Everyone is a Hypocrite and a Spammer

One hates to give Jason Calacanis any additional exposure, but how can a person be so against SEO while selling text links for scuba blackjack online? Is he ahead of the market on global warming?

Grow up.. the only thing you're ever going to prove by trying to game my SeRP is that you're low-class idiots.

True, or maybe we are looking for scuba blackjack customers, and knew that you publish high quality original content and ads for that market.

[Video] What is a Self Reinforcing Authority (and a Self Reinforcing Market Position)?


Video Summary:

Some documents and websites build self reinforcing authority that make them hard to beat for their targeted search terms. This video explains how that works and gives examples of some self reinforcing market authorities, as well as tips on how to make these types of sites and pages.

Resources Mentioned in the Video:

Examples of Self Reinforcing Authorities From This Video:

  • us debt clock

  • xe currency converter
  • search engine history
  • search engine ranking factors
  • black hat seo
  • seo code of ethics
  • seo today / search engine watch

Things I Should Have Mentioned That I Forgot:

  • Your title is important because most people will reference your document by its title.

  • Statistics, standards, and scientific sounding things are easy to become self reinforcing powerhouses, especially if they feed into the ego of the target audience.
  • If you get large media coverage of your idea leverage it to get more coverage. Show it off to seem exceptionally legitimate and trustworthy.
  • US News and world report ranks colleges, and is a great example of a self reinforcing authority.
  • Common ways to undermine authority that may prevent a site or article from becoming authoritative.
  • If someone has an authoritative idea in another market, but nobody has applied it to your market that may present an eay oppurtunity.

Signs of a Low Quality Website

Webmasterworld recently had a good thread about signs of low quality websites. The less a person knows about your topic the more likely they are to rely on general signs of quality (or lack of) when consider if they should link at your site or not.

Common Quality Questions:

Is the design clean? Is the content well organized? Do they have major misspellings on their homepage? Who is behind the site? Is it easy to contact them? Are they referenced by any other credible sources? How unique and useful is the content? How aggressively are ads blended into the content? etc. etc. etc.

Why Proxies for Quality Are Important:

Recently someone spread a God hates fags song website. Friends were instant messaging me about whether it was real or not. Some journalists guessed it wrong. People are getting better at creating fakes. The easier we make it for people to trust us in a snap judgement the more people will trust us (and link to our sites).

These proxies for trust are important, especially when you are new to an established industry, are in a new industry with a small community of support, are in a rapidly growing industry that the media is having a feeding frenzy over, or are the seedy arm of a larger industry.

Example of the Importance of Outside Perception:

If an industry is new, the early leaders of that industry might be determined by mainstream media perception (or other perception outside of that industry). Using blogs as an example, if the media did not constantly pump up the Weblogs Inc. story that company still might be unprofitable today. That media exposure lead to more media exposure, gave the sites the link juice to help them rank, and gave them brand exposure that brought in advertisements.

Relating This to the SEO Industry:

With SEO it is easier to be seen as a SEO expert if you are first seen as an expert on search. It is easier to be trusted as an expert on any topic if your site does not flag common signals of crap.

I just got a link from the WSJ to my keyword research tool, but if I would have scored lower on the proxies for value maybe they never would have linked. And when you get that type of link you can leverage it as an additional signal of trust that makes it easier for others to link at you.

With BlackHatSEO.com, I mentioned as seen in Clickz and Search Engine Watch, but what I didn't mention was that both mentions were brief and in the same syndicated article. When the London Times interviewed me about that site I quickly put up another as seen in at the top of the home page, which will make it easier to get more exposure. You want your press coverage to lead to more press coverage, because those are some of the most trusted links and links that money alone usually can't buy.

But I am Already Doing Well:

Many people who buy consultations are already doing far better than I would expect them to do giving some of the obvious flaws I see with their site structure and marketing methods. Some state that they are already doing well. The point of these sorts of signs of crap is not that you need to fill all the holes to do well, or that you can't do well if you do not fill them, but imagine how much better a site can do after it fixes obvious errors if it was already doing well when it had many errors that undermined its credibility and linkability.

Straight Out of the Andy Hagans Playbook

Andy Hagans reveals virtually everything there is to know about link baiting and social marketing.

Hidden Content Costs & Understanding Your Profit Potential Per Page

Ever since Google has got more selective with what they will index, the model for profitable SEO changed from chucking up pages and hoping some of them are profitable, to where it makes sense to put more strategy into what you are willing to publish.

The Supplemental Index Hates Parasitic SEO:

Each site will only get so many pages indexed given a certain link authority. And each of those pages will rank based on the domain's authority score, and the authority of the individual page, but each page needs a minimum authority score to get indexed and stay out of the supplemental results - this is how Google is trying to fight off parasitic SEO.

Given that many people are leveraging trusted domains, it makes sense that if you have one that you leverage it in a way that makes sense. CNN will rank for a lot of queries, but it does not make sense for Google to return nothing but CNN. It is good for the health of Google to have some variety in their search results. This is why smaller sites can still compete with the bigger ones, Google needs to use the smaller sites to have variety and to have leverage over the larger sites...to keep the larger sites honest if they are too aggressive in leveraging their authority, or have holes that others are exploiting.

Extending a Profitable Website:

If you have a 100 page niche website you may be able to expand it out to 500 pages without seeing too much of a drop in revenue on those first 100 pages, but eventually you will see some drop off where the cost of additional content (via link authority that it pulls from other pages on your site) nearly matches the revenue potential of the new pages. And then at some point, especially if you are not doing good keyword research, have bad information architecture, create pages that compete with other pages on your site, are not actively participating in your market (gaining links and mindshare), or if you are expanding from a higher margin keyword set to a lower margin one, you may see revenues drop as you add more pages.

The solution to fix this problem is build editorial linkage data and stop adding pages unless they have a net positive profit potential.

What are the costs of content?

  • the time and money that went into creating it

  • link equity (and the potential to be indexed) that the page takes from other pages
  • the mindshare and effort that could have been used doing something potentially more productive
  • the time it takes to maintain the content
  • if it is bad or off topic content, anything that causes people to unsubscribe, hurts conversion rates, or lowers your perceived value is a cost

How can a Page Create Profit?

  • anything that leads people toward telling others about you (links or other word of mouth marketing) is a form of profit

  • anything that makes more people pay attention to you or boosts the credibility of your site is a form of profit
  • anything that thickens your margins, increases conversion rates, or increases lifetime value of a customer creates profit
  • anything that reduces the amount of bad customers you have to deal with is a form of profit

Mixing Up Quality for Profit Potential:

I am still a firm believer in creating content of various quality levels and cost levels, using the authoritative content to get the lower quality content indexed, and using the lower quality content earnings to finance the higher quality ideas, but rather than thinking of each page as another chance to profit it helps to weigh the risks and rewards when mapping out a site and site structure.

Increasing Profit:

Rather than covering many fields broadly consider going deeper into the most profitable areas by

  • creating more pages in the expensive niches

  • making articles about the most profitable topics semantically correct with lots of variation and rich unique content
  • highly representing the most valuable content in your navigational scheme and internal link structure
  • creating self reinforcing authority pages in the most profitable verticals
  • requesting visitors add content to the most valuable sections or give you feedback on what content ideas they would like to see covered in your most valuable sections
  • If your site has more authority than you know what to do with consider adding a user generated content area to your site

Take Out the Trash:

If Google is only indexing a portion of your site make sure you make it easy for them to index your most important content. If you have an under-performing section on your site consider:

  • deweighting it's integration in the site's navigational scheme and link structure

  • placing more internal and external link weight on the higher performing sections
  • if it does not have much profit potential and nobody is linking at it you may want to temporarily block Googlebot from indexing that section using robots.txt, or remove the weak content until you have more link authority and/or a better way to monetize it

Google Offers More Link Data

Google's Link: command has been broken forever, but now Google is letting you see a far more representative sample of external links to your site and your internal link structure if you verify that you are the owner of your site by signing up at Google Webmaster Central. They also allow you to export your linkage data in an excel file. Some ways to use this data:

  • look at internal link structure of important pages and make sure they are well represented

  • look at internal external structure of important pages and make sure they are well represented
  • look at which pages on your site are well represented and make sure they link to other key pages
  • download your external linkage data and sort by date to look for new link sources (and why they are linking at your site)
  • run the excel sheet through a duplicate site remover or c class IP range checker to see how diverse your linking profile is

If you have shifty sites obviously there would be little to no upside in verifying those sites with Google, but if your sites are generally above board you might find this tool useful.

Thanks to Adam.

Triple Your Google AdWords CTR Overnight by Doing Nothing, Guaranteed!

I have seen many ad studies that empirically proved that the person doing the study did not collect enough data to publish their findings as irrefutable facts. While recently split testing my new and old landing pages I came across an example that collected far more data than many of these studies do: Google AdWords ad Clickthrough Rates.

Same ad title. Same ad copy. Same display URL. Same keyword. And yet one has 3x the CTR as the other.

If you collect enough small data samples you can prove whatever point you want to. And you may even believe you are being honest when you do so. Numbers don't lie, but they don't always tell the truth, either.

Want to know when your data is accurate? Create another ad in the split group that is an exact copy of one of the ads being tested. Once its variances go away with the other clone then the other split test data might be a bit more believable.

Anonymous Voting is Garbage

What is the problem with free anonymous votes? Just like an infinite supply of money, it has no real value. Get one account banned and start working on the next. I might be able to believe this drivel if I didn't know so many cases that proved this wrong:

We strongly believe attempts to game Digg are ineffective. While it would be foolish to say that Digg has never been artificially manipulated in the 2+ years (50,000,000+ diggs) we’ve been live, we’re confident that such attempts do not impact the content that reaches the home page.

Beyond self interested manipulation, allowing people (or bots) to vote on content takes the focus away from the content and makes people interested on arbitrary voting or how voters may respond. It makes the content watered down, average, bland, and generally worthless. It takes the focus away from your value and accepts anonymous input as having some real value, but outside of gaming them for links, they don't...just look at how fast they leave sites.

Your creativity, and value are in your addictions.

Making a Couple SEO Videos

A long time ago I made a keyword research video which was WAY too long and came out a bit grany. I recently bought the latest copy of Camtasia Studio and am thinking about making a few short SEO videos. Here are some possible topic ideas:

  • what are quality links

  • what web directories are worthwhile
  • link baiting and how do I appeal to web 2.0
  • how to use SEO for Firefox
  • how to do competitive research
  • how to find the most profitable keywords
  • how to write page titles and meta description tags
  • how to do on the page optimization
  • how to structure a website to be search engine friendly AND convert well
  • how do I pick a niche
  • how to evaluate the health of a website

So the question is... what topics would you like to see me make short tutorial videos about? I can't guarantee I will make them all, but I will try to make at least a few, and then listen to feedback to make a few more that are hopefully a bit better.

User Generated Tags Are Useless Noise

A current dumb, but popular, trend is to get user to tag pages.

How valuable is a Technorati tag page to a Google user? Probably just about worthless, IMHO. The only reason they exist is that it gives bloggers crumbs of exposure in exchange for their link equity, and it gives Technorati a way to build authority and get an automated scraper to pass as real content. Other large sites have started following this tag example, and allow users to use non-descriptive labels like 2000, hip, and cool to tag their content. As if this tag noise was not bad enough for people trying to look past the clutter and actually find something, some of these sites use the tags to create additional content pages.

What are these page? They are a perfect example of low information quality pages. Some dumb content management systems and blog plug-ins take the noise one step further by cross referencing the tags, having a virtually infinite set of tags that will keep generating more cross referenced tag pages until search engines get sick of wasting their bandwidth and your link equity indexing the low value garbage.

A set of loosely defined tag pages is no better than a low quality search result page. Search engines have long ago decided they generally didn't want to index the search results from other search engines. When too many of their own results are these noisy tag pages eventually they are going to turn against tags...maybe not via any official statements, but some sites will just not rank as well.

Search engines react to the noise in the marketplace then the marketplace creates new types of noise to pollute the SERPs. Then the search engines react to the noise in the marketplace. Then the marketplace creates new types of noise to pollute the SERPs. Tags are noise and they will have their day.

Why would you want to let users outside of your business interests control your information architecture and internal link structure when Google is getting picky about what they are willing to index? Why waste your link equity and bandwidth?

Pages