For Profit Websites Have No Value Until They Rank

May 10th

If you are passionate, a site can have value without ranking, as rankings are a lagging indication of site quality, market timing, and/or marketing savvy. If you are offering something that is substantially similar to competing sites, it has virtually no value until it ranks at the top of the results. In the quest to build value, mindshare, and rankings it is easy to focus on unimportant things that eat time and provide little return. For example, you could write a 3000 page website that is the encyclopedia for your topic or you could try to create the ultimate branded property, but if nobody sees it then the content or brand it doesn't flourish. You need the site to look good enough to compete, but there is little value is trying to make it perfect right out of the gate.

Brand Developement and Market Leverage

While one is writing page after page or tweaking away building a perfect new site, the competitors are leveraging Indian copywriters who write thin informational pieces wrapped in AdSense. Those same low quality sites garner self reinforcing links because they are already ranking, and most people are lazy, just linking to whatever they can easily find.

Premature Testing

The results of any tests to monetize a low traffic site are going to provide inadequate and inconclusive results, which also likely feed into your biases and expected outcomes. If you build authority first and then come back and test later you will receive a greater ROI for the amount of effort required to perform the tests.

To put into perspective the testing errors that small samples can create, a friend of mine has a site which makes virtually the same amount from AdSense every day. The same site sells leads. Some days it generates 6 conversions and other days it does 21, all while the traffic flow and AdSense earnings are fairly constant. If you compared one revenue stream to the other, the obvious winner would look different based on what day you chose.

Everything on the Web is Broken

If you try to look really polished that might not be remarkable. You are not cutting edge if you have to be perfect before you are willing to be seen. If I wasn't willing to release my first ebook prior to when I should have you probably would not be reading these words right now.

Everything on the Web is Biased

I believe people have more of a tendency to talk about and share things that are unpolished. Google gets talked about by getting sued, Digg gets talked about by getting gamed, Fox news gets talked about by entertainment sold as news, etc etc etc.

When you try to come out of the gate perfect, it is hard to relate to your end audience without spending thousands of dollars on marketing. It is far more remarkable to come out of the gate slightly broken and biased and appeal to the overt biases of those who can give you authority. I am not suggesting to be racist or sexist or anything like that, but people are generally more receptive to (and thus likely to share) things that reinforce their worldview. Appeal to a known bias, market that story, then create another story that works another group. Do it over and over until you have enough authority to clean up the site and become the market leader.

Rough edges appealing to deep niches is a far better approach to marketing than broad and polished to a fine dull.

In Summary...

Get authority by appealing to smaller groups of your audience, grab marketshare, THEN try to look authoritative. Most people don't know HOW you acquired your authority...it is not something most think to question, and if they do you can always change your look and feel as needed to accommodate the market.

You don't have to do anything deceptive to gain authority, but if you think perfect content is the answer you are only deceiving yourself.

Search Engines Giving You the Tools to Kill Yourself

May 7th

Many publishers hide additional information sections that they want people to be able to select viewing if they show interest in the topic. For example, each of Think Progress's navigational sections are expandable, and some publishers have more information or other informational cues to make additional page content visible. These can be used deceptively, but if you have a strong brand and are trying to use them with the end user in mind, I doubt search engines will think the intent is bad.

AdSense Section Targeting:

As search has taken a larger and larger piece of the web search engines have given us ways to mark up our pages to suit their needs. AdSense section targeting made it easier for Google to target content ads to your site. That sounds like a good idea, but they also offer tags that offer publishers no value.

Google's NoFollow:

Nofollow was originally recommended to stop blog comment spam, but it has morphed into a tag that Matt Cutts wants you to use on any paid or unnatural link. What makes a link unnatural? In one form or another almost everything is paid for, by giving away value, exchanging currency, or nepotism.

Do You Trust Yourself?

If a page has many nofollow tags on it isn't that another way of saying that the publisher does not trust their own content? If a publisher says that they don't trust their own content or their own advertisers then why would search engines (or savvy webmasters) want to trust them?

The Machine is Broken:

Bob Massa recently highlighted how absurd the current use of the nofollow attribute is:

Mr. Cutts, speaking on behalf of Google presumably, made the comment, "if you want to buy links just for traffic, totally fine just don’t do it so they affect search engines".

This concept is completely flawed. This self serving philosophy is also at the very core of the problem. When the machine attempts to modify the behavior of people to satisfy it’s own ends, the machine is broken. What people do should not be seen as affecting the search engine. What people do should be the very reason for the engine to exist in the first place. If the search engine is being affected by the actions of people, is any logical person going to honestly assume that it is the people that are broken? That is exactly what is happening here.

Yahoo!'s Robots-Nocontent Attribute:

Search engines have got better at identifying duplicate content. Some search engines may boilerplate strip obvious navigational elements from pages. Some may place pages with too much duplicate content in supplemental results. Some may sites with too much duplicate content in reduced crawling status.

There are all of these ways to fight off content duplication and Yahoo! offers a robots-nocontent tag. One of the first people to comment on the news was Google's Matt Cutts, who said:

Danny, can you ask how Yahoo intends to treat links in the "robots-nocontent" section?

Don't Use the Robots-Nocontent Attribute:

It might be easy to add class="robots-nocontent" to some of your divs, but should you? I think it has little value. Sure you could use it in a sneaky way, as suggested by Jay Westerdal, but the problems with that are:

  • it looks sneaky

  • you are removing content from your pages (and will thus rank for fewer phrases)
  • there are easier and more effective ways of changing the meaning of a page without looking so sneaky...like just rewriting an article, adding a spammy comment that looks like it came from a third party, or adding a few additional words here or there.

Yahoo! is the top network of sites on the web. Internally they have publishing teams and an SEO team. If their search engineers can't figure out how to use their own internal traffic stats and other relevancy measurements to refine their duplicate detection algorithms they deserve to bleed marketshare until they no longer have relevancy in the marketplace.

How to Change the Focus of a Page Without Using Robots-Nocontent:

If you want to change the focus of your pages here are some of the best ways to do it

  • Ensure your page title and meta description are unique. Do not place the same words at the start of every page title on all the pages of a new website.

  • Make your h1 headings and subheadings target a slightly different word set than your page title.
  • If your page is thin on content, add more additional relevant unique content to the page. The solution to not getting killed by duplicate content filters is adding more unique content, not stripping out obvious required duplication (such as navigation and advertisements) that search engines should be able to figure out.
  • If your site has comments or consumer feedback you can post or encourage feedback that targets other keywords. Comments offer free text. A 500 word page with an additional 1,000 words in the comment section may rank for 2 or 3 times as many search queries. Don't throw away the free content.
  • For those who are really aggressive and have crusty links that will never be removed, consider placing your commercial messages on one of your highly trusted high ranking pages. People buy and sell websites, who is to say that the contents of a URL can't change?

Why I Love the Google's Supplemental Index

May 6th

Forbes recently wrote an article about Google's supplemental results, painting it as webpage hell. The article states that pages in Google's Supplemental index is trusted less than pages in the regular index:

Google's programmers appear to have created the supplemental index with the best intentions. It's designed to lighten the workload of Google's "spider," the algorithm that constantly combs and categorizes the Web's pages. Google uses the index as a holding pen for pages it deems to be of low quality or designed to appear artificially high in search results.

Matt Cutts was quick to state that supplemental results are not a big deal, as Rand did here too, but supplemental results ARE a big deal. They are an indication of the health of a website.

I have worked on some of the largest sites and network of sites on the web (hundreds of millions+ pages). When looking for duplicate content or information architecture related issues, the search engines do not allow you to view deep enough to see all indexing problems, so one of the first things I do is use this search to find low quality pages (ie: things that suck PageRank and do not add much unique content to their site). After you find some of the major issues you can dig deeper by filtering out some of the core issues that showed up on your first supplemental searches. For example, here are threadwatch.org supplemental results that do not contain the word node in the URL.

If you have duplicate content issues, at best you are splitting your PageRank, but you might also affect your crawl priorities. If Google thinks 90% of a site is garbage (or not worth trusting much) I am willing to bet that they also trust anything else on that domain a bit less than they otherwise would, and are more restrictive with their willingness to crawl the rest of the site. As noted in Wasting Link Authority on Ineffective Internal Link Structure, ShoeMoney increased his search traffic 1400% after blocking some of his supplemental pages.

People Don't Look Beyond the Page

May 3rd

I once saw a college professor cite a page about caffiene on a low quality site about pornography, gambling, and drugs on his official profile page. Many people never look beyond the page when linking to a story.

This is not to say that one should put a story on a bad website, but that one should make the story page they are currently marketing as clean as possible so it is easy to link at. And you are probably better off placing your marketing stories on your key site if you think they will still spread.

Over time people will become more aware of using content bait on a crappy site, but for now most people don't look beyond the page when referencing a story.

Digg to Pligg - Easy Social News Links

Apr 25th

Social news sites come to prominence largely over the controversies associated with people gaming them, and without people gaming them few would ever garner a critical mass. Marketers spamming a social news site is part of the growth cycle.

If I can come up with an easy search string to detect that many Pligg sites you have to think that as people and spambots abuse them, the search engines will discount most of their votes, but short and long term there is still going to be value to many of them.

Why buy low quality PR2 and PR3 links from inactive parts of the web when you can get on topic ones for free? Of course most of these communities will have limited value and die (failing to build a critical mass), but if you are submitting useful content to the real ones that will also lead to indirect links and other signs of trust and quality.

Content networks with virtually no content cost, free software, and limited editorial control might call people who submit self promotional stuff spammers, but what are all these sites until they build a critical mass? Parasitic useless noise, a form of spam.

The difference between a spammer and a contributor is that a contributor will post at least a few entries that are not self promotional, and they will also create content worthy of exposure. Both of which help build the community.

If you feel bad about gaming markets just remember that every market and every value system is both self-promotional and gamed.

As a background, here are some background tips on formatting linkbait and free linkbait ideas.

Buying Links for Underdeveloped Unremarkable Sites

Apr 17th

If search engines already have a reason to trust your site then leveraging SEO may help you gain more exposure. However, if your conversion process is not smooth, search as an isolated marketing channel is rarely an effective long-term business model.

Flat Rate Paid Inclusion: Yahoo! Search Submit Basic Returns

Not talked about much, but my partner noticed Yahoo! once again shifted paid inclusion to a yearly flat rate.

While Search Submit Pro allows you to have more control over listings and is sold on a costs a per click basis, their Search Submit Basic allows you to submit URLs for $49 per year per URL.

In the past Search Submit Basic was called Search Submit Express. It charged a flat inclusion price and sold clicks on CPC basis. Here is an Archive.org link to the old program.

If you have launched a new site and are not getting much Yahoo! traffic, submitting a few of your highest value pages is a good call. If you have key deep high value pages that are not staying indexed in Yahoo! this program also makes sense.

Exerting Influence & Moving Markets

Apr 6th

There are two basic ways to do SEO. One is to look for the criteria you think the search engine wants to see, and then work to slowly build it day after day, chipping away doing great keyword research and picking up one good links one at a time here or there. If you understand what the search engines are looking for this is still readily possible in most markets, but with each passing day this gets harder.

The other way to do SEO is to move markets. When I interviewed Bob Massa, his words search engines follow people stuck in my head. So what does it mean to move markets? People are using the word linkarati. It wasn't a word until recently. Rand made it up. As that word spreads his brand equity, market position, and link authority all improve. Does that make Rand an SEO expert or a person good with words? Probably both, as far as engines and the public are concerned.

I have seen friends get free homepage links from businesses that are making 10s of millions of dollars profit per year. I have had fortune 500 companies contact me with free co-branding offers for new sites. I have came up with content ideas that naturally made it to the #1 position on Netscape and stuck there for 20+ hours straight. I still fail often and have a lot to learn, but I do know this: If you are the featured content on most of the sites in your field then YOU are relevant, and search engines will pick up on it unless their algorithms are broke.

When I was new to SEO I did much more block and tackle SEO. I had to because I had limited knowledge, no trust, no leverage, no money, and was a bad writer. The little things mattered a lot. They had to. As I learned more about the web I have tried to transition into the second mode of marketing. Neither method is right or wrong, each works better for different people at different stages, but as more people come online I think the second path is easier, safer, more stable, more profitable, and more rewarding.

If you are empathetic towards a market and have interests aligned with a market you do not need to understand exactly how search engines work. Search engines follow people.

It is still worth doing the little things right so that when the big things hit you are as efficient as possible, but if you can mix research, active marketing, and reactive marketing into your site strategy you will be more successful than you would be if you ignored one of them.

Different Links Have Different Goals

Apr 6th

WMW has a good thread about some of the changes people are noticing at Google. Two big things that are happening are more and more pages are getting thrown in Google's supplemental results, and Google may be getting more aggressive with re-ranking results based on local inter-connectivity and other quality related criteria. You need some types of links to have enough raw PageRank to keep most of your pages indexed, and to have your deeper pages included in the final selection set of long tail search results. You need links from trusted related sites in order to get a boost in result re-ranking.

There are also a few other types of links to look at, if you wanted to take a more holistic view:

  • links from general trusted seed sites

  • links that drive sales
  • links that lead to additional trusted links
  • links that gain you mindshare or subscribers

Some of those other links may not even be traditional links, but may come from a well placed ad buy.

Every unbranded site is heavily unbalanced in their link profile. If you do not have a strong brand then the key people in your community who should be talking about you are not (and thus you are lacking those links).

Most branded sites do not create enough content or do enough keyword research to fully leverage their brand strength, but occasionally you see some of them get a bit too aggressive.

Benchmarking Information Quality

Apr 3rd

Wikipedia ranks #2 for Aaron right now. They also rank for millions of other queries. They don't rank because their information is of great quality, they rank because everything else is so bad. About.com was once considered best of breed, but scaling content costs and profitability is hard. Google doesn't hate thin affiliate sites because they are bad. They only hate them because the same thing already exists elsewhere. Search engines try to benchmark information quality, and create a structure which encourages the creation and open sharing of higher quality content. When you see poor sites at the top of search results view it as a sign of opportunity. Realize that whatever ranks today is probably not what search engines want, but it is what is considered best giving the lack of structure to the web and how poor most websites are.

Pages






    Email Address
    Pick a Username
    Yes, please send me "7 Days to SEO Success" mini-course (a $57 value) for free.

    Learn More

    We value your privacy. We will not rent or sell your email address.