Using Similar Versions of a Keyword in a Page

SEO Question: My main keywords for my site are e-book and ebook. Should I sprinkle both versions of the word on every page?

SEO Answer:

Semantically Related Phrases:

I think in time search engines will eventually get better at determining what words related to one another. In many cases they are already good at it.

If you search Google for ~term -term it will show you some of the terms Google understand to be related. Here are a couple semantic relationship tools that also do that process for you:

Ways to Target Multiple Similar Versions of a Keyword Phrase:

There are multiple ways to target both version of a phrase.

  • Sometimes adding one version somewhere in the meta description and maybe in the page footer area is a good way to target the less popular of the two. You may also be able to work both versions into your page title, but you really want to consider how search engines will display your page titles and descriptions. If you have a dynamically generated site it is much easier to create formulas for the page title and meta descriptions which help you to test many of them without needing to waste a ton of time editing each page one at a time.

  • Another good option for picking up the secondary phrase is to get a few external citations to the important pages with the less common term in the anchor text, or maybe use a sitemap which pushes the secondary version. It is pretty easy to syndicate articles and do other things like that to pick up a few low to mid quality links with decent anchor text.
  • Some sites, like About.com, use a related phrases section on definition pages, which outlines other versions of a phrase. If you sell parts you could call it something like "alternate part numbers". If you use this you need to make it look professional and get some quality citations so that your site seems as though it is above board, and not just trying to spam the engines.
  • Finally, the last way I can think of tackling the problem is to create different version that target the different phrases, but if you do this it is easier to write mini blog posts or do something like that. You want to make it look legitimate, so the page contents should not be exact duplicates with the exception of find and replace, because that could look suspicious and as if it is only for search spiders. Duplicate content filters are improving daily as well, and are getting better at detecting find and replace duplication.

Influencing Word Relations:

There are subtle ways to drive search volume, but it is a long hard and involved process to try to change the way people use language. It may also be darn near impossible if most the market discussion occurs offline.

Many people would consider linking off to search results in a salesletter a no no, but if you can have people search Google for your coupon or brand name then your brand might be recommended more frequently for things like inline search suggest or see also searches for broader related search queries.

If you use online or offline techniques to drive search make sure you follow it up by creating an online resource or someone else will target it, and Google may do an inline suggest for it.

How you use language on other sites can also help determine what phrases engines think are related to one another, especially if the patterns you create are reinforced on many pages of multiple large independent sites. Yahoo!'s see also patterns seem to be driven at least partially by word patterns on pages.

How Popular is Each Version?

When considering if you want to go after one version or both the first thing you have to do is get a rough indication of demand for each term. Use Google's keyword tool and perhaps combine that with mine. Keep in mind that mine is driven off of Overture, and there are flaws to the data collection and sharing models at any keyword tool, so these are just estimates.

Using those two tools should show you what version is the most popular. People often search in the same way as they create content. So another good backup indicator would be searching Google for ["e-book"] and [ebook].

I also have a tool which uses the Google API to give the approximate number of hits for each version. My Compitition Finder tool will show how many results there are for pages that use the terms in the title and / or anchor text. If terms occur in the page title and anchor text than those pages are likely going to be far more targeted on a topic than pages that may just have the words somewhere on the page. Sometimes my tool is a bit broken, so after this semester is done hopefully my programmer buddy will have a few hours to fix it up.

How Competitive is Each Keyword Phrase?

The number of hits might give you some idea of how competitive is each version, but a more accurate way to find out is just to look at the top search results for each version. If official type sites tend to target one version and spam sites target the other you may be better off going after the less popular and less competitive version off the start, especially if you are working on a limited budget.

Before you commit to any targeting method it may be worth considering

  • how easy or hard it will be to change what you are targeting as your site influence and income increase.

  • weather or not you will need to worry about updating the aged content, or if your site structure allows you to focus on creating new content without the structure of the old content hurting you too much

Using things like a content management system or server side includes might make a lot of sense if you are going to be working on a large site.

How Does Google Create Multi Link Listings?

SEO Question:

My site already ranks number 1 in Google. How do I get Google to post a mini site map in the search results?

SEO Answer:

I believe that Google primarily displays multi link listings when they feel a query has a strong chance of being navigational in nature. I think they can determine that something is navigational in nature based on linkage data and click streams. If the domain is well aligned with the term that could be another signal to consider.

If you have 10,000 legit links for a term that nobody else has more than a few dozen external citations for then odds are pretty good that your site is the official brand source for that term. I think overall relevancy as primarily determined by link reputation is the driving factor for weather or not they post mini site map links near your domain.

This site ranks for many terms, but for most of them I don't get the multi link map love. For the exceptionally navigational type terms (like seobook or seo book) I get multi links.

The mini site maps are query specific. For Aaron Wall I do not get the mini site map. Most people usually refer to the site by it's domain name instead of my name.

Google may also include subdomains in their mini sitemaps. In some cases they will list those subdomains as part of the mini site map and also list them in the regular search results as additional results.

Michael Nguyen put together a post comparing the mini site maps to Alexa traffic patterns. I think that the mini site maps may roughly resemble traffic patterns, but I think the mini links may also be associated with internal link structure.

For instance, I have a sitewide link to my sales letter page which I use the word testimonials as the anchor text. Google lists a link to the sales letter page using the word testimonials.

When I got sued the page referencing the lawsuit got tons and tons of links from many sources, which not only built up a ton of linkage data, but also sent tons of traffic to that specific page. That page was never listed on the Google mini site map, which would indicate that if they place heavy emphasis on external traffic or external linkage data either they try to smooth the data out over a significant period of time and / or they have a heavy emphasis on internal linkage.

My old site used to also list the monthly archives on the right side of each page, and the February 2004 category used to be one of the mini site map links in Google.

You should present the pages you want people to visit the most to search bots the most often as well. If you can get a few extra links to some of your most important internal pages and use smart channeling of internal linkage data then you should be able to help control which pages Google picks as being the most appropriate matches for your mini site map.

Sometimes exceptionally popular sites will get mini site map navigational links for broad queries. SEO Chat had them for the term SEO, but after they ticked off some of their lead moderators they stopped being as active and stopped getting referenced as much. The navigational links may ebb and flow like that on broad generic queries. For your official brand term it may make sense to try to get them, but for broad generic untargeted terms in competitive markets the amount of effort necessary to try to get them will likely exceed the opportunity cost for most webmasters.

Monthly Keyword Research Marketing Data

SEO Question:

Are you aware of a tool or a service that can provide reliable search volume history for certain keywords? Like how many searches there were for "keyword phrase" in each month of 2005.

SEO Answer:

Keyword Intelligence and Keyword Discovery (in depth review of many keyword research tools) both offer seasonal data to some extent, but both are limited in their database depth. Search traffic falls off sharply when you leave Google and Yahoo!. Unless you are Google or Yahoo! it is just plain tough to gather exceptionally useful and meaningful data (unless you are in a non English market with other major players).

With Google and Yahoo! their own commercial motivations are to show increased volume on core terms to create artificially competitive markets. Google and Yahoo! don't make tons of money when advertisers buy a bucket full of long tail 10 cent clicks instead of bidding up the core heavily searched industry terms. They don't mind if you find some long tail terms, but they want everyone bidding up the core terms.

Due to the engines recommending the most obvious terms and some advertisers feeling they NEED to advertise on those terms some keywords get so competitive that the margins are negative. This slim or negative margin environment spurs on rank checkers, click fraud, and other market manipulating activities which drive up the core search volume numbers provided by the major engines on the most common terms.

Yahoo!'s keyword research tool (my cooler version tool is available here) will show you the raw number of searches for the prior month. For example, last month they stated that SEO had 51,787 searches on Yahoo!.

Since that is a short term in a hyper saturated market you shouldn't be surprised if 90% of that search volume is junky automated traffic or ego searching. I usually rank in the top 10 in Google and Yahoo! for SEO and it typically sends me about 5 visitors a day. While Yahoo! only shows "SEO Book" as getting about 5% of the search volume as the term SEO I get way more traffic for SEO Book.

Yahoo! has recently started mailing out keyword promos reminding people to bid up the most competitive holiday related terms. Those are probably good words to steer clear of paying for. Many ignorant bidders jumping into the market at the same time creates an ugly overpriced playing field, although if you can sell them their PPC traffic the overpriced bidding may be a beautiful thing. Keyword terms with a large standard deviation may create good arbitrage opportunities.

Google recently started offering 12 months seasonal search data with their keyword tool. Unlike Overture, they only show graphical estimates and trends instead of exact search volume numbers. But I think it is only important to get a glance at trends since exact numbers usually do not matter much (due to automated traffic etc). They also provide quick snapshots of ad market competition and estimated bids. Both of which may be useful in deciding which markets are valuable to enter prior to investing into content creation.

In many markets the breadth of the keyword space matters much more than just the volume of the top few keywords. Some markets which are driven around a well known brand with few well known product names may have 90% of the search volume come in under the brand name or a couple slight variations of it. Most markets have much more traffic at the other end of the keyword spectrum though. Keyword phrase modifiers and alternate phrases may be huge. For example, yesterday over 75% of the search queries referring visitors to this site were unique.

You can also learn about many of the odd search patterns, consistent seasonal trends, and how search relates to society by reading the Hitwise blog. While they do not have anywhere near the amount of data Google or Yahoo! do, the Hitwise blog is always an interesting read, and does a great job of marketing their services.

Mind-share leads to more mind-share. And mind-share leads to expression. If the search volume trends are going up then likely the number of people talking about the topic will go up as well. If you are looking for what is hot right now there are numerous buzz trackers.

When thinking about keyword data remember:

  • the biases of the providers (wanting to sell expensive clicks or having small a keyword database size)

  • the numbers provided by any tools are just estimates
  • consider how spread out the search terms likely are in your industry.

If you are new to an industry, have limited capital, limited brand equity built up, and your market is hyper-saturated it may be far more profitable to go for niche long tail search phrases, since those will be easier to compete for and they typically have more implied value / targeting / demand.

Mixing Organic SEO and Pay Per Click Marketing

SEO Question: What is the best way to determine what resources should be put into pay per click marketing versus organic SEO?

SEO Answers: There are a near endless number of factors in determining how you should spend your marketing money online. The good thing about search is the implied intent while people are searching - which can lead to quick feedback on efficient accounts, but there are certain businesses that are hard to sell via search.

This site ranks fairly well via search, but most of my conversions come from other marketing mechanisms because there is so much hype in online marketing and so much distrust toward marketing ebooks. About the only search terms that convert for this site are searches for my name or the official name of the site (part of that is also because the brand name of this site is rather generic in nature). When selling unbranded commodity based products at low cost I think search works much better than expensive products or services that require building trust first. If you build a brand it makes it hard for competitors to compete on your branded terms because your conversion rate will be so much higher on the branded searches.

I think prior to determining how you break down your marketing spend you first have to determine what your short term and long term goals are. Do you want to rank for certain competitive terms in Google? Is your goal to get a certain amount of traffic? A certain amount of profit? Develop a brand or market reach that allows you to profit indirectly?

Some business models work great with pay per click marketing. Particularly small uncompetitive niches or high value markets that do not have much advertising depth. Using PPC to market in local niche markets tend to offer under-priced leads. In many markets people bid on the most common terms but leave off higher value related terms. Also some markets are far under-priced since PPC is newer there. Based on talking to a few friends I think PPC in Germany on average would offer higher returns than PPC in the UK or US.

Some business models work horribly with pay per click marketing. Particularly businesses that have no recurring income streams and/or lower product prices in a market crowded by competitors with higher price points or higher profit margins. If you have a product which may be priced out of the more common high value terms you still may be able to find a few niche terms and bid on your brand, but you may need to rely more on organic search for traffic in this scenario.

Within pay per click marketing I have seen some topics where the Google AdWords ROI is much greater, but, more commonly, Yahoo! has less reach but greater ROI. Because of differences in how their systems work it may mean that leads which are prohibitively expensive in one channel may be cheap in another.

Since right now MSN has few ad distribution partners and is still in beta they should have some of the cleanest traffic and least competition within their new beta system. But they may not have much traffic in some markets due to their small search market share compared to Google or Yahoo!

To do pay per click well you really have to track your conversions so you can calculate your lead value / income per unique visitor. If it is hard to track the exact lead value it is important to find a proxy for value. If your costs seem prohibitively expensive and your business model is similar to competing sites you need to look at what is going wrong in your conversion process. Competitive PPC markets force you to be more efficient, which helps you woth conversions on both PPC and organic search.

Many non search ads are also sold through the PPC interfaces at the major search engines. Cash rich companies or exceptionally efficient businesses may consider bidding low on contextual ads to help give them a brand lift. Since many of these ads have a low clickthrough rate you can get hundreds of thousands or millions of impressions for a few hundred dollars. Increased mindshare leads to greater search volume, so the contextual ads play back into your PPC and organic search marketing campaigns.

There is an appeal to the concept of retail without the risk, or turn key operations, but a business without risk is a business without growth or purpose. Even if things seem like they are churning along smoothly with pay per click marketing the players may change the rules of the game, and overnight many of the terms and techniques that were once exceptionally profitable are less so. In much the same way they want to keep noise out of their regular search results to keep them relevant they also want to keep ads relevant. And then competitors can enter the market and shift the game plan overnight as well. This can happen in organic or paid results, so using both can help lower your risk from things going wrong with either, plus you can take information your learn from either discipline and use it to refine the other.

As far as organic SEO goes I could write a 100 page long post that nobody would read (or perhaps I could sell it as an ebook and then people would read it), but generally the four major questions are:

  1. Should I do PPC? is composed of the following elements:

    • Do I have enough cash to at least give PPC a try?

    • Does my business model preclude PPC?
    • If so, are there ways I can improve my business model?
    • You can learn a lot from PPC, like market value estimates and what terms are really important. I think just about everyone should try and track PPC, at least for their own brand names and some of the underpriced edges of the market (although I think it is best to stick with the major players - Google, Yahoo! and MSN Search).
  2. Is there enough traffic to justify outlaying an SEO expenditure?
  3. Can your site compete in Yahoo! and MSN?
  4. Can your site compete in Google?

One way to test how much traffic there is for a given keyword phrase or group of keyword phrases is to start up a test Google AdWords account. If you need a primer on PPC marketing my free PPC tips ebook may be of use.

You can also estimate the size of a keyword market using keyword research tools, although many of them have sampling errors due to small search volume or inflate the search volumes of the most competitive keywords due to automated traffic sources.

If you learn SEO yourself and are in a small niche market you may be able to do it for a hundred or a few hundred dollars. But also do not forget the value of your time. SEO Moz also has a free keyword difficulty estimation tool which some people may find useful.

If you outsource SEO it is hard to find someone who is honest and willing to give you personalized attention unless you can afford a decent spend. Some people may not know what their work is worth and be willing to work dirt cheap, but if you are paying less than $1,000 you probably have about a 95% chance of being disappointed. Depending on your market the cost can scale up to a much larger number. Some people spend hundreds of thousands of dollars a year.

With MSN it seems that just publishing content, using targeted anchor text, and getting low quality links (like links from junky general directories and article syndication sites) is all you need to do to rank. Yahoo! is the same way, although they are a bit more advanced than MSN search is.

With Google, to compete in saturated markets, you need to have an old trusted domain name, or be able to come up with ways to get natural citations from quality sites - and even then it helps as the site ages.

There are ways to get some quality links that may seem like natural citations (like perhaps links for donating to related charities) but the easier it is to get a link the quicker that source will get spammed out. The more abstract your donations are the harder it is for competitors to compete with you. Realistically all links occur due to donations. Creating funny, useful, or compelling content is in a sense a donation to whoever reads it or watches it.

If you are in below-the-radar industries and are creative some of your links can stick for an extended period of time, but if you are competing in savvy fields you also want to ensure you get some legitimate citations that would be hard for your competitors to duplicate. Also keep in mind that if you get exceptionally powerful links via creative means some people in other industries may do research to see what other links you have, and may even start competing in your industry.

You need one or more of the following to compete in Google:

  • a brand that you can leverage

  • viral marketing
  • a rabid following that you can contact
  • influential web friends who can help spread your message

In non competitive markets you still may be able to do well in Google right away, but the keys there are to make sure you mix your link anchor text and also create content that is long tail in nature.

As stated above, the budget mix is going to be hard to come up with exact percentages due to various competitive landscapes and different business models working better with different parts of the search space. If a site already has a large brand it is important to make sure your content management system is working well with search and your site is getting well indexed.

For just about any long-term website I would recommend doing at least the following for organic SEO either before or in conjunction with starting a pay per click account:

  • Unique page titles on each page. If you have a huge branded content site and were not doing this you may see your traffic double just by placing unique titles on each page.

  • Ensure your site is getting well indexed (which has multiple parts to it):
  • Ensure you do not have duplicate content issues
    • each page has unique content on it

    • the same page is not available at multiple URLs
    • the PageRank should be the same at site.com and www.site.com (or at least they should not be different numbers unless one of them is 0)
    • when you did the site searches mentioned above you did not see the exact same content over and over again listed many times in one search result page
    • if you have many pages indexed in the major search engines you should probably be fine on this front.
    • also make sure that your Google listings do not have the words "supplemental result" near them. Those are typically caused by things like orphaned pages or duplicate content.

Should I Have Yahoo! or Google Set Up My PPC Account?

SEO Question: I am new to pay per click marketing. Should I have Google and Yahoo! set up my pay per click account?

SEO Answer: Indeed Google and Yahoo! both are willing to set up your PPC accounts for you, but I would not recommend Google AdWords Jumpstart or Yahoo! Search Marketing Fast Start.

For small non-competitive niche markets it may not hurt you for the engines to set up your campaigns, but if your market is not well established odds are pretty good that the search engines will not do a good job of deep keyword research (since they will have few competing accounts to build your keyword list from).

In competitive markets many people end up losing money. It benefits the engines if most advertisers bid up some of the most common terms (and thus fully value or overvalue the terms that are frequently searched for). The people who make money off pay per click often avoid or underbid the most common terms, and spend more time thinking laterally and bidding on terms that competitors have not yet found. So the goals of the engines may not be well aligned with your own goals (ie: efficient profitable accounts do not mean the same thing when you look at the perspectives of buyers vs sellers).

For competitive terms that you want to compete on you may want to frequently test and retest your landing page and ad copy to help make their accounts more competitive (so in that regard you need to learn about PPC anyhow).

Google's keyword research tool is constantly improving. There are other free keyword tools on the market as well, like Yahoo!'s or mine.

I was able to write most of what I know about PPC in about 30 pages in this free PDF. I do recommend starting with the largest players (Google AdWords, Yahoo! Search Marketing, and MSN AdCenter), but in a game of margins you really need to do more than accept a default account set up provided by the person selling you traffic.

Also there are a number of questions you can't expect the traffic sellers to honestly answer, like:

  • does PPC even make sense given your current business model

  • what percent of your budget should be spent with a competitor
  • should content syndication be turned on
  • how should you bid on content ads
  • should you bid on the most common terms? what is the best position to rank?

Even if they know exactly what different keywords and ad positions will cost they still do not know your business well enough to know what is best for you. Good accounts should use ad targeting to limit their spend...instead of tying arbitrary budgets to bad bids and bad targeting, but it takes a while of learning and tweaking to set up an appropriate account...more work than most engines would like to do. And could you blame them for not wanting to tweak your account to REMOVE some of their income opportunities?

Keeping in tune with your account and your search data also helps you keep in tune with your customers.

Can I Get Penalized for Who Links at My Site?

SEO Question: I was recently threatened by a competitor about them pointing bad links at my site. Can I be penalized based on who links to my site?

SEO Answer: For most people it is unlikely that a competitor is going to go to such lengths to try to sabotage your business, and it is probably not worth being too paranoid over. The whole reason SEO works well is that few people actively practice it.

Having said all of that, the answer to your question is yes. I have seen it done a couple times and there are many different mechanisms people can use to hurt your rankings. Google is constantly testing new algorithms. Sometimes sites will not rank for their own names due to too much similar anchor text. Then at other times sometimes Google creates new algorithmic holes while trying to patch old ones.

As far as building shady links goes, some search algorithms may ignore them and some may give them a bit of a negative weighting on your overall relevancy. Generally though the more positive signs of quality your site has the more low quality signs you can get away with. In that regard probably the best way to protect your site from competitive sabotage is to ensure you don't have domain canonicalization issues (ie: engines realize www.site.com and site.com are the same) and work hard to build legitimate signs of trust. Dan Thies offers some good link building advice in this video, but there are a limited number of quality votes any site can get. The key to beating competitors in the link game is to create more legitimate reasons for people to want to link to your site.

Different engines have different mechanisms for analyzing your link profile. For example, Yahoo! may place too much emphasis on sitewide links while the same links may not help you as much in Google. If you push the low quality links hard enough it may boost your site to #1 in MSN and/or Yahoo!, but you may end up with a link profile that prevents you from ranking well in Google (audio here). Also keep in mind that if competitors try to use links to hurt your site in Google they may also be boosting your Yahoo! or MSN rankings.

In summary I think the two best ways to avoid competitive threats are to stay away from hyper competitive industries OR work hard to create enough legitimate signs of quality that your site is hard to harm.

Can You Build Links too Quickly?

SEO Question: I believe link popularity is the #1 criteria to rank in most search algorithms. Is it possible to gain links too quickly?

SEO Answer: Yes you can gain links too quickly, however I think gaining links too quick is rare. Here is an example of Google temporarily banning one of their own sites for building too many links too quickly. You have to appreciate the strength of Google's brand, and that is part of the reason their then new AdSense blog could have gained so many legitimate links so quickly - it is rare...an anomaly.

When people get in trouble for building links too quickly typically they are using automated link building methods, link exchange networks, or lack focus on link quality - all of which give a site an unnatural link profile with an emphasis on low quality linkage data (see TrustRank and the Company You Keep as an example).

If you are getting natural citations in a viral marketing campaign I would not want it to stop for anything. Even if a site did temporarily get banned by a bad search algorithms as long as the fault is not your own the site will come back strongly. Plus natural viral link campaigns have the following bonuses:

  • are hard for competitors to duplicate

  • competitors even requesting links the wrong way from certain opinionated high authority authors can end up hurting their brand equity.
  • drive usage data - ie: they usually spread through the active portions of the web
  • give you a safety net...if your site is ever removed from the search results viral links will still provide direct traffic (and revenue) as well as help fill up search results for your brand with positive comments that further help improve your trustworthiness and conversion rate

If you are building links by submitting to directories and submitting articles to syndication sites I don't think it hurts to build 20 to 50 links at a time so long as you keep actively building links over time or already have an old estabilshed site.

Of course when you build links it makes sense to mix up your anchor text and descriptions so that you are relevant for a basket of keywords and do not make your link profile too unnatural looking.

What are Google Supplemental Results?

SEO Question: Much of my website is in Google's Supplemental index? What is their supplemental index? How does it work?

SEO Answer: What a timely question...where to start...well if the supplemental problem has only hit your site recently (as compared to the date of this post) it may be a Google specific problem that has caused them to dump thousands of sites recently.

Believe it or not, other than the home page most of this site is currently in supplemental results as of typing this, and with the current Google hole you can throw sites into supplemental hell within 72 hours.

Matt Cutts, a well known Google engineer, recently asked for feedback on the widespread supplemental indexing issue in this thread. As noted by Barry, in comment 195 Matt said:

Based on the specifics everyone has sent (thank you, by the way), I'm pretty sure what the issue is. I'll check with the crawl/indexing team to be sure though. Folks don't need to send any more emails unless they really want to. It may take a week or so to sort this out and be sure, but I do expect these pages to come back to the main index.

In this thread SEM4U points out that 72.14.207.104 was showing fewer supplemental sites than he saw on others like 64.233.179.104.

Some people are conspiring that generally lots of listed pages were dropped and only the longstanding supplemental pages remain, but that theory is garbage on my site...since I still see a strong PageRank 6 Supplemental page that was recently ranking in the SERPs for competitive phrases (prior to going supplemental) that recently went supplemental.

I have done a site redesign just after this supplemental deal occured, but that was sorta in coincidence with this happening. One good thing about that MovableType update is that the last version of MovableType I was using created these aweful nuclear waste redirect pages...it don't do that on version 3.2.

As far as other reasons this site could have possibly been hit supplemental:

  • too much similar text on each page - but I do think it is common to have common sales elements on many pages of a site, so I doubt that is it
  • redirect links - affiliate links via Clickbank and the direct affiliate program might have flagged some sort of trigger if Google was trying to work on 301 & 302 issues... but whatever they did I don't think they did it better ;)
  • Google is a bit hosed right now

What are supplemental results?

Supplemental results usually only show up in the search index after the normal results. They are a way for Google to extend their search database while also preventing questionable pages from getting massive exposure.

How does a page go supplemental?

From my experiences pages have typically gone supplemental when they became isolated doorway type pages (lost their inbound link popularity) or if they are deemed to be duplicate content. For example, if Google indexes the www. version of your site and the non www. version of your site then likely most of one of those will be in supplemental results.

If you put a ton of DMOZ content and Wikipedia content on your site that sort of stuff may go supplemental as well. If too much of your site is considered to be useless or duplicate junk then Google may start trusting other portions of your site less.

Negative side effects of supplemental:
Since supplemental results are not trusted much and rarely rank they are not crawled often either. Since they are generally not trusted much and rarely crawled odds are pretty good that links from supplemental pages likely do not pull much - if any - weight in Google.

How to get out of Google Supplemental results?
If you were recently thrown into them the problem may be Google. You may just want to give it a wait, but also check to make sure you are not making errors like www vs non www, content manangement errors delivering the same content at multiple URLs (doing things like rotating product URLs), or too much duplicate content for other reasons (you may also want to check that nobody outside your domain is showing up in Google when you search for site:mysite.com and you can also look for duplicate content with Copyscape).

If you have pages that have been orphaned or if your site's authority has went down Google may not be crawling as deep through your site. If you have a section that needs more link popularity to get indexed don't be afraid to point link popularity at that section instead of trying to point more at the home page. If you add thousands and thousands of pages you may need more link popularity to get it all indexed.

After you solve the problem it still may take a while for many of the supplementals to go away. As long as the number of supplementals is not growing, your content is unique, and Google is ranking your site well across a broad set of keywords then supplementals are probably nothing big to worry about.

Why Do Search Engines Favor Informational Sites Over Commercial Sites?

SEO Question: I have noticed many more content heavy websites in Google's search results over the last year or two. Why does it seem it is getting harder for commercial sites to rank?

SEO Answer: Within the commercial realm there are more and more competing sites. Building content, at one time primarily a hobby only project, has become far more lucrative in recent years. Not only have content management systems like Movable Type and Wordpress became cheaply or freely available, but AdSense and affiliate marketing have vastly increased the number of real and fake content sites on the market over the last couple years.

Duplicate content filters have improved, and many shell product catalogs have been filtered out of Google's search results. It seems like some older sites are getting away with some rather shoddy stuff in Google, but as they get more user data and more people create quality content you can look for the search engine to shift away from that loophole.

Search algorithms prefer informational websites over commercial ones for many reasons:

  • they want commercial sites to buy their ads

  • the search ads provide commercial results. they prefer to have some informational results to help balance out the search results.
  • in competitive marketplaces there tends to be many more commercial sites than informational sites
  • if multiple merchants have similar product databases it does not drastically improve the user experience to show hollow shell pages over and over again from a wide variety of merchants
  • many quality informational sites link to related resources that lead searchers to more abstract answers that search engines are not yet advanced enough to answer
  • many informational sites are monetized using contextual ads provided by search engines. those give engines a second chance at revenue after the search

Also keep in mind that most merchant sites focus on the same small core group of keywords. Anything involved with big business can take weeks or months to do...or longer if the company is big or the content management system is highly complex.

For a content based website it takes no time at all to do keyword research using some of the keyword research tools on the market, and then quickly create pages around common customer questions, concerns and buying points. If few sites cover those topics with specific pages then it is low hanging fruit waiting to be claimed. I think it was Peter D who said the key to making money on search was to dig where other people were not digging.

Yahoo! currently offers a paid inclusion program (sidenote: which I generally recommend avoiding) which ensures sites are indexed in Yahoo!. Yahoo! charges those sites a flat rate per click for traffic Yahoo! delivers. That per click fee means that for many search queries it may make sense for them to allow many commercial sites to rank in the search results.

As the largest content site, Yahoo!'s search results also offers quick links to many of their internal content channels, which lessens their need for content from other sources. Make no mistake though, Yahoo! has the ability to try to determine how commercial a website is. See their Yahoo! Mindset tool for an example of how results can be weighted toward either commercial or informational resources.

If you look at the Mindset dial and use it to compare the default search results from Yahoo! and Google think of Google as being turned much further toward research. If Yahoo! drops their paid inclusion program you can bet that they will dial their results more toward the research angle, just like Google is.

Some commercial websites, like Amazon, offer rich interactive features that make them easy to reference (both from a webmaster perspective and a search engine perspective), but generally most commercial sites are not highly interactive and most webmasters would typically be far more inclined to link to quality content sites than overtly commercial sites.

If you are in a competitive field it may make sense to look at Librarians' Internet Index or read this newsletter to see what sort of content sites librarians prefer and trust.

The average person on the web may not be as information savvy as librarians are, so it may also help to look at ideas that go viral by looking at sites like Digg, Memeorandum, or Del.icio.us.

You can also learn a lot content ideas by looking at some of the top ranked content sites in your vertical and related verticals which you are interested in and knowledgeable.

Even commercial sites can still be highly linkable if they are feature rich or offer quality answers to relevant topical questions that competing sites typically ignore.

How do I do Search Engine Optimization for a Small Site?

SEO question: How do I do SEO for a small commercial website? Adding more pages will make it look more unprofessional, and so not something I really want to do?

SEO Answer: Sometimes small sites can be easier to do SEO for than big sites.

Faults of big commercial sites:
Some big sites that are product catalogs may require significant link popularity to get indexed. Also if you are dealing with thousands and thousands and thousands of pages it can be hard to make them unique enough to stay indexed as search algorithms continue to advance. Search engines are getting better at comparing pages and sites. If the only difference across most pages of your many thousand page site are a few part numbers then many pages may be considered duplicate content.

Benefits of a small site:
If a site is small that makes it easy to concentrate your internal link popularity on the most important issues, ideas, and keywords. Small hyper targeted sites also work well at being able to dominate those niche markets. You can create a site name based on the vertical and use the domain name to your advantage.

If you are trying to tackle insurance then a small site is not going to get you anywhere unless you are targeting niche types of insurance.

I tend to be a bit verbose (which is perhaps why I wrote an ebook ;) but I also do not buy that adding pages to a commercial site makes a site less professional. Web pages are just a bunch of bits, but those bits are your salesmen.

Which site would YOU trust more:

  1. Get the lead or sale or the prospective client can screw off. If they want anything they must pay first.

  2. Offers substantial information about the products they sell. Also builds credibility with FAQ section, answering common questions along the buying cycle with content focused on the issues that people tend to think are important before making a purchase.

If you hype it enough, have a high price point and get affiliates pushing it hard enough #1 may win, but in most markets most of the time site #2 will win.

If their site is exceptionally small then adding a few pages with about us and frequently asked questions should allow you to build credibility and target new traffic streams.

If competing sites have a huge brand that you can't afford to compete with one of the best ways to chip away at them is to create useful content, tools, and ideas that solve market problems that have not yet been solved.

If your content is great then it may garner some natural citations, but you need to build at least a few links for search engines to trust your site enough to where others will find it.

Some webmasters are also afraid to link out to relevant resources. I think that most good websites link out to at least a few decent resources. Don't be afraid to link at relevant .gov or .edu pages, industry trade organizations, local chamber of commerce sites and other sites that make sense to reference.

Pages