SEO Question: Many people say write naturally for SEO, but what does that mean?
SEO Answer: About a month and a half ago the New York Times published an article by Steve Lohr titled This Boring Headline Is Written for Google. The article flits with the idea of writing newspaper articles with Google in mind. That story got a decent amount of buzz because newspapers usually do not put much consideration into search engine marketing.
Old School Search Engine Optimization:
A few years ago you could do SEO like this:
start your page title with your keyword or keyword phrase
include that keyword phrase on most every heading or subheading on that page
link to the page sitewide with that same keyword in the anchor text
build a ton of links from external locations, with most (or all) of them containing that keyword phrase
Does Old School Still Work?
For MSN (and, to some extent, Yahoo!) you could still use a somewhat similar keyword stuffing philosophy and see outstanding results, but the problem with the stick my core phrase everywhere SEO methodâ„¢ is that Google does not want to show the most optimized content. They want to show the most relevant content.
As noted above in the New York Times article, most news articles (and likely most quality web documents) are not heavily focused on concentrating on optimizing for a keyword. Instead they use the natural language associated with that topic.
If too many of your signals are focused on just one word or phrase and you lack the supporting vocabulary in your document you may get filtered out of the search results for your primary keyword targets. It has happened to me several times, and it is a pretty common occurrence, especially for websites that have few authoritative trustworthy votes and try to make up for it by aggressive use of a phrase in the page content.
Here is an example of a snapshot of a spam page I saw ranking for a long tail keyword
The problem is, that page was ranking for Michigan Smoker's Life Insurance when it targeted a way different phrase. The page will never rank for the main phrase it was targeting, so unless they redirect searchers to a more relevant page it is going to be hard for them to convert any visitors that land on a page like that.
So if old hat optimization is considered overoptimization and/or is potentially detrimental to your rankings what do you do?
say screw Google they will eventually rank me if I get this keyword on the page 1 more time ;)
say screw Google I am pulling in plenty of money from Yahoo! and MSN
not worry about SEO at all
evolve SEO to a more productive state
Onward and upward I say. How to mix it up to become Google friendly:
Start the page title with a modifier or couple non keywords instead of placing your primary keyword phrase as the first word of the page title. Example... instead of search engine marketing company start your title with professional search engine marketing...
Stemming is your friend. Use plural, singular, and ing versions of your keywords. I have seen pages that used a bunch of the plural version filtered out of Google for the plural version but still ranking for the singular version. If you mix it up you can catch both.
Mix up the anchor text, subheaders and page content. Use semantically related phrases, and, in some cases, write subheaders that are useful for humans even if some of them do not have any keyword phrases in them.
Make sure each page is somewhat unique and focused in nature.
Semantically related phrases:
If you think of words as having relationships to one another and you visualize optimizing for a keyword as optimizing for a basket of relevant related keywords it will help you draw in relevant related search traffic while also making your page more relevant for its core keywords.
For example, the acronym SEO would have the following as some semantically related phrases
Now you wouldn't necessarily need to get all of those in your page copy, but if a person was writing naturally about the topic of SEO it would be common for many of those kinds of words to appear on the page.
In addition to using words that are semantically related it makes sense to use words that are common modifiers. For example common buying / shopping searches might include words like
I created a keyword modifiers spreadsheet with free keyword modifier ideas for a few different search, transaction, and classification types. I might try to expand it a bit if people find it useful.
If it All Sounds Like a Bit Much...
If it seems complex or complicated then don't focus too heavily on the modifiers or semantic related phrases or even your core keyword that much.
First write your article about your topic without even thinking about the search engines. Then go back and tweak it to include relevant modifiers and semantically related phrases. Make sure that you use multiple versions of your primary keyword phrase if it has multiple versions.
To make the page easy to read and to make it easy to add related phrases and alternate versions of your keywords break up the page using many subheaders. Also add leading questions that lead people from one section to the next. For example, I could say did you find this search engine marketing article helpful in your website promotion quest? Do you think it will help you do a become a better search engine optimizer and more holistic internet marketer?
I am a bit tired and I think this was a bit verbose, but hopefully it helps somebody. If not, arggggg... hehehe.
SEO Question: So here is my question. I have followed a good deal of your advice and am thankful for it as I see myself sitting pretty for some of my keyword phrases. However, my friend in Idaho sees different results and my friend in Saudi sees different results - and I wanna know - does Google index differently according to geographical locations?
Assuming that you mean where you rank in the SERPs (also known as the search results page) and not PageRank - which is a rough estimate of global link popularity, there are a number of factors which may show you different search results than what your friends see when they search Google. The 3 major factors are:
Google's Data Centers:
Google has a boatload of data centers around the world. In fact some of them even be running in shipping crates. They usually route search queries to the data center that is nearest you. In a recent interview Google's Matt Cutts said:
In fact, even at different data centers we have different binaries, different algorithms, different types of data always being tested.
If they roll new filters or make large changes to their algorithms you might notice different results as you hit one data center or another.
A friend of mine owns a site in a hyper competitive market that used to be owned by a person from Australia. While the site does not yet rank as well as my friend would like on Google.com it ranks for amazingly competitive single word queries in Google Australia (Google.com.au) due to having many links from websites that are located in Australia.
Within Google's local search they allow people to search for all websites or just local ones. You can appear in local databases by hosting your site there, using a domain with a local extension, or having many links from sites that are deemed local in nature.
If you are in Canada even if you search on Google.com those search results will be biased toward Canadian websites. If you are located in another country but want to see what Google's search results look like in the US you can search Google from a proxy.
If you are logged into a Google Account they will bias your search results based on websites you have visited, especially those you have clicked through to from search results.
If you visit a site or page frequently they will improve the positioning of that page in your personalized search results. If you visit a page occasionally just rank checking, and then sometimes clicking onto your result then clicking back nearly immediately Google will demote those pages in the SERPs.
You can turn off Google personalized results by clicking a link on the results that says something like Turn Off Personalized Results
Free Google Keyword Rank Checking Tools to Use:
There are a number of free tools that make it easy to track where you rank.
I like tracking some core keywords using Rank Checker. It is free and stores historical results, offering data refreshed however often you would like it refreshed. And since it sits on your desktop you don't have to worry about others spying on and aggregating your data to compete against you.
If you just want to check where you rank in Google I have a few rank checkers in the tool section on my website as well.
Keep in mind that many of the rank checkers will set the number of results per page to be a different number than the default 10, and that will cause a slight ranking skew. Also if you wanted to check your rankings on different data centers McDar has a free tool which makes it easy to check your rankings across a number of data centers all at once.
You could also manually go to any of the data centers directly and do a search query from their IP address, like 126.96.36.199 or 188.8.131.52
Paid Rank Checkers
If you need to track rankings in some search engines outside of the core global search engines (Google, Microsoft Bing, & Yahoo! Search) say like Baidu or Yandex then you might want to give Advanced Web Ranking a look. In addition to storing your rank data they store the entire top 50, and allow you to create graphs with it and provide many different useful reports that give you various looks at ranking trends amongst competing sites.
Server Logs & Tracking:
It is easy to get hung up on where you rank for a specific term, but it is far more important to try to rank for a diverse set of terms. See what terms are driving searchers to your site and track which of those terms are converting. From that data you can create more content around what converts or spruce up the top performing pages by adding a few modifiers or making the content more compelling to human visitors, and thus further increasing conversions.
Ways to Target Multiple Similar Versions of a Keyword Phrase:
There are multiple ways to target both version of a phrase.
Sometimes adding one version somewhere in the meta description and maybe in the page footer area is a good way to target the less popular of the two. You may also be able to work both versions into your page title, but you really want to consider how search engines will display your page titles and descriptions. If you have a dynamically generated site it is much easier to create formulas for the page title and meta descriptions which help you to test many of them without needing to waste a ton of time editing each page one at a time.
Another good option for picking up the secondary phrase is to get a few external citations to the important pages with the less common term in the anchor text, or maybe use a sitemap which pushes the secondary version. It is pretty easy to syndicate articles and do other things like that to pick up a few low to mid quality links with decent anchor text.
Some sites, like About.com, use a related phrases section on definition pages, which outlines other versions of a phrase. If you sell parts you could call it something like "alternate part numbers". If you use this you need to make it look professional and get some quality citations so that your site seems as though it is above board, and not just trying to spam the engines.
Finally, the last way I can think of tackling the problem is to create different version that target the different phrases, but if you do this it is easier to write mini blog posts or do something like that. You want to make it look legitimate, so the page contents should not be exact duplicates with the exception of find and replace, because that could look suspicious and as if it is only for search spiders. Duplicate content filters are improving daily as well, and are getting better at detecting find and replace duplication.
Influencing Word Relations:
There are subtle ways to drive search volume, but it is a long hard and involved process to try to change the way people use language. It may also be darn near impossible if most the market discussion occurs offline.
Many people would consider linking off to search results in a salesletter a no no, but if you can have people search Google for your coupon or brand name then your brand might be recommended more frequently for things like inline search suggest or see also searches for broader related search queries.
How you use language on other sites can also help determine what phrases engines think are related to one another, especially if the patterns you create are reinforced on many pages of multiple large independent sites. Yahoo!'s see also patterns seem to be driven at least partially by word patterns on pages.
How Popular is Each Version?
When considering if you want to go after one version or both the first thing you have to do is get a rough indication of demand for each term. Use Google's keyword tool and perhaps combine that with mine. Keep in mind that mine is driven off of Overture, and there are flaws to the data collection and sharing models at any keyword tool, so these are just estimates.
Using those two tools should show you what version is the most popular. People often search in the same way as they create content. So another good backup indicator would be searching Google for ["e-book"] and [ebook].
I also have a tool which uses the Google API to give the approximate number of hits for each version. My Compitition Finder tool will show how many results there are for pages that use the terms in the title and / or anchor text. If terms occur in the page title and anchor text than those pages are likely going to be far more targeted on a topic than pages that may just have the words somewhere on the page. Sometimes my tool is a bit broken, so after this semester is done hopefully my programmer buddy will have a few hours to fix it up.
How Competitive is Each Keyword Phrase?
The number of hits might give you some idea of how competitive is each version, but a more accurate way to find out is just to look at the top search results for each version. If official type sites tend to target one version and spam sites target the other you may be better off going after the less popular and less competitive version off the start, especially if you are working on a limited budget.
Before you commit to any targeting method it may be worth considering
how easy or hard it will be to change what you are targeting as your site influence and income increase.
weather or not you will need to worry about updating the aged content, or if your site structure allows you to focus on creating new content without the structure of the old content hurting you too much
Using things like a content management system or server side includes might make a lot of sense if you are going to be working on a large site.
My site already ranks number 1 in Google. How do I get Google to post a mini site map in the search results?
I believe that Google primarily displays multi link listings when they feel a query has a strong chance of being navigational in nature. I think they can determine that something is navigational in nature based on linkage data and click streams. If the domain is well aligned with the term that could be another signal to consider.
If you have 10,000 legit links for a term that nobody else has more than a few dozen external citations for then odds are pretty good that your site is the official brand source for that term. I think overall relevancy as primarily determined by link reputation is the driving factor for weather or not they post mini site map links near your domain.
This site ranks for many terms, but for most of them I don't get the multi link map love. For the exceptionally navigational type terms (like seobook or seo book) I get multi links.
The mini site maps are query specific. For Aaron Wall I do not get the mini site map. Most people usually refer to the site by it's domain name instead of my name.
Google may also include subdomains in their mini sitemaps. In some cases they will list those subdomains as part of the mini site map and also list them in the regular search results as additional results.
For instance, I have a sitewide link to my sales letter page which I use the word testimonials as the anchor text. Google lists a link to the sales letter page using the word testimonials.
When I got sued the page referencing the lawsuit got tons and tons of links from many sources, which not only built up a ton of linkage data, but also sent tons of traffic to that specific page. That page was never listed on the Google mini site map, which would indicate that if they place heavy emphasis on external traffic or external linkage data either they try to smooth the data out over a significant period of time and / or they have a heavy emphasis on internal linkage.
My old site used to also list the monthly archives on the right side of each page, and the February 2004 category used to be one of the mini site map links in Google.
You should present the pages you want people to visit the most to search bots the most often as well. If you can get a few extra links to some of your most important internal pages and use smart channeling of internal linkage data then you should be able to help control which pages Google picks as being the most appropriate matches for your mini site map.
Sometimes exceptionally popular sites will get mini site map navigational links for broad queries. SEO Chat had them for the term SEO, but after they ticked off some of their lead moderators they stopped being as active and stopped getting referenced as much. The navigational links may ebb and flow like that on broad generic queries. For your official brand term it may make sense to try to get them, but for broad generic untargeted terms in competitive markets the amount of effort necessary to try to get them will likely exceed the opportunity cost for most webmasters.
Are you aware of a tool or a service that can provide reliable search volume history for certain keywords? Like how many searches there were for "keyword phrase" in each month of 2005.
Keyword Intelligence and Keyword Discovery (in depth review of many keyword research tools) both offer seasonal data to some extent, but both are limited in their database depth. Search traffic falls off sharply when you leave Google and Yahoo!. Unless you are Google or Yahoo! it is just plain tough to gather exceptionally useful and meaningful data (unless you are in a non English market with other major players).
With Google and Yahoo! their own commercial motivations are to show increased volume on core terms to create artificially competitive markets. Google and Yahoo! don't make tons of money when advertisers buy a bucket full of long tail 10 cent clicks instead of bidding up the core heavily searched industry terms. They don't mind if you find some long tail terms, but they want everyone bidding up the core terms.
Due to the engines recommending the most obvious terms and some advertisers feeling they NEED to advertise on those terms some keywords get so competitive that the margins are negative. This slim or negative margin environment spurs on rank checkers, click fraud, and other market manipulating activities which drive up the core search volume numbers provided by the major engines on the most common terms.
Since that is a short term in a hyper saturated market you shouldn't be surprised if 90% of that search volume is junky automated traffic or ego searching. I usually rank in the top 10 in Google and Yahoo! for SEO and it typically sends me about 5 visitors a day. While Yahoo! only shows "SEO Book" as getting about 5% of the search volume as the term SEO I get way more traffic for SEO Book.
Yahoo! has recently started mailing out keyword promos reminding people to bid up the most competitive holiday related terms. Those are probably good words to steer clear of paying for. Many ignorant bidders jumping into the market at the same time creates an ugly overpriced playing field, although if you can sell them their PPC traffic the overpriced bidding may be a beautiful thing. Keyword terms with a large standard deviation may create good arbitrage opportunities.
Google recently started offering 12 months seasonal search data with their keyword tool. Unlike Overture, they only show graphical estimates and trends instead of exact search volume numbers. But I think it is only important to get a glance at trends since exact numbers usually do not matter much (due to automated traffic etc). They also provide quick snapshots of ad market competition and estimated bids. Both of which may be useful in deciding which markets are valuable to enter prior to investing into content creation.
In many markets the breadth of the keyword space matters much more than just the volume of the top few keywords. Some markets which are driven around a well known brand with few well known product names may have 90% of the search volume come in under the brand name or a couple slight variations of it. Most markets have much more traffic at the other end of the keyword spectrum though. Keyword phrase modifiers and alternate phrases may be huge. For example, yesterday over 75% of the search queries referring visitors to this site were unique.
You can also learn about many of the odd search patterns, consistent seasonal trends, and how search relates to society by reading the Hitwise blog. While they do not have anywhere near the amount of data Google or Yahoo! do, the Hitwise blog is always an interesting read, and does a great job of marketing their services.
the biases of the providers (wanting to sell expensive clicks or having small a keyword database size)
the numbers provided by any tools are just estimates
consider how spread out the search terms likely are in your industry.
If you are new to an industry, have limited capital, limited brand equity built up, and your market is hyper-saturated it may be far more profitable to go for niche long tail search phrases, since those will be easier to compete for and they typically have more implied value / targeting / demand.
SEO Question: What is the best way to determine what resources should be put into pay per click marketing versus organic SEO?
SEO Answers: There are a near endless number of factors in determining how you should spend your marketing money online. The good thing about search is the implied intent while people are searching - which can lead to quick feedback on efficient accounts, but there are certain businesses that are hard to sell via search.
This site ranks fairly well via search, but most of my conversions come from other marketing mechanisms because there is so much hype in online marketing and so much distrust toward marketing ebooks. About the only search terms that convert for this site are searches for my name or the official name of the site (part of that is also because the brand name of this site is rather generic in nature). When selling unbranded commodity based products at low cost I think search works much better than expensive products or services that require building trust first. If you build a brand it makes it hard for competitors to compete on your branded terms because your conversion rate will be so much higher on the branded searches.
I think prior to determining how you break down your marketing spend you first have to determine what your short term and long term goals are. Do you want to rank for certain competitive terms in Google? Is your goal to get a certain amount of traffic? A certain amount of profit? Develop a brand or market reach that allows you to profit indirectly?
Some business models work great with pay per click marketing. Particularly small uncompetitive niches or high value markets that do not have much advertising depth. Using PPC to market in local niche markets tend to offer under-priced leads. In many markets people bid on the most common terms but leave off higher value related terms. Also some markets are far under-priced since PPC is newer there. Based on talking to a few friends I think PPC in Germany on average would offer higher returns than PPC in the UK or US.
Some business models work horribly with pay per click marketing. Particularly businesses that have no recurring income streams and/or lower product prices in a market crowded by competitors with higher price points or higher profit margins. If you have a product which may be priced out of the more common high value terms you still may be able to find a few niche terms and bid on your brand, but you may need to rely more on organic search for traffic in this scenario.
Within pay per click marketing I have seen some topics where the Google AdWords ROI is much greater, but, more commonly, Yahoo! has less reach but greater ROI. Because of differences in how their systems work it may mean that leads which are prohibitively expensive in one channel may be cheap in another.
Since right now MSN has few ad distribution partners and is still in beta they should have some of the cleanest traffic and least competition within their new beta system. But they may not have much traffic in some markets due to their small search market share compared to Google or Yahoo!
To do pay per click well you really have to track your conversions so you can calculate your lead value / income per unique visitor. If it is hard to track the exact lead value it is important to find a proxy for value. If your costs seem prohibitively expensive and your business model is similar to competing sites you need to look at what is going wrong in your conversion process. Competitive PPC markets force you to be more efficient, which helps you woth conversions on both PPC and organic search.
Many non search ads are also sold through the PPC interfaces at the major search engines. Cash rich companies or exceptionally efficient businesses may consider bidding low on contextual ads to help give them a brand lift. Since many of these ads have a low clickthrough rate you can get hundreds of thousands or millions of impressions for a few hundred dollars. Increased mindshare leads to greater search volume, so the contextual ads play back into your PPC and organic search marketing campaigns.
There is an appeal to the concept of retail without the risk, or turn key operations, but a business without risk is a business without growth or purpose. Even if things seem like they are churning along smoothly with pay per click marketing the players may change the rules of the game, and overnight many of the terms and techniques that were once exceptionally profitable are less so. In much the same way they want to keep noise out of their regular search results to keep them relevant they also want to keep ads relevant. And then competitors can enter the market and shift the game plan overnight as well. This can happen in organic or paid results, so using both can help lower your risk from things going wrong with either, plus you can take information your learn from either discipline and use it to refine the other.
As far as organic SEO goes I could write a 100 page long post that nobody would read (or perhaps I could sell it as an ebook and then people would read it), but generally the four major questions are:
Should I do PPC? is composed of the following elements:
Do I have enough cash to at least give PPC a try?
Does my business model preclude PPC?
If so, are there ways I can improve my business model?
You can learn a lot from PPC, like market value estimates and what terms are really important. I think just about everyone should try and track PPC, at least for their own brand names and some of the underpriced edges of the market (although I think it is best to stick with the major players - Google, Yahoo! and MSN Search).
Is there enough traffic to justify outlaying an SEO expenditure?
Can your site compete in Yahoo! and MSN?
Can your site compete in Google?
One way to test how much traffic there is for a given keyword phrase or group of keyword phrases is to start up a test Google AdWords account. If you need a primer on PPC marketing my free PPC tips ebook may be of use.
You can also estimate the size of a keyword market using keyword research tools, although many of them have sampling errors due to small search volume or inflate the search volumes of the most competitive keywords due to automated traffic sources.
If you learn SEO yourself and are in a small niche market you may be able to do it for a hundred or a few hundred dollars. But also do not forget the value of your time. SEO Moz also has a free keyword difficulty estimation tool which some people may find useful.
If you outsource SEO it is hard to find someone who is honest and willing to give you personalized attention unless you can afford a decent spend. Some people may not know what their work is worth and be willing to work dirt cheap, but if you are paying less than $1,000 you probably have about a 95% chance of being disappointed. Depending on your market the cost can scale up to a much larger number. Some people spend hundreds of thousands of dollars a year.
With MSN it seems that just publishing content, using targeted anchor text, and getting low quality links (like links from junky general directories and article syndication sites) is all you need to do to rank. Yahoo! is the same way, although they are a bit more advanced than MSN search is.
With Google, to compete in saturated markets, you need to have an old trusted domain name, or be able to come up with ways to get natural citations from quality sites - and even then it helps as the site ages.
There are ways to get some quality links that may seem like natural citations (like perhaps links for donating to related charities) but the easier it is to get a link the quicker that source will get spammed out. The more abstract your donations are the harder it is for competitors to compete with you. Realistically all links occur due to donations. Creating funny, useful, or compelling content is in a sense a donation to whoever reads it or watches it.
If you are in below-the-radar industries and are creative some of your links can stick for an extended period of time, but if you are competing in savvy fields you also want to ensure you get some legitimate citations that would be hard for your competitors to duplicate. Also keep in mind that if you get exceptionally powerful links via creative means some people in other industries may do research to see what other links you have, and may even start competing in your industry.
You need one or more of the following to compete in Google:
a brand that you can leverage
a rabid following that you can contact
influential web friends who can help spread your message
In non competitive markets you still may be able to do well in Google right away, but the keys there are to make sure you mix your link anchor text and also create content that is long tail in nature.
As stated above, the budget mix is going to be hard to come up with exact percentages due to various competitive landscapes and different business models working better with different parts of the search space. If a site already has a large brand it is important to make sure your content management system is working well with search and your site is getting well indexed.
For just about any long-term website I would recommend doing at least the following for organic SEO either before or in conjunction with starting a pay per click account:
Unique page titles on each page. If you have a huge branded content site and were not doing this you may see your traffic double just by placing unique titles on each page.
Ensure your site is getting well indexed (which has multiple parts to it):
For small non-competitive niche markets it may not hurt you for the engines to set up your campaigns, but if your market is not well established odds are pretty good that the search engines will not do a good job of deep keyword research (since they will have few competing accounts to build your keyword list from).
In competitive markets many people end up losing money. It benefits the engines if most advertisers bid up some of the most common terms (and thus fully value or overvalue the terms that are frequently searched for). The people who make money off pay per click often avoid or underbid the most common terms, and spend more time thinking laterally and bidding on terms that competitors have not yet found. So the goals of the engines may not be well aligned with your own goals (ie: efficient profitable accounts do not mean the same thing when you look at the perspectives of buyers vs sellers).
For competitive terms that you want to compete on you may want to frequently test and retest your landing page and ad copy to help make their accounts more competitive (so in that regard you need to learn about PPC anyhow).
I was able to write most of what I know about PPC in about 30 pages in this free PDF. I do recommend starting with the largest players (Google AdWords, Yahoo! Search Marketing, and MSN AdCenter), but in a game of margins you really need to do more than accept a default account set up provided by the person selling you traffic.
Also there are a number of questions you can't expect the traffic sellers to honestly answer, like:
does PPC even make sense given your current business model
what percent of your budget should be spent with a competitor
should content syndication be turned on
how should you bid on content ads
should you bid on the most common terms? what is the best position to rank?
Even if they know exactly what different keywords and ad positions will cost they still do not know your business well enough to know what is best for you. Good accounts should use ad targeting to limit their spend...instead of tying arbitrary budgets to bad bids and bad targeting, but it takes a while of learning and tweaking to set up an appropriate account...more work than most engines would like to do. And could you blame them for not wanting to tweak your account to REMOVE some of their income opportunities?
Keeping in tune with your account and your search data also helps you keep in tune with your customers.
SEO Question: I was recently threatened by a competitor about them pointing bad links at my site. Can I be penalized based on who links to my site?
SEO Answer: For most people it is unlikely that a competitor is going to go to such lengths to try to sabotage your business, and it is probably not worth being too paranoid over. The whole reason SEO works well is that few people actively practice it.
Having said all of that, the answer to your question is yes. I have seen it done a couple times and there are many different mechanisms people can use to hurt your rankings. Google is constantly testing new algorithms. Sometimes sites will not rank for their own names due to too much similar anchor text. Then at other times sometimes Google creates new algorithmic holes while trying to patch old ones.
As far as building shady links goes, some search algorithms may ignore them and some may give them a bit of a negative weighting on your overall relevancy. Generally though the more positive signs of quality your site has the more low quality signs you can get away with. In that regard probably the best way to protect your site from competitive sabotage is to ensure you don't have domain canonicalization issues (ie: engines realize www.site.com and site.com are the same) and work hard to build legitimate signs of trust. Dan Thies offers some good link building advice in this video, but there are a limited number of quality votes any site can get. The key to beating competitors in the link game is to create more legitimate reasons for people to want to link to your site.
Different engines have different mechanisms for analyzing your link profile. For example, Yahoo! may place too much emphasis on sitewide links while the same links may not help you as much in Google. If you push the low quality links hard enough it may boost your site to #1 in MSN and/or Yahoo!, but you may end up with a link profile that prevents you from ranking well in Google (audio here). Also keep in mind that if competitors try to use links to hurt your site in Google they may also be boosting your Yahoo! or MSN rankings.
In summary I think the two best ways to avoid competitive threats are to stay away from hyper competitive industries OR work hard to create enough legitimate signs of quality that your site is hard to harm.
SEO Question: I believe link popularity is the #1 criteria to rank in most search algorithms. Is it possible to gain links too quickly?
SEO Answer: Yes you can gain links too quickly, however I think gaining links too quick is rare. Here is an example of Google temporarily banning one of their own sites for building too many links too quickly. You have to appreciate the strength of Google's brand, and that is part of the reason their then new AdSense blog could have gained so many legitimate links so quickly - it is rare...an anomaly.
When people get in trouble for building links too quickly typically they are using automated link building methods, link exchange networks, or lack focus on link quality - all of which give a site an unnatural link profile with an emphasis on low quality linkage data (see TrustRank and the Company You Keep as an example).
If you are getting natural citations in a viral marketing campaign I would not want it to stop for anything. Even if a site did temporarily get banned by a bad search algorithms as long as the fault is not your own the site will come back strongly. Plus natural viral link campaigns have the following bonuses:
are hard for competitors to duplicate
competitors even requesting links the wrong way from certain opinionated high authority authors can end up hurting their brand equity.
drive usage data - ie: they usually spread through the active portions of the web
give you a safety net...if your site is ever removed from the search results viral links will still provide direct traffic (and revenue) as well as help fill up search results for your brand with positive comments that further help improve your trustworthiness and conversion rate
If you are building links by submitting to directories and submitting articles to syndication sites I don't think it hurts to build 20 to 50 links at a time so long as you keep actively building links over time or already have an old estabilshed site.
Of course when you build links it makes sense to mix up your anchor text and descriptions so that you are relevant for a basket of keywords and do not make your link profile too unnatural looking.
SEO Question: Much of my website is in Google's Supplemental index? What is their supplemental index? How does it work?
SEO Answer: What a timely question...where to start...well if the supplemental problem has only hit your site recently (as compared to the date of this post) it may be a Google specific problem that has caused them to dump thousands of sites recently.
Matt Cutts, a well known Google engineer, recently asked for feedback on the widespread supplemental indexing issue in this thread. As noted by Barry, in comment 195 Matt said:
Based on the specifics everyone has sent (thank you, by the way), I'm pretty sure what the issue is. I'll check with the crawl/indexing team to be sure though. Folks don't need to send any more emails unless they really want to. It may take a week or so to sort this out and be sure, but I do expect these pages to come back to the main index.
Some people are conspiring that generally lots of listed pages were dropped and only the longstanding supplemental pages remain, but that theory is garbage on my site...since I still see a strong PageRank 6 Supplemental page that was recently ranking in the SERPs for competitive phrases (prior to going supplemental) that recently went supplemental.
I have done a site redesign just after this supplemental deal occured, but that was sorta in coincidence with this happening. One good thing about that MovableType update is that the last version of MovableType I was using created these aweful nuclear waste redirect pages...it don't do that on version 3.2.
As far as other reasons this site could have possibly been hit supplemental:
too much similar text on each page - but I do think it is common to have common sales elements on many pages of a site, so I doubt that is it
redirect links - affiliate links via Clickbank and the direct affiliate program might have flagged some sort of trigger if Google was trying to work on 301 & 302 issues... but whatever they did I don't think they did it better ;)
Google is a bit hosed right now
What are supplemental results?
Supplemental results usually only show up in the search index after the normal results. They are a way for Google to extend their search database while also preventing questionable pages from getting massive exposure.
How does a page go supplemental?
From my experiences pages have typically gone supplemental when they became isolated doorway type pages (lost their inbound link popularity) or if they are deemed to be duplicate content. For example, if Google indexes the www. version of your site and the non www. version of your site then likely most of one of those will be in supplemental results.
If you put a ton of DMOZ content and Wikipedia content on your site that sort of stuff may go supplemental as well. If too much of your site is considered to be useless or duplicate junk then Google may start trusting other portions of your site less.
Negative side effects of supplemental:
Since supplemental results are not trusted much and rarely rank they are not crawled often either. Since they are generally not trusted much and rarely crawled odds are pretty good that links from supplemental pages likely do not pull much - if any - weight in Google.
How to get out of Google Supplemental results?
If you were recently thrown into them the problem may be Google. You may just want to give it a wait, but also check to make sure you are not making errors like www vs non www, content manangement errors delivering the same content at multiple URLs (doing things like rotating product URLs), or too much duplicate content for other reasons (you may also want to check that nobody outside your domain is showing up in Google when you search for site:mysite.com and you can also look for duplicate content with Copyscape).
If you have pages that have been orphaned or if your site's authority has went down Google may not be crawling as deep through your site. If you have a section that needs more link popularity to get indexed don't be afraid to point link popularity at that section instead of trying to point more at the home page. If you add thousands and thousands of pages you may need more link popularity to get it all indexed.
After you solve the problem it still may take a while for many of the supplementals to go away. As long as the number of supplementals is not growing, your content is unique, and Google is ranking your site well across a broad set of keywords then supplementals are probably nothing big to worry about.