[Tim] Mayer reminded that what's relevant for a query can often change over time. Google's Udi Manber, vice president of engineering, made similar remarks when I spoke with him about human-crafted results when I was visiting at Google yesterday.
One example he pointed out was how Google's human quality reviewers -- people that Google pays to provide a human double-check on the quality of its results, so they can then better tune the search algorithm -- started to downgrade results for [cars] when information about the movie Cars started turning up. The algorithm had picked up that the movie was important to that term before some of the human reviewers were aware of it.
Obviously human review is used at all major search engines, but even when outsourcing reviews humans have limits just like with producing content. Even if Google has 10,000 quality raters those people can only be trained to find and rate certain things.
Information architecture is probably the single most important and most under-rated aspect of the search marketing strategy for large websites.
A Recurring Error
I have been reviewing some client sites that could use work on the information architecture front. Some of them want to rank for keywords that they do not actively target. The key to ranking is to create meaningful navigation schemes that reinforce the goals of your site and your target keyword phrases. In addition, a site which is improperly categorized due to poor internal navigation does not flow PageRank properly through the site, which means your ranking data and market feedback will be irrelevant / broken and not show your true potential.
Conversion oriented structure is a type of content. It is one of the biggest advantages smaller players have over large players that operate in many fields, and adds to the bottom line of any site that takes it seriously.
Compare the following...
What Happenst to a Site With Bad Internal Navigation?
A piece meal site with hacked up internal navigation exhibits the following characteristics
navigation is inconsistent and confusing, thus it is hard for spiders to know what pages are important and it is hard for humans to work their way through the conversion process
if the spiders do not rank the correct pages odds are pretty good that the visitors will come into the site on the wrong page (and have a hard time working their way through the conversion process if they start out at the wrong spot)
hard to buy broad keywords using PPC because competing sites are better at funneling visitors through the conversion process
hard to buy category level keywords using PPC because it is hard to place people on meaningful content if it does not exist. category pages should be more than a link list or groups of irrelevant chunks of content
what should be category pages do not add context, build trust, and build credibility - they are essentially placeholders gluing together various unrelated content
if you do not have well defined and reinforced category pages the site is not structured to rank for the mid level category related keywords
When creating a robots.txt file, if you specify something for all bots (using a *) and then later specify something for a specific bot (like Googlebot) then search engines tend to ignore the broad rules and follow the ones you defined specifically for them. From Matt Cutts:
If there's a weak specification and a specific specification for Googlebot, we'll go with the one for Googlebot. If you include specific directions for Googlebot and also want Googlebot to obey the "generic" directives, you'd need to include allows/disallows from the generic section in the Googlebot section.
I believe most/all search engines interpret robots.txt this way--a more specific directive takes precedence over a weaker one.
You can learn a lot about how the search results will change based on recent changes that have been made and by seeing what tests the engines are running. As SEOs we track the algorithms quite intensively, but the search result display is just as important. Google allows webmasters to see what search tests they are currently performing via Google Experimental Search.
Google bought YouTube, but is struggling with ironing out ad revenue shares and advertising. What is the easiest way for Google to fix these issues? Integrate YouTube and Google Video directly into Google's search results.
Using what legal loopholes they may and something they call universal search, you can now listen to music videos directly from Google.com search results. This creates a marketplace that many businesses will need to be in to stay relevant, destroys a whole vertical of web spam, AND allows Google to monetize the organic search results (via YouTube). If you think video is a passing fad you bet wrong, but if you are doing it you are best off branding your videos to be associated with your domain name and uploading them to YouTube. Eli offered tips on how to make $1,000 a month re-purposing video, but now the number is more like $20,000.
Sure Google has done many YouTube users wrong, but if you need exposure, Google turned back the clock on SEO. Top ranking have never been easier. All you need to succeed is to format your content in video and upload it to YouTube.
If Google or Amazon were your bank or credit card, they'd let you know which merchants had the best prices for the same products, so you'd be a smarter shopper next time. They'd let merchants know what products were popular with people who also bought related products. They'd help merchants stock the right products by zip code. They'd let you know when you were spending more on dining out than you have set in your family budget. They'd let you know when you were approaching your credit limit, with a real-time fuel gauge, not just a "Sorry, your card has been declined."
By making search richer you have less reason to leave Google. Google started with targeted text ads, but it is even better if they can combine their targeting with trusted brands and offers while making their ads look like a useful piece of content in any format.
A friend recently launched a new site and promptly crafted a great linkbait award idea that got so many links that over 95% of the website's inbound links were reciprocal links. The award program worked so well that traditional PR firms used our list of award winners to seed their list of people they wanted to contact to talk about a client.
The site did not rank anywhere near as well as it should have because there were too many reciprocal links gained far too quickly when you consider the rest of the site's link profile.
One of the reasons that it is so important to mix link types is such that if any of your marketing really takes off you want some semblance of balance to your link profile.