Frank mentioned this NYP article about how some companies are buying sites outright rather than increasing their AdWords bid prices. I expect this to be a large and growing trend for at least a couple years. As Google gets more efficient at pricing the ads they increase the value of the top ranked sites that sit alongside those ads. Internet Search Metrics, quoted in the NYP article as Internet Search Management, is providing audits on the competitive landscape of search
ISM's audits track the top 4.5 million search phrases on Google and Yahoo!, a total of 7.3 billion searches a month, to determine which companies across 50 business sectors pop up most frequently in the top three or four positions in natural search. ...
The ISM audits, to be released in London, break down which of 50 business sectors are locked up - that is, have large chunks of natural search dominated by a handful of companies - and which are wide open.
I have not yet seen any of the reports, but the network is still young. If you love marketing, are in tune with web trends, and are well funded I am guessing that many of the markets that appear locked up are still wide open.
Creating Shadow Brands & Buying Top Ranked Competing Sites
While small businesses are worried about the risks of buying or renting a few links, some large corporations are launching shadow brands or buying out competing domains en mass. There are thousands or millions of other examples, so it is unfair for me to point any out, but here are a few for the sake of argument.
How many different verticals does Yahoo! cover the Nintendo Wii in? Off the top of my head, at least 9: their brand universe, yahoo tech, yahoo shopping, yahoo news, yahoo directory, ask yahoo, yahoo answers, videogames.yahoo, games.yahoo, etc. (and that doesn't even count geolocal subdomains for answers, shopping, etc.)
What happened to result diversity? When and why did Google stop caring about that?
Why is buying links bad, when using infinite domains or buying a bunch of sites are both legitimate? Why is it ok for the WSJ to publish this type of content, but wrong for me to do whatever necessary to compete in a marketplace cluttered with that information pollution?
The point here is not to say that big businesses are bad or doing anything wrong, but to show the stupidity Google is relying on when they scaremonger newer and smaller webmasters about the risks of buying a link here or there. The big businesses do all of the above, gain more organic links by being well known, and still buy links because the techniques works. Whatever Google ranks is what people will create more of, so long as it is profitable to do so.
If you create a real brand you can buy more links and be far spammier with your optimization with a lower risk profile, because Google has to rank your site or they lose marketshare. Create something that is best of breed and then market the hell out of it. If marketing requires buying a few links then open up the wallet and get ready to rank.
What happens to the value of your content when search engines get better at providing answers directly in the search results? Is your site the type of site they would like to cite, or does it fall further down the list on another category of queries? What can you do to make them more likely to want to source your site? Does your site have enough perceived trust and value to draw clicks after they put your content directly in the search results?
As search engines work harder at things like universal search, search personalization, and cyc any sites which are only facts and filler won't get much exposure.
Some top ranking sites do not deserve to. When one is lucky enough to be in such a situation it allows us to get away with being lazy, because a site does not have to be too efficient to make money if it is well represented for targeted search queries that send free traffic. But every website has upside potential, even if it already ranks #1.
Improve Internal Navigation & Usability
One client of mine only ranks below his official manufacturer for their name. His site had inadequate internal navigation. I took a day to improve the navigation, and the result was a 150% increase in sales. The last 8 days of last month sold nearly as much as the rest of the month combined. His business model looked like it was about to die, but that one day of work made it functional for at least a couple more years.
High Profit Parallel Markets
I had a site which made a couple hundred dollars a day that was well established in its market, but did not dominate it. Taking the path of lowest resistance, I branched the site into two parallel markets of greater commercial interest where the competition was weaker. On an investment which is less than what the site earns in a month now I was able to increase its income 5x, without even doing much link building.
An Undeserved Ranking
One of my friends is in a high profit market where the competition is absolutely clueless. Basic SEO brought that site to a #1 ranking in Google. The site is highly conversion oriented and makes great income, but now that it already ranks it probably makes sense to reinvest some of the profit into improving content quality and reinforcing that market position. Businesses that do not reinvest eventually fall, especially if they are winning only because the competition is clueless. After spending a couple thousand dollars a day on AdWords eventually the competitors will start to look into SEO.
The Value of Branding
If a site ranks #1, and is monetized via PPC ads, it still might only make a portion of what it should because AdSense is not as efficient as some people would lead you to believe. If a site is strong enough to attract brand advertisers they will pay a premium just for getting their brand seen. Scraper sites and thin content sites don't attract brand advertisers, even if they convert. I have seen a site that was making $80 a month on AdSense make over $10,000 a month selling brand advertisements.
By the time people are looking to automate a no cost SEO technique, as a competitive strategy it is already dead. Blog spamming was once highly effective, but when commercial blog comment spam software was available the practice already stopped working in Google.
Automated Article Submission Software
At SMX advanced a Yahoo! engineer noted that if they detect content as duplicate they are less likely to trust it to seed crawling other documents. People are pushing article submission software to submit articles to article directories, but if most of the content on an article directory site is duplicate, marketers are pushing spamming them via an automated system, and the content networks accept automated submissions, obviously this is not going to be a clean and trusted part of the web that you can go back to again and again. Maybe it is good to try here or there for a bit of anchor text or other market testing, but it is probably not worth automating and doing on a mass scale, especially if the site lacks important signs of quality.
Hundreds of Engineers Work to Kill Spamming Techniques
The spam detection and anti-spam algorithms are driven by people. If something is commonplace in a market then the search engines try their best to stop it. If they can automate it they will. If they have to demote it manually they will.
In the second video here Matt Cutts talked about how spam prevention methods may be different based on language, country, or even market...noting that many real estate sites rely too much on reciprocal link spam.
The less your site's marketing methods look like spam and the harder it is to duplicate what you have done the less likely you are to get hurt by the next update. By the time there is a mass market automated spamming solution the technique is already dead.
[Tim] Mayer reminded that what's relevant for a query can often change over time. Google's Udi Manber, vice president of engineering, made similar remarks when I spoke with him about human-crafted results when I was visiting at Google yesterday.
One example he pointed out was how Google's human quality reviewers -- people that Google pays to provide a human double-check on the quality of its results, so they can then better tune the search algorithm -- started to downgrade results for [cars] when information about the movie Cars started turning up. The algorithm had picked up that the movie was important to that term before some of the human reviewers were aware of it.
Obviously human review is used at all major search engines, but even when outsourcing reviews humans have limits just like with producing content. Even if Google has 10,000 quality raters those people can only be trained to find and rate certain things.
Information architecture is probably the single most important and most under-rated aspect of the search marketing strategy for large websites.
A Recurring Error
I have been reviewing some client sites that could use work on the information architecture front. Some of them want to rank for keywords that they do not actively target. The key to ranking is to create meaningful navigation schemes that reinforce the goals of your site and your target keyword phrases. In addition, a site which is improperly categorized due to poor internal navigation does not flow PageRank properly through the site, which means your ranking data and market feedback will be irrelevant / broken and not show your true potential.
Conversion oriented structure is a type of content. It is one of the biggest advantages smaller players have over large players that operate in many fields, and adds to the bottom line of any site that takes it seriously.
Compare the following...
What Happenst to a Site With Bad Internal Navigation?
A piece meal site with hacked up internal navigation exhibits the following characteristics
navigation is inconsistent and confusing, thus it is hard for spiders to know what pages are important and it is hard for humans to work their way through the conversion process
if the spiders do not rank the correct pages odds are pretty good that the visitors will come into the site on the wrong page (and have a hard time working their way through the conversion process if they start out at the wrong spot)
hard to buy broad keywords using PPC because competing sites are better at funneling visitors through the conversion process
hard to buy category level keywords using PPC because it is hard to place people on meaningful content if it does not exist. category pages should be more than a link list or groups of irrelevant chunks of content
what should be category pages do not add context, build trust, and build credibility - they are essentially placeholders gluing together various unrelated content
if you do not have well defined and reinforced category pages the site is not structured to rank for the mid level category related keywords