Captain Caveman posts on Google's duplicate content filters.
Interesting tactic by Google. If too many pages on the same site trip a duplicate content filter Google does not just filter through to find the best result, sometimes they filter out ALL the pages from that site.
This creates an added opportunity cost to creating keyword driftnets & deep databases of near identical useless information. One page left in the results = no big deal. Zero pages = big deal.
Not only would this type of filter whack junk empty directories, thematic screen scraper sites, and cookie cutter affiliate sites, but it could also hit regular merchant sites which had little unique information on each page.
On commercial searches many merchants will be left in the cold & the SERPs will be heavily biased toward unique content & information dense websites.
If your site was filtered there is always AdWords. And if there are few commercial sites in the organic results then the AdWords CTR goes up. Everyone is happy, except the commercial webmaster sitting in the cold.
Yet another example of Google trying to nullify SEO techniques that work amazingly well in it's competitors results. I wonder what percent of SEOs are making different sites targeted at different engines algorithms.
I have to be somewhat careful with watching some of these types of duplicate content filters, because I have a mini salesletter on many pages of this site, and this site could get whacked by one of these algorithms. If it does changes will occur. Perhaps using PHP to render text as an image or some other similar technique.