Interesting tactic by Google. If too many pages on the same site trip a duplicate content filter Google does not just filter through to find the best result, sometimes they filter out ALL the pages from that site.
This creates an added opportunity cost to creating keyword driftnets & deep databases of near identical useless information. One page left in the results = no big deal. Zero pages = big deal.
Not only would this type of filter whack junk empty directories, thematic screen scraper sites, and cookie cutter affiliate sites, but it could also hit regular merchant sites which had little unique information on each page.
On commercial searches many merchants will be left in the cold & the SERPs will be heavily biased toward unique content & information dense websites.
If your site was filtered there is always AdWords. And if there are few commercial sites in the organic results then the AdWords CTR goes up. Everyone is happy, except the commercial webmaster sitting in the cold.
Yet another example of Google trying to nullify SEO techniques that work amazingly well in it's competitors results. I wonder what percent of SEOs are making different sites targeted at different engines algorithms.
I have to be somewhat careful with watching some of these types of duplicate content filters, because I have a mini salesletter on many pages of this site, and this site could get whacked by one of these algorithms. If it does changes will occur. Perhaps using PHP to render text as an image or some other similar technique.
Recently I did a paid consult with a person who runs a number of websites who wanted to increase his AdSense earnings. He wanted to know the secret of tweaking in page copy for SEO perfection.
As he kept tweaking his page copy he kept raising the keyword density and unknowingly pulling out some of the modifiers and other semantically related terms.
Since his site did not have an amazing authority score he was not ranking for the most common terms. Most his traffic was coming in from longer queries. As he tweaked in the page copy his pages became less linkable / linkworthy, and he removed many of the terms that were responsible for the 3 and 4 word queries that were bringing visitors to his website. His traffic kept dropping so he kept tweaking. Traffic kept dropping, keep tweaking, repeat cycle...
The site might be absolutely offensive to a ton of people, but that site will likely get links from BOTH people who like it AND people who hate it. The site is equally unique and offensive, which is something that is oh-so-easy to link at.
One well known search engineer in the past also recommended creating a grammar nazi site that went around fixing everyone's borken grammar and linking back to the home site.
The same words, sent in the exact same way, carried two completely different meanings. In the "default" case, it's just another shill hawking just another product. In the second, it's a real request from a real person who is not even directly involved with the product, who happened to think it (and, more importantly, the folks involved with it) were neat, and wanted to get the word out.
Same words. Same medium. Very different meanings.
I think some of the people email spamming with poor english would probably do far better if they also tried sounding young or whatever in some of them. Sounding authentic is the key.
Most of my link requests talk about other subjects as well. If only I were 20 years younger...arg..am...getting...old.
I really like the Threadwatch tagging thingie. It's where I found this link & is a good way for people to submit stories without actually having to submit them :)
So when I announced Backlink Analyzer I posted a detailed blog post, which got many links from solid authority industry related sites.
I later moved the bulk of the info to the download page, and now most people will probably link at that.
The reason I posted so much info on the blog part is that I wanted to make sure that people read it / saw it. I probably should have had a bit more self confidence with that and placed the bulk of the information on it's own permanent page right off the start.
One of the biggest things many webmasters do that hurt their sites is not being consistant with internal linking or not being consistent with where they tell others to link.
Some of them start two auctions in parallel supporting different charities such that bidders aim to outbid the other item to show how much more important their charity is and how much more they support it.
After paying back the costs sometimes the links are way cheaper than buying similar links directly, and you help charities. Win win.
I see the guide as being pretty good for people new to the search market, as it gives them the snapshot of a variety of voices helping them get a better picture of the keyword research and internet marketing process.
The three big criticisms I have with the guide are:
Most emails that are sent from unknown strangers are garbage. Sending a somewhat vanilla looking link request email means people are going to be predisposed to wanting to ignore you.
Seth explains why timely, targeted, and personalized link requests are much more effective than the average link request. (although he is talking about a different topic I decided to try to relate it to SEO) When I was new to the web I worked much harder on link building than I do today. I usually found the best link requests only worked if I took the time to really understand the motives of the webmasters or made it look as though I was just trying to help them out.
An example technique I did, was when a site was taken down and redirected to another site I:
used a tool similar to Hub Finder to find authority links pointed at the old site.
manually reviewed the sites to look for their motives and how on topic the page was to my site
emailed webmasters of some rather powerful websites reminding them that they had an outdated links, told them the new site location, and listed a few other highly recommendable resources (one of which was my own site).
By breaking link to me down into a 4 or 5 step process to highly qualify the leads and make myself look like I was helping them I was much more effective at building links than the random rogue hunting methods.
By making it look like I was trying to help them make their sites better my link conversion rate was like 30% to 50%, and these were for free powerful links.
Around the same time I wrote an article about Google's Florida Update that got to be somewhat well known. Some people want to know your status or whatever, and when some ask I played down any success I had up to that point and said well I just wrote an article a couple days ago... and based on that even more people converted.
Here is another, similar example, that blatently failed. A while ago I used a tools like Hub Finder, Link Harvester (I should soon add a feature to limit the domain type on Link Harvester), and Yahoo!'s Advanced Search Page to find some of the college sites that linked into search engine submission companies years ago.
I wanted to see if I could persuade them that search engine submission was outdated, and that they could keep current with search news by linking through to some of the search related blogs (one of them of course being my blog).
This had horrific conversion rates, and probably was a complete waste of time. Why? Because I was asking people to make multiple changes at once:
First it requires them to admit that their information is outdated, incorrect, or useless; and unlike a site moving location I did not have blatent obvious proof of this fact assembled.
I did not sell why they should change the page as mutch as I needed to
students changing sites need permission to change stuff and
they have little reason to believe me or care
most of them get paid next to nothing and would probably rather work on their homework or real job than worry about me
most professors like to think of their own information as pure and correct.
How could that have been more effective? I could of asked Danny Sullivan or Robert Clough or one of the other authoritative search site owners if they would publish an article about search engine submissions being outdated. I could either write that article myself or pay a friend to ask those people if they can write it and have it published. Then I could have used that article to quote various search reps as saying submission is outdated, as well as link through to other more effective ways to list and rank sites, and then use that as reasoning for people to avoid submissions.
I could also write a part two to the article describing many submission services as scams, going over how they use submit your site buttons to gather link popularity and many of them also are notoriously well known as email spammers.
By sourcing authoritative voices on the topic I, or at least by developing my credibility beyond a random person sending emails, I would:
would have had much greater success at link building.
And the other hidden tip sorta mentioned in the post is that while you have credibility, traffic, media coverage, authority, etc etc etc you should work harder at building links or spreading ideas, as your effort will be much more fruitful if launched on the back of some other success. Find a hit and run with it.
Sorta makes it compelling to create an open source site, or some site that has a pure sounding mission, which makes people want to heavily link at it, so that you can push that link popularity through the rest of your high profit network.
Are search algorithms saying every web based businesses should start off with a strong relationship to a socially conscience 501 C 3 (or equivalent)?
Then again, even Google is paying Mozilla for making Google the default Firefox search engine, so it seems search engines MUST endorse donating to charities for search engine traffic.