If Google dials up their weighting on large authority sites before Christmas maybe the solution is to buy ad pages on some of them. I bet there are some great underpriced ad links and advertisement pages if people would look hard enough.
Link to Jim Boykin's new tool...still a bad tool name though, IMHO.
I have not put much effort into following most directory type sites that use redirect links (especailly if they are not ranked well in the related search results), but will engines eventually count many weird links as votes if they notice that the users click on a link and like what is on the other end of the series of redirects? Will those links ever count as much or more than static links that never get clicked on?
2005 SEO - Yahoo and MSN, pound with lots of links at once and keep pounding with anything you can get for backlinks with a focused backlink text campaign. With Google, the older the site the better, slow and steady link building with a large variety of backlink text wins (notice itâ€™s the opposite of yahoo and msn).
I think that for most searches, the top 10 will consist mostly of these types of pages. I think Google does this on purpose to show a variety of Types of pages to the user.
If youâ€™re targeting a phrase, you should start by figuring what type of result your site will be, and what itâ€™s role is in the top 10, and who youâ€™re "real" and "direct" competitor is and what it will take to replace them.
Some information gets smarter with more input, and some gets less smart with more input. One of the hard parts about SEO is that everything is debatable. Some additional opinions will poison data, whereas others will make it way better.
Even beyond the debatable is that questioning the right people makes some data seem much more credible and so much easier to spread. If you seek input from the right people in your industry, like Danny Sullivan in search, you can help ensure that an idea spreads far and quick.
I bet that within a month or two that page will be the most well linked document on the SEO Moz site, and it is something just about anyone can do in any industry. Design problems, site usability problems, gardening problems, airplane landing problems, etc etc etc.
The key is to know who to ask for help and to be trusted enough to where they want to help you. Of course you also have to appeal to their ego to where they want to help you. Other than including MM in the data sources I don't think there is much Rand could have done better to make that page more linkable. I love the smiley faces.
Another nice thing about the page is it could be resorted, asking the same questions to self proclaimed search spam gurus. Give DaveN, Greg Bosers, Oilman, Baked Jake, and a few other guys the same set of criteria and see how they answer it. Then those feedbacks can be cross compared.
A Google Zeitgeist of SEO factors that has biggest gainers, biggest losers, and top ten would be amazing link bait that reminded people to visit frequently and link in every month. And then maybe redue the whole ranking factors thing once a year or so.
If you are struggling for creative ways to uniquify your content and make it more linkable sometimes a good technique is to search an old established community site for a common word. The older the community site the better...as there will be lots more random stuff from before the web became so commercial.
Interesting tactic by Google. If too many pages on the same site trip a duplicate content filter Google does not just filter through to find the best result, sometimes they filter out ALL the pages from that site.
This creates an added opportunity cost to creating keyword driftnets & deep databases of near identical useless information. One page left in the results = no big deal. Zero pages = big deal.
Not only would this type of filter whack junk empty directories, thematic screen scraper sites, and cookie cutter affiliate sites, but it could also hit regular merchant sites which had little unique information on each page.
On commercial searches many merchants will be left in the cold & the SERPs will be heavily biased toward unique content & information dense websites.
If your site was filtered there is always AdWords. And if there are few commercial sites in the organic results then the AdWords CTR goes up. Everyone is happy, except the commercial webmaster sitting in the cold.
Yet another example of Google trying to nullify SEO techniques that work amazingly well in it's competitors results. I wonder what percent of SEOs are making different sites targeted at different engines algorithms.
I have to be somewhat careful with watching some of these types of duplicate content filters, because I have a mini salesletter on many pages of this site, and this site could get whacked by one of these algorithms. If it does changes will occur. Perhaps using PHP to render text as an image or some other similar technique.
Recently I did a paid consult with a person who runs a number of websites who wanted to increase his AdSense earnings. He wanted to know the secret of tweaking in page copy for SEO perfection.
As he kept tweaking his page copy he kept raising the keyword density and unknowingly pulling out some of the modifiers and other semantically related terms.
Since his site did not have an amazing authority score he was not ranking for the most common terms. Most his traffic was coming in from longer queries. As he tweaked in the page copy his pages became less linkable / linkworthy, and he removed many of the terms that were responsible for the 3 and 4 word queries that were bringing visitors to his website. His traffic kept dropping so he kept tweaking. Traffic kept dropping, keep tweaking, repeat cycle...
The site might be absolutely offensive to a ton of people, but that site will likely get links from BOTH people who like it AND people who hate it. The site is equally unique and offensive, which is something that is oh-so-easy to link at.
One well known search engineer in the past also recommended creating a grammar nazi site that went around fixing everyone's borken grammar and linking back to the home site.
The same words, sent in the exact same way, carried two completely different meanings. In the "default" case, it's just another shill hawking just another product. In the second, it's a real request from a real person who is not even directly involved with the product, who happened to think it (and, more importantly, the folks involved with it) were neat, and wanted to get the word out.
Same words. Same medium. Very different meanings.
I think some of the people email spamming with poor english would probably do far better if they also tried sounding young or whatever in some of them. Sounding authentic is the key.
Most of my link requests talk about other subjects as well. If only I were 20 years younger...arg..am...getting...old.
I really like the Threadwatch tagging thingie. It's where I found this link & is a good way for people to submit stories without actually having to submit them :)