Usage Data Will Not Replace Link Reputation

I am a big fan of usage data as an SEO and marketing mechanism (especially because usage data leads to editorial citations if the site is linkworthy), but I doubt usage data alone will fully replace linkage data anytime soon. Why? The web is but a small fraction of overall advertising market. With search companies controlling so little offline media you would doubt that they would want to let ads outside of their network have control over their relevancy, wouldn't you?

Why does Matt Cutts frequently obfuscate the policies on buying and selling links outside of AdWords? Because abusing links undermines Google's relevancy and Google does not get a cut of the action.

Google's current results are heavily biased toward informational resources. If Google was heavily reliant on usage data it would commercially bias their organic search results and make their paid ads seem less appealing.

Published: July 30, 2006 by Aaron Wall in seo tips

Comments

July 30, 2006 - 11:27pm

Yea...many of the top 10 ranked pages that are .edu or .gov pages are NOT great pages, and rarely visited...just google's love of the .edu's and .gov's puts them there...perhaps some usage data would remove those...but you're right...they won't use it...fill the top 10 with what google feels are "resources" and anyone selling things can use adwords...even if many of the "resources" are crap.

July 30, 2006 - 11:38pm

I think Google has learned their lesson about leaning too heavily on one ranking factor.

If usage data became anything more then just "another factor" it would quickly be subjected to all forms of abuse.

Hell, anonymizing proxies are a dime a dozen these days. It's not hard to spoof user data.

But, I do a believe that usage data does/and needs to play a role, it just needs to be used correctly.

mblair
July 30, 2006 - 11:49pm

Another complicating factor is that when you have the market share in search Google has, "good usage data" for specific sites becomes kind of a self-fulfilling prophecy as any old junk Google sticks up at the top of the SERPs is doing to have a huge advantage over what's on page 12 of the results.

But I doubt that usage data will be used so directly as it is subject to manipulation of a specific site's results. (Think about an army of zombie computers reporting toolbar data, bumping up usage metrics artificially).

Right now, I'd say that Google is probably still relying upon link reputation far more than they would like to in commercial queries, which is partially why they are so focused on pushing the Google Toolbar, in order to access additional metrics so that they can "water down" link reputation in the most commercial sectors where there is serious, well-funded competition to manipulate links.

I also think that the TrustRank concept is going to be expanded in such a way that it becomes a sort of formal "credit" that a domain or group of domains with common ownership is assigned. Google could safely use the upper crust of the usage data (hardest to influence) to help establish a grouping of "creditable" sites that influences overall rankings based upon whom they link to. Of course, if these sites are determined to be abusing their authority, their credit could become diminished, causing their own rankings to slip as well - creating incentive for Webmasters to conserve their "credit". Knowledgeable webmasters are already concerned with this, as witnessed by the widespread adoption of the "no follow" tag by those in the know.

This kind of indirect usage of usage data could, I believe, supersede link reputation as a decisive factor in "qualifying" top commercial results fairly readily. In general, the more that Google's algorithms are based on these kind of indirect conclusions the more difficult they are to influence by those intent upon doing so.

August 1, 2006 - 3:48pm

Usage data shouldn't be all important. This way, new, unique content won't be visible to the search engines, so linking is still important. You can't neglect usage data though, so it may be a factor.

August 3, 2006 - 1:04am

I have always found you to be the most honest voice in SEO as well as being generous with your tools!

Link driven relevancy is a fragile model for most vertical segments and Google's stated policies are a bandaid to preserve their approach. It is good to hear someone say out loud that the Google policy on link buying is a combination of bullying, blustering and illusion.

I asked the link buying panel as SES NYC whether the policy trying to strip link credit for paid links and warning webmasters to use the no follow tag for paid links (under threat of banishment) was ethical or hypocritical.

Most of the panel didn't want to take a stand. Greg Boser was the only panelist to directly take up the issue, agreeing that someone with a cynical view of the world my see it that way (anyone who know Greg even a little knows his world view is pretty cynical).

My only concern is that while it is easy to assail the tower and burn the reigning monarch in effigy, what alternaives are better?

My other question is this, oh mighty oracle of SEO code: Any clue how to create a link to a results page of a search query from an indisputable authority site?
A company I work for is listed as an Adobe Print Partner, but the page won't ever get spidered because the content is dynamic. I thought I might publish a link on one of my sites advertising them as an adobe certified partner.

The search page is http://partners.adobe.com/public/partnerfinder/psp/show_find.do . I want to link for the results for a query for partners in Arlington, Texas and thus give the spiders a path to find our listing. Any suggestions are appreciated.

August 3, 2006 - 1:25am

>what alternaives are better?

If I knew that and could implement it I would try to shop it, just like Larry Page tried shopping PageRank.

As far as showing your Adobe partnership, it would be better to link to a page like this
http://partners.adobe.com/public/partnerfinder/psp/show_provider.do?part...

August 3, 2006 - 7:42am

Aaron:

Thanks for the tip. Actually, the company is http://partners.adobe.com/public/partnerfinder/psp/show_provider.do?part..., but I appreciate the quick catch of something I missed. From your understanding of Trust Rank, should a link from Adobe, on a previously unspidered page with a PR of 0, but a presumed TR of 10, be important?

As far as alternatives, do you remember an engine called DirectHit that was around for about a year (2000-2001?) before it got gobbled up? They had great data that viewed what people actually clicked when they read SERPS.

All three major engines give much better relevancy that we saw in 2000, so this may not be the trump card it once was, but the next evolution in search will need to be heavily influenced by what users actually choose as the best (last navigated to???) result. Some of the social networking sites are trying to find the right formula, but the solution lies more in tapping the collective decision making of millions of users than the bookmarks of a couple hundred thousand geeks and SEO/SEM types.

Something like the Google Toolbar thumbs up but built into the architecture of the ranking.

August 3, 2006 - 7:47am

An Adobe link should be a good thing I would think.

Direct Hit was before my time but I remember the model and reading about how often they had to rewrite their relevancy and spam detection algorithms.

I think the next step for search engine relevancy improvement is probably through priority content partnerships and creating a framework that supports higher quality content more than it is through just tapping usage data.

August 3, 2006 - 8:20am

Aaron:

Directhit was a form of meta engine that ranked sites based on results on the data they got from their partners (GoTo, Yahoo and Excite, if I recall). This data was simply the most popular site chosen from the SERP for each term. Obviously, the logic had flaws, but it was actually not easy to spam, because the user had to select the site from the SERP on the partner engines. The only way to Spam Directhit was to get top clickthrough on other engines.

I remember it fondly because it let an SEO guy demonstrate his effectiveness to potential clients very easily. If your site shows up number 1 in direct hit for a keyword, you are getting more search traffic than any company for that term.

The problem with content partnerships and AI frameworks for content analysis is they they both tend to heavily favor the entrenched companies with business development relationships and content staff. How can a small and highly relevant site hope to place well in the face of a large and entrenched content provider/network. More importantly, once a company is officially part of a content network, they have an opportunity and an incentive to monetize that the same way sites sell links today. It repeats the essential issue of how an engine can rely on external editorial judgements to determine rank and still prevent those sites from selling that editorial positioning.

Its hard to imagine a place for a freelance SEO guy or a small firm when the vertical is sewn up by a company with a content deal that is controlling SERP for the category. This kind of arrangement seems to be too close to the type of deals engines were trying to orchestrate when Google stepped in and stole the market.

I really love how the blogosphere snuck up on everyone and suddenly became so important for SEO and marketing in general. If nothing else, it forced individuals to publish there thoughts to the world instead of sharing them at lunch.

Add new comment

(If you're a human, don't change the following field)
Your first name.
(If you're a human, don't change the following field)
Your first name.
(If you're a human, don't change the following field)
Your first name.