Trusted vs Untrusted Links

About a year and a half ago I wrote an article called TrustRank and the Company You Keep which offered an image showing how many of the cheesy "buy PageRank here" type general directories were not well meshed into the web. That same image can be extended well beyond directories. Article submissions, reciprocal links, press releases, and other low effort low cost links put your site in a community of low trust sites. Even if the source originally had great trust, if they offer much greater value than cost, market forces such as:

  • other marketers using the same marketing techniques to promote low quality sites

  • improving relevancy algorithms

are going to neutralize the value. And then all you are left with is the risk.

Worse yet, a new site which is heavily co-cited alongside low quality sites may never be able to build enough quality votes to offset all of the votes of non-quality. So after you gain too many garbage votes, even when you decide to splash out to put the effort in or spend the money necessary to get quality votes it may not matter. The site status may be beyond repair.

And as long as you think of SEO as I need links I need links I need links then you are going to be more inclined to pick up a disproportionate volume of junky links, especially if you are not thinking of the web as a large social network. If you know your market well enough to read market demands then it is much easier to get editorial links that will hold value, and perhaps even increase in value as relevancy algorithms evolve.

Nothing is absolute of course, but it is all ratio driven. If the first thing you do with your site is put it in a community of low trusted sites then you are going to need to work much harder to develop a trusting relationship with Google. If you go for quality first then you have more room for error down the road.

Each engine has its own values which determine the quality of a link. Google is typically the best at scrubbing link quality, and Microsoft is generally no good at it. If the market seems so saturated that you think Google will be prettymuch out of reach no matter what you do, then it might make sense to concede Google rankings and be a bit more aggressive with getting bulk low quality links to dominate Yahoo! and MSN.

When I started with SEO I ranked for search engine marketing inside of 9 months on like $300 just by getting whatever spammy links I could that had PageRank, but Google's algorithms have long since evolved. The fact that many votes count as negative votes means that you can't just pick off the easiest links pointing at competing sites and catch up that way. You have to get some of their higher quality links right away to have a good enough of a trust-to-junk ratio for the bad stuff not to whack you.

Published: October 29, 2006 by Aaron Wall in seo tips

Comments

rk
October 31, 2006 - 9:43am

Excellent post, Aaron

What are your views on Trusted Content vs Untrusted Content?
Do you think a two-years old site which was using low quality content for a couple of years, can start ranking good with google if the quality of content increases? the site didn't have many IBL's earlier but those can be developed once the quality of content increases. also the site does not ranks with google but ranks well with yahoo, msn. or is it better to post the high quality content on a new domain so as to avoid any untrust with google of the earlier content, if there's any such thing? Thanks.

October 31, 2006 - 10:35am

Content quality is largely determined by 4 main criteria, IMHO:

  • linguistics & naturalness - if you have "natural" copy it will rank for a far more diverse set of keywords...it will also hlep you rank better for the core terms. If the content is too heavily focused on a core keyword then that can suppress rankings.
  • link profile - if you have strong quality inbound and outbound links then it may be assumed that your content is of decent quality
  • uniqueness - if it is very similar to content on other site or you have internal navigational issues causing duplicate content that can screw you over
  • site size and growth rate - the more age and link equity the more you can get away with here, but if your site grows rapidly relative to its history and other sites in your neighborhood that could be a sign of low quality content

Assuming you and are in the index rank somewhere then you can easily improve your rankings for many keywords by adding content, building links, or improving the content quality of the content on your site.

rk
October 31, 2006 - 3:02pm

thanks for your reply, Aaron - much appreciated

seopractices
October 31, 2006 - 6:15pm

thanks Aaron, as usual great information.

SEO BB
November 1, 2006 - 10:47am

You simply cannot consider anything on the web in isolation. What if you counteract the poor quality IBLs links by some high quality OBLs?

BB

November 1, 2006 - 11:37am

Sure you can consider things in isolation. Each thing has the potential to help or hurt you. As a marketer all you are doing is risk analysis.

I presume that you can build the high quality outbound links AND focus on high quality inbound links.

October 29, 2006 - 6:50am

I think you're right, but do you think search engines are able to sort out an unwanted increase of low quality links ?
i.e:
A new site with few but high quality links (.edu, .gov...)
Got a link or two on Wikipedia (legitimates one).
The Wikipedia pages are mirrored few hundreds times.
The handful of high quality links are drowned in hundredth of links from MFA.

I wish search engines will be smart enough about those junky neighbourhoods.

nuevojefe
October 29, 2006 - 7:49am

Do it the opposite way (trusted domain, throw tons of crap links at it) and you're set... for now.

October 29, 2006 - 12:10pm

I was thinking... it is quite easy for people to build up a huge quantity of low-quality links.
Don't you think that it could be used as a "tactic" to lower the position of competitors ?

October 29, 2006 - 4:10pm

I have seen it done. And I have seen it both work and not work, on a case by case basis.

The newer and less established a site is the easier it is to blow them out of the SERPs...but older established sites pick up many automated scraper links just by ranking well, so Google wouldn't want to nuke those sites for something that is a direct result of being authoritative and ranking well.

October 29, 2006 - 7:33pm

"I was thinking... it is quite easy for people to build up a huge quantity of low-quality links.
Don't you think that it could be used as a "tactic" to lower the position of competitors ?"

It's an idea, but it's the kind of thing that requires the same amount of effort as it would building quality links for your own web properties.

So is it worth your time? Considering that you could be spending that time doing much the same thing but with quality for yourself...

October 29, 2006 - 11:59pm

It is actually far easier to build spam links for a competitor than it is to build quality links for yourself...like soooo easy to automate...but there is no guarantee it will hurt them.

jwm0z
October 30, 2006 - 4:47pm

Over the past 2 weeks I have submitted a site i'm working on to over 100 directories, the result - we finally popped up onto the first page of google for our main keyword.

October 30, 2006 - 7:36pm

jwm0z, that may be true...but I'd be interested to see how long that lasts. A link-blitz may get you good rankings fast, there is little retention as far as I'm concerned. Good content gets you links, albeit slower, but it also keeps good retention.

Add new comment

(If you're a human, don't change the following field)
Your first name.
(If you're a human, don't change the following field)
Your first name.
(If you're a human, don't change the following field)
Your first name.