TrustRank Algorithm

A buddy of mine pointed me to a white paper by Zoltan Gyongyi, Hector Garcia-Molina, & Jan Pederson about a concept called TrustRank(PDF).

Human editors help search engines combat search engine spam, but reviewing all content is impractical. TrustRank places a core vote of trust on a seed set of reviewed sites to help search engines identify pages that would be considered useful from pages that would be considered spam. This trust is attenuated to other sites through links from the seed sites.
TrustRank can be use to

  • automatically boost pages that have a high probablility of being good, as well as demote the rankings of pages that have a high probability of being bad.

  • help search engines identify what pages should be good canidates for quality review

Some common ideas that TrustRank is based upon:

  • Good pages rarely link to bad ones. Bad pages often link to good ones in an attempt to improve hub scores.

  • The care with which people add links to a page is often inversely proportional to the number of links on the page.
  • Trust score is attenuated as it passes from site to site.

To select seed sites they looked for sites which link to many other sites. DMOZ clones and other similar sites created many non useful seed sites.

Sites which were not listed in any of the major directories were removed from the seed set, of the remaining sites only sites which were backed by government, educational, or corporate bodies were accepted as seed sites.

When deciding what sites to review it is mostly important to identify high PR spam sites since they will be more likely to show in the results and because it would be too expensive to closely monitor the tail.

TrustRank can be bolted onto PageRank to significantly improve search relevancy.

Published: February 7, 2005 by Aaron Wall in technology


October 19, 2006 - 4:16pm

Would creating links to high value sites increase one's TR?

October 19, 2006 - 4:20pm

Would creating links to high value sites increase one's TR?

August 13, 2007 - 7:36am

Google seems to have a large focus on combating web design spam. I am quite new to the concept of Trust Rank, but I have been following the "no-follow vs "do-follow debate".

While there is an obvious benefit to saying "this link is more trusted than this link" my question would be whether it remains democratic, as a bias towards an edu domain is still a bias.

As a soon to graduate student, I can tell you that their is a fair amount of spam on university web hosting. It almost feels a little archaic that government department gtlds are given more trust - they have huge budgets and thousands of employees chattering away and creating link bait. Their credibility should come from this alone, and not the fact that they are a government department.

And although I agree that making quality links is a matter of having good enough content that people will link to you, an organisation which can pay 50 employees to write articles will have a major advantage over a company that can only employ one. In the end while it comes down to quality content, it can also come down to budget.

Nonetheless, I found this article very informative. I know there are sites that demonstrate a possible algorithm for PR, is there any reliable ones for TR?

charles turner
July 18, 2007 - 10:17pm


I think we are starting to see some real differences in and some of the country specific googles.

If I search in the UK for the search term conservatory, there are no edus, wikis, orgs, govs in the top ten. Just commercial sites. If I search in the version half of them fall into this category.

Could it be that Google is being country specific in how it changes its algorythms for Trust rank

May 16, 2007 - 8:50am

Trustrank like Pagerank will again get dilutes by spammers. The way link building has been spammed.

July 15, 2006 - 7:00pm

I think trustrank is probably becoming the single most important factor in rankings now, since we see how google loves old and trusted sites and they're always ranked better.
And so do their subdomains.

Do you think pagerank and trustrank are somehow related, e.g. one is a part of the other?

July 15, 2006 - 7:24pm

There is also another paper about link spam mass that ties in nicely with the Trustrank paper.

I think in many ways raw PageRank is far less important than it once was, and I think some of the easier ways to manipulate it have also been depreciated by algorithms that scrub link quality.

Most high trust link sources probably have a decently high PageRank on their site as well.

February 7, 2005 - 9:55pm

Excellent link and paper, Aaron. I wouldn't be surprised to see human editors become a large part of search engine innovation over the next few years.

July 16, 2006 - 7:00am

Nice comments, but how do you determine the best keywords for the Page Rank that will work? If you have a UNIQUE product or service with no proper noun name, the only way to develop keywords is to use multiple high value keywords (technology product or service). A searcher would almost be barred from finding the most appropriate answer to the query, since the keyword value is based on selling products and services.

How do you solve THIS problem? In the early days of the internet,, there wasn't such a "high value" for keywords. Now with internet marketing, this original benefit of the internet is no longer available. There needs to be as NON-marketing, non-selling search engine and PORTAL, that does not allow ANY promotion. In that way the keywords for this search engine would have no economic value.

I've tried filtering out words such as sale, best, worst, expensive, free, low cost, budget etc etc, but the sales/marketing spam keeps creeping in. Any ideas

It sounds easy but it isn't. Lets say you develop a device that has never existed before and that will solve problems for a niche of potential researchers or users. You need to use a cultural language requiring lots of descriptors, adjectives adverbs, nouns, most of which would be high value ( high cost) keywords. If you aren't selling anything, you have a bigger problem. Yes you can always publish info about it, but what about keywords? You will always have to rely on the cultural language (English or?) to describe anything about the object.

November 30, 2006 - 5:33am

I wish I knew when my sites would be ranked.

July 16, 2006 - 9:30pm

Language is not fixed or static.

New markets open daily.

When this site started Overture showed zero search volume for SEO Book. Recently it has been getting over 1,000 referrals per day for that query.

If you are not selling anything and you share high quality content then it is typically much easier to get free links than if you have a commercially oriented site.

October 22, 2006 - 11:44pm

Searching within the tourism/leisure sector in which I operate returns mostly official tourist sites for a particular town/ region. Is this trust rank in action? Often the .gov site is either at the top of SERPs page or close - followed by other official or semi-official sites - such as the university or football club or similar established sites.

This didn't used to be the case - a search for a resort town would produce loads of accommodation directories and similar. So from what I can see Trustrank seems to be in operation. That's OK if you are highly trusted but what happens if your a small - commercial - operator. Low trust and low ranks presumably? How do you gain trustrank and is it more important than pagerank?

October 23, 2006 - 12:57am

Hi David
What you are describing is TrustRank (or some similar or derivative type ofo algorithm) in action, plus Google getting better at filtering out low quality links, plus Google placing less weight on anchor text, plus Google getting better at understanding linguistics and natural writing.

I think it is mostly the 1st of those 4 that are really taking effect, but the other 3 largely play in as factors as well.

May 9, 2006 - 10:47pm

great stuff.
my frustration is that it's dificult to establish a site that you'd like to get widespread use primarily because it's a tool that would benefit a lot of folks but isn't really a big money maker. (right now trying to get our site started has been a challenge but we'll keep pushing on).
thanks for the helpful tips Aaron... keep up the good work!

June 27, 2006 - 6:48pm

I use LinkMachine for receiprocal links, it has an aoutomatic link exchange for everyone that uses the system.

So are you saying that it is worhtless and I should manually make a links page? Updating it on my own with out a software.

I was on page one on google searches then 3 months ago went to page four. I have been following the guidline of the SEO book and some info. from forums and got to page 2.

Still trying to get back to page 1. I submitted my site to the main directories recommended in the SEO book. I did that last month. But they are not showing as back links in google yet.

Any advice?


June 27, 2006 - 7:53pm

Google only shows a sample of backlinks and they only update that sample every so often.

I would move away from using widespread systems that were generally used by low quality sites. Automated link exchange stuff is not usually used by too many real sites, so having that type of a footprint in your site is probably not a good thing if you want to rank well in Google.

November 4, 2006 - 2:28am

Is TrustRank reversable, kind of like Page Rank where it can go up or down? Let's take a hypothetical case where a site got links from "bad neighborhoods" and then later gets more legitimate links from major trusted sources, would the addition of "good links" offset the bad links and restore the Trust Rank of a page? Can anyone share any experience with the TrustRank?

April 27, 2005 - 10:29am

I have a feeling this will make people buy even more links on high PR (TR) pages. Sure it'll be harder to get a link on such a page but still not impossible and getting on one will probably have such a positive impact on ranking that people will pay a lot for it.

lee johnson
July 12, 2007 - 4:26pm

very interesting article. I feel this is the way search engines should go. Internet marketing is common sense. Why give a recomendation if you think little of them? You dont. Its the same with inbound links, you only get them if the owner of the site who is linking to you feels you have something that could be of use for its viewers/customers. Roll it out and lets see the spam boys drop like a stone.

November 4, 2006 - 2:07pm

I think if you go too far it can be quite hard to dig yourself out of a hole if you already incurred some penalties, but if you didn't dig too deeply it can probably be reversed.

November 5, 2006 - 6:54pm

Thank you Aaron for your thoughts and advise. I haven't done anything that can be "too far"... It is a reputable site and I am trying to do my best to increase popularity and rank "the correct" way.... the only thing I might have done border line "questionably" were some links from 2-3 blogs that might be considered "overdoing"... I deleted those since... What is interesting is the fact that on some Google datacenters we are still ranked for a handful of keywords in the top 10 (vs. #1) and on other datacenters we are not ranked on some of those at all... sometimes we show, sometimes we don't show at all... (we lost many other rankins). This makes me wonder if it isn't a "Google algorithm glitch"....

September 11, 2006 - 8:35pm

Hi Aaron,
When is your seobook going to have a major update to remove all the old stuff that does not work anymore and add the newest or what is current and effective? topics such as proper link exchange, how to get one way links etc.

Also, does a link from internal page of an .edu has any benefit? ex.
if this person has a personal web site, and has a links page and give me a link, do I get any benefit at all? how important?

September 12, 2006 - 11:02am

Hi Manuel
I would not describe my book as being full of old and outdated and ineffective information. If you did not find any information about building one way links I would suggest that you did not read it very closely.

June 2, 2005 - 4:20pm

Haha, can't wait. Going to screw over all the PR spammers, SEO wannabes, text link brokers, and all the idiots who keep emailing asking for a link exchange.

January 28, 2007 - 11:03pm

my website was registered on the 29 of December obviously with pagerank 0 after the Google Dance on january 27 my pagerank was changed to 4. How could be possible since I wasn't linked on DMOZ?

May 17, 2006 - 8:30am

Searches have to go back to being human edited. It is too easy to spam a site to the top, through link farming and copy a line of content from multiple sites. Beside that fact, based on mathematical algorithm and bits and bytes they can't see a web site as a human visitor does. Which in the end makes their evaluation 40% useless.

As an exmple I could write a programme to translate your web site into German. If you do not speak German, how do you know it's a 100% bona fide translation? The only way is to get a German speaking person to proof read it... I would guarantee that the translation made by a computer is only 40% correct. It can't read "between the lines", can't evalute emotions, or decide if the content is really valuable or only a collection of key words. As for trustrank, what you're basically saying is, if I'm not in someway linked to a government body, university or international business, I may as well forget it. It will not work. Why? I'll just pay a student to build his home page on his university web site, and link from that. It has to go back to human editing.

May 17, 2006 - 1:01pm

Hi Jon
Humans have biases just like algorithms do. But they are more expensive than ocmputers are.

Keep in mind that to some extent evaluating links IS using humans to help organize the web.

Steve Nelson
August 5, 2007 - 2:03am

This all reminds me of something the first really smart SEO consultant told me... Don't try to outsmart the search engines... Instead, work to improve the user experience of and the value of your site... And be natural in your promotion of the site.

Anyway, thanks for a good article. Very useful.

A Reader
July 30, 2005 - 5:35am

what pages should be good canidates for quality review


September 30, 2005 - 8:35pm

>what pages should be good canidates for quality review

High PageRank ones which lack matching high quality indication scores.

September 19, 2007 - 11:34am

It was very helpfull article, I am waiting for your next release and be sure that I will be your best customer.
Thanks for providing us with usefull information about website building and SEO strategies to promote our website on the Internet.

April 3, 2008 - 11:38am

my site is top rank in google in following keywork offshore outsourcing but algorithm change and my site has not rank for this why??

July 9, 2008 - 2:53am

That is a good point. Is TrustRank reversable? Can you dip into the negatives like Page Rank?

November 14, 2008 - 10:04pm

Great pdf and read, Alot people miss understand link building and they keep building tons of spammy links not realising that if they remove a few spammy links and improved there content they might already be good enough to rank well.

Add new comment

(If you're a human, don't change the following field)
Your first name.
(If you're a human, don't change the following field)
Your first name.
(If you're a human, don't change the following field)
Your first name.