The 100+ Ranking Variables Google Uses, And Why You Shouldn't Care

Continuing on with our community questions, here are a few requests for specific ranking information:

"What are the 100+ variables Google considers in their ranking algorithm?"

Cheeky :)

Easy to say, hard to do. Take a job at Google, work your way up the ranks and join the inner circle.

Another question we received is along the same lines:

How do you outrank a super established website in your niche, one where Google is giving site links and their domain is older

Again, easy to say, hard to do. Either forget outranking the domain and buy it, or spend time doing exactly what they have done, and hope they also stop their SEO efforts in order to let you catch up.

These types of questions arise often. "If I could just learn a few quick-fix insider secrets, I can outrank everyone!"

If there was a quick n easy secret formula that would guarantee high rank, why would those who know it, reveal it?

The reality is that quick-fix secret formulas don't exist.

Sure, there are quirks in the algorithms that can be exploited, but they are often trumped by historical factors, like authority metrics, that are difficult to fake. One common blackhat technique is to hack an established domain, and place "money" pages on that domain. That's an admission, if ever there was, that technical trickery on your own domain is either too time consuming, or doesn't work so well.

I know some of the worlds top SEOs, and I can't recall them spending much time talking about secret sauce. What they do talk about is making money and growing empires. They're more focused on the business strategy of SEO.

The effectiveness of many SEO techniques will be dead soon, anyway.

What you need to think about for the future is user interaction.

The Future Of SEO

Have a read of this document, by my good friend and Merlot drinker, Mike Grehan. Mike outlines his view on the future of search, and he makes a number of important points:

  • The web crawler model is nearing the end of its useful life
  • Signals from users, not content creators, will become more important
  • Universal Search changed the ranking game forever
  • Forget rank, think engagement

If you want to future proof your SEO strategy, take heed of Mike's words.

The crawler model is failing because the crawler was designed for structured text, not multimedia. The crawler can't see behind pay-walls. It has trouble navigating databases in which the data isn't interlinked or marked-up. The search engines will need to look for other ways of finding and making sense of data.

Social networks, blogs, Twitter etc indicate a move away from the webmaster as signaler of importance i.e. who you choose to link out to. The search engines will need to mine the social signals form those networks. The user will signal where their attention is focused by their interaction and paths.

Universal search, in may cases, has pushed results listings down below the fold. For example, to get a client seen high up on the results page may involve making sure making sure they are featured on Google Maps. Similarly, if they have video content, it should be placed on YouTube. Google have shown they are increasingly looking to the aggregators for results and featuring their content in prominent positions.

That list of search results is becoming more and more personalized, and this will continue. Who knows, we may not have a list before too long. More and more "search" data - meaning "answers to questions" - might be pushed to us, rather than us having to go hunt for it.

The future of SEO, therefore, will be increasingly about engaging people. The search engines will be measuring the signals users send. In the past, it's all been about the signals webmasters send i.e. links and marked up content.

For now, you still need to cover the obvious bases - create crawlable, on-topic content, backed by quality linking. But you'll also need to think about the users - and the signals they send - in order to future proof your site. Google has long placed the user at the center of the web. Their algorithms are surely heading towards measuring them, too.

What are these signals? Ah, now there's a question.....

Published: January 23, 2009 by A Reader in seo tips

Comments

dilipshaw
January 23, 2009 - 12:27pm

Take a job at Google, work your way up the ranks and join the inner circle.

By the time they do, will they really need a website? They will be old and happy with the fat salary Google pays them ;-)

Ok... on a serious note a good site sticking to the basics of SEO need not bother about "100+ Ranking Variables" any search engine is using. Knowingly or unknowingly they are already using them.

SEOBook is one such example.

Dilip

CureDream
January 23, 2009 - 4:37pm

There are a few attacks on entrenched sites that work. It depends on the kind of site you're competing against.

One of them is the site that's a lot more general than your site but ranks for terms you're interested in. Wikipedia is a common example. A focused site with keywords in the domain name can win this war if you make a vigorous effort.

More generally, a general site can be attacked with a network of specific sites. I think a lot of people overrate the value of general keywords: one of my sites got a penalty that killed my ranking for a generic term I wanted, but my search traffic doubled because I improved my medium tail traffic. At one time, two generic queries brought in 20% of my traffic (about 10% each), but medium and long term brought in 80%. I could probably double or triple the traffic I get from my weaker generic, but that would just be a 20% increase for the site as a whole.

And, like Peter says, user engagement is the real key. If you want to beat an entrenched #1 site, it helps to make a better site. I've fought battles against sites that have been around for most of a decade: they were cool sites in 2001 but they're no match (in the long term) against the Web 3.0 sites I'm making today.

The Gypsy
January 23, 2009 - 6:06pm

What's funny is that talking behavioral metrics seems to be more of a buzz in the SEO community than it is in the IR community. There are even those that feel implicit user feedback is a dead horse... while others simply haven't cracked that particular nut. Of the problems at this point there is;

*Noise – trying to understand intent/satisfaction through actions is problematic at best.

*Resource heavy – more signals = more processing – in a slow economy there are considerations to be had here.

*Click bias – research shows that people trust search results and click ordering or SERPs not necessarily based on quality – leads false positives.

*Spam – personalized search is more likely as it’s hard to spam oneself – Ralph at Fantomaster wasted no time in digging into potential ways to manipulate these signals in the reg index.

Now, research into implicit/explicit user feedback has shown SERP improvements, they simply haven't dealt with some of the related problems (addressed above).

I think SEOs should look into areas such as Personalized PageRank as it is more likely the route that the folks at Google are utilizing at this point and explicit signals (for personalization) such as Search Wiki and more recent test with Preferred Sites.

While engagement data may not be completely ready for prime time, it is good webmastering to consider them as it will lead to a better user experience.

2c....

hugoguzman
January 23, 2009 - 6:15pm

Great piece! I would argue that citation of some kind (inbound links or other types of "citations") will always play a role in natural search, but I think that your overall idea is spot on.

Sadly, there are still a lot of people making decisions for small and large companies that think SEO is about tricks and secret sauce.

MikeTek
January 23, 2009 - 6:18pm

Couldn't agree more - and as someone who has offered SEO services, I'm honestly tired of talking about "SEO tricks." Instead I'm telling existing and prospective clients that while you still need sound code and site structure, there are few "tricks" left. Getting backlinks is more about adding true value to the user experience, creating a resource or otherwise engaging the user - and you can't fake your way through that.

CureDream
January 23, 2009 - 6:54pm

@Gypsy,

a simplistic use of behavioral metrics would simply put the SERPs under the control of black hats. The issues are that:

(i) participation rates in web community systems are 5% of readership rates, and
(ii) to fool search engines into giving you a certain amount of traffic, you'd need to simulate the amount of traffic you want to get

Today's comment spammers generate thousands, some times tens of thousands, of web requests against my sites every day. If they're lucky they get one link a month for their trouble. On the other hand, there's a directory out there that uses voting to decide the order of links. A cron job that votes once a day is sufficient to keep a site at the top of the listing and bring in another 20-30 direct visitors a day.

User engagement, however, is a different matter. If you can get people excited about your site, get them to spend time on it, get them to make links about it, get them to share it with their facebook friends or twit about it, you can really pump your traffic.

Tom McCracken
January 23, 2009 - 9:46pm

Nice post.

Going in the direction of Web 3.0

fantomaster
January 24, 2009 - 5:39pm

Make no mistake: be it social "engagement", "personalized search", "personalized PageRank" or what have you - as long as it's automatable, it's eminently spammable. That technology is already there and it's being used already in this very moment, too - still pretty primitive in many respects, maybe, but let's not forget that "spam" tech evolves as much and as fast (in very many areas actually a whole lot faster...) as search. The better the ROI, the more efficient your spam tech will get.

From what Dave Harry and Bill Slawsky and others monitoring the patents circus keep digging up, implicit user data positively sucks at creating useful, actionable and reliable signals to help determine a page's relevance for whichever search phrase, so personally I'm not too bullish on seeing this tech deployed on a large scale as yet.

Arguably the more important question here is whether we'll see an end of generalized search altogether anytime soon. The more granular user data gets, and the more of it the major search engines are able to accrue, the easier it may seem to say good-bye to the old one-search-for-all model. Mike Grehan's absolutely right on that score IMV.

But there's also some pretty important non-technological factors to consider: for example, already, the European Union (esp. the Brussels Commission) has been viewing US predominance in search as a prime informational and essentially political, social, you might even say: strategic issue which is why they're pumping tons of tax payer money into competing projects of their own.

Not that I expect a lot to come off this within the forseeable future but it does positively indicate that the political class is growing aware of what kind of an informational monster they're actually dealing with. And they are getting dead set on having a huge chunk of the pie for themselves.

At the end of the day, for the ruling elites it's always about who dominates "reality production" because this translates 1:1 to mass control. So five will give you twenty that they won't be sitting on their hands watching it all stream by unaddressed for very much longer. It's been happening in China, North Korea, Singapore, Iran, Germany and other beacons of liberalism - admittedly on an even more primitive technological level as the majority of current social network spam, but the initial claims have been staked and established, which was and is the most important part of the process. The rest will simply be a question of evolution and expansion of administrative powers, political and social censorship, law enforcement, etc.

And at this point in time it's really anybody's guess what kind of a proactive follow-up operations this will trigger.

Again: none of this relates to search technology as such, but we can definitely expect it to be impacted dramatically by such developments.

January 24, 2009 - 8:07pm

"Reality production" - I love that phrase Fantomaster. Are there any books you would recommend about that topic?

bjewelled
January 25, 2009 - 11:11am

Aaron,

perhaps, "1984"

deMib
January 25, 2009 - 12:06pm

Mike is a good friend of mine too and I like him a lot but he sometimes make the same "mistake" as I used to do: Look too much at what may (or may not) happen in 2, 5 or 10 years.

From an academic point of view its VERY interesting. I admit that (probably why I fall in the trap too myself). But from a production SEO point of view it may not always be so interesting or useful.

One good examples of my own "mistakes" in this area is "Natural Language" that I have been arguing about for almost 10 years is a factor that search engines WILL have to better deal with at some point. Understanding the true meaning of what people search for and the meaning of documents (not just the words) is key to me - still is.

However, it is only now (the other day) that Google say they will (probably) be doing more of that this year. So far it just dosn't count.

Social indicators and user data is still very theoretical - even more than Human Language computing is. It's an interesting theory but we are yet to see that it really works vell, scales and will take over as a key factor.

Linkdata used to be like that. When introduced everyone (at least the ones working at search engines) thought spam would be dead. Now is it? :)

PeterD
January 25, 2009 - 8:39pm

Thanks Ralph. Good to hear from you :)

I agree that these metrics, like all metrics they've used thus far, are spamable. They're also low quality, but then that hasn't stopped search engines using low quality signals in the past either i.e. link graph, on-page density, etc.

As Mike pointed out, the search engines are faced with changing web usage patterns. People used to build and publish websites. Now, a lot of people are twittering, facebooking, watching video and otherwise engaging with each other, within established platforms.

The old SE metrics don't appear suitable....

fantomaster
January 26, 2009 - 6:33pm

@Aaron: Thanks for your comments. In terms of books focusing on the "reality production" topic I think it's an excellent idea ramping up a list of sorts some time. (But it may really take a while: while I fully agree with bejewelled's suggestion "1984" - Orwell's "Animal Farm" being another fictional case in point, as is Huxley's "Brave New World" and most of Borges's stuff -, it does touch on so many different disciplines ranging from social studies to mass psychology, behaviorism, structuralist anthropology, advertising, political history etc., all of which makes it a lot harder to pin down than one would like.)

@deMib: Indeed we all have to be extremely careful not to confuse registered patents with actually deployed (or even deployable) technology. Generally speaking, Mike hasn't been off the mark to much in his previous "predictions" (well: they're really extrapolations, aren't they?), so I'll always root for taking them seriously.
But you're quite right in that we as SEOs tend to obsess about lots of issues the search engines themselves couldn't really care less about.
The darn thing being that we'll usually only find out after the event, just like with any old roulette "system", lol.

@PeterD: Long time no see, hope you're well.
Yep, old SE metrics are rapidly falling out of favor, and yep again, even the most obvious spammability hasn't prevented the engines from going for whatever tech happened to enthrall their current executives' whimsies anyway.

Yes, Web usage patterns are mutating into all sorts of chaotic sub groups once more: they've done it before e.g. when people abandoned human edited directories for crawling based search engines; or when they began hitting forums in search of more qualified information; or the current Twitter, Facebook, Myspace, and StumbleUpon fads, and there's no indication it won't happen again.

Guess the engines will see themselves forced to get ever more granular in defining their own markets/clientele - which is possibly what all that "personalized search" hoopla is really about in the first place. At the very least, the classic "one size fits all" brute force search model is increasingly beginning to look like a major loser strategy.

Add new comment

(If you're a human, don't change the following field)
Your first name.
(If you're a human, don't change the following field)
Your first name.
(If you're a human, don't change the following field)
Your first name.