Search Engine Optimization - Evolution or Extinction?

The following is a guest blog post by Jeremy L. Knauff from Wildfire Marketing Group, highlighting many of the recent changes to the field of SEO.

Marketing is constantly evolving and no form of marketing has evolved more over the last ten years than search engine optimization. That fact isn’t going to change anytime soon. In fact, the entire search engine optimization industry is headed for a major paradigm shift over the next twelve months. Like many of the major algorithm updates in the past, some people will be prepared while some will sit teary-eyed amongst their devastation wondering what happened and scrambling to pick up the pieces. Unlike the major algorithm updates of the past, you won’t be able to simply fix the flaws in your search engine optimization and jump back to the top of the SERPs.

Why is this change going to be so different? In the past, the search engines have incrementally updated certain aspects of their algorithms to improve the quality of their SERPs, for example, eliminating the positive effect of Meta tag keyword stuffing which was being abused by spammers. Anyone who has been in the SEO industry for more than a few years probably remembers the chaos and panic when the major search engines stopped ranking websites based on this approach. This time around though, we’re looking at something much more significant than simply updating an algorithm to favor particular factors or discount others. We are looking at not only a completely new way for search engines to assign value to web pages, but more importantly, a new way for search engines to function.

Local search

A number one ranking for a particular keyword phrase was once the end-all, be-all goal but now many searches are regionalized to show the most relevant web pages that are located in the area that you are searching from. While this will probably reduce your traffic, the traffic that you now receive will be more targeted in many cases. Additionally, it give smaller websites a more equal chance to compete.
local.jpg

Google suggest

This August, Google Suggest was moved from Google Labs to the homepage, offering real-time suggestions based on the letters you’ve typed into the search box so far. This can be an incredibly helpful feature for users. At the same time, it can be potentially devastating to websites that rely of long-tail traffic because once a user sees a keyword phrase that seems like at least a mediocre choice they will usually click on it rather than continuing to type a more specific keyword phrase.
suggest.jpg

Devaluation of paid links

Google’s recent attempt to eliminate paid links has scared a lot of people on both sides of the link buying equation into implementing the “nofollow” tag. In the midst of this hypocritical nonsense, Google has also been taking great measures to devalue links based on quantifiable criteria, such as the “C” class of the originating IP, similarities in anchor text and/or surrounding text, location of the link on the page and the authority of the domain the link is from to name a few. Regardless of the effectiveness of any search engines ability to evaluate and subsequently devalue paid links, the fear of getting caught and possibly penalized is more than enough to deter a lot of people from buying or selling links.

Visitor usage data

Again, Google is leading the charge on this one. Between their analytics, toolbar and web browser, they are collecting an enormous amount of data on visitor usage. When a visitor arrives at a website, Google knows how long they stayed there, how many pages they accessed, which links they followed and much more. With this data, a search engine can determine the quality of a website, which is beginning to carry more weight in regards to ranking than some of the more manipulatable factors such as keyword density or inbound links. This puts the focus on content quality instead of content quantity and over time, will begin to knock many of the “me too” websites further down the SERPs pages, or out of the picture all together. The websites that will prosper will be those that produce relevant, original content that their visitors find useful.

TrustRank

Simply pointing a vast number of links with a particular keyword phrase in the anchor text to a website was once a quick and easy way to assure top ranking. The effectiveness of this approach is diminishing and will continue in that direction as a result of TrustRank. In a nutshell, a particular set of websites are chosen (by Google) based on their editorial quality and prominence on the Internet. Then Google analyzes the outbound links from these sites, the outbound links from the sites linked to by these site, and so on down the chain. The sites that are further up the chain carry more trust and those further down the chain, less trust. Links from sites with more TrustRank, those further up the chain, have a greater impact on ranking than links from websites further down the chain. On one hand, this makes it difficult for new websites to improve their position in the SERPs compared to established website; one the other hand, it helps to eliminate many of the redundant websites out there that are just repeating what everyone else is saying.

Google Chrome

Utilizing a combination of visitor usage data and a not so gentle nudge in Google’s direction, Google Chrome is set to change the way search engines gather data and present it to users. For example, when a user begins typing in the address bar of the browser, they are presented with a dropdown list of suggestions containing choices consisting of the #1 result in Google’s SERPs, related search terms and other pages you’ve recently visited. This gives a serious advantage to the websites that hold top ranking in Google and at the same time, gives a serious advantage to Google by giving their Internet real estate even more exposure than ever before.
chrome.jpg
So the question remains, is search engine optimization facing evolution or extinction? Certainly not extinction, not by a long shot, but in a short period of time it is going to be drastically different than it is today. The focus will soon be on producing a valuable and enjoyable user experience rather than just achieving top ranking, which is what it should have been all along.

Published: September 21, 2008 by Aaron Wall in seo tips

Comments

Kman1564
September 21, 2008 - 6:38pm

As long as your website actually gives your visitors added value and you're not implementing any Black hat SEO techniques, I don't think changes in SERP algorithms will negatively affect your rankings in a drastic way. Sure there will be things you need to tweak as a result of these changes, but the websites that are going to see the "hammer" are the poor quality sites that don't add any value.

If you're staying ahead of the curve, your website should be fine.

ROW
September 21, 2008 - 7:20pm

Hi Aaron,

Since this topic came up in this post, I have a question regarding local search feature.

For one of my blogs I see that for a particular term it ranks #1 on Google.com (for more than a year now) while it is nowhere to be seen on first 10 pages of Google.co.in (Google India search) I am completely baffled by this. Moreover it is not like my site is de-indexed because when I query google.co.in using site: operator, it does throw all the pages.

Is this a normal behavior? Any thoughts?

Patrick
September 21, 2008 - 8:17pm

Interesting post. I sort of agree with KMan1564's opinion. As in anyone who has been following Aaron's advice (it is easier to create an idea worth spreading than spreading one that is not...you need to have an actual competitive-advantage and add value to the SERPs, etc.) should do fine.

But then again, I'm wondering what other more experienced people (e.g. yourself Aaron ;)) think of this? Obviously that post is kind of 'scary', but should it scare anyone who has thought strategically and not just 'quick rankings', anyway?

I'm also really wondering about the Google Suggest feature. Will this really pose a problem for people doing long-tail SEO? Or will it actually be good for them?

Seems like really generic phrases will get fewer searches (many people who are lazy an unimaginative or whatever will type in one word, but then see a more attractive/describing search term and click on that one instead), but middle- to long-tail phrases would get more!

Will it really have any effect on 'long-tail' (as in the real long-tail) phrases? If I had a very descriptive search term in mind (a long-tail one), won't I type it in anyway? I guess I'd type it in and then either follow through with it, or in the middle of typing it in I'll see another long-tail search term that I might switch, too. I think this would only make popular long-tail phrases more popular than unpopular long-tail phrases (because their featured), not get people to go from long-tail to middle-tail or something, though.

EDIT:

Maybe, what I failed to see is that the (very-)long-tail will become less popular, because people will switch to a descriptive (usually more popular) 3 or 4 word phrase instead of continuing to type in the 10 word phrase they had in mind?

EDITEDIT: I just read your post on it and the comments, forget about my google-suggest question!

Diablos
September 22, 2008 - 9:36am

Very interesting post Aaron, thanks.

I actually did a post on teh Google suggest thing a while ago for an affiliate program highlighting the same concerns over the loss of traffic from longer and super long tailed keywords.

While [for the user] it will be a really important step forward I think a lot of thin/lazy affiliate sites will, quite rightly, lose their traffic.
The problem is so will a bunch of honest, harder working ones.

I think SEO has been in big change this last six months anyway and with Chrome coming along it has accelerated things. I know the industry will stay strong I am just hoping I can follow it because I really like working at what I do.

absolute-link
October 1, 2008 - 12:22am

Very interesting post.
I hear more and more people - at least here in Israel - getting frustrated with all those Index sites appearing on Google 1st page (driven usually by SEO guys...) and pushing more people to switch to Firefox.

What's your opinion on the subject? Is Google going to tackle those sites similar to paid links - or do you see it as a legitimate use of SEO knowledge?

October 1, 2008 - 5:29pm

Spam has different formats and is policed differently in different markets. Non-English language content may be able to get away with being of lower quality than English language content would need to be...at least for a few years

Add new comment

(If you're a human, don't change the following field)
Your first name.
(If you're a human, don't change the following field)
Your first name.
(If you're a human, don't change the following field)
Your first name.