Is SEO Irreducibly Complex?

In his book, Origin of Species, Charles Darwin says that:

"If it could be demonstrated that any complex organ existed which could not possibly have been formed by numerous, successive, slight modifications, my theory would absolutely break down."

This is typically used by proponents of Intelligent Design to state their case against evolution by invoking the principle of Irreducible Complexity, which is to say that:

This applies to any system of interacting parts in which the removal of any one part destroys the function of the entire system. An irreducibly complex system, then, requires each and every component to be in place before it will function

Essentially the idea is that something that is irreducibly complex does not evolve to its state in a gradual manner (like evolution) and so the scientific research process of how X came into being is mostly irrelevant versus something that has evolved over time (like an algorithm). A man made algorithm fits into both categories.

What creature could be more complex than a creature which not only was part of natural evolution but also has elements of intelligent design within its core?

How does this apply to SEO? Google's algorithm evolves and it certainly fits precisely with how Darwin laid out the basis of his theory of evolution (numerous, successive, slight modifications in general) but by human hand and captured data.

Of course, sometimes Google makes a big update but generally speaking they make lots of minor updates per year.

Consider that a couple years back in 2009 they claim to have made just south of 500 updates in that year alone.

So the point I'm making is that SEO is both irreducibly complex (remove the hand of man and it would no longer evolve or even work as intended) and a product of a natural evolutionary process (the constantly adjusted algorithm) with layers and layers of thousands of changes over time, full of small and large complexities. These two characteristics make the process of trying to break it down to a stagnant formula with assigned percentages you can attribute to a majority of examples (with confidence) cumbersome and inaccurate.

Forcing Simplicity Creates Complexity

If you read a bunch of SEO blogs you might feel a bit overwhelmed with where to start and what to do. Some blogs tend to be information-heavy, some heavy in theory and (attempted) science, some straight news oriented, and some that are of the good old fashioned boot in your rear end "get something done now" genre.

I think it's important to pick blogs to read from those aforementioned areas of the industry to help get a well-rounded view of the SEO space. However, sometimes I think the more simple you try to make something, say like trying to whittle SEO down to a push button solution, the more complex you make things because then you need to have sound, reliable data to back up those kinds of claims and solutions.

If data starts reading out 50/50 or 60/40 probabilities then that's not really sound science at all. In fact, if anything, it just shows that some things cannot be broken down into a push button forumla or a statistic with any reliability whatsoever. It probably makes for good salesmanship when you want to wow a client with your superior knowledge but it also makes for laughable science, kind of like this kind of science:

The real problem is that Google claims to have more than 200 parts to its algorithm (which we obviously don't have available for studying :) ). Even if you call it an even 200 what about the different weight each factor has? Surely each does not represent 0.5% of the algorithm.

When you dive into trying to mathematically and scientifically break down a formula, of which you know an average (at best) amount of the variables + their direct effects, you actually create more confusion because you have to go out and find examples proving a specific theory while ignoring ones that point in the other direction.

Figuring Out the Variables

I think the annual SeoMoz Search Engine Ranking Factors is a worthy read as they pull data from lots and lots of respected folks in the industry and the presentation is top notch. I think overall it's a good representation of the factors you will need to face when conducting an SEO campaign.

Another good page to bookmark is this page from Search Engine Journal which has guesstimates of what they feel these elusive variables might be.

It can be hard to isolate really specific types of variables because of the constant Google updates, the other factors that are involved with your site and its ranking, and anything being done by the competition. You can test elements for sure, things like:

  • Does X link pass any pop?
  • Seeing if a couple pages pass juice on a 301 before 301-ing an entire site
  • On-page elements like title tag changes, internal linking, and external linking
  • An so on and so on..

The issues are still there though, even with "testing". It is still really, really hard to sell off a scientific breakdown of a consistent path to success beyond high-level ideas like:

  • Become a brand (brand signals, social media signals, offline branding, nice site design, etc)
  • Lots of links from unique domains (preferably good ones of course)
  • A good natural mix of anchor text
  • Great user experience and deep user engagement
  • Targeted content which gives the user exactly what they are looking for

I think that for someone looking to move forward in their SEO career it is important to try and remove the idea that you can break down the factors into exact numbers, as far as value of each individual variable goes. Anyone who practices SEO will likely tell you that you simply want to win more than you lose and even if you are on top of your game you still will have site failures here and there.

The issue of failing might not even be because of some current practice. You could be sailing right along and all of a sudden a Google update cleans your clock (another good reason to be involved with multiple projects).

You might spend more time agonizing over some magic formula or avoiding a project because some tool told you it was too competitive (rather than your knowledge) than building out multiple web properties to weather the expected storms and the ebbs and flows of the web.

Dealing with Complex & Unknown Variables

When faced with the prospect of working within a system where the variables that hold the key to your success are unknown, it can seem daunting. It can also make you want to run out and buy a shiny new tool to solve all your problems and get you that elusive Google ranking you've been waiting for.

The sad truth is if there was such a tool the person(s) who created it wouldn't be selling it to you for less than $100 or slightly higher (or even way higher!). They would be building sites in many verticals and making an absolute killing in the SERPS. By selling it to you they would just be creating more work for themselves and competition.

Not all tools are bad of course. I use the tools here at SeoBook as well as tools from Majestic, Raven, SeoMoz, and Caphyon (Advanced Web Ranking). The tools give you data and data points to work with as well as to cross reference. They do not provide answers for you at the push of a button.

The best thing to do is to start launching some sites and play around with different strategies. Over time you'll find that even strategies that worked in A, B, and C markets didn't work in D or E.

Things like algorithm's changing and competitor's stepping up their game can be factors as to why test results aren't always that accurate (at the real granular level) and why certain strategies worked here but not there.

Keeping Track of Wins & Losses

It makes sense to keep some kind of running journal on a site (why I did this, when I did that, etc) so you can go back and evaluate real (not theorized) data.

Running weekly rank checks isn't a bad idea and tools like Advanced Web Ranking and Raven have built in ways of you keeping notes (events for Raven) on a specific campaign or date-based events (added X links this day).

I happen to like Evernote for these kinds of things but most project management applications and information organizer tools have this kind of capability built in (as does having separate Word and Excel docs for your campaigns).

So if you are involved with a handful or four of projects, in addition to keeping track of strategies used, you can really get a solid handle on what is likely to work in the short to mid term and what really is working now.

A good example of this would be folks poo-pooing the idea of exact match domains being a golden egg of sorts over the years. If you were or are running any SEO campaigns you'll notice that the exact match benefit was quite real. So while pontificators were decrying their effectiveness, practitioners were laughing all the way to the bank.

There is no substitute for real experience and real data. Which group do you want to be in?

Mental Models

As we discussed above, the algorithm has a lot of components to it. There is generally no 1 correct universal right answer to each and every SERP. The gold usually lies in trying to understand where algorithms are heading and how they have changed.

As an example, in his recent post about exact match domains losing weight, Aaron used highlights to visually segment the search results in regards to "why is XYZ ranking". I'll include the image here:

This is a good example of the fact that when you build your own sites and you collect your data it helps you form and solidify your mental models.

The tricky part is how do you know who's advice is garbage vs who you should trust? You should take your independently arrived upon conclusions that you have repeatedly tested and see who is offering similar advice. Those are the folks who you can trust to tell you "what actually works" rather than "how to buy the number they are selling as a solution".

For another example of a mental model in action, you should check out Charlie Munger's piece on mental models and investing.

One more piece of advice here. Recently we wrote about the the importance of rank checking with a tie-in to analytics. It's vital to have both installed as you can get concrete before and after data. Without hard data relative to ongoing algorithm changes, you are kind of flying blind to the actual changes being made.

Being in the Know

The reason this community and many paid communities are successful is because there isn't a lot of noise or high pressure sales (like there are on free chat forums or message boards) and because experienced people are able to freely share ideas, thoughts, and data with like-minded people.

The more information and thoughts you get from people who are in the trenches on a daily basis can only help your efforts, knowledge, and experience because theories will only get you so far.

I think there is a scientific element to some factors like links, domain age, social signals, brand signals, anchor text (but at a high level, nothing overly exact) but overall I think it's too complex to break down into a reliable scientific formula.

It's important to pay attention to trends but your own experience and data is invaluable to your ongoing success. I believe that search is going to continue to get more complex but that's necessarily a bad thing if you have access to good information.

A friend gave me a great quote from Michael Lewis's book, Liar's Poker:

You spend a lot of time asking yourself questions: Are munis (municipal bonds) right for me? Are govys (government bonds) right for me? Are corporates (corporate bonds) right for me?

You spend a lot of time thinking about that. And you should.

But think about this: might be more important to choose a jungle guide than to choose your product.

When it comes to SEO, it's pretty important to choose your jungle guides correctly.

Published: June 10, 2011 by Eric Covino in marketing

Comments

GusRuss89
June 10, 2011 - 12:20pm

Even though my view is that seo shouldn't be complex and intricate, but instead should happen naturally with a good web strategy and smart content marketing, I do like to hear about the intricacies. This is a great guide to getting started with technical seo. Thanks

SEObri
June 10, 2011 - 4:22pm

I think Punctuationalism would be more apt.

Borzio
June 10, 2011 - 5:36pm
dr_pete
June 10, 2011 - 6:26pm

I think it's going to get really interesting when Google starts using more machine learning (which it's been suggested that they're playing with in the Panda update). At that point, even Google may not be able to tell you the weight of their ranking factors. You'll end up with an algorithm that's a black box even to its creators.

vanillacoke
June 10, 2011 - 10:04pm

I agree pete, it's kind of like the insurance industry (where I use to work in a prior life)...the rating variables now are so out of control that underwriters can't even manually rate a policy anymore, even the actuaries have issues with breaking down the rating because of the blackbox nature and complexity of their hundreds and hundreds of rating variables (with different weighting and counter-balances to other variables)!

Borzio
June 10, 2011 - 7:32pm

John Blake
February 24th, 2010 at 9:56 PM
"When hyper-linked IT nodes reach a certain level of complexity c. 2030, the resulting Emergent Order may not be discernible but it will exist. Whether sentient self-awareness will accompany this development, who knows… such issues, including holographic attributes, are entirely beyond mathematicians’ purview today.

Emergent Order is THE central question in AI (as the cliche has it, intelligence as such is not artificial; by definition, it transcends programmed design).

AI researchers can only start things off. Like “genetic algorithms”, no-one knows or can know where Emergent Order leads. When different central foci exist in competition, the result will be a second-order Emergent Organism, and so on down the line."

cxbrian
June 22, 2011 - 9:11pm

I think evolution is a good analogy here. There are these quantum leaps in evolution which can't be reduced to their parts (like punctuationalism) just like when new layers of algorithms are added to previous layers as opposed to making tweaks in existing algorithms (like gradualism). Novelty is introduced in new organizations of functioning.

As the Dr. Phil soundboard says, "Whattya say after this you and I go do some [vanilla] coke?"

cxbrian
June 22, 2011 - 9:19pm

This is a great article BTW. Lots of good points. I've been using Yahoo Notepad to have access to notes everywhere but it's just text so I installed Evernote. You can also use annotations in Google Analytics to keep track of changes and connect to traffic if you trust Google. I usually use code which I reference elsewhere just in case. Normally I don't write down all the changes but I want to get into it more to make more connections. Just keeping the changes in mind generally has worked pretty good but with more clients I'll probably need more notes.

Add new comment

(If you're a human, don't change the following field)
Your first name.
(If you're a human, don't change the following field)
Your first name.
(If you're a human, don't change the following field)
Your first name.