Hugo Guzman: Deconstructing the Google 'Brand?' Algorithm

So here we are on Tuesday, March 3 rd, and I’m still trying to fully digest the implications of Aaron’s “Heavy emphasis on Brandings” post from last Wednesday, February 25. The data that was presented, the context that was provided and the labyrinth of insightful user comments that were spawned left me reeling for days. So much so that I wouldn’t be surprised if the annals of SEO history associate February 25, 2009 as the infamous “Aaron Wall” update.

In all seriousness, though, this really is a big deal, especially for folks like me who spend their days attempting to optimize mainstream “Big Brand” web sites for a living. I’m fortunate enough work for an interactive agency that takes SEO seriously, and my team strives to deliver a truly comprehensive approach to SEO – blending site-side factors, link building, social media elements, and analytics. We usually do a pretty darn good job, despite the myriad of obstacles and pitfalls associated with trying to implement SEO for a large, lumbering, Fortune 500 web portal. And sadly, like many big firms out there, we have occasionally chalked up our shortcomings to a lack of implementation and cooperation on the part of the client. It’s that typical “not our fault, it’s a crappy big brand site” copout that many of us have heard a thousand times before.

Then along comes Aaron with his revelations about Google’s recent algorithm shift and its ramifications for big brands, and all hell breaks loose:

  • I immediately spiral into self-doubt regarding me and my team’s marketing abilities
  • I start scrambling to deconstruct this alleged algorithm shift
  • I start emailing all of my senior team members asking them to attempt deconstructing the algorithm shift
  • they roll their eyes and one of them tells me stop sending so many random emails at 10 o’clock at night

I’ve calmed down a bit since then, but I’m still hard at work trying to figure out exactly what levers have caused certain “Big Brand” sites to skyrocket in the SERPs while others remain mired in search engine mediocrity. As with most things in life, the best course of action is to introduce a bit of the old scientific method, systematically isolating variables in an attempt to identify predictable patterns that can be replicated.

After taking a high-level look at each of the keywords outlined in Aaron’s post, and the corresponding brand sites that made the jump onto the front page, several possible culprits become apparent. Here are a couple that jumped out at me:

Social Media Signals – companies like University of Phoenix have made a concerted effort to engage users via social media channels, and those social reverberations could be a key facet in Google’s newly refined algorithm, especially if some of those reverberations include mention of the phrase “online degree.”

Increased weighting of anchor text within internal site linkage – companies like American Airlines seem to be leveraging both their own internal site pages as well partner sites to increase the volume of anchor text occurrences for the term “airline tickets” (although they’re missing out on some seriously low-hanging fruit by failing to optimize the alt. image attribute on their global logo image link). If Google has decided to increase the potency of this element, then large brand portals with voluminous amounts of internal pages and partner sites (or branded micro sites) could gain an upper hand for highly competitive terms.

Increased sensitivity to offline marketing campaigns – Perhaps Google’s algorithm is getting better at recognizing site traffic associated with offline marketing campaigns. This would extremely difficult to do without having direct access to a site’s analytics data (although Google Analytics conspiracy theorists are convinced that this is already the case for sites using GA) but perhaps Google is using signals such as the relative volume of specific search queries (e.g. branded queries like “State Farm”) and somehow tying that data back to terms that the algorithm associates with the given brand query (e.g. State Farm = Auto Insurance).

Disclaimer: I haven’t been able to actually test these hypotheses out thoroughly or with any real semblance of scientific method. After all, it’s only been five days since I read the post, and I do have other things to do besides ponder the ramifications of this alleged algorithm shift (it’s 10pm so I have to start annoying my team with random emails again).

Besides, Google’s results could roll back at any moment, rendering all of these insights (nearly) moot. Still, if you’re in any way involved in optimizing web sites for big brands (or if you just want to improve your eye for SEO) it’s probably a good idea to start doing a little scientific testing of your own.

If you liked this post (or even if you thought it was a flaming pile of dog excrement) feel free to reach out to me via my Twitter handle:

Published: March 3, 2009 by Aaron Wall in google


March 3, 2009 - 1:45pm

Thanks for giving me another opportunity to share my thoughts on your soap box, Aaron!

Much appreciated.


March 3, 2009 - 2:39pm

Patterns I notice are strong linking with only brand as keyword, yet they rank for otherwise hard to rank for words...

March 3, 2009 - 7:18pm

Thanks for the feedback, googlemonster! You could be onto something, especially if Google has figured out a way to tie brand keyword anchor text to generic keyword ranking (e.g. State Farm anchor text = Life Insurance ranking).

Could be a bit of a stretch, though.

March 4, 2009 - 1:52am

There are many ways Google might identify brand names and their associated keywords.

With their new Chrome browser and it's ability to collect your behavioral data, they might estimate how often a URL is typed in the browser - more often = a brand.

They can measure CTR, bounce rate & time on page for site-to-site links with Chrome. Data harvesting capabilities have expanded.

They might look at a company's ad spend as in indication of whether or not they're a brand for a keyword.

Social mentions is another...

Heck, they might even aggregate data from business databases like InfoUSA. We know they keep track of publicly traded companies with Google Finance.

Their timing might have a lot to do with the recession, pushing commercial products over information in an effort to help the economy???... maybe why Google and Obama are so buddy-buddy... Maybe not.

March 4, 2009 - 2:06pm

Interesting thoughts, directdiscstores. I think that the key here, moving forward, is for folks to start actually testing specific variables to see if any clearly identifiable patterns emerge.

March 6, 2009 - 7:25pm

I agree - testability is key. Don't Algorithm deconstructionists store & analyze historical ranking & website data? If so, they wouldn't necessarily need to test their theories against the latest updates using their own websites; they could ping their black box of data and let it do the talking.

Granted, ones ability to look at behavioral data that way is very limited... but I suppose one could simulate behavioral factors on one's own website. You'd have to carefully simulate natural user demographics & behavior: browser distribution, natural geographic distribution, natural ip history, ratio of ip's with iGoogle/gmail accounts, ratio of igoogle to google search actions, believable CTR, believable bounce rate... that would be really hard to simulate.

March 4, 2009 - 5:48pm

Many apologies for using the comments for spelling correction. (Couldn't find a "Contact us" form to keep it private.)

In "...the annuls of SEO history...", the word should be "annals". Check it:

Otherwise, cheers for a heartfelt and articulate post!

March 4, 2009 - 7:36pm

Thanks Winooski. Fixed it :)

March 4, 2009 - 7:44pm

@Winooski - this will go down in my own personal "annals" of crappy spelling!

March 4, 2009 - 8:47pm

Ahh the good old days when Google presented a level playing field. Have to admit this hasn't been a very painful algo change for most of my clients.

March 5, 2009 - 3:00pm

So should we expect this to be an algorithm change that effects nothing, but big brands?

Or one that only effects big brands, now, but will probably be rolled out into all SERPs eventually? (Maybe it has a lot to do with behavioural data, but they only have that kind of data for big brands, now)

I'm really wondering what (if anything) that implies for the average webmaster/SEO/online marketer who is not working for fortune 500's, but on his own niche websites and for small businesses.

Any suggestions? I'd love to hear any views on this! thanks!

March 5, 2009 - 4:20pm

Matt Cutts said he thought this update was minor...which would indicate to me that this is certainly the direction that Google wants to push the algorithm in. What that means to you is entirely up to you and your business model. :)

March 5, 2009 - 4:34pm

Thanks for the quick reply, Aaron. I understand you can't g ive away free consulting on your blog, but just for clarification: By "the direction that Google wants to push the algorithm in", I assume you meant that you assume this is not just something that'll effect exclusively big brands in the future, but that it'll effect medium-sized to small sites, too, in the future?

Add new comment

(If you're a human, don't change the following field)
Your first name.
(If you're a human, don't change the following field)
Your first name.
(If you're a human, don't change the following field)
Your first name.