A Thought Experiment on Google Whitelisting Websites

Google has long maintained that "the algorithm" is what controls rankings, except for sites which are manually demoted for spamming, getting hacked, delivering spyware, and so on.

At the SMX conference it was revealed that Google uses white listing:

Google and Bing admitted publicly to having ‘exception lists’ for sites that were hit by algorithms that should not have been hit. Matt Cutts explained that there is no global whitelist but for some algorithms that have a negative impact on a site in Google’s search results, Google may make an exception for individual sites.

The idea that "sites rank where they deserve, with the exception of spammers" has long been pushed to help indemnify Google from potential anti-competitive behavior. Google's marketing has further leveraged the phrase "unique democratic nature of the web" to highlight how PageRank originally worked.

But why don't we conduct a thought experiment for the purpose of thinking through the differences between how Google behaves and how Google doesn't want to be perceived as behaving.

Let's cover the negative view first. The negative view is that either Google has a competing product or a Google engineer dislikes you and goes out of his way to torch your stuff simply because you are you and he dislikes you & is holding onto a grudge. Given Google's current monopoly-level marketshare in most countries, such would be seen as unacceptable if Google was just picking winners and losers based on their business interests.

The positive view is that "the algorithm handles almost everything, except some edge cases of spam." Let's break down that positive view a bit.

  • Off the start, consider that Google engineers write the algorithms with set goals and objectives in mind.
    • Google only launched universal search after Google bought Youtube. Coincidence? Not likely. If Google had rolled out universal search before buying Youtube then they likely would have increased the price of Youtube by 30% to 50%.
    • Likewise, Google trains some of their algorithms with human raters. Google seeds certain questions & desired goals in the minds of raters & then uses their input to help craft an algorithm that matches their goals. (This is like me telling you I can't say the number 3, but I can ask you to add 1 and 2 then repeat whatever you say :D)
  • At some point Google rolls out a brand-filter (or other arbitrary algorithm) which allows certain favored sites to rank based on criteria that other sites simply can not match. It allows some sites to rank with junk doorway pages while demoting other websites.
  • To try to compete with that, some sites are forced to either live in obscurity & consistently shed marketshare in their market, or be aggressive and operate outside the guidelines (at least in spirit, if not in a technical basis).
  • If the site operates outside the guidelines there is potential that they can go unpenalized, get a short-term slap on the wrist, or get a long-term hand issued penalty that can literally last for up to 3 years!
  • Now here is where it gets interesting...
    • Google can roll out an automated algorithm that is overly punitive and has a significant number of false positives.
    • Then Google can follow up by allowing nepotistic businesses & those that fit certain criteria to quickly rank again via whitelisting.
    • Sites which might be doing the same things as the whitelisted sites might be crushed for doing the exact same thing & upon review get a cold shoulder.

You can see that even though it is claimed "TheAlgorithm" handles almost everything, they can easily interject their personal biases to decide who ranks and who does not. "TheAlgorithm" is first and foremost a legal shield. Beyond that it is a marketing tool. Relevancy is likely third in line in terms of importance (how else could one explain the content farm issue getting so out of hand for so many years before Google did something about it).

Doorway Pages Ranking in Google in 2011?

When Google did the Panda update they highlighted that not only did some "low quality" sites get hammered, but that some "high quality" sites got a boost. Matt Cutts said: "we actually came up with a classifier to say, okay, IRS or Wikipedia or New York Times is over on this side, and the low-quality sites are over on this side."

Here is the problem with that sort of classification system: doorway pages.

The following Ikea page was ranking page 1 in the search results for a fairly competitive keyword.

Once you strip away the site's navigation there are literally only 20 words on that page. And the main body area "content" for that page is a link to a bizarre, confusing, and poor-functioning flash tour which takes a while to load.

If you were trying to design the worst possible user experience & wanted to push the "minimum viable product" page into the search results then you really couldn't possibly do much worse that that Ikea page is (at least not without delivering malware and such).

I am not accusing Ikea of doing anything spammy. They just have terrible usability on that page. Their backlinks to that page are few in number & look just about as organic as they could possibly come. But not that long ago companies like JC Penny and Overstock were demoted by Google for building targeted deep links (that they needed in order to rank, but were allegedly harming search relevancy & Google user experience). Less than a month later Google arbitrarily changed their algorithm to where other branded sites simply didn't need many (or in some cases any) deep links to get in the game, even if their pages were pure crap. Google Handling Flash.

We are told the recent "content farm" update was to demote low quality content. If that is the case, then how does a skeleton of a page like that rank so high? How did that Ikea page go from ranking on the third page of Google's results to the first one? I think Google's classifier is flashing a new set of exploits for those who know what to look for.

A basic tip? If you see Google ranking an information-less page like that on a site you own, that might be a green light to see how far you can run with it. Give GoogleBot the "quality content" it seeks. Opportunity abound!

Quick & Dirty Competitive Research for Keywords

There are so many competitive research tools on the market. We reviewed some of the larger ones here but there are quite a few more on the market today.

The truth is that you can really get a lot of good, usable data to give you an idea of what the competition is likely to be by using free tools or the free version of paid tools.

Some of the competitive research tools out there (the paid ones) really are useful if you are going to scale way up with some of your SEO or PPC plans but many of the paid versions are overkill for a lot of webmasters.

Choosing Your Tools

Most tools come with the promises of “UNCOVERING YOUR COMPETITORS BEST _____".

That blank can be links, keywords, traffic sources, and so on. As we know, most competitive research tools are rough estimates at best and almost useless estimates at worst. Unless you get your hands on your competition’s analytics reports, you are still kind of best-guessing. In this example we are looking for the competitiveness of a core keyword.

Best-guessing really isn’t a bad thing so long as you realize that what you are doing is really triangulating data points and looking for patterns across different tools. Keep in mind many tools use Google’s data so you’ll want to try to reach beyond Google’s data points a bit and hit up places like:

The lure of competitive research is to get it done quickly and accurately. However, gauging the competition of a keyword or market can’t really be done with a push of the button as there are factors that come into play which a push-button tool cannot account for, such as:

  • how hard is the market to link build for?
  • is the vertical dominated by brands and thick EMD’s?
  • what is your available capital?
  • are the ranking sites knowledgeable about SEO or are they mostly ranking on brand authority/domain authority? (how tight is their site structure, how targeted is their content, etc)
  • is Google giving the competing sites a brand boost?
  • is Google integrating products, images, videos, local results, etc?

Other questions might be stuff like "how is Google Instant skewing this keyword marketplace" or "is Google firing a vertical search engine for these results (like local" or "is Google placing 3 AdWords ads at the top of the search results" or "is Google making inroads into the market" like they are with mortgage rates.

People don't search in an abstract mathematical world, but by using their fingers and eyes. Looking at the search results matters. Quite a bit of variables come into play which require some human intuition and common sense. A research tool is only as good as the person using it, you have to know what you are looking at & what to be aware of.

Getting the Job Done

In this example I decided to use the following tools:

Yep, just 2 free tools.... :)

So we are stipulating that you’ve already selected a keyword. In this case I picked a generic keyword for the purposes of going through how to use the tools. Plug your keyword into Google, flip on SEO for Firefox and off you go!

This is actually a good example of where a push button tool might bite the dust. You’ve got Related Search breadcrumbs at the top, Images in the #1 spot, Shopping in the #3 spot, and News (not pictured) in the #5 spot.

So wherever you thought you might rank, just move yourself down a 1-3 spots depending on where you would be in the SERPS. This can have a large effect on potential traffic and revenue so you’ll want to evaluate the SERP prior to jumping in.

You might decide that you need to shoot for 1 or 2 rather than top 3 or top 5 given all the other stuff Google is integrating into this results page. Or you might decide that the top spot is locked up and the #2 position is your only opportunity, making the risk to reward ratio much less appealing.

With SEO for Firefox you can quickly see important metrics like:

  • Yahoo! links to domain/page
  • domain age
  • Open Site Explorer and Majestic SEO link data
  • presence in strong directories
  • potential, estimated traffic value from SEM Rush

Close up of SEO for Firefox data:

Basically by looking at the results page you can see what other pieces of universal search you’ll be competing with, whether the home page or a sub-page is ranking, and whether you are competing with brands and/or strong EMD’s.

With SEO for Firefox you’ll see all of the above plus the domain age, domain links, page links, listings in major directories, position in other search engines, and so on. This will give you a good idea of potential competitiveness of this keyword for free and in about 5 seconds.

It is typically better & easier to measure the few smaller sites that managed to rank rather than measuring the larger authoritative domains. Why? Well...

Checking Links

So now that you know how many links are pointing to that domain/page you’ll want to check how many unique domains are pointing in and what the anchor text looks like, in addition to what the quality of those links might be.

Due to its ease of use (in addition to the data being good) I like to use Open Site Explorer from SeoMoz in these cases of quick research. I will use their free service for this example, which requires no log in, and they are even more generous with data when you register for a free account.

The first thing I do is head over to the anchor text distribution of the site or page to see if the site/page is attracting links specific to the keyword I am researching:

What’s great here is you can see the top 5 instances of anchor text usage, how many total links are using that term, and how many unique domains are supplying those total links.

You can also see data relative to the potential quality of the entire link profile in addition to the ratio of total/unique domains linking in.

You probably won’t want or need to do this for every single keyword you decide to pursue. However, when looking at a new market, a potential core keyword, or if you are considering buying an exact match domain for a specific keyword you can accomplish a really good amount of competitive research on that keyword by using a couple free tools.

Types of Competitive Research

Competitive research is a broad term and can go in a bunch of different directions. As an example, when first entering a market you would likely start with some keyword research and move into analyzing the competition of those keywords before you decide to enter or fully enter the market.

As you move into bigger markets and start to do more enterprise-level competitive research specific to a domain, link profiles, or a broader market you might move into some paid tools.

Analysis paralysis is a major issue in SEO. Many times you might find that those enterprise-level tools really are overkill for what you might be trying to do initially. Gauging the competitiveness of a huge keyword or a lower volume keyword really doesn’t change based on the money you throw at a tool. The data is the data especially when you narrow down the research to a keyword, keywords, or domains.

Get the Data, Make a Decision

So with the tools we used here you are getting many of the key data points you need to decide whether pursuing the keyword or keywords you have chosen is right for you.

Some things the tools cannot tell you are questions we talked about before:

  • how much captial can you allocate to the project?
  • how hard are you willing to work?
  • do you have a network of contacts you can lean on for advice and assistance?
  • do you have enough patience to see the project through, especially if ranking will take a bit..can you wait on the revenue?
  • is creativity lacking in the market and can you fill that void or at least be better than what’s out there?

Only you can answer those questions :)

The Problem With Following Prescription

You can't learn great SEO from an e-book. Or buying software tools.

Great SEO is built on an understanding.

Reducing SEO To Prescription

One of the problems with reductive, prescribed SEO approaches - i.e. step one: research keywords, step two: put keyword in title etc can be seen in the recent "Content Farm" update.

When Google decide sites are affecting their search quality, they look for a definable, repeated footprint made by the sites they deem to be undesirable. They then design algorithms that flag and punish the sites that use such a footprint.

This is why a lot of legitimate sites get taken out in updates. A collection of sites may not look, to a human, like problem sites, but the algo sees them as being the same thing, because their technical footprint is the same. For instance, a website with a high number of 250-word pages is an example of a footprint. Not necessarily an undesirable one, but a footprint nevertheless. Similar footprints exist amongst ecommerce sites heavy in sitewide templating but light on content unique to the page.

Copying successful sites is a great way to learn, but can also be a trap. If you share a similar footprint, having followed the same SEO prescription, you may go down with them if Google decides their approach is no longer flavor of the month.

The Myth Of White Hat

A lot of sites that get taken out are white hat i.e. sites that follow Google's webmaster guidelines.

It's a reasonably safe approach, but if you understand SEO, you'll soon realize that following a white hat prescription offers no guarantees of ranking, nor does it offer any guarantees you won't be taken out.

The primary reason there aren't any guarantees comes down to numbers. Google knows that when it makes a change, many sites will lose. They also know that many sites will win i.e. replace the sites that lost. If your site drops out, Google aren't bothered. There will be plenty of other sites to take your place. Google are only concerned that their users perceive the search results to be of sufficient quality.

The exception is if your site really is a one-of-a-kind. The kind of site that would embarrass Google if users couldn't find it. BMW, for example, in response to the query "BMW".

It's not fair, but we understand that's just how life is.

An Understanding

For those readers new to SEO, in order to really grasp SEO, you need to see things from the search engines point of view.

Firstly, understand the search engines business case. The search engine can only make money if advertisers pay for search traffic. If it were too easy for those sites who are likely to use PPC to rank highly in the natural results, then the search engines business model is undermined. Therefore, it is in the search engines interest to "encourage" purely commercial entities to use PPC, not SEO. One way they do this is to make the natural results volatile and unpredictable. There are exceptions, covered in my second point.

Secondly, search engines must provide sufficient information quality to their users. This is an SEO opportunity, because without webmasters producing free-to-crawl, quality content, there can be no search engine business model. The search engines must nurture this ecosystem.

If you provide genuine utility to end users, the search engines have a vested interest in your survival, perhaps not as an individual, but certainly as a group i.e. "quality web publishers". Traffic is the lifeblood of the web, and if quality web publishers aren't fed traffic, they die. The problem, for webmasters, is that the search engines don't care about any one "quality publisher", as there are plenty of quality publishers. The exception is if you're the type of quality publisher who has a well recognized brand, and would therefore give the impression to users that Google was useless if you didn't appear.

Thirdly, for all their cryptic black box genius, search engines aren't all that sophisticated. Yes, the people who run them are brilliant. The problems they solve are very difficult. They have built what, only decades ago, would have been considered magic. But, at the end of the day, it's just a bit of maths trying to figure out a set of signals. If you can work out what that set of signals are, the maths will - unblinkingly - reward you. It is often said that in the search engine wars, the black hats will be the last SEOs standing.

Fourthly, the search engines don't really like you. They identified you as a business risk in their statement to investors. You can, potentially, make them look bad. You can undermine their business case. You may compete with their own channels for traffic. They tolerate you because they need publishers making their stuff easy to crawl, and not locking their content away behind paywalls. Just don't expect a Christmas card.

SEO Strategy Built On Understanding

Develop strategies based on how a search engine sees the world.

For example, if you're a known brand, your approach will be different to a little known, generic publisher. There isn't really much risk you won't appear, as you could embarrass Google if users can't find you. This is the reason BMW were reinstated so quickly after falling foul of Google's guidelines, but the same doesn't necessarily apply to lesser known publishers.

If you like puzzles, then testing the algorithms can give you an unfair advantage. It's a lot harder than it used to be, but where there is difficulty, there is a barrier to entry to those who come later. Avoid listening to SEO echo chambers where advice may be well-meaning, but isn't based on rigorous testing.

If you're a publisher, not much into SEO wizardry, and you create content that is very similar to content created by others, you should focus on differentiation. If there are 100's of publishers just like you, then Google doesn't care if you disappear. Google do need to find a way to reward quality, especially in niches that aren't well covered. Be better than the rest, but if you're not, slice your niche finer and finer, until you're the top dog in your niche. You should focus on building brand, so you can own a search stream. For example, this site owns the search stream "SEO Book", a stream Aaron created and built up.

Remember, search engines don't care about you, unless there's something in it for them.

Google Update Panda

Google tries to wrestle back index update naming from the pundits, naming the update "Panda". Named after one of their engineers, apparently.

The official Google line - and I'm paraphrasing here - is this:

Trust us. We're putting the bad guys on one side, and the good guys on the other

I like how Wired didn't let them off the hook.

Wired persisted:

Wired.com: Some people say you should be transparent, to prove that you aren’t making those algorithms to help your advertisers, something I know that you will deny.

Singhal: I can say categorically that money does not impact our decisions.

Wired.com: But people want the proof.

This answer, from Matt Cutts, was interesting:

Cutts: If someone has a specific question about, for example, why a site dropped, I think it’s fair and justifiable and defensible to tell them why that site dropped. But for example, our most recent algorithm does contain signals that can be gamed. If that one were 100 percent transparent, the bad guys would know how to optimize their way back into the rankings

Why Not Just Tell Us What You Want, Already!

Blekko makes a big deal about being transparent and open, but Google have always been secretive. After all, if Google want us to produce quality documents their users like and trust, then why not just tell us exactly what a quality document their users like and trust looks like?

Trouble is, Google's algorithmns clearly aren't that bulletproof, as Google admit they can still be gamed, hence the secrecy. Matt says he would like to think there would be a time they could open source the algorithms, but it's clear that time isn't now.

Do We Know Anything New?

So, what are we to conclude?

  • Google can be gamed. We kinda knew that....
  • Google still aren't telling us much. No change there....

Then again, there's this:

Google have filed a patent that sounds very similar to what Demand Media does i.e looks for serp areas that are under-served by content, and prompts writers to write for it.

The patent basically covers a system for identifying search queries which have low quality content and then asking either publishers or the people searching for that topic to create some better content themselves. The system takes into account the volume of searches when looking at the quality of the content so for bigger keywords the content would need to be better in order for Google to not need to suggest somebody else writes something

If Google do implement technology based on this patent, then it would appear they aren't down on the "Content Farm" model. They may even integrate it themselves.

Until then....

How To Avoid Getting Labelled A Content Farmer

The question remains: how do you prevent being labelled as a low-quality publisher, especially when sites like eHow remain untouched, yet Cult Of Mac gets taken out? Note: Cult Of Mac appears to have been reinstated, but one wonders if that was the result of the media attention, or an algo tweak.

Google want content their users find useful. As always, they're cagey about what "useful" means, so those who want to publish content, and want to rank well, but do not want be confused with a content farm, are left to guess. And do a little reverse-engineering.

Here's a stab, based on our investigations, the conference scene, Google's rhetoric, and pure conjecture thus far:

  • A useful document will pass a human inspection
  • A useful document is not ad heavy
  • A useful document is well linked externally
  • A useful document is not a copy of another document
  • A useful document is typically created by a brand or an entity which has a distribution channel outside of the search channel
  • A useful document does not have a 100% bounce rate followed by a click on a different search result for that same search query ;)

Kinda obvious. Are we off-base here? Something else? What is the difference, as far as algo is concerned, between e-How and Suite 101? Usage patterns?

Still doesn't explain YouTube, though, which brings us back to:

Wired.com: But people want the proof

YouTube, the domain, is incredibly useful, but some pages - not so much. Did YouTube get hammered by update Panda, too?

Many would say that's unlikely.

I guess "who you know" helps.

In the Panda update some websites got owned. Others are owned and operated by Google. :D

Tracking Offline Conversions for Local SEO

Google Penalized BeatThatQuote.com

Shortly after we found out that Google was to purchase Beat That Quote we highlighted how Google purchased a website that was deeply engaged in numerous nefarious black hat SEO practices. A friend just pinged me to confirm that Google has penalized the domain by removing it from the search results.

From a competition & market regulation perspective that was a smart move for Google. They couldn't leave it in the search results while justifying handing out penalties to any of its competitors. As an added bonus, the site is building up tons of authoritative links in the background from all the buzz about being bought by Google. Thus when Google allows it to rank again in 30 days it will rank better than ever.

Based on their web dominance which generates such a widespread media buzz, Google adds millions of Pounds worth of inbound links to any website they buy.

The message Google sends to the market with this purchase is that you should push to get the attention of Google's hungry biz dev folks before you get scrutiny from their search quality team. After the payday the penalty is irrelevant because you already have cash in hand & the added links from the press mentioning the transaction will more than offset any spam links you remove. Month 1 revenues might be slightly lower, but months 2 through x will be far higher.

Google Buys BeatThatQuote, a UK Comparison Site Violating Google's Guidelines

It looks like Money.co.uk was right on the money:

BeatThatQuote.com today was sold to Google for GBP37.7 million. We think this deal is a tremendous opportunity for our company to develop new and innovative options for personal finance in the UK. Our team is excited about becoming a part of Google. We look forward to working with their engineers to create new tools making it easier for consumers to choose the right financial products. We think

What is screwed up about this is that Google is engaging in *major* channel conflict. Not only is there some gray area background stuff:

BeatThatQuote.com's ad prompted 101 complaints to the Advertising Standards Authority, with 65 objecting that the commercial "trivialised, condoned or encouraged bullying in the workplace".

But now they have to consider SEO as well. I highlighted how it was a bit unjust when Google arbitrarily chose to whack one site while letting another get away with worse just because the founder was good at public relations, but how can Google police Google's guidelines when Google is the one breaking them?

Doorway Pages / Gateway Sites

Remember how Overstock.com was recently penalized for offering discounts in exchange for links? BeatThatQuote partnered with Oxfam to create CompareForGood.com. The homepage consists of a bunch of links into BeatThatQuote.com. If you look at those links using our server header checker you will see some 301 redirects. Of course doorway pages are considered spam & we know that Google has torched some other affiliate programs for using 301 redirects.

Such links are doing 301 redirects

Sure that is fairly vanilla...a bit gray area. Certainly not straight up black hat.

....unlike...

Buying Links That Pass PageRank

Raise your hand if you have read a post by Matt Cutts on the dangers of buying links that flow PageRank?

Ok. Now keep your hand up if you have read a dozen of them.

Remember how Matt Cutts stated that some bloggers got torched for selling a single link? Remember the long NYT article about how Google is cracking down on link buying & they penalized JC Penny for it?

With that in mind, can anyone explain why Google's newest purchase buying links like

Not so much a categorized listing with an editorial review...just a paid link for the sake of buying links to flow PageRank.

That one is only totally flagrant.

A bit off color. Like comment spamming.

Sorta like the link exchanges in German.

But some are even more outrageous. Consider that BeatThatQuote is buying links from pages with ad sections like

Paid Blog Reviews

Remember those "evil" paid reviews Matt Cutts wrote of? Plenty of those to go around ;)

In fact, some of the paid blog links were in place so long that BeatThatQuote got thank you's for advertising for over a year straight.

I don't have a decade of spam fighting experience like Google does. But is it too much to suggest that before Google buys *any* website they should do a basic compliance audit to verify that the site is operating within Google's TOS. I am an independent SEO and it took me all of 2 minutes to find numerous FLAGRANT violations.

What sort of message does Google send the market with the above behavior?

How Can Google Police Anyone?

Google has on multiple occasions penalized other UK based finance sites for SEO issues & link buying. But now that Google owns one of the horses in the race, and that horse has been found to be using anabolic steroids, can they legitimately penalize any of their new competitors?

If I had a UK finance site I would go on a link buying binge right now. Google can't penalize you for it because they are doing the same damn thing. And if they do penalize it for DOING THE EXACT SAME THING GOOGLE IS DOING then you know you have a legitimate gripe for the media, and I have no doubt Microsoft would be willing to help pick up the legal tab.

Google Eats Microsoft's Lunch Again!

Ultimately this is a body blow to Microsoft. Microsoft started to gain momentum in search through verticalization, but has since backed off. Meanwhile Google took Microsoft's ball and ran it in for a touchdown (acquiring MetaWebs, trying to buy ITA Software, and buying BeatThatQuote). And now one of MSN's portal ad partners is owned by Google:

Head of partnerships at MSN, Phil Coxon said, “At Microsoft Advertising, we’re passionate about collaborating with brands to create compelling advertising campaigns. By providing new and exclusive content that appeals to consumers, this partnership both enhances the overall MSN user experience as well as providing a great platform for BeatThatQuote to engage with their target audience on a meaningful level. This deal builds on our previous partnership with BeatThatQuote, which led to a 400% increase in revenue generated from insurance products. We’re delighted to continue to build on this relationship with this new campaign.”

Oooops.

Update

After a Googler read this blog post, it appears that Beat That Quote has been penalized.

Almost Everything is Unprofitable

Clowning Around

There is a saying in the bond trading market that if you don't know who the clown in a deal is then look in the mirror because it is probably you. Business is the same way. Almost everyone gets taken for a ride at least once.

What is Ignorance?

Ignorance is often viewed as a condescending word, but it is how we are all born. It is only through learning and experience we are able to do much more than survive. Any time you enter a new market or use a new strategy you start out from behind. You are the sucker who is losing money. Rarely does the new guy win just by showing up, or just by copying someone else's existing strategy. There has to be some point of differentiation.

A Brutal Uphill Climb

The leader has more data, more connections, more links, more capital, higher visitor value, and the algorithms have another layer of karma built over the top of them as well. Matt Cutts described part of the Panda update as "we actually came up with a classifier to say, okay, IRS or Wikipedia or New York Times is over on this side, and the low-quality sites are over on this side."

Roadblocks & Pot Holes Are Everywhere

Based on those sorts of disadvantages, why would anyone want to try SEO? Well in almost any other business model similar roadblocks and pain points exist, and SEO allows one to build momentum over time without it being an all or nothing risk. The slow buildup can lead you toward success in ways you may not have anticipated. And the cost of failure is often little more than time. Plus you gain knowledge even when something fails.

Is SEO Really Any Different?

I think Chris Dixon is one of the smarter entrepreneurs and angle investors out there, but was disappointed to see him write a post titled SEO is no longer a viable marketing strategy for startups:

I talk to lots of startups and almost none that I know of post-2008 have gained significant traction through SEO (the rare exceptions tend to be focused on content areas that were previously un-monetizable). Google keeps its ranking algorithms secret, but it is widely believed that inbound links are the preeminent ranking factor. This ends up rewarding sites that are 1) older and have built up years of inbound links 2) willing to engage in aggressive link building, or what is known as black-hat SEO.

A similar blog headline flipped around might read like "Most VC funded companies fail & founders get hosed on equity dilution, so getting funded is no longer a viable company formation strategy for startups." Of course something like that would be laughable, but it is no less absurd than saying SEO is no longer viable.

Sure coming from behind is hard, but the above misses that

  • Google has grown more aggressive in monetizing their search results through increased verticalization and navigational aids
  • many of the most profitable SEO plays are reinvesting into growth
  • most people who are successful with SEO do not like to attribute their success to it because doing so creates additional risks & more competition

Unique Market Approaches

Even treading water in a market where competitors are reinvesting profits & the market maker is tilting the table is quite respectable. If you want to come from behind and exactly clone someone else's business model, it won't likely be profitable. But that is why people attack markets from different perspectives. This is no different than why there are many different graphs. Chris isn't trying to beat Google in creating another link graph, but is looking at different signals.

Tectonic Shifts in Relevancy

Likewise marketing strategies can be vastly different between different companies and different projects within a company. Certain types of pages & certain types of websites rise and fall as the algorithms are adjusted to close down opportunistic loopholes. But as they make certain things harder they make other things easier. The whole content farm model was only enabled by an excessive weighting on domain authority & the introduction of rel=nofollow.

That opportunity may have fallen by the wayside. Many content mills just got hit pretty hard.

Was The Pain Really That Bad?

But for all the bluster about how it was one of the biggest changes in years, most of the content farms are only down maybe 20% to 50% in terms of traffic & revenues.

Sure that is a lot of revenue to disappear, but when you are operating at 80% net margins you can do that without it destroying your company. And this doesn't even take into account that many of these sites had a clean double over the past year. So if you grow 100% then lose 50% you are still even year on year, in spite of being penalized. Not bad in an environment where tons of businesses are going bankrupt offline.

And of course those sites getting whacked create opportunity for other folks, who build sites using different strategies.

A Cautionary Tale

About a half-decade ago a CEO of a start up contacted me & had us build a few links for them. Then they had to get their VCs approval for doing a full in-depth strategic review because it was going to cost well into 5 figures. Their VC investors didn't believe in SEO!

So that killed the project.

This company had a multi-lingual site where their leading market's content was only accessible through a drop down form where the URLs did not change. Fixing that issue to make the site crawlable would have produced more revenues in the first few months than the cost of our contract. But the VC didn't think SEO was valuable. They never got that tip. And for businesses which have network effects built in, losing $x today can easily be $10x or $20x a few years out.

Current Market Leaders Were Yesterday's Gray Area Marketers

Mr. Dixon also highlights how established TripAdvisor is, but when they were founded they were once the small dog just starting out. His article also fails to mention that TripAdvisor was Text-Link-Ads largest customer. In other words, they came from behind, took a calculated risk, and won. They backed off from the risks when the risks started to exceed the opportunity.

Not long after TripAdvisor started collecting consumer reviews, eHow was sold for $100,000. That turned out to be quite profitable for the buyers! And eHow was also known for aggressive & spammy link building against Google's guidelines. In fact, one of their largest competitors highlighted the lack of this information in their S1 filing:

The entire 250+ page document is devoid of any discussion of incoming links which is the cornerstone of search engine optimization. By reading through the lines, it appears that they have two primary sources for link development for their owned and operated sites: (1) from their “undeveloped websites” and (2) from their content partner sites. Although these two initiatives alone are generally not financially profitable, they are successful approaches to maximizing the incoming link equity in their owned and operated properties.

The point is that start ups shouldn't avoid all risk, but they should pick and choose their spots. The above sites are billion Dollar enterprises because they worked in the gray area to catch up & build a lead, and then pulled away from risk after they had a strong market position.

As time passes the opportunities change, but they don't really disappear.

The Cloud and Your SEO Data

The cloud is all the rage these days and with good reason, it certainly can make life easier for a roving SEO, freelancer, or anyone really. I am a big fan of the cloud and consistently utilize services like:

Dropbox
Evernote
37Signals suite of products
SEO tools like the ones here at SEO Book, toolsets like Raven & SEO Moz as well

Most of the mission critical data for a web marketer would be stuff like:

Keyword information
Link building info
Rankings
Analytics
Marketing plans for a site
And so on....

HOSTED SOLUTIONS AND EASE OF USE

Many hosted solutions in the SEO & PPC toolspace offer you the ability to conduct those areas of your business on their servers. I'm leaving out Google Apps, Gmail, and Google docs because I don't really use them much these days and because they are kind of the angry elephants in the room.

For an increasingly mobile worker, access to data at all times is a big benefit to being able to run a smooth and effective business. Not only are workers more mobile in the space these days but many folks also have multiple computers and devices to work off of.

Some folks have the slew of Apple devices (maybe an Air, an iMac, an iPhone, and an iPad) while some have a mashed up set of devices based on their personal preferences or their company's requirements.

Trying to sync desktop stuff to so many different platforms is a real PIA as you know, so more and more folks are begin to use cloud based solutions for everything.

WHO HAS YOUR DATA

Sit back and think about who has your data and where it is. You probably see these folks at conferences, watch webinars, follow them on Twitter, etc. You've probably gotten recommendations from people you trust with respect to using that company's services.

It's not about trusting one person or a couple people you know at the company, you have to be able to trust the company. Sometimes it can be a difficult thing to think about because you feel a personal connection to a particular group and you might feel like you are attacking their personal character by questioning whether you can trust the safekeeping of your data.

In any event, you have a right to question whether or not having your mission critical data hosted by anyone (not to mention another SEO company) is a good idea.

IS THERE A RECORD OF TRUST

Some of these products are newer but there's no evidence of any wrongdoing or snooping going on behind the scenes. To be fair though, how would you know :)

The value is probably in the aggregation of such data (oh wow those links work, oh look at how well these keywords convert...etc) rather than something specific to a single campaign.

You really have to ask yourself if hosting SEO or PPC data with a company that operates in that space is a good business practice. Personally speaking, I have accounts at the two spots I mentioned above (Raven and SeoMoz). So I feel comfortable with those accounts mostly, but I still think it's important to revisit the thought process from time to time.

I like to peruse the privacy policy of these places prior to stockpiling data with them. Here's an excerpt from SEO Moz's privacy policy

**SEOmoz offers a variety of online tools and software. These include, but are not limited to, our free SEO tools, our paid SEO tools, our API, and our tools on OpenSiteExplorer.org. These tools require you to enter a variety of information, such as URLs, domains, keywords, or other items relevant to Internet marketing and link research. We associate this information with your account in order to provide useful features, identify and terminate accounts that violate our Terms of Service, to improve our products, and to provide customer service. We never use this information for the provision of SEO consulting services so you do not need to worry that entering your information will be used against you or your clients by SEOmoz.

We take appropriate physical, electronic, and other security measures to help safeguard personal information from unauthorized access, alteration, or disclosure.***

DATA IS THE NEW GOLD

Knowledge is power and as SEO becomes more and more difficult, or at least more and more gray, your data becomes exponentially more important. What do you think your competitor(s) would give to know exactly what keywords were bringing you traffic and conversions, as well as which links you noted were more valuable than others?

Some of this data can (roughly) be gleaned from the use of competitive research tools but the pieces of information you are storing on other companies (sometimes other SEO businesses) servers are full of hard, actual data rather than estimated or scraped data.

Occasionally I'll hear about people storing all sorts of sensitive SEO data in Google products, I personally think that is a bit cavalier (yes, yes I know that risk reward is not in Google's favor at all there, at least on an individual basis..but the cynic in me puts nothing past them :D )

Most places use heavy-duty security to lock down your accounts and restrict access to key employees but if you've ever worked in an office setting you know that it usually isn't terribly difficult to take a peek at a key account or two.

Pages