SEO 2014

We’re at the start of 2014.

SEO is finished.

Well, what we had come to know as the practical execution of “whitehat SEO” is finished. Google has defined it out of existence. Research keyword. Write page targeting keyword. Place links with that keyword in the link. Google cares not for this approach.

SEO, as a concept, is now an integral part of digital marketing. To do SEO in 2014 - Google-compliant, whitehat SEO - digital marketers must seamlessly integrate search strategy into other aspects of digital marketing. It’s a more complicated business than traditional SEO, but offers a more varied and interesting challenge, too.

Here are a few things to think about for 2014.

1. Focus On Brand

Big brands not only get a free pass, they can get extra promotion. By being banned. Take a look at Rap Genius. Aggressive link-building strategy leads to de-indexing. A big mea culpa follows and what happens? Not only do they get reinstated, they’ve earned themselves a wave of legitimate links.

Now that’s genius.

Google would look deficient if they didn’t show that site as visitors would expect to find it - enough people know the brand. To not show a site that has brand awareness would make Google look bad.

Expedia's link profile was similarly outed for appearing to be at odds with Google's published standards. Could a no-name site pass a hand inspection if they use aggressive linking? Unlikely.

What this shows is that if you have a brand important enough so that Google would look deficient by excluding it, then you will have advantages that no-name sites don’t enjoy. You will more likely pass manual inspections, and you’re probably more likely to get penalties lifted.

What is a brand?

In terms of search, it’s a site that visitors can use a brand search to find. Just how much search volume you require is open to debate, but you don’t need to be a big brand like Apple, or Trip Advisor or Microsoft. Rap Genius aren't. Ask “would Google look deficient if this site didn’t show up” and you can usually tell that by looking for evidence of search volume on a sites name.

In advertising, brands have been used to secure a unique identity. That identity is associated with a product or service by the customer. Search used to be about matching a keyword term. But as keyword areas become saturated, and Google returns fewer purely keyword-focused pages anyway, developing a unique identity is a good way forward.

If you haven’t already, put some work into developing a cohesive, unique brand. If you have a brand, then think about generating more awareness. This may mean higher spends on brand-related advertising than you’ve allocated in previous years. The success metric is an increase in brand searches i.e. the name of the site.

2. Be Everywhere

The idea of a stand-alone site is becoming redundant. In 2014, you need to be everywhere your potential visitors reside. If your potential visitors are spending all day in Facebook, or YouTube, that’s where you need to be, too. It’s less about them coming to you, which is the traditional search metaphor, and a lot more about you going to them.

You draw visitors back to your site, of course, but look at every platform and other site as a potential extension of your own site. Pages or content you place on those platforms are yet another front door to your site, and can be found in Google searches. If you’re not where your potential visitors are, you can be sure your competitors will be, especially if they’re investing in social media strategies.

A reminder to see all channels as potential places to be found.

Mix in cross-channel marketing with remarketing and consumers get the perception that your brand is bigger. Google ran the following display ad before they broadly enabled retargeting ads. Retargeting only further increases that lift in brand searches.

3. Advertise Everywhere

Are you finding it difficult to get top ten in some areas? Consider advertising with AdWords and on the sites that already hold those positions. Do some brand advertising on them to raise awareness and generate some brand searches. An advert placed on a site that offers a complementary good or service might be cheaper than going to the expense and effort needed to rank. It might also help insulate you from Google’s whims.

The same goes for guest posts and content placement, although obviously you need to be a little careful as Google can take a dim view of it. The safest way is to make sure the content you place is unique, valuable and has utility in it’s own right. Ask yourself if the content would be equally at home on your own site if you were to host it for someone else. If so, it’s likely okay.

4. Valuable Content

Google does an okay job of identifying good content. It could do better. They’ve lost their way a bit in terms of encouraging production of good content. It’s getting harder and harder to make the numbers work in order to cover the production cost.

However, it remains Google’s mission to provide the user with answers the visitor deems relevant and useful. The utility of Google relies on it. Any strategy that is aligned with providing genuine visitor utility will align with Google’s long term goals.

Review your content creation strategies. Content that is of low utility is unlikely to prosper. While it’s still a good idea to use keyword research as a guide to content creation, it’s a better idea to focus on topic areas and creating engagement through high utility. What utility is the user expecting from your chosen topic area? If it’s rap lyrics for song X, then only the rap lyrics for song X will do. If it is plans for a garden, then only plans for a garden will do. See being “relevant” as “providing utility”, not keyword matching.

Go back to the basic principles of classifying the search term as either Navigational, Informational, or Transactional. If the keyword is one of those types, make sure the content offers the utility expected of that type. Be careful when dealing with informational queries that Google could use in it’s Knowledge Graph. If your pages deal with established facts that anyone else can state, then you have no differentiation, and that type of information is more likely to end up as part of Google’s Knowledge Graph. Instead, go deep on information queries. Expand the information. Associate it with other information. Incorporate opinion.

BTW, Bill has some interesting reading on the methods by which Google might be identifying different types of queries.

Methods, systems, and apparatus, including computer program products, for identifying navigational resources for queries. In an aspect, a candidate query in a query sequence is selected, and a revised query subsequent to the candidate query in the query sequence is selected. If a quality score for the revised query is greater than a quality score threshold and a navigation score for the revised query is greater than a navigation score threshold, then a navigational resource for the revised query is identified and associated with the candidate query. The association specifies the navigational resource as being relevant to the candidate query in a search operation.

5. Solve Real Problems

This is a follow-on from “ensuring you provide content with utility”. Go beyond keyword and topical relevance. Ask “what problem is the user is trying to solve”? Is it an entertainment problem? A “How To” problem? What would their ideal solution look like? What would a great solution look like?

There is no shortcut to determining what a user finds most useful. You must understand the user. This understanding can be gleaned from interviews, questionnaires, third party research, chat sessions, and monitoring discussion forums and social channels. Forget about the keyword for the time being. Get inside a visitors head. What is their problem? Write a page addressing that problem by providing a solution.

6. Maximise Engagement

Google are watching for the click-back to the SERP results, an action characteristic of a visitor who clicked through to a site and didn’t deem what they found to be relevant to the search query in terms of utility. Relevance in terms of subject match is now a given.

Big blocks of dense text, even if relevant, can be off-putting. Add images and videos to pages that have low engagement and see if this fixes the problem. Where appropriate, make sure the user takes an action of some kind. Encourage the user to click deeper into the site following an obvious, well placed link. Perhaps they watch a video. Answer a question. Click a button. Anything that isn’t an immediate click back.

If you’ve focused on utility, and genuinely solving a users problem, as opposed to just matching a keyword, then your engagement metrics should be better than the guy who is still just chasing keywords and only matching in terms of relevance to a keyword term.

7. Think Like A PPCer

Treat every click like you were paying for it directly. Once that visitor has arrived, what is the one thing you want them to do next? Is it obvious what they have to do next? Always think about how to engage that visitor once they land. Get them to take an action, where possible.

8.Think Like A Conversion Optimizer

Conversion optimization tries to reduce the bounce-rate by re-crafting the page to ensure it meets the users needs. They do this by split testing different designs, phrases, copy and other elements on the page.

It’s pretty difficult to test these things in SEO, but it’s good to keep this process in mind. What pages of your site are working well and which pages aren’t? Is it anything to do with different designs or element placement? What happens if you change things around? What do the three top ranking sites in your niche look like? If their link patterns are similar to yours, what is it about those sites that might lead to higher engagement and relevancy scores?

9. Rock Solid Strategic Marketing Advantage

SEO is really hard to do on generic me-too sites. It’s hard to get links. It’s hard to get anyone to talk about them. People don’t share them with their friends. These sites don’t generate brand searches. The SEO option for these sites is typically what Google would describe as blackhat, namely link buying.

Look for a marketing angle. Find a story to tell. Find something unique and remarkable about the offering. If a site doesn’t have a clearly-articulated point of differentiation, then the harder it is to get value from organic search if your aim is to do so whilst staying within the guidelines.

10. Links

There’s a reason Google hammers links. It’s because they work. Else, surely Google wouldn’t make a big deal about them.

Links count. It doesn’t matter if they are no-follow, scripted, within social networks, or wherever, they are still paths down which people travel. It comes back to a clear point of differentiation, genuine utility and a coherent brand. It’s a lot easier, and safer, to link build when you’ve got all the other bases covered first.

Did Matt Cutts Endorse Rap Genius Link Spam?

On TWIG Matt Cutts spoke about the importance of defunding spammers & breaking their spirits.

If you want to stop spam, the most straight forward way to do it is to deny people money because they care about the money and that should be their end goal. But if you really want to stop spam, it is a little bit mean, but what you want to do, is break their spirits. You want to make them frustrated and angry. There are parts of Google's algorithms specifically designed to frustrate spammers and mystify them and make them frustrated. And some of the stuff we do gives people a hint their site is going to drop and then a week or two later their site actually does drop so they get a little bit more frustrated. And so hopefully, and we've seen this happen, people step away from the dark side and say "you know, that was so much pain and anguish and frustration, let's just stay on the high road from now on" some of the stuff I like best is when people say "you know this SEO stuff is too unpredictable, I'm just going to write some apps. I'm going to go off and do something productive for society." And that's great because all that energy is channeled at something good.

What was less covered was that in the same video Matt Cutts made it sound like anything beyond information architecture, duplicate content cleanup & clean URLs was quickly approaching scamming - especially anything to do with links. So over time more and more behaviors get reclassified as black hat spam as Google gains greater control over the ecosystem.

there's the kind of SEO that is better architecture, cleaner URLs, not duplicate content ... that's just like making sure your resume doesn't have any typos on it. that's just clever stuff. and then there's the type of SEO that is sort of cheating. trying to get a lot of bad backlinks or scamming, and that's more like lying on your resume. when you get caught sometime's there's repercussions. and it definitely helps to personalize because now anywhere you search for plumbers there's local results and they are not the same across the world. we've done a diligent job of trying to crack down on black hat spam. so we had an algorithm named Penguin that launched that kind of had a really big impact. we had a more recent launch just a few months ago. and if you go and patrole the black hat SEO forums where the guys talk about the techniques that work, now its more people trying to sell other people scams rather than just trading tips. a lot of the life has gone out of those forums. and even the smaller networks that they're trying to promote "oh buy my anglo rank or whatever" we're in the process of tackling a lot of those link networks as well. the good part is if you want to create a real site you don't have to worry as much about these bad guys jumping ahead of you. the playing ground is a lot more level now. panda was for low quality. penguin was for spam - actual cheating.

The Matt Cutts BDSM School of SEO

As part of the ongoing campaign to "break their spirits" we get increasing obfuscation, greater time delays between certain algorithmic updates, algorithmic features built explicitly with the goal of frustrating people, greater brand bias, and more outrageous selective enforcement of the guidelines.

Those who were hit by either Panda or Penguin in some cases took a year or more to recover. Far more common is no recovery — ever. How long do you invest in & how much do you invest in a dying project when the recovery timeline is unknown?

You Don't Get to Fascism Without 2-Tier Enforcement

While success in and of itself may make one a "spammer" to the biased eyes of a search engineer (especially if you are not VC funded nor part of a large corporation), many who are considered "spammers" self-regulate in a way that make them far more conservative than the alleged "clean" sites do.

Pretend you are Ask.com and watch yourself get slaughtered without warning.

Build a big brand & you will have advanced notification & free customer support inside the GooglePlex:

In my experience with large brand penalties, (ie, LARGE global brands) Google have reached out in advance of the ban every single time. - Martin Macdonald

Launching a Viral Linkspam Sitemap Campaign

When RapGenius was penalized, the reason they were penalized is they were broadly and openly and publicly soliciting to promote bloggers who would dump a list of keyword rich deeplinks into their blog posts. They were basically turning boatloads of blogs into mini-sitemaps for popular new music albums.

Remember reading dozens (hundreds?) of blog posts last year about how guest posts are spam & Google should kill them? Well these posts from RapGenius were like a guest post on steroids. The post "buyer" didn't have to pay a single cent for the content, didn't care at all about relevancy, AND a sitemap full of keyword rich deep linking spam was included in EACH AND EVERY post.

Most "spammers" would never attempt such a campaign because they would view it as being far too spammy. They would have a zero percent chance of recovery as Google effectively deletes their site from the web.

And while RG is quick to distance itself from scraper sites, for almost the entirety of their history virtually none of the lyrics posted on their site were even licensed.

In the past I've mentioned Google is known to time the news cycle. It comes without surprise that on a Saturday barely a week after being penalized Google restored RapGenius's rankings.

How to Gain Over 400% More Links, While Allegedly Losing

While the following graph may look scary in isolation, if you know the penalty is only a week or two then there's virtually no downside.

Since being penalized, RapGenius has gained links from over 1,000* domains

  • December 25th: 129
  • December 26th: 85
  • December 27th: 87
  • December 28th: 54
  • December 29th: 61
  • December 30th: 105
  • December 31st: 182
  • January 1st: 142
  • January 2nd: 112
  • January 3rd: 122

The above add up to 1,079 & RapGenius only has built a total of 11,930 unique linking domains in their lifetime. They grew about 10% in 10 days!

On every single day the number of new referring domains VASTLY exceeded the number of referring domains that disappeared. And many of these new referring domains are the mainstream media and tech press sites, which are both vastly over-represented in importance/authority on the link graph. They not only gained far more links than they lost, but they also gained far higher quality links that will be nearly impossible for their (less spammy) competitors to duplicate.

They not only got links, but the press coverage acted as a branded advertising campaign for RapGenius.

Here's some quotes from RapGenius on their quick recovery:

  • "we owe a big thanks to Google for being fair and transparent and allowing us back onto their results pages" <-- Not the least bit true. RapGenius was not treated fairly, but rather they were given a free ride compared to the death hundreds of thousands of small businesses have been been handed over the past couple years.
  • "On guest posts, we appended lists of song links (often tracklists of popular new albums) that were sometimes completely unrelated to the music that was the subject of the post." <-- and yet others are afraid of writing relevant on topic posts due to Google's ramped fearmongering campaigns
  • "we compiled a list of 100 “potentially problematic domains”" <-- so their initial list of domains to inspect was less than 10% the number of links they gained while being penalized
  • "Generally Google doesn’t hold you responsible for unnatural inbound links outside of your control" <-- another lie
  • "of the 286 potentially problematic URLs that we manually identified, 217 (more than 75 percent!) have already had all unnatural links purged." <-- even the "all in" removal of pages was less than 25% of the number of unique linking domains generated during the penalty period

And Google allowed the above bullshit during a period when they were sending out messages telling other people WHO DID THINGS FAR LESS EGREGIOUS that they are required to remove more links & Google won't even look at their review requests for at least a couple weeks - A TIME PERIOD GREATER THAN THE ENTIRE TIME RAPGENIUS WAS PENALIZED FOR.

In Conclusion...

If you tell people what works and why you are a spammer with no morals. But if you are VC funded, Matt Cutts has made it clear that you should spam the crap out of Google. Just make sure you hire a PR firm to trump up press coverage of the "unexpected" event & have a faux apology saved in advance. So long as you lie to others and spread Google's propaganda you are behaving in an ethical white hat manner.

Notes

* These stats are from Ahrefs. A few of these links may have been in place before the penality and only recently crawled. However it is also worth mentioning that all third party databases of links are limited in size & refresh rate by optimizing their capital spend, so there are likely hundreds more links which have not yet been crawled by Ahrefs. One should also note that the story is still ongoing and they keep generating more links every day. By the time the story is done spreading they are likely to see roughly a 30% growth in unique linking domains in about 6 weeks.

Gray Hat Search Engineering

Almost anyone in internet marketing who has spent a couple months in the game has seen some "shocking" case study where changing the color of a button increased sales 183% or such. In many cases such changes only happen when the original site had no focus on conversion.

Google, on the other hand, has billions of daily searches and is constantly testing ways to increase yield:

The company was considering adding another sponsored link to its search results, and they were going to do a 30-day A/B test to see what the resulting change would be. As it turns out, the change brought massive returns. Advertising revenues from those users who saw more ads doubled in the first 30 days.
...
By the end of the second month, 80 percent of the people in the cohort that was being served an extra ad had started using search engines other than Google as their primary search engine.

One of the reasons traditional media outlets struggle with the web is the perception that ads and content must be separated. When they had regional monopolies they could make large demands to advertisers - sort of like how Google may increase branded CPCs on AdWords by 500% if you add sitelinks. You not only pay more for clicks that you were getting for free, but you also pay more for the other paid clicks you were getting cheaper in the past.

That's how monopolies work - according to Eric Schmidt they are immune from market forces.

Search itself is the original "native ad." The blend confuses many searchers as the background colors fade into white.

Google tests colors & can control the flow of traffic based not only on result displacement, but also the link colors.

It was reported last month that Google tested adding ads to the knowledge graph. The advertisement link is blue, while the ad disclosure is to the far right out of view & gray.

I was searching for a video game yesterday & noticed that now the entire Knowledge Graph unit itself is becoming an ad unit. Once again, gray disclosure & blue ad links.

Where Google gets paid for the link, the link is blue.

Where Google scrapes third party content & shows excerpts, the link is gray.

The primary goal of such a knowledge block is result displacement - shifting more clicks to the ads and away from the organic results.

When those blocks appear in the search results, even when Google manages to rank the Mayo Clinic highly, it's below the fold.

What's so bad about this practice in health

  • Context Matters: Many issues have overlapping symptoms where a quick glance at a few out-of-context symptoms causes a person to misdiagnose themselves. Flu-like symptoms from a few months ago turned out to be indication of a kidney stone. That level of nuance will *never* be in the knowledge graph. Google's remote rater documents discuss your money your life (YMYL) topics & talk up the importance of knowing who exactly is behind content, but when they use gray font on the source link for their scrape job they are doing just the opposite.
  • Hidden Costs: Many of the heavily advertised solutions appearing above the knowledge graph have hidden costs yet to be discovered. You can't find a pharmaceutical company worth $10s of billions that hasn't plead guilty to numerous felonies associated with deceptive marketing and/or massaging research.
  • Artificially Driving Up Prices: in-patent drugs often cost 100x as much as the associated generic drugs & thus the affordable solutions are priced out of the ad auctions where the price for a click can vastly exceed than the profit from selling a generic prescription drug.

Where's the business model for publishers when they have real editorial cost & must fact check and regularly update their content, their content is good enough to be featured front & center on Google, but attribution is nearly invisible (and thus traffic flow is cut off)? As the knowledge graph expands, what does that publishing business model look like in the future?

Does the knowledge graph eventually contain sponsored self-assessment medical quizzes? How far does this cancer spread?

Where do you place your chips?

Google believes it can ultimately fulfil people’s data needs by sending results directly to microchips implanted into its user’s brains.

Historical Revisionism

A stopped clock is right two times a day.

There’s some amusing historical revisionism going on in SEO punditry world right now, which got me thinking about the history of SEO. I’d like to talk about some common themes of this historical revision, which goes along the lines of “what I predicted all those years ago came true - what a visionary I am! ." No naming names, as I don't meant this to be anything personal - as the same theme has popped up in a number of places - just making some observations :)

See if you agree….

Divided We Fall

The SEO world has never been united. There are no industry standards and qualifications like you’d find in the professions, such as being a doctor, or lawyer or a builder. If you say you’re an SEO, then you’re an SEO.

Part of the reason for the lack of industry standard is that the search engines never came to the party. Sure, they talked at conferences, and still do. They offered webmasters helpful guidelines. They participated in search engine discussion forums. But this was mainly to do with risk management. Keep your friends close, and your enemies closer.

In all these years, you won’t find one example of a representative from a major search engine saying “Hey, let’s all get together and form an SEO standard. It will help promote and legitimize the industry!”.

No, it has always been decrees from on high. “Don’t do this, don’t do that, and here are some things we’d like you to do”. Webmasters don’t get a say in it. They either do what the search engines say, or they go against them, but make no mistake, there was never any partnership, and the search engines didn’t seek one.

This didn’t stop some SEOs seeing it as a form of quasi-partnership, however.

Hey Partner

Some SEOs chose to align themselves with search engines and do their bidding. If the search engine reps said “do this”, they did it. If the search engines said “don’t do this”, they’d wrap themselves up in convoluted rhetorical knots pretending not to do it. This still goes on, of course.

In the early 2000’s, it turned, curiously, into a question of morality. There was “Ethical SEO”, although quite what it had to do with ethics remains unclear. Really, it was another way of saying “someone who follows the SEO guidelines”, presuming that whatever the search engines decree must be ethical, objectively good and have nothing to do self-interest. It’s strange how people kid themselves, sometimes.

What was even funnier was the search engine guidelines were kept deliberately vague and open to interpretation, which, of course, led to a lot of heated debate. Some people were “good” and some people were “bad”, even though the distinction was never clear. Sometimes it came down to where on the page someone puts a link. Or how many times someone repeats a keyword. And in what color.

It got funnier still when the search engines moved the goal posts, as they are prone to do. What was previously good - using ten keywords per page - suddenly became the height of evil, but using three was “good” and so all the arguments about who was good and who wasn’t could start afresh. It was the pot calling the kettle black, and I’m sure the search engines delighted in having the enemy warring amongst themselves over such trivial concerns. As far as the search engines were concerned, none of them were desirable, unless they became paying customers, or led paying customers to their door. Then there was all that curious Google+ business.

It's hard to keep up, sometimes.

Playing By The Rules

There’s nothing wrong with playing by the rules. It would have been nice to think there was a partnership, and so long as you followed the guidelines, high rankings would naturally follow, the bad actors would be relegated, and everyone would be happy.

But this has always been a fiction. A distortion of the environment SEOs were actually operating in.

Jason Calacanis, never one to miss an opportunity for controversy, fired some heat seekers at Google during his WebmasterWorld keynote address recently…..

Calacanis proceeded to describe Cutts and Google in terms like, “liar,” “evil,” and “a bad partner.” He cautioned the PubCon audience to not trust Google, and said they cooperate with partners until they learn the business and find a way to pick off the profits for themselves. The rant lasted a good five minutes….

He accused Google of doing many of the things SEOs are familiar with, like making abrupt algorithm changes without warning. They don’t consult, they just do it, and if people’s businesses get trashed as a result, then that’s just too bad. Now, if that’s a sting for someone who is already reasonably wealthy and successful like Calacanis, just imagine what it feels like for the much smaller web players who are just trying to make a living.

The search business is not a pleasant environment where all players have an input, and then standards, terms and play are generally agreed upon. It’s war. It’s characterized by a massive imbalance of power and wealth, and one party will use it to crush those who it determines stands in its way.

Of course, the ever pleasant Matt Cutts informs us it’s all about the users, and that’s a fair enough spin of the matter, too. There was, and is, a lot of junk in the SERPs, and Mahalo was not a partner of Google, so any expectation they’d have a say in what Google does is unfounded.

The take-away is that Google will set rules that work for Google, and if they happen to work for the webmaster community too, well that’s good, but only a fool would rely on it. Google care about their bottom line and their projects, not ours. If someone goes out of business due to Google’s behaviour, then so be it. Personally, I think the big technology companies do have a responsibility beyond themselves to society, because the amount of power they are now centralising means they’re not just any old company anymore, but great vortexes that can distort entire markets. For more on this idea, and where it’s all going, check out my review of “Who Owns The Future” by Jaron Lanier.

So, if you see SEO as a matter of playing by their rules, then fine, but keep in mind "those who can give you everything can also take everything away". Those rules weren't designed for your benefit.

Opportunity Cost

There was a massive opportunity cost by following so called ethical SEO during the 2000s.

For a long time, it was relatively easily to get high rankings by being grey. And if you got torched, you probably had many other sites with different link patterns good to go. This was against the webmaster guidelines, but given marketing could be characterized as war, one does not let the enemy define ones tactics. Some SEOs made millions doing it. Meanwhile, a lot of content-driven sites disappeared. That was, perhaps, my own "a stopped clock is right two times a day" moment. It's not like I'm going to point you to all the stuff I've been wrong about, now is it :)

These days, a lot of SEO is about content and how that content is marketed, but more specifically it’s about the stature of the site on which that content appears. That’s the bit some pundits tend to gloss over. You can have great content, but that’s no guarantee of anything. You will likely remain invisible. However, put that exact same content on a Fortune 500 site, and that content will likely prosper. Ah, the rich get richer.

So, we can say SEO is about content, but that’s only half the picture. If you’re a small player, the content needs to appear in the right place, be very tightly targeted to your audiences needs so they don’t click back, and it should be pushed through various media channels.

Content, even from many of these "ethical SEOs", used to be created for search engines in the hope of netting as many visitors as possible. These days, it’s probably a better strategy to get inside the audience's heads and target it to their specific needs, as opposed to a keyword, then get that content out to wherever your audience happens to be. Unless, of course, you’re Fortune 500 or otherwise well connected, in which case you can just publish whatever you like and it will probably do well.

Fair? Not really, but no one ever said this game was fair.

Whatever Next?

Do I know what’s going to happen next? In ten years time? Nope. I could make a few guesses, and like many other pundits, some guesses will prove right, and some will be wrong, but that’s the nature of the future. It will soon make fools of us all.

Having said that, will you take a punt and tell us what you think will be the future of SEO? Does it have one? What will look like? If you’re right, then you can point back here in a few years time and say “Look, I told you so!”.

If you’re wrong, well, there’s always historical revisionism :)

Prospering When The Keyword Is "Not Provided"

So, Google has pulled the chair out from under the SEO industry.

Google is no longer passing (much) keyword referrer data. This has been coming for a while, although many people didn’t expect most keyword data to disappear, and not quite this quickly.

As Aaron noted just last month:

Google is not only hiding half of their own keyword referral data, but they are hiding so much more than half that even when you mix in Bing and Yahoo! you still get over 50% of the total hidden.Google's 86% of the 26,233 searches is 22,560 searches.
Keyword (not provided) being shown for 13,413 is 59% of 22,560. That means Google is hiding at least 59% of the keyword data for organic search. While they are passing a significant share of mobile search referrers, there is still a decent chunk that is not accounted for in the change this past week

Google, citing privacy concerns, has been increasingly withholding keyword data in the form of “not provided”. In the past week, they’ve been pretty on track for 100%, and things look set to stay that way.

In the interests of user privacy.

On yesterday's "This Week in Google," a Google engineer called Matt Cutts revealed that the company started encrypting its queries in 2008 after reading my novel Little Brother

Strangely, privacy doesn’t seem to be an issue when it comes to Adwords. 100% of the keyword referral data remains available via Google’s proprietary advertising channel. I guess the lesson here is that user privacy is much less of an issue, so long as you’re paying Google to see it.

Uh-huh.

Opposite Sides

If anyone is still in any doubt about Google’s relationship with SEOs, then hopefully they’re left in no doubt now. There was no industry consultation, Google unilaterally made these changes and thus broke a search industry standard that has been in place since the search industry began. This move makes life harder for all SEOs.

In terms of privacy issues, there is some truth in it. Problem is, because privacy doesn’t extend to Adwords, the explanation isn’t particularly convincing. The message is that if you want keyword data, then you have to pay to get it via Adwords.

One of the cornerstones of SEO is optimization based on keyword terms. Since last century, SEOs have mined data for keyword terms. They have constructed pages and sites based on those terms with the aim of ranking well for those terms. In theory, everyone wins. The searcher finds what they’re looking for, the search engine looks relevant, and the webmaster receives traffic.

This model has developed some serious cracks over the years.

One problem is PPC. The search engine now has split incentives. They want the results to be relevant, so visitors return often, but, perversely, they also have an incentive for the user not to click on the results, but to click on the advertising links, instead.

This becomes a business problem when an intermediary - an SEO - runs a service that competes with the advertising. The value proposition of the SEO is to get the click on the non-advertising links. Not only is the search engine being deprived of the click, the SEO is likely dissuading, or removing the need, for the site owner spending more on PPC.

So, the SEO is a competitor, although potentially useful in a couple of respects.

One, they encourage sites to be more crawler friendly than they would otherwise be. There was a time when there were a lot of Flash sites, and sites designed, often unwittingly, as uncrawlable brochures. These have mostly been eliminated due to the imperatives of search. SEOs encouraged webmasters to focus more on the production of crawlable content. Secondly, SEOs acted as a defacto-sales force for adwords. If a client saw search as important, then PPC was likely to be part of the mix to help extend reach.

However, as the search engines filled with crawlable content, and a lot of it was junk, the search engines had to get better at determining relevance, and not just by matching keywords. They’ve largely achieved this, so the SEO is no longer offering the search engine much they don’t already now have in abundance - crawlable content they can easily classify.

So, that just leaves the SEO as a competitor and a potential defacto-sales force for a higher Adwords spend. So, removing keyword referral data was a clever move. It will drive more search spend to Adwords and make life harder for Adwords competitors, namely SEOs. If you’re doing Adwords, you’re a customer, if you’re doing SEO, you’re a competitor.

What’s Next?

For some, it will mean a significant change in strategy.

Google don’t need pages optimized against a keyword phrase. In response, SEOs could look at broader page-level metrics, like traffic volumes and conversion rates. They could adopt publishing strategies, backed by sales funnel analytics and optimization. For example, a webmaster may sell a variety of products and spend more time watching out for the on-site links users click on the most in order to determine searcher intent. They optimize what happens after the click. In order to get the click in the first place, they might throw out a fairly wide content net of on-topic pages, and hope to scoop up a lot of fish.

Some will bite the bullet and spend more on Adwords. Adwords will reveal the search keywords linked with volume, and this data can then be fed through into SEO campaigns. We’ll likely see a return to rank checking and matching of these ranks against visitor activity on site.

SEOs could also use proxy information from other search engines, such as Bing. The problem with that other engines have low traffic volumes, meaning comparisons to Google traffic will be inaccurate due to small sample sizes. Still, better than nothing. Webmaster Tools data is available, although this isn’t persistent and is pretty clunky compared to keyword data within analytics packages. No doubt new keyword mining and tracking tools will spring up that will help approximate Google keyword traffic. It will be interesting to see what happens in this space, so if anyone spots any of these services, please add them to the comments.

However, a bigger problem for SEOs still hovers beyond the horizon. If SEOs are competitors to Adwords, then SEOs can expect ongoing changes from Google that will further reduce their ability to compete with Adwords. Another day, another inbound missile. No one should be in any doubt that Google will have a series of missiles lined up.

Vince. Panda. Penguin. Knowledge Graph. Link disavow tool. Decommissioned keyword research tool. Keyword (not provided). More to come.

Adopt A Wider Digital Strategy

One approach is to learn more about the visitor using other metrics at the page and site level.

The point of SEO is to get relevant traffic. Keyword data helped SEOs to target pages and go some way to understanding user intent. However, determining intent by the keyword alone has always been a hit and miss affair. Sometimes, the intent is obvious, particularly on long keyword strings. But the more generic the keyword term, the less you can tell about visitor intent, which then leads to the visitor clicking-back and refining their search.

We should be looking for a richer determination of visitor intent.

Of course, we can watch and measure what visitors do after they arrive on site. If they click back, we know we’re off-topic for that user. Or not attractive enough. Or not getting the message across clearly. Or perhaps we have targeted the wrong demographic. Could the users be segmented a lot further than we already do? We could run A/B testing to learn more about the audience. We could offer multiple paths and see which are the most promising in terms of engagement. If so doing, we understand a lot more about visitors than just guessing based on the keywords they use.

SEOs will likely be looking more at content strategy. Is this content really what the user wants? Is a site offering text when what users really want video? Does the site have a strategy to test content types against one and other? And the placement thereof? We can establish this by gaining a deep understanding of analytics and incorporating demographic information, and other third-party research.

Engagement metrics are a big thing post-Panda. Are people clicking back straight away, or clicking further into the site? Refine content and links until bounce rates come down. These elements can also be tested on Adwords landing pages. If the engagement metrics are right on an adwords landing page, they are likely right if a similar page is used for SEO. The ranking for an individual keyword doesn’t matter so much, just as long as enough of the audience who do arrive are engaging.

Look at optimizing the user experience in terms of better usability and watching the paths they take through the site. Where are we losing people? Could the funnels be made more evident? And which users are we talking about? i.e. young visitors vs old visitors, returning visitor vs new visitors?

There are some high end tools that can help with this, such as Foresters Technographics and Adobe Neolane, however there are other more-than-adequate approaches, mixing readily available tools with a little best practice. Consider website surveys and polls, and third-party profiling tools, like SEMRush, to quantify your competition.

In "Digital Marketing Analytics: Making Sense Of Consumer Data In A Digital World", the authors give a lot of practical advice on mining the various channels so as to better understand your customer, and configure your website to meet their needs. Only a small fraction of this can be gleaned from keyword data.

For example, mining social media channels tells you a lot about your potential audience. How they talk, who they talk to, what their interests are, who they are connected to, where they are, who influences who, and who shares what with who. Social profile and activity analysis offers rich audience insight, often more so than keywords. You can segment and understand your audiences in ways that would be difficult to do using keywords alone.

So, losing keywords makes life difficult. But it also present opportunities.

As Much As Things Change, They Stay The Same

The promise of search marketing is to deliver the right message to the right people at the right time. That’s the same promise for all digital marketing, keyword driven or otherwise. We should place just as much emphasis, if not more, on measuring audience behaviour over time i.e. what happens well before the click, and what happens after it, as we do on the keyword, itself.

The better we understand the audience, the better we are able to serve their needs, which likely leads to a more profitable business that those who understand less. Keywords help, but they’re not the be-all and end-all. Google still has the exact same user base. Someone still has to rank #1 against a given keyword term. So long as you're doing Adwords, your competitors have no better idea regarding keywords than you do, so the playing field is still level in that respect.

Those putting more effort into page-level metrics, site metrics, and brand in order to better understand visitors now stand to gain advantage. The fundamentals haven't changed:

  • Who is the audience?
  • Where are they located?
  • What does the audience know?
  • What are they interested in?
  • What do the audience need?

When SEO becomes harder, the barrier is raised, meaning those who jump that barrier are in a more dependable position than they were before. Remember, most of you will have archived keyword data. New entrants to the SEO field will not, and will find it very difficult, if not impossible, to acquire.

The game just got harder. For everyone.

Google Keyword (Not Provided)

Just a follow up on the prior (not provided) post, as Google has shot the moon since our last post on this. Here's a quick YouTube video.

The above video references the following:

Matt Cutts when secured search first rolled out:

Google software engineer Matt Cutts, who’s been involved with the privacy changes, wouldn’t give an exact figure but told me he estimated even at full roll-out, this would still be in the single-digit percentages of all Google searchers on Google.com.

This Week in Google (TWIG) show 211, where Matt mentioned the inspiration for encrypted search:

we actually started doing encrypted.google.com in 2008 and one of the guys who did a lot of heavy lifting on that, his name is Evan, and he actually reports to me. And we started that after I read Little Brother, and we said "we've got to encrypt the web."

The integration of organic search performance data inside AdWords.

The esteemed AdWords advertiser David Whitaker.

When asked about the recent increase in (not provided), a Google representative stated the following:

We want to provide SSL protection to as many users as we can, in as many regions as we can — we added non-signed-in Chrome omnibox searches earlier this year, and more recently other users who aren’t signed in. We’re going to continue expanding our use of SSL in our services because we believe it’s a good thing for users….

The motivation here is not to drive the ads side — it’s for our search users.

What an excellent time for Google to block paid search referrals as well.

If the move is important for user safety then it should apply to the ads as well.

The Benefits Of Thinking Like Google

Shadows are the color of the sky.

It’s one of those truths that is difficult to see, until you look closely at what’s really there.

To see something as it really is, we should try to identify our own bias, and then let it go.

“Unnatural” Links

This article tries to make sense of Google's latest moves regarding links.

It’s a reaction to Google’s update of their Link Schemes policy. Google’s policy states “Any links intended to manipulate PageRank or a site’s ranking in Google search results may be considered part of a link scheme and a violation of Google’s Webmaster Guidelines." I wrote on this topic, too.

Those with a vested interest in the link building industry - which is pretty much all of us - might spot the problem.

Google’s negative emphasis, of late, has been about links. Their message is not new, just the emphasis. The new emphasis could pretty much be summarized thus:"any link you build for the purpose of manipulating rank is outside the guidelines." Google have never encouraged activity that could manipulate rankings, which is precisely what those link building, for the purpose of SEO, attempt to do. Building links for the purposes of higher rank AND staying within Google's guidelines will not be easy.

Some SEOs may kid themselves that they are link building “for the traffic”, but if that were the case, they’d have no problem insisting those links were scripted so they could monitor traffic statistics, or at very least, no-followed, so there could be no confusion about intent.

How many do?

Think Like Google

Ralph Tegtmeier: In response to Eric's assertion "I applaud Google for being more and more transparent with their guidelines", Ralph writes- "man, Eric: isn't the whole point of your piece that this is exactly what they're NOT doing, becoming "more transparent"?

Indeed.

In order to understand what Google is doing, it can be useful to downplay any SEO bias i.e. what we may like to see from an SEO standpoint, rather try to look at the world from Google’s point of view.

I ask myself “if I were Google, what would I do?”

Clearly I'm not Google, so these are just my guesses, but if I were Google, I’d see all SEO as a potential competitive threat to my click advertising business. The more effective the SEO, the more of a threat it is. SEOs can’t be eliminated, but they can been corralled and managed in order to reduce the level of competitive threat. Partly, this is achieved by algorithmic means. Partly, this is achieved using public relations. If I were Google, I would think SEOs are potentially useful if they could be encouraged to provide high quality content and make sites easier to crawl, as this suits my business case.

I’d want commercial webmasters paying me for click traffic. I’d want users to be happy with the results they are getting, so they keep using my search engine. I’d consider webmasters to be unpaid content providers.

Do I (Google) need content? Yes, I do. Do I need any content? No, I don’t. If anything, there is too much content, and lot of it is junk. In fact, I’m getting more and more selective about the content I do show. So selective, in fact, that a lot of what I show above the fold content is controlled and “published”, in the broadest sense of the word, by me (Google) in the form of the Knowledge Graph.

It is useful to put ourselves in someone else’s position to understand their truth. If you do, you’ll soon realise that Google aren’t the webmasters friend if your aim, as a webmaster, is to do anything that "artificially" enhances your rank.

So why are so many SEOs listening to Google’s directives?

Rewind

A year or two ago, it would be madness to suggest webmasters would pay to remove links, but that’s exactly what’s happening. Not only that, webmasters are doing Google link quality control. For free. They’re pointing out the links they see as being “bad” - links Google’s algorithms may have missed.

Check out this discussion. One exasperated SEO tells Google that she tries hard to get links removed, but doesn’t hear back from site owners. The few who do respond want money to take the links down.

It is understandable site owners don't spend much time removing links. From a site owners perspective, taking links down involves a time cost, so there is no benefit to the site owner in doing so, especially if they receive numerous requests. Secondly, taking down links may be perceived as being an admission of guilt. Why would a webmaster admit their links are "bad"?

The answer to this problem, from Google's John Mueller is telling.

A shrug of the shoulders.

It’s a non-problem. For Google. If you were Google, would you care if a site you may have relegated for ranking manipulation gets to rank again in future? Plenty more where they came from, as there are thousands more sites just like it, and many of them owned by people who don’t engage in ranking manipulation.

Does anyone really think their rankings are going to return once they’ve been flagged?

Jenny Halasz then hinted at the root of the problem. Why can’t Google simply not count the links they don’t like? Why make webmasters jump through arbitrary hoops? The question was side-stepped.

If you were Google, why would you make webmasters jump through hoops? Is it because you want to make webmasters lives easier? Well, that obviously isn’t the case. Removing links is a tedious, futile process. Google suggest using the disavow links tool, but the twist is you can’t just put up a list of links you want to disavow.

Say what?

No, you need to show you’ve made some effort to remove them.

Why?

If I were Google, I’d see this information supplied by webmasters as being potentially useful. They provide me with a list of links that the algorithm missed, or considered borderline, but the webmaster has reviewed and thinks look bad enough to affect their ranking. If the webmaster simply provided a list of links dumped from a link tool, it’s probably not telling Google much Google doesn’t already know. There’s been no manual link review.

So, what webmasters are doing is helping Google by manually reviewing links and reporting bad links. How does this help webmasters?

It doesn’t.

It just increases the temperature of the water in the pot. Is the SEO frog just going to stay there, or is he going to jump?

A Better Use Of Your Time

Does anyone believe rankings are going to return to their previous positions after such an exercise? A lot of webmasters aren’t seeing changes. Will you?

Maybe.

But I think it’s the wrong question.

It’s the wrong question because it’s just another example of letting Google define the game. What are you going to do when Google define you right out of the game? If your service or strategy involves links right now, then in order to remain snow white, any links you place, for the purposes of achieving higher rank, are going to need to be no-followed in order to be clear about intent. Extreme? What's going to be the emphasis in six months time? Next year? How do you know what you're doing now is not going to be frowned upon, then need to be undone, next year?

A couple of years ago it would be unthinkable that webmasters would report and remove their own links, even paying for them to be removed, but that’s exactly what’s happening. So, what is next year's unthinkable scenario?

You could re-examine the relationship and figure what you do on your site is absolutely none of Google’s business. They can issue as many guidelines as they like, but they do not own your website, or the link graph, and therefore don't have authority over you unless you allow it. Can they ban your site because you’re not compliant with their guidelines? Sure, they can. It’s their index. That is the risk. How do you choose to manage this risk?

It strikes me you can lose your rankings at anytime whether you follow the current guidelines or not, especially when the goal-posts keep moving. So, the risk of not following the guidelines, and following the guidelines but not ranking well is pretty much the same - no traffic. Do you have a plan to address the “no traffic from Google” risk, however that may come about?

Your plan might involve advertising on other sites that do rank well. It might involve, in part, a move to PPC. It might be to run multiple domains, some well within the guidelines, and some way outside them. Test, see what happens. It might involve beefing up other marketing channels. It might be to buy competitor sites. Your plan could be to jump through Google’s hoops if you do receive a penalty, see if your site returns, and if it does - great - until next time, that is.

What's your long term "traffic from Google" strategy?

If all you do is "follow Google's Guidelines", I'd say that's now a high risk SEO strategy.

Winning Strategies to Lose Money With Infographics

Google is getting a bit absurd with suggesting that any form of content creation that drives links should include rel=nofollow. Certainly some techniques may be abused, but if you follow the suggested advice, you are almost guaranteed to have a negative ROI on each investment - until your company goes under.

Some will ascribe such advice as taking a "sustainable" and "low-risk" approach, but such strategies are only "sustainable" and "low-risk" so long as ROI doesn't matter & you are spending someone else's money.

The advice on infographics in the above video suggests that embed code by default should include nofollow links.

Companies can easily spend at least $2,000 to research, create, revise & promote an infographic. And something like 9 out of 10 infographics will go nowhere. That means you are spending about $20,000 for each successful viral infographic. And this presumes that you know what you are doing. Mix in a lack of experience, poor strategy, poor market fit, or poor timing and that cost only goes up from there.

If you run smaller & lesser known websites, quite often Google will rank a larger site that syndicates the infographic above the original source. They do that even when the links are followed. Mix in nofollow on the links and it is virtually guaranteed that you will get outranked by someone syndicating your infographic.

So if you get to count as duplicate content for your own featured premium content that you dropped 4 or 5 figures on AND you don't get links out of it, how exactly does the investment ever have any chance of backing out?

Sales?

Not a snowball's chance in hell.

An infographic created around "the 10 best ways you can give me your money" won't spread. And if it does spread, it will be people laughing at you.

I also find it a bit disingenuous the claim that people putting something that is 20,000 pixels large on their site are not actively vouching for it. If something was crap and people still felt like burning 20,000 pixels on syndicating it, surely they could add nofollow on their end to express their dissatisfaction and disgust with the piece.

Many dullards in the SEO industry give Google a free pass on any & all of their advice, as though it is always reasonable & should never be questioned. And each time it goes unquestioned, the ability to exist in the ecosystem as an independent player diminishes as the entire industry moves toward being classified as some form of spam & getting hit or not depends far more on who than what.

Does Google's recent automated infographic generator give users embed codes with nofollow on the links? Not at all. Instead they give you the URL without nofollow & those URLs are canonicalized behind the scenes to flow the link equity into the associated core page.

No cost cut-n-paste mix-n-match = direct links. Expensive custom research & artwork = better use nofollow, just to be safe.

If Google actively adds arbitrary risks to some players while subsidizing others then they shift the behaviors of markets. And shift the markets they do!

Years ago Twitter allowed people who built their platform to receive credit links in their bio. Matt Cutts tipped off Ev Williams that the profile links should be nofollowed & that flow of link equity was blocked.

It was revealed in the WSJ that in 2009 Twitter's internal metrics showed an 11% spammy Tweet rate & Twitter had a grand total of 2 "spam science" programmers on staff in 2012.

With smaller sites, they need to default everything to nofollow just in case anything could potentially be construed (or misconstrued) to have the intent to perhaps maybe sorta be aligned potentially with the intent to maybe sorta be something that could maybe have some risk of potentially maybe being spammy or maybe potentially have some small risk that it could potentially have the potential to impact rank in some search engine at some point in time, potentially.

A larger site can have over 10% of their site be spam (based on their own internal metrics) & set up their embed code so that the embeds directly link - and they can do so with zero risk.

I just linked to Twitter twice in the above embed. If those links were directly to Cygnus it may have been presumed that either he or I are spammers, but put the content on Twitter with 143,199 Tweets in a second & those links are legit & clean. Meanwhile, fake Twitter accounts have grown to such a scale that even Twitter is now buying them to try to stop them. Twitter's spam problem was so large that once they started to deal with spam their growth estimates dropped dramatically:

CEO Dick Costolo told employees he expected to get to 400 million users by the end of 2013, according to people familiar with the company.

Sources said that Twitter now has around 240 million users, which means it has been adding fewer than 4.5 million users a month in 2013. If it continues to grow at that rate, it would end this year around the 260 million mark — meaning that its user base would have grown by about 30 percent, instead of Costolo’s 100 percent goal.

Typically there is no presumed intent to spam so long as the links are going into a large site (sure there are a handful of token counter-examples shills can point at). By and large it is only when the links flow out to smaller players that they are spam. And when they do, they are presumed to be spam even if they point into featured content that cost thousands of Dollars. You better use nofollow, just to play it safe!

That duality is what makes blind unquestioning adherence to Google scripture so unpalatable. A number of people are getting disgusted enough by it that they can't help but comment on it: David Naylor, Martin Macdonald & many others DennisG highlighted.

Oh, and here's an infographic for your pleasurings.

Google: Press Release Links

So, Google have updated their Webmaster Guidelines.

Here are a few common examples of unnatural links that violate our guidelines:....Links with optimized anchor text in articles or press releases distributed on other sites.

For example: There are many wedding rings on the market. If you want to have a wedding, you will have to pick the best ring. You will also need to buy flowers and a wedding dress.

In particular, they have focused on links with optimized anchor text in articles or press releases distributed on other sites. Google being Google, these rules are somewhat ambiguous. “Optimized anchor text”? The example they provide includes keywords in the anchor text, so keywords in the anchor text is “optimized” and therefore a violation of Google’s guidelines.

Ambiguously speaking, of course.

To put the press release change in context, Google’s guidelines state:

Any links intended to manipulate PageRank or a site's ranking in Google search results may be considered part of a link scheme and a violation of Google’s Webmaster Guidelines. This includes any behavior that manipulates links to your site or outgoing links from your site

So, links gained, for SEO purposes - intended to manipulate ranking - are against Google Guidelines.

Google vs Webmasters

Here’s a chat...

In this chat, Google’s John Muller says that, if the webmaster initiated it, then it isn't a natural link. If you want to be on the safe side, John suggests to use no-follow on links.

Google are being consistent, but what’s amusing is the complete disconnect on display from a few of the webmasters. Google have no problem with press releases, but if a webmaster wants to be on the safe side in terms of Google’s guidelines, the webmaster should no-follow the link.

Simple, right. If it really is a press release, and not an attempt to link build for SEO purposes, then why would a webmaster have any issue with adding a no-follow to a link?

He/she wouldn't.

But because some webmasters appear to lack self-awareness about what it is they are actually doing, they persist with their line of questioning. I suspect what they really want to hear is “keyword links in press releases are okay." Then, webmasters can continue to issue pretend press releases as a link building exercise.

They're missing the point.

Am I Taking Google’s Side?

Not taking sides.

Just hoping to shine some light on a wider issue.

If webmasters continue to let themselves be defined by Google, they are going to get defined out of the game entirely. It should be an obvious truth - but sadly lacking in much SEO punditry - that Google is not on the webmasters side. Google is on Google’s side. Google often say they are on the users side, and there is certainly some truth in that.

However,when it comes to the webmaster, the webmaster is a dime-a-dozen content supplier who must be managed, weeded out, sorted and categorized. When it comes to the more “aggressive” webmasters, Google’s behaviour could be characterized as “keep your friends close, and your enemies closer”.

This is because some webmasters, namely SEOs, don’t just publish content for users, they compete with Google’s revenue stream. SEOs offer a competing service to click based advertising that provides exactly the same benefit as Google's golden goose, namely qualified click traffic.

If SEOs get too good at what they do, then why would people pay Google so much money per click? They wouldn’t - they would pay it to SEOs, instead. So, if I were Google, I would see SEO as a business threat, and manage it - down - accordingly. In practice, I’d be trying to redefine SEO as “quality content provision”.

Why don't Google simply ignore press release links? Easy enough to do. Why go this route of making it public? After all, Google are typically very secret about algorithmic topics, unless the topic is something they want you to hear. And why do they want you to hear this? An obvious guess would be that it is done to undermine link building, and SEOs.

Big missiles heading your way.

Guideline Followers

The problem in letting Google define the rules of engagement is they can define you out of the SEO game, if you let them.

If an SEO is not following the guidelines - guidelines that are always shifting - yet claim they do, then they may be opening themselves up to legal liability. In one recent example, a case is underway alleging lack of performance:

Last week, the legal marketing industry was aTwitter (and aFacebook and even aPlus) with news that law firm Seikaly & Stewart had filed a lawsuit against The Rainmaker Institute seeking a return of their $49,000 in SEO fees and punitive damages under civil RICO

.....but it’s not unreasonable to expect a somewhat easier route for litigants in the future might be “not complying with Google’s guidelines”, unless the SEO agency disclosed it.

SEO is not the easiest career choice, huh.

One group that is likely to be happy about this latest Google push is legitimate PR agencies, media-relations departments, and publicists. As a commenter on WMW pointed out:

I suspect that most legitimate PR agencies, media-relations departments, and publicists will be happy to comply with Google's guidelines. Why? Because, if the term "press release" becomes a synonym for "SEO spam," one of the important tools in their toolboxes will become useless.

Just as real advertisers don't expect their ads to pass PageRank, real PR people don't expect their press releases to pass PageRank. Public relations is about planting a message in the media, not about manipulating search results

However, I’m not sure that will mean press releases are seen as any more credible, as press releases have never enjoyed a stellar reputation pre-SEO, but it may thin the crowd somewhat, which increases an agencies chances of getting their client seen.

Guidelines Honing In On Target

One resource referred to in the video above was this article, written by Amit Singhal, who is head of Google’s core ranking team. Note that it was written in 2011, so it’s nothing new. Here’s how Google say they determine quality:

we aren't disclosing the actual ranking signals used in our algorithms because we don't want folks to game our search results; but if you want to step into Google's mindset, the questions below provide some guidance on how we've been looking at the issue:

  • Would you trust the information presented in this article?
  • Is this article written by an expert or enthusiast who knows the topic well, or is it more shallow in nature?
  • Does the site have duplicate, overlapping, or redundant articles on the same or similar topics with slightly different keyword variations?
  • Are the topics driven by genuine interests of readers of the site, or does the site generate content by attempting to guess what might rank well in search engines?
  • Does the article provide original content or information, original reporting, original research, or original analysis?
  • Does the page provide substantial value when compared to other pages in search results?
  • How much quality control is done on content?

….and so on. Google’s rhetoric is almost always about “producing high quality content”, because this is what Google’s users want, and what Google’s users want, Google’s shareholders want.

It’s not a bad thing to want, of course. Who would want poor quality content? But as most of us know, producing high quality content is no guarantee of anything. Great for Google, great for users, but often not so good for publishers as the publisher carries all the risk.

Take a look at the Boston Globe, sold along with a boatload of content for a 93% decline. Quality content sure, but is it a profitable business? Emphasis on content without adequate marketing is not a sure-fire strategy. Bezos has just bought the Washington Post, of course, and we're pretty sure that isn't a content play, either.

High quality content often has a high upfront production cost attached to it, and given measly web advertising rates, the high possibility of invisibility, getting content scrapped and ripped off, then it is no wonder webmasters also push their high quality content in order to ensure it ranks. What other choice have they got?

To not do so is also risky.

Even eHow, well known for cheap factory line content, is moving toward subscription membership revenues.

The Somewhat Bigger Question

Google can move the goal- posts whenever they like. What you’re doing today might be frowned upon tomorrow. One day, your content may be made invisible, and there will be nothing you can do about it, other than start again.

Do you have a contingency plan for such an eventuality?

Johnon puts it well:

The only thing that matters is how much traffic you are getting from search engines today, and how prepared you are for when some (insert adjective here) Googler shuts off that flow of traffic"

To ask about the minuate of Google’s policies and guidelines is to miss the point. The real question is how prepared are you when Google shuts off you flow of traffic because they’ve reset the goal posts?

Focusing on the minuate of Google's policies is, indeed, to miss the point.

This is a question of risk management. What happens if your main site, or your clients site, runs foul of a Google policy change and gets trashed? Do you run multiple sites? Run one site with no SEO strategy at all, whilst you run other sites that push hard? Do you stay well within the guidelines and trust that will always be good enough? If you stay well within the guidelines, but don’t rank, isn’t that effectively the same as a ban i.e. you’re invisible? Do you treat search traffic as a bonus, rather than the main course?

Be careful about putting Google’s needs before your own. And manage your risk, on your own terms.

Google Keyword(Not Provided): High Double Digit Percent

Most Organic Search Data is Now Hidden

Over the past couple years since its launch, Google's keyword (not provided) has received quite a bit of exposure, with people discussing all sorts of tips on estimating its impact & finding alternate sources of data (like competitive research tools & webmaster tools).

What hasn't received anywhere near enough exposure (and should be discussed daily) is that the sole purpose of the change was anti-competitive abuse from the market monopoly in search.

The site which provided a count for (not provided) recently displayed over 40% of queries as (not provided), but that percentage didn't include the large percent of mobile search users that were showing no referrals at all & were showing up as direct website visitors. On July 30, Google started showing referrals for many of those mobile searchers, using keyword (not provided).

According to research by RKG, mobile click prices are nearly 60% of desktop click prices, while mobile search click values are only 22% of desktop click prices. Until Google launched enhanced AdWords campaigns they understated the size of mobile search by showing many mobile searchers as direct visitors. But now that AdWords advertisers were opted into mobile ads (and have to go through some tricky hoops to figure out how to disable it), Google has every incentive to promote what a big growth channel mobile search is for their business.

Looking at the analytics data for some non-SEO websites over the past 4 days I get Google referring an average of 86% of the 26,233 search visitors, with 13,413 being displayed as keyword (not provided).

Hiding The Value of SEO

Google is not only hiding half of their own keyword referral data, but they are hiding so much more than half that even when you mix in Bing and Yahoo! you still get over 50% of the total hidden.

Google's 86% of the 26,233 searches is 22,560 searches.

Keyword (not provided) being shown for 13,413 is 59% of 22,560. That means Google is hiding at least 59% of the keyword data for organic search. While they are passing a significant share of mobile search referrers, there is still a decent chunk that is not accounted for in the change this past week.

Not passing keywords is just another way for Google to increase the perceived risk & friction of SEO, while making SEO seem less necessary, which has been part of "the plan" for years now.

Buy AdWords ads and the data gets sent. Rank organically and most the data is hidden.

When one digs into keyword referral data & ad blocking, there is a bad odor emitting from the GooglePlex.

Subsidizing Scammers Ripping People Off

A number of the low end "solutions" providers scamming small businesses looking for SEO are taking advantage of the opportunity that keyword (not provided) offers them. A buddy of mine took over SEO for a site that had showed absolutely zero sales growth after a year of 15% monthly increase in search traffic. Looking at the on-site changes, the prior "optimizers" did nothing over the time period. Looking at the backlinks, nothing there either.

So what happened?

Well, when keyword data isn't shown, it is pretty easy for someone to run a clickbot to show keyword (not provided) Google visitors & claim that they were "doing SEO."

And searchers looking for SEO will see those same scammers selling bogus solutions in AdWords. Since they are selling a non-product / non-service, their margins are pretty high. Endorsed by Google as the best, they must be good.

Or something like that:

Google does prefer some types of SEO over others, but their preference isn’t cast along the black/white divide you imagine. It has nothing to do with spam or the integrity of their search results. Google simply prefers ineffective SEO over SEO that works. No question about it. They abhor any strategies that allow guys like you and me to walk into a business and offer a significantly better ROI than AdWords.

This is no different than the YouTube videos "recommended for you" that teach you how to make money on AdWords by promoting Clickbank products which are likely to get your account flagged and banned. Ooops.

Anti-competitive Funding Blocking Competing Ad Networks

John Andrews pointed to Google's blocking (then funding) of AdBlock Plus as an example of their monopolistic inhibiting of innovation.

sponsoring Adblock is changing the market conditions. Adblock can use the money provided by Google to make sure any non-Google ad is blocked more efficiently. They can also advertise their addon better, provide better support, etc. Google sponsoring Adblock directly affects Adblock's ability to block the adverts of other companies around the world. - RyanZAG

Turn AdBlock Plus on & search for credit cards on Google and get ads.

Do that same search over at Bing & get no ads.

How does a smaller search engine or a smaller ad network compete with Google on buying awareness, building a network AND paying the other kickback expenses Google forces into the marketplace?

They can't.

Which is part of the reason a monopoly in search can be used to control the rest of the online ecosystem.

Buying Browser Marketshare

Already the #1 web browser, Google Chrome buys marketshare with shady one-click bundling in software security installs.

If you do that stuff in organic search or AdWords, you might be called a spammer employing deceptive business practices.

When Google does it, it's "good for the user."

Vampire Sucking The Lifeblood Out of SEO

Google tells Chrome users "not signed in to Chrome (You're missing out - sign in)." Login to Chrome & searchers don't pass referral information. Google also promotes Firefox blocking the passage of keyword referral data in search, but when it comes to their own cookies being at risk, that is unacceptable: "Google is pulling out all the stops in its campaign to drive Chrome installs, which is understandable given Microsoft and Mozilla's stance on third-party cookies, the lifeblood of Google's display-ad business."

What do we call an entity that considers something "its lifeblood" while sucking it out of others?

Pages