The Benefits Of Thinking Like Google

Shadows are the color of the sky.

It’s one of those truths that is difficult to see, until you look closely at what’s really there.

To see something as it really is, we should try to identify our own bias, and then let it go.

“Unnatural” Links

This article tries to make sense of Google's latest moves regarding links.

It’s a reaction to Google’s update of their Link Schemes policy. Google’s policy states “Any links intended to manipulate PageRank or a site’s ranking in Google search results may be considered part of a link scheme and a violation of Google’s Webmaster Guidelines." I wrote on this topic, too.

Those with a vested interest in the link building industry - which is pretty much all of us - might spot the problem.

Google’s negative emphasis, of late, has been about links. Their message is not new, just the emphasis. The new emphasis could pretty much be summarized thus:"any link you build for the purpose of manipulating rank is outside the guidelines." Google have never encouraged activity that could manipulate rankings, which is precisely what those link building, for the purpose of SEO, attempt to do. Building links for the purposes of higher rank AND staying within Google's guidelines will not be easy.

Some SEOs may kid themselves that they are link building “for the traffic”, but if that were the case, they’d have no problem insisting those links were scripted so they could monitor traffic statistics, or at very least, no-followed, so there could be no confusion about intent.

How many do?

Think Like Google

Ralph Tegtmeier: In response to Eric's assertion "I applaud Google for being more and more transparent with their guidelines", Ralph writes- "man, Eric: isn't the whole point of your piece that this is exactly what they're NOT doing, becoming "more transparent"?

Indeed.

In order to understand what Google is doing, it can be useful to downplay any SEO bias i.e. what we may like to see from an SEO standpoint, rather try to look at the world from Google’s point of view.

I ask myself “if I were Google, what would I do?”

Clearly I'm not Google, so these are just my guesses, but if I were Google, I’d see all SEO as a potential competitive threat to my click advertising business. The more effective the SEO, the more of a threat it is. SEOs can’t be eliminated, but they can been corralled and managed in order to reduce the level of competitive threat. Partly, this is achieved by algorithmic means. Partly, this is achieved using public relations. If I were Google, I would think SEOs are potentially useful if they could be encouraged to provide high quality content and make sites easier to crawl, as this suits my business case.

I’d want commercial webmasters paying me for click traffic. I’d want users to be happy with the results they are getting, so they keep using my search engine. I’d consider webmasters to be unpaid content providers.

Do I (Google) need content? Yes, I do. Do I need any content? No, I don’t. If anything, there is too much content, and lot of it is junk. In fact, I’m getting more and more selective about the content I do show. So selective, in fact, that a lot of what I show above the fold content is controlled and “published”, in the broadest sense of the word, by me (Google) in the form of the Knowledge Graph.

It is useful to put ourselves in someone else’s position to understand their truth. If you do, you’ll soon realise that Google aren’t the webmasters friend if your aim, as a webmaster, is to do anything that "artificially" enhances your rank.

So why are so many SEOs listening to Google’s directives?

Rewind

A year or two ago, it would be madness to suggest webmasters would pay to remove links, but that’s exactly what’s happening. Not only that, webmasters are doing Google link quality control. For free. They’re pointing out the links they see as being “bad” - links Google’s algorithms may have missed.

Check out this discussion. One exasperated SEO tells Google that she tries hard to get links removed, but doesn’t hear back from site owners. The few who do respond want money to take the links down.

It is understandable site owners don't spend much time removing links. From a site owners perspective, taking links down involves a time cost, so there is no benefit to the site owner in doing so, especially if they receive numerous requests. Secondly, taking down links may be perceived as being an admission of guilt. Why would a webmaster admit their links are "bad"?

The answer to this problem, from Google's John Mueller is telling.

A shrug of the shoulders.

It’s a non-problem. For Google. If you were Google, would you care if a site you may have relegated for ranking manipulation gets to rank again in future? Plenty more where they came from, as there are thousands more sites just like it, and many of them owned by people who don’t engage in ranking manipulation.

Does anyone really think their rankings are going to return once they’ve been flagged?

Jenny Halasz then hinted at the root of the problem. Why can’t Google simply not count the links they don’t like? Why make webmasters jump through arbitrary hoops? The question was side-stepped.

If you were Google, why would you make webmasters jump through hoops? Is it because you want to make webmasters lives easier? Well, that obviously isn’t the case. Removing links is a tedious, futile process. Google suggest using the disavow links tool, but the twist is you can’t just put up a list of links you want to disavow.

Say what?

No, you need to show you’ve made some effort to remove them.

Why?

If I were Google, I’d see this information supplied by webmasters as being potentially useful. They provide me with a list of links that the algorithm missed, or considered borderline, but the webmaster has reviewed and thinks look bad enough to affect their ranking. If the webmaster simply provided a list of links dumped from a link tool, it’s probably not telling Google much Google doesn’t already know. There’s been no manual link review.

So, what webmasters are doing is helping Google by manually reviewing links and reporting bad links. How does this help webmasters?

It doesn’t.

It just increases the temperature of the water in the pot. Is the SEO frog just going to stay there, or is he going to jump?

A Better Use Of Your Time

Does anyone believe rankings are going to return to their previous positions after such an exercise? A lot of webmasters aren’t seeing changes. Will you?

Maybe.

But I think it’s the wrong question.

It’s the wrong question because it’s just another example of letting Google define the game. What are you going to do when Google define you right out of the game? If your service or strategy involves links right now, then in order to remain snow white, any links you place, for the purposes of achieving higher rank, are going to need to be no-followed in order to be clear about intent. Extreme? What's going to be the emphasis in six months time? Next year? How do you know what you're doing now is not going to be frowned upon, then need to be undone, next year?

A couple of years ago it would be unthinkable that webmasters would report and remove their own links, even paying for them to be removed, but that’s exactly what’s happening. So, what is next year's unthinkable scenario?

You could re-examine the relationship and figure what you do on your site is absolutely none of Google’s business. They can issue as many guidelines as they like, but they do not own your website, or the link graph, and therefore don't have authority over you unless you allow it. Can they ban your site because you’re not compliant with their guidelines? Sure, they can. It’s their index. That is the risk. How do you choose to manage this risk?

It strikes me you can lose your rankings at anytime whether you follow the current guidelines or not, especially when the goal-posts keep moving. So, the risk of not following the guidelines, and following the guidelines but not ranking well is pretty much the same - no traffic. Do you have a plan to address the “no traffic from Google” risk, however that may come about?

Your plan might involve advertising on other sites that do rank well. It might involve, in part, a move to PPC. It might be to run multiple domains, some well within the guidelines, and some way outside them. Test, see what happens. It might involve beefing up other marketing channels. It might be to buy competitor sites. Your plan could be to jump through Google’s hoops if you do receive a penalty, see if your site returns, and if it does - great - until next time, that is.

What's your long term "traffic from Google" strategy?

If all you do is "follow Google's Guidelines", I'd say that's now a high risk SEO strategy.

Winning Strategies to Lose Money With Infographics

Google is getting a bit absurd with suggesting that any form of content creation that drives links should include rel=nofollow. Certainly some techniques may be abused, but if you follow the suggested advice, you are almost guaranteed to have a negative ROI on each investment - until your company goes under.

Some will ascribe such advice as taking a "sustainable" and "low-risk" approach, but such strategies are only "sustainable" and "low-risk" so long as ROI doesn't matter & you are spending someone else's money.

The advice on infographics in the above video suggests that embed code by default should include nofollow links.

Companies can easily spend at least $2,000 to research, create, revise & promote an infographic. And something like 9 out of 10 infographics will go nowhere. That means you are spending about $20,000 for each successful viral infographic. And this presumes that you know what you are doing. Mix in a lack of experience, poor strategy, poor market fit, or poor timing and that cost only goes up from there.

If you run smaller & lesser known websites, quite often Google will rank a larger site that syndicates the infographic above the original source. They do that even when the links are followed. Mix in nofollow on the links and it is virtually guaranteed that you will get outranked by someone syndicating your infographic.

So if you get to count as duplicate content for your own featured premium content that you dropped 4 or 5 figures on AND you don't get links out of it, how exactly does the investment ever have any chance of backing out?

Sales?

Not a snowball's chance in hell.

An infographic created around "the 10 best ways you can give me your money" won't spread. And if it does spread, it will be people laughing at you.

I also find it a bit disingenuous the claim that people putting something that is 20,000 pixels large on their site are not actively vouching for it. If something was crap and people still felt like burning 20,000 pixels on syndicating it, surely they could add nofollow on their end to express their dissatisfaction and disgust with the piece.

Many dullards in the SEO industry give Google a free pass on any & all of their advice, as though it is always reasonable & should never be questioned. And each time it goes unquestioned, the ability to exist in the ecosystem as an independent player diminishes as the entire industry moves toward being classified as some form of spam & getting hit or not depends far more on who than what.

Does Google's recent automated infographic generator give users embed codes with nofollow on the links? Not at all. Instead they give you the URL without nofollow & those URLs are canonicalized behind the scenes to flow the link equity into the associated core page.

No cost cut-n-paste mix-n-match = direct links. Expensive custom research & artwork = better use nofollow, just to be safe.

If Google actively adds arbitrary risks to some players while subsidizing others then they shift the behaviors of markets. And shift the markets they do!

Years ago Twitter allowed people who built their platform to receive credit links in their bio. Matt Cutts tipped off Ev Williams that the profile links should be nofollowed & that flow of link equity was blocked.

It was revealed in the WSJ that in 2009 Twitter's internal metrics showed an 11% spammy Tweet rate & Twitter had a grand total of 2 "spam science" programmers on staff in 2012.

With smaller sites, they need to default everything to nofollow just in case anything could potentially be construed (or misconstrued) to have the intent to perhaps maybe sorta be aligned potentially with the intent to maybe sorta be something that could maybe have some risk of potentially maybe being spammy or maybe potentially have some small risk that it could potentially have the potential to impact rank in some search engine at some point in time, potentially.

A larger site can have over 10% of their site be spam (based on their own internal metrics) & set up their embed code so that the embeds directly link - and they can do so with zero risk.

I just linked to Twitter twice in the above embed. If those links were directly to Cygnus it may have been presumed that either he or I are spammers, but put the content on Twitter with 143,199 Tweets in a second & those links are legit & clean. Meanwhile, fake Twitter accounts have grown to such a scale that even Twitter is now buying them to try to stop them. Twitter's spam problem was so large that once they started to deal with spam their growth estimates dropped dramatically:

CEO Dick Costolo told employees he expected to get to 400 million users by the end of 2013, according to people familiar with the company.

Sources said that Twitter now has around 240 million users, which means it has been adding fewer than 4.5 million users a month in 2013. If it continues to grow at that rate, it would end this year around the 260 million mark — meaning that its user base would have grown by about 30 percent, instead of Costolo’s 100 percent goal.

Typically there is no presumed intent to spam so long as the links are going into a large site (sure there are a handful of token counter-examples shills can point at). By and large it is only when the links flow out to smaller players that they are spam. And when they do, they are presumed to be spam even if they point into featured content that cost thousands of Dollars. You better use nofollow, just to play it safe!

That duality is what makes blind unquestioning adherence to Google scripture so unpalatable. A number of people are getting disgusted enough by it that they can't help but comment on it: David Naylor, Martin Macdonald & many others DennisG highlighted.

Oh, and here's an infographic for your pleasurings.

Google: Press Release Links

So, Google have updated their Webmaster Guidelines.

Here are a few common examples of unnatural links that violate our guidelines:....Links with optimized anchor text in articles or press releases distributed on other sites.

For example: There are many wedding rings on the market. If you want to have a wedding, you will have to pick the best ring. You will also need to buy flowers and a wedding dress.

In particular, they have focused on links with optimized anchor text in articles or press releases distributed on other sites. Google being Google, these rules are somewhat ambiguous. “Optimized anchor text”? The example they provide includes keywords in the anchor text, so keywords in the anchor text is “optimized” and therefore a violation of Google’s guidelines.

Ambiguously speaking, of course.

To put the press release change in context, Google’s guidelines state:

Any links intended to manipulate PageRank or a site's ranking in Google search results may be considered part of a link scheme and a violation of Google’s Webmaster Guidelines. This includes any behavior that manipulates links to your site or outgoing links from your site

So, links gained, for SEO purposes - intended to manipulate ranking - are against Google Guidelines.

Google vs Webmasters

Here’s a chat...

In this chat, Google’s John Muller says that, if the webmaster initiated it, then it isn't a natural link. If you want to be on the safe side, John suggests to use no-follow on links.

Google are being consistent, but what’s amusing is the complete disconnect on display from a few of the webmasters. Google have no problem with press releases, but if a webmaster wants to be on the safe side in terms of Google’s guidelines, the webmaster should no-follow the link.

Simple, right. If it really is a press release, and not an attempt to link build for SEO purposes, then why would a webmaster have any issue with adding a no-follow to a link?

He/she wouldn't.

But because some webmasters appear to lack self-awareness about what it is they are actually doing, they persist with their line of questioning. I suspect what they really want to hear is “keyword links in press releases are okay." Then, webmasters can continue to issue pretend press releases as a link building exercise.

They're missing the point.

Am I Taking Google’s Side?

Not taking sides.

Just hoping to shine some light on a wider issue.

If webmasters continue to let themselves be defined by Google, they are going to get defined out of the game entirely. It should be an obvious truth - but sadly lacking in much SEO punditry - that Google is not on the webmasters side. Google is on Google’s side. Google often say they are on the users side, and there is certainly some truth in that.

However,when it comes to the webmaster, the webmaster is a dime-a-dozen content supplier who must be managed, weeded out, sorted and categorized. When it comes to the more “aggressive” webmasters, Google’s behaviour could be characterized as “keep your friends close, and your enemies closer”.

This is because some webmasters, namely SEOs, don’t just publish content for users, they compete with Google’s revenue stream. SEOs offer a competing service to click based advertising that provides exactly the same benefit as Google's golden goose, namely qualified click traffic.

If SEOs get too good at what they do, then why would people pay Google so much money per click? They wouldn’t - they would pay it to SEOs, instead. So, if I were Google, I would see SEO as a business threat, and manage it - down - accordingly. In practice, I’d be trying to redefine SEO as “quality content provision”.

Why don't Google simply ignore press release links? Easy enough to do. Why go this route of making it public? After all, Google are typically very secret about algorithmic topics, unless the topic is something they want you to hear. And why do they want you to hear this? An obvious guess would be that it is done to undermine link building, and SEOs.

Big missiles heading your way.

Guideline Followers

The problem in letting Google define the rules of engagement is they can define you out of the SEO game, if you let them.

If an SEO is not following the guidelines - guidelines that are always shifting - yet claim they do, then they may be opening themselves up to legal liability. In one recent example, a case is underway alleging lack of performance:

Last week, the legal marketing industry was aTwitter (and aFacebook and even aPlus) with news that law firm Seikaly & Stewart had filed a lawsuit against The Rainmaker Institute seeking a return of their $49,000 in SEO fees and punitive damages under civil RICO

.....but it’s not unreasonable to expect a somewhat easier route for litigants in the future might be “not complying with Google’s guidelines”, unless the SEO agency disclosed it.

SEO is not the easiest career choice, huh.

One group that is likely to be happy about this latest Google push is legitimate PR agencies, media-relations departments, and publicists. As a commenter on WMW pointed out:

I suspect that most legitimate PR agencies, media-relations departments, and publicists will be happy to comply with Google's guidelines. Why? Because, if the term "press release" becomes a synonym for "SEO spam," one of the important tools in their toolboxes will become useless.

Just as real advertisers don't expect their ads to pass PageRank, real PR people don't expect their press releases to pass PageRank. Public relations is about planting a message in the media, not about manipulating search results

However, I’m not sure that will mean press releases are seen as any more credible, as press releases have never enjoyed a stellar reputation pre-SEO, but it may thin the crowd somewhat, which increases an agencies chances of getting their client seen.

Guidelines Honing In On Target

One resource referred to in the video above was this article, written by Amit Singhal, who is head of Google’s core ranking team. Note that it was written in 2011, so it’s nothing new. Here’s how Google say they determine quality:

we aren't disclosing the actual ranking signals used in our algorithms because we don't want folks to game our search results; but if you want to step into Google's mindset, the questions below provide some guidance on how we've been looking at the issue:

  • Would you trust the information presented in this article?
  • Is this article written by an expert or enthusiast who knows the topic well, or is it more shallow in nature?
  • Does the site have duplicate, overlapping, or redundant articles on the same or similar topics with slightly different keyword variations?
  • Are the topics driven by genuine interests of readers of the site, or does the site generate content by attempting to guess what might rank well in search engines?
  • Does the article provide original content or information, original reporting, original research, or original analysis?
  • Does the page provide substantial value when compared to other pages in search results?
  • How much quality control is done on content?

….and so on. Google’s rhetoric is almost always about “producing high quality content”, because this is what Google’s users want, and what Google’s users want, Google’s shareholders want.

It’s not a bad thing to want, of course. Who would want poor quality content? But as most of us know, producing high quality content is no guarantee of anything. Great for Google, great for users, but often not so good for publishers as the publisher carries all the risk.

Take a look at the Boston Globe, sold along with a boatload of content for a 93% decline. Quality content sure, but is it a profitable business? Emphasis on content without adequate marketing is not a sure-fire strategy. Bezos has just bought the Washington Post, of course, and we're pretty sure that isn't a content play, either.

High quality content often has a high upfront production cost attached to it, and given measly web advertising rates, the high possibility of invisibility, getting content scrapped and ripped off, then it is no wonder webmasters also push their high quality content in order to ensure it ranks. What other choice have they got?

To not do so is also risky.

Even eHow, well known for cheap factory line content, is moving toward subscription membership revenues.

The Somewhat Bigger Question

Google can move the goal- posts whenever they like. What you’re doing today might be frowned upon tomorrow. One day, your content may be made invisible, and there will be nothing you can do about it, other than start again.

Do you have a contingency plan for such an eventuality?

Johnon puts it well:

The only thing that matters is how much traffic you are getting from search engines today, and how prepared you are for when some (insert adjective here) Googler shuts off that flow of traffic"

To ask about the minuate of Google’s policies and guidelines is to miss the point. The real question is how prepared are you when Google shuts off you flow of traffic because they’ve reset the goal posts?

Focusing on the minuate of Google's policies is, indeed, to miss the point.

This is a question of risk management. What happens if your main site, or your clients site, runs foul of a Google policy change and gets trashed? Do you run multiple sites? Run one site with no SEO strategy at all, whilst you run other sites that push hard? Do you stay well within the guidelines and trust that will always be good enough? If you stay well within the guidelines, but don’t rank, isn’t that effectively the same as a ban i.e. you’re invisible? Do you treat search traffic as a bonus, rather than the main course?

Be careful about putting Google’s needs before your own. And manage your risk, on your own terms.

Google Keyword(Not Provided): High Double Digit Percent

Most Organic Search Data is Now Hidden

Over the past couple years since its launch, Google's keyword (not provided) has received quite a bit of exposure, with people discussing all sorts of tips on estimating its impact & finding alternate sources of data (like competitive research tools & webmaster tools).

What hasn't received anywhere near enough exposure (and should be discussed daily) is that the sole purpose of the change was anti-competitive abuse from the market monopoly in search.

The site which provided a count for (not provided) recently displayed over 40% of queries as (not provided), but that percentage didn't include the large percent of mobile search users that were showing no referrals at all & were showing up as direct website visitors. On July 30, Google started showing referrals for many of those mobile searchers, using keyword (not provided).

According to research by RKG, mobile click prices are nearly 60% of desktop click prices, while mobile search click values are only 22% of desktop click prices. Until Google launched enhanced AdWords campaigns they understated the size of mobile search by showing many mobile searchers as direct visitors. But now that AdWords advertisers were opted into mobile ads (and have to go through some tricky hoops to figure out how to disable it), Google has every incentive to promote what a big growth channel mobile search is for their business.

Looking at the analytics data for some non-SEO websites over the past 4 days I get Google referring an average of 86% of the 26,233 search visitors, with 13,413 being displayed as keyword (not provided).

Hiding The Value of SEO

Google is not only hiding half of their own keyword referral data, but they are hiding so much more than half that even when you mix in Bing and Yahoo! you still get over 50% of the total hidden.

Google's 86% of the 26,233 searches is 22,560 searches.

Keyword (not provided) being shown for 13,413 is 59% of 22,560. That means Google is hiding at least 59% of the keyword data for organic search. While they are passing a significant share of mobile search referrers, there is still a decent chunk that is not accounted for in the change this past week.

Not passing keywords is just another way for Google to increase the perceived risk & friction of SEO, while making SEO seem less necessary, which has been part of "the plan" for years now.

Buy AdWords ads and the data gets sent. Rank organically and most the data is hidden.

When one digs into keyword referral data & ad blocking, there is a bad odor emitting from the GooglePlex.

Subsidizing Scammers Ripping People Off

A number of the low end "solutions" providers scamming small businesses looking for SEO are taking advantage of the opportunity that keyword (not provided) offers them. A buddy of mine took over SEO for a site that had showed absolutely zero sales growth after a year of 15% monthly increase in search traffic. Looking at the on-site changes, the prior "optimizers" did nothing over the time period. Looking at the backlinks, nothing there either.

So what happened?

Well, when keyword data isn't shown, it is pretty easy for someone to run a clickbot to show keyword (not provided) Google visitors & claim that they were "doing SEO."

And searchers looking for SEO will see those same scammers selling bogus solutions in AdWords. Since they are selling a non-product / non-service, their margins are pretty high. Endorsed by Google as the best, they must be good.

Or something like that:

Google does prefer some types of SEO over others, but their preference isn’t cast along the black/white divide you imagine. It has nothing to do with spam or the integrity of their search results. Google simply prefers ineffective SEO over SEO that works. No question about it. They abhor any strategies that allow guys like you and me to walk into a business and offer a significantly better ROI than AdWords.

This is no different than the YouTube videos "recommended for you" that teach you how to make money on AdWords by promoting Clickbank products which are likely to get your account flagged and banned. Ooops.

Anti-competitive Funding Blocking Competing Ad Networks

John Andrews pointed to Google's blocking (then funding) of AdBlock Plus as an example of their monopolistic inhibiting of innovation.

sponsoring Adblock is changing the market conditions. Adblock can use the money provided by Google to make sure any non-Google ad is blocked more efficiently. They can also advertise their addon better, provide better support, etc. Google sponsoring Adblock directly affects Adblock's ability to block the adverts of other companies around the world. - RyanZAG

Turn AdBlock Plus on & search for credit cards on Google and get ads.

Do that same search over at Bing & get no ads.

How does a smaller search engine or a smaller ad network compete with Google on buying awareness, building a network AND paying the other kickback expenses Google forces into the marketplace?

They can't.

Which is part of the reason a monopoly in search can be used to control the rest of the online ecosystem.

Buying Browser Marketshare

Already the #1 web browser, Google Chrome buys marketshare with shady one-click bundling in software security installs.

If you do that stuff in organic search or AdWords, you might be called a spammer employing deceptive business practices.

When Google does it, it's "good for the user."

Vampire Sucking The Lifeblood Out of SEO

Google tells Chrome users "not signed in to Chrome (You're missing out - sign in)." Login to Chrome & searchers don't pass referral information. Google also promotes Firefox blocking the passage of keyword referral data in search, but when it comes to their own cookies being at risk, that is unacceptable: "Google is pulling out all the stops in its campaign to drive Chrome installs, which is understandable given Microsoft and Mozilla's stance on third-party cookies, the lifeblood of Google's display-ad business."

What do we call an entity that considers something "its lifeblood" while sucking it out of others?

What Is Your SEO Strategy?

How do you determine your SEO strategy?

Actually, before you answer, let’s step back.

What Is SEO, Anyway?

“Search engine optimization” has always been an odd term as it’s somewhat misleading. After all, we’re not optimizing search engines.

SEO came about when webmasters optimized websites. Specifically, they optimized the source code of pages to appeal to search engines. The intent of SEO was to ensure websites appeared higher in search results than if the site was simply left to site designers and copywriters. Often, designers would inadvertently make sites uncrawlable, and therefore invisible in search engines.

But there was more to it than just enhancing crawlability.

SEOs examined the highest ranking page, looked at the source code, often copied it wholesale, added a few tweaks, then republished the page. In the days of Infoseek, this was all you needed to do to get an instant top ranking.

I know, because I used to do it!

At the time, I thought it was an amusing hacker trick. It also occurred to me that such positioning could be valuable. Of course, this rather obvious truth occurred to many other people, too. A similar game had been going on in the Yahoo Directory where people named sites “AAAA...whatever” because Yahoo listed sites in alphabetical order. People also used to obsessively track spiders, spotting fresh spiders (Hey Scooter!) as they appeared and....cough......guiding them through their websites in a favourable fashion.

When it comes to search engines, there’s always been gaming. The glittering prize awaits.

The new breed of search engines made things a bit more tricky. You couldn’t just focus on optimizing code in order to rank well. There was something else going on.

Backlinks.

So, SEO was no longer just about optimizing the underlying page code, SEO was also about getting links. At that point, SEO jumped from being just a technical coding exercise to a marketing exercise. Webmasters had to reach out to other webmasters and convince them to link up.

A young upstart, Google, placed heavy emphasis on links, making use of a clever algorithm that sorted “good” links from, well, “evil” links. This helped make Google’s result set more relevant than other search engines. Amusingly enough, Google once claimed it wasn’t possible to spam Google.

Webmasters responded by spamming Google.

Or, should I say, Google likely categorized what many webmasters were doing as “spam”, at least internally, and may have regretted their earlier hubris. Webmasters sought links that looked like “good” links. Sometimes, they even earned them.

And Google has been pushing back ever since.

Building links pre-dated SEO, and search engines, but, once backlinks were counted in ranking scores, link building was blended into SEO. These days, most SEO's consider link building a natural part of SEO. But, as we've seen, it wasn’t always this way.

We sometimes get comments on this blog about how marketing is different from SEO. Well, it is, but if you look at the history of SEO, there has always been marketing elements involved. Getting external links could be characterized as PR, or relationship building, or marketing, but I doubt anyone would claim getting links is not SEO.

More recently, we’ve seen a massive change in Google. It’s a change that is likely being rolled out over a number of years. It’s a change that makes a lot of old school SEO a lot less effective in the same way introducing link analysis made meta-tag optimization a lot less effective.

My takeaways from Panda are that this is not an individual change or something with a magic bullet solution. Panda is clearly based on data about the user interacting with the SERP (Bounce, Pogo Sticking), time on site, page views, etc., but it is not something you can easily reduce to 1 number or a short set of recommendations. To address a site that has been Pandalized requires you to isolate the "best content" based on your user engagement and try to improve that.

Google is likely applying different algorithms to different sectors, so the SEO tactics used in on sector don’t work in another. They’re also looking at engagement metrics, so they’re trying to figure out if the user really wanted the result they clicked on. When you consider Google's work on PPC landing pages, this development is obvious. It’s the same measure. If people click back often, too quickly, then the landing page quality score drops. This is likely happening in the SERPs, too.

So, just like link building once got rolled into SEO, engagement will be rolled into SEO. Some may see that as a death of SEO, and in some ways it is, just like when meta-tag optimization, and other code optimizations, were deprecated in favour of other, more useful relevancy metrics. In others ways, it's SEO just changing like it always has done.

The objective remains the same.

Deciding On Strategy

So, how do you construct your SEO strategy? What will be your strategy going forward?

Some read Google’s Webmaster Guidelines. They'll watch every Matt Cutts video. They follow it all to the letter. There’s nothing wrong with this approach.

Others read Google’s Guidelines. They'll watch every Matt Cutts video. They read between the lines and do the complete opposite. Nothing wrong with that approach, either.

It depends on what strategy you've adopted.

One of the problems with letting Google define your game is that they can move the goalposts anytime they like. The linking that used to be acceptable, at least in practice, often no longer is. Thinking of firing off a press release? Well, think carefully before loading it with keywords:

This is one of the big changes that may have not been so clear for many webmasters. Google said, “links with optimized anchor text in articles or press releases distributed on other sites,” is an example of an unnatural link that violate their guidelines. The key are the examples given and the phrase “distributed on other sites.” If you are publishing a press release or an article on your site and distribute it through a wire or through an article site, you must make sure to nofollow the links if those links are “optimized anchor text.

Do you now have to go back and unwind a lot of link building in order to stay in their good books? Or, perhaps you conclude that links in press releases must work a little too well, else Google wouldn’t be making a point of it. Or conclude that Google is running a cunning double-bluff hoping you’ll spend a lot more time doing things you think Google does or doesn’t like, but really Google doesn’t care about at all, as they’ve found a way to mitigate it.

Bulk guest posting were also included in Google's webmaster guidelines as a no no. Along with keyword rich anchors in article directories. Even how a site monetizes by doing things like blocking the back button can be considered "deceptive" and grounds for banning.

How about the simple strategy of finding the top ranking sites, do what they do, and add a little more? Do you avoid saturated niches, and aim for the low-hanging fruit? Do you try and guess all the metrics and make sure you cover every one? Do you churn and burn? Do you play the long game with one site? Is social media and marketing part of your game, or do you leave these aspects out of the SEO equation? Is your currency persuasion?

Think about your personal influence and the influence you can manage without dollars or gold or permission from Google. Think about how people throughout history have sought karma, invested in social credits, and injected good will into their communities, as a way to “prep” for disaster. Think about it.

We may be “search marketers” and “search engine optimizers” who work within the confines of an economy controlled (manipulated) by Google, but our currency is persuasion. Persuasion within a market niche transcends Google

It would be interesting to hear the strategies you use, and if you plan on using different strategy going forward.

New Local Carousel

Google announced they rolled out their local carousel results on desktops in categories like hotels, dining & nightlife for US English search queries. The ranking factors driving local rank are aligned with the same ones that were driving the old 7 pack result set.

The layout seems to be triggered when there are 5 or more listings. One upside to the new layout is that clicks within the carousel might not fall off quite as quickly as they do with vertical listings, so if you don't rank #1 you might still get plenty of traffic.

The default amount of useful information offered by the new layout is less than the old layout provided, while requiring user interaction with the result set to get the information they want. You get a picture, but the only way the phone number is in the result set is if you click into that result set or conduct a branded query from the start.

If you search for a general query (say "Indian restaurants") and want the phone number of a specific restaurant, you will likely need to click on that restaurant's picture in order to shift the search to that restaurant's branded search result set to pull their phone number & other information into the page. In that way Google is able to better track user engagement & enhance personalization on local search. When people repeatedly click into the same paths from logged in Google user accounts then Google can put weight on the end user behavior.

This multi-click process not only gives Google usage data to refine rankings with, but it also will push advertisers into buying branded AdWords ads.

Where this new result set is a bit of a train wreck for navigational searches is when a brand is fairly generic & aligned with a location as part of the business name. For instance, in Oakland there is a place named San Francisco Pizza. Even if you do that branded search, you still get the carousel & there might also be three AdWords ads above the organic search results.

If that company isn't buying branded AdWords ads, they best hope that their customers have large monitors, don't use Google, or are better than the average searcher at distinguishing between AdWords & organic results.

Some of Google's other verticals may appear above the organic result set too. When searching for downtown Oakland hotels they offer listings of hotels in San Francisco & Berkeley inside the hotel onebox.

Perhaps Google can patch together some new local ad units that work with the carousel to offer local businesses a flat-rate monthly ad product. A lot of advertisers would be interested in testing a subscription product that enabled them to highlight selected user reviews and include other options like ratings & coupons & advertiser control of the image. As the search result set becomes the destination some of Google's ad products can become much more like Yelp's.

In the short term the new layout is likely a boon for Yelp & some other local directory plays. Whatever segment of the search audience that dislike's the new carousel will likely be shunted into many of these other local directories.

In the longrun some of these local directories will be the equivalent of MapQuest. As Google gains confidence they will make their listings richer & have more confidence in entirely displacing the result set. The following search isn't a local one, but is a good example of where we may be headed. Even though the search is set to "web" results (rather than "video" results) the first 9 listings are from YouTube.

Update: In addition to the alarming rise of further result displacement, the 2-step clickthrough process means that local businesses will lose even more keyword referral data, as many of the generic queries are replaced by their branded keywords in analytics data.

SEO: Dirty Rotten Scoundrels

SEO is a dirty word.

PPC isn’t a dirty word.

Actually, they’re not words they’re acronyms, but you get my drift, I’m sure :)

It must be difficult for SEO providers to stay on the “good and pure” side of SEO when the definitions are constantly shifting. Recently we’ve seen one prominent SEO tool provider rebrand as an “inbound marketing” tools provider and it’s not difficult to appreciate the reasons why.

SEO, to a lot of people, means spam. The term SEO is lumbered, rightly or wrongly, with negative connotations.

Email Optimization

Consider email marketing.

Is all email marketing spam? Many would consider it annoying, but obviously not all email marketing is spam.

There is legitimate email marketing, whereby people opt-in to receive email messages they consider valuable. It is an industry worth around $2.468 billion. There are legitimate agencies providing campaign services, reputable tools vendors providing tools, and it can achieve measurable marketing results where everyone wins.

Yet, most email marketing is spam. Most of it is annoying. Most of it is irrelevant. According to a Microsoft security report, 97% of all email circulating is spam.

So, only around 3% of all email is legitimate. 3% of email is wanted. Relevant. Requested.

One wonders how much SEO is legitimate? I guess it depends what we mean by legitimate, but if we accept the definition I’ve used - “something relevant wanted by the user” - then, at a guess, I’d say most SEO these days is legitimate, simply because being off-topic is not rewarded. Most SEOs provide on-topic content, and encourage businesses to publish it - free - on the web. If anything, SEOs could be accused of being too on-topic.

The proof can be found in the SERPs. A site is requested by the user. If a site is listed matches their query, then the user probably deems it to be relevant. They might find that degree of relevance, personally, to be somewhat lacking, in which case they’ll click-back, but we don’t have a situation where search results are rendered irrelevant by the presence of SEO.

Generally speaking, search appears to work well in terms of delivering relevance. SEO could be considered cleaner than email marketing in that SEOs are obsessed with being relevant to a user. The majority of email marketers, on the other hand, couldn't seem to care less about what is relevant, just so long as they get something, anything, in front of you. In search, if a site matches the search query, and the visitor likes it enough to register positive quality metrics, then what does it matter how it got there?

It probably depends on whos’ business case we’re talking about.

Advertorials

Matt Cutts has released a new video on Advertorials and Native Advertising.

Matt makes a good case. He reminds us of the idea on which Google was founded, namely citation. If people think a document is important, or interesting, they link to it.

This idea came from academia. The more an academic document is cited, and cited by those with authority, the more relevant that document is likely to be. Nothing wrong with that idea, however some of the time, it doesn’t work. In academic circles, citation is prone to corruption. One example is self-citation.

But really, excessive self-citation is for amateurs: the real thing is forming a “citation cartel” as Phil David from The Scholarly Kitchen puts it. In April this year, after receiving a “tip from a concerned scientist” Davis did some detective work using the JCR data and found that several journals published reviews citing an unusually high number of articles fitting the JIF window from other journals. In one case, theMedical Science Monitor published a 2010 review citing 490 articles, 445 of them were published in 2008-09 in the journal Cell Transplantation (44 of the other 45 were for articles from Medical Science Journal published in 2008-09 as well). Three of the authors were Cell Transplantation editors

So, even in academia, self-serving linking gets pumped and manipulated. When this idea is applied to the unregulated web where there is vast sums of money at stake, you can see how citation very quickly changes into something else.

There is no way linking is going to stay “pure” in such an environment.

The debate around “paid links” and “paid placement” has been done over and over again, but in summary, the definition of “paid” is inherently problematic. For example, some sites invite guest posting, pay the writers nothing in monetary terms, but the payment is a link back to the writers site. The article is a form of paid placement, it’s just that no money changes hands. Is the article truly editorial?

It’s a bit grey.

A lot of the time, such articles pump the writers business interests. Is that paid content, and does it need to be disclosed? Does it need to be disclosed to both readers and search engines? I think Matt's video suggests it isn't a problem, as utility is provided, but a link from said article may need to be no-followed in order to stay within Google's guidelines.

Matt wants to see clear and conspicuous disclosure of advertorial content. Paid links, likewise. The disclosure should be made both to search engines and readers.

Which is interesting.

Why would a disclosure need to be made to a search engine spider? Granted, it makes Google’s job easier, but I’m not sure why publishers would want to make Google’s job easier, especially if there’s nothing in it for the publishers.

But here comes the stick, and not just from the web spam team.

Google News have stated they may remove a publication if a publication is taking money for paid content and not adequately disclosing that fact - in Google’s view - to both readers and search engines, then that publication may be kicked from Google News. In so doing, Google increase the risk to the publisher, and therefore the cost, in accepting paid links or paid placement.

So, that’s why a publisher will want to make Google’s job easier. If they don’t, they run the risk of invisibility.

Now, on one level, this sounds fair and reasonable. The most “merit worthy” content should be at the top. A ranking should not depend on how deep your pockets are i.e. the more links you can buy, the more merit you have.

However, one of the problems is that the search results already work this way. Big brands often do well in the SERPs due to reputation gained, in no small part, from massive advertising spend that has the side effect, or sometimes direct effect, of inbound links. Do these large brands therefore have “more merit” by virtue of their deeper pockets?

Google might also want to consider why a news organization would blur advertorial lines when they never used to. Could it be because their advertising is no longer paying them enough to survive?

SEO Rebalances The Game

SEO has helped level the playing field for small businesses, in particular. The little guy didn’t have deep pockets, but he could play the game smarter by figuring out what the search engines wanted, algorithmicly speaking, and giving it to them.

I can understand Google’s point of view. If I were Google, I’d probably think the same way. I’d love a situation where editorial was editorial, and business was PPC. SEO, to me, would mean making a site crawlable and understandable to both visitors and bots, but that’s the end of it. Anything outside that would be search engine spam. It’s neat. It’s got nice rounded edges. It would fit my business plan.

But real life is messier.

If a publisher doesn’t have the promotion budget of a major brand, and they don’t have enough money to outbid big brands on PPC, then they risk being invisible on search engines. Google search is pervasive, and if you’re not visible in Google search, then it’s a lot harder to make a living on the web. The risk of being banned for not following the guidelines is the same as the risk of playing the game within the guidelines, but not ranking. That risk is invisibility.

Is the fact a small business plays a game that is already stacked against them, by using SEO, “bad”? If they have to pay harder than the big brands just to compete, and perhaps become a big brand themselves one day, then who can really blame them? Can a result that is relevant, as far as the user is concerned, still really be labelled “spam”? Is that more to do with the search engines business case than actual end user dissatisfaction?

Publishers and SEOs should think carefully before buying into the construct that SEO, beyond Google’s narrow definition, is spam. Also consider that the more people who can be convinced to switch to PPC and/or stick to just making sites more crawlable, then the more spoils for those who couldn’t care less how SEO is labelled.

It would be great if quality content succeeded in the SERPs on merit, alone. This would encourage people to create quality content. But when other aspects are rewarded, then those aspects will be played.

Perhaps if the search engines could be explicit about what they want, and reward it when they’re delivered it, then everyone’s happy.

I guess the algorithms just aren’t that clever yet.

LarryWorld

It’s hard to disagree with Larry Page.

In his recent speech at Google I/O, Page talked about privacy and how it impairs Google. “Why are people so focused on keeping their medical history private”? If only people would share more, then Google could do more.

Well, quite.

We look forward to Google taking the lead in this area and opening up their systems to public inspection. Perhaps they could start with the search algorithms. If Google would share more, publishers could do more.

What’s not to like? :)

But perhaps that’s comparing apples with oranges. The two areas may not be directly comparable as the consequences of opening up the algorithm would likely destroy Google’s value. Google’s argument against doing so has been that the results would suffer quality issues.

Google would not win.

TechnoUtopia

If Page's vision sounds somewhat utopian, then perhaps we should consider where Google came from.

In a paper entitled “The Politics Of Search: A Decade Retrospective”, Laura Granker points out that when Google started out, the web was a more utopian place.

A decade ago, the Internet was frequently viewed through a utopian lens, with scholars redicting that this increased ability to share, access, and produce content would reduce barriers to information access...Underlying most of this work is a desire to prevent online information from merely mimicking the power structure of the conglomerates that dominate the media landscape. The search engine, subsequently, is seen as an idealized vehicle that can differentiate the Web from the consolidation that has plagued ownership and content in traditional print and broadcast media

At the time, researchers Introna and Nissenbaum felt that online information was too important to be shaped by market forces alone. They correctly predicted this would lead to a loss of information quality, and a lack of diversity, as information would pander to popular tastes.

They advocated, perhaps somewhat naively in retrospect, public oversight of search engines and algorithm transparency to correct these weaknesses. They argued that doing so would empower site owners and users.

Fast forward to 2013, and there is now more skepticism about such utopian values. Search engines are seen as the gatekeepers of information, yet they remain secretive about how they determine what information we see. Sure, they talk about their editorial process in general terms, but the details of the algorithms remain a closely guarded secret.

In the past decade, we’ve seen a considerable shift in power away from publishers and towards the owners of big data aggregators, like Google. Information publishers are expected to be transparent - so that a crawler can easily gather information, or a social network can be, well, social - and this has has advantaged Google and Facebook. It would be hard to run a search engine or a social network if publishers didn't buy into this utopian vision of transparency.

Yet, Google aren’t quite as transparent with their own operation. If you own a siren server, then you want other people to share and be open. But the same rule doesn’t apply to the siren server owner.

Opening Up Health

Larry is concerned about constraints in healthcare, particularly around access to private data.

“Why are people so focused on keeping their medical history private?” Page thinks it’s because people are worried about their insurance. This wouldn’t happen if there was universal care, he reasons.

I don’t think that’s correct.

People who live in areas where there is universal healthcare, like the UK, Australia and New Zealand, are still very concerned about the privacy of their data. People are concerned that their information might be used against them, not just by insurance companies, but by any company, not to mention government agencies and their employees.

People just don’t like the idea of surveillance, and they especially don’t like the idea of surveillance by advertising companies who operate inscrutable black boxes.

Not that good can’t come from crunching the big data linked to health. Page is correct in saying there is a lot of opportunity to do good by applying technology to the health sector. But first companies like Google need to be a lot more transparent about their own data collection and usage in order to earn trust. What data are they collecting? Why? What is it used for? How long is it kept? Who can access it? What protections are in place? Who is watching the watchers?

Google goes someway towards providing transparency with their privacy policy. A lesser known facility, called Data Liberation allows you to move data out of Google, if you wish.

I’d argue that in order for people to trust Google to a level Page demands would require a lot more rigor and transparency, including third party audit. There are also considerable issues to overcome, in terms of government legislation, such as privacy acts. Perhaps the most important question is "how does this shift power balances"? No turkey votes for an early Christmas. If your job relies on being a gatekeeper of health information, you're hardly going to hand that responsibility over to Google.

So, it’s not a technology problem. And not just because people afraid of insurance companies. And it’s not because people aren’t on board with the whole Burning-Man-TechnoUtopia vision. It’s to do with trust. People would like to know what they’re giving up, to whom, and what they’re getting in return. And it's about power and money.

Page has answered some of the question, but not nearly enough of it. Something might be good for Google, and it might be good for others, but people want a lot more than just his word on it.

Sean Gallagher writes in ArsTechnica:

The changes Page wants require more than money. They require a change of culture, both political and national. The massively optimistic view that technology can solve all of what ails America—and the accompanying ideas on immigration, patent reform, and privacy—are not going to be so easy to force into the brains of the masses.

The biggest reason is trust. Most people trust the government because it's the government—a 226-year old institution that behaves relatively predictably, remains accountable to its citizens, and is governed by source code (the Constitution) that is hard to change. Google, on the other hand, is a 15-year old institution that is constantly shifting in nature, is accountable to its stockholders, and is governed by source code that is updated daily. You can call your Congressman and watch what happens in Washington on C-SPAN every day. Google is, to most people, a black box that turns searches and personal data into cash”

And it may do so at their expense, not benefit.

GoogleMart

It was hard to spot, at first.

It started with one store on the outskirts of town. It was big. Monolithic. It amalgamated a lot of cheap, largely imported stuff and sold the stuff on. The workers were paid very little. The suppliers were squeezed tight on their margins.

And so it grew.

And as it grew, it hollowed out the high street. The high street could not compete with the monoliths sheer power. They couldn’t compete with the monoliths influence on markets. They couldn’t compete with the monoliths unique insights gained from clever number crunching of big data sets.

I’m talking about Wal Mart, of course.

Love ‘em or loathe ‘em, Walmart gave people what they wanted, but in so doing, hollowed out a chunk of America's middle class. It displaced a lot of shop keepers. It displaced small business owners on Main Street. It displaced the small family retail chain that provided a nice little middle class steady earner.

Where did all those people go?

It was not only the small, independent retail businesses and local manufacturers who were fewer in number. Their closure triggered flow-on effects. There was less demand for the services they used, such as local small business accountants, the local lawyer, small advertising companies, local finance companies, and the host of service providers that make up the middle class ecosystem.

Where did they all go?

Some would have taken up jobs at WalMart, of course. Some would become unemployed. Some would close their doors and take early retirement. Some would change occupations and some would move away to where prospects were better.

What does any of this have to do with the internet?

The same thing is happening on the internet.

And if you’re a small business owner, located on the web-equivalent of the high street, or your business relies on those same small business owners, then this post is for you.

Is Technology Gutting The Middle Class?

I’ve just read “Who Owns The Future”, by Jaron Lanier. Anyone who has anything to do with the internet - and anyone who is even remotely middle class - will find it asks some pretty compelling questions about our present and future.

Consider this.

At the height of it’s power, the photography company Kodak employed more than 140,000 people and wa worth $28 billion. They even invented the first digital camera. But today Kodak is bankrupt, and the new face of digital photography has become Instagram. When it was sold to Facebook for a billion dollars in 2012, Instagram only employed 13 people

Great for Instagram. Bad for Kodak. And bad for the people who worked for Kodak. But, hey. That’s progress, right? Kodak had an outdated business model. Technology overtook them.

That’s true. It is progress. It’s also true that all actions have consequences. The consequence of transformative technology is that, according to Lanier, it may well end up destroying the middle class if too much of the value is retained in the large technology companies.

Lanier suggests that the advance of technology is not replacing as many jobs as it destroys, and those jobs that are destroyed are increasingly middle class.

Not Political (Kinda)

I don’t wish to make this post political, although all change is inherently political. I’m not taking political sides. This issue cuts across political boundaries. I have a lot of sympathy for technological utopian ideas and the benefits technology brings, and have little time for luddism.

However, it’s interesting to focus on the the consequences of this shift in wealth and power brought about by technology and whether enough people in the internet value chain receive adequate value for their efforts.

If the value doesn't flow through, as capitalism requires in order to function well, then few people win. Are children living at home longer than they used to? Are people working longer hours than they used to in order to have the same amount of stuff? Has the value chain been broken, Lanier asks? And, if so, what can be done to fix it?

What Made Instagram Worth One Billion Dollars?

Lanier points out that Instagram wasn’t worth a billion dollars because it had extraordinary employees doing amazing things.

The value of Instagram came from network effects.

Millions of people using Instagram gave the Instagram network value. Without the user base, Instagram is just another photo app.

Who got paid in the end? Not the people who gave the network value. The people who got paid were the small group at the top who organized the network. The owners of the "Siren Servers":

The power rests in what Lanier calls the “Siren Servers”: giant corporate repositories of information about our lives that we have given freely and often without consent, now being used for huge financial benefit by a super-rich few

The value is created by all the people who make up the network, but they only receive a small slither of that value in the form of a digital processing tool. To really benefit, you have to own, or get close to, a Siren Server.

Likewise, most of Google’s value resides in the network of users. These users feed value into Google simply by using it and thereby provide Google with a constant stream of data. This makes Google valuable. There isn’t much difference between Google and Bing in terms of service offering, but one is infinitely more valuable than the other purely by virtue of the size of the audience. Same goes for Facebook over Orkut.

You Provide Value

Google are provided raw materials by people. Web publishers allow Google to take their work, at no charge, and for Google to use that work and add value to Google’s network. Google then charges advertisers to place their advertising next to the aggregated information.

Why do web publishers do this?

Publishers create and give away their work in the hope they’ll get traffic back, from which they may derive benefit. Some publishers make money, so they can then pay real-world expenses, like housing, food and clothing. The majority of internet publishers make little, or nothing, from this informal deal. A few publishers make a lot. The long tail, when it comes to internet publishing, is pretty long. The majority of wealth, and power, is centralized at the head.

Similarly, Google’s users are giving away their personal information.

Every time someone uses Google, they are giving Google personal information of value. Their search queries. They browsing patterns. Their email conversations. Their personal network of contacts. Aggregate that information together, and it becomes valuable information, indeed. Google records this information, crunches it looking for patterns, then packages it up and sells it to advertisers.

What does Google give back in return?

Web services.

Is it a fair exchange of value?

Lanier argues it isn’t. What’s more, it’s an exchange of value so one-sided that it’s likely to destroy the very ecosystem on which companies like Google are based - the work output, and the spending choices, of the middle class. If few of the people who publish can make a reasonable living doing so, then the quality of what gets published must decrease, or cease to exist.

People could make their money in other ways, including offline. However, consider that the web is affecting a lot of offline business, already. The music industry is a faint shadow of what it once was, even as recent as one decade ago. There are a lot fewer middle class careers in the music industry now. Small retailers are losing out to the web. Fewer jobs there. The news industry is barely making any money. Same goes for book publishers. All these industries are struggling as online aggregators carve up their value chains.

Now, factor in all the support industries of these verticals. Then think about all the industries likely to be affected in the near future - like health, or libraries, or education, for example. Many businesses that used to hire predominantly middle class people are going out of business, downsizing their operations, or soon to have chunks of their value displaced.

It’s not Google’s aim to gut the middle class, of course. This post is not an anti-Google rant, either, simply a look at action and consequence. What is the effect of technology and, in particular, the effect of big technology companies on the web, most of whom seem obsessed with keeping you in their private, proprietary environments for as long as possible?

Google’s aim is index all the worlds information and make it available. That’s a good aim. It’s a useful, free service. But Lanier argues that gutting the middle class is a side-effect of re-contextualising, and thereby devaluing, information. Information may want to be free, but the consequence of free information is that those creating the information may not get paid. Many of those who do get paid may be weaker organizations more willing to sacrifice editorial quality in able to stay in business. We already see major news sites with MFA-styled formatting on unvetted syndicated press releases. What next?

You may notice that everyone is encouraged to “share” - meaning “give away” - but sharing doesn't seem to extend to the big tech companies, themselves.

They charge per click.

Robots.txt

One argument is that if someone doesn’t like Google, or any search engine, they should simply block that search engine via robots.txt. The problem with that argument is it’s like saying if you don’t like aspects of your city, you should move to the middle of the wilderness. You could, but really you’d just like to make the city a better place to be, and to see it thrive and prosper, and be able to thrive within it.

Google provides useful things. I use Google, just like I use my iPhone. I know the deal. I get the utility in exchange for information, and this exchange is largely on their terms. What Lanier proposes is a solution that not only benefits the individual, and the little guy, but ultimately the big information companies, themselves.

Money Go Round

Technology improvements have created much prosperity and the development of a strong middle class. But the big difference today is that what is being commoditized is information itself. In a world increasingly controlled by software that acts as our interface to information, if we commoditize information then we commoditize everything else.

If those creating the information don’t get paid, quality must decrease, or become less available than it otherwise would be. They can buy less stuff in the real world. If they can’t buy as much stuff in the real world, then Google and Facebook’s advertisers have fewer people to talk to that they otherwise would.

It was all a social construct to begin with, so what changed, to get to your question, is that at the turn of the [21st] century it was really Sergey Brin at Google who just had the thought of, well, if we give away all the information services, but we make money from advertising, we can make information free and still have capitalism. But the problem with that is it reneges on the social contract where people still participate in the formal economy. And it’s a kind of capitalism that’s totally self-defeating because it’s so narrow. It’s a winner-take-all capitalism that’s not sustaining

That isn’t a sustainable situation long-term. A winner-takes-all system centralizes wealth and power at the top, whilst everyone else occupies the long tail. Google has deals in place with large publishers, such as AP, AFP and various European agencies, but this doesn't extend to smaller publishers. It’s the same in sports. The very top get paid ridiculous amounts of money whilst those only a few levels down are unlikely to make rent on their earnings.

But doesn’t technology create new jobs? People who were employed at Kodak just go do something else?

The latest waves of high tech innovation have not created jobs like the old ones did. Iconic new ventures like Facebook employ vastly fewer people than big older companies like, say, General Motors. Put another way, the new schemes.....channel much of the productivity of ordinary people into an informal economy of barter and reputation, while concentrating the extracted old -fashioned wealth for themselves. All activity that takes place over digital networks becomes subject to arbitrage, in the sense that risk is routed to whoever suffers lesser computation resources

The people who will do well in such an environment will likely be employees of those who own the big data networks, like Google. Or they will be the entrepreneurial and adaptable types who manage to get close to them - the companies that serve WalMart or Google, or Facebook, or large financial institutions, or leverage off them - but Lanier argues there simply aren't enough of those roles to sustain society in a way that gave rise to these companies in the first place.

He argues this situation disenfranchises too many people, too quickly. And when that happens, the costs spread to everyone, including the successful owners of the networks. They become poorer than they would otherwise be by not returning enough of the value that enables the very information they need to thrive. Or another way of looking at it - who’s going to buy all the stuff if only a few people have the money?

The network, whether it be a search engine, a social network, an insurance company, or an investment fund uses information to concentrate power. Lanier argues they are all they same as they operate in pretty much the same way. The use network effects to mine and crunch big data, and this, in turn, grows their position at the expense of smaller competitors, and the ecosystem that surrounds them.

It doesn’t really matter what the intent was. The result is that the technology can prevent the middle class from prospering and when that happens, everyone ultimately loses.

So What Does He Propose Can Be Done?

A few days ago, Matt Cutts released a video about what site owners can expect from the next round of Google changes.

Google have announced a web spam change, called Penguin 2.0. They’ll be “looking at” advertorials, and native advertising. They’ll be taking a “stronger line” on this form of publishing. They’ll also be “going upstream” to make link spammers less effective.

Of course, whenever Google release these videos, the webmaster community goes nuts. Google will be making changes, and these changes may either make your day, or send you to the wall.

The most interesting aspect of this, I think, is the power relationship. If you want to do well in Google’s search results then there is no room for negotiation. You either do what they want or you lose out. Or you may do what they want and still lose out. Does the wealth and power sit with the publisher?

Nope.

In other news, Google just zapped another link network.

Cutts warns they’ll be going after a lot of this happening. Does wealth and power sit with the link buyer or seller?

Nope.

Now, Google are right to eliminate or devalue sites that they feel devalues their search engine. Google have made search work. Search was all but dead twelve years ago due to the ease with which publishers could manipulate the results, typically with off-topic junk. The spoils of solving this problem have flowed to Google.

The question is has too much wealth flowed to companies like Google, and is this situation going to kill off large chunks of the ecosystem on which it was built? Google isn’t just a player in this game, they’re so pervasive they may as well be the central planner. Cutts is running product quality control. The customers aren’t the publishers, they’re the advertisers.

It’s also interesting to note what these videos do not say. Cutts video was not about how your business could be more prosperous. It was all about your business doing what Google wants in order for Google to be more prosperous. It’s irrelevant if you disagree or not, as you don’t get to dictate terms to Google.

That’s the deal.

Google’s concern lies not with webmasters just as WalMarts concern lies not with small town retailers. Their concern is to meet company goals and enhance shareholder value. The effects aren’t Google or WalMarts fault. They are just that - effects.

The effect of Google pursuing those objectives might be to gouge out the value of publishing, and in so doing, gouge out a lot of the value of the middle class. The Google self-drive cars project is fascinating from a technical point of view - the view Google tends to focus on - but perhaps even more fascinating when looked at from a position they seldom seem to consider, at least, not in public, namely what happens to all those taxi drivers, and delivery drivers, who get their first break in society doing this work? Typically, these people are immigrants. Typically, they are poor but upwardly mobile.

That societal effect doesn't appear to be Google’s concern.

So who’s concern should it be?

Well, perhaps it really should be Google’s concern, as it’s in their own long-term best interest:

Today, a guitar manufacturer might advertise through Google. But when guitars are someday spun out of 3D printers, there will be no one to buy an ad if guitar designs are “free”. Yet Google’s lifeblood is information put online for free. That is what Google’s servers organize. Thus Google’s current business model is a trap in the longterm

Laniers suggestion is everyone gets paid, via micro-payments, linked back to the value they helped create. These payments continue so long as people are using their stuff, be it a line of code, a photograph, a piece of music, or an article.

For example, if you wrote a blog post, and someone quoted a paragraph of it, you would receive a tiny payment. The more often you’re quoted, the more relevant you are, therefore the more payment you receive. If a search engine indexes your pages, then you receive a micro-payment in return. If people view your pages, you receive a micro-payment. Likewise, when you consume, you pay. If you conduct a search, then you run Google’s code, and Google gets paid. The payments are tiny, as far as the individual is concerned, but they all add up.

Mammoth technical issues of doing this aside, the effect would be to take money from the head and pump it back into the tail. It would be harder to build empires off the previously free content others produce. It would send money back to producers.

It also eliminates the piracy question. Producers would want people to copy, remix and redistribute their content, as the more people that use it, the more money they make. Also, with the integration of two-way linking, the mechanism Lanier proposes to keep track of ownership and credit, you’d always know who is using your content.

Information would no longer be free. It would be affordable, in the broadest sense of the word. There would also be a mechanism to reward the production, and a mechanism to reward the most relevant information the most. The more you contribute to the net, and the more people use it, the more you make. Tiny payments. Incremental. Ongoing.

Interesting Questions

So, if these questions are of interest to you, I’d encourage you to read “Who Owns The Future” by Jaron Lanier. It’s often rambling - in a good way - and heads off on wild tangents - in a good way, and you can tell there is a very intelligent and thoughtful guy behind it all. He’s asking some pretty big, relevant questions. His answers are sketches that should be challenged, argued, debated and enlarged.

And if big tech companies want a challenge that really will change the world, perhaps they could direct all that intellect, wealth and power towards enriching the ecosystem at a pace faster than they potentially gouge it.

Link Madness

Link paranoia is off the scale. As the “unnatural link notifications” fly, the normally jittery SEO industry has moved deep into new territory, of late.

I have started to wonder if some of these links (there are hundreds since the site is large) may be hurting my site in the Google Algo. I am considering changing most of my outbound links to rel="nofollow". It is not something I want to do but ... “

We’ve got site owners falling to their knees, confessing to be link spammers, and begging for forgiveness. Even when they do, many sites don’t return. Some sites have been returned, but their rankings, and traffic, haven’t recovered. Many sites carry similar links, but get a free pass.

That’s the downside of letting Google dictate the game, I guess.

Link Removal

When site owners are being told by Google that their linking is “a problem,” they tend to hit the forums and spread the message, so the effect is multiplied.

Why does Google bother with the charade of “unnatural link notifications,” anyway?

If Google has found a problem with links to a site, then they can simply ignore or discount them, rather than send reports prompting webmasters to remove them. Alternatively, they could send a report saying they’ve already discounted them.

So one assumes Google’s strategy is a PR - as in public relations - exercise to plant a bomb between link buyers and link sellers. Why do that? Well, a link is a link, and one could conclude that Google must still have problems nailing down the links they don’t like.

So they get some help.

The disavow links tool, combined with a re-inclusion request, is pretty clever. If you wanted a way to get site owners to admit to being link buyers, and to point out the places from which they buy links, or you want to build a database of low quality links, for no money down, then you could hardly imagine a better system of outsourced labour.

If you’re a site owner, getting hundreds, if not thousands, of links removed is hardly straightforward. It’s difficult, takes a long time, and is ultimately futile.

Many site owners inundated with link removal requests have moved to charging removal fees, which in many cases is understandable, given it takes some time and effort to verify the true owner of a link, locate the link, remove it, and update the site.

As one rather fed-up sounding directory owner put it:

Blackmail? Google's blackmailing you, not some company you paid to be listed forever. And here's a newsflash for you. If you ask me to do work, then I demand to be paid. If the work's not worth anything to you, then screw off and quit emailing me asking me to do it for free.

Find your link, remove it, confirm it's removed, email you a confirmation, that's 5 minutes. And that's $29US. Don't like it? Then don't email me. I have no obligation to work for you for free, not even for a minute. …. I think the best email I got was someone telling me that $29 was extortion. I then had to explain that $29 wasn't extortion - but his new price of $109 to have the link removed, see, now THAT'S extortion.

if it makes you sleep at night, you might realize that you paid to get in the directory to screw with your Google rankings, now you get to pay to get out of the directory, again to screw your Google rankings. That's your decision, quit complaining about it like it's someone else's fault. Not everyone has to run around in circles because you're cleaning up the very mess that you made

Heh.

In any case, if these links really did harm a site - which is arguable - then it doesn’t take a rocket scientist to guess the next step. Site owners would be submitting their competitors links to directories thick and fast.

Cue Matt Cutts on negative SEO....

Recovery Not Guaranteed

Many sites don’t recover from Google penalties, no matter what they do.

It’s conceivable that a site could have a permanent flag against it no matter what penance has been paid. Google takes into account your history in Adwords, so it’s not a stretch to imagine similar flags may continue to exist against domains in their organic results.

The most common reason is not what they're promoting now, its what they've promoted in the past.
Why would Google hold that against them? It's probably because of the way affiliates used to churn and burn domains they were promoting in years gone by...

This may be the reason why some recovered sites just don’t rank like they used to after they've returned. They may carry permanent negative flags.

However, the reduced rankings and traffic when/if a site does return may have nothing to do with low-quality links or previous behaviour. There are many other factors involved in ranking and Google’s algorithm updates aren’t sitting still, so it’s always difficult to pin down.

Which is why the SEO environment can be a paranoid place.

Do Brands Escape?

Matt Cutts is on record discussing big brands, saying they get punished, too. You may recall the case of Interflora UK.

Google may well punish big brands, but the punishment might be quite different to the punishment handed out to a no-brand site. It will be easier for a big brand to return, because if Google don’t show what Google users expect to see in the SERPs then Google looks deficient.

Take, for example, this report received - amusingly - by the BBC:

I am a representative of the BBC site and on Saturday we got a ‘notice of detected unnatural links’. Given the BBC site is so huge, with so many independently run sub sections, with literally thousands or agents and authors, can you give us a little clue as to where we might look for these ‘unnatural links

If I was the BBC webmaster, I wouldn’t bother. Google isn’t going to dump the BBC sites as Google would look deficient. If Google has problems with some of the links pointing to the BBC, then perhaps Google should sort it out.

Take It On The Chin, Move On

Many of those who engaged in aggressive link tactics knew the deal. They went looking for an artificial boost in relevancy, and as a result of link building, they achieved a boost in the SERPs.

That is playing the game that Google, a search engine that factors in backlinks, "designed". By design, Google rewards well-linked sites by ranking them above others.

The site owners enjoyed the pay-off at the expense of their less aggressive competitors. The downside - there’s always a downside - is that Google may spot the artificial boost in relevancy, now or in the future, and and may slam the domain as a result.

That’s part of the game, too.

Some cry about it, but Google doesn’t care about crying site owners, so site owners should build that risk into their business case from the get go.

Strategically, there are two main ways of looking at “the game”:

Whack A Mole: Use aggressive linking for domains you’re prepared to lose. If you get burned, then that’s a cost of playing the game. Run multiple domains using different link graphs for each and hope that a few survive at any one time, thus keeping you in the game. If some domains get taken out, then take it on the chin. Try to get reinstated, and if you can’t, then torch them and move on.

Ignore Google: If you operate like Google doesn’t exist, then it’s pretty unlikely Google will slam you, although there are no guarantees. In any case, a penalty and a low ranking are the same thing in terms of outcome.

Take one step back. If your business relies on Google rankings, then that’s a business risk. If you rely entirely on Google rankings, then that’s a big business risk. I’m not suggesting it’s not a risk worth taking, but only you can answer that what risks make sense for your business.

If the whack a mole strategy is not for you, and you want to lower the business risk of Google’s whims, then it makes sense to diversify the ways in which you get traffic so that if one traffic stream fails, then all is not lost. If you’re playing for the long term, then establishing brand, diversifying traffic, and treating organic SEO traffic as a bonus should be considerations. You then don’t need to worry about what Google may or may not do as Google aren’t fueling your engine.

Some people run both these strategies simultaneously, which is an understandable way of managing risk. Most people probably sit somewhere in the middle and hope for the best.

Link Building Going Forward

The effect of Google’s fear, uncertainty and doubt strategy is that a lot of site owners are going to be running scared or confused, or both.

Just what is acceptable?

Trouble is, what is deemed acceptable today might be unacceptable next week. It’s pretty difficult, if not impossible, for a site owner to wind the clock back once they undertake a link strategy, and who knows what will be deemed unacceptable in a years time.

Of course, Google doesn’t want site owners to think in terms of a “link strategy”, if the aim of said link strategy is to “inflate rankings”. That maxim has remained constant.

If you want to take a low-risk approach, then it pays to think of Google traffic as a bonus. Brett Tabke, founder of WebmasterWorld, used to keep a sticker on his monitor that said “Pretend The Search Engines Don’t Exist”, or words to that effect. I’m reminded of how useful that message still is today, as it's a prompt to think strategically beyond SEO. If you disappeared from Google today, would your business survive? If the answer is no, then you should revise your strategy.

Is there a middle ground?

Here are a few approaches to link building that will likely stand the test of time, and incorporate strategy that provides resilience from Google’s whims. The key is having links for reasons besides SEO, even if you part of their value is higher rankings.

1. Publisher

Publish relevant, valuable content, as determined by your audience.

It’s no longer enough to publish pages of information on a topic, the information must have demonstrable utility i.e. other people need to deem it valuable, reference it, visit it, and talk about it. Instead of putting your money into buying links, you put your money into content development and then marketing it to people. The links will likely follow. This is passive link acquisition.

It’s unlikely these types of links will ever be a problem, as the link graph is not going to look contrived. If any poor quality links slip into this link graph, then they’re not going to be the dominant feature. The other signals will likely trump them and therefore diminish their impact.

Build brand based on unique, high quality information, and then market it to people by via multiple channels, and the links tend to follow, which then boost your ranking in Google. Provide a high degree of utility, first and foremost.

One problem with this model is that it’s easy for other people to steal your utility. This is a big problem and prevents investment in quality content. One way of getting around this is to use some content as loss-leader and lock the rest away behind pay-walls. You give the outside world, and Google, just enough, but if they want the rest, then they’re going to need to sign up.

Think carefully about the return on giving the whole farm away to a crawler. Think about providing utility, not "content".

2. Differentiation

There is huge first mover advantage when it comes to getting links.

If a new field opens up, and you get there first, or early, then it’s relatively easy to build a credible link graph. As a field expands, the next layer involves a lot of meta activity i.e. bloggers, media and other information curators writing about that activity. At the start of any niche, there aren’t many players to talk about, so the early movers get all the links.

As a field matures, you get a phenomenon Mike Grehan aptly characterised as “filthy linking rich

The law of "preferential attachment" as it is also known, wherein new links on the web are more likely to go to sites that already have many links, proves that the scheme is inherently biased against new and unknown pages. When search engines constantly return popular pages at the top of the pile, more web users discover those pages and more web users are likely to link to them

Those who got all those links early on will receive more and more links over time because they are top of the results. They just need to keep doing what they’re doing. It becomes very difficult for late entrants to beat them unless they do something extraordinary. By definition, that probably means shifting the niche to a new niche.

If you’re late to a crowded field, then you need to think in terms of differentiation. What can you offer the rest do not? New content in such fields must be remarkable i.e worth remarking upon.

Is that field moving in a new direction? If so, can you pivot in that direction and be first mover in that direction? Look not where a niche currently is, but where it’s going, then position ahead of it.

"Same old, same old content” doesn’t get linked to, engaged with, ranked, or remarked upon - and why should it? The web is not short of content. The web has so much content that companies like Google have made billions trying to reduce it to a manageable set of ten links

3. Brand

Brand is the ultimate Google-protection tactic.

It’s not that brands don’t get hammered by Google occasionally, because they do. But what tends to differ is the sentence handed down. The bigger the brand, the lighter the sentence, or the shorter the sentence, because no matter how much WalMart or The Office Of The President Of The United States Of America spams Google, Google must show such sites. I’m not suggesting these sites engage in aggressive SEO tactics, or need to, but we know they’ll always be in Google.

You don’t have to be a big brand. You do need search volume on your distinctive brand name. If you’re well known enough in your niche i.e. you attract significant type-in search volume, Google must show you or appear deficient.

This is not to say having a brand means you can get away with poor behavior. But the more type-in traffic for your brand, the more pressure there is on Google to rank you.

Links to a brand name will almost never look forced in the same way a link in a footer to “cheap online pharmacy” looks forced. People know your name, and they link to you by name , they talk about you by name - naturally.

The more generic your site, the more vulnerable you are, as it’s very difficult to own a space when you’re aligning with generic keyword terms. The links are always going to look a little - or a lot - forced.

This is not to say you shouldn't get links with keywords in them, but build a distinctive brand, too. The link graph will appear mostly natural - because it is. A few low quality links won’t trump the good signals created by a lot of natural brand links.

4. Engagement

The web is a place.

This placed is filled with people. There are relationships between people. Relationships between people on the web, are almost always expressed as a link. It might be a Facebook link, a Twitter link, a comment link, a blog link, but they’re all links. It doesn’t matter if they’re crawlable or not, or if they’re no-followed, or not, it still indicates a relationship.

If Google is to survive, it must figure out these relationships.

That’s why all links - apart from negative SEO - are good links. The more signals of a real relationship, the better you *should* be ranked, because you are more relevant, in an objective sense.

So look for ways to foster relationships and engagement. It might be guest posting. It might be commenting on someone elses site. It might be contributing to forums. It might be interviewing people. It might be accepting interviews. It might be forging relationships with the press. It might be forging relationships with business organisations. It might be contributing to charity. It might be running competitions. It might be attending conferences. It might be linking out to influential people.

It’s all networking.

And wherever you network, you should be getting links as a byproduct.

One potential problem:

Provide long - well, longer than 400 words - unique, editorial articles. Articles also need get linked to, and engaged with. Articles need to be placed on sites they’ll be seen, as opposed to content farms.

Ask yourself "am I providing genuine utility?"

5. Fill A Need

This is similar to differentiation, but a little more focused.

Think about the problems people have in a niche. The really hard problems to solve. “How to”, “tips”, “advice”, and so on.

Solving a genuine problem for people tends to make people feel a sense of obligation, especially if they would otherwise have to pay for that help. If you can twist that obligation towards getting a link, all the better. For example, “if this article/video/whatever helped you, no payment necessary! But it would be great if you link to us/follow us on Twitter/” and so on. It doesn’t need to be that overt, although sometimes overt is what is required. It fosters engagement. It builds your network. And it builds links.

Think about ways you can integrate a call-to-action that results in a link of some kind.

Coda

In other news, Caesars Palace bans Google :)

Pages