But First, A Word From Our Sponsors...

May 6th

Yesterday Google shared they see greater mobile than desktop search volumes in 10 countries including Japan and the United States.

3 years ago RKG shared CTR data which highlighted how mobile search ads were getting over double the CTR as desktop search ads.

The basic formula: less screen real estate = higher proportion of user clicks on ads.

Google made a big deal of their "mobilepocalypse" update to scare other webmasters into making their sites mobile friendly. Part of the goal of making sites "mobile friendly" is to ensure it isn't too ad dense (which in turn lowers accidental ad clicks & lowers monetization). Not only does Google have an "ad heavy" relevancy algorithm which demotes ad heavy sites, but they also explicitly claim even using a moderate sized ad unit on mobile devices above the fold is against their policy guidelines:

Is placing a 300x250 ad unit on top of a high-end mobile optimized page considered a policy violation?

Yes, this would be considered a policy violation as it falls under our ad placement policies for site layout that pushes content below the fold. This implementation would take up too much space on a mobile optimized site's first view screen with ads and provides a poor experience to users. Always try to think of the users experience on your site - this will help ensure that users continue to visit.

So if you make your site mobile friendly you can't run Google ads above the fold unless you are a large enough publisher that the guidelines don't actually matter.

If you spend the extra money to make your site mobile friendly, you then must also go out of your way to lower your income.

What is the goal of the above sort of scenario? Defunding content publishers to ensure most the ad revenues flow to Google.

If you think otherwise, consider the layout of the auto ads & hotel ads Google announced yesterday. Top of the search results, larger than 300x250.

If you do X, you are a spammer. If Google does X, they are improving the user experience.

The above sort of contrast is something noticed by non-SEOs. The WSJ article about Google's new ad units had a user response stating:

With this strategy, Google has made the mistake of an egregious use of precious mobile screen space in search results. This entails much extra fingering/scrolling to acquire useful results and bypass often not-needed coincident advertising. Perhaps a moneymaker by brute force; not a good idea for utility’s sake.

That content displacement with ads is both against Google's guidelines and algorithmically targeted for demotion - unless you are Google.

How is that working for Google partners?

According to eMarketer, by 2019 mobile will account for 72% of US digital ad spend. Almost all that growth in ad spend flows into the big ad networks while other online publishers struggle to monetize their audiences:

Facebook and Google accounted for a majority of mobile ad market growth worldwide last year. Combined, the two companies saw net mobile ad revenues increase by $6.92 billion, claiming 75.2% of the additional $9.2 billion that went toward mobile in 2013.

Back to the data RKG shared. Mobile is where the growth is...

...and the smaller the screen size the more partners are squeezed out of the ecosystem...

The high-intent, high-value search traffic is siphoned off by ads.

What does that leave for the rest of the ecosystem?

It is hard to build a sustainable business when you have to rely almost exclusively on traffic with no commercial intent.

One of the few areas that works well is perhaps with evergreen content which has little cost of maintenance, but even many of those pockets of opportunity are disappearing due to the combination of the Panda algorithm and Google's scrape-n-displace knowledge graph.

Even companies with direct ad sales teams struggle to monetize mobile:

At The New York Times, for instance, more than half its digital audience comes from mobile, yet just 10% of its digital-ad revenue is attributed to these devices.

Other news websites also get the majority of their search traffic from mobile.

Why do news sites get so much mobile search traffic? A lot of it is navigational & beyond that most of it is on informational search queries which are hard to monetize (and thus have few search ads) and hard to structure into the knowledge graph (because they are about news items which only just recently happened).

If you look at the organic search traffic breakdown in your analytics account & you run a site which isn't a news site you will likely see a far lower share of search traffic from mobile. Websites outside of the news vertical typically see far less mobile traffic. This goes back to Google dominating the mobile search interface with ads.

Mobile search ecosystem breakdown

  • traffic with commercial intent = heavy ads
  • limited commercial intent but easy answer = knowledge graph
  • limited commercial intent & hard to answer = traffic flows to news sites

Not only is Google monetizing a far higher share of mobile search traffic, but they are also aggressively increasing minimum bids.

As Google continues to gut the broader web publishing ecosystem, they can afford to throw a few hundred million in "innovation" bribery kickback slush funds. That will earn them some praise in the short term with some of the bigger publishers, but it will make those publishers more beholden to Google. And it is even worse for smaller publishers. It means the smaller publishers are not only competing against algorithmic brand bias, confirmation bias expressed in the remote rater documents, & wholesale result set displacement, but some of their bigger publishing competitors are also subsidized directly by Google.

Ignore the broader ecosystem shifts.

Ignore the hypocrisy.

Focus on the user.

Until you are eating cat food.

Google Mobilepocalypse Update

Apr 22nd

A day after the alleged major update, I thought it would make sense to highlight where we are at in the cycle.

Yesterday Google suggested their fear messaging caused 4.7% of webmasters to move over to mobile friendly design since the update was originally announced a few months ago.

The 4.7% of the websites Google pushed to go mobile friendly likely include some sites which would have been mobile friendly anyhow by virtue of being new sites on hosted platforms with responsive designs. But for the rest of the sites, was the shift worth it?

That is a tough question.

It is too early to tell.

  • Google still hasn't put much weight on it in the rankings yet.
  • Mobile traffic is typically worth far less than desktop traffic for most websites.
  • Time which was spent on mobile friendly conversion could have been spent on other forms of marketing.
  • Some sites which became mobile friendly took a significant revenue hit in doing so by switching out long running effective ad placements with mobile responsive units which may not have performed as well.

The problem with going early is you eat the expense upfront, while the rewards are still unknown.

  • Many people who jumped on the "secured everywhere" bandwagon last year saw broken security certificate issues and broken plugins which were hard to fix. And the upfront cost wasn't the only expense, as many AdSense publishers saw less relevant ads, lower ad CTR, and a sharp drop in AdSense earnings after going secured.
  • Those who spent the money to integrate Google Checkout to get AdWords discounts had to spend again to remove it when Google stopped supporting it.
  • TV makers who were early to integrate Google's YouTube API (which allowed ad free streaming) will now have to deal with a rash of customer complaints as Google sunsets the old API to make way to be able to sell an ad free subscription service.

If you are spending your own time & money and you believe in what you are doing and the longevity of a project then it doesn't matter too much if the rewards come slowly or never come. A sense of purpose & a sense of pride in your work is a form of payment.

However, if you are spending a client's money & you ring a 5 alarm fire to rush to make some technical change & then see no upside after the much hyped announcement, that erodes client trust. If there is no upside and a huge drop in revenue, then the consultant looks like a clueless idiot burning money for the sake of it doing various make work projects.

A few years ago a Google rep stated Panda would be folded into the regular algorithms. Then recently we were told it was a near realtime. Then we were told it was something where updates needed to be manually pushed out & it is something Google hasn't done in 4 months. If we trusted Google & conveyed any of these messages to clients, once again we looked like idiots. If we choose to invest client money based on the cycles and advice we are given, quite often that is a money incinerator.

Imagine dropping $30,000 on a link cleanup project where you remove links which were helping your Bing rankings but the Google update "coming soon" takes over a year to show up.

Invest money to lower your current income while you're waiting for Godat.

Good times!

So after Google made a big show of this pending mobile update by pre-announcing it, speaking about it at multiple conferences, comparing it to Panda and Penguin & stating it would have a bigger impact, sending out millions of warning messages via Webmaster Tools, etc etc etc .. when the big day came, did Google make the people who trusted them & invested in their advice look good?

Not so much.

Ayima recently launched a SERP flux pulse tracker tool which shows desktop and mobile flux side-by-side.

As you can see, nothing happened.

So far, no rewards. Maybe they will come. Though here is a hypothetical example where it could be very much NOT worth it for some publishers to go mobile friendly...

  • a webmaster managing an affiliate site converts it to a mobile responsive design
  • but user conversions on mobile devices in some verticals are unlikely, due to it being a pain in the ass to enter credit card info and so on ...
  • well ... person makes their site mobile friendly
  • that leads their mobile version of their site to rank better in Google
  • that leads to a greater share of their overall organic Google search traffic coming from mobile devices
  • their engagement metrics on mobile are somewhat weak, particularly when compared against desktop users, as is the case for many websites
  • their lower aggregate engagement metrics could create a signal which lead an edge case site into a false positive panda penalty
  • that then lowers their desktop search rankings
  • which lowers their desktop search traffic
  • which lowers their desktop search revenues
  • ...worse yet, ...
  • those affiliate cookies they dropped on mobile devices don't count for them when the user later converts on a desktop device

Any form of penalty (even a false positive) can become self-reinforcing. And many of the things which seem like they might help could cause harm.

Did you jump the gun or wait and see?

Consensus Bias

Apr 15th

The Truth About Subjective Truths

A few months ago there was an article in New Scientist about Google's research paper on potentially ranking sites based on how factual their content is. The idea is generally and genuinely absurd.

For a search engine to be driven primarily by group think (see unity100's posts here) is the death of diversity.

Less Diversity, More Consolidation

The problem is rarely attributed to Google, but as ecosystem diversity has declined (and entire segments of the ecosystem are unprofitable to service), more people are writing things like: "The market for helping small businesses maintain a home online isn’t one with growing profits – or, for the most part, any profits. It’s one that’s heading for a bloody period of consolidation."

As companies grow in power the power gets monetized. If you can manipulate people without appearing to do so you can make a lot of money.

We Just Listen to the Data (Ish)

As Google sucks up more data, aggregates intent, and scrapes-n-displaces the ecosystem they get air cover for some of their gray area behaviors by claiming things are driven by the data & putting the user first.

Those "data" and altruism claims from Google recently fell flat on their face when the Wall Street Journal published a number of articles about a leaked FTC document.

That PDF has all sorts of goodies in it about things like blocking competition, signing a low margin deal with AOL to keep monopoly marketshare (while also noting the general philosophy outside of a few key deals was to squeeze down on partners), scraping content and ratings from competing sites, Google force inserting itself in certain verticals anytime select competitors ranked in the organic result set, etc.

As damning as the above evidence is, more will soon be brought to light as the EU ramps up their formal statement of objection, as Google is less politically connected in Europe than they are in the United States:

"On Nov. 6, 2012, the night of Mr. Obama’s re-election, Mr. Schmidt was personally overseeing a voter-turnout software system for Mr. Obama. A few weeks later, Ms. Shelton and a senior antitrust lawyer at Google went to the White House to meet with one of Mr. Obama’s technology advisers. ... By the end of the month, the FTC had decided not to file an antitrust lawsuit against the company, according to the agency’s internal emails."

What is wild about the above leaked FTC document is it goes to great lengths to show an anti-competitive pattern of conduct toward the larger players in the ecosystem. Even if you ignore the distasteful political aspects of the FTC non-decision, the other potential out was:

"The distinction between harm to competitors and harm to competition is an important one: according to the modern interpretation of antitrust law, even if a business hurts individual competitors, it isn’t seen as breaking antitrust law unless it has also hurt the competitive process—that is, that it has taken actions that, for instance, raised prices or reduced choices, over all, for consumers." - Vauhini Vara

Part of the reason the data set was incomplete on that front was for the most part only larger ecosystem players were consulted. Google engineers have went on record stating they aim to break people's spirits in a game of psychological warfare. If that doesn't hinder consumer choice, what does?

When the EU published their statement of objections Google's response showed charts with the growth of Amazon and eBay as proof of a healthy ecosystem.

The market has been consolidated down into a few big winners which are still growing, but that in and of itself does not indicate a healthy nor neutral overall ecosystem.

The long tail of smaller e-commerce sites which have been scrubbed from the search results is nowhere to be seen in such charts / graphs / metrics.

The other obvious "untruth" hidden in the above Google chart is there is no way product searches on Google.com are included in Google's aggregate metrics. They are only counting some subset of them which click through a second vertical ad type while ignoring Google's broader impact via the combination of PLAs along with text-based AdWords ads and the knowledge graph, or even the recently rolled out rich product answer results.

Who could look at the following search result (during anti-trust competitive review no less) and say "yeah, that looks totally reasonable?"

Google has allegedly spent the last couple years removing "visual clutter" from the search results & yet they manage to product SERPs looking like that - so long as the eye candy leads to clicks monetized directly by Google or other Google hosted pages.

The Search Results Become a Closed App Store

Search was an integral piece of the web which (in the past) put small companies on a level playing field with larger players.

That it no longer is.

"What kind of a system do you have when existing, large players are given a head start and other advantages over insurgents? I don’t know. But I do know it’s not the Internet." - Dave Pell

The above quote was about app stores, but it certainly parallels a rater system which enforces the broken window fallacy against smaller players while looking the other way on larger players, unless they are in a specific vertical Google itself decides to enter.

"That actually proves my point that they use Raters to rate search results. aka: it *is* operated manually in many (how high?) cases. There is a growing body of consensus that a major portion of Googles current "algo" consists of thousands of raters that score results for ranking purposes. The "algorithm" by machine, on the majority of results seen by a high percentage of people, is almost non-existent." ... "what is being implied by the FTC is that Googles criteria was: GoogleBot +10 all Yelp content (strip mine all Yelp reviews to build their database). GoogleSerps -10 all yelp content (downgrade them in the rankings and claim they aren't showing serps in serps). That is anticompetitive criteria that was manually set." - Brett Tabke

The remote rater guides were even more explicitly anti-competitive than what was detailed in the FTC report. For instance, requiring hotel affiliate sites rated as spam even if they are helpful, for no reason other than being affiliate sites.

Is Brand the Answer?

About 3 years ago I wrote a blog post about how branding plays into SEO & why it might peak. As much as I have been accused of having a cynical view, the biggest problem with my post was it was naively optimistic. I presumed Google's consolidation of markets would end up leading Google to alter their ranking approach when they were unable to overcome the established consensus bias which was subsidizing their competitors. The problem with my presumption is Google's reliance on "data" was a chimera. When convenient (and profitable) data is discarded on an as need basis.

Or, put another way, the visual layout of the search result page trumps the underlying ranking algorithms.

Google has still highly disintermediated brand value, but they did it via vertical search, larger AdWords ad units & allowing competitive bidding on trademark terms.

If Not Illegal, then Scraping is Certainly Morally Deplorable...

As Google scraped Yelp & TripAdvisor reviews & gave them an ultimatum, Google was also scraping Amazon sales rank data and using it to power Google Shopping product rankings.

Around this same time Google pushed through a black PR smear job of Bing for doing a similar, lesser offense to Google on rare, made-up longtail searches which were not used by the general public.

While Google was outright stealing third party content and putting it front & center on core keyword searches, they had to use "about 100 “synthetic queries”—queries that you would never expect a user to type" to smear Bing & even numerous of these queries did not show the alleged signal.

Here are some representative views of that incident:

  • "We look forward to competing with genuinely new search algorithms out there—algorithms built on core innovation, and not on recycled search results from a competitor. So to all the users out there looking for the most authentic, relevant search results, we encourage you to come directly to Google. And to those who have asked what we want out of all this, the answer is simple: we'd like for this practice to stop." - Google's Amit Singhal
  • “It’s cheating to me because we work incredibly hard and have done so for years but they just get there based on our hard work. I don’t know how else to call it but plain and simple cheating. Another analogy is that it’s like running a marathon and carrying someone else on your back, who jumps off just before the finish line.” Amit Singhal, more explicitly.
  • "One comment that I’ve heard is that “it’s whiny for Google to complain about this.” I agree that’s a risk, but at the same time I think it’s important to go on the record about this." - Matt Cutts
  • "I’ve got some sympathy for Google’s view that Bing is doing something it shouldn’t." - Danny Sullivan

What is so crazy about the above quotes is Google engineers knew at the time what Google was doing with Google's scraping. I mentioned that contrast shortly after the above PR fiasco happened:

when popular vertical websites (that have invested a decade and millions of Dollars into building a community) complain about Google disintermediating them by scraping their reviews, Google responds by telling those webmasters to go pound sand & that if they don't want Google scraping them then they should just block Googlebot & kill their search rankings

Learning the Rules of the Road

If you get a sense "the rules" are arbitrary, hypocritical & selectively enforced - you may be on to something:

  • "The bizrate/nextag/epinions pages are decently good results. They are usually well-format[t]ed, rarely broken, load quickly and usually on-topic. Raters tend to like them" ... which is why ... "Google repeatedly changed the instructions for raters until raters assessed Google's services favorably"
  • and while claimping down on those services ("business models to avoid") ... "Google elected to show its product search OneBox “regardless of the quality” of that result and despite “pretty terribly embarrassing failures” "
  • and since Google knew their offerings were vastly inferior, “most of us on geo [Google Local] think we won't win unless we can inject a lot more of local directly into google results” ... thus they added "a 'concurring sites' signal to bias ourselves toward triggering [display of a Google local service] when a local-oriented aggregator site (i.e. Citysearch) shows up in the web results”"

Google's justification for not being transparent is "spammer" would take advantage of transparency to put inferior results front and center - the exact same thing Google does when it benefits the bottom line!

Around the same time Google hard-codes the self-promotion of their own vertical offerings, they may choose to ban competing business models through "quality" score updates and other similar changes:

The following types of websites are likely to merit low landing page quality scores and may be difficult to advertise affordably. In addition, it's important for advertisers of these types of websites to adhere to our landing page quality guidelines regarding unique content.

  • eBook sites that show frequent ads
  • 'Get rich quick' sites
  • Comparison shopping sites
  • Travel aggregators
  • Affiliates that don't comply with our affiliate guidelines

The anti-competitive conspiracy theory is no longer conspiracy, nor theory.

Key points highlighted by the European Commission:

  • Google systematically positions and prominently displays its comparison shopping service in its general search results pages, irrespective of its merits. This conduct started in 2008.
  • Google does not apply to its own comparison shopping service the system of penalties, which it applies to other comparison shopping services on the basis of defined parameters, and which can lead to the lowering of the rank in which they appear in Google's general search results pages.
  • Froogle, Google's first comparison shopping service, did not benefit from any favourable treatment, and performed poorly.
  • As a result of Google's systematic favouring of its subsequent comparison shopping services "Google Product Search" and "Google Shopping", both experienced higher rates of growth, to the detriment of rival comparison shopping services.
  • Google's conduct has a negative impact on consumers and innovation. It means that users do not necessarily see the most relevant comparison shopping results in response to their queries, and that incentives to innovate from rivals are lowered as they know that however good their product, they will not benefit from the same prominence as Google's product.

Overcoming Consensus Bias

Consensus bias is set to an absurdly high level to block out competition, slow innovation, and make the search ecosystem easier to police. This acts as a tax on newer and lesser-known players and a subsidy toward larger players.

Eventually that subsidy would be a problem to Google if the algorithm was the only thing that matters, however if the entire result set itself can be displaced then that subsidy doesn't really matter, as it can be retracted overnight.

Whenever Google has a competing offering ready, they put it up top even if they are embarrassed by it and 100% certain it is a vastly inferior option to other options in the marketplace.

That is how Google reinforces, then manages to overcome consensus bias.

How do you overcome consensus bias?

Google Mobile Search Result Highlights

Mar 18th

Google recently added highlights at the bottom of various sections of their mobile search results. The highlights appear on ads, organic results, and other various vertical search insertion types. The colors vary arbitrarily by section and are patterned off the colors in the Google logo. Historically such borders have conveyed a meaning, like separating advertisements from organic search results, but now the colors have no meaning other than acting as a visual separator.

We recently surveyed users to see if they understood what the borders represented & if they felt the borders had any meaning. We did 4 surveys total. The first 2 allows a user to select a choice from a drop down menu. The last two were open ended, where a user typed text into the box. For each of the 2 survey types, we did a survey of a SERP which had an ad in it & a survey of a SERP without an ad in it.

Below are the associated survey images & user results.


Google recently added colored bars at the bottom of some mobile search results. What do they mean?

answer no ads with ad
none of the other options are correct 27.7% (+2.7 / -2.5) 29.9% (+2.8 / -2.7)
the listing is an advertisement 25.8% (+2.8 / -2.6) 30.1% (+2.8 / -2.7)
each color has a different meaning 24% (+2.7 / -2.5) 19.6% (+2.5 / -2.3)
colors separate sections but have no meaning 15.5% (+2.4 / -2.1) 12.5% (+2.1 / -1.9)
the listing is a free search result 6.9% (+1.8 / -1.5) 7.9% (+2.0 / -1.6)

Given there are 5 answers, if the distributions were random there would have been a 20% distribution on each option. The only options which skewed well below that were the perceptions that the colored highlights either had no meaning or represented free/organic search results.

Link to survey results: without ads vs with ads.

And here are images of what users saw for the above surveys:


For the second set of surveys we used an open ended format

The open ended questions allow a user to type in whatever they want. This means the results do not end up biased by the predefined answer options in a quiz, but it also means the results will include plenty of noise like...

  • people entering a, c, d, k, 1, 2, 3, ggg, hello, jj, blah, and who cares as answer choices
  • some of the responses referencing the listing topics
  • some of the responses referencing parts of a search result listing like the headlines or hyperlinks
  • some of the responses highlighting the colors of the bars
  • etc.

Like the above surveys, on each of these I ordered 1,500 responses. As of writing this, each had over 1,400 responses completed & here are the word clouds for the SERPs without an ad vs the SERPs with an ad.

SERP without an ad

SERP with an ad

On each of the above word clouds, we used the default automated grouping. Here is an example of what the word cloud would look like if the results were grouped manually.

Summary

For a couple years Google has removed various forms of eye candy from many organic results (cutting back on video snippets, limiting rich rating snipets, removing authorship, etc.). The justification for such removals was to make the results feel "less cluttered." At the same time, Google has added a variety of the same types of "noisy" listing enhancements to their various ad programs.

What is the difference between reviews ad extensions, consumer ratings ad extensions, and seller ratings ad extensions? What is the difference between callout extensions and dynamic structured snippets?

Long ago AdWords advertisements had a border near them to separate them from the organic results. Those borders disappeared many years ago & only recently reappeared on mobile devices when they also appeared near organic listings. That in turn has left searchers confused as to what the border highlighting means.

According to the above Google survey results, the majority of users don't know what the colors signify, don't care what they signify, or think they indicate advertisements.

Responsive Design for Mobile SEO

Why Is Mobile So Important?

If you look just at your revenue numbers as a publisher, it is easy to believe mobile is of limited importance. In our last post I mentioned an AdAge article highlighting how the New York Times was generating over half their traffic from mobile with it accounting for about 10% of their online ad revenues.

Large ad networks (Google, Bing, Facebook, Twitter, Yahoo!, etc.) can monetize mobile *much* better than other publishers can because they aggressively blend the ads right into the social stream or search results, causing them to have a much higher CTR than ads on the desktop. Bing recently confirmed the same trend RKG has highlighted about Google's mobile ad clicks:

While mobile continues to be an area of rapid and steady growth, we are pleased to report that the Yahoo Bing Network’s search and click volumes from smart phone users have more than doubled year-over-year. Click volumes have generally outpaced growth in search volume

Those ad networks want other publishers to make their sites mobile friendly for a couple reasons...

  • If the downstream sites are mobile friendly, then users are more likely to go back to the central ad / search / social networks more often & be more willing to click out on the ads from them.
  • If mobile is emphasized in importance, then those who are critical of the value of the channel may eat some of the blame for relative poor performance, particularly if they haven't spent resources optimizing user experience on the channel.

Further Elevating the Importance of Mobile

Google has hinted at the importance of having a mobile friendly design, labeling friendly sites, testing labeling slow sites & offering tools to test how mobile friendly a site design is.

Today Google put out an APB warning they are going to increase the importance of mobile friendly website design:

Starting April 21, we will be expanding our use of mobile-friendliness as a ranking signal. This change will affect mobile searches in all languages worldwide and will have a significant impact in our search results.

In the past Google would hint that they were working to clean up link spam or content farms or website hacking and so on. In some cases announcing such efforts was done to try to discourage investment in the associated strategies, but it is quite rare that Google pre-announces an algorithmic shift which they state will be significant & they put an exact date on it.

I wouldn't recommend waiting until the last day to implement the design changes, as it will take Google time to re-crawl your site & recognize if the design is mobile friendly.

Those who ignore the warning might be in for significant pain.

Some sites which were hit by Panda saw a devastating 50% to 70% decline in search traffic, but given how small mobile phone screen sizes are, even ranking just a couple spots lower could cause an 80% or 90% decline in mobile search traffic.

Another related issue referenced in the above post was tying in-app content to mobile search personalization:

Starting today, we will begin to use information from indexed apps as a factor in ranking for signed-in users who have the app installed. As a result, we may now surface content from indexed apps more prominently in search. To find out how to implement App Indexing, which allows us to surface this information in search results, have a look at our step-by-step guide on the developer site.

Google also announced today they are extending AdWords-styled ads to their Google Play search results, so they now have a direct economic incentive to allow app activity to bleed into their organic ranking factors.

m. Versus Responsive Design

Some sites have a separate m. version for mobile visitors, while other sites keep consistent URLs & employ responsive design. How the m. version works is on the regular version of their site (say like www.seobook.com) a webmaster could add an alternative reference to the mobile version in the head section of the document

<link rel="alternate" media="only screen and (max-width: 640px)" href="http://m.seobook.com/" >

...and then on the mobile version, they would rel=canonical it back to the desktop version, likeso...

<link rel="canonical" href="http://www.seobook.com/" >

With the above sort of code in place, Google would rank the full version of the site on desktop searches & the m. version in mobile search results.

3 or 4 years ago it was a toss up as to which of these 2 options would win, but over time it appears the responsive design option is more likely to win out.

Here are a couple reasons responsive is likely to win out as a better solution:

  • If people share a mobile-friendly URL on Twitter, Facebook or other social networks & the URL changes, then when someone on a desktop computer clicks on the shared m. version of the page with fewer ad units & less content on the page, then the publisher is providing a worse user experience & is losing out on incremental monetization they would have achieved with the additional ad units.
  • While some search engines and social networks might be good at consolidating the performance of the same piece of content across multiple URL versions, some of them will periodically mess it up. That in turn will lead in some cases to lower rankings in search results or lower virality of content on social networks.
  • Over time there is an increasing blur between phones and tablets with phablets. Some high pixel density screens on cross over devices may appear large in terms of pixel count, but still not have particularly large screens, making it easy for users to misclick on the interface.
  • When Bing gave their best practices for mobile, they stated: "Ideally, there shouldn’t be a difference between the “mobile-friendly” URL and the “desktop” URL: the site would automatically adjust to the device — content, layout, and all." In that post Bing shows some examples of m. versions of sites ranking in their mobile search results, however for smaller & lesser known sites they may not rank the m. version the way they do for Yelp or Wikipedia, which means that even if you optimize the m. version of the site to a great degree, that isn't the version all mobile searchers will see. Back in 2012 Bing also stated their preference for a single version of a URL, in part based on lowering crawl traffic & consolidation of ranking signals.

In addition to responsive web design & separate mobile friendly URLs, a third configuration option is dynamic serving, which uses the Vary HTTP header to detect the user-agent & use that to drive the layout. For large, complex & high-traffic sites (and sites with numerous staff programmers & designers) dynamic serving is perhaps the best optimization solution because you can optimize the images and code to lower bandwidth costs and response times. Most smaller sites will likely rely on responsive design rather than dynamic serving, in large part because it is quicker & cheaper to implement, and most are not running sites large enough to where the incremental bandwidth provides a significant incremental expense to their business.

Solutions for Quickly Implementing Responsive Design

New Theme / Design

If your site hasn't been updated in years you might be suprised at what you find available on sites like ThemeForest for quite reasonable prices. Many of the options are responsive out of the gate & look good with a day or two of customization. Theme subscription services like WooThemes and Elegant Themes also have responsive options.

Child Themes

Some of the default Wordpress themes are responsive. Creating a child theme is quite easy. The popular Thesis and Studiopress frameworks also offer responsive skins.

PSD to HTML HTML to Responsive HTML

Some of the PSD to HTML conversion services like PSD 2 HTML, HTML Slicemate & XHTML Chop offer responsive design conversion of existing HTML sites in as little as a day or two, though you will likely need to do at least a few minor changes when you put the designs live to compensate for issues like third party ad units and other minor issues.

If you have an existing Wordpress theme, you might want to see if you can zip it up and send it to them, or else they may make your new theme as a child theme of 2015 or such. If you are struggling to get them to convert your Wordpress theme over (like they are first converting it to a child theme of 2015 or such) then another option would be to have them do a static HTML file conversion (instead of a Wordpress conversion) and then feed that through a theme creation tool like Themespress.

For simple hand rolled designs there are a variety of grid generator tools, which can make it reasonably easy to replace some old school table-based designs with divs. Many of the themes for sale in marketplaces like Theme Forest also use a multi-column div grid based system.

Other Things to Look Out For

Third Party Plug-ins & Ad Code Gotchas

Google allows webmasters to alter the ad calls on their mobile responsive AdSense ad units to show different sized ad units to different screen sizes & skip showing some ad units on smaller screens. An AdSense code example is included in an expandable section at the bottom of this page.

<style type="text/css">
.adslot_1 { display:inline-block; width: 320px; height: 50px; }
@media (max-width: 400px) { .adslot_1 { display: none; } }
@media (min-width:500px) { .adslot_1 { width: 468px; height: 60px; } }
@media (min-width:800px) { .adslot_1 { width: 728px; height: 90px; } }
</style>
<ins class="adsbygoogle adslot_1"
data-ad-client="ca-pub-1234"
data-ad-slot="5678"></ins>
<script async src="//pagead2.googlesyndication.com/pagead/js/adsbygoogle.js"></script>
<script>(adsbygoogle = window.adsbygoogle || []).push({});</script>

For other ads which perhaps don't have a "mobile friendly" option you could use CSS to either set the ad unit to display none or to set the ad unit to overflow using code like either of the following

hide it:

@media only screen and (max-width: ___px) {
     .bannerad {
          display: none;
     }
}

overflow it:

@media only screen and (max-width: ___px) {
     .ad-unit {
          max-width: ___px;
          overflow: scroll;
     }
}

Images are another tricky point.

img {
height:auto;
max-width:100%;
}

Here are a few other general responsive CSS tricks.

Before Putting Your New Responsive Site Live...

Back up your old site before putting the new site live.

For static HTML sites or sites with PHP or SHTML includes & such...

  • Download a copy of your existing site to local.
  • Rename that folder to something like sitename.com-OLDVERSION
  • Upload the sitename.com-OLDVERSION folder to your server. If anything goes drastically wrong during your conversion process you can rename the new site design to something like sitename.com-HOSED then set the sitename.com-OLDVERSION folder to sitename.com to quickly restore the site.
  • Download your site to local again.
  • Ensure your new site design is using a different CSS folder or CSS filename such that they old and new versions of the design can be live at the same time while you are editing the site.
  • Create a test file with the responsive design on your site & test that page until things work well enough.
  • Once that page works well enough, test changing your homepage over to the new design & then save and upload it to verify it works properly. In addition to using your cell phone you could see how it looks on a variety of devices via the mobile browser testing emulation tool in Chrome, or a wide array of third party tools like: MobileTest.me, MobileMoxy Device Emulator, iPadPeek, Mobile Phone Emulator, Browshot, Matt Kersley's responsive web design testing tool, BrowserStack, Cross Browser Testing, & the W3C mobileOK Checker. Paid services like Keynote offer manual testing rather than emulation on a wide variety of devices. There is also paid downloadable desktop emulation software like Multi-browser view.
  • Once you have the general "what needs changed in each file" down, then use find & replace to bulk edit the remaining files to make the changes to make them responsive.
  • Use a tool like FileZilla to quickly bulk upload the files.
  • Look through key pages and if there are only a few minor errors then fix them and re-upload them. If things are majorly screwed up then revert to the old design being live and schedule a do over on the upgrade.
  • If you have a decently high traffic site, it might make sense to schedule the above process for late Friday night or an off hour on the weekend, such that if anything goes astray you have fewer people seeing the errors while you frantically rush to fix them. :)
  • If you want to view your top pages you could export that data from your web analytics to verify all those pages look good. If you wanted to view every page of your site 1 at a time after the change, you could use a tool like Xenu Link Sleuth or Screaming Frog SEO Spider to crawl your site & export a list of URLs into a spreadsheet. Then you could take the URLs from that spreadsheet and put them a chunk at a time into a tool like URL Opener.

If you have little faith in the above test-it-live "methodology" & would prefer a slower & lower stress approach, you could create a test site on another domain name for testing purposes. Just be sure to include a noindex directive in the robots.txt file or password protect access to the site while testing. When you get things worked out on it, make sure your internal links are referencing the correct domain name, and that you have removed any block via robots.txt or password protection.

For a site with a CMS the above process is basically the same, except for how you might need to create a different backup. If you are uploading a Wordpress or Drupal theme, then change the name at least slightly so you can keep the old and new designs live at the same time so you can quickly switch back to the old design if you need to.

If you have a mixed site with Wordpress & static files or such then it might make sense to test changing the static files first, get those to work well & then create a Wordpress theme after that.

Update: at a conference a Googler named Zineb Ait Bahajji recently stated they expect this update to impact more sites than Panda and Penguin have. And Google recently started sending out mobile usability warning messages in bulk:

Google systems have tested 2,790 pages from your site and found that 100% of them have critical mobile usability errors. The errors on these 2,790 pages severely affect how mobile users are able to experience your website. These pages will not be seen as mobile-friendly by Google Search, and will therefore be displayed and ranked appropriately for smartphone users.

You Can't Copyright Facts

The Facts of Life

When Google introduced the knowledge graph one of their underlying messages behind it was "you can't copyright facts."

Facts are like domain names or links or pictures or anything else in terms of being a layer of information which can be highly valued or devalued through commoditization.

When you search for love quotes, Google pulls one into their site & then provides another "try again" link.

Since quotes mostly come from third parties they are not owned by BrainyQuotes and other similar sites. But here is the thing, if those other sites which pay to organize and verify such collections have their economics sufficiently undermined then they go away & then Google isn't able to pull them into the search results either.

The same is true with song lyrics. If you are one of the few sites paying to license the lyrics & then Google puts lyrics above the search results, then the economics which justified the investment in licensing might not back out & you will likely go bankrupt. That bankruptcy wouldn't be the result of being a spammer trying to work an angle, but rather because you had a higher cost structure from trying to do things the right way.

Never trust a corporation to do a librarian's job.

What's Behind Door Number One?

Google has also done the above quote-like "action item" types of onebox listings in other areas like software downloads

Where there are multiple versions of the software available, Google is arbitrarily selecting the download page, even though a software publisher might have a parallel SAAS option or other complex funnels based on a person's location or status as a student or such.

Mix in Google allowing advertisers to advertise bundled adware, and it becomes quite easy for Google to gum up the sales process and undermine existing brand equity by sending users to the wrong location. Here's a blog post from Malwarebytes referencing

  • their software being advertised on their brand term in Google via AdWords ads, engaging in trademark infringement and bundled with adware.
  • numerous user complaints they received about the bundleware
  • required legal actions they took to take the bundler offline

Brands are forced to buy their own brand equity before, during & after the purchase, or Google partners with parasites to monetize the brand equity:

The company used this cash to build more business, spending more than $1 million through at least seven separate advertising accounts with Google.
...
The ads themselves said things like “McAfee Support - Call +1-855-[redacted US phone number]” and pointed to domains like mcafee-support.pccare247.com.
...
One PCCare247 ad account with Google produced 71.7 million impressions; another generated 12.4 million more. According to records obtained by the FTC, these combined campaigns generated 1.5 million clicks

Since Google requires Chrome extensions be installed from their own website it makes it hard (for anyone other than Google) to monetize them, which in turn makes it appealing for people to sell the ad-ons to malware bundlers. Android apps in the Google Play store are yet another "open" malware ecosystem.

FACT: This Isn't About Facts

Google started the knowledge graph & onebox listings on some utterly banal topics which were easy for a computer to get right, though their ambitions vastly exceed the starting point. The starting point was done where it was because it was low-risk and easy.

When Google's evolving search technology was recently covered on Medium by Steven Levy he shared that today the Knowledge Graph appears on roughly 25% of search queries and that...

Google is also trying to figure out how to deliver more complex results — to go beyond quick facts and deliver more subjective, fuzzier associations. “People aren’t interested in just facts,” she says. “They are interested in subjective things like whether or not the television shows are well-written. Things that could really help take the Knowledge Graph to the next level.”

Even as the people who routinely shill for Google parrot the "you can't copyright facts" mantra, Google is telling you they have every intent of expanding far beyond it. “I see search as the interface to all computing,” says Singhal.

Even if You Have Copyright...

What makes the "you can't copyright facts" line so particularly disingenuous was Google's support of piracy when they purchased YouTube:

cofounder Jawed Karim favored “lax” copyright policy to make YouTube “huge” and hence “an excellent acquisition target.” YouTube at one point added a “report copyrighted content” button to let users report infringements, but removed the button when it realized how many users were reporting unauthorized videos. Meanwhile, YouTube managers intentionally retained infringing videos they knew were on the site, remarking “we should KEEP …. comedy clips (Conan, Leno, etc.) [and] music videos” despite having licenses for none of these. (In an email rebuke, cofounder Steve Chen admonished: “Jawed, please stop putting stolen videos on the site. We’re going to have a tough time defending the fact that we’re not liable for the copyrighted material on the site because we didn’t put it up when one of the co-founders is blatantly stealing content from other sites and trying to get everyone to see it.”)

To some, the separation of branding makes YouTube distinct and separate from Google search, but that wasn't so much the case when many sites lost their video thumbnails and YouTube saw larger thumbnails on many of their listings in Google. In the above Steven Levy article he wrote: "one of the highest ranked general categories was a desire to know “how to” perform certain tasks. So Google made it easier to surface how-to videos from YouTube and other sources, featuring them more prominently in search."

Altruism vs Disruption for the Sake of it

Whenever Google implements a new feature they can choose to not monetize it so as to claim they are benevolent and doing it for users without commercial interest. But that same unmonetized & for users claim was also used with their shopping search vertical until one day it went paid. Google claimed paid inclusion was evil right up until the day it claimed paid inclusion was a necessity to improve user experience.

There was literally no transition period.

Many of the "informational" knowledge block listings contain affiliate links pointing into Google Play or other sites. Those affiliate ads were only labeled as advertisements after the FTC complained about inconsistent ad labeling in search results.

Health is Wealth

Google recently went big on the knowledge graph by jumping head first into the health vertical:

starting in the next few days, when you ask Google about common health conditions, you’ll start getting relevant medical facts right up front from the Knowledge Graph. We’ll show you typical symptoms and treatments, as well as details on how common the condition is—whether it’s critical, if it’s contagious, what ages it affects, and more. For some conditions you’ll also see high-quality illustrations from licensed medical illustrators. Once you get this basic info from Google, you should find it easier to do more research on other sites around the web, or know what questions to ask your doctor.


Google's links to the Mayo Clinic in their knowledge graph are, once again, a light gray font.

In case you didn't find enough background in Google's announcement article, Greg Sterling shared more of Google's views here. A couple notable quotes from Greg...

Cynics might say that Google is moving into yet another vertical content area and usurping third-party publishers. I don’t believe this is the case. Google isn’t going to be monetizing these queries; it appears to be genuinely motivated by a desire to show higher-quality health information and educate users accordingly.

  • Google doesn't need to directly monetize it to impact the economics of the industry. If they shift a greater share of clicks through AdWords then that will increase competition and ad prices in that category while lowering investment in SEO.
  • If this is done out of benevolence, it will appear *above* the AdWords ads on the search results — unlike almost every type of onebox or knowledge graph result Google offers.
  • If it is fair for him to label everyone who disagrees with his thesis as a cynic then it is of course fair for those "cynics" to label Greg Sterling as a shill.

Google told me that it hopes this initiative will help motivate the improvement of health content across the internet.

By defunding and displacing something they don't improve its quality. Rather they force the associated entities to cut their costs to try to make the numbers work.

If their traffic drops and they don't do more with less, then...

  • their margins will fall
  • growth slows (or they may even shrink)
  • their stock price will tank
  • management will get fired & replaced and/or they will get took private by private equity investors and/or they will need to do some "bet the company" moves to find growth elsewhere (and hope Google doesn't enter that parallel area anytime soon)

When the numbers don't work, publishers need to cut back or cut corners.

Things get monetized directly, monetized indirectly, or they disappear.

Some of the more hated aspects of online publishing (headline bait, idiotic correlations out of context, pagination, slideshows, popups, fly in ad units, auto play videos, full page ad wraps, huge ads eating most the above the fold real estate, integration of terrible native ad units promoting junk offers with shocking headline bait, content scraping answer farms, blending unvetted user generated content with house editorial, partnering with content farms to create subdomains on trusted blue chip sites, using Narrative Science or Automated Insights to auto-generate content, etc.) are not done because online publishers want to be jackasses, but because it is hard to make the numbers work in a competitive environment.

Publishers who were facing an "oh crap" moment when seeing print Dollars turn into digital dimes are having round number 2 when they see those digital dimes turn into mobile pennies:

At The New York Times, for instance, more than half its digital audience comes from mobile, yet just 10% of its digital-ad revenue is attributed to these devices.

If we lose some diversity in news it isn't great, though it isn't the end of the world. But what makes health such an important area is it is literally a matter of life & death.

Its importance & the amount of money flowing through the market ensures there is heavy investment in misinforming the general population. The corruption is so bad some people (who should know better) instead fault science.

... and, only those who hate free speech, capitalism, democracy & the country could possibly have anything negative to say about it. :D

Not to worry though. Any user trust built through the health knowledge graph can be monetized through a variety of other fantastic benevolent offers.

Once again, Google puts the user first.

Peak Google? Not Even Close

Nov 16th

Search vs Native Ads

Google owns search, but are they a one trick pony?

A couple weeks ago Ben Thompson published an interesting article suggesting Google may follow IBM and Microsoft in peaking, perhaps with native ads becoming more dominant than online search ads.

According to Forrester, in a couple years digital ad spend will overtake TV ad spend. In spite of the rise of sponsored content, native isn't even broken out as a category.

Part of the issue with native advertising is it can be blurry to break out some of it. Some of it is obvious, but falls into multiple categories, like video ads on YouTube. Some of it is obvious, but relatively new & thus lacking in scale. Amazon is extending their payment services & Prime shipping deals to third party sites of brands like AllSaints & listing inventory from those sites on Amazon.com, selling them traffic on a CPC basis. Does that count as native advertising? What about a ticket broker or hotel booking site syndicating their inventory to a meta search site?

And while native is not broken out, Google already offers native ad management features in DoubleClick and has partnered with some of the more well known players like BuzzFeed.

The Penny Gap's Impact on Search

Time intends to test paywalls on all of its major titles next year & they are working with third parties to integrate affiliate ads on sites like People.com.

The second link in the above sentence goes to an article which is behind a paywall. On Twitter I often link to WSJ articles which are behind a paywall. Any important information behind a paywall may quickly spread beyond it, but typically a competing free site which (re)reports on whatever is behind the paywall is shared more, spreads further on social, generates more additional coverage on forums and discussion sites like Hacker News, gets highlighted on aggregators like TechMeme, gets more links, ranks higher, and becomes the default/canonical source of the story.

Part of the rub of the penny gap is the cost of the friction vastly exceeds the financial cost. Those who can flow attention around the payment can typically make more by tracking and monetizing user behavior than they could by charging users incrementally a cent here and a nickel there.

Well known franchises are forced to offer a free version or they eventually cede their market position.

There are sites which do roll up subscriptions to a variety of sites at once, but some of them which had stub articles requiring payment to access like Highbeam Research got torched by Panda. If the barrier to entry to get to the content is too high the engagement metrics are likely to be terrible & a penalty ensues. Even a general registration wall is too high of a barrier to entry for some sites. Google demands whatever content is shown to them be visible to end users & if there is a miss match that is considered cloaking - unless the miss match is due to monetizing by using Google's content locking consumer surveys.

Who gets to the scale needed to have enough consumer demand to be able to charge an ongoing subscription for access to a variety of third party content? There are a handful of players in music (Apple, Spotify, Pandora, etc) & a handful of players in video (Netflix, Hulu, Amazon Prime), but outside of those most paid subscription sites are about finance or niche topics with small scale. And whatever goes behind the paywalls gets seen by almost nobody when compared against to the broader public market at the free pricepoint.

Even if you are in a broad niche industry where a subscription-based model works, it still may be brutally tough to compete against Google. Google's chief business officer joined the board of Spotify, which means Spotify should be safe from Google risk, except...

Google's Impact on Premium Content

I've long argued Google has leveraged piracy to secure favorable media deals (see the second bullet point at the bottom of this infographic). Some might have perceived my take as being cynical, but when Google mentioned their "continued progress on fighting piracy" the first thing they mentioned was more ad units.

There are free options, paid options & the blurry lines in between which Google & YouTube ride while they experiment with business models and monetize the flow of traffic to the paid options.

"Tech companies don’t believe in the unique value of premium content over the long term." - Jessica Lessin

There is a massive misalignment of values which causes many players to have to refine their strategy over and over again. The gray area is never static.

Many businesses only have a 10% or 15% profit margin. An online publishing company which sees 20% of its traffic disappear might thus go from sustainable to bleeding cash overnight. A company which can arbitrarily shift / redirect a large amount of traffic online might describe itself as a "kingmaker."

In Germany some publishers wanted to be paid to be in the Google index. As a result Google stopped showing snippets near their listings. Google also refined their news search integration into the regular search results to include a broader selection of sources including social sites like Reddit. As a result Axel Springer quickly found itself begging for things to go back to the way they were before as their Google search traffic declined 40% and their Google News traffic declined 80%. Axel Springer got their descriptions back, but the "in the news" change remains.

Google's Impact on Weaker Players

If Google could have that dramatic of an impact on Axel Springer, imagine what sort of influence they have on millions of other smaller and weaker online businesses.

One of the craziest examples is Demand Media.

Demand Media's market cap peaked above $1.9 billion. They spun out the domain name portion of the business into a company named Rightside Group, but the content portion of the business is valued at essentially nothing. They have about $40 million in cash & equivalents. Earlier this year they acquired Saatchi Art for $17 million & last year they acquired ecommerce marketplace Society6 for $94 million. After their last quarterly result their stock closed down 16.83% & Thursday they were down another 6.32%, given them a market capitalization of $102 million.

On their most recent conference call, here are some of the things Demand Media executives stated:

  • By the end of 2014, we anticipate more than 50.000 articles will be substantially improved by rewrites made rich with great visuals.
  • We are well underway with this push for quality and will remove $1.8 million underperforming articles in Q4
  • as we strive to create the best experience we can we have removed two ad units with the third unit to be removed completely by January 1st
  • (on the above 2 changes) These changes are expected to have a negative impact on revenues and adjusted EBITDA of approximately $15 million on an annualized basis.
  • Through Q3 we have invested $1.1 million in content renovation costs and expect approximately another $1 million in Q4 and $2 million to $4 million in the first half of next year.
  • if you look at visits or you know the mobile mix is growing which has lower CPM associated with it and then also on desktop we're seeing compression on the pricing line as well.
  • we know that sites that have ad density that's too high, not only are they offending audiences in near term, you are penalized with respect to where you show up in search indexes as well.

Google torched eHow in April of 2011. In spite of over 3 years of pain, Demand Media is still letting Google drive their strategy, in some cases spending millions of dollars to undo past investments.

Yet when you look at Google's search results page, they are doing the opposite of the above strategy: more scraping of third party content coupled with more & larger ad units.

Originally the source links in the scrape-n-displace program were light gray. They only turned blue after a journalist started working on a story about 10 blue links.

The Blend

The search results can be designed to have some aspects blend in while others stand out. Colors can change periodically to alter response rates in desirable ways.

The banner ad got a bad rap as publishers have fought declining CPMs by adding more advertisements to their pages. When it works, Google's infrastructure still delivers (and tracks) billions of daily banner ads.

Search ads have never had the performance decline banner ads have.

The closest thing Google ever faced on that front was when AdBlock Plus became popular. Since it was blocking search ads, Google banned them & then restored them as they eventually negotiated a deal to pay them to display ads on Google while they continued to block ads on other third party sites.

Search itself *is* the ultimate native advertising platform.

Google is doing away with the local carousel in favor of a 3 pack local listing in categories like hotels. Once a person clicks on one of the hotel listings, Google then inserts an inline knowledge graph listing for that hotel with booking affiliate links inline in the search results, displacing the organic result set below the fold.

Notice in the above graphic how the "website" link uses generic text, is aligned toward the right, and is right next to an image so that it looks like an ad. It is engineered to feel like an ad and be ignored. The actual ads are left aligned and look like regular text links. They have an ad label, but that label is a couple lines up from them & there are multiple horizontal lines between the label and the actual ad units.

Not only does Google have the ability to shift the layout in such a drastic format, but then with whatever remains they also get to determine who they act against & who they don't. While the SEO industry debates the "ethics" of various marketing techniques Google has their eye on the prize & is displacing the entire ecosystem wholesale.

Users were generally unable to distinguish between ads and organic listings *before* Google started mixing the two in their knowledge graph. That is a big part of the reason search ads have never seen the engagement declines banner ads have seen.

Mobile has been redesigned with the same thinking in mind. Google action items (which can eventually be monetized) up top & everything else pushed down the page.

The blurring of "knowledge" and ads allows Google to test entering category after category (like doctor calls from the search results) & forcing advertisers to pay for the various tests while Google collects data.

And as Google is scraping information from third party sites, they can choose to show less information on their own site if doing so creates additional monetization opportunities. As far back as 2009 Google stripped phone numbers off of non-sponsored map listings. And what happened with the recent 3 pack? While 100% of the above the fold results are monetized, ...

"Go back to an original search that turns up the 3 PAC. Its completely devoid of logical information that a searcher would want:

  • No phone number
  • No address
  • No map
  • NO LINK to the restaurant website.

Anything that most users would want is deliberately hidden and/or requires more clicks." - Dave Oremland

Google justifies their scrape-n-displace programs by claiming they are put users first. Then they hide some of the information to drive incremental monetization opportunities. Google may eventually re-add some of those basic features which are now hidden, but as part of sponsored local listings.

After all - ads are the solution to everything.

Do branded banner ads in the search results have a low response rate? Or are advertisers unwilling to pay a significant sum for them? If so, Google can end the test and instead shift to include a product carousel in the search results, driving traffic to Google Shopping.

"I see this as yet another money grab by Google. Our clients typically pay 400-500% more for PLA clicks than for clicks on their PPC Brand ads. We will implement exact match brand negatives in Shopping campaigns." - Barb Young

That money grab stuff has virtually no limit.

The Click Casino

Off the start keywords defaulted to broad match. Then campaigns went "enhanced" so advertisers were forced to eat lower quality clicks on mobile devices.

Then there was the blurring exact match targeting into something else, forcing advertisers to buy lower quality variations of searches & making them add tons of negative keywords (and keep eating new garbage re-interpretations of words) in order to run a fine tuned campaign specifically targeted against a term.

In the past some PPC folks cheered obfuscation of organic search, thinking "this will never happen to me."

Oops.

And of course Google not only wants to be the ad auction, but they want to be your SEM platform managing your spend & they are suggesting you can leverage the "power" of automated auction time biding.

Advertisers RAVE about the success of Google's automatic bidding features: "It received one click. That click cost $362.63."

The only thing better than that is banner ads in free mobile tap games targeted at children.

Adding Friction

Above I mentioned how Google arbitrarily banned the AdBlock Plus extension from the Play store. They also repeatedly banned Disconnect Mobile. If you depend on mobile phones for distribution it is hard to get around Google. What's more they also collect performance data, internally launch competing apps, and invest in third party apps. And they control the prices various apps pay for advertising across their broad network.

So maybe you say "ok, I'll skip search & mobile, I'll try to leverage email" but this gets back to the same issue again. In Gmail social or promotional emails get quarantined into a ghetto where they are rarely seen:

"online stores, if they get big enough, can act as chokepoints. And so can Google. ... Google unilaterally misfiled my daily blog into the promotions folder they created, and I have no recourse and no way (other than this post) to explain the error to them" - Seth Godin

Those friction adders have real world consequences. A year ago Groupon blamed Gmail's tabs for causing them to have poor quarterly results. The filtering impact on a start up can be even more extreme. A small shift in exposure can lower the K factor to something below 1 & require the startups to buy exposure rather than generating it virally.

In addition to those other tabs, there are a host of other risks like being labeled as spam or having a security warning. Few sites are as widely read inside the Googleplex as Search Engine Land, yet at one point even their newsletter was showing a warning in Gmail.

Google can also add friction to

  • websites using search rankings, vertical search result displacement, hiding local business information (as referenced above), search query suggestions, and/or leveraging their web browser to redirect consumer intent
  • video on YouTube by counting ad views as organic views, changing the relevancy metrics, and investing in competing channels & giving them preferential exposure as part of the deal. YouTube gets over half their views on mobile devices with small screens, so any shift on Google's rank preference is going to have a major shift in click distributions.
  • mobile apps using default bundling agreements which require manufactures to set Google's apps as defaults
  • other business models by banning bundling-based business models too similar to their own (bundling is wrong UNLESS it is Google doing it)
  • etc.

The launch of Keyword (not provided) which hid organic search keyword data was friction for the sake of it in organic search. When Google announced HTTPS as a ranking signal, Netmeg wrote: "It's about ad targeting, and who gets to profile you, and who doesn't. Mark my words."

Facebook announced their relaunch of Atlas and Google immediately started cracking down on data access:

In the conversations this week, with companies like Krux, BlueKai and Lotame, Google told data management platform players that they could not use pixels in certain ads. The pixels—embedded within digital ads—help marketers target and understand how many times a given user has seen their messages online.

"Google is only allowing data management platforms to fire pixels on creative assets that they're serving, on impressions they bought, through the Google Display Network," said Mike Moreau, chief solutions officer at Krux. "So they're starting with a very narrow scope."

Around the same time Google was cracking down on data sharing, they began offering features targeting consumers across devices & announced custom affinity audiences which allow advertisers to target audiences who visit any particular website.

Google's special role is not only as an organizer (and obfuscator) of information, but then they get to be the source measuromg how well marketing works via their analytics, which can regularly launch new reports which may causually over-represent their own contribution while under-representing some other channels, profiting from activity bias. The industry default of last click attribution driving search ad spending is one of the key issues which has driven down display ad values over the years.

Investing in Competition

Google not only ranks the ecosystem, but they actively invest in it.

Google tried to buy Yelp. When Facebook took off Google invested in Zynga to get access to data, in spite of a sketchy background. When Google's $6 billion offer for Groupon didn't close the deal, Google quickly partnered with over a dozen Groupon competitors & created new offer ad units in the search results.

Inside of the YouTube ecosystem Google also holds equity stakes in leading publishers like Machinima and Vevo.

There have been a few examples of investments getting special treatment, getting benefit of the doubt, or access to non-public information.

The scary scenario for publishers might sound something like this: "in Baidu Maps you can find a hotel, check room availability, and make a booking, all inside the app." There's no need to leave the search engine.

Take a closer look & that scary version might already be here. Google's same day delivery boss moved to Uber and Google added Uber pickups and price estimates to their mobile Maps app.

Google, of course, also invested in Uber. It would be hard to argue that Uber is anything but successful. Though it is also worth mentioning winning at any cost often creates losses elsewhere:

Google invests in disruption as though disruption is its own virtue & they leverage their internal data to drive the investments:

“If you can’t measure and quantify it, how can you hope to start working on a solution?” said Bill Maris, managing partner of Google Ventures. “We have access to the world’s largest data sets you can imagine, our cloud computer infrastructure is the biggest ever. It would be foolish to just go out and make gut investments.”

Combining usage data from their search engine, web browser, app store & mobile OS gives them unparalleled insights into almost any business.

Google is one of the few companies which can make multi-billion dollar investments in low margin areas, just for the data:

Google executives are prodding their engineers to make its public cloud as fast and powerful as the data centers that run its own apps. That push, along with other sales and technology initiatives, aren’t just about grabbing a share of growing cloud revenue. Executives increasingly believe such services could give Google insights about what to build next, what companies to buy and other consumer preferences

Google committed to spending as much as a half billion dollars promoting their shopping express delivery service.

Google's fiber push now includes offering business internet services. Elon Musk is looking into offering satellite internet services - with an ex-Googler.

The End Game

Google now spends more than any other company on federal lobbying in the US. A steady stream of Google executives have filled US government rolls like deputy chief technology officer, chief technology officer, and head of the patent and trademark office. A Google software engineer went so far as suggesting President Obama

  • Retire all government employees with full pensions.
  • Transfer administrative authority to the tech industry.
  • Appoint Eric Schmidt CEO of America.

That Googler may be crazy or a troll, but even if we don't get their nightmare scenario, if the regulators come from a particular company, that company is unlikely to end up hurt by regulations.

President Obama has stated the importance of an open internet: “We cannot allow Internet service providers to restrict the best access or to pick winners and losers in the online marketplace for services and ideas.”

If there are relevant complaints about Google, who will hear them when Googlers head key government roles?

Larry Page was recently labeled businessperson of the year by Fortune:

It’s a powerful example of how Page pushes the world around him into his vision of the future. “The breadth of things that he is taking on is staggering,” says Ben Horowitz, of Andreessen Horowitz. “We have not seen that kind of business leader since Thomas Edison at GE or David Packard at HP.”

A recent interview of Larry Page in the Financial Times echos the theme of limitless ambition:

  • "the world’s most powerful internet company is ready to trade the cash from its search engine monopoly for a slice of the next century’s technological bonanza." ... "As Page sees it, it all comes down to ambition – a commodity of which the world simply doesn’t have a large enough supply."
  • “I think people see the disruption but they don’t really see the positive,” says Page. “They don’t see it as a life-changing kind of thing . . . I think the problem has been people don’t feel they are participating in it.”
  • “Even if there’s going to be a disruption on people’s jobs, in the short term that’s likely to be made up by the decreasing cost of things we need, which I think is really important and not being talked about.”
  • "in a capitalist system, he suggests, the elimination of inefficiency through technology has to be pursued to its logical conclusion."

There are some dark layers which are apparently "incidental side effects" of the techno-utopian desires.

Mental flaws could be reinforced & monetized by hooking people on prescription pharmaceuticals:

It takes very little imagination to foresee how the kitchen mood wall could lead to advertisements for antidepressants that follow you around the Web, or trigger an alert to your employer, or show up on your Facebook page because, according to Robert Scoble and Shel Israel in Age of Context: Mobile, Sensors, Data and the Future of Privacy, Facebook “wants to build a system that anticipates your needs.”

Or perhaps...

Those business savings are crucial to Rifkin’s vision of the Third Industrial Revolution, not simply because they have the potential to bring down the price of consumer goods, but because, for the first time, a central tenet of capitalism—that increased productivity requires increased human labor—will no longer hold. And once productivity is unmoored from labor, he argues, capitalism will not be able to support itself, either ideologically or practically.

That is not to say "all will fail" due to technology. Some will succeed wildly.

Michelle Phan has been able to leverage her popularity on YouTube to launch a makeup subscription service which is at an $84 million per year revenue run rate.

Those at the top of the hierarchy will get an additional boost. Such edge case success stories will be marketed offline to pull more people onto the platform.

While a "star based" compensation system makes a few people well off, most people publishing on those platforms won't see any financial benefit from their efforts. Worse yet, a lot of the "viral" success stories are driven by a large ad budget.

Even Google has done research on income inequality in attention economies - and that was before they dialed up their brand bias stuff.

Category after category gets commoditized, platform after platform gets funded by Google, and ultimately employees working on them will long for the days where their wages were held down by illegal collusion rather than the crowdsourcing fate they face:

Workers, in turn, have more mobility and a semblance of greater control over their working lives. But is any of it worth it when we can’t afford health insurance or don’t know how much the next gig might pay, or when it might come? When an app’s terms of service agreement is the closest thing we have to an employment contract? When work orders come through a smartphone and we know that if we don’t respond immediately, we might not get such an opportunity again? When we can’t even talk to another human being about the task at hand and we must work nonstop just to make minimum wage?

Just as people get commoditized, so do other layers of value:

In SEO for a number of years many people have painted brand as the solution to everything. But consider the hotel search results which are 100% monetized above the fold - even if you have a brand, you still must pay to play. Or consider the Google Shopping ads which are now being tested on branded navigational searches.

Google even obtained a patent for targeting ads aimed at monetizing named entities.

You paid to build the brand. Then you pay Google again - "or else."

One could choose to opt out of Google ad products so as not to pay to arbitrage themselves, but Google is willing to harm their own relevancy to extract revenues.

A search in the UK for the trademark term [cheapflights] is converted into the generic search [cheap flights]. The official site is ranking #2 organically and is the 20th clickable link in the left rail of the search results.

As much as brand is an asset, it also becomes a liability if you have to pay again for every time someone looks for your brand.

Mobile apps may be a way around Google, but again it is worth noting Google owns the operating system and guarantees themselves default placement across a wide array of verticals through bundling contracts with manufacturers. Another thing worth considering with mobile is new notification features tied to the operating systems are unbundling apps & Google has apps like Google Now which tie into many verticals.

As SEOs for a long time we had value in promoting the adoption of Google's ecosystem. As Google attempts to capture more value than they create we may no longer gain by promoting the adoption of their ecosystem, but given their...

  • cash hoard
  • lobbyists
  • ex-employees in key government rolls
  • control over video, mobile, apps, maps, email, analytics (along with search)
  • broad portfolio of investments

... it is hard to think they've come anywhere close to peaking.

Google SEO Services (BETA)

Nov 11th

When Google acquired DoubleClick Larry Page wanted to keep the Performics division offering SEM & SEO services just to see what would happen. Other Google executives realized the absurd conflict of interest and potential anti trust issues, so they overrode ambitious Larry: "He wanted to see how those things work. He wanted to experiment."

Webmasters have grown tired of Google's duplicity as the search ecosystem shifts to pay to play, or go away.

Google's webmaster guidelines can be viewed as reasonable and consistent or as an anti-competitive tool. As Google eats the ecosystem, those thrown under the bus shift their perspective.

Within some sectors larger players can repeatedly get scrutiny for the same offense with essentially no response, whereas smaller players operating in that same market are slaughtered because they are small.

Access to lawyers, politicians & media outlets = access to benefit of the doubt.

Lack those & BEST OF LUCK TO YOU ;)

Google's page asking "Do you need an SEO?" uses terms like: scam, illicit and deceptive to help frame the broader market perception of SEO.

If ranking movements appear random & non-linear then it is hard to make sense of continued ongoing investment. The less stable Google makes the search ecosystem, the worse they make SEOs look, as...

  • anytime a site ranks better, that anchors the baseline expectation of where rankings should be
  • large rank swings create friction in managing client communications
  • whenever search traffic falls drastically it creates real world impacts on margins, employment & inventory levels

Matt Cutts stated it is a waste of resources for him to be a personal lightning rod for criticism from black hat SEOs. When Matt mentioned he might not go back to his old role at Google some members of the SEO industry were glad. In response some other SEOs mentioned black hats have nobody to blame but themselves & it is their fault for automating things.

After all, it is not like Google arbitrarily shifts their guidelines overnight and drastically penalizes websites to a disproportionate degree ex-post-facto for the work of former employees, former contractors, mistaken/incorrect presumed intent, third party negative SEO efforts, etc.

Oh ... wait ... let me take that back.

Indeed Google DOES do that, which is where much of the negative sentiment Matt complained about comes from.

Recall when Google went after guest posts, a site which had a single useful guest post on it got a sitewide penalty.

Around that time it was noted Auction.com had thousands of search results for text which was in some of their guest posts.

About a month before the guest post crack down, Auction.com received a $50 million investment from Google Capital.

  • Publish a single guest post on your site = Google engineers shoot & ask questions later.
  • Publish a duplicated guest post on many websites, with Google investment = Google engineers see it as a safe, sound, measured, reasonable, effective, clean, whitehat strategy.

The point of highlighting that sort of disconnect was not to "out" someone, but rather to highlight the (il)legitimacy of the selective enforcement. After all, ...

But perhaps Google has decided to change their practices and have a more reasonable approach to the SEO industry.

An encouraging development on this front was when Auction.com was once again covered in Bloomberg. They not only benefited from leveraging Google's data and money, but Google also offered them another assist:

Closely held Auction.com, which is valued at $1.2 billion, based on Google’s stake, also is working with the Internet company to develop mobile and Web applications and improve its search-engine optimization for marketing, Sharga said.

"In a capitalist system, [Larry Page] suggests, the elimination of inefficiency through technology has to be pursued to its logical conclusion." ― Richard Waters

With that in mind, one can be certain Google didn't "miss" the guest posts by Auction.com. Enforcement is selective, as always.

“The best way to control the opposition is to lead it ourselves.” ― Vladimir Lenin

Whether you turn left or right, the road leads to the same goal.

Loah Qwality Add Werds Clix Four U

Aug 16th

Google recently announced they were doing away with exact match AdWords ad targeting this September. They will force all match types to have close variant keyword matching enabled. This means you get misspelled searches, plural versus singular overlap, and an undoing of your tight organization.

In some cases the user intent is different between singular and plural versions of a keyword. A singular version search might be looking to buy a single widget, whereas a plural search might be a user wanting to compare different options in the marketplace. In some cases people are looking for different product classes depending on word form:

For example, if you sell spectacles, the difference between users searching on ‘glass’ vs. ‘glasses’ might mean you are getting users seeing your ad interested in a building material, rather than an aid to reading.

Where segmenting improved the user experience, boosted conversion rates, made management easier, and improved margins - those benefits are now off the table.

CPC isn't the primary issue. Profit margins are what matter. Once you lose the ability to segment you lose the ability to manage your margins. And this auctioneer is known to bid in their own auctions, have random large price spikes, and not give refunds when they are wrong.

An offline analogy for this loss of segmentation ... you go to a gas station to get a bottle of water. After grabbing your water and handing the cashier a $20, they give you $3.27 back along with a six pack you didn't want and didn't ask for.

Why does a person misspell a keyword? Some common reasons include:

  • they are new to the market & don't know it well
  • they are distracted
  • they are using a mobile device or something which makes it hard to input their search query (and those same input issues make it harder to perform other conversion-oriented actions)
  • their primary language is a different language
  • they are looking for something else

In any of those cases, the typical average value of the expressed intent is usually going to be less than a person who correctly spelled the keyword.

Even if spelling errors were intentional and cultural, the ability to segment that and cater the landing page to match disappears. Or if the spelling error was a cue to send people to an introductory page earlier in the conversion funnel, that option is no more.

In many accounts the loss of the granular control won't cause too big of a difference. But some advertiser accounts in competitive markets will become less profitable and more expensive to manage:

No one who's in the know has more than about 5-10 total keywords in any one adgroup because they're using broad match modified, which eliminated the need for "excessive keyword lists" a long time ago. Now you're going to have to spend your time creating excessive negative keyword lists with possibly millions upon millions of variations so you can still show up for exactly what you want and nothing else.

You might not know which end of the spectrum your account is on until disaster strikes:

I added negatives to my list for 3 months before finally giving up opting out of close variants. What they viewed as a close variant was not even in the ballpark of what I sell. There have been petitions before that have gotten Google to reverse bad decisions in the past. We need to make that happen again.

Brad Geddes has held many AdWords seminars for Google. What does he think of this news?

In this particular account, close variations have much lower conversion rates and much higher CPAs than their actual match type.
...
Variation match isn’t always bad, there are times it can be good to use variation match. However, there was choice.
...
Loss of control is never good. Mobile control was lost with Enhanced Campaigns, and now you’re losing control over your match types. This will further erode your ability to control costs and conversions within AdWords.

A monopoly restricting choice to enhance their own bottom line. It isn't the first time they've done that, and it won't be the last.

Have an enhanced weekend!

Understanding The Google Penguin Algorithm

Aug 1st

Whenever Google does a major algorithm update we all rush off to our data to see what changed in terms of rankings, search traffic, and then look for the trends to try to figure out what changed.

The two people I chat most with during periods of big algorithmic changes are Joe Sinkwitz and Jim Boykin. I recently interviewed them about the Penguin algorithm.

Topics include:

  • what it is
  • its impact
  • why there hasn't been an update in a while
  • how to determine if issues are related to Penguin or something else
  • the recovery process (from Penguin and manual link penalties)
  • and much, much more

Here's a custom drawing we commissioned for this interview.
Pang Win.

Want to embed this image on your website?

To date there have been 5 Penguin updates:

  • April 24, 2012
  • May 25, 2012
  • October 5, 2012
  • May 22, 2013 (Penguin 2.0)
  • October 4, 2013

There hasn't been one in quite a while, which is frustrating many who haven't been able to recover. On to the interview...

At its core what is Google Penguin?

Jim: It is a link filter that can cause penalties.

Joe: At its core, Penguin can be viewed as an algorithmic batch filter designed to punish lower quality link profiles.

What sort of ranking and traffic declines do people typically see from Penguin?

Jim: 30-98%. actually, seen some "manual partial matches" some, where traffic was hardly hit...but that's rare.

Joe: Near total. I should expand. Penguin 1.0 has been a different beast than its later iterations; the first one has been nearly a fixed flag whereas later iterations haven't been quite as severe.

After the initial update there was another one about a month later & then one about every 6 months for a while. There hasn't been one for about 10 months now. So why have the updates been so rare? And why hasn't there been one for a long time?

Jim: Great question. We all believed there'd be an update every 6 months, and now it's been way longer than 6 months...maybe because Matt's on vacation...or maybe he knew it would be a long time until the next update, so he took some time off...or perhaps Google wants those with a algorithmic penalty to feel the pain for longer than 6 months.

Joe: 1.0 was temporarily escapable if you were willing to 301 your site; after 1.1 the redirect began to pass on the damage. My theory on why it has been so very long on the most recent update has to do with maximizing pain - Google doesn't intend to lift its boot off the throats of webmasters just yet; no amount of groveling will do. Add to that the complexity of every idiot disavowing 90%+ of their clean link profiles and 'dirty' vs 'clean' links is difficult to ascertain on that signal.

Jim: Most people disavow some, then the disavow some more...then next month they disavow more...wait a year and they may disavow them all :)

Joe: Agreed.

Jim: Then Google will let them out...hehe, tongue in cheek...a little.

Joe: I've seen disavow files with over 98% of links in there, including Wikipedia, the Yahoo! Directory, and other great sources - absurd.

Jim: Me too. Most of the people are clueless ... there's tons of people who are disavowing links just because their traffic has gone down, so they feel they must have been hit by penguin, so they start disavowing links.

Joe: Yes; I've seen a lot of panda hits where the person wants to immediately disavow. "whoa, slow down there Tex!"

Jim: I've seen services where they guarantee you'll get out of a penguin penalty, and we know that they're just disavowing 100% of the links. Yes, you get your manual penalty removed that way, but then you're left with nothing.

Joe: Good time to mention that any guarantee of getting out of a penalty is likely sold as a bag of smoke.

Jim: or as they are disavowing 100% of the links they can find going to the site.

OK. I think you mentioned an important point there Jim about "100% of the links they can find." What are the link sources people should use & how comprehensive is the Google Webmaster Tools data? Is WMT data enough to get you recovered?

Joe: Rarely. I've seen where the examples listed in a manual action might be discoverable on Ahrefs, Majestic SEO, or in WMT, but upon cleaning them up (and disavowing further of course) that Google will come back with a few more links that weren't initially in the WMT data dump. I'm dealing with a client on this right now that bought a premium domain as-is and has been spending about a year constantly disavowing and removing links. Google won't let them up for air and won't do the hard reset.

Jim: well first...if you're getting your backlinks from Google, be sure to pull your backlinks from the www and the non www version of your site. You can't just use one: you HAVE to pull backlinks from both, so you have to verify both your www and your non www version of your site with Google Webmaster Tools.

We often start with that. When we find big patterns that we feel are the cause, we'll then go into OSE, Majestic SEO, and Ahrefs, and pull those backlinks too, and pull out those that fit the patterns, but that's after the Google backlink analysis.

Joe, you mentioned people getting hit by Panda and mistakenly going off to the races to disavow links. What are the distinguishing characteristics between Penguin, Panda & manual link penalties?

Joe: Given they like to sandwich updates to make it difficult to discern, I like this question. Penguin is about links; it is the easiest to find but hardest to fix. When I first am looking at a URL I'll quickly look at anchor % breakdowns, sources of links, etc. The big difference between penguin and a manual link penalty (if you aren't looking on WMT) is the timing -- think of a bomb going off vs a sniper...everyone complaining at once? probably an algorithm; just a few? probably some manual actions. For manual actions, you'll get a note too in WMT. With panda I like to look first at the on-page to see if I can spot the egregious KW stuffing, weird infrastructure setups that result in thin/duplicated content, and look into engagement metrics and my favorite...externally supported pages - to - total indexed pages ratios.

Jim: Manual, at least you can keep resubmitting and get a yes or no. With an algorithmic, you're screwed....because you're waiting for the next refresh...hoping you did enough to get out.

I don't mind going back and forth with Google with a manual penalty...at least I'm getting an answer.

If you see a drop in traffic, be sure to compare that to the dates of Panda and Penguin updates...if you see a drop on one of the update days, then you can know if you have Panda or Penguin....and if you're traffic is just falling, it could be just that, and no penalty.

Joe: While this interview was taking place an employee pinged me to let me know a manual action that was denied, with an example URL being something akin to domain.com/?var=var&var=var - the entire domain was already disavowed. Those 20 second manual reviews by 3rd parties without much of an understanding of search doesn't generate a lot of confidence for me

Jim: Yes, I posted this yesterday to SEOchat. Reviewers are definitely not looking at things.

You guys mentioned that anyone selling a guaranteed 100% recovery solution is likely selling a bag of smoke. What are the odds of recovery? When does it make sense to invest in recovery, when does it make sense to start a different site, and when does it make sense to do both in parallel?

Jim: Well, I'm one for trying to save a site. I haven't once said "it's over for that site, let's start fresh." Links are so important, that if I can even save a few links going to a site, I'll take it. I'm not a fan of doing two sites, causes duplicate content issues, and now your efforts are on two sites.

Joe : It depends on the infraction. I have a lot more success getting stuff out of panda, manual actions, and the later iterations of penguin (theoretically including the latest one once a refresh takes place); I won't take anyone's money for those hit on penguin 1.0 though...I give free advice and add it to my DB tracking, but the very few examples I have where a recovery took place that I can confirm were penguin 1.0 and not something else, happened due to being a beta user of the disavow tool and likely occurred for political reasons vs tech reasons.

For churn and burn, redirects and canonicals can still work if you're clever...but that's not reinvestment so much as strategy shift I realize.

You guys mentioned the disavow process, where a person does some, does some more over time, etc. Is Google dragging out the process primarily to drive pain? Or are they leveraging the aggregate data in some way?

Joe: Oh absolutely they drag it out. Mathematically I think of triggers where a threshold to trigger down might be at X%, but the trigger for recovery might be X-10%. Further though, I think initially they looooooved all the aggregate disavow data, until the community freaked out and started disavowing everything. Let's just say I know of a group of people that have a giant network where lots of quality sites are purposefully disavowed in an attempt to screw with the signal further. :)

Jim: pain :) ... not sure if they're leveraging the data yet, but they might be. It shouldn't be too hard for Google to see that a ton of people are disavowing links from a site like get-free-links-directory.com, for Google to say, "no one else seems to trust these links, we should just nuke that site and not count any links from there."

we can do this ourselves with our own tools we have..I can see how many times I've seen a domain in my disavows, and how many times I disavowed that...ie, If I see spamsite.com in 20 disavows I've done, and I'd disavowed it all 20 times I saw it, I can see this data... or if I've seen goodsite.com 20 times, and never once disavowed it, I can see that too. I'd assume Google must do something like this as well.

Given that they drag it out, on the manual penalties does it make sense to do a couched effort on the first rejection or two, in order to give the perception of a greater level of pain and effort as you scale things up on further requests? What level of resources does it make sense to devote to the initial effort vs the next one and so on? When does recovery typically happen (in terms of % of links filtered and in terms of how many reconsideration requests were filed)?

Joe: When I deliver "disavow these" and "say this" stuff, I give multiple levels, knowing full well that there might be deeper and deeper considerations of the pain. Now, there have been cases where the 1st try gets a site out, but I usually see 3 or more.

Jim: I figure it will take a few reconsideration requests...and yes, I start "big" and get "bigger."

but that's for a sitewide penalty...

We've seen sitewides get reduced to a partial penalty. And once we have a partial penalty, it's much easier to identify this issues and take care of those, while leaving links that go to pages that were not effected.

A sitewide manual penalty kills the site...a partial match penalty usually has some stuff that ranks good, and some stuff that no longer ranks...once we're at a partial match, I feel much more confident in getting that resolved.

Jim, I know you've mentioned the errors people make in either disavowing great links or disavowing links when they didn't need to. You also mentioned the ability to leverage your old disavow data when processing new sites. When does it make sense to DIY on recovery versus hiring a professional? Are there any handy "rule of thumb" guidelines in terms of the rough cost of a recovery process based on the size of their backlink footprint?

Joe: It comes down to education, doesn't it? Were you behind the reason it got dinged? You might try that first vs immediately hiring. Psychologically it could even look like you're more serious after the first disavow is declined by showing you "invested" in the pain. Also, it comes down to opportunity cost. What is your personal time worth divided by your perceived probability of fixing

Jim: We charge $5000 for the analysis, and $5000 for the link removal process...some may think that's expensive...but removing good links will screw you, and not removing bad links will screw you...it's a real science, and getting is wrong can cost you a lot more than this...of course I'd recommend seeing a professional, as I sell this service...but I can't see anyone who's not a true expert in links doing this themselves.

Oh...and once we start work for someone, we keep going at no further cost until they get out.

Joe: That's a nice touch Jim.

Jim: Thank you.

Joe, during this interview you mentioned a reconsideration request rejection where the person cited a link on a site that has already been disavowed. Given how many errors Google's reviewers make, does it make sense to aggressively push to remove links rather than using disavow? What are the best strategies to get links removed?

Joe: DDoS

Jim: hehe

Joe: Really though, be upfront and honest when using those link removal services (which I'd do vs trying to do them one-by-one-by-one)

Jim: Only 1% of the people will remove links anyways; it's more to show Google that to you really tried to get the links removed.

Joe: Let the link holder know that you got hit with a penalty, you're just trying to clean it up because your business is suffering, and ask politely that they do you a solid favor.

I've been on the receiving end of a lot of different strategies given the size of my domain portfolio. I've been sued before (as a first course of action!) by someone that PAID to put a link on my site....they never even asked, just filed the case.

Jim: We send 3 removal requests..and ping the links too..so when we do a reconsideration request we can show Google the spreadsheet of who we emailed, when we emailed them, and who removed or no followed the links...but it's more about "show" to Google.

Joe: Yep, not a ton of compliance; webmasters have link removal fatigue by now.

This is more of a business question than an SEO question, but ... as much as budgeting for the monetary cost of recovery, an equally important form of budgeting is dealing with the reduced cashflow while the site is penalized. How many months does it typically take to recover from a manual penalty? When should business owners decide to start laying people off? Do you guys suggest people aggressively invest in other marketing channels while the SEO is being worked on in the background?

Jim: manual penalty typically take 2-4 months to recover. Recover is a relative term. Some people get "your manual penalty has been removed" and thier recovery is a tiny blip -up 5%, but still down 90% from what is was prior. Getting a "manual penalty removed" is great. IF there's good links left in your profile...if you've disavow everything, and your penalty is removed...so what...you've got nothing....people often ask where they'll be once they "recover" and I say "it depends on what you have left for links"...but it won't be where you were.

Joe: It depends on how exposed they are per variable costs. If the costs are fixed, then one can generally wait longer (all things being equal) before cutting. If you have a quarter million monthly link budget *cough* then, you're going to want to trim as quickly as possible just in order to survive.

Per investing in other channels, I absolutely wholeheartedly cannot emphasize how important it is to become an expert in one channel and at least a generalist in several others...even better, hire an expert in another channel to partner up with. In payday one of the big players did okay in SEO but even with a lot of turbulence was doing great due to their TV and radio capabilities. Also, collect the damn email addresses; email is still a gold mine if you use it correctly.

One of my theories for why there hasn't been a penguin update in a long time was that as people have become more afraid of links they've started using them as a weapon & Google doesn't want a bunch of false positives caused by competitors killing sites. One reason I've thought this versus the pain first motive is that Google could always put a time delay on recoveries while still allowing new sites to get penalized on updates. Joe, you mentioned that after the second Penguin update penalties started passing forward on redirects. Do people take penalized sites and point them at competitors?

Joe: Yes, they do. They also take them and pass them into the natural links of their competitors. I've been railing on negative SEO for several years now...right about when the first manual action wave came out in Jan 2012; that was a tipping point. It is now more economical to take someone else's ranking down than it is to (with a strong degree of confidence) invest in a link strategy to leapfrog them naturally

I could speak for days straight in a congressional filibuster on link strategies used for Negative SEO. It is almost magical how pervasive it has become. I get a couple requests a week to do it even...by BIG companies. Brands being the mechanism to sort out the cesspool and all that.

Jim: Soon, everyone will be monitoring they backlinks on a monthly basis. I know one big company that submits an updated disavow list every week to google.

That leads to a question about preemptive disavows. When does it make sense to do that? What businesses need to worry about that sort of stuff?

Joe: Are you smaller than a Fortune 500? Then the cards are stacked against you. At the very least, be aware of your link profile -- I wouldn't go so far as to preemptively disavow unless something major popped up.

Jim: I've done a preemptive disavow for my site. I'd say everyone should do a preemptive disavow to clean out the crap backlinks.

Joe: I can't wait to launch an avow service...basically go around to everyone and charge a few thousand dollars to clean up their disavows. :)

Jim: We should team up Joe and do them together :)

Joe: I'll have my spambots call your spambots.

Jim: saving the planet from penguin penalties. cleaning up the links of the web for Google.

Joe: For Google or from Google? :) The other dig, if there's time, is that not all penalties are created equal because there are several books of law in terms of how long a penalty might last. If I take an unknown site and do what RapGenius did, I'd still be waiting, even after fixing (which rapgenius really didn't do) largely because Google is not one of my direct or indirect investors.

Perhaps SEOs will soon offer a service for perfecting your pitch deck for the Google Ventures or Google Capital teams so it is easier to BeatThatPenalty? BanMeNot ;)

Joe: Or to extract money from former Googlers...there's a funding bubble right now where those guys can write their own ticket by VCs chasing the brand. Sure the engineer was responsible for changing the font color of a button, but they have friends on the inside still that might be able to reverse catastrophe.

Outside of getting a Google investment, what are some of the best ways to minimize SEO risk if one is entering a competitive market?

Jim: Don't try to rank for specific phrases anymore. It's a long slow road now.

Joe: Being less dependent on Google gives you power; think of it like a job interview. Do you need that job? The less you do, the more bargaining power you have. If you have more and more income coming in to your site from other channels, chances are you are also hitting on some important brand signals.

Jim: You must create great things, and build your brand...that has to be the focus...unless you want to do things to rank higher quicker, and take the risk of a penalty with Google.

Joe: Agreed. I do far fewer premium domaining + SEO-only plays anymore. For a while they worked; just a different landscape now.

Some (non-link builders) mention how foolish SEOs are for wasting so many thought cycles on links. Why are core content, user experience, and social media all vastly more important than link building?

Jim: links are still the biggest part of the Google algorithm - they can not be ignored. People must have things going on that will get them mentions across the web, and ideally some links as well. Links is #1 still today... but yes, after links, you need great content, good user experience, and more.

Joe: CopyPress sells content (please buy some content people; I have three kids to feed here), however it is important to point out that the most incredible content doesn't mean anything in a vacuum. How are you going to get a user experience with 0 users? Link building, purchasing traffic, DRIVING attention are crucial not just to SEO but to marketing in general. Google is using links as votes; while the variability has changed and evolved over time, it is still very much there. I don't see it going away in the next year or two.

An analogy: I wrote two books of poetry in college; I think they are ok, but I never published them and tried to get any attention, so how good are they really? Without promotion and amplification, we're all just guessing.

Thanks guys for sharing your time & wisdom!


About our contributors:

Jim Boykin is the Founder and CEO of Internet Marketing Ninjas, and owner of Webmasterworld.com, SEOChat.com, Cre8asiteForums.com and other community websites. Jim specializes in creating digital assets for sites that attract natural backlinks, and in analyzing links to disavow non-natural links for penalty recoveries.

Joe Sinkwitz, known as Cygnus, is current Chief of Revenue for CopyPress.com. He enjoys long walks on the beach, getting you the content you need, and then whispering in your ear how to best get it ranking.

Pages






    Email Address
    Pick a Username
    Yes, please send me "7 Days to SEO Success" mini-course (a $57 value) for free.

    Learn More

    We value your privacy. We will not rent or sell your email address.