There are many reasons for search engineers to want to roll out algorithm updates (or at least test new algorithms) before a long holiday weekend:
no media coverage: few journalists on the job & a lack of expectation that the PR team will answer any questions. no official word beyond rumors from self-promotional marketers = no story
many SEOs outside of work: few are watching as the algorithms tip their cards.
declining search volumes: long holiday weekends generally have less search volume associated with them. Thus anyone who is aggressively investing in SEO may wonder if their site was hit, even if it wasn't.
The communications conflicts this causes between in-house SEOs and their bosses, as well as between SEO companies and their clients both makes the job of the SEO more miserable and makes the client more likely to pull back on investment, while ensuring the SEO has family issues back home as work ruins their vacation.
fresh users: as people travel their search usage changes, thus they have fresh sets of eyes & are doing somewhat different types of searches. This in turn makes their search usage data more dynamic and useful as a feedback mechanism on any changes made to the underlying search relevancy algorithm or search result interface.
Algo Flux Testing Tools
Just about any of the algorithm volatility tools showed far more significant shift earlier in this month than over the past few days.
One issue with looking at any of the indexes is the rank shifts tend to be far more dramatic as you move away from the top 3 or 4 search results, so the algorithm volatility scores are much higher than the actual shifts in search traffic (the least volatile rankings are also the ones with the most usage data & ranking signals associated with them, so the top results for those terms tend to be quite stable outside of verticals like news).
You can use AWR's flux tracker to see how volatility is higher across the top 20 or top 50 results than it is across the top 10 results.
Example Ranking Shifts
I shut down our membership site in April & spend most of my time reading books & news to figure out what's next after search, but a couple legacy clients I am winding down working with still have me tracking a few keywords & one of the terms saw a lot of smaller sites (in terms of brand awareness) repeatedly slide and recover over the past month.
Notice how a number of sites would spike down on the same day & then back up. And then the pattern would repeat.
As a comparison, here is that chart over the past 3 months.
Notice the big ranking moves which became common over the past month were not common the 2 months prior.
Negative SEO Was Real
There is a weird sect of alleged SEOs which believes Google is omniscient, algorithmic false positives are largely a myth, AND negative SEO was never a real thing.
As it turns out, negative SEO was real, which likely played a part in Google taking years to roll out this Penguin update AND changing how they process Penguin from a sitewide negative factor to something more granular.
Part of the reason many people think there was no Penguin update or responded to the update with "that's it?" is because few sites which were hit in the past recovered relative to the number of sites which ranked well until recently just got clipped by this algorithm update.
When Google updates algorithms or refreshes data it does not mean sites which were previously penalized will immediately rank again.
Some penalties (absent direct Google investment or nasty public relations blowback for Google) require a set amount of time to pass before recovery is even possible.
Google has no incentive to allow a broad-based set of penalty recoveries on the same day they announce a new "better than ever" spam fighting algorithm.
They'll let some time base before the penalized sites can recover.
Further, many of the sites which were hit years ago & remain penalized have been so defunded for so long that they've accumulated other penalties due to things like tightening anchor text filters, poor user experience metrics, ad heavy layouts, link rot & neglect.
What to do?
So here are some of the obvious algorithmic holes left by the new Penguin approach...
not sure that would even be a valid mindset in the current market
hell, the whole ecosystem is built on quicksand
The trite advice is to make quality content, focus on the user, and build a strong brand.
But you can do all of those well enough that you change the political landscape yet still lose money.
“Mother Jones published groundbreaking story on prisons that contributed to change in govt policy. Cost $350k & generated $5k in ad revenue”— SEA☔☔LE SEO (@searchsleuth998) August 22, 2016
And it is getting almost impossible to win in search by focusing on search as an isolated channel.
I never understood mentality behind Penguin "recovery" people. The spam links ranked you, why do you expect to recover once they're removed?— SEOwner (@tehseowner) September 25, 2016
Efforts and investments in chasing the algorithms in isolation are getting less viable by the day.
Obviously removing them may get you out of algorithm, but then you'll only have enough power to rank where you started before spam links.— SEOwner (@tehseowner) September 25, 2016
Anyone operating at scale chasing SEO with automation is likely to step into a trap.
When it happens, that player better have some serious savings or some non-Google revenues, because even with "instant" algorithm updates you can go months or years on reduced revenues waiting for an update.
And if the bulk of your marketing spend while penalized is spent on undoing past marketing spend (rather than building awareness in other channels outside of search) you can almost guarantee that business is dead.
"If you want to stop spam, the most straight forward way to do it is to deny people money because they care about the money and that should be their end goal. But if you really want to stop spam, it is a little bit mean, but what you want to do, is break their spirits." - Matt Cutts
Google Ventures Partner Blake Byers joined LendUp’s board of directors with his firm’s investment. The investor said he expects LendUp to make short-term lending reasonable and favorable for the “80 million people banks won’t give credit cards to,” and help reshape what had been “a pretty terrible industry.”
What sort of strategy is helping to drive that industry transformation?
How about doorway pages.
That in spite of last year Google going out of their way to say they were going to kill those sorts of strategies.
Google does not want to rank doorway pages in their search results. The purpose behind many of these doorway pages is to maximize their search footprint by creating pages both externally on the web or internally on their existing web site, with the goal of ranking multiple pages in the search results, all leading to the same destination.
These sorts of doorway pages are still live to this day.
Simply look at the footer area of lendup.com/payday-loans
But the pages existing doesn't mean they rank.
For that let's head over to SEMrush and search for LendUp.com
Today those sorts of stories are literally everywhere.
Tomorrow the story will be over.
And when it is.
Precisely zero journalists will have covered the above contrasting behaviors.
As they weren't in the press release.
Best yet, not only does Google maintain their investment in payday loans via LendUp, but there is also a bubble in the personal loans space, so Google will be able to show effectively the same ads for effectively the same service & by the time the P2P loan bubble pops some of the payday lenders will have followed LendUp's lead in re-branding their offers as being something else in name.
A user comment on Google's announcement blog post gets right to the point...
Are you disgusted by Google's backing of LendUp, which lends money at rates of ~ 395% for short periods of time? Check it out. GV (formerly known as Google Ventures) has an investment in LendUp. They currently hold that position.
Oh, the former CIO and VP of Engineering of Google is the CEO of Zest Finance and Zest Cash. Zest Cash lends at an APR of 390%.
Meanwhile, off to revolutionize the next industry by claiming everyone else is greedy and scummy and there is a wholesome way to do the same thing leveraging new technology, when in reality the primary difference between the business models is simply a thin veneer of tech utopian PR misinformation.
Don't expect to see a link to this blog post on TechCrunch.
There you'll read some hard-hitting cutting edge tech news like:
Banks are so greedy that LendUp can undercut them, help people avoid debt, and still make a profit on its payday loans and credit card.
Update: Kudos to the Google Public Relations team, as it turns out the CFPB is clamping down on payday lenders, so all the positive PR Google got on this front was simply them front running a known regulatory issue in the near future & turning it into a public relations bonanza. Further, absolutely NOBODY (other than the above post) mentioned the doorway page issue, which remains in place to this day & is driving fantastic rankings for their LendUp investment.
Update 2:Record keeping requirements do not improve things if a company still intentionally violates the rules, knowing they will only have to pay a token slap on the wrist fine if and when they are finally caught. All it really does is drive the local businesses under.
The massive record-keeping and data requirements that Mr. Corday is foisting on the industry will have another effect: It will drive out the small, local players who have dominated the industry in favor of big firms and consolidators who can afford the regulatory overhead. It will also favor companies that can substitute big data for local knowledge like LendUp, the Google-backed venture that issued a statement Thursday applauding the CFPB rules. Google’s self-interest has become a recurrent theme in Obama policy making
Onine lending start-up LendUp, which has billed itself as a better and more affordable alternative to traditional payday lenders, will pay $6.3 million in refunds and penalties after regulators uncovered widespread rule-breaking at the company.
If you live outside of the United States it can be hard to appreciate just how ad heavy some of Google's search results have become in key ad categories.
Plenty of Room in Hotel California
When Google rolled out the 4 AdWords ads above the organic results layout they mentioned it would mostly appear on highly commercial search terms like New York Hotels. Hotels are one of the most profitable keyword themes, because:
the searches tend to be fairly late funnel
the transactions are for hundreds of dollars
OTAs and other intermediaries often get somewhere between 10% to 30% of the transaction
Google search results for hotels not only contain 4 AdWords ads, but they also have price ads on the "organic" local listings. That gives Google a second bite at the apple on monetizing the user.
Click on any of those prices and you get sent to a beautiful(ly ugly) ad heavy click circus page like the following.
today we will begin phasing out the following Digital Magazines: Yahoo Food, Yahoo Health, Yahoo Parenting, Yahoo Makers, Yahoo Travel, Yahoo Autos and Yahoo Real Estate.
Direct Marketing Budgets vs Brand Ad Budgets
Google recently had another vertical search program which paralleled their hotel offering which focused on finance. It allowed users to compare things like credit cards, home loans, auto insurance policies, and other financial offers. They acquired BeatThatQuote, hard coded aggressive placements for themselves near the top of the search results, increased the size of these custom ad units - and then killed them off.
Why would Google invest hundreds of millions of Dollars in vertical search only to kill the offering?
It turns out the offering was too efficient from an advertiser perspective, so it didn't drive enough yield for Google.
If it is a lead-based product the ad rates are set by rational lead values. There is no brand manager insisting on paying $120 a click because "we HAVE TO be #1 in Google for auto insurance."
If Google does lead generation and sells the lead off exclusively they get paid precisely once for the consumer. Whereas if Google scrubs many aggregators from the market & allows searchers to click on one brand at a time they get to monetize the user many times over and take advantage of any irrational bidders in the ecosystem.
As long as Google is monetizing brand advertising budgets they can insert many layers of fat into the ad stack.
Google's vertical ad offerings may come and go, the biases behind the relevancy algorithms may shift, and the ecosystem constantly has some number false positives. As search engines test out various features & shift their editorial policies some companies get disrupted and are forced to change their business models, while other companies get disrupted and outright disappear.
Google's move into auto insurance might have been part of the reason Bankrate decided to exit the business. But Google exiting the Google Compare business and adding a 4th text AdWords ad slot above the organic search results a few days before Bankrate reported results caused BankRate's stock to slide by as much as 47%.
Brand Building to Lower Risk
Part of the SEO value of building a brand is the strength of the brand awareness helps you rank better across whatever portion of the search ecosystem Google has not yet eaten, while lowering your risk of becoming a false positive statistic. Branded-related searches should (in theory) also provide some baseline level of demand which insulates against ranking shifts on other keywords. And having a brand name rather than a generic business name allows one to go from one market to the next.
Just be Apple...
Computers.com won't magically morph into MP3player.com then CellPhone.com then Tablet.com then Watch.com, but Apple was able to move from one market to the next with ease due to consumer familiarity and loyalty toward their brand.
Investing in building brand awareness is often quite expensive & typically requires many years of losses to eventually see positive returns. Trends come and go, and with them so do associated brands.
Heavily invest in the wrong trend & die.
Wait too long to invest in an important trend & die.
Few companies are able to succeed in field after field after field.
When the financial crisis happened about 8 years ago Google saw both their revenue growth rate and their stock price crash. Direct marketers receded with the consumer, but many pre-approved brand ad campaigns continued to run. Google's preferred custom shifted away from direct marketers toward large global brands.
If you shop at big box stores in the United States you may have no awareness of the following product.
Look a bit closer at that image & you'll see it wasn't LEGO, but rather LEBQ.
Sales for Le Bao Quan are not sales for the core LEGO brand, the consumer gets acclimated to an artificially low price point, and imagine what sort of a traumatic impact it might have for a child if their first LEGO-like toy looks like a pig fresh from the butcher's shop.
The key difference between that sort of stuff and gray areas monetized by the big online platforms is you may have to go to third world to find the sketchy physical products in the real world; whereas the big online platforms all have some number of sketchy globally accessible offers at any point in time. Here are just a few examples:
Part of why Apple has such strong margins is their brand is so strong they can dictate terms and control the supply chain. Others are willing to give them the majority of the profits because carrying them completes the catalog and helps the retailers sell other, weaker goods where the retailers have higher profit margins.
Luckily when fake products use spammy titles on Amazon the reviewers will quickly highlight if they are of inferior quality. But if they look authentic & work, it can be hard for the brands to know unless they proactively track everything. And as that demand gets filled, if there is a negative experience it may lead to customer complaints about the brand, whereas if there are no complaints & the product works it still leaves less money for the brand which is being arbitraged.
"The Internet doesn't change everything. It doesn't change supply and demand." - Andy Grove
Some companies die slowly, as accountants drive strategy & they outsource their key points of differentiation and become unremarkable. When Yahoo! turned their verticals into thin "me too" outsourced plays they made it easy for Google to offer something of a similar quality, which in turn left the Yahoo! vertical properties without much distribution.
Some retailers have symbiotic relations with brands they sell, while other platforms may compete more aggressively with those whose products they sell. The same is true with affiliates. Affiliates can genuinely add value & drive new distribution for brands, or they can engage in lower value arbitrage, where they push the brand to pay for what was already owned by it through shady techniques like cookie stuffing.
One of the most one-sided and biased hate-filled perspectives I've ever seen about affiliates is Lori Weiman's guest columns at Search Engine Land.
Just the same, some merchants treat affiliates honestly and fairly, while other merchants have a pattern of scamming their affiliates through lead shaving, adjusting revenue share without telling the affiliates, and a host of other sketchy behaviors.
Monetizing Brand (Search Engine)
Search engines allow competitors or resellers to bid on branded keywords, which creates an auction bidding environment for many branded terms. Typically Google offers the official site / brand clicks at a significant discount for these terms in order to encourage them to compete in the ad marketplace & to help shift some of the organic click mix over to paid clicks.
Google has also tried a number of other initiatives to boost their monetization of branded keywords. A partial list of such efforts includes:
a test of banner ads from brands which were merged with organic listings (though this effort was quickly dumped due to lack of driving revenues as it didn't allow for auction dynamics to drive prices upward - similar to the reason Google Advisor was shut down)
adding other distracting eye candy to mobile results including the knowledge graph and "also searched for" links pointing at competing businesses
allowing syndicated search partners to use harder to notice ad labeling
allowing syndicated search partners to use more ads above their organic search results
Sophisticated vs Unsophisticated SEM
Many poorly managed AdWords accounts managed by large ad agency ultimately end up far more damaging to brands than the efforts from "shady" affiliates. The set up (which is far more common than most would care to believe) revolves around the ad agency arbitraging the client's existing brand, falsely claiming the revenue generated by that spend to be completely incremental & then get a percent of spend management fee on that spend. The phantom profits which are generated from those efforts are further applied to bidding irrationally high on other terms, to once again pick up more percent of spend management fees.
Both eBay and Google have done studies on the incrementality of paid search clicks.
eBay being a large brand found they didn't see much incrementality [PDF]. Search Google for eBay and they won't run AdWords ads. eBay still participates in product listing ads / shopping search for other products they carry.
Google (of course) found much more incrementality with paid search ads. While they conducted their internal study and suggested it would be too hard or expensive for most advertisers to conduct such a study, they also failed to mention that the reason it would be expensive for an advertiser to perform such a test is because Google intentionally & explicitly decided against offering those features inside the AdWords platform. It is the same reason Google shut down Google Advisor / Google Compare - offering it doesn't provide Google a guaranteed positive yield when compared against not offering it.
One thing Google did note about seeing higher rates of incremental clicks in their study was when there was increased space between the listings there tended to be a higher rate of incremental ad clicks. This is part of why we see AdWords ads getting larger with more extensions & there being so many features in mobile which push the organic results below the fold.
The same Lori Weiman who hates affiliates is currently running (literally) an 8-part series on why you should bid on your brand keywords.
If anyone other than a search engine monetizes brand that might be bad, but if the search engines do it then going along with the game is always the right call.
Owning the Supply Chain
"The true victory (the true 'negation of the negation') occurs when the enemy talks your language." - Slavoj Zizek
The opposite is also true. If you are a brand who is being dictionary attacked by an ad network, the brand quickly shifts from an asset to a liability.
"The only thing that I'd rather own than Windows is English, because then I could charge you two hundred and forty-nine dollars for the right to speak it." - Scott McNealy
Google owns English and Spanish and German and ...
Google recently announced app streaming, where they can showcase & deep link into apps in the search results even if users do not have those apps installed. How it works is rather than users installing the app, Google has the app installed on a computer in their cloud & then shows users a video of the app. Click targets, ads, etc. remain the same.
Imagine if, in order to use the web, you had to download an app for each website you wanted to visit. To find news from the New York Times, you had to install an app that let you access the site through your web browser. To purchase from Amazon, you first needed to install an Amazon app for your browser. To share on Facebook, installation of the Facebook app for your browser would be required. That would be a nightmare.
The web put an end to this. More specifically, the web browser did. The web browser became a universal app that let anyone open anything on the web.
To meaningfully participate on those sorts of sites you still need an account. You are not going to be able to buy on Amazon without registration. Any popular social network which allows third party IDs to take the place of first party IDs will quickly become a den of spam until they close that loophole.
In short, you still have to register with sites to get real value out of them if you are doing much beyond reading an article. Without registration it is hard for them to personalize your experience & recommend relevant content.
Desktop Friendly Design
App indexing & deep linking of apps is a step in the opposite direction of the open web. It is supporting proprietary non-web channels which don't link out. Further, if you thought keyword (not provided) heavily obfuscated user data, how much will data be obfuscated if the user isn't even using your site or app, but rather is interacting via a Google cloud computer?
Who visited your app? Not sure. It was a Google cloud computer.
Where were they located? Not sure. It was a Google cloud computer.
Did they have problems using your app? Not sure. It was a Google cloud computer.
What did they look at? Can you retarget them? Not sure. It was a Google cloud computer.
Is an app maker too lazy to create a web equivalent version of their content? If so, let them be at a strategic disadvantage to everyone who put in the extra effort to publish their content online.
If Google has their remote quality raters consider a site as not meeting users needs because they don't publish a "mobile friendly" version of their site, how can one consider a publisher who creates "app only" content as an entity which is trying hard to meet end user needs?
The low pricepoints for consumer apps in app stores makes it hard for businesses to justify selling B2B apps for a high enough price to offset the smaller addressable audience.
It has become harder to sell consumer apps as the app stores have saturated with competition.
2008 I'll sell apps for $2.99 & make millions
2010 At $0.99 I'll make $1000s
2012 Ads might cover my rent
2014 Kickstart my app
2015 Hire me— Nick Lockwood (@nicklockwood) August 3, 2015
Exceptionally popular apps are disabled for interfering with business models of the platforms. Apps and extensions can be disabled at any time, even after the fact, due to violating guidelines or rule changes that turn what was once fine into a guideline violation. In some cases when they are disabled it is done with no option to re-enable.
We’re rapidly moving from an internet where computers are ‘peers’ (equals) to one where there are consumers and ‘data owners’, silos of end user data that work as hard as they can to stop you from communicating with other, similar silos.
If the current trend persists we’re heading straight for AOL 2.0, only now with a slick user interface, a couple more features and more users.
Katz of Gogobot says that “SEO is a dying field” as Google uses its “monopoly” power to turn the field of search into Google’s own walled garden like AOL did in the age of dial-up modems.
Almost 4 years ago a Google engineer described SEO as a bug. He suggested one shouldn't be able to rank highly without paying.
It looks like he was right. Google's aggressive ad placement on mobile SERPs "has broken the will of users who would have clicked on an organic link if they could find one at the top of the page but are instead just clicking ads because they don’t want to scroll down."
In the years since then we've learned Google's "algorithm" has concurrent ranking signals & other forms of home cooking which guarantees success for Google's vertical search offerings. The "reasonable" barrier to entry which applies to third parties does not apply to any new Google offerings.
And "bugs" keep appearing in those "algorithms," which deliver a steady stream of harm to competing businesses.
From Indy to Brand
The waves of algorithm updates have in effect increased the barrier to entry, along with the cost needed to maintain rankings. The stresses and financial impacts that puts on small businesses makes many of them not worth running. Look no further than MetaFilter's founder seeing a psychologist, then quitting because he couldn't handle the process.
there’s no reason why the internet couldn’t keep on its present course for years to come. Under those circumstances, it would shed most of the features that make it popular with today’s avant-garde, and become one more centralized, regulated, vacuous mass medium, packed to the bursting point with corporate advertising and lowest-common-denominator content, with dissenting voices and alternative culture shut out or shoved into corners where nobody ever looks. That’s the normal trajectory of an information technology in today’s industrial civilization, after all; it’s what happened with radio and television in their day, as the gaudy and grandiose claims of the early years gave way to the crass commercial realities of the mature forms of each medium.
If you participate on the web daily, the change washes over you slowly, and the cumulative effects can be imperceptible. But if you were locked in an Iranian jail for years the change is hard to miss.
These sorts of problems not only impact search, but have an impact on all the major tech channels.
iPhone autocorrect inserted "showgirl" for "shows" and "POV" for "PPC". This crowd sourcing of autocorrect is not welcomed.— john andrews (@searchsleuth998) November 10, 2015
Eventually they might even symbolically close their websites, finishing the job they started when they all stopped paying attention to what their front pages looked like. Then, they will do a whole lot of what they already do, according to the demands of their new venues. They will report news and tell stories and post garbage and make mistakes. They will be given new metrics that are both more shallow and more urgent than ever before; they will adapt to them, all the while avoiding, as is tradition, honest discussions about the relationship between success and quality and self-respect.
If in five years I’m just watching NFL-endorsed ESPN clips through a syndication deal with a messaging app, and Vice is just an age-skewed Viacom with better audience data, and I’m looking up the same trivia on Genius instead of Wikipedia, and “publications” are just content agencies that solve temporary optimization issues for much larger platforms, what will have been point of the last twenty years of creating things for the web?
A Deal With the Devil
As ad blocking has grown more pervasive, some publishers believe the solution to the problem is through gaining distribution through the channels which are exempt from the impacts of ad blocking. However those channels have no incentive to offer exceptional payouts. They make more by showing fewer ads within featured content from partners (where they must share ad revenues) and showing more ads elsewhere (where they keep all the ad revenues).
The problem is if you don't control the publishing you don't control the monetization and you don't control the data flow.
Your website helps make the knowledge graph (and other forms of vertical search) possible. But you are paid nothing when your content appears in the knowledge graph. And the knowledge graph now has a number of ad units embedded in it.
A decade ago, when Google pushed autolink to automatically insert links in publisher's content, webmasters had enough leverage to "just say no." But now? Not so much. Google considers in-text ad networks spam & embeds their own search in third party apps. As the terms of deals change, and what is considered "best for users" changes, content creators quietly accept, or quit.
The most recent leaked Google rater documents suggested the justification for featured answers was to make mobile search quick, but if that were the extent of it then it still doesn't explain why they also appear on desktop search results. It also doesn't explain why the publisher credit links were originally a light gray.
With Google everything comes down to speed, speed, speed. But then they offer interstitial ad units, lock content behind surveys, and transform the user intent behind queries in a way that leads them astray.
Back in 2009 Google executives were scared of not being able to retain talent with stock options after Google's stock price cratered with the rest of the market & Google's ad revenue growth rate slid to zero. That led them to reprice employee stock options. That is as close as Google has come to a "near death" experience since their IPO. They've consistently grown & become more dominant.
In 2012 a Googler named Jon Rockway was more candid than Googlers are typically known for being: "SEO isn't good for users or the Internet at large. ... It's a bug that you could rank highly in Google without buying ads, and Google is trying to fix the bug."
"If you want to stop spam, the most straight forward way to do it is to deny people money because they care about the money and that should be their end goal. But if you really want to stop spam, it is a little bit mean, but what you want to do, is break their spirits." - Matt Cutts
Through a constant ex-post-facto redefinition of "what is spam" to include most anything which is profitable, predictable & accessible, Google engineers work hard to "deny people money."
Over time SEO became harder & less predictable. The exception being Google investments like Thumbtack, in which case other's headwind became your tailwind & a list of techniques declared off limits became a strategy guidebook.
"We also like that it means alpha‑bet (Alpha is investment return above benchmark), which we strive for!" - Larry Page
From Do/Know/Go to Scrape/Displace/Monetize
It takes a lot of effort & most people are probably too lazy to do it, but if you look at the arch of Google's patents related to search quality, many of the early ones revolved around links. Then many focused on engagement related signals. Chrome & Android changed the pool of signals Google had access to. Things like Project Fi, Gogle Fiber, Nest, and Google's new OnHub router give them more of that juicy user data. Many of their recently approved patents revolve around expanding the knowledge graph so that they may outright displace the idea of having a neutral third party result set for an increasing share of the overall search pie.
Searchers can instead get bits of "knowledge" dressed in various flavors of ads.
This sort of displacement is having a significant impact on a variety of sites. But for most it is a slow bleed rather than an overnight sudden shift. In that sort of environment, even volunteer run sites will eventually atrophy. They will have fewer new users, and as some of the senior people leave, eventually fewer will rise through the ranks. Or perhaps a greater share of the overall ranks will be driven by money.
Jimmy Wales stated: “It is also false that ‘Wikipedia thrives on clicks,’ at least as compared to ad-revenue driven sites… The relationship between ‘clicks’ and the things we care about: community health and encyclopedia quality is not nothing, but it’s not as direct as some think.”
Most likely the relationship *is* quite direct, but there is a lagging impact. Today's major editors didn't join the site yesterday & take time to rise through the ranks.
If Google works hard enough at prioritizing "deny people money" as a primary goal, then they will eventually get an index quality that reflects that lack of payment. Plenty of good looking & well-formatted content, but a mix of content which:
is monetized indirectly & in ways which are not clearly disclosed
has interstitial ads and slideshows where the ads look like the "next" button & the "next" button is colored the same color as the site's background
There has been a general pattern in search innovation. Google introduces a new feature, pitches it as being the next big thing, gets people to adopt it, collects data on the impact of the feature, clamps down on selectively allowing it, perhaps removes the feature outright from organic search results, permanently adds the feature to their ad units.
This sort of pattern has happened so many times it is hard to count.
Google puts faces in search results for authorship & to promote Google+, Google realizes Google+ is a total loser & disconnects it, new ad units for local services show faces in the search results. What was distracting noise was removed, then it was re-introduced as part of an ad unit.
Some sites which bundle software got penalized in organic search and are not even allowed to buy AdWords ads. At an extreme degree, sites which bundled no software, but simply didn't link to an End User Agreement (EULA) from the download page were penalized. Which leads to uncomfortable conversations like this one:
Google Support: I looked through this, and it seemed that one of the issues was a lack of an End User Agreement (EULA)
Google Support: Hmm, They do want it on the download page itself
Simtec: How come there isn’t one here? google.co.uk/chrome/browser/desktop/
Google Support: LOL
Simtec: No really?
Google Support: That’s a great question
Of course, it goes without saying that much of the Google Chrome install base came from negative option software bundling on Adobe Flash security updates.
Google claimed helpful hotel affiliate sites should be rated as spam, then they put their own affiliate ads in hotel search results & even recommended hotel searches in the knowledge graph on city name searches.
Google search engineers have recently started complaining about interstitial ads & suggested they might create a "relevancy" signal based on users not liking those. At the same time, an increasing number of YouTube videos have unskippable pre-roll ads. And the volume of YouTube ad views is so large that it is heavily driving down Google's aggregate ad click price. On top of this, Google also offers a survey tool which publishers can lock content behind & requires users to answer a question before they can see the full article they just saw ranking in the search results.
"Everything is possible, but nothing is real." - Living Colour
Blue Ocean Opportunity
Amid the growing ecosystem instability & increasing hypocrisy, there have perhaps been only a couple "blue ocean" areas left in organic search: local search & brand.
And it appears Google might be well on their way in trying to take those away.
For years brand has been the solution to almost any SEO problem.
I wonder how many SEOs working for big brands have done absolutely nothing of value since 2012 yet still look like geniuses to executives.— Ross Hudgens (@RossHudgens) August 7, 2015
Now that the mobile search interface is literally nothing but ads above the fold, early data shows a significant increase in mobile ad clicks. Of course it doesn't matter if there are 2 or 3 ads, if Google shows ad extensions on SERPs with only 2 ads to ensure they drive the organic results "out of sight, out of mind."
Earlier this month it was also noticed Google replaced 7-pack local results with 3-pack local results for many more search queries, even on desktop search results. On some of these results they only show a call button, on others they show links to sites. It is a stark contrast to the vast array of arbitrary (and even automated) ad extensions in AdWords.
Why would they determine users want to see links to the websites & the phone numbers, then decide overnight users don't want those?
Why would Google determine for many years that 7 is a good number of results to show, and then overnight shift to showing 3?
If Google listed 7 ads in a row people might notice the absurdity of it and complain. But if Google only shows 3 results, then they can quickly convert it into an ad unit with little blowback.
You don't have to be a country music fan to know the Austin SEO limits in a search result where the local results are now payola.
Try not to hurt your back while looking down for the organic search results!
Here are two tips to ensure any SEO success isn't ethereal: don't be nearby, and don't be a business. :D
Yesterday Google shared they see greater mobile than desktop search volumes in 10 countries including Japan and the United States.
3 years ago RKG shared CTR data which highlighted how mobile search ads were getting over double the CTR as desktop search ads.
The basic formula: less screen real estate = higher proportion of user clicks on ads.
Google made a big deal of their "mobilepocalypse" update to scare other webmasters into making their sites mobile friendly. Part of the goal of making sites "mobile friendly" is to ensure it isn't too ad dense (which in turn lowers accidental ad clicks & lowers monetization). Not only does Google have an "ad heavy" relevancy algorithm which demotes ad heavy sites, but they also explicitly claim even using a moderate sized ad unit on mobile devices above the fold is against their policy guidelines:
Is placing a 300x250 ad unit on top of a high-end mobile optimized page considered a policy violation?
Yes, this would be considered a policy violation as it falls under our ad placement policies for site layout that pushes content below the fold. This implementation would take up too much space on a mobile optimized site's first view screen with ads and provides a poor experience to users. Always try to think of the users experience on your site - this will help ensure that users continue to visit.
So if you make your site mobile friendly you can't run Google ads above the fold unless you are a large enough publisher that the guidelines don't actually matter.
If you spend the extra money to make your site mobile friendly, you then must also go out of your way to lower your income.
What is the goal of the above sort of scenario? Defunding content publishers to ensure most the ad revenues flow to Google.
If you do X, you are a spammer. If Google does X, they are improving the user experience.
@aaronwall they will personally do everything they penalize others for doing; penalties are just another way to weaken the market.— Cygnus SEO (@CygnusSEO) May 5, 2015
The above sort of contrast is something noticed by non-SEOs. The WSJ article about Google's new ad units had a user response stating:
With this strategy, Google has made the mistake of an egregious use of precious mobile screen space in search results. This entails much extra fingering/scrolling to acquire useful results and bypass often not-needed coincident advertising. Perhaps a moneymaker by brute force; not a good idea for utility’s sake.
That content displacement with ads is both against Google's guidelines and algorithmically targeted for demotion - unless you are Google.
Facebook and Google accounted for a majority of mobile ad market growth worldwide last year. Combined, the two companies saw net mobile ad revenues increase by $6.92 billion, claiming 75.2% of the additional $9.2 billion that went toward mobile in 2013.
...and the smaller the screen size the more partners are squeezed out of the ecosystem...
The high-intent, high-value search traffic is siphoned off by ads.
What does that leave for the rest of the ecosystem?
It is hard to build a sustainable business when you have to rely almost exclusively on traffic with no commercial intent.
One of the few areas that works well is perhaps with evergreen content which has little cost of maintenance, but even many of those pockets of opportunity are disappearing due to the combination of the Panda algorithm and Google's scrape-n-displace knowledge graph.
Why do news sites get so much mobile search traffic? A lot of it is navigational & beyond that most of it is on informational search queries which are hard to monetize (and thus have few search ads) and hard to structure into the knowledge graph (because they are about news items which only just recently happened).
If you look at the organic search traffic breakdown in your analytics account & you run a site which isn't a news site you will likely see a far lower share of search traffic from mobile. Websites outside of the news vertical typically see far less mobile traffic. This goes back to Google dominating the mobile search interface with ads.
Mobile search ecosystem breakdown
traffic with commercial intent = heavy ads
limited commercial intent but easy answer = knowledge graph
limited commercial intent & hard to answer = traffic flows to news sites
The 4.7% of the websites Google pushed to go mobile friendly likely include some sites which would have been mobile friendly anyhow by virtue of being new sites on hosted platforms with responsive designs. But for the rest of the sites, was the shift worth it?
That is a tough question.
It is too early to tell.
Google still hasn't put much weight on it in the rankings yet.
Mobile traffic is typically worth far less than desktop traffic for most websites.
Time which was spent on mobile friendly conversion could have been spent on other forms of marketing.
Some sites which became mobile friendly took a significant revenue hit in doing so by switching out long running effective ad placements with mobile responsive units which may not have performed as well.
The problem with going early is you eat the expense upfront, while the rewards are still unknown.
Many people who jumped on the "secured everywhere" bandwagon last year saw broken security certificate issues and broken plugins which were hard to fix. And the upfront cost wasn't the only expense, as many AdSense publishers saw less relevant ads, lower ad CTR, and a sharp drop in AdSense earnings after going secured.
Those who spent the money to integrate Google Checkout to get AdWords discounts had to spend again to remove it when Google stopped supporting it.
TV makers who were early to integrate Google's YouTube API (which allowed ad free streaming) will now have to deal with a rash of customer complaints as Google sunsets the old API to make way to be able to sell an ad free subscription service.
If you are spending your own time & money and you believe in what you are doing and the longevity of a project then it doesn't matter too much if the rewards come slowly or never come. A sense of purpose & a sense of pride in your work is a form of payment.
However, if you are spending a client's money & you ring a 5 alarm fire to rush to make some technical change & then see no upside after the much hyped announcement, that erodes client trust. If there is no upside and a huge drop in revenue, then the consultant looks like a clueless idiot burning money for the sake of it doing various make work projects.
A few years ago a Google rep stated Panda would be folded into the regular algorithms. Then recently we were told it was a near realtime. Then we were told it was something where updates needed to be manually pushed out & it is something Google hasn't done in 4 months. If we trusted Google & conveyed any of these messages to clients, once again we looked like idiots. If we choose to invest client money based on the cycles and advice we are given, quite often that is a money incinerator.
Imagine dropping $30,000 on a link cleanup project where you remove links which were helping your Bing rankings but the Google update "coming soon" takes over a year to show up.
Invest money to lower your current income while you're waiting for Godat.
So after Google made a big show of this pending mobile update by pre-announcing it, speaking about it at multiple conferences, comparing it to Panda and Penguin & stating it would have a bigger impact, sending out millions of warning messages via Webmaster Tools, etc etc etc .. when the big day came, did Google make the people who trusted them & invested in their advice look good?
Not so much.
Ayima recently launched a SERP flux pulse tracker tool which shows desktop and mobile flux side-by-side.
As you can see, nothing happened.
So far, no rewards. Maybe they will come. Though here is a hypothetical example where it could be very much NOT worth it for some publishers to go mobile friendly...
a webmaster managing an affiliate site converts it to a mobile responsive design
but user conversions on mobile devices in some verticals are unlikely, due to it being a pain in the ass to enter credit card info and so on ...
well ... person makes their site mobile friendly
that leads their mobile version of their site to rank better in Google
that leads to a greater share of their overall organic Google search traffic coming from mobile devices
their engagement metrics on mobile are somewhat weak, particularly when compared against desktop users, as is the case for many websites
their lower aggregate engagement metrics could create a signal which lead an edge case site into a false positive panda penalty
that then lowers their desktop search rankings
which lowers their desktop search traffic
which lowers their desktop search revenues
...worse yet, ...
those affiliate cookies they dropped on mobile devices don't count for them when the user later converts on a desktop device
Any form of penalty (even a false positive) can become self-reinforcing. And many of the things which seem like they might help could cause harm.
You can't copyright facts, which means that if this were a primary ranking signal & people focused on it then they would be optimizing their site to be scraped-n-displaced into the knowledge graph. Some people may sugar coat the knowledge graph and rich answers as opportunity, but it is Google outsourcing the cost of editorial labor while reaping the rewards.
The previously mentioned links were governmental efforts. However such strategies are more common in the commercial market. Consider how Google has sponsored academic conferences while explicitly telling the people who put them on to hide the sponsorship as part of their lobbying efforts.
The problem is rarely attributed to Google, but as ecosystem diversity has declined (and entire segments of the ecosystem are unprofitable to service), more people are writing things like: "The market for helping small businesses maintain a home online isn’t one with growing profits – or, for the most part, any profits. It’s one that’s heading for a bloody period of consolidation."
If you don't think Google wants to disrupt you out of a job, you've been asleep at the wheel for the past decade— Michael Gray (@graywolf) March 13, 2015
We Just Listen to the Data (Ish)
As Google sucks up more data, aggregates intent, and scrapes-n-displaces the ecosystem they get air cover for some of their gray area behaviors by claiming things are driven by the data & putting the user first.
Those "data" and altruism claims from Google recently fell flat on their face when the Wall Street Journal published a number of articles about a leaked FTC document.
That PDF has all sorts of goodies in it about things like blocking competition, signing a low margin deal with AOL to keep monopoly marketshare (while also noting the general philosophy outside of a few key deals was to squeeze down on partners), scraping content and ratings from competing sites, Google force inserting itself in certain verticals anytime select competitors ranked in the organic result set, etc.
"On Nov. 6, 2012, the night of Mr. Obama’s re-election, Mr. Schmidt was personally overseeing a voter-turnout software system for Mr. Obama. A few weeks later, Ms. Shelton and a senior antitrust lawyer at Google went to the White House to meet with one of Mr. Obama’s technology advisers. ... By the end of the month, the FTC had decided not to file an antitrust lawsuit against the company, according to the agency’s internal emails."
What is wild about the above leaked FTC document is it goes to great lengths to show an anti-competitive pattern of conduct toward the larger players in the ecosystem. Even if you ignore the distasteful political aspects of the FTC non-decision, the other potential out was:
"The distinction between harm to competitors and harm to competition is an important one: according to the modern interpretation of antitrust law, even if a business hurts individual competitors, it isn’t seen as breaking antitrust law unless it has also hurt the competitive process—that is, that it has taken actions that, for instance, raised prices or reduced choices, over all, for consumers." - Vauhini Vara
Part of the reason the data set was incomplete on that front was for the most part only larger ecosystem players were consulted. Google engineers have went on record stating they aim to break people's spirits in a game of psychological warfare. If that doesn't hinder consumer choice, what does?
@aaronwall rofl. Feed the dragon Honestly these G investigations need solid long term SEOs to testify as well as brands.— Rishi Lakhani (@rishil) April 2, 2015
When the EU published their statement of objections Google's response showed charts with the growth of Amazon and eBay as proof of a healthy ecosystem.
The market has been consolidated down into a few big winners which are still growing, but that in and of itself does not indicate a healthy nor neutral overall ecosystem.
The other obvious "untruth" hidden in the above Google chart is there is no way product searches on Google.com are included in Google's aggregate metrics. They are only counting some subset of them which click through a second vertical ad type while ignoring Google's broader impact via the combination of PLAs along with text-based AdWords ads and the knowledge graph, or even the recently rolled out rich product answer results.
Who could look at the following search result (during anti-trust competitive review no less) and say "yeah, that looks totally reasonable?"
Google has allegedly spent the last couple years removing "visual clutter" from the search results & yet they manage to product SERPs looking like that - so long as the eye candy leads to clicks monetized directly by Google or other Google hosted pages.
The Search Results Become a Closed App Store
Search was an integral piece of the web which (in the past) put small companies on a level playing field with larger players.
"What kind of a system do you have when existing, large players are given a head start and other advantages over insurgents? I don’t know. But I do know it’s not the Internet." - Dave Pell
The above quote was about app stores, but it certainly parallels a rater system which enforces the broken window fallacy against smaller players while looking the other way on larger players, unless they are in a specific vertical Google itself decides to enter.
"That actually proves my point that they use Raters to rate search results. aka: it *is* operated manually in many (how high?) cases. There is a growing body of consensus that a major portion of Googles current "algo" consists of thousands of raters that score results for ranking purposes. The "algorithm" by machine, on the majority of results seen by a high percentage of people, is almost non-existent." ... "what is being implied by the FTC is that Googles criteria was: GoogleBot +10 all Yelp content (strip mine all Yelp reviews to build their database). GoogleSerps -10 all yelp content (downgrade them in the rankings and claim they aren't showing serps in serps). That is anticompetitive criteria that was manually set." - Brett Tabke
The remote rater guides were even more explicitly anti-competitive than what was detailed in the FTC report. For instance, requiring hotel affiliate sites rated as spam even if they are helpful, for no reason other than being affiliate sites.
Is Brand the Answer?
About 3 years ago I wrote a blog post about how branding plays into SEO & why it might peak. As much as I have been accused of having a cynical view, the biggest problem with my post was it was naively optimistic. I presumed Google's consolidation of markets would end up leading Google to alter their ranking approach when they were unable to overcome the established consensus bias which was subsidizing their competitors. The problem with my presumption is Google's reliance on "data" was a chimera. When convenient (and profitable) data is discarded on an as need basis.
While Google was outright stealing third party content and putting it front & center on core keyword searches, they had to use "about 100 “synthetic queries”—queries that you would never expect a user to type" to smear Bing & even numerous of these queries did not show the alleged signal.
Here are some representative views of that incident:
"We look forward to competing with genuinely new search algorithms out there—algorithms built on core innovation, and not on recycled search results from a competitor. So to all the users out there looking for the most authentic, relevant search results, we encourage you to come directly to Google. And to those who have asked what we want out of all this, the answer is simple: we'd like for this practice to stop." - Google's Amit Singhal
“It’s cheating to me because we work incredibly hard and have done so for years but they just get there based on our hard work. I don’t know how else to call it but plain and simple cheating. Another analogy is that it’s like running a marathon and carrying someone else on your back, who jumps off just before the finish line.” Amit Singhal, more explicitly.
"One comment that I’ve heard is that “it’s whiny for Google to complain about this.” I agree that’s a risk, but at the same time I think it’s important to go on the record about this." - Matt Cutts
"I’ve got some sympathy for Google’s view that Bing is doing something it shouldn’t." - Danny Sullivan
What is so crazy about the above quotes is Google engineers knew at the time what Google was doing with Google's scraping. I mentioned that contrast shortly after the above PR fiasco happened:
when popular vertical websites (that have invested a decade and millions of Dollars into building a community) complain about Google disintermediating them by scraping their reviews, Google responds by telling those webmasters to go pound sand & that if they don't want Google scraping them then they should just block Googlebot & kill their search rankings
"The bizrate/nextag/epinions pages are decently good results. They are usually well-format[t]ed, rarely broken, load quickly and usually on-topic. Raters tend to like them" ... which is why ... "Google repeatedly changed the instructions for raters until raters assessed Google's services favorably"
and while claimping down on those services ("business models to avoid") ... "Google elected to show its product search OneBox “regardless of the quality” of that result and despite “pretty terribly embarrassing failures” "
and since Google knew their offerings were vastly inferior, “most of us on geo [Google Local] think we won't win unless we can inject a lot more of local directly into google results” ... thus they added "a 'concurring sites' signal to bias ourselves toward triggering [display of a Google local service] when a local-oriented aggregator site (i.e. Citysearch) shows up in the web results”"
Google's justification for not being transparent is "spammer" would take advantage of transparency to put inferior results front and center - the exact same thing Google does when it benefits the bottom line!
The following types of websites are likely to merit low landing page quality scores and may be difficult to advertise affordably. In addition, it's important for advertisers of these types of websites to adhere to our landing page quality guidelines regarding unique content.
eBook sites that show frequent ads
'Get rich quick' sites
Comparison shopping sites
Affiliates that don't comply with our affiliate guidelines
The anti-competitive conspiracy theory is no longer conspiracy, nor theory.
Google systematically positions and prominently displays its comparison shopping service in its general search results pages, irrespective of its merits. This conduct started in 2008.
Google does not apply to its own comparison shopping service the system of penalties, which it applies to other comparison shopping services on the basis of defined parameters, and which can lead to the lowering of the rank in which they appear in Google's general search results pages.
Froogle, Google's first comparison shopping service, did not benefit from any favourable treatment, and performed poorly.
As a result of Google's systematic favouring of its subsequent comparison shopping services "Google Product Search" and "Google Shopping", both experienced higher rates of growth, to the detriment of rival comparison shopping services.
Google's conduct has a negative impact on consumers and innovation. It means that users do not necessarily see the most relevant comparison shopping results in response to their queries, and that incentives to innovate from rivals are lowered as they know that however good their product, they will not benefit from the same prominence as Google's product.
Overcoming Consensus Bias
Consensus bias is set to an absurdly high level to block out competition, slow innovation, and make the search ecosystem easier to police. This acts as a tax on newer and lesser-known players and a subsidy toward larger players.
Eventually that subsidy would be a problem to Google if the algorithm was the only thing that matters, however if the entire result set itself can be displaced then that subsidy doesn't really matter, as it can be retracted overnight.
Whenever Google has a competing offering ready, they put it up top even if they are embarrassed by it and 100% certain it is a vastly inferior option to other options in the marketplace.
That is how Google reinforces, then manages to overcome consensus bias.
Google recently added highlights at the bottom of various sections of their mobile search results. The highlights appear on ads, organic results, and other various vertical search insertion types. The colors vary arbitrarily by section and are patterned off the colors in the Google logo. Historically such borders have conveyed a meaning, like separating advertisements from organic search results, but now the colors have no meaning other than acting as a visual separator.
We recently surveyed users to see if they understood what the borders represented & if they felt the borders had any meaning. We did 4 surveys total. The first 2 allows a user to select a choice from a drop down menu. The last two were open ended, where a user typed text into the box. For each of the 2 survey types, we did a survey of a SERP which had an ad in it & a survey of a SERP without an ad in it.
Below are the associated survey images & user results.
Google recently added colored bars at the bottom of some mobile search results. What do they mean?
none of the other options are correct
27.7% (+2.7 / -2.5)
29.9% (+2.8 / -2.7)
the listing is an advertisement
25.8% (+2.8 / -2.6)
30.1% (+2.8 / -2.7)
each color has a different meaning
24% (+2.7 / -2.5)
19.6% (+2.5 / -2.3)
colors separate sections but have no meaning
15.5% (+2.4 / -2.1)
12.5% (+2.1 / -1.9)
the listing is a free search result
6.9% (+1.8 / -1.5)
7.9% (+2.0 / -1.6)
Given there are 5 answers, if the distributions were random there would have been a 20% distribution on each option. The only options which skewed well below that were the perceptions that the colored highlights either had no meaning or represented free/organic search results.
And here are images of what users saw for the above surveys:
For the second set of surveys we used an open ended format
The open ended questions allow a user to type in whatever they want. This means the results do not end up biased by the predefined answer options in a quiz, but it also means the results will include plenty of noise like...
people entering a, c, d, k, 1, 2, 3, ggg, hello, jj, blah, and who cares as answer choices
some of the responses referencing the listing topics
some of the responses referencing parts of a search result listing like the headlines or hyperlinks
some of the responses highlighting the colors of the bars
On each of the above word clouds, we used the default automated grouping. Here is an example of what the word cloud would look like if the results were grouped manually.
For a couple years Google has removed various forms of eye candy from many organic results (cutting back on video snippets, limiting rich rating snipets, removing authorship, etc.). The justification for such removals was to make the results feel "less cluttered." At the same time, Google has added a variety of the same types of "noisy" listing enhancements to their various ad programs.
What is the difference between reviews ad extensions, consumer ratings ad extensions, and seller ratings ad extensions? What is the difference between callout extensions and dynamic structured snippets?
Long ago AdWords advertisements had a border near them to separate them from the organic results. Those borders disappeared many years ago & only recently reappeared on mobile devices when they also appeared near organic listings. That in turn has left searchers confused as to what the border highlighting means.
According to the above Google survey results, the majority of users don't know what the colors signify, don't care what they signify, or think they indicate advertisements.
If you look just at your revenue numbers as a publisher, it is easy to believe mobile is of limited importance. In our last post I mentioned an AdAge article highlighting how the New York Times was generating over half their traffic from mobile with it accounting for about 10% of their online ad revenues.
Large ad networks (Google, Bing, Facebook, Twitter, Yahoo!, etc.) can monetize mobile *much* better than other publishers can because they aggressively blend the ads right into the social stream or search results, causing them to have a much higher CTR than ads on the desktop. Bing recently confirmed the same trend RKG has highlighted about Google's mobile ad clicks:
While mobile continues to be an area of rapid and steady growth, we are pleased to report that the Yahoo Bing Network’s search and click volumes from smart phone users have more than doubled year-over-year. Click volumes have generally outpaced growth in search volume
Those ad networks want other publishers to make their sites mobile friendly for a couple reasons...
If the downstream sites are mobile friendly, then users are more likely to go back to the central ad / search / social networks more often & be more willing to click out on the ads from them.
If mobile is emphasized in importance, then those who are critical of the value of the channel may eat some of the blame for relative poor performance, particularly if they haven't spent resources optimizing user experience on the channel.
Starting April 21, we will be expanding our use of mobile-friendliness as a ranking signal. This change will affect mobile searches in all languages worldwide and will have a significant impact in our search results.
In the past Google would hint that they were working to clean up link spam or content farms or website hacking and so on. In some cases announcing such efforts was done to try to discourage investment in the associated strategies, but it is quite rare that Google pre-announces an algorithmic shift which they state will be significant & they put an exact date on it.
I wouldn't recommend waiting until the last day to implement the design changes, as it will take Google time to re-crawl your site & recognize if the design is mobile friendly.
Those who ignore the warning might be in for significant pain.
Some sites which were hit by Panda saw a devastating 50% to 70% decline in search traffic, but given how small mobile phone screen sizes are, even ranking just a couple spots lower could cause an 80% or 90% decline in mobile search traffic.
Another related issue referenced in the above post was tying in-app content to mobile search personalization:
Starting today, we will begin to use information from indexed apps as a factor in ranking for signed-in users who have the app installed. As a result, we may now surface content from indexed apps more prominently in search. To find out how to implement App Indexing, which allows us to surface this information in search results, have a look at our step-by-step guide on the developer site.
Some sites have a separate m. version for mobile visitors, while other sites keep consistent URLs & employ responsive design. How the m. version works is on the regular version of their site (say like www.seobook.com) a webmaster could add an alternative reference to the mobile version in the head section of the document
With the above sort of code in place, Google would rank the full version of the site on desktop searches & the m. version in mobile search results.
3 or 4 years ago it was a toss up as to which of these 2 options would win, but over time it appears the responsive design option is more likely to win out.
Here are a couple reasons responsive is likely to win out as a better solution:
If people share a mobile-friendly URL on Twitter, Facebook or other social networks & the URL changes, then when someone on a desktop computer clicks on the shared m. version of the page with fewer ad units & less content on the page, then the publisher is providing a worse user experience & is losing out on incremental monetization they would have achieved with the additional ad units.
While some search engines and social networks might be good at consolidating the performance of the same piece of content across multiple URL versions, some of them will periodically mess it up. That in turn will lead in some cases to lower rankings in search results or lower virality of content on social networks.
Over time there is an increasing blur between phones and tablets with phablets. Some high pixel density screens on cross over devices may appear large in terms of pixel count, but still not have particularly large screens, making it easy for users to misclick on the interface.
When Bing gave their best practices for mobile, they stated: "Ideally, there shouldn’t be a difference between the “mobile-friendly” URL and the “desktop” URL: the site would automatically adjust to the device — content, layout, and all." In that post Bing shows some examples of m. versions of sites ranking in their mobile search results, however for smaller & lesser known sites they may not rank the m. version the way they do for Yelp or Wikipedia, which means that even if you optimize the m. version of the site to a great degree, that isn't the version all mobile searchers will see. Back in 2012 Bing also stated their preference for a single version of a URL, in part based on lowering crawl traffic & consolidation of ranking signals.
In addition to responsive web design & separate mobile friendly URLs, a third configuration option is dynamic serving, which uses the Vary HTTP header to detect the user-agent & use that to drive the layout. For large, complex & high-traffic sites (and sites with numerous staff programmers & designers) dynamic serving is perhaps the best optimization solution because you can optimize the images and code to lower bandwidth costs and response times. Most smaller sites will likely rely on responsive design rather than dynamic serving, in large part because it is quicker & cheaper to implement, and most are not running sites large enough to where the incremental bandwidth provides a significant incremental expense to their business.
Solutions for Quickly Implementing Responsive Design
New Theme / Design
If your site hasn't been updated in years you might be suprised at what you find available on sites like ThemeForest for quite reasonable prices. Many of the options are responsive out of the gate & look good with a day or two of customization. Theme subscription services like WooThemes and Elegant Themes also have responsive options.
Some of the PSD to HTML conversion services like PSD 2 HTML, HTML Slicemate & XHTML Chop offer responsive design conversion of existing HTML sites in as little as a day or two, though you will likely need to do at least a few minor changes when you put the designs live to compensate for issues like third party ad units and other minor issues.
If you have an existing Wordpress theme, you might want to see if you can zip it up and send it to them, or else they may make your new theme as a child theme of 2015 or such. If you are struggling to get them to convert your Wordpress theme over (like they are first converting it to a child theme of 2015 or such) then another option would be to have them do a static HTML file conversion (instead of a Wordpress conversion) and then feed that through a theme creation tool like Themespress.
For simple hand rolled designs there are a variety of grid generator tools, which can make it reasonably easy to replace some old school table-based designs with divs. Many of the themes for sale in marketplaces like Theme Forest also use a multi-column div grid based system.
Other Things to Look Out For
Third Party Plug-ins & Ad Code Gotchas
Google allows webmasters to alter the ad calls on their mobile responsive AdSense ad units to show different sized ad units to different screen sizes & skip showing some ad units on smaller screens. An AdSense code example is included in an expandable section at the bottom of this page.
Back up your old site before putting the new site live.
For static HTML sites or sites with PHP or SHTML includes & such...
Download a copy of your existing site to local.
Rename that folder to something like sitename.com-OLDVERSION
Upload the sitename.com-OLDVERSION folder to your server. If anything goes drastically wrong during your conversion process you can rename the new site design to something like sitename.com-HOSED then set the sitename.com-OLDVERSION folder to sitename.com to quickly restore the site.
Download your site to local again.
Ensure your new site design is using a different CSS folder or CSS filename such that they old and new versions of the design can be live at the same time while you are editing the site.
Create a test file with the responsive design on your site & test that page until things work well enough.
Once you have the general "what needs changed in each file" down, then use find & replace to bulk edit the remaining files to make the changes to make them responsive.
Use a tool like FileZilla to quickly bulk upload the files.
Look through key pages and if there are only a few minor errors then fix them and re-upload them. If things are majorly screwed up then revert to the old design being live and schedule a do over on the upgrade.
If you have a decently high traffic site, it might make sense to schedule the above process for late Friday night or an off hour on the weekend, such that if anything goes astray you have fewer people seeing the errors while you frantically rush to fix them. :)
If you want to view your top pages you could export that data from your web analytics to verify all those pages look good. If you wanted to view every page of your site 1 at a time after the change, you could use a tool like Xenu Link Sleuth or Screaming Frog SEO Spider to crawl your site & export a list of URLs into a spreadsheet. Then you could take the URLs from that spreadsheet and put them a chunk at a time into a tool like URL Opener.
If you have little faith in the above test-it-live "methodology" & would prefer a slower & lower stress approach, you could create a test site on another domain name for testing purposes. Just be sure to include a noindex directive in the robots.txt file or password protect access to the site while testing. When you get things worked out on it, make sure your internal links are referencing the correct domain name, and that you have removed any block via robots.txt or password protection.
For a site with a CMS the above process is basically the same, except for how you might need to create a different backup. If you are uploading a Wordpress or Drupal theme, then change the name at least slightly so you can keep the old and new designs live at the same time so you can quickly switch back to the old design if you need to.
If you have a mixed site with Wordpress & static files or such then it might make sense to test changing the static files first, get those to work well & then create a Wordpress theme after that.
Google systems have tested 2,790 pages from your site and found that 100% of them have critical mobile usability errors. The errors on these 2,790 pages severely affect how mobile users are able to experience your website. These pages will not be seen as mobile-friendly by Google Search, and will therefore be displayed and ranked appropriately for smartphone users.