Responsive Design for Mobile SEO

Why Is Mobile So Important?

If you look just at your revenue numbers as a publisher, it is easy to believe mobile is of limited importance. In our last post I mentioned an AdAge article highlighting how the New York Times was generating over half their traffic from mobile with it accounting for about 10% of their online ad revenues.

Large ad networks (Google, Bing, Facebook, Twitter, Yahoo!, etc.) can monetize mobile *much* better than other publishers can because they aggressively blend the ads right into the social stream or search results, causing them to have a much higher CTR than ads on the desktop. Bing recently confirmed the same trend RKG has highlighted about Google's mobile ad clicks:

While mobile continues to be an area of rapid and steady growth, we are pleased to report that the Yahoo Bing Network’s search and click volumes from smart phone users have more than doubled year-over-year. Click volumes have generally outpaced growth in search volume

Those ad networks want other publishers to make their sites mobile friendly for a couple reasons...

  • If the downstream sites are mobile friendly, then users are more likely to go back to the central ad / search / social networks more often & be more willing to click out on the ads from them.
  • If mobile is emphasized in importance, then those who are critical of the value of the channel may eat some of the blame for relative poor performance, particularly if they haven't spent resources optimizing user experience on the channel.

Further Elevating the Importance of Mobile

Google has hinted at the importance of having a mobile friendly design, labeling friendly sites, testing labeling slow sites & offering tools to test how mobile friendly a site design is.

Today Google put out an APB warning they are going to increase the importance of mobile friendly website design:

Starting April 21, we will be expanding our use of mobile-friendliness as a ranking signal. This change will affect mobile searches in all languages worldwide and will have a significant impact in our search results.

In the past Google would hint that they were working to clean up link spam or content farms or website hacking and so on. In some cases announcing such efforts was done to try to discourage investment in the associated strategies, but it is quite rare that Google pre-announces an algorithmic shift which they state will be significant & they put an exact date on it.

I wouldn't recommend waiting until the last day to implement the design changes, as it will take Google time to re-crawl your site & recognize if the design is mobile friendly.

Those who ignore the warning might be in for significant pain.

Some sites which were hit by Panda saw a devastating 50% to 70% decline in search traffic, but given how small mobile phone screen sizes are, even ranking just a couple spots lower could cause an 80% or 90% decline in mobile search traffic.

Another related issue referenced in the above post was tying in-app content to mobile search personalization:

Starting today, we will begin to use information from indexed apps as a factor in ranking for signed-in users who have the app installed. As a result, we may now surface content from indexed apps more prominently in search. To find out how to implement App Indexing, which allows us to surface this information in search results, have a look at our step-by-step guide on the developer site.

Google also announced today they are extending AdWords-styled ads to their Google Play search results, so they now have a direct economic incentive to allow app activity to bleed into their organic ranking factors.

m. Versus Responsive Design

Some sites have a separate m. version for mobile visitors, while other sites keep consistent URLs & employ responsive design. How the m. version works is on the regular version of their site (say like www.seobook.com) a webmaster could add an alternative reference to the mobile version in the head section of the document

<link rel="alternate" media="only screen and (max-width: 640px)" href="http://m.seobook.com/" >

...and then on the mobile version, they would rel=canonical it back to the desktop version, likeso...

<link rel="canonical" href="http://www.seobook.com/" >

With the above sort of code in place, Google would rank the full version of the site on desktop searches & the m. version in mobile search results.

3 or 4 years ago it was a toss up as to which of these 2 options would win, but over time it appears the responsive design option is more likely to win out.

2017 Update: Google confirmed in June of 2017 they prefer responsive designs, as in 2018 they intend to move to a mobile-first index of the web. In the past the m. version was simply a relational pointer, but when the mobile version of a page becomes the default version of a page, then if it lacks content that was on the desktop version of the page that missing content will not be in the search index on mobile or desktop.

Here are a couple reasons responsive is likely to win out as a better solution:

  • If people share a mobile-friendly URL on Twitter, Facebook or other social networks & the URL changes, then when someone on a desktop computer clicks on the shared m. version of the page with fewer ad units & less content on the page, then the publisher is providing a worse user experience & is losing out on incremental monetization they would have achieved with the additional ad units.
  • While some search engines and social networks might be good at consolidating the performance of the same piece of content across multiple URL versions, some of them will periodically mess it up. That in turn will lead in some cases to lower rankings in search results or lower virality of content on social networks.
  • Over time there is an increasing blur between phones and tablets with phablets. Some high pixel density screens on cross over devices may appear large in terms of pixel count, but still not have particularly large screens, making it easy for users to misclick on the interface.
  • When Bing gave their best practices for mobile, they stated: "Ideally, there shouldn’t be a difference between the “mobile-friendly” URL and the “desktop” URL: the site would automatically adjust to the device — content, layout, and all." In that post Bing shows some examples of m. versions of sites ranking in their mobile search results, however for smaller & lesser known sites they may not rank the m. version the way they do for Yelp or Wikipedia, which means that even if you optimize the m. version of the site to a great degree, that isn't the version all mobile searchers will see. Back in 2012 Bing also stated their preference for a single version of a URL, in part based on lowering crawl traffic & consolidation of ranking signals.

In addition to responsive web design & separate mobile friendly URLs, a third configuration option is dynamic serving, which uses the Vary HTTP header to detect the user-agent & use that to drive the layout. For large, complex & high-traffic sites (and sites with numerous staff programmers & designers) dynamic serving is perhaps the best optimization solution because you can optimize the images and code to lower bandwidth costs and response times. Most smaller sites will likely rely on responsive design rather than dynamic serving, in large part because it is quicker & cheaper to implement, and most are not running sites large enough to where the incremental bandwidth provides a significant incremental expense to their business.

Solutions for Quickly Implementing Responsive Design

New Theme / Design

If your site hasn't been updated in years you might be suprised at what you find available on sites like ThemeForest for quite reasonable prices. Many of the options are responsive out of the gate & look good with a day or two of customization. Theme subscription services like WooThemes and Elegant Themes also have responsive options.

Child Themes

Some of the default Wordpress themes are responsive. Creating a child theme is quite easy. The popular Thesis and Studiopress frameworks also offer responsive skins.

PSD to HTML HTML to Responsive HTML

Some of the PSD to HTML conversion services like PSD 2 HTML, HTML Slicemate & XHTML Chop offer responsive design conversion of existing HTML sites in as little as a day or two, though you will likely need to do at least a few minor changes when you put the designs live to compensate for issues like third party ad units and other minor issues.

If you have an existing Wordpress theme, you might want to see if you can zip it up and send it to them, or else they may make your new theme as a child theme of 2015 or such. If you are struggling to get them to convert your Wordpress theme over (like they are first converting it to a child theme of 2015 or such) then another option would be to have them do a static HTML file conversion (instead of a Wordpress conversion) and then feed that through a theme creation tool like Themespress.

For simple hand rolled designs there are a variety of grid generator tools, which can make it reasonably easy to replace some old school table-based designs with divs. Many of the themes for sale in marketplaces like Theme Forest also use a multi-column div grid based system.

Other Things to Look Out For

Third Party Plug-ins & Ad Code Gotchas

Google allows webmasters to alter the ad calls on their mobile responsive AdSense ad units to show different sized ad units to different screen sizes & skip showing some ad units on smaller screens. An AdSense code example is included in an expandable section at the bottom of this page.

<style type="text/css">
.adslot_1 { display:inline-block; width: 320px; height: 50px; }
@media (max-width: 400px) { .adslot_1 { display: none; } }
@media (min-width:500px) { .adslot_1 { width: 468px; height: 60px; } }
@media (min-width:800px) { .adslot_1 { width: 728px; height: 90px; } }
</style>
<ins class="adsbygoogle adslot_1"
data-ad-client="ca-pub-1234"
data-ad-slot="5678"></ins>
<script async src="//pagead2.googlesyndication.com/pagead/js/adsbygoogle.js"></script>
<script>(adsbygoogle = window.adsbygoogle || []).push({});</script>

For other ads which perhaps don't have a "mobile friendly" option you could use CSS to either set the ad unit to display none or to set the ad unit to overflow using code like either of the following

hide it:

@media only screen and (max-width: ___px) {
     .bannerad {
          display: none;
     }
}

overflow it:

@media only screen and (max-width: ___px) {
     .ad-unit {
          max-width: ___px;
          overflow: scroll;
     }
}

Images are another tricky point.

img {
height:auto;
max-width:100%;
}

Here are a few other general responsive CSS tricks.

Before Putting Your New Responsive Site Live...

Back up your old site before putting the new site live.

For static HTML sites or sites with PHP or SHTML includes & such...

  • Download a copy of your existing site to local.
  • Rename that folder to something like sitename.com-OLDVERSION
  • Upload the sitename.com-OLDVERSION folder to your server. If anything goes drastically wrong during your conversion process you can rename the new site design to something like sitename.com-HOSED then set the sitename.com-OLDVERSION folder to sitename.com to quickly restore the site.
  • Download your site to local again.
  • Ensure your new site design is using a different CSS folder or CSS filename such that they old and new versions of the design can be live at the same time while you are editing the site.
  • Create a test file with the responsive design on your site & test that page until things work well enough.
  • Once that page works well enough, test changing your homepage over to the new design & then save and upload it to verify it works properly. In addition to using your cell phone you could see how it looks on a variety of devices via the mobile browser testing emulation tool in Chrome, or a wide array of third party tools like: MobileTest.me, MobileMoxy Device Emulator, iPadPeek, Mobile Phone Emulator, Browshot, Matt Kersley's responsive web design testing tool, BrowserStack, Cross Browser Testing, & the W3C mobileOK Checker. Paid services like Keynote offer manual testing rather than emulation on a wide variety of devices. There is also paid downloadable desktop emulation software like Multi-browser view.
  • Once you have the general "what needs changed in each file" down, then use find & replace to bulk edit the remaining files to make the changes to make them responsive.
  • Use a tool like FileZilla to quickly bulk upload the files.
  • Look through key pages and if there are only a few minor errors then fix them and re-upload them. If things are majorly screwed up then revert to the old design being live and schedule a do over on the upgrade.
  • If you have a decently high traffic site, it might make sense to schedule the above process for late Friday night or an off hour on the weekend, such that if anything goes astray you have fewer people seeing the errors while you frantically rush to fix them. :)
  • If you want to view your top pages you could export that data from your web analytics to verify all those pages look good. If you wanted to view every page of your site 1 at a time after the change, you could use a tool like Xenu Link Sleuth or Screaming Frog SEO Spider to crawl your site & export a list of URLs into a spreadsheet. Then you could take the URLs from that spreadsheet and put them a chunk at a time into a tool like URL Opener.

If you have little faith in the above test-it-live "methodology" & would prefer a slower & lower stress approach, you could create a test site on another domain name for testing purposes. Just be sure to include a noindex directive in the robots.txt file or password protect access to the site while testing. When you get things worked out on it, make sure your internal links are referencing the correct domain name, and that you have removed any block via robots.txt or password protection.

For a site with a CMS the above process is basically the same, except for how you might need to create a different backup. If you are uploading a Wordpress or Drupal theme, then change the name at least slightly so you can keep the old and new designs live at the same time so you can quickly switch back to the old design if you need to.

If you have a mixed site with Wordpress & static files or such then it might make sense to test changing the static files first, get those to work well & then create a Wordpress theme after that.

Update: at a conference a Googler named Zineb Ait Bahajji recently stated they expect this update to impact more sites than Panda and Penguin have. And Google recently started sending out mobile usability warning messages in bulk:

Google systems have tested 2,790 pages from your site and found that 100% of them have critical mobile usability errors. The errors on these 2,790 pages severely affect how mobile users are able to experience your website. These pages will not be seen as mobile-friendly by Google Search, and will therefore be displayed and ranked appropriately for smartphone users.

You Can't Copyright Facts

The Facts of Life

When Google introduced the knowledge graph one of their underlying messages behind it was "you can't copyright facts."

Facts are like domain names or links or pictures or anything else in terms of being a layer of information which can be highly valued or devalued through commoditization.

When you search for love quotes, Google pulls one into their site & then provides another "try again" link.

Since quotes mostly come from third parties they are not owned by BrainyQuotes and other similar sites. But here is the thing, if those other sites which pay to organize and verify such collections have their economics sufficiently undermined then they go away & then Google isn't able to pull them into the search results either.

The same is true with song lyrics. If you are one of the few sites paying to license the lyrics & then Google puts lyrics above the search results, then the economics which justified the investment in licensing might not back out & you will likely go bankrupt. That bankruptcy wouldn't be the result of being a spammer trying to work an angle, but rather because you had a higher cost structure from trying to do things the right way.

Never trust a corporation to do a librarian's job.

What's Behind Door Number One?

Google has also done the above quote-like "action item" types of onebox listings in other areas like software downloads

Where there are multiple versions of the software available, Google is arbitrarily selecting the download page, even though a software publisher might have a parallel SAAS option or other complex funnels based on a person's location or status as a student or such.

Mix in Google allowing advertisers to advertise bundled adware, and it becomes quite easy for Google to gum up the sales process and undermine existing brand equity by sending users to the wrong location. Here's a blog post from Malwarebytes referencing

  • their software being advertised on their brand term in Google via AdWords ads, engaging in trademark infringement and bundled with adware.
  • numerous user complaints they received about the bundleware
  • required legal actions they took to take the bundler offline

Brands are forced to buy their own brand equity before, during & after the purchase, or Google partners with parasites to monetize the brand equity:

The company used this cash to build more business, spending more than $1 million through at least seven separate advertising accounts with Google.
...
The ads themselves said things like “McAfee Support - Call +1-855-[redacted US phone number]” and pointed to domains like mcafee-support.pccare247.com.
...
One PCCare247 ad account with Google produced 71.7 million impressions; another generated 12.4 million more. According to records obtained by the FTC, these combined campaigns generated 1.5 million clicks

Since Google requires Chrome extensions be installed from their own website it makes it hard (for anyone other than Google) to monetize them, which in turn makes it appealing for people to sell the ad-ons to malware bundlers. Android apps in the Google Play store are yet another "open" malware ecosystem.

FACT: This Isn't About Facts

Google started the knowledge graph & onebox listings on some utterly banal topics which were easy for a computer to get right, though their ambitions vastly exceed the starting point. The starting point was done where it was because it was low-risk and easy.

When Google's evolving search technology was recently covered on Medium by Steven Levy he shared that today the Knowledge Graph appears on roughly 25% of search queries and that...

Google is also trying to figure out how to deliver more complex results — to go beyond quick facts and deliver more subjective, fuzzier associations. “People aren’t interested in just facts,” she says. “They are interested in subjective things like whether or not the television shows are well-written. Things that could really help take the Knowledge Graph to the next level.”

Even as the people who routinely shill for Google parrot the "you can't copyright facts" mantra, Google is telling you they have every intent of expanding far beyond it. “I see search as the interface to all computing,” says Singhal.

Even if You Have Copyright...

What makes the "you can't copyright facts" line so particularly disingenuous was Google's support of piracy when they purchased YouTube:

cofounder Jawed Karim favored “lax” copyright policy to make YouTube “huge” and hence “an excellent acquisition target.” YouTube at one point added a “report copyrighted content” button to let users report infringements, but removed the button when it realized how many users were reporting unauthorized videos. Meanwhile, YouTube managers intentionally retained infringing videos they knew were on the site, remarking “we should KEEP …. comedy clips (Conan, Leno, etc.) [and] music videos” despite having licenses for none of these. (In an email rebuke, cofounder Steve Chen admonished: “Jawed, please stop putting stolen videos on the site. We’re going to have a tough time defending the fact that we’re not liable for the copyrighted material on the site because we didn’t put it up when one of the co-founders is blatantly stealing content from other sites and trying to get everyone to see it.”)

To some, the separation of branding makes YouTube distinct and separate from Google search, but that wasn't so much the case when many sites lost their video thumbnails and YouTube saw larger thumbnails on many of their listings in Google. In the above Steven Levy article he wrote: "one of the highest ranked general categories was a desire to know “how to” perform certain tasks. So Google made it easier to surface how-to videos from YouTube and other sources, featuring them more prominently in search."

Altruism vs Disruption for the Sake of it

Whenever Google implements a new feature they can choose to not monetize it so as to claim they are benevolent and doing it for users without commercial interest. But that same unmonetized & for users claim was also used with their shopping search vertical until one day it went paid. Google claimed paid inclusion was evil right up until the day it claimed paid inclusion was a necessity to improve user experience.

There was literally no transition period.

Many of the "informational" knowledge block listings contain affiliate links pointing into Google Play or other sites. Those affiliate ads were only labeled as advertisements after the FTC complained about inconsistent ad labeling in search results.

Health is Wealth

Google recently went big on the knowledge graph by jumping head first into the health vertical:

starting in the next few days, when you ask Google about common health conditions, you’ll start getting relevant medical facts right up front from the Knowledge Graph. We’ll show you typical symptoms and treatments, as well as details on how common the condition is—whether it’s critical, if it’s contagious, what ages it affects, and more. For some conditions you’ll also see high-quality illustrations from licensed medical illustrators. Once you get this basic info from Google, you should find it easier to do more research on other sites around the web, or know what questions to ask your doctor.


Google's links to the Mayo Clinic in their knowledge graph are, once again, a light gray font.

In case you didn't find enough background in Google's announcement article, Greg Sterling shared more of Google's views here. A couple notable quotes from Greg...

Cynics might say that Google is moving into yet another vertical content area and usurping third-party publishers. I don’t believe this is the case. Google isn’t going to be monetizing these queries; it appears to be genuinely motivated by a desire to show higher-quality health information and educate users accordingly.

  • Google doesn't need to directly monetize it to impact the economics of the industry. If they shift a greater share of clicks through AdWords then that will increase competition and ad prices in that category while lowering investment in SEO.
  • If this is done out of benevolence, it will appear *above* the AdWords ads on the search results — unlike almost every type of onebox or knowledge graph result Google offers.
  • If it is fair for him to label everyone who disagrees with his thesis as a cynic then it is of course fair for those "cynics" to label Greg Sterling as a shill.

Google told me that it hopes this initiative will help motivate the improvement of health content across the internet.

By defunding and displacing something they don't improve its quality. Rather they force the associated entities to cut their costs to try to make the numbers work.

If their traffic drops and they don't do more with less, then...

  • their margins will fall
  • growth slows (or they may even shrink)
  • their stock price will tank
  • management will get fired & replaced and/or they will get took private by private equity investors and/or they will need to do some "bet the company" moves to find growth elsewhere (and hope Google doesn't enter that parallel area anytime soon)

When the numbers don't work, publishers need to cut back or cut corners.

Things get monetized directly, monetized indirectly, or they disappear.

Some of the more hated aspects of online publishing (headline bait, idiotic correlations out of context, pagination, slideshows, popups, fly in ad units, auto play videos, full page ad wraps, huge ads eating most the above the fold real estate, integration of terrible native ad units promoting junk offers with shocking headline bait, content scraping answer farms, blending unvetted user generated content with house editorial, partnering with content farms to create subdomains on trusted blue chip sites, using Narrative Science or Automated Insights to auto-generate content, etc.) are not done because online publishers want to be jackasses, but because it is hard to make the numbers work in a competitive environment.

Publishers who were facing an "oh crap" moment when seeing print Dollars turn into digital dimes are having round number 2 when they see those digital dimes turn into mobile pennies:

At The New York Times, for instance, more than half its digital audience comes from mobile, yet just 10% of its digital-ad revenue is attributed to these devices.

If we lose some diversity in news it isn't great, though it isn't the end of the world. But what makes health such an important area is it is literally a matter of life & death.

Its importance & the amount of money flowing through the market ensures there is heavy investment in misinforming the general population. The corruption is so bad some people (who should know better) instead fault science.

... and, only those who hate free speech, capitalism, democracy & the country could possibly have anything negative to say about it. :D

Not to worry though. Any user trust built through the health knowledge graph can be monetized through a variety of other fantastic benevolent offers.

Once again, Google puts the user first.

Peak Google? Not Even Close

Search vs Native Ads

Google owns search, but are they a one trick pony?

A couple weeks ago Ben Thompson published an interesting article suggesting Google may follow IBM and Microsoft in peaking, perhaps with native ads becoming more dominant than online search ads.

According to Forrester, in a couple years digital ad spend will overtake TV ad spend. In spite of the rise of sponsored content, native isn't even broken out as a category.

Part of the issue with native advertising is it can be blurry to break out some of it. Some of it is obvious, but falls into multiple categories, like video ads on YouTube. Some of it is obvious, but relatively new & thus lacking in scale. Amazon is extending their payment services & Prime shipping deals to third party sites of brands like AllSaints & listing inventory from those sites on Amazon.com, selling them traffic on a CPC basis. Does that count as native advertising? What about a ticket broker or hotel booking site syndicating their inventory to a meta search site?

And while native is not broken out, Google already offers native ad management features in DoubleClick and has partnered with some of the more well known players like BuzzFeed.

The Penny Gap's Impact on Search

Time intends to test paywalls on all of its major titles next year & they are working with third parties to integrate affiliate ads on sites like People.com.

The second link in the above sentence goes to an article which is behind a paywall. On Twitter I often link to WSJ articles which are behind a paywall. Any important information behind a paywall may quickly spread beyond it, but typically a competing free site which (re)reports on whatever is behind the paywall is shared more, spreads further on social, generates more additional coverage on forums and discussion sites like Hacker News, gets highlighted on aggregators like TechMeme, gets more links, ranks higher, and becomes the default/canonical source of the story.

Part of the rub of the penny gap is the cost of the friction vastly exceeds the financial cost. Those who can flow attention around the payment can typically make more by tracking and monetizing user behavior than they could by charging users incrementally a cent here and a nickel there.

Well known franchises are forced to offer a free version or they eventually cede their market position.

There are sites which do roll up subscriptions to a variety of sites at once, but some of them which had stub articles requiring payment to access like Highbeam Research got torched by Panda. If the barrier to entry to get to the content is too high the engagement metrics are likely to be terrible & a penalty ensues. Even a general registration wall is too high of a barrier to entry for some sites. Google demands whatever content is shown to them be visible to end users & if there is a miss match that is considered cloaking - unless the miss match is due to monetizing by using Google's content locking consumer surveys.

Who gets to the scale needed to have enough consumer demand to be able to charge an ongoing subscription for access to a variety of third party content? There are a handful of players in music (Apple, Spotify, Pandora, etc) & a handful of players in video (Netflix, Hulu, Amazon Prime), but outside of those most paid subscription sites are about finance or niche topics with small scale. And whatever goes behind the paywalls gets seen by almost nobody when compared against to the broader public market at the free pricepoint.

Even if you are in a broad niche industry where a subscription-based model works, it still may be brutally tough to compete against Google. Google's chief business officer joined the board of Spotify, which means Spotify should be safe from Google risk, except...

Google's Impact on Premium Content

I've long argued Google has leveraged piracy to secure favorable media deals (see the second bullet point at the bottom of this infographic). Some might have perceived my take as being cynical, but when Google mentioned their "continued progress on fighting piracy" the first thing they mentioned was more ad units.

There are free options, paid options & the blurry lines in between which Google & YouTube ride while they experiment with business models and monetize the flow of traffic to the paid options.

"Tech companies don’t believe in the unique value of premium content over the long term." - Jessica Lessin

There is a massive misalignment of values which causes many players to have to refine their strategy over and over again. The gray area is never static.

Many businesses only have a 10% or 15% profit margin. An online publishing company which sees 20% of its traffic disappear might thus go from sustainable to bleeding cash overnight. A company which can arbitrarily shift / redirect a large amount of traffic online might describe itself as a "kingmaker."

In Germany some publishers wanted to be paid to be in the Google index. As a result Google stopped showing snippets near their listings. Google also refined their news search integration into the regular search results to include a broader selection of sources including social sites like Reddit. As a result Axel Springer quickly found itself begging for things to go back to the way they were before as their Google search traffic declined 40% and their Google News traffic declined 80%. Axel Springer got their descriptions back, but the "in the news" change remains.

Google's Impact on Weaker Players

If Google could have that dramatic of an impact on Axel Springer, imagine what sort of influence they have on millions of other smaller and weaker online businesses.

One of the craziest examples is Demand Media.

Demand Media's market cap peaked above $1.9 billion. They spun out the domain name portion of the business into a company named Rightside Group, but the content portion of the business is valued at essentially nothing. They have about $40 million in cash & equivalents. Earlier this year they acquired Saatchi Art for $17 million & last year they acquired ecommerce marketplace Society6 for $94 million. After their last quarterly result their stock closed down 16.83% & Thursday they were down another 6.32%, given them a market capitalization of $102 million.

On their most recent conference call, here are some of the things Demand Media executives stated:

  • By the end of 2014, we anticipate more than 50.000 articles will be substantially improved by rewrites made rich with great visuals.
  • We are well underway with this push for quality and will remove $1.8 million underperforming articles in Q4
  • as we strive to create the best experience we can we have removed two ad units with the third unit to be removed completely by January 1st
  • (on the above 2 changes) These changes are expected to have a negative impact on revenues and adjusted EBITDA of approximately $15 million on an annualized basis.
  • Through Q3 we have invested $1.1 million in content renovation costs and expect approximately another $1 million in Q4 and $2 million to $4 million in the first half of next year.
  • if you look at visits or you know the mobile mix is growing which has lower CPM associated with it and then also on desktop we're seeing compression on the pricing line as well.
  • we know that sites that have ad density that's too high, not only are they offending audiences in near term, you are penalized with respect to where you show up in search indexes as well.

Google torched eHow in April of 2011. In spite of over 3 years of pain, Demand Media is still letting Google drive their strategy, in some cases spending millions of dollars to undo past investments.

Yet when you look at Google's search results page, they are doing the opposite of the above strategy: more scraping of third party content coupled with more & larger ad units.

Originally the source links in the scrape-n-displace program were light gray. They only turned blue after a journalist started working on a story about 10 blue links.

The Blend

The search results can be designed to have some aspects blend in while others stand out. Colors can change periodically to alter response rates in desirable ways.

The banner ad got a bad rap as publishers have fought declining CPMs by adding more advertisements to their pages. When it works, Google's infrastructure still delivers (and tracks) billions of daily banner ads.

Search ads have never had the performance decline banner ads have.

The closest thing Google ever faced on that front was when AdBlock Plus became popular. Since it was blocking search ads, Google banned them & then restored them as they eventually negotiated a deal to pay them to display ads on Google while they continued to block ads on other third party sites.

Search itself *is* the ultimate native advertising platform.

Google is doing away with the local carousel in favor of a 3 pack local listing in categories like hotels. Once a person clicks on one of the hotel listings, Google then inserts an inline knowledge graph listing for that hotel with booking affiliate links inline in the search results, displacing the organic result set below the fold.

Notice in the above graphic how the "website" link uses generic text, is aligned toward the right, and is right next to an image so that it looks like an ad. It is engineered to feel like an ad and be ignored. The actual ads are left aligned and look like regular text links. They have an ad label, but that label is a couple lines up from them & there are multiple horizontal lines between the label and the actual ad units.

Not only does Google have the ability to shift the layout in such a drastic format, but then with whatever remains they also get to determine who they act against & who they don't. While the SEO industry debates the "ethics" of various marketing techniques Google has their eye on the prize & is displacing the entire ecosystem wholesale.

Users were generally unable to distinguish between ads and organic listings *before* Google started mixing the two in their knowledge graph. That is a big part of the reason search ads have never seen the engagement declines banner ads have seen.

Mobile has been redesigned with the same thinking in mind. Google action items (which can eventually be monetized) up top & everything else pushed down the page.

The blurring of "knowledge" and ads allows Google to test entering category after category (like doctor calls from the search results) & forcing advertisers to pay for the various tests while Google collects data.

And as Google is scraping information from third party sites, they can choose to show less information on their own site if doing so creates additional monetization opportunities. As far back as 2009 Google stripped phone numbers off of non-sponsored map listings. And what happened with the recent 3 pack? While 100% of the above the fold results are monetized, ...

"Go back to an original search that turns up the 3 PAC. Its completely devoid of logical information that a searcher would want:

  • No phone number
  • No address
  • No map
  • NO LINK to the restaurant website.

Anything that most users would want is deliberately hidden and/or requires more clicks." - Dave Oremland

Google justifies their scrape-n-displace programs by claiming they are put users first. Then they hide some of the information to drive incremental monetization opportunities. Google may eventually re-add some of those basic features which are now hidden, but as part of sponsored local listings.

After all - ads are the solution to everything.

Do branded banner ads in the search results have a low response rate? Or are advertisers unwilling to pay a significant sum for them? If so, Google can end the test and instead shift to include a product carousel in the search results, driving traffic to Google Shopping.

"I see this as yet another money grab by Google. Our clients typically pay 400-500% more for PLA clicks than for clicks on their PPC Brand ads. We will implement exact match brand negatives in Shopping campaigns." - Barb Young

That money grab stuff has virtually no limit.

The Click Casino

Off the start keywords defaulted to broad match. Then campaigns went "enhanced" so advertisers were forced to eat lower quality clicks on mobile devices.

Then there was the blurring exact match targeting into something else, forcing advertisers to buy lower quality variations of searches & making them add tons of negative keywords (and keep eating new garbage re-interpretations of words) in order to run a fine tuned campaign specifically targeted against a term.

In the past some PPC folks cheered obfuscation of organic search, thinking "this will never happen to me."

Oops.

And of course Google not only wants to be the ad auction, but they want to be your SEM platform managing your spend & they are suggesting you can leverage the "power" of automated auction time biding.

Advertisers RAVE about the success of Google's automatic bidding features: "It received one click. That click cost $362.63."

The only thing better than that is banner ads in free mobile tap games targeted at children.

Adding Friction

Above I mentioned how Google arbitrarily banned the AdBlock Plus extension from the Play store. They also repeatedly banned Disconnect Mobile. If you depend on mobile phones for distribution it is hard to get around Google. What's more they also collect performance data, internally launch competing apps, and invest in third party apps. And they control the prices various apps pay for advertising across their broad network.

So maybe you say "ok, I'll skip search & mobile, I'll try to leverage email" but this gets back to the same issue again. In Gmail social or promotional emails get quarantined into a ghetto where they are rarely seen:

"online stores, if they get big enough, can act as chokepoints. And so can Google. ... Google unilaterally misfiled my daily blog into the promotions folder they created, and I have no recourse and no way (other than this post) to explain the error to them" - Seth Godin

Those friction adders have real world consequences. A year ago Groupon blamed Gmail's tabs for causing them to have poor quarterly results. The filtering impact on a start up can be even more extreme. A small shift in exposure can lower the K factor to something below 1 & require the startups to buy exposure rather than generating it virally.

In addition to those other tabs, there are a host of other risks like being labeled as spam or having a security warning. Few sites are as widely read inside the Googleplex as Search Engine Land, yet at one point even their newsletter was showing a warning in Gmail.

Google can also add friction to

  • websites using search rankings, vertical search result displacement, hiding local business information (as referenced above), search query suggestions, and/or leveraging their web browser to redirect consumer intent
  • video on YouTube by counting ad views as organic views, changing the relevancy metrics, and investing in competing channels & giving them preferential exposure as part of the deal. YouTube gets over half their views on mobile devices with small screens, so any shift on Google's rank preference is going to have a major shift in click distributions.
  • mobile apps using default bundling agreements which require manufactures to set Google's apps as defaults
  • other business models by banning bundling-based business models too similar to their own (bundling is wrong UNLESS it is Google doing it)
  • etc.

The launch of Keyword (not provided) which hid organic search keyword data was friction for the sake of it in organic search. When Google announced HTTPS as a ranking signal, Netmeg wrote: "It's about ad targeting, and who gets to profile you, and who doesn't. Mark my words."

Facebook announced their relaunch of Atlas and Google immediately started cracking down on data access:

In the conversations this week, with companies like Krux, BlueKai and Lotame, Google told data management platform players that they could not use pixels in certain ads. The pixels—embedded within digital ads—help marketers target and understand how many times a given user has seen their messages online.

"Google is only allowing data management platforms to fire pixels on creative assets that they're serving, on impressions they bought, through the Google Display Network," said Mike Moreau, chief solutions officer at Krux. "So they're starting with a very narrow scope."

Around the same time Google was cracking down on data sharing, they began offering features targeting consumers across devices & announced custom affinity audiences which allow advertisers to target audiences who visit any particular website.

Google's special role is not only as an organizer (and obfuscator) of information, but then they get to be the source measuromg how well marketing works via their analytics, which can regularly launch new reports which may causually over-represent their own contribution while under-representing some other channels, profiting from activity bias. The industry default of last click attribution driving search ad spending is one of the key issues which has driven down display ad values over the years.

Investing in Competition

Google not only ranks the ecosystem, but they actively invest in it.

Google tried to buy Yelp. When Facebook took off Google invested in Zynga to get access to data, in spite of a sketchy background. When Google's $6 billion offer for Groupon didn't close the deal, Google quickly partnered with over a dozen Groupon competitors & created new offer ad units in the search results.

Inside of the YouTube ecosystem Google also holds equity stakes in leading publishers like Machinima and Vevo.

There have been a few examples of investments getting special treatment, getting benefit of the doubt, or access to non-public information.

The scary scenario for publishers might sound something like this: "in Baidu Maps you can find a hotel, check room availability, and make a booking, all inside the app." There's no need to leave the search engine.

Take a closer look & that scary version might already be here. Google's same day delivery boss moved to Uber and Google added Uber pickups and price estimates to their mobile Maps app.

Google, of course, also invested in Uber. It would be hard to argue that Uber is anything but successful. Though it is also worth mentioning winning at any cost often creates losses elsewhere:

Google invests in disruption as though disruption is its own virtue & they leverage their internal data to drive the investments:

“If you can’t measure and quantify it, how can you hope to start working on a solution?” said Bill Maris, managing partner of Google Ventures. “We have access to the world’s largest data sets you can imagine, our cloud computer infrastructure is the biggest ever. It would be foolish to just go out and make gut investments.”

Combining usage data from their search engine, web browser, app store & mobile OS gives them unparalleled insights into almost any business.

Google is one of the few companies which can make multi-billion dollar investments in low margin areas, just for the data:

Google executives are prodding their engineers to make its public cloud as fast and powerful as the data centers that run its own apps. That push, along with other sales and technology initiatives, aren’t just about grabbing a share of growing cloud revenue. Executives increasingly believe such services could give Google insights about what to build next, what companies to buy and other consumer preferences

Google committed to spending as much as a half billion dollars promoting their shopping express delivery service.

Google's fiber push now includes offering business internet services. Elon Musk is looking into offering satellite internet services - with an ex-Googler.

The End Game

Google now spends more than any other company on federal lobbying in the US. A steady stream of Google executives have filled US government rolls like deputy chief technology officer, chief technology officer, and head of the patent and trademark office. A Google software engineer went so far as suggesting President Obama

  • Retire all government employees with full pensions.
  • Transfer administrative authority to the tech industry.
  • Appoint Eric Schmidt CEO of America.

That Googler may be crazy or a troll, but even if we don't get their nightmare scenario, if the regulators come from a particular company, that company is unlikely to end up hurt by regulations.

President Obama has stated the importance of an open internet: “We cannot allow Internet service providers to restrict the best access or to pick winners and losers in the online marketplace for services and ideas.”

If there are relevant complaints about Google, who will hear them when Googlers head key government roles?

Larry Page was recently labeled businessperson of the year by Fortune:

It’s a powerful example of how Page pushes the world around him into his vision of the future. “The breadth of things that he is taking on is staggering,” says Ben Horowitz, of Andreessen Horowitz. “We have not seen that kind of business leader since Thomas Edison at GE or David Packard at HP.”

A recent interview of Larry Page in the Financial Times echos the theme of limitless ambition:

  • "the world’s most powerful internet company is ready to trade the cash from its search engine monopoly for a slice of the next century’s technological bonanza." ... "As Page sees it, it all comes down to ambition – a commodity of which the world simply doesn’t have a large enough supply."
  • “I think people see the disruption but they don’t really see the positive,” says Page. “They don’t see it as a life-changing kind of thing . . . I think the problem has been people don’t feel they are participating in it.”
  • “Even if there’s going to be a disruption on people’s jobs, in the short term that’s likely to be made up by the decreasing cost of things we need, which I think is really important and not being talked about.”
  • "in a capitalist system, he suggests, the elimination of inefficiency through technology has to be pursued to its logical conclusion."

There are some dark layers which are apparently "incidental side effects" of the techno-utopian desires.

Mental flaws could be reinforced & monetized by hooking people on prescription pharmaceuticals:

It takes very little imagination to foresee how the kitchen mood wall could lead to advertisements for antidepressants that follow you around the Web, or trigger an alert to your employer, or show up on your Facebook page because, according to Robert Scoble and Shel Israel in Age of Context: Mobile, Sensors, Data and the Future of Privacy, Facebook “wants to build a system that anticipates your needs.”

Or perhaps...

Those business savings are crucial to Rifkin’s vision of the Third Industrial Revolution, not simply because they have the potential to bring down the price of consumer goods, but because, for the first time, a central tenet of capitalism—that increased productivity requires increased human labor—will no longer hold. And once productivity is unmoored from labor, he argues, capitalism will not be able to support itself, either ideologically or practically.

That is not to say "all will fail" due to technology. Some will succeed wildly.

Michelle Phan has been able to leverage her popularity on YouTube to launch a makeup subscription service which is at an $84 million per year revenue run rate.

Those at the top of the hierarchy will get an additional boost. Such edge case success stories will be marketed offline to pull more people onto the platform.

While a "star based" compensation system makes a few people well off, most people publishing on those platforms won't see any financial benefit from their efforts. Worse yet, a lot of the "viral" success stories are driven by a large ad budget.

Even Google has done research on income inequality in attention economies - and that was before they dialed up their brand bias stuff.

Category after category gets commoditized, platform after platform gets funded by Google, and ultimately employees working on them will long for the days where their wages were held down by illegal collusion rather than the crowdsourcing fate they face:

Workers, in turn, have more mobility and a semblance of greater control over their working lives. But is any of it worth it when we can’t afford health insurance or don’t know how much the next gig might pay, or when it might come? When an app’s terms of service agreement is the closest thing we have to an employment contract? When work orders come through a smartphone and we know that if we don’t respond immediately, we might not get such an opportunity again? When we can’t even talk to another human being about the task at hand and we must work nonstop just to make minimum wage?

Just as people get commoditized, so do other layers of value:

In SEO for a number of years many people have painted brand as the solution to everything. But consider the hotel search results which are 100% monetized above the fold - even if you have a brand, you still must pay to play. Or consider the Google Shopping ads which are now being tested on branded navigational searches.

Google even obtained a patent for targeting ads aimed at monetizing named entities.

You paid to build the brand. Then you pay Google again - "or else."

One could choose to opt out of Google ad products so as not to pay to arbitrage themselves, but Google is willing to harm their own relevancy to extract revenues.

A search in the UK for the trademark term [cheapflights] is converted into the generic search [cheap flights]. The official site is ranking #2 organically and is the 20th clickable link in the left rail of the search results.

As much as brand is an asset, it also becomes a liability if you have to pay again for every time someone looks for your brand.

Mobile apps may be a way around Google, but again it is worth noting Google owns the operating system and guarantees themselves default placement across a wide array of verticals through bundling contracts with manufacturers. Another thing worth considering with mobile is new notification features tied to the operating systems are unbundling apps & Google has apps like Google Now which tie into many verticals.

As SEOs for a long time we had value in promoting the adoption of Google's ecosystem. As Google attempts to capture more value than they create we may no longer gain by promoting the adoption of their ecosystem, but given their...

  • cash hoard
  • lobbyists
  • ex-employees in key government rolls
  • control over video, mobile, apps, maps, email, analytics (along with search)
  • broad portfolio of investments

... it is hard to think they've come anywhere close to peaking.

Google SEO Services (BETA)

When Google acquired DoubleClick Larry Page wanted to keep the Performics division offering SEM & SEO services just to see what would happen. Other Google executives realized the absurd conflict of interest and potential anti trust issues, so they overrode ambitious Larry: "He wanted to see how those things work. He wanted to experiment."

Webmasters have grown tired of Google's duplicity as the search ecosystem shifts to pay to play, or go away.

Google's webmaster guidelines can be viewed as reasonable and consistent or as an anti-competitive tool. As Google eats the ecosystem, those thrown under the bus shift their perspective.

Within some sectors larger players can repeatedly get scrutiny for the same offense with essentially no response, whereas smaller players operating in that same market are slaughtered because they are small.

Access to lawyers, politicians & media outlets = access to benefit of the doubt.

Lack those & BEST OF LUCK TO YOU ;)

Google's page asking "Do you need an SEO?" uses terms like: scam, illicit and deceptive to help frame the broader market perception of SEO.

If ranking movements appear random & non-linear then it is hard to make sense of continued ongoing investment. The less stable Google makes the search ecosystem, the worse they make SEOs look, as...

  • anytime a site ranks better, that anchors the baseline expectation of where rankings should be
  • large rank swings create friction in managing client communications
  • whenever search traffic falls drastically it creates real world impacts on margins, employment & inventory levels

Matt Cutts stated it is a waste of resources for him to be a personal lightning rod for criticism from black hat SEOs. When Matt mentioned he might not go back to his old role at Google some members of the SEO industry were glad. In response some other SEOs mentioned black hats have nobody to blame but themselves & it is their fault for automating things.

After all, it is not like Google arbitrarily shifts their guidelines overnight and drastically penalizes websites to a disproportionate degree ex-post-facto for the work of former employees, former contractors, mistaken/incorrect presumed intent, third party negative SEO efforts, etc.

Oh ... wait ... let me take that back.

Indeed Google DOES do that, which is where much of the negative sentiment Matt complained about comes from.

Recall when Google went after guest posts, a site which had a single useful guest post on it got a sitewide penalty.

Around that time it was noted Auction.com had thousands of search results for text which was in some of their guest posts.

About a month before the guest post crack down, Auction.com received a $50 million investment from Google Capital.

  • Publish a single guest post on your site = Google engineers shoot & ask questions later.
  • Publish a duplicated guest post on many websites, with Google investment = Google engineers see it as a safe, sound, measured, reasonable, effective, clean, whitehat strategy.

The point of highlighting that sort of disconnect was not to "out" someone, but rather to highlight the (il)legitimacy of the selective enforcement. After all, ...

But perhaps Google has decided to change their practices and have a more reasonable approach to the SEO industry.

An encouraging development on this front was when Auction.com was once again covered in Bloomberg. They not only benefited from leveraging Google's data and money, but Google also offered them another assist:

Closely held Auction.com, which is valued at $1.2 billion, based on Google’s stake, also is working with the Internet company to develop mobile and Web applications and improve its search-engine optimization for marketing, Sharga said.

"In a capitalist system, [Larry Page] suggests, the elimination of inefficiency through technology has to be pursued to its logical conclusion." ― Richard Waters

With that in mind, one can be certain Google didn't "miss" the guest posts by Auction.com. Enforcement is selective, as always.

“The best way to control the opposition is to lead it ourselves.” ― Vladimir Lenin

Whether you turn left or right, the road leads to the same goal.

Loah Qwality Add Werds Clix Four U

Google recently announced they were doing away with exact match AdWords ad targeting this September. They will force all match types to have close variant keyword matching enabled. This means you get misspelled searches, plural versus singular overlap, and an undoing of your tight organization.

In some cases the user intent is different between singular and plural versions of a keyword. A singular version search might be looking to buy a single widget, whereas a plural search might be a user wanting to compare different options in the marketplace. In some cases people are looking for different product classes depending on word form:

For example, if you sell spectacles, the difference between users searching on ‘glass’ vs. ‘glasses’ might mean you are getting users seeing your ad interested in a building material, rather than an aid to reading.

Where segmenting improved the user experience, boosted conversion rates, made management easier, and improved margins - those benefits are now off the table.

CPC isn't the primary issue. Profit margins are what matter. Once you lose the ability to segment you lose the ability to manage your margins. And this auctioneer is known to bid in their own auctions, have random large price spikes, and not give refunds when they are wrong.

An offline analogy for this loss of segmentation ... you go to a gas station to get a bottle of water. After grabbing your water and handing the cashier a $20, they give you $3.27 back along with a six pack you didn't want and didn't ask for.

Why does a person misspell a keyword? Some common reasons include:

  • they are new to the market & don't know it well
  • they are distracted
  • they are using a mobile device or something which makes it hard to input their search query (and those same input issues make it harder to perform other conversion-oriented actions)
  • their primary language is a different language
  • they are looking for something else

In any of those cases, the typical average value of the expressed intent is usually going to be less than a person who correctly spelled the keyword.

Even if spelling errors were intentional and cultural, the ability to segment that and cater the landing page to match disappears. Or if the spelling error was a cue to send people to an introductory page earlier in the conversion funnel, that option is no more.

In many accounts the loss of the granular control won't cause too big of a difference. But some advertiser accounts in competitive markets will become less profitable and more expensive to manage:

No one who's in the know has more than about 5-10 total keywords in any one adgroup because they're using broad match modified, which eliminated the need for "excessive keyword lists" a long time ago. Now you're going to have to spend your time creating excessive negative keyword lists with possibly millions upon millions of variations so you can still show up for exactly what you want and nothing else.

You might not know which end of the spectrum your account is on until disaster strikes:

I added negatives to my list for 3 months before finally giving up opting out of close variants. What they viewed as a close variant was not even in the ballpark of what I sell. There have been petitions before that have gotten Google to reverse bad decisions in the past. We need to make that happen again.

Brad Geddes has held many AdWords seminars for Google. What does he think of this news?

In this particular account, close variations have much lower conversion rates and much higher CPAs than their actual match type.
...
Variation match isn’t always bad, there are times it can be good to use variation match. However, there was choice.
...
Loss of control is never good. Mobile control was lost with Enhanced Campaigns, and now you’re losing control over your match types. This will further erode your ability to control costs and conversions within AdWords.

A monopoly restricting choice to enhance their own bottom line. It isn't the first time they've done that, and it won't be the last.

Have an enhanced weekend!

Understanding The Google Penguin Algorithm

Whenever Google does a major algorithm update we all rush off to our data to see what changed in terms of rankings, search traffic, and then look for the trends to try to figure out what changed.

The two people I chat most with during periods of big algorithmic changes are Joe Sinkwitz and Jim Boykin. I recently interviewed them about the Penguin algorithm.

Topics include:

  • what it is
  • its impact
  • why there hasn't been an update in a while
  • how to determine if issues are related to Penguin or something else
  • the recovery process (from Penguin and manual link penalties)
  • and much, much more

Here's a custom drawing we commissioned for this interview.
Pang Win.

Want to embed this image on your website?

To date there have been 5 Penguin updates:

  • April 24, 2012
  • May 25, 2012
  • October 5, 2012
  • May 22, 2013 (Penguin 2.0)
  • October 4, 2013

There hasn't been one in quite a while, which is frustrating many who haven't been able to recover. On to the interview...

At its core what is Google Penguin?

Jim: It is a link filter that can cause penalties.

Joe: At its core, Penguin can be viewed as an algorithmic batch filter designed to punish lower quality link profiles.

What sort of ranking and traffic declines do people typically see from Penguin?

Jim: 30-98%. actually, seen some "manual partial matches" some, where traffic was hardly hit...but that's rare.

Joe: Near total. I should expand. Penguin 1.0 has been a different beast than its later iterations; the first one has been nearly a fixed flag whereas later iterations haven't been quite as severe.

After the initial update there was another one about a month later & then one about every 6 months for a while. There hasn't been one for about 10 months now. So why have the updates been so rare? And why hasn't there been one for a long time?

Jim: Great question. We all believed there'd be an update every 6 months, and now it's been way longer than 6 months...maybe because Matt's on vacation...or maybe he knew it would be a long time until the next update, so he took some time off...or perhaps Google wants those with a algorithmic penalty to feel the pain for longer than 6 months.

Joe: 1.0 was temporarily escapable if you were willing to 301 your site; after 1.1 the redirect began to pass on the damage. My theory on why it has been so very long on the most recent update has to do with maximizing pain - Google doesn't intend to lift its boot off the throats of webmasters just yet; no amount of groveling will do. Add to that the complexity of every idiot disavowing 90%+ of their clean link profiles and 'dirty' vs 'clean' links is difficult to ascertain on that signal.

Jim: Most people disavow some, then the disavow some more...then next month they disavow more...wait a year and they may disavow them all :)

Joe: Agreed.

Jim: Then Google will let them out...hehe, tongue in cheek...a little.

Joe: I've seen disavow files with over 98% of links in there, including Wikipedia, the Yahoo! Directory, and other great sources - absurd.

Jim: Me too. Most of the people are clueless ... there's tons of people who are disavowing links just because their traffic has gone down, so they feel they must have been hit by penguin, so they start disavowing links.

Joe: Yes; I've seen a lot of panda hits where the person wants to immediately disavow. "whoa, slow down there Tex!"

Jim: I've seen services where they guarantee you'll get out of a penguin penalty, and we know that they're just disavowing 100% of the links. Yes, you get your manual penalty removed that way, but then you're left with nothing.

Joe: Good time to mention that any guarantee of getting out of a penalty is likely sold as a bag of smoke.

Jim: or as they are disavowing 100% of the links they can find going to the site.

OK. I think you mentioned an important point there Jim about "100% of the links they can find." What are the link sources people should use & how comprehensive is the Google Webmaster Tools data? Is WMT data enough to get you recovered?

Joe: Rarely. I've seen where the examples listed in a manual action might be discoverable on Ahrefs, Majestic SEO, or in WMT, but upon cleaning them up (and disavowing further of course) that Google will come back with a few more links that weren't initially in the WMT data dump. I'm dealing with a client on this right now that bought a premium domain as-is and has been spending about a year constantly disavowing and removing links. Google won't let them up for air and won't do the hard reset.

Jim: well first...if you're getting your backlinks from Google, be sure to pull your backlinks from the www and the non www version of your site. You can't just use one: you HAVE to pull backlinks from both, so you have to verify both your www and your non www version of your site with Google Webmaster Tools.

We often start with that. When we find big patterns that we feel are the cause, we'll then go into OSE, Majestic SEO, and Ahrefs, and pull those backlinks too, and pull out those that fit the patterns, but that's after the Google backlink analysis.

Joe, you mentioned people getting hit by Panda and mistakenly going off to the races to disavow links. What are the distinguishing characteristics between Penguin, Panda & manual link penalties?

Joe: Given they like to sandwich updates to make it difficult to discern, I like this question. Penguin is about links; it is the easiest to find but hardest to fix. When I first am looking at a URL I'll quickly look at anchor % breakdowns, sources of links, etc. The big difference between penguin and a manual link penalty (if you aren't looking on WMT) is the timing -- think of a bomb going off vs a sniper...everyone complaining at once? probably an algorithm; just a few? probably some manual actions. For manual actions, you'll get a note too in WMT. With panda I like to look first at the on-page to see if I can spot the egregious KW stuffing, weird infrastructure setups that result in thin/duplicated content, and look into engagement metrics and my favorite...externally supported pages - to - total indexed pages ratios.

Jim: Manual, at least you can keep resubmitting and get a yes or no. With an algorithmic, you're screwed....because you're waiting for the next refresh...hoping you did enough to get out.

I don't mind going back and forth with Google with a manual penalty...at least I'm getting an answer.

If you see a drop in traffic, be sure to compare that to the dates of Panda and Penguin updates...if you see a drop on one of the update days, then you can know if you have Panda or Penguin....and if you're traffic is just falling, it could be just that, and no penalty.

Joe: While this interview was taking place an employee pinged me to let me know a manual action that was denied, with an example URL being something akin to domain.com/?var=var&var=var - the entire domain was already disavowed. Those 20 second manual reviews by 3rd parties without much of an understanding of search doesn't generate a lot of confidence for me

Jim: Yes, I posted this yesterday to SEOchat. Reviewers are definitely not looking at things.

You guys mentioned that anyone selling a guaranteed 100% recovery solution is likely selling a bag of smoke. What are the odds of recovery? When does it make sense to invest in recovery, when does it make sense to start a different site, and when does it make sense to do both in parallel?

Jim: Well, I'm one for trying to save a site. I haven't once said "it's over for that site, let's start fresh." Links are so important, that if I can even save a few links going to a site, I'll take it. I'm not a fan of doing two sites, causes duplicate content issues, and now your efforts are on two sites.

Joe : It depends on the infraction. I have a lot more success getting stuff out of panda, manual actions, and the later iterations of penguin (theoretically including the latest one once a refresh takes place); I won't take anyone's money for those hit on penguin 1.0 though...I give free advice and add it to my DB tracking, but the very few examples I have where a recovery took place that I can confirm were penguin 1.0 and not something else, happened due to being a beta user of the disavow tool and likely occurred for political reasons vs tech reasons.

For churn and burn, redirects and canonicals can still work if you're clever...but that's not reinvestment so much as strategy shift I realize.

You guys mentioned the disavow process, where a person does some, does some more over time, etc. Is Google dragging out the process primarily to drive pain? Or are they leveraging the aggregate data in some way?

Joe: Oh absolutely they drag it out. Mathematically I think of triggers where a threshold to trigger down might be at X%, but the trigger for recovery might be X-10%. Further though, I think initially they looooooved all the aggregate disavow data, until the community freaked out and started disavowing everything. Let's just say I know of a group of people that have a giant network where lots of quality sites are purposefully disavowed in an attempt to screw with the signal further. :)

Jim: pain :) ... not sure if they're leveraging the data yet, but they might be. It shouldn't be too hard for Google to see that a ton of people are disavowing links from a site like get-free-links-directory.com, for Google to say, "no one else seems to trust these links, we should just nuke that site and not count any links from there."

we can do this ourselves with our own tools we have..I can see how many times I've seen a domain in my disavows, and how many times I disavowed that...ie, If I see spamsite.com in 20 disavows I've done, and I'd disavowed it all 20 times I saw it, I can see this data... or if I've seen goodsite.com 20 times, and never once disavowed it, I can see that too. I'd assume Google must do something like this as well.

Given that they drag it out, on the manual penalties does it make sense to do a couched effort on the first rejection or two, in order to give the perception of a greater level of pain and effort as you scale things up on further requests? What level of resources does it make sense to devote to the initial effort vs the next one and so on? When does recovery typically happen (in terms of % of links filtered and in terms of how many reconsideration requests were filed)?

Joe: When I deliver "disavow these" and "say this" stuff, I give multiple levels, knowing full well that there might be deeper and deeper considerations of the pain. Now, there have been cases where the 1st try gets a site out, but I usually see 3 or more.

Jim: I figure it will take a few reconsideration requests...and yes, I start "big" and get "bigger."

but that's for a sitewide penalty...

We've seen sitewides get reduced to a partial penalty. And once we have a partial penalty, it's much easier to identify this issues and take care of those, while leaving links that go to pages that were not effected.

A sitewide manual penalty kills the site...a partial match penalty usually has some stuff that ranks good, and some stuff that no longer ranks...once we're at a partial match, I feel much more confident in getting that resolved.

Jim, I know you've mentioned the errors people make in either disavowing great links or disavowing links when they didn't need to. You also mentioned the ability to leverage your old disavow data when processing new sites. When does it make sense to DIY on recovery versus hiring a professional? Are there any handy "rule of thumb" guidelines in terms of the rough cost of a recovery process based on the size of their backlink footprint?

Joe: It comes down to education, doesn't it? Were you behind the reason it got dinged? You might try that first vs immediately hiring. Psychologically it could even look like you're more serious after the first disavow is declined by showing you "invested" in the pain. Also, it comes down to opportunity cost. What is your personal time worth divided by your perceived probability of fixing

Jim: We charge $5000 for the analysis, and $5000 for the link removal process...some may think that's expensive...but removing good links will screw you, and not removing bad links will screw you...it's a real science, and getting is wrong can cost you a lot more than this...of course I'd recommend seeing a professional, as I sell this service...but I can't see anyone who's not a true expert in links doing this themselves.

Oh...and once we start work for someone, we keep going at no further cost until they get out.

Joe: That's a nice touch Jim.

Jim: Thank you.

Joe, during this interview you mentioned a reconsideration request rejection where the person cited a link on a site that has already been disavowed. Given how many errors Google's reviewers make, does it make sense to aggressively push to remove links rather than using disavow? What are the best strategies to get links removed?

Joe: DDoS

Jim: hehe

Joe: Really though, be upfront and honest when using those link removal services (which I'd do vs trying to do them one-by-one-by-one)

Jim: Only 1% of the people will remove links anyways; it's more to show Google that to you really tried to get the links removed.

Joe: Let the link holder know that you got hit with a penalty, you're just trying to clean it up because your business is suffering, and ask politely that they do you a solid favor.

I've been on the receiving end of a lot of different strategies given the size of my domain portfolio. I've been sued before (as a first course of action!) by someone that PAID to put a link on my site....they never even asked, just filed the case.

Jim: We send 3 removal requests..and ping the links too..so when we do a reconsideration request we can show Google the spreadsheet of who we emailed, when we emailed them, and who removed or no followed the links...but it's more about "show" to Google.

Joe: Yep, not a ton of compliance; webmasters have link removal fatigue by now.

This is more of a business question than an SEO question, but ... as much as budgeting for the monetary cost of recovery, an equally important form of budgeting is dealing with the reduced cashflow while the site is penalized. How many months does it typically take to recover from a manual penalty? When should business owners decide to start laying people off? Do you guys suggest people aggressively invest in other marketing channels while the SEO is being worked on in the background?

Jim: manual penalty typically take 2-4 months to recover. Recover is a relative term. Some people get "your manual penalty has been removed" and thier recovery is a tiny blip -up 5%, but still down 90% from what is was prior. Getting a "manual penalty removed" is great. IF there's good links left in your profile...if you've disavow everything, and your penalty is removed...so what...you've got nothing....people often ask where they'll be once they "recover" and I say "it depends on what you have left for links"...but it won't be where you were.

Joe: It depends on how exposed they are per variable costs. If the costs are fixed, then one can generally wait longer (all things being equal) before cutting. If you have a quarter million monthly link budget *cough* then, you're going to want to trim as quickly as possible just in order to survive.

Per investing in other channels, I absolutely wholeheartedly cannot emphasize how important it is to become an expert in one channel and at least a generalist in several others...even better, hire an expert in another channel to partner up with. In payday one of the big players did okay in SEO but even with a lot of turbulence was doing great due to their TV and radio capabilities. Also, collect the damn email addresses; email is still a gold mine if you use it correctly.

One of my theories for why there hasn't been a penguin update in a long time was that as people have become more afraid of links they've started using them as a weapon & Google doesn't want a bunch of false positives caused by competitors killing sites. One reason I've thought this versus the pain first motive is that Google could always put a time delay on recoveries while still allowing new sites to get penalized on updates. Joe, you mentioned that after the second Penguin update penalties started passing forward on redirects. Do people take penalized sites and point them at competitors?

Joe: Yes, they do. They also take them and pass them into the natural links of their competitors. I've been railing on negative SEO for several years now...right about when the first manual action wave came out in Jan 2012; that was a tipping point. It is now more economical to take someone else's ranking down than it is to (with a strong degree of confidence) invest in a link strategy to leapfrog them naturally

I could speak for days straight in a congressional filibuster on link strategies used for Negative SEO. It is almost magical how pervasive it has become. I get a couple requests a week to do it even...by BIG companies. Brands being the mechanism to sort out the cesspool and all that.

Jim: Soon, everyone will be monitoring they backlinks on a monthly basis. I know one big company that submits an updated disavow list every week to google.

That leads to a question about preemptive disavows. When does it make sense to do that? What businesses need to worry about that sort of stuff?

Joe: Are you smaller than a Fortune 500? Then the cards are stacked against you. At the very least, be aware of your link profile -- I wouldn't go so far as to preemptively disavow unless something major popped up.

Jim: I've done a preemptive disavow for my site. I'd say everyone should do a preemptive disavow to clean out the crap backlinks.

Joe: I can't wait to launch an avow service...basically go around to everyone and charge a few thousand dollars to clean up their disavows. :)

Jim: We should team up Joe and do them together :)

Joe: I'll have my spambots call your spambots.

Jim: saving the planet from penguin penalties. cleaning up the links of the web for Google.

Joe: For Google or from Google? :) The other dig, if there's time, is that not all penalties are created equal because there are several books of law in terms of how long a penalty might last. If I take an unknown site and do what RapGenius did, I'd still be waiting, even after fixing (which rapgenius really didn't do) largely because Google is not one of my direct or indirect investors.

Perhaps SEOs will soon offer a service for perfecting your pitch deck for the Google Ventures or Google Capital teams so it is easier to BeatThatPenalty? BanMeNot ;)

Joe: Or to extract money from former Googlers...there's a funding bubble right now where those guys can write their own ticket by VCs chasing the brand. Sure the engineer was responsible for changing the font color of a button, but they have friends on the inside still that might be able to reverse catastrophe.

Outside of getting a Google investment, what are some of the best ways to minimize SEO risk if one is entering a competitive market?

Jim: Don't try to rank for specific phrases anymore. It's a long slow road now.

Joe: Being less dependent on Google gives you power; think of it like a job interview. Do you need that job? The less you do, the more bargaining power you have. If you have more and more income coming in to your site from other channels, chances are you are also hitting on some important brand signals.

Jim: You must create great things, and build your brand...that has to be the focus...unless you want to do things to rank higher quicker, and take the risk of a penalty with Google.

Joe: Agreed. I do far fewer premium domaining + SEO-only plays anymore. For a while they worked; just a different landscape now.

Some (non-link builders) mention how foolish SEOs are for wasting so many thought cycles on links. Why are core content, user experience, and social media all vastly more important than link building?

Jim: links are still the biggest part of the Google algorithm - they can not be ignored. People must have things going on that will get them mentions across the web, and ideally some links as well. Links is #1 still today... but yes, after links, you need great content, good user experience, and more.

Joe: CopyPress sells content (please buy some content people; I have three kids to feed here), however it is important to point out that the most incredible content doesn't mean anything in a vacuum. How are you going to get a user experience with 0 users? Link building, purchasing traffic, DRIVING attention are crucial not just to SEO but to marketing in general. Google is using links as votes; while the variability has changed and evolved over time, it is still very much there. I don't see it going away in the next year or two.

An analogy: I wrote two books of poetry in college; I think they are ok, but I never published them and tried to get any attention, so how good are they really? Without promotion and amplification, we're all just guessing.

Thanks guys for sharing your time & wisdom!


About our contributors:

Jim Boykin is the Founder and CEO of Internet Marketing Ninjas, and owner of Webmasterworld.com, SEOChat.com, Cre8asiteForums.com and other community websites. Jim specializes in creating digital assets for sites that attract natural backlinks, and in analyzing links to disavow non-natural links for penalty recoveries.

Joe Sinkwitz, known as Cygnus, is current Chief of Revenue for CopyPress.com. He enjoys long walks on the beach, getting you the content you need, and then whispering in your ear how to best get it ranking.

Google Search Censorship for Fun and Profit

Growing Up vs Breaking Things

Facebook's early motto was "move fast and break things," but as they wanted to become more of a platform play they changed it to "move fast with stability." Anything which is central to the web needs significant stability, or it destroys many other businesses as a side effect of its instability.

As Google has become more dominant, they've moved in the opposite direction. Disruption is promoted as a virtue unto itself, so long as it doesn't adversely impact the home team's business model.

There are a couple different ways to view big search algorithm updates. Large, drastic updates implicitly state one of the following:

  • we were REALLY wrong yesterday
  • we are REALLY wrong today

Any change or disruption is easy to justify so long as you are not the one facing the consequences:

"Smart people have a problem, especially (although not only) when you put them in large groups. That problem is an ability to convincingly rationalize nearly anything." ... "Impostor Syndrome is that voice inside you saying that not everything is as it seems, and it could all be lost in a moment. The people with the problem are the people who can't hear that voice." - Googler Avery Pennarun


Monopoly Marketshare in a Flash

Make no mistake, large changes come with false positives and false negatives. If a monopoly keeps buying marketshare, then any mistakes they make have more extreme outcomes.

Here's the Flash update screen (which hits almost every web browser EXCEPT Google Chrome).

Notice the negative option installs for the Google Chrome web browser and the Google Toolbar in Internet Explorer.

Why doesn't that same process hit Chrome? They not only pay Adobe to use security updates to steal marketshare from other browsers, but they also pay Adobe to embed Flash inside Chrome, so Chrome users never go through the bundleware update process.

Anytime anyone using a browser other than Chrome has a Flash security update they need to opt out of the bundleware, or they end up installing Google Chrome as their default web browser, which is the primary reason Firefox marketshare is in decline.

Google engineers "research" new forms of Flash security issues to drive critical security updates.

Obviously, users love it:

Has anyone noticed that the latest Flash update automatically installs Google Toolbar and Google Chrome? What a horrible business decision Adobe. Force installing software like you are Napster. I would fire the product manager that made that decision. As a CTO I will be informing my IT staff to set Flash to ignore updates from this point forward. QA staff cannot have additional items installed that are not part of the base browser installation. Ridiculous that Adobe snuck this crap in. All I can hope now is to find something that challenges Photoshop so I can move my design team away from Adobe software as well. Smart move trying to make pennies off of your high dollar customers.

In Chrome Google is the default search engine. As it is in Firefox and Opera and Safari and Android and iOS's web search.

In other words, in most cases across most web interfaces you have to explicitly change the default to not get Google. And then even when you do that, you have to be vigilant in protecting against the various Google bundleware bolted onto core plugins for other web browsers, or else you still end up in an ecosystem owned, controlled & tracked by Google.

Those "default" settings are not primarily driven by user preferences, but by a flow of funds. A few hundred million dollars here, a billion there, and the market is sewn up.

Google's user tracking is so widespread & so sophisticated that their ad cookies were a primary tool for government surveillance efforts.

Locking Down The Ecosystem

And Chrome is easily the most locked down browser out there.

Whenever Google wants to promote something they have the ability to bundle it into their web browser, operating system & search results to try to force participation. In a fluid system with finite attention, over-promoting one thing means under-promoting or censoring other options. Google likes to have their cake & eat it too, but the numbers don't lie.

The Right to Be Forgotten

This brings us back to the current snafu with the "right to be forgotten" in Europe.

Google notified publishers like the BBC & The Guardian of their links being removed due to the EU "right to be forgotten" law. Their goal was to cause a public relations uproar over "censorship" which seems to have been a bit too transparent, causing them to reverse some of the removals after they got caught with their hand in the cookie jar.

The breadth of removals is an ongoing topic of coverage. But if you are Goldman Sachs instead of a government Google finds filtering information for you far more reasonable.

Some have looked at the EU policy and compared it to state-run censorship in China.

Google already hires over 10,000 remote quality raters to rate search results. How exactly is receiving 70,000 requests a monumental task? As their public relations propagandists paint this as an unbelievable burden, they are also highlighting how their own internal policies destroy smaller businesses: "If a multi-billion dollar corporation is struggling to cope with 70,000 censor requests, imagine how the small business owner feels when he/she has to disavow thousands or tens of thousands of links."

The World's Richest Librarian

Google aims to promote themselves as a digital librarian: "It’s a bit like saying the book can stay in the library, it just cannot be included in the library’s card catalogue."

That analogy is absurd on a number of levels. Which librarian...

Sorry About That Incidental Deletion From the Web...

David Drummond's breathtaking propaganda makes it sound like Google has virtually no history in censoring access to information:

In the past we’ve restricted the removals we make from search to a very short list. It includes information deemed illegal by a court, such as defamation, pirated content (once we’re notified by the rights holder), malware, personal information such as bank details, child sexual abuse imagery and other things prohibited by local law (like material that glorifies Nazism in Germany).

Yet Google sends out hundreds of thousands of warning messages in webmaster tools every single month.

Google is free to force whatever (often both arbitrary and life altering) changes they desire onto the search ecosystem. But the moment anyone else wants any level of discourse or debate into the process, they feign outrage over the impacts on the purity of their results.

Despite Google's great power they do make mistakes. And when they do, people lose their jobs.

Consider MetaFilter.

They were penalized November 17, 2012.

At a recent SMX conference Matt Cutts stated MetaFilter was a false positive.

People noticed the Google update when it happened. It is hard to miss an overnight 40% decline in your revenues. Yet when they asked about it, Google did not confirm its existence. That economic damage hit MetaFilter for nearly two years & they only got a potential reprieve from after they fired multiple employees and were able to generate publicity about what had happened.

As SugarRae mentioned, those false positives happen regularly, but most the people who are hit by them lack political and media influence, and are thus slaughtered with no chance of recovery.

MetaFilter is no different than tens of thousands of other good, worthy small businesses who are also laying off employees – some even closing their doors – as a result of Google’s Panda filter serving as judge, jury and executioner. They’ve been as blindly and unfairly cast away to an island and no one can hear their pleas for help.

The only difference between MetaFilter and tons of other small businesses on the web is that MetaFilter has friends in higher places.

If you read past the headlines & the token slaps of big brands, these false positive death sentences for small businesses are a daily occurrence.

And such stories are understated for fear of coverage creating a witch-hunt:

Conversations I’ve had with web publishers, none of whom would speak on the record for fear of retribution from Cutts’ webspam team, speak to a litany of frustration at a lack of transparency and potential bullying from Google. “The very fact I’m not able to be candid, that’s a testament to the grotesque power imbalance that’s developed,” the owner of one widely read, critically acclaimed popular website told me after their site ran afoul of Cutts’ last Panda update.

Not only does Google engage in anti-competitive censorship, but they also frequently publish misinformation. Here's a story from a week ago of a restaurant which went under after someone changed their Google listing store hours to be closed on busy days. That misinformation was embedded directly in the search results. That business is no more.

Then there are areas like locksmiths:

I am one of the few Real Locksmiths here in Denver and I have been struggling with this for years now. I only get one or two calls a day now thanks to spammers, and that's not calls I do, it's calls for prices. For instance I just got a call from a lady locked out of her apt. It is 1130 pm so I told her 75 dollars, Nope she said someone told her 35 dollars....a fake locksmith no doubt. She didn't understand that they meant 35 dollars to come out and look at it. These spammers charge hundreds to break your lock, they don't know how to pick a lock, then they charge you 10 times the price of some cheap lock from a hardware store. I'm so lost, I need help from google to remove those listings. Locksmithing is all I have ever done and now I'm failing at it.

There are entire sectors of the offline economy being reshaped by Google policies.

When those sectors get coverage, the blame always goes to the individual business owner who was (somehow?) personally responsible for Google's behaviors, or perhaps some coverage of the nefarious "spammers."

Never does anybody ask if it is reasonable for Google to place their own inaccurate $0 editorial front and center. To even bring up that issue makes one an anti-capitalist nut or someone who wishes to impede on free speech rights. This even after the process behind the sausage comes to light.

And while Google arbitrarily polices others, their leaked internal documents contain juicy quotes about their ad policies like:

  • “We are the only player in our industry still accepting these ads”
  • “We do not make these decisions based on revenue, but as background, [redacted].”
  • "As with all of our policies, we do not verify what these sites actually do, only what they claim to do."
  • "I understand that we should not let other companies, press, etc. influence our decision-making around policy"

Is This "Censorship" Problem New?

This problem of control to access of information is nothing new - it is only more extreme today. Read the (rarely read) preface to Animal Farm, or consider this:

John Milton in his fiery 1644 defense of free speech, Areopagitica, was writing not against the oppressive power of the state but of the printers guilds. Darnton said the same was true of John Locke's writings about free speech. Locke's boogeyman wasn't an oppressive government, but a monopolistic commercial distribution system that was unfriendly to ways of organizing information that didn't fit into its business model. Sound familiar?

When Google complains about censorship, they are not really complaining about what may be, but what already is. Their only problem is the idea that someone other than themselves should have any input in the process.

"Policy is largely set by economic elites and organized groups representing business interests with little concern for public attitudes or public safety, as long as the public remains passive and obedient." ― Noam Chomsky

Many people have come to the same conclusion

Turn on, tune in, drop out

"I think as technologists we should have some safe places where we can try out some new things and figure out what is the effect on society, what's the effect on people, without having to deploy kind of into the normal world. And people like those kind of things can go there and experience that and we don't have mechanisms for that." - Larry Page

I have no problem with an "opt-in" techno-utopia test in some remote corner of the world, but if that's the sort of operation he wants to run, it would be appreciated if he stopped bundling his software into billions of electronic devices & assumed everyone else is fine with "opting out."

{This | The Indicated} {Just | True} {In | Newfangled}

A couple years ago we published an article named Branding & the Cycle, which highlighted how brands would realign with the algorithmic boost they gained from Panda & leverage their increased level of trust to increase their profit margins by leveraging algorithmic journalism.

Narrative Science has been a big player in the algorithmic journalism game for years. But they are not the only player in the market. Recently the Associated Press (AP) announced they will use algorithms to write articles based on quarterly earnings reports, working with a company named Automated Insights:

We discovered that automation technology, from a company called Automated Insights, paired with data from Zacks Investment Research, would allow us to automate short stories – 150 to 300 words — about the earnings of companies in roughly the same time that it took our reporters.

And instead of providing 300 stories manually, we can provide up to 4,400 automatically for companies throughout the United States each quarter.
...
Zacks maintains the data when the earnings reports are issued. Automated Insights has algorithms that ping that data and then in seconds output a story.

In the past Matt Cutts has mentioned how thin rewrites are doorway page spam:

you can also have more subtle doorway pages. so we ran into a directv installer in denver, for example. and that installer would say I install for every city in Colorado. so I am going to make a page for every single city in Colorado. and Boulder or Aspen or whatever I do directv install in all of those. if you were just to land on that page it might look relatively reasonable. but if you were to look at 4 or 5 of those you would quickly see that the only difference between them is the city, and that is something that we would consider a doorway.

One suspects these views do not apply to large politically connected media bodies like the AP, which are important enough to have a direct long-term deal with Google.

In the above announcement the AP announced they include automated NFL player rankings. One interesting thing to note about the AP is they have syndication deals with 1,400 daily newspapers nationwide, as well as thousands of TV and radio stations..

A single automated AP article might appear on thousands of websites. When thousands of articles are automated, that means millions of copies. When millions of articles are automated, that means billions of copies. When billions ... you get the idea.

To date Automated Insights has raised a total of $10.8 million. With that limited funding they are growing quickly. Last year their Wordsmith software produced 300 million stories & this year it will likely exceed a billion articles:

"We are the largest producer of content in the world. That's more than all media companies combined," [Automated Insights CEO Robbie Allen] said in a phone interview with USA TODAY.

The Automated Insights homepage lists both Yahoo! & Microsoft as clients.

The above might sound a bit dystopian (for those with careers in journalism and/or lacking equity in Automated Insights and/or publishers who must compete against algorithmically generated content), but the story also comes with a side of irony.

Last year Google dictated press releases shall use nofollow links. All the major press release sites quickly fell in line & adopted nofollow, thinking they would remain in Google's good graces. Unfortunately for those sites, they were crushed by Panda. PR Newswire's solution their penalty was greater emphasis on manual editorial review:

Under the new copy quality guidelines, PR Newswire editorial staff will review press releases for a number of message elements, including:

  • Inclusion of insightful analysis and original content (e.g. research, reporting or other interesting and useful information);
  • Use of varied release formats, guarding against repeated use of templated copy (except boilerplate);
  • Assessing release length, guarding against issue of very short, unsubstantial messages that are mere vehicles for links;
  • Overuse of keywords and/or links within the message.

So now we are in a situation where press release sites require manual human editorial oversight to try to get out of being penalized, and the news companies (which currently enjoy algorithmic ranking boosts) are leveraging those same "spammy" press releases using software to auto-generate articles based on them.

That makes sense & sounds totally reasonable, so long as you don't actually think about it (or work at Google)...

Pandas, Penguins, and Popsicles

Are you still working through your newsfeed of SEO material on the 101 ways to get out of panda 4.0 written by people that have never actually practiced SEO on their own sites? Aaron and I had concluded that what was rolling through was panda before it was announced that it was panda, but I'm not going to walk here on my treadmill and knock out yet another post on the things you should be doing if you were gut punched by that negative a priori algorithm (hat tip to Terry, another fine SEObook member, for pointing out to me those public discussions that showed the philosophical evolutionary shift towards the default assumption that sites likely deserve to be punished). I'd say 90% of those posts are thinly veiled sales pitches; I should know since I sell infographics to support my nachos habit. Speaking of infographics, there's already a great one that covers recovery strategies that still work right here.

Should I write about penguin? Analysis of that beast consumed the better part of 2 year years of my waking time. Nope. Again, I think it has already been adequately covered in a previous blog post. There's nothing particularly new to report there either since the next update may be completely different, might be just another refresh that doesn't take into account those slapped in the 1.0 incarnation of the update, or may actually be the penguin everyone hopes it is, taking into account the countless hours agencies have spent disavowing links and spamming me with fake legal threads should I not remove links they themselves placed. I wouldn't hold your breathe on that last one. Outside of crowdsourcing pain for future manual penalties, I don't expect much relief on that front.

Instead, I think I'm going to talk about popsicles. That seems like the kind of tripe that a SEO blog might discuss. I bet I can make it work though. I'm a fat dude in the Phoenix area and we already had our first 100F day, so I'm thinking of frozen treats. Strap in.

Search tactics and I'd even go so far as to say even certain strategies are like popsicles. When they are brand new they are cool and refreshing, but once exposed to the public heat they fade…fast. Really fast. Like a goop of sticky mess, which users of ALN and BMR can probably tell you.

Bear with me.

If you have a tactic that works, why would you expose it to the public? Nothing good can come of that. Sure, you have a tactic that works 100% but since I'm a loyal subscriber you're willing to share it with me for $297. Seems legit. I'm not saying all services/products pitched this way are inherently ‘bad', I'm just saying you aren't going to get a magic bullet, yet alone one hand-wrapped and delivered by filling out a single wufoo form…sans report.

Would you share with a really close friend? I suppose, but even still the popsicle isn't going to last as long since it is now being consumed at an accelerated rate. There's the thought of germs, contamination, and other nasty thoughts that'd prevent me from going down that route. Cue the “Two SEOs, one popsicle” reaction videos. No. There are two ways to make the best use out of that popsicle.

  • Practionioner: eat it quietly, savor it, make it last.
  • Strategist w/ resources: figure out the recipe and mass produce it as quickly as possible, knowing that after enough public heat is on, the popsicles will start melting before they can be eaten, and no one likely that weird, warm orange sticky stuff that tastes like a glucose intolerance test.

There's another caveat to the two above scenarios. Even if you're a strategist with deep resources, unless you're willing to test on your own sites, you're just effectively selling smoke on an unproven tactic.

So there you have it, tactics are like popsicles. Disappointed? Good. I've been doing SEO since 1997, so here's a secret: try to create engaging content, supported by authoritative off-page signals. There's an ebb and flow to this of course, but it can be translated across the full black/white spectrum. Markov content in a free wordpress theme can be engaging when it is cloaked with actionable imagery, with certain % of back-buttons disabled, or when you make the advertising more compelling than the content (just ask eHow). Similarly, well-researched interactive infographics can engage the user on the other side of the spectrum…just more expensive. Comment spam and parasitic hosting on “authority” sites can tap into those authority signals on dark side, as can a thorough native campaign across a bunch of relevant sites backed by a PR campaign, TV commercials, and radio spots for the light side. Budget and objectives are the only difference.

Go enjoy a popsicle everyone. Summer is here; I expect a lot more heat from Google, so you might need one.  

Eric Schmidt Drawing.

About the author: Joe Sinkwitz is the Chief Revenue Officer at CopyPress. He {Tweets / posts / comments / shares his thoughts} on navigating the evolving SEO landscape on Twitter here.

Please Remove My Link. Or Else.

Getting links removed is a tedious business.

It’s just as tedious for the site owner who must remove the links. Google’s annoying practice of "suggesting" webmasters jump through hoops in order to physically remove links that the webmaster suspects are bad, rather than Google simply ignoring the links that they’ve internally flagged, is causing frustration.

Is it a punitive punishment? If so, it’s doing nothing to endear Google to webmasters. Is it a smokescreen? i.e. they don't know which links are bad, but by having webmasters declare them, this helps Google build up a more comprehensive database? Bit of both? It might also be adding costs to SEO in order to put SEO out of reach of small companies. Perhaps it’s a red herring to make people think links are more important than they actually are.

Hard to be sure.

Collateral Damage

SEOs are accustomed to search engines being coy, punitive and oblique. SEOs accept it as part of the game. However, it becomes rather interesting when webmasters who are not connected to SEO get caught up in the collateral damage:

I received an interesting email the other day from a company we linked to from one of our websites. In short, the email was a request to remove links from our site to their site. We linked to this company on our own accord, with no prior solicitation, because we felt it would be useful to our site visitors, which is generally why people link to things on the Internet.

And check out the subsequent discussion on Hacker News. Matt Cutts first post is somewhat disingenuous:

Situation #1 is by far the most common. If a site gets dinged for linkspam and works to clean up their links, a lot of them send out a bunch of link removal requests on their own prerogative

Webmasters who receive the notification are encouraged by Google to clean up their backlinks, because if they don’t, then their rankings suffer.

But, essentially from our point of view when it comes to unnatural links to your website we want to see that you’ve taken significant steps to actually remove it from the web but if there are some links that you can’t remove yourself or there are some that require payment to be removed then having those in the disavow file is fine as well.

(Emphasis mine)

So, of course webmasters who have received a notification from Google are going to contact websites to get links removed. Google have stated they want to see that the webmaster has gone to considerable effort to remove them, rather than simply use the disavow tool.

The inevitable result is that a webmaster who links to anyone who has received a bad links notification may receive the latest form of email spam known as the “please remove my link” email. For some webmasters, this email has become more common that the “someone has left you millions in a Nigerian bank account” gambit, and is just as persistent and annoying.

From The Webmasters Perspective

Webmasters could justifiably add the phrase “please remove my link” and the word "disavow" to their spam filters.

Let’s assume this webmaster isn’t a bad neighbourhood and is simply caught in the cross-fire. The SEO assumes, perhaps incorrectly, the link is bad and requests a take-down. From the webmasters perspective, they incur a time cost dealing with the link removal requests. A lone request might take a few minutes to physically remove - but hang on a minute - how does the webmaster know this request is coming from the site owner and not from some dishonest competitor? Ownership takes time to verify. And why would the webmaster want to take down this link, anyway? Presumably, they put it up because they deemed it useful to their audience. Or, perhaps some bot put the link there - perhaps as a forum or blog comment link - against the webmasters wishes - and now, to add insult to injury, the SEO wants the webmaster to spend his time taking it down!

Even so, this might be okay if it’s only one link. It doesn't take long to remove. But, for webmasters who own large sites, it quickly becomes a chore. For large sites with thousands of outbound links built up over years, removal requests can pile up. That’s when the spam filter kicks in.

Then come the veiled threats. “Thanks for linking to us. This is no reflection on you, but if you don’t remove my link I’ll be forced to disavow you and your site will look bad in Google. I don’t want to do this, but I may have to.”

What a guy.

How does the webmaster know the SEO won’t do that anyway? Isn’t that exactly what some SEO conference speakers have been telling other SEOs to do regardless of whether the webmaster takes the link down or not?

So, for a webmaster caught in the cross-fire, there’s not much incentive to remove links, especially if s/he's read Matt's suggestion:

higherpurpose, nowhere in the original article did it say that Google said the link was bad. This was a request from a random site (we don't know which one, since the post dropped that detail), and the op can certainly ignore the link removal request.

In some cases Google does specify links:

We’ve reviewed the links to your site and we still believe that some of them are outside our quality guidelines.

Sample URLs:
ask.metafilter.com/194610/get-me-and-my-stuff-from-point-a-to-point-b-possibly-via-point-c

Please correct or remove all inorganic links, not limited to the samples provided above. This may involve contacting webmasters of the sites with the inorganic links on them.

And they make errors when they specify those links. They've flagged DMOZ & other similar links: "Every time I investigate these “unnatural link” claims, I find a comment by a longtime member of MetaFilter in good standing trying to help someone out, usually trying to identify something on Ask MetaFilter."

Changing Behaviour

Then the webmaster starts thinking.

"Hmmm...maybe linking out will hurt me! Google might penalize me or, even worse, I’ll get flooded with more and more “please remove my link” spam in future."

So what happens?

The webmaster becomes very wary about linking out. David Naylor mentioned an increasing number of sites adopting a "no linking" policy. Perhaps the webmaster no-follows everything as a precaution. Far from being the life-giving veins of the web, links are seen as potentially malignant. If all outbound links are made no-follow, perhaps the chance of being banned and flooded with “please remove my link”spam is reduced. Then again, even nofollowed links are getting removal requests.

As more webmasters start to see links as problematic, fewer legitimate sites receive links. Meanwhile, the blackhat, who sees their sites occasionally getting burned as a cost of doing business, will likely see their site rise as they’ll be the sites getting all the links, served up from their curated link networks.

A commenter notes:

The Google webspam team seems to prefer psychology over technology to solve the problem, especially recently. Nearly everything that's come out of Matt Cutt's mouth in the last 18 months or so has been a scare tactic.
IMO all this does is further encourage the development of "churn and burn" websites from blackhats who have being penalized in their business plan. So why should I risk all the time and effort it takes to generate quality web content when it could all come crashing down because an imperfect and overzealous algorithm thinks it's spam? Or worse, some intern or non-google employee doing a manual review wrongly decides the site violates webmaster guidelines?

And what’s the point of providing great content when some competitor can just take you out with a dedicated negative SEO campaign, or if Google hits you with a false positive? If most of your traffic comes from Google, then the risk of the web publishing model increases.

Like MetaFilter:

Is Google broken? Or is your site broken? That’s the question any webmaster asks when she sees her Google click-throughs drop dramatically. It’s a question that Matt Haughey, founder of legendary Internet forum MetaFilter, has been asking himself for the last year and a half, as declining ad revenues have forced the long-running site to lay off several of its staff.

Then again, Google may just not want what MetaFilter has to offer anymore.

(In)Unintended Consequences

Could this be uncompetitive practice from Google? Are the sites getting hit with penalties predominantly commercial sites? It would be interesting to see how many of them are non-commercial. If so, is it a way to encourage commercial sites to use Adwords as it becomes harder and harder to get a link by organic means? If all it did was raise the cost of doing SEO, it would still be doing its job.

I have no idea, but you could see why people might ask that question.

Let’s say it’s benevolent and Google is simply working towards better results. The unintended consequence is that webmasters will think twice about linking out. And if that happens, then their linking behaviour will start to become more exclusive. When links become harder to get and become more problematic, then PPC and social-media is going to look that much more attractive.

Pages