Loah Qwality Add Werds Clix Four U

Aug 16th
posted in

Google recently announced they were doing away with exact match AdWords ad targeting this September. They will force all match types to have close variant keyword matching enabled. This means you get misspelled searches, plural versus singular overlap, and an undoing of your tight organization.

In some cases the user intent is different between singular and plural versions of a keyword. A singular version search might be looking to buy a single widget, whereas a plural search might be a user wanting to compare different options in the marketplace. In some cases people are looking for different product classes depending on word form:

For example, if you sell spectacles, the difference between users searching on ‘glass’ vs. ‘glasses’ might mean you are getting users seeing your ad interested in a building material, rather than an aid to reading.

Where segmenting improved the user experience, boosted conversion rates, made management easier, and improved margins - those benefits are now off the table.

CPC isn't the primary issue. Profit margins are what matter. Once you lose the ability to segment you lose the ability to manage your margins. And this auctioneer is known to bid in their own auctions, have random large price spikes, and not give refunds when they are wrong.

An offline analogy for this loss of segmentation ... you go to a gas station to get a bottle of water. After grabbing your water and handing the cashier a $20, they give you $3.27 back along with a six pack you didn't want and didn't ask for.

Why does a person misspell a keyword? Some common reasons include:

  • they are new to the market & don't know it well
  • they are distracted
  • they are using a mobile device or something which makes it hard to input their search query (and those same input issues make it harder to perform other conversion-oriented actions)
  • their primary language is a different language
  • they are looking for something else

In any of those cases, the typical average value of the expressed intent is usually going to be less than a person who correctly spelled the keyword.

Even if spelling errors were intentional and cultural, the ability to segment that and cater the landing page to match disappears. Or if the spelling error was a cue to send people to an introductory page earlier in the conversion funnel, that option is no more.

In many accounts the loss of the granular control won't cause too big of a difference. But some advertiser accounts in competitive markets will become less profitable and more expensive to manage:

No one who's in the know has more than about 5-10 total keywords in any one adgroup because they're using broad match modified, which eliminated the need for "excessive keyword lists" a long time ago. Now you're going to have to spend your time creating excessive negative keyword lists with possibly millions upon millions of variations so you can still show up for exactly what you want and nothing else.

You might not know which end of the spectrum your account is on until disaster strikes:

I added negatives to my list for 3 months before finally giving up opting out of close variants. What they viewed as a close variant was not even in the ballpark of what I sell. There have been petitions before that have gotten Google to reverse bad decisions in the past. We need to make that happen again.

Brad Geddes has held many AdWords seminars for Google. What does he think of this news?

In this particular account, close variations have much lower conversion rates and much higher CPAs than their actual match type.
...
Variation match isn’t always bad, there are times it can be good to use variation match. However, there was choice.
...
Loss of control is never good. Mobile control was lost with Enhanced Campaigns, and now you’re losing control over your match types. This will further erode your ability to control costs and conversions within AdWords.

A monopoly restricting choice to enhance their own bottom line. It isn't the first time they've done that, and it won't be the last.

Have an enhanced weekend!

Understanding The Google Penguin Algorithm

Aug 1st

Whenever Google does a major algorithm update we all rush off to our data to see what changed in terms of rankings, search traffic, and then look for the trends to try to figure out what changed.

The two people I chat most with during periods of big algorithmic changes are Joe Sinkwitz and Jim Boykin. I recently interviewed them about the Penguin algorithm.

Topics include:

  • what it is
  • its impact
  • why there hasn't been an update in a while
  • how to determine if issues are related to Penguin or something else
  • the recovery process (from Penguin and manual link penalties)
  • and much, much more

Here's a custom drawing we commissioned for this interview.
Pang Win.

Want to embed this image on your website?

To date there have been 5 Penguin updates:

  • April 24, 2012
  • May 25, 2012
  • October 5, 2012
  • May 22, 2013 (Penguin 2.0)
  • October 4, 2013

There hasn't been one in quite a while, which is frustrating many who haven't been able to recover. On to the interview...

At its core what is Google Penguin?

Jim: It is a link filter that can cause penalties.

Joe: At its core, Penguin can be viewed as an algorithmic batch filter designed to punish lower quality link profiles.

What sort of ranking and traffic declines do people typically see from Penguin?

Jim: 30-98%. actually, seen some "manual partial matches" some, where traffic was hardly hit...but that's rare.

Joe: Near total. I should expand. Penguin 1.0 has been a different beast than its later iterations; the first one has been nearly a fixed flag whereas later iterations haven't been quite as severe.

After the initial update there was another one about a month later & then one about every 6 months for a while. There hasn't been one for about 10 months now. So why have the updates been so rare? And why hasn't there been one for a long time?

Jim: Great question. We all believed there'd be an update every 6 months, and now it's been way longer than 6 months...maybe because Matt's on vacation...or maybe he knew it would be a long time until the next update, so he took some time off...or perhaps Google wants those with a algorithmic penalty to feel the pain for longer than 6 months.

Joe: 1.0 was temporarily escapable if you were willing to 301 your site; after 1.1 the redirect began to pass on the damage. My theory on why it has been so very long on the most recent update has to do with maximizing pain - Google doesn't intend to lift its boot off the throats of webmasters just yet; no amount of groveling will do. Add to that the complexity of every idiot disavowing 90%+ of their clean link profiles and 'dirty' vs 'clean' links is difficult to ascertain on that signal.

Jim: Most people disavow some, then the disavow some more...then next month they disavow more...wait a year and they may disavow them all :)

Joe: Agreed.

Jim: Then Google will let them out...hehe, tongue in cheek...a little.

Joe: I've seen disavow files with over 98% of links in there, including Wikipedia, the Yahoo! Directory, and other great sources - absurd.

Jim: Me too. Most of the people are clueless ... there's tons of people who are disavowing links just because their traffic has gone down, so they feel they must have been hit by penguin, so they start disavowing links.

Joe: Yes; I've seen a lot of panda hits where the person wants to immediately disavow. "whoa, slow down there Tex!"

Jim: I've seen services where they guarantee you'll get out of a penguin penalty, and we know that they're just disavowing 100% of the links. Yes, you get your manual penalty removed that way, but then you're left with nothing.

Joe: Good time to mention that any guarantee of getting out of a penalty is likely sold as a bag of smoke.

Jim: or as they are disavowing 100% of the links they can find going to the site.

OK. I think you mentioned an important point there Jim about "100% of the links they can find." What are the link sources people should use & how comprehensive is the Google Webmaster Tools data? Is WMT data enough to get you recovered?

Joe: Rarely. I've seen where the examples listed in a manual action might be discoverable on Ahrefs, Majestic SEO, or in WMT, but upon cleaning them up (and disavowing further of course) that Google will come back with a few more links that weren't initially in the WMT data dump. I'm dealing with a client on this right now that bought a premium domain as-is and has been spending about a year constantly disavowing and removing links. Google won't let them up for air and won't do the hard reset.

Jim: well first...if you're getting your backlinks from Google, be sure to pull your backlinks from the www and the non www version of your site. You can't just use one: you HAVE to pull backlinks from both, so you have to verify both your www and your non www version of your site with Google Webmaster Tools.

We often start with that. When we find big patterns that we feel are the cause, we'll then go into OSE, Majestic SEO, and Ahrefs, and pull those backlinks too, and pull out those that fit the patterns, but that's after the Google backlink analysis.

Joe, you mentioned people getting hit by Panda and mistakenly going off to the races to disavow links. What are the distinguishing characteristics between Penguin, Panda & manual link penalties?

Joe: Given they like to sandwich updates to make it difficult to discern, I like this question. Penguin is about links; it is the easiest to find but hardest to fix. When I first am looking at a URL I'll quickly look at anchor % breakdowns, sources of links, etc. The big difference between penguin and a manual link penalty (if you aren't looking on WMT) is the timing -- think of a bomb going off vs a sniper...everyone complaining at once? probably an algorithm; just a few? probably some manual actions. For manual actions, you'll get a note too in WMT. With panda I like to look first at the on-page to see if I can spot the egregious KW stuffing, weird infrastructure setups that result in thin/duplicated content, and look into engagement metrics and my favorite...externally supported pages - to - total indexed pages ratios.

Jim: Manual, at least you can keep resubmitting and get a yes or no. With an algorithmic, you're screwed....because you're waiting for the next refresh...hoping you did enough to get out.

I don't mind going back and forth with Google with a manual penalty...at least I'm getting an answer.

If you see a drop in traffic, be sure to compare that to the dates of Panda and Penguin updates...if you see a drop on one of the update days, then you can know if you have Panda or Penguin....and if you're traffic is just falling, it could be just that, and no penalty.

Joe: While this interview was taking place an employee pinged me to let me know a manual action that was denied, with an example URL being something akin to domain.com/?var=var&var=var - the entire domain was already disavowed. Those 20 second manual reviews by 3rd parties without much of an understanding of search doesn't generate a lot of confidence for me

Jim: Yes, I posted this yesterday to SEOchat. Reviewers are definitely not looking at things.

You guys mentioned that anyone selling a guaranteed 100% recovery solution is likely selling a bag of smoke. What are the odds of recovery? When does it make sense to invest in recovery, when does it make sense to start a different site, and when does it make sense to do both in parallel?

Jim: Well, I'm one for trying to save a site. I haven't once said "it's over for that site, let's start fresh." Links are so important, that if I can even save a few links going to a site, I'll take it. I'm not a fan of doing two sites, causes duplicate content issues, and now your efforts are on two sites.

Joe : It depends on the infraction. I have a lot more success getting stuff out of panda, manual actions, and the later iterations of penguin (theoretically including the latest one once a refresh takes place); I won't take anyone's money for those hit on penguin 1.0 though...I give free advice and add it to my DB tracking, but the very few examples I have where a recovery took place that I can confirm were penguin 1.0 and not something else, happened due to being a beta user of the disavow tool and likely occurred for political reasons vs tech reasons.

For churn and burn, redirects and canonicals can still work if you're clever...but that's not reinvestment so much as strategy shift I realize.

You guys mentioned the disavow process, where a person does some, does some more over time, etc. Is Google dragging out the process primarily to drive pain? Or are they leveraging the aggregate data in some way?

Joe: Oh absolutely they drag it out. Mathematically I think of triggers where a threshold to trigger down might be at X%, but the trigger for recovery might be X-10%. Further though, I think initially they looooooved all the aggregate disavow data, until the community freaked out and started disavowing everything. Let's just say I know of a group of people that have a giant network where lots of quality sites are purposefully disavowed in an attempt to screw with the signal further. :)

Jim: pain :) ... not sure if they're leveraging the data yet, but they might be. It shouldn't be too hard for Google to see that a ton of people are disavowing links from a site like get-free-links-directory.com, for Google to say, "no one else seems to trust these links, we should just nuke that site and not count any links from there."

we can do this ourselves with our own tools we have..I can see how many times I've seen a domain in my disavows, and how many times I disavowed that...ie, If I see spamsite.com in 20 disavows I've done, and I'd disavowed it all 20 times I saw it, I can see this data... or if I've seen goodsite.com 20 times, and never once disavowed it, I can see that too. I'd assume Google must do something like this as well.

Given that they drag it out, on the manual penalties does it make sense to do a couched effort on the first rejection or two, in order to give the perception of a greater level of pain and effort as you scale things up on further requests? What level of resources does it make sense to devote to the initial effort vs the next one and so on? When does recovery typically happen (in terms of % of links filtered and in terms of how many reconsideration requests were filed)?

Joe: When I deliver "disavow these" and "say this" stuff, I give multiple levels, knowing full well that there might be deeper and deeper considerations of the pain. Now, there have been cases where the 1st try gets a site out, but I usually see 3 or more.

Jim: I figure it will take a few reconsideration requests...and yes, I start "big" and get "bigger."

but that's for a sitewide penalty...

We've seen sitewides get reduced to a partial penalty. And once we have a partial penalty, it's much easier to identify this issues and take care of those, while leaving links that go to pages that were not effected.

A sitewide manual penalty kills the site...a partial match penalty usually has some stuff that ranks good, and some stuff that no longer ranks...once we're at a partial match, I feel much more confident in getting that resolved.

Jim, I know you've mentioned the errors people make in either disavowing great links or disavowing links when they didn't need to. You also mentioned the ability to leverage your old disavow data when processing new sites. When does it make sense to DIY on recovery versus hiring a professional? Are there any handy "rule of thumb" guidelines in terms of the rough cost of a recovery process based on the size of their backlink footprint?

Joe: It comes down to education, doesn't it? Were you behind the reason it got dinged? You might try that first vs immediately hiring. Psychologically it could even look like you're more serious after the first disavow is declined by showing you "invested" in the pain. Also, it comes down to opportunity cost. What is your personal time worth divided by your perceived probability of fixing

Jim: We charge $5000 for the analysis, and $5000 for the link removal process...some may think that's expensive...but removing good links will screw you, and not removing bad links will screw you...it's a real science, and getting is wrong can cost you a lot more than this...of course I'd recommend seeing a professional, as I sell this service...but I can't see anyone who's not a true expert in links doing this themselves.

Oh...and once we start work for someone, we keep going at no further cost until they get out.

Joe: That's a nice touch Jim.

Jim: Thank you.

Joe, during this interview you mentioned a reconsideration request rejection where the person cited a link on a site that has already been disavowed. Given how many errors Google's reviewers make, does it make sense to aggressively push to remove links rather than using disavow? What are the best strategies to get links removed?

Joe: DDoS

Jim: hehe

Joe: Really though, be upfront and honest when using those link removal services (which I'd do vs trying to do them one-by-one-by-one)

Jim: Only 1% of the people will remove links anyways; it's more to show Google that to you really tried to get the links removed.

Joe: Let the link holder know that you got hit with a penalty, you're just trying to clean it up because your business is suffering, and ask politely that they do you a solid favor.

I've been on the receiving end of a lot of different strategies given the size of my domain portfolio. I've been sued before (as a first course of action!) by someone that PAID to put a link on my site....they never even asked, just filed the case.

Jim: We send 3 removal requests..and ping the links too..so when we do a reconsideration request we can show Google the spreadsheet of who we emailed, when we emailed them, and who removed or no followed the links...but it's more about "show" to Google.

Joe: Yep, not a ton of compliance; webmasters have link removal fatigue by now.

This is more of a business question than an SEO question, but ... as much as budgeting for the monetary cost of recovery, an equally important form of budgeting is dealing with the reduced cashflow while the site is penalized. How many months does it typically take to recover from a manual penalty? When should business owners decide to start laying people off? Do you guys suggest people aggressively invest in other marketing channels while the SEO is being worked on in the background?

Jim: manual penalty typically take 2-4 months to recover. Recover is a relative term. Some people get "your manual penalty has been removed" and thier recovery is a tiny blip -up 5%, but still down 90% from what is was prior. Getting a "manual penalty removed" is great. IF there's good links left in your profile...if you've disavow everything, and your penalty is removed...so what...you've got nothing....people often ask where they'll be once they "recover" and I say "it depends on what you have left for links"...but it won't be where you were.

Joe: It depends on how exposed they are per variable costs. If the costs are fixed, then one can generally wait longer (all things being equal) before cutting. If you have a quarter million monthly link budget *cough* then, you're going to want to trim as quickly as possible just in order to survive.

Per investing in other channels, I absolutely wholeheartedly cannot emphasize how important it is to become an expert in one channel and at least a generalist in several others...even better, hire an expert in another channel to partner up with. In payday one of the big players did okay in SEO but even with a lot of turbulence was doing great due to their TV and radio capabilities. Also, collect the damn email addresses; email is still a gold mine if you use it correctly.

One of my theories for why there hasn't been a penguin update in a long time was that as people have become more afraid of links they've started using them as a weapon & Google doesn't want a bunch of false positives caused by competitors killing sites. One reason I've thought this versus the pain first motive is that Google could always put a time delay on recoveries while still allowing new sites to get penalized on updates. Joe, you mentioned that after the second Penguin update penalties started passing forward on redirects. Do people take penalized sites and point them at competitors?

Joe: Yes, they do. They also take them and pass them into the natural links of their competitors. I've been railing on negative SEO for several years now...right about when the first manual action wave came out in Jan 2012; that was a tipping point. It is now more economical to take someone else's ranking down than it is to (with a strong degree of confidence) invest in a link strategy to leapfrog them naturally

I could speak for days straight in a congressional filibuster on link strategies used for Negative SEO. It is almost magical how pervasive it has become. I get a couple requests a week to do it even...by BIG companies. Brands being the mechanism to sort out the cesspool and all that.

Jim: Soon, everyone will be monitoring they backlinks on a monthly basis. I know one big company that submits an updated disavow list every week to google.

That leads to a question about preemptive disavows. When does it make sense to do that? What businesses need to worry about that sort of stuff?

Joe: Are you smaller than a Fortune 500? Then the cards are stacked against you. At the very least, be aware of your link profile -- I wouldn't go so far as to preemptively disavow unless something major popped up.

Jim: I've done a preemptive disavow for my site. I'd say everyone should do a preemptive disavow to clean out the crap backlinks.

Joe: I can't wait to launch an avow service...basically go around to everyone and charge a few thousand dollars to clean up their disavows. :)

Jim: We should team up Joe and do them together :)

Joe: I'll have my spambots call your spambots.

Jim: saving the planet from penguin penalties. cleaning up the links of the web for Google.

Joe: For Google or from Google? :) The other dig, if there's time, is that not all penalties are created equal because there are several books of law in terms of how long a penalty might last. If I take an unknown site and do what RapGenius did, I'd still be waiting, even after fixing (which rapgenius really didn't do) largely because Google is not one of my direct or indirect investors.

Perhaps SEOs will soon offer a service for perfecting your pitch deck for the Google Ventures or Google Capital teams so it is easier to BeatThatPenalty? BanMeNot ;)

Joe: Or to extract money from former Googlers...there's a funding bubble right now where those guys can write their own ticket by VCs chasing the brand. Sure the engineer was responsible for changing the font color of a button, but they have friends on the inside still that might be able to reverse catastrophe.

Outside of getting a Google investment, what are some of the best ways to minimize SEO risk if one is entering a competitive market?

Jim: Don't try to rank for specific phrases anymore. It's a long slow road now.

Joe: Being less dependent on Google gives you power; think of it like a job interview. Do you need that job? The less you do, the more bargaining power you have. If you have more and more income coming in to your site from other channels, chances are you are also hitting on some important brand signals.

Jim: You must create great things, and build your brand...that has to be the focus...unless you want to do things to rank higher quicker, and take the risk of a penalty with Google.

Joe: Agreed. I do far fewer premium domaining + SEO-only plays anymore. For a while they worked; just a different landscape now.

Some (non-link builders) mention how foolish SEOs are for wasting so many thought cycles on links. Why are core content, user experience, and social media all vastly more important than link building?

Jim: links are still the biggest part of the Google algorithm - they can not be ignored. People must have things going on that will get them mentions across the web, and ideally some links as well. Links is #1 still today... but yes, after links, you need great content, good user experience, and more.

Joe: CopyPress sells content (please buy some content people; I have three kids to feed here), however it is important to point out that the most incredible content doesn't mean anything in a vacuum. How are you going to get a user experience with 0 users? Link building, purchasing traffic, DRIVING attention are crucial not just to SEO but to marketing in general. Google is using links as votes; while the variability has changed and evolved over time, it is still very much there. I don't see it going away in the next year or two.

An analogy: I wrote two books of poetry in college; I think they are ok, but I never published them and tried to get any attention, so how good are they really? Without promotion and amplification, we're all just guessing.

Thanks guys for sharing your time & wisdom!


About our contributors:

Jim Boykin is the Founder and CEO of Internet Marketing Ninjas, and owner of Webmasterworld.com, SEOChat.com, Cre8asiteForums.com and other community websites. Jim specializes in creating digital assets for sites that attract natural backlinks, and in analyzing links to disavow non-natural links for penalty recoveries.

Joe Sinkwitz, known as Cygnus, is current Chief of Revenue for CopyPress.com. He enjoys long walks on the beach, getting you the content you need, and then whispering in your ear how to best get it ranking.

Google Search Censorship for Fun and Profit

Growing Up vs Breaking Things

Facebook's early motto was "move fast and break things," but as they wanted to become more of a platform play they changed it to "move fast with stability." Anything which is central to the web needs significant stability, or it destroys many other businesses as a side effect of its instability.

As Google has become more dominant, they've moved in the opposite direction. Disruption is promoted as a virtue unto itself, so long as it doesn't adversely impact the home team's business model.

There are a couple different ways to view big search algorithm updates. Large, drastic updates implicitly state one of the following:

  • we were REALLY wrong yesterday
  • we are REALLY wrong today

Any change or disruption is easy to justify so long as you are not the one facing the consequences:

"Smart people have a problem, especially (although not only) when you put them in large groups. That problem is an ability to convincingly rationalize nearly anything." ... "Impostor Syndrome is that voice inside you saying that not everything is as it seems, and it could all be lost in a moment. The people with the problem are the people who can't hear that voice." - Googler Avery Pennarun

Monopoly Marketshare in a Flash

Make no mistake, large changes come with false positives and false negatives. If a monopoly keeps buying marketshare, then any mistakes they make have more extreme outcomes.

Here's the Flash update screen (which hits almost every web browser EXCEPT Google Chrome).

Notice the negative option installs for the Google Chrome web browser and the Google Toolbar in Internet Explorer.

Why doesn't that same process hit Chrome? They not only pay Adobe to use security updates to steal marketshare from other browsers, but they also pay Adobe to embed Flash inside Chrome, so Chrome users never go through the bundleware update process.

Anytime anyone using a browser other than Chrome has a Flash security update they need to opt out of the bundleware, or they end up installing Google Chrome as their default web browser, which is the primary reason Firefox marketshare is in decline.

Google engineers "research" new forms of Flash security issues to drive critical security updates.

Obviously, users love it:

Has anyone noticed that the latest Flash update automatically installs Google Toolbar and Google Chrome? What a horrible business decision Adobe. Force installing software like you are Napster. I would fire the product manager that made that decision. As a CTO I will be informing my IT staff to set Flash to ignore updates from this point forward. QA staff cannot have additional items installed that are not part of the base browser installation. Ridiculous that Adobe snuck this crap in. All I can hope now is to find something that challenges Photoshop so I can move my design team away from Adobe software as well. Smart move trying to make pennies off of your high dollar customers.

In Chrome Google is the default search engine. As it is in Firefox and Opera and Safari and Android and iOS's web search.

In other words, in most cases across most web interfaces you have to explicitly change the default to not get Google. And then even when you do that, you have to be vigilant in protecting against the various Google bundleware bolted onto core plugins for other web browsers, or else you still end up in an ecosystem owned, controlled & tracked by Google.

Those "default" settings are not primarily driven by user preferences, but by a flow of funds. A few hundred million dollars here, a billion there, and the market is sewn up.

Google's user tracking is so widespread & so sophisticated that their ad cookies were a primary tool for government surveillance efforts.

Locking Down The Ecosystem

And Chrome is easily the most locked down browser out there.

Whenever Google wants to promote something they have the ability to bundle it into their web browser, operating system & search results to try to force participation. In a fluid system with finite attention, over-promoting one thing means under-promoting or censoring other options. Google likes to have their cake & eat it too, but the numbers don't lie.

The Right to Be Forgotten

This brings us back to the current snafu with the "right to be forgotten" in Europe.

Google notified publishers like the BBC & The Guardian of their links being removed due to the EU "right to be forgotten" law. Their goal was to cause a public relations uproar over "censorship" which seems to have been a bit too transparent, causing them to reverse some of the removals after they got caught with their hand in the cookie jar.

The breadth of removals is an ongoing topic of coverage. But if you are Goldman Sachs instead of a government Google finds filtering information for you far more reasonable.

Some have looked at the EU policy and compared it to state-run censorship in China.

Google already hires over 10,000 remote quality raters to rate search results. How exactly is receiving 70,000 requests a monumental task? As their public relations propagandists paint this as an unbelievable burden, they are also highlighting how their own internal policies destroy smaller businesses: "If a multi-billion dollar corporation is struggling to cope with 70,000 censor requests, imagine how the small business owner feels when he/she has to disavow thousands or tens of thousands of links."

The World's Richest Librarian

Google aims to promote themselves as a digital librarian: "It’s a bit like saying the book can stay in the library, it just cannot be included in the library’s card catalogue."

That analogy is absurd on a number of levels. Which librarian...

Sorry About That Incidental Deletion From the Web...

David Drummond's breathtaking propaganda makes it sound like Google has virtually no history in censoring access to information:

In the past we’ve restricted the removals we make from search to a very short list. It includes information deemed illegal by a court, such as defamation, pirated content (once we’re notified by the rights holder), malware, personal information such as bank details, child sexual abuse imagery and other things prohibited by local law (like material that glorifies Nazism in Germany).

Yet Google sends out hundreds of thousands of warning messages in webmaster tools every single month.

Google is free to force whatever (often both arbitrary and life altering) changes they desire onto the search ecosystem. But the moment anyone else wants any level of discourse or debate into the process, they feign outrage over the impacts on the purity of their results.

Despite Google's great power they do make mistakes. And when they do, people lose their jobs.

Consider MetaFilter.

They were penalized November 17, 2012.

At a recent SMX conference Matt Cutts stated MetaFilter was a false positive.

People noticed the Google update when it happened. It is hard to miss an overnight 40% decline in your revenues. Yet when they asked about it, Google did not confirm its existence. That economic damage hit MetaFilter for nearly two years & they only got a potential reprieve from after they fired multiple employees and were able to generate publicity about what had happened.

As SugarRae mentioned, those false positives happen regularly, but most the people who are hit by them lack political and media influence, and are thus slaughtered with no chance of recovery.

MetaFilter is no different than tens of thousands of other good, worthy small businesses who are also laying off employees – some even closing their doors – as a result of Google’s Panda filter serving as judge, jury and executioner. They’ve been as blindly and unfairly cast away to an island and no one can hear their pleas for help.

The only difference between MetaFilter and tons of other small businesses on the web is that MetaFilter has friends in higher places.

If you read past the headlines & the token slaps of big brands, these false positive death sentences for small businesses are a daily occurrence.

And such stories are understated for fear of coverage creating a witch-hunt:

Conversations I’ve had with web publishers, none of whom would speak on the record for fear of retribution from Cutts’ webspam team, speak to a litany of frustration at a lack of transparency and potential bullying from Google. “The very fact I’m not able to be candid, that’s a testament to the grotesque power imbalance that’s developed,” the owner of one widely read, critically acclaimed popular website told me after their site ran afoul of Cutts’ last Panda update.

Not only does Google engage in anti-competitive censorship, but they also frequently publish misinformation. Here's a story from a week ago of a restaurant which went under after someone changed their Google listing store hours to be closed on busy days. That misinformation was embedded directly in the search results. That business is no more.

Then there are areas like locksmiths:

I am one of the few Real Locksmiths here in Denver and I have been struggling with this for years now. I only get one or two calls a day now thanks to spammers, and that's not calls I do, it's calls for prices. For instance I just got a call from a lady locked out of her apt. It is 1130 pm so I told her 75 dollars, Nope she said someone told her 35 dollars....a fake locksmith no doubt. She didn't understand that they meant 35 dollars to come out and look at it. These spammers charge hundreds to break your lock, they don't know how to pick a lock, then they charge you 10 times the price of some cheap lock from a hardware store. I'm so lost, I need help from google to remove those listings. Locksmithing is all I have ever done and now I'm failing at it.

There are entire sectors of the offline economy being reshaped by Google policies.

When those sectors get coverage, the blame always goes to the individual business owner who was (somehow?) personally responsible for Google's behaviors, or perhaps some coverage of the nefarious "spammers."

Never does anybody ask if it is reasonable for Google to place their own inaccurate $0 editorial front and center. To even bring up that issue makes one an anti-capitalist nut or someone who wishes to impede on free speech rights. This even after the process behind the sausage comes to light.

And while Google arbitrarily polices others, their leaked internal documents contain juicy quotes about their ad policies like:

  • “We are the only player in our industry still accepting these ads”
  • “We do not make these decisions based on revenue, but as background, [redacted].”
  • "As with all of our policies, we do not verify what these sites actually do, only what they claim to do."
  • "I understand that we should not let other companies, press, etc. influence our decision-making around policy"

Is This "Censorship" Problem New?

This problem of control to access of information is nothing new - it is only more extreme today. Read the (rarely read) preface to Animal Farm, or consider this:

John Milton in his fiery 1644 defense of free speech, Areopagitica, was writing not against the oppressive power of the state but of the printers guilds. Darnton said the same was true of John Locke's writings about free speech. Locke's boogeyman wasn't an oppressive government, but a monopolistic commercial distribution system that was unfriendly to ways of organizing information that didn't fit into its business model. Sound familiar?

When Google complains about censorship, they are not really complaining about what may be, but what already is. Their only problem is the idea that someone other than themselves should have any input in the process.

"Policy is largely set by economic elites and organized groups representing business interests with little concern for public attitudes or public safety, as long as the public remains passive and obedient." ― Noam Chomsky

Many people have come to the same conclusion

Turn on, tune in, drop out

"I think as technologists we should have some safe places where we can try out some new things and figure out what is the effect on society, what's the effect on people, without having to deploy kind of into the normal world. And people like those kind of things can go there and experience that and we don't have mechanisms for that." - Larry Page

I have no problem with an "opt-in" techno-utopia test in some remote corner of the world, but if that's the sort of operation he wants to run, it would be appreciated if he stopped bundling his software into billions of electronic devices & assumed everyone else is fine with "opting out."

  • Over 100 training modules, covering topics like: keyword research, link building, site architecture, website monetization, pay per click ads, tracking results, and more.
  • An exclusive interactive community forum
  • Members only videos and tools
  • Additional bonuses - like data spreadsheets, and money saving tips
We love our customers, but more importantly

Our customers love us!

{This | The Indicated} {Just | True} {In | Newfangled}

A couple years ago we published an article named Branding & the Cycle, which highlighted how brands would realign with the algorithmic boost they gained from Panda & leverage their increased level of trust to increase their profit margins by leveraging algorithmic journalism.

Narrative Science has been a big player in the algorithmic journalism game for years. But they are not the only player in the market. Recently the Associated Press (AP) announced they will use algorithms to write articles based on quarterly earnings reports, working with a company named Automated Insights:

We discovered that automation technology, from a company called Automated Insights, paired with data from Zacks Investment Research, would allow us to automate short stories – 150 to 300 words — about the earnings of companies in roughly the same time that it took our reporters.

And instead of providing 300 stories manually, we can provide up to 4,400 automatically for companies throughout the United States each quarter.
...
Zacks maintains the data when the earnings reports are issued. Automated Insights has algorithms that ping that data and then in seconds output a story.

In the past Matt Cutts has mentioned how thin rewrites are doorway page spam:

you can also have more subtle doorway pages. so we ran into a directv installer in denver, for example. and that installer would say I install for every city in Colorado. so I am going to make a page for every single city in Colorado. and Boulder or Aspen or whatever I do directv install in all of those. if you were just to land on that page it might look relatively reasonable. but if you were to look at 4 or 5 of those you would quickly see that the only difference between them is the city, and that is something that we would consider a doorway.

One suspects these views do not apply to large politically connected media bodies like the AP, which are important enough to have a direct long-term deal with Google.

In the above announcement the AP announced they include automated NFL player rankings. One interesting thing to note about the AP is they have syndication deals with 1,400 daily newspapers nationwide, as well as thousands of TV and radio stations..

A single automated AP article might appear on thousands of websites. When thousands of articles are automated, that means millions of copies. When millions of articles are automated, that means billions of copies. When billions ... you get the idea.

To date Automated Insights has raised a total of $10.8 million. With that limited funding they are growing quickly. Last year their Wordsmith software produced 300 million stories & this year it will likely exceed a billion articles:

"We are the largest producer of content in the world. That's more than all media companies combined," [Automated Insights CEO Robbie Allen] said in a phone interview with USA TODAY.

The Automated Insights homepage lists both Yahoo! & Microsoft as clients.

The above might sound a bit dystopian (for those with careers in journalism and/or lacking equity in Automated Insights and/or publishers who must compete against algorithmically generated content), but the story also comes with a side of irony.

Last year Google dictated press releases shall use nofollow links. All the major press release sites quickly fell in line & adopted nofollow, thinking they would remain in Google's good graces. Unfortunately for those sites, they were crushed by Panda. PR Newswire's solution their penalty was greater emphasis on manual editorial review:

Under the new copy quality guidelines, PR Newswire editorial staff will review press releases for a number of message elements, including:

  • Inclusion of insightful analysis and original content (e.g. research, reporting or other interesting and useful information);
  • Use of varied release formats, guarding against repeated use of templated copy (except boilerplate);
  • Assessing release length, guarding against issue of very short, unsubstantial messages that are mere vehicles for links;
  • Overuse of keywords and/or links within the message.

So now we are in a situation where press release sites require manual human editorial oversight to try to get out of being penalized, and the news companies (which currently enjoy algorithmic ranking boosts) are leveraging those same "spammy" press releases using software to auto-generate articles based on them.

That makes sense & sounds totally reasonable, so long as you don't actually think about it (or work at Google)...

Google Helpouts Twitter Spam (Beta)

May 16th

Google is desperate to promote Helpouts. I first realized this when I saw the following spam message in my email inbox.

Shortly after a friend sent me a screenshot of a onebox promoting Helpouts in the SERPs.

That's Google monopoly and those are Google's services. It is not like they are:

  • being anti-competitive
  • paying others to spam other websites

Let's slow down though. Maybe I am getting ahead of myself:

Google has its own remote technology support service similar to Mr. Gupta's called Google Helpouts. Mr. Gupta's complaint alleges Google may have been blocking his advertisements so Google Helpouts could get more customers.

Oh, and that first message looked like it could have been an affiliate link. Was it?

Hmm

Let me see

What do we have here?

Google Helpouts connects you to a variety of experts--from doctors, parenting experts, tutors, personal trainers, and more--over live video call. The Google Helpouts Ambassador Program is a unique opportunity to spread the word about Helpouts, earn money, and influence a new Google product--all on your own schedule.

As an Ambassador, you will:

  • Earn extra income–receive $25 for each friend you refer who takes their first paid Helpout, up to $1,000 per month for the first 4 months.
  • Give direct feedback and help shape a new Google product
  • Join a community of innovative Ambassadors around the country
  • Receive a Helpouts gift and the chance to win prizes

We all know HELPFUL hotel affiliate websites are spam, but maybe Google HELPouts affiliate marketing isn't spam.

After all, Google did promise to teach people how to do their affiliate marketing professionally: "We will provide you with an Ambassador Toolkit with tips and suggestions on creative ways you can spread the word. You are encouraged to get creative, be innovative, and utilize different networks (i.e. social media, word of mouth, groups & associations, blogs, etc.) to help you."

Of course the best way to lead is by example.

And lead they do.

They are highly inclusive in their approach.

Check out this awesome Twitter usage

They've more Tweets in the last few months than I've made in 7 years. There are 1,440 minutes in a day, so it is quite an achievement to make over 800 Tweets in a day.

You and many many many many thousands of others, Emma.

Some minutes they are making 2 or 3 Tweets.

And with that sort of engagement & the Google brand name, surely they have built a strong following.

Uh, nope.

They are following over 500 people and have about 4,000 followers. And the 4,000 number is generous, as some of them are people who sell on that platform or are affiliates pushing it.

Let's take a look at the zero moment of truth:

Thanks for your unsolicited commercial message, but I am not interested.

You're confusing me. Some context would help.

No email support, but support "sessions"? What is this?

Oh, I get it now. Is this a spam bot promoting phone sex?

Ah, so it isn't phone sex, but you can help with iPhones. Um, did we forget that whole Steve Jobs thermonuclear war bit? And why is Google offering support for Apple products when Larry Page stated the whole idea of customer support was ridiculous?

OK, so maybe this is more of the same.

Cynical, aren't we?

And cheap?

Really cheap. :(

And angry?

And testy?

And rude?

And curt?

Didn't you already say that???

Didn't you already say that???

It seems we are having issues communicating here.

I'm not sure it is fair to call it spying a half day late.

Better late than never.

Even if automated.

Good catch Megar, as Google has a creepy patent on automating social spam.

Who are your real Google+ friends? Have they all got the bends? Is Google really sinking this low?

Every journey of a thousand miles begins with a single step.

Humorous or sad...depending on your view.

There's no wrong way to eat a Reese's.

Google has THOUSANDS of opportunities available for you to learn how to spam Twitter.

As @Helpouts repeatedly Tweets: "Use the code IFOUNDHELP for $20 off" :D

++++++++

All the above Tweets were from the last few days.

The same sort of anti-social agro spamming campaign has been going on far longer.

When Twitter users said "no thank you"...

...Google quickly responded like a Marmaris rug salesman

Google has a magic chemistry for being able to...

...help with slow computers.

We need to fight spam messages (with MOAR spam messages).

In a recent Youtube video Matt Cutts said: "We got less spam and so it looks like people don't like the new algorithms as much." Based on that, perhaps we can presume Helpouts is engaging in a guerrilla marketing campaign to improve user satisfaction with the algorithms.

Or maybe Google is spamming Twitter so they can justify banning Twitter.

Or maybe this is Google's example of how we should market websites which don't have the luxury of hard-coding at the top of the search results.

Or maybe Google wasn't responsible for any of this & once again it was "a contractor."

Learn Local Search Marketing

Apr 21st

Last October Vendran Tomic wrote a guide for local SEO which has since become one of the more popular pages on our site, so we decided to follow up with a QnA on some of the latest changes in local search.

Local Ants.

Q: Google appears to have settled their monopolistic abuse charges in Europe. As part of that settlement they have to list 3 competing offers in their result set from other vertical databases. If Google charges for the particular type of listing then these competitors compete in an ad auction, whereas if the vertical is free those clicks to competitors are free. How long do we have until Google's local product has a paid inclusion element to it?

A: Local advertising market is huge. It's a market that Google still hasn't mastered. It's a market still dominated by IYP platforms.

Since search in general is stagnant, Google will be looking to increase their share of the market.

That was obvious to anyone who was covering Google's attempt to acquire Groupon since social couponing is a local marketing phenomenon mostly.

Their new dashboard is not only more stable with a slicker interface, but also capable of facilitating any paid inclusion module.

I would guess that Google will not wait a long time to launch a paid inclusion product or something similar, since they want to keep their shareholders happy.

Q: In the past there have been fiascos with things like local page cross-integration with Google+. How "solved" are these problems, and how hard is it to isolate these sorts of issues from other potential issues?

A: Traditionally, Google had the most trouble with their "local" products. Over the years, they were losing listings, reviews, merging listings, duplicating them etc. Someone called their attempts "a train wreck at the junction." They were also notoriously bad with providing guidance that would help local businesses navigate the complexity of the environment Google created.

Google has also faced some branding challenges - confusing even the most seasoned local search professionals with their branding.

Having said that, things have been changing for the better. Google has introduced phone support which is, I must say, very useful. In addition, the changes they made in a way they deal with local data made things more stable.

However, I'd still say that Google's local products are their biggest challenge.

Q: Yelp just had strong quaterly results and Yahoo! has recently added a knowledge-graph like pane to their search results. How important is local search on platforms away from Google? How aligned are the various local platforms on ranking criteria?

A: Just like organic search is mostly about two functions - importance and relevance, local search is about location prominence, proximity and relevance (where location prominence is an equivalent to importance in general SEO).

All local search platforms have ranking factors that are based on these principles.

The only thing that's different is what they consider ranking signals and the way they place on each. For example, to rank high in Yahoo! Local, one needs to be very close to the centroid of the town, have something in the title of their business that matches the query of the search and have a few reviews.

Google is more sophisticated, but the principles are the same.

The less sophisticated local search platforms use less signals in their algorithm, and are usually geared more towards proximity as a ranking signal.

It's also important to note that local search functions as a very interconnected ecosystem, and that changes made in order to boost visibility in one platform, might hurt you in another.

Q: There was a Google patent where they mentioned using driving directions to help as a relevancy signal. And Bing recently invested in and licensed data from Foursquare. Are these the sorts of signals you see taking weight from things like proximity over time?

A: I see these signals becoming/increasing in importance over time as they would be a useful ranking signal. However, to Google, local search is also about location sensitivity, and these signals will probably not be used outside of this context.

If you read a patent named "Methods And Systems For Improving A Search Ranking Using Location Awareness" (Amit Singhal is one of the inventors), you will see that Google, in fact, is aware that people have different sensitivities fo different types of services/queries. You don't necessarily care where your plumber will come from, but you do care where the pizza places are where you search for pizza in your location.

I don't see driving directions as a signal ever de-throning proximity, because proximity is closer to the nature of the offline/online interaction.

Q: There are many different local directories which are highly relevant to local, while there are also vertical specific directories which might be tied to travel reviews or listing doctors. Some of these services (say like OpenTable) also manage bookings and so on. How important is it that local businesses "spread around" their marketing efforts? When does it make sense to focus deeply on a specific platform or channel vs to promote on many of them?

A: This is a great question, Aaron! About 5 years ago, I believed that the only true game in town for any local business is Google. This was because, at that time, I wasn't invested in proper measurement of outcomes and metrics such as cost of customer acquisition, lead acqusition etc.

Local businesses, famous for their lack of budgets, should always "give" vertical platforms a try, even IYP type sites. This is why:

  • one needs to decrease dependance on Google because it's an increasingly fickle channel of traffic acquisition (Penguin and Panda didn't spare local websites),
  • sometimes, those vertical websites can produce great returns. I was positively surprised by the number of inquiries/leads one of our law firm clients got from a well known vertical platform.
  • using different marketing channels and measuring the right things can improve your marketing skills.

Keep in mind, basics need to be covered first: data aggregators, Google Places, creating a professional/usable/persuasive website, as well as developing a measurement model.

Q: What is the difference between incentivizing a reasonable number of reviews & being so aggressive that something is likely to be flagged as spam? How do you draw the line with trying to encourage customer reviews?

A: Reviews and review management have always been tricky, as well as important. We know two objective things about reviews:

  • consumers care about reviews when making a purchase and
  • reviews are important for your local search visibility.

Every local search/review platform worth its weight in salt will have a policy in place discouraging incentivizing and "buying" reviews. They will enforce this policy using algorithms or humans. We all know that.

Small and medium sized businesses make a mistake of trying to get as many reviews as humanly possible, and direct them to one or two local search platforms. Here, they make two mistakes:

1. they're driven by a belief that one needs a huge number of reviews on Google and
2. one needs to direct all their review efforts at Google.

This behavior forces them to be flagged algorithmically or manually. Neither Google nor Yelp want you to solicit reviews.

However, if you change your approach from aggressively asking for reviews to a survey-based approach, you should be fine.

What do I mean by that?

A survey-based approach means you solicit your customers' opinions on different services/products to improve your operations - and then ask them to share their opinion on the web while giving them plenty of choices.

This approach will get you much further than mindlessly begging people for reviews and sending them to Google.

The problem with clear distinction between the right and wrong way in handling reviews, as far as Google goes, lies in their constant changing of guidelines regarding reviews.

Things to remember are: try to get reviews on plenty of sites, while surveying your customers and never get too aggressive. Slow and steady wins the race.

Q: On many local searches people are now getting carouseled away from generic searches toward branded searches before clicking through, and then there is keyword(not provided) on top of that. What are some of the more cost efficient ways a small business can track & improve their ranking performance when so much of the performance data is hidden/disconnected?

A: Are you referring to ranking in Maps or organic part of the results? I'm asking because Google doesn't blend anymore.

Q: I meant organic search

A: OK. My advice has always been to not obsess over rankings, but over customer acquisition numbers, leads, lifetime customer value etc.

However, rankings are objectively a very important piece of the puzzle. Here are my suggestions when it comes to more cost efficient ways to track and improve ranking performance:

  • When it comes to tracking, I'd use Advanced Web Ranking (AWR) or Authority Labs, both of which are not very expensive.
  • Improving ranking performance is another story. Local websites should be optimized based on the same principles that would work for any site (copy should be written for conversion, pages should be focused on narrow topics, titles should be written for clickthrough rates etc).
  • On the link building side of things, I'd suggest taking care of data aggregators first as a very impactful, yet cost effective strategy. Then, I would go after vertical platforms that link directly to a website, that have profiles chockfull of structured data. I would also make sure to join relevant industry and business associations, and generally go after links that only a real local business can get - or that come as a result of broader marketing initiatives. For example, one can organize events in the offline world that can result in links and citations, effectively increasing their search visibility without spending too much.

Q: If you are a local locksmith, how do you rise above the spam which people have publicly complained about for at least 5 years straight now?

A: If I were a local locksmith, I would seriously consider moving my operations close to the centroid of my town/city. I would also make sure my business data across the web is highly consistent.

In addition, I would make sure to facilitate getting reviews on many platforms. If this wouldn't be enough (as it often isn't enough in many markets), I would be public about Google's inability to handle locksmiths spam in my town - using their forums, and any other medium.

Q: In many cities do you feel the potential ROI would be high enough to justify paying for downtown real estate then? Or would you suggest having a mailing related address or such?

A: The ROI of getting a legitimate downtown address would greatly depend on customer lifetime value. For example, if I were a personal injury attorney in a major city, I would definitely consider opening a small office near a center of my city/town.

Another thing to consider would be the search radius/location sensitivity. If the location sensitivity for a set of keywords is high, I would be more inclined to invest in a downtown office.

I wouldn't advocate PO boxes or virtual offices, since Google is getting more aggressive about weeding those out.

Q: Google recently started supporting microformats for things like hours of operation, phone numbers, and menus. How important is it for local businesses to use these sorts of features?

A: It is not a crucial ranking factor, and is unlikely to be any time in the near future. However, Google tends to reward businesses that embrace their new features - at least in local search. I would definitely recommend embracing microformats in local search.

Q: As a blogger I've noticed an increase in comment spam with NAP information in it. Do you see Google eventually penalizing people for that? Is this likely to turn into yet another commonplace form of negative SEO?

A: This is a difficult question. Knowing how Google operates, it's possible they start penalizing that practice. However, I don't see that type of spam being particularly effective.

Most blogs cannot do a lot to enhance the location prominence. But if that turned into a negative SEO avenue, I would say that Google wouldn't handle it well (based on their track records).

Q: Last year you wrote a popular guide to local search. What major changes have happened to the ecosystem since then? Would you change any of the advice you gave back then? Or has local search started to become more stable recently?

A: There weren't huge changes in the local ecosystem. Google has made a lot of progress in transferring accounts to the new dashboard, improving the Bulk upload function. They also changed their UX slightly.

Moz entered the local search space with their Moz Local product.

Q: When doing a local SEO campaign, how much of the workload tends to be upfront stuff versus ongoing maintenance work? For many campaigns is a one-off effort enough to last for a significant period of time? How do you determine the best approach for a client in terms of figuring out the mix of upfront versus maintenance and how long it will take results to show and so on?

A: This largely depends on the objective of the campaign, the market and the budget. There are verticals where local Internet marketing is extremely competitive, and tends to be a constant battle.

Some markets, on the other hand, are easy and can largely be a one-off thing. For example, if you're a plumber or an electrician in a small town with a service area limited to that town, you really don't need much maintenance, if any.

However, if you are a roofing company that wants to be a market leader in greater Houston, TX your approach has to be much different.

The upfront work tends to be more intense if the business has NAP inconsistencies, never did any Internet marketing and doesn't excel at offline marketing.

If you're a brand offline and know to tie your offline and online marketing efforts, you will have a much easier time getting the most out of the web.

In most smaller markets, the results can be seen in a span of just a few months. More competitive markets, in my experience, require more time and a larger investment.

Q: When does it make sense for a local business to DIY versus hiring help? What tools do you recommend they use if they do it themselves?

A: If local business owner is in a position where doing local Internet marketing is their highest value activity, it would make sense to do it themselves.

However, more often than not, this is not the case even for the smallest of businesses. Being successful in local Internet marketing in a small market is not that difficult. But it does come with a learning curve and a cost in time.

Having said that, if the market is not that competitive, taking care of data aggregators, a few major local search platforms and acquisition of a handful of industry links would do the trick.

For data aggregators, one might go directly to them or use a tool such as UBM or Moz Local.

To dig for citations, Whitespark's citation tool is pretty good and not that expensive.

Q: The WSJ recently published a fairly unflatering article about some of the larger local search firms which primarily manage AdWords for 10's of thousands of clients & rely on aggressive outbound marketing to offset high levels of churn. Should a small business consider paid search & local as being separate from one another or part of the same thing? If someone hires help on these fronts, where's the best place to find responsive help?

A: "Big box" local search companies were always better about client acquisition than performance. It always seemed as if performance wasn't an integral part of their business model.

However, small businesses cannot take that approach when it comes to performance. Generally speaking, the more web is connected to business, the better of a small business is. This means that a local Internet marketing strategy should start with business objectives.

Everyone should ask themselves 2 questions:
1. What's my lifetime customer value?
2. How much can I afford to spend on acquiring a customer?

Every online marketing endeavor should be judged through this lens. This means greater integration.

Q: What are some of the best resources people can use to get the fundamentals of local search & to keep up with the changing search landscape?

A: Luckily for everyone, blogosphere in local search is rich in useful information. I would definitely recommend Mike Blumenthal's blog, Andrew Shotland's Local SEO Guide, Linda Buquet's forum, Nyagoslav Zhekov, Mary Bowling and of course, the Local U blog.


Vedran Tomic is a member of SEOBook and founder of Local Ants LLC, a local internet marketing agency. Please feel free to use the comments below to ask any local search questions you have, as Vedran will be checking in periodically to answer them over the next couple days.

Google's Effective 'White Hat' Marketing Case Study

Apr 15th

There's the safe way & the high risk approach. The shortcut takers & those who win through hard work & superior offering.

One is white hat and the other is black hat.

With the increasing search ecosystem instability over the past couple years, some see these labels constantly sliding, sometimes on an ex-post-facto basis, turning thousands of white hats into black hats arbitrarily overnight.

Are you a white hat SEO? or a black hat SEO?

Do you even know?

Before you answer, please have a quick read of this Washington Post article highlighting how Google manipulated & undermined the US political system.

.
.
.
.
.
.
.

Seriously, go read it now.

It's fantastic journalism & an important read for anyone who considers themselves an SEO.

.
.
.
.
.
.
.
.

######

Take the offline analog to Google's search "quality" guidelines & in spirit Google repeatedly violated every single one of them.

Advertorials

creating links that weren’t editorially placed or vouched for by the site’s owner on a page, otherwise known as unnatural links can be considered a violation of our guidelines. Advertorials or native advertising where payment is received for articles that include links that pass PageRank

Advertorials are spam, except when they are not: "the staff and professors at GMU’s law center were in regular contact with Google executives, who supplied them with the company’s arguments against antitrust action and helped them get favorable op-ed pieces published"

Deception

Don't deceive your users.

Ads should be clearly labeled, except when they are not: "GMU officials later told Dellarocas they were planning to have him participate from the audience," which is just like an infomercial that must be labeled as an advertisement!

Preventing Money from Manipulating Editorial

Make reasonable efforts to ensure that advertisements do not affect search engine rankings. For example, Google's AdSense ads and DoubleClick links are blocked from being crawled by a robots.txt file.

Money influencing outcomes is wrong, except when it's not: "Google’s lobbying corps — now numbering more than 100 — is split equally, like its campaign donations, among Democrats and Republicans. ... Google became the second-largest corporate spender on lobbying in the United States in 2012."

Content Quality

The best way to get other sites to create high-quality, relevant links to yours is to create unique, relevant content that can naturally gain popularity in the Internet community. Creating good content pays off: Links are usually editorial votes given by choice, and the more useful content you have, the greater the chances someone else will find that content valuable to their readers and link to it.

Payment should be disclosed, except when it shouldn't: "The school and Google staffers worked to organize a second academic conference focused on search. This time, however, Google’s involvement was not publicly disclosed."

Cloaking

Cloaking refers to the practice of presenting different content or URLs to human users and search engines. Cloaking is considered a violation of Google’s Webmaster Guidelines because it provides our users with different results than they expected.

cloaking is evil, except when it's not: Even as Google executives peppered the GMU staff with suggestions of speakers and guests to invite to the event, the company asked the school not to broadcast its involvement. “We will certainly limit who we announce publicly from Google”

...and on and on and on...

It's not safe to assume that just because a specific deceptive technique isn't included on this page, Google approves of it.

And while they may not approve of something, that doesn't mean they avoid the strategy when mapping out their own approach.

There's a lesson & it isn't a particularly subtle one.

Free markets aren't free. Who could have known?

Bing Lists 'Alternatives' In Search Results

Mar 30th
posted in
msn

Bing recently stated testing listing 'alternatives' near their local search results.

I wasn't able to replicate these in other search verticals like flight search, or on an iPhone search, but the format of these alternatives looks similar to the format proposed in Google's ongoing monopolistic abuse case in Europe:

"In effect, competitors will have the 'choice' either to pay Google in order to remain relevant or lose visibility and become irrelevant," a European consumer watchdog, BEUC, said in a letter it sent to all 28 EU commissioners. The letter, seen by The Wall Street Journal, terms the deal "unacceptable."

Flip Guest Blogging on its Head, With Steroids

Mar 19th

Guest blogging was once considered a widely recommended white hat technique.

Today our monopoly-led marketplace arbitrarily decided this is no longer so.

Stick a fork in it. Torch it. Etc.

Now that rules have changed ex post facto, we can expect to deal with a near endless stream of "unnatural" link penalties for doing what was seen at the time as being:

  • natural
  • widespread
  • common
  • low risk
  • best practice

Google turns your past client investments into new cost centers & penalties. This ought to be a great thing for the SEO industry. Or maybe not.

As Google scares & expunges smaller players from participating in the SEO market, larger companies keep chugging along.

Today a friend received the following unsolicited email:

Curious about their background, he looked up their past coverage: "Written then offers a number of different content licenses that help the advertiser reach this audience, either by re-branding the existing page, moving the content to the advertiser’s website and re-directing traffic there, or just re-publishing the post on the brand’s blog."

So that's basically guest blogging at scale.

And it's not only guest blogging at scale, but it is guest blogging at scale based on keyword performance:

"You give us your gold keywords. Written finds high-performing, gold content with a built-in, engaged audience. Our various license options can bring the audience to you or your brand to the audience through great content."

What's worse is how they pitch this to the people they license content from:

I'm sorry, but taking your most valuable content & turning it into duplicate content by syndicating it onto a fortune 500 website will not increase your traffic. The fortune 500 site will outrank you (especially if visitors/links are 301 redirected to their site!). And when visitors are not redirected, they will still typically outrank you due to their huge domain authority (and the cross-domain rel=canonical tag), leading your content on your site to get filtered out of the search results as duplicate content & your link equity to pass on to the branded advertiser.

And if Google were to come down on anyone in the above sort of situation it would likely be the smaller independent bloggers who get hit.

This is how SEO works.

Smaller independent players innovate & prove the model.

Google punishes them for being innovative.

As they are punished, a vanilla corporate tweak of the same model rolls out and is white hat.

In SEO it's not what you do that matters - it's who your client is.

If you're not working for a big brand, you're doing it wrong.

Four legs good, two legs better.

Disavow & Link Removal: Understanding Google

Jan 26th

Fear Sells

Few SEOs took notice when Matt Cutts mentioned on TWIG that "breaking their spirits" was essential to stopping spammers. But that single piece of information add layers of insights around things like:

  • duplicity on user privacy on organic versus AdWords
  • benefit of the doubt for big brands versus absolute apathy toward smaller entities
  • the importance of identity versus total wipeouts of those who are clipped
  • mixed messaging on how to use disavow & the general fear around links

From Growth to No Growth

Some people internalize failure when growth slows or stops. One can't raise venture capital and keep selling the dream of the growth story unless the blame is internalized. If one understands that another dominant entity (monopoly) is intentionally subverting the market then a feel good belief in the story of unlimited growth flames out.

Most of the growth in the search channel is being absorbed by Google. In RKG's Q4 report they mentioned that mobile ad clicks were up over 100% for the year & mobile organic clicks were only up 28%.

Investing in Fear

There's a saying in investing that "genius is declining interest rates" but when the rates reverse the cost of that additional leverage surfaces. Risks from years ago that didn't really matter suddenly do.

The same is true with SEO. A buddy of mine mentioned getting a bad link example from Google where the link was in place longer than Google has been in existence. Risk can arbitrarily be added after the fact to any SEO activity. Over time Google can keep shifting the norms of what is acceptable. So long as they are fighting off Wordpress hackers and other major issues they are kept busy, but when they catch up on that stuff they can then focus on efforts to shift white to gray and gray to black - forcing people to abandon techniques which offered a predictable positive ROI.

Defunding SEO is an essential & virtuous goal.

Hiding data (and then giving crumbs of it back to profile webmasters) is one way of doing it, but adding layers of risk is another. What panda did to content was add a latent risk to content where the cost of that risk in many cases vastly exceeded the cost of the content itself. What penguin did to links was the same thing: make the latent risk much larger than the upfront cost.

As Google dials up their weighting on domain authority many smaller sites which competed on legacy relevancy metrics like anchor text slide down the result set. When they fall down the result set, many of those site owners think they were penalized (even if their slide was primarily driven by a reweighting of factors rather than an actual penalty). Since there is such rampant fearmongering on links, they start there. Nearly every widely used form of link building has been promoted by Google engineers as being spam.

  • Paid links? Spam.
  • Reciprocal links? Spam.
  • Blog comments? Spam.
  • Forum profile links? Spam.
  • Integrated newspaper ads? Spam.
  • Article databases? Spam.
  • Designed by credit links? Spam.
  • Press releases? Spam.
  • Web 2.0 profile & social links? Spam.
  • Web directories? Spam.
  • Widgets? Spam.
  • Infographics? Spam.
  • Guest posts? Spam.

It doesn't make things any easier when Google sends out examples of spam links which are sites the webmaster has already disavowed or sites which Google explicitly recommended in their webmaster guidelines, like DMOZ.

It is quite the contradiction where Google suggests we should be aggressive marketers everywhere EXCEPT for SEO & basically any form of link building is far too risky.

It’s a strange world where when it comes to social media, Google is all promote promote promote. Or even in paid search, buy ads, buy ads, buy ads. But when it comes to organic listings, it’s just sit back and hope it works, and really don’t actively go out and build links, even those are so important. - Danny Sullivan

Google is in no way a passive observer of the web. Rather they actively seek to distribute fear and propaganda in order to take advantage of the experiment effect.

They can find and discredit the obvious, but most on their “spam list” done “well” are ones they can’t detect. So, it’s easier to have webmasters provide you a list (disavows), scare the ones that aren’t crap sites providing the links into submission and damn those building the links as “examples” – dragging them into town square for a public hanging to serve as a warning to anyone who dare disobey the dictatorship. - Sugarrae

This propaganda is so effective that email spammers promoting "SEO solutions" are now shifting their pitches from grow your business with SEO to recover your lost traffic

Where Do Profits Come From?

I saw Rand tweet this out a few days ago...

... and thought "wow, that couldn't possibly be any less correct."

When ecosystems are stable you can create processes which are profitable & pay for themselves over the longer term.

I very frequently get the question: 'what’s going to change in the next 10 years?' And that is a very interesting question; it’s a very common one. I almost never get the question: 'what’s not going to change in the next 10 years?' And I submit to you that that second question is actually the more important of the two – because you can build a business strategy around the things that are stable in time….in our retail business, we know that customers want low prices and I know that’s going to be true 10 years from now. They want fast delivery, they want vast selection. It’s impossible to imagine a future 10 years from now where a customer comes up and says, 'Jeff I love Amazon, I just wish the prices were a little higher [or] I love Amazon, I just wish you’d deliver a little more slowly.' Impossible. And so the effort we put into those things, spinning those things up, we know the energy we put into it today will still be paying off dividends for our customers 10 years from now. When you have something that you know is true, even over the long-term, you can afford to put a lot of energy into it. - Jeff Bezos at re: Invent, November, 2012

When ecosystems are unstable, anything approaching boilerplate has an outsized risk added by the dominant market participant. The quicker your strategy can be done at scale or in the third world, the quicker Google shifts it from a positive to a negative ranking signal. It becomes much harder to train entry level employees on the basics when some of the starter work they did in years past now causes penalties. It becomes much harder to manage client relationships when their traffic spikes up and down, especially if Google sends out rounds of warnings they later semi-retract.

What's more, anything that is vastly beyond boilerplate tends to require a deeper integration and a higher level of investment - making it take longer to pay back. But the budgets for such engagement dry up when the ecosystem itself is less stable. Imagine the sales pitch, "I realize we are off 35% this year, but if we increase the budget 500% we should be in a good spot a half-decade from now."

All great consultants aim to do more than the bare minimum in order to give their clients a sustainable competitive advantage, but by removing things which are scalable and low risk Google basically prices out the bottom 90% to 95% of the market. Small businesses which hire an SEO are almost guaranteed to get screwed because Google has made delivering said services unprofitable, particularly on a risk-adjusted basis.

Being an entrepreneur is hard. Today Google & Amazon are giants, but it wasn't always that way. Add enough risk and those streams of investment in innovation disappear. Tomorrow's Amazon or Google of other markets may die a premature death. You can't see what isn't there until you look back from the future - just like the answering machine AT&T held back from public view for decades.

Meanwhile, the Google Venture backed companies keep on keeping on - they are protected.

When ad agencies complain about the talent gap, what they are really complaining about is paying people what they are worth. But as the barrier to entry in search increases, independent players die, leaving more SEOs to chase fewer corporate jobs at lower wages. Even companies servicing fortune 500s are struggling.

On an individual basis, creating value and being fairly compensated for the value you create are not the same thing. Look no further than companies like Google & Apple which engage in flagrantly illegal anti-employee cartel agreements. These companies "partnered" with their direct competitors to screw their own employees. Even if you are on a winning team it does not mean that you will be a winner after you back out higher living costs and such illegal employer agreements.

This is called now the winner-take-all society. In other words the rewards go overwhelmingly to just the thinnest crust of folks. The winner-take-all society creates incredibly perverse incentives to become a cheater-take-all society. Cause my chances of winning an honest competition are very poor. Why would I be the one guy or gal who would be the absolute best in the world? Why not cheat instead?" - William K Black

Meanwhile, complaints about the above sorts of inequality or other forms of asset stripping are pitched as being aligned with Nazi Germany's treatment of Jews. Obviously we need more H-1B visas to further drive down wages even as graduates are underemployed with a mountain of debt.

A Disavow For Any (& Every) Problem

Removing links is perhaps the single biggest growth area in SEO.

Just this week I got an unsolicited email from an SEO listing directory

We feel you may qualify for a Top position among our soon to be launched Link Cleaning Services Category and we would like to learn more about Search Marketing Info. Due to the demand for link cleaning services we're poised to launch the link cleaning category. I took a few minutes to review your profile and felt you may qualify. Do you have time to talk this Monday or Tuesday?

Most of the people I interact with tend to skew toward the more experienced end of the market. Some of the folks who join our site do so after their traffic falls off. In some cases the issues look intimately tied to Panda & the sites with hundreds of thousands of pages maybe only have a couple dozen inbound links. In spite of having few inbound links & us telling people the problem looks to be clearly aligned with Panda, some people presume that the issue is links & they still need to do a disavow file.

Why do they make that presumption? It's the fear message Google has been selling nonstop for years.

Punishing people is much different, and dramatic, from not rewarding. And it feeds into the increasing fear that people might get punished for anything. - Danny Sullivan

What happens when Google hands out free all-you-can-eat gummy bear laxatives to children at the public swimming pool? A tragedy of the commons.

Rather than questioning or countering the fear stuff, the role of the SEO industry has largely been to act as lap dogs, syndicating & amplifying the fear.

  • link tool vendors want to sell proprietary clean up data
  • SEO consultants want to tell you that they are the best and if you work with someone else there is a high risk hidden in the low price
  • marketers who crap on SEO to promote other relabeled terms want to sell you on the new term and paint the picture that SEO is a self-limiting label & a backward looking view of marketing
  • paid search consultants want to enhance the perception that SEO is unreliable and not worthy of your attention or investment

Even entities with a 9 figure valuation (and thus plenty of resources to invest in a competent consultant) may be incorrectly attributing SEO performance problems to links.

A friend recently sent me a link removal request from Buy Domains referring to a post which linked to them.

On the face of this, it's pretty absurd, no? A company which does nothing but trade in names themselves asks that their name reference be removed from a fairly credible webpage recommending them.

The big problem for Buy Domains is not backlinks. They may have had an issue with some of the backlinks from PPC park pages in the past, but now those run through a redirect and are nofollowed.

Their big issue is that they have less than great engagement metrics (as do most marketplace sites other than eBay & Amazon which are not tied to physical stores). That typically won't work if the entity has limited brand awareness coupled with having nearly 5 million pages in Google's index.

They not only have pages for each individual domain name, but they link to their internal search results from their blog posts & those search pages are indexed. Here's part of a recent blog post

And here are examples of the thin listing sorts of pages which Panda was designed in part to whack. These pages were among the millions indexed in Google.

A marketplace with millions of pages that doesn't have broad consumer awareness is likely to get nailed by Panda. And the websites linking to it are likely to end up in disavow files, not because they did anything wrong but because Google is excellent at nurturing fear.

What a Manual Penalty Looks Like

Expedia saw a 25% decline in search visibility due to an unnatural links penalty , causing their stock to fall 6.4%. Both Google & Expedia declined to comment. It appears that the eventual Expedia undoing stemmed from Hacker News feedback & coverage about an outing story on an SEO blog that certainly sounded like it stemmed from an extortion attempt. USA Today asked if the Expedia campaign was a negative SEO attack.

While Expedia's stock drop was anything but trivial, they will likely recover within a week to a month.

Smaller players can wait and wait and wait and wait ... and wait.

Manual penalties are no joke, especially if you are a small entity with no political influence. The impact of them can be absolutely devastating. Such penalties are widespread too.

In Google's busting bad advertising practices post they highlighted having zero tolerance, banning more than 270,000 advertisers, removing more than 250,000 publishers accounts, and disapproving more than 3,000,000 applications to join their ad network. All that was in 2013 & Susan Wojcicki mentioned Google having 2,000,000 sites in their display ad network. That would mean that something like 12% of their business partners were churned last year alone.

If Google's churn is that aggressive on their own partners (where Google has an economic incentive for the relationship) imagine how much broader the churn is among the broader web. In this video Matt Cutts mentioned that Google takes over 400,000 manual actions each month & they get about 5,000 reconsideration request messages each week, so over 95% of the sites which receive notification never reply. Many of those who do reply are wasting their time.

The Disavow Threat

Originally when disavow was launched it was pitched as something to be used with extreme caution:

This is an advanced feature and should only be used with caution. If used incorrectly, this feature can potentially harm your site’s performance in Google’s search results. We recommend that you disavow backlinks only if you believe you have a considerable number of spammy, artificial, or low-quality links pointing to your site, and if you are confident that the links are causing issues for you. In most cases, Google can assess which links to trust without additional guidance, so most normal or typical sites will not need to use this tool.

Recently Matt Cutts has encouraged broader usage. He has one video which discusses proatively disavowing bad links as they come in & another where he mentioned how a large company disavowed 100% of their backlinks that came in for a year.

The idea of proactively monitoring your backlink profile is quickly becoming mainstream - yet another recurring fixed cost center in SEO with no upside to the client (unless you can convince the client SEO is unstable and they should be afraid - which would ultimately retard their longterm investment in SEO).

Given the harshness of manual actions & algorithms like Penguin, they drive companies to desperation, acting irrationally based on fear.

People are investing to undo past investments. It's sort of like riding a stock down 60%, locking in the losses by selling it, and then using the remaining 40% of the money to buy put options or short sell the very same stock. :D

Some companies are so desperate to get links removed that they "subscribe" sites that linked to them organically with spam email messages asking the links be removed.

Some go so far that they not only email you on and on, but they created dedicated pages on their site claiming that the email was real.

What's so risky about the above is that many webmasters will remove links sight unseen, even from an anonymous Gmail account. Mix in the above sort of "this message is real" stuff and how easy would it be for a competitor to target all your quality backlinks with a "please remove my links" message? Further, how easy would it be for a competitor aware of such a campaign to drop a few hundred Dollars on Fiverr or Xrummer or other similar link sources, building up your spam links while removing your quality links?

A lot of the "remove my link" messages are based around lying to the people who are linking & telling them that the outbound link is harming them as well: "As these links are harmful to both yours and our business after penguin2.0 update, we would greatly appreciate it if you would delete these backlinks from your website."

Here's the problem though. Even if you spend your resources and remove the links, people will still likely add your site to their disavow file. I saw a YouTube video recording of an SEO conference where 4 well known SEO consultants mentioned that even if they remove the links "go ahead and disavow anyhow," so there is absolutely no upside for publishers in removing links.

How Aggregate Disavow Data Could Be Used

Recovery is by no means guaranteed. In fact of the people who go to the trouble to remove many links & create a disavow file, only 15% of people claim to have seen any benefit.

The other 85% who weren't sure of any benefit may not have only wasted their time, but they may have moved some of their other projects closer toward being penalized.

Let's look at the process:

  • For the disavow to work you also have to have some links removed.
    • Some of the links that are removed may not have been the ones that hurt you in Google, thus removing them could further lower your rank.
    • Some of the links you have removed may be the ones that hurt you in Google, while also being ones that helped you in Bing.
    • The Bing & Yahoo! Search traffic hit comes immediately, whereas the Google recovery only comes later (if at all).
  • Many forms of profits (from client services or running a network of sites) come systematization. If you view everything that is systematized or scalable as spam, then you are not only disavowing to try to recover your penalized site, but you are send co-citation disavow data to Google which could have them torch other sites connected to those same sources.
    • If you run a network of sites & use the same sources across your network and/or cross link around your network, you may be torching your own network.
    • If you primarily do client services & disavow the same links you previously built for past clients, what happens to the reputation of your firm when dozens or hundreds of past clients get penalized? What happens if a discussion forum thread on Google Groups or elsewhere starts up where your company gets named & then a tsunami of pile on stuff fills out in the thread? Might that be brand destroying?

The disavow and review process is not about recovery, but is about collecting data and distributing pain in a game of one-way transparency. Matt has warned that people shouldn't lie to Google...

...however Google routinely offers useless non-information in their responses.

Some Google webmaster messages leave a bit to be desired.

Recovery is uncommon. Your first response from Google might take a month or more. If you work for a week or two on clean up and then the response takes a month, the penalty has already lasted at least 6 weeks. And that first response might be something like this

Reconsideration request for site.com: Site violates Google's quality guidelines

We received a reconsideration request from a site owner for site.com/.

We've reviewed your site and we believe that site.com/ still violates our quality guidelines. In order to preserve the quality of our search engine, pages from site.com/ may not appear or may not rank as highly in Google's search results, or may otherwise be considered to be less trustworthy than sites which follow the quality guidelines.

For more specific information about the status of your site, visit the Manual Actions page in Webmaster Tools. From there, you may request reconsideration of your site again when you believe your site no longer violates the quality guidelines.
If you have additional questions about how to resolve this issue, please see our Webmaster Help Forum.

Absolutely useless.

Zero useful information whatsoever.

As people are unsuccessful in the recovery process they cut deeper and deeper. Some people have removed over 90% of their profile without recovering & been nearly a half-year into the (12-step) "recovery" process before even getting a single example of a bad link from Google. In some cases these bad links Google identified were links were obviously created by third party scraper sites & were not in Google's original sample of links to look at (so even if you looked at every single link they showed you & cleaned up 100% of issues you would still be screwed.)

Another issue with aggregate disavow data is there is a lot of ignorance in the SEO industry in general, and people who try to do things cheap (essentially free) at scale have an outsized footprint in the aggregate data. For instance, our site's profile links are nofollowed & our profiles are not indexed by Google. In spite of this, examples like the one below are associated with not 1 but 3 separate profiles for a single site.

Our site only has about 20,000 to 25,000 unique linking domains. However over the years we have had well over a million registered user profiles. If only 2% of the registered user profiles were ignorant spammers who spammed our profile pages and then later added our site to a disavow file, we would have more people voting *against* our site than we have voting for it. And that wouldn't be because we did anything wrong, but rather because Google is fostering an environment of mixed messaging, fear & widespread ignorance.

And if we are ever penalized, the hundreds of scraper sites built off scraping our RSS feed would make the recovery process absolutely brutal.

Another factor with Google saying "you haven't cut out enough bone marrow yet" along with suggesting that virtually any/every type of link is spam is that there is going to be a lot of other forms of false positives in the aggregate data.

I know some companies specializing in link recovery which in part base some aspects of their disavows on the site's ranking footprint. Well if you get a manual penalty, a Panda penalty, or your site gets hacked, then those sorts of sites which you are linking to may re-confirm that your site deserves to be penalized (on a nearly automated basis with little to no thought) based on the fact that it is already penalized. Good luck on recovering from that as Google folds in aggregate disavow data to justify further penalties.

Responsibility

All large ecosystems are gamed. We see it with app ratings & reviews, stealth video marketing, advertising, malware installs, and of course paid links.

Historically in search there has been the view that you are responsible for what you have done, but not the actions of others. The alternate roadmap would lead to this sort of insanity:

Our system has noticed that in the last week you received 240 spam emails. In result, your email account was temporarily suspended. Please contact the spammers and once you have a proof they unsuscribed you from their spam databases, we will reconsider reopening your email account.

As Google has closed down their own ecosystem, they allow their own $0 editorial to rank front & center even if it is pure spam, but third parties are now held to a higher standard - you could be held liable for the actions of others.

At the extreme, one of Google's self-promotional automated email spam messages sent a guy to jail. In spite of such issues, Google remains unfazed, adding a setting which allows anyone on Google+ to email other members.

Ask Google if they should be held liable for the actions of third parties and they will tell you to go to hell. Their approach to copyright remains fuzzy, they keep hosting more third party content on their own sites, and even when that content has been deemed illegal they scream that it undermines their first amendment rights if they are made to proactively filter:

Finally, they claimed they were defending free speech. But it's the courts which said the pictures were illegal and should not be shown, so the issue is the rule of law, not freedom of speech.
...
the non-technical management, particularly in the legal department, seems to be irrational to the point of becoming adolescent. It's almost as if they refuse to do something entirely sensible, and which would save them and others time and trouble, for no better reason than that someone asked them to.

Monopolies with nearly unlimited resources shall be held liable for nothing.

Individuals with limited resources shall be liable for the behavior of third parties.

Google Duplicity (beta).

Torching a Competitor

As people have become more acclimated toward link penalties, a variety of tools have been created to help make sorting through the bad ones easier.

"There have been a few tools coming out on the market since the first Penguin - but I have to say that LinkRisk wins right now for me on ease of use and intuitive accuracy. They can cut the time it takes to analyse and root out your bad links from days to minutes..." - Dixon Jones

But as there have been more tools created for sorting out bad links & more tools created to automate sending link emails, two things have happened

  • Google is demanding more links be removed to allow for recovery
  • people are becoming less responsive to link removal requests as they get bombarded with them
    • Some of these tools keep bombarding people over and over again weekly until the link is removed or the emails go to the spam bin
    • to many people the link removal emails are the new link request emails ;)
    • one highly trusted publisher who participates in our forums stated they filtered the word "disavow" to automatically go to their trash bin
    • on WebmasterWorld a member decided it was easier to delete their site than deal with the deluge of link removal spam emails

The problem with Google rewarding negative signals is there are false positives and it is far cheaper to kill a business than it is to build one. The technically savvy teenager who created the original version of the software used in the Target PoS attack sold the code for only $2,000.

There have been some idiotic articles like this one on The Awl suggesting that comment spamming is now dead as spammers run for the hills, but that couldn't be further from the truth. Some (not particularly popular) blogs are getting hundreds to thousands of spam comments daily & Wordpress can have trouble even backing up the database (unless the comment spam is regularly deleted) as the database can quickly get a million records.

The spam continues but the targets change. A lot of these comments are now pointed at YouTube videos rather than ordinary websites.

As Google keeps leaning into negative signals, one can expect a greater share of spam links to be created for negative SEO purposes.

Maybe this maternity jeans comment spam is tied to the site owner, but if they didn't do it, how do they prove it?

Once again, I'll reiterate Bill Black

This is called now the winner-take-all society. In other words the rewards go overwhelmingly to just the thinnest crust of folks. The winner-take-all society creates incredibly perverse incentives to become a cheater-take-all society. Cause my chances of winning an honest competition are very poor. Why would I be the one guy or gal who would be the absolute best in the world? Why not cheat instead?" - William K Black

The cost of "an academic test" can be as low as $5. You know you might be in trouble when you see fiverr.com/conversations/theirusername in your referrers:

Our site was hit with negative SEO. We have manually collected about 24,000 bad links for our disavow file (so far). It probably cost the perp $5 on Fiverr to point these links at our site. Do you want to know how bad that sucks? I'll tell you. A LOT!! Google should be sued enmass by web masters for wasting our time with this "bad link" nonsense. For a company with so many Ph.D's on staff, I can't believe how utterly stupid they are

Or, worse yet, you might see SAPE in your referrers

And if the attempt to get you torched fails, they can try & try again. The cost of failure is essentially zero. They can keep pouring on the fuel until the fire erupts.

Even Matt Cutts complains about website hacking, but that doesn't mean you are free of risk if someone else links to your site from hacked blogs. I've been forwarded unnatural link messages from Google which came about after person's site was added in on a SAPE hack by a third party in an attempt to conceal who the beneficial target was. When in doubt, Google may choose to blame all parties in a scorched Earth strategy.

If you get one of those manual penalties, you're screwed.

Even if you are not responsible for such links, and even if you respond on the same day, and even if Google believes you, you are still likely penalized AT LEAST for a month. Most likely Google will presume you are a liar and you have at least a second month in the penalty box. To recover you might have to waste days (weeks?) of your life & remove some of your organic links to show that you have went through sufficient pain to appease the abusive market monopoly.

As bad as the above is, it is just the tip of the iceberg.

  • People can redirect torched websites.
  • People can link to you from spam link networks which rotate links across sites, so you can't possibly remove or even disavow all the link sources.
  • People can order you a subscription of those rotating spam links from hacked sites, where new spam links appear daily. Google mentioned discovering 9,500 malicious sites daily & surely the number has only increased from there.
  • People can tie any/all of the above with cloaking links or rel=canonical messages to GoogleBot & then potentially chain that through further redirects cloaked to GoogleBot.
  • And on and on ... the possibilities are endless.

Extortion

Another thing this link removal fiasco subsidizes is various layers of extortion.

Not only are there the harassing emails threatening to add sites to disavow lists if they don't remove the links, but some companies quickly escalate things from there. I've seen hosting abuse, lawyer threat letters, and one friend was actually sued in court (and the people who sued him actually had the link placed!)

Google created a URL removal tool which allows webmasters to remove pages from third party websites. How long until that is coupled with DDoS attacks? Once effective with removing one page, a competitor might decide to remove another.

Another approach to get links removed is to offer payment. But payment itself might encourage the creation of further spammy links as link networks look to replace their old cashflow with new sources.

The recent Expedia fiasco started as an extortion attempt: "If I wanted him to not publish it, he would "sell the post to the highest bidder."

Another nasty issue here is articles like this one on Link Research Tools, where they not only highlight client lists of particular firms, but then state which URLs have not yet been penalized followed by "most likely not yet visible." So long as that sort of "publishing" is acceptable in the SEO industry, you can bet that some people will hire the SEOs nearly guaranteeing a penalty to work on their competitor's sites, while having an employee write a "case study" for Link Research Tools. Is this the sort of bullshit we really want to promote?

Some folks are now engaging in overt extortion:

I had a client phone me today and say he had a call from a guy with an Indian accent who told him that he will destroy his website rankings if he doesn't pay him £10 per month to NOT do this.

Branding / Rebranding / Starting Over

Sites that are overly literal in branding likely have no chance at redemption. That triple hyphenated domain name in a market that is seen as spammy has zero chance of recovery.

Even being a generic unbranded site in a YMYL category can make you be seen as spam. The remote rater documents stated that the following site was spam...

... even though the spammiest thing on it was the stuff advertised in the AdSense ads:

For many (most?) people who receive a manual link penalty or are hit by Penguin it is going to be cheaper to start over than to clean up.

At the very minimum it can make sense to lay groundwork for a new project immediately just in case the old site can't recover or takes nearly a year to recover. However, even if you figure out the technical bits, as soon as you have any level of success (or as soon as you connect your projects together in any way) you once again become a target.

And you can't really invest in higher level branding functions unless you think the site is going to be around for many years to earn off the sunk cost.

Succeeding at SEO is not only about building rank while managing cashflow and staying unpenalized, but it is also about participating in markets where you are not marginalized due to Google inserting their own vertical search properties.

Even companies which are large and well funded may not succeed with a rebrand if Google comes after their vertical from the top down.

Hope & Despair

If you are a large partner affiliated with Google, hope is on your side & you can monetize the link graph: "By ensuring that our clients are pointing their links to maximize their revenue, we’re not only helping them earn more money, but we’re also stimulating the link economy."

You have every reason to be Excited, as old projects like Excite or Merchant Circle can be relaunched again and again.

Even smaller players with the right employer or investor connections are exempt from these arbitrary risks.

You can even be an SEO and start a vertical directory knowing you will do well if you can get that Google Ventures investment, even as other similar vertical directories were torched by Panda.

For most other players in that same ecosystem, the above tailwind is a headwind. Don't expect much 1 on 1 help in webmaster tools.

In this video Matt Cutts mentioned that Google takes over 400,000 manual actions each month & they get about 5,000 reconsideration request messages each week, so over 95% of the sites which receive notification never reply. Many of those who reply are wasting their time. How many confirmed Penguin 1.0 recoveries are you aware of?

Even if a recovery is deserved, it does not mean one will happen, as errors do happen. And on the off chance recovery happens, recovery does not mean a full restoration of rankings.

There are many things we can learn from Google's messages, but probably the most important is this:

It was the best of times, it was the worst of times, it was the age of wisdom, it was the age of foolishness, it was the epoch of belief, it was the epoch of incredulity, it was the season of Light, it was the season of Darkness, it was the spring of hope, it was the winter of despair, we had everything before us, we had nothing before us, we were all going direct to heaven, we were all going direct the other way - in short, the period was so far like the present period, that some of its noisiest authorities insisted on its being received, for good or for evil, in the superlative degree of comparison only. - Charles Dickens, A Tale of Two Cities

Pages






    Email Address
    Pick a Username
    Yes, please send me "7 Days to SEO Success" mini-course (a $57 value) for free.

    Learn More

    We value your privacy. We will not rent or sell your email address.