Loah Qwality Add Werds Clix Four U

Google recently announced they were doing away with exact match AdWords ad targeting this September. They will force all match types to have close variant keyword matching enabled. This means you get misspelled searches, plural versus singular overlap, and an undoing of your tight organization.

In some cases the user intent is different between singular and plural versions of a keyword. A singular version search might be looking to buy a single widget, whereas a plural search might be a user wanting to compare different options in the marketplace. In some cases people are looking for different product classes depending on word form:

For example, if you sell spectacles, the difference between users searching on ‘glass’ vs. ‘glasses’ might mean you are getting users seeing your ad interested in a building material, rather than an aid to reading.

Where segmenting improved the user experience, boosted conversion rates, made management easier, and improved margins - those benefits are now off the table.

CPC isn't the primary issue. Profit margins are what matter. Once you lose the ability to segment you lose the ability to manage your margins. And this auctioneer is known to bid in their own auctions, have random large price spikes, and not give refunds when they are wrong.

An offline analogy for this loss of segmentation ... you go to a gas station to get a bottle of water. After grabbing your water and handing the cashier a $20, they give you $3.27 back along with a six pack you didn't want and didn't ask for.

Why does a person misspell a keyword? Some common reasons include:

  • they are new to the market & don't know it well
  • they are distracted
  • they are using a mobile device or something which makes it hard to input their search query (and those same input issues make it harder to perform other conversion-oriented actions)
  • their primary language is a different language
  • they are looking for something else

In any of those cases, the typical average value of the expressed intent is usually going to be less than a person who correctly spelled the keyword.

Even if spelling errors were intentional and cultural, the ability to segment that and cater the landing page to match disappears. Or if the spelling error was a cue to send people to an introductory page earlier in the conversion funnel, that option is no more.

In many accounts the loss of the granular control won't cause too big of a difference. But some advertiser accounts in competitive markets will become less profitable and more expensive to manage:

No one who's in the know has more than about 5-10 total keywords in any one adgroup because they're using broad match modified, which eliminated the need for "excessive keyword lists" a long time ago. Now you're going to have to spend your time creating excessive negative keyword lists with possibly millions upon millions of variations so you can still show up for exactly what you want and nothing else.

You might not know which end of the spectrum your account is on until disaster strikes:

I added negatives to my list for 3 months before finally giving up opting out of close variants. What they viewed as a close variant was not even in the ballpark of what I sell. There have been petitions before that have gotten Google to reverse bad decisions in the past. We need to make that happen again.

Brad Geddes has held many AdWords seminars for Google. What does he think of this news?

In this particular account, close variations have much lower conversion rates and much higher CPAs than their actual match type.
...
Variation match isn’t always bad, there are times it can be good to use variation match. However, there was choice.
...
Loss of control is never good. Mobile control was lost with Enhanced Campaigns, and now you’re losing control over your match types. This will further erode your ability to control costs and conversions within AdWords.

A monopoly restricting choice to enhance their own bottom line. It isn't the first time they've done that, and it won't be the last.

Have an enhanced weekend!

Understanding The Google Penguin Algorithm

Whenever Google does a major algorithm update we all rush off to our data to see what changed in terms of rankings, search traffic, and then look for the trends to try to figure out what changed.

The two people I chat most with during periods of big algorithmic changes are Joe Sinkwitz and Jim Boykin. I recently interviewed them about the Penguin algorithm.

Topics include:

  • what it is
  • its impact
  • why there hasn't been an update in a while
  • how to determine if issues are related to Penguin or something else
  • the recovery process (from Penguin and manual link penalties)
  • and much, much more

Here's a custom drawing we commissioned for this interview.
Pang Win.

Want to embed this image on your website?

To date there have been 5 Penguin updates:

  • April 24, 2012
  • May 25, 2012
  • October 5, 2012
  • May 22, 2013 (Penguin 2.0)
  • October 4, 2013

There hasn't been one in quite a while, which is frustrating many who haven't been able to recover. On to the interview...

At its core what is Google Penguin?

Jim: It is a link filter that can cause penalties.

Joe: At its core, Penguin can be viewed as an algorithmic batch filter designed to punish lower quality link profiles.

What sort of ranking and traffic declines do people typically see from Penguin?

Jim: 30-98%. actually, seen some "manual partial matches" some, where traffic was hardly hit...but that's rare.

Joe: Near total. I should expand. Penguin 1.0 has been a different beast than its later iterations; the first one has been nearly a fixed flag whereas later iterations haven't been quite as severe.

After the initial update there was another one about a month later & then one about every 6 months for a while. There hasn't been one for about 10 months now. So why have the updates been so rare? And why hasn't there been one for a long time?

Jim: Great question. We all believed there'd be an update every 6 months, and now it's been way longer than 6 months...maybe because Matt's on vacation...or maybe he knew it would be a long time until the next update, so he took some time off...or perhaps Google wants those with a algorithmic penalty to feel the pain for longer than 6 months.

Joe: 1.0 was temporarily escapable if you were willing to 301 your site; after 1.1 the redirect began to pass on the damage. My theory on why it has been so very long on the most recent update has to do with maximizing pain - Google doesn't intend to lift its boot off the throats of webmasters just yet; no amount of groveling will do. Add to that the complexity of every idiot disavowing 90%+ of their clean link profiles and 'dirty' vs 'clean' links is difficult to ascertain on that signal.

Jim: Most people disavow some, then the disavow some more...then next month they disavow more...wait a year and they may disavow them all :)

Joe: Agreed.

Jim: Then Google will let them out...hehe, tongue in cheek...a little.

Joe: I've seen disavow files with over 98% of links in there, including Wikipedia, the Yahoo! Directory, and other great sources - absurd.

Jim: Me too. Most of the people are clueless ... there's tons of people who are disavowing links just because their traffic has gone down, so they feel they must have been hit by penguin, so they start disavowing links.

Joe: Yes; I've seen a lot of panda hits where the person wants to immediately disavow. "whoa, slow down there Tex!"

Jim: I've seen services where they guarantee you'll get out of a penguin penalty, and we know that they're just disavowing 100% of the links. Yes, you get your manual penalty removed that way, but then you're left with nothing.

Joe: Good time to mention that any guarantee of getting out of a penalty is likely sold as a bag of smoke.

Jim: or as they are disavowing 100% of the links they can find going to the site.

OK. I think you mentioned an important point there Jim about "100% of the links they can find." What are the link sources people should use & how comprehensive is the Google Webmaster Tools data? Is WMT data enough to get you recovered?

Joe: Rarely. I've seen where the examples listed in a manual action might be discoverable on Ahrefs, Majestic SEO, or in WMT, but upon cleaning them up (and disavowing further of course) that Google will come back with a few more links that weren't initially in the WMT data dump. I'm dealing with a client on this right now that bought a premium domain as-is and has been spending about a year constantly disavowing and removing links. Google won't let them up for air and won't do the hard reset.

Jim: well first...if you're getting your backlinks from Google, be sure to pull your backlinks from the www and the non www version of your site. You can't just use one: you HAVE to pull backlinks from both, so you have to verify both your www and your non www version of your site with Google Webmaster Tools.

We often start with that. When we find big patterns that we feel are the cause, we'll then go into OSE, Majestic SEO, and Ahrefs, and pull those backlinks too, and pull out those that fit the patterns, but that's after the Google backlink analysis.

Joe, you mentioned people getting hit by Panda and mistakenly going off to the races to disavow links. What are the distinguishing characteristics between Penguin, Panda & manual link penalties?

Joe: Given they like to sandwich updates to make it difficult to discern, I like this question. Penguin is about links; it is the easiest to find but hardest to fix. When I first am looking at a URL I'll quickly look at anchor % breakdowns, sources of links, etc. The big difference between penguin and a manual link penalty (if you aren't looking on WMT) is the timing -- think of a bomb going off vs a sniper...everyone complaining at once? probably an algorithm; just a few? probably some manual actions. For manual actions, you'll get a note too in WMT. With panda I like to look first at the on-page to see if I can spot the egregious KW stuffing, weird infrastructure setups that result in thin/duplicated content, and look into engagement metrics and my favorite...externally supported pages - to - total indexed pages ratios.

Jim: Manual, at least you can keep resubmitting and get a yes or no. With an algorithmic, you're screwed....because you're waiting for the next refresh...hoping you did enough to get out.

I don't mind going back and forth with Google with a manual penalty...at least I'm getting an answer.

If you see a drop in traffic, be sure to compare that to the dates of Panda and Penguin updates...if you see a drop on one of the update days, then you can know if you have Panda or Penguin....and if you're traffic is just falling, it could be just that, and no penalty.

Joe: While this interview was taking place an employee pinged me to let me know a manual action that was denied, with an example URL being something akin to domain.com/?var=var&var=var - the entire domain was already disavowed. Those 20 second manual reviews by 3rd parties without much of an understanding of search doesn't generate a lot of confidence for me

Jim: Yes, I posted this yesterday to SEOchat. Reviewers are definitely not looking at things.

You guys mentioned that anyone selling a guaranteed 100% recovery solution is likely selling a bag of smoke. What are the odds of recovery? When does it make sense to invest in recovery, when does it make sense to start a different site, and when does it make sense to do both in parallel?

Jim: Well, I'm one for trying to save a site. I haven't once said "it's over for that site, let's start fresh." Links are so important, that if I can even save a few links going to a site, I'll take it. I'm not a fan of doing two sites, causes duplicate content issues, and now your efforts are on two sites.

Joe : It depends on the infraction. I have a lot more success getting stuff out of panda, manual actions, and the later iterations of penguin (theoretically including the latest one once a refresh takes place); I won't take anyone's money for those hit on penguin 1.0 though...I give free advice and add it to my DB tracking, but the very few examples I have where a recovery took place that I can confirm were penguin 1.0 and not something else, happened due to being a beta user of the disavow tool and likely occurred for political reasons vs tech reasons.

For churn and burn, redirects and canonicals can still work if you're clever...but that's not reinvestment so much as strategy shift I realize.

You guys mentioned the disavow process, where a person does some, does some more over time, etc. Is Google dragging out the process primarily to drive pain? Or are they leveraging the aggregate data in some way?

Joe: Oh absolutely they drag it out. Mathematically I think of triggers where a threshold to trigger down might be at X%, but the trigger for recovery might be X-10%. Further though, I think initially they looooooved all the aggregate disavow data, until the community freaked out and started disavowing everything. Let's just say I know of a group of people that have a giant network where lots of quality sites are purposefully disavowed in an attempt to screw with the signal further. :)

Jim: pain :) ... not sure if they're leveraging the data yet, but they might be. It shouldn't be too hard for Google to see that a ton of people are disavowing links from a site like get-free-links-directory.com, for Google to say, "no one else seems to trust these links, we should just nuke that site and not count any links from there."

we can do this ourselves with our own tools we have..I can see how many times I've seen a domain in my disavows, and how many times I disavowed that...ie, If I see spamsite.com in 20 disavows I've done, and I'd disavowed it all 20 times I saw it, I can see this data... or if I've seen goodsite.com 20 times, and never once disavowed it, I can see that too. I'd assume Google must do something like this as well.

Given that they drag it out, on the manual penalties does it make sense to do a couched effort on the first rejection or two, in order to give the perception of a greater level of pain and effort as you scale things up on further requests? What level of resources does it make sense to devote to the initial effort vs the next one and so on? When does recovery typically happen (in terms of % of links filtered and in terms of how many reconsideration requests were filed)?

Joe: When I deliver "disavow these" and "say this" stuff, I give multiple levels, knowing full well that there might be deeper and deeper considerations of the pain. Now, there have been cases where the 1st try gets a site out, but I usually see 3 or more.

Jim: I figure it will take a few reconsideration requests...and yes, I start "big" and get "bigger."

but that's for a sitewide penalty...

We've seen sitewides get reduced to a partial penalty. And once we have a partial penalty, it's much easier to identify this issues and take care of those, while leaving links that go to pages that were not effected.

A sitewide manual penalty kills the site...a partial match penalty usually has some stuff that ranks good, and some stuff that no longer ranks...once we're at a partial match, I feel much more confident in getting that resolved.

Jim, I know you've mentioned the errors people make in either disavowing great links or disavowing links when they didn't need to. You also mentioned the ability to leverage your old disavow data when processing new sites. When does it make sense to DIY on recovery versus hiring a professional? Are there any handy "rule of thumb" guidelines in terms of the rough cost of a recovery process based on the size of their backlink footprint?

Joe: It comes down to education, doesn't it? Were you behind the reason it got dinged? You might try that first vs immediately hiring. Psychologically it could even look like you're more serious after the first disavow is declined by showing you "invested" in the pain. Also, it comes down to opportunity cost. What is your personal time worth divided by your perceived probability of fixing

Jim: We charge $5000 for the analysis, and $5000 for the link removal process...some may think that's expensive...but removing good links will screw you, and not removing bad links will screw you...it's a real science, and getting is wrong can cost you a lot more than this...of course I'd recommend seeing a professional, as I sell this service...but I can't see anyone who's not a true expert in links doing this themselves.

Oh...and once we start work for someone, we keep going at no further cost until they get out.

Joe: That's a nice touch Jim.

Jim: Thank you.

Joe, during this interview you mentioned a reconsideration request rejection where the person cited a link on a site that has already been disavowed. Given how many errors Google's reviewers make, does it make sense to aggressively push to remove links rather than using disavow? What are the best strategies to get links removed?

Joe: DDoS

Jim: hehe

Joe: Really though, be upfront and honest when using those link removal services (which I'd do vs trying to do them one-by-one-by-one)

Jim: Only 1% of the people will remove links anyways; it's more to show Google that to you really tried to get the links removed.

Joe: Let the link holder know that you got hit with a penalty, you're just trying to clean it up because your business is suffering, and ask politely that they do you a solid favor.

I've been on the receiving end of a lot of different strategies given the size of my domain portfolio. I've been sued before (as a first course of action!) by someone that PAID to put a link on my site....they never even asked, just filed the case.

Jim: We send 3 removal requests..and ping the links too..so when we do a reconsideration request we can show Google the spreadsheet of who we emailed, when we emailed them, and who removed or no followed the links...but it's more about "show" to Google.

Joe: Yep, not a ton of compliance; webmasters have link removal fatigue by now.

This is more of a business question than an SEO question, but ... as much as budgeting for the monetary cost of recovery, an equally important form of budgeting is dealing with the reduced cashflow while the site is penalized. How many months does it typically take to recover from a manual penalty? When should business owners decide to start laying people off? Do you guys suggest people aggressively invest in other marketing channels while the SEO is being worked on in the background?

Jim: manual penalty typically take 2-4 months to recover. Recover is a relative term. Some people get "your manual penalty has been removed" and thier recovery is a tiny blip -up 5%, but still down 90% from what is was prior. Getting a "manual penalty removed" is great. IF there's good links left in your profile...if you've disavow everything, and your penalty is removed...so what...you've got nothing....people often ask where they'll be once they "recover" and I say "it depends on what you have left for links"...but it won't be where you were.

Joe: It depends on how exposed they are per variable costs. If the costs are fixed, then one can generally wait longer (all things being equal) before cutting. If you have a quarter million monthly link budget *cough* then, you're going to want to trim as quickly as possible just in order to survive.

Per investing in other channels, I absolutely wholeheartedly cannot emphasize how important it is to become an expert in one channel and at least a generalist in several others...even better, hire an expert in another channel to partner up with. In payday one of the big players did okay in SEO but even with a lot of turbulence was doing great due to their TV and radio capabilities. Also, collect the damn email addresses; email is still a gold mine if you use it correctly.

One of my theories for why there hasn't been a penguin update in a long time was that as people have become more afraid of links they've started using them as a weapon & Google doesn't want a bunch of false positives caused by competitors killing sites. One reason I've thought this versus the pain first motive is that Google could always put a time delay on recoveries while still allowing new sites to get penalized on updates. Joe, you mentioned that after the second Penguin update penalties started passing forward on redirects. Do people take penalized sites and point them at competitors?

Joe: Yes, they do. They also take them and pass them into the natural links of their competitors. I've been railing on negative SEO for several years now...right about when the first manual action wave came out in Jan 2012; that was a tipping point. It is now more economical to take someone else's ranking down than it is to (with a strong degree of confidence) invest in a link strategy to leapfrog them naturally

I could speak for days straight in a congressional filibuster on link strategies used for Negative SEO. It is almost magical how pervasive it has become. I get a couple requests a week to do it even...by BIG companies. Brands being the mechanism to sort out the cesspool and all that.

Jim: Soon, everyone will be monitoring they backlinks on a monthly basis. I know one big company that submits an updated disavow list every week to google.

That leads to a question about preemptive disavows. When does it make sense to do that? What businesses need to worry about that sort of stuff?

Joe: Are you smaller than a Fortune 500? Then the cards are stacked against you. At the very least, be aware of your link profile -- I wouldn't go so far as to preemptively disavow unless something major popped up.

Jim: I've done a preemptive disavow for my site. I'd say everyone should do a preemptive disavow to clean out the crap backlinks.

Joe: I can't wait to launch an avow service...basically go around to everyone and charge a few thousand dollars to clean up their disavows. :)

Jim: We should team up Joe and do them together :)

Joe: I'll have my spambots call your spambots.

Jim: saving the planet from penguin penalties. cleaning up the links of the web for Google.

Joe: For Google or from Google? :) The other dig, if there's time, is that not all penalties are created equal because there are several books of law in terms of how long a penalty might last. If I take an unknown site and do what RapGenius did, I'd still be waiting, even after fixing (which rapgenius really didn't do) largely because Google is not one of my direct or indirect investors.

Perhaps SEOs will soon offer a service for perfecting your pitch deck for the Google Ventures or Google Capital teams so it is easier to BeatThatPenalty? BanMeNot ;)

Joe: Or to extract money from former Googlers...there's a funding bubble right now where those guys can write their own ticket by VCs chasing the brand. Sure the engineer was responsible for changing the font color of a button, but they have friends on the inside still that might be able to reverse catastrophe.

Outside of getting a Google investment, what are some of the best ways to minimize SEO risk if one is entering a competitive market?

Jim: Don't try to rank for specific phrases anymore. It's a long slow road now.

Joe: Being less dependent on Google gives you power; think of it like a job interview. Do you need that job? The less you do, the more bargaining power you have. If you have more and more income coming in to your site from other channels, chances are you are also hitting on some important brand signals.

Jim: You must create great things, and build your brand...that has to be the focus...unless you want to do things to rank higher quicker, and take the risk of a penalty with Google.

Joe: Agreed. I do far fewer premium domaining + SEO-only plays anymore. For a while they worked; just a different landscape now.

Some (non-link builders) mention how foolish SEOs are for wasting so many thought cycles on links. Why are core content, user experience, and social media all vastly more important than link building?

Jim: links are still the biggest part of the Google algorithm - they can not be ignored. People must have things going on that will get them mentions across the web, and ideally some links as well. Links is #1 still today... but yes, after links, you need great content, good user experience, and more.

Joe: CopyPress sells content (please buy some content people; I have three kids to feed here), however it is important to point out that the most incredible content doesn't mean anything in a vacuum. How are you going to get a user experience with 0 users? Link building, purchasing traffic, DRIVING attention are crucial not just to SEO but to marketing in general. Google is using links as votes; while the variability has changed and evolved over time, it is still very much there. I don't see it going away in the next year or two.

An analogy: I wrote two books of poetry in college; I think they are ok, but I never published them and tried to get any attention, so how good are they really? Without promotion and amplification, we're all just guessing.

Thanks guys for sharing your time & wisdom!


About our contributors:

Jim Boykin is the Founder and CEO of Internet Marketing Ninjas, and owner of Webmasterworld.com, SEOChat.com, Cre8asiteForums.com and other community websites. Jim specializes in creating digital assets for sites that attract natural backlinks, and in analyzing links to disavow non-natural links for penalty recoveries.

Joe Sinkwitz, known as Cygnus, is current Chief of Revenue for CopyPress.com. He enjoys long walks on the beach, getting you the content you need, and then whispering in your ear how to best get it ranking.

Google Search Censorship for Fun and Profit

Growing Up vs Breaking Things

Facebook's early motto was "move fast and break things," but as they wanted to become more of a platform play they changed it to "move fast with stability." Anything which is central to the web needs significant stability, or it destroys many other businesses as a side effect of its instability.

As Google has become more dominant, they've moved in the opposite direction. Disruption is promoted as a virtue unto itself, so long as it doesn't adversely impact the home team's business model.

There are a couple different ways to view big search algorithm updates. Large, drastic updates implicitly state one of the following:

  • we were REALLY wrong yesterday
  • we are REALLY wrong today

Any change or disruption is easy to justify so long as you are not the one facing the consequences:

"Smart people have a problem, especially (although not only) when you put them in large groups. That problem is an ability to convincingly rationalize nearly anything." ... "Impostor Syndrome is that voice inside you saying that not everything is as it seems, and it could all be lost in a moment. The people with the problem are the people who can't hear that voice." - Googler Avery Pennarun


Monopoly Marketshare in a Flash

Make no mistake, large changes come with false positives and false negatives. If a monopoly keeps buying marketshare, then any mistakes they make have more extreme outcomes.

Here's the Flash update screen (which hits almost every web browser EXCEPT Google Chrome).

Notice the negative option installs for the Google Chrome web browser and the Google Toolbar in Internet Explorer.

Why doesn't that same process hit Chrome? They not only pay Adobe to use security updates to steal marketshare from other browsers, but they also pay Adobe to embed Flash inside Chrome, so Chrome users never go through the bundleware update process.

Anytime anyone using a browser other than Chrome has a Flash security update they need to opt out of the bundleware, or they end up installing Google Chrome as their default web browser, which is the primary reason Firefox marketshare is in decline.

Google engineers "research" new forms of Flash security issues to drive critical security updates.

Obviously, users love it:

Has anyone noticed that the latest Flash update automatically installs Google Toolbar and Google Chrome? What a horrible business decision Adobe. Force installing software like you are Napster. I would fire the product manager that made that decision. As a CTO I will be informing my IT staff to set Flash to ignore updates from this point forward. QA staff cannot have additional items installed that are not part of the base browser installation. Ridiculous that Adobe snuck this crap in. All I can hope now is to find something that challenges Photoshop so I can move my design team away from Adobe software as well. Smart move trying to make pennies off of your high dollar customers.

In Chrome Google is the default search engine. As it is in Firefox and Opera and Safari and Android and iOS's web search.

In other words, in most cases across most web interfaces you have to explicitly change the default to not get Google. And then even when you do that, you have to be vigilant in protecting against the various Google bundleware bolted onto core plugins for other web browsers, or else you still end up in an ecosystem owned, controlled & tracked by Google.

Those "default" settings are not primarily driven by user preferences, but by a flow of funds. A few hundred million dollars here, a billion there, and the market is sewn up.

Google's user tracking is so widespread & so sophisticated that their ad cookies were a primary tool for government surveillance efforts.

Locking Down The Ecosystem

And Chrome is easily the most locked down browser out there.

Whenever Google wants to promote something they have the ability to bundle it into their web browser, operating system & search results to try to force participation. In a fluid system with finite attention, over-promoting one thing means under-promoting or censoring other options. Google likes to have their cake & eat it too, but the numbers don't lie.

The Right to Be Forgotten

This brings us back to the current snafu with the "right to be forgotten" in Europe.

Google notified publishers like the BBC & The Guardian of their links being removed due to the EU "right to be forgotten" law. Their goal was to cause a public relations uproar over "censorship" which seems to have been a bit too transparent, causing them to reverse some of the removals after they got caught with their hand in the cookie jar.

The breadth of removals is an ongoing topic of coverage. But if you are Goldman Sachs instead of a government Google finds filtering information for you far more reasonable.

Some have looked at the EU policy and compared it to state-run censorship in China.

Google already hires over 10,000 remote quality raters to rate search results. How exactly is receiving 70,000 requests a monumental task? As their public relations propagandists paint this as an unbelievable burden, they are also highlighting how their own internal policies destroy smaller businesses: "If a multi-billion dollar corporation is struggling to cope with 70,000 censor requests, imagine how the small business owner feels when he/she has to disavow thousands or tens of thousands of links."

The World's Richest Librarian

Google aims to promote themselves as a digital librarian: "It’s a bit like saying the book can stay in the library, it just cannot be included in the library’s card catalogue."

That analogy is absurd on a number of levels. Which librarian...

Sorry About That Incidental Deletion From the Web...

David Drummond's breathtaking propaganda makes it sound like Google has virtually no history in censoring access to information:

In the past we’ve restricted the removals we make from search to a very short list. It includes information deemed illegal by a court, such as defamation, pirated content (once we’re notified by the rights holder), malware, personal information such as bank details, child sexual abuse imagery and other things prohibited by local law (like material that glorifies Nazism in Germany).

Yet Google sends out hundreds of thousands of warning messages in webmaster tools every single month.

Google is free to force whatever (often both arbitrary and life altering) changes they desire onto the search ecosystem. But the moment anyone else wants any level of discourse or debate into the process, they feign outrage over the impacts on the purity of their results.

Despite Google's great power they do make mistakes. And when they do, people lose their jobs.

Consider MetaFilter.

They were penalized November 17, 2012.

At a recent SMX conference Matt Cutts stated MetaFilter was a false positive.

People noticed the Google update when it happened. It is hard to miss an overnight 40% decline in your revenues. Yet when they asked about it, Google did not confirm its existence. That economic damage hit MetaFilter for nearly two years & they only got a potential reprieve from after they fired multiple employees and were able to generate publicity about what had happened.

As SugarRae mentioned, those false positives happen regularly, but most the people who are hit by them lack political and media influence, and are thus slaughtered with no chance of recovery.

MetaFilter is no different than tens of thousands of other good, worthy small businesses who are also laying off employees – some even closing their doors – as a result of Google’s Panda filter serving as judge, jury and executioner. They’ve been as blindly and unfairly cast away to an island and no one can hear their pleas for help.

The only difference between MetaFilter and tons of other small businesses on the web is that MetaFilter has friends in higher places.

If you read past the headlines & the token slaps of big brands, these false positive death sentences for small businesses are a daily occurrence.

And such stories are understated for fear of coverage creating a witch-hunt:

Conversations I’ve had with web publishers, none of whom would speak on the record for fear of retribution from Cutts’ webspam team, speak to a litany of frustration at a lack of transparency and potential bullying from Google. “The very fact I’m not able to be candid, that’s a testament to the grotesque power imbalance that’s developed,” the owner of one widely read, critically acclaimed popular website told me after their site ran afoul of Cutts’ last Panda update.

Not only does Google engage in anti-competitive censorship, but they also frequently publish misinformation. Here's a story from a week ago of a restaurant which went under after someone changed their Google listing store hours to be closed on busy days. That misinformation was embedded directly in the search results. That business is no more.

Then there are areas like locksmiths:

I am one of the few Real Locksmiths here in Denver and I have been struggling with this for years now. I only get one or two calls a day now thanks to spammers, and that's not calls I do, it's calls for prices. For instance I just got a call from a lady locked out of her apt. It is 1130 pm so I told her 75 dollars, Nope she said someone told her 35 dollars....a fake locksmith no doubt. She didn't understand that they meant 35 dollars to come out and look at it. These spammers charge hundreds to break your lock, they don't know how to pick a lock, then they charge you 10 times the price of some cheap lock from a hardware store. I'm so lost, I need help from google to remove those listings. Locksmithing is all I have ever done and now I'm failing at it.

There are entire sectors of the offline economy being reshaped by Google policies.

When those sectors get coverage, the blame always goes to the individual business owner who was (somehow?) personally responsible for Google's behaviors, or perhaps some coverage of the nefarious "spammers."

Never does anybody ask if it is reasonable for Google to place their own inaccurate $0 editorial front and center. To even bring up that issue makes one an anti-capitalist nut or someone who wishes to impede on free speech rights. This even after the process behind the sausage comes to light.

And while Google arbitrarily polices others, their leaked internal documents contain juicy quotes about their ad policies like:

  • “We are the only player in our industry still accepting these ads”
  • “We do not make these decisions based on revenue, but as background, [redacted].”
  • "As with all of our policies, we do not verify what these sites actually do, only what they claim to do."
  • "I understand that we should not let other companies, press, etc. influence our decision-making around policy"

Is This "Censorship" Problem New?

This problem of control to access of information is nothing new - it is only more extreme today. Read the (rarely read) preface to Animal Farm, or consider this:

John Milton in his fiery 1644 defense of free speech, Areopagitica, was writing not against the oppressive power of the state but of the printers guilds. Darnton said the same was true of John Locke's writings about free speech. Locke's boogeyman wasn't an oppressive government, but a monopolistic commercial distribution system that was unfriendly to ways of organizing information that didn't fit into its business model. Sound familiar?

When Google complains about censorship, they are not really complaining about what may be, but what already is. Their only problem is the idea that someone other than themselves should have any input in the process.

"Policy is largely set by economic elites and organized groups representing business interests with little concern for public attitudes or public safety, as long as the public remains passive and obedient." ― Noam Chomsky

Many people have come to the same conclusion

Turn on, tune in, drop out

"I think as technologists we should have some safe places where we can try out some new things and figure out what is the effect on society, what's the effect on people, without having to deploy kind of into the normal world. And people like those kind of things can go there and experience that and we don't have mechanisms for that." - Larry Page

I have no problem with an "opt-in" techno-utopia test in some remote corner of the world, but if that's the sort of operation he wants to run, it would be appreciated if he stopped bundling his software into billions of electronic devices & assumed everyone else is fine with "opting out."

{This | The Indicated} {Just | True} {In | Newfangled}

A couple years ago we published an article named Branding & the Cycle, which highlighted how brands would realign with the algorithmic boost they gained from Panda & leverage their increased level of trust to increase their profit margins by leveraging algorithmic journalism.

Narrative Science has been a big player in the algorithmic journalism game for years. But they are not the only player in the market. Recently the Associated Press (AP) announced they will use algorithms to write articles based on quarterly earnings reports, working with a company named Automated Insights:

We discovered that automation technology, from a company called Automated Insights, paired with data from Zacks Investment Research, would allow us to automate short stories – 150 to 300 words — about the earnings of companies in roughly the same time that it took our reporters.

And instead of providing 300 stories manually, we can provide up to 4,400 automatically for companies throughout the United States each quarter.
...
Zacks maintains the data when the earnings reports are issued. Automated Insights has algorithms that ping that data and then in seconds output a story.

In the past Matt Cutts has mentioned how thin rewrites are doorway page spam:

you can also have more subtle doorway pages. so we ran into a directv installer in denver, for example. and that installer would say I install for every city in Colorado. so I am going to make a page for every single city in Colorado. and Boulder or Aspen or whatever I do directv install in all of those. if you were just to land on that page it might look relatively reasonable. but if you were to look at 4 or 5 of those you would quickly see that the only difference between them is the city, and that is something that we would consider a doorway.

One suspects these views do not apply to large politically connected media bodies like the AP, which are important enough to have a direct long-term deal with Google.

In the above announcement the AP announced they include automated NFL player rankings. One interesting thing to note about the AP is they have syndication deals with 1,400 daily newspapers nationwide, as well as thousands of TV and radio stations..

A single automated AP article might appear on thousands of websites. When thousands of articles are automated, that means millions of copies. When millions of articles are automated, that means billions of copies. When billions ... you get the idea.

To date Automated Insights has raised a total of $10.8 million. With that limited funding they are growing quickly. Last year their Wordsmith software produced 300 million stories & this year it will likely exceed a billion articles:

"We are the largest producer of content in the world. That's more than all media companies combined," [Automated Insights CEO Robbie Allen] said in a phone interview with USA TODAY.

The Automated Insights homepage lists both Yahoo! & Microsoft as clients.

The above might sound a bit dystopian (for those with careers in journalism and/or lacking equity in Automated Insights and/or publishers who must compete against algorithmically generated content), but the story also comes with a side of irony.

Last year Google dictated press releases shall use nofollow links. All the major press release sites quickly fell in line & adopted nofollow, thinking they would remain in Google's good graces. Unfortunately for those sites, they were crushed by Panda. PR Newswire's solution their penalty was greater emphasis on manual editorial review:

Under the new copy quality guidelines, PR Newswire editorial staff will review press releases for a number of message elements, including:

  • Inclusion of insightful analysis and original content (e.g. research, reporting or other interesting and useful information);
  • Use of varied release formats, guarding against repeated use of templated copy (except boilerplate);
  • Assessing release length, guarding against issue of very short, unsubstantial messages that are mere vehicles for links;
  • Overuse of keywords and/or links within the message.

So now we are in a situation where press release sites require manual human editorial oversight to try to get out of being penalized, and the news companies (which currently enjoy algorithmic ranking boosts) are leveraging those same "spammy" press releases using software to auto-generate articles based on them.

That makes sense & sounds totally reasonable, so long as you don't actually think about it (or work at Google)...

Google Helpouts Twitter Spam (Beta)

Google is desperate to promote Helpouts. I first realized this when I saw the following spam message in my email inbox.

Shortly after a friend sent me a screenshot of a onebox promoting Helpouts in the SERPs.

That's Google monopoly and those are Google's services. It is not like they are:

  • being anti-competitive
  • paying others to spam other websites

Let's slow down though. Maybe I am getting ahead of myself:

Google has its own remote technology support service similar to Mr. Gupta's called Google Helpouts. Mr. Gupta's complaint alleges Google may have been blocking his advertisements so Google Helpouts could get more customers.

Oh, and that first message looked like it could have been an affiliate link. Was it?

Hmm

Let me see

What do we have here?

Google Helpouts connects you to a variety of experts--from doctors, parenting experts, tutors, personal trainers, and more--over live video call. The Google Helpouts Ambassador Program is a unique opportunity to spread the word about Helpouts, earn money, and influence a new Google product--all on your own schedule.

As an Ambassador, you will:

  • Earn extra income–receive $25 for each friend you refer who takes their first paid Helpout, up to $1,000 per month for the first 4 months.
  • Give direct feedback and help shape a new Google product
  • Join a community of innovative Ambassadors around the country
  • Receive a Helpouts gift and the chance to win prizes

We all know HELPFUL hotel affiliate websites are spam, but maybe Google HELPouts affiliate marketing isn't spam.

After all, Google did promise to teach people how to do their affiliate marketing professionally: "We will provide you with an Ambassador Toolkit with tips and suggestions on creative ways you can spread the word. You are encouraged to get creative, be innovative, and utilize different networks (i.e. social media, word of mouth, groups & associations, blogs, etc.) to help you."

Of course the best way to lead is by example.

And lead they do.

They are highly inclusive in their approach.

Check out this awesome Twitter usage

They've more Tweets in the last few months than I've made in 7 years. There are 1,440 minutes in a day, so it is quite an achievement to make over 800 Tweets in a day.

You and many many many many thousands of others, Emma.

Some minutes they are making 2 or 3 Tweets.

And with that sort of engagement & the Google brand name, surely they have built a strong following.

Uh, nope.

They are following over 500 people and have about 4,000 followers. And the 4,000 number is generous, as some of them are people who sell on that platform or are affiliates pushing it.

Let's take a look at the zero moment of truth:

Thanks for your unsolicited commercial message, but I am not interested.

You're confusing me. Some context would help.

No email support, but support "sessions"? What is this?

Oh, I get it now. Is this a spam bot promoting phone sex?

Ah, so it isn't phone sex, but you can help with iPhones. Um, did we forget that whole Steve Jobs thermonuclear war bit? And why is Google offering support for Apple products when Larry Page stated the whole idea of customer support was ridiculous?

OK, so maybe this is more of the same.

Cynical, aren't we?

And cheap?

Really cheap. :(

And angry?

And testy?

And rude?

And curt?

Didn't you already say that???

Didn't you already say that???

It seems we are having issues communicating here.

I'm not sure it is fair to call it spying a half day late.

Better late than never.

Even if automated.

Good catch Megar, as Google has a creepy patent on automating social spam.

Who are your real Google+ friends? Have they all got the bends? Is Google really sinking this low?

Every journey of a thousand miles begins with a single step.

Humorous or sad...depending on your view.

There's no wrong way to eat a Reese's.

Google has THOUSANDS of opportunities available for you to learn how to spam Twitter.

As @Helpouts repeatedly Tweets: "Use the code IFOUNDHELP for $20 off" :D

++++++++

All the above Tweets were from the last few days.

The same sort of anti-social agro spamming campaign has been going on far longer.

When Twitter users said "no thank you"...

...Google quickly responded like a Marmaris rug salesman

Google has a magic chemistry for being able to...

...help with slow computers.

We need to fight spam messages (with MOAR spam messages).

In a recent Youtube video Matt Cutts said: "We got less spam and so it looks like people don't like the new algorithms as much." Based on that, perhaps we can presume Helpouts is engaging in a guerrilla marketing campaign to improve user satisfaction with the algorithms.

Or maybe Google is spamming Twitter so they can justify banning Twitter.

Or maybe this is Google's example of how we should market websites which don't have the luxury of hard-coding at the top of the search results.

Or maybe Google wasn't responsible for any of this & once again it was "a contractor."

Update: After they stopped spamming, Google Helpouts never took off and is shutting down in April of 2015.

Learn Local Search Marketing

Last October Vendran Tomic wrote a guide for local SEO which has since become one of the more popular pages on our site, so we decided to follow up with a QnA on some of the latest changes in local search.

Local Ants.

Q: Google appears to have settled their monopolistic abuse charges in Europe. As part of that settlement they have to list 3 competing offers in their result set from other vertical databases. If Google charges for the particular type of listing then these competitors compete in an ad auction, whereas if the vertical is free those clicks to competitors are free. How long do we have until Google's local product has a paid inclusion element to it?

A: Local advertising market is huge. It's a market that Google still hasn't mastered. It's a market still dominated by IYP platforms.

Since search in general is stagnant, Google will be looking to increase their share of the market.

That was obvious to anyone who was covering Google's attempt to acquire Groupon since social couponing is a local marketing phenomenon mostly.

Their new dashboard is not only more stable with a slicker interface, but also capable of facilitating any paid inclusion module.

I would guess that Google will not wait a long time to launch a paid inclusion product or something similar, since they want to keep their shareholders happy.

Q: In the past there have been fiascos with things like local page cross-integration with Google+. How "solved" are these problems, and how hard is it to isolate these sorts of issues from other potential issues?

A: Traditionally, Google had the most trouble with their "local" products. Over the years, they were losing listings, reviews, merging listings, duplicating them etc. Someone called their attempts "a train wreck at the junction." They were also notoriously bad with providing guidance that would help local businesses navigate the complexity of the environment Google created.

Google has also faced some branding challenges - confusing even the most seasoned local search professionals with their branding.

Having said that, things have been changing for the better. Google has introduced phone support which is, I must say, very useful. In addition, the changes they made in a way they deal with local data made things more stable.

However, I'd still say that Google's local products are their biggest challenge.

Q: Yelp just had strong quaterly results and Yahoo! has recently added a knowledge-graph like pane to their search results. How important is local search on platforms away from Google? How aligned are the various local platforms on ranking criteria?

A: Just like organic search is mostly about two functions - importance and relevance, local search is about location prominence, proximity and relevance (where location prominence is an equivalent to importance in general SEO).

All local search platforms have ranking factors that are based on these principles.

The only thing that's different is what they consider ranking signals and the way they place on each. For example, to rank high in Yahoo! Local, one needs to be very close to the centroid of the town, have something in the title of their business that matches the query of the search and have a few reviews.

Google is more sophisticated, but the principles are the same.

The less sophisticated local search platforms use less signals in their algorithm, and are usually geared more towards proximity as a ranking signal.

It's also important to note that local search functions as a very interconnected ecosystem, and that changes made in order to boost visibility in one platform, might hurt you in another.

Q: There was a Google patent where they mentioned using driving directions to help as a relevancy signal. And Bing recently invested in and licensed data from Foursquare. Are these the sorts of signals you see taking weight from things like proximity over time?

A: I see these signals becoming/increasing in importance over time as they would be a useful ranking signal. However, to Google, local search is also about location sensitivity, and these signals will probably not be used outside of this context.

If you read a patent named "Methods And Systems For Improving A Search Ranking Using Location Awareness" (Amit Singhal is one of the inventors), you will see that Google, in fact, is aware that people have different sensitivities fo different types of services/queries. You don't necessarily care where your plumber will come from, but you do care where the pizza places are where you search for pizza in your location.

I don't see driving directions as a signal ever de-throning proximity, because proximity is closer to the nature of the offline/online interaction.

Q: There are many different local directories which are highly relevant to local, while there are also vertical specific directories which might be tied to travel reviews or listing doctors. Some of these services (say like OpenTable) also manage bookings and so on. How important is it that local businesses "spread around" their marketing efforts? When does it make sense to focus deeply on a specific platform or channel vs to promote on many of them?

A: This is a great question, Aaron! About 5 years ago, I believed that the only true game in town for any local business is Google. This was because, at that time, I wasn't invested in proper measurement of outcomes and metrics such as cost of customer acquisition, lead acqusition etc.

Local businesses, famous for their lack of budgets, should always "give" vertical platforms a try, even IYP type sites. This is why:

  • one needs to decrease dependance on Google because it's an increasingly fickle channel of traffic acquisition (Penguin and Panda didn't spare local websites),
  • sometimes, those vertical websites can produce great returns. I was positively surprised by the number of inquiries/leads one of our law firm clients got from a well known vertical platform.
  • using different marketing channels and measuring the right things can improve your marketing skills.

Keep in mind, basics need to be covered first: data aggregators, Google Places, creating a professional/usable/persuasive website, as well as developing a measurement model.

Q: What is the difference between incentivizing a reasonable number of reviews & being so aggressive that something is likely to be flagged as spam? How do you draw the line with trying to encourage customer reviews?

A: Reviews and review management have always been tricky, as well as important. We know two objective things about reviews:

  • consumers care about reviews when making a purchase and
  • reviews are important for your local search visibility.

Every local search/review platform worth its weight in salt will have a policy in place discouraging incentivizing and "buying" reviews. They will enforce this policy using algorithms or humans. We all know that.

Small and medium sized businesses make a mistake of trying to get as many reviews as humanly possible, and direct them to one or two local search platforms. Here, they make two mistakes:

1. they're driven by a belief that one needs a huge number of reviews on Google and
2. one needs to direct all their review efforts at Google.

This behavior forces them to be flagged algorithmically or manually. Neither Google nor Yelp want you to solicit reviews.

However, if you change your approach from aggressively asking for reviews to a survey-based approach, you should be fine.

What do I mean by that?

A survey-based approach means you solicit your customers' opinions on different services/products to improve your operations - and then ask them to share their opinion on the web while giving them plenty of choices.

This approach will get you much further than mindlessly begging people for reviews and sending them to Google.

The problem with clear distinction between the right and wrong way in handling reviews, as far as Google goes, lies in their constant changing of guidelines regarding reviews.

Things to remember are: try to get reviews on plenty of sites, while surveying your customers and never get too aggressive. Slow and steady wins the race.

Q: On many local searches people are now getting carouseled away from generic searches toward branded searches before clicking through, and then there is keyword(not provided) on top of that. What are some of the more cost efficient ways a small business can track & improve their ranking performance when so much of the performance data is hidden/disconnected?

A: Are you referring to ranking in Maps or organic part of the results? I'm asking because Google doesn't blend anymore.

Q: I meant organic search

A: OK. My advice has always been to not obsess over rankings, but over customer acquisition numbers, leads, lifetime customer value etc.

However, rankings are objectively a very important piece of the puzzle. Here are my suggestions when it comes to more cost efficient ways to track and improve ranking performance:

  • When it comes to tracking, I'd use Advanced Web Ranking (AWR) or Authority Labs, both of which are not very expensive.
  • Improving ranking performance is another story. Local websites should be optimized based on the same principles that would work for any site (copy should be written for conversion, pages should be focused on narrow topics, titles should be written for clickthrough rates etc).
  • On the link building side of things, I'd suggest taking care of data aggregators first as a very impactful, yet cost effective strategy. Then, I would go after vertical platforms that link directly to a website, that have profiles chockfull of structured data. I would also make sure to join relevant industry and business associations, and generally go after links that only a real local business can get - or that come as a result of broader marketing initiatives. For example, one can organize events in the offline world that can result in links and citations, effectively increasing their search visibility without spending too much.

Q: If you are a local locksmith, how do you rise above the spam which people have publicly complained about for at least 5 years straight now?

A: If I were a local locksmith, I would seriously consider moving my operations close to the centroid of my town/city. I would also make sure my business data across the web is highly consistent.

In addition, I would make sure to facilitate getting reviews on many platforms. If this wouldn't be enough (as it often isn't enough in many markets), I would be public about Google's inability to handle locksmiths spam in my town - using their forums, and any other medium.

Q: In many cities do you feel the potential ROI would be high enough to justify paying for downtown real estate then? Or would you suggest having a mailing related address or such?

A: The ROI of getting a legitimate downtown address would greatly depend on customer lifetime value. For example, if I were a personal injury attorney in a major city, I would definitely consider opening a small office near a center of my city/town.

Another thing to consider would be the search radius/location sensitivity. If the location sensitivity for a set of keywords is high, I would be more inclined to invest in a downtown office.

I wouldn't advocate PO boxes or virtual offices, since Google is getting more aggressive about weeding those out.

Q: Google recently started supporting microformats for things like hours of operation, phone numbers, and menus. How important is it for local businesses to use these sorts of features?

A: It is not a crucial ranking factor, and is unlikely to be any time in the near future. However, Google tends to reward businesses that embrace their new features - at least in local search. I would definitely recommend embracing microformats in local search.

Q: As a blogger I've noticed an increase in comment spam with NAP information in it. Do you see Google eventually penalizing people for that? Is this likely to turn into yet another commonplace form of negative SEO?

A: This is a difficult question. Knowing how Google operates, it's possible they start penalizing that practice. However, I don't see that type of spam being particularly effective.

Most blogs cannot do a lot to enhance the location prominence. But if that turned into a negative SEO avenue, I would say that Google wouldn't handle it well (based on their track records).

Q: Last year you wrote a popular guide to local search. What major changes have happened to the ecosystem since then? Would you change any of the advice you gave back then? Or has local search started to become more stable recently?

A: There weren't huge changes in the local ecosystem. Google has made a lot of progress in transferring accounts to the new dashboard, improving the Bulk upload function. They also changed their UX slightly.

Moz entered the local search space with their Moz Local product.

Q: When doing a local SEO campaign, how much of the workload tends to be upfront stuff versus ongoing maintenance work? For many campaigns is a one-off effort enough to last for a significant period of time? How do you determine the best approach for a client in terms of figuring out the mix of upfront versus maintenance and how long it will take results to show and so on?

A: This largely depends on the objective of the campaign, the market and the budget. There are verticals where local Internet marketing is extremely competitive, and tends to be a constant battle.

Some markets, on the other hand, are easy and can largely be a one-off thing. For example, if you're a plumber or an electrician in a small town with a service area limited to that town, you really don't need much maintenance, if any.

However, if you are a roofing company that wants to be a market leader in greater Houston, TX your approach has to be much different.

The upfront work tends to be more intense if the business has NAP inconsistencies, never did any Internet marketing and doesn't excel at offline marketing.

If you're a brand offline and know to tie your offline and online marketing efforts, you will have a much easier time getting the most out of the web.

In most smaller markets, the results can be seen in a span of just a few months. More competitive markets, in my experience, require more time and a larger investment.

Q: When does it make sense for a local business to DIY versus hiring help? What tools do you recommend they use if they do it themselves?

A: If local business owner is in a position where doing local Internet marketing is their highest value activity, it would make sense to do it themselves.

However, more often than not, this is not the case even for the smallest of businesses. Being successful in local Internet marketing in a small market is not that difficult. But it does come with a learning curve and a cost in time.

Having said that, if the market is not that competitive, taking care of data aggregators, a few major local search platforms and acquisition of a handful of industry links would do the trick.

For data aggregators, one might go directly to them or use a tool such as UBM or Moz Local.

To dig for citations, Whitespark's citation tool is pretty good and not that expensive.

Q: The WSJ recently published a fairly unflatering article about some of the larger local search firms which primarily manage AdWords for 10's of thousands of clients & rely on aggressive outbound marketing to offset high levels of churn. Should a small business consider paid search & local as being separate from one another or part of the same thing? If someone hires help on these fronts, where's the best place to find responsive help?

A: "Big box" local search companies were always better about client acquisition than performance. It always seemed as if performance wasn't an integral part of their business model.

However, small businesses cannot take that approach when it comes to performance. Generally speaking, the more web is connected to business, the better of a small business is. This means that a local Internet marketing strategy should start with business objectives.

Everyone should ask themselves 2 questions:
1. What's my lifetime customer value?
2. How much can I afford to spend on acquiring a customer?

Every online marketing endeavor should be judged through this lens. This means greater integration.

Q: What are some of the best resources people can use to get the fundamentals of local search & to keep up with the changing search landscape?

A: Luckily for everyone, blogosphere in local search is rich in useful information. I would definitely recommend Mike Blumenthal's blog, Andrew Shotland's Local SEO Guide, Linda Buquet's forum, Nyagoslav Zhekov, Mary Bowling and of course, the Local U blog.


Vedran Tomic is a member of SEOBook and founder of Local Ants LLC, a local internet marketing agency. Please feel free to use the comments below to ask any local search questions you have, as Vedran will be checking in periodically to answer them over the next couple days.

Google's Effective 'White Hat' Marketing Case Study

There's the safe way & the high risk approach. The shortcut takers & those who win through hard work & superior offering.

One is white hat and the other is black hat.

With the increasing search ecosystem instability over the past couple years, some see these labels constantly sliding, sometimes on an ex-post-facto basis, turning thousands of white hats into black hats arbitrarily overnight.

Are you a white hat SEO? or a black hat SEO?

Do you even know?

Before you answer, please have a quick read of this Washington Post article highlighting how Google manipulated & undermined the US political system.

.
.
.
.
.
.
.

Seriously, go read it now.

It's fantastic journalism & an important read for anyone who considers themselves an SEO.

.
.
.
.
.
.
.
.

######

Take the offline analog to Google's search "quality" guidelines & in spirit Google repeatedly violated every single one of them.

Advertorials

creating links that weren’t editorially placed or vouched for by the site’s owner on a page, otherwise known as unnatural links can be considered a violation of our guidelines. Advertorials or native advertising where payment is received for articles that include links that pass PageRank

Advertorials are spam, except when they are not: "the staff and professors at GMU’s law center were in regular contact with Google executives, who supplied them with the company’s arguments against antitrust action and helped them get favorable op-ed pieces published"

Deception

Don't deceive your users.

Ads should be clearly labeled, except when they are not: "GMU officials later told Dellarocas they were planning to have him participate from the audience," which is just like an infomercial that must be labeled as an advertisement!

Preventing Money from Manipulating Editorial

Make reasonable efforts to ensure that advertisements do not affect search engine rankings. For example, Google's AdSense ads and DoubleClick links are blocked from being crawled by a robots.txt file.

Money influencing outcomes is wrong, except when it's not: "Google’s lobbying corps — now numbering more than 100 — is split equally, like its campaign donations, among Democrats and Republicans. ... Google became the second-largest corporate spender on lobbying in the United States in 2012."

Content Quality

The best way to get other sites to create high-quality, relevant links to yours is to create unique, relevant content that can naturally gain popularity in the Internet community. Creating good content pays off: Links are usually editorial votes given by choice, and the more useful content you have, the greater the chances someone else will find that content valuable to their readers and link to it.

Payment should be disclosed, except when it shouldn't: "The school and Google staffers worked to organize a second academic conference focused on search. This time, however, Google’s involvement was not publicly disclosed."

Cloaking

Cloaking refers to the practice of presenting different content or URLs to human users and search engines. Cloaking is considered a violation of Google’s Webmaster Guidelines because it provides our users with different results than they expected.

cloaking is evil, except when it's not: Even as Google executives peppered the GMU staff with suggestions of speakers and guests to invite to the event, the company asked the school not to broadcast its involvement. “We will certainly limit who we announce publicly from Google”

...and on and on and on...

It's not safe to assume that just because a specific deceptive technique isn't included on this page, Google approves of it.

And while they may not approve of something, that doesn't mean they avoid the strategy when mapping out their own approach.

There's a lesson & it isn't a particularly subtle one.

Free markets aren't free. Who could have known?

Bing Lists 'Alternatives' In Search Results

Bing recently stated testing listing 'alternatives' near their local search results.

I wasn't able to replicate these in other search verticals like flight search, or on an iPhone search, but the format of these alternatives looks similar to the format proposed in Google's ongoing monopolistic abuse case in Europe:

"In effect, competitors will have the 'choice' either to pay Google in order to remain relevant or lose visibility and become irrelevant," a European consumer watchdog, BEUC, said in a letter it sent to all 28 EU commissioners. The letter, seen by The Wall Street Journal, terms the deal "unacceptable."

Flip Guest Blogging on its Head, With Steroids

Guest blogging was once considered a widely recommended white hat technique.

Today our monopoly-led marketplace arbitrarily decided this is no longer so.

Stick a fork in it. Torch it. Etc.

Now that rules have changed ex post facto, we can expect to deal with a near endless stream of "unnatural" link penalties for doing what was seen at the time as being:

  • natural
  • widespread
  • common
  • low risk
  • best practice

Google turns your past client investments into new cost centers & penalties. This ought to be a great thing for the SEO industry. Or maybe not.

As Google scares & expunges smaller players from participating in the SEO market, larger companies keep chugging along.

Today a friend received the following unsolicited email:

Curious about their background, he looked up their past coverage: "Written then offers a number of different content licenses that help the advertiser reach this audience, either by re-branding the existing page, moving the content to the advertiser’s website and re-directing traffic there, or just re-publishing the post on the brand’s blog."

So that's basically guest blogging at scale.

And it's not only guest blogging at scale, but it is guest blogging at scale based on keyword performance:

"You give us your gold keywords. Written finds high-performing, gold content with a built-in, engaged audience. Our various license options can bring the audience to you or your brand to the audience through great content."

What's worse is how they pitch this to the people they license content from:

I'm sorry, but taking your most valuable content & turning it into duplicate content by syndicating it onto a fortune 500 website will not increase your traffic. The fortune 500 site will outrank you (especially if visitors/links are 301 redirected to their site!). And when visitors are not redirected, they will still typically outrank you due to their huge domain authority (and the cross-domain rel=canonical tag), leading your content on your site to get filtered out of the search results as duplicate content & your link equity to pass on to the branded advertiser.

And if Google were to come down on anyone in the above sort of situation it would likely be the smaller independent bloggers who get hit.

This is how SEO works.

Smaller independent players innovate & prove the model.

Google punishes them for being innovative.

As they are punished, a vanilla corporate tweak of the same model rolls out and is white hat.

In SEO it's not what you do that matters - it's who your client is.

If you're not working for a big brand, you're doing it wrong.

Four legs good, two legs better.

Sign Up for Yahoo! Gemini Ads

How to Get a free $50 Yahoo! Gemini Coupon

  • Step 1: click here
  • Step 2: after clicking that link, enter the promo code YAHOOADS for a $50 credit when you sign up today

What is Yahoo! Gemini?

Yahoo! announced the launch of Gemini, which allows advertisers to buy in-content native Yahoo! Stream ads on the Yahoo! homepage and other key Yahoo! properties. In addition Gemini will allow Yahoo! to sell their own search ads on mobile devices rather than Microsoft's Bing Ads.

Here is an example of a stream ad right on the Yahoo! homepage.

These ads are sold on a cost per click basis like Google AdWords and Bing Ads. They appear on both desktop and mobile versions of Yahoo!.

You can sign up for Gemini here.

Gemini's Growing Importance in the Search Landscape

Late in 2014 Yahoo! shocked the search landscape when they announced a deal with Mozilla to become the default search provider in Firefox.

In RKG's Q1 2015 digital marketing report they highlighted how Yahoo! Gemini is quickly growing and now powering a significant share of mobile search ad clicks in the Yahoo!/Bing ad network.

Yahoo! is further expanding the reach of their network through powering in-app search on thousands of apps and story recommendations on popular publishing networks like Vox Media, CBS Interactive

Gemini will soon likely power many of the desktop search ads on Yahoo! Search as well, as when Marissa Mayer renewed the Yahoo! Search contract with Microsoft, she lowered the guaranteed inventory delivered to Microsoft to 51% and got Yahoo! a carve out to enable them to deliver their own ads on desktops, laptops and tablets along with mobile devices. Yahoo! now has the ability to use Bing algorithmic search results without using the ads on up to 49% of their search volume.

Get in Early & Save

Since Gemini is a relatively new ad network their clicks tend to be significantly cheaper than clicks on ad networks established long ago. Gemini can represent a significant savings over buying Google AdWords ads.

Activate your Gemini account today

Pages