Google's Effective 'White Hat' Marketing Case Study

Apr 15th

There's the safe way & the high risk approach. The shortcut takers & those who win through hard work & superior offering.

One is white hat and the other is black hat.

With the increasing search ecosystem instability over the past couple years, some see these labels constantly sliding, sometimes on an ex-post-facto basis, turning thousands of white hats into black hats arbitrarily overnight.

Are you a white hat SEO? or a black hat SEO?

Do you even know?

Before you answer, please have a quick read of this Washington Post article highlighting how Google manipulated & undermined the US political system.

.
.
.
.
.
.
.

Seriously, go read it now.

It's fantastic journalism & an important read for anyone who considers themselves an SEO.

.
.
.
.
.
.
.
.

######

Take the offline analog to Google's search "quality" guidelines & in spirit Google repeatedly violated every single one of them.

Advertorials

creating links that weren’t editorially placed or vouched for by the site’s owner on a page, otherwise known as unnatural links can be considered a violation of our guidelines. Advertorials or native advertising where payment is received for articles that include links that pass PageRank

Advertorials are spam, except when they are not: "the staff and professors at GMU’s law center were in regular contact with Google executives, who supplied them with the company’s arguments against antitrust action and helped them get favorable op-ed pieces published"

Deception

Don't deceive your users.

Ads should be clearly labeled, except when they are not: "GMU officials later told Dellarocas they were planning to have him participate from the audience," which is just like an infomercial that must be labeled as an advertisement!

Preventing Money from Manipulating Editorial

Make reasonable efforts to ensure that advertisements do not affect search engine rankings. For example, Google's AdSense ads and DoubleClick links are blocked from being crawled by a robots.txt file.

Money influencing outcomes is wrong, except when it's not: "Google’s lobbying corps — now numbering more than 100 — is split equally, like its campaign donations, among Democrats and Republicans. ... Google became the second-largest corporate spender on lobbying in the United States in 2012."

Content Quality

The best way to get other sites to create high-quality, relevant links to yours is to create unique, relevant content that can naturally gain popularity in the Internet community. Creating good content pays off: Links are usually editorial votes given by choice, and the more useful content you have, the greater the chances someone else will find that content valuable to their readers and link to it.

Payment should be disclosed, except when it shouldn't: "The school and Google staffers worked to organize a second academic conference focused on search. This time, however, Google’s involvement was not publicly disclosed."

Cloaking

Cloaking refers to the practice of presenting different content or URLs to human users and search engines. Cloaking is considered a violation of Google’s Webmaster Guidelines because it provides our users with different results than they expected.

cloaking is evil, except when it's not: Even as Google executives peppered the GMU staff with suggestions of speakers and guests to invite to the event, the company asked the school not to broadcast its involvement. “We will certainly limit who we announce publicly from Google”

...and on and on and on...

It's not safe to assume that just because a specific deceptive technique isn't included on this page, Google approves of it.

And while they may not approve of something, that doesn't mean they avoid the strategy when mapping out their own approach.

There's a lesson & it isn't a particularly subtle one.

Free markets aren't free. Who could have known?

Flip Guest Blogging on its Head, With Steroids

Mar 19th

Guest blogging was once considered a widely recommended white hat technique.

Today our monopoly-led marketplace arbitrarily decided this is no longer so.

Stick a fork in it. Torch it. Etc.

Now that rules have changed ex post facto, we can expect to deal with a near endless stream of "unnatural" link penalties for doing what was seen at the time as being:

  • natural
  • widespread
  • common
  • low risk
  • best practice

Google turns your past client investments into new cost centers & penalties. This ought to be a great thing for the SEO industry. Or maybe not.

As Google scares & expunges smaller players from participating in the SEO market, larger companies keep chugging along.

Today a friend received the following unsolicited email:

Curious about their background, he looked up their past coverage: "Written then offers a number of different content licenses that help the advertiser reach this audience, either by re-branding the existing page, moving the content to the advertiser’s website and re-directing traffic there, or just re-publishing the post on the brand’s blog."

So that's basically guest blogging at scale.

And it's not only guest blogging at scale, but it is guest blogging at scale based on keyword performance:

"You give us your gold keywords. Written finds high-performing, gold content with a built-in, engaged audience. Our various license options can bring the audience to you or your brand to the audience through great content."

What's worse is how they pitch this to the people they license content from:

I'm sorry, but taking your most valuable content & turning it into duplicate content by syndicating it onto a fortune 500 website will not increase your traffic. The fortune 500 site will outrank you (especially if visitors/links are 301 redirected to their site!). And when visitors are not redirected, they will still typically outrank you due to their huge domain authority (and the cross-domain rel=canonical tag), leading your content on your site to get filtered out of the search results as duplicate content & your link equity to pass on to the branded advertiser.

And if Google were to come down on anyone in the above sort of situation it would likely be the smaller independent bloggers who get hit.

This is how SEO works.

Smaller independent players innovate & prove the model.

Google punishes them for being innovative.

As they are punished, a vanilla corporate tweak of the same model rolls out and is white hat.

In SEO it's not what you do that matters - it's who your client is.

If you're not working for a big brand, you're doing it wrong.

Four legs good, two legs better.

Disavow & Link Removal: Understanding Google

Jan 26th

Fear Sells

Few SEOs took notice when Matt Cutts mentioned on TWIG that "breaking their spirits" was essential to stopping spammers. But that single piece of information add layers of insights around things like:

  • duplicity on user privacy on organic versus AdWords
  • benefit of the doubt for big brands versus absolute apathy toward smaller entities
  • the importance of identity versus total wipeouts of those who are clipped
  • mixed messaging on how to use disavow & the general fear around links

From Growth to No Growth

Some people internalize failure when growth slows or stops. One can't raise venture capital and keep selling the dream of the growth story unless the blame is internalized. If one understands that another dominant entity (monopoly) is intentionally subverting the market then a feel good belief in the story of unlimited growth flames out.

Most of the growth in the search channel is being absorbed by Google. In RKG's Q4 report they mentioned that mobile ad clicks were up over 100% for the year & mobile organic clicks were only up 28%.

Investing in Fear

There's a saying in investing that "genius is declining interest rates" but when the rates reverse the cost of that additional leverage surfaces. Risks from years ago that didn't really matter suddenly do.

The same is true with SEO. A buddy of mine mentioned getting a bad link example from Google where the link was in place longer than Google has been in existence. Risk can arbitrarily be added after the fact to any SEO activity. Over time Google can keep shifting the norms of what is acceptable. So long as they are fighting off Wordpress hackers and other major issues they are kept busy, but when they catch up on that stuff they can then focus on efforts to shift white to gray and gray to black - forcing people to abandon techniques which offered a predictable positive ROI.

Defunding SEO is an essential & virtuous goal.

Hiding data (and then giving crumbs of it back to profile webmasters) is one way of doing it, but adding layers of risk is another. What panda did to content was add a latent risk to content where the cost of that risk in many cases vastly exceeded the cost of the content itself. What penguin did to links was the same thing: make the latent risk much larger than the upfront cost.

As Google dials up their weighting on domain authority many smaller sites which competed on legacy relevancy metrics like anchor text slide down the result set. When they fall down the result set, many of those site owners think they were penalized (even if their slide was primarily driven by a reweighting of factors rather than an actual penalty). Since there is such rampant fearmongering on links, they start there. Nearly every widely used form of link building has been promoted by Google engineers as being spam.

  • Paid links? Spam.
  • Reciprocal links? Spam.
  • Blog comments? Spam.
  • Forum profile links? Spam.
  • Integrated newspaper ads? Spam.
  • Article databases? Spam.
  • Designed by credit links? Spam.
  • Press releases? Spam.
  • Web 2.0 profile & social links? Spam.
  • Web directories? Spam.
  • Widgets? Spam.
  • Infographics? Spam.
  • Guest posts? Spam.

It doesn't make things any easier when Google sends out examples of spam links which are sites the webmaster has already disavowed or sites which Google explicitly recommended in their webmaster guidelines, like DMOZ.

It is quite the contradiction where Google suggests we should be aggressive marketers everywhere EXCEPT for SEO & basically any form of link building is far too risky.

It’s a strange world where when it comes to social media, Google is all promote promote promote. Or even in paid search, buy ads, buy ads, buy ads. But when it comes to organic listings, it’s just sit back and hope it works, and really don’t actively go out and build links, even those are so important. - Danny Sullivan

Google is in no way a passive observer of the web. Rather they actively seek to distribute fear and propaganda in order to take advantage of the experiment effect.

They can find and discredit the obvious, but most on their “spam list” done “well” are ones they can’t detect. So, it’s easier to have webmasters provide you a list (disavows), scare the ones that aren’t crap sites providing the links into submission and damn those building the links as “examples” – dragging them into town square for a public hanging to serve as a warning to anyone who dare disobey the dictatorship. - Sugarrae

This propaganda is so effective that email spammers promoting "SEO solutions" are now shifting their pitches from grow your business with SEO to recover your lost traffic

Where Do Profits Come From?

I saw Rand tweet this out a few days ago...

... and thought "wow, that couldn't possibly be any less correct."

When ecosystems are stable you can create processes which are profitable & pay for themselves over the longer term.

I very frequently get the question: 'what’s going to change in the next 10 years?' And that is a very interesting question; it’s a very common one. I almost never get the question: 'what’s not going to change in the next 10 years?' And I submit to you that that second question is actually the more important of the two – because you can build a business strategy around the things that are stable in time….in our retail business, we know that customers want low prices and I know that’s going to be true 10 years from now. They want fast delivery, they want vast selection. It’s impossible to imagine a future 10 years from now where a customer comes up and says, 'Jeff I love Amazon, I just wish the prices were a little higher [or] I love Amazon, I just wish you’d deliver a little more slowly.' Impossible. And so the effort we put into those things, spinning those things up, we know the energy we put into it today will still be paying off dividends for our customers 10 years from now. When you have something that you know is true, even over the long-term, you can afford to put a lot of energy into it. - Jeff Bezos at re: Invent, November, 2012

When ecosystems are unstable, anything approaching boilerplate has an outsized risk added by the dominant market participant. The quicker your strategy can be done at scale or in the third world, the quicker Google shifts it from a positive to a negative ranking signal. It becomes much harder to train entry level employees on the basics when some of the starter work they did in years past now causes penalties. It becomes much harder to manage client relationships when their traffic spikes up and down, especially if Google sends out rounds of warnings they later semi-retract.

What's more, anything that is vastly beyond boilerplate tends to require a deeper integration and a higher level of investment - making it take longer to pay back. But the budgets for such engagement dry up when the ecosystem itself is less stable. Imagine the sales pitch, "I realize we are off 35% this year, but if we increase the budget 500% we should be in a good spot a half-decade from now."

All great consultants aim to do more than the bare minimum in order to give their clients a sustainable competitive advantage, but by removing things which are scalable and low risk Google basically prices out the bottom 90% to 95% of the market. Small businesses which hire an SEO are almost guaranteed to get screwed because Google has made delivering said services unprofitable, particularly on a risk-adjusted basis.

Being an entrepreneur is hard. Today Google & Amazon are giants, but it wasn't always that way. Add enough risk and those streams of investment in innovation disappear. Tomorrow's Amazon or Google of other markets may die a premature death. You can't see what isn't there until you look back from the future - just like the answering machine AT&T held back from public view for decades.

Meanwhile, the Google Venture backed companies keep on keeping on - they are protected.

When ad agencies complain about the talent gap, what they are really complaining about is paying people what they are worth. But as the barrier to entry in search increases, independent players die, leaving more SEOs to chase fewer corporate jobs at lower wages. Even companies servicing fortune 500s are struggling.

On an individual basis, creating value and being fairly compensated for the value you create are not the same thing. Look no further than companies like Google & Apple which engage in flagrantly illegal anti-employee cartel agreements. These companies "partnered" with their direct competitors to screw their own employees. Even if you are on a winning team it does not mean that you will be a winner after you back out higher living costs and such illegal employer agreements.

This is called now the winner-take-all society. In other words the rewards go overwhelmingly to just the thinnest crust of folks. The winner-take-all society creates incredibly perverse incentives to become a cheater-take-all society. Cause my chances of winning an honest competition are very poor. Why would I be the one guy or gal who would be the absolute best in the world? Why not cheat instead?" - William K Black

Meanwhile, complaints about the above sorts of inequality or other forms of asset stripping are pitched as being aligned with Nazi Germany's treatment of Jews. Obviously we need more H-1B visas to further drive down wages even as graduates are underemployed with a mountain of debt.

A Disavow For Any (& Every) Problem

Removing links is perhaps the single biggest growth area in SEO.

Just this week I got an unsolicited email from an SEO listing directory

We feel you may qualify for a Top position among our soon to be launched Link Cleaning Services Category and we would like to learn more about Search Marketing Info. Due to the demand for link cleaning services we're poised to launch the link cleaning category. I took a few minutes to review your profile and felt you may qualify. Do you have time to talk this Monday or Tuesday?

Most of the people I interact with tend to skew toward the more experienced end of the market. Some of the folks who join our site do so after their traffic falls off. In some cases the issues look intimately tied to Panda & the sites with hundreds of thousands of pages maybe only have a couple dozen inbound links. In spite of having few inbound links & us telling people the problem looks to be clearly aligned with Panda, some people presume that the issue is links & they still need to do a disavow file.

Why do they make that presumption? It's the fear message Google has been selling nonstop for years.

Punishing people is much different, and dramatic, from not rewarding. And it feeds into the increasing fear that people might get punished for anything. - Danny Sullivan

What happens when Google hands out free all-you-can-eat gummy bear laxatives to children at the public swimming pool? A tragedy of the commons.

Rather than questioning or countering the fear stuff, the role of the SEO industry has largely been to act as lap dogs, syndicating & amplifying the fear.

  • link tool vendors want to sell proprietary clean up data
  • SEO consultants want to tell you that they are the best and if you work with someone else there is a high risk hidden in the low price
  • marketers who crap on SEO to promote other relabeled terms want to sell you on the new term and paint the picture that SEO is a self-limiting label & a backward looking view of marketing
  • paid search consultants want to enhance the perception that SEO is unreliable and not worthy of your attention or investment

Even entities with a 9 figure valuation (and thus plenty of resources to invest in a competent consultant) may be incorrectly attributing SEO performance problems to links.

A friend recently sent me a link removal request from Buy Domains referring to a post which linked to them.

On the face of this, it's pretty absurd, no? A company which does nothing but trade in names themselves asks that their name reference be removed from a fairly credible webpage recommending them.

The big problem for Buy Domains is not backlinks. They may have had an issue with some of the backlinks from PPC park pages in the past, but now those run through a redirect and are nofollowed.

Their big issue is that they have less than great engagement metrics (as do most marketplace sites other than eBay & Amazon which are not tied to physical stores). That typically won't work if the entity has limited brand awareness coupled with having nearly 5 million pages in Google's index.

They not only have pages for each individual domain name, but they link to their internal search results from their blog posts & those search pages are indexed. Here's part of a recent blog post

And here are examples of the thin listing sorts of pages which Panda was designed in part to whack. These pages were among the millions indexed in Google.

A marketplace with millions of pages that doesn't have broad consumer awareness is likely to get nailed by Panda. And the websites linking to it are likely to end up in disavow files, not because they did anything wrong but because Google is excellent at nurturing fear.

What a Manual Penalty Looks Like

Expedia saw a 25% decline in search visibility due to an unnatural links penalty , causing their stock to fall 6.4%. Both Google & Expedia declined to comment. It appears that the eventual Expedia undoing stemmed from Hacker News feedback & coverage about an outing story on an SEO blog that certainly sounded like it stemmed from an extortion attempt. USA Today asked if the Expedia campaign was a negative SEO attack.

While Expedia's stock drop was anything but trivial, they will likely recover within a week to a month.

Smaller players can wait and wait and wait and wait ... and wait.

Manual penalties are no joke, especially if you are a small entity with no political influence. The impact of them can be absolutely devastating. Such penalties are widespread too.

In Google's busting bad advertising practices post they highlighted having zero tolerance, banning more than 270,000 advertisers, removing more than 250,000 publishers accounts, and disapproving more than 3,000,000 applications to join their ad network. All that was in 2013 & Susan Wojcicki mentioned Google having 2,000,000 sites in their display ad network. That would mean that something like 12% of their business partners were churned last year alone.

If Google's churn is that aggressive on their own partners (where Google has an economic incentive for the relationship) imagine how much broader the churn is among the broader web. In this video Matt Cutts mentioned that Google takes over 400,000 manual actions each month & they get about 5,000 reconsideration request messages each week, so over 95% of the sites which receive notification never reply. Many of those who do reply are wasting their time.

The Disavow Threat

Originally when disavow was launched it was pitched as something to be used with extreme caution:

This is an advanced feature and should only be used with caution. If used incorrectly, this feature can potentially harm your site’s performance in Google’s search results. We recommend that you disavow backlinks only if you believe you have a considerable number of spammy, artificial, or low-quality links pointing to your site, and if you are confident that the links are causing issues for you. In most cases, Google can assess which links to trust without additional guidance, so most normal or typical sites will not need to use this tool.

Recently Matt Cutts has encouraged broader usage. He has one video which discusses proatively disavowing bad links as they come in & another where he mentioned how a large company disavowed 100% of their backlinks that came in for a year.

The idea of proactively monitoring your backlink profile is quickly becoming mainstream - yet another recurring fixed cost center in SEO with no upside to the client (unless you can convince the client SEO is unstable and they should be afraid - which would ultimately retard their longterm investment in SEO).

Given the harshness of manual actions & algorithms like Penguin, they drive companies to desperation, acting irrationally based on fear.

People are investing to undo past investments. It's sort of like riding a stock down 60%, locking in the losses by selling it, and then using the remaining 40% of the money to buy put options or short sell the very same stock. :D

Some companies are so desperate to get links removed that they "subscribe" sites that linked to them organically with spam email messages asking the links be removed.

Some go so far that they not only email you on and on, but they created dedicated pages on their site claiming that the email was real.

What's so risky about the above is that many webmasters will remove links sight unseen, even from an anonymous Gmail account. Mix in the above sort of "this message is real" stuff and how easy would it be for a competitor to target all your quality backlinks with a "please remove my links" message? Further, how easy would it be for a competitor aware of such a campaign to drop a few hundred Dollars on Fiverr or Xrummer or other similar link sources, building up your spam links while removing your quality links?

A lot of the "remove my link" messages are based around lying to the people who are linking & telling them that the outbound link is harming them as well: "As these links are harmful to both yours and our business after penguin2.0 update, we would greatly appreciate it if you would delete these backlinks from your website."

Here's the problem though. Even if you spend your resources and remove the links, people will still likely add your site to their disavow file. I saw a YouTube video recording of an SEO conference where 4 well known SEO consultants mentioned that even if they remove the links "go ahead and disavow anyhow," so there is absolutely no upside for publishers in removing links.

How Aggregate Disavow Data Could Be Used

Recovery is by no means guaranteed. In fact of the people who go to the trouble to remove many links & create a disavow file, only 15% of people claim to have seen any benefit.

The other 85% who weren't sure of any benefit may not have only wasted their time, but they may have moved some of their other projects closer toward being penalized.

Let's look at the process:

  • For the disavow to work you also have to have some links removed.
    • Some of the links that are removed may not have been the ones that hurt you in Google, thus removing them could further lower your rank.
    • Some of the links you have removed may be the ones that hurt you in Google, while also being ones that helped you in Bing.
    • The Bing & Yahoo! Search traffic hit comes immediately, whereas the Google recovery only comes later (if at all).
  • Many forms of profits (from client services or running a network of sites) come systematization. If you view everything that is systematized or scalable as spam, then you are not only disavowing to try to recover your penalized site, but you are send co-citation disavow data to Google which could have them torch other sites connected to those same sources.
    • If you run a network of sites & use the same sources across your network and/or cross link around your network, you may be torching your own network.
    • If you primarily do client services & disavow the same links you previously built for past clients, what happens to the reputation of your firm when dozens or hundreds of past clients get penalized? What happens if a discussion forum thread on Google Groups or elsewhere starts up where your company gets named & then a tsunami of pile on stuff fills out in the thread? Might that be brand destroying?

The disavow and review process is not about recovery, but is about collecting data and distributing pain in a game of one-way transparency. Matt has warned that people shouldn't lie to Google...

...however Google routinely offers useless non-information in their responses.

Some Google webmaster messages leave a bit to be desired.

Recovery is uncommon. Your first response from Google might take a month or more. If you work for a week or two on clean up and then the response takes a month, the penalty has already lasted at least 6 weeks. And that first response might be something like this

Reconsideration request for site.com: Site violates Google's quality guidelines

We received a reconsideration request from a site owner for site.com/.

We've reviewed your site and we believe that site.com/ still violates our quality guidelines. In order to preserve the quality of our search engine, pages from site.com/ may not appear or may not rank as highly in Google's search results, or may otherwise be considered to be less trustworthy than sites which follow the quality guidelines.

For more specific information about the status of your site, visit the Manual Actions page in Webmaster Tools. From there, you may request reconsideration of your site again when you believe your site no longer violates the quality guidelines.
If you have additional questions about how to resolve this issue, please see our Webmaster Help Forum.

Absolutely useless.

Zero useful information whatsoever.

As people are unsuccessful in the recovery process they cut deeper and deeper. Some people have removed over 90% of their profile without recovering & been nearly a half-year into the (12-step) "recovery" process before even getting a single example of a bad link from Google. In some cases these bad links Google identified were links were obviously created by third party scraper sites & were not in Google's original sample of links to look at (so even if you looked at every single link they showed you & cleaned up 100% of issues you would still be screwed.)

Another issue with aggregate disavow data is there is a lot of ignorance in the SEO industry in general, and people who try to do things cheap (essentially free) at scale have an outsized footprint in the aggregate data. For instance, our site's profile links are nofollowed & our profiles are not indexed by Google. In spite of this, examples like the one below are associated with not 1 but 3 separate profiles for a single site.

Our site only has about 20,000 to 25,000 unique linking domains. However over the years we have had well over a million registered user profiles. If only 2% of the registered user profiles were ignorant spammers who spammed our profile pages and then later added our site to a disavow file, we would have more people voting *against* our site than we have voting for it. And that wouldn't be because we did anything wrong, but rather because Google is fostering an environment of mixed messaging, fear & widespread ignorance.

And if we are ever penalized, the hundreds of scraper sites built off scraping our RSS feed would make the recovery process absolutely brutal.

Another factor with Google saying "you haven't cut out enough bone marrow yet" along with suggesting that virtually any/every type of link is spam is that there is going to be a lot of other forms of false positives in the aggregate data.

I know some companies specializing in link recovery which in part base some aspects of their disavows on the site's ranking footprint. Well if you get a manual penalty, a Panda penalty, or your site gets hacked, then those sorts of sites which you are linking to may re-confirm that your site deserves to be penalized (on a nearly automated basis with little to no thought) based on the fact that it is already penalized. Good luck on recovering from that as Google folds in aggregate disavow data to justify further penalties.

Responsibility

All large ecosystems are gamed. We see it with app ratings & reviews, stealth video marketing, advertising, malware installs, and of course paid links.

Historically in search there has been the view that you are responsible for what you have done, but not the actions of others. The alternate roadmap would lead to this sort of insanity:

Our system has noticed that in the last week you received 240 spam emails. In result, your email account was temporarily suspended. Please contact the spammers and once you have a proof they unsuscribed you from their spam databases, we will reconsider reopening your email account.

As Google has closed down their own ecosystem, they allow their own $0 editorial to rank front & center even if it is pure spam, but third parties are now held to a higher standard - you could be held liable for the actions of others.

At the extreme, one of Google's self-promotional automated email spam messages sent a guy to jail. In spite of such issues, Google remains unfazed, adding a setting which allows anyone on Google+ to email other members.

Ask Google if they should be held liable for the actions of third parties and they will tell you to go to hell. Their approach to copyright remains fuzzy, they keep hosting more third party content on their own sites, and even when that content has been deemed illegal they scream that it undermines their first amendment rights if they are made to proactively filter:

Finally, they claimed they were defending free speech. But it's the courts which said the pictures were illegal and should not be shown, so the issue is the rule of law, not freedom of speech.
...
the non-technical management, particularly in the legal department, seems to be irrational to the point of becoming adolescent. It's almost as if they refuse to do something entirely sensible, and which would save them and others time and trouble, for no better reason than that someone asked them to.

Monopolies with nearly unlimited resources shall be held liable for nothing.

Individuals with limited resources shall be liable for the behavior of third parties.

Google Duplicity (beta).

Torching a Competitor

As people have become more acclimated toward link penalties, a variety of tools have been created to help make sorting through the bad ones easier.

"There have been a few tools coming out on the market since the first Penguin - but I have to say that LinkRisk wins right now for me on ease of use and intuitive accuracy. They can cut the time it takes to analyse and root out your bad links from days to minutes..." - Dixon Jones

But as there have been more tools created for sorting out bad links & more tools created to automate sending link emails, two things have happened

  • Google is demanding more links be removed to allow for recovery
  • people are becoming less responsive to link removal requests as they get bombarded with them
    • Some of these tools keep bombarding people over and over again weekly until the link is removed or the emails go to the spam bin
    • to many people the link removal emails are the new link request emails ;)
    • one highly trusted publisher who participates in our forums stated they filtered the word "disavow" to automatically go to their trash bin
    • on WebmasterWorld a member decided it was easier to delete their site than deal with the deluge of link removal spam emails

The problem with Google rewarding negative signals is there are false positives and it is far cheaper to kill a business than it is to build one. The technically savvy teenager who created the original version of the software used in the Target PoS attack sold the code for only $2,000.

There have been some idiotic articles like this one on The Awl suggesting that comment spamming is now dead as spammers run for the hills, but that couldn't be further from the truth. Some (not particularly popular) blogs are getting hundreds to thousands of spam comments daily & Wordpress can have trouble even backing up the database (unless the comment spam is regularly deleted) as the database can quickly get a million records.

The spam continues but the targets change. A lot of these comments are now pointed at YouTube videos rather than ordinary websites.

As Google keeps leaning into negative signals, one can expect a greater share of spam links to be created for negative SEO purposes.

Maybe this maternity jeans comment spam is tied to the site owner, but if they didn't do it, how do they prove it?

Once again, I'll reiterate Bill Black

This is called now the winner-take-all society. In other words the rewards go overwhelmingly to just the thinnest crust of folks. The winner-take-all society creates incredibly perverse incentives to become a cheater-take-all society. Cause my chances of winning an honest competition are very poor. Why would I be the one guy or gal who would be the absolute best in the world? Why not cheat instead?" - William K Black

The cost of "an academic test" can be as low as $5. You know you might be in trouble when you see fiverr.com/conversations/theirusername in your referrers:

Our site was hit with negative SEO. We have manually collected about 24,000 bad links for our disavow file (so far). It probably cost the perp $5 on Fiverr to point these links at our site. Do you want to know how bad that sucks? I'll tell you. A LOT!! Google should be sued enmass by web masters for wasting our time with this "bad link" nonsense. For a company with so many Ph.D's on staff, I can't believe how utterly stupid they are

Or, worse yet, you might see SAPE in your referrers

And if the attempt to get you torched fails, they can try & try again. The cost of failure is essentially zero. They can keep pouring on the fuel until the fire erupts.

Even Matt Cutts complains about website hacking, but that doesn't mean you are free of risk if someone else links to your site from hacked blogs. I've been forwarded unnatural link messages from Google which came about after person's site was added in on a SAPE hack by a third party in an attempt to conceal who the beneficial target was. When in doubt, Google may choose to blame all parties in a scorched Earth strategy.

If you get one of those manual penalties, you're screwed.

Even if you are not responsible for such links, and even if you respond on the same day, and even if Google believes you, you are still likely penalized AT LEAST for a month. Most likely Google will presume you are a liar and you have at least a second month in the penalty box. To recover you might have to waste days (weeks?) of your life & remove some of your organic links to show that you have went through sufficient pain to appease the abusive market monopoly.

As bad as the above is, it is just the tip of the iceberg.

  • People can redirect torched websites.
  • People can link to you from spam link networks which rotate links across sites, so you can't possibly remove or even disavow all the link sources.
  • People can order you a subscription of those rotating spam links from hacked sites, where new spam links appear daily. Google mentioned discovering 9,500 malicious sites daily & surely the number has only increased from there.
  • People can tie any/all of the above with cloaking links or rel=canonical messages to GoogleBot & then potentially chain that through further redirects cloaked to GoogleBot.
  • And on and on ... the possibilities are endless.

Extortion

Another thing this link removal fiasco subsidizes is various layers of extortion.

Not only are there the harassing emails threatening to add sites to disavow lists if they don't remove the links, but some companies quickly escalate things from there. I've seen hosting abuse, lawyer threat letters, and one friend was actually sued in court (and the people who sued him actually had the link placed!)

Google created a URL removal tool which allows webmasters to remove pages from third party websites. How long until that is coupled with DDoS attacks? Once effective with removing one page, a competitor might decide to remove another.

Another approach to get links removed is to offer payment. But payment itself might encourage the creation of further spammy links as link networks look to replace their old cashflow with new sources.

The recent Expedia fiasco started as an extortion attempt: "If I wanted him to not publish it, he would "sell the post to the highest bidder."

Another nasty issue here is articles like this one on Link Research Tools, where they not only highlight client lists of particular firms, but then state which URLs have not yet been penalized followed by "most likely not yet visible." So long as that sort of "publishing" is acceptable in the SEO industry, you can bet that some people will hire the SEOs nearly guaranteeing a penalty to work on their competitor's sites, while having an employee write a "case study" for Link Research Tools. Is this the sort of bullshit we really want to promote?

Some folks are now engaging in overt extortion:

I had a client phone me today and say he had a call from a guy with an Indian accent who told him that he will destroy his website rankings if he doesn't pay him £10 per month to NOT do this.

Branding / Rebranding / Starting Over

Sites that are overly literal in branding likely have no chance at redemption. That triple hyphenated domain name in a market that is seen as spammy has zero chance of recovery.

Even being a generic unbranded site in a YMYL category can make you be seen as spam. The remote rater documents stated that the following site was spam...

... even though the spammiest thing on it was the stuff advertised in the AdSense ads:

For many (most?) people who receive a manual link penalty or are hit by Penguin it is going to be cheaper to start over than to clean up.

At the very minimum it can make sense to lay groundwork for a new project immediately just in case the old site can't recover or takes nearly a year to recover. However, even if you figure out the technical bits, as soon as you have any level of success (or as soon as you connect your projects together in any way) you once again become a target.

And you can't really invest in higher level branding functions unless you think the site is going to be around for many years to earn off the sunk cost.

Succeeding at SEO is not only about building rank while managing cashflow and staying unpenalized, but it is also about participating in markets where you are not marginalized due to Google inserting their own vertical search properties.

Even companies which are large and well funded may not succeed with a rebrand if Google comes after their vertical from the top down.

Hope & Despair

If you are a large partner affiliated with Google, hope is on your side & you can monetize the link graph: "By ensuring that our clients are pointing their links to maximize their revenue, we’re not only helping them earn more money, but we’re also stimulating the link economy."

You have every reason to be Excited, as old projects like Excite or Merchant Circle can be relaunched again and again.

Even smaller players with the right employer or investor connections are exempt from these arbitrary risks.

You can even be an SEO and start a vertical directory knowing you will do well if you can get that Google Ventures investment, even as other similar vertical directories were torched by Panda.

For most other players in that same ecosystem, the above tailwind is a headwind. Don't expect much 1 on 1 help in webmaster tools.

In this video Matt Cutts mentioned that Google takes over 400,000 manual actions each month & they get about 5,000 reconsideration request messages each week, so over 95% of the sites which receive notification never reply. Many of those who reply are wasting their time. How many confirmed Penguin 1.0 recoveries are you aware of?

Even if a recovery is deserved, it does not mean one will happen, as errors do happen. And on the off chance recovery happens, recovery does not mean a full restoration of rankings.

There are many things we can learn from Google's messages, but probably the most important is this:

It was the best of times, it was the worst of times, it was the age of wisdom, it was the age of foolishness, it was the epoch of belief, it was the epoch of incredulity, it was the season of Light, it was the season of Darkness, it was the spring of hope, it was the winter of despair, we had everything before us, we had nothing before us, we were all going direct to heaven, we were all going direct the other way - in short, the period was so far like the present period, that some of its noisiest authorities insisted on its being received, for good or for evil, in the superlative degree of comparison only. - Charles Dickens, A Tale of Two Cities

SEO 2014

Jan 11th
posted in

We’re at the start of 2014.

SEO is finished.

Well, what we had come to know as the practical execution of “whitehat SEO” is finished. Google has defined it out of existence. Research keyword. Write page targeting keyword. Place links with that keyword in the link. Google cares not for this approach.

SEO, as a concept, is now an integral part of digital marketing. To do SEO in 2014 - Google-compliant, whitehat SEO - digital marketers must seamlessly integrate search strategy into other aspects of digital marketing. It’s a more complicated business than traditional SEO, but offers a more varied and interesting challenge, too.

Here are a few things to think about for 2014.

1. Focus On Brand

Big brands not only get a free pass, they can get extra promotion. By being banned. Take a look at Rap Genius. Aggressive link-building strategy leads to de-indexing. A big mea culpa follows and what happens? Not only do they get reinstated, they’ve earned themselves a wave of legitimate links.

Now that’s genius.

Google would look deficient if they didn’t show that site as visitors would expect to find it - enough people know the brand. To not show a site that has brand awareness would make Google look bad.

Expedia's link profile was similarly outed for appearing to be at odds with Google's published standards. Could a no-name site pass a hand inspection if they use aggressive linking? Unlikely.

What this shows is that if you have a brand important enough so that Google would look deficient by excluding it, then you will have advantages that no-name sites don’t enjoy. You will more likely pass manual inspections, and you’re probably more likely to get penalties lifted.

What is a brand?

In terms of search, it’s a site that visitors can use a brand search to find. Just how much search volume you require is open to debate, but you don’t need to be a big brand like Apple, or Trip Advisor or Microsoft. Rap Genius aren't. Ask “would Google look deficient if this site didn’t show up” and you can usually tell that by looking for evidence of search volume on a sites name.

In advertising, brands have been used to secure a unique identity. That identity is associated with a product or service by the customer. Search used to be about matching a keyword term. But as keyword areas become saturated, and Google returns fewer purely keyword-focused pages anyway, developing a unique identity is a good way forward.

If you haven’t already, put some work into developing a cohesive, unique brand. If you have a brand, then think about generating more awareness. This may mean higher spends on brand-related advertising than you’ve allocated in previous years. The success metric is an increase in brand searches i.e. the name of the site.

2. Be Everywhere

The idea of a stand-alone site is becoming redundant. In 2014, you need to be everywhere your potential visitors reside. If your potential visitors are spending all day in Facebook, or YouTube, that’s where you need to be, too. It’s less about them coming to you, which is the traditional search metaphor, and a lot more about you going to them.

You draw visitors back to your site, of course, but look at every platform and other site as a potential extension of your own site. Pages or content you place on those platforms are yet another front door to your site, and can be found in Google searches. If you’re not where your potential visitors are, you can be sure your competitors will be, especially if they’re investing in social media strategies.

A reminder to see all channels as potential places to be found.

Mix in cross-channel marketing with remarketing and consumers get the perception that your brand is bigger. Google ran the following display ad before they broadly enabled retargeting ads. Retargeting only further increases that lift in brand searches.

3. Advertise Everywhere

Are you finding it difficult to get top ten in some areas? Consider advertising with AdWords and on the sites that already hold those positions. Do some brand advertising on them to raise awareness and generate some brand searches. An advert placed on a site that offers a complementary good or service might be cheaper than going to the expense and effort needed to rank. It might also help insulate you from Google’s whims.

The same goes for guest posts and content placement, although obviously you need to be a little careful as Google can take a dim view of it. The safest way is to make sure the content you place is unique, valuable and has utility in it’s own right. Ask yourself if the content would be equally at home on your own site if you were to host it for someone else. If so, it’s likely okay.

4. Valuable Content

Google does an okay job of identifying good content. It could do better. They’ve lost their way a bit in terms of encouraging production of good content. It’s getting harder and harder to make the numbers work in order to cover the production cost.

However, it remains Google’s mission to provide the user with answers the visitor deems relevant and useful. The utility of Google relies on it. Any strategy that is aligned with providing genuine visitor utility will align with Google’s long term goals.

Review your content creation strategies. Content that is of low utility is unlikely to prosper. While it’s still a good idea to use keyword research as a guide to content creation, it’s a better idea to focus on topic areas and creating engagement through high utility. What utility is the user expecting from your chosen topic area? If it’s rap lyrics for song X, then only the rap lyrics for song X will do. If it is plans for a garden, then only plans for a garden will do. See being “relevant” as “providing utility”, not keyword matching.

Go back to the basic principles of classifying the search term as either Navigational, Informational, or Transactional. If the keyword is one of those types, make sure the content offers the utility expected of that type. Be careful when dealing with informational queries that Google could use in it’s Knowledge Graph. If your pages deal with established facts that anyone else can state, then you have no differentiation, and that type of information is more likely to end up as part of Google’s Knowledge Graph. Instead, go deep on information queries. Expand the information. Associate it with other information. Incorporate opinion.

BTW, Bill has some interesting reading on the methods by which Google might be identifying different types of queries.

Methods, systems, and apparatus, including computer program products, for identifying navigational resources for queries. In an aspect, a candidate query in a query sequence is selected, and a revised query subsequent to the candidate query in the query sequence is selected. If a quality score for the revised query is greater than a quality score threshold and a navigation score for the revised query is greater than a navigation score threshold, then a navigational resource for the revised query is identified and associated with the candidate query. The association specifies the navigational resource as being relevant to the candidate query in a search operation.

5. Solve Real Problems

This is a follow-on from “ensuring you provide content with utility”. Go beyond keyword and topical relevance. Ask “what problem is the user is trying to solve”? Is it an entertainment problem? A “How To” problem? What would their ideal solution look like? What would a great solution look like?

There is no shortcut to determining what a user finds most useful. You must understand the user. This understanding can be gleaned from interviews, questionnaires, third party research, chat sessions, and monitoring discussion forums and social channels. Forget about the keyword for the time being. Get inside a visitors head. What is their problem? Write a page addressing that problem by providing a solution.

6. Maximise Engagement

Google are watching for the click-back to the SERP results, an action characteristic of a visitor who clicked through to a site and didn’t deem what they found to be relevant to the search query in terms of utility. Relevance in terms of subject match is now a given.

Big blocks of dense text, even if relevant, can be off-putting. Add images and videos to pages that have low engagement and see if this fixes the problem. Where appropriate, make sure the user takes an action of some kind. Encourage the user to click deeper into the site following an obvious, well placed link. Perhaps they watch a video. Answer a question. Click a button. Anything that isn’t an immediate click back.

If you’ve focused on utility, and genuinely solving a users problem, as opposed to just matching a keyword, then your engagement metrics should be better than the guy who is still just chasing keywords and only matching in terms of relevance to a keyword term.

7. Think Like A PPCer

Treat every click like you were paying for it directly. Once that visitor has arrived, what is the one thing you want them to do next? Is it obvious what they have to do next? Always think about how to engage that visitor once they land. Get them to take an action, where possible.

8.Think Like A Conversion Optimizer

Conversion optimization tries to reduce the bounce-rate by re-crafting the page to ensure it meets the users needs. They do this by split testing different designs, phrases, copy and other elements on the page.

It’s pretty difficult to test these things in SEO, but it’s good to keep this process in mind. What pages of your site are working well and which pages aren’t? Is it anything to do with different designs or element placement? What happens if you change things around? What do the three top ranking sites in your niche look like? If their link patterns are similar to yours, what is it about those sites that might lead to higher engagement and relevancy scores?

9. Rock Solid Strategic Marketing Advantage

SEO is really hard to do on generic me-too sites. It’s hard to get links. It’s hard to get anyone to talk about them. People don’t share them with their friends. These sites don’t generate brand searches. The SEO option for these sites is typically what Google would describe as blackhat, namely link buying.

Look for a marketing angle. Find a story to tell. Find something unique and remarkable about the offering. If a site doesn’t have a clearly-articulated point of differentiation, then the harder it is to get value from organic search if your aim is to do so whilst staying within the guidelines.

10. Links

There’s a reason Google hammers links. It’s because they work. Else, surely Google wouldn’t make a big deal about them.

Links count. It doesn’t matter if they are no-follow, scripted, within social networks, or wherever, they are still paths down which people travel. It comes back to a clear point of differentiation, genuine utility and a coherent brand. It’s a lot easier, and safer, to link build when you’ve got all the other bases covered first.

Did Matt Cutts Endorse Rap Genius Link Spam?

Jan 4th

On TWIG Matt Cutts spoke about the importance of defunding spammers & breaking their spirits.

If you want to stop spam, the most straight forward way to do it is to deny people money because they care about the money and that should be their end goal. But if you really want to stop spam, it is a little bit mean, but what you want to do, is break their spirits. You want to make them frustrated and angry. There are parts of Google's algorithms specifically designed to frustrate spammers and mystify them and make them frustrated. And some of the stuff we do gives people a hint their site is going to drop and then a week or two later their site actually does drop so they get a little bit more frustrated. And so hopefully, and we've seen this happen, people step away from the dark side and say "you know, that was so much pain and anguish and frustration, let's just stay on the high road from now on" some of the stuff I like best is when people say "you know this SEO stuff is too unpredictable, I'm just going to write some apps. I'm going to go off and do something productive for society." And that's great because all that energy is channeled at something good.

What was less covered was that in the same video Matt Cutts made it sound like anything beyond information architecture, duplicate content cleanup & clean URLs was quickly approaching scamming - especially anything to do with links. So over time more and more behaviors get reclassified as black hat spam as Google gains greater control over the ecosystem.

there's the kind of SEO that is better architecture, cleaner URLs, not duplicate content ... that's just like making sure your resume doesn't have any typos on it. that's just clever stuff. and then there's the type of SEO that is sort of cheating. trying to get a lot of bad backlinks or scamming, and that's more like lying on your resume. when you get caught sometime's there's repercussions. and it definitely helps to personalize because now anywhere you search for plumbers there's local results and they are not the same across the world. we've done a diligent job of trying to crack down on black hat spam. so we had an algorithm named Penguin that launched that kind of had a really big impact. we had a more recent launch just a few months ago. and if you go and patrole the black hat SEO forums where the guys talk about the techniques that work, now its more people trying to sell other people scams rather than just trading tips. a lot of the life has gone out of those forums. and even the smaller networks that they're trying to promote "oh buy my anglo rank or whatever" we're in the process of tackling a lot of those link networks as well. the good part is if you want to create a real site you don't have to worry as much about these bad guys jumping ahead of you. the playing ground is a lot more level now. panda was for low quality. penguin was for spam - actual cheating.

The Matt Cutts BDSM School of SEO

As part of the ongoing campaign to "break their spirits" we get increasing obfuscation, greater time delays between certain algorithmic updates, algorithmic features built explicitly with the goal of frustrating people, greater brand bias, and more outrageous selective enforcement of the guidelines.

Those who were hit by either Panda or Penguin in some cases took a year or more to recover. Far more common is no recovery — ever. How long do you invest in & how much do you invest in a dying project when the recovery timeline is unknown?

You Don't Get to Fascism Without 2-Tier Enforcement

While success in and of itself may make one a "spammer" to the biased eyes of a search engineer (especially if you are not VC funded nor part of a large corporation), many who are considered "spammers" self-regulate in a way that make them far more conservative than the alleged "clean" sites do.

Pretend you are Ask.com and watch yourself get slaughtered without warning.

Build a big brand & you will have advanced notification & free customer support inside the GooglePlex:

In my experience with large brand penalties, (ie, LARGE global brands) Google have reached out in advance of the ban every single time. - Martin Macdonald

Launching a Viral Linkspam Sitemap Campaign

When RapGenius was penalized, the reason they were penalized is they were broadly and openly and publicly soliciting to promote bloggers who would dump a list of keyword rich deeplinks into their blog posts. They were basically turning boatloads of blogs into mini-sitemaps for popular new music albums.

Remember reading dozens (hundreds?) of blog posts last year about how guest posts are spam & Google should kill them? Well these posts from RapGenius were like a guest post on steroids. The post "buyer" didn't have to pay a single cent for the content, didn't care at all about relevancy, AND a sitemap full of keyword rich deep linking spam was included in EACH AND EVERY post.

Most "spammers" would never attempt such a campaign because they would view it as being far too spammy. They would have a zero percent chance of recovery as Google effectively deletes their site from the web.

And while RG is quick to distance itself from scraper sites, for almost the entirety of their history virtually none of the lyrics posted on their site were even licensed.

In the past I've mentioned Google is known to time the news cycle. It comes without surprise that on a Saturday barely a week after being penalized Google restored RapGenius's rankings.

How to Gain Over 400% More Links, While Allegedly Losing

While the following graph may look scary in isolation, if you know the penalty is only a week or two then there's virtually no downside.

Since being penalized, RapGenius has gained links from over 1,000* domains

  • December 25th: 129
  • December 26th: 85
  • December 27th: 87
  • December 28th: 54
  • December 29th: 61
  • December 30th: 105
  • December 31st: 182
  • January 1st: 142
  • January 2nd: 112
  • January 3rd: 122

The above add up to 1,079 & RapGenius only has built a total of 11,930 unique linking domains in their lifetime. They grew about 10% in 10 days!

On every single day the number of new referring domains VASTLY exceeded the number of referring domains that disappeared. And many of these new referring domains are the mainstream media and tech press sites, which are both vastly over-represented in importance/authority on the link graph. They not only gained far more links than they lost, but they also gained far higher quality links that will be nearly impossible for their (less spammy) competitors to duplicate.

They not only got links, but the press coverage acted as a branded advertising campaign for RapGenius.

Here's some quotes from RapGenius on their quick recovery:

  • "we owe a big thanks to Google for being fair and transparent and allowing us back onto their results pages" <-- Not the least bit true. RapGenius was not treated fairly, but rather they were given a free ride compared to the death hundreds of thousands of small businesses have been been handed over the past couple years.
  • "On guest posts, we appended lists of song links (often tracklists of popular new albums) that were sometimes completely unrelated to the music that was the subject of the post." <-- and yet others are afraid of writing relevant on topic posts due to Google's ramped fearmongering campaigns
  • "we compiled a list of 100 “potentially problematic domains”" <-- so their initial list of domains to inspect was less than 10% the number of links they gained while being penalized
  • "Generally Google doesn’t hold you responsible for unnatural inbound links outside of your control" <-- another lie
  • "of the 286 potentially problematic URLs that we manually identified, 217 (more than 75 percent!) have already had all unnatural links purged." <-- even the "all in" removal of pages was less than 25% of the number of unique linking domains generated during the penalty period

And Google allowed the above bullshit during a period when they were sending out messages telling other people WHO DID THINGS FAR LESS EGREGIOUS that they are required to remove more links & Google won't even look at their review requests for at least a couple weeks - A TIME PERIOD GREATER THAN THE ENTIRE TIME RAPGENIUS WAS PENALIZED FOR.

In Conclusion...

If you tell people what works and why you are a spammer with no morals. But if you are VC funded, Matt Cutts has made it clear that you should spam the crap out of Google. Just make sure you hire a PR firm to trump up press coverage of the "unexpected" event & have a faux apology saved in advance. So long as you lie to others and spread Google's propaganda you are behaving in an ethical white hat manner.

Notes

* These stats are from Ahrefs. A few of these links may have been in place before the penality and only recently crawled. However it is also worth mentioning that all third party databases of links are limited in size & refresh rate by optimizing their capital spend, so there are likely hundreds more links which have not yet been crawled by Ahrefs. One should also note that the story is still ongoing and they keep generating more links every day. By the time the story is done spreading they are likely to see roughly a 30% growth in unique linking domains in about 6 weeks.

Gray Hat Search Engineering

Jan 3rd

Almost anyone in internet marketing who has spent a couple months in the game has seen some "shocking" case study where changing the color of a button increased sales 183% or such. In many cases such changes only happen when the original site had not had any focus on conversion at all.

Google, on the other hand, has billions of daily searches and is constantly testing ways to increase yield:

The company was considering adding another sponsored link to its search results, and they were going to do a 30-day A/B test to see what the resulting change would be. As it turns out, the change brought massive returns. Advertising revenues from those users who saw more ads doubled in the first 30 days.
...
By the end of the second month, 80 percent of the people in the cohort that was being served an extra ad had started using search engines other than Google as their primary search engine.

One of the reasons traditional media outlets struggle with the web is the perception that ads and content must be separated. When they had regional monopolies they could make large demands to advertisers - sort of like how Google may increase branded CPCs on AdWords by 500% if you add sitelinks. You not only pay more for clicks that you were getting for free, but you also pay more for the other paid clicks you were getting cheaper in the past.

That's how monopolies work - according to Eric Schmidt they are immune from market forces.

Search itself is the original "native ad." The blend confuses many searchers as the background colors fade into white.

Google tests colors & can control the flow of traffic based not only on result displacement, but also the link colors.

It was reported last month that Google tested adding ads to the knowledge graph. The advertisement link is blue, while the ad disclosure is to the far right out of view & gray.

I was searching for a video game yesterday & noticed that now the entire Knowledge Graph unit itself is becoming an ad unit. Once again, gray disclosure & blue ad links.

Where Google gets paid for the link, the link is blue.

Where Google scrapes third party content & shows excerpts, the link is gray.

The primary goal of such a knowledge block is result displacement - shifting more clicks to the ads and away from the organic results.

When those blocks appear in the search results, even when Google manages to rank the Mayo Clinic highly, it's below the fold.

What's so bad about this practice in health

  • Context Matters: Many issues have overlapping symptoms where a quick glance at a few out-of-context symptoms causes a person to misdiagnose themselves. Flu-like symptoms from a few months ago turned out to be indication of a kidney stone. That level of nuance will *never* be in the knowledge graph. Google's remote rater documents discuss your money your life (YMYL) topics & talk up the importance of knowing who exactly is behind content, but when they use gray font on the source link for their scrape job they are doing just the opposite.
  • Hidden Costs: Many of the heavily advertised solutions appearing above the knowledge graph have hidden costs yet to be discovered. You can't find a pharmaceutical company worth $10s of billions that hasn't plead guilty to numerous felonies associated with deceptive marketing and/or massaging research.
  • Artificially Driving Up Prices: in-patent drugs often cost 100x as much as the associated generic drugs & thus the affordable solutions are priced out of the ad auctions where the price for a click can vastly exceed than the profit from selling a generic prescription drug.

Where's the business model for publishers when they have real editorial cost & must fact check and regularly update their content, their content is good enough to be featured front & center on Google, but attribution is nearly invisible (and thus traffic flow is cut off)? As the knowledge graph expands, what does that publishing business model look like in the future?

Does the knowledge graph eventually contain sponsored self-assessment medical quizzes? How far does this cancer spread?

Where do you place your chips?

Google believes it can ultimately fulfil people’s data needs by sending results directly to microchips implanted into its user’s brains.

Historical Revisionism

Nov 1st
posted in

A stopped clock is right two times a day.

There’s some amusing historical revisionism going on in SEO punditry world right now, which got me thinking about the history of SEO. I’d like to talk about some common themes of this historical revision, which goes along the lines of “what I predicted all those years ago came true - what a visionary I am! ." No naming names, as I don't meant this to be anything personal - as the same theme has popped up in a number of places - just making some observations :)

See if you agree….

Divided We Fall

The SEO world has never been united. There are no industry standards and qualifications like you’d find in the professions, such as being a doctor, or lawyer or a builder. If you say you’re an SEO, then you’re an SEO.

Part of the reason for the lack of industry standard is that the search engines never came to the party. Sure, they talked at conferences, and still do. They offered webmasters helpful guidelines. They participated in search engine discussion forums. But this was mainly to do with risk management. Keep your friends close, and your enemies closer.

In all these years, you won’t find one example of a representative from a major search engine saying “Hey, let’s all get together and form an SEO standard. It will help promote and legitimize the industry!”.

No, it has always been decrees from on high. “Don’t do this, don’t do that, and here are some things we’d like you to do”. Webmasters don’t get a say in it. They either do what the search engines say, or they go against them, but make no mistake, there was never any partnership, and the search engines didn’t seek one.

This didn’t stop some SEOs seeing it as a form of quasi-partnership, however.

Hey Partner

Some SEOs chose to align themselves with search engines and do their bidding. If the search engine reps said “do this”, they did it. If the search engines said “don’t do this”, they’d wrap themselves up in convoluted rhetorical knots pretending not to do it. This still goes on, of course.

In the early 2000’s, it turned, curiously, into a question of morality. There was “Ethical SEO”, although quite what it had to do with ethics remains unclear. Really, it was another way of saying “someone who follows the SEO guidelines”, presuming that whatever the search engines decree must be ethical, objectively good and have nothing to do self-interest. It’s strange how people kid themselves, sometimes.

What was even funnier was the search engine guidelines were kept deliberately vague and open to interpretation, which, of course, led to a lot of heated debate. Some people were “good” and some people were “bad”, even though the distinction was never clear. Sometimes it came down to where on the page someone puts a link. Or how many times someone repeats a keyword. And in what color.

It got funnier still when the search engines moved the goal posts, as they are prone to do. What was previously good - using ten keywords per page - suddenly became the height of evil, but using three was “good” and so all the arguments about who was good and who wasn’t could start afresh. It was the pot calling the kettle black, and I’m sure the search engines delighted in having the enemy warring amongst themselves over such trivial concerns. As far as the search engines were concerned, none of them were desirable, unless they became paying customers, or led paying customers to their door. Then there was all that curious Google+ business.

It's hard to keep up, sometimes.

Playing By The Rules

There’s nothing wrong with playing by the rules. It would have been nice to think there was a partnership, and so long as you followed the guidelines, high rankings would naturally follow, the bad actors would be relegated, and everyone would be happy.

But this has always been a fiction. A distortion of the environment SEOs were actually operating in.

Jason Calacanis, never one to miss an opportunity for controversy, fired some heat seekers at Google during his WebmasterWorld keynote address recently…..

Calacanis proceeded to describe Cutts and Google in terms like, “liar,” “evil,” and “a bad partner.” He cautioned the PubCon audience to not trust Google, and said they cooperate with partners until they learn the business and find a way to pick off the profits for themselves. The rant lasted a good five minutes….

He accused Google of doing many of the things SEOs are familiar with, like making abrupt algorithm changes without warning. They don’t consult, they just do it, and if people’s businesses get trashed as a result, then that’s just too bad. Now, if that’s a sting for someone who is already reasonably wealthy and successful like Calacanis, just imagine what it feels like for the much smaller web players who are just trying to make a living.

The search business is not a pleasant environment where all players have an input, and then standards, terms and play are generally agreed upon. It’s war. It’s characterized by a massive imbalance of power and wealth, and one party will use it to crush those who it determines stands in its way.

Of course, the ever pleasant Matt Cutts informs us it’s all about the users, and that’s a fair enough spin of the matter, too. There was, and is, a lot of junk in the SERPs, and Mahalo was not a partner of Google, so any expectation they’d have a say in what Google does is unfounded.

The take-away is that Google will set rules that work for Google, and if they happen to work for the webmaster community too, well that’s good, but only a fool would rely on it. Google care about their bottom line and their projects, not ours. If someone goes out of business due to Google’s behaviour, then so be it. Personally, I think the big technology companies do have a responsibility beyond themselves to society, because the amount of power they are now centralising means they’re not just any old company anymore, but great vortexes that can distort entire markets. For more on this idea, and where it’s all going, check out my review of “Who Owns The Future” by Jaron Lanier.

So, if you see SEO as a matter of playing by their rules, then fine, but keep in mind "those who can give you everything can also take everything away". Those rules weren't designed for your benefit.

Opportunity Cost

There was a massive opportunity cost by following so called ethical SEO during the 2000s.

For a long time, it was relatively easily to get high rankings by being grey. And if you got torched, you probably had many other sites with different link patterns good to go. This was against the webmaster guidelines, but given marketing could be characterized as war, one does not let the enemy define ones tactics. Some SEOs made millions doing it. Meanwhile, a lot of content-driven sites disappeared. That was, perhaps, my own "a stopped clock is right two times a day" moment. It's not like I'm going to point you to all the stuff I've been wrong about, now is it :)

These days, a lot of SEO is about content and how that content is marketed, but more specifically it’s about the stature of the site on which that content appears. That’s the bit some pundits tend to gloss over. You can have great content, but that’s no guarantee of anything. You will likely remain invisible. However, put that exact same content on a Fortune 500 site, and that content will likely prosper. Ah, the rich get richer.

So, we can say SEO is about content, but that’s only half the picture. If you’re a small player, the content needs to appear in the right place, be very tightly targeted to your audiences needs so they don’t click back, and it should be pushed through various media channels.

Content, even from many of these "ethical SEOs", used to be created for search engines in the hope of netting as many visitors as possible. These days, it’s probably a better strategy to get inside the audience's heads and target it to their specific needs, as opposed to a keyword, then get that content out to wherever your audience happens to be. Unless, of course, you’re Fortune 500 or otherwise well connected, in which case you can just publish whatever you like and it will probably do well.

Fair? Not really, but no one ever said this game was fair.

Whatever Next?

Do I know what’s going to happen next? In ten years time? Nope. I could make a few guesses, and like many other pundits, some guesses will prove right, and some will be wrong, but that’s the nature of the future. It will soon make fools of us all.

Having said that, will you take a punt and tell us what you think will be the future of SEO? Does it have one? What will look like? If you’re right, then you can point back here in a few years time and say “Look, I told you so!”.

If you’re wrong, well, there’s always historical revisionism :)

Google Keyword (Not Provided)

Sep 25th

Just a follow up on the prior (not provided) post, as Google has shot the moon since our last post on this. Here's a quick YouTube video.

The above video references the following:

Matt Cutts when secured search first rolled out:

Google software engineer Matt Cutts, who’s been involved with the privacy changes, wouldn’t give an exact figure but told me he estimated even at full roll-out, this would still be in the single-digit percentages of all Google searchers on Google.com.

This Week in Google (TWIG) show 211, where Matt mentioned the inspiration for encrypted search:

we actually started doing encrypted.google.com in 2008 and one of the guys who did a lot of heavy lifting on that, his name is Evan, and he actually reports to me. And we started that after I read Little Brother, and we said "we've got to encrypt the web."

The integration of organic search performance data inside AdWords.

The esteemed AdWords advertiser David Whitaker.

When asked about the recent increase in (not provided), a Google representative stated the following:

We want to provide SSL protection to as many users as we can, in as many regions as we can — we added non-signed-in Chrome omnibox searches earlier this year, and more recently other users who aren’t signed in. We’re going to continue expanding our use of SSL in our services because we believe it’s a good thing for users….

The motivation here is not to drive the ads side — it’s for our search users.

What an excellent time for Google to block paid search referrals as well.

If the move is important for user safety then it should apply to the ads as well.

The Benefits Of Thinking Like Google

Aug 27th
posted in

Shadows are the color of the sky.

It’s one of those truths that is difficult to see, until you look closely at what’s really there.

To see something as it really is, we should try to identify our own bias, and then let it go.

“Unnatural” Links

This article tries to make sense of Google's latest moves regarding links.

It’s a reaction to Google’s update of their Link Schemes policy. Google’s policy states “Any links intended to manipulate PageRank or a site’s ranking in Google search results may be considered part of a link scheme and a violation of Google’s Webmaster Guidelines." I wrote on this topic, too.

Those with a vested interest in the link building industry - which is pretty much all of us - might spot the problem.

Google’s negative emphasis, of late, has been about links. Their message is not new, just the emphasis. The new emphasis could pretty much be summarized thus:"any link you build for the purpose of manipulating rank is outside the guidelines." Google have never encouraged activity that could manipulate rankings, which is precisely what those link building, for the purpose of SEO, attempt to do. Building links for the purposes of higher rank AND staying within Google's guidelines will not be easy.

Some SEOs may kid themselves that they are link building “for the traffic”, but if that were the case, they’d have no problem insisting those links were scripted so they could monitor traffic statistics, or at very least, no-followed, so there could be no confusion about intent.

How many do?

Think Like Google

Ralph Tegtmeier: In response to Eric's assertion "I applaud Google for being more and more transparent with their guidelines", Ralph writes- "man, Eric: isn't the whole point of your piece that this is exactly what they're NOT doing, becoming "more transparent"?

Indeed.

In order to understand what Google is doing, it can be useful to downplay any SEO bias i.e. what we may like to see from an SEO standpoint, rather try to look at the world from Google’s point of view.

I ask myself “if I were Google, what would I do?”

Clearly I'm not Google, so these are just my guesses, but if I were Google, I’d see all SEO as a potential competitive threat to my click advertising business. The more effective the SEO, the more of a threat it is. SEOs can’t be eliminated, but they can been corralled and managed in order to reduce the level of competitive threat. Partly, this is achieved by algorithmic means. Partly, this is achieved using public relations. If I were Google, I would think SEOs are potentially useful if they could be encouraged to provide high quality content and make sites easier to crawl, as this suits my business case.

I’d want commercial webmasters paying me for click traffic. I’d want users to be happy with the results they are getting, so they keep using my search engine. I’d consider webmasters to be unpaid content providers.

Do I (Google) need content? Yes, I do. Do I need any content? No, I don’t. If anything, there is too much content, and lot of it is junk. In fact, I’m getting more and more selective about the content I do show. So selective, in fact, that a lot of what I show above the fold content is controlled and “published”, in the broadest sense of the word, by me (Google) in the form of the Knowledge Graph.

It is useful to put ourselves in someone else’s position to understand their truth. If you do, you’ll soon realise that Google aren’t the webmasters friend if your aim, as a webmaster, is to do anything that "artificially" enhances your rank.

So why are so many SEOs listening to Google’s directives?

Rewind

A year or two ago, it would be madness to suggest webmasters would pay to remove links, but that’s exactly what’s happening. Not only that, webmasters are doing Google link quality control. For free. They’re pointing out the links they see as being “bad” - links Google’s algorithms may have missed.

Check out this discussion. One exasperated SEO tells Google that she tries hard to get links removed, but doesn’t hear back from site owners. The few who do respond want money to take the links down.

It is understandable site owners don't spend much time removing links. From a site owners perspective, taking links down involves a time cost, so there is no benefit to the site owner in doing so, especially if they receive numerous requests. Secondly, taking down links may be perceived as being an admission of guilt. Why would a webmaster admit their links are "bad"?

The answer to this problem, from Google's John Mueller is telling.

A shrug of the shoulders.

It’s a non-problem. For Google. If you were Google, would you care if a site you may have relegated for ranking manipulation gets to rank again in future? Plenty more where they came from, as there are thousands more sites just like it, and many of them owned by people who don’t engage in ranking manipulation.

Does anyone really think their rankings are going to return once they’ve been flagged?

Jenny Halasz then hinted at the root of the problem. Why can’t Google simply not count the links they don’t like? Why make webmasters jump through arbitrary hoops? The question was side-stepped.

If you were Google, why would you make webmasters jump through hoops? Is it because you want to make webmasters lives easier? Well, that obviously isn’t the case. Removing links is a tedious, futile process. Google suggest using the disavow links tool, but the twist is you can’t just put up a list of links you want to disavow.

Say what?

No, you need to show you’ve made some effort to remove them.

Why?

If I were Google, I’d see this information supplied by webmasters as being potentially useful. They provide me with a list of links that the algorithm missed, or considered borderline, but the webmaster has reviewed and thinks look bad enough to affect their ranking. If the webmaster simply provided a list of links dumped from a link tool, it’s probably not telling Google much Google doesn’t already know. There’s been no manual link review.

So, what webmasters are doing is helping Google by manually reviewing links and reporting bad links. How does this help webmasters?

It doesn’t.

It just increases the temperature of the water in the pot. Is the SEO frog just going to stay there, or is he going to jump?

A Better Use Of Your Time

Does anyone believe rankings are going to return to their previous positions after such an exercise? A lot of webmasters aren’t seeing changes. Will you?

Maybe.

But I think it’s the wrong question.

It’s the wrong question because it’s just another example of letting Google define the game. What are you going to do when Google define you right out of the game? If your service or strategy involves links right now, then in order to remain snow white, any links you place, for the purposes of achieving higher rank, are going to need to be no-followed in order to be clear about intent. Extreme? What's going to be the emphasis in six months time? Next year? How do you know what you're doing now is not going to be frowned upon, then need to be undone, next year?

A couple of years ago it would be unthinkable that webmasters would report and remove their own links, even paying for them to be removed, but that’s exactly what’s happening. So, what is next year's unthinkable scenario?

You could re-examine the relationship and figure what you do on your site is absolutely none of Google’s business. They can issue as many guidelines as they like, but they do not own your website, or the link graph, and therefore don't have authority over you unless you allow it. Can they ban your site because you’re not compliant with their guidelines? Sure, they can. It’s their index. That is the risk. How do you choose to manage this risk?

It strikes me you can lose your rankings at anytime whether you follow the current guidelines or not, especially when the goal-posts keep moving. So, the risk of not following the guidelines, and following the guidelines but not ranking well is pretty much the same - no traffic. Do you have a plan to address the “no traffic from Google” risk, however that may come about?

Your plan might involve advertising on other sites that do rank well. It might involve, in part, a move to PPC. It might be to run multiple domains, some well within the guidelines, and some way outside them. Test, see what happens. It might involve beefing up other marketing channels. It might be to buy competitor sites. Your plan could be to jump through Google’s hoops if you do receive a penalty, see if your site returns, and if it does - great - until next time, that is.

What's your long term "traffic from Google" strategy?

If all you do is "follow Google's Guidelines", I'd say that's now a high risk SEO strategy.

Winning Strategies to Lose Money With Infographics

Aug 26th

Google is getting a bit absurd with suggesting that any form of content creation that drives links should include rel=nofollow. Certainly some techniques may be abused, but if you follow the suggested advice, you are almost guaranteed to have a negative ROI on each investment - until your company goes under.

Some will ascribe such advice as taking a "sustainable" and "low-risk" approach, but such strategies are only "sustainable" and "low-risk" so long as ROI doesn't matter & you are spending someone else's money.

The advice on infographics in the above video suggests that embed code by default should include nofollow links.

Companies can easily spend at least $2,000 to research, create, revise & promote an infographic. And something like 9 out of 10 infographics will go nowhere. That means you are spending about $20,000 for each successful viral infographic. And this presumes that you know what you are doing. Mix in a lack of experience, poor strategy, poor market fit, or poor timing and that cost only goes up from there.

If you run smaller & lesser known websites, quite often Google will rank a larger site that syndicates the infographic above the original source. They do that even when the links are followed. Mix in nofollow on the links and it is virtually guaranteed that you will get outranked by someone syndicating your infographic.

So if you get to count as duplicate content for your own featured premium content that you dropped 4 or 5 figures on AND you don't get links out of it, how exactly does the investment ever have any chance of backing out?

Sales?

Not a snowball's chance in hell.

An infographic created around "the 10 best ways you can give me your money" won't spread. And if it does spread, it will be people laughing at you.

I also find it a bit disingenuous the claim that people putting something that is 20,000 pixels large on their site are not actively vouching for it. If something was crap and people still felt like burning 20,000 pixels on syndicating it, surely they could add nofollow on their end to express their dissatisfaction and disgust with the piece.

Many dullards in the SEO industry give Google a free pass on any & all of their advice, as though it is always reasonable & should never be questioned. And each time it goes unquestioned, the ability to exist in the ecosystem as an independent player diminishes as the entire industry moves toward being classified as some form of spam & getting hit or not depends far more on who than what.

Does Google's recent automated infographic generator give users embed codes with nofollow on the links? Not at all. Instead they give you the URL without nofollow & those URLs are canonicalized behind the scenes to flow the link equity into the associated core page.

No cost cut-n-paste mix-n-match = direct links. Expensive custom research & artwork = better use nofollow, just to be safe.

If Google actively adds arbitrary risks to some players while subsidizing others then they shift the behaviors of markets. And shift the markets they do!

Years ago Twitter allowed people who built their platform to receive credit links in their bio. Matt Cutts tipped off Ev Williams that the profile links should be nofollowed & that flow of link equity was blocked.

It was revealed in the WSJ that in 2009 Twitter's internal metrics showed an 11% spammy Tweet rate & Twitter had a grand total of 2 "spam science" programmers on staff in 2012.

With smaller sites, they need to default everything to nofollow just in case anything could potentially be construed (or misconstrued) to have the intent to perhaps maybe sorta be aligned potentially with the intent to maybe sorta be something that could maybe have some risk of potentially maybe being spammy or maybe potentially have some small risk that it could potentially have the potential to impact rank in some search engine at some point in time, potentially.

A larger site can have over 10% of their site be spam (based on their own internal metrics) & set up their embed code so that the embeds directly link - and they can do so with zero risk.

I just linked to Twitter twice in the above embed. If those links were directly to Cygnus it may have been presumed that either he or I are spammers, but put the content on Twitter with 143,199 Tweets in a second & those links are legit & clean. Meanwhile, fake Twitter accounts have grown to such a scale that even Twitter is now buying them to try to stop them. Twitter's spam problem was so large that once they started to deal with spam their growth estimates dropped dramatically:

CEO Dick Costolo told employees he expected to get to 400 million users by the end of 2013, according to people familiar with the company.

Sources said that Twitter now has around 240 million users, which means it has been adding fewer than 4.5 million users a month in 2013. If it continues to grow at that rate, it would end this year around the 260 million mark — meaning that its user base would have grown by about 30 percent, instead of Costolo’s 100 percent goal.

Typically there is no presumed intent to spam so long as the links are going into a large site (sure there are a handful of token counter-examples shills can point at). By and large it is only when the links flow out to smaller players that they are spam. And when they do, they are presumed to be spam even if they point into featured content that cost thousands of Dollars. You better use nofollow, just to play it safe!

That duality is what makes blind unquestioning adherence to Google scripture so unpalatable. A number of people are getting disgusted enough by it that they can't help but comment on it: David Naylor, Martin Macdonald & many others DennisG highlighted.

Oh, and here's an infographic for your pleasurings.

Pages






    Email Address
    Pick a Username
    Yes, please send me "7 Days to SEO Success" mini-course (a $57 value) for free.

    Learn More

    We value your privacy. We will not rent or sell your email address.