How Google Destroyed the Value of Google Site Search

Mar 18th

Do You Really Want That Indexed?

On-demand indexing was a great value added feature for Google site search, but now it carries more risks than ever. Why? Google decides how many documents make their primary index. And if too many of your documents are arbitrarily considered "low quality" then you get hit with a sitewide penalty. You did nothing but decide to trust Google & use Google products. In response Google goes out of its way to destroy your business. Awesome!

Keep in mind that Google was directly responsible for the creation of AdSense farms. And rather than addressing them directly, Google had to roll everything through an arbitrary algorithmic approach.

< meta name="googlebot" content="noindex" />

Part of the prescribed solution to the Panda Update is to noindex content that Google deems to be of low quality. But if you are telling GoogleBot to noindex some of your content, then if you are also using them for site search, you destroy the usability of their site search feature by making your content effectively invisible to your customers. For Google Site Search customers this algorithmic change is even more value destructive than the arbitrary price jack Google Site Search recently did.

We currently use Google Site Search on our site here, but given Google's arbitrary switcheroo styled stuff, I would be the first person to dump it if they hit our site with their stupid "low quality" stuff that somehow missed eHow & sites which wrap repurposed tweets in a page. :D

Cloaking vs rel=noindex, rel=canonical, etc. etc. etc.

Google tells us that cloaking is bad & that we should build our sites for users instead of search engines, but now Google's algorithms are so complex that you literally have to break some of Google's products to be able to work with other Google products. How stupid! But a healthy reminder for those considering deeply integrating Google into your on-site customer experience. Who knows when their model will arbitrarily change again? But we do know that when it does they won't warn partners in advance. ;)

I could be wrong in the above, but if I am, it is not easy to find any helpful Google documentation. There is no site-search bot on their list of crawlers, questions about if they share the same user agent have gone unanswered, and even a blog post like this probably won't get a response.

That is a reflection of only one more layer of hypocrisy, in which Google states that if you don't provide great customer service then your business is awful, while going to the dentist is more fun than trying to get any customer service from Google. :D

I was talking to a friend about this stuff and I think he summed it up perfectly: "The layers of complexity make everyone a spammer since they ultimately conflict, giving them the ability to boot anyone at will."

Is the Huffington Post Google's Favorite Content Farm?

I was looking for information about the nuclear reactor issue in Japan and am glad it did not turn out as bad as it first looked!

But in that process of searching for information I kept stumbling into garbage hollow websites. I was cautious not to click on the malware results, but of the mainstream sites covering the issue, one of the most flagrant efforts was from the Huffington Post.

AOL recently announced that they were firing 15% to 20% of their staff. No need for original stories or even staff writers when you can literally grab a third party tweet, wrap it in your site design, and rank it in Google. Inline with that spirit, I took a screenshot. Rather than calling it the Huffington Post I decided a more fitting title would be plundering host. :D

plundering host.

We were told that the content farm update was to get rid of low quality web pages & yet that information-less page was ranking at the top of their search results, when it was nothing but a 3rd party tweet wrapped in brand and ads.

How does Huffington Post get away with that?

You can imagine in a hyperspace a bunch of points, some points are red, some points are green, and in others there’s some mixture. Your job is to find a plane which says that most things on this side of the place are red, and most of the things on that side of the plane are the opposite of red. - Google's Amit Singhal

If you make it past Google's arbitrary line in the sand there is no limit to how much spamming and jamming you can do.

we actually came up with a classifier to say, okay, IRS or Wikipedia or New York Times is over on this side, and the low-quality sites are over on this side. - Matt Cutts

(G)arbitrage never really goes away, it just becomes more corporate.

The problem with Google arbitrarily picking winners and losers is the winners will mass produce doorway pages. With much of the competition (including many of the original content creators) removed from the search results, this sort of activity is simply printing money.

As bad as that sounds, it is actually even worse than that. Today Google Alerts showed our brand being mentioned on a group-piracy website built around a subscription model of selling 3rd party content without permission! As annoying as that feels, of course there are going to be some dirtbags on the way that you have to deal with from time to time. But now that the content farm update has went through, some of the original content producers are no longer ranking for their own titles, whereas piracy sites that stole their content are now the canonical top ranked sources!

Google never used to put piracy sites on the first page of results for my books, this is a new feature on their part, and I think it goes a long way to show that their problem is cultural rather than technical. Google seems to have reached the conclusion that since many of their users are looking for pirated eBooks, quality search results means providing them with the best directory of copyright infringements available. And since Google streamlined their DMCA process with online forms, I couldn’t discover a method of telling them to remove a result like this from their search results, though I tried anyway.
... I feel like the guy who was walking across the street when Google dropped a 1000 pound bomb to take out a cockroach - Morris Rosenthal

Way to go Google! +1 +1

Too clever by half.

Google's Matt Cutts Talks Down Keyword Domain Names

Mar 11th

I have long documented Google's preference toward brands, while Google has always stated that they don't really think of brand.

While not thinking of brands, someone on the Google UI team later added navigational aids to the search results promoting popular brands - highlighting the list of brands with the label "brands" before the list of links.

Take a look at what Matt Cutts shares in the following video, where he tries to compare brand domain names vs keyword domain names. He highlights brand over and over again, and then when he talks about exact match domains getting a bonus or benefit, he highlights that Google may well dial that down soon.

Now if you are still on the fence, let me just give you a bit of color. that we have looked at the rankings and the weights that we give to keyword domains, & some people have complained that we are giving a little too much weight for keywords in domains. So we have been thinking about at adjusting that mix a bit and sort of turning the knob down within the algorithm, so that given 2 different domains it wouldn't necessarily help you as much to have a domain name with a bunch of keywords in it. - Matt Cutts

For years the Google algorithm moved in one direction, and that was placing increased emphasis on brand and domain authority. That created the content farm problem, but with the content farm update they figured out how to dial down a lot of junk hollow authority sites. They were able to replace "on-topic-ness" with "good-ness," according to the search quality engineer who goes by the nickname moultano. As part of that content farm update, they dialed up brands to the point where now doorway pages are ranking well (so long as they are hosted on brand websites).

Google keeps creating more signals from social media and how people interact with the search results. A lot of those types of signals are going to end up favoring established brands which have large labor forces & offline marketing + distribution channels. Google owns about 97% of the mobile search market, so more and more of that signal will eventually end up bleeding into the online world.

In addition to learning from the firehose of mobile search data, Google is also talking about selling hotel ads on a price per booking. Google can get a taste of any transaction simply by offering free traffic in exchange for giving them the data needed to make a marketplace & then requiring access to the best deals & discounts:

It is believed that Google requires participating hotels to provide Google Maps with the lowest publicly available rates, for stays of one to seven nights, double occupancy, with arrival days up to 90 days ahead.

In a world where Google has business volume data, clientele demographics, pricing data, and customer satisfaction data for most offline businesses they don't really need to place too much weight on links or domain names. Businesses can be seen as being great simply by being great.*

(*and encouraging people to stuff the ballot box for them with discounts :D)

Classical SEO signals (on-page optimization, link anchor text, domain names, etc.) have value up until a point, but if Google is going to keep mixing in more and more signals from other data sources then the value of any single signal drops. I haven't bought any great domain names in a while, and with Google's continued brand push and Google coming over the top with more ad units (in markets like credit cards and mortgage) I am seeing more and more reason to think harder about brand. It seems that is where Google is headed. The link graph is rotted out by nepotism & paid links. Domain names are seen as a tool for speculation & a short cut. It is not surprising Google is looking for more signals.

How have you adjusted your strategies of late? What happens to the value of domain names if EMD bonus goes away & Google keeps adding other data sources?

A Thought Experiment on Google Whitelisting Websites

Mar 11th

Google has long maintained that "the algorithm" is what controls rankings, except for sites which are manually demoted for spamming, getting hacked, delivering spyware, and so on.

At the SMX conference it was revealed that Google uses white listing:

Google and Bing admitted publicly to having ‘exception lists’ for sites that were hit by algorithms that should not have been hit. Matt Cutts explained that there is no global whitelist but for some algorithms that have a negative impact on a site in Google’s search results, Google may make an exception for individual sites.

The idea that "sites rank where they deserve, with the exception of spammers" has long been pushed to help indemnify Google from potential anti-competitive behavior. Google's marketing has further leveraged the phrase "unique democratic nature of the web" to highlight how PageRank originally worked.

But why don't we conduct a thought experiment for the purpose of thinking through the differences between how Google behaves and how Google doesn't want to be perceived as behaving.

Let's cover the negative view first. The negative view is that either Google has a competing product or a Google engineer dislikes you and goes out of his way to torch your stuff simply because you are you and he dislikes you & is holding onto a grudge. Given Google's current monopoly-level marketshare in most countries, such would be seen as unacceptable if Google was just picking winners and losers based on their business interests.

The positive view is that "the algorithm handles almost everything, except some edge cases of spam." Let's break down that positive view a bit.

  • Off the start, consider that Google engineers write the algorithms with set goals and objectives in mind.
    • Google only launched universal search after Google bought Youtube. Coincidence? Not likely. If Google had rolled out universal search before buying Youtube then they likely would have increased the price of Youtube by 30% to 50%.
    • Likewise, Google trains some of their algorithms with human raters. Google seeds certain questions & desired goals in the minds of raters & then uses their input to help craft an algorithm that matches their goals. (This is like me telling you I can't say the number 3, but I can ask you to add 1 and 2 then repeat whatever you say :D)
  • At some point Google rolls out a brand-filter (or other arbitrary algorithm) which allows certain favored sites to rank based on criteria that other sites simply can not match. It allows some sites to rank with junk doorway pages while demoting other websites.
  • To try to compete with that, some sites are forced to either live in obscurity & consistently shed marketshare in their market, or be aggressive and operate outside the guidelines (at least in spirit, if not in a technical basis).
  • If the site operates outside the guidelines there is potential that they can go unpenalized, get a short-term slap on the wrist, or get a long-term hand issued penalty that can literally last for up to 3 years!
  • Now here is where it gets interesting...
    • Google can roll out an automated algorithm that is overly punitive and has a significant number of false positives.
    • Then Google can follow up by allowing nepotistic businesses & those that fit certain criteria to quickly rank again via whitelisting.
    • Sites which might be doing the same things as the whitelisted sites might be crushed for doing the exact same thing & upon review get a cold shoulder.

You can see that even though it is claimed "TheAlgorithm" handles almost everything, they can easily interject their personal biases to decide who ranks and who does not. "TheAlgorithm" is first and foremost a legal shield. Beyond that it is a marketing tool. Relevancy is likely third in line in terms of importance (how else could one explain the content farm issue getting so out of hand for so many years before Google did something about it).

Doorway Pages Ranking in Google in 2011?

Mar 11th

When Google did the Panda update they highlighted that not only did some "low quality" sites get hammered, but that some "high quality" sites got a boost. Matt Cutts said: "we actually came up with a classifier to say, okay, IRS or Wikipedia or New York Times is over on this side, and the low-quality sites are over on this side."

Here is the problem with that sort of classification system: doorway pages.

The following Ikea page was ranking page 1 in the search results for a fairly competitive keyword.

Once you strip away the site's navigation there are literally only 20 words on that page. And the main body area "content" for that page is a link to a bizarre, confusing, and poor-functioning flash tour which takes a while to load.

If you were trying to design the worst possible user experience & wanted to push the "minimum viable product" page into the search results then you really couldn't possibly do much worse that that Ikea page is (at least not without delivering malware and such).

I am not accusing Ikea of doing anything spammy. They just have terrible usability on that page. Their backlinks to that page are few in number & look just about as organic as they could possibly come. But not that long ago companies like JC Penny and Overstock were demoted by Google for building targeted deep links (that they needed in order to rank, but were allegedly harming search relevancy & Google user experience). Less than a month later Google arbitrarily changed their algorithm to where other branded sites simply didn't need many (or in some cases any) deep links to get in the game, even if their pages were pure crap. Google Handling Flash.

We are told the recent "content farm" update was to demote low quality content. If that is the case, then how does a skeleton of a page like that rank so high? How did that Ikea page go from ranking on the third page of Google's results to the first one? I think Google's classifier is flashing a new set of exploits for those who know what to look for.

A basic tip? If you see Google ranking an information-less page like that on a site you own, that might be a green light to see how far you can run with it. Give GoogleBot the "quality content" it seeks. Opportunity abound!

The Problem With Following Prescription

Mar 10th
posted in

You can't learn great SEO from an e-book. Or buying software tools.

Great SEO is built on an understanding.

Reducing SEO To Prescription

One of the problems with reductive, prescribed SEO approaches - i.e. step one: research keywords, step two: put keyword in title etc can be seen in the recent "Content Farm" update.

When Google decide sites are affecting their search quality, they look for a definable, repeated footprint made by the sites they deem to be undesirable. They then design algorithms that flag and punish the sites that use such a footprint.

This is why a lot of legitimate sites get taken out in updates. A collection of sites may not look, to a human, like problem sites, but the algo sees them as being the same thing, because their technical footprint is the same. For instance, a website with a high number of 250-word pages is an example of a footprint. Not necessarily an undesirable one, but a footprint nevertheless. Similar footprints exist amongst ecommerce sites heavy in sitewide templating but light on content unique to the page.

Copying successful sites is a great way to learn, but can also be a trap. If you share a similar footprint, having followed the same SEO prescription, you may go down with them if Google decides their approach is no longer flavor of the month.

The Myth Of White Hat

A lot of sites that get taken out are white hat i.e. sites that follow Google's webmaster guidelines.

It's a reasonably safe approach, but if you understand SEO, you'll soon realize that following a white hat prescription offers no guarantees of ranking, nor does it offer any guarantees you won't be taken out.

The primary reason there aren't any guarantees comes down to numbers. Google knows that when it makes a change, many sites will lose. They also know that many sites will win i.e. replace the sites that lost. If your site drops out, Google aren't bothered. There will be plenty of other sites to take your place. Google are only concerned that their users perceive the search results to be of sufficient quality.

The exception is if your site really is a one-of-a-kind. The kind of site that would embarrass Google if users couldn't find it. BMW, for example, in response to the query "BMW".

It's not fair, but we understand that's just how life is.

An Understanding

For those readers new to SEO, in order to really grasp SEO, you need to see things from the search engines point of view.

Firstly, understand the search engines business case. The search engine can only make money if advertisers pay for search traffic. If it were too easy for those sites who are likely to use PPC to rank highly in the natural results, then the search engines business model is undermined. Therefore, it is in the search engines interest to "encourage" purely commercial entities to use PPC, not SEO. One way they do this is to make the natural results volatile and unpredictable. There are exceptions, covered in my second point.

Secondly, search engines must provide sufficient information quality to their users. This is an SEO opportunity, because without webmasters producing free-to-crawl, quality content, there can be no search engine business model. The search engines must nurture this ecosystem.

If you provide genuine utility to end users, the search engines have a vested interest in your survival, perhaps not as an individual, but certainly as a group i.e. "quality web publishers". Traffic is the lifeblood of the web, and if quality web publishers aren't fed traffic, they die. The problem, for webmasters, is that the search engines don't care about any one "quality publisher", as there are plenty of quality publishers. The exception is if you're the type of quality publisher who has a well recognized brand, and would therefore give the impression to users that Google was useless if you didn't appear.

Thirdly, for all their cryptic black box genius, search engines aren't all that sophisticated. Yes, the people who run them are brilliant. The problems they solve are very difficult. They have built what, only decades ago, would have been considered magic. But, at the end of the day, it's just a bit of maths trying to figure out a set of signals. If you can work out what that set of signals are, the maths will - unblinkingly - reward you. It is often said that in the search engine wars, the black hats will be the last SEOs standing.

Fourthly, the search engines don't really like you. They identified you as a business risk in their statement to investors. You can, potentially, make them look bad. You can undermine their business case. You may compete with their own channels for traffic. They tolerate you because they need publishers making their stuff easy to crawl, and not locking their content away behind paywalls. Just don't expect a Christmas card.

SEO Strategy Built On Understanding

Develop strategies based on how a search engine sees the world.

For example, if you're a known brand, your approach will be different to a little known, generic publisher. There isn't really much risk you won't appear, as you could embarrass Google if users can't find you. This is the reason BMW were reinstated so quickly after falling foul of Google's guidelines, but the same doesn't necessarily apply to lesser known publishers.

If you like puzzles, then testing the algorithms can give you an unfair advantage. It's a lot harder than it used to be, but where there is difficulty, there is a barrier to entry to those who come later. Avoid listening to SEO echo chambers where advice may be well-meaning, but isn't based on rigorous testing.

If you're a publisher, not much into SEO wizardry, and you create content that is very similar to content created by others, you should focus on differentiation. If there are 100's of publishers just like you, then Google doesn't care if you disappear. Google do need to find a way to reward quality, especially in niches that aren't well covered. Be better than the rest, but if you're not, slice your niche finer and finer, until you're the top dog in your niche. You should focus on building brand, so you can own a search stream. For example, this site owns the search stream "SEO Book", a stream Aaron created and built up.

Remember, search engines don't care about you, unless there's something in it for them.

Google Update Panda

Mar 9th
posted in

Google tries to wrestle back index update naming from the pundits, naming the update "Panda". Named after one of their engineers, apparently.

The official Google line - and I'm paraphrasing here - is this:

Trust us. We're putting the bad guys on one side, and the good guys on the other

I like how Wired didn't let them off the hook.

Wired persisted:

Wired.com: Some people say you should be transparent, to prove that you aren’t making those algorithms to help your advertisers, something I know that you will deny.

Singhal: I can say categorically that money does not impact our decisions.

Wired.com: But people want the proof.

This answer, from Matt Cutts, was interesting:

Cutts: If someone has a specific question about, for example, why a site dropped, I think it’s fair and justifiable and defensible to tell them why that site dropped. But for example, our most recent algorithm does contain signals that can be gamed. If that one were 100 percent transparent, the bad guys would know how to optimize their way back into the rankings

Why Not Just Tell Us What You Want, Already!

Blekko makes a big deal about being transparent and open, but Google have always been secretive. After all, if Google want us to produce quality documents their users like and trust, then why not just tell us exactly what a quality document their users like and trust looks like?

Trouble is, Google's algorithmns clearly aren't that bulletproof, as Google admit they can still be gamed, hence the secrecy. Matt says he would like to think there would be a time they could open source the algorithms, but it's clear that time isn't now.

Do We Know Anything New?

So, what are we to conclude?

  • Google can be gamed. We kinda knew that....
  • Google still aren't telling us much. No change there....

Then again, there's this:

Google have filed a patent that sounds very similar to what Demand Media does i.e looks for serp areas that are under-served by content, and prompts writers to write for it.

The patent basically covers a system for identifying search queries which have low quality content and then asking either publishers or the people searching for that topic to create some better content themselves. The system takes into account the volume of searches when looking at the quality of the content so for bigger keywords the content would need to be better in order for Google to not need to suggest somebody else writes something

If Google do implement technology based on this patent, then it would appear they aren't down on the "Content Farm" model. They may even integrate it themselves.

Until then....

How To Avoid Getting Labelled A Content Farmer

The question remains: how do you prevent being labelled as a low-quality publisher, especially when sites like eHow remain untouched, yet Cult Of Mac gets taken out? Note: Cult Of Mac appears to have been reinstated, but one wonders if that was the result of the media attention, or an algo tweak.

Google want content their users find useful. As always, they're cagey about what "useful" means, so those who want to publish content, and want to rank well, but do not want be confused with a content farm, are left to guess. And do a little reverse-engineering.

Here's a stab, based on our investigations, the conference scene, Google's rhetoric, and pure conjecture thus far:

  • A useful document will pass a human inspection
  • A useful document is not ad heavy
  • A useful document is well linked externally
  • A useful document is not a copy of another document
  • A useful document is typically created by a brand or an entity which has a distribution channel outside of the search channel
  • A useful document does not have a 100% bounce rate followed by a click on a different search result for that same search query ;)

Kinda obvious. Are we off-base here? Something else? What is the difference, as far as algo is concerned, between e-How and Suite 101? Usage patterns?

Still doesn't explain YouTube, though, which brings us back to:

Wired.com: But people want the proof

YouTube, the domain, is incredibly useful, but some pages - not so much. Did YouTube get hammered by update Panda, too?

Many would say that's unlikely.

I guess "who you know" helps.

In the Panda update some websites got owned. Others are owned and operated by Google. :D

Google Buys BeatThatQuote, a UK Comparison Site Violating Google's Guidelines

Mar 7th

It looks like Money.co.uk was right on the money:

BeatThatQuote.com today was sold to Google for GBP37.7 million. We think this deal is a tremendous opportunity for our company to develop new and innovative options for personal finance in the UK. Our team is excited about becoming a part of Google. We look forward to working with their engineers to create new tools making it easier for consumers to choose the right financial products. We think

What is screwed up about this is that Google is engaging in *major* channel conflict. Not only is there some gray area background stuff:

BeatThatQuote.com's ad prompted 101 complaints to the Advertising Standards Authority, with 65 objecting that the commercial "trivialised, condoned or encouraged bullying in the workplace".

But now they have to consider SEO as well. I highlighted how it was a bit unjust when Google arbitrarily chose to whack one site while letting another get away with worse just because the founder was good at public relations, but how can Google police Google's guidelines when Google is the one breaking them?

Doorway Pages / Gateway Sites

Remember how Overstock.com was recently penalized for offering discounts in exchange for links? BeatThatQuote partnered with Oxfam to create CompareForGood.com. The homepage consists of a bunch of links into BeatThatQuote.com. If you look at those links using our server header checker you will see some 301 redirects. Of course doorway pages are considered spam & we know that Google has torched some other affiliate programs for using 301 redirects.

Such links are doing 301 redirects

Sure that is fairly vanilla...a bit gray area. Certainly not straight up black hat.

....unlike...

Buying Links That Pass PageRank

Raise your hand if you have read a post by Matt Cutts on the dangers of buying links that flow PageRank?

Ok. Now keep your hand up if you have read a dozen of them.

Remember how Matt Cutts stated that some bloggers got torched for selling a single link? Remember the long NYT article about how Google is cracking down on link buying & they penalized JC Penny for it?

With that in mind, can anyone explain why Google's newest purchase buying links like

Not so much a categorized listing with an editorial review...just a paid link for the sake of buying links to flow PageRank.

That one is only totally flagrant.

A bit off color. Like comment spamming.

Sorta like the link exchanges in German.

But some are even more outrageous. Consider that BeatThatQuote is buying links from pages with ad sections like

Paid Blog Reviews

Remember those "evil" paid reviews Matt Cutts wrote of? Plenty of those to go around ;)

In fact, some of the paid blog links were in place so long that BeatThatQuote got thank you's for advertising for over a year straight.

I don't have a decade of spam fighting experience like Google does. But is it too much to suggest that before Google buys *any* website they should do a basic compliance audit to verify that the site is operating within Google's TOS. I am an independent SEO and it took me all of 2 minutes to find numerous FLAGRANT violations.

What sort of message does Google send the market with the above behavior?

How Can Google Police Anyone?

Google has on multiple occasions penalized other UK based finance sites for SEO issues & link buying. But now that Google owns one of the horses in the race, and that horse has been found to be using anabolic steroids, can they legitimately penalize any of their new competitors?

If I had a UK finance site I would go on a link buying binge right now. Google can't penalize you for it because they are doing the same damn thing. And if they do penalize it for DOING THE EXACT SAME THING GOOGLE IS DOING then you know you have a legitimate gripe for the media, and I have no doubt Microsoft would be willing to help pick up the legal tab.

Google Eats Microsoft's Lunch Again!

Ultimately this is a body blow to Microsoft. Microsoft started to gain momentum in search through verticalization, but has since backed off. Meanwhile Google took Microsoft's ball and ran it in for a touchdown (acquiring MetaWebs, trying to buy ITA Software, and buying BeatThatQuote). And now one of MSN's portal ad partners is owned by Google:

Head of partnerships at MSN, Phil Coxon said, “At Microsoft Advertising, we’re passionate about collaborating with brands to create compelling advertising campaigns. By providing new and exclusive content that appeals to consumers, this partnership both enhances the overall MSN user experience as well as providing a great platform for BeatThatQuote to engage with their target audience on a meaningful level. This deal builds on our previous partnership with BeatThatQuote, which led to a 400% increase in revenue generated from insurance products. We’re delighted to continue to build on this relationship with this new campaign.”

Oooops.

Update

After a Googler read this blog post, it appears that Beat That Quote has been penalized.

When Best Practices Lead to Miserable Failures

Mar 2nd

How often do you ever hear the phrase worst practices? Probably never.

Everything is a best practice approach, right up until things change.

Consider AdSense websites.

Hey Look, a Case Study!

When you look at some of the biggest losers in the Google content farm update, many of them happened to be premium AdSense publishers which were even used by Google as case studies! For instance, Hub Pages or EzineArticles.

Everyone thinks that their content is the cream of the crop & that they will bounce back:

We are confident that over time the proven quality of our writers' content will be attractive to users. We have faith in Google's ability to tune results post major updates and are optimistic that the cream will rise back to the top in the coming weeks, which has been our experience with past updates - Paul Edmondson

The problem is that for many businesses there will be no bounce back. Some are simply over. The web has evolved & the algorithm has moved beyond them.

Where is the Much Needed Disclaimer?

What makes this worse is that when Google gives a site their premium AdSense feed & sets something up as a case study others will see that as an explicit endorsement.

THIS IS HOW YOU SHOULD DO IT.

Even after Google torches the companies that follow Google suggested best practices those case studies live on, offering what now amounts to maps to Google hell.

Adding Insult to Injury

What makes such filters/penalties even more infuriating is that in some cases when your site is slapped with a negative karma penalty, others who steal your content & wrap it in AdSense will outrank you, since their site does not yet have a negative karma penalty against it. :)

Individually the splog sites may not live long, but collectively they can keep outranking you to ensure you are invisible for your own words, even if you poured years of your life into creating something beautiful & important.

As Cult of Mac reports:

As we noted yesterday, Cult of Mac was collateral damage in Google’s war on crappy content farms. For some inexplicable reason, we got downgraded when Google tweaked its algorithms last Thursday.

But today we’re back in. We’re on Google News (a very important source of daily traffic) as well as Google’s general search results. However, we still get outranked by some of the scraper sites that steal our content, so not everything’s perfect.

That part in bold is the most outrageous part of this new "algorithmic" approach. When Google whacks your site then someone who steals your content will outrank you. And most sites stealing content *are* monetized via Google's DoubleClick & AdSense ads.

That whooshing sound you just heard was MFA sploggers making a mad dash to steal content from the list of currently penalized sites.

Cult of Mac is lucky they had enough pull with the press to get reconsidered. Most webmasters who got hit did not & anyone who has contracts based on set traffic levels or tight margins which just turned negative are in a pretty crappy situation. Yet another example of the importance of not fueling growth with debt & the importance of profit margins and a cash-on-hand safety net.

Who Are the Opportunistic Maximizers?

The problem with such an approach of maximizing everything you do to suck peak revenue out of the pageview is that things can change on a whim. I have seen some of Google's 1 on 1 AdSense optimization advice they sent to a friend of mine. I told my friend that the optimization advise was at best short-term opportunism that would end up crushing them in the long run if they actually implemented it.

Google doesn't care if following their advice torches your site if it makes them a bit more money, because ultimately there is another person standing in line waiting to follow.

My friend is lucky that they realized my advice was more trustworthy than the advice they were getting direct from Google. If they listened to Google back then their business might be destroyed today.

Google likes to position SEOs as exploiters out for the quick buck, but what honest analysis shows is that it is Google which is pushing the boundaries in terms of:

Google AdSense has a "get rich quick" ad category. That is something you won't find on our website (and one of the reasons we will never put AdSense ads on this site)!

AdSense Heatmaps? Look Out Below!

One of the worst hit sites in the AdSense farm update was WiseGeek. Sure WiseGeek must have had something like a 20% ad clickthrough rate. But with traffic falling 75%, maybe they would have been better off building a cleaner experience with a 5% CTR.

That said, they were simply following Google AdSense best practices:

Collateral Damage

What was the most profitable best practices based approach suddenly falls short. And the results are not always predictable. When Google decided to attack content farms who honestly knew that:

  1. somehow eHow.com would survive
  2. yet somehow Google's "algorithmic" approach would punt 10,000's of smaller websites that have far higher content quality

In advance of the solution I was fairly certain eHow would survive, but what I underestimated was the Google engineers. Or rather the ignorance of same. I simply couldn't imagine such a content farm algorithm going live that missed eHow and decimated the lives of so many independent webmasters.

I guess we can simply view this as an extension of Google's you can have any web you want so long as it is corporate TM policy. I think Brett Tabke said it best in a recent AdSense thread:

When the rules and the enforcements are made up by monopolies in a make believe world - there is no cheating.

The only "cheatings" is when it gets outside the lines of the law. - Brett Tabke

After the farmer update layoffs are already happening. Not only for monolithic useless content mill websites, but even in organizations where the content is pure as the driven snow.

AskTheBuilder is yet another Google AdSense case study. In spite of being a niche player well regarded in his community, Sistrix data shows the site off 87% after the most recent Google update!

Who Caused the Content Farm Problem?

Everyone likes to vilify the content farms and scrapers (and they deserve it) but the real villain behind all of this is CPC/CPM based advertising.

Can you imagine a world where your attention was sold off based on how long you stayed on a page rather then how often you switched pages? If google wants to fix their search results, they should focus on fixing adsense. The technology to more accurately measure a viewer's exposure to an ad are there, it just needs a trustworthy player to bring it to market. Someone trusted by both users and advertisers.

Google made click/impression-based advertising appealing to both groups and it made them what they are now. It's time to get away from it. - po

Smokescreen & Misdirection

I have long highlighted that Google's algorithmic-centric approach was blindingly hypocritical & that I felt the approach was nothing more than a scammy cover though which they can selectively exercise editorial discretion while claiming that "the algorithm did it."

Consider the following scenario:

  • roll in an algorithm that aggressively penalizes tons of borderline edge cases
  • see who complains to the media & has connections with the media
  • fix the rankings of those who you like & those with sway, while ignoring the rest

Can You Trust Google?

All of this leads to the obvious question: can you trust Google?

The short answer is yes.

The long answer is you can *always* expect Google to do what is in the best interest of Google. As they plow into field after field (payments, local, mobile, ecommerce, mortgage, credit cards, travel, weddings, fashion, etc.) & use their search dominance to manipulate other markets one would have to be blind to view Google as anything other than a competitor.

Maybe not today. Maybe not tomorrow. But some day they will come. And it is never fun when it happens to you. :(

Until that day may come, if you always follow their best practices, just remember ... ;)

Don't say you were not warned!

Google: The Risk And The Opportunity

Mar 2nd
posted in

It feels like old times.

Google makes a big algorithm change, and all hell breaks loose. Well, some hell, and some jumping for joy, depending on which direction a webmasters rankings went.

As I wrote in Content Farms Vs... at the beginning of last month:

Put it this way. Any algorithm that takes out Demand Media content is going to take out a lot of SEO content, too. SEO copy-writing? What is that? That's what Demand Media do. As I outlined in the first paragraph, a lot of SEO content in not that different, and any algorithm that targets Demand Media's content isn't going to see any difference. Keyword traffic stream identical to title tag? Yep. A couple of hundred words? Yep. SEO format? Yep. Repeats keywords and keyword phrases a few times? Yep. Contributes to the betterment of mankind? Nope. SEO's need to be careful what they wish for....

There were a lot sites following the SEO model of "writing for the keyword term" taken out, not just sites pejoratively labelled as "Content Farms". Ironicly, the pinup example I used, Demand Media, got off lightly.

If you want more detail about what happened, and why, check out Aaron's post Google Kills eHow Competitors, eHow Rankings Up, and, if you're a forum member, this very detailed and insightful thread.

Collateral Damage

Some people have suggested there has been much collateral damage. Google have taken out legitimate pages, too.

What happened is that the pages that were taken out shared enough similarity to pages on Content Farms and the algorithm simply did what it was designed to do, although Google have admitted - kinda - that the change still needs work. The ultimate judgement of whether this is a good or a bad thing comes down to what Google's users think. Does Google deliver higher quality results, or doesn't it?

This Guardian article outlines the frustration experienced by many:

I'm pissed because we've worked our asses off over the last two years to make this a successful site. Cult of Mac is an independently owned small business. We're a startup. We have a small but talented team, and I'm the only full timer. We're busting our chops to produce high-quality, original content on a shoestring budget.We were just starting to see the light at the end of the tunnel. After two years of uncertainty, the site finally looks like it will be able to stand on its two feet. But this is a major setback. Anyone got Larry's cell number?

Scroll down, as there's also some very interesting comments in reply to that post.

This is nothing new, of course, It's been going on since search began. The search engines shrug, and send businesses that depend on them flying, whilst elevating others.

What can be done?

Spread The Risk

"Be less reliant on Google!", people say.

It's an easy thing to say, right, but what do you do when Google is the only search game in town? We know any business strategy that relies on an entity over which we have no control is high risk, but what choice is there? Wait for Bing to get their act together? Hope Blekko becomes the next big thing?

None of us can wait.

Sometimes, no matter how closely we stick to Google's Guidelines, Google are going to change the game. Whether it is fair or not is beside the point, it's going to happen.

So, we need to adopt web marketing strategies that help lessen this risk.

The best way to lessen this risk, of course, is to not rely on Google at all. Design your site strategy in such a way as that it wouldn't grind to a halt if you blocked all spiders with a robots.txt. Treat any traffic from Google as a bonus. Such a strategy might involve PPC, brand building, offline advertising, social media, email marketing and the wealth of other channels open to you.

Try the above as an academic exercise. If you had to operate without natural traffic, does your business still stand up? Are you filling a niche with high demand, a demand you can see in other channels? Is there sufficient margin to advertise, or does your entire model rely on free search traffic? Are there viral elements which could be better exploited? Are there social elements which could be better exploited?

Academic exercises aside, we can also look to mitigate risk. Think about not putting all your eggs in one basket. Instead of running one site, run multiple sites using different SEO strategies on each. Aaron talks about running auxiliary sites in the forum.

Try to get pages (articles, advertising) on other sites in your niche. If your site is taken out, at least you still have a presence in your niche, albeit on someone else's site. A kindly webmaster may even agree to repoint links to any new site you devise.

Do you have other ideas that help mitigate the risk? Add them to the comments.

It's An Advantage Being An SEO

Finally, be pleased you're an SEO.

SEO just got that much harder, and the harder it gets, the more your services are required, and the higher the barrier to entry for new publishers. Every day search is getting more complex. At the end of the day, it's an algorithm change. It can be reverse engineered, and new strategies will be adopted to maximize the opportunity it presents.

Until such a time as Google tells us exactly what they want to see, and rewards such content, SEO's will just keep doing what they do. And thank goodness Google isn't entirely transparent. If they were the value of your SEO knowledge as a competitive advantage would plunge. For many of us, wages would quickly follow.

Sure a short-term hit is painful, but the best SEOs will recover.

As they do, other content producers will be left scratching their heads.

Pages






    Email Address
    Pick a Username
    Yes, please send me "7 Days to SEO Success" mini-course (a $57 value) for free.

    Learn More

    We value your privacy. We will not rent or sell your email address.