You see, tricking people is bad. Unless you are Google. In which case you have to hit the quarterly numbers.
Everyone else needs to read Google platitudes, create deep content, and pray to turn the corner before bankruptcy hits.
Matt Cutts stated that you should make your products like Apple products by packaging them nicely.
For illustrative purposes:
It was easy for Google to speak from a moral high ground when their growth was above 50% a year, but now that growth has slowed over the past couple years they have been willing to do things they wouldn't have. In November of 2009 when I saw the following I knew the writing was on the wall.
When Google Instant launched, we got to test Google's 50% content theory. And they hit the numbers perfectly. A full 50% of web users could see 2 organic listings above the fold when instant was extended (the other half of folks could only see one or none).
As if the massive Youtube promotion & the magically shrinking search results for everyone else were not bad enough, with Panda they suck at determining the original content source.
This site you are reading wasn't hit by Panda, which makes us lucky, as it allows us to rank as high as #3 for our own content (while Google pays dozens of other webmasters to snag it wholesale and wrap it in AdSense).
We got lucky though. If we had been hit by Panda (like 10,000's of other webmasters) we probably wouldn't even rank on the first page of the search results for our own content.
When Google screws up source attribution they are working counter to open culture, because they are having you bear 100% of the cost of content production, and then they are immediately paying someone else for your work. Do that long enough and the quality content disappears & we get a web full of eHow-like sites.
And yet Google tells us the secret recipe (which may or may not work at some unknown time) is to pour more money into content development.
The solution to this problem is more deep content. Keep feeding Google (and their AdSense scraper partners) and hope that after you pour $50,000 into your site that some small fraction of it ends up back in your bank account (while the larger share winds up in Google's and their AdSense partners).
As bad as all that is, I recently got selected as a lucky beta user for the next version of Google's search results. Notice the horizontal spacing that drives down the organic search results. After the top AdWords listings the organic listings start off 88 pixels lower on the screen.
I have a huge monitor. Less than 10% of people have a monitor as large as mine. Before this new search result I saw 8 organic search results above the fold on my large monitor. Now it is down to 5 (and that is with no Google video ad, no Google vertical comparison ad like the above credit card one, no browser toolbars, no browser status bar, and only 1 of the advertisers having ad sitelinks).
So how does Google score now on their ad to content ratio?
When Google's new search results roll out, there are some keywords where less than 1 in 3 searchers will be able to see a single organic listing above the fold! And lest you think that spacing is about improving user experience, notice how wide the spacing in the left column is, and how narrow the right rail AdWords spacing is. This is all about juicing revenues & hitting the number.
Which leads me to the Google Panda loophole I mentioned in the headline. It is an easy (but painful) one-step process.
All Google's propaganda about the horrors of paid inclusion look absurd when compared against the search result with 0 organic listings above the fold for half of desktop computer users.
The only "exploit" here is how Google is paying people to steal other's content, then ranking the stolen stuff above the original source.
The #1 goal for any organization is self-preservation. When people feel things are fairly just & they are just getting by they are fine with squeezing out more efficiency in what they do and figuring out ways to pay the bills. But when people feel the table is tilted at some point they stop caring and do whatever it takes.
Ex Post Facto
Some longtime AdWords advertisers have recently been punished for affiliate ads they ran 8 years ago where some of the sites they promoted at some point fell out of Google's graces through an ad system which never allows you to delete your history & offers ex post facto regulations that turn a regular advertiser arbitrarily into a spammer.
In 3 weeks it will have been 3 months since Google first launched Panda. Outside of bloggers with 50,000 RSS subscribers few (if any) reports of recovery from Panda have been seen. Some of the theories floating around what caused Panda attempt to tie it to AdSense & many of Google's AdSense case studies are now highlighting best practices to follow if you want to be just like the sites Google torched.
As if that wasn't conflicting enough, some of the webmasters that were torched by Panda received automated messages that they were missing out on revenues by not using the maximum allotted number of ad units. After the huge fall off from Panda, Google has been pushing AdSense so hard that many webmasters have been receiving unsolicited emails from Google suggesting they sign up for AdSense.
I won't run AdSense on our main sections of this site because it would be tacky and destroy perceived credibility (having a "submit your site to 2000 search engines for $29" ad next to the content doesn't inspire trust on an SEO site). I could create a content farm answers section of the site that mirrors Ask's strategy, but with a higher level of quality. I won't though, because it would be viewed as spam because I am me. Once again, SEOs should be held to a higher standard than search engines. ;)
That Which You Consume, Consumes You
Where this rubs wrong is not only the overt brand push, but also that some of Google's pushes at expansion down the search funnel have looked a lot like the spam they claim to fight.
In the Wall Street Journal there was an article about the Panda update highlighting that many small businesses were laying off their employees. The same article highlighted numerous cost extensive desperate marketing measures the firms were taking which may or may not work. Google didn't disclose much in the article other than:
The Google spokesman says the company doesn't disclose details about changes it makes to its algorithms because doing so "would give bad actors a way to game our systems."
Nobody likes bad actors, but most of the webmasters that were hit were not bad actors. Rather, most of them were naive & simply followed the Google guidelines thinking that was in their best interests and perhaps would allow them to stay competitive. Unfortunately, it wasn't.
If you adhere to guidelines, get beat down, are not told why, and are told that generally sites need to "improve their quality" that can be a pretty infuriating message. The presumption that your stuff isn't good enough when 3rd grade rewrites of your content now outrank you is both smug and obnoxious. What is worse about the update though now is that many scraper websites are outranking the original content sources, so the message is that your content is plenty good enough, but it is just not good enough when it is on your site. A large portion of those scraper sites are monetized via Google AdSense & would not even exist if it were not for AdSense.
So Google whacks your site, tells you to clean up your act (& increase your operating costs while decreasing your margins), lumps you in the bad actors group, offers no information about when the pain will (or even could) end, pays someone to steal your content, then ranks that stolen copy of your content above you in the search results.
Make Your Move
If a person has the pleasure to experience the above it doesn't take much critical thinking skills to develop a different perspective on search.
Ultimately this is going to lead to a "why not" approach to search for many folks in the search space.
If Google already dinged your website why wouldn't you remove AdSense & replace it with competing ad programs? Why not test those affiliate programs you have been meaning to test? If you have to rework your content anyway, why not move past AdSense/webmaster welfare?
If your AdWords budget was marginally profitable & you were buying ads to compliment your organic exposure, why wouldn't you stop buying ads with Google & test running ads on other websites? Google is fine funding an affiliate network that uses direct links, so why not use clean links on your ad buys? If you like run it through a self-hosted affiliate program so that you are just like Google.
If your site is already whacked why wouldn't you buy links to help boost its ranking back?
If your site earns nothing from search, why wouldn't you sell links if you have to do whatever it takes to make costs?
If your site gets penalized & someone copying your content & wrapping it in AdSense outranks you why wouldn't you create new mirror sites? Why wouldn't you create scraper websites to pollute Google with?
If rankings are unpredictable & one site is no longer enough, why wouldn't you create backup sites & projects of various levels of quality & effort? At this point diversity simply serves as a needed form of insurance.
If while running these purely scientific experiments you accidentally run into something that works really well that shouldn't, why not scale it to the moon?
I am not convinced that the search results are any cleaner today than they were a few months ago. However I am fairly certain things will soon head south. I am not advocating going out of your way to be extra spammy, but am just highlighting the cost-benefit analysis which is going through the heads of thousands of webmasters who Google just torched.
Google is betting that anonymous strangers will behave more kindly than Google has, but when an animal is backed into a corner it often acts in unpredictable (and even uncontrollable) ways.
The big problem for Google is this: "when innocence itself, is brought to the bar and condemned, especially to die, the subject will exclaim, it is immaterial to me whether I behave well or ill, for virtue itself is no security." - John Adams
Like a good neighbor, State Farm is there and there and there and there and there.
The Struggle Real Businesses Face
The big problem with this IMHO is all but the spammer (who is now busy working on "local" signals) loses. Legit online-only pure plays are simply wiped off the result set. The searcher gains nothing by seeing State Farm agents 5 times in the search results. Even the local business which has a new windfall of business is simply overwhelmed with leads, meaning they likely have (at least relatively) poor customer service until they hire up.
To a small business, a sharp rise in demand can be every bit as damaging as a sharp fall in demand.
But should small local businesses hire aggressively, they could be only 1 algorithmic update away from needing to prune staff. Maybe some day Google decides to limit the results to show 1 agent per parent company, and then the agents end up fighting out each other (much like affiliates had to fight each other on bids in AdWords to be the 1 that shows up).
Given that some of the agents ranking page 1 have less than a dozen inbound links & links from only a few unique domains, it won't take long for some new "local" players to come online.
What Makes a Search Result Good?
A lot can be said for getting users where they need to be quickly. When it works it has great value. But when it doesn't work, it makes the market less efficient. Value chains exist for a reason. Sometimes a brand (or an individual agent of brand x) is not in the best position to act as an unbiased advisor.
As a consumer buying car insurance, I don't care that my agent is local. In fact, if I live in an expensive area I may want my insurance provided from someone who lives in an area with a lower cost of living so they can provide the services (while making a comfortable living) for less. For the last decade I have been insured from a company in another state (USAA in Texas). Location had precisely 0 impact on my decision making.
What mattered to me was that they had great rates. Which is precisely what almost all insurance commercials promote.
Geico spends nearly a billion Dollars a year pounding that message into the minds of consumers.
The problem is that almost all the big brands promote the exact same message. They are the cheapest. Save with them. Etc. Online pure plays that provide quote comparisons provide a valuable & value-add function in this marketplace, but they have simply disappeared from Google. They aren't local enough to hit the local signal, they aren't brand enough to hit the brand signal, and since they are not the end brands they can't justify buying $30 AdWords clicks thinking that what they don't get back in direct ROI can be written off to "brand."
Ultimately the end user loses (or at least until Google creates their insurance flavor of "comparison ads.")
This Stuff is Everywhere
This stuff is even happening on search queries where there is absolutely no implied local intent & no need for a local provider. General discovery & topical queries like "web designer" or even informational background searches like "SEO" now bring up service based sites with a local presence.
Leaving Off On a Positive Note
1 day doesn't make a trend, but if this stuff sticks ranking local sites for big keywords just got really easy.
If you know SEO and live near a big city, a second office location might soon be a profitable decision.
If you are a local business who thought SEO was too complex or expensive, that excuse may have just been removed from the marketplace.
If you run a bespoke consulting styled business & ran into a windfall of demand don't forget to increase your rates & be more selective with who you work with. Working all the time leads to burn out. Trust me I know that all too well. ;)
This is another example why it can be a great idea to mix and match your businesses...such that if one jumps out of nowhere or another one tanks you are still fine. Having multiple projects is one of the few ways you can really protect yourself from the likes of Panda & updates like this one. Running multiple businesses allows you to lean into your side gigs when your main one drops off, and push harder on your main gig when it is really humming along.
In the last post I mentioned how the US government tried to change the cost benefit analysis for some sleazy executives at pharmaceutical corporations which continue to operate as criminal enterprises that simply view repeated fines as a calculable cost of doing business.
If you think about what Google's Panda update did, it largely changed the cost-benefit analysis of many online publishing business models. Some will be frozen with fear, others will desperately throw money at folks who may or may not have solutions, while others who gained will buy additional marketshare for pennies on the Dollar.
"We actually came up with a classifier to say, okay, IRS or Wikipedia or New York Times is over on this side, and the low-quality sites are over on this side." - Matt Cutts
Now that Google is picking winners and losers the gap between winners & losers rapidly grows as the winners reinvest.
And that word invest is key to understanding the ecosystem.
Beware of Scrapers
To those who are not yet successful with search, the idea of spending a lot of money building on a strategy becomes a bit more risky when you see companies like Demand Media that have spent $100's of millions growing an empire only to see 40% of the market value evaporate in a couple weeks due to a single Google update. There are literally thousands of webmasters furiously filing DMCA reports to Google after Panda, because Google decided that the content quality was fine if it was on a scraper site, but the exact same content lacked quality when on the original source site.
And even some sites that were not hit by Panda (even some which have thousands of inbound links) are still getting outranked by mirroring scrapers. Geordie spent hours sharing tips on how to boost lifetime customer value. For his efforts, Google decided to rank a couple scrapers as the original source & filter out PPCBlog as duplicate content, in spite of one of the scrapers even linking to the source site.
Outstanding work Google! Killer algo :D
Even if the thinking is misguided or an out of context headline, Reuters articles like Is SEO DOA as a core marketing strategy? do nothing to build confidence to make large investments in the search channel. Which only further aids people trying to do it on the cheap. Which gets harder to do as SEO grows more complex. Which only further aids the market for lemons effect.
At the opposite end of the spectrum, there are currently some search results which look like this
All of the colored boxes are the same company. You need a quite large monitor to get any level of result diversity above the fold. The company that was on the right side of the classifier can keep investing to build a nearly impenetrable moat, while others who fell back will have a hard time justifying the investment. Who wants to scale up on costs while revenues are down & the odds of success are lower? Few will. But the company with the top 3 (or top 6) results is collecting the data, refining their pitch, and re-investing into locking down the market.
Much like the Gini coefficient shows increasing wealth consolidation in the United States, search results where winners and losers are chose by search engines creates a divide where doing x will be very profitable for company A, while doing the exact same thing will be a sure money loser for company B.
Thin Arbitrary Lines in the Sand
The lines between optimization & spam blur as some trusted sites are able to rank a doorway page or a recycled tweet. Once site owners know they are trusted, you can count on them green lighting endless content production.
Scraping the Scrape of the Scrape
Many mainstream media websites have topics subdomains where they use services like DayLife or Truveo to auto-generate a near endless number of "content pages." To appreciate how circular it all is consider the following
a reporter makes a minimally informing Tweet
Huffington Post scrapes that 3rd party Tweet and ranks it as a page
I write a blog post about how outrageous that Huffington Post "page" was
SFGate.com has an auto-generated "Huffington Post" topics page (topics.sfgate.com/topics/The_Huffington_Post) which highlighted my blog post
some of the newspaper scraper pages rank in the search results for keywords
sites like Mahalo scrape the scrape of the scrape
At some point in some such loops I am pretty certain the loops start feeding back into themselves & create a near-infinite cycle :D
An Endless Sea of "Trustworthy" Content
The OPA mentioned a billion dollar shift in revenues which favors large newspapers. But those "pure" old-school media sites now use services like DayLife or Truveo to auto-generate content pages. And it is fine when they do it.
The newspapers call others scammy agents of piracy and copyright violators for doing far less at lower scale, all while wanting to still be ranked highly (even while putting their own original content behind a paywall), and then go out and do the exact same scraping that they complain about others doing. It is the tragedy of the commons played out on an infinite web where the cost of an additional page is under a cent & everyone is farming for attention.
And the piece of pie everyone is farming for is shrinking as:
competition increases faster than the growth of the market
Aware that consumers spend someplace between eight and 10 hours researching cars before they contact a dealer, auto markers and dealers are vectoring ever-greater portions of their marketing budgets into intercepting consumers online.
As but one example, Ford is so keen about capturing online tire-kickers that its website gives side-by-side comparisons between its Fiesta and competing brands. While you are on the Ford site, you can price the car of your dreams, investigate financing options, estimate your payment, view local dealer inventories and request a quote from a dealer.
Search Ads Replacing the Organic Search Results
AdWords is eating up more of the value chain by pushing big brands
comparison ads = same brands that were in AdWords appearing again
bigger adwords ads with more extensions = less diversity above the fold
additional adwords ad formats (like product ads) = less diversity (most of the advertisers who first tried it were big box stores, and since it is priced on a CPA profit share basis the biggest brands that typically have more pricing power with manufacturers win)
Other search services like Ask.com and Yahoo! Search are even more aggressive with nepotistic self promotion.
Small Businesses Walking a Tightrope (or, the Plank)
Not only are big brands being propped up with larger ad units (and algorithmically promoted in the organic search results) but the unstable nature of Google's results further favors big business at the expense of small businesses via the following:
more verticals & more ad formats = show the same sources multiple times over
less stability = more opportunities for spammers (they typically have high margins & lots of test projects in the work...when one site drops another one is ready to pop into the game...really easy for scrapers to do...just grab content & wait for the original source to be penalized, or scrape from a source which is already penalized)
less stability = lowers multiples on site sales, making it easier for folks like WebMD, Quinstreet, BankRate, and Monster.com to buy out secondary & tertiary competing sites
If you are a small business primarily driven by organic search you either need to have big brand, big ego, big balls, or a lack of common sense to stay in the market in the years to come, as the market keeps getting consolidated. ;)
Google ignored our page title, ignored our on-page header, and then use the 'comments' count as the lead in the clickable link. Then they follow it with the site's homepage page title. The problem here is if the eye is scanning the results for a discriminating factor to re-locate a vital piece of information, there is no discrimination factor, nothing memorable stands out. Luckily we are not using breadcrumbs & that post at least had a somewhat memorable page URL, otherwise I would not have been able to find it.
For what it is worth, the search I was doing didn't have the words comments in it & Google just flat out missed on this one. Given that some huge % of the web's pages has the word "comments" on it (according to the number of search results returned for "comments" it is about 1/6th as popular online as the word "the") one might think that they could have programmed their page title modification feature to never select 'comments' as the lead.
Google has also been using link anchor text sometimes with this new feature, so it may be a brutal way to Google-bomb someone. It is sure be fun when the political bloggers give it a play. ;)
But just like the relevancy algorithms these days, it seems like this is one more feature where Google ships & then leaves it up to the SEOs to tell them what they did wrong. ;)
You can learn a lot about how search has improved over the years by reading Matt Cutts. Recently he highlighted how search was irrelevant in the past due to a lack of diversity:
Seven of the top 10 results all came from one domain, and the urls look a little… well, let’s say fishy. In 1999 and early 2000, search engines would often return 50 results from the same domain in the search results. One nice change that Google introduced in February 2000 was “host crowding,” which only showed two results from each hostname. ... Suddenly, Google’s search results were much cleaner and more diverse! It was a really nice win–we even got email fan letters.
Thanks to those kinds of improvements, in 2011 we never have to look at search results like this.*
* And by never, I mean, unless the results are linking to fraternal Google pages, in which case, game on!
Why should Google result crowding not apply to Google.com? Sure they can say those books are from different authors, but many websites are ran by organizations with multiple authors. Some websites are even built through the partnerships of multiple different business organizations. Who knows, maybe some searchers are uncomfortable with every other listing being an out of context book highlight.
In the past I have been called cynical for highlighting stuff like the following image
I saw it as part of a trend toward home cooking promotions. And I still view it that way. The above books promotion is simply further proof of concept.
other Google owned and operated sites
a branded website ranking for its own brand
Can you show me *any* occurrence of a result where a site is listed 5 times in the search result? Bonus points if you can find it where the 5 times are not grouped into 1 bunch via result crowding.
As a thought experiment, ask yourself if that Google ranking accident would happen if the content archive being served up was promoting media hosted on Microsoft servers.
A friend of mine summed it up nicely with:
well, it's not everyday you see that kind of power and the fact that other sites aren't afforded the same opportunity makes me think that they are being anti-competitive. Google literally wrote the book (ok scraped it) on anti-competitive practices.
If you live outside the United States and were unscathed by the Panda Update, a world of hurt may await soon. Or you may be in for a pleasant surprise. It is hard to say where the chips may lay for you without looking.
Due to Google having multiple algorithms running right now, you can get a peak at the types of sites that were hit, and if your site is in English you can see if it would have got hit by comparing your Google.com rankings in the United States versus in foreign markets by using the Google AdWords ad preview tool.
In most foreign markets Google is not likely to be as aggressive with this type of algorithm as they are in the United States (because foreign ad markets are less liquid and there is less of a critical mass of content in some foreign markets), but I would be willing to bet that Google will be pretty aggressive with it in the UK when it rolls out.
The keywords where you will see the most significant ranking changes will be those where there is a lot of competition, as keywords with less competition generally do not have as many sites to replace them when they are whacked (since there were less people competing for the keyword). Another way to get a glimpse of the aggregate data is to look at your Google Analytics search traffic from the US and see how it has changed relative to seasonal norms. Here is a look out below example, highlighting how Google traffic dropped. ;)
What is worse, is that on most sites impacted revenue declined faster than traffic because search traffic monetizes so well & the US ad market is so much deeper than most foreign markets. Thus a site that had 50% profit margins might have just went to break even or losing money after this update. :D
When Google updates the US content farmer algorithm again (likely soon, since it has already been over a month since the update happened) it will likely roll out around other large global markets, because Google does not like running (and maintaining) 2 sets of ranking algorithms for an extended period of time, as it is more cost intensive and it helps people reverse engineer the algorithm.
And the spam clean up? Google did NOTHING of the sort.
Every single example (of Google spamming Google) that was highlighted is still live.
Now Google can claim they handled the spam on their end / discounted it behind the scenes, but such claims fall short when compared to the standards Google holds other companies to.
Most sites that get manually whacked for link-based penalties are penalized for much longer than 2 weeks.
Remember the brand damage Google did to companies like JC Penny & Overstock.com by talking to the press about those penalties? In spite of THOUSANDS of media outlets writing about Google's BTQ acquisition, The Register was the most mainstream publication discussing Google's penalization of BeatThatQuote, and there were no quotes from Google in it.
When asking for forgiveness for such moral violations, you are supposed to grovel before Google admitting all past sins & admit to their omniscient ability to know everything. This can lead one to over-react and actually make things ever worse than the penalty was!
In an attempt to clean up their spam penalties (or at least to show they were making an effort) JC Penny did a bulk email to sites linking to them, stating that the links were unauthorized and to remove them. So JC Penny not only had to spend effort dropping any ill gotten link equity, but also lost tons of organic links in the process.
Time to coin a new SEO phrase: token penalty.
token penalty: an arbitrary short-term editorial action by Google to deflect against public relations blowback that could ultimately lead to review of anti-competitive monopolistic behaviors from a search engine with monopoly marketshare which doesn't bother to follow its own guidelines.
Your faith in your favorite politician should be challenged after you see him out on the town snorting coke and renting hookers. The same is true for Googler's preaching their guidelines as though it is law while Google is out buying links (and the sites that buy them).
You won't read about this in the mainstream press because they are scared of Google's monopolistic business practices. Luckily there are blogs. And Cyndi Lauper. ;)
Update: after reading this blog post, Google engineers once again penalized BeatThatQuote!
On-demand indexing was a great value added feature for Google site search, but now it carries more risks than ever. Why? Google decides how many documents make their primary index. And if too many of your documents are arbitrarily considered "low quality" then you get hit with a sitewide penalty. You did nothing but decide to trust Google & use Google products. In response Google goes out of its way to destroy your business. Awesome!
Part of the prescribed solution to the Panda Update is to noindex content that Google deems to be of low quality. But if you are telling GoogleBot to noindex some of your content, then if you are also using them for site search, you destroy the usability of their site search feature by making your content effectively invisible to your customers. For Google Site Search customers this algorithmic change is even more value destructive than the arbitrary price jack Google Site Search recently did.
Cloaking vs rel=noindex, rel=canonical, etc. etc. etc.
Google tells us that cloaking is bad & that we should build our sites for users instead of search engines, but now Google's algorithms are so complex that you literally have to break some of Google's products to be able to work with other Google products. How stupid! But a healthy reminder for those considering deeply integrating Google into your on-site customer experience. Who knows when their model will arbitrarily change again? But we do know that when it does they won't warn partners in advance. ;)
I could be wrong in the above, but if I am, it is not easy to find any helpful Google documentation. There is no site-search bot on their list of crawlers, questions about if they share the same user agent have gone unanswered, and even a blog post like this probably won't get a response.
That is a reflection of only one more layer of hypocrisy, in which Google states that if you don't provide great customer service then your business is awful, while going to the dentist is more fun than trying to get any customer service from Google. :D
I was talking to a friend about this stuff and I think he summed it up perfectly: "The layers of complexity make everyone a spammer since they ultimately conflict, giving them the ability to boot anyone at will."