Other Google Guy: Sorry, just shouting out "Thanks!" to Marissa. She left me a cup cake this morning. You were saying?
GoogleGuy: Our algo, it keeps returning low-quality farmer garbage
Other Google Guy: Ah, right. We're gone all "Alta Vista" a bit lately, huh. People are noticing....
Google Guy: Hey! No one mentions the AV word around here, OK!
Other Google Guy: Sorry dude. So, what shall we do?
Google Guy: We could invent a cool new algorithm, like Sergey and Larry did all those years ago
Other Google Guy: Hahahaha....you ain't Sergey or Larry, dude. Anyway, they're more concerned with self-drive cars these days, aren't they? Search is so 2001.....
Google Guy: Look, we've got to do something. The technorati are getting uppity. They're writing blog posts. Tweets. Everything. And let's not forget the JC Penny debacle. The shareholders could get angry about this. Well, they would if they understood it.....
Other Google Guy: Do they?
Google Guy: Probably not.
Other Google Guy: So, what's the problem? My data is showing most of our users couldn't give a toss about the farmer stuff. Some of them like learning about how to pour a glass of milk. It's just the valleywags getting grumpy, and no one listens to them.
Google Guy: Right, but this has the potential to filter out. It might get on FOX! Too many people might get the wrong end of the stick, and suddenly we're not cool anymore.
Other Google Guy: But we're not cool n.......
Google Guy: Shut it. We're still cool, OK.
Other Google Guy: Anything you say, boss
Google Guy: Hmmm.......what we could do is go "social media". So hot right now. We could crowdsource it! We'd look very cool with the hipsters.
Other Google Guy: Mmmmmm.....sauce.....
Google Guy: We'll give 'em a Chrome extension. Yes! Make them do all the work. At very least, it's going to shut them up. They won't have to look at anything they don't want to look at. It will make them feel superior, and we can collect some data about what sites techno dudes don't like
Other Google Guy: Brilliant! Superb! One problem - won't content farmers use this against each other in order to take each other out?
Google Guy : Nah, it's just a "ranking signal". We have hundreds of 'em we apply to every search, don't you know ;)
Other Google Guy: Hahahah..."ranking signal". Nice one, Google Guy. You can add it to the other two hundred! Or was it three hundred? Shareholders love that stuff.
Google Guy: Laughs. Oh...kay.....almost finished this extension. It'll push it out there.....
Ten seconds pass.....
Google Guy: Hey! The first data is in already!
Other Google Guy: People use Chrome? Opps...I mean "People use Chrome!" Which sites are they blocking?
Google claims that they do not want to police low quality content by trying to judge intent, that doing so would not be scalable enough to solve the problem, & that they need to do it algorithmically. At the same time, Google is willing to manually torch some sites and basically destroy the associated businesses. Talk to enough SEOs and you will find stories of carnage - complete decimation.
Economics Drive Everything
Content farms are driven by economics. Make them unprofitable (rather than funding them) and the problem solves itself - just like Google AdWords does with quality scores. Sure you can show up on AdWords where you don't belong and/or with a crappy scam offer, but you are priced out of the market so losses are guaranteed. Hello $100 clicks!
How many content farms would Google need to manually torch to deter investment in the category? 5? Maybe 10? 20 tops? Does that really require a new algorithmic approach on a web with 10's of millions of websites?
When Google nuked a ton of article banks a few years back the damage was fairly complete and lasted a long time. When Google nuked a ton of web directories a few years back the damage was fairly complete and lasted a long time. These were done in sweeps where on day you would see 50 sites lose their toolbar PageRank & see a swan dive in traffic. Yet content farms are a sacred cow that need an innovated "algorithmic" approach.
One Bad Page? TORCHED
If they feel an outright ban would be too much, then they could even dial the sites down over time if they desired to deter them without immediately killing them. Some bloggers who didn't know any better got torched based on a single blog post:
The Forrester report discusses a recent “sponsored conversation” from Kmart, but I doubt whether mentions that even in that small test, Google found multiple bloggers that violated our quality guidelines and we took corresponding action. Those blogs are not trusted in Google’s algorithms any more.
When you look at garbage content there are hundreds of words on the page screaming "I AM EXPLOITATIVE TRASH." Yet when you look at links they are often embedded inline and there is little context to tell if the link is paid or not, and determine if the link was an organic reference or something that is paid for.
Why is it that Google is comfortable implying intent with links, but must look the other way when it comes to content?
Media is a game of numbers, and so content companies have various layers of quality they mix in to make it harder for Google to find signal from noise. Yahoo! has fairly solid content in their sports category, but then fluff it out with top 10 lists and such from Associated Content. Now Yahoo! is hoping they can offset lower quality with a higher level of personalization:
The Yahoo platform aims to draw from a user’s declared preferences, search items, social media and other sources to find and highlight the most relevant content, according to the people familiar with the matter. It will be available on Yahoo’s Web site, but is optimized to work as an app on tablets and smartphones, and especially on Google Android and Apple devices, they said.
AOL made a big splash when they bought TechCrunch for $25 million. When AOL's editorial strategy was recently leaked it highlighted how they promoted cross linking their channels to drive SEO strategy. And, since acquisition, TechCrunch has only scaled up on the volume of content they produce. In the last 2 days I have seen 2 advertorials on TechCrunch where the conflicting relationship was only mentioned *after* you read the post. One was a Google employee suggesting Wikipedia needs ads, and the other was some social commerce platform guy promoting the social commerce revolution occurring on Facebook.
Being at the heart of technology is a great source of link equity to funnel around their websites. TechCrunch.com already has over 25% as many unique linking domains as AOL.com does. One of the few areas that is more connected on the social graph than technology is politics. AOL just bought Huffington Post for $315 million. The fusion of political bias, political connections, celebrity contributors, and pushing a guy who promoted (an ultimately empty) promise of hope and change quickly gave the Huffington Post even more link equity than TechCrunch has.
Ultimately this is where Google's head in the sand approach to content farms backfired. When content farms were isolated websites full of trash Google could have nuked them without much risk. But now that their is a blended approach and content farms are part of public companies backed by politically powerful individuals, Google can't do anything about them. Their hands are tied.
Trends in Journalism
Much like the middle class has been gutted in the United States, Ireland (and pretty much everywhere that is not Iceland) by economic policies that gut the average person to promote banking criminals, we are seeing the same thing happen online to the value of any type of online journalism. As we continue to ask people to do more for less we suffer through a lower quality user experience with more half-content that leaves out the essential bits.
The silver lining there is that if you are the employer your margins may grow, but if you are an employee & are just scraping by on $10 an hour then it increases the importance of doing something on the side to lower your perceived risk & increase your influence. A few years back Marshall Kirkpatrick started out on AOL's content farms. The tips he shared to stand out would be a competitive advantage in almost any vertical outside of technology & politics:
one day Michael Arrington called and hired me at TechCrunch. "You keep beating us to stories," he told me. I was able to do that because I was getting RSS feeds from key vendors in our market delivered by IM and SMS. That's standard practice among tech bloggers now, but at the time no one else was doing it, so I was able to cover lots of news first.
Three big tips from the "becoming a well known writer front" for new writers are...
if short form junk content is the standard then it is easier to stand out by creating long form well edited content
it is easier to be a big fish in a small pond than to try to get well known in a saturated area, so it is sometimes better to start working for niche publishers that have a strong spot in a smallish niche
if you want to target the bigger communities the most important thing to them (and the thing they are most likely to talk about) are themselves
Spam reports are prioritized by looking at how much visibility a potentially spammy site has in our search results, in order to help us focus on high-impact sites in a timely manner. For instance, we’re likely to prioritize the investigation of a site that regularly ranks on the first or second page over that of a site that only gets a few search impressions per month.
Given the widely echoed complaints on content farms, it seems Google has a different approach on content farms, especially considering that the top farms are seen by millions of searchers every month.
If end users can determine when links are paid (with limited context) then why not trust their input on judging the quality of the content as well? The Google Toolbar has a PageRank meter for assessing link authority. Why not add a meter for publisher reputation & content quality? I can hear people saying "people will use it to harm competitors" but I have also seen websites torched in Google because a competitor went on a link buying spree on behalf of their fellow webmaster. At least if someone gives you a bad rating for great content then the content still has a chance to defend its own quality.
With link stuff there is a final opinion and that is it. Not only are particular techniques of varying levels of risk, but THE prescribed analysis of intent depends on who is doing it!
A few months back I was running Advanced Web Ranking and noticed that Google and Bing were really starting to come in line on some keywords.
Of course there are still differences between Bing and Google.
Google has far more usage data built up over the years & a huge market share advantage over Bing in literally every global market. Microsoft's poor branding in search meant they had roughly 0 leverage in the marketplace until they launched the Bing brand. That longer experience in search is likely what gives Google the confidence to have a much deeper crawl.
That head start also means that Google has been working on understanding word meanings and adjusting their vocabulary far longer, which also gives them the confidence to be able to use word relationships more aggressively (when Bing came to market part of their ad campaign was built on teasing Google for this). The last big difference from an interface perspective would be that Google forces searchers down certain paths with their Google Instant search suggestions.
Who Copied Who?
But the similarities between the search engines are far greater than their differences.
At the core of Google's search relevancy algorithm is PageRank and link analysis. Bing places a lot of weight on those as well.
Google also factors in the domain name into their relevancy algorithms. So does Bing.
Google has long had universal search & Bing copied it.
Google has tried to innovate by localizing search results. Bing localizes results as well.
Bing moved the right rail ads closer to the organic search results. Google copied them.
Bing put a fourth ad above the organic search results. Google began listing vertical CPA ad units for mortgages and credit cards above the organic search results - a fourth ad unit.
Bing has a homepage background image. Google copied them by allowing you to upload a personalized homepage logo.
Bing offers left rail navigation to filter the search results. Google copied them by offering the same.
Bing innovated in travel search. Google is trying to buy the underlying data provider ITA Software.
Bing included Freebase content in their search results. Google bought Metawebs, which owns Freebase.
Bing offered infinite scroll and a unique image search experience that highlights the images. Google copied it.
Oh, The Outrage
Off the start Bing was playing catch up, but almost anything they have ever tried which has truly differentiated their experience ended up copied by Google. Recently Google conducted a black PR campaign to smear Bing for using usage data across multiple search engines to improve their relevancy. The money quote would be:
Those results from Google are then more likely to show up on Bing. Put another way, some Bing results increasingly look like an incomplete, stale version of Google results—a cheap imitation.
Perhaps why Google finds this so annoying is that it allows Microsoft to refine their "crawl" & relevancy process on tail keywords, which are the hardest ones to get right (because as engines get deeper into search they have fewer signals to work with and a lot more spam). It allows Microsoft to conduct tests which compare their own internal algorithms against Google's top listings on the fly & learn from them. It takes away some of Google's economies of scale advantages.
Is Google Eating Its Own Home Cooking (And Throwing UP?)
Here is what I don't get about Google's complaints though. Google had no problem borrowing a half-dozen innovations from Bing. But this is how Google describes Bing's "nefarious" activities:
“It’s cheating to me because we work incredibly hard and have done so for years but they just get there based on our hard work,” said Singhal. “I don’t know how else to call it but plain and simple cheating. Another analogy is that it’s like running a marathon and carrying someone else on your back, who jumps off just before the finish line.”
When a content site compiles reviews, creates editorial features to highlight the best reviews (and best reviewers), and works to create algorithms to filter out junk and spam then Google is fine with Google eating all that work for free. Google then jumps off their backs just before the finish line and throws the repurposed reviews in front of Google searchers.
This public blanket admission of Microsoft using clickstream data for relevancy purposes is helpful. But outside of the PR smear campaign from Google there wasn't much new to learn here, as this has been a bit of an open secret amongst those in the know in the search space for well over a year now.
But the idea of using existing traffic stream data as a signal increases the value of having a strong diversified traffic flow which leverages:
Recently we tested adding ads on one of our websites that had a fairly uninspired design on it. After adding the ads (which make the site feel a bit less credible) the new design was so much better fitting than the old one that the site now gets 26% more pageviews per visit. Anytime you can put something on your website which increases monetization, sends visitors away & yet still get more user engagement you are making a positive change!
If You Can't Beat Em, Filter
I was being a bit of a joker when I created this, but the point remains that as larger search engines force feed junk (content mills and vertical search results) down end user's throats that some of the best ways for upstart search engines to compete is to filter that stuff out. Both DuckDuckGo and Blekko have done just that.
Search can be used as a wedge in a variety of ways. Most are perhaps poorly understood by the media and market regulators.
Woot! Check Out Our Bundling Discounts
When Google Checkout rolled out, it was free. Not only was it free, but it came with a badge that appears near AdWords ads to make the ads stand out. That boosts ad clickthrough rates, which feeds into ad quality score & acts as a discount for advertisers who used Google Checkout. If you did not use Google's bundled services you were stuck paying above market rates to compete with those who accepted Google's bundling discounts.
Companies spend billions of Dollars every year building their trademarked brands. But if they don't pay Google for existing brand equity then Google sells access to that stream of branded traffic to competitors, even though internal Google studies have shown it causes confusion in the marketplace.
The Right to Copy
Copyright protects the value of content. To increase the cost of maintaining that value, DoubleClick and AdSense fund a lot of copy and paste publishing, even of the automated variety. Sure you can hide your content behind a paywall, but if Google is paying people to steal it and wrap it in ads how do you have legal recourse if those people live in a country which doesn't respect copyright?
You can see how LOOSE Google's AdSense standards are when it comes to things like copyright and trademarks by searching for something like "bulk PageRank checker" and seeing how many sites that violate Google's TOS multiple ways are built on cybersquatted domain names that contain the word "PageRank" in them. There are also sites dedicated to turning Youtube videos into MP3's which are monetized via AdSense.
Philosophically Google believes in (and delivers regular sermons about) an open web where companies should compete on the merit of their products. And yet when Google enters a new vertical they *require* you to let them use your content against you. If you want to opt out of competing against yourself Google say that is fine, but the only way they will allow you to opt out is if you block them from indexing your content & kill your search traffic.
“Google has also advised that if we want to stop content from appearing on Google Places we would have to reduce/stop Google’s ability to scan the TripAdvisor site,” said Kaufer “Needless to say, this would have a significant impact on TripAdvisor’s ranking on natural search queries through Google and, as such, we are not blocking Google from scanning our site.”
From a public relations standpoint & a legal perspective I don't think it is a good idea for Google to deliver all-or-nothing ultimatums. Ultimately that could cause people in positions of power to view their acts as a collection which have to be justified on the whole, rather than on an individual basis.
Lucky for publishers, technology does allow them to skirt Google's suggestions. If I ran an industry-leading review site and wanted to opt out of Google's all-or-nothing scrape job scam, my approach would be to selectively post certain types of content. Some of it would be behind a registration wall, some of it would be publicly accessible in iframes, and maybe just a sliver of it is fully accessible to Google. That way Google indexes your site (and you still rank for the core industry keywords), but they can't scrape the parts you don't want them to. Of course that means losing out on some longtail search traffic (as the hidden content is invisible to search engines), but it is better than the alternatives of killing all search traffic or giving away the farm.
Over the past year or 2 there have been lots of changes with Google pushing vertical integration, but outside of localization and verticalization, core relevancy algorithms (especially in terms of spam fighting) haven't changed too much recently. There have been a fewtricky bits, but when you consider how much more powerful Google has grown, their approach to core search hasn't been as adversarial as it was a few years back (outside of pushing more self promotion).
There has been some speculation as to why Google has toned down their manual intervention, including:
anti-trust concerns as Google steps up vertically driven self-promotion (and an endless well of funding for anyone with complaints, courtesyMicrosoft)
a desire to create more automated solutions as the web scales up
spending significant resources fighting site hacking (the "bigger fish to fry" theory)
As we’ve increased both our size and freshness in recent months, we’ve naturally indexed a lot of good content and some spam as well. To respond to that challenge, we recently launched a redesigned document-level classifier that makes it harder for spammy on-page content to rank highly. The new classifier is better at detecting spam on individual web pages, e.g., repeated spammy words—the sort of phrases you tend to see in junky, automated, self-promoting blog comments. We’ve also radically improved our ability to detect hacked sites, which were a major source of spam in 2010. And we’re evaluating multiple changes that should help drive spam levels even lower, including one change that primarily affects sites that copy others’ content and sites with low levels of original content.
It sounds like Google was mainly focused on fighting hacked sites and auto-generated & copied content. And now that hacked *GOVERNMENT* websites are available for purchase for a few hundred Dollars (and perhaps millions in personal risk when a government comes after you) it seems like Google's pushing toward fighting off site hacking was a smart move! Further, there are a wide array of start ups built around leveraging the "domain authority" bias in Google's algorithm, which certainly means that looking more at page by page metrics was a needed strategy to evolve relevancy. And with page-by-page metrics it will allow Google to filter out the cruddy parts of good sites without killing off the whole site.
As Google has tackled many of the hard core auto-generated spam issues it allows them to ramp up their focus on more vanilla spam. Due to a rash of complaints (typically from web publishers & SEO folks) content mills are now a front and center issue:
As “pure webspam” has decreased over time, attention has shifted instead to “content farms,” which are sites with shallow or low-quality content. In 2010, we launched two major algorithmic changes focused on low-quality sites. Nonetheless, we hear the feedback from the web loud and clear: people are asking for even stronger action on content farms and sites that consist primarily of spammy or low-quality content. We take pride in Google search and strive to make each and every search perfect. The fact is that we’re not perfect, and combined with users’ skyrocketing expectations of Google, these imperfections get magnified in perception.
But what sort of sites are the content mills that Google is going to ramp up action on?
The tricky part with vanilla spam is the subjective nature of it. End users (particularly those who are not web publishers & online advertisers) might not complain much about sites like eHow because they are aesthetically pleasing & well formatted for easy consumption. The content might be at a low level, but maybe Google is willing to let a few of the bigger players slide. And there is a lot of poorly formatted expert content which end users would view worse than eHow, simply because it is not formatted for online consumption.
My guess is that sites that took a swan dive in the October 23rd timeframe might expect to fall off the cliff once more. Where subject search relevancy gets hard is that issues rise and fall like ocean waves crashing ashore. Issues that get fixed eventually create opportunities for other problems to fester. And after an issue has been fixed long enough it becomes a non-issue to the point of being a promoted best practice, at least for a while.
Anyone who sees opportunity as permanently disappearing from search is looking at a half-empty glass rather than one which sees opportunities that died reborn again and again.
That said, I view Matt's blog post as a bit of a warning shot. What types of sites do you think he is coming after? What types of sites do you see benefiting from such changes? Discuss. :)
One of the fundamental keys to monetizing third party content is finding a way to do it while keeping your earnings data abstract. A huge problem that hits pure plays like Netflix is that as soon as companies see the profits the cost structures change.
Partners who license video content to Netflix want a bigger piece of the action as well: "Now many of the companies that make the shows and movies that Netflix delivers to mailboxes, computer screens and televisions — companies whose stocks have not enjoyed the same frothy rise, and whose chief executives have not won the same accolades — are pushing back, arguing that the company is overhyped, and vowing to charge much more to license their content."
Making big money on someone else's content makes the content owner look stupid. As soon as you let big media know you are making money on their content they get pissed and feel they rightfully earned that money. As they sense a shift in power any edge cases become the standard against which all other deals are compared.
How Youtube Differs From Netflix
If you study Google & listen to their quarterly conference calls you will always come away with the following: YouTube is operating at an amazing scale, Youtube's growth is accelerating, and YouTube might not be profitable. In the most recent quarterly call Google highlighted that their display network was a $2.5 billion business, but we never hear specific revenue or cost stats from YouTube. Hiding that business within the larger Google enterprise allows Google to print money and gain leverage without evoking the wrath of big media.
Remember how Google doesn't like cloaking? But they will DRM manage your media for you & if someone views it outside of the appropriate area they will get a "screw you" page, likeso:
(If you are from the US you can see how content is cloaked in various countries by using web proxies or VPN services.)
Copyright is for Suckers
Is Google a more authoritative book seller than Barnes & Nobles? Other than lying & taking a few legal shortcuts, what puts Google in a superior position as a book seller?
At least their (lack of) respect for copyright is consistent.
You Need to Disclose, but Google Does NOT
Remember back when Google claimed that anyone buying or selling links needed to do it in a way that is both machine readable & human readable? Well, Google invested in Viglinks, which is certainly 100% counter to that spirit. Further, consider Google's recent hard coding of ebook promotions in their search results. There is no ad label in a machine readable or human readable format, but they mix it right in their 'organic' search results.
Remember how paid links were bad?
"Search engine guidelines require machine-readable disclosure of paid links in the same way that consumers online and offline appreciate disclosure of paid relationships (for example, a full-page newspaper ad may be headed by the word 'Advertisement')" - Google.
If you do the same thing Google does, then you are violating their guidelines. Sorta hard to compete with them while staying inside their guidelines then, eh?
If Google expects you to label your paid ads in machine and human readable ways, then why are they fine with blending their ads directly into the organic search results with no disclaimer? Do they actually believe that manipulating end users (to promote their own business deals) is less evil than potentially manipulating a search tool?
If you want to know what's really going on in a society or ideology, follow the money. If money is flowing to advertising instead of musicians, journalists, and artists, then a society is more concerned with manipulation than truth or beauty. If the content is worthless, then people will start to become empty-headed and contentless.
The combination of hive mind and advertising has resulted in a new kind of social contract. The basic idea of this contract is that authors, journalists, musicians, and artists are encouraged to treat the fruits of their intellects and imaginations as fragments to be given without pay to the hive mind. Reciprocity takes the form of self-promotion. Culture is to become precisely nothing but advertising.
I mentioned this in our last post but it probably deserves a post of its own. ;)
Google has long claimed that search results inside search results are a poor user experience. They also claim their use of your content is fair use because it is only for ranking and distribution purposes.
Take a look at Google's deskbar subdomain. Google has created MILLIONS of pages on this subdomain:
These pages ARE ranking in the search results:
Google's quest to become the web is leading them to produce a lot of half done products (is eHow's content written at a higher level than Matt Cutts writes) & an increasing variety of bugs. These of course create opportunity for some folks, but a whole lot of pain for many folks who have done nothing wrong other than trusting Google to be competent & fair.
I understand ready, fire, aim on on beta tests or things for start ups, but should Google be doing this sort of silliness with a search service millions depend on?
So much of their originality algorithms determine what is the true source on the internet; the moment bugs like this appear, that trustworthiness is tarnished, and the people who poured sweat blood and tears into a product can be wiped out with a flip of a deskbar.google.com launch.
Search engines are powerful because they are an editorial filter which encourages relevancy.
Four Legs Good, Two Legs Better
Frequently we are marketed to that any errors or omissions on the part of search engines are not due to bad algorithms, but rather do to unscrupulous spammers.
Webmaster guidelines are arbitrary & ever-shifting, and preached like gospel. The 'or else' fear mindset is a primary component of the algorithm.
And yet when some of the largest & most outrageous guideline violations are brought to light, they are quickly dismissed & swept under the rug.
In some cases search engineers conflate SEOs with hackers who are doing illegal activities, but if all marketers & advertisers were criminals then Google.com would top that list, given that ~ 99% of their revenues come from ads & fewer than 100 countries have a GDP greater than Google's revenues. :D
Are 'Spammers' Relevant?
Further claims against spammers include irrelevancy. That was true before I got into the search game (and in some edge cases might be true today), but most spammers try to be relevant. Back in the late 90's when "any page view will do" banner advertising ruled the web all one needed to profit was page views by any means. But as marketing has become more precise and more closely measured, it has become more relevant. With current online marketing being more driven by true conversion performance, relevancy is key. If you show up where you are not relevant you are simply wasting your time & money.
Search engines have a CPM higher than virtually any other type of media format precisely because their ads are so relevant.
Who Promotes Inferior Product?
Let's skip the fact that Google's ad system is set up to maximize yield, while ignoring that Google AdSense has a get rich quick ad category. Looking beyond those, the core argument against spammers is that they pollute the organic search results & leverage Google's distribution to bring inferior product to market.
You know who else does that?
Yelp Inc. CEO Jeremy Stoppelman has complained about Google's use of Yelp content for Google Place pages and is negotiating with Google over the issue. He said Google "is trying to leverage its distribution power"—the search engine—"to take an inferior product and put it in front of the user."
In Google's ideal world they would build a media empire by scraping whoever's content they want, monetizing it however they like, and paying partners a prescribed share of the revenues, right up until Google finds another partner which is willing to accept less.
Google is no longer able to stream in reviews from TripAdvisor to Places pages after the user review giant blocked it.
TripAdvisor confirmed the move today in an email, stating that while it continues to evaluate recent changes to Google Places it believes the user does not benefit with the “experience of selecting the right hotel”.
“As a result, we have currently limited TripAdvisor content available on those pages,” an official says.
As Google spreads into a B2C player & tries to offer up suggestions for everything the top market leaders in many big markets (like Yelp & TripAdvisor) will tell them to screw off. However, players 2 through x will be desperate enough for exposure that they are driven by short term thinking. Google's ebook news mentioned that software is in place to do bundled deals to sell hard copies with the electronic versions. And just look at the direct to consumer marketing Google is doing in Japan.
Eventually market leaders will be offered concessions for deals, or Google will partner with lower placed businesses to slowly wear down the advantage of market leaders with a slow water torture treatment. But for now TripAdvisor stands on its own.
The positive news for Google in this is that the search results offer a wide range of excellent hiking boots for Googlers to choose from :D
Hello, My name is Stanley with DecorMyEyes.com,” the post began. “I just wanted to let you guys know that the more replies you people post, the more business and the more hits and sales I get. My goal is NEGATIVE advertisement.”It’s all part of a sales strategy, he said. Online chatter about DecorMyEyes, even furious online chatter, pushed the site higher in Google search results, which led to greater sales. He closed with a sardonic expression of gratitude: “I never had the amount of traffic I have now since my 1st complaint. I am in heaven
If you look at the backlinks for DecorMyEyes.com, you'll find a significant volume of inbound linking, some of which is junk, but also includes links from the likes of the New York Times. The high-profile links are a direct result of bad publicity.
Of course, this has always been the fly in Google's ointment. Google's link-oriented approach to ranking reflects the attention a site receives. This doesn't necessarily mean the site is endorsed, and in this case, the opposite is true.
Facing a PR disaster, in all senses of the word, Google were quick to act:
We were horrified to read about Ms. Rodriguez’s dreadful experience. Even though our initial analysis pointed to this being an edge case and not a widespread problem in our search results, we immediately convened a team that looked carefully at the issue. That team developed an initial algorithmic solution, implemented it, and the solution is already live
Hmmm....was the algorithmic solution "if domain = DecorMyEyes.com, then PR=0" :)
Jokes aside, Google outlined the options they could have taken to prevent such a problem, but chose not to, then cryptically hint at the step they did eventually take:
Instead, in the last few days we developed an algorithmic solution which detects the merchant from the Times article along with hundreds of other merchants that, in our opinion, provide an extremely poor user experience. The algorithm we incorporated into our search rankings represents an initial solution to this issue, and Google users are now getting a better experience as a result
Reading between the lines, it is clear that.......erm.......hmmmm.........I don't know about you, but I'm none the wiser! That could mean anything! Assembling a team of hand editors to baby sit the results of an algo, or the beginnings of some frightfully clever semantic analysis.
Hard to tell.
Google make out the case is an outlier, although that would only be true on the surface. The fundamental problem, for Google, is link context, and that is a far more difficult problem to solve.
Link As A Vote
When Google started, they used a clever backlink check as a form of voting. The more backlinks a site had, from sites deemed to be authoritative, the higher the rank.
But the web has changed.
These days, we have Facebook and social media. Most people on the web aren't web publishers in the traditional sense. Most people participate on the web, but don't have their own websites. They post on other people's sites, over which they have little control. Google has to make sense of all this, because Google still wants to know what information people pay the most attention to.
The beating heart of a link is a mark of attention.
Google collects markers of attention.
As the PR - as in public relations - problem with DecorMyEyes reveals, popularity and authority calculations are not enough. Google's black box also has to figure out context. Most SEOs would guess Google is putting a lot of work into semantic analysis.
This is why it is becoming increasingly important to treat SEO as a public relations exercise. Links can come from anywhere, and whether they are no-followed, scripted or otherwise, they are all markers of attention. Google's job will always be to collect them, and make sense of them. To the webmaster, all markers of attention are valuable.
Well, almost all.
DecorMyEyes turned it into a marketing strategy, but in terms of SEO, it was never going to last. First rule of SEOClub is that you don't publicly embarrass Google.
In a useful way.
Oscar Wilde said "the only thing worse than being talked about was not being talked about". Malcolm Mclaren said something similar: "bad publicity isn't as good as good publicity, it is ten times better". Brendan Behan "All publicity is good, except an obituary notice".