What are the incentives to publish high-value content to the web?
Search engines, like Google, say they want to index quality content, but provide little incentive to create and publish it. The reality is that the publishing environment is risky, relatively poorly paid in most instances, and is constantly being undermined.
There is little point publishing web content if the cost of publishing outweighs any profit that can be derived from it.
Many publishers, who have search engines in mind, work on an assumption that if they provide content to everyone, including Google, for free, then Google should provide traffic in return. It’s not an official deal, of course. It’s unspoken.
Rightly or wrongly, that’s the “deal” as many webmasters perceive it.
What Actually Happens
Search engines take your information and, if your information is judged sufficiently worthy that day, as the result of an ever-changing, obscure digital editorial mechanism known only to themselves, they will rank you highly, and you’ll receive traffic in return for your efforts.
That may all change tomorrow, of course.
What might also happen is that they could grab your information, amalgamate it, rank you further down the page, and use your information to keep visitors on their own properties.
Look at the case of Trip Advisor. Trip Advisor, frustrated with Google’s use of its travel and review data, filed a competition complaint against Google in 2012.
The company said: "We hope that the commission takes prompt corrective action to ensure a healthy and competitive online environment that will foster innovation across the internet."
The commission has been investigating more than a dozen complaints against Google from rivals, including Microsoft, since November 2010, looking at claims that it discriminates against other services in its search results and manipulates them to promote its own products.
TripAdvisor's hotel and restaurants review site competes with Google Places, which provides reviews and listings of local businesses."We continue to see them putting Google Places results higher in the search results – higher on the page than other natural search results," said Adam Medros, TripAdvisor's vice president for product, in February. "What we are constantly vigilant about is that Google treats relevant content fairly."
Similarly, newspapers have taken aim at Google and other search engines for aggregating their content, and deriving value from that aggregation, but the newspapers claim they aren’t making enough to cover the cost of producing that content in the first place:
In 2009 Rupert Murdoch called Google and other search engines “content kleptomaniacs”. Now cash-strapped newspapers want to put legal pressure on what they see as parasitical news aggregators.”
Of course, it’s not entirely the fault of search engines that newspapers are in decline. Their own aggregation model - bundling news, sport, lifestyle, classifieds topics - into one “place” has been surpassed.
Search engines often change their stance without warning, or can be cryptic about their intentions, often to the determent of content creators. For example, Google has stated they see ads as helpful, useful and informative:
In his argument, Cutts said, “We actually think our ads can be as helpful as the search results in some cases. And no, that’s not a new attitude.”
In entering the advertising market, Google tested our belief that highly relevant advertising can be as useful as search results or other forms of content
However, business models built around the ads as content idea, such as Suite101.com, got hammered. Google could argue these sites went too far, and that they are asserting editorial control, and that may be true, but such cases highlight the flaky and precarious nature of the search ecosystem as far as publishers are concerned. One day, what you're doing is seemingly "good", the next day it is "evil". Punishment is swift and without trial.
In the days before we meet, he has been watching a box set of Adam Curtis's BBC series, All Watched Over by Machines of Loving Grace, about the implications of our digitised future, so the arguments are fresh in his head. "We were so into the net around the time of Kid A," he says. "Really thought it might be an amazing way of connecting and communicating. And then very quickly we started having meetings where people started talking about what we did as 'content'. They would show us letters from big media companies offering us millions in some mobile phone deal or whatever it was, and they would say all they need is some content. I was like, what is this 'content' which you describe? Just a filling of time and space with stuff, emotion, so you can sell it?"
Having thought they were subverting the corporate music industry with In Rainbows, he now fears they were inadvertently playing into the hands of Apple and Google and the rest. "They have to keep commodifying things to keep the share price up, but in doing so they have made all content, including music and newspapers, worthless, in order to make their billions. And this is what we want? I still think it will be undermined in some way. It doesn't make sense to me. Anyway, All Watched Over by Machines of Loving Grace. The commodification of human relationships through social networks. Amazing!
There is no question the value of content is being deprecated by big aggregation companies. The overhead of creating well-researched, thoughtful content is the same whether search engines value it or not. And if they do value it, a lot of the value of that content has shifted to the networks, distributors and aggregators and away from the creators.
Facebook’s value is based entirely on the network itself. Almost all of Google’s value is based on scraping and aggregating free content and placing advertising next to it. Little of this value gets distributed back to the creator, unless they take further, deliberate steps to try and capture some back.
In such a precarious environment, what incentive does the publisher have to invest and publish to the “free” web?
Google lives or dies on the relevancy of the information they provide to visitors. Without a steady supply of “free” information from third parties, they don’t have a business.
Of course, this information isn’t free to create. So if search engines do not provide you profitable traffic, then why allow search engines to crawl your pages? They cost you money in terms of bandwidth and may extract, and then re-purpose, the value you created to suit their own objectives.
Google has done content-related deals in the past. They did one in France in February whereby Google agreed to help publishers develop their digital units:
Under the deal, Google agreed to set up a fund, worth 60 million euroes, or $80 million, over three years, to help publishers develop their digital units. The two sides also pledged to deepen business ties, using Google’s online tools, in an effort to generate more online revenue for the publishers, who have struggled to counteract dwindling print revenue.
This seems to fit with Google’s algorithmic emphasis on major web properties, seemingly as a means to sift the "noise in the channel". Such positioning favors big, established content providers.
It may have also been a forced move as Google would have wanted to avoid a protracted battle with European regulators. Whatever the case, Google doesn’t do content deals with small publishers and it could be said they are increasingly marginalizing them due to algorithm shifts that appear to favor larger web publishers over small players.
Don't Be Evil To Whom?
Google’s infamous catch-phrase is “Don’t Be Evil”. In the documentary Inside Google", Eric Schmidt initially thought the phrase was a joke. Soon after, he realized they took it seriously.
The problem with such a phrase is that it implies Google is a benevolent moral actor that cares about......what? You - the webmaster?
“Don’t Be Evil” is typically used by Google in reference to users, not webmasters. In practice, it’s not even a question of morality, it’s a question of who to favor. Someone is going to lose, and if you’re a small webmaster with little clout, it’s likely to be you.
For example, Google appear to be kicking a lot of people out of Adsense, and as many webmasters are reporting, Google often act as judge, jury and executioner, without recourse. That’s a very strange way of treating business “partners”, unless partnership has some new definition of which I'm unaware.
But I think Google as an organization has moved on; they're focussed now on market position, not making the world better. Which makes me sad. Google is too powerful, too arrogant, too entrenched to be worth our love. Let them defend themselves, I'd rather devote my emotional energy to the upstarts and startups. They deserve our passion.
Some may call such behavior a long way from “good” on the “good” vs “evil” spectrum.
How To Protect Value
Bottom line: if your business model involves creating valuable content, you’re going to need a strategy to protect it and claw value back from aggregators and networks in order for a content model to be sustainable.
Some argue that if you don’t like Google, then block them using robots.txt. This is one option, but there’s no doubt Google still provides some value - it’s just a matter of deciding where to draw the line on how much value to give away.
What Google offers is potential visitor attention. We need to acquire and hold enough visitor attention before we switch the visitors to desired action. An obvious way to do this, of course, is to provide free, attention grabbing content that offers some value, then lock the high value content away behind a paywall. Be careful about page length. As HubPages CEO Paul Edmonds points out:
Longer, richer pages are more expensive to create, but our data shows that as the quality of a page increases, its effective revenue decreases. There will have to be a pretty significant shift in traffic to higher quality pages to make them financially viable to create"
You should also consider giving the search engines summaries or the first section of an article, but block them from the rest.
I know a little bit about this because in January I was invited to a meeting at the A.P.’s headquarters with about two dozen other publishers, most of them from the print world, to discuss the formation of the consortium. TechCrunch has not joined at this time. Ironically, neither has the A.P., which has apparently decided to go its own way and fight the encroachments of the Web more aggressively (although, to my knowledge, it still uses Attributor’s technology). But at that meeting, which was organized by Attributor, a couple slides were shown that really brought home the point to everyone in the room. One showed a series of bar graphs estimating how much ad revenues splogs were making simply from the feeds of everyone in the room. (Note that this was just for sites taking extensive copies of articles, not simply quoting). The numbers ranged from $13 million (assuming a $.25 effective CPM) to $51 million (assuming a $1.00 eCPM)
You still end up facing the cost of policing "content re-purposing" - just one of the many costs publishers face when publishing on the web, and just one more area where the network is sucking out value.
Use multiple channels so you’re not reliant on one traffic provider. You might segment your approach by providing some value to one channel, and some value to another, but not all of it to both. This is not to say models entirely reliant on Google won’t work, but if you do rely on a constant supply of new visitors via Google, and if you don’t have the luxury of having sufficient brand reputation, then consider running multiple sites that use different optimization strategies so that the inevitable algorithm changes won’t take you out entirely. It’s a mistake to think Google cares deeply about your business.
Treat every new visitor as gold. Look for ways to lock visitors in so you aren’t reliant on Google in future for a constant stream of new traffic. Encourage bookmarking, email sign-ups, memberships, rewards - whatever it takes to keep them. Encourage people to talk about you across other media, such as social media. Look for ways to turn visitors into broadcasters.
Adopt a business model that leverages off your content. Many consultants write business books. They make some money from the books, but the books mainly serve as advertisements for their services or speaking engagements. Similarly, would you be better creating a book and publishing it on Amazon than publishing too much content to the web?
Business models focused on getting Google traffic and then monetarizing that attention using advertising only works if the advertising revenue covers production cost. Some sites make a lot of money this way, but big money content sites are in the minority. Given the low return of a lot of web advertising, other webmasters opt for cheap content production. But cheap content isn’t likely to get the attention required these days, unless you happen to be Wikipedia.
Perhaps a better approach for those starting out is to focus on building brand / engagement / awarenesss / publicity / non-search distribution. As Aaron points out:
...the sorts of things that PR folks & brand managers focus on. The reason being is that if you have those things...
the incremental distribution helps subsidize the content creation & marketing costs
many of the links happen automatically (such that you don't need to spend as much on links & if/when you massage some other stuff in, it is mixed against a broader base of stuff)
that incremental distribution provides leverage in terms of upstream product suppliers (eg: pricing leverage) or who you are able to partner with & how (think about Mint.com co-marketing with someone or the WhiteHouse doing a presentation with CreditCards.com ... in addition to celebrity stuff & such ... or think of all the ways Amazon can sell things: rentals, digital, physical, discounts via sites like Woot, higher margin high fashion on sites like Zappos, etc etc etc)
as Google folds usage data & new signals in, you win
as Google tracks users more aggressively (Android + Chrome + Kansas City ISP), you win
if/when/as Google eventually puts some weight on social you win
people are more likely to buy since they already know/trust you
if anyone in your industry has a mobile app that is widely used & you are the lead site in the category you could either buy them out or be that app maker to gain further distribution
Google engineers are less likely to curb you knowing that you have an audience of rabid fans & they are more likely to consider your view if you can mobilize that audience against "unjust editorial actions"
A lot of the most valuable content on this site is locked-up. We’d love to open this content up, but there is currently no model that sufficiently rewards publishers for doing so. This is the case across the web, and it's the reason the most valuable content is not in Google.
It’s not in Google because Google, and the other search engines, don’t pay.
Fair? Unfair? Is there a better way? How can content providers - particularly newcomers - grow and prosper in such an environment?
Is using payment to influence search results unethical unless the check has Google on it?
None of those links in the content use nofollow, in spite of many of them having Google Analytics tracking URLs on them.
And I literally spent less than 10 minutes finding the above examples & writing this article. Surely Google insiders know more about Google's internal marketing campaigns than I do. Which leads one to ask the obvious (but uncomfortable) question: why doesn't Google police themselves when they are policing others? If their algorithmic ideals are true, shouldn't they apply to Google as well?
Clearly Google takes paid links that pass pagerank seriously, as acknowledged by their repeated use of them.
Growing search marketshare is hard work. At a recent investor conference Marissa Mayer stated that: "The key pieces are around the underpinnings of the alliance themselves. The point is, we collectively want to grow share, rather than trading share with each other."
Part of the reason Yahoo! & Bing struggle to gain marketshare is Google's default search placement payments to Mozilla and Apple. If the associated browsers have nearly 1/3 the market & Chrome is another 1/3 of the market then it requires Yahoo! or Bing to be vastly better than Google to break the Google habit + default search placement purchases.
half of those billions of queries it handles comes from Google partners, rather than searches at Google directly.
Arora also said that he expects about 50% of advertising to move online in the next three to five years.
he just said ad team looks at ways to make ads not look like ads. I think he meant that positively, like content you want.
A friend sent me a screenshot where he was surprised how similar the results looked between Bing & Google.
If Bing looks too different it feels out of place, if it looks to similar it doesn't feel memorable. And if Google is optimized for revenue generation then Bing is going to have a fairly similar look & feel to their results if they want to earn enough to bid on partnerships.
Another factor helping Google maintain their dominance in search marketshare is the shift of search query mix to mobile, where Google has a 95.8% marketshare.
In spite of losing share on browser defaults & mobile, Yahoo! managed to grow their search ad clicks 11% year over year. How was Yahoo! able to do that? In part by quietly dialing up on search arbitrage. They have long had a "trending now" box on their homepage, but over the past year they have dialed up on ads in their news, finance & sports sections that are linked to search queries. Some of these ad units are in the sidebar & some are inline with the articles.
Yahoo! also buys ads on some smaller ad networks & sends those through to a search result with almost no organic results.
“Within search results, information tied to verified online profiles will be ranked higher than content without such verification, which will result in most users naturally clicking on the top (verified) results. The true cost of remaining anonymous, then, might be irrelevance.” - Eric Schmidt
And the risks from such data mining operations are not just in "those countries over there." The ad networks that hire lobbyists to change foreign privacy laws do so such that they can better track people the globe over and deliver higher paying ads. (No problem so long as they don't catch you on a day you are down and push ads for a mind numbing psychotropic drug with suicidal or homicidal side effects.)
top 10 ways to regurgitate top 10 lists from 10 different angles (BuzzFeed)
hatchet job that was written before manufacturing the "conforming" experience (example)
factually incorrect hate bait irrelevant article with no author name, wrapped in ads for get rich quick scams (example)
... no matter how it is created, it is fine, so long as you have political influence. Not only will it rank, but it will be given a ranking boost based on being part of a large site, even if it is carpet bombed with irrelevant ads.
Coin Operated Ideals
But then the companies that claim this transparency is vital for society pull a George Costanza & "Do The Opposite" with their own approach.
If we don't exam the faux ideals push to shift cultural norms we will end up with a crappier world to live in. Some Googlers (or Google fanbois) who read this will claim I am a broken record stuck in the past on this stuff. But those same people will be surprised x years down the road when something bizarre surfaces from an old deranged contact or prior life.
Anyone who has done anything meaningful has also done some things that are idiotic.
Is that sort of stuff always forever relevant or does it make sense at some point to move on?
Mr. Page, the CEO, about a year ago pushed the idea of requiring Google users to sign on to their Google+ accounts simply to view reviews of businesses, the people say. Google executives persuaded him not to pursue that strategy, fearing it would irritate Google search users, the people say.
Links to Google+ also appear in Google search-engine results involving people and brands that have set up a Google+ account.
Other websites can't hardcode their own listings into the search results. But anyone who widely attempted showing things to Googlebot while cloaking them to users would stand a good chance of being penalized for their spam. They would risk both a manual intervention & being hit by Panda based on poor engagement metrics.
Recall that a big portion of the complaint about Google's business practices was their scrape-n-displace modus operandi. As part of the FTC agreement, companies are able to opt out of being scraped into some of Google's vertical offerings, but that still doesn't prevent their content from making its way into the knowledge graph.
Now that Google is no longer free to scrape-n-displace competitors, apparently the parallel Google version of that type of content that should be "free and open to all to improve user experience" (when owned by a 3rd party) is a premium feature locked behind a registration wall (when owned by Google). There is a teaser for the cloaked information in the SERPs, & you are officially invited to sign into Google & join Google+ if you would like to view more.
Information wants to be free.
Unless it is Google's.
Then users want to be tracked and monetized.
Trademark Violations & Copyright Spam
A few years back Google gave themselves a pat on the back for ending relationships with "approximately 50,000 AdWords accounts for attempting to advertise counterfeit goods."
How the problem grew to that scale before being addressed went unasked.
Last year Google announced a relevancy signal based on DMCA complaints (while exempting YouTube) & even nuked an AdSense publisher for linking to a torrent of his own ebook. Google sees a stray link, makes a presumption. If they are wrong and you have access to media channels then the issue might get fixed. But if you lack the ability to get coverage, you're toast.
"To the extent [the study] suggests that Google ads are a major source of funds for major pirate sites, we believe it is mistaken," a Google spokesperson said. "Over the past several years, we've taken a leadership role in this fight. The complexity of online advertising has led some to conclude, incorrectly, that the mere presence of any Google code on a site means financial support from Google."
So Google intentionally avails their infrastructure to people they believe are engaged in criminal conduct (based on their own 50,000,000+ "valid" DMCA findings) and yet Google claims to have zero responsibility for those actions because Google may, in some cases, not get a direct taste in the revenues (only benefiting indirectly through increasing the operating costs of running a publishing business that is not partnered with Google).
The above linked LA Times article also had the following quote in it:
"When our ads were running unbeknownst to us on these pirate sites, we had a serious problem with that," said Gareth Hornberger, senior manager of global digital marketing for Levi's. "We reached out to our global ad agency of record, OMD, and immediately had them remove them.... We made a point, moving forward, that we really need to take steps to avoid having these problems again."
Through Google's reality warping efforts the ad network, the ad agency, the publisher, and the advertiser are all entirely unaccountable for their own efforts & revenue streams. And it is not like Google or the large ad agencies lack the resources to deal with these issues, as there is some serious cash in these types of deals: "WPP, Google's largest customer, increased its spending on Google by 25% in 2012, to about $2 billion."
These multi-billion Dollar budgets are insufficient funds to police the associated activities. Whenever anything is mentioned in the media, mention system complexity & other forms of plausible deniability. When that falls short, outsource the blame onto a contractor, service provider, or rogue partner. Contrasting that behavior, the common peasant webmaster must proactively monitor the rest of the web to ensure he stays in the graces of his Lord Google.
You have to police your user generated content, or you risk your site being scored as spam. With that in mind, many big companies are now filing false DMCA takedown requests. Sites that receive DMCA complaints need to address them or risk being penalized. Businesses that send out bogus DMCA requests have no repercussions (until they are eventually hit with a class action lawsuit).
Remember how a while back Google mentioned their sophisticated duplication detection technology in YouTube?
There are over a million full movies on YouTube, according to YouTube!
The other thing that is outrageous is that if someone takes a video that is already on YouTube & re-uploads it again, Google will sometimes outrank the original video with the spam shag-n-republish.
In the below search result you can see that our video (the one with the Excel spreadsheet open) is listed in the SERPs 3 times.
The version we uploaded has over a quarter million views, but ranks below the spam syndication version with under 100 views.
There are only 3 ways to describe how the above can happen:
a negative ranking factor against our account
horrible relevancy algorithms
I realize I could DMCA them, but why should I have to bear that additional cost when Google allegedly automatically solved this problem years ago?
Unlike sacrosanct ad code, if someone points spam links at your site, you are responsible for cleaning it up. The absurdity of this contrast is only further highlighted by the post Google did about cleaning up spam links, where one of the examples they highlighted publicly as link spam was not a person's spam efforts, but rather a competitor's sabotage efforts that worked so well that they were even publicly cited as being outrageous link spam.
Well Mr Cutts, you have created a monster in Google now im afraid. Your video here http://www.youtube.com/watch?v=HWJUU-g5U_I says that with the new disavow tool makes negative SEO a mere nuisance.
Yet in your previous video about the diavow tool you say it can take months for links to be disavowed as google waits to crawl them???
In the meantime, the time lag makes it a little more than a "nuisance" don't you think?
Where Does This Leave Us?
As Google keeps adding more advanced filters to their search engines & folding more usage data into their relevancy algorithms, they are essentially gutting small online businesses. As Google guts them, it was important to offer a counter message of inclusion. A WSJ articles mentioned that Google's "get your business online" initiative was more effective at manipulating governmental officials than their other lobbying efforts. And that opinion was sourced from Google's lobbyists:
Some Washington lobbyists, including those who have done work for Google, said that the Get Your Business Online effort has perhaps had more impact on federal lawmakers than any lobbying done on Capitol Hill.
Each of the additional junk time wasting tasks (eg: monitoring backlinks and proactively filtering them, managing inventory & cashflow while waiting for penalties tied to competitive sabotage to clear, filing DMCAs against Google properties when Google claims to have fixed the issue years ago, merging Google Places listings into Google+, etc.) Google foists onto webmasters who run small operations guarantees that a greater share of them will eventually get tripped up.
That algorithmic approach will also only feed into further "market for lemons" aspects as consultants skip the low margin, small budget, heavy lifting jobs and focus exclusively on servicing the companies which Google is biasing their "relevancy" algorithms to promote in order to taste a larger share of their ad budgets.
While chatting with a friend earlier today he had this to say:
Business is arbitrage. Any exchange not based in fraud is legitimate regardless of volume or medium. The mediums choose to delegitimize smaller players as a way to consolidate power.
Sadly most journalists are willfully ignorant of the above biases & literally nobody is comparing the above sorts of behaviors against each other. Most people inside the SEO industry also avoid the topic, because it is easier (& more profitable) to work with the elephants & attribute their success to your own efforts than it is highlight the holes in the official propaganda.
I mean, just look at all the great work David Naylor did for a smaller client here & Google still gave him the ole "screw you" in spite of doing just about everything possible within his control.
The linkbuilding tactics used by the SEO company on datalabel.co.uk were low quality, but the links were completely removed before a Reconsideration Request was filed. The MD’s commenting and directory submissions were done in good faith as ways to spread the word about his business. Despite a lengthy explanation to Google, a well-documented clean-up process, and eventually disavowing every link to the site, the domain has never recovered and still violates Google’s guidelines.
If you’ve removed or disavowed every link, and even rebuilt the site itself, where do you go from there?
Ahead of the Penguin update Google claimed that they wanted to "level the playing field." Now that Google shopping has converted into a pay-to-play format & Amazon.com has opted out of participation, Google once again claims that they want to "level the playing field":
“We are trying to provide a level playing field for retailers,” [Google’s VP of Shopping Sameer Samat] said, adding that there are some companies that have managed to do both tech and retail well. “How’s the rest of the retail world going to hit that bar?”
This quote is particularly disingenuous. For years you could win in search with a niche site by being more focused, having higher quality content & more in-depth reviews. But now even some fairly large sites are getting flushed down the ranking toilet while the biggest sites that syndicate their data displace them (see this graph for an example, as Pricegrabber is the primary source for Yahoo! Shopping).
Some may make the argument that a business is illegitimate if it is excessively focused on search and has few other distribution channels, but if building those other channels causes your own site to get filtered out as duplicate content, all you are doing is trading one risky relationship for another. When it comes time to re-negotiate the partnerships in a couple years look for the partner to take a pound of flesh on that deal.
How Google Drives Businesses to Amazon, eBay & Other Platforms
Google has spent much of the past couple years scrubbing smaller ecommerce sites off the web via the Panda & Penguin updates. Now if small online merchants want an opportunity to engage in Google's search ecosystem they have a couple options:
Ignore it: flat out ignore search until they build a huge brand (it's worth noting that branding is a higher level function & deep brand investment is too cost intensive for many small niche businesses)
Join The Circus: jump through an endless series of hoops, minimizing their product pages & re-configuring their shopping cart
Ignoring search isn't a lasting option, some of the PPC costs won't back out for smaller businesses that lack a broad catalog to do repeat sales against to lift lifetime customer value, SEO is getting prohibitively expensive & uncertain. Of these options, a good number of small online merchants are now choosing #4.
Operating an ecommerce store is hard. You have to deal with...
sourcing & managing inventory
technical / software issues
credit card fraud
Some services help to minimize the pain in many of these areas, but just like people do showrooming offline many also do it online. And one of the biggest incremental costs added to ecommerce over the past couple years has been SEO.
Google's Barrier to Entry Destroys the Diversity of Online Businesses
How are the smaller merchants to compete with larger ones? Well, for starters, there are some obvious points of influence in the market that Google could address...
time spent worrying about Penguin or Panda is time that is not spent on differentiating your offering or building new products & services
algorithmically brand emphasis will peak in the next year or two as Google comes to appreciate that they have excessively consolidated some markets and made it too hard for themselves to break into those markets. (Recall how Google came up with their QDF algorithm only *after* Google Finance wasn't able to rank). At that point in time Google will push their own verticals more aggressively & launch some aggressive public relations campaigns about helping small businesses succeed online.
Since that point in time Amazon has made so many great moves to combat Google:
All of that is on top of creating the Kindle Fire, gaining content streaming deals & their existing strong positions in books and e-commerce.
It is unsurprising to see Google mentioning the need to "level the playing field." They realize that Amazon benefits from many of the same network effects that Google does & now that Amazon is leveraging their position atop e-commerce to get into the online ads game, Google feels the need to mix things up.
Said IgnitionOne CEO Will Margiloff: “I’ve always believed that the best data is conversion data. Who has more conversion data in e-commerce than Amazon?”
“The truth is that they have a singular amount of data that nobody else can touch,” said Jonathan Adams, iCrossing’s U.S. media lead. “Search behavior is not the same as conversion data. These guys have been watching you buy things for … years.”
Amazon also has an opportunity to shift up the funnel, to go after demand-generation ad budgets (i.e. branding dollars) by using its audience data to package targeting segments. It's easy to imagine these segments as hybrids of Google’s intent-based audience pools and Facebook’s interest-based ones.
Google is in a sticky spot with product search. As they aim to increase monetization by displacing the organic result set they also lose what differentiates them from other online shopping options. If they just list big box then users will learn to pick their favorite and cut Google out of the loop. Many shoppers have been trained to start at Amazon.com even before Google began polluting their results with paid inclusion:
Research firm Forrester reported that 30 percent of U.S. online shoppers in the third quarter began researching their purchase on Amazon.com, compared with 13 percent who started on a search engine such as Google - a reversal from two years earlier when search engines were more popular starting points.
Who will Google partner with in their attempt to disrupt Amazon? Smaller businesses, larger corporations, or a mix of both? Can they succeed? Thoughts?
Well it has been a fun year in search. Having had various sites that I thought were quality, completely burnt by Google since they started with the Penguins and Pandas and other penalties, I thought i would try something that I KNEW Google would love….. Something dare I say would be “bulletproof.” Something I could go to bed, knowing it would be there the next day in Google’s loving arms. Something I could focus on and be proud of.
Enter www.buymycar.com, an idea I had wanted to do for some time, where people list a car and it gets sent to a network of dealers who bid on it from a secure area. A simple idea but FAR from simple to implement.
Notes I made prior to launch to please Google and to give it a fighting chance were:
To make sure the content was of a high quality. I took this so seriously that we actually made a point of linking out to direct competition where it helped to do so. This was almost physically painful to do! But I thought I would start as I meant to go on. I remember paying the content guy that helps me, triple his normal fee to go above and beyond normal research for the articles in our "sell my car" and "value my car" sections.
To make the site socially likeable. I wanted something that people would share and as such to sacrifice profits in the short term to get it established.
To give Google the things it loves on-site. Speed testing, webmaster tools error checking (even got a little well done from Google for having no errors, bless), user testing, sitemaps for big G to find our content more easily, fast hosting, letting it have full access with analytics…
TO NEVER, EVER UNDER ANY CIRCUMSTANCES PAY FOR A LINK. Yes, I figured I would put all the investment into the site and content this time. If it went how I had hoped perhaps I could find the holy grail where site’s link to us willingly without a financial incentive! A grail I had been chasing for some years. Could people really link out without being paid? I had once heard a rumour it was possible and I wanted to investigate it……
Satisfied I had ticked all the boxes from hours of Matt Cutts video’s and Google guidelines documents, I went to work and stopped SEO on all my smaller sites that were out of favour. I was enjoying building what I had hoped would be a useful site and kicked myself for not having done so sooner. I also thanked Google mentally for being smart enough now to reward better sites.
Fast forward 4 months of testing and re testing and signing up car dealers across the country and I decided to do a cursory check to see if anyone had liked what I was building and linked to it. I put my site into ahrefs.com and to my surprise, 13,208 sites had!! What was also nice was that all of them had used the anchor text “Buy My Car Scam” and had been so kind as to give me worldwide exposure on .ru, .br and .fr sites in blog comments amongst others.
In seriousness, this was absolutely devastating to see.
A worried competitor had obviously decided I was a threat and to nip my site in the bud with Google and attack it before it had even fully started. The live launch date was scheduled for January 7th, 2013! I was aware of negative SEO from other sites I had lost but not in advance of actually having any traffic or rankings. Now I was faced with death by Google rankings to look forward to before it had any rankings, add to that my site being cited as a scam across the Internet before it launched!
My options were immediately as follows:
Go back and nuke the likely candidates in Google who had sabotaged me. Not really an option as I think it is the lowest of the low.
Start trying to contact 13,000+ link owners to ask for the links to be removed. When I am heavily invested in this project anyway and have a deadline to reach, this was not an option. Also, Xrummer, Scrapebox or other automated tools could send another 13,000 just as easily in hours for me to deal with.
Disavow links with Google. To download all the links, disavow them all and hope that Google would show me mercy in the few months Matt Cutts said it takes to get to them all removed.
Give up the project. Radical as this may sound, it did go through my mind as organic traffic was a big part of my business plan. Thankfully I was talked out of it and it would be "letting them win."
I opted for number 3, the disavow method but wondered what would happen if I kept being sent 10’s of thousands more links and how a new site can actually have any protection from this? To set back a site months in its early stages is devastating to a new on-line business. To be in a climate where it is done prior to launch is ridiculous.
Had I fired back at future competitors as many suggested I did, there would be a knock on effect that makes me wonder if in the months to come, everyone will be doing it to each other as routine. Having been in SEO for years I always knew it was possible to sabotage sites but never thought it would become so common and before they even ranked!
Robert Prime is a self employed web developer based in East Sussex, England. You can follow him on Twitter at @RobertPrime.
The above creates an interesting market dynamic...
the long established market leader can wither on the vine for being too focused on their niche market & not broadening out in ways that increase brand awareness
a larger site with loads of usage data can outsource the vertical and win based on the bleed of usage data across services & the ability to cross promote the site
the company investing in creating the architecture & baseline system that powers other sites continues to slide due to limited brand & a larger entity gets to displace the data source
Google then directly enters the market, further displacing some of the vertical players
The above puts Nextag's slide in perspective, but the problem is that they still have fixed costs to manage if they are going to maintain their editorial quality. Google can hand out badges for people willing to improve their product for free or give searchers a "Click any fact to locate it on the web. Click Wrong? to report a problem" but others who operated with such loose editorial standards would likely be labeled as a spammer of one stripe or another.
Most businesses have to earn the right to have exposure. They have to compete in the ecosystem, built awareness & so on. But Google can come in from the top of the market with an inferior product, displace the competition, economically starve them & eventually create a competitive product over time through a combination of incremental editorial improvements and gutting the traffic & cash flow to competing sites.
"The difference between life and death is remarkably small. And it’s not until you face it directly that you realize your own mortality." - Dustin Curtis
The above quote is every bit as much true for businesses as it is for people. Nothing more than a threat of a potential entry into a market can cut off the flow of investment & paralyze businesses in fear.
If you have stuff behind a paywall or pre-roll ads you might have "poor user experience metrics" that get you hit by Panda.
If you make your information semi-accessible to Googlebot you might get hit by Panda for having too much similar content.
If you are not YouTube & you have a bunch of stolen content on your site you might get hit by a copyright penalty.
If you leave your information fully accessible publicly you get to die by scrape-n-displace.
If you are more clever about information presentation perhaps you get a hand penlty for cloaking.
None of those is a particularly desirable way to have your business die.
In addition to having a non-comprehensive database, Google Shopping also suffers from the problem of line extension (who buys video games from Staples?).
The bigger issue is that issue of general editorial integrity.
Are products in stock? Sometimes no.
It is also worth mentioning that some sites with "no product available" like Target or Toys R Us might also carry further Google AdSense ads.
Then there are also issues with things like ads that optimize for CTR which end up promoting things like software piracy or the academic versions of software (while lowering the perceived value of the software).
Over the past couple years Google has whacked loads of small ecommerce sites & the general justification is that they don't add enough that is unique, and that they don't deserve to rank as their inventory is unneeded duplication of Amazon & eBay. Many of these small businesses carry inventory and will be driven into insolvency by the sharp shifts in traffic. And while a small store is unneeded duplication, Google still allows syndicated press releases to rank great (and once again SEOs get blamed for Google being Google - see the quote-as-headline here).
Let's presume Google's anti-small business bias is legitimate & look at Google Shopping to see how well they performed in terms of providing a value add editorial function.
A couple days ago I was looking for a product that is somewhat hard to find due to seasonal shopping. It is often available at double or triple retail on sites like eBay, but Google Shopping helped me locate a smaller site that had it available at retail price. Good deal for me & maybe I was wong about Google.
... then again ...
The site they sent me to had the following characteristics:
URL - not EMD & not a brand, broken English combination
logo - looks like I designed it AND like I was in a rush when I did it
about us page - no real information, no contact information (on an ecommerce site!!!), just some obscure stuff about "direct connection with China" & mention of business being 15 years old and having great success
age - domain is barely a year old & privacy registered
inbound links - none
product price - lower than everywhere else
product level page content - no reviews, thin scraped editorial, editorial repeats itself to fill up more space, 3 adsense blocks in the content area of the page
no reviews, thin scraped editorial, editorial repeats itself to fill up more space, 3 adsense blocks in the content area of the page
no reviews, thin scraped editorial, editorial repeats itself to fill up more space, 3 adsense blocks in the content area of the page
no reviews, thin scraped editorial, editorial repeats itself to fill up more space, 3 adsense blocks in the content area of the page
the above repetition is to point out the absurdity of the formatting of the "content" of said page
site search - yet again the adsense feed, searching for the product landing page that was in Google Shopping I get no results (so outside of paid inclusion & front/center placement, Google doesn't even feel this site is worth wasting the resources to index)
checkout - requires account registration, includes captcha that never matches, hoping you will get frustrated & go back to earlier pages and click an ad
It actually took me a few minutes to figure it out, but the site was designed to look like a phishing site, with intent that perhaps you will click on an ad rather than trying to complete a purchase. The forced registration will eat your email & who knows what they will do with it, but you can never complete your purchase, making the site a complete waste of time.
Looking at the above spam site with some help of tools like NetComber it was apparent that this "merchant" also ran all sorts of scraper sites driven on scraping content from Yahoo! Answers & similar, with sites about Spanish + finance + health + shoes + hedge funds.
It is easy to make complaints about Nextag being a less than perfect user experience. But it is hard to argue that Google is any better. And when other companies have editorial costs that Google lacks (and the other companies would be labeled as spammers if they behaved like Google) over time many competing sites will die off due to the embedded cost structure advantages. Amazon has enoug scale that people are willing to bypass Google's click circus & go directly to Amazon, but most other ecommerce players don't. The rest are largely forced to pay Google's rising rents until they can no longer afford to, then they just disappear.
Bonus Prize: Are You Up to The Google Shopping Test?
The first person who successfully solves this captcha wins a free month membership to our site.
If you haven’t received an “unnatural link” alert from Google, you don’t really need to use this tool. And even if you have received notification, Google are quick to point out that you may wish to pursue other avenues, such as approaching the site owner, first.
Webmasters have met with mixed success following this approach, of course. It's difficult to imagine many webmasters going to that trouble and expense when they can now upload a txt file to Google.
The disavow tool is a loaded gun.
If you get the format wrong by mistake, you may end up taking out valuable links for long periods of time. Google advise that if this happens, you can still get your links back, but not immediately.
Could the use of the tool be seen as an admission of guilt? Matt gives examples of "bad" webmaster behavior, which comes across a bit like “webmasters confessing their sins!”. Is this the equivalent of putting up your hand and saying “yep, I bought links that even I think are dodgy!”? May as well paint a target on your back.
Some webmasters have been victims of negative SEO. Some webmasters have had scrapers and autogen sites that steal their content, and then link back. There are legitimate reasons to disavow links. Hopefully, Google makes an effort to make such a distinction.
Not only is it difficult working out the links that may be a problem, it can be difficult getting a view of the entire link graph. There are various third party tools, including Google’s own Webmaster Central, but they aren’t exhaustive.
Matt mentioned that the link notification emails will provide examples of problem links, however this list won't be exhaustive. He also mentioned that you should pay attention to the more recent links, presumably because if you haven't received notification up until now, then older links weren't the problem. The issue with that assumption is that links that were once good can over time become bad:
That donation where you helped a good cause & were later mortified that "online casino" and "discount cheap viagra" followed your course for purely altruistic reasons.
That clever comment on a well-linked PR7 page that is looking to cure erectile dysfunction 20 different ways in the comments.
Links from sources that were considered fine years ago & were later repositioned as spam (article banks anyone?)
Links from sites that were fine, but a number of other webmasters disavowed, turning a site that originally passed the sniff test into one that earns a second review revealing a sour stench.
This could all get rather painful if webmasters start taking out links they perceive to be a problem, but aren’t. I imagine a few feet will get blasted off in the process.
Webmasters Asked, Google Gaveth
Webmasters have been demanding such a tool since the un-natural notifications started appearing. There is no question that removing established links can be as hard, if not harder, than getting the links in the first place. Generally speaking, the cheaper the link was to get the higher the cost of removal (relative to the original purchase price). If you are renting text link ads for $50 a month you can get them removed simply by not paying. But if you did a bulk submission to 5,000 high PR SEO friendly directories...best of luck with that!
It is time consuming. Firstly, there’s the overhead in working out which links to remove, as Google doesn’t specify them. Once a webmaster has made a list of the links she thinks might be a problem, she then needs to go through the tedious task of contacting each sites and requesting that a link be taken down.
Even with the best will in the world, this is an overhead for the linking site, too. A legitimate site may wish to verify the identity of the person requesting the delink, as the delink request could come from a malicious competitor. Once identity has been established, the site owner must go to the trouble of making the change on their site.
This is not a big deal if a site owner only receives one request, but what if they receive multiple requests per day? It may not be unreasonable for a site owner to charge for the time taken to make the change, as such a change incurs a time cost. If the webmaster who has incurred a penalty has to remove many links, from multiple sites, then such costs could quickly mount. Taken to the (il)logical extremes, this link removal stuff is a big business. Not only are there a number of link removal services on the market, but one of our members was actually sued for linking to a site (when the person who was suing them paid to place the link!)
It’s hard to imagine this data not finding it’s way to the manual reviewers. If there are multiple instances of webmasters reporting paid links from a certain site, then Google have more than enough justification to take it out. This would be a cunning way around the “how do we know if a link is paid?” problem.
Webmasters will likely incorporate bad link checking into their daily activities. Monitoring inbound links wasn’t something you had to watch in the past, as links were good, and those that weren’t, didn’t matter, as they didn’t affect ranking anyway. Now, webmasters may feel compelled to avoid an unnatural links warning by meticulously monitoring their inbound links and reporting anything that looks odd. Google haven’t been clear on whether they would take such action as a result - Matt suggests they just reclassify the link & see it as a strong suggestion to treat it like the link has a nofollow attribute - but no doubt there will be clarification as the tool beds in. Google has long used a tiered index structure & enough complaints might lower the tier of a page or site, cause it's ability to pass trust to be blocked, or cause the site to be directly penalized.
This is also a way of reaffirming “the law”, as Google sees it. In many instances, it is no fault of the webmaster that rogue sites link up, yet the webmaster will feel compelled to jump through Google’s hoops. Google sets the rules of the game. If you want to play, then you play by their rules, and recognize their authority. Matt Cutts suggested:
we recommend that you contact the sites that link to you and try to get links taken off the public web first. You’re also helping to protect your site’s image, since people will no longer find spammy links and jump to conclusions about your website or business.
Left unsaid in the above is most people don't have access to aggregate link data while they surf the web, most modern systems of justice are based on the presumption of innocence rather than guilt, and most rational people don't presume that a site that is linked to is somehow shady simply for being linked to.
If the KKK links to Matt's blog tomorrow that doesn't imply anything about Matt. And when Google gets featured in an InfoWars article it doesn't mean that Google desires that link or coverage. Many sketchy sites link to Adobe (for their flash player) or sites like Disney & Google for people who are not old enough to view them or such. Those links do not indicate anything negative about the sites being linked into. However, as stated above, search is Google's monopoly to do with as they please.
On the positive side, if Google really do want sites to conform to certain patterns, and will reward them for doing so by letting them out of jail, then this is yet another way to clean up the SERPs. They get the webmaster on side and that webmaster doing link classification work for them for free.
Large companies can likely safely ignore much of the fear-first approach to search regulation. And when things blow up they can cast off blame on a rogue anonymous contractor of sorts. Whereas smaller webmasters walk on egg shells.
When the government wanted to regulate copyright issues Google claimed it would be too expensive and kill innovation at small start ups. Google then drafted their own copyright policy from which they themselves are exempt. And now small businesses not only need to bear that cost but also need to police their link profiles, even as competitors can use Fivver, ScrapeBox, splog link networks & various other sources to drip a constant stream of low cost sludge in their direction.
Now more than ever, status is important.
No doubt you’ve thought of a few. A couple thoughts - not that we advocate them, but realize they will happen:
Intentionally build spam links to yourself & then disavow them (in order to make your profile look larger than it is & to ensure that competitor who follows everything you do - but lacks access to your disavow data - walks into a penalty).
Find sites that link to competitors and leave loads of comments for the competitor on them, hoping that the competitor blocks the domain as a whole.
Find sites that link to competitors & buy links from them into a variety of other websites & then disavow from multiple accounts.
Get a competitor some link warnings & watch them push to get some of their own clean "unauthorized" links removed.
If a malicious webmaster wanted to get a target site in the bad books, they could post obvious comment spam - pointing at their site, and other sites. If this activity doesn’t result in an unnatural linking notification, then all good. It’s a test of how Google values that domain. If it does result in an unnatural link notification, the webmaster could then disavow links from that site. Other webmasters will likely do the same. Result: the target site may get taken out.
To avoid this sort of hit, pay close attention to your comment moderation.
Please add your own to the comments! :) Gotchas, that is, not rogue links.