Google: "As We Say, NOT As We Do"

Jan 7th

Due to heavy lobbying, the FTC's investigation into Google's business practices has ended with few marks or bruises on Google's behalf. If the EU has similar results, you can count on Google growing more anti-competitive in their practices:

Google is flat-out lying. They’ve modified their code to break Google Maps on Windows Phones. It worked before, but with the ‘redirect,’ it no longer works.

We are only a couple days into the new year, but there have already been numerous absurdities highlighted, in addition to the FTC decision & Google blocking Windows Phones.

When is Cloaking, Cloaking?

Don't ask Larry Page:

Mr. Page, the CEO, about a year ago pushed the idea of requiring Google users to sign on to their Google+ accounts simply to view reviews of businesses, the people say. Google executives persuaded him not to pursue that strategy, fearing it would irritate Google search users, the people say.
...
Links to Google+ also appear in Google search-engine results involving people and brands that have set up a Google+ account.

Other websites can't hardcode their own listings into the search results. But anyone who widely attempted showing things to Googlebot while cloaking them to users would stand a good chance of being penalized for their spam. They would risk both a manual intervention & being hit by Panda based on poor engagement metrics.

Recall that a big portion of the complaint about Google's business practices was their scrape-n-displace modus operandi. As part of the FTC agreement, companies are able to opt out of being scraped into some of Google's vertical offerings, but that still doesn't prevent their content from making its way into the knowledge graph.

Now that Google is no longer free to scrape-n-displace competitors, apparently the parallel Google version of that type of content that should be "free and open to all to improve user experience" (when owned by a 3rd party) is a premium feature locked behind a registration wall (when owned by Google). There is a teaser for the cloaked information in the SERPs, & you are officially invited to sign into Google & join Google+ if you would like to view more.

Information wants to be free.

Unless it is Google's.

Then users want to be tracked and monetized.

Trademark Violations & Copyright Spam

A few years back Google gave themselves a pat on the back for ending relationships with "approximately 50,000 AdWords accounts for attempting to advertise counterfeit goods."

How the problem grew to that scale before being addressed went unasked.

Last year Google announced a relevancy signal based on DMCA complaints (while exempting YouTube) & even nuked an AdSense publisher for linking to a torrent of his own ebook. Google sees a stray link, makes a presumption. If they are wrong and you have access to media channels then the issue might get fixed. But if you lack the ability to get coverage, you're toast.

Years ago a study highlighted how Google's AdSense & DoubleClick were the monetization engine for stolen content. Recently some USC researchers came to the same conclusion by looking at Google's list of domains that saw the most DMCA requests against them. Upon hearing of the recent study, Google's shady public relations team stated:

"To the extent [the study] suggests that Google ads are a major source of funds for major pirate sites, we believe it is mistaken," a Google spokesperson said. "Over the past several years, we've taken a leadership role in this fight. The complexity of online advertising has led some to conclude, incorrectly, that the mere presence of any Google code on a site means financial support from Google."

So Google intentionally avails their infrastructure to people they believe are engaged in criminal conduct (based on their own 50,000,000+ "valid" DMCA findings) and yet Google claims to have zero responsibility for those actions because Google may, in some cases, not get a direct taste in the revenues (only benefiting indirectly through increasing the operating costs of running a publishing business that is not partnered with Google).

A smaller company engaged in a similar operation might end up getting charged for the conduct of their partners. However, when Google's ad code is in the page you are wrong to assume any relationship.

The above linked LA Times article also had the following quote in it:

"When our ads were running unbeknownst to us on these pirate sites, we had a serious problem with that," said Gareth Hornberger, senior manager of global digital marketing for Levi's. "We reached out to our global ad agency of record, OMD, and immediately had them remove them.... We made a point, moving forward, that we really need to take steps to avoid having these problems again."

Through Google's reality warping efforts the ad network, the ad agency, the publisher, and the advertiser are all entirely unaccountable for their own efforts & revenue streams. And it is not like Google or the large ad agencies lack the resources to deal with these issues, as there is some serious cash in these types of deals: "WPP, Google's largest customer, increased its spending on Google by 25% in 2012, to about $2 billion."

These multi-billion Dollar budgets are insufficient funds to police the associated activities. Whenever anything is mentioned in the media, mention system complexity & other forms of plausible deniability. When that falls short, outsource the blame onto a contractor, service provider, or rogue partner. Contrasting that behavior, the common peasant webmaster must proactively monitor the rest of the web to ensure he stays in the graces of his Lord Google.

DMCA Spam

You have to police your user generated content, or you risk your site being scored as spam. With that in mind, many big companies are now filing false DMCA takedown requests. Sites that receive DMCA complaints need to address them or risk being penalized. Businesses that send out bogus DMCA requests have no repercussions (until they are eventually hit with a class action lawsuit).

Remember how a while back Google mentioned their sophisticated duplication detection technology in YouTube?

There are over a million full movies on YouTube, according to YouTube!

The other thing that is outrageous is that if someone takes a video that is already on YouTube & re-uploads it again, Google will sometimes outrank the original video with the spam shag-n-republish.

In the below search result you can see that our video (the one with the Excel spreadsheet open) is listed in the SERPs 3 times.

The version we uploaded has over a quarter million views, but ranks below the spam syndication version with under 100 views.

There are only 3 ways to describe how the above can happen:

  • a negative ranking factor against our account
  • horrible relevancy algorithms
  • idiocy

I realize I could DMCA them, but why should I have to bear that additional cost when Google allegedly automatically solved this problem years ago?

Link Spam

Unlike sacrosanct ad code, if someone points spam links at your site, you are responsible for cleaning it up. The absurdity of this contrast is only further highlighted by the post Google did about cleaning up spam links, where one of the examples they highlighted publicly as link spam was not a person's spam efforts, but rather a competitor's sabotage efforts that worked so well that they were even publicly cited as being outrageous link spam.

It has been less than 3 months since Google launched their disavow tool, but since it's launch some webmasters are already engaging in pre-negative SEO. That post had an interesting comment on it:

Well Mr Cutts, you have created a monster in Google now im afraid. Your video here http://www.youtube.com/watch?v=HWJUU-g5U_I says that with the new disavow tool makes negative SEO a mere nuisance.
Yet in your previous video about the diavow tool you say it can take months for links to be disavowed as google waits to crawl them???
In the meantime, the time lag makes it a little more than a "nuisance" don't you think?

Where Does This Leave Us?

As Google keeps adding more advanced filters to their search engines & folding more usage data into their relevancy algorithms, they are essentially gutting small online businesses. As Google guts them, it was important to offer a counter message of inclusion. A WSJ articles mentioned that Google's "get your business online" initiative was more effective at manipulating governmental officials than their other lobbying efforts. And that opinion was sourced from Google's lobbyists:

Some Washington lobbyists, including those who have done work for Google, said that the Get Your Business Online effort has perhaps had more impact on federal lawmakers than any lobbying done on Capitol Hill.

Each of the additional junk time wasting tasks (eg: monitoring backlinks and proactively filtering them, managing inventory & cashflow while waiting for penalties tied to competitive sabotage to clear, filing DMCAs against Google properties when Google claims to have fixed the issue years ago, merging Google Places listings into Google+, etc.) Google foists onto webmasters who run small operations guarantees that a greater share of them will eventually get tripped up.

Not only will the algorithms be out of their reach, but so will consulting.

That algorithmic approach will also only feed into further "market for lemons" aspects as consultants skip the low margin, small budget, heavy lifting jobs and focus exclusively on servicing the companies which Google is biasing their "relevancy" algorithms to promote in order to taste a larger share of their ad budgets.

While chatting with a friend earlier today he had this to say:

Business is arbitrage. Any exchange not based in fraud is legitimate regardless of volume or medium. The mediums choose to delegitimize smaller players as a way to consolidate power.

Sadly most journalists are willfully ignorant of the above biases & literally nobody is comparing the above sorts of behaviors against each other. Most people inside the SEO industry also avoid the topic, because it is easier (& more profitable) to work with the elephants & attribute their success to your own efforts than it is highlight the holes in the official propaganda.

I mean, just look at all the great work David Naylor did for a smaller client here & Google still gave him the ole "screw you" in spite of doing just about everything possible within his control.

The linkbuilding tactics used by the SEO company on datalabel.co.uk were low quality, but the links were completely removed before a Reconsideration Request was filed. The MD’s commenting and directory submissions were done in good faith as ways to spread the word about his business. Despite a lengthy explanation to Google, a well-documented clean-up process, and eventually disavowing every link to the site, the domain has never recovered and still violates Google’s guidelines.

If you’ve removed or disavowed every link, and even rebuilt the site itself, where do you go from there?

Is Google Concerned About Amazon Eating Their Lunch?

Dec 26th

Leveling The Playing Field

When monopolies state that they want to "level the playing field" it should be cause for concern.

Groupon is a great example of how this works. After they turned down Google's buyout offer, Google responded by...

The same deal is slowly progressing in the cell phone market: “we are using compatibility as a club to make them do things we want."

Leveling Shopping Search

Ahead of the Penguin update Google claimed that they wanted to "level the playing field." Now that Google shopping has converted into a pay-to-play format & Amazon.com has opted out of participation, Google once again claims that they want to "level the playing field":

“We are trying to provide a level playing field for retailers,” [Google’s VP of Shopping Sameer Samat] said, adding that there are some companies that have managed to do both tech and retail well. “How’s the rest of the retail world going to hit that bar?”

This quote is particularly disingenuous. For years you could win in search with a niche site by being more focused, having higher quality content & more in-depth reviews. But now even some fairly large sites are getting flushed down the ranking toilet while the biggest sites that syndicate their data displace them (see this graph for an example, as Pricegrabber is the primary source for Yahoo! Shopping).

Some may make the argument that a business is illegitimate if it is excessively focused on search and has few other distribution channels, but if building those other channels causes your own site to get filtered out as duplicate content, all you are doing is trading one risky relationship for another. When it comes time to re-negotiate the partnerships in a couple years look for the partner to take a pound of flesh on that deal.

How Google Drives Businesses to Amazon, eBay & Other Platforms

Google has spent much of the past couple years scrubbing smaller ecommerce sites off the web via the Panda & Penguin updates. Now if small online merchants want an opportunity to engage in Google's search ecosystem they have a couple options:

  • Ignore it: flat out ignore search until they build a huge brand (it's worth noting that branding is a higher level function & deep brand investment is too cost intensive for many small niche businesses)
  • Join The Circus: jump through an endless series of hoops, minimizing their product pages & re-configuring their shopping cart
  • PPC: operate at or slightly above the level of a non-functional thin phishing website & pay Google by the click via their new paid inclusion program
  • Ride on a 3rd Party Platform: sell on one of the larger platforms that Google is biasing their algorithms toward & hope that the platform doesn't cut you out of the loop.

Ignoring search isn't a lasting option, some of the PPC costs won't back out for smaller businesses that lack a broad catalog to do repeat sales against to lift lifetime customer value, SEO is getting prohibitively expensive & uncertain. Of these options, a good number of small online merchants are now choosing #4.

Operating an ecommerce store is hard. You have to deal with...

  • sourcing & managing inventory
  • managing employees
  • technical / software issues
  • content creation
  • marketing
  • credit card fraud
  • customer service
  • shipping

Some services help to minimize the pain in many of these areas, but just like people do showrooming offline many also do it online. And one of the biggest incremental costs added to ecommerce over the past couple years has been SEO.

Google's Barrier to Entry Destroys the Diversity of Online Businesses

How are the smaller merchants to compete with larger ones? Well, for starters, there are some obvious points of influence in the market that Google could address...

  • time spent worrying about Penguin or Panda is time that is not spent on differentiating your offering or building new products & services
  • time spent modifying the source code of your shopping cart to minimize pagecount & consolidate products (and various other "learn PHP on the side" work) is not spent on creating more in-depth editorial
  • time switching carts to one that has the newly needed features (for GoogleBot and ONLY GoogleBot) & aligning your redirects is not spent on outreach and media relations
  • time spent disavowing links that a competitor built into your site is not spent on building new partnerships & other distribution channels outside of search

Ecosystem instability taxes small businesses more than larger ones as they...

The presumption that size = quality is false. A fact which Google only recognizes when it hits their own bottom line.

Anybody Could Have Saw This Coming

About a half-year ago we had a blog post about 'Branding & The Cycle' which stated:

algorithmically brand emphasis will peak in the next year or two as Google comes to appreciate that they have excessively consolidated some markets and made it too hard for themselves to break into those markets. (Recall how Google came up with their QDF algorithm only *after* Google Finance wasn't able to rank). At that point in time Google will push their own verticals more aggressively & launch some aggressive public relations campaigns about helping small businesses succeed online.

Since that point in time Amazon has made so many great moves to combat Google:

All of that is on top of creating the Kindle Fire, gaining content streaming deals & their existing strong positions in books and e-commerce.

It is unsurprising to see Google mentioning the need to "level the playing field." They realize that Amazon benefits from many of the same network effects that Google does & now that Amazon is leveraging their position atop e-commerce to get into the online ads game, Google feels the need to mix things up.

If Google was worried about book searches happening on Amazon, how much more worried might they be about a distributed ad network built on Amazon's data?

Said IgnitionOne CEO Will Margiloff: “I’ve always believed that the best data is conversion data. Who has more conversion data in e-commerce than Amazon?”

“The truth is that they have a singular amount of data that nobody else can touch,” said Jonathan Adams, iCrossing’s U.S. media lead. “Search behavior is not the same as conversion data. These guys have been watching you buy things for … years.”
...
Amazon also has an opportunity to shift up the funnel, to go after demand-generation ad budgets (i.e. branding dollars) by using its audience data to package targeting segments. It's easy to imagine these segments as hybrids of Google’s intent-based audience pools and Facebook’s interest-based ones.

Google is in a sticky spot with product search. As they aim to increase monetization by displacing the organic result set they also lose what differentiates them from other online shopping options. If they just list big box then users will learn to pick their favorite and cut Google out of the loop. Many shoppers have been trained to start at Amazon.com even before Google began polluting their results with paid inclusion:

Research firm Forrester reported that 30 percent of U.S. online shoppers in the third quarter began researching their purchase on Amazon.com, compared with 13 percent who started on a search engine such as Google - a reversal from two years earlier when search engines were more popular starting points.

Who will Google partner with in their attempt to disrupt Amazon? Smaller businesses, larger corporations, or a mix of both? Can they succeed? Thoughts?

How to Obfuscate And Misdirect an Algo Update

Dec 16th

Sharing is caring!

Please share :)

Embed code is here.

If you find the following a bit hard to read due to font size, a wider version is located here.

Google Algo Changes.

The 'Scam' Site That Never Launched

Dec 14th
posted in

(A case study in being PRE negatively seo'ed)

Well it has been a fun year in search. Having had various sites that I thought were quality, completely burnt by Google since they started with the Penguins and Pandas and other penalties, I thought i would try something that I KNEW Google would love….. Something dare I say would be “bulletproof.” Something I could go to bed, knowing it would be there the next day in Google’s loving arms. Something I could focus on and be proud of.

Enter www.buymycar.com, an idea I had wanted to do for some time, where people list a car and it gets sent to a network of dealers who bid on it from a secure area. A simple idea but FAR from simple to implement.

Notes I made prior to launch to please Google and to give it a fighting chance were:

  1. To have an actual service and not to be an affiliate. Google crushed my affiliate sites and we know they are not fond of them as they want to be the only affiliate I think.
  2. To make sure the content was of a high quality. I took this so seriously that we actually made a point of linking out to direct competition where it helped to do so. This was almost physically painful to do! But I thought I would start as I meant to go on. I remember paying the content guy that helps me, triple his normal fee to go above and beyond normal research for the articles in our "sell my car" and "value my car" sections.
  3. To make the site socially likeable. I wanted something that people would share and as such to sacrifice profits in the short term to get it established.
  4. To give Google the things it loves on-site. Speed testing, webmaster tools error checking (even got a little well done from Google for having no errors, bless), user testing, sitemaps for big G to find our content more easily, fast hosting, letting it have full access with analytics…
  5. TO NEVER, EVER UNDER ANY CIRCUMSTANCES PAY FOR A LINK. Yes, I figured I would put all the investment into the site and content this time. If it went how I had hoped perhaps I could find the holy grail where site’s link to us willingly without a financial incentive! A grail I had been chasing for some years. Could people really link out without being paid? I had once heard a rumour it was possible and I wanted to investigate it……

Satisfied I had ticked all the boxes from hours of Matt Cutts video’s and Google guidelines documents, I went to work and stopped SEO on all my smaller sites that were out of favour. I was enjoying building what I had hoped would be a useful site and kicked myself for not having done so sooner. I also thanked Google mentally for being smart enough now to reward better sites.

Fast forward 4 months of testing and re testing and signing up car dealers across the country and I decided to do a cursory check to see if anyone had liked what I was building and linked to it. I put my site into ahrefs.com and to my surprise, 13,208 sites had!! What was also nice was that all of them had used the anchor text “Buy My Car Scam” and had been so kind as to give me worldwide exposure on .ru, .br and .fr sites in blog comments amongst others.

In seriousness, this was absolutely devastating to see.

A worried competitor had obviously decided I was a threat and to nip my site in the bud with Google and attack it before it had even fully started. The live launch date was scheduled for January 7th, 2013! I was aware of negative SEO from other sites I had lost but not in advance of actually having any traffic or rankings. Now I was faced with death by Google rankings to look forward to before it had any rankings, add to that my site being cited as a scam across the Internet before it launched!

My options were immediately as follows:

  1. Go back and nuke the likely candidates in Google who had sabotaged me. Not really an option as I think it is the lowest of the low.
  2. Start trying to contact 13,000+ link owners to ask for the links to be removed. When I am heavily invested in this project anyway and have a deadline to reach, this was not an option. Also, Xrummer, Scrapebox or other automated tools could send another 13,000 just as easily in hours for me to deal with.
  3. Disavow links with Google. To download all the links, disavow them all and hope that Google would show me mercy in the few months Matt Cutts said it takes to get to them all removed.
  4. Give up the project. Radical as this may sound, it did go through my mind as organic traffic was a big part of my business plan. Thankfully I was talked out of it and it would be "letting them win."

I opted for number 3, the disavow method but wondered what would happen if I kept being sent 10’s of thousands more links and how a new site can actually have any protection from this? To set back a site months in its early stages is devastating to a new on-line business. To be in a climate where it is done prior to launch is ridiculous.

Had I fired back at future competitors as many suggested I did, there would be a knock on effect that makes me wonder if in the months to come, everyone will be doing it to each other as routine. Having been in SEO for years I always knew it was possible to sabotage sites but never thought it would become so common and before they even ranked!


Robert Prime is a self employed web developer based in East Sussex, England. You can follow him on Twitter at @RobertPrime.

Counterspin on Shopping Search: Shady Paid Inclusion

Nov 28th

Bing caused a big stink today when they unveiled Scroogled, a site that highlights how Google Shopping has went paid-inclusion only. A couple weeks ago Google announced that they would be taking their controvercial business model global, in spite of it being "a mess."

Nextag has long been critical of Google's shifts on the shopping search front. Are their complaints legitimate, or are they just whiners?

Data, More Reliable Than Spin

Nothing beats data, so lets start with that.

This is what Nextag's search exposure has done over the past few years, according to SearchMetrics.

If Google did that to any large & politically connected company, you can bet regulators would have already took action against Google, rather than currently negotiating with them.

What's more telling is how some other sites in the shopping search vertical have performed.

PriceGrabber, another player in the shopping search market, has also slowly drifted downward (though at a much slower rate).

One of the few shopping search engines that has seen a big lift over this time period was Yahoo! Shopping.

What is interesting about that rise is that Yahoo! outsourced substantially all of their shopping search product to PriceGrabber.

A Self-Destructing Market Dynamic

The above creates an interesting market dynamic...

  • the long established market leader can wither on the vine for being too focused on their niche market & not broadening out in ways that increase brand awareness
  • a larger site with loads of usage data can outsource the vertical and win based on the bleed of usage data across services & the ability to cross promote the site
  • the company investing in creating the architecture & baseline system that powers other sites continues to slide due to limited brand & a larger entity gets to displace the data source
  • Google then directly enters the market, further displacing some of the vertical players

The above puts Nextag's slide in perspective, but the problem is that they still have fixed costs to manage if they are going to maintain their editorial quality. Google can hand out badges for people willing to improve their product for free or give searchers a "Click any fact to locate it on the web. Click Wrong? to report a problem" but others who operated with such loose editorial standards would likely be labeled as a spammer of one stripe or another.

Scrape-N-Displace

Most businesses have to earn the right to have exposure. They have to compete in the ecosystem, built awareness & so on. But Google can come in from the top of the market with an inferior product, displace the competition, economically starve them & eventually create a competitive product over time through a combination of incremental editorial improvements and gutting the traffic & cash flow to competing sites.

"The difference between life and death is remarkably small. And it’s not until you face it directly that you realize your own mortality." - Dustin Curtis

The above quote is every bit as much true for businesses as it is for people. Nothing more than a threat of a potential entry into a market can cut off the flow of investment & paralyze businesses in fear.

  • If you have stuff behind a paywall or pre-roll ads you might have "poor user experience metrics" that get you hit by Panda.
  • If you make your information semi-accessible to Googlebot you might get hit by Panda for having too much similar content.
  • If you are not YouTube & you have a bunch of stolen content on your site you might get hit by a copyright penalty.
  • If you leave your information fully accessible publicly you get to die by scrape-n-displace.
  • If you are more clever about information presentation perhaps you get a hand penlty for cloaking.

None of those is a particularly desirable way to have your business die.

Editorial Integrity

In addition to having a non-comprehensive database, Google Shopping also suffers from the problem of line extension (who buys video games from Staples?).

The bigger issue is that issue of general editorial integrity.

Are products in stock? Sometimes no.

It is also worth mentioning that some sites with "no product available" like Target or Toys R Us might also carry further Google AdSense ads.

Then there are also issues with things like ads that optimize for CTR which end up promoting things like software piracy or the academic versions of software (while lowering the perceived value of the software).

Over the past couple years Google has whacked loads of small ecommerce sites & the general justification is that they don't add enough that is unique, and that they don't deserve to rank as their inventory is unneeded duplication of Amazon & eBay. Many of these small businesses carry inventory and will be driven into insolvency by the sharp shifts in traffic. And while a small store is unneeded duplication, Google still allows syndicated press releases to rank great (and once again SEOs get blamed for Google being Google - see the quote-as-headline here).

Let's presume Google's anti-small business bias is legitimate & look at Google Shopping to see how well they performed in terms of providing a value add editorial function.

A couple days ago I was looking for a product that is somewhat hard to find due to seasonal shopping. It is often available at double or triple retail on sites like eBay, but Google Shopping helped me locate a smaller site that had it available at retail price. Good deal for me & maybe I was wong about Google.

... then again ...

The site they sent me to had the following characteristics:

  • URL - not EMD & not a brand, broken English combination
  • logo - looks like I designed it AND like I was in a rush when I did it
  • about us page - no real information, no contact information (on an ecommerce site!!!), just some obscure stuff about "direct connection with China" & mention of business being 15 years old and having great success
  • age - domain is barely a year old & privacy registered
  • inbound links - none
  • product price - lower than everywhere else
  • product level page content - no reviews, thin scraped editorial, editorial repeats itself to fill up more space, 3 adsense blocks in the content area of the page
    • no reviews, thin scraped editorial, editorial repeats itself to fill up more space, 3 adsense blocks in the content area of the page
    • no reviews, thin scraped editorial, editorial repeats itself to fill up more space, 3 adsense blocks in the content area of the page
    • no reviews, thin scraped editorial, editorial repeats itself to fill up more space, 3 adsense blocks in the content area of the page
    • the above repetition is to point out the absurdity of the formatting of the "content" of said page
  • site search - yet again the adsense feed, searching for the product landing page that was in Google Shopping I get no results (so outside of paid inclusion & front/center placement, Google doesn't even feel this site is worth wasting the resources to index)
  • checkout - requires account registration, includes captcha that never matches, hoping you will get frustrated & go back to earlier pages and click an ad

It actually took me a few minutes to figure it out, but the site was designed to look like a phishing site, with intent that perhaps you will click on an ad rather than trying to complete a purchase. The forced registration will eat your email & who knows what they will do with it, but you can never complete your purchase, making the site a complete waste of time.

Looking at the above spam site with some help of tools like NetComber it was apparent that this "merchant" also ran all sorts of scraper sites driven on scraping content from Yahoo! Answers & similar, with sites about Spanish + finance + health + shoes + hedge funds.

It is easy to make complaints about Nextag being a less than perfect user experience. But it is hard to argue that Google is any better. And when other companies have editorial costs that Google lacks (and the other companies would be labeled as spammers if they behaved like Google) over time many competing sites will die off due to the embedded cost structure advantages. Amazon has enoug scale that people are willing to bypass Google's click circus & go directly to Amazon, but most other ecommerce players don't. The rest are largely forced to pay Google's rising rents until they can no longer afford to, then they just disappear.

Bonus Prize: Are You Up to The Google Shopping Test?

The first person who successfully solves this captcha wins a free month membership to our site.

Google Disavow Tool

Oct 18th

Google launched a disavow links tool. Webmasters who want to tell Google which links they don’t want counted can now do so by uploading a list of links in Google Webmaster Tools.

If you haven’t received an “unnatural link” alert from Google, you don’t really need to use this tool. And even if you have received notification, Google are quick to point out that you may wish to pursue other avenues, such as approaching the site owner, first.

Webmasters have met with mixed success following this approach, of course. It's difficult to imagine many webmasters going to that trouble and expense when they can now upload a txt file to Google.

Careful, Now

The disavow tool is a loaded gun.

If you get the format wrong by mistake, you may end up taking out valuable links for long periods of time. Google advise that if this happens, you can still get your links back, but not immediately.

Could the use of the tool be seen as an admission of guilt? Matt gives examples of "bad" webmaster behavior, which comes across a bit like “webmasters confessing their sins!”. Is this the equivalent of putting up your hand and saying “yep, I bought links that even I think are dodgy!”? May as well paint a target on your back.

Some webmasters have been victims of negative SEO. Some webmasters have had scrapers and autogen sites that steal their content, and then link back. There are legitimate reasons to disavow links. Hopefully, Google makes an effort to make such a distinction.

One wonders why Google simply don't discount the links they already deem to be “bad”? Why the need for the webmaster to jump through hoops? The webmaster is still left to guess which links are “bad”, of course.

Not only is it difficult working out the links that may be a problem, it can be difficult getting a view of the entire link graph. There are various third party tools, including Google’s own Webmaster Central, but they aren’t exhaustive.

Matt mentioned that the link notification emails will provide examples of problem links, however this list won't be exhaustive. He also mentioned that you should pay attention to the more recent links, presumably because if you haven't received notification up until now, then older links weren't the problem. The issue with that assumption is that links that were once good can over time become bad:

  • That donation where you helped a good cause & were later mortified that "online casino" and "discount cheap viagra" followed your course for purely altruistic reasons.
  • That clever comment on a well-linked PR7 page that is looking to cure erectile dysfunction 20 different ways in the comments.
  • Links from sources that were considered fine years ago & were later repositioned as spam (article banks anyone?)
  • Links from sites that were fine, but a number of other webmasters disavowed, turning a site that originally passed the sniff test into one that earns a second review revealing a sour stench.

This could all get rather painful if webmasters start taking out links they perceive to be a problem, but aren’t. I imagine a few feet will get blasted off in the process.

Webmasters Asked, Google Gaveth

Webmasters have been demanding such a tool since the un-natural notifications started appearing. There is no question that removing established links can be as hard, if not harder, than getting the links in the first place. Generally speaking, the cheaper the link was to get the higher the cost of removal (relative to the original purchase price). If you are renting text link ads for $50 a month you can get them removed simply by not paying. But if you did a bulk submission to 5,000 high PR SEO friendly directories...best of luck with that!

It is time consuming. Firstly, there’s the overhead in working out which links to remove, as Google doesn’t specify them. Once a webmaster has made a list of the links she thinks might be a problem, she then needs to go through the tedious task of contacting each sites and requesting that a link be taken down.

Even with the best will in the world, this is an overhead for the linking site, too. A legitimate site may wish to verify the identity of the person requesting the delink, as the delink request could come from a malicious competitor. Once identity has been established, the site owner must go to the trouble of making the change on their site.

This is not a big deal if a site owner only receives one request, but what if they receive multiple requests per day? It may not be unreasonable for a site owner to charge for the time taken to make the change, as such a change incurs a time cost. If the webmaster who has incurred a penalty has to remove many links, from multiple sites, then such costs could quickly mount. Taken to the (il)logical extremes, this link removal stuff is a big business. Not only are there a number of link removal services on the market, but one of our members was actually sued for linking to a site (when the person who was suing them paid to place the link!)

What’s In It For Google?

Webmasters now face the prisoner's dilemma and are doing Google’s job for them.

It’s hard to imagine this data not finding it’s way to the manual reviewers. If there are multiple instances of webmasters reporting paid links from a certain site, then Google have more than enough justification to take it out. This would be a cunning way around the “how do we know if a link is paid?” problem.

Webmasters will likely incorporate bad link checking into their daily activities. Monitoring inbound links wasn’t something you had to watch in the past, as links were good, and those that weren’t, didn’t matter, as they didn’t affect ranking anyway. Now, webmasters may feel compelled to avoid an unnatural links warning by meticulously monitoring their inbound links and reporting anything that looks odd. Google haven’t been clear on whether they would take such action as a result - Matt suggests they just reclassify the link & see it as a strong suggestion to treat it like the link has a nofollow attribute - but no doubt there will be clarification as the tool beds in. Google has long used a tiered index structure & enough complaints might lower the tier of a page or site, cause it's ability to pass trust to be blocked, or cause the site to be directly penalized.

This is also a way of reaffirming “the law”, as Google sees it. In many instances, it is no fault of the webmaster that rogue sites link up, yet the webmaster will feel compelled to jump through Google’s hoops. Google sets the rules of the game. If you want to play, then you play by their rules, and recognize their authority. Matt Cutts suggested:

we recommend that you contact the sites that link to you and try to get links taken off the public web first. You’re also helping to protect your site’s image, since people will no longer find spammy links and jump to conclusions about your website or business.

Left unsaid in the above is most people don't have access to aggregate link data while they surf the web, most modern systems of justice are based on the presumption of innocence rather than guilt, and most rational people don't presume that a site that is linked to is somehow shady simply for being linked to.

If the KKK links to Matt's blog tomorrow that doesn't imply anything about Matt. And when Google gets featured in an InfoWars article it doesn't mean that Google desires that link or coverage. Many sketchy sites link to Adobe (for their flash player) or sites like Disney & Google for people who are not old enough to view them or such. Those links do not indicate anything negative about the sites being linked into. However, as stated above, search is Google's monopoly to do with as they please.

On the positive side, if Google really do want sites to conform to certain patterns, and will reward them for doing so by letting them out of jail, then this is yet another way to clean up the SERPs. They get the webmaster on side and that webmaster doing link classification work for them for free.

Who, Not What

For a decade search was driven largely by meritocracy. What you did was far more important than who you were. It was much less corrupt than the physical world. But as Google chases brand ad Dollars, that view of the search landscape is no longer relevant.

Large companies can likely safely ignore much of the fear-first approach to search regulation. And when things blow up they can cast off blame on a rogue anonymous contractor of sorts. Whereas smaller webmasters walk on egg shells.

When the government wanted to regulate copyright issues Google claimed it would be too expensive and kill innovation at small start ups. Google then drafted their own copyright policy from which they themselves are exempt. And now small businesses not only need to bear that cost but also need to police their link profiles, even as competitors can use Fivver, ScrapeBox, splog link networks & various other sources to drip a constant stream of low cost sludge in their direction.

Now more than ever, status is important.

Gotchas

No doubt you’ve thought of a few. A couple thoughts - not that we advocate them, but realize they will happen:

  • Intentionally build spam links to yourself & then disavow them (in order to make your profile look larger than it is & to ensure that competitor who follows everything you do - but lacks access to your disavow data - walks into a penalty).
  • Find sites that link to competitors and leave loads of comments for the competitor on them, hoping that the competitor blocks the domain as a whole.
  • Find sites that link to competitors & buy links from them into a variety of other websites & then disavow from multiple accounts.
  • Get a competitor some link warnings & watch them push to get some of their own clean "unauthorized" links removed.
  • The webmaster who parts on poor terms burning the bridge behind them, or leaving a backdoor so that they may do so at anytime.

If a malicious webmaster wanted to get a target site in the bad books, they could post obvious comment spam - pointing at their site, and other sites. If this activity doesn’t result in an unnatural linking notification, then all good. It’s a test of how Google values that domain. If it does result in an unnatural link notification, the webmaster could then disavow links from that site. Other webmasters will likely do the same. Result: the target site may get taken out.

To avoid this sort of hit, pay close attention to your comment moderation.

Please add your own to the comments! :) Gotchas, that is, not rogue links.

Further opinions @ searchengineland and seoroundtable.

How to Free Your E-Commerce Site from Google's Panda

Aug 29th

On Feb. 25, 2011, Google released Panda to wreak havoc on the web. While it may have been designed to take out content farms, it also took out scores of quality e-commerce sites. What do content farms and e-commerce sites have in common? Lots of pages. Many with zero or very few links. And on e-commerce sites with hundreds or thousands of products, the product pages may have a low quantity of content, making them appear as duplicate, low quality, or shallow to the Panda, thus a target for massive devaluation.

My e-commerce site was hit by Panda, causing a 60% drop in traffic overnight. But I was able to escape after many months of testing content and design changes. In this post, I'll explain how we beat the Panda, and what you can do to get your site out if you've been hit.

The key to freeing your e-commerce site from Panda lies at the bottom of a post Google provided as guidance to Pandalized sites:

One other specific piece of guidance we've offered is that low-quality content on some parts of a website can impact the whole site’s rankings, and thus removing low quality pages, merging or improving the content of individual shallow pages into more useful pages, or moving low quality pages to a different domain could eventually help the rankings of your higher-quality content.

Panda doesn't like what it thinks are "low quality" pages, and that includes "shallow pages". Many larger e-commerce sites, and likely all of those that were hit by Panda, have a high number of product pages with either duplicate bits of descriptions or short descriptions, leading to the shallow pages label. In order to escape from the Panda devaluation, you'll need to do something about that. Here are a few possible solutions:

Adding Content To Product Pages

If your site has a relatively small number of products, or if each product is unique enough to support entirely different descriptions and information, you may be able to thicken up the pages with unique, useful information. Product reviews can also serve the same purpose, but if your site is already hit by Panda you may not have the customers to leave enough reviews to make a difference. Additionally, some product types are such that customers are unlikely to leave reviews.

If you can add unique and useful information to each of your product pages, you should do so both to satisfy the Panda and your customers. It's a win-win.

Using Variations To Decrease Product Pages

Some e-commerce sites have large numbers of products with slight variations. For example, if you're selling t-shirts you may have one design in 5 different sizes and 10 different colors. If you've got 20 designs, you've got 1,000 unique products. However, it would be impossible to write 1,000 unique descriptions. At best, you'll be able to write one for each design, or a total of 20. If your e-commerce site is set up so that each of the product variations has a single page, Panda isn't going to like that. You've either got near 1,000 pages that look like duplicates, or you've got near 1,000 pages that look VERY shallow.

Many shopping carts allow for products to have variations, such that in the above situation you can have 20 product pages where a user can select size and color variations for each design. Switching to such a structure will probably cause the Panda to leave you alone and make shopping easier for your customers.

Removing Poor Performing Products

If your products aren't sufficiently unique to add substantial content to each one, and they also don't lend themselves to consolidation through selectable variations, you might consider deleting any that haven't sold well historically. Panda doesn't like too many pages. So if you've got pages that have never produced income, it's time to remove them from your site.

Getting Rid of All Product Pages

This is a bold step, but the one we were forced to take in order to recover. A great many of our products are very similar. They're variations of each other. But due to the limitations of our shopping cart combined with shipping issues, where each variation had different shipping costs that couldn't be programed into the variations, it was the only viable choice we were left with.

In this option, you redesign your site so that products displayed on category pages are no longer clickable, removing links to all product pages. The information that was displayed on product pages gets moved to your category pages. Not only does this eliminate your product pages, which make up the vast majority of your site, but it also adds content to your category pages. Rather than having an "add to cart" or "buy now" button on the product page, it's integrated into the category page right next to the product.

Making this move reduced our page count by nearly 90%. Our category pages became thicker, and we no longer had any shallow pages. A side benefit of this method is that customers have to make fewer clicks to purchase a product. And if your customers tend to purchase multiple products with each order, they avoid having to go from category page to product page, back to the category page, and into another product page. They can simply purchase a number of products with single clicks.

Noindexing Product Pages

If you do get rid of all links to your product pages but your cart is still generating them, you'll want to add a "noindex, follow" tag to each of them. This can also be a solution for e-commerce sites where all traffic enters on category level pages rather than product pages. If you know your customers are searching for phrases that you target on your category pages, and not specifically searching for the products you sell, you can simply noindex all of your product pages with no loss in traffic.

If all of your products are in a specific folder, I'd recommend also disallowing that folder from Googlebot in your robots.txt file, and filing a removal request in Google Webmaster Tools, in order to make sure the pages are taken out of the index.

Other Considerations: Pagination & Search Results Pages

In addition to issues with singular product pages, your e-commerce site may have duplicate content issues or a very large number of similar pages in the index due to your on-site search and sorting features. Googlebot will fill in your search form and index your search results pages, potentially leading to thousands of similar pages in the index. Make sure your search results pages have a rel="noindex, follow" tag or a rel="canonical" tag to take care of this. Similarly, if your product pages have a variety of sorting options (price, best selling, etc.), you should make sure the rel="canonical" tag points to the default page as the canonical version. Otherwise, each product page may exist in Google's index in each variation.


Maxmoritz, a long time member of our SEO Community, has been working in SEO full time since 2005. He runs a variety of sites, including Hungry Piranha, where he blogs regularly.

Rank Modifying Spammers

Aug 23rd
posted in

My good friend Bill at SEOByTheSea has unearthed a Google patent that will likely raise eyebrows, whilst others will have their suspicions confirmed.

The patent is called Ranking Documents. When webmasters alter a page, or links to a page, the system may not respond immediately to those changes. Rather, the system may change rankings in unexpected ways.

A system determines a first rank associated with a document and determines a second rank associated with the document, where the second rank is different from the first rank. The system also changes, during a transition period that occurs during a transition from the first rank to the second rank, a transition rank associated with the document based on a rank transition function that varies the transition rank over time without any change in ranking factors associated with the document.

Further:

During the transition from the old rank to the target rank, the transition rank might cause:
  • a time-based delay response,
  • a negative response
  • a random response, and/or
  • an unexpected response

So, Google may shift the rankings of your site, in what appears to be a random manner, before Google settles on a target rank.

Let's say that you're building links to a site, and the site moves up in the rankings. You would assume that the link building has had a positive effect. Not so if the patent code is active, as your site may have already been flagged.

Google then toys with you for a while before sending your site plummeting to the target rank. This makes it harder to determine cause and effect.

Just because a patent exists doesn't mean Google is using it, of course. This may be just be another weapon in the war-of-FUD, but it sounds plausible and it’s something to keep in mind, especially if you're seeing this type of movement.

The Search Engine As Black Box

In ancient times (1990s), SEO thrived because search engines were stupid black boxes. If you added some keywords here, added a few links there, the black box would respond in a somewhat predictable, prescribed, fashion. Your rankings would rise if you guessed what the black box liked to "see", and you plummeted if you did too much of what the black box liked to see!

Ah, the good old days.

These days, the black box isn’t quite so stupid. It’s certainly a lot more cryptic. What hasn’t changed, however, is the battle line drawn between webmasters and search engines as they compete for search visitor attention.

If there are any webmasters still under the illusion that Google is the SEOs friend, that must be a very small club, indeed. Google used to maintain a - somewhat unconvincing - line that if you just followed their ambiguous guidelines (read: behaved yourself) then they would reward you. It was you and Google on the good side, and the evil spammers on the other.

Of late, Google appear to have gotten bored of maintaining any pretense, and the battle lines have been informally redrawn. If you’re a webmaster doing anything at all that might be considered an effort to improve rank, then you're a "spammer". Google would no doubt argue this has always been the case, even if you had to read between the lines to grasp it. And they’d be right.

Unconvinced?

Look at the language on the patent:

The systems and methods may also observe spammers’ reactions to rank changes caused by the rank transition function to identify documents that are actively being manipulated. This assists in the identification of rank-modifying spammers.

“Manipulated”? “Rank modifying spammers”? So, a spammer is someone who attempts to modify their rank?

I’ve yet to meet a webmaster who didn’t wish to modify their rank.

Google As A Competitor

Google’s business model relies on people clicking ads. In their initial IPO filing, Google identified rank manipulation as a business risk.

We are susceptible to index spammers who could harm the integrity of our web search results. There is an ongoing and increasing effort by “index spammers” to develop ways to manipulate our web search results

It’s a business risk partly because the result sets need to be relevant for people to return to Google. The largely unspoken point is Google wants webmasters to pay to run advertising, not get it for “free”, or hand their search advertising budget to an SEO shop.

Why would Google make life easy for competitors?

The counter argument has been that webmasters provide free content, which the search engines need in order to attract visitors in the first place. However, now relevant content is plentiful, that argument has been weakened. Essentially, if you don't want to be in Google, then block Google. They won't lose any sleep over it.

What has happened, however, is that the incentive to produce quality content, with search engines in mind, has been significantly reduced. If content can be scraped, ripped-off, demoted and merely used as a means to distract the search engine user enough to maybe click a few search engine ads, then where is the money going to come from to produce quality content? Google may be able to find relevant content, but "relevant" (on-topic) and "quality" (worth consuming) are seldom the same thing

One content model that works in such as environment is content that is cheap to produce. Cheap content can be quality content, but like all things in life, quality tends to come with a higher price tag. Another model that works is loss-leader content, but then the really good stuff is still hidden from view, and it's still hard to do this well, unless you've established considerable credibility - which is still expensive to do.

This is the same argument the newspaper publishers have been making. The advertising doesn’t pay enough to cover the cost of production and make a profit - so naturally the winner in this game cuts production cost until the numbers do add up. What tends to be sacrificed in this process - is quality.

NFSW Corp, a new startup by ex-TechCrunch and Guardian columnist writer Paul Carr has taken the next step. They have put everything behind a paywall. There is no free content. No loss-leaders. All you see is a login screen.

Is this the future for web publishing? If so, the most valuable content will not be in Google. And if more and more valuable content lies beyond Google's reach, then will fewer people bother going to Google in the first place?

The Happy Middle

Google argue that they focus on the user. They run experiments to determine search quality, quality as determined by users.

Here’s how it works. Our engineers come up with some insight or technique and implement a change to the search ranking algorithm . They hope this will improve search results, but at this point it’s just a hypothesis. So how do we know if it’s a good change? First we have a panel of real users spread around the world try out the change, comparing it side by side against our unchanged algorithm. This is a blind test — they don’t know which is which. They rate the results, and from that we get a rough sense of whether the change is better than the original. If it isn’t, we go back to the drawing board. But if it looks good, we might next take it into our usability lab — a physical room where we can invite people in to try it out in person and give us more detailed feedback. Or we might run it live for a small percentage of actual Google users, and see whether the change is improving things for them. If all those experiments have positive results, we eventually roll out the change for everyone"

Customer focus is, of course, admirable, but you’ve got to wonder about a metric that doesn’t involve the needs of publishers. If publishing on the web is not financially worthwhile, then, over time, the serps will surely degrade in terms of quality as a whole, and users will likely go elsewhere.

There is evidence this is already happening. Brett at Webmasterworld pointed out that there is a growing trend amongst consumers to skip Google altogether and just head for the Amazon, and other sites, directly. Amazon queries are up 73 percent in the last year.

There may well be a lot of very clever people at Google, but they do not appear to be clever enough to come up with a model that encourages webmasters to compete with each other in terms of information quality.

If Google doesn’t want the highest quality information increasingly locked up behind paywalls, then it needs to think of a way to nurture and incentivise the production of quality content, not just relevant content. Tell publishers exactly what content Google wants to see rank well and tell them how to achieve it. There should be enough money left on the table for publishers i.e. less competition from ads - so that everyone can win.

I’m not going to hold my breath for this publisher nirvana, however. I suspect Google's current model just needs content to be "good enough."

Stop Questioning Negative SEO -- It Exists and It May Kill Your Niche

Jul 16th
posted in

Cygnus Drawing.

The best part about a growing and very quickly changing industry is the diversity of viewpoints; the worst part is the exact same thing because sometimes 1 always equals 1 and doesn't need bullshit in lieu of evidence. I try my best to stay out of the limelight and just focus on making things happen. However, occasionally a topic will bother me so much that I have to chime in. The last time was over 5 years ago so I figure I'm due to speak up again. Today's topic? Negative SEO. My issue with the topic? Deniers.

There've been several posts on how negative SEO doesn't exist (those are the worst) or that maybe it exists but only weak sites can get hit (in other words, people with opinions that didn't do any testing). I'd like to put those topics to rest as best as a guy that keeps to himself can. I really should be able to do this in one sentence, but in the event what I write as the second half of this sentence doesn't do it for you, I have a couple stories; if crappy SEO of over-optimized anchors and junky links are to blame for ranking drops, how can it be said one cannot do this to someone else, and even if you were to deny this, then why the sudden rush to denounce certain links? On to some anecdotes!

While leading a training session overseas I mentioned a site I watched get hit by some negative SEO activities. I know that it was negative SEO and not a slip up on the SEOs' part by virtue of knowing the history/team behind the site and watching it as part of my normal data routine; the site was managed by the kind of guys that get asked to speak at SEOktoberfest...the kind of people I'd go work for if my bag of tricks ever ran out. Ok, so you're asking how I know it was negative SEO. The easiest explanation is that I watched the site spike heavily with on-theme anchors from junk sites over a one week period and was filtered shortly thereafter. It stayed filtered for just under few months, but 2 days after discussing the site and explaining how I knew the site was hit it magically reappeared (yes, there were googlers in the audience).

If you are skeptical then your first response better be that I'm only loosely describing one example so let me say that in the same industry where I've shared my knowledge of the subject on some more sophisticated methods (first released in the SEObook community), I feel almost like an information arms dealer since even the larger brands have themselves or through affiliated relationships been clubbing each other over the head. You read that right; I explained how I thought negative SEO could be employed and then watched a bunch of people actually do it, repeatedly. Unfortunately, I was hit too, but that's a different issue. In this particular industry, the only people left standing now are some poorly matched local results with fake reviews, a bunch of hacked domains, and the flotsam of macroparasites that gained popularity post Penguin. The only one that came back? The one I publically shared at a conference, explaining exactly how they were a victim based on the link patterns that didn't fit with the site's history over a several year period.

I'll wrap this up with a bit of humor. As a joke a friend of mine asked me to negative SEO him for his name. Let's say his name is John Doe and his domain is johndoe.com. The negative effect was temporary, but I was able to get him filtered for a little while on his name for maybe 120 seconds of my time and less than $50. The site did come back after a few days, but our mutual feeling on the matter is that for an extra $50 double-dose I could probably get the site filtered again. Neither of us wants negative SEO to get any more prevalent than it already is, so I'll skip the details on exactly how it was performed. There are multiple forms of negative SEO significantly scarier than someone with a copy of xrumer and in some cases there is very little you can do to prevent it; if a jerk wants to take you down, it can happen. If your niches begin to look like the wasteland I described above where I shared my thoughts a little too freely, then heaven help you because it doesn't look like Google is going to.


Cygnus has been involved in search since 1997 and loves tackling new and interesting (and of course lucrative) projects. Follow @Cygnus on Twitter for his rants.

Top 10 List of Top 10 Lists for your Top 10 List Compilation

A lot of amazing technology is being created.

The Only Constant is Change

When you think of all the implications of the above video (and all the things that are going on in machine learning & search), it can be somewhat difficult to think about sustainable strategies.

Want to fund in-depth automotive reviews in part based on your organic rankings? That business model breaks down when the organic SERPs move below the fold.

When platforms are new they start off as being fairly open to win attention & maximize their growth rates. Over time as they push to monetize they shift gears & what was once true becomes misleading. Thus a lot people likely come off as sounding like quack jobs because they keep having to reinvent themselves & reassess their belief systems as the markets change.

Hello Mr. Cynic

If you write things that sound like rants & complaints a lot of people mistake it as thinking you are a crank full of gloom & nonsense. For what it is worth, in many ways I think the future of the web will still be bright, but just relatively less bright than it was in the recent past for smaller players.

During the creation of any new communications network there are amazing opportunities, but over time they get arbitraged away & returns move more toward the typical norm in business as the platform gets locked down.

No Longer An Isolated Channel

The web is becoming more & more like the physical world (and is merging with it). For a long time search & online was largely a meritocracy, where the best person could easily win even if they came from the most humble beginnings.

In the offline world there are many hoops one has to jump through to win and the online market is just becoming more like that & at an accelerating rate due to network effects that allow big companies to saturate channels & tracking leading to asymmetrical advantages.

From Meritocracy to Corporatocracy

In search of years gone by, large & complex organizations that were overly bloated and inefficient routinely had their asses handed to them by smaller & more efficient operations. But then size became a primary signal of relevancy & quality, and that all changed. As Larry Page & Sergey Brin warned, the relevancy algorithms inevitably follow the underlying business model of the search engine.

That is a big part of the disillusionment with Google. For many years they were a leveler which was concerned primarily with quality. That grew the importance of search & differentiated them from everyone else, but then they decided to be "the same" & so many who promoted them felt a bit betrayed.

If a person gives you something and then takes it away you likely view them worse than someone who simply never offered that in the first place. As a species we are biologically aligned with being adverse toward loss.

Vertical AND Horizontal Integration

I was chatting with a friend about the above trend & his responses were:

  • "you don't shoot the guy that didn't give you the job; you shoot the guy that gave you the job and then fired you"
  • "their public image as being a leveler becomes more grating too, given how much they no longer represent that"
  • "the biggest problem we have in search is that search engines don't view themselves as a medium. They want to be the cable operator + television show + in-show advertising + commercials...I'm not aware of another medium where it works that way"

The last of those 3 points is a big deal. Consider how popular music is & that Machinima drives about 2/3 as many video streams as all of VEVO does & yet Google invested directly into it. That gives Google power to rank the content (Google serps), host the content (YouTube), monetize the content (ads), and have an ownership stake in the content. All that is in addition to owning a browser, an operating system (make that two) & building hardware.

If Google's internal stats show someone else is catching up to a channel they invested in, Google can...

  • relay this news across to drive editorial quality, content quantity, or even ad placement
  • preferentially promote the network they are invested in (free ads, better rankings, more "you might also like" recommendations, more post-view recommendations)
  • give a higher revenue share to the network they are invested in (or offer them early access to new betas and exclusives that increase monetization)
  • slow the growth of the competing network by using more aggressive ad placement (or lower CPM ads)
  • slow upload speeds for competing channels
  • etc.

If you are batting for the home team, such advantages are great. But they blow for everyone else in the ecosystem.

Those sorts of issues don't just appear in a few isolated incidents, but appear over and over again.

Social networks should be open, unless they are Google+.

Affiliate links shouldn't count for ranking purposes, unless they are Viglink, which Google invested in. ;)

Affiliate links should be clearly labeled as such. When they are not clearly labeled & go through tracking redirects they are sneaky redirects in Google's remote rater guidelines. On YouTube the affiliate links to Amazon & iTunes are not labeled as such & add an extra layer of tracking redirects to the sequence.

Let Me See Your Backlinks!


Yesterday someone sent me an email about their reinclusion request being denied because someone else scraped their eZineArticles article & syndicated it to another 3rd party site.

They didn't create that link and yet they are somehow supposed to get a spammer (maybe one from another continent!) to remove it. In many cases spammers won't respond to anything other than cash, but if you do offer cash to get the job done then that spammer might keep adding more and more links over time, turning their mark into an easy source of subscription revenues.

What is Wrong With This Picture?

The above scenario is ridiculous.

If you look at *any* site closely enough there will be something wrong with it.

Just by the virtue of existing & ranking you will pick up dozens to hundreds of spammy links you don't even want, due to SERP scraper sites that are trying to rank on longtail keyword queries.

About 5 years ago I had a page get filtered out because it gained about 500 scraper links in a month. No matter what I did that page would not rank until it was rewrote with a fresh page title. When you could change things & have the algorithms re-evaluate them automatically there was at least a decent opportunity to get around such issues.

Now that there is a manual review process holding you responsible for the actions of third party webmasters the market is a bit more grim.
But at least a bunch of link removal services are cropping up to profit from Google's errant logic. ;)

Engineers Ad Networks Love Quality Websites Big Brands

A bigger company can always shut a site down, split off into sections, & so on. Plus if you are a bigger company you are more likely to enjoy the benefit of the doubt.

But if you are a low margin small business who has seen declining revenue AND have to jump through further hoops (rather than focusing on running your business) at some point it is easier to give up than to keep on fighting.

After this year's FUD there is zero camaraderie in the industry.

That's How Business Works

Eventually a lot of the displacement trends that are hitting the organic search market will hit the paid search market & Google will make many of the enterprise AdWords management tools obsolete via a combination of various free scripts & data obfuscation.

At that point in time some of the paid search folks will look like the guy to the right, but nobody will care, as those same people reminded us that this is just how business works. :D

Perhaps they're right:

Google appears to have a culture that condones shamelessly violating consumer privacy. How else can you explain a company that bypasses Apple's iPhone privacy settings in a reported attempt to strengthen advertising revenues?

It is hard to believe that Dave Packard or Andy Grove would ever tell a group of entrepreneurs that he did "every horrible thing in the book to just get revenues right away," or brag to trade publications that his company used behavioral psychologists to design "compulsion loops" into products to keep customers engaged. But Mark Pincus, the founder of Internet gaming giant Zynga, has done just that.

When corporate leaders pursue wealth in the winner-take-all Internet environment, companies dance on the edge of acceptable behavior. If they don't take it to the limit, a competitor will. That competitor will become the dominant supplier -- one monopoly will replace another. And when you engage in these activities you get a different set of Valley values: the values of customer exploitation.

Pages






    Email Address
    Pick a Username
    Yes, please send me "7 Days to SEO Success" mini-course (a $57 value) for free.

    Learn More

    We value your privacy. We will not rent or sell your email address.