My good friend Bill at SEOByTheSea has unearthed a Google patent that will likely raise eyebrows, whilst others will have their suspicions confirmed.
The patent is called Ranking Documents. When webmasters alter a page, or links to a page, the system may not respond immediately to those changes. Rather, the system may change rankings in unexpected ways.
A system determines a first rank associated with a document and determines a second rank associated with the document, where the second rank is different from the first rank. The system also changes, during a transition period that occurs during a transition from the first rank to the second rank, a transition rank associated with the document based on a rank transition function that varies the transition rank over time without any change in ranking factors associated with the document.
During the transition from the old rank to the target rank, the
transition rank might cause:
a time-based delay response,
a negative response
a random response, and/or
an unexpected response
So, Google may shift the rankings of your site, in what appears to be a random manner, before Google settles on a target rank.
Let's say that you're building links to a site, and the site moves up in the rankings. You would assume that the link building has had a positive effect. Not so if the patent code is active, as your site may have already been flagged.
Google then toys with you for a while before sending your site plummeting to the target rank. This makes it harder to determine cause and effect.
Just because a patent exists doesn't mean Google is using it, of course. This may be just be another weapon in the war-of-FUD, but it sounds plausible and it’s something to keep in mind, especially if you're seeing this type of movement.
The Search Engine As Black Box
In ancient times (1990s), SEO thrived because search engines were stupid black boxes. If you added some keywords here, added a few links there, the black box would respond in a somewhat predictable, prescribed, fashion. Your rankings would rise if you guessed what the black box liked to "see", and you plummeted if you did too much of what the black box liked to see!
Ah, the good old days.
These days, the black box isn’t quite so stupid. It’s certainly a lot more cryptic. What hasn’t changed, however, is the battle line drawn between webmasters and search engines as they compete for search visitor attention.
If there are any webmasters still under the illusion that Google is the SEOs friend, that must be a very small club, indeed. Google used to maintain a - somewhat unconvincing - line that if you just followed their ambiguous guidelines (read: behaved yourself) then they would reward you. It was you and Google on the good side, and the evil spammers on the other.
Of late, Google appear to have gotten bored of maintaining any pretense, and the battle lines have been informally redrawn. If you’re a webmaster doing anything at all that might be considered an effort to improve rank, then you're a "spammer". Google would no doubt argue this has always been the case, even if you had to read between the lines to grasp it. And they’d be right.
Look at the language on the patent:
The systems and methods may also observe spammers’ reactions to rank changes caused by the rank transition function to identify documents that are actively being manipulated. This assists in the identification of rank-modifying spammers.
“Manipulated”? “Rank modifying spammers”? So, a spammer is someone who attempts to modify their rank?
I’ve yet to meet a webmaster who didn’t wish to modify their rank.
Google As A Competitor
Google’s business model relies on people clicking ads. In their initial IPO filing, Google identified rank manipulation as a business risk.
We are susceptible to index spammers who could harm the integrity of our web search results. There is an ongoing and increasing effort by “index spammers” to develop ways to manipulate our web search results
It’s a business risk partly because the result sets need to be relevant for people to return to Google. The largely unspoken point is Google wants webmasters to pay to run advertising, not get it for “free”, or hand their search advertising budget to an SEO shop.
Why would Google make life easy for competitors?
The counter argument has been that webmasters provide free content, which the search engines need in order to attract visitors in the first place. However, now relevant content is plentiful, that argument has been weakened. Essentially, if you don't want to be in Google, then block Google. They won't lose any sleep over it.
What has happened, however, is that the incentive to produce quality content, with search engines in mind, has been significantly reduced. If content can be scraped, ripped-off, demoted and merely used as a means to distract the search engine user enough to maybe click a few search engine ads, then where is the money going to come from to produce quality content? Google may be able to find relevant content, but "relevant" (on-topic) and "quality" (worth consuming) are seldom the same thing
One content model that works in such as environment is content that is cheap to produce. Cheap content can be quality content, but like all things in life, quality tends to come with a higher price tag. Another model that works is loss-leader content, but then the really good stuff is still hidden from view, and it's still hard to do this well, unless you've established considerable credibility - which is still expensive to do.
This is the same argument the newspaper publishers have been making. The advertising doesn’t pay enough to cover the cost of production and make a profit - so naturally the winner in this game cuts production cost until the numbers do add up. What tends to be sacrificed in this process - is quality.
NFSW Corp, a new startup by ex-TechCrunch and Guardian columnist writer Paul Carr has taken the next step. They have put everything behind a paywall. There is no free content. No loss-leaders. All you see is a login screen.
Is this the future for web publishing? If so, the most valuable content will not be in Google. And if more and more valuable content lies beyond Google's reach, then will fewer people bother going to Google in the first place?
Here’s how it works. Our engineers come up with some insight or technique and implement a change to the search ranking algorithm . They hope this will improve search results, but at this point it’s just a hypothesis. So how do we know if it’s a good change? First we have a panel of real users spread around the world try out the change, comparing it side by side against our unchanged algorithm. This is a blind test — they don’t know which is which. They rate the results, and from that we get a rough sense of whether the change is better than the original. If it isn’t, we go back to the drawing board. But if it looks good, we might next take it into our usability lab — a physical room where we can invite people in to try it out in person and give us more detailed feedback. Or we might run it live for a small percentage of actual Google users, and see whether the change is improving things for them. If all those experiments have positive results, we eventually roll out the change for everyone"
Customer focus is, of course, admirable, but you’ve got to wonder about a metric that doesn’t involve the needs of publishers. If publishing on the web is not financially worthwhile, then, over time, the serps will surely degrade in terms of quality as a whole, and users will likely go elsewhere.
There is evidence this is already happening. Brett at Webmasterworld pointed out that there is a growing trend amongst consumers to skip Google altogether and just head for the Amazon, and other sites, directly. Amazon queries are up 73 percent in the last year.
There may well be a lot of very clever people at Google, but they do not appear to be clever enough to come up with a model that encourages webmasters to compete with each other in terms of information quality.
If Google doesn’t want the highest quality information increasingly locked up behind paywalls, then it needs to think of a way to nurture and incentivise the production of quality content, not just relevant content. Tell publishers exactly what content Google wants to see rank well and tell them how to achieve it. There should be enough money left on the table for publishers i.e. less competition from ads - so that everyone can win.
I’m not going to hold my breath for this publisher nirvana, however. I suspect Google's current model just needs content to be "good enough."
The best part about a growing and very quickly changing industry is the diversity of viewpoints; the worst part is the exact same thing because sometimes 1 always equals 1 and doesn't need bullshit in lieu of evidence. I try my best to stay out of the limelight and just focus on making things happen. However, occasionally a topic will bother me so much that I have to chime in. The last time was over 5 years ago so I figure I'm due to speak up again. Today's topic? Negative SEO. My issue with the topic? Deniers.
There've been several posts on how negative SEO doesn't exist (those are the worst) or that maybe it exists but only weak sites can get hit (in other words, people with opinions that didn't do any testing). I'd like to put those topics to rest as best as a guy that keeps to himself can. I really should be able to do this in one sentence, but in the event what I write as the second half of this sentence doesn't do it for you, I have a couple stories; if crappy SEO of over-optimized anchors and junky links are to blame for ranking drops, how can it be said one cannot do this to someone else, and even if you were to deny this, then why the sudden rush to denounce certain links? On to some anecdotes!
While leading a training session overseas I mentioned a site I watched get hit by some negative SEO activities. I know that it was negative SEO and not a slip up on the SEOs' part by virtue of knowing the history/team behind the site and watching it as part of my normal data routine; the site was managed by the kind of guys that get asked to speak at SEOktoberfest...the kind of people I'd go work for if my bag of tricks ever ran out. Ok, so you're asking how I know it was negative SEO. The easiest explanation is that I watched the site spike heavily with on-theme anchors from junk sites over a one week period and was filtered shortly thereafter. It stayed filtered for just under few months, but 2 days after discussing the site and explaining how I knew the site was hit it magically reappeared (yes, there were googlers in the audience).
If you are skeptical then your first response better be that I'm only loosely describing one example so let me say that in the same industry where I've shared my knowledge of the subject on some more sophisticated methods (first released in the SEObook community), I feel almost like an information arms dealer since even the larger brands have themselves or through affiliated relationships been clubbing each other over the head. You read that right; I explained how I thought negative SEO could be employed and then watched a bunch of people actually do it, repeatedly. Unfortunately, I was hit too, but that's a different issue. In this particular industry, the only people left standing now are some poorly matched local results with fake reviews, a bunch of hacked domains, and the flotsam of macroparasites that gained popularity post Penguin. The only one that came back? The one I publically shared at a conference, explaining exactly how they were a victim based on the link patterns that didn't fit with the site's history over a several year period.
I'll wrap this up with a bit of humor. As a joke a friend of mine asked me to negative SEO him for his name. Let's say his name is John Doe and his domain is johndoe.com. The negative effect was temporary, but I was able to get him filtered for a little while on his name for maybe 120 seconds of my time and less than $50. The site did come back after a few days, but our mutual feeling on the matter is that for an extra $50 double-dose I could probably get the site filtered again. Neither of us wants negative SEO to get any more prevalent than it already is, so I'll skip the details on exactly how it was performed. There are multiple forms of negative SEO significantly scarier than someone with a copy of xrumer and in some cases there is very little you can do to prevent it; if a jerk wants to take you down, it can happen. If your niches begin to look like the wasteland I described above where I shared my thoughts a little too freely, then heaven help you because it doesn't look like Google is going to.
Cygnus has been involved in search since 1997 and loves tackling new and interesting (and of course lucrative) projects. Follow @Cygnus on Twitter for his rants.
When platforms are new they start off as being fairly open to win attention & maximize their growth rates. Over time as they push to monetize they shift gears & what was once true becomes misleading. Thus a lot people likely come off as sounding like quack jobs because they keep having to reinvent themselves & reassess their belief systems as the markets change.
Hello Mr. Cynic
If you write things that sound like rants & complaints a lot of people mistake it as thinking you are a crank full of gloom & nonsense. For what it is worth, in many ways I think the future of the web will still be bright, but just relatively less bright than it was in the recent past for smaller players.
The web is becoming more & more like the physical world (andis mergingwith it). For a long time search & online was largely a meritocracy, where the best person could easily win even if they came from the most humble beginnings.
In search of years gone by, large & complex organizations that were overly bloated and inefficient routinely had their asses handed to them by smaller & more efficient operations. Butthensizebecame a primary signal of relevancy & quality, and that all changed. As Larry Page & Sergey Brin warned, the relevancy algorithms inevitably follow the underlying business model of the search engine.
That is a big part of the disillusionment with Google. For many years they were a leveler which was concerned primarily with quality. That grew the importance of search & differentiated them from everyone else, but then they decided to be "the same" & so many who promoted them felt a bit betrayed.
If a person gives you something and then takes it away you likely view them worse than someone who simply never offered that in the first place. As a species we are biologically aligned with being adverse toward loss.
Vertical AND Horizontal Integration
I was chatting with a friend about the above trend & his responses were:
"you don't shoot the guy that didn't give you the job; you shoot the guy that gave you the job and then fired you"
"their public image as being a leveler becomes more grating too, given how much they no longer represent that"
"the biggest problem we have in search is that search engines don't view themselves as a medium. They want to be the cable operator + television show + in-show advertising + commercials...I'm not aware of another medium where it works that way"
Affiliate links should be clearly labeled as such. When they are not clearly labeled & go through tracking redirects they are sneaky redirects in Google's remote rater guidelines. On YouTube the affiliate links to Amazon & iTunes are not labeled as such & add an extra layer of tracking redirects to the sequence.
Let Me See Your Backlinks!
Yesterday someone sent me an email about their reinclusion request being denied because someone else scraped their eZineArticles article & syndicated it to another 3rd party site.
They didn't create that link and yet they are somehow supposed to get a spammer (maybe one from another continent!) to remove it. In many cases spammers won't respond to anything other than cash, but if you do offer cash to get the job done then that spammer might keep adding more and more links over time, turning their mark into an easy source of subscription revenues.
What is Wrong With This Picture?
The above scenario is ridiculous.
If you look at *any* site closely enough there will be something wrong with it.
Just by the virtue of existing & ranking you will pick up dozens to hundreds of spammy links you don't even want, due to SERP scraper sites that are trying to rank on longtail keyword queries.
About 5 years ago I had a page get filtered out because it gained about 500 scraper links in a month. No matter what I did that page would not rank until it was rewrote with a fresh page title. When you could change things & have the algorithms re-evaluate them automatically there was at least a decent opportunity to get around such issues.
Now that there is a manual review process holding you responsible for the actions of third party webmasters the market is a bit more grim.
But at least a bunch of link removal services are cropping up to profit from Google's errant logic. ;)
Engineers Ad Networks Love Quality Websites Big Brands
But if you are a low margin small business who has seen declining revenue AND have to jump through further hoops (rather than focusing on running your business) at some point it is easier to give up than to keep on fighting.
Eventually a lot of the displacement trends that are hitting the organic search market will hit the paid search market & Google will make many of the enterprise AdWords management tools obsolete via a combination of various free scripts & data obfuscation.
At that point in time some of the paid search folks will look like the guy to the right, but nobody will care, as those same people reminded us that this is just how business works. :D
Google appears to have a culture that condones shamelessly violating consumer privacy. How else can you explain a company that bypasses Apple's iPhone privacy settings in a reported attempt to strengthen advertising revenues?
It is hard to believe that Dave Packard or Andy Grove would ever tell a group of entrepreneurs that he did "every horrible thing in the book to just get revenues right away," or brag to trade publications that his company used behavioral psychologists to design "compulsion loops" into products to keep customers engaged. But Mark Pincus, the founder of Internet gaming giant Zynga, has done just that.
When corporate leaders pursue wealth in the winner-take-all Internet environment, companies dance on the edge of acceptable behavior. If they don't take it to the limit, a competitor will. That competitor will become the dominant supplier -- one monopoly will replace another. And when you engage in these activities you get a different set of Valley values: the values of customer exploitation.
The layout of the result looks something like this
Or if you put it in Google's browser analysis tool, it looks something like this
And with that move, if you are in ecommerce & you don't rank #1 you are essentially invisible to most searchers.
As John Andrews highlighted on Twitter: "Notice Google tells us "paid relationships improve quality" and then penalizes for paid links?"
As always, it is more profitable to follow Google's biz dev team than Google's public relations pablum.
In some cases Google might include 3 or 4 different types of monetization in a search result. In the below search result Google includes:
Hotel Comparison ads
Hotel Price ads
And those are *in addition* to featuring promotional links to Google Maps & Google+ in the search results. Further, some of these vertical results consist exclusively of paid inclusion & then have yet another layer of PPC ads over the top.
As SEOs we focus a lot of energy on "how do I rank 1 spot higher" but when the organic results are displaced and appear below the fold why bother? The issue of the incredibly shrinking organic result set is something that can't be over-emphasized. For many SEOs the trend will absolutely be career ending.
AdWords, product listing ads, brand navigation, product search, local, etc. A result like this has a single organic listing above the fold & if Google decides to rank their local one spot higher then that turns to zero.
If you look at the new TLD announcement Google applied for .MBA & .PhD (as well as many names around entertainment, family & software). Thus it is safe to say that education will eventually be added to local, video, media, shopping & travel as verticals where Google is displacing the organic results with links to more of their fraternal listings. About the only big categories this will leave unscathed will be real estate, employment & healthcare. However those first 2 are still in contraction during our ongoing depression & Google blew a lot of their health credibility by pushing those illegal ads for steroids from a person posing as a Mexican drug lord.
In addition to these fixed vertical that cover the most profitable areas of search, Google is also building a "vertical search on the fly" styled service with their knowledge graph. Their knowledge graph extracts data from 3rd party websites & then can be used to funnel traffic and revenue to Google's various vertical services. To make it seem legit, Google will often start by sending some of the traffic onto 3rd party sites, but the end destination is no different than product search. While it is a "beta" product it is free to justify an inferior product being showcased front & center, but after Google gets enough buy in they monetize.
There is a non-subtle difference between Google's approach and Microsoft's approach to building a search ecosystem.
Let's compare that behavior against Yahoo! or Bing.
Yahoo! has long been considered out of the search game, yet when they want to have a competitive advantage they do things like license photos from Getty. They use the content with permission on agreed terms.
Google's approach is more along the lines of "scrape it now & figure out legal later." And after a long enough period has passed they will add monetization & mix it into the core of their offering, like they recently did with books:
This launch transitions the billions of pages of scanned books to a unified serving and scoring infrastructure with web search. This is an efficiency, comprehensiveness and quality change that provides significant savings in CPU usage while improving the quality of search results.
Eric Enge interviewed Stefan Weitz about the new Bing interface. As part of that interview, Stefan described Bing's editorial philosophy on building a search ecosystem
We partner with 3rd party services instead of trying to build or acquire them. There are probably something like a million apps out there today.
I talk to probably two dozen start-ups every week that are doing different cool things on the web. To think that we are ever going to be able to actually beat them, or out-execute them (when they are talking about 12 guys with half a million angel funding building some really interesting apps), it is just not likely.
Off the start forays into new categories might provide some value to publishers in order to get buy in, but eventually the "first hit free" stuff shifts to paid & Google continues to displace publishers across more and more of the ecosystem, using content scraped from said publishers.
When Google or Apple drive cars around the country or fly military-grade planes over cities to create 3D maps of cities they are creating databases & adding new information. Outside of collecting private data (like wifi payload data) there is little to complain about with that. They are adding value to the system.
However, at the same time, Google not only scrapes themselves, but they are a revenue engine that drives a lot of third party scraping. And they design penalties in a way that allows those who scrape penalized sites to outrank them. With batch penalty updates some folks can chain redirects, expired domains & so on to keep exploiting the combination of copyright violations & Google penalties to make a mint. Google also had a long history of funding Traffic Equalizer sites, sites like Mahalo that would take a copy of a search result & auto-generate a page on it, newspaper sites that would hang auto-generated stub preview articles on subdomains, & sites like eHow which integrate humans into the process.
While many sites are still penalized from the first version of Panda, downstream referrals to eHow.com from Google in the US were up over 9% last month. They know "how to create SEO content."
Recently a start up that launched a couple years ago decided to take their thousands of subdomains of scraped databases & partner with authoritative websites to syndicate that content around the web. Some of those get double listings & for some search queries there is the same page (with a different masthead logo) 5 different times. Those sites don't get hit by duplicate content filters or algorithms like Panda because they have enough domain authority that they get a free pass. Including AdSense in the set up probably makes it more palatable to Google as well.
If you have scale you can even auto-generate a bunch of "editorial" questions off the database.
More data = more pages = more questions and comparisons = more pages = SEO alchemy (especially if you don't have to worry about Panda).
The parent scraper site includes links back to itself on every syndicated page, which to some degree makes it a glorified PageRank funnel. WPMU.org got smoked for syndicating out a sponsored theme on one of his own sites, but the above industrial-scale set up is somehow reasonable because it was launched by a person who sold their first start up to Google (and will likely sell this start up to Google too). The site also includes undisclosed affiliate links & hands out "awards" badges to the best casual encounter sex dating sites, which then get syndicated around the web & get it many inbound links from "high quality" porn sites.
I won't name the site here for obvious reasons, but they are not doing the above in a cloak of darkness that one has to look hard to find & do deep research to patch together. For some search results they are half or more of the search result set & they even put out press releases when they add new syndication partners, linking to numerous new automated subdomains or sections within sites related to various categories.
When the search results look like that, if you do original in-depth reviews that are expensive there is zero incentive structure to leaving your content and ratings open to Google and these sort of scraper/syndicaters.
There is always a new spin on the mash up low end content with high trust websites and try to feed it into Google. So long as Google biases their algorithms toward big brands & looks the other way when they exploit the ecosystem that trend will not end.
The Independent Publishers Group, a principal distributor of about 500 small publishers, recently angered Amazon by refusing to accept the company’s peremptory demand for deeper discounts. Amazon promptly yanked nearly 5,000 digital titles. Small-press publishers were beside themselves. Bryce Milligan of Wings Press, based in Texas, spoke for most when, in a blistering broadside, he lambasted Amazon, complaining that its actions caused his sales to drop by 40 percent.
However, even when companies are brutal in some aspects they do amazing things in other areas, so one has to weigh the good with the bad.
At any point Google can fold one vertical into another or extend out a new model. The Android Marketplace feeds into Google Play, Google local feeds into Google+, Google search force feeds just about everything else & even free offerings on sites like YouTube will eventually become pay to play stores.
Where Google lacks marketshare & forced bundling isn't enough to compete they can buy the #2 or #3 player in the market & try to propel it to #1 using all those other forms of bundling.
Part of what made search competitive against other platforms was its openness & neutrality. But if the search results are Wal-Mart over and over again (or the same scraped info 5 times in a row, or a collection of internal listings) then the system becomes more closed off & the perception of choice becomes an illusion. John Andrews wrote a couple greatTweets expressing the shift in search:
"Google SEO is no longer worth the effort for those who are not writers, artists, speakers, trainers, or promoters. What happened to Search?"
"If you want to see what Google will look like after it locks up, look at Apple. ipad users are already "managed" very tightly."
When companies try to expand the depth of their platform with more features it is a double edged sword. At some point they capture more value than they create and are no longer worth the effort. When they get to that stage it becomes a race to the bottom with scrapers trying to outscrape one another. Then in turn the company that created the ecosystem problem uses the pollution they rewarded to further justify closing off the system, guaranteeing only more of the same. Those who actually add value move on looking for greener pastures.
A label or an interest is a vector for ad targeting. There is no need to worry about de-anonymizing data for ad targeting when it is all in-network and you monitor what someone does, control which messages they see, & track which ones they respond to. Tell someone something often enough and they may believe it is true.
The Contempt Large Companies Have for their Customers
There is a sameness to customer service from a lot of big companies. They spend loads & loads to track you and market to you, but then disappear the moment things go wrong, as they are forbidden to care.
Perhaps the only thing worse that AOL's customer support is the unmoderated comments on the YouTube page.
Denise Griffin, the person in charge of Google’s small customer-support team, asked Page for a larger staff. Instead, he told her that the whole idea of customer support was ridiculous. Rather than assuming the unscalable task of answering users one by one, Page said, Google should enable users to answer one another’s questions.
Even their official blog posts announcing that they are accepting customer feedback for your applications go unmoderated.
This sort of contempt exists at essentially all large companies.
Everything seems on the up & up, but that "private listing" was maybe for a counterfeit product.
If it isn't a counterfeit & you get too good of a price you are threatened with a lawsuit, and the branded network falls behind a "oh we are just a marketplace and can't be bothered to give a crap about our customers" public relations angle.
The Retina MacBook is the least repairable laptop we’ve ever taken apart: unlike the previous model, the display is fused to the glass—meaning replacing the LCD requires buying an expensive display assembly. The RAM is now soldered to the logic board—making future memory upgrades impossible. And the battery is glued to the case—requiring customers to mail their laptop to Apple every so often for a $200 replacement. The design may well be comprised of “highly recyclable aluminum and glass”—but my friends in the electronics recycling industry tell me they have no way of recycling aluminum that has glass glued to it like Apple did with both this machine and the recent iPad. The design pattern has serious consequences not only for consumers and the environment, but also for the tech industry as a whole.
Every time we buy a locked down product containing a non-replaceable battery with a finite cycle count, we’re voicing our opinion on how long our things should last. But is it an informed decision? When you buy something, how often do you really step back and ask how long it should last? If we want long-lasting products that retain their value, we have to support products that do so.
One last bit of absurdity on the YouTube front. Google recently threatened to sue a site designed to convert YouTube videos into MP3s.
How does Google's "computers deserve free speech rights" & shagging 3rd party content to fill out their own vertical search services compare against their approach when someone uses YouTube content in a way Google does not desire?
There are AdWords ads promoting free unlimited MP3 downloading & song burning bundled with shady adware.
Google's AdSense for domains funds boatloads of cybersquatting. While Google threatened to sue this particular site, they could have just took the domain due to it cybersquatting on the YouTube trademark. The fact that they chose to turn this into a press event rather than simply fix the issue shows that this is more for posturing.
Further aligned with the above point, while Google singled out a specific MP3 conversion site, there are other sites designed around doing the same exact thing which are PREMIUM ADSENSE PARTNERS, with the body of the page looking like this:
How Small Companies Are Taxed With Uncertainty
When Google decided to move away from direct marketing to brand advertising things that are often associated with size, scale & brand recognition became relevancy signals.
how much to invest in marketing, where to invest it, how to balance the need for short term cashflow with the required reinvestments to build real (or fake) brand signals
how long does the market have left before Google enters the niche and destroys the opportunity that organic SEO once represented
should you run 1 website, or many to hedge risks? and how many is optimal?
how big should your site be?
if one of your sites gets penalized, should you try to fix it up, should you start over with a new site, or should you consider SEO to be a pointless goal?
Google mentions that they want people to do what is best for the user & not worry about Google, but that advice is a recipe for pain
If you do not run a large & authoritative website there are so many landmines to trip over with the increasing complexity of SEO. And any of Google's "helpful" webmaster messages can suspend a webmaster in fear, leading them to an eventual bankruptcy.
Small companies need to do all sorts of canonicalization hoops & prune content and such to hope to avoid algorithms like Panda. Then Google changed their host crowding preferences to let some large sites get up to 8 listings in a single search result page for their LACK OF effort. Those larger sites can then partner with glorified scraper sites that syndicate databases feeding on domain authority with no risk of Panda.
Due to how Google penalizes smaller sites, those that rewrite their content will outrank them when they get hit. These horrible trends are so obvious that even non-SEOs like Tim Carter (who was a Google golden boy for years) highlights how the tables have tilted away from what is most relevant to what pays Google the most.
I am already getting fake webmaster tool notification messages using the above subject line & the following message:
Hello dear managers of http://www.seobook.com/! My name is Olivia, and the issue I’m gonna to discuss is for sure not new, but really actual and complicated, otherwise your website and therefore business wouldn’t have lost their favourable positions. Yes, I want to talk about Google Panda and Penguin. These virtual beasts become more and more freakish. Don't you think it's time to pacify them? Google intends to clean its search results from poor content websites, low quality links and hype. Are you sure your website has nothing common with this stuff?
Our team has been constantly studying Google search algorithms. We have already faced the latest freaks of Google Panda 3.4 and will be happy to win back your top positions.
We will heal your website from:
poor on page optimization;
same content submission;
low quality links to your website;
absence of website moderation;
black hat SEO applied earlier.
We will make Google be proud of you with:
high quality SEO strategy;
backlinks from relevant resources;
unique content for every submission directory;
constatnt situation analysis and reporting.
Contact us and you will get a reliable website healer, strategy planner and safe guard of your top positions.
Looking forward to your answer!
And Gmail is letting this stuff slide through the spam filters. Along with garbage like this:
Our Web Site [the url] is definitely related to yours and by placing a link from your site to a Web page of ours, you may not only bring further value to your visitors but you may improve your search engine rankings potential as well. By NOT being what Google and other search engines refer to as a "dead-end" site or a site that does not link to other industry related and content sites, your rankings have a good chance of increasing for important keyword searches. We can explain this in further detail following a response from you.
Create FUD & some huckster will sell into your messaging with inbound spamming.
If you ever wonder where the "reputation problem" of the SEO industry comes from, wonder no more.
One company in particular does a great job of riding these trends on through to their logical conclusion, then riding them a bit longer. And that company is Google.
Google recently launched their webspam Penguin update. While they claim it only impacted about 3.1% of search queries, the 3.1% it impacted were largely in the "commercial transactional keywords worth a lot of money" category.
Based on the number of complaints online about it (there is even a petition!) this is likely every bit as large as Panda or the Florida update. A friend also mentioned that shortly after the update WickedFire & TrafficPlanet both had sluggish servers, yet another indication of the impact of the update.
Spam vs OOP
Originally leading up to the update, the update was sold as being about over-optimization. However when it was launched it was given no pet name, but rather given the name of the webspam update. Thus anyone who complained about the update was by definition a spammer.
A day after declaring that the name didn't have any name Google changed positions and called the update the Penguin update.
Why the quick turn around on the naming?
If you smoke a bunch of webmasters & then label them all as spammers, of course they are going to express outrage and look for the edge cases that make you look bad & promote those. One of the first ones out of the gate on that front was a literally blank blogspot blog that was ranking #1 for make money online.
As I joked with Eli, if it is blank then they couldn't have done anything wrong, right? :D
Another site that got nailed by the update was Viagra.com. It has since been fixed, but it is pretty hard for Google to state that the sites that got hit are spam, blend the search ads into the results so much that users can't tell them apart & force Pfizer to buy their own brand to rank. If that condition didn't get fixed quickly I am pretty certain it would lead to lawsuits.
So Worried About Manipulation That They Manipulate Themselves
When I was a kid I used to collect baseball cards. As the price of pictures from sites like iStockphoto have gone up I recently bought a few cards on eBay (in part for nostalgia & in part to have pictures for some of our blog posts). Yesterday I searched for baseball card holders for mini-cards & in the first page of search results was:
a big ecommerce site where the review on that product stated that the retail described the quantity as being 10x what you actually get (the same site had other better pages)
a user-driven aggregator site with a thin affiliate post made years ago & attributed to a site that no longer exists
a Facebook note that was auto-generated from a feed
an old blogspot splog
a broader tag page for a social site
a Yahoo! Shopping page that was completely empty
That blank Yahoo! Shopping page is also what showed up in Google's cache too. So I am not claiming that they were spamming Google in any way, rather that Google just has bad algorithms when they rank literally blank pages simply because they are on an authoritative domain name.
The SERPs lacked expert blogs, forum discussions, & niche retailers. In short, too much emphasis on domain authority yet again.
Part of the idea of the web was that it could connect supply and demand directly, but an excessive focus on domain authority leads users to have to go through another set of arbitragers. Efforts to squeeze out micro-parasites has led to the creation of macro-parasites (and micro-parasites that ride on the macro-parasite platforms).
And your business model is probably far more important than your SEO skill level is. Imagine running a consulting company for a lot of small business customers for a few hundred Dollars a month each, based on stable rankings & then dealing with a tumultuous update that hits a number of them at the same time. And then they see an older (abandoned even) competing site of lower quality with fewer links ranking and they think you are selling them a bag of smoke. These sorts of updates harm the ability to do SEO consulting for anyone who isn't consulting the big brands. Yes many people made it through this update unscathed, but how many of these sorts of updates can one manage to slide through before eventually getting clipped?
The Unknowable Future
As search evolves, invariably anyone who is doing well in the ecosystem will at some point face setbacks. Those may happen due to an algorithm update or an interface change where Google inserts itself in your market. If you never get hit, it means you were only operating at a fraction of your potential. If you consistently get hit, you might be aiming too low. Many trends can be predicted, but the future is unknowable, so set up a safety cushion when things are going well.
This year Google has moved faster than any year in their history (massive link warnings, massive link penalties, tighter integration of Panda & now Penguin) & the rate of change is only accelerating. Go back about 125 years and a candle wick adjuster was cutting edge technology marketed as brand spanking new:
Blekko has a decently competitive search service which they manage to run for only a few million a year. As computers get cheaper & Google collects more data think of all the different data points they will be able to layer into their relevancy algorithms. In some markets Chrome has more marketshare than Internet Explorer does & Android is another deep data source. And they can know what user data to trust most by tracking things like if they have a credit card or phone verified on file & how often they use various services like Gmail or YouTube. Google+ is just icing on the cake.
You have to play by their rules, which are really restrictive. The kind of environment that we developed Google in, the reason that we were able to develop a search engine, is the web was so open. Once you get too many rules, that will stifle innovation.
He was talking about Facebook, but those words are far more applicable to Google.
A Social Experiment
In the movie Dark Knight the Joker ran a social experiment where he offered 2 boats full of people the opportunity to save their own lives by blowing up the other boat. The boat full of "criminals" threw the button overboard & the other boat also decided not to push the button.
Of course taking someone's life is more extreme than taking their livelihood, but if you do the latter it might create stress and/or other issues which in effect lead to the former. Some people who see their income disappear might have a heart attack, others might have marriages that soon falls apart, leading into a spiral of depression and substance abuse & eventually suicide. Others still might have employees that get laid off & end up heading down some of the same scary paths - through no fault of their own.
Negative SEO Goes Mainstream
Anyone who outs or link bombs smaller businesses (small enough that Google punishing them destroys their livelihood rather than just giving them a bad quarter) is a _______. Anyone who advocates outing or link bombing such businesses is an even larger _______.
With all of Google's warning messages about abnormal links they have built the negative SEO industry in a big way. In some instances those who are not good enough to compete try to harm competitors. I received emails & support tickets like the following one for years and years...
...but the rate of demand increase for such "services" has been sharp this year. Every additional warning message from Google creates additional incremental demand.
And this is where outing a competitor makes one a total and complete _______ of a human being.
A Recent (& Very Public) Example of Negative SEO
Dan Thies mentioned that it was "about time" that Google started hitting some of the splog link networks.
Anyone who knows the tiniest bit about the social sciences could predict what came next.
15th March - Dan Thies posts smug tweets to Matt Cutts and pisses off the entire internet.
18th March - seofaststart.com - blog posts started - anchor text "seo" "seo service" and "seo book"
22th March - seofaststart.com - 1 million scrapebox blast started - 100% anchor text "Dan Thies"
26th March - Dan Thies posts in Twitter that he has received an unnatural links message.
Since then Dan has installed a new template & his rankings tanked. Is it the template or the spam links? Probably the spam links, given how many other sites have got hit for using too much focused anchor text.
Will the site stay tanked? If so, now Google's approach to anchor text & link spikes allows independent websites to get torched in a few weeks for a few Dollars.
Or will the site come back stronger than ever with the help of the spam links? If it does, then how long is it before people start accidentally spam blasting their own websites & posting a public case study about burning a competitor on a forum, then citing that forum thread in their reconsideration request?
If the site quickly comes back, will that be due to a manual intervention by a search engineer, or from an algorithm more advanced than some people are giving it credit for being?
When asking such questions one quickly arrives at another set of questions. Is it the web that is broken? Or is it Google's editorial approach that is broken? If the observer breaks the system they observe, then the observer is the problem.
The Bigger Issue
The bigger issue isn't the short term trends for SEO related keywords or Dan's site (he will be fine & rankings are not that important for sites about SEO), but the big issue is that if this can happen to a decade old website then this can happen to literally anybody.
Piss off a ...
bitter family member
insert any classification or category you like
... and risk getting torched.
When you out someone for shady links, you can't be certain they were responsible for it. They could have had a falling out with a consultant or business partner or another competitor who wanted to hose them. Or their SEO or webmaster could have been non-transparent with them.
Then you out them & they might be toast.
White Hat, Black Hat & ________ Hat SEO
Any of the ________ who promote competitor smoking or competitor outing as somehow being "ethical" or "white hat" never bother to explain what happens to YOU when someone else does that to you.
Sketchy marketers can make just about anything look good at first glance. No matter how shiny the package in concept, it is hard to appreciate the pain until you are the one undergoing it.
Building things up is typically far more profitable than tearing things down & if SEOs go after each other then the only winner is Google. Literally every other participant in the ecosystem has higher risk, higher costs & is taxed by the additional uncertainty. Sure some of the conscripts might get a bit of revenues and some of the "white hat" hacks might gain incremental short term exposure, but as the marrow is scraped out of the bone, they too will fall hard.
Google is betting that the SEO industry is full of ________. If our trade is to worth being in, I hope Google is wrong! If not, you will soon see most of the quality professionals in our trade go underground, while only the hacks who misinform people & are an unofficial extension of Google's public relations team remain publicly visible.
That might be Google's goal.
Will they be successful at it?
That depends entirely on how intelligent members of the SEO industry are.
Sorry I haven't blogged as much lately, but one of our employees recently had a child and Google sending out so many warning messages in webmaster central has created a ton of demand for independent SEO advice. Our growth in demand last month was higher than any month outside of the time a few years ago when we announced we would be raising prices and got so many new subscribers that I had to close down the ability to sign up for about 3 or 4 months because there were so many new customers.
Google has been firing on all cylinders this year. They did have a few snafus in the press, but those didn't have any noticeable impact on user perception or behavior & Google recently rolled out yet another billion Dollar business in their consumer surveys.
Google is doing an excellent job of adding friction to SEO & managing its perception to make it appear less stable, less trustworthy and to discourage investment in SEO. They send out warnings for unnatural links, warnings for traffic drops, and even warnings for traffic increases.
Any SEO company which has clients sign up for their own Webmaster Tools account now has to deal with explaining why things change, when many of the changes that happen are more driven by algorthmic shifts (adding local results to the SERPs or taking them away, other forms of localization, changing of ad placement on the SERP, etc.) than by the work of the SEO. This in turn adds costs to managing SEO projects while also making them seem less stable (even outside of those who were use paid link networks). Think through the sequence...
Google first sends a warning for traffic going up, and the SEO tells the client that this is because they did such a great job with SEO.
Then Google sends a warning for traffic dropping & the client worries that something is wrong.
The net impact on actual traffic or conversions could be a 0, but the warnings amplify the perception of changes.
Any SEO who doesn't use Webmaster Tools loses search referral data. It first started with logged in Google users, but apparently it is also headed to Firefox. Who's to say Google Chrome & Safari won't follow Firefox at some point?
Google has changed & obfuscated so many things that it is very hard to isolate cause and effect. They have made changes to how much data you get, changes to their analytics interface & how they report unique visitors, changes to how tightly they filter certain link behaviors, they have rolled in frequent Panda updates, and they have nailed a number of the paid link networks.
Google, since they grant themselves more editorial leeway. If everyone is a scofflaw then they can hit just about anyone they want. And the organic search results are going to be far easier to police if many market participants are held back by a fear tax.
Larger businesses which are harder to justify hitting & which can buy out smaller businesses at lower multiples based on the perception of fear.
Sites which were outranked by people using the obvious paid links, which now rank a bit better after some of those paid link buyers were removed from the search results.
some of the paid link networks & those who used them for years
under-priced SEO service providers who were only able to make the model work by scaling up on risk
smaller businesses who are not particularly spammy, but are so paralyzed by fear that they won't put in enough effort & investment to compete in the marketplace
One of the reasons I haven't advocated using the paid link networks is I was afraid of putting the associated keywords into a hopper of automated competition that I would then have to compete against year after year. Even if you usually win, over the course of years you can still lose a lot of money by promoting the creation of disposable, automated & scalable competing sites. If you don't mind projects getting hit & starting over the ROI on such efforts might work out, but after so many years in the industry the idea of starting over again and again as sites get hit is less appealing.
We’ve detected that some of your site’s pages may be using techniques that are outside Google’s Webmaster Guidelines.
Specifically, look for possibly artificial or unnatural links pointing to your site that could be intended to manipulate PageRank. Examples of unnatural linking could include buying links to pass PageRank or participating in link schemes.
We encourage you to make changes to your site so that it meets our quality guidelines. Once you’ve made these changes, please submit your site for reconsideration in Google’s search results.
If you find unnatural links to your site that you are unable to control or remove, please provide the details in your reconsideration request.
If you have any questions about how to resolve this issue, please see our Webmaster Help Forum for support.
Google Search Quality Team
If that doesn't change then negative SEO will become a bigger issue than paid links ever were.
What is hard about Google penalizing websites for such links is that it is cheap & easy for someone else to set you up. Shortly after Dan Thies mentioned that it was "about time" to Matt Cutts on Twitter someone started throwing some of the splog links at his site. It is safe to say that Dan didn't build those links, but there are many people who will be in the same situation as Dan who did nothing wrong but had a competitor set them up.
And there is no easy way to disconnect your site from those types of links.
If you go back a few years, it was quite easy to win at SEO by doing it in a "paint by number" fashion. One rarely got hit unless they were exceptionally excessive and stuck out like a sore thumb.
But after all of Google's recent moves, a few missed steps in a drunken stupor can have the same result.
the page disclaims that it is not endorsed by Google
the page embeds a Google search box
the page strips out the Yahoo! Directory search box
the page strips out the Yahoo! Directory PPC ads (on the categories which have them)
the page strips out the Yahoo! Directory logo
Recall that when Google ran their bogus sting operation on Bing, Google engineers suggest that Bing was below board for using user clickstreams to potentially influence their search results. That level of outrage & the smear PR campaign look ridiculous when compared against Google's behavior toward the Yahoo! Directory, which is orders of magnitude worse:
Bing vs Google
Google vs Yahoo! Directory
Uses user-experience across a wide range of search engines to potentially impact a limited number of search queries in a minor way.
Shags expensive hand-created editorial content wholesale & hosts it on Google.com.
Bing hosts Bing search results using Bing snippets.
Google hosts Yahoo! Directory results using Yahoo! Directory listing content & keeps all the user data.
Bing publicly claimed for years to be using a user-driven search signal based on query streams.
Google removes the Yahoo! Directory logo to format the page. Does Google remove the Google logo from Google.com when formatting for mobile? Nope.
Bing sells their own ads & is not scraping Google content wholesale.
Google scrapes Yahoo! Directory content wholesale & strips out the sidebar CPC ads.
Bing puts their own search box on their own website.
Google puts their own search box on the content of the Yahoo! Directory.
Google claimed that Bing was using "their data" when tracking end user behavior.
Google hosts the Yahoo! Directory page, allowing themselves to fully track user behavior, while robbing Yahoo! of the opportunity to even see their own data with how users interact with their own listings.
In the above case the publisher absorbs 100% of the editorial cost & Google absorbs nearly 100% of the benefit (while disclaiming they do not endorse the page they host, wrap in their own search ad, and track user behavior on).
As we move into a search market where the search engines give you a slightly larger listing for marking up your pages with rich snippets, you will see a short term 10% or 20% lift in traffic followed by a 50% or more decline when Google enters your market with "instant answers."
The ads remain up top & the organic resultss get pushed down. It isn't scraping if they get 10 or 20 competitors to do it & then use the aggregate data to launch a competing service ... talk to the bankrupt Yellow Pages companies & ask them how Google has helped to build their businesses.
Update: looks like this has been around for a while...though when I spoke to numerous friends nobody had ever seen it before. The only reason I came across it was seeing a referrer through a new page type from Google & not knowing what the heck it was. Clearly this search option doesn't get much traffic because Google even removes their own ads from their own search results. I am glad to know this isn't something that is widespread, though still surprised it exists at all given that it effectively removes monetization from the publisher & takes the content wholesale and re-publishes it across domain names.
Since it took me a few hours to put together my SMX presentation I figured it was worth sharing that information on the blog as well. This post will discuss examples of how Google has dialed up their brand bias over time & points to where Google may be headed in the future.
Note that I don't have anything against them promoting brands, I just think it is dishonest to claim they are not.
Against All Odds
When analyzing Google's big-brand bias the question is not "do some small sites manage to succeed against all odds" but…
What are the trends?
What are the biases?
Eric Schmidt once stated that "Brands are the solution, not the problem. Brands are how you sort out the cesspool. Brand affinity is clearly hard wired."
We have a fear of the unknown. Thus that which we have already experienced is seen as less risky than something new & different. This is a big part of why & how cumulative advantage works - it lowers perceived risk.
A significant portion of brand-related searches are driven by offline advertising. When a story becomes popular in the news people look online to learn more. The same sort of impact can be seen with ads - from infomercials to Superbowl ads. Geico alone spends nearly a billion Dollars per year on advertising, & Warren Buffet mentioned that 3/4 of their quotes come from the internet.
Some of the most profitable business models are built off of questionable means.
Many big brands are owned by conglomerates with many horses in the race. When one gets caught doing something illegal they close it down or sell off the assets & move to promote their parallel projects more aggressively.
If things aligned with brands become relevancy signals then to some degree those measure longevity & size of a company (and their ad budget) rather than the quality of their offering.
Companies with a high page rank are in a strong position to move into new markets. By “pointing” to this new information from their existing sites they can pass on some of their existing search engine aura, guaranteeing them more prominence.
Google’s Mr Singhal calls this the problem of “brand recognition”: where companies whose standing is based on their success in one area use this to “venture out into another class of information which they may not be as rich at”. Google uses human raters to assess the quality of individual sites in order to counter this effect, he adds.
Since Panda Overstock has moved into offering ebooks & insurance quotes while companies like Barnes & Noble run affiliate listings for rugs.
As an example of the above trend gone astray, my wonderful wife recently purchased me a new computer. I was trying to figure out how to move over some user databases (like our Rank Checker & Advanced Web Ranking) and in the search results were pages like this one:
The problems with the above are:
actual legitimate reviews get pushed down by such filler
the business model behind doing such actual reviews gets eroded by the automated syndicated reviews
outside of branding & navigation the content is fully syndicated
that particular page is referencing the 2005 version of the software, so the listed price is wrong & the feature set has changed a lot in the last 7 years
Such scrape-n-mash content strategies by large brands are not uncommon. Sites like Answers.com can quickly add a coupons section, sites like FindTheBest can create 10s of millions of automated cross-referencing pages that load a massive keyword net of related keywords below the fold, news sites can create auto-generated subdomains of scraped content, etc.
Eric Schmidt highlighted FindTheBest publicly as an example of a successful vertical search play. That site was launched by an ex-Googler, but if I did the same thing you can be certain that the only way Google would highlight it publicly would be as a "type of spam."
The issue with broadly measuring user experience is that I am still going to visit Yahoo! Sports repeatedly even if my experience on Yahoo! Downloads is pretty crappy. A site which is a market leader in one niche can take those signals to launch a "me too" service in other parallel markets & quickly dominate the market.
Potential Brand Signals
When attempting to debunk the concept of "brand bias" some people claim that it would be ridiculous for Google to have a list of brands that get an across-the-board boost. Of course that debunking is debunking a straw man that was never stated publicly (outside of the irrelevant debunking).
However, some of Google's old rater documents *did* have certain sites whitelisted & Google's Scott Huffman once wrote the following:
At a [search] quality level, we have something similar. On a continuous basis in every one of our data centers, a large set of queries are being run in the background, and we’re looking at the results, looking up our evaluations of them and making sure that all of our quality metrics are within tolerance.
These are queries that we have used as ongoing tests, sort of a sample of queries that we have scored results for; our evaluators have given scores to them. So we’re constantly running these across dozens of locales. Both broad query sets and navigational query sets, like “San Francisco bike shop” to the more mundane, like: Here’s every U.S. state and they have a home page and we better get that home page in the top results, and if we don’t … then literally somebody’s pager goes off.
(Outside of some fraternal Google properties) the algorithm isn't hardcoded to rank sites x & y at #1, but if some sites don't rank for certain queries it does cause an alert to be sent out.
Google has a wide host of quality-based metrics they could look at and analyze when determining if something gets a brand boost, gets ignored, or gets hit by an algorithm like Panda.
If you search for "fishing gear" and then click their Bass Shop refinement link in the search results, you are thus directly creating that search funnels relevancy "signal." Even if you don't click on that link the exposure to the term may make you remember it and search for it later.
When some small bloggers were selling paid links to K-Mart as part of a "sponsored conversations" outreach, Matt Cutts equated the practice to selling bogus solutions to brain cancer & stated: "Those blogs are not trusted in Google's algorithms any more."
Google also started sending webmasters automated messages for bad links pointing at their sites:
Dear site owner or webmaster of domain.com, We've detected that some of your site's pages may be using techniques that are outside Google's Webmaster Guidelines.
We encourage you to make changes to your site so that it meets our quality guidelines. Once you've made these changes, please submit your site for reconsideration in Google's search results.
So if you run a big site & they automatically detect paid links they generally just ignore those links and leave your site alone. If you are a small site & they automatically detect paid links they may decide to automatically penalize your site.
Same offense, entirely different outcome.
Is cloaking evil?
Once again, it depends on who is doing it.
I have a Vistaprint Visa card (so I could get a credit card with our dog's picture on it) and one of the pages that was ranking for Vistaprint Visa was the Singapore Groupon website.
The page forces a pop up and you can't do anything on that page (view the content, scroll around the site, etc.) other than filling in the lead generation form or logging into an existing account. I would never try that because I know I would get smoked for it. ;)
After the first iteration of the Google Panda update Google allowed users to vote to block websites. Experts Exchange was hated among some programmers in part because they used scroll cloaking. That in turn got their site hit by the second Panda update.
Smaller webmasters who ran network of sites in some cases got hit with "doorway page" penalties for owning networks of sites registered in Google Webmaster Tools, even if each site was a full fledged ecommerce website.
Is content farming evil?
Once again, it depends on who is doing it (and where it is hosted).
Another thing that is interesting about the content farms and the alleged need for the Panda algorithm was that in spite of flagrant editorial violations by both eHow and Mahalo, Google didn't smoke them until it could be done "algorithmically."
On the flip side of the above, in some cases Google has chose to keep smaller webmasters penalized because content that was at one point on their site months in the past!
A couple weeks after that aggressive promotional integration Amit Singal stated: "The overall takeaway that I have in my mind is that people are judging a product and an overall direction that we have in the first two weeks of a launch, where we are producing a product for the long term."
The problem with build preferential rankings first & increase quality later is that is the exact opposite of what Google is asking publishers to do with algorithms like Panda. Worse yet, Google not only does this integration when you are logged in, but also shows it on obscure longtail advanced queries when you are not logged in.
In Google's remote rater documents they suggested that hotel affiliate sites be marked as spam, even if they are helpful.
On Google's reconsideration request form they also stated: "In general, sites that directly profit from traffic (e.g. search engine optimizers, affiliate programs, etc.) may need to provide more evidence of good faith before a site will be reconsidered."
The broken piggy bank in the above cycle highlights the break that exists in the process to building a big brand. It is quite hard to have any level of certainty in the search ecosystem with an algorithm like Panda. Without that level of certainty companies must build from low cost structures, but that very constraint makes them more likely to get hit by an algorithm or search engineer.
Being an entrepreneur is all about taking smart calculated bets & managing risk. However as search engines become closed off portals that compete with (& exclude) publishers, there are so many unknowns that estimating risk is exceptionally challenging.
CustomMade is a Google-funded start up launched by an SEO who purchased an old website & created a vertical directory out of it (just like TeachStreet was trying to do, but in a different vertical). Googler's helped with the project & the article highlighting that shared this quote: "Having Google as an investor gives you a branding piece that you can't ignore." - Christopher Ahlberg.
Penalties: How Hard Were They Hit?
Years ago when BMW or Wordpress.org got caught spamming aggressively they were back in good graces in a mater of days.
About the only times well known (non-affiliate) sites have been penalized for a significant duration was when JC Penney & Overstock.com were hit. But that happened around the time of the Panda fiasco & Google had incentive to show who was boss. When the flower sites were outed for massive link buying that was ignored because Google had already rolled out Panda & reasserted the perception of their brand.
In 2009 Google banned over 30,000 affiliates from the AdWords auction. In some cases the problem was not with a current ad (or even a landing page the advertiser controlled), but rather ads that ran years ago promoting 3rd party products. In some cases Google changed their AdWords TOS after the fact in an ex post facto style. Google won't allow some of these advertisers to advertise unless they fix the landing page, but if they don't control the landing page they can't ever fix the problem. Making things worse, to this day Google still suggests affiliates do direct linking. But if the company they promote gets bought out by someone too aggressive then that affiliate could be waiting for a lifetime ban through no fault of their own.
In Australia a small travel site had a similar issue with AdSense. The only way they were able to get a reconsideration was to lodge a formal complaint with regulators. If that is how Google treats their business partners, it colors how they view non-business partners who monetize traffic without giving Google a taste of the revenues.
Why Does Google Lean Into Brand?
Minimize legal risks: if they hit small businesses almost nobody will see/notice/care, but big businesses are flush with cash and political connections. When Google hits big businesses they create organizations & movements like Fair Search & Search Neutrality.
Minimize duplication: some small businesses & affiliates simply repeat offers that exist on larger merchant sites. That said, many big businesses buy out a 2nd, 3rd, 4th, or even 5th site in a vertical to have multiple placements in the search results.
Better user experience: the theory is that the larger sites have more data and capital to improve user experience, but they don't always do it.
Business partnerships: if Google wants to strike up closed door business partnerships with big business then some of those negotiations will have specific terms attached to them. It costs Google nothing to give away part of the organic results as part of some custom deals. If Google wants to sell TV ads & run a media streaming device they need to play well with brands.
CPA-based product ads: on some searches Google provides CPA-based product ads above the search results. It makes sense for Google to promote those who are buying their ads to get the best relationships possible.
Fewer people tasting the revenues: the fewer organizations an ecosystem needs to support the more of the profits from that ecosystem that can be kept by the manager.
More complete ad cycle: if Google caters to direct response advertisers they get to monetize the demand fulfillment of demand, however that is only a small slice of the complete ad cycle. If Google caters to brands they get to monetize (directly or indirectly) every piece of the ad cycle. For example, buying display ads helps build brand searches which helps create brand signals. In such a way, improved rankings in the organic results subsidize ad buying.
Brands buying their equity: Google has create exceptionally large ad units & has convinced many brands to buy their own pre-existing brand equity.
Lack of Diversity
The big issue with brand bias is that a lot of the same *types* of companies rank with roughly similar consumer experiences. If there is a mix of large and small businesses that rank then many of those small businesses will be able to differentiate their offering by adding services to their products, doing in-depth reviews, and so on.
Sure Zappos is a big company known for customer service, but how different is the consumer-facing experience if I click on Target.com or Walmart.com? Sure the text on the page may be slightly different, but is there any real difference beyond aesthetic? Further, a lot of the business models built around strong in-depth editorial reviews & comparisons are eroded by the current algorithms. If the consumer reviews are not good enough, then tough luck!
Do Brands Always Provide a Better User Experience?
For decades, Target has collected vast amounts of data on every person who regularly walks into one of its stores. Whenever possible, Target assigns each shopper a unique code — known internally as the Guest ID number — that keeps tabs on everything they buy. "If you use a credit card or a coupon, or fill out a survey, or mail in a refund, or call the customer help line, or open an e-mail we've sent you or visit our Web site, we'll record it and link it to your Guest ID," Pole said. "We want to know everything we can."
Many big media companies provided watered down versions of their content online because they don't want to cannibalize their offline channels. Likewise some large stores may consider their website an afterthought. When I wanted to order my wife a specific shoe directly from the brand they didn't have customer support open for extended hours during the holidays and their shopping cart kept kicking an error. Since they *are* the brand, that brand strength allows them to get away with other issues that need fixed.
Some of those same sites carry huge AdSense ad blocks on the category pages & have funky technical issues which act like doorway pages & force users who are using any browser to go through their homepage if they land on a deep page.
Missing the Target indeed.
That above "screw you" redirect error has been going on literally for weeks now, with Target's webmaster asleep at the wheel. Perhaps they want you to navigate their site by internal search so they can track every character you type.
Riding The Waves
With SEO many aggressive techniques work for a period of time & then suddenly stop working. Every so often there are major changes like the Florida update & the Panda update, but in between these there are other smaller algorithmic updates that aim to fill in the holes until a big change comes about.
No matter what Google promotes, they will always have some gaps & relevancy issues. Some businesses that "ignore the algorithms and focus on the user" are likely to run on thinner margins than those who understand where they algorithms are headed. Those thin margins can quickly turn negative if either Google enters your niche or top competitors keep reinvesting in growth to buy more marketshare.
Given the above pattern - where trends spread until they get hit hard - those who quickly figure out where the algorithms are going & where there are opportunities have plenty of time to monetize their efforts. Whereas if you have to wait until things are widely spread on SEO blogs as common "tricks of the trade" or wait until a Google engineer explicitly confirms something then you are likely only going to be adopting techniques and strategies after most of the profit potential is sucked out of them, just before the goal posts move yet again.
People who cloned some of the most profitable eHow articles years ago had plenty of time to profit before the content farm business model got hit. Those who waited until Demand Media spelled their business model out in a Wired article had about 1.5 years until the hammer. Those who waited until the content farm controversy started creating a public relations issue to clone the model may have only had a couple months of enhanced revenues before their site got hit & was worse off than before they chased the algorithm late in the game.
Ride The Brand
If Google does over-represent established branded websites in their algorithms then in many cases it will be far easier to rank a Facebook notes page or a YouTube video than to try to rank a new site from scratch. There are a ton of web 2.0 sites driven by user generated content.
In addition to those sorts of sites, also consider participating in industry websites in your niche & buying presell pages on sites that rank especially well.
Collecting (& Abusing) User Data
Google has been repeatedly branded as being a bit creepy for their interest in user tracking.
Collecting that data & using it for ad targeting can have profound personal implications (think of serving a girl with anorexia ads about losing weight everywhere she goes online, simply because she clicks the ad, in such a case Google reinforces a warped worldview). Then when the person needs counseling Google can recommend a service provider there as well. ;)
Throughout the history of the web there will be many cycles between open and closed ecosystems. Currently we are cycling toward closed silos (Apple, Amazon, Google, Facebook). As these silos become more closed off they will end up leaving gaps that create new opportunities.
While on one front Google keeps making it easier for brands to compete against non-brands, Google also keeps clawing back a bigger slice of that branded traffic through larger AdWords ad units & integration of listings from services like Google+, which can in some cases outrank the actual brand.
Google has multiple platforms (Android Marketplace, Chrome Marketplace, Enterprise Marketplace) competing against iTunes. Google recently decided to merge some of their offerings into Google Play. In addition to games, music & books, Play will soon include audiobooks, magazines & other content formats.
Having a brand & following will still be important for allowing premium rates, fatter margins, building non-search distribution (which can be used to influence the "relevancy" signals), and to help overturn manual editorial interventions. But algorithmically brand emphasis will peak in the next year or two as Google comes to appreciate that they have excessively consolidated some markets and made it too hard for themselves to break into those markets. (Recall how Google came up with their QDF algorithm only *after* Google Finance wasn't able to rank). At that point in time Google will push their own verticals more aggressively & launch some aggressive public relations campaigns about helping small businesses succeed online.
Once Google is the merchant of record, almost everyone is just an affiliate, especially in digital marketplaces with digital delivery.