Here are a few common examples of unnatural links that violate our guidelines:....Links with optimized anchor text in articles or press releases distributed on other sites.
For example: There are many wedding rings on the market. If you want to have a wedding, you will have to pick the best ring. You will also need to buy flowers and a wedding dress.
In particular, they have focused on links with optimized anchor text in articles or press releases distributed on other sites. Google being Google, these rules are somewhat ambiguous. “Optimized anchor text”? The example they provide includes keywords in the anchor text, so keywords in the anchor text is “optimized” and therefore a violation of Google’s guidelines.
Ambiguously speaking, of course.
To put the press release change in context, Google’s guidelines state:
Any links intended to manipulate PageRank or a site's ranking in Google search results may be considered part of a link scheme and a violation of Google’s Webmaster Guidelines. This includes any behavior that manipulates links to your site or outgoing links from your site
So, links gained, for SEO purposes - intended to manipulate ranking - are against Google Guidelines.
In this chat, Google’s John Muller says that, if the webmaster initiated it, then it isn't a natural link. If you want to be on the safe side, John suggests to use no-follow on links.
Google are being consistent, but what’s amusing is the complete disconnect on display from a few of the webmasters. Google have no problem with press releases, but if a webmaster wants to be on the safe side in terms of Google’s guidelines, the webmaster should no-follow the link.
Simple, right. If it really is a press release, and not an attempt to link build for SEO purposes, then why would a webmaster have any issue with adding a no-follow to a link?
But because some webmasters appear to lack self-awareness about what it is they are actually doing, they persist with their line of questioning. I suspect what they really want to hear is “keyword links in press releases are okay." Then, webmasters can continue to issue pretend press releases as a link building exercise.
They're missing the point.
Am I Taking Google’s Side?
Not taking sides.
Just hoping to shine some light on a wider issue.
If webmasters continue to let themselves be defined by Google, they are going to get defined out of the game entirely. It should be an obvious truth - but sadly lacking in much SEO punditry - that Google is not on the webmasters side. Google is on Google’s side. Google often say they are on the users side, and there is certainly some truth in that.
However,when it comes to the webmaster, the webmaster is a dime-a-dozen content supplier who must be managed, weeded out, sorted and categorized. When it comes to the more “aggressive” webmasters, Google’s behaviour could be characterized as “keep your friends close, and your enemies closer”.
This is because some webmasters, namely SEOs, don’t just publish content for users, they compete with Google’s revenue stream. SEOs offer a competing service to click based advertising that provides exactly the same benefit as Google's golden goose, namely qualified click traffic.
If SEOs get too good at what they do, then why would people pay Google so much money per click? They wouldn’t - they would pay it to SEOs, instead. So, if I were Google, I would see SEO as a business threat, and manage it - down - accordingly. In practice, I’d be trying to redefine SEO as “quality content provision”.
Why don't Google simply ignore press release links? Easy enough to do. Why go this route of making it public? After all, Google are typically very secret about algorithmic topics, unless the topic is something they want you to hear. And why do they want you to hear this? An obvious guess would be that it is done to undermine link building, and SEOs.
Big missiles heading your way.
The problem in letting Google define the rules of engagement is they can define you out of the SEO game, if you let them.
If an SEO is not following the guidelines - guidelines that are always shifting - yet claim they do, then they may be opening themselves up to legal liability. In one recent example, a case is underway alleging lack of performance:
Last week, the legal marketing industry was aTwitter (and aFacebook and even aPlus) with news that law firm Seikaly & Stewart had filed a lawsuit against The Rainmaker Institute seeking a return of their $49,000 in SEO fees and punitive damages under civil RICO
.....but it’s not unreasonable to expect a somewhat easier route for litigants in the future might be “not complying with Google’s guidelines”, unless the SEO agency disclosed it.
SEO is not the easiest career choice, huh.
One group that is likely to be happy about this latest Google push is legitimate PR agencies, media-relations departments, and publicists. As a commenter on WMW pointed out:
I suspect that most legitimate PR agencies, media-relations departments, and publicists will be happy to comply with Google's guidelines. Why? Because, if the term "press release" becomes a synonym for "SEO spam," one of the important tools in their toolboxes will become useless.
Just as real advertisers don't expect their ads to pass PageRank, real PR people don't expect their press releases to pass PageRank. Public relations is about planting a message in the media, not about manipulating search results
However, I’m not sure that will mean press releases are seen as any more credible, as press releases have never enjoyed a stellar reputation pre-SEO, but it may thin the crowd somewhat, which increases an agencies chances of getting their client seen.
Guidelines Honing In On Target
One resource referred to in the video above was this article, written by Amit Singhal, who is head of Google’s core ranking team. Note that it was written in 2011, so it’s nothing new. Here’s how Google say they determine quality:
we aren't disclosing the actual ranking signals used in our algorithms because we don't want folks to game our search results; but if you want to step into Google's mindset, the questions below provide some guidance on how we've been looking at the issue:
Would you trust the information presented in this article?
Is this article written by an expert or enthusiast who knows the topic well, or is it more shallow in nature?
Does the site have duplicate, overlapping, or redundant articles on the same or similar topics with slightly different keyword variations?
Are the topics driven by genuine interests of readers of the site, or does the site generate content by attempting to guess what might rank well in search engines?
Does the article provide original content or information, original reporting, original research, or original analysis?
Does the page provide substantial value when compared to other pages in search results?
How much quality control is done on content?
….and so on. Google’s rhetoric is almost always about “producing high quality content”, because this is what Google’s users want, and what Google’s users want, Google’s shareholders want.
It’s not a bad thing to want, of course. Who would want poor quality content? But as most of us know, producing high quality content is no guarantee of anything. Great for Google, great for users, but often not so good for publishers as the publisher carries all the risk.
Take a look at the Boston Globe, sold along with a boatload of content for a 93% decline. Quality content sure, but is it a profitable business? Emphasis on content without adequate marketing is not a sure-fire strategy. Bezos has just bought the Washington Post, of course, and we're pretty sure that isn't a content play, either.
High quality content often has a high upfront production cost attached to it, and given measly web advertising rates, the high possibility of invisibility, getting content scrapped and ripped off, then it is no wonder webmasters also push their high quality content in order to ensure it ranks. What other choice have they got?
Google can move the goal- posts whenever they like. What you’re doing today might be frowned upon tomorrow. One day, your content may be made invisible, and there will be nothing you can do about it, other than start again.
Do you have a contingency plan for such an eventuality?
The only thing that matters is how much traffic you are getting from search engines today, and how prepared you are for when some (insert adjective here) Googler shuts off that flow of traffic"
To ask about the minuate of Google’s policies and guidelines is to miss the point. The real question is how prepared are you when Google shuts off you flow of traffic because they’ve reset the goal posts?
Focusing on the minuate of Google's policies is, indeed, to miss the point.
This is a question of risk management. What happens if your main site, or your clients site, runs foul of a Google policy change and gets trashed? Do you run multiple sites? Run one site with no SEO strategy at all, whilst you run other sites that push hard? Do you stay well within the guidelines and trust that will always be good enough? If you stay well within the guidelines, but don’t rank, isn’t that effectively the same as a ban i.e. you’re invisible? Do you treat search traffic as a bonus, rather than the main course?
Be careful about putting Google’s needs before your own. And manage your risk, on your own terms.
Over the past couple years since its launch, Google's keyword (not provided) has received quite a bit of exposure, with people discussing all sorts of tips on estimating its impact & finding alternate sources of data (like competitive research tools & webmaster tools).
What hasn't received anywhere near enough exposure (and should be discussed daily) is that the sole purpose of the change was anti-competitive abuse from the market monopoly in search.
According to research by RKG, mobile click prices are nearly 60% of desktop click prices, while mobile search click values are only 22% of desktop click prices. Until Google launched enhanced AdWords campaigns they understated the size of mobile search by showing many mobile searchers as direct visitors. But now that AdWords advertisers were opted into mobile ads (and have to go through some tricky hoops to figure out how to disable it), Google has every incentive to promote what a big growth channel mobile search is for their business.
Looking at the analytics data for some non-SEO websites over the past 4 days I get Google referring an average of 86% of the 26,233 search visitors, with 13,413 being displayed as keyword (not provided).
Hiding The Value of SEO
Google is not only hiding half of their own keyword referral data, but they are hiding so much more than half that even when you mix in Bing and Yahoo! you still get over 50% of the total hidden.
Google's 86% of the 26,233 searches is 22,560 searches.
Keyword (not provided) being shown for 13,413 is 59% of 22,560. That means Google is hiding at least 59% of the keyword data for organic search. While they are passing a significant share of mobile search referrers, there is still a decent chunk that is not accounted for in the change this past week.
Not passing keywords is just another way for Google to increase the perceived risk & friction of SEO, while making SEO seem less necessary, which has been part of "the plan" for years now.
When one digs into keyword referral data & ad blocking, there is a bad odor emitting from the GooglePlex.
Subsidizing Scammers Ripping People Off
A number of the low end "solutions" providers scamming small businesses looking for SEO are taking advantage of the opportunity that keyword (not provided) offers them. A buddy of mine took over SEO for a site that had showed absolutely zero sales growth after a year of 15% monthly increase in search traffic. Looking at the on-site changes, the prior "optimizers" did nothing over the time period. Looking at the backlinks, nothing there either.
So what happened?
Well, when keyword data isn't shown, it is pretty easy for someone to run a clickbot to show keyword (not provided) Google visitors & claim that they were "doing SEO."
And searchers looking for SEO will see those same scammers selling bogus solutions in AdWords. Since they are selling a non-product / non-service, their margins are pretty high. Endorsed by Google as the best, they must be good.
Google does prefer some types of SEO over others, but their preference isn’t cast along the black/white divide you imagine. It has nothing to do with spam or the integrity of their search results. Google simply prefers ineffective SEO over SEO that works. No question about it. They abhor any strategies that allow guys like you and me to walk into a business and offer a significantly better ROI than AdWords.
This is no different than the YouTube videos "recommended for you" that teach you how to make money on AdWords by promoting Clickbank products which are likely to get your account flagged and banned. Ooops.
Anti-competitive Funding Blocking Competing Ad Networks
sponsoring Adblock is changing the market conditions. Adblock can use the money provided by Google to make sure any non-Google ad is blocked more efficiently. They can also advertise their addon better, provide better support, etc. Google sponsoring Adblock directly affects Adblock's ability to block the adverts of other companies around the world. - RyanZAG
Turn AdBlock Plus on & search for credit cards on Google and get ads.
Do that same search over at Bing & get no ads.
How does a smaller search engine or a smaller ad network compete with Google on buying awareness, building a network AND paying the other kickback expenses Google forces into the marketplace?
Which is part of the reason a monopoly in search can be used to control the rest of the online ecosystem.
Buying Browser Marketshare
Already the #1 web browser, Google Chrome buys marketshare with shady one-click bundling in software security installs.
If you do that stuff in organic search or AdWords, you might be called a spammer employing deceptive business practices.
When Google does it, it's "good for the user."
Vampire Sucking The Lifeblood Out of SEO
Google tells Chrome users "not signed in to Chrome (You're missing out - sign in)." Login to Chrome & searchers don't pass referral information. Google also promotes Firefox blocking the passage of keyword referral data in search, but when it comes to their own cookies being at risk, that is unacceptable: "Google is pulling out all the stops in its campaign to drive Chrome installs, which is understandable given Microsoft and Mozilla's stance on third-party cookies, the lifeblood of Google's display-ad business."
What do we call an entity that considers something "its lifeblood" while sucking it out of others?
“Search engine optimization” has always been an odd term as it’s somewhat misleading. After all, we’re not optimizing search engines.
SEO came about when webmasters optimized websites. Specifically, they optimized the source code of pages to appeal to search engines. The intent of SEO was to ensure websites appeared higher in search results than if the site was simply left to site designers and copywriters. Often, designers would inadvertently make sites uncrawlable, and therefore invisible in search engines.
But there was more to it than just enhancing crawlability.
SEOs examined the highest ranking page, looked at the source code, often copied it wholesale, added a few tweaks, then republished the page. In the days of Infoseek, this was all you needed to do to get an instant top ranking.
I know, because I used to do it!
At the time, I thought it was an amusing hacker trick. It also occurred to me that such positioning could be valuable. Of course, this rather obvious truth occurred to many other people, too. A similar game had been going on in the Yahoo Directory where people named sites “AAAA...whatever” because Yahoo listed sites in alphabetical order. People also used to obsessively track spiders, spotting fresh spiders (Hey Scooter!) as they appeared and....cough......guiding them through their websites in a favourable fashion.
When it comes to search engines, there’s always been gaming. The glittering prize awaits.
The new breed of search engines made things a bit more tricky. You couldn’t just focus on optimizing code in order to rank well. There was something else going on.
So, SEO was no longer just about optimizing the underlying page code, SEO was also about getting links. At that point, SEO jumped from being just a technical coding exercise to a marketing exercise. Webmasters had to reach out to other webmasters and convince them to link up.
A young upstart, Google, placed heavy emphasis on links, making use of a clever algorithm that sorted “good” links from, well, “evil” links. This helped make Google’s result set more relevant than other search engines. Amusingly enough, Google once claimed it wasn’t possible to spam Google.
Webmasters responded by spamming Google.
Or, should I say, Google likely categorized what many webmasters were doing as “spam”, at least internally, and may have regretted their earlier hubris. Webmasters sought links that looked like “good” links. Sometimes, they even earned them.
And Google has been pushing back ever since.
Building links pre-dated SEO, and search engines, but, once backlinks were counted in ranking scores, link building was blended into SEO. These days, most SEO's consider link building a natural part of SEO. But, as we've seen, it wasn’t always this way.
We sometimes get comments on this blog about how marketing is different from SEO. Well, it is, but if you look at the history of SEO, there has always been marketing elements involved. Getting external links could be characterized as PR, or relationship building, or marketing, but I doubt anyone would claim getting links is not SEO.
More recently, we’ve seen a massive change in Google. It’s a change that is likely being rolled out over a number of years. It’s a change that makes a lot of old school SEO a lot less effective in the same way introducing link analysis made meta-tag optimization a lot less effective.
My takeaways from Panda are that this is not an individual change or something with a magic bullet solution. Panda is clearly based on data about the user interacting with the SERP (Bounce, Pogo Sticking), time on site, page views, etc., but it is not something you can easily reduce to 1 number or a short set of recommendations. To address a site that has been Pandalized requires you to isolate the "best content" based on your user engagement and try to improve that.
Google is likely applying different algorithms to different sectors, so the SEO tactics used in on sector don’t work in another. They’re also looking at engagement metrics, so they’re trying to figure out if the user really wanted the result they clicked on. When you consider Google's work on PPC landing pages, this development is obvious. It’s the same measure. If people click back often, too quickly, then the landing page quality score drops. This is likely happening in the SERPs, too.
So, just like link building once got rolled into SEO, engagement will be rolled into SEO. Some may see that as a death of SEO, and in some ways it is, just like when meta-tag optimization, and other code optimizations, were deprecated in favour of other, more useful relevancy metrics. In others ways, it's SEO just changing like it always has done.
The objective remains the same.
Deciding On Strategy
So, how do you construct your SEO strategy? What will be your strategy going forward?
Some read Google’s Webmaster Guidelines. They'll watch every Matt Cutts video. They follow it all to the letter. There’s nothing wrong with this approach.
Others read Google’s Guidelines. They'll watch every Matt Cutts video. They read between the lines and do the complete opposite. Nothing wrong with that approach, either.
It depends on what strategy you've adopted.
One of the problems with letting Google define your game is that they can move the goalposts anytime they like. The linking that used to be acceptable, at least in practice, often no longer is. Thinking of firing off a press release? Well, think carefully before loading it with keywords:
This is one of the big changes that may have not been so clear for many webmasters. Google said, “links with optimized anchor text in articles or press releases distributed on other sites,” is an example of an unnatural link that violate their guidelines. The key are the examples given and the phrase “distributed on other sites.” If you are publishing a press release or an article on your site and distribute it through a wire or through an article site, you must make sure to nofollow the links if those links are “optimized anchor text.
Do you now have to go back and unwind a lot of link building in order to stay in their good books? Or, perhaps you conclude that links in press releases must work a little too well, else Google wouldn’t be making a point of it. Or conclude that Google is running a cunning double-bluff hoping you’ll spend a lot more time doing things you think Google does or doesn’t like, but really Google doesn’t care about at all, as they’ve found a way to mitigate it.
Bulk guest posting were also included in Google's webmaster guidelines as a no no. Along with keyword rich anchors in article directories. Even how a site monetizes by doing things like blocking the back button can be considered "deceptive" and grounds for banning.
How about the simple strategy of finding the top ranking sites, do what they do, and add a little more? Do you avoid saturated niches, and aim for the low-hanging fruit? Do you try and guess all the metrics and make sure you cover every one? Do you churn and burn? Do you play the long game with one site? Is social media and marketing part of your game, or do you leave these aspects out of the SEO equation? Is your currency persuasion?
Think about your personal influence and the influence you can manage without dollars or gold or permission from Google. Think about how people throughout history have sought karma, invested in social credits, and injected good will into their communities, as a way to “prep” for disaster. Think about it.
We may be “search marketers” and “search engine optimizers” who work within the confines of an economy controlled (manipulated) by Google, but our currency is persuasion. Persuasion within a market niche transcends Google
It would be interesting to hear the strategies you use, and if you plan on using different strategy going forward.
The default amount of useful information offered by the new layout is less than the old layout provided, while requiring user interaction with the result set to get the information they want. You get a picture, but the only way the phone number is in the result set is if you click into that result set or conduct a branded query from the start.
If you search for a general query (say "Indian restaurants") and want the phone number of a specific restaurant, you will likely need to click on that restaurant's picture in order to shift the search to that restaurant's branded search result set to pull their phone number & other information into the page. In that way Google is able to better track user engagement & enhance personalization on local search. When people repeatedly click into the same paths from logged in Google user accounts then Google can put weight on the end user behavior.
This multi-click process not only gives Google usage data to refine rankings with, but it also will push advertisers into buying branded AdWords ads.
Where this new result set is a bit of a train wreck for navigational searches is when a brand is fairly generic & aligned with a location as part of the business name. For instance, in Oakland there is a place named San Francisco Pizza. Even if you do that branded search, you still get the carousel & there might also be three AdWords ads above the organic search results.
Some of Google's other verticals may appear above the organic result set too. When searching for downtown Oakland hotels they offer listings of hotels in San Francisco & Berkeley inside the hotel onebox.
Perhaps Google can patch together some new local ad units that work with the carousel to offer local businesses a flat-rate monthly ad product. A lot of advertisers would be interested in testing a subscription product that enabled them to highlight selected user reviews and include other options like ratings & coupons & advertiser control of the image. As the search result set becomes the destination some of Google's ad products can become much more like Yelp's.
In the short term the new layout is likely a boon for Yelp & some other local directory plays. Whatever segment of the search audience that dislike's the new carousel will likely be shunted into many of these other local directories.
In the longrun some of these local directories will be the equivalent of MapQuest. As Google gains confidence they will make their listings richer & have more confidence in entirely displacing the result set. The following search isn't a local one, but is a good example of where we may be headed. Even though the search is set to "web" results (rather than "video" results) the first 9 listings are from YouTube.
Actually, they’re not words they’re acronyms, but you get my drift, I’m sure :)
It must be difficult for SEO providers to stay on the “good and pure” side of SEO when the definitions are constantly shifting. Recently we’ve seen one prominent SEO tool provider rebrand as an “inbound marketing” tools provider and it’s not difficult to appreciate the reasons why.
SEO, to a lot of people, means spam. The term SEO is lumbered, rightly or wrongly, with negative connotations.
Consider email marketing.
Is all email marketing spam? Many would consider it annoying, but obviously not all email marketing is spam.
There is legitimate email marketing, whereby people opt-in to receive email messages they consider valuable. It is an industry worth around $2.468 billion. There are legitimate agencies providing campaign services, reputable tools vendors providing tools, and it can achieve measurable marketing results where everyone wins.
Yet, most email marketing is spam. Most of it is annoying. Most of it is irrelevant. According to a Microsoft security report, 97% of all email circulating is spam.
So, only around 3% of all email is legitimate. 3% of email is wanted. Relevant. Requested.
One wonders how much SEO is legitimate? I guess it depends what we mean by legitimate, but if we accept the definition I’ve used - “something relevant wanted by the user” - then, at a guess, I’d say most SEO these days is legitimate, simply because being off-topic is not rewarded. Most SEOs provide on-topic content, and encourage businesses to publish it - free - on the web. If anything, SEOs could be accused of being too on-topic.
The proof can be found in the SERPs. A site is requested by the user. If a site is listed matches their query, then the user probably deems it to be relevant. They might find that degree of relevance, personally, to be somewhat lacking, in which case they’ll click-back, but we don’t have a situation where search results are rendered irrelevant by the presence of SEO.
Generally speaking, search appears to work well in terms of delivering relevance. SEO could be considered cleaner than email marketing in that SEOs are obsessed with being relevant to a user. The majority of email marketers, on the other hand, couldn't seem to care less about what is relevant, just so long as they get something, anything, in front of you. In search, if a site matches the search query, and the visitor likes it enough to register positive quality metrics, then what does it matter how it got there?
It probably depends on whos’ business case we’re talking about.
Matt Cutts has released a new video on Advertorials and Native Advertising.
Matt makes a good case. He reminds us of the idea on which Google was founded, namely citation. If people think a document is important, or interesting, they link to it.
This idea came from academia. The more an academic document is cited, and cited by those with authority, the more relevant that document is likely to be. Nothing wrong with that idea, however some of the time, it doesn’t work. In academic circles, citation is prone to corruption. One example is self-citation.
But really, excessive self-citation is for amateurs: the real thing is forming a “citation cartel” as Phil David from The Scholarly Kitchen puts it. In April this year, after receiving a “tip from a concerned scientist” Davis did some detective work using the JCR data and found that several journals published reviews citing an unusually high number of articles fitting the JIF window from other journals. In one case, theMedical Science Monitor published a 2010 review citing 490 articles, 445 of them were published in 2008-09 in the journal Cell Transplantation (44 of the other 45 were for articles from Medical Science Journal published in 2008-09 as well). Three of the authors were Cell Transplantation editors
So, even in academia, self-serving linking gets pumped and manipulated. When this idea is applied to the unregulated web where there is vast sums of money at stake, you can see how citation very quickly changes into something else.
There is no way linking is going to stay “pure” in such an environment.
The debate around “paid links” and “paid placement” has been done over and over again, but in summary, the definition of “paid” is inherently problematic. For example, some sites invite guest posting, pay the writers nothing in monetary terms, but the payment is a link back to the writers site. The article is a form of paid placement, it’s just that no money changes hands. Is the article truly editorial?
It’s a bit grey.
A lot of the time, such articles pump the writers business interests. Is that paid content, and does it need to be disclosed? Does it need to be disclosed to both readers and search engines? I think Matt's video suggests it isn't a problem, as utility is provided, but a link from said article may need to be no-followed in order to stay within Google's guidelines.
Matt wants to see clear and conspicuous disclosure of advertorial content. Paid links, likewise. The disclosure should be made both to search engines and readers.
Which is interesting.
Why would a disclosure need to be made to a search engine spider? Granted, it makes Google’s job easier, but I’m not sure why publishers would want to make Google’s job easier, especially if there’s nothing in it for the publishers.
But here comes the stick, and not just from the web spam team.
Google News have stated they may remove a publication if a publication is taking money for paid content and not adequately disclosing that fact - in Google’s view - to both readers and search engines, then that publication may be kicked from Google News. In so doing, Google increase the risk to the publisher, and therefore the cost, in accepting paid links or paid placement.
So, that’s why a publisher will want to make Google’s job easier. If they don’t, they run the risk of invisibility.
Now, on one level, this sounds fair and reasonable. The most “merit worthy” content should be at the top. A ranking should not depend on how deep your pockets are i.e. the more links you can buy, the more merit you have.
However, one of the problems is that the search results already work this way. Big brands often do well in the SERPs due to reputation gained, in no small part, from massive advertising spend that has the side effect, or sometimes direct effect, of inbound links. Do these large brands therefore have “more merit” by virtue of their deeper pockets?
SEO has helped level the playing field for small businesses, in particular. The little guy didn’t have deep pockets, but he could play the game smarter by figuring out what the search engines wanted, algorithmicly speaking, and giving it to them.
I can understand Google’s point of view. If I were Google, I’d probably think the same way. I’d love a situation where editorial was editorial, and business was PPC. SEO, to me, would mean making a site crawlable and understandable to both visitors and bots, but that’s the end of it. Anything outside that would be search engine spam. It’s neat. It’s got nice rounded edges. It would fit my business plan.
But real life is messier.
If a publisher doesn’t have the promotion budget of a major brand, and they don’t have enough money to outbid big brands on PPC, then they risk being invisible on search engines. Google search is pervasive, and if you’re not visible in Google search, then it’s a lot harder to make a living on the web. The risk of being banned for not following the guidelines is the same as the risk of playing the game within the guidelines, but not ranking. That risk is invisibility.
Is the fact a small business plays a game that is already stacked against them, by using SEO, “bad”? If they have to pay harder than the big brands just to compete, and perhaps become a big brand themselves one day, then who can really blame them? Can a result that is relevant, as far as the user is concerned, still really be labelled “spam”? Is that more to do with the search engines business case than actual end user dissatisfaction?
Publishers and SEOs should think carefully before buying into the construct that SEO, beyond Google’s narrow definition, is spam. Also consider that the more people who can be convinced to switch to PPC and/or stick to just making sites more crawlable, then the more spoils for those who couldn’t care less how SEO is labelled.
It would be great if quality content succeeded in the SERPs on merit, alone. This would encourage people to create quality content. But when other aspects are rewarded, then those aspects will be played.
Perhaps if the search engines could be explicit about what they want, and reward it when they’re delivered it, then everyone’s happy.
I guess the algorithms just aren’t that clever yet.
In his recent speech at Google I/O, Page talked about privacy and how it impairs Google. “Why are people so focused on keeping their medical history private”? If only people would share more, then Google could do more.
We look forward to Google taking the lead in this area and opening up their systems to public inspection. Perhaps they could start with the search algorithms. If Google would share more, publishers could do more.
What’s not to like? :)
But perhaps that’s comparing apples with oranges. The two areas may not be directly comparable as the consequences of opening up the algorithm would likely destroy Google’s value. Google’s argument against doing so has been that the results would suffer quality issues.
Google would not win.
If Page's vision sounds somewhat utopian, then perhaps we should consider where Google came from.
A decade ago, the Internet was frequently viewed through a utopian lens, with scholars redicting that this increased ability to share, access, and produce content would reduce barriers to information access...Underlying most of this work is a desire to prevent online information from merely mimicking the power structure of the conglomerates that dominate the media landscape. The search engine, subsequently, is seen as an idealized vehicle that can differentiate the Web from the consolidation that has plagued ownership and content in traditional print and broadcast media
At the time, researchers Introna and Nissenbaum felt that online information was too important to be shaped by market forces alone. They correctly predicted this would lead to a loss of information quality, and a lack of diversity, as information would pander to popular tastes.
They advocated, perhaps somewhat naively in retrospect, public oversight of search engines and algorithm transparency to correct these weaknesses. They argued that doing so would empower site owners and users.
Fast forward to 2013, and there is now more skepticism about such utopian values. Search engines are seen as the gatekeepers of information, yet they remain secretive about how they determine what information we see. Sure, they talk about their editorial process in general terms, but the details of the algorithms remain a closely guarded secret.
In the past decade, we’ve seen a considerable shift in power away from publishers and towards the owners of big data aggregators, like Google. Information publishers are expected to be transparent - so that a crawler can easily gather information, or a social network can be, well, social - and this has has advantaged Google and Facebook. It would be hard to run a search engine or a social network if publishers didn't buy into this utopian vision of transparency.
Yet, Google aren’t quite as transparent with their own operation. If you own a siren server, then you want other people to share and be open. But the same rule doesn’t apply to the siren server owner.
Opening Up Health
Larry is concerned about constraints in healthcare, particularly around access to private data.
“Why are people so focused on keeping their medical history private?” Page thinks it’s because people are worried about their insurance. This wouldn’t happen if there was universal care, he reasons.
I don’t think that’s correct.
People who live in areas where there is universal healthcare, like the UK, Australia and New Zealand, are still very concerned about the privacy of their data. People are concerned that their information might be used against them, not just by insurance companies, but by any company, not to mention government agencies and their employees.
People just don’t like the idea of surveillance, and they especially don’t like the idea of surveillance by advertising companies who operate inscrutable black boxes.
Not that good can’t come from crunching the big data linked to health. Page is correct in saying there is a lot of opportunity to do good by applying technology to the health sector. But first companies like Google need to be a lot more transparent about their own data collection and usage in order to earn trust. What data are they collecting? Why? What is it used for? How long is it kept? Who can access it? What protections are in place? Who is watching the watchers?
I’d argue that in order for people to trust Google to a level Page demands would require a lot more rigor and transparency, including third party audit. There are also considerable issues to overcome, in terms of government legislation, such as privacy acts. Perhaps the most important question is "how does this shift power balances"? No turkey votes for an early Christmas. If your job relies on being a gatekeeper of health information, you're hardly going to hand that responsibility over to Google.
So, it’s not a technology problem. And not just because people afraid of insurance companies. And it’s not because people aren’t on board with the whole Burning-Man-TechnoUtopia vision. It’s to do with trust. People would like to know what they’re giving up, to whom, and what they’re getting in return. And it's about power and money.
Page has answered some of the question, but not nearly enough of it. Something might be good for Google, and it might be good for others, but people want a lot more than just his word on it.
The changes Page wants require more than money. They require a change of culture, both political and national. The massively optimistic view that technology can solve all of what ails America—and the accompanying ideas on immigration, patent reform, and privacy—are not going to be so easy to force into the brains of the masses.
The biggest reason is trust. Most people trust the government because it's the government—a 226-year old institution that behaves relatively predictably, remains accountable to its citizens, and is governed by source code (the Constitution) that is hard to change. Google, on the other hand, is a 15-year old institution that is constantly shifting in nature, is accountable to its stockholders, and is governed by source code that is updated daily. You can call your Congressman and watch what happens in Washington on C-SPAN every day. Google is, to most people, a black box that turns searches and personal data into cash”
It started with one store on the outskirts of town. It was big. Monolithic. It amalgamated a lot of cheap, largely imported stuff and sold the stuff on. The workers were paid very little. The suppliers were squeezed tight on their margins.
And so it grew.
And as it grew, it hollowed out the high street. The high street could not compete with the monoliths sheer power. They couldn’t compete with the monoliths influence on markets. They couldn’t compete with the monoliths unique insights gained from clever number crunching of big data sets.
I’m talking about Wal Mart, of course.
Love ‘em or loathe ‘em, Walmart gave people what they wanted, but in so doing, hollowed out a chunk of America's middle class. It displaced a lot of shop keepers. It displaced small business owners on Main Street. It displaced the small family retail chain that provided a nice little middle class steady earner.
Where did all those people go?
It was not only the small, independent retail businesses and local manufacturers who were fewer in number. Their closure triggered flow-on effects. There was less demand for the services they used, such as local small business accountants, the local lawyer, small advertising companies, local finance companies, and the host of service providers that make up the middle class ecosystem.
Where did they all go?
Some would have taken up jobs at WalMart, of course. Some would become unemployed. Some would close their doors and take early retirement. Some would change occupations and some would move away to where prospects were better.
What does any of this have to do with the internet?
The same thing is happening on the internet.
And if you’re a small business owner, located on the web-equivalent of the high street, or your business relies on those same small business owners, then this post is for you.
Is Technology Gutting The Middle Class?
I’ve just read “Who Owns The Future”, by Jaron Lanier. Anyone who has anything to do with the internet - and anyone who is even remotely middle class - will find it asks some pretty compelling questions about our present and future.
At the height of it’s power, the photography company Kodak employed more than 140,000 people and wa worth $28 billion. They even invented the first digital camera. But today Kodak is bankrupt, and the new face of digital photography has become Instagram. When it was sold to Facebook for a billion dollars in 2012, Instagram only employed 13 people
Great for Instagram. Bad for Kodak. And bad for the people who worked for Kodak. But, hey. That’s progress, right? Kodak had an outdated business model. Technology overtook them.
That’s true. It is progress. It’s also true that all actions have consequences. The consequence of transformative technology is that, according to Lanier, it may well end up destroying the middle class if too much of the value is retained in the large technology companies.
Lanier suggests that the advance of technology is not replacing as many jobs as it destroys, and those jobs that are destroyed are increasingly middle class.
Not Political (Kinda)
I don’t wish to make this post political, although all change is inherently political. I’m not taking political sides. This issue cuts across political boundaries. I have a lot of sympathy for technological utopian ideas and the benefits technology brings, and have little time for luddism.
However, it’s interesting to focus on the the consequences of this shift in wealth and power brought about by technology and whether enough people in the internet value chain receive adequate value for their efforts.
If the value doesn't flow through, as capitalism requires in order to function well, then few people win. Are children living at home longer than they used to? Are people working longer hours than they used to in order to have the same amount of stuff? Has the value chain been broken, Lanier asks? And, if so, what can be done to fix it?
What Made Instagram Worth One Billion Dollars?
Lanier points out that Instagram wasn’t worth a billion dollars because it had extraordinary employees doing amazing things.
The value of Instagram came from network effects.
Millions of people using Instagram gave the Instagram network value. Without the user base, Instagram is just another photo app.
Who got paid in the end? Not the people who gave the network value. The people who got paid were the small group at the top who organized the network. The owners of the "Siren Servers":
The power rests in what Lanier calls the “Siren Servers”: giant corporate repositories of information about our lives that we have given freely and often without consent, now being used for huge financial benefit by a super-rich few
The value is created by all the people who make up the network, but they only receive a small slither of that value in the form of a digital processing tool. To really benefit, you have to own, or get close to, a Siren Server.
Likewise, most of Google’s value resides in the network of users. These users feed value into Google simply by using it and thereby provide Google with a constant stream of data. This makes Google valuable. There isn’t much difference between Google and Bing in terms of service offering, but one is infinitely more valuable than the other purely by virtue of the size of the audience. Same goes for Facebook over Orkut.
You Provide Value
Google are provided raw materials by people. Web publishers allow Google to take their work, at no charge, and for Google to use that work and add value to Google’s network. Google then charges advertisers to place their advertising next to the aggregated information.
Why do web publishers do this?
Publishers create and give away their work in the hope they’ll get traffic back, from which they may derive benefit. Some publishers make money, so they can then pay real-world expenses, like housing, food and clothing. The majority of internet publishers make little, or nothing, from this informal deal. A few publishers make a lot. The long tail, when it comes to internet publishing, is pretty long. The majority of wealth, and power, is centralized at the head.
Similarly, Google’s users are giving away their personal information.
Every time someone uses Google, they are giving Google personal information of value. Their search queries. They browsing patterns. Their email conversations. Their personal network of contacts. Aggregate that information together, and it becomes valuable information, indeed. Google records this information, crunches it looking for patterns, then packages it up and sells it to advertisers.
What does Google give back in return?
Is it a fair exchange of value?
Lanier argues it isn’t. What’s more, it’s an exchange of value so one-sided that it’s likely to destroy the very ecosystem on which companies like Google are based - the work output, and the spending choices, of the middle class. If few of the people who publish can make a reasonable living doing so, then the quality of what gets published must decrease, or cease to exist.
People could make their money in other ways, including offline. However, consider that the web is affecting a lot of offline business, already. The music industry is a faint shadow of what it once was, even as recent as one decade ago. There are a lot fewer middle class careers in the music industry now. Small retailers are losing out to the web. Fewer jobs there. The news industry is barely making any money. Same goes for book publishers. All these industries are struggling as online aggregators carve up their value chains.
Now, factor in all the support industries of these verticals. Then think about all the industries likely to be affected in the near future - like health, or libraries, or education, for example. Many businesses that used to hire predominantly middle class people are going out of business, downsizing their operations, or soon to have chunks of their value displaced.
It’s not Google’s aim to gut the middle class, of course. This post is not an anti-Google rant, either, simply a look at action and consequence. What is the effect of technology and, in particular, the effect of big technology companies on the web, most of whom seem obsessed with keeping you in their private, proprietary environments for as long as possible?
Google’s aim is index all the worlds information and make it available. That’s a good aim. It’s a useful, free service. But Lanier argues that gutting the middle class is a side-effect of re-contextualising, and thereby devaluing, information. Information may want to be free, but the consequence of free information is that those creating the information may not get paid. Many of those who do get paid may be weaker organizations more willing to sacrifice editorial quality in able to stay in business. We already see major news sites with MFA-styled formatting on unvetted syndicated press releases. What next?
You may notice that everyone is encouraged to “share” - meaning “give away” - but sharing doesn't seem to extend to the big tech companies, themselves.
They charge per click.
One argument is that if someone doesn’t like Google, or any search engine, they should simply block that search engine via robots.txt. The problem with that argument is it’s like saying if you don’t like aspects of your city, you should move to the middle of the wilderness. You could, but really you’d just like to make the city a better place to be, and to see it thrive and prosper, and be able to thrive within it.
Google provides useful things. I use Google, just like I use my iPhone. I know the deal. I get the utility in exchange for information, and this exchange is largely on their terms. What Lanier proposes is a solution that not only benefits the individual, and the little guy, but ultimately the big information companies, themselves.
Money Go Round
Technology improvements have created much prosperity and the development of a strong middle class. But the big difference today is that what is being commoditized is information itself. In a world increasingly controlled by software that acts as our interface to information, if we commoditize information then we commoditize everything else.
If those creating the information don’t get paid, quality must decrease, or become less available than it otherwise would be. They can buy less stuff in the real world. If they can’t buy as much stuff in the real world, then Google and Facebook’s advertisers have fewer people to talk to that they otherwise would.
It was all a social construct to begin with, so what changed, to get to your question, is that at the turn of the [21st] century it was really Sergey Brin at Google who just had the thought of, well, if we give away all the information services, but we make money from advertising, we can make information free and still have capitalism. But the problem with that is it reneges on the social contract where people still participate in the formal economy. And it’s a kind of capitalism that’s totally self-defeating because it’s so narrow. It’s a winner-take-all capitalism that’s not sustaining
That isn’t a sustainable situation long-term. A winner-takes-all system centralizes wealth and power at the top, whilst everyone else occupies the long tail. Google has deals in place with large publishers, such as AP, AFP and various European agencies, but this doesn't extend to smaller publishers. It’s the same in sports. The very top get paid ridiculous amounts of money whilst those only a few levels down are unlikely to make rent on their earnings.
But doesn’t technology create new jobs? People who were employed at Kodak just go do something else?
The latest waves of high tech innovation have not created jobs like the old ones did. Iconic new ventures like Facebook employ vastly fewer people than big older companies like, say, General Motors. Put another way, the new schemes.....channel much of the productivity of ordinary people into an informal economy of barter and reputation, while concentrating the extracted old -fashioned wealth for themselves. All activity that takes place over digital networks becomes subject to arbitrage, in the sense that risk is routed to whoever suffers lesser computation resources
The people who will do well in such an environment will likely be employees of those who own the big data networks, like Google. Or they will be the entrepreneurial and adaptable types who manage to get close to them - the companies that serve WalMart or Google, or Facebook, or large financial institutions, or leverage off them - but Lanier argues there simply aren't enough of those roles to sustain society in a way that gave rise to these companies in the first place.
He argues this situation disenfranchises too many people, too quickly. And when that happens, the costs spread to everyone, including the successful owners of the networks. They become poorer than they would otherwise be by not returning enough of the value that enables the very information they need to thrive. Or another way of looking at it - who’s going to buy all the stuff if only a few people have the money?
The network, whether it be a search engine, a social network, an insurance company, or an investment fund uses information to concentrate power. Lanier argues they are all they same as they operate in pretty much the same way. The use network effects to mine and crunch big data, and this, in turn, grows their position at the expense of smaller competitors, and the ecosystem that surrounds them.
It doesn’t really matter what the intent was. The result is that the technology can prevent the middle class from prospering and when that happens, everyone ultimately loses.
Google have announced a web spam change, called Penguin 2.0. They’ll be “looking at” advertorials, and native advertising. They’ll be taking a “stronger line” on this form of publishing. They’ll also be “going upstream” to make link spammers less effective.
Of course, whenever Google release these videos, the webmaster community goes nuts. Google will be making changes, and these changes may either make your day, or send you to the wall.
The most interesting aspect of this, I think, is the power relationship. If you want to do well in Google’s search results then there is no room for negotiation. You either do what they want or you lose out. Or you may do what they want and still lose out. Does the wealth and power sit with the publisher?
Cutts warns they’ll be going after a lot of this happening. Does wealth and power sit with the link buyer or seller?
Now, Google are right to eliminate or devalue sites that they feel devalues their search engine. Google have made search work. Search was all but dead twelve years ago due to the ease with which publishers could manipulate the results, typically with off-topic junk. The spoils of solving this problem have flowed to Google.
The question is has too much wealth flowed to companies like Google, and is this situation going to kill off large chunks of the ecosystem on which it was built? Google isn’t just a player in this game, they’re so pervasive they may as well be the central planner. Cutts is running product quality control. The customers aren’t the publishers, they’re the advertisers.
It’s also interesting to note what these videos do not say. Cutts video was not about how your business could be more prosperous. It was all about your business doing what Google wants in order for Google to be more prosperous. It’s irrelevant if you disagree or not, as you don’t get to dictate terms to Google.
That’s the deal.
Google’s concern lies not with webmasters just as WalMarts concern lies not with small town retailers. Their concern is to meet company goals and enhance shareholder value. The effects aren’t Google or WalMarts fault. They are just that - effects.
The effect of Google pursuing those objectives might be to gouge out the value of publishing, and in so doing, gouge out a lot of the value of the middle class. The Google self-drive cars project is fascinating from a technical point of view - the view Google tends to focus on - but perhaps even more fascinating when looked at from a position they seldom seem to consider, at least, not in public, namely what happens to all those taxi drivers, and delivery drivers, who get their first break in society doing this work? Typically, these people are immigrants. Typically, they are poor but upwardly mobile.
That societal effect doesn't appear to be Google’s concern.
So who’s concern should it be?
Well, perhaps it really should be Google’s concern, as it’s in their own long-term best interest:
Today, a guitar manufacturer might advertise through Google. But when guitars are someday spun out of 3D printers, there will be no one to buy an ad if guitar designs are “free”. Yet Google’s lifeblood is information put online for free. That is what Google’s servers organize. Thus Google’s current business model is a trap in the longterm
Laniers suggestion is everyone gets paid, via micro-payments, linked back to the value they helped create. These payments continue so long as people are using their stuff, be it a line of code, a photograph, a piece of music, or an article.
For example, if you wrote a blog post, and someone quoted a paragraph of it, you would receive a tiny payment. The more often you’re quoted, the more relevant you are, therefore the more payment you receive. If a search engine indexes your pages, then you receive a micro-payment in return. If people view your pages, you receive a micro-payment. Likewise, when you consume, you pay. If you conduct a search, then you run Google’s code, and Google gets paid. The payments are tiny, as far as the individual is concerned, but they all add up.
Mammoth technical issues of doing this aside, the effect would be to take money from the head and pump it back into the tail. It would be harder to build empires off the previously free content others produce. It would send money back to producers.
It also eliminates the piracy question. Producers would want people to copy, remix and redistribute their content, as the more people that use it, the more money they make. Also, with the integration of two-way linking, the mechanism Lanier proposes to keep track of ownership and credit, you’d always know who is using your content.
Information would no longer be free. It would be affordable, in the broadest sense of the word. There would also be a mechanism to reward the production, and a mechanism to reward the most relevant information the most. The more you contribute to the net, and the more people use it, the more you make. Tiny payments. Incremental. Ongoing.
So, if these questions are of interest to you, I’d encourage you to read “Who Owns The Future” by Jaron Lanier. It’s often rambling - in a good way - and heads off on wild tangents - in a good way, and you can tell there is a very intelligent and thoughtful guy behind it all. He’s asking some pretty big, relevant questions. His answers are sketches that should be challenged, argued, debated and enlarged.
And if big tech companies want a challenge that really will change the world, perhaps they could direct all that intellect, wealth and power towards enriching the ecosystem at a pace faster than they potentially gouge it.
I have started to wonder if some of these links (there are hundreds since the site is large) may be hurting my site in the Google Algo. I am considering changing most of my outbound links to rel="nofollow". It is not something I want to do but ... “
We’ve got site owners falling to their knees, confessing to be link spammers, and begging for forgiveness. Even when they do, many sites don’t return. Some sites have been returned, but their rankings, and traffic, haven’t recovered. Many sites carry similar links, but get a free pass.
That’s the downside of letting Google dictate the game, I guess.
When site owners are being told by Google that their linking is “a problem,” they tend to hit the forums and spread the message, so the effect is multiplied.
Why does Google bother with the charade of “unnatural link notifications,” anyway?
If Google has found a problem with links to a site, then they can simply ignore or discount them, rather than send reports prompting webmasters to remove them. Alternatively, they could send a report saying they’ve already discounted them.
So one assumes Google’s strategy is a PR - as in public relations - exercise to plant a bomb between link buyers and link sellers. Why do that? Well, a link is a link, and one could conclude that Google must still have problems nailing down the links they don’t like.
So they get some help.
The disavow links tool, combined with a re-inclusion request, is pretty clever. If you wanted a way to get site owners to admit to being link buyers, and to point out the places from which they buy links, or you want to build a database of low quality links, for no money down, then you could hardly imagine a better system of outsourced labour.
If you’re a site owner, getting hundreds, if not thousands, of links removed is hardly straightforward. It’s difficult, takes a long time, and is ultimately futile.
Many site owners inundated with link removal requests have moved to charging removal fees, which in many cases is understandable, given it takes some time and effort to verify the true owner of a link, locate the link, remove it, and update the site.
As one rather fed-up sounding directory owner put it:
Blackmail? Google's blackmailing you, not some company you paid to be listed forever. And here's a newsflash for you. If you ask me to do work, then I demand to be paid. If the work's not worth anything to you, then screw off and quit emailing me asking me to do it for free.
Find your link, remove it, confirm it's removed, email you a confirmation, that's 5 minutes. And that's $29US. Don't like it? Then don't email me. I have no obligation to work for you for free, not even for a minute. …. I think the best email I got was someone telling me that $29 was extortion. I then had to explain that $29 wasn't extortion - but his new price of $109 to have the link removed, see, now THAT'S extortion.
if it makes you sleep at night, you might realize that you paid to get in the directory to screw with your Google rankings, now you get to pay to get out of the directory, again to screw your Google rankings. That's your decision, quit complaining about it like it's someone else's fault. Not everyone has to run around in circles because you're cleaning up the very mess that you made
In any case, if these links really did harm a site - which is arguable - then it doesn’t take a rocket scientist to guess the next step. Site owners would be submitting their competitors links to directories thick and fast.
Cue Matt Cutts on negative SEO....
Recovery Not Guaranteed
Many sites don’t recover from Google penalties, no matter what they do.
It’s conceivable that a site could have a permanent flag against it no matter what penance has been paid. Google takes into account your history in Adwords, so it’s not a stretch to imagine similar flags may continue to exist against domains in their organic results.
The most common reason is not what they're promoting now, its what they've promoted in the past.
Why would Google hold that against them? It's probably because of the way affiliates used to churn and burn domains they were promoting in years gone by...
This may be the reason why some recovered sites just don’t rank like they used to after they've returned. They may carry permanent negative flags.
However, the reduced rankings and traffic when/if a site does return may have nothing to do with low-quality links or previous behaviour. There are many other factors involved in ranking and Google’s algorithm updates aren’t sitting still, so it’s always difficult to pin down.
Which is why the SEO environment can be a paranoid place.
Do Brands Escape?
Matt Cutts is on record discussing big brands, saying they get punished, too. You may recall the case of Interflora UK.
Google may well punish big brands, but the punishment might be quite different to the punishment handed out to a no-brand site. It will be easier for a big brand to return, because if Google don’t show what Google users expect to see in the SERPs then Google looks deficient.
Take, for example, this report received - amusingly - by the BBC:
I am a representative of the BBC site and on Saturday we got a ‘notice of detected unnatural links’. Given the BBC site is so huge, with so many independently run sub sections, with literally thousands or agents and authors, can you give us a little clue as to where we might look for these ‘unnatural links
If I was the BBC webmaster, I wouldn’t bother. Google isn’t going to dump the BBC sites as Google would look deficient. If Google has problems with some of the links pointing to the BBC, then perhaps Google should sort it out.
Take It On The Chin, Move On
Many of those who engaged in aggressive link tactics knew the deal. They went looking for an artificial boost in relevancy, and as a result of link building, they achieved a boost in the SERPs.
That is playing the game that Google, a search engine that factors in backlinks, "designed". By design, Google rewards well-linked sites by ranking them above others.
The site owners enjoyed the pay-off at the expense of their less aggressive competitors. The downside - there’s always a downside - is that Google may spot the artificial boost in relevancy, now or in the future, and and may slam the domain as a result.
That’s part of the game, too.
Some cry about it, but Google doesn’t care about crying site owners, so site owners should build that risk into their business case from the get go.
Strategically, there are two main ways of looking at “the game”:
Whack A Mole: Use aggressive linking for domains you’re prepared to lose. If you get burned, then that’s a cost of playing the game. Run multiple domains using different link graphs for each and hope that a few survive at any one time, thus keeping you in the game. If some domains get taken out, then take it on the chin. Try to get reinstated, and if you can’t, then torch them and move on.
Ignore Google: If you operate like Google doesn’t exist, then it’s pretty unlikely Google will slam you, although there are no guarantees. In any case, a penalty and a low ranking are the same thing in terms of outcome.
Take one step back. If your business relies on Google rankings, then that’s a business risk. If you rely entirely on Google rankings, then that’s a big business risk. I’m not suggesting it’s not a risk worth taking, but only you can answer that what risks make sense for your business.
If the whack a mole strategy is not for you, and you want to lower the business risk of Google’s whims, then it makes sense to diversify the ways in which you get traffic so that if one traffic stream fails, then all is not lost. If you’re playing for the long term, then establishing brand, diversifying traffic, and treating organic SEO traffic as a bonus should be considerations. You then don’t need to worry about what Google may or may not do as Google aren’t fueling your engine.
Some people run both these strategies simultaneously, which is an understandable way of managing risk. Most people probably sit somewhere in the middle and hope for the best.
Link Building Going Forward
The effect of Google’s fear, uncertainty and doubt strategy is that a lot of site owners are going to be running scared or confused, or both.
Just what is acceptable?
Trouble is, what is deemed acceptable today might be unacceptable next week. It’s pretty difficult, if not impossible, for a site owner to wind the clock back once they undertake a link strategy, and who knows what will be deemed unacceptable in a years time.
Of course, Google doesn’t want site owners to think in terms of a “link strategy”, if the aim of said link strategy is to “inflate rankings”. That maxim has remained constant.
If you want to take a low-risk approach, then it pays to think of Google traffic as a bonus. Brett Tabke, founder of WebmasterWorld, used to keep a sticker on his monitor that said “Pretend The Search Engines Don’t Exist”, or words to that effect. I’m reminded of how useful that message still is today, as it's a prompt to think strategically beyond SEO. If you disappeared from Google today, would your business survive? If the answer is no, then you should revise your strategy.
Is there a middle ground?
Here are a few approaches to link building that will likely stand the test of time, and incorporate strategy that provides resilience from Google’s whims. The key is having links for reasons besides SEO, even if you part of their value is higher rankings.
Publish relevant, valuable content, as determined by your audience.
It’s no longer enough to publish pages of information on a topic, the information must have demonstrable utility i.e. other people need to deem it valuable, reference it, visit it, and talk about it. Instead of putting your money into buying links, you put your money into content development and then marketing it to people. The links will likely follow. This is passive link acquisition.
It’s unlikely these types of links will ever be a problem, as the link graph is not going to look contrived. If any poor quality links slip into this link graph, then they’re not going to be the dominant feature. The other signals will likely trump them and therefore diminish their impact.
Build brand based on unique, high quality information, and then market it to people by via multiple channels, and the links tend to follow, which then boost your ranking in Google. Provide a high degree of utility, first and foremost.
One problem with this model is that it’s easy for other people to steal your utility. This is a big problem and prevents investment in quality content. One way of getting around this is to use some content as loss-leader and lock the rest away behind pay-walls. You give the outside world, and Google, just enough, but if they want the rest, then they’re going to need to sign up.
Think carefully about the return on giving the whole farm away to a crawler. Think about providing utility, not "content".
There is huge first mover advantage when it comes to getting links.
If a new field opens up, and you get there first, or early, then it’s relatively easy to build a credible link graph. As a field expands, the next layer involves a lot of meta activity i.e. bloggers, media and other information curators writing about that activity. At the start of any niche, there aren’t many players to talk about, so the early movers get all the links.
As a field matures, you get a phenomenon Mike Grehan aptly characterised as “filthy linking rich”
The law of "preferential attachment" as it is also known, wherein new links on the web are more likely to go to sites that already have many links, proves that the scheme is inherently biased against new and unknown pages. When search engines constantly return popular pages at the top of the pile, more web users discover those pages and more web users are likely to link to them
Those who got all those links early on will receive more and more links over time because they are top of the results. They just need to keep doing what they’re doing. It becomes very difficult for late entrants to beat them unless they do something extraordinary. By definition, that probably means shifting the niche to a new niche.
If you’re late to a crowded field, then you need to think in terms of differentiation. What can you offer the rest do not? New content in such fields must be remarkable i.e worth remarking upon.
Is that field moving in a new direction? If so, can you pivot in that direction and be first mover in that direction? Look not where a niche currently is, but where it’s going, then position ahead of it.
"Same old, same old content” doesn’t get linked to, engaged with, ranked, or remarked upon - and why should it? The web is not short of content. The web has so much content that companies like Google have made billions trying to reduce it to a manageable set of ten links
Brand is the ultimate Google-protection tactic.
It’s not that brands don’t get hammered by Google occasionally, because they do. But what tends to differ is the sentence handed down. The bigger the brand, the lighter the sentence, or the shorter the sentence, because no matter how much WalMart or The Office Of The President Of The United States Of America spams Google, Google must show such sites. I’m not suggesting these sites engage in aggressive SEO tactics, or need to, but we know they’ll always be in Google.
You don’t have to be a big brand. You do need search volume on your distinctive brand name. If you’re well known enough in your niche i.e. you attract significant type-in search volume, Google must show you or appear deficient.
This is not to say having a brand means you can get away with poor behavior. But the more type-in traffic for your brand, the more pressure there is on Google to rank you.
Links to a brand name will almost never look forced in the same way a link in a footer to “cheap online pharmacy” looks forced. People know your name, and they link to you by name , they talk about you by name - naturally.
The more generic your site, the more vulnerable you are, as it’s very difficult to own a space when you’re aligning with generic keyword terms. The links are always going to look a little - or a lot - forced.
This is not to say you shouldn't get links with keywords in them, but build a distinctive brand, too. The link graph will appear mostly natural - because it is. A few low quality links won’t trump the good signals created by a lot of natural brand links.
The web is a place.
This placed is filled with people. There are relationships between people. Relationships between people on the web, are almost always expressed as a link. It might be a Facebook link, a Twitter link, a comment link, a blog link, but they’re all links. It doesn’t matter if they’re crawlable or not, or if they’re no-followed, or not, it still indicates a relationship.
If Google is to survive, it must figure out these relationships.
That’s why all links - apart from negative SEO - are good links. The more signals of a real relationship, the better you *should* be ranked, because you are more relevant, in an objective sense.
So look for ways to foster relationships and engagement. It might be guest posting. It might be commenting on someone elses site. It might be contributing to forums. It might be interviewing people. It might be accepting interviews. It might be forging relationships with the press. It might be forging relationships with business organisations. It might be contributing to charity. It might be running competitions. It might be attending conferences. It might be linking out to influential people.
It’s all networking.
And wherever you network, you should be getting links as a byproduct.
Provide long - well, longer than 400 words - unique, editorial articles. Articles also need get linked to, and engaged with. Articles need to be placed on sites they’ll be seen, as opposed to content farms.
Ask yourself "am I providing genuine utility?"
5. Fill A Need
This is similar to differentiation, but a little more focused.
Think about the problems people have in a niche. The really hard problems to solve. “How to”, “tips”, “advice”, and so on.
Solving a genuine problem for people tends to make people feel a sense of obligation, especially if they would otherwise have to pay for that help. If you can twist that obligation towards getting a link, all the better. For example, “if this article/video/whatever helped you, no payment necessary! But it would be great if you link to us/follow us on Twitter/” and so on. It doesn’t need to be that overt, although sometimes overt is what is required. It fosters engagement. It builds your network. And it builds links.
Think about ways you can integrate a call-to-action that results in a link of some kind.
What are the incentives to publish high-value content to the web?
Search engines, like Google, say they want to index quality content, but provide little incentive to create and publish it. The reality is that the publishing environment is risky, relatively poorly paid in most instances, and is constantly being undermined.
There is little point publishing web content if the cost of publishing outweighs any profit that can be derived from it.
Many publishers, who have search engines in mind, work on an assumption that if they provide content to everyone, including Google, for free, then Google should provide traffic in return. It’s not an official deal, of course. It’s unspoken.
Rightly or wrongly, that’s the “deal” as many webmasters perceive it.
What Actually Happens
Search engines take your information and, if your information is judged sufficiently worthy that day, as the result of an ever-changing, obscure digital editorial mechanism known only to themselves, they will rank you highly, and you’ll receive traffic in return for your efforts.
That may all change tomorrow, of course.
What might also happen is that they could grab your information, amalgamate it, rank you further down the page, and use your information to keep visitors on their own properties.
Look at the case of Trip Advisor. Trip Advisor, frustrated with Google’s use of its travel and review data, filed a competition complaint against Google in 2012.
The company said: "We hope that the commission takes prompt corrective action to ensure a healthy and competitive online environment that will foster innovation across the internet."
The commission has been investigating more than a dozen complaints against Google from rivals, including Microsoft, since November 2010, looking at claims that it discriminates against other services in its search results and manipulates them to promote its own products.
TripAdvisor's hotel and restaurants review site competes with Google Places, which provides reviews and listings of local businesses."We continue to see them putting Google Places results higher in the search results – higher on the page than other natural search results," said Adam Medros, TripAdvisor's vice president for product, in February. "What we are constantly vigilant about is that Google treats relevant content fairly."
Similarly, newspapers have taken aim at Google and other search engines for aggregating their content, and deriving value from that aggregation, but the newspapers claim they aren’t making enough to cover the cost of producing that content in the first place:
In 2009 Rupert Murdoch called Google and other search engines “content kleptomaniacs”. Now cash-strapped newspapers want to put legal pressure on what they see as parasitical news aggregators.”
Of course, it’s not entirely the fault of search engines that newspapers are in decline. Their own aggregation model - bundling news, sport, lifestyle, classifieds topics - into one “place” has been surpassed.
Search engines often change their stance without warning, or can be cryptic about their intentions, often to the determent of content creators. For example, Google has stated they see ads as helpful, useful and informative:
In his argument, Cutts said, “We actually think our ads can be as helpful as the search results in some cases. And no, that’s not a new attitude.”
In entering the advertising market, Google tested our belief that highly relevant advertising can be as useful as search results or other forms of content
However, business models built around the ads as content idea, such as Suite101.com, got hammered. Google could argue these sites went too far, and that they are asserting editorial control, and that may be true, but such cases highlight the flaky and precarious nature of the search ecosystem as far as publishers are concerned. One day, what you're doing is seemingly "good", the next day it is "evil". Punishment is swift and without trial.
In the days before we meet, he has been watching a box set of Adam Curtis's BBC series, All Watched Over by Machines of Loving Grace, about the implications of our digitised future, so the arguments are fresh in his head. "We were so into the net around the time of Kid A," he says. "Really thought it might be an amazing way of connecting and communicating. And then very quickly we started having meetings where people started talking about what we did as 'content'. They would show us letters from big media companies offering us millions in some mobile phone deal or whatever it was, and they would say all they need is some content. I was like, what is this 'content' which you describe? Just a filling of time and space with stuff, emotion, so you can sell it?"
Having thought they were subverting the corporate music industry with In Rainbows, he now fears they were inadvertently playing into the hands of Apple and Google and the rest. "They have to keep commodifying things to keep the share price up, but in doing so they have made all content, including music and newspapers, worthless, in order to make their billions. And this is what we want? I still think it will be undermined in some way. It doesn't make sense to me. Anyway, All Watched Over by Machines of Loving Grace. The commodification of human relationships through social networks. Amazing!
There is no question the value of content is being deprecated by big aggregation companies. The overhead of creating well-researched, thoughtful content is the same whether search engines value it or not. And if they do value it, a lot of the value of that content has shifted to the networks, distributors and aggregators and away from the creators.
Facebook’s value is based entirely on the network itself. Almost all of Google’s value is based on scraping and aggregating free content and placing advertising next to it. Little of this value gets distributed back to the creator, unless they take further, deliberate steps to try and capture some back.
In such a precarious environment, what incentive does the publisher have to invest and publish to the “free” web?
Google lives or dies on the relevancy of the information they provide to visitors. Without a steady supply of “free” information from third parties, they don’t have a business.
Of course, this information isn’t free to create. So if search engines do not provide you profitable traffic, then why allow search engines to crawl your pages? They cost you money in terms of bandwidth and may extract, and then re-purpose, the value you created to suit their own objectives.
Google has done content-related deals in the past. They did one in France in February whereby Google agreed to help publishers develop their digital units:
Under the deal, Google agreed to set up a fund, worth 60 million euroes, or $80 million, over three years, to help publishers develop their digital units. The two sides also pledged to deepen business ties, using Google’s online tools, in an effort to generate more online revenue for the publishers, who have struggled to counteract dwindling print revenue.
This seems to fit with Google’s algorithmic emphasis on major web properties, seemingly as a means to sift the "noise in the channel". Such positioning favors big, established content providers.
It may have also been a forced move as Google would have wanted to avoid a protracted battle with European regulators. Whatever the case, Google doesn’t do content deals with small publishers and it could be said they are increasingly marginalizing them due to algorithm shifts that appear to favor larger web publishers over small players.
Don't Be Evil To Whom?
Google’s infamous catch-phrase is “Don’t Be Evil”. In the documentary Inside Google", Eric Schmidt initially thought the phrase was a joke. Soon after, he realized they took it seriously.
The problem with such a phrase is that it implies Google is a benevolent moral actor that cares about......what? You - the webmaster?
“Don’t Be Evil” is typically used by Google in reference to users, not webmasters. In practice, it’s not even a question of morality, it’s a question of who to favor. Someone is going to lose, and if you’re a small webmaster with little clout, it’s likely to be you.
For example, Google appear to be kicking a lot of people out of Adsense, and as many webmasters are reporting, Google often act as judge, jury and executioner, without recourse. That’s a very strange way of treating business “partners”, unless partnership has some new definition of which I'm unaware.
But I think Google as an organization has moved on; they're focussed now on market position, not making the world better. Which makes me sad. Google is too powerful, too arrogant, too entrenched to be worth our love. Let them defend themselves, I'd rather devote my emotional energy to the upstarts and startups. They deserve our passion.
Some may call such behavior a long way from “good” on the “good” vs “evil” spectrum.
How To Protect Value
Bottom line: if your business model involves creating valuable content, you’re going to need a strategy to protect it and claw value back from aggregators and networks in order for a content model to be sustainable.
Some argue that if you don’t like Google, then block them using robots.txt. This is one option, but there’s no doubt Google still provides some value - it’s just a matter of deciding where to draw the line on how much value to give away.
What Google offers is potential visitor attention. We need to acquire and hold enough visitor attention before we switch the visitors to desired action. An obvious way to do this, of course, is to provide free, attention grabbing content that offers some value, then lock the high value content away behind a paywall. Be careful about page length. As HubPages CEO Paul Edmonds points out:
Longer, richer pages are more expensive to create, but our data shows that as the quality of a page increases, its effective revenue decreases. There will have to be a pretty significant shift in traffic to higher quality pages to make them financially viable to create"
You should also consider giving the search engines summaries or the first section of an article, but block them from the rest.
I know a little bit about this because in January I was invited to a meeting at the A.P.’s headquarters with about two dozen other publishers, most of them from the print world, to discuss the formation of the consortium. TechCrunch has not joined at this time. Ironically, neither has the A.P., which has apparently decided to go its own way and fight the encroachments of the Web more aggressively (although, to my knowledge, it still uses Attributor’s technology). But at that meeting, which was organized by Attributor, a couple slides were shown that really brought home the point to everyone in the room. One showed a series of bar graphs estimating how much ad revenues splogs were making simply from the feeds of everyone in the room. (Note that this was just for sites taking extensive copies of articles, not simply quoting). The numbers ranged from $13 million (assuming a $.25 effective CPM) to $51 million (assuming a $1.00 eCPM)
You still end up facing the cost of policing "content re-purposing" - just one of the many costs publishers face when publishing on the web, and just one more area where the network is sucking out value.
Use multiple channels so you’re not reliant on one traffic provider. You might segment your approach by providing some value to one channel, and some value to another, but not all of it to both. This is not to say models entirely reliant on Google won’t work, but if you do rely on a constant supply of new visitors via Google, and if you don’t have the luxury of having sufficient brand reputation, then consider running multiple sites that use different optimization strategies so that the inevitable algorithm changes won’t take you out entirely. It’s a mistake to think Google cares deeply about your business.
Treat every new visitor as gold. Look for ways to lock visitors in so you aren’t reliant on Google in future for a constant stream of new traffic. Encourage bookmarking, email sign-ups, memberships, rewards - whatever it takes to keep them. Encourage people to talk about you across other media, such as social media. Look for ways to turn visitors into broadcasters.
Adopt a business model that leverages off your content. Many consultants write business books. They make some money from the books, but the books mainly serve as advertisements for their services or speaking engagements. Similarly, would you be better creating a book and publishing it on Amazon than publishing too much content to the web?
Business models focused on getting Google traffic and then monetarizing that attention using advertising only works if the advertising revenue covers production cost. Some sites make a lot of money this way, but big money content sites are in the minority. Given the low return of a lot of web advertising, other webmasters opt for cheap content production. But cheap content isn’t likely to get the attention required these days, unless you happen to be Wikipedia.
Perhaps a better approach for those starting out is to focus on building brand / engagement / awarenesss / publicity / non-search distribution. As Aaron points out:
...the sorts of things that PR folks & brand managers focus on. The reason being is that if you have those things...
the incremental distribution helps subsidize the content creation & marketing costs
many of the links happen automatically (such that you don't need to spend as much on links & if/when you massage some other stuff in, it is mixed against a broader base of stuff)
that incremental distribution provides leverage in terms of upstream product suppliers (eg: pricing leverage) or who you are able to partner with & how (think about Mint.com co-marketing with someone or the WhiteHouse doing a presentation with CreditCards.com ... in addition to celebrity stuff & such ... or think of all the ways Amazon can sell things: rentals, digital, physical, discounts via sites like Woot, higher margin high fashion on sites like Zappos, etc etc etc)
as Google folds usage data & new signals in, you win
as Google tracks users more aggressively (Android + Chrome + Kansas City ISP), you win
if/when/as Google eventually puts some weight on social you win
people are more likely to buy since they already know/trust you
if anyone in your industry has a mobile app that is widely used & you are the lead site in the category you could either buy them out or be that app maker to gain further distribution
Google engineers are less likely to curb you knowing that you have an audience of rabid fans & they are more likely to consider your view if you can mobilize that audience against "unjust editorial actions"
A lot of the most valuable content on this site is locked-up. We’d love to open this content up, but there is currently no model that sufficiently rewards publishers for doing so. This is the case across the web, and it's the reason the most valuable content is not in Google.
It’s not in Google because Google, and the other search engines, don’t pay.
Fair? Unfair? Is there a better way? How can content providers - particularly newcomers - grow and prosper in such an environment?
Is using payment to influence search results unethical unless the check has Google on it?
None of those links in the content use nofollow, in spite of many of them having Google Analytics tracking URLs on them.
And I literally spent less than 10 minutes finding the above examples & writing this article. Surely Google insiders know more about Google's internal marketing campaigns than I do. Which leads one to ask the obvious (but uncomfortable) question: why doesn't Google police themselves when they are policing others? If their algorithmic ideals are true, shouldn't they apply to Google as well?
Clearly Google takes paid links that pass pagerank seriously, as acknowledged by their repeated use of them.