Almost anyone in internet marketing who has spent a couple months in the game has seen some "shocking" case study where changing the color of a button increased sales 183% or such. In many cases such changes only happen when the original site had not had any focus on conversion at all.
The company was considering adding another sponsored link to its search results, and they were going to do a 30-day A/B test to see what the resulting change would be. As it turns out, the change brought massive returns. Advertising revenues from those users who saw more ads doubled in the first 30 days.
By the end of the second month, 80 percent of the people in the cohort that was being served an extra ad had started using search engines other than Google as their primary search engine.
One of the reasons traditional media outlets struggle with the web is the perception that ads and content must be separated. When they had regional monopolies they could make large demands to advertisers - sort of like how Google may increase branded CPCs on AdWords by 500% if you add sitelinks. You not only pay more for clicks that you were getting for free, but you also pay more for the other paid clicks you were getting cheaper in the past.
Google tests colors & can control the flow of traffic based not only on result displacement, but also the link colors.
It was reported last month that Google tested adding ads to the knowledge graph. The advertisement link is blue, while the ad disclosure is to the far right out of view & gray.
I was searching for a video game yesterday & noticed that now the entire Knowledge Graph unit itself is becoming an ad unit. Once again, gray disclosure & blue ad links.
Where Google gets paid for the link, the link is blue.
Where Google scrapes third party content & shows excerpts, the link is gray.
The primary goal of such a knowledge block is result displacement - shifting more clicks to the ads and away from the organic results.
When those blocks appear in the search results, even when Google manages to rank the Mayo Clinic highly, it's below the fold.
What's so bad about this practice in health
Context Matters: Many issues have overlapping symptoms where a quick glance at a few out-of-context symptoms causes a person to misdiagnose themselves. Flu-like symptoms from a few months ago turned out to be indication of a kidney stone. That level of nuance will *never* be in the knowledge graph. Google's remote rater documents discuss your money your life (YMYL) topics & talk up the importance of knowing who exactly is behind content, but when they use gray font on the source link for their scrape job they are doing just the opposite.
Hidden Costs: Many of the heavily advertised solutions appearing above the knowledge graph have hidden costs yet to be discovered. You can't find a pharmaceutical company worth $10s of billions that hasn't plead guilty to numerous felonies associated with deceptive marketing and/or massaging research.
Artificially Driving Up Prices: in-patent drugs often cost 100x as much as the associated generic drugs & thus the affordable solutions are priced out of the ad auctions where the price for a click can vastly exceed than the profit from selling a generic prescription drug.
Where's the business model for publishers when they have real editorial cost & must fact check and regularly update their content, their content is good enough to be featured front & center on Google, but attribution is nearly invisible (and thus traffic flow is cut off)? As the knowledge graph expands, what does that publishing business model look like in the future?
Does the knowledge graph eventually contain sponsored self-assessment medical quizzes? How far does this cancer spread?
There’s some amusing historical revisionism going on in SEO punditry world right now, which got me thinking about the history of SEO. I’d like to talk about some common themes of this historical revision, which goes along the lines of “what I predicted all those years ago came true - what a visionary I am! ." No naming names, as I don't meant this to be anything personal - as the same theme has popped up in a number of places - just making some observations :)
See if you agree….
Divided We Fall
The SEO world has never been united. There are no industry standards and qualifications like you’d find in the professions, such as being a doctor, or lawyer or a builder. If you say you’re an SEO, then you’re an SEO.
Part of the reason for the lack of industry standard is that the search engines never came to the party. Sure, they talked at conferences, and still do. They offered webmasters helpful guidelines. They participated in search engine discussion forums. But this was mainly to do with risk management. Keep your friends close, and your enemies closer.
In all these years, you won’t find one example of a representative from a major search engine saying “Hey, let’s all get together and form an SEO standard. It will help promote and legitimize the industry!”.
No, it has always been decrees from on high. “Don’t do this, don’t do that, and here are some things we’d like you to do”. Webmasters don’t get a say in it. They either do what the search engines say, or they go against them, but make no mistake, there was never any partnership, and the search engines didn’t seek one.
This didn’t stop some SEOs seeing it as a form of quasi-partnership, however.
Some SEOs chose to align themselves with search engines and do their bidding. If the search engine reps said “do this”, they did it. If the search engines said “don’t do this”, they’d wrap themselves up in convoluted rhetorical knots pretending not to do it. This still goes on, of course.
In the early 2000’s, it turned, curiously, into a question of morality. There was “Ethical SEO”, although quite what it had to do with ethics remains unclear. Really, it was another way of saying “someone who follows the SEO guidelines”, presuming that whatever the search engines decree must be ethical, objectively good and have nothing to do self-interest. It’s strange how people kid themselves, sometimes.
What was even funnier was the search engine guidelines were kept deliberately vague and open to interpretation, which, of course, led to a lot of heated debate. Some people were “good” and some people were “bad”, even though the distinction was never clear. Sometimes it came down to where on the page someone puts a link. Or how many times someone repeats a keyword. And in what color.
It got funnier still when the search engines moved the goal posts, as they are prone to do. What was previously good - using ten keywords per page - suddenly became the height of evil, but using three was “good” and so all the arguments about who was good and who wasn’t could start afresh. It was the pot calling the kettle black, and I’m sure the search engines delighted in having the enemy warring amongst themselves over such trivial concerns. As far as the search engines were concerned, none of them were desirable, unless they became paying customers, or led paying customers to their door. Then there was all that curious Google+business.
It's hard to keep up, sometimes.
Playing By The Rules
There’s nothing wrong with playing by the rules. It would have been nice to think there was a partnership, and so long as you followed the guidelines, high rankings would naturally follow, the bad actors would be relegated, and everyone would be happy.
But this has always been a fiction. A distortion of the environment SEOs were actually operating in.
Jason Calacanis, never one to miss an opportunity for controversy, fired some heat seekers at Google during his WebmasterWorld keynote address recently…..
Calacanis proceeded to describe Cutts and Google in terms like, “liar,” “evil,” and “a bad partner.” He cautioned the PubCon audience to not trust Google, and said they cooperate with partners until they learn the business and find a way to pick off the profits for themselves. The rant lasted a good five minutes….
He accused Google of doing many of the things SEOs are familiar with, like making abrupt algorithm changes without warning. They don’t consult, they just do it, and if people’s businesses get trashed as a result, then that’s just too bad. Now, if that’s a sting for someone who is already reasonably wealthy and successful like Calacanis, just imagine what it feels like for the much smaller web players who are just trying to make a living.
The search business is not a pleasant environment where all players have an input, and then standards, terms and play are generally agreed upon. It’s war. It’s characterized by a massive imbalance of power and wealth, and one party will use it to crush those who it determines stands in its way.
Of course, the ever pleasant Matt Cutts informs us it’s all about the users, and that’s a fair enough spin of the matter, too. There was, and is, a lot of junk in the SERPs, and Mahalo was not a partner of Google, so any expectation they’d have a say in what Google does is unfounded.
The take-away is that Google will set rules that work for Google, and if they happen to work for the webmaster community too, well that’s good, but only a fool would rely on it. Google care about their bottom line and their projects, not ours. If someone goes out of business due to Google’s behaviour, then so be it. Personally, I think the big technology companies do have a responsibility beyond themselves to society, because the amount of power they are now centralising means they’re not just any old company anymore, but great vortexes that can distort entire markets. For more on this idea, and where it’s all going, check out my review of “Who Owns The Future” by Jaron Lanier.
So, if you see SEO as a matter of playing by their rules, then fine, but keep in mind "those who can give you everything can also take everything away". Those rules weren't designed for your benefit.
There was a massive opportunity cost by following so called ethical SEO during the 2000s.
For a long time, it was relatively easily to get high rankings by being grey. And if you got torched, you probably had many other sites with different link patterns good to go. This was against the webmaster guidelines, but given marketing could be characterized as war, one does not let the enemy define ones tactics. Some SEOs made millions doing it. Meanwhile, a lot of content-driven sites disappeared. That was, perhaps, my own "a stopped clock is right two times a day" moment. It's not like I'm going to point you to all the stuff I've been wrong about, now is it :)
These days, a lot of SEO is about content and how that content is marketed, but more specifically it’s about the stature of the site on which that content appears. That’s the bit some pundits tend to gloss over. You can have great content, but that’s no guarantee of anything. You will likely remain invisible. However, put that exact same content on a Fortune 500 site, and that content will likely prosper. Ah, the rich get richer.
So, we can say SEO is about content, but that’s only half the picture. If you’re a small player, the content needs to appear in the right place, be very tightly targeted to your audiences needs so they don’t click back, and it should be pushed through various media channels.
Content, even from many of these "ethical SEOs", used to be created for search engines in the hope of netting as many visitors as possible. These days, it’s probably a better strategy to get inside the audience's heads and target it to their specific needs, as opposed to a keyword, then get that content out to wherever your audience happens to be. Unless, of course, you’re Fortune 500 or otherwise well connected, in which case you can just publish whatever you like and it will probably do well.
Fair? Not really, but no one ever said this game was fair.
Do I know what’s going to happen next? In ten years time? Nope. I could make a few guesses, and like many other pundits, some guesses will prove right, and some will be wrong, but that’s the nature of the future. It will soon make fools of us all.
Having said that, will you take a punt and tell us what you think will be the future of SEO? Does it have one? What will look like? If you’re right, then you can point back here in a few years time and say “Look, I told you so!”.
If you’re wrong, well, there’s always historical revisionism :)
Google software engineer Matt Cutts, who’s been involved with the privacy changes, wouldn’t give an exact figure but told me he estimated even at full roll-out, this would still be in the single-digit percentages of all Google searchers on Google.com.
we actually started doing encrypted.google.com in 2008 and one of the guys who did a lot of heavy lifting on that, his name is Evan, and he actually reports to me. And we started that after I read Little Brother, and we said "we've got to encrypt the web."
When asked about the recent increase in (not provided), a Google representative stated the following:
We want to provide SSL protection to as many users as we can, in as many regions as we can — we added non-signed-in Chrome omnibox searches earlier this year, and more recently other users who aren’t signed in. We’re going to continue expanding our use of SSL in our services because we believe it’s a good thing for users….
The motivation here is not to drive the ads side — it’s for our search users.
What an excellent time for Google to block paid search referrals as well.
It’s one of those truths that is difficult to see, until you look closely at what’s really there.
To see something as it really is, we should try to identify our own bias, and then let it go.
This article tries to make sense of Google's latest moves regarding links.
It’s a reaction to Google’s update of their Link Schemes policy. Google’s policy states “Any links intended to manipulate PageRank or a site’s ranking in Google search results may be considered part of a link scheme and a violation of Google’s Webmaster Guidelines." I wrote on this topic, too.
Those with a vested interest in the link building industry - which is pretty much all of us - might spot the problem.
Google’s negative emphasis, of late, has been about links. Their message is not new, just the emphasis. The new emphasis could pretty much be summarized thus:"any link you build for the purpose of manipulating rank is outside the guidelines." Google have never encouraged activity that could manipulate rankings, which is precisely what those link building, for the purpose of SEO, attempt to do. Building links for the purposes of higher rank AND staying within Google's guidelines will not be easy.
Some SEOs may kid themselves that they are link building “for the traffic”, but if that were the case, they’d have no problem insisting those links were scripted so they could monitor traffic statistics, or at very least, no-followed, so there could be no confusion about intent.
How many do?
Think Like Google
Ralph Tegtmeier: In response to Eric's assertion "I applaud Google for being more and more transparent with their guidelines", Ralph writes- "man, Eric: isn't the whole point of your piece that this is exactly what they're NOT doing, becoming "more transparent"?
In order to understand what Google is doing, it can be useful to downplay any SEO bias i.e. what we may like to see from an SEO standpoint, rather try to look at the world from Google’s point of view.
I ask myself “if I were Google, what would I do?”
Clearly I'm not Google, so these are just my guesses, but if I were Google, I’d see all SEO as a potential competitive threat to my click advertising business. The more effective the SEO, the more of a threat it is. SEOs can’t be eliminated, but they can been corralled and managed in order to reduce the level of competitive threat. Partly, this is achieved by algorithmic means. Partly, this is achieved using public relations. If I were Google, I would think SEOs are potentially useful if they could be encouraged to provide high quality content and make sites easier to crawl, as this suits my business case.
I’d want commercial webmasters paying me for click traffic. I’d want users to be happy with the results they are getting, so they keep using my search engine. I’d consider webmasters to be unpaid content providers.
Do I (Google) need content? Yes, I do. Do I need any content? No, I don’t. If anything, there is too much content, and lot of it is junk. In fact, I’m getting more and more selective about the content I do show. So selective, in fact, that a lot of what I show above the fold content is controlled and “published”, in the broadest sense of the word, by me (Google) in the form of the Knowledge Graph.
It is useful to put ourselves in someone else’s position to understand their truth. If you do, you’ll soon realise that Google aren’t the webmasters friend if your aim, as a webmaster, is to do anything that "artificially" enhances your rank.
So why are so many SEOs listening to Google’s directives?
A year or two ago, it would be madness to suggest webmasters would pay to remove links, but that’s exactly what’s happening. Not only that, webmasters are doing Google link quality control. For free. They’re pointing out the links they see as being “bad” - links Google’s algorithms may have missed.
Check out this discussion. One exasperated SEO tells Google that she tries hard to get links removed, but doesn’t hear back from site owners. The few who do respond want money to take the links down.
It is understandable site owners don't spend much time removing links. From a site owners perspective, taking links down involves a time cost, so there is no benefit to the site owner in doing so, especially if they receive numerous requests. Secondly, taking down links may be perceived as being an admission of guilt. Why would a webmaster admit their links are "bad"?
The answer to this problem, from Google's John Mueller is telling.
A shrug of the shoulders.
It’s a non-problem. For Google. If you were Google, would you care if a site you may have relegated for ranking manipulation gets to rank again in future? Plenty more where they came from, as there are thousands more sites just like it, and many of them owned by people who don’t engage in ranking manipulation.
Does anyone really think their rankings are going to return once they’ve been flagged?
Jenny Halasz then hinted at the root of the problem. Why can’t Google simply not count the links they don’t like? Why make webmasters jump through arbitrary hoops? The question was side-stepped.
If you were Google, why would you make webmasters jump through hoops? Is it because you want to make webmasters lives easier? Well, that obviously isn’t the case. Removing links is a tedious, futile process. Google suggest using the disavow links tool, but the twist is you can’t just put up a list of links you want to disavow.
No, you need to show you’ve made some effort to remove them.
If I were Google, I’d see this information supplied by webmasters as being potentially useful. They provide me with a list of links that the algorithm missed, or considered borderline, but the webmaster has reviewed and thinks look bad enough to affect their ranking. If the webmaster simply provided a list of links dumped from a link tool, it’s probably not telling Google much Google doesn’t already know. There’s been no manual link review.
So, what webmasters are doing is helping Google by manually reviewing links and reporting bad links. How does this help webmasters?
It just increases the temperature of the water in the pot. Is the SEO frog just going to stay there, or is he going to jump?
A Better Use Of Your Time
Does anyone believe rankings are going to return to their previous positions after such an exercise? A lot of webmasters aren’t seeing changes. Will you?
But I think it’s the wrong question.
It’s the wrong question because it’s just another example of letting Google define the game. What are you going to do when Google define you right out of the game? If your service or strategy involves links right now, then in order to remain snow white, any links you place, for the purposes of achieving higher rank, are going to need to be no-followed in order to be clear about intent. Extreme? What's going to be the emphasis in six months time? Next year? How do you know what you're doing now is not going to be frowned upon, then need to be undone, next year?
A couple of years ago it would be unthinkable that webmasters would report and remove their own links, even paying for them to be removed, but that’s exactly what’s happening. So, what is next year's unthinkable scenario?
You could re-examine the relationship and figure what you do on your site is absolutely none of Google’s business. They can issue as many guidelines as they like, but they do not own your website, or the link graph, and therefore don't have authority over you unless you allow it. Can they ban your site because you’re not compliant with their guidelines? Sure, they can. It’s their index. That is the risk. How do you choose to manage this risk?
It strikes me you can lose your rankings at anytime whether you follow the current guidelines or not, especially when the goal-posts keep moving. So, the risk of not following the guidelines, and following the guidelines but not ranking well is pretty much the same - no traffic. Do you have a plan to address the “no traffic from Google” risk, however that may come about?
Your plan might involve advertising on other sites that do rank well. It might involve, in part, a move to PPC. It might be to run multiple domains, some well within the guidelines, and some way outside them. Test, see what happens. It might involve beefing up other marketing channels. It might be to buy competitor sites. Your plan could be to jump through Google’s hoops if you do receive a penalty, see if your site returns, and if it does - great - until next time, that is.
What's your long term "traffic from Google" strategy?
If all you do is "follow Google's Guidelines", I'd say that's now a high risk SEO strategy.
Google is getting a bit absurd with suggesting that any form of content creation that drives links should include rel=nofollow. Certainly some techniques may be abused, but if you follow the suggested advice, you are almost guaranteed to have a negative ROI on each investment - until your company goes under.
Some will ascribe such advice as taking a "sustainable" and "low-risk" approach, but such strategies are only "sustainable" and "low-risk" so long as ROI doesn't matter & you are spending someone else's money.
The advice on infographics in the above video suggests that embed code by default should include nofollow links.
Companies can easily spend at least $2,000 to research, create, revise & promote an infographic. And something like 9 out of 10 infographics will go nowhere. That means you are spending about $20,000 for each successful viral infographic. And this presumes that you know what you are doing. Mix in a lack of experience, poor strategy, poor market fit, or poor timing and that cost only goes up from there.
If you run smaller & lesser known websites, quite often Google will rank a larger site that syndicates the infographic above the original source. They do that even when the links are followed. Mix in nofollow on the links and it is virtually guaranteed that you will get outranked by someone syndicating your infographic.
So if you get to count as duplicate content for your own featured premium content that you dropped 4 or 5 figures on AND you don't get links out of it, how exactly does the investment ever have any chance of backing out?
Not a snowball's chance in hell.
An infographic created around "the 10 best ways you can give me your money" won't spread. And if it does spread, it will be people laughing at you.
I also find it a bit disingenuous the claim that people putting something that is 20,000 pixels large on their site are not actively vouching for it. If something was crap and people still felt like burning 20,000 pixels on syndicating it, surely they could add nofollow on their end to express their dissatisfaction and disgust with the piece.
Does Google's recent automated infographic generator give users embed codes with nofollow on the links? Not at all. Instead they give you the URL without nofollow & those URLs are canonicalized behind the scenes to flow the link equity into the associated core page.
No cost cut-n-paste mix-n-match = direct links. Expensive custom research & artwork = better use nofollow, just to be safe.
If Google actively adds arbitrary risks to some players while subsidizing others then they shift the behaviors of markets. And shift the markets they do!
Years ago Twitter allowed people who built their platform to receive credit links in their bio. Matt Cutts tipped off Ev Williams that the profile links should be nofollowed & that flow of link equity was blocked.
It was revealed in the WSJ that in 2009 Twitter's internal metrics showed an 11% spammy Tweet rate & Twitter had a grand total of 2 "spam science" programmers on staff in 2012.
With smaller sites, they need to default everything to nofollow just in case anything could potentially be construed (or misconstrued) to have the intent to perhaps maybe sorta be aligned potentially with the intent to maybe sorta be something that could maybe have some risk of potentially maybe being spammy or maybe potentially have some small risk that it could potentially have the potential to impact rank in some search engine at some point in time, potentially.
A larger site can have over 10% of their site be spam (based on their own internal metrics) & set up their embed code so that the embeds directly link - and they can do so with zero risk.
@phillian Like all empires, ultimately Google will be the root of its own demise.— Cygnus SEO (@CygnusSEO) August 13, 2013
CEO Dick Costolo told employees he expected to get to 400 million users by the end of 2013, according to people familiar with the company.
Sources said that Twitter now has around 240 million users, which means it has been adding fewer than 4.5 million users a month in 2013. If it continues to grow at that rate, it would end this year around the 260 million mark — meaning that its user base would have grown by about 30 percent, instead of Costolo’s 100 percent goal.
Typically there is no presumed intent to spam so long as the links are going into a large site (sure there are a handful of token counter-examples shills can point at). By and large it is only when the links flow out to smaller players that they are spam. And when they do, they are presumed to be spam even if they point into featured content that cost thousands of Dollars. You better use nofollow, just to play it safe!
Here are a few common examples of unnatural links that violate our guidelines:....Links with optimized anchor text in articles or press releases distributed on other sites.
For example: There are many wedding rings on the market. If you want to have a wedding, you will have to pick the best ring. You will also need to buy flowers and a wedding dress.
In particular, they have focused on links with optimized anchor text in articles or press releases distributed on other sites. Google being Google, these rules are somewhat ambiguous. “Optimized anchor text”? The example they provide includes keywords in the anchor text, so keywords in the anchor text is “optimized” and therefore a violation of Google’s guidelines.
Ambiguously speaking, of course.
To put the press release change in context, Google’s guidelines state:
Any links intended to manipulate PageRank or a site's ranking in Google search results may be considered part of a link scheme and a violation of Google’s Webmaster Guidelines. This includes any behavior that manipulates links to your site or outgoing links from your site
So, links gained, for SEO purposes - intended to manipulate ranking - are against Google Guidelines.
In this chat, Google’s John Muller says that, if the webmaster initiated it, then it isn't a natural link. If you want to be on the safe side, John suggests to use no-follow on links.
Google are being consistent, but what’s amusing is the complete disconnect on display from a few of the webmasters. Google have no problem with press releases, but if a webmaster wants to be on the safe side in terms of Google’s guidelines, the webmaster should no-follow the link.
Simple, right. If it really is a press release, and not an attempt to link build for SEO purposes, then why would a webmaster have any issue with adding a no-follow to a link?
But because some webmasters appear to lack self-awareness about what it is they are actually doing, they persist with their line of questioning. I suspect what they really want to hear is “keyword links in press releases are okay." Then, webmasters can continue to issue pretend press releases as a link building exercise.
They're missing the point.
Am I Taking Google’s Side?
Not taking sides.
Just hoping to shine some light on a wider issue.
If webmasters continue to let themselves be defined by Google, they are going to get defined out of the game entirely. It should be an obvious truth - but sadly lacking in much SEO punditry - that Google is not on the webmasters side. Google is on Google’s side. Google often say they are on the users side, and there is certainly some truth in that.
However,when it comes to the webmaster, the webmaster is a dime-a-dozen content supplier who must be managed, weeded out, sorted and categorized. When it comes to the more “aggressive” webmasters, Google’s behaviour could be characterized as “keep your friends close, and your enemies closer”.
This is because some webmasters, namely SEOs, don’t just publish content for users, they compete with Google’s revenue stream. SEOs offer a competing service to click based advertising that provides exactly the same benefit as Google's golden goose, namely qualified click traffic.
If SEOs get too good at what they do, then why would people pay Google so much money per click? They wouldn’t - they would pay it to SEOs, instead. So, if I were Google, I would see SEO as a business threat, and manage it - down - accordingly. In practice, I’d be trying to redefine SEO as “quality content provision”.
Why don't Google simply ignore press release links? Easy enough to do. Why go this route of making it public? After all, Google are typically very secret about algorithmic topics, unless the topic is something they want you to hear. And why do they want you to hear this? An obvious guess would be that it is done to undermine link building, and SEOs.
Big missiles heading your way.
The problem in letting Google define the rules of engagement is they can define you out of the SEO game, if you let them.
If an SEO is not following the guidelines - guidelines that are always shifting - yet claim they do, then they may be opening themselves up to legal liability. In one recent example, a case is underway alleging lack of performance:
Last week, the legal marketing industry was aTwitter (and aFacebook and even aPlus) with news that law firm Seikaly & Stewart had filed a lawsuit against The Rainmaker Institute seeking a return of their $49,000 in SEO fees and punitive damages under civil RICO
.....but it’s not unreasonable to expect a somewhat easier route for litigants in the future might be “not complying with Google’s guidelines”, unless the SEO agency disclosed it.
SEO is not the easiest career choice, huh.
One group that is likely to be happy about this latest Google push is legitimate PR agencies, media-relations departments, and publicists. As a commenter on WMW pointed out:
I suspect that most legitimate PR agencies, media-relations departments, and publicists will be happy to comply with Google's guidelines. Why? Because, if the term "press release" becomes a synonym for "SEO spam," one of the important tools in their toolboxes will become useless.
Just as real advertisers don't expect their ads to pass PageRank, real PR people don't expect their press releases to pass PageRank. Public relations is about planting a message in the media, not about manipulating search results
However, I’m not sure that will mean press releases are seen as any more credible, as press releases have never enjoyed a stellar reputation pre-SEO, but it may thin the crowd somewhat, which increases an agencies chances of getting their client seen.
Guidelines Honing In On Target
One resource referred to in the video above was this article, written by Amit Singhal, who is head of Google’s core ranking team. Note that it was written in 2011, so it’s nothing new. Here’s how Google say they determine quality:
we aren't disclosing the actual ranking signals used in our algorithms because we don't want folks to game our search results; but if you want to step into Google's mindset, the questions below provide some guidance on how we've been looking at the issue:
Would you trust the information presented in this article?
Is this article written by an expert or enthusiast who knows the topic well, or is it more shallow in nature?
Does the site have duplicate, overlapping, or redundant articles on the same or similar topics with slightly different keyword variations?
Are the topics driven by genuine interests of readers of the site, or does the site generate content by attempting to guess what might rank well in search engines?
Does the article provide original content or information, original reporting, original research, or original analysis?
Does the page provide substantial value when compared to other pages in search results?
How much quality control is done on content?
….and so on. Google’s rhetoric is almost always about “producing high quality content”, because this is what Google’s users want, and what Google’s users want, Google’s shareholders want.
It’s not a bad thing to want, of course. Who would want poor quality content? But as most of us know, producing high quality content is no guarantee of anything. Great for Google, great for users, but often not so good for publishers as the publisher carries all the risk.
Take a look at the Boston Globe, sold along with a boatload of content for a 93% decline. Quality content sure, but is it a profitable business? Emphasis on content without adequate marketing is not a sure-fire strategy. Bezos has just bought the Washington Post, of course, and we're pretty sure that isn't a content play, either.
High quality content often has a high upfront production cost attached to it, and given measly web advertising rates, the high possibility of invisibility, getting content scrapped and ripped off, then it is no wonder webmasters also push their high quality content in order to ensure it ranks. What other choice have they got?
Google can move the goal- posts whenever they like. What you’re doing today might be frowned upon tomorrow. One day, your content may be made invisible, and there will be nothing you can do about it, other than start again.
Do you have a contingency plan for such an eventuality?
The only thing that matters is how much traffic you are getting from search engines today, and how prepared you are for when some (insert adjective here) Googler shuts off that flow of traffic"
To ask about the minuate of Google’s policies and guidelines is to miss the point. The real question is how prepared are you when Google shuts off you flow of traffic because they’ve reset the goal posts?
Focusing on the minuate of Google's policies is, indeed, to miss the point.
This is a question of risk management. What happens if your main site, or your clients site, runs foul of a Google policy change and gets trashed? Do you run multiple sites? Run one site with no SEO strategy at all, whilst you run other sites that push hard? Do you stay well within the guidelines and trust that will always be good enough? If you stay well within the guidelines, but don’t rank, isn’t that effectively the same as a ban i.e. you’re invisible? Do you treat search traffic as a bonus, rather than the main course?
Be careful about putting Google’s needs before your own. And manage your risk, on your own terms.
Over the past couple years since its launch, Google's keyword (not provided) has received quite a bit of exposure, with people discussing all sorts of tips on estimating its impact & finding alternate sources of data (like competitive research tools & webmaster tools).
What hasn't received anywhere near enough exposure (and should be discussed daily) is that the sole purpose of the change was anti-competitive abuse from the market monopoly in search.
According to research by RKG, mobile click prices are nearly 60% of desktop click prices, while mobile search click values are only 22% of desktop click prices. Until Google launched enhanced AdWords campaigns they understated the size of mobile search by showing many mobile searchers as direct visitors. But now that AdWords advertisers were opted into mobile ads (and have to go through some tricky hoops to figure out how to disable it), Google has every incentive to promote what a big growth channel mobile search is for their business.
Looking at the analytics data for some non-SEO websites over the past 4 days I get Google referring an average of 86% of the 26,233 search visitors, with 13,413 being displayed as keyword (not provided).
Hiding The Value of SEO
Google is not only hiding half of their own keyword referral data, but they are hiding so much more than half that even when you mix in Bing and Yahoo! you still get over 50% of the total hidden.
Google's 86% of the 26,233 searches is 22,560 searches.
Keyword (not provided) being shown for 13,413 is 59% of 22,560. That means Google is hiding at least 59% of the keyword data for organic search. While they are passing a significant share of mobile search referrers, there is still a decent chunk that is not accounted for in the change this past week.
Not passing keywords is just another way for Google to increase the perceived risk & friction of SEO, while making SEO seem less necessary, which has been part of "the plan" for years now.
When one digs into keyword referral data & ad blocking, there is a bad odor emitting from the GooglePlex.
Subsidizing Scammers Ripping People Off
A number of the low end "solutions" providers scamming small businesses looking for SEO are taking advantage of the opportunity that keyword (not provided) offers them. A buddy of mine took over SEO for a site that had showed absolutely zero sales growth after a year of 15% monthly increase in search traffic. Looking at the on-site changes, the prior "optimizers" did nothing over the time period. Looking at the backlinks, nothing there either.
So what happened?
Well, when keyword data isn't shown, it is pretty easy for someone to run a clickbot to show keyword (not provided) Google visitors & claim that they were "doing SEO."
And searchers looking for SEO will see those same scammers selling bogus solutions in AdWords. Since they are selling a non-product / non-service, their margins are pretty high. Endorsed by Google as the best, they must be good.
Google does prefer some types of SEO over others, but their preference isn’t cast along the black/white divide you imagine. It has nothing to do with spam or the integrity of their search results. Google simply prefers ineffective SEO over SEO that works. No question about it. They abhor any strategies that allow guys like you and me to walk into a business and offer a significantly better ROI than AdWords.
This is no different than the YouTube videos "recommended for you" that teach you how to make money on AdWords by promoting Clickbank products which are likely to get your account flagged and banned. Ooops.
Anti-competitive Funding Blocking Competing Ad Networks
sponsoring Adblock is changing the market conditions. Adblock can use the money provided by Google to make sure any non-Google ad is blocked more efficiently. They can also advertise their addon better, provide better support, etc. Google sponsoring Adblock directly affects Adblock's ability to block the adverts of other companies around the world. - RyanZAG
Turn AdBlock Plus on & search for credit cards on Google and get ads.
Do that same search over at Bing & get no ads.
How does a smaller search engine or a smaller ad network compete with Google on buying awareness, building a network AND paying the other kickback expenses Google forces into the marketplace?
Which is part of the reason a monopoly in search can be used to control the rest of the online ecosystem.
Buying Browser Marketshare
Already the #1 web browser, Google Chrome buys marketshare with shady one-click bundling in software security installs.
If you do that stuff in organic search or AdWords, you might be called a spammer employing deceptive business practices.
When Google does it, it's "good for the user."
Vampire Sucking The Lifeblood Out of SEO
Google tells Chrome users "not signed in to Chrome (You're missing out - sign in)." Login to Chrome & searchers don't pass referral information. Google also promotes Firefox blocking the passage of keyword referral data in search, but when it comes to their own cookies being at risk, that is unacceptable: "Google is pulling out all the stops in its campaign to drive Chrome installs, which is understandable given Microsoft and Mozilla's stance on third-party cookies, the lifeblood of Google's display-ad business."
What do we call an entity that considers something "its lifeblood" while sucking it out of others?
“Search engine optimization” has always been an odd term as it’s somewhat misleading. After all, we’re not optimizing search engines.
SEO came about when webmasters optimized websites. Specifically, they optimized the source code of pages to appeal to search engines. The intent of SEO was to ensure websites appeared higher in search results than if the site was simply left to site designers and copywriters. Often, designers would inadvertently make sites uncrawlable, and therefore invisible in search engines.
But there was more to it than just enhancing crawlability.
SEOs examined the highest ranking page, looked at the source code, often copied it wholesale, added a few tweaks, then republished the page. In the days of Infoseek, this was all you needed to do to get an instant top ranking.
I know, because I used to do it!
At the time, I thought it was an amusing hacker trick. It also occurred to me that such positioning could be valuable. Of course, this rather obvious truth occurred to many other people, too. A similar game had been going on in the Yahoo Directory where people named sites “AAAA...whatever” because Yahoo listed sites in alphabetical order. People also used to obsessively track spiders, spotting fresh spiders (Hey Scooter!) as they appeared and....cough......guiding them through their websites in a favourable fashion.
When it comes to search engines, there’s always been gaming. The glittering prize awaits.
The new breed of search engines made things a bit more tricky. You couldn’t just focus on optimizing code in order to rank well. There was something else going on.
So, SEO was no longer just about optimizing the underlying page code, SEO was also about getting links. At that point, SEO jumped from being just a technical coding exercise to a marketing exercise. Webmasters had to reach out to other webmasters and convince them to link up.
A young upstart, Google, placed heavy emphasis on links, making use of a clever algorithm that sorted “good” links from, well, “evil” links. This helped make Google’s result set more relevant than other search engines. Amusingly enough, Google once claimed it wasn’t possible to spam Google.
Webmasters responded by spamming Google.
Or, should I say, Google likely categorized what many webmasters were doing as “spam”, at least internally, and may have regretted their earlier hubris. Webmasters sought links that looked like “good” links. Sometimes, they even earned them.
And Google has been pushing back ever since.
Building links pre-dated SEO, and search engines, but, once backlinks were counted in ranking scores, link building was blended into SEO. These days, most SEO's consider link building a natural part of SEO. But, as we've seen, it wasn’t always this way.
We sometimes get comments on this blog about how marketing is different from SEO. Well, it is, but if you look at the history of SEO, there has always been marketing elements involved. Getting external links could be characterized as PR, or relationship building, or marketing, but I doubt anyone would claim getting links is not SEO.
More recently, we’ve seen a massive change in Google. It’s a change that is likely being rolled out over a number of years. It’s a change that makes a lot of old school SEO a lot less effective in the same way introducing link analysis made meta-tag optimization a lot less effective.
My takeaways from Panda are that this is not an individual change or something with a magic bullet solution. Panda is clearly based on data about the user interacting with the SERP (Bounce, Pogo Sticking), time on site, page views, etc., but it is not something you can easily reduce to 1 number or a short set of recommendations. To address a site that has been Pandalized requires you to isolate the "best content" based on your user engagement and try to improve that.
Google is likely applying different algorithms to different sectors, so the SEO tactics used in on sector don’t work in another. They’re also looking at engagement metrics, so they’re trying to figure out if the user really wanted the result they clicked on. When you consider Google's work on PPC landing pages, this development is obvious. It’s the same measure. If people click back often, too quickly, then the landing page quality score drops. This is likely happening in the SERPs, too.
So, just like link building once got rolled into SEO, engagement will be rolled into SEO. Some may see that as a death of SEO, and in some ways it is, just like when meta-tag optimization, and other code optimizations, were deprecated in favour of other, more useful relevancy metrics. In others ways, it's SEO just changing like it always has done.
The objective remains the same.
Deciding On Strategy
So, how do you construct your SEO strategy? What will be your strategy going forward?
Some read Google’s Webmaster Guidelines. They'll watch every Matt Cutts video. They follow it all to the letter. There’s nothing wrong with this approach.
Others read Google’s Guidelines. They'll watch every Matt Cutts video. They read between the lines and do the complete opposite. Nothing wrong with that approach, either.
It depends on what strategy you've adopted.
One of the problems with letting Google define your game is that they can move the goalposts anytime they like. The linking that used to be acceptable, at least in practice, often no longer is. Thinking of firing off a press release? Well, think carefully before loading it with keywords:
This is one of the big changes that may have not been so clear for many webmasters. Google said, “links with optimized anchor text in articles or press releases distributed on other sites,” is an example of an unnatural link that violate their guidelines. The key are the examples given and the phrase “distributed on other sites.” If you are publishing a press release or an article on your site and distribute it through a wire or through an article site, you must make sure to nofollow the links if those links are “optimized anchor text.
Do you now have to go back and unwind a lot of link building in order to stay in their good books? Or, perhaps you conclude that links in press releases must work a little too well, else Google wouldn’t be making a point of it. Or conclude that Google is running a cunning double-bluff hoping you’ll spend a lot more time doing things you think Google does or doesn’t like, but really Google doesn’t care about at all, as they’ve found a way to mitigate it.
Bulk guest posting were also included in Google's webmaster guidelines as a no no. Along with keyword rich anchors in article directories. Even how a site monetizes by doing things like blocking the back button can be considered "deceptive" and grounds for banning.
How about the simple strategy of finding the top ranking sites, do what they do, and add a little more? Do you avoid saturated niches, and aim for the low-hanging fruit? Do you try and guess all the metrics and make sure you cover every one? Do you churn and burn? Do you play the long game with one site? Is social media and marketing part of your game, or do you leave these aspects out of the SEO equation? Is your currency persuasion?
Think about your personal influence and the influence you can manage without dollars or gold or permission from Google. Think about how people throughout history have sought karma, invested in social credits, and injected good will into their communities, as a way to “prep” for disaster. Think about it.
We may be “search marketers” and “search engine optimizers” who work within the confines of an economy controlled (manipulated) by Google, but our currency is persuasion. Persuasion within a market niche transcends Google
It would be interesting to hear the strategies you use, and if you plan on using different strategy going forward.
We have reviewed a number of contextual ad networks & Media.net scored as the best network outside of AdSense. Many smaller ad networks have a huge fall off, to where if you earned 50 cents or a dollar a click with Google AdSense, you'd see nickle and penny clicks. Thankfully Media.net is nothing like that & they are perhaps the best network at competing with AdSense on a CPM basis. Their interface is quite easy to use, both in terms of creating & customizing new ad units and in tracking performance reports.
Application only takes a couple minutes. Account approval may take 4 or 5 business days to about a week. Once your account is approved, each additional site you submit must also be approved, but your account representative can help with that and getting additional sites approved should take a day or less.
They have high traffic quality standards and manually review all sites to help maintain network quality. They require English as your primary language & that your site receives the majority of its traffic from the United States, Canada, and the United Kingdom. Other publisher requirements are posted online. Their terms of service are published at media.net/legal/tos and their program guidelines are published at media.net/legal/programguidelines.
Media.net pays on a Net-30 basis and has a $100 minimum earning threshold.
You can select Paypal or bank wire transfer as your payment method.
RPM / CPM Rate
The earnings potential for any ad network is driven by
the depth of the ad network
the relevancy of the ads
how tightly ads can be integrated to fit the theme of the site
the commercial appeal of the publisher's topic
Ad Network Depth
Since Media.net leverages the Yahoo Bing Network, it has significant ad depth inside the United States. Shortly after its launch in 2012, Media.net CEO Divyank Turkhia stated: "Media.net has contextually optimized over $200 million worth of internet traffic." 6 months later their ad network already had over 2.5 billion pageviews.
While the earnings from Media.net are typically not vastly better than AdSense, they may be quite close to par and tend to outperform networks like Chitika, particularly when the published content is tied to a high value topic where pay per click (ppc) prices are significant. The cost per click (cpc) will vary across networks and topics, but in my experience the gap between AdSense and Media.net is far less than the gap between Media.net and networks like Chitika or the in-text ad networks like Infolinks, Kontera & Vibrant Media IntelliTXT. I've even seen some cases where Media.net outperformed AdSense on some topics. You don't have to chose one or the other though, as Media.net ads can be used in conjunction with AdSense ads on the same site.
Publishers who have had experience with the (now defunct) Yahoo! Publisher Network may recall the ads in Yahoo!'s old network were not particularly relevant. Ads in the Yahoo! Publisher Network lacked relevancy in part because Yahoo! placed excessive weight on the CPC which the advertiser was willing to pay. That in turn led to substantially lower ad clickthrough rates (CTR). And when some of the top paying advertisers like Vonage lowered their bids, ultimately that led to drastically lower RPM.
The good news with Media.net is it puts ad relevancy front and center. This leads to a high level of user engagement with the ads, which in turn drives a much better yield for publishers at a better RPM rate. Their ads have a 100% fill rate and use page level precision targeting.
Media.net is primarily a contextual ad network. Select publishers may be invited to sign up for the premium display advertising partnership Media.net has with Google, to complement the contextual ad performance with display ads. By leveraging ad retargeting features, display ads can help put a floor under the earning potential of pages covering topics of limited commercial appeal. Media.net also has mobile-specific ad units.
When a person sets up AdSense ads or other contextual ads on their site, there's a bit of a sense of "you're on your own." Worse yet, there is often a bit of a conflict between the recommendations from the AdSense team and the search quality team at Google.
One of Media.net's big points of differentiation is they have a team of over 450 employees who work on the product and help publishers better integrate the ads into their websites, including making the ad units really match the look and feel of the site. On some higher revenue sites Media.net will help create custom ad units. For instance, on TheStreet.com here's an example of an ad unit.
Even smaller sites will see a significant amount of effort spent on testing optimizing ad colors & ensuring the ads match the look and feel of the site. The customer service is really one of the areas where Media.net shines best.
Media.net offers a variety of ad unit sizes.
most popular sizes: 336x280, 300x250, 728x90, 600x250, 160x600
horizontal sizes: 728x20, 600x120, 468x60
vertical sizes: 120x600, 120x300, 300x600, 160x90
square: 200x200, 250x250
Media.net offers a variety of pre-set ad unit templates to choose from and the ability to customize the colors further.
Usage samples / examples
The colors can be adjusted on a per-unit basis, so you can test having some ad units blend in to the design & use higher contrasting colors on other ad units. If your site has enough scale the Media.net team can also help you split test different colors. Another useful ad integration strategy Media.net allows & recommends is the creation of jQuery sticky ads which help keep ads in view as a person scrolls around a page, helping the ad units stand out.
Publisher Interface & Reporting
Media.net has put a lot of thought into usability and detailed reporting. Creating new ad units only takes a minute or two and posting the ad code into your site is just as quick.
Publishers can login to their accounts at the Media.net homepage and view stats 24 hours a day. Currently the dashboard does not offer CPC or click reporting, but report impressions, RPM and estimated revenue. They report live impression traffic stats in real-time on the welcome screen, but earnings stats are typically updated early on the next morning. In addition to account-wide reporting, their interface allows you to drill down into reporting on a per-site or per-unit basis.
minimum traffic: none, but they tend to be more likely to approve sites which are already approved in other tier 1 networks and/or obviously have a strong traffic footprint
prohibited topics: illegal drugs, pornography, violence, other illegal activities
Competitive eCPM when compared against AdSense in many categories.
Can be used in conjunction with AdSense.
Has some standard ad unit sizes & some that are custom, which gives you flexibility in terms of integrating them in typical ad spots and in terms of having units which look different than common ones and thus have greater eye appeal than a standard 468x60 or 728x90 banner.
Leverages the Yahoo! Bing Network, which gives it a fairly decent advertiser base & network scale to tap into to ensure there are relevant ads for most topics. I believe one thing that has helped them do so well is Microsoft has done a much better job on pricing click quality than many ad networks did in years past.
Since they are a smaller company than Google, their partner communications are much clearer. You don't have to pull down millions of dollars a year to be considered a valued partner.
Their customer support team not only communicates clearly with publishers, but also works to help improve ad integration.
Once your account has been established and they see strong traffic quality they are generally quite quick at approving any additional sites you add to your account.
In addition to offering contextual ads, Media.net has a partnership to serve Google display ads on their network (though publishers have to sign up with Google).
While earning statistics are not real-time, they provide them the following day.
Fast Net-30 payouts.
The main drawbacks would be:
They require English as your primary language & that your site receives the majority of its traffic from the United States, Canada, and the United Kingdom. If you operate outside those markets, then they wouldn't be a great fit at the moment (though who knows where they may be in a couple years as Bing gets more aggressive with international expansion of their ad network).
It can take a while to get a new account approved, so it is worth applying early to have some experience with their network and to have a backup in place in case anything should happen to your AdSense account.
Inability to split test units. While you can use a PHP rotation script to compare 2 ad units against each other, there isn't a core split test feature baked into the ad platform by default - though if you are doing enough volume your customer support person will help set up and implement a split test for you.
While they do offer statistics on a per-site, per-day & per-ad unit basis (along with impression stats), they currently do not offer data down to the individual page or keyword level. They provide data on earnings, pageviews & eCPM; but they currently do not provide click or CPC data. (I believe they will be adding more granular metrics fairly soon).
The default amount of useful information offered by the new layout is less than the old layout provided, while requiring user interaction with the result set to get the information they want. You get a picture, but the only way the phone number is in the result set is if you click into that result set or conduct a branded query from the start.
If you search for a general query (say "Indian restaurants") and want the phone number of a specific restaurant, you will likely need to click on that restaurant's picture in order to shift the search to that restaurant's branded search result set to pull their phone number & other information into the page. In that way Google is able to better track user engagement & enhance personalization on local search. When people repeatedly click into the same paths from logged in Google user accounts then Google can put weight on the end user behavior.
This multi-click process not only gives Google usage data to refine rankings with, but it also will push advertisers into buying branded AdWords ads.
Where this new result set is a bit of a train wreck for navigational searches is when a brand is fairly generic & aligned with a location as part of the business name. For instance, in Oakland there is a place named San Francisco Pizza. Even if you do that branded search, you still get the carousel & there might also be three AdWords ads above the organic search results.
Some of Google's other verticals may appear above the organic result set too. When searching for downtown Oakland hotels they offer listings of hotels in San Francisco & Berkeley inside the hotel onebox.
Perhaps Google can patch together some new local ad units that work with the carousel to offer local businesses a flat-rate monthly ad product. A lot of advertisers would be interested in testing a subscription product that enabled them to highlight selected user reviews and include other options like ratings & coupons & advertiser control of the image. As the search result set becomes the destination some of Google's ad products can become much more like Yelp's.
In the short term the new layout is likely a boon for Yelp & some other local directory plays. Whatever segment of the search audience that dislike's the new carousel will likely be shunted into many of these other local directories.
In the longrun some of these local directories will be the equivalent of MapQuest. As Google gains confidence they will make their listings richer & have more confidence in entirely displacing the result set. The following search isn't a local one, but is a good example of where we may be headed. Even though the search is set to "web" results (rather than "video" results) the first 9 listings are from YouTube.