Gray Hat Search Engineering

Jan 3rd

Almost anyone in internet marketing who has spent a couple months in the game has seen some "shocking" case study where changing the color of a button increased sales 183% or such. In many cases such changes only happen when the original site had not had any focus on conversion at all.

Google, on the other hand, has billions of daily searches and is constantly testing ways to increase yield:

The company was considering adding another sponsored link to its search results, and they were going to do a 30-day A/B test to see what the resulting change would be. As it turns out, the change brought massive returns. Advertising revenues from those users who saw more ads doubled in the first 30 days.
...
By the end of the second month, 80 percent of the people in the cohort that was being served an extra ad had started using search engines other than Google as their primary search engine.

One of the reasons traditional media outlets struggle with the web is the perception that ads and content must be separated. When they had regional monopolies they could make large demands to advertisers - sort of like how Google may increase branded CPCs on AdWords by 500% if you add sitelinks. You not only pay more for clicks that you were getting for free, but you also pay more for the other paid clicks you were getting cheaper in the past.

That's how monopolies work - according to Eric Schmidt they are immune from market forces.

Search itself is the original "native ad." The blend confuses many searchers as the background colors fade into white.

Google tests colors & can control the flow of traffic based not only on result displacement, but also the link colors.

It was reported last month that Google tested adding ads to the knowledge graph. The advertisement link is blue, while the ad disclosure is to the far right out of view & gray.

I was searching for a video game yesterday & noticed that now the entire Knowledge Graph unit itself is becoming an ad unit. Once again, gray disclosure & blue ad links.

Where Google gets paid for the link, the link is blue.

Where Google scrapes third party content & shows excerpts, the link is gray.

The primary goal of such a knowledge block is result displacement - shifting more clicks to the ads and away from the organic results.

When those blocks appear in the search results, even when Google manages to rank the Mayo Clinic highly, it's below the fold.

What's so bad about this practice in health

  • Context Matters: Many issues have overlapping symptoms where a quick glance at a few out-of-context symptoms causes a person to misdiagnose themselves. Flu-like symptoms from a few months ago turned out to be indication of a kidney stone. That level of nuance will *never* be in the knowledge graph. Google's remote rater documents discuss your money your life (YMYL) topics & talk up the importance of knowing who exactly is behind content, but when they use gray font on the source link for their scrape job they are doing just the opposite.
  • Hidden Costs: Many of the heavily advertised solutions appearing above the knowledge graph have hidden costs yet to be discovered. You can't find a pharmaceutical company worth $10s of billions that hasn't plead guilty to numerous felonies associated with deceptive marketing and/or massaging research.
  • Artificially Driving Up Prices: in-patent drugs often cost 100x as much as the associated generic drugs & thus the affordable solutions are priced out of the ad auctions where the price for a click can vastly exceed than the profit from selling a generic prescription drug.

Where's the business model for publishers when they have real editorial cost & must fact check and regularly update their content, their content is good enough to be featured front & center on Google, but attribution is nearly invisible (and thus traffic flow is cut off)? As the knowledge graph expands, what does that publishing business model look like in the future?

Does the knowledge graph eventually contain sponsored self-assessment medical quizzes? How far does this cancer spread?

Where do you place your chips?

Google believes it can ultimately fulfil people’s data needs by sending results directly to microchips implanted into its user’s brains.

Historical Revisionism

Nov 1st
posted in

A stopped clock is right two times a day.

There’s some amusing historical revisionism going on in SEO punditry world right now, which got me thinking about the history of SEO. I’d like to talk about some common themes of this historical revision, which goes along the lines of “what I predicted all those years ago came true - what a visionary I am! ." No naming names, as I don't meant this to be anything personal - as the same theme has popped up in a number of places - just making some observations :)

See if you agree….

Divided We Fall

The SEO world has never been united. There are no industry standards and qualifications like you’d find in the professions, such as being a doctor, or lawyer or a builder. If you say you’re an SEO, then you’re an SEO.

Part of the reason for the lack of industry standard is that the search engines never came to the party. Sure, they talked at conferences, and still do. They offered webmasters helpful guidelines. They participated in search engine discussion forums. But this was mainly to do with risk management. Keep your friends close, and your enemies closer.

In all these years, you won’t find one example of a representative from a major search engine saying “Hey, let’s all get together and form an SEO standard. It will help promote and legitimize the industry!”.

No, it has always been decrees from on high. “Don’t do this, don’t do that, and here are some things we’d like you to do”. Webmasters don’t get a say in it. They either do what the search engines say, or they go against them, but make no mistake, there was never any partnership, and the search engines didn’t seek one.

This didn’t stop some SEOs seeing it as a form of quasi-partnership, however.

Hey Partner

Some SEOs chose to align themselves with search engines and do their bidding. If the search engine reps said “do this”, they did it. If the search engines said “don’t do this”, they’d wrap themselves up in convoluted rhetorical knots pretending not to do it. This still goes on, of course.

In the early 2000’s, it turned, curiously, into a question of morality. There was “Ethical SEO”, although quite what it had to do with ethics remains unclear. Really, it was another way of saying “someone who follows the SEO guidelines”, presuming that whatever the search engines decree must be ethical, objectively good and have nothing to do self-interest. It’s strange how people kid themselves, sometimes.

What was even funnier was the search engine guidelines were kept deliberately vague and open to interpretation, which, of course, led to a lot of heated debate. Some people were “good” and some people were “bad”, even though the distinction was never clear. Sometimes it came down to where on the page someone puts a link. Or how many times someone repeats a keyword. And in what color.

It got funnier still when the search engines moved the goal posts, as they are prone to do. What was previously good - using ten keywords per page - suddenly became the height of evil, but using three was “good” and so all the arguments about who was good and who wasn’t could start afresh. It was the pot calling the kettle black, and I’m sure the search engines delighted in having the enemy warring amongst themselves over such trivial concerns. As far as the search engines were concerned, none of them were desirable, unless they became paying customers, or led paying customers to their door. Then there was all that curious Google+ business.

It's hard to keep up, sometimes.

Playing By The Rules

There’s nothing wrong with playing by the rules. It would have been nice to think there was a partnership, and so long as you followed the guidelines, high rankings would naturally follow, the bad actors would be relegated, and everyone would be happy.

But this has always been a fiction. A distortion of the environment SEOs were actually operating in.

Jason Calacanis, never one to miss an opportunity for controversy, fired some heat seekers at Google during his WebmasterWorld keynote address recently…..

Calacanis proceeded to describe Cutts and Google in terms like, “liar,” “evil,” and “a bad partner.” He cautioned the PubCon audience to not trust Google, and said they cooperate with partners until they learn the business and find a way to pick off the profits for themselves. The rant lasted a good five minutes….

He accused Google of doing many of the things SEOs are familiar with, like making abrupt algorithm changes without warning. They don’t consult, they just do it, and if people’s businesses get trashed as a result, then that’s just too bad. Now, if that’s a sting for someone who is already reasonably wealthy and successful like Calacanis, just imagine what it feels like for the much smaller web players who are just trying to make a living.

The search business is not a pleasant environment where all players have an input, and then standards, terms and play are generally agreed upon. It’s war. It’s characterized by a massive imbalance of power and wealth, and one party will use it to crush those who it determines stands in its way.

Of course, the ever pleasant Matt Cutts informs us it’s all about the users, and that’s a fair enough spin of the matter, too. There was, and is, a lot of junk in the SERPs, and Mahalo was not a partner of Google, so any expectation they’d have a say in what Google does is unfounded.

The take-away is that Google will set rules that work for Google, and if they happen to work for the webmaster community too, well that’s good, but only a fool would rely on it. Google care about their bottom line and their projects, not ours. If someone goes out of business due to Google’s behaviour, then so be it. Personally, I think the big technology companies do have a responsibility beyond themselves to society, because the amount of power they are now centralising means they’re not just any old company anymore, but great vortexes that can distort entire markets. For more on this idea, and where it’s all going, check out my review of “Who Owns The Future” by Jaron Lanier.

So, if you see SEO as a matter of playing by their rules, then fine, but keep in mind "those who can give you everything can also take everything away". Those rules weren't designed for your benefit.

Opportunity Cost

There was a massive opportunity cost by following so called ethical SEO during the 2000s.

For a long time, it was relatively easily to get high rankings by being grey. And if you got torched, you probably had many other sites with different link patterns good to go. This was against the webmaster guidelines, but given marketing could be characterized as war, one does not let the enemy define ones tactics. Some SEOs made millions doing it. Meanwhile, a lot of content-driven sites disappeared. That was, perhaps, my own "a stopped clock is right two times a day" moment. It's not like I'm going to point you to all the stuff I've been wrong about, now is it :)

These days, a lot of SEO is about content and how that content is marketed, but more specifically it’s about the stature of the site on which that content appears. That’s the bit some pundits tend to gloss over. You can have great content, but that’s no guarantee of anything. You will likely remain invisible. However, put that exact same content on a Fortune 500 site, and that content will likely prosper. Ah, the rich get richer.

So, we can say SEO is about content, but that’s only half the picture. If you’re a small player, the content needs to appear in the right place, be very tightly targeted to your audiences needs so they don’t click back, and it should be pushed through various media channels.

Content, even from many of these "ethical SEOs", used to be created for search engines in the hope of netting as many visitors as possible. These days, it’s probably a better strategy to get inside the audience's heads and target it to their specific needs, as opposed to a keyword, then get that content out to wherever your audience happens to be. Unless, of course, you’re Fortune 500 or otherwise well connected, in which case you can just publish whatever you like and it will probably do well.

Fair? Not really, but no one ever said this game was fair.

Whatever Next?

Do I know what’s going to happen next? In ten years time? Nope. I could make a few guesses, and like many other pundits, some guesses will prove right, and some will be wrong, but that’s the nature of the future. It will soon make fools of us all.

Having said that, will you take a punt and tell us what you think will be the future of SEO? Does it have one? What will look like? If you’re right, then you can point back here in a few years time and say “Look, I told you so!”.

If you’re wrong, well, there’s always historical revisionism :)

Google Keyword (Not Provided)

Sep 25th

Just a follow up on the prior (not provided) post, as Google has shot the moon since our last post on this. Here's a quick YouTube video.

The above video references the following:

Matt Cutts when secured search first rolled out:

Google software engineer Matt Cutts, who’s been involved with the privacy changes, wouldn’t give an exact figure but told me he estimated even at full roll-out, this would still be in the single-digit percentages of all Google searchers on Google.com.

This Week in Google (TWIG) show 211, where Matt mentioned the inspiration for encrypted search:

we actually started doing encrypted.google.com in 2008 and one of the guys who did a lot of heavy lifting on that, his name is Evan, and he actually reports to me. And we started that after I read Little Brother, and we said "we've got to encrypt the web."

The integration of organic search performance data inside AdWords.

The esteemed AdWords advertiser David Whitaker.

When asked about the recent increase in (not provided), a Google representative stated the following:

We want to provide SSL protection to as many users as we can, in as many regions as we can — we added non-signed-in Chrome omnibox searches earlier this year, and more recently other users who aren’t signed in. We’re going to continue expanding our use of SSL in our services because we believe it’s a good thing for users….

The motivation here is not to drive the ads side — it’s for our search users.

What an excellent time for Google to block paid search referrals as well.

If the move is important for user safety then it should apply to the ads as well.

The Benefits Of Thinking Like Google

Aug 27th
posted in

Shadows are the color of the sky.

It’s one of those truths that is difficult to see, until you look closely at what’s really there.

To see something as it really is, we should try to identify our own bias, and then let it go.

“Unnatural” Links

This article tries to make sense of Google's latest moves regarding links.

It’s a reaction to Google’s update of their Link Schemes policy. Google’s policy states “Any links intended to manipulate PageRank or a site’s ranking in Google search results may be considered part of a link scheme and a violation of Google’s Webmaster Guidelines." I wrote on this topic, too.

Those with a vested interest in the link building industry - which is pretty much all of us - might spot the problem.

Google’s negative emphasis, of late, has been about links. Their message is not new, just the emphasis. The new emphasis could pretty much be summarized thus:"any link you build for the purpose of manipulating rank is outside the guidelines." Google have never encouraged activity that could manipulate rankings, which is precisely what those link building, for the purpose of SEO, attempt to do. Building links for the purposes of higher rank AND staying within Google's guidelines will not be easy.

Some SEOs may kid themselves that they are link building “for the traffic”, but if that were the case, they’d have no problem insisting those links were scripted so they could monitor traffic statistics, or at very least, no-followed, so there could be no confusion about intent.

How many do?

Think Like Google

Ralph Tegtmeier: In response to Eric's assertion "I applaud Google for being more and more transparent with their guidelines", Ralph writes- "man, Eric: isn't the whole point of your piece that this is exactly what they're NOT doing, becoming "more transparent"?

Indeed.

In order to understand what Google is doing, it can be useful to downplay any SEO bias i.e. what we may like to see from an SEO standpoint, rather try to look at the world from Google’s point of view.

I ask myself “if I were Google, what would I do?”

Clearly I'm not Google, so these are just my guesses, but if I were Google, I’d see all SEO as a potential competitive threat to my click advertising business. The more effective the SEO, the more of a threat it is. SEOs can’t be eliminated, but they can been corralled and managed in order to reduce the level of competitive threat. Partly, this is achieved by algorithmic means. Partly, this is achieved using public relations. If I were Google, I would think SEOs are potentially useful if they could be encouraged to provide high quality content and make sites easier to crawl, as this suits my business case.

I’d want commercial webmasters paying me for click traffic. I’d want users to be happy with the results they are getting, so they keep using my search engine. I’d consider webmasters to be unpaid content providers.

Do I (Google) need content? Yes, I do. Do I need any content? No, I don’t. If anything, there is too much content, and lot of it is junk. In fact, I’m getting more and more selective about the content I do show. So selective, in fact, that a lot of what I show above the fold content is controlled and “published”, in the broadest sense of the word, by me (Google) in the form of the Knowledge Graph.

It is useful to put ourselves in someone else’s position to understand their truth. If you do, you’ll soon realise that Google aren’t the webmasters friend if your aim, as a webmaster, is to do anything that "artificially" enhances your rank.

So why are so many SEOs listening to Google’s directives?

Rewind

A year or two ago, it would be madness to suggest webmasters would pay to remove links, but that’s exactly what’s happening. Not only that, webmasters are doing Google link quality control. For free. They’re pointing out the links they see as being “bad” - links Google’s algorithms may have missed.

Check out this discussion. One exasperated SEO tells Google that she tries hard to get links removed, but doesn’t hear back from site owners. The few who do respond want money to take the links down.

It is understandable site owners don't spend much time removing links. From a site owners perspective, taking links down involves a time cost, so there is no benefit to the site owner in doing so, especially if they receive numerous requests. Secondly, taking down links may be perceived as being an admission of guilt. Why would a webmaster admit their links are "bad"?

The answer to this problem, from Google's John Mueller is telling.

A shrug of the shoulders.

It’s a non-problem. For Google. If you were Google, would you care if a site you may have relegated for ranking manipulation gets to rank again in future? Plenty more where they came from, as there are thousands more sites just like it, and many of them owned by people who don’t engage in ranking manipulation.

Does anyone really think their rankings are going to return once they’ve been flagged?

Jenny Halasz then hinted at the root of the problem. Why can’t Google simply not count the links they don’t like? Why make webmasters jump through arbitrary hoops? The question was side-stepped.

If you were Google, why would you make webmasters jump through hoops? Is it because you want to make webmasters lives easier? Well, that obviously isn’t the case. Removing links is a tedious, futile process. Google suggest using the disavow links tool, but the twist is you can’t just put up a list of links you want to disavow.

Say what?

No, you need to show you’ve made some effort to remove them.

Why?

If I were Google, I’d see this information supplied by webmasters as being potentially useful. They provide me with a list of links that the algorithm missed, or considered borderline, but the webmaster has reviewed and thinks look bad enough to affect their ranking. If the webmaster simply provided a list of links dumped from a link tool, it’s probably not telling Google much Google doesn’t already know. There’s been no manual link review.

So, what webmasters are doing is helping Google by manually reviewing links and reporting bad links. How does this help webmasters?

It doesn’t.

It just increases the temperature of the water in the pot. Is the SEO frog just going to stay there, or is he going to jump?

A Better Use Of Your Time

Does anyone believe rankings are going to return to their previous positions after such an exercise? A lot of webmasters aren’t seeing changes. Will you?

Maybe.

But I think it’s the wrong question.

It’s the wrong question because it’s just another example of letting Google define the game. What are you going to do when Google define you right out of the game? If your service or strategy involves links right now, then in order to remain snow white, any links you place, for the purposes of achieving higher rank, are going to need to be no-followed in order to be clear about intent. Extreme? What's going to be the emphasis in six months time? Next year? How do you know what you're doing now is not going to be frowned upon, then need to be undone, next year?

A couple of years ago it would be unthinkable that webmasters would report and remove their own links, even paying for them to be removed, but that’s exactly what’s happening. So, what is next year's unthinkable scenario?

You could re-examine the relationship and figure what you do on your site is absolutely none of Google’s business. They can issue as many guidelines as they like, but they do not own your website, or the link graph, and therefore don't have authority over you unless you allow it. Can they ban your site because you’re not compliant with their guidelines? Sure, they can. It’s their index. That is the risk. How do you choose to manage this risk?

It strikes me you can lose your rankings at anytime whether you follow the current guidelines or not, especially when the goal-posts keep moving. So, the risk of not following the guidelines, and following the guidelines but not ranking well is pretty much the same - no traffic. Do you have a plan to address the “no traffic from Google” risk, however that may come about?

Your plan might involve advertising on other sites that do rank well. It might involve, in part, a move to PPC. It might be to run multiple domains, some well within the guidelines, and some way outside them. Test, see what happens. It might involve beefing up other marketing channels. It might be to buy competitor sites. Your plan could be to jump through Google’s hoops if you do receive a penalty, see if your site returns, and if it does - great - until next time, that is.

What's your long term "traffic from Google" strategy?

If all you do is "follow Google's Guidelines", I'd say that's now a high risk SEO strategy.

Winning Strategies to Lose Money With Infographics

Aug 26th

Google is getting a bit absurd with suggesting that any form of content creation that drives links should include rel=nofollow. Certainly some techniques may be abused, but if you follow the suggested advice, you are almost guaranteed to have a negative ROI on each investment - until your company goes under.

Some will ascribe such advice as taking a "sustainable" and "low-risk" approach, but such strategies are only "sustainable" and "low-risk" so long as ROI doesn't matter & you are spending someone else's money.

The advice on infographics in the above video suggests that embed code by default should include nofollow links.

Companies can easily spend at least $2,000 to research, create, revise & promote an infographic. And something like 9 out of 10 infographics will go nowhere. That means you are spending about $20,000 for each successful viral infographic. And this presumes that you know what you are doing. Mix in a lack of experience, poor strategy, poor market fit, or poor timing and that cost only goes up from there.

If you run smaller & lesser known websites, quite often Google will rank a larger site that syndicates the infographic above the original source. They do that even when the links are followed. Mix in nofollow on the links and it is virtually guaranteed that you will get outranked by someone syndicating your infographic.

So if you get to count as duplicate content for your own featured premium content that you dropped 4 or 5 figures on AND you don't get links out of it, how exactly does the investment ever have any chance of backing out?

Sales?

Not a snowball's chance in hell.

An infographic created around "the 10 best ways you can give me your money" won't spread. And if it does spread, it will be people laughing at you.

I also find it a bit disingenuous the claim that people putting something that is 20,000 pixels large on their site are not actively vouching for it. If something was crap and people still felt like burning 20,000 pixels on syndicating it, surely they could add nofollow on their end to express their dissatisfaction and disgust with the piece.

Many dullards in the SEO industry give Google a free pass on any & all of their advice, as though it is always reasonable & should never be questioned. And each time it goes unquestioned, the ability to exist in the ecosystem as an independent player diminishes as the entire industry moves toward being classified as some form of spam & getting hit or not depends far more on who than what.

Does Google's recent automated infographic generator give users embed codes with nofollow on the links? Not at all. Instead they give you the URL without nofollow & those URLs are canonicalized behind the scenes to flow the link equity into the associated core page.

No cost cut-n-paste mix-n-match = direct links. Expensive custom research & artwork = better use nofollow, just to be safe.

If Google actively adds arbitrary risks to some players while subsidizing others then they shift the behaviors of markets. And shift the markets they do!

Years ago Twitter allowed people who built their platform to receive credit links in their bio. Matt Cutts tipped off Ev Williams that the profile links should be nofollowed & that flow of link equity was blocked.

It was revealed in the WSJ that in 2009 Twitter's internal metrics showed an 11% spammy Tweet rate & Twitter had a grand total of 2 "spam science" programmers on staff in 2012.

With smaller sites, they need to default everything to nofollow just in case anything could potentially be construed (or misconstrued) to have the intent to perhaps maybe sorta be aligned potentially with the intent to maybe sorta be something that could maybe have some risk of potentially maybe being spammy or maybe potentially have some small risk that it could potentially have the potential to impact rank in some search engine at some point in time, potentially.

A larger site can have over 10% of their site be spam (based on their own internal metrics) & set up their embed code so that the embeds directly link - and they can do so with zero risk.

I just linked to Twitter twice in the above embed. If those links were directly to Cygnus it may have been presumed that either he or I are spammers, but put the content on Twitter with 143,199 Tweets in a second & those links are legit & clean. Meanwhile, fake Twitter accounts have grown to such a scale that even Twitter is now buying them to try to stop them. Twitter's spam problem was so large that once they started to deal with spam their growth estimates dropped dramatically:

CEO Dick Costolo told employees he expected to get to 400 million users by the end of 2013, according to people familiar with the company.

Sources said that Twitter now has around 240 million users, which means it has been adding fewer than 4.5 million users a month in 2013. If it continues to grow at that rate, it would end this year around the 260 million mark — meaning that its user base would have grown by about 30 percent, instead of Costolo’s 100 percent goal.

Typically there is no presumed intent to spam so long as the links are going into a large site (sure there are a handful of token counter-examples shills can point at). By and large it is only when the links flow out to smaller players that they are spam. And when they do, they are presumed to be spam even if they point into featured content that cost thousands of Dollars. You better use nofollow, just to play it safe!

That duality is what makes blind unquestioning adherence to Google scripture so unpalatable. A number of people are getting disgusted enough by it that they can't help but comment on it: David Naylor, Martin Macdonald & many others DennisG highlighted.

Oh, and here's an infographic for your pleasurings.

Google: Press Release Links

Aug 7th
posted in

So, Google have updated their Webmaster Guidelines.

Here are a few common examples of unnatural links that violate our guidelines:....Links with optimized anchor text in articles or press releases distributed on other sites.

For example: There are many wedding rings on the market. If you want to have a wedding, you will have to pick the best ring. You will also need to buy flowers and a wedding dress.

In particular, they have focused on links with optimized anchor text in articles or press releases distributed on other sites. Google being Google, these rules are somewhat ambiguous. “Optimized anchor text”? The example they provide includes keywords in the anchor text, so keywords in the anchor text is “optimized” and therefore a violation of Google’s guidelines.

Ambiguously speaking, of course.

To put the press release change in context, Google’s guidelines state:

Any links intended to manipulate PageRank or a site's ranking in Google search results may be considered part of a link scheme and a violation of Google’s Webmaster Guidelines. This includes any behavior that manipulates links to your site or outgoing links from your site

So, links gained, for SEO purposes - intended to manipulate ranking - are against Google Guidelines.

Google vs Webmasters

Here’s a chat...

In this chat, Google’s John Muller says that, if the webmaster initiated it, then it isn't a natural link. If you want to be on the safe side, John suggests to use no-follow on links.

Google are being consistent, but what’s amusing is the complete disconnect on display from a few of the webmasters. Google have no problem with press releases, but if a webmaster wants to be on the safe side in terms of Google’s guidelines, the webmaster should no-follow the link.

Simple, right. If it really is a press release, and not an attempt to link build for SEO purposes, then why would a webmaster have any issue with adding a no-follow to a link?

He/she wouldn't.

But because some webmasters appear to lack self-awareness about what it is they are actually doing, they persist with their line of questioning. I suspect what they really want to hear is “keyword links in press releases are okay." Then, webmasters can continue to issue pretend press releases as a link building exercise.

They're missing the point.

Am I Taking Google’s Side?

Not taking sides.

Just hoping to shine some light on a wider issue.

If webmasters continue to let themselves be defined by Google, they are going to get defined out of the game entirely. It should be an obvious truth - but sadly lacking in much SEO punditry - that Google is not on the webmasters side. Google is on Google’s side. Google often say they are on the users side, and there is certainly some truth in that.

However,when it comes to the webmaster, the webmaster is a dime-a-dozen content supplier who must be managed, weeded out, sorted and categorized. When it comes to the more “aggressive” webmasters, Google’s behaviour could be characterized as “keep your friends close, and your enemies closer”.

This is because some webmasters, namely SEOs, don’t just publish content for users, they compete with Google’s revenue stream. SEOs offer a competing service to click based advertising that provides exactly the same benefit as Google's golden goose, namely qualified click traffic.

If SEOs get too good at what they do, then why would people pay Google so much money per click? They wouldn’t - they would pay it to SEOs, instead. So, if I were Google, I would see SEO as a business threat, and manage it - down - accordingly. In practice, I’d be trying to redefine SEO as “quality content provision”.

Why don't Google simply ignore press release links? Easy enough to do. Why go this route of making it public? After all, Google are typically very secret about algorithmic topics, unless the topic is something they want you to hear. And why do they want you to hear this? An obvious guess would be that it is done to undermine link building, and SEOs.

Big missiles heading your way.

Guideline Followers

The problem in letting Google define the rules of engagement is they can define you out of the SEO game, if you let them.

If an SEO is not following the guidelines - guidelines that are always shifting - yet claim they do, then they may be opening themselves up to legal liability. In one recent example, a case is underway alleging lack of performance:

Last week, the legal marketing industry was aTwitter (and aFacebook and even aPlus) with news that law firm Seikaly & Stewart had filed a lawsuit against The Rainmaker Institute seeking a return of their $49,000 in SEO fees and punitive damages under civil RICO

.....but it’s not unreasonable to expect a somewhat easier route for litigants in the future might be “not complying with Google’s guidelines”, unless the SEO agency disclosed it.

SEO is not the easiest career choice, huh.

One group that is likely to be happy about this latest Google push is legitimate PR agencies, media-relations departments, and publicists. As a commenter on WMW pointed out:

I suspect that most legitimate PR agencies, media-relations departments, and publicists will be happy to comply with Google's guidelines. Why? Because, if the term "press release" becomes a synonym for "SEO spam," one of the important tools in their toolboxes will become useless.

Just as real advertisers don't expect their ads to pass PageRank, real PR people don't expect their press releases to pass PageRank. Public relations is about planting a message in the media, not about manipulating search results

However, I’m not sure that will mean press releases are seen as any more credible, as press releases have never enjoyed a stellar reputation pre-SEO, but it may thin the crowd somewhat, which increases an agencies chances of getting their client seen.

Guidelines Honing In On Target

One resource referred to in the video above was this article, written by Amit Singhal, who is head of Google’s core ranking team. Note that it was written in 2011, so it’s nothing new. Here’s how Google say they determine quality:

we aren't disclosing the actual ranking signals used in our algorithms because we don't want folks to game our search results; but if you want to step into Google's mindset, the questions below provide some guidance on how we've been looking at the issue:

  • Would you trust the information presented in this article?
  • Is this article written by an expert or enthusiast who knows the topic well, or is it more shallow in nature?
  • Does the site have duplicate, overlapping, or redundant articles on the same or similar topics with slightly different keyword variations?
  • Are the topics driven by genuine interests of readers of the site, or does the site generate content by attempting to guess what might rank well in search engines?
  • Does the article provide original content or information, original reporting, original research, or original analysis?
  • Does the page provide substantial value when compared to other pages in search results?
  • How much quality control is done on content?

….and so on. Google’s rhetoric is almost always about “producing high quality content”, because this is what Google’s users want, and what Google’s users want, Google’s shareholders want.

It’s not a bad thing to want, of course. Who would want poor quality content? But as most of us know, producing high quality content is no guarantee of anything. Great for Google, great for users, but often not so good for publishers as the publisher carries all the risk.

Take a look at the Boston Globe, sold along with a boatload of content for a 93% decline. Quality content sure, but is it a profitable business? Emphasis on content without adequate marketing is not a sure-fire strategy. Bezos has just bought the Washington Post, of course, and we're pretty sure that isn't a content play, either.

High quality content often has a high upfront production cost attached to it, and given measly web advertising rates, the high possibility of invisibility, getting content scrapped and ripped off, then it is no wonder webmasters also push their high quality content in order to ensure it ranks. What other choice have they got?

To not do so is also risky.

Even eHow, well known for cheap factory line content, is moving toward subscription membership revenues.

The Somewhat Bigger Question

Google can move the goal- posts whenever they like. What you’re doing today might be frowned upon tomorrow. One day, your content may be made invisible, and there will be nothing you can do about it, other than start again.

Do you have a contingency plan for such an eventuality?

Johnon puts it well:

The only thing that matters is how much traffic you are getting from search engines today, and how prepared you are for when some (insert adjective here) Googler shuts off that flow of traffic"

To ask about the minuate of Google’s policies and guidelines is to miss the point. The real question is how prepared are you when Google shuts off you flow of traffic because they’ve reset the goal posts?

Focusing on the minuate of Google's policies is, indeed, to miss the point.

This is a question of risk management. What happens if your main site, or your clients site, runs foul of a Google policy change and gets trashed? Do you run multiple sites? Run one site with no SEO strategy at all, whilst you run other sites that push hard? Do you stay well within the guidelines and trust that will always be good enough? If you stay well within the guidelines, but don’t rank, isn’t that effectively the same as a ban i.e. you’re invisible? Do you treat search traffic as a bonus, rather than the main course?

Be careful about putting Google’s needs before your own. And manage your risk, on your own terms.

Google Keyword(Not Provided): High Double Digit Percent

Aug 3rd

Most Organic Search Data is Now Hidden

Over the past couple years since its launch, Google's keyword (not provided) has received quite a bit of exposure, with people discussing all sorts of tips on estimating its impact & finding alternate sources of data (like competitive research tools & webmaster tools).

What hasn't received anywhere near enough exposure (and should be discussed daily) is that the sole purpose of the change was anti-competitive abuse from the market monopoly in search.

The site which provided a count for (not provided) recently displayed over 40% of queries as (not provided), but that percentage didn't include the large percent of mobile search users that were showing no referrals at all & were showing up as direct website visitors. On July 30, Google started showing referrals for many of those mobile searchers, using keyword (not provided).

According to research by RKG, mobile click prices are nearly 60% of desktop click prices, while mobile search click values are only 22% of desktop click prices. Until Google launched enhanced AdWords campaigns they understated the size of mobile search by showing many mobile searchers as direct visitors. But now that AdWords advertisers were opted into mobile ads (and have to go through some tricky hoops to figure out how to disable it), Google has every incentive to promote what a big growth channel mobile search is for their business.

Looking at the analytics data for some non-SEO websites over the past 4 days I get Google referring an average of 86% of the 26,233 search visitors, with 13,413 being displayed as keyword (not provided).

Hiding The Value of SEO

Google is not only hiding half of their own keyword referral data, but they are hiding so much more than half that even when you mix in Bing and Yahoo! you still get over 50% of the total hidden.

Google's 86% of the 26,233 searches is 22,560 searches.

Keyword (not provided) being shown for 13,413 is 59% of 22,560. That means Google is hiding at least 59% of the keyword data for organic search. While they are passing a significant share of mobile search referrers, there is still a decent chunk that is not accounted for in the change this past week.

Not passing keywords is just another way for Google to increase the perceived risk & friction of SEO, while making SEO seem less necessary, which has been part of "the plan" for years now.

Buy AdWords ads and the data gets sent. Rank organically and most the data is hidden.

When one digs into keyword referral data & ad blocking, there is a bad odor emitting from the GooglePlex.

Subsidizing Scammers Ripping People Off

A number of the low end "solutions" providers scamming small businesses looking for SEO are taking advantage of the opportunity that keyword (not provided) offers them. A buddy of mine took over SEO for a site that had showed absolutely zero sales growth after a year of 15% monthly increase in search traffic. Looking at the on-site changes, the prior "optimizers" did nothing over the time period. Looking at the backlinks, nothing there either.

So what happened?

Well, when keyword data isn't shown, it is pretty easy for someone to run a clickbot to show keyword (not provided) Google visitors & claim that they were "doing SEO."

And searchers looking for SEO will see those same scammers selling bogus solutions in AdWords. Since they are selling a non-product / non-service, their margins are pretty high. Endorsed by Google as the best, they must be good.

Or something like that:

Google does prefer some types of SEO over others, but their preference isn’t cast along the black/white divide you imagine. It has nothing to do with spam or the integrity of their search results. Google simply prefers ineffective SEO over SEO that works. No question about it. They abhor any strategies that allow guys like you and me to walk into a business and offer a significantly better ROI than AdWords.

This is no different than the YouTube videos "recommended for you" that teach you how to make money on AdWords by promoting Clickbank products which are likely to get your account flagged and banned. Ooops.

Anti-competitive Funding Blocking Competing Ad Networks

John Andrews pointed to Google's blocking (then funding) of AdBlock Plus as an example of their monopolistic inhibiting of innovation.

sponsoring Adblock is changing the market conditions. Adblock can use the money provided by Google to make sure any non-Google ad is blocked more efficiently. They can also advertise their addon better, provide better support, etc. Google sponsoring Adblock directly affects Adblock's ability to block the adverts of other companies around the world. - RyanZAG

Turn AdBlock Plus on & search for credit cards on Google and get ads.

Do that same search over at Bing & get no ads.

How does a smaller search engine or a smaller ad network compete with Google on buying awareness, building a network AND paying the other kickback expenses Google forces into the marketplace?

They can't.

Which is part of the reason a monopoly in search can be used to control the rest of the online ecosystem.

Buying Browser Marketshare

Already the #1 web browser, Google Chrome buys marketshare with shady one-click bundling in software security installs.

If you do that stuff in organic search or AdWords, you might be called a spammer employing deceptive business practices.

When Google does it, it's "good for the user."

Vampire Sucking The Lifeblood Out of SEO

Google tells Chrome users "not signed in to Chrome (You're missing out - sign in)." Login to Chrome & searchers don't pass referral information. Google also promotes Firefox blocking the passage of keyword referral data in search, but when it comes to their own cookies being at risk, that is unacceptable: "Google is pulling out all the stops in its campaign to drive Chrome installs, which is understandable given Microsoft and Mozilla's stance on third-party cookies, the lifeblood of Google's display-ad business."

What do we call an entity that considers something "its lifeblood" while sucking it out of others?

What Is Your SEO Strategy?

Jul 31st
posted in

How do you determine your SEO strategy?

Actually, before you answer, let’s step back.

What Is SEO, Anyway?

“Search engine optimization” has always been an odd term as it’s somewhat misleading. After all, we’re not optimizing search engines.

SEO came about when webmasters optimized websites. Specifically, they optimized the source code of pages to appeal to search engines. The intent of SEO was to ensure websites appeared higher in search results than if the site was simply left to site designers and copywriters. Often, designers would inadvertently make sites uncrawlable, and therefore invisible in search engines.

But there was more to it than just enhancing crawlability.

SEOs examined the highest ranking page, looked at the source code, often copied it wholesale, added a few tweaks, then republished the page. In the days of Infoseek, this was all you needed to do to get an instant top ranking.

I know, because I used to do it!

At the time, I thought it was an amusing hacker trick. It also occurred to me that such positioning could be valuable. Of course, this rather obvious truth occurred to many other people, too. A similar game had been going on in the Yahoo Directory where people named sites “AAAA...whatever” because Yahoo listed sites in alphabetical order. People also used to obsessively track spiders, spotting fresh spiders (Hey Scooter!) as they appeared and....cough......guiding them through their websites in a favourable fashion.

When it comes to search engines, there’s always been gaming. The glittering prize awaits.

The new breed of search engines made things a bit more tricky. You couldn’t just focus on optimizing code in order to rank well. There was something else going on.

Backlinks.

So, SEO was no longer just about optimizing the underlying page code, SEO was also about getting links. At that point, SEO jumped from being just a technical coding exercise to a marketing exercise. Webmasters had to reach out to other webmasters and convince them to link up.

A young upstart, Google, placed heavy emphasis on links, making use of a clever algorithm that sorted “good” links from, well, “evil” links. This helped make Google’s result set more relevant than other search engines. Amusingly enough, Google once claimed it wasn’t possible to spam Google.

Webmasters responded by spamming Google.

Or, should I say, Google likely categorized what many webmasters were doing as “spam”, at least internally, and may have regretted their earlier hubris. Webmasters sought links that looked like “good” links. Sometimes, they even earned them.

And Google has been pushing back ever since.

Building links pre-dated SEO, and search engines, but, once backlinks were counted in ranking scores, link building was blended into SEO. These days, most SEO's consider link building a natural part of SEO. But, as we've seen, it wasn’t always this way.

We sometimes get comments on this blog about how marketing is different from SEO. Well, it is, but if you look at the history of SEO, there has always been marketing elements involved. Getting external links could be characterized as PR, or relationship building, or marketing, but I doubt anyone would claim getting links is not SEO.

More recently, we’ve seen a massive change in Google. It’s a change that is likely being rolled out over a number of years. It’s a change that makes a lot of old school SEO a lot less effective in the same way introducing link analysis made meta-tag optimization a lot less effective.

My takeaways from Panda are that this is not an individual change or something with a magic bullet solution. Panda is clearly based on data about the user interacting with the SERP (Bounce, Pogo Sticking), time on site, page views, etc., but it is not something you can easily reduce to 1 number or a short set of recommendations. To address a site that has been Pandalized requires you to isolate the "best content" based on your user engagement and try to improve that.

Google is likely applying different algorithms to different sectors, so the SEO tactics used in on sector don’t work in another. They’re also looking at engagement metrics, so they’re trying to figure out if the user really wanted the result they clicked on. When you consider Google's work on PPC landing pages, this development is obvious. It’s the same measure. If people click back often, too quickly, then the landing page quality score drops. This is likely happening in the SERPs, too.

So, just like link building once got rolled into SEO, engagement will be rolled into SEO. Some may see that as a death of SEO, and in some ways it is, just like when meta-tag optimization, and other code optimizations, were deprecated in favour of other, more useful relevancy metrics. In others ways, it's SEO just changing like it always has done.

The objective remains the same.

Deciding On Strategy

So, how do you construct your SEO strategy? What will be your strategy going forward?

Some read Google’s Webmaster Guidelines. They'll watch every Matt Cutts video. They follow it all to the letter. There’s nothing wrong with this approach.

Others read Google’s Guidelines. They'll watch every Matt Cutts video. They read between the lines and do the complete opposite. Nothing wrong with that approach, either.

It depends on what strategy you've adopted.

One of the problems with letting Google define your game is that they can move the goalposts anytime they like. The linking that used to be acceptable, at least in practice, often no longer is. Thinking of firing off a press release? Well, think carefully before loading it with keywords:

This is one of the big changes that may have not been so clear for many webmasters. Google said, “links with optimized anchor text in articles or press releases distributed on other sites,” is an example of an unnatural link that violate their guidelines. The key are the examples given and the phrase “distributed on other sites.” If you are publishing a press release or an article on your site and distribute it through a wire or through an article site, you must make sure to nofollow the links if those links are “optimized anchor text.

Do you now have to go back and unwind a lot of link building in order to stay in their good books? Or, perhaps you conclude that links in press releases must work a little too well, else Google wouldn’t be making a point of it. Or conclude that Google is running a cunning double-bluff hoping you’ll spend a lot more time doing things you think Google does or doesn’t like, but really Google doesn’t care about at all, as they’ve found a way to mitigate it.

Bulk guest posting were also included in Google's webmaster guidelines as a no no. Along with keyword rich anchors in article directories. Even how a site monetizes by doing things like blocking the back button can be considered "deceptive" and grounds for banning.

How about the simple strategy of finding the top ranking sites, do what they do, and add a little more? Do you avoid saturated niches, and aim for the low-hanging fruit? Do you try and guess all the metrics and make sure you cover every one? Do you churn and burn? Do you play the long game with one site? Is social media and marketing part of your game, or do you leave these aspects out of the SEO equation? Is your currency persuasion?

Think about your personal influence and the influence you can manage without dollars or gold or permission from Google. Think about how people throughout history have sought karma, invested in social credits, and injected good will into their communities, as a way to “prep” for disaster. Think about it.

We may be “search marketers” and “search engine optimizers” who work within the confines of an economy controlled (manipulated) by Google, but our currency is persuasion. Persuasion within a market niche transcends Google

It would be interesting to hear the strategies you use, and if you plan on using different strategy going forward.

New Local Carousel

Jun 24th

Google announced they rolled out their local carousel results on desktops in categories like hotels, dining & nightlife for US English search queries. The ranking factors driving local rank are aligned with the same ones that were driving the old 7 pack result set.

The layout seems to be triggered when there are 5 or more listings. One upside to the new layout is that clicks within the carousel might not fall off quite as quickly as they do with vertical listings, so if you don't rank #1 you might still get plenty of traffic.

The default amount of useful information offered by the new layout is less than the old layout provided, while requiring user interaction with the result set to get the information they want. You get a picture, but the only way the phone number is in the result set is if you click into that result set or conduct a branded query from the start.

If you search for a general query (say "Indian restaurants") and want the phone number of a specific restaurant, you will likely need to click on that restaurant's picture in order to shift the search to that restaurant's branded search result set to pull their phone number & other information into the page. In that way Google is able to better track user engagement & enhance personalization on local search. When people repeatedly click into the same paths from logged in Google user accounts then Google can put weight on the end user behavior.

This multi-click process not only gives Google usage data to refine rankings with, but it also will push advertisers into buying branded AdWords ads.

Where this new result set is a bit of a train wreck for navigational searches is when a brand is fairly generic & aligned with a location as part of the business name. For instance, in Oakland there is a place named San Francisco Pizza. Even if you do that branded search, you still get the carousel & there might also be three AdWords ads above the organic search results.

If that company isn't buying branded AdWords ads, they best hope that their customers have large monitors, don't use Google, or are better than the average searcher at distinguishing between AdWords & organic results.

Some of Google's other verticals may appear above the organic result set too. When searching for downtown Oakland hotels they offer listings of hotels in San Francisco & Berkeley inside the hotel onebox.

Perhaps Google can patch together some new local ad units that work with the carousel to offer local businesses a flat-rate monthly ad product. A lot of advertisers would be interested in testing a subscription product that enabled them to highlight selected user reviews and include other options like ratings & coupons & advertiser control of the image. As the search result set becomes the destination some of Google's ad products can become much more like Yelp's.

In the short term the new layout is likely a boon for Yelp & some other local directory plays. Whatever segment of the search audience that dislike's the new carousel will likely be shunted into many of these other local directories.

In the longrun some of these local directories will be the equivalent of MapQuest. As Google gains confidence they will make their listings richer & have more confidence in entirely displacing the result set. The following search isn't a local one, but is a good example of where we may be headed. Even though the search is set to "web" results (rather than "video" results) the first 9 listings are from YouTube.

Update: In addition to the alarming rise of further result displacement, the 2-step clickthrough process means that local businesses will lose even more keyword referral data, as many of the generic queries are replaced by their branded keywords in analytics data.

SEO: Dirty Rotten Scoundrels

Jun 12th

SEO is a dirty word.

PPC isn’t a dirty word.

Actually, they’re not words they’re acronyms, but you get my drift, I’m sure :)

It must be difficult for SEO providers to stay on the “good and pure” side of SEO when the definitions are constantly shifting. Recently we’ve seen one prominent SEO tool provider rebrand as an “inbound marketing” tools provider and it’s not difficult to appreciate the reasons why.

SEO, to a lot of people, means spam. The term SEO is lumbered, rightly or wrongly, with negative connotations.

Email Optimization

Consider email marketing.

Is all email marketing spam? Many would consider it annoying, but obviously not all email marketing is spam.

There is legitimate email marketing, whereby people opt-in to receive email messages they consider valuable. It is an industry worth around $2.468 billion. There are legitimate agencies providing campaign services, reputable tools vendors providing tools, and it can achieve measurable marketing results where everyone wins.

Yet, most email marketing is spam. Most of it is annoying. Most of it is irrelevant. According to a Microsoft security report, 97% of all email circulating is spam.

So, only around 3% of all email is legitimate. 3% of email is wanted. Relevant. Requested.

One wonders how much SEO is legitimate? I guess it depends what we mean by legitimate, but if we accept the definition I’ve used - “something relevant wanted by the user” - then, at a guess, I’d say most SEO these days is legitimate, simply because being off-topic is not rewarded. Most SEOs provide on-topic content, and encourage businesses to publish it - free - on the web. If anything, SEOs could be accused of being too on-topic.

The proof can be found in the SERPs. A site is requested by the user. If a site is listed matches their query, then the user probably deems it to be relevant. They might find that degree of relevance, personally, to be somewhat lacking, in which case they’ll click-back, but we don’t have a situation where search results are rendered irrelevant by the presence of SEO.

Generally speaking, search appears to work well in terms of delivering relevance. SEO could be considered cleaner than email marketing in that SEOs are obsessed with being relevant to a user. The majority of email marketers, on the other hand, couldn't seem to care less about what is relevant, just so long as they get something, anything, in front of you. In search, if a site matches the search query, and the visitor likes it enough to register positive quality metrics, then what does it matter how it got there?

It probably depends on whos’ business case we’re talking about.

Advertorials

Matt Cutts has released a new video on Advertorials and Native Advertising.

Matt makes a good case. He reminds us of the idea on which Google was founded, namely citation. If people think a document is important, or interesting, they link to it.

This idea came from academia. The more an academic document is cited, and cited by those with authority, the more relevant that document is likely to be. Nothing wrong with that idea, however some of the time, it doesn’t work. In academic circles, citation is prone to corruption. One example is self-citation.

But really, excessive self-citation is for amateurs: the real thing is forming a “citation cartel” as Phil David from The Scholarly Kitchen puts it. In April this year, after receiving a “tip from a concerned scientist” Davis did some detective work using the JCR data and found that several journals published reviews citing an unusually high number of articles fitting the JIF window from other journals. In one case, theMedical Science Monitor published a 2010 review citing 490 articles, 445 of them were published in 2008-09 in the journal Cell Transplantation (44 of the other 45 were for articles from Medical Science Journal published in 2008-09 as well). Three of the authors were Cell Transplantation editors

So, even in academia, self-serving linking gets pumped and manipulated. When this idea is applied to the unregulated web where there is vast sums of money at stake, you can see how citation very quickly changes into something else.

There is no way linking is going to stay “pure” in such an environment.

The debate around “paid links” and “paid placement” has been done over and over again, but in summary, the definition of “paid” is inherently problematic. For example, some sites invite guest posting, pay the writers nothing in monetary terms, but the payment is a link back to the writers site. The article is a form of paid placement, it’s just that no money changes hands. Is the article truly editorial?

It’s a bit grey.

A lot of the time, such articles pump the writers business interests. Is that paid content, and does it need to be disclosed? Does it need to be disclosed to both readers and search engines? I think Matt's video suggests it isn't a problem, as utility is provided, but a link from said article may need to be no-followed in order to stay within Google's guidelines.

Matt wants to see clear and conspicuous disclosure of advertorial content. Paid links, likewise. The disclosure should be made both to search engines and readers.

Which is interesting.

Why would a disclosure need to be made to a search engine spider? Granted, it makes Google’s job easier, but I’m not sure why publishers would want to make Google’s job easier, especially if there’s nothing in it for the publishers.

But here comes the stick, and not just from the web spam team.

Google News have stated they may remove a publication if a publication is taking money for paid content and not adequately disclosing that fact - in Google’s view - to both readers and search engines, then that publication may be kicked from Google News. In so doing, Google increase the risk to the publisher, and therefore the cost, in accepting paid links or paid placement.

So, that’s why a publisher will want to make Google’s job easier. If they don’t, they run the risk of invisibility.

Now, on one level, this sounds fair and reasonable. The most “merit worthy” content should be at the top. A ranking should not depend on how deep your pockets are i.e. the more links you can buy, the more merit you have.

However, one of the problems is that the search results already work this way. Big brands often do well in the SERPs due to reputation gained, in no small part, from massive advertising spend that has the side effect, or sometimes direct effect, of inbound links. Do these large brands therefore have “more merit” by virtue of their deeper pockets?

Google might also want to consider why a news organization would blur advertorial lines when they never used to. Could it be because their advertising is no longer paying them enough to survive?

SEO Rebalances The Game

SEO has helped level the playing field for small businesses, in particular. The little guy didn’t have deep pockets, but he could play the game smarter by figuring out what the search engines wanted, algorithmicly speaking, and giving it to them.

I can understand Google’s point of view. If I were Google, I’d probably think the same way. I’d love a situation where editorial was editorial, and business was PPC. SEO, to me, would mean making a site crawlable and understandable to both visitors and bots, but that’s the end of it. Anything outside that would be search engine spam. It’s neat. It’s got nice rounded edges. It would fit my business plan.

But real life is messier.

If a publisher doesn’t have the promotion budget of a major brand, and they don’t have enough money to outbid big brands on PPC, then they risk being invisible on search engines. Google search is pervasive, and if you’re not visible in Google search, then it’s a lot harder to make a living on the web. The risk of being banned for not following the guidelines is the same as the risk of playing the game within the guidelines, but not ranking. That risk is invisibility.

Is the fact a small business plays a game that is already stacked against them, by using SEO, “bad”? If they have to pay harder than the big brands just to compete, and perhaps become a big brand themselves one day, then who can really blame them? Can a result that is relevant, as far as the user is concerned, still really be labelled “spam”? Is that more to do with the search engines business case than actual end user dissatisfaction?

Publishers and SEOs should think carefully before buying into the construct that SEO, beyond Google’s narrow definition, is spam. Also consider that the more people who can be convinced to switch to PPC and/or stick to just making sites more crawlable, then the more spoils for those who couldn’t care less how SEO is labelled.

It would be great if quality content succeeded in the SERPs on merit, alone. This would encourage people to create quality content. But when other aspects are rewarded, then those aspects will be played.

Perhaps if the search engines could be explicit about what they want, and reward it when they’re delivered it, then everyone’s happy.

I guess the algorithms just aren’t that clever yet.

Pages






    Email Address
    Pick a Username
    Yes, please send me "7 Days to SEO Success" mini-course (a $57 value) for free.

    Learn More

    We value your privacy. We will not rent or sell your email address.