Some sites have seen pretty drastic drops in Google search traffic recently, related to indexing issues. Google maintains that it is a glitch:
Just to be clear, the issues from this thread, which I have reviewed in detail, are not due to changes in our policies or changes in our algorithms; they is due to a technical issue on our side that will be visibly resolved as soon as possible (it may take up to a few days to be visible for all sites though). You do not need to change anything on your side and we will continue to crawl and index your content (perhaps not as quickly at the moment, but we hope that will be resolved for all sites soon). I would not recommend changing anything significantly at this moment (unless you spot obvious problems on your side), as these may result in other issues once this problem is resolved on our side.
An example of one site's search traffic that was butchered by this glitch, see the below images. Note that in the before, Google traffic is ~ 10x what Yahoo! or Bing drive, and after the bug the traffic is ~ even.
Not that long ago I saw another site with over 500 unique linking domains which simply disappeared from the index for a few days, then came right back 3 days later. Google's push to become faster and more comprehensive has perhaps made them less stable, as digging into social media highlights a lot of false signals & often promotes a copy over the original. Add in any sort of indexing issues and things get really ugly really fast.
Now this may just be a glitch, but as Tedster points out, many such "glitches" often precede or coincide with major index updates. Ever since I have been in the SEO field I think Google has done a major algorithmic change just before the holidays every year except last year.
I think the reasons they do it are likely 3 or 4 fold
they want to make SEO unpredictable & unreliable (which ultimately means less resources are spent on SEO & the results are overall less manipulated)
they want to force businesses (who just stocked up on inventory) to enter the AdWords game in a big way
by making changes to the core relevancy algorithms (and having the market discuss those) they can slide in more self promotion via their vertical search services without it drawing much anti-trust scrutiny
the holidays are when conversion rates are the highest, so if they want to make changes to seek additional yield it is the best time to do it, and the holidays give them an excuse to offer specials or beta tests of various sorts
As an SEO with clients, the unpredictability is a bad thing, because it makes it harder to manage expectations. Sharp drops in rankings from Google "glitches" erode customer trust in the SEO provider. Sometimes Google will admit to major issues happening, and other times they won't until well *after* the fact. Being proven right after the fact still doesn't take back 100% of the uncertainty unleashed into the marketplace weeks later.
Even if half your clients double their business while 1/3 lose half their search traffic, as an SEO business you typically don't generally get to capture much of the additional upside...whereas you certainly capture the complaints from those who just fell behind. Ultimately this is one of the reasons why I think being a diversified web publisher is better than being an SEO consultant... if something takes off & something else drops then you can just pour additional resources into whatever is taking well and capture the lift from those changes.
If you haven't been tracking rankings now would be a great time to get on it. It is worth tracking a variety of keywords (at various levels of competition) daily while there is major flux going on, because that gives you another lens through which to view the relevancy algorithms, and where they might be headed.
Google Instant launched. It is a new always-on search experience where Google tries to complete your keyword search by predicting what keyword you are searching for. As you type more letters the search results change.
Short intro here:
Long view here:
Not seeing it yet? You can probably turn it on here (though in some countries you may also need to be logged into a Google account). In time Google intends to make the is a default feature turned on for almost everyone (other than those with slow ISPs and older web browsers). And if you don't like it, the feature is easy to turn off at the right of the search box, but to turn it off it uses a cookie. If you clear cookies the feature turns right back on.
Here is an image using Google's browser size tool, showing that when Google includes 4 AdWords ads only 50% of web browsers get to see the full 2nd organic listing, while only 20% get to see the full 4th organic listing.
Its implications on SEO are easy to understate. However, they can also be overstated: I already saw one public relations hack stating that it "makes SEO irrelevant."
Nothing could be further from the truth. If anything, Google instant only increases the value of a well thought out SEO strategy. Why? Well...
it consolidates search volume into a smaller basket of keywords
it further promotes the localization of results
it makes it easier to change between queries, so its easier to type one more letter than scroll down the page
it further pollutes AdWords impression testing as a great source of data
"More and more searches are done on your behalf without you needing to type. I actually think most people don't want Google to answer their questions," he elaborates. "They want Google to tell them what they should be doing next. ... serendipity—can be calculated now. We can actually produce it electronically."
What are some of the most bland and most well worn paths in the world? Established brands:
The internet is fast becoming a "cesspool" where false information thrives, Google CEO Eric Schmidt said yesterday. Speaking with an audience of magazine executives visiting the Google campus here as part of their annual industry conference, he said their brands were increasingly important signals that content can be trusted.
"Brands are the solution, not the problem," Mr. Schmidt said. "Brands are how you sort out the cesspool."
"Brand affinity is clearly hard wired," he said. "It is so fundamental to human existence that it's not going away. It must have a genetic component."
If Google is so smart then why the lazy reliance on brand? Why not show me something unique & original & world-changing?
While Google is collecting your data and selling it off to marketers, they have also thought of other ways to monetize that data and deliver serendipity:
"One day we had a conversation where we figured we could just try and predict the stock market..." Eric Schmidt continues, "and then we decided it was illegal. So we stopped doing that."
Any guess how that product might have added value to the world? On down days (or days when you search for "debt help") would Google deliver more negatively biased ads & play off fears more, while on up days selling more euphoric ads? Might that serendipity put you on the wrong side of almost every trade you make? After all, that is how the big names in that space make money - telling you to take the losing side of a trade with bogus "research."
“All this information that you have about us: where does it go? Who has access to that?” (Google servers and Google employees, under careful rules, Schmidt said.) “Does that scare everyone in this room?” The questioner asked, to applause. “Would you prefer someone else?” Schmidt shot back – to laughter and even greater applause. “Is there a government that you would prefer to be in charge of this?”
That exchange helped John Gruber give Eric Schmidt the label Creep Executive Officer, while asking: "Maybe the question isn’t who should hold this information, but rather should anyone hold this information."
If you're ever confused as to the value of newspaper editors, look at the blog world. That's all you need to see. - Eric Schmdit
Here is the thing I don't get about Google's rhetorical position on serendipity & moral authority: if they are to be trusted to recommend what you do, then why do they recommend illegal activities like pirating copyright works via warez, keygens, cracks & torrents?
Years ago Google introduced rel=nofollow, claiming it as a cure-all for comment spam. Once in place, it was quickly promoted as a tool to use on any paid link. Google scared webmasters about selling links so much that many webmasters simply became afraid to link out to anyone for fear of falling out of favor with Google.
Our affiliate program on this site stopped passing link juice after a fellow SEO blogger outed it quite publicly. Other affiliate programs continue to pass PageRank. Highlighting Google's double standards invites more scrutiny and more selective arbitrary enforcement. Whereas promoting Google products earns free links. ;)
No Disclosure Required: WOOT!
Reading the news today I found out that VigLink bought out DrivingTraffic. Both are networks to help publishers monetize their outbound links. The claim about VigLink is the one of no-effort money:
"Quite simply, if you're a Web publisher who hasn't recognized the value of your outbound traffic, you are leaving money on the table," said Raymond Lyle, CEO and Co-Founder of Driving Revenue. "Dozens of our publishers make six figure incomes for a one-time investment of one minute of work. Who isn't interested in that?"
The page loads fast. And your site looks exactly the same. Even your links look and behave the same way. The only difference is that now when your visitors buy products or services you'll earn a commission. ... Once you have set up viglink you can sign in to view reports about your site. You can see how much money you are making every day and compare that with last week. You can see which merchants are the most profitable, and make decisions on who to link to in the future.
So basically Viglink is suggesting controlling who you link to based on whatever makes you the most money, and not providing any disclosure of the financial relationship.
AKA: paid links.
Here is where it really gets screwed up: Google is an investor in VigLink.
Selectively allowing some links to pass link juice while arbitrarily blocking others indeed controls the shape of the web graph. It gives anyone who works with Google a strong competitive advantage in the organic search results over those who are not using Google endorsed technology.
As Google reached the limits of returns in direct marketing they started pushing the value of branding (because, hey, if you can chalk it up to latent branding value there is no cap on your max bid). Surprisingly, they even got many big brands to buy their own brands AND buy sitelinks on the AdWords ads. Some went so far as providing case studies for how much of their own brand traffic they were now willing to pay for, which they previously got free. :D
Sure that can make sense for seasonal promotions, but you could do the same thing by having subdomains and sister websites. Dell.com can be the main site, Dell.net (or deals.dell.com) can be the deals & promotions website, and Dell.org can be the good karma charity site. No paying someone else for brand you already spent to build. Beautiful. But I digress...
Companies with a high page rank are in a strong position to move into new markets. By “pointing” to this new information from their existing sites they can pass on some of their existing search engine aura, guaranteeing them more prominence.
Google’s Mr Singhal calls this the problem of “brand recognition”: where companies whose standing is based on their success in one area use this to “venture out into another class of information which they may not be as rich at”. Google uses human raters to assess the quality of individual sites in order to counter this effect, he adds.
Those are all irrelevant details, just beyond Google's omniscient view. :D
The other thing which is absurd, is that if you listen to Google's SEO tips, they will tell you to dominate a small niche then expand. Quoting Matt Cutts: "In general, I’ve found that starting with a small niche and building your way up is great practice."
And now brand extension is somehow a big deal worth another layer of arbitrary manual inspection and intervention?
If sites which expand in scope deserve more scrutiny then why is there so much scrape & mash flotsam in the search results? What makes remixed chunks of content better than the original source? A premium AdSense feed? Brand?
This image might need updated in the years to come, but it does a great job laying out how Google works when you type a query into their search engine. Search is so easy to do that it is hard to appreciate how complex it is unless you take a look under the hood. Which is exactly what this graphic does :D
Click the image to get the full sized beefy image :D
A side benefit of this graphic is that it should help prospective clients realize how complex SEO & PPC campaigns can be. So if anyone is trying to be an el cheapo with their budget you can use this to remind them how complex search is, and thus how time consuming and expensive a proper search marketing campaign is.
In the past when I claimed that the Google Maps insertion in organic search results wasn't more organic search but rather a Google promotion, I was met with skepticism by some, who argued that Google Maps was just another flavor of organic search and visitors would still be able to go to the end ranked website.
If you search for something on Google and click on one of the end URLs you can still visit them, but Google made one step in the opposite direction today. If you click on the map now the Google Maps section lists a bunch of places on the maps, rather than giving you the URLs. You then have to click onto one of the locations to see it on the map and open a pop up area which contains information including the URL. More clicks to do the same thing.
How long until Google replaces the URL listings in the search results with links to locations on the Google Maps or links to Google Places pages? It is the next obvious step in this transition.
Originally Google wanted to send you to the destination as quickly as possible, hoping that in doing so they would encourage you to come back again. This year Google's strategy has changed to one that wants you to stay for a while. There is no better example of that shift than Youtube Leanback:
Jamie Davidson, a YouTube product manager, says that the 15 minutes of daily viewing by a user typically involves six videos, with the conclusion of each presenting "a decision point, and every decision point is an opportunity to leave. We’re looking at how to push users into passive-consumption mode, a lean-back experience."
Generally I have not been a huge fan of registering all your websites with Google (profiling risks, etc.), but they keep using the carrot nicely to lead me astray. :D ... So much so that I want to find a Googler and give them a hug.
Google recently decided to share some more data in their webmaster tools. And for many webmasters the data is enough to make it worth registering (at least 1 website)!
AOL Click Data
When speaking of keyword search volume beakdown data people have typically shared information from the leaked AOL search data.
The big problem with that data is it is in aggregate. It is a nice free tool, and a good starting point, but it is fuzzy.
In general, for navigational searches people click the top result more often than they would on an informational search.
In general, for informational searches people tend to click throughout the full set of search results at a more even distribution than they would for navigational or transactional searches.
The only solid recently-shared publicly data on those breakdowns is from Dogpile [PDF], a meta search engine. But given how polluted meta search services tend to be (with ads mixed in their search results) those numbers were quite a bit off from what one might expect. And once more, they are aggregate numbers.
Pretty solid looking estimates can get pretty rough pretty fast. ;)
The Value of Data
If there is one critical piece of marketing worth learning above all others it is that context is important.
My suggestions as to what works, another person's opinions or advice on what you should do, and empirical truth collected by a marketer who likes to use numbers to prove his point ... well all 3 data sets fall flat on their face when compared against the data and insights and interactions that come from running your own business. As teachers and marketers we try to share tips to guide people toward success, but your data is one of the most valuable things you own.
A Hack to Collect Search Volume Data & Estimated CTR Data
In their Excel plug-in Microsoft shares the same search data they use internally, but its not certain that when they integrate the Yahoo! Search deal that Microsoft will keep sharing as much data as they do now.
Google offers numerous keywordresearchtools, but getting them to agree with each other can be quite a challenge.
There have been some hacks to collect organic search clickthrough rate data on Google. One of the more popular strategies was to run an AdWords ad for the exact match version of a keyword and bid low onto the first page of results. Keep the ad running for a while and then run an AdWords impression share report. With that data in hand you can estimate how many actual searches there were, and then compare your organic search clicks against that to get an effective clickthrough rate.
The New Solution
Given search personalization and localization and the ever-changing result sets with all the test Google runs, even the above can be rough. So what is a webmaster to do?
Well Google upgraded the data they share inside their webmaster tools, which includes (on a per keyword level)
keyword clickthrough rank
clickthrough rate at various ranking positions
URL that was clicked onto
Trophy Keywords vs Brand Keywords
Even if your site is rather well known going after some of the big keywords can be a bit self-defeating in terms of the value delivered. Imagine ranking #6 or #7 for SEO. Wouldn't that send a lot of search traffic? Nope.
When you back away the ego searches, the rank checkers, etc. it turns out that there isn't a ton of search volume to be had ranking on page 1 of Google for SEO.
With only a 2% CTR the core keyword SEO is driving less than 1/2 the traffic driven by our 2 most common brand search keywords. Our brand might not seem like it is getting lots of traffic with only a few thousand searches a month, but when you have a > 70% CTR that can still add up to a lot of traffic. More importantly, that is the kind of traffic which is more likely to buy from you than someone searching for a broad discovery or curiosity type of keyword.
The lessons for SEOs in that data?
Core keywords & raw mechanical SEO are both quite frequently heavily over-rated in terms of value.
Rather than sweating trying to rank well for the hardest keywords first focus on more niche keywords that are easy to rank for.
Search is becoming the default navigational tool for the web. People go to Google and then type in "yahoo." If you don't have a branded keyword as one of your top keywords that might indicate long-term risk to your business. If a competitor can clone most of what you are doing and then bake in a viral component you are toast.
Going After the Wrong Brand Keywords
Arbitraging 3rd party brands is an easy way to build up distribution quickly. This is why there are 4,982 Britney Spears fan blogs (well 2 people are actually fans, but the other 4,980 are marketers).
But if you want to pull in traffic you have to go after a keyword that is an extension of the brand. Ranking for "eBay" probably won't send you much traffic (as their clickthrough rate on their first result is probably even higher than the 70% I had above). Though if you have tips on how to buy or sell on eBay those kinds of keywords might pull in a much higher clickthrough rate for you.
To confirm the above I grabbed data for a couple SEO tool brands we rank well for. A number 3 ranking (behind a double listing) and virtually no traffic!
Different keyword, same result
Link building is still a bit of a discovery keyword, but I think it is perhaps a bit later staged than just the acronym "SEO." Here the click volume distribution is much flatter / less consolidated than it was on the above brand-oriented examples.
If when Google lowers your rank you still pull in a fairly high CTR that might be a signal to them that your site should rank a bit higher.
When the internal Google remote quality rater guidelines leaked online there was a core quote inside it that defined the essence of spam:
Final Notes on Spam
When trying to decide if a page is Spam, it is helpful to ask yourself this question: if I remove the scraped (copied) content, the ads, and the links to other pages, is there anything of value left? if the answer is no, the page is probably Spam.
With the above quote in mind please review the typical Mahalo page
Adding a bit more context, the following 25 minute video from 2008 starts off with Matt Cutts talking about how he penalized a website for using deceptive marketing. Later into the video (~ 21 minutes in) the topic of search results within search results and then Mahalo come up.
Here is a transcription of relevant bits...
Matt Cutts: Would a user be annoyed if they land on this page, right. Because if users get annoyed, if users complain, then that is when we start to take action.
And so it is definitely the case where we have seen search results where a search engine didn't robots.txt something out, or somebody takes a cookie cutter affiliate feed, they just warm it up and slap it out, there is no value add, there is no original content there and they say search results or some comparison shopping sites don't put a lot of work into making it a useful site. They don't add value.
Though we mainly wanted to get on record and say that hey we are willing to take these out, because we try to document everything as much as we can, because if we came and said oh removed some stuff but it wasn't in our guidelines to do that then that would be sub-optimal.
So there are 2 parts to Google's guidelines. There are technical guidelines and quality guidelines. The quality guidelines are things where if you put hidden text we'll consider that spam and we can remove your page. The technical guidelines are more like just suggestions.
So we said don't have search results in search results. And if we find those then we may end up pruning those out.
We just want to make sure that searchers get good search results and that they don't just say oh well I clicked on this and I am supposed to find the answer, and now I have to click somewhere else and I am lost, and I didn't find what I wanted. Now I am angry and I am going to complain to Google.
Danny Sulivan: "Mahalo is nothing but search results. I mean that is explicitly what he says he is doing. I will let you qualify it, but if you ask him what it is still to this day he will say its a search engine. And then all the SEOs go 'well if it is a search engine, shouldn't you be blocking all your search results from Google' and his response is 'yeah well IF we ever see them do anything then we might do it'."
Matt Cutts: It's kinda interesting because I think Jason...he is a smart guy. He's a savvy guy, and he threaded the needle where whenever he talked to some people he called it a search service or search engine, and whenever he talked to other people he would say oh it is more of a content play.
And in my opinion, I talked to him, and so I said what software do you use to power your search engine? And he said we use Twika or MediaWiki. You know, wiki software, not C++ not Perl not Python. And at that point it really does move more into a content play. And so it is closer to an About.com than to a Powerset or a Microsoft or Yahoo! Search.
And if you think about it he has even moved more recently to say 'you know, you need to have this much content on the page.' So I think various people have stated how skilled he is at baiting people, but I don't think anybody is going to make a strong claim that it is pure search or that even he seems to be moving away from ok we are nothing but a search engine and moving more toward we have got a lot of people who are paid editors to add a lot of value.
One quick thing to note about the above video was how the site mentioned off the start got penalized for lying for links, and yet Jason Calacanis apologized for getting a reporter fired after lying about having early access to the iPad. Further notice how Matt considered that the first person was lying and deserved to be penalized for it, whereas when he spoke of Jason he used the words savvy, smart, and the line threaded the needle. To the layperson, what is the difference between being a savvy person threading the needle and a habitual liar?
Further lets look at some other surrounding facts in 2010, shall we?
How does Jason stating "Mahalo sold $250k+ in Amazon product in 2009 without trying" square with Matt Cutts saying "somebody takes a cookie cutter affiliate feed, they just warm it up and slap it out, there is no value add, there is no original content there" ... Does the phrase without trying sound like value add to you? Doesn't to me.
Matt stated "and if you think about it he has even moved more recently to say 'you know, you need to have this much content on the page,'" but in reality, that was a response to when I highlighted how Mahalo was scraping content. Jason dismissed the incident as an "experimental" page that they would nofollow. Years later, of course, it turned out he was (once again) lying and still doing the same thing, only with far greater scale. Jason once again made Matt Cutts look bad for trusting him.
Matt stated "I don't think anybody is going to make a strong claim that it is pure search" ... and no, its not pure search. If anything it is IMPURE search, where they use 3rd party content *without permission* and put most of it below the fold, while the Google AdSense ads are displayed front and center.
If you want to opt out of Mahalo scraping your content you can't because he scrapes it from 3rd party sites and provides NO WAY for you to opt out of him displaying scraped content from your site as content on his page).
Jason offers an "embed this" option for their content, so you can embed their "content" on your site. But if you use that code the content is in an iframe so it doesn't harm them on the duplicate content front AND the code gives Jason multiple direct clean backlinks. Whereas when Jason automatically embeds millions of scraped listings of your content he puts it right in the page as content on his page AND slaps nofollow on the link. If you use his content he gets credit...when he uses your content you get a lump of coal. NICE!
And, if you were giving Jason the benefit of the doubt, and thought the above was accidental, check out how when he scrapes the content in that all external links have a nofollow added, but any internal link *does not*
Matt stated "[Jason is] moving more toward we have got a lot of people who are paid editors to add a lot of value" ... and, in reality, Jason used the recession as an excuse to can the in house editorial team and outsource that to freelancers (which are paid FAR LESS than the amounts he hypes publicly). Given that many of the pages that have original content on them only have 2 sentences surrounded by large swaths of scraped content, I am not sure there is an attempt to "add a lot of value." Do you find this page on Shake and Bake meth to be a high quality editorial page?
What is EVEN MORE OUTRAGEOUS when they claim to have some editorial control over the content is that not only do they wrap outbound links which they are scraping content from in nofollow, but they publish articles on topics like 13 YEAR OLD RAPE. Either they have no editorial, or some of the editorial is done by pedophiles.
Here Jason is creating a new auto-generated page about me! And if I want to opt out of being scraped I CAN'T. What other source automatically scrapes content, republishes it wrapped in ads and calls it fair use, and then does not allow you to opt out? What is worse in the below example, is that on that page Jason stole the meta description from my site and used it as his page's meta description (without my permission, and without a way for me to opt out of it).
So basically Matt...until you do something, Jason is going to keep spamming the crap out of Google. Each day you ignore him another entreprenuer will follow suit trying to build another company that scrapes off the backs of original content creators. Should Google be paying people to *borrow* 3rd party content without permission (and with no option of opting out)?
I think Jason has pressed his luck and made Matt look naive and stupid. Matt Cutts has got to be pissed. But unfortunately for Matt, Mahalo is too powerful for him to do anything about it. In that spirit, David Naylor recently linked to this page on Twitter.
What is the moral of the story for Jason Calacanas & other SEOs?
If you have venture capital and have media access and lie to the media for years it is fine. If you are branded as an SEO and you are caught lying once then no soup for you.
If you are going to steal third party content and use it as content on your site and try to claim it is fair use make sure you provide a way of opting out (doing otherwise is at best classless, but likely illegal as well).
If you have venture capital and are good at public relations then Google's quality guidelines simply do not apply to you. Follow Jason's lead as long as Google permits mass autogenerated spam wrapped in AdSense to rank well in their search results.
The Google Webmaster Guidelines are an arbitrary device used to oppress the small and weak, but do not apply to large Google ad partners.
Don't waste any of your time reporting search spam or link buying. The above FLAGRANT massive violation of Google's guidelines was reported on SearchEngineLand, and yet the issue continues without remedy - showing what a waste of time it is to highlight such issues to Google.