Marin software manages about 5% of Google AdWords spend for clients, and they noticed that since Google Instant was unveiled, AdWords ad clicks are up 5%. Since the launch Google's Jonathan Rosenberg has mentioned that the impact on AdWords was "not material."
I found the repeated use of those exact words suspicious and diversionary, and, as it turned out, with good reason! When Google Instant launched I highlighted what Google was doing to screen real estate & predicted this shift.
Turns out that the "tin foil hat wearing SEOs" were right once again.
And that 5% lift in AdWords clicks is on top of the lift Google has seen from
creating a 4th ad slot for comparison ads (in high paying verticals like "credit cards" and "mortgage")
sitelinks, merchant ratings, and other ad extensions, that gave Google another lift. On the last quarterly call Jonathan Rosenberg stated: "These ads appear on more than 10% of the queries where we show ads and people like them. We see this because click-through rates are up for some formats as much as 10% and up more than 30% on some others."
Some sites have seen pretty drastic drops in Google search traffic recently, related to indexing issues. Google maintains that it is a glitch:
Just to be clear, the issues from this thread, which I have reviewed in detail, are not due to changes in our policies or changes in our algorithms; they is due to a technical issue on our side that will be visibly resolved as soon as possible (it may take up to a few days to be visible for all sites though). You do not need to change anything on your side and we will continue to crawl and index your content (perhaps not as quickly at the moment, but we hope that will be resolved for all sites soon). I would not recommend changing anything significantly at this moment (unless you spot obvious problems on your side), as these may result in other issues once this problem is resolved on our side.
An example of one site's search traffic that was butchered by this glitch, see the below images. Note that in the before, Google traffic is ~ 10x what Yahoo! or Bing drive, and after the bug the traffic is ~ even.
Not that long ago I saw another site with over 500 unique linking domains which simply disappeared from the index for a few days, then came right back 3 days later. Google's push to become faster and more comprehensive has perhaps made them less stable, as digging into social media highlights a lot of false signals & often promotes a copy over the original. Add in any sort of indexing issues and things get really ugly really fast.
Now this may just be a glitch, but as Tedster points out, many such "glitches" often precede or coincide with major index updates. Ever since I have been in the SEO field I think Google has done a major algorithmic change just before the holidays every year except last year.
I think the reasons they do it are likely 3 or 4 fold
they want to make SEO unpredictable & unreliable (which ultimately means less resources are spent on SEO & the results are overall less manipulated)
they want to force businesses (who just stocked up on inventory) to enter the AdWords game in a big way
by making changes to the core relevancy algorithms (and having the market discuss those) they can slide in more self promotion via their vertical search services without it drawing much anti-trust scrutiny
the holidays are when conversion rates are the highest, so if they want to make changes to seek additional yield it is the best time to do it, and the holidays give them an excuse to offer specials or beta tests of various sorts
As an SEO with clients, the unpredictability is a bad thing, because it makes it harder to manage expectations. Sharp drops in rankings from Google "glitches" erode customer trust in the SEO provider. Sometimes Google will admit to major issues happening, and other times they won't until well *after* the fact. Being proven right after the fact still doesn't take back 100% of the uncertainty unleashed into the marketplace weeks later.
Even if half your clients double their business while 1/3 lose half their search traffic, as an SEO business you typically don't generally get to capture much of the additional upside...whereas you certainly capture the complaints from those who just fell behind. Ultimately this is one of the reasons why I think being a diversified web publisher is better than being an SEO consultant... if something takes off & something else drops then you can just pour additional resources into whatever is taking well and capture the lift from those changes.
If you haven't been tracking rankings now would be a great time to get on it. It is worth tracking a variety of keywords (at various levels of competition) daily while there is major flux going on, because that gives you another lens through which to view the relevancy algorithms, and where they might be headed.
Google Instant launched. It is a new always-on search experience where Google tries to complete your keyword search by predicting what keyword you are searching for. As you type more letters the search results change.
Short intro here:
Long view here:
Not seeing it yet? You can probably turn it on here (though in some countries you may also need to be logged into a Google account). In time Google intends to make the is a default feature turned on for almost everyone (other than those with slow ISPs and older web browsers). And if you don't like it, the feature is easy to turn off at the right of the search box, but to turn it off it uses a cookie. If you clear cookies the feature turns right back on.
Here is an image using Google's browser size tool, showing that when Google includes 4 AdWords ads only 50% of web browsers get to see the full 2nd organic listing, while only 20% get to see the full 4th organic listing.
Its implications on SEO are easy to understate. However, they can also be overstated: I already saw one public relations hack stating that it "makes SEO irrelevant."
Nothing could be further from the truth. If anything, Google instant only increases the value of a well thought out SEO strategy. Why? Well...
it consolidates search volume into a smaller basket of keywords
it further promotes the localization of results
it makes it easier to change between queries, so its easier to type one more letter than scroll down the page
it further pollutes AdWords impression testing as a great source of data
"More and more searches are done on your behalf without you needing to type. I actually think most people don't want Google to answer their questions," he elaborates. "They want Google to tell them what they should be doing next. ... serendipity—can be calculated now. We can actually produce it electronically."
What are some of the most bland and most well worn paths in the world? Established brands:
The internet is fast becoming a "cesspool" where false information thrives, Google CEO Eric Schmidt said yesterday. Speaking with an audience of magazine executives visiting the Google campus here as part of their annual industry conference, he said their brands were increasingly important signals that content can be trusted.
"Brands are the solution, not the problem," Mr. Schmidt said. "Brands are how you sort out the cesspool."
"Brand affinity is clearly hard wired," he said. "It is so fundamental to human existence that it's not going away. It must have a genetic component."
If Google is so smart then why the lazy reliance on brand? Why not show me something unique & original & world-changing?
While Google is collecting your data and selling it off to marketers, they have also thought of other ways to monetize that data and deliver serendipity:
"One day we had a conversation where we figured we could just try and predict the stock market..." Eric Schmidt continues, "and then we decided it was illegal. So we stopped doing that."
Any guess how that product might have added value to the world? On down days (or days when you search for "debt help") would Google deliver more negatively biased ads & play off fears more, while on up days selling more euphoric ads? Might that serendipity put you on the wrong side of almost every trade you make? After all, that is how the big names in that space make money - telling you to take the losing side of a trade with bogus "research."
“All this information that you have about us: where does it go? Who has access to that?” (Google servers and Google employees, under careful rules, Schmidt said.) “Does that scare everyone in this room?” The questioner asked, to applause. “Would you prefer someone else?” Schmidt shot back – to laughter and even greater applause. “Is there a government that you would prefer to be in charge of this?”
That exchange helped John Gruber give Eric Schmidt the label Creep Executive Officer, while asking: "Maybe the question isn’t who should hold this information, but rather should anyone hold this information."
If you're ever confused as to the value of newspaper editors, look at the blog world. That's all you need to see. - Eric Schmdit
Here is the thing I don't get about Google's rhetorical position on serendipity & moral authority: if they are to be trusted to recommend what you do, then why do they recommend illegal activities like pirating copyright works via warez, keygens, cracks & torrents?
Years ago Google introduced rel=nofollow, claiming it as a cure-all for comment spam. Once in place, it was quickly promoted as a tool to use on any paid link. Google scared webmasters about selling links so much that many webmasters simply became afraid to link out to anyone for fear of falling out of favor with Google.
Our affiliate program on this site stopped passing link juice after a fellow SEO blogger outed it quite publicly. Other affiliate programs continue to pass PageRank. Highlighting Google's double standards invites more scrutiny and more selective arbitrary enforcement. Whereas promoting Google products earns free links. ;)
No Disclosure Required: WOOT!
Reading the news today I found out that VigLink bought out DrivingTraffic. Both are networks to help publishers monetize their outbound links. The claim about VigLink is the one of no-effort money:
"Quite simply, if you're a Web publisher who hasn't recognized the value of your outbound traffic, you are leaving money on the table," said Raymond Lyle, CEO and Co-Founder of Driving Revenue. "Dozens of our publishers make six figure incomes for a one-time investment of one minute of work. Who isn't interested in that?"
The page loads fast. And your site looks exactly the same. Even your links look and behave the same way. The only difference is that now when your visitors buy products or services you'll earn a commission. ... Once you have set up viglink you can sign in to view reports about your site. You can see how much money you are making every day and compare that with last week. You can see which merchants are the most profitable, and make decisions on who to link to in the future.
So basically Viglink is suggesting controlling who you link to based on whatever makes you the most money, and not providing any disclosure of the financial relationship.
AKA: paid links.
Here is where it really gets screwed up: Google is an investor in VigLink.
Selectively allowing some links to pass link juice while arbitrarily blocking others indeed controls the shape of the web graph. It gives anyone who works with Google a strong competitive advantage in the organic search results over those who are not using Google endorsed technology.
As Google reached the limits of returns in direct marketing they started pushing the value of branding (because, hey, if you can chalk it up to latent branding value there is no cap on your max bid). Surprisingly, they even got many big brands to buy their own brands AND buy sitelinks on the AdWords ads. Some went so far as providing case studies for how much of their own brand traffic they were now willing to pay for, which they previously got free. :D
Sure that can make sense for seasonal promotions, but you could do the same thing by having subdomains and sister websites. Dell.com can be the main site, Dell.net (or deals.dell.com) can be the deals & promotions website, and Dell.org can be the good karma charity site. No paying someone else for brand you already spent to build. Beautiful. But I digress...
Companies with a high page rank are in a strong position to move into new markets. By “pointing” to this new information from their existing sites they can pass on some of their existing search engine aura, guaranteeing them more prominence.
Google’s Mr Singhal calls this the problem of “brand recognition”: where companies whose standing is based on their success in one area use this to “venture out into another class of information which they may not be as rich at”. Google uses human raters to assess the quality of individual sites in order to counter this effect, he adds.
Those are all irrelevant details, just beyond Google's omniscient view. :D
The other thing which is absurd, is that if you listen to Google's SEO tips, they will tell you to dominate a small niche then expand. Quoting Matt Cutts: "In general, I’ve found that starting with a small niche and building your way up is great practice."
And now brand extension is somehow a big deal worth another layer of arbitrary manual inspection and intervention?
If sites which expand in scope deserve more scrutiny then why is there so much scrape & mash flotsam in the search results? What makes remixed chunks of content better than the original source? A premium AdSense feed? Brand?
This image might need updated in the years to come, but it does a great job laying out how Google works when you type a query into their search engine. Search is so easy to do that it is hard to appreciate how complex it is unless you take a look under the hood. Which is exactly what this graphic does :D
Click the image to get the full sized beefy image :D
A side benefit of this graphic is that it should help prospective clients realize how complex SEO & PPC campaigns can be. So if anyone is trying to be an el cheapo with their budget you can use this to remind them how complex search is, and thus how time consuming and expensive a proper search marketing campaign is.
In the past when I claimed that the Google Maps insertion in organic search results wasn't more organic search but rather a Google promotion, I was met with skepticism by some, who argued that Google Maps was just another flavor of organic search and visitors would still be able to go to the end ranked website.
If you search for something on Google and click on one of the end URLs you can still visit them, but Google made one step in the opposite direction today. If you click on the map now the Google Maps section lists a bunch of places on the maps, rather than giving you the URLs. You then have to click onto one of the locations to see it on the map and open a pop up area which contains information including the URL. More clicks to do the same thing.
How long until Google replaces the URL listings in the search results with links to locations on the Google Maps or links to Google Places pages? It is the next obvious step in this transition.
Originally Google wanted to send you to the destination as quickly as possible, hoping that in doing so they would encourage you to come back again. This year Google's strategy has changed to one that wants you to stay for a while. There is no better example of that shift than Youtube Leanback:
Jamie Davidson, a YouTube product manager, says that the 15 minutes of daily viewing by a user typically involves six videos, with the conclusion of each presenting "a decision point, and every decision point is an opportunity to leave. We’re looking at how to push users into passive-consumption mode, a lean-back experience."
Generally I have not been a huge fan of registering all your websites with Google (profiling risks, etc.), but they keep using the carrot nicely to lead me astray. :D ... So much so that I want to find a Googler and give them a hug.
Google recently decided to share some more data in their webmaster tools. And for many webmasters the data is enough to make it worth registering (at least 1 website)!
AOL Click Data
When speaking of keyword search volume beakdown data people have typically shared information from the leaked AOL search data.
The big problem with that data is it is in aggregate. It is a nice free tool, and a good starting point, but it is fuzzy.
In general, for navigational searches people click the top result more often than they would on an informational search.
In general, for informational searches people tend to click throughout the full set of search results at a more even distribution than they would for navigational or transactional searches.
The only solid recently-shared publicly data on those breakdowns is from Dogpile [PDF], a meta search engine. But given how polluted meta search services tend to be (with ads mixed in their search results) those numbers were quite a bit off from what one might expect. And once more, they are aggregate numbers.
Pretty solid looking estimates can get pretty rough pretty fast. ;)
The Value of Data
If there is one critical piece of marketing worth learning above all others it is that context is important.
My suggestions as to what works, another person's opinions or advice on what you should do, and empirical truth collected by a marketer who likes to use numbers to prove his point ... well all 3 data sets fall flat on their face when compared against the data and insights and interactions that come from running your own business. As teachers and marketers we try to share tips to guide people toward success, but your data is one of the most valuable things you own.
A Hack to Collect Search Volume Data & Estimated CTR Data
In their Excel plug-in Microsoft shares the same search data they use internally, but its not certain that when they integrate the Yahoo! Search deal that Microsoft will keep sharing as much data as they do now.
Google offers numerous keywordresearchtools, but getting them to agree with each other can be quite a challenge.
There have been some hacks to collect organic search clickthrough rate data on Google. One of the more popular strategies was to run an AdWords ad for the exact match version of a keyword and bid low onto the first page of results. Keep the ad running for a while and then run an AdWords impression share report. With that data in hand you can estimate how many actual searches there were, and then compare your organic search clicks against that to get an effective clickthrough rate.
The New Solution
Given search personalization and localization and the ever-changing result sets with all the test Google runs, even the above can be rough. So what is a webmaster to do?
Well Google upgraded the data they share inside their webmaster tools, which includes (on a per keyword level)
keyword clickthrough rank
clickthrough rate at various ranking positions
URL that was clicked onto
Trophy Keywords vs Brand Keywords
Even if your site is rather well known going after some of the big keywords can be a bit self-defeating in terms of the value delivered. Imagine ranking #6 or #7 for SEO. Wouldn't that send a lot of search traffic? Nope.
When you back away the ego searches, the rank checkers, etc. it turns out that there isn't a ton of search volume to be had ranking on page 1 of Google for SEO.
With only a 2% CTR the core keyword SEO is driving less than 1/2 the traffic driven by our 2 most common brand search keywords. Our brand might not seem like it is getting lots of traffic with only a few thousand searches a month, but when you have a > 70% CTR that can still add up to a lot of traffic. More importantly, that is the kind of traffic which is more likely to buy from you than someone searching for a broad discovery or curiosity type of keyword.
The lessons for SEOs in that data?
Core keywords & raw mechanical SEO are both quite frequently heavily over-rated in terms of value.
Rather than sweating trying to rank well for the hardest keywords first focus on more niche keywords that are easy to rank for.
Search is becoming the default navigational tool for the web. People go to Google and then type in "yahoo." If you don't have a branded keyword as one of your top keywords that might indicate long-term risk to your business. If a competitor can clone most of what you are doing and then bake in a viral component you are toast.
Going After the Wrong Brand Keywords
Arbitraging 3rd party brands is an easy way to build up distribution quickly. This is why there are 4,982 Britney Spears fan blogs (well 2 people are actually fans, but the other 4,980 are marketers).
But if you want to pull in traffic you have to go after a keyword that is an extension of the brand. Ranking for "eBay" probably won't send you much traffic (as their clickthrough rate on their first result is probably even higher than the 70% I had above). Though if you have tips on how to buy or sell on eBay those kinds of keywords might pull in a much higher clickthrough rate for you.
To confirm the above I grabbed data for a couple SEO tool brands we rank well for. A number 3 ranking (behind a double listing) and virtually no traffic!
Different keyword, same result
Link building is still a bit of a discovery keyword, but I think it is perhaps a bit later staged than just the acronym "SEO." Here the click volume distribution is much flatter / less consolidated than it was on the above brand-oriented examples.
If when Google lowers your rank you still pull in a fairly high CTR that might be a signal to them that your site should rank a bit higher.