If Google is smarter than humans, we must accept that it should be able to help us answer the difficult questions about life that are vital toward making humans reach their full potential, such that we may help computers become smarter, so that we may reach the singularity.
"The development of search as a technology has become commoditized. To continue to invest our own resources to do web search doesn't make sense because that development is expensive and doesn't give you a differentiated product," Ask President Doug Leeds said by telephone.
My contention is that their is no value spending the engineering resources to fight auto-generated spam if Google is paying you to create it. At some point one stack of money becomes much larger than the other.
Then again, speaking of differentiation, I wonder if Doug Leeds would care to comment on if answers content "has been commoditized" at all by them skirting around the intent of fair use laws (much like Youtube did to video content). Are they offering a "differentiated" service by turning tons of their sites into giant answer farms?
Ultimately this is much like Mahalo, but on a grand scale. At least they are not pointing expired redirects into their site (like eHow did) but if this trend continues look for thin answer sites wrapped in AdSense to become the equivalent of the auto-generated affiliate feed powered website of years gone by. The model is infinitely more scalable than content mills since the companies doing it don't actually have content costs: throw a keyword list in the hopper, send your scrapers out to "add value" & watch the money come in. Wherever something is working simply throw more related keywords in the hopper.
The lack of cost to the model means you can build thousands of pages around misspellings and yet still have it be profitable...the cost of creating each page is under a cent.
Who funds the creation of all this garbage? Google, via their AdSense program. It's a bit of Southern Hospitality from Google, if you will.
Own a forum website or answers website & are sick of seeing Ask outrank you by leveraging their domain authority + "fair use" of your content? Here is how to block their bot in robots.txt:
With great power comes great responsibility, however working on the Google spam team must feel a bit like the movie Brazil when watching this stuff unfold.
Remember how all kinds of affiliates were given the boot by Google for not "adding value"? How are lander pages like this one adding any value? 10 of 10 above the fold links are monetized. And it looks like their sites are using content spinning too!
The promise of the web was that it could directly connect supply and demand to make markets more efficient, and yet leading search engines are paying to create a layer of (g)arbitrage that lowers the utility and value of the network for everyone else, while pushing even more publishers into bankruptcy as the leeches grow in size & number.
My guess is that unless this short term opportunism changes, some of the star search engineers will leave in disgust within 12 to 18 months. Mark 2012 on your calendars, it will be a good year for clean search plays like Blekko and DuckDuckGo. ;)
"Matt recommends SEOs do not “chase the algorithm” and instead try to predict where Google will be going in the future". Matt was addressing PubCon.
Good advice, methinks.
Trying to predict where Google is going is something we do a lot of at SEOBook.com. Whilst no one has a crystal ball, it's good practice to keep one eye on the search horizon.
So, where do we think Google might be heading?
Google Will Continue To Dominate Search
Easy one, huh.
Their biggest competitors appear clueless when it comes to search. Bing may make some inroads. Maybe. It's hard to imagine anyone eating Google's lunch when it comes to search, for many years to come.
Is Facebook a threat? I doubt it. Search is difficult, and I can see no reason why Facebook - which has a media focus - could own the search channel any more than Yahoo could.
Search is, after all, an infrastructure problem. Google's infrastructure would be very difficult to replicate.
Google Won't Be Doing All That Much About Blackhat Sites
A search result set only really contains spam if the Google users think it contains spam i.e. they don't see the answer they were expecting.
The fact a website may fall outside Google's guidelines might get competing webmasters' knickers in a knot, but it probably doesn't matter that much to Google, or anyone else.
Even though Matt Cutts says Google will devote more resources to this, I suspect Google's efforts will largely remain focused on outright deception i.e. misrepresentation, hijacking and malware.
The Web Reflects Power Structures
We can forget the San Fran techno-hippy ethos of the web. It will not be a free-for-all democracy, if it ever was. History shows us that power tries to centralize control in order to maintain it.
Google may try to keep users on Google for longer. They do this by owning more and more verticals, and extracting data and reformatting it. When they send visitors away from Google, they'll try to do so more and more on their own terms. Watch very carefully what type of sites Google rewards, as opposed to what they may say they reward.
Expect less competition in the market as a result. Some people are already getting angry about it.
Be Where Your Users Are
Google follows users. So does Facebook. Anywhere your users are, you've got to be there, too. On Google Maps. On YouTube. Wherever and whenever. Think beyond your website. Think in terms of getting your data out there.
Social media can drive tons of attention, awareness and traffic. But the search box is the best way to navigate to stuff you want. Now what will drive those results - if I type in "pizza", what should I get? The answer can be very different depending on whether the results are coming from the web, Yelp, or Facebook. So I guess my answer is that I still see search being the core way to navigate, but I think what gets searched is going to get a lot more structured and move away from simple keyword matches against unstructured web pages
A Shift To Localization
Microsoft Research found that people tend to organize their memories in geographic terms i.e. where they were when something happened.
If you want to know where Google is heading, then watch Marissa Mayer. Marissa has been responsible for much of what you see in Google in terms of how it is organized. Marissa has just moved to head of Geographic and Location Services.
Google Earth. Google Maps. Google Local. Google Street View. Mobile location data and targeting. Expect more data to be organized around locality.
SEO hasn't changed all that much in years. We still find an audience (keyword research), we publish content, we build links to the content, and then we repeat it all over again.
The changes come around the edges, especially for big companies like Google. There is a lot of risk to Google in making radical changes. Shareholders don't like it. Why risk breaking something that makes so much money, and is so popular?
The biggest changes in the way we do things on the web are probably going to come from the upstarts. They're probably hard at work in their garage right now.
[When] we roll[ed] out Google Finance, we did put the Google link first. It seems only fair right, we do all the work for the search page and all these other things, so we do put it first... That has actually been our policy, since then, because of Finance. So for Google Maps again, it's the first link. - Marissa Mayer
If they gain certain privileges in the marketplace by claiming to not abuse their power and that their algorithmic results are neutral, but those algorithmic results may be pushed below the fold, then is it "only fair" for them to put themselves in a default market leading position in any category they feel they can make money from by selling ads in? Or is that an abuse of power?
To give a visual of how dire this situation was for some webmasters, consider the following graphic.
The blue line is Google search traffic and the gray is total unique visitors. And since search visitors tend to monetize better than most other website visitors, the actual impact on revenues was greater than the impact on visitors. And, if you figure that sites have fixed costs (hosting, maintenance, new content creation, data licensing, marketing, etc.) then the impact on profits is even more extreme than the impact on revenues.
Hence in the search game you can go from hero to zero fast!
Search is one of the highest leverage business functions around today.
When stuff heads south like that, what do you do? Do you consider it game over and try to lower costs further?
My approach to such events is to take it as a warning shot. To take it as a challenge. In the above example the traffic came back...
...but algorithms sometimes get rolled in using phases. Sometimes stuff that gets tripped up and later restored is being set up for a second fall when they refine their relevancy algorithms again. Sites that get caught in snags are sites which are fairly weak. So if you take any set back as motivation to create something better and work hard then you at least know that if you failed you tried and it just didn't work. Most likely, if you try hard, you will be able to make the site much better and not only reach your old traffic levels, but exceed them.
Even though the traffic came back for the above site, it has been getting a lot more effort. And it will continue to for months and months. The fear of loss is a great motivator to push people to create something better. Sometimes I think Google should mix up the results a bit more often just to drive people to create better stuff. :)
When you get to *really* competitive keywords the results typically tend to be fairly stable because the cost of entry into the game is so high & many of the top players keep building additional signals of quality. You might get minor fluctuations from time to time, but large fluctuations on highly competitive keywords are fairly rare.
Over the last day or 2 Google has done yet another algorithm change (the 3rd or 4th noticeable one in 2 weeks), where on some searches they are ranking an internal page over the homepage. It is almost as if the best mental model for the algorithm that is doing this is...
find the top SITES that deserve to rank well & rank them based on that criteria
however, rather than ranking THOSE PAGES, instead do internal site searches & back in other relevancy factors to look for other popular & relevant pages on those sites
test to see how well searchers respond to them
Here is a pretty overt example, where Google changed 2 of the listings for "SEO" to internal pages.
I have seen other examples, some where Google also highlighted a new information-less blog post with only a couple automated backlinks pointing at it. I don't want to "out" that site though, but this is the type of image Google was showing beneath that entry
I have no problems keeping up with the increasing complexity of search, but Google is setting up some serious barriers to entry for new players. It is hard to explain in a straightforward manner that page A might be ranking due to relevancy signals pointing into page B, but these are the SERPs through which we make a living. And it is only going to keep growing more complex. ;)
Depending on how far Google pushes with this, it can have major implications in terms of rank tracking, SEO strategy, site architecture & conversion optimization. More on that stuff in the community forums.
When Bing launched, one of the interesting things they did to make the organic search results appear more relevant was to use link anchor text to augment page titles (where relevant). This would mean if people searched for a phrase that was mostly in your title (but maybe your page title was missing a word or 2 from the search) then Bing might insert those words into the page title area of your listing if they were in some of the link anchor text pointing into your page.
Before being switched over to Bing, Yahoo! would sometimes display the H1 heading as the clickable link to your site (rather than the page title). Bing also uses on page headings to augment page titles.
Historically if Google has thought it would appear more relevant to searchers, sometimes they have shown a relevant machine generated piece of your page displaying those keywords in context rather than the meta description in the snippet, but typically Google has been far more conservative with the page titles. Sometimes Google would list the ODP title for the title of a page, most until recently they have generally typically just listed the page title as the clickable link to your site.
Recently Google has grown more experimental on this front, being willing to use link anchor text and on-page headings as part of the listing. In addition, if the page title is short, Google may add the site's name at the end of the title.
Here is an example in Google of the page title being replaced by part of an on-page heading & also showing the site's name being added to the end of the link
And here is the associated on-page heading for the above
I have also seen a few examples of the link anchor text being added to the page title in Google, however it was on a client project & the client would prefer that I didn't share his new site on an SEO blog with 10's of thousands of readers. :D
Last November Matt Cutts recently did a video on the topic of Google editing the page titles for relevancy & how it was a fairly new thing for Google. Even back then Google was quite conservative in editing the clickable link ... I think they have only grown more aggressive on that front in the past month or so.
Google's US ad revenue is roughly 15 billion & the size of the US Yellow Pages market is roughly 14 billion. Most of that money is still in print, but that shift is only accelerating with Google's push into local.
Further, cell phones are location aware, can incorporate locationinto search suggest, and on the last quarterly conference call Google's Jonathan Rosenberg highlighted that mobile ads were already a billion Dollar market for Google.
Google has been working on localization for years, and as a top priority. When asked "Anything you’ve focused on more recently than freshness?" Amit Singal stated:
Localization. We were not local enough in multiple countries, especially in countries where there are multiple languages or in countries whose language is the same as the majority country.
So in Austria, where they speak German, they were getting many more German results because the German Web is bigger, the German linkage is bigger. Or in the U.K., they were getting American results, or in India or New Zealand. So we built a team around it and we have made great strides in localization. And we have had a lot of success internationally.
Some of the localized results not only appear for things like Chicago pizza but also for single word searches in some cases, like pizza or flowers.
Promoting local businesses via the new formats has many strategic business benefits for Google
assuming they track user interactions, then eventually the relevancy is better for the end users
allows local businesses to begin to see more value from search, so they are more likely to invest into a search strategy
creates a direct relationship with business owners which can later be leveraged (in the past Google has marketed AdWords coupons to Google Analytics users)
if a nationwide brand can't dominate everywhere just because they are the brand, it means that they will have to pony up on the AdWords front if they want to keep 100% exposure
if Google manages to put more diversity into the local results then they can put more weight on domain authority on the global results (for instance, they have: looked at query chains, recommended brands in the search results, shown many results from the lead brand on a branded search query, listed the official site for searches for a brand + a location where that brand has no office, etc.)
it puts eye candy in the right rail that can make searchers more inclined to look over there
it puts in place an infrastructure which can be used in other markets outside of local
Data Data Data
Off the start it is hard to know what to make of this unless one draws historical parallels. At first one might be inclined to say the yellow page directories are screwed, but the transition could be a bit more subtle. The important thing to remember is that now that the results are in place, Google can test and collect data.
There are 2 strong ways to build a competitive advantage on the data front:
make your data better
starve competing business models to make them worse
Off the start yellow page sites might get a fair shake, but ultimately the direction they are headed in is being increasingly squeezed. In a mobile connected world with Google owning 97% search marketshare, while offering localized search auto-complete, ads that map to physical locations, and creating a mobile coupon offers network, the yellow page companies are a man without a country. Or perhaps a country without a plot of land. ;)
Last December I cringed when I read David Swanson, the CEO of R.H. Donnelley, state: "People relate to us as a product company -- the yellow-pages -- but we don't get paid by people who use the yellow-pages, we get paid by small businesses for helping them create ad messages, build websites, and show up in search engine results. ... Most of the time today, you are not even realizing that you are interacting with us."
How does a business maximize yield? Externalize costs & internalize profits. Pretty straightforward. To do this effectively, Google wants to cut out as many middle men out of the game as possible. This means Google might decide to feed off your data while driving no traffic to your business, but rather driving you into bankruptcy.
Ultimately, what is being commoditized? Labor. More specifically:
the affiliate who took the risk to connect keywords and products
the labor that went into collecting & verifying local data
the labor that went into creating the editorial content on the web graph and the links which search engines rely on as their backbone.
the labor that went into manually creating local AdWords accounts, tracking their results, & optimizing them (which Google tracks & uses as the basis for their automated campaigns)
the labor that went into structuring content with the likes of micro-formats
the labor that went into policing and formatting user reviews
many other pieces of labor that the above labor ties into
Of course Google squirms out of any complaints by highlighting the seedy ends of the market and/or by highlighting how they only use such data "in aggregate" ... but if you are the one losing your job & having your labor used against you, "the aggregate" still blows as an excuse.
But if Google drives a business they are relying on into bankruptcy, won't that make their own search results worse?
For 2 big reasons:
you are only judged on your *relative* performance against existing competitors
after Google drives some other players out of the marketplace and/or makes their data sets less complete, the end result is Google having the direct relationships with the advertisers and the most complete data set
The reason many Google changes come with limited monetization off the start is so that people won't question their motives.
Basically I think they look at it this way: "We don't care if we kill off a signal of relevancy because we will always be able to create more. If we poison the well for everyone else while giving ourselves a unique competitive advantage it is a double win. It is just like the murky gray area book deal which makes start up innovation prohibitively expensive while locking in a lasting competitive advantage for Google."
You would never hear Google state that sort of stuff publicly, but when you look at their private internal slides you see those sorts of thoughts are part of their strategy.
What is Spam?
The real Google guidelines should read something like this:
After typing a query, the search engine user sees a result page. You can think of the results on the result page as a list. Sometimes, the best results for "queries that ask for a list" are the best individual examples from that list. The page of search results itself is a nice list for the user.
...But This is Only Local...
After reading the above some SEOs might have a sigh of relief thinking "well at least this is only local."
To me that mindset is folly though.
Think back to the unveiling of Universal search. At first it was a limited beta test with some news sites, then Google bought Youtube, and then the search landscape changed...everyone wanted videos and all the other stuff all the time. :D
Anyone who thinks this rich content SERP which promotes Google is only about local is going to be sorely disappointed as it moves to:
travel search (Google doesn't need to sell airline tickets so long as they can show you who is cheapest & then book you on a high margin hotel)
any form of paid media (ebooks, music, magazines, newspapers, videos, anything taking micro-payments)
large lead generation markets (like insurance, mortgage, credit cards, .edu)
perhaps eventually even markets like live ticketing for events
Google does query classification and can shape search traffic in ways that most people do not understand. If enough publishers provide the same sorts of data and use the same types of tags, they are creating new sets of navigation for Google to offer end users.
No need to navigate through a publisher's website until *after* you have passed the click toll booth.
Try #3 at Reviews
Google SearchWiki failed in large part because it confused users. Google launched SideWiki about a year ago, but my guess is it isn't fairing much better. When SideWiki launched Danny Sullivan wrote:
Sidewiki feels like another swing at something Google seems to desperately desires — a community of experts offering high quality comments. Google says that’s something that its cofounders Larry Page and Sergey Brin wanted more than a system for ranking web pages. They really wanted a system to annotate pages across the web.
The only way they are going to get that critical mass is by putting that stuff right in the search results. It starts with local (& scrape + mash in other areas like ecommerce), but you know what they want & they are nothing if not determined to get what they want! ;)
Ultimately it is webmasters, web designers & web developers who market and promote search engines. If at some point it becomes consensus that Google is capturing more value than they create, or that perhaps Google search results have too much miscellaneous junk in them, they could push a lot more searchers over to search services which are more minimalistic + publisher friendly. Blekko launches Monday, and their approach to search is much like Google's early approach was. :)
Marin software manages about 5% of Google AdWords spend for clients, and they noticed that since Google Instant was unveiled, AdWords ad clicks are up 5%. Since the launch Google's Jonathan Rosenberg has mentioned that the impact on AdWords was "not material."
I found the repeated use of those exact words suspicious and diversionary, and, as it turned out, with good reason! When Google Instant launched I highlighted what Google was doing to screen real estate & predicted this shift.
Turns out that the "tin foil hat wearing SEOs" were right once again.
And that 5% lift in AdWords clicks is on top of the lift Google has seen from
creating a 4th ad slot for comparison ads (in high paying verticals like "credit cards" and "mortgage")
sitelinks, merchant ratings, and other ad extensions, that gave Google another lift. On the last quarterly call Jonathan Rosenberg stated: "These ads appear on more than 10% of the queries where we show ads and people like them. We see this because click-through rates are up for some formats as much as 10% and up more than 30% on some others."
Some sites have seen pretty drastic drops in Google search traffic recently, related to indexing issues. Google maintains that it is a glitch:
Just to be clear, the issues from this thread, which I have reviewed in detail, are not due to changes in our policies or changes in our algorithms; they is due to a technical issue on our side that will be visibly resolved as soon as possible (it may take up to a few days to be visible for all sites though). You do not need to change anything on your side and we will continue to crawl and index your content (perhaps not as quickly at the moment, but we hope that will be resolved for all sites soon). I would not recommend changing anything significantly at this moment (unless you spot obvious problems on your side), as these may result in other issues once this problem is resolved on our side.
An example of one site's search traffic that was butchered by this glitch, see the below images. Note that in the before, Google traffic is ~ 10x what Yahoo! or Bing drive, and after the bug the traffic is ~ even.
Not that long ago I saw another site with over 500 unique linking domains which simply disappeared from the index for a few days, then came right back 3 days later. Google's push to become faster and more comprehensive has perhaps made them less stable, as digging into social media highlights a lot of false signals & often promotes a copy over the original. Add in any sort of indexing issues and things get really ugly really fast.
Now this may just be a glitch, but as Tedster points out, many such "glitches" often precede or coincide with major index updates. Ever since I have been in the SEO field I think Google has done a major algorithmic change just before the holidays every year except last year.
I think the reasons they do it are likely 3 or 4 fold
they want to make SEO unpredictable & unreliable (which ultimately means less resources are spent on SEO & the results are overall less manipulated)
they want to force businesses (who just stocked up on inventory) to enter the AdWords game in a big way
by making changes to the core relevancy algorithms (and having the market discuss those) they can slide in more self promotion via their vertical search services without it drawing much anti-trust scrutiny
the holidays are when conversion rates are the highest, so if they want to make changes to seek additional yield it is the best time to do it, and the holidays give them an excuse to offer specials or beta tests of various sorts
As an SEO with clients, the unpredictability is a bad thing, because it makes it harder to manage expectations. Sharp drops in rankings from Google "glitches" erode customer trust in the SEO provider. Sometimes Google will admit to major issues happening, and other times they won't until well *after* the fact. Being proven right after the fact still doesn't take back 100% of the uncertainty unleashed into the marketplace weeks later.
Even if half your clients double their business while 1/3 lose half their search traffic, as an SEO business you typically don't generally get to capture much of the additional upside...whereas you certainly capture the complaints from those who just fell behind. Ultimately this is one of the reasons why I think being a diversified web publisher is better than being an SEO consultant... if something takes off & something else drops then you can just pour additional resources into whatever is taking well and capture the lift from those changes.
If you haven't been tracking rankings now would be a great time to get on it. It is worth tracking a variety of keywords (at various levels of competition) daily while there is major flux going on, because that gives you another lens through which to view the relevancy algorithms, and where they might be headed.
Google Instant launched. It is a new always-on search experience where Google tries to complete your keyword search by predicting what keyword you are searching for. As you type more letters the search results change.
Short intro here:
Long view here:
Not seeing it yet? You can probably turn it on here (though in some countries you may also need to be logged into a Google account). In time Google intends to make the is a default feature turned on for almost everyone (other than those with slow ISPs and older web browsers). And if you don't like it, the feature is easy to turn off at the right of the search box, but to turn it off it uses a cookie. If you clear cookies the feature turns right back on.
Here is an image using Google's browser size tool, showing that when Google includes 4 AdWords ads only 50% of web browsers get to see the full 2nd organic listing, while only 20% get to see the full 4th organic listing.
Its implications on SEO are easy to understate. However, they can also be overstated: I already saw one public relations hack stating that it "makes SEO irrelevant."
Nothing could be further from the truth. If anything, Google instant only increases the value of a well thought out SEO strategy. Why? Well...
it consolidates search volume into a smaller basket of keywords
it further promotes the localization of results
it makes it easier to change between queries, so its easier to type one more letter than scroll down the page
it further pollutes AdWords impression testing as a great source of data