Google - Now With More Google in Your Google TM

Nov 16th

Ben Edelman did it again :)

This time he highlighted how Google hard codes their search results:

[When] we roll[ed] out Google Finance, we did put the Google link first. It seems only fair right, we do all the work for the search page and all these other things, so we do put it first... That has actually been our policy, since then, because of Finance. So for Google Maps again, it's the first link. - Marissa Mayer

If they gain certain privileges in the marketplace by claiming to not abuse their power and that their algorithmic results are neutral, but those algorithmic results may be pushed below the fold, then is it "only fair" for them to put themselves in a default market leading position in any category they feel they can make money from by selling ads in? Or is that an abuse of power?

As Google adds features, collects more data, puts ads everywhere, and pushes into being a publisher on more fronts, at some point there will be a straw that breaks the camel's back. Big money is paying attention and the list of "evidence" grows weekly. Sometimes they still think like a start up. And that will lead to their demise.

It might not be anytime soon, but eventually they will hit a whammy.

A Warning Shot or an Accident? Does it Matter?

Nov 12th

On the 22nd of October Google had an indexing issue and a separate algorithmic change. Some of the sites associated with the indexing glitch quickly came back, whereas others seemed like they were hosed for weeks and headed toward the path of perpetual obscurity.

To give a visual of how dire this situation was for some webmasters, consider the following graphic.

The blue line is Google search traffic and the gray is total unique visitors. And since search visitors tend to monetize better than most other website visitors, the actual impact on revenues was greater than the impact on visitors. And, if you figure that sites have fixed costs (hosting, maintenance, new content creation, data licensing, marketing, etc.) then the impact on profits is even more extreme than the impact on revenues.

Hence in the search game you can go from hero to zero fast!

Search is one of the highest leverage business functions around today.

But it is also volatile. And it is a winner take most market.

When stuff heads south like that, what do you do? Do you consider it game over and try to lower costs further?

My approach to such events is to take it as a warning shot. To take it as a challenge. In the above example the traffic came back...

...but algorithms sometimes get rolled in using phases. Sometimes stuff that gets tripped up and later restored is being set up for a second fall when they refine their relevancy algorithms again. Sites that get caught in snags are sites which are fairly weak. So if you take any set back as motivation to create something better and work hard then you at least know that if you failed you tried and it just didn't work. Most likely, if you try hard, you will be able to make the site much better and not only reach your old traffic levels, but exceed them.

Even though the traffic came back for the above site, it has been getting a lot more effort. And it will continue to for months and months. The fear of loss is a great motivator to push people to create something better. Sometimes I think Google should mix up the results a bit more often just to drive people to create better stuff. :)

Google Ranking Internal Pages Rather Than Home Pages

Nov 3rd

When you get to *really* competitive keywords the results typically tend to be fairly stable because the cost of entry into the game is so high & many of the top players keep building additional signals of quality. You might get minor fluctuations from time to time, but large fluctuations on highly competitive keywords are fairly rare.

Over the last day or 2 Google has done yet another algorithm change (the 3rd or 4th noticeable one in 2 weeks), where on some searches they are ranking an internal page over the homepage. It is almost as if the best mental model for the algorithm that is doing this is...

  • find the top SITES that deserve to rank well & rank them based on that criteria
  • however, rather than ranking THOSE PAGES, instead do internal site searches & back in other relevancy factors to look for other popular & relevant pages on those sites
  • test to see how well searchers respond to them

Here is a pretty overt example, where Google changed 2 of the listings for "SEO" to internal pages.

I have seen other examples, some where Google also highlighted a new information-less blog post with only a couple automated backlinks pointing at it. I don't want to "out" that site though, but this is the type of image Google was showing beneath that entry

Google could conceivably use this sort of process to further adjust the search results based on demographics, searcher location, recent searches, searcher interests, and so on. Add in the ability to send searchers down a known path optimized for profitability, the ability to select vertical databases on the fly and change the titles on the fly and it allows whoever has the most search market share to keep refining the results to make them more appealing to users at an ever increasing level of granularity & greater profitability.

I have no problems keeping up with the increasing complexity of search, but Google is setting up some serious barriers to entry for new players. It is hard to explain in a straightforward manner that page A might be ranking due to relevancy signals pointing into page B, but these are the SERPs through which we make a living. And it is only going to keep growing more complex. ;)

Depending on how far Google pushes with this, it can have major implications in terms of rank tracking, SEO strategy, site architecture & conversion optimization. More on that stuff in the community forums.

Google Replacing Page Titles in Search Results With On-Page Headings

Nov 1st

When Bing launched, one of the interesting things they did to make the organic search results appear more relevant was to use link anchor text to augment page titles (where relevant). This would mean if people searched for a phrase that was mostly in your title (but maybe your page title was missing a word or 2 from the search) then Bing might insert those words into the page title area of your listing if they were in some of the link anchor text pointing into your page.

Before being switched over to Bing, Yahoo! would sometimes display the H1 heading as the clickable link to your site (rather than the page title). Bing also uses on page headings to augment page titles.

Historically if Google has thought it would appear more relevant to searchers, sometimes they have shown a relevant machine generated piece of your page displaying those keywords in context rather than the meta description in the snippet, but typically Google has been far more conservative with the page titles. Sometimes Google would list the ODP title for the title of a page, most until recently they have generally typically just listed the page title as the clickable link to your site.

Recently Google has grown more experimental on this front, being willing to use link anchor text and on-page headings as part of the listing. In addition, if the page title is short, Google may add the site's name at the end of the title.

Here is an example in Google of the page title being replaced by part of an on-page heading & also showing the site's name being added to the end of the link

And here is the associated on-page heading for the above

I have also seen a few examples of the link anchor text being added to the page title in Google, however it was on a client project & the client would prefer that I didn't share his new site on an SEO blog with 10's of thousands of readers. :D

Last November Matt Cutts recently did a video on the topic of Google editing the page titles for relevancy & how it was a fairly new thing for Google. Even back then Google was quite conservative in editing the clickable link ... I think they have only grown more aggressive on that front in the past month or so.

Localization, Unique Data Sets & the Future of Search

Oct 30th

Local is Huge

Google's US ad revenue is roughly 15 billion & the size of the US Yellow Pages market is roughly 14 billion. Most of that money is still in print, but that shift is only accelerating with Google's push into local.

Further, cell phones are location aware, can incorporate location into search suggest, and on the last quarterly conference call Google's Jonathan Rosenberg highlighted that mobile ads were already a billion Dollar market for Google.

Google has been working on localization for years, and as a top priority. When asked "Anything you’ve focused on more recently than freshness?" Amit Singal stated:

Localization. We were not local enough in multiple countries, especially in countries where there are multiple languages or in countries whose language is the same as the majority country.

So in Austria, where they speak German, they were getting many more German results because the German Web is bigger, the German linkage is bigger. Or in the U.K., they were getting American results, or in India or New Zealand. So we built a team around it and we have made great strides in localization. And we have had a lot of success internationally.

The Big Shift

I have been saving some notes on the push toward local for a while now, and with Google's launch of the new localized search results it is about time to do an overview. First here is Google's official announcement, and some great reviews from many top blogs.

Some of the localized results not only appear for things like Chicago pizza but also for single word searches in some cases, like pizza or flowers.

Promoting local businesses via the new formats has many strategic business benefits for Google

  • assuming they track user interactions, then eventually the relevancy is better for the end users
  • allows local businesses to begin to see more value from search, so they are more likely to invest into a search strategy
  • creates a direct relationship with business owners which can later be leveraged (in the past Google has marketed AdWords coupons to Google Analytics users)
  • if a nationwide brand can't dominate everywhere just because they are the brand, it means that they will have to pony up on the AdWords front if they want to keep 100% exposure
  • if Google manages to put more diversity into the local results then they can put more weight on domain authority on the global results (for instance, they have: looked at query chains, recommended brands in the search results, shown many results from the lead brand on a branded search query, listed the official site for searches for a brand + a location where that brand has no office, etc.)
  • it puts eye candy in the right rail that can make searchers more inclined to look over there
  • it makes SEO more complex & expensive
  • it allows Google to begin monetizing the organic results (rather than hiding them)
  • it puts in place an infrastructure which can be used in other markets outside of local

Data Data Data

Off the start it is hard to know what to make of this unless one draws historical parallels. At first one might be inclined to say the yellow page directories are screwed, but the transition could be a bit more subtle. The important thing to remember is that now that the results are in place, Google can test and collect data.

More data sources is typically better than better algorithms, and Google has highlighted that one of their richest sources of data is through tracking searcher behavior on their own websites.

Pardon Me, While I Steal Your Lunch

There are 2 strong ways to build a competitive advantage on the data front:

  • make your data better
  • starve competing business models to make them worse

Off the start yellow page sites might get a fair shake, but ultimately the direction they are headed in is being increasingly squeezed. In a mobile connected world with Google owning 97% search marketshare, while offering localized search auto-complete, ads that map to physical locations, and creating a mobile coupon offers network, the yellow page companies are a man without a country. Or perhaps a country without a plot of land. ;)

They are so desperate that they are cross licensing amongst leading competitors. But that just turns their data into more of a commodity.

Last December I cringed when I read David Swanson, the CEO of R.H. Donnelley, state: "People relate to us as a product company -- the yellow-pages -- but we don't get paid by people who use the yellow-pages, we get paid by small businesses for helping them create ad messages, build websites, and show up in search engine results. ... Most of the time today, you are not even realizing that you are interacting with us."

After seeing their high level of churn & reading the above comment, at that point I felt someone should have sent him the memo about the fate of thin affiliates on AdWords. Not to worry, truth would come out in time. ;)

Making things worse, not only is local heavily integrated into core search, with search suggest being localized, but Google is also dialing for Dollars offering flat rate map ads (with a free trial) and is testing fully automated flat rate local automated AdWords ads again.

Basic Economics

How does a business maximize yield? Externalize costs & internalize profits. Pretty straightforward. To do this effectively, Google wants to cut out as many middle men out of the game as possible. This means Google might decide to feed off your data while driving no traffic to your business, but rather driving you into bankruptcy.

Ultimately, what is being commoditized? Labor. More specifically:

  • the affiliate who took the risk to connect keywords and products
  • the labor that went into collecting & verifying local data
  • the labor that went into creating the editorial content on the web graph and the links which search engines rely on as their backbone.
  • the labor that went into manually creating local AdWords accounts, tracking their results, & optimizing them (which Google tracks & uses as the basis for their automated campaigns)
  • the labor that went into structuring content with the likes of micro-formats
  • the labor that went into policing and formatting user reviews
  • many other pieces of labor that the above labor ties into

Of course Google squirms out of any complaints by highlighting the seedy ends of the market and/or by highlighting how they only use such data "in aggregate" ... but if you are the one losing your job & having your labor used against you, "the aggregate" still blows as an excuse.

But if Google drives a business they are relying on into bankruptcy, won't that make their own search results worse?

Nope.

For 2 big reasons:

  • you are only judged on your *relative* performance against existing competitors
  • after Google drives some other players out of the marketplace and/or makes their data sets less complete, the end result is Google having the direct relationships with the advertisers and the most complete data set

The reason many Google changes come with limited monetization off the start is so that people won't question their motives.

Basically I think they look at it this way: "We don't care if we kill off a signal of relevancy because we will always be able to create more. If we poison the well for everyone else while giving ourselves a unique competitive advantage it is a double win. It is just like the murky gray area book deal which makes start up innovation prohibitively expensive while locking in a lasting competitive advantage for Google."

You would never hear Google state that sort of stuff publicly, but when you look at their private internal slides you see those sorts of thoughts are part of their strategy.

What is Spam?

The real Google guidelines should read something like this:

Fundamentally, the way to think about Google's perception of spam is that if Google can offer a similar quality service without much cost & without much effort then your site is spam.

Google doesn't come right out and say that (for anti-trust reasons), but they have mentioned the problem of search results in search results. And their remote rater documents did state this:

After typing a query, the search engine user sees a result page. You can think of the results on the result page as a list. Sometimes, the best results for "queries that ask for a list" are the best individual examples from that list. The page of search results itself is a nice list for the user.

...But This is Only Local...

After reading the above some SEOs might have a sigh of relief thinking "well at least this is only local."

To me that mindset is folly though.

Think back to the unveiling of Universal search. At first it was a limited beta test with some news sites, then Google bought Youtube, and then the search landscape changed...everyone wanted videos and all the other stuff all the time. :D

Anyone who thinks this rich content SERP which promotes Google is only about local is going to be sorely disappointed as it moves to:

  • travel search (Google doesn't need to sell airline tickets so long as they can show you who is cheapest & then book you on a high margin hotel)
  • any form of paid media (ebooks, music, magazines, newspapers, videos, anything taking micro-payments)
  • real estate
  • large lead generation markets (like insurance, mortgage, credit cards, .edu)
  • ecommerce search
  • perhaps eventually even markets like live ticketing for events

Google does query classification and can shape search traffic in ways that most people do not understand. If enough publishers provide the same sorts of data and use the same types of tags, they are creating new sets of navigation for Google to offer end users.

No need to navigate through a publisher's website until *after* you have passed the click toll booth.

Try #3 at Reviews

Google SearchWiki failed in large part because it confused users. Google launched SideWiki about a year ago, but my guess is it isn't fairing much better. When SideWiki launched Danny Sullivan wrote:

Sidewiki feels like another swing at something Google seems to desperately desires — a community of experts offering high quality comments. Google says that’s something that its cofounders Larry Page and Sergey Brin wanted more than a system for ranking web pages. They really wanted a system to annotate pages across the web.

The only way they are going to get that critical mass is by putting that stuff right in the search results. It starts with local (& scrape + mash in other areas like ecommerce), but you know what they want & they are nothing if not determined to get what they want! ;)

Long Term Implications

Scrape / mash / redirect may be within the legal limits of fair use, but it falls short in spirit. At some point publishers who recognize what is going on will align with better partners. We are already seeing an angry reaction to Google from within the travel vertical and from companies in the TV market.

Ultimately it is webmasters, web designers & web developers who market and promote search engines. If at some point it becomes consensus that Google is capturing more value than they create, or that perhaps Google search results have too much miscellaneous junk in them, they could push a lot more searchers over to search services which are more minimalistic + publisher friendly. Blekko launches Monday, and their approach to search is much like Google's early approach was. :)

Google's Profits: to Infinity & Beyond

Oct 26th

Marin software manages about 5% of Google AdWords spend for clients, and they noticed that since Google Instant was unveiled, AdWords ad clicks are up 5%. Since the launch Google's Jonathan Rosenberg has mentioned that the impact on AdWords was "not material."

I found the repeated use of those exact words suspicious and diversionary, and, as it turned out, with good reason! When Google Instant launched I highlighted what Google was doing to screen real estate & predicted this shift.

Turns out that the "tin foil hat wearing SEOs" were right once again.

And that 5% lift in AdWords clicks is on top of the lift Google has seen from

  • creating a 4th ad slot for comparison ads (in high paying verticals like "credit cards" and "mortgage")
  • sitelinks, merchant ratings, and other ad extensions, that gave Google another lift. On the last quarterly call Jonathan Rosenberg stated: "These ads appear on more than 10% of the queries where we show ads and people like them. We see this because click-through rates are up for some formats as much as 10% and up more than 30% on some others."

It is thus no surprise that Google's move into other verticals is met with resistance. The travel industry recently put together the Fair Search site to oppose Google's purchase of ITA Software.

The Google as Monopoly meme continues to grow.

Is Google a Monopoly?Graphic by Scores.org

As Google continues to make enemies this is a great time for the launch of a back to the basics approach to core algorithmic search. Blekko is launching publicly on November 1st.

Ho Ho Ho, Go Google Go

Oct 24th

Some sites have seen pretty drastic drops in Google search traffic recently, related to indexing issues. Google maintains that it is a glitch:

Just to be clear, the issues from this thread, which I have reviewed in detail, are not due to changes in our policies or changes in our algorithms; they is due to a technical issue on our side that will be visibly resolved as soon as possible (it may take up to a few days to be visible for all sites though). You do not need to change anything on your side and we will continue to crawl and index your content (perhaps not as quickly at the moment, but we hope that will be resolved for all sites soon). I would not recommend changing anything significantly at this moment (unless you spot obvious problems on your side), as these may result in other issues once this problem is resolved on our side.

An example of one site's search traffic that was butchered by this glitch, see the below images. Note that in the before, Google traffic is ~ 10x what Yahoo! or Bing drive, and after the bug the traffic is ~ even.

Not that long ago I saw another site with over 500 unique linking domains which simply disappeared from the index for a few days, then came right back 3 days later. Google's push to become faster and more comprehensive has perhaps made them less stable, as digging into social media highlights a lot of false signals & often promotes a copy over the original. Add in any sort of indexing issues and things get really ugly really fast.

Now this may just be a glitch, but as Tedster points out, many such "glitches" often precede or coincide with major index updates. Ever since I have been in the SEO field I think Google has done a major algorithmic change just before the holidays every year except last year.

I think the reasons they do it are likely 3 or 4 fold

  • they want to make SEO unpredictable & unreliable (which ultimately means less resources are spent on SEO & the results are overall less manipulated)
  • they want to force businesses (who just stocked up on inventory) to enter the AdWords game in a big way
  • by making changes to the core relevancy algorithms (and having the market discuss those) they can slide in more self promotion via their vertical search services without it drawing much anti-trust scrutiny
  • the holidays are when conversion rates are the highest, so if they want to make changes to seek additional yield it is the best time to do it, and the holidays give them an excuse to offer specials or beta tests of various sorts

As an SEO with clients, the unpredictability is a bad thing, because it makes it harder to manage expectations. Sharp drops in rankings from Google "glitches" erode customer trust in the SEO provider. Sometimes Google will admit to major issues happening, and other times they won't until well *after* the fact. Being proven right after the fact still doesn't take back 100% of the uncertainty unleashed into the marketplace weeks later.

Even if half your clients double their business while 1/3 lose half their search traffic, as an SEO business you typically don't generally get to capture much of the additional upside...whereas you certainly capture the complaints from those who just fell behind. Ultimately this is one of the reasons why I think being a diversified web publisher is better than being an SEO consultant... if something takes off & something else drops then you can just pour additional resources into whatever is taking well and capture the lift from those changes.

If you haven't been tracking rankings now would be a great time to get on it. It is worth tracking a variety of keywords (at various levels of competition) daily while there is major flux going on, because that gives you another lens through which to view the relevancy algorithms, and where they might be headed.

How Google Instant Changes the SEO Landscape

Sep 9th

Google Instant launched. It is a new always-on search experience where Google tries to complete your keyword search by predicting what keyword you are searching for. As you type more letters the search results change.

Short intro here:

Long view here:

Not seeing it yet? You can probably turn it on here (though in some countries you may also need to be logged into a Google account). In time Google intends to make the is a default feature turned on for almost everyone (other than those with slow ISPs and older web browsers). And if you don't like it, the feature is easy to turn off at the right of the search box, but to turn it off it uses a cookie. If you clear cookies the feature turns right back on.

Here is an image using Google's browser size tool, showing that when Google includes 4 AdWords ads only 50% of web browsers get to see the full 2nd organic listing, while only 20% get to see the full 4th organic listing.

Its implications on SEO are easy to understate. However, they can also be overstated: I already saw one public relations hack stating that it "makes SEO irrelevant."

Nothing could be further from the truth. If anything, Google instant only increases the value of a well thought out SEO strategy. Why? Well...

  • it consolidates search volume into a smaller basket of keywords
  • it further promotes the localization of results
  • it makes it easier to change between queries, so its easier to type one more letter than scroll down the page
  • it further pollutes AdWords impression testing as a great source of data

Lets dig into these, shall we?

How Many Companies Has Google Bought?

Aug 25th

One of the best ways to track Google's strategies is through visualizing & analyzing their acquisitions. Which is what the following image helps you do. Click on it for the full enlarged version :)


via Scores

Your Favorite Eric Schmidt Quotes?

Aug 22nd

Do you want Google to tell you what you should be doing? Mr. Schmidt thinks so:

"More and more searches are done on your behalf without you needing to type. I actually think most people don't want Google to answer their questions," he elaborates. "They want Google to tell them what they should be doing next. ... serendipity—can be calculated now. We can actually produce it electronically."

Of course the problem with algorithms is they rely on prior experience to guide you. The won't tell you to do something unique & original that can change the world, rather they will lead you down a well worn path.

What are some of the most bland and most well worn paths in the world? Established brands:

The internet is fast becoming a "cesspool" where false information thrives, Google CEO Eric Schmidt said yesterday. Speaking with an audience of magazine executives visiting the Google campus here as part of their annual industry conference, he said their brands were increasingly important signals that content can be trusted.

"Brands are the solution, not the problem," Mr. Schmidt said. "Brands are how you sort out the cesspool."

"Brand affinity is clearly hard wired," he said. "It is so fundamental to human existence that it's not going away. It must have a genetic component."

If Google is so smart then why the lazy reliance on brand? Why not show me something unique & original & world-changing?

Does brand affinity actually have a hard wired genetic component? Or is it that computers are stupid & brands have many obvious signals associated with them: one of which typically being a large ad budget. And why has Google's leading search engineer complained about the problem of "brand recognition" recently?

While Google is collecting your data and selling it off to marketers, they have also thought of other ways to monetize that data and deliver serendipity:

"One day we had a conversation where we figured we could just try and predict the stock market..." Eric Schmidt continues, "and then we decided it was illegal. So we stopped doing that."

Any guess how that product might have added value to the world? On down days (or days when you search for "debt help") would Google deliver more negatively biased ads & play off fears more, while on up days selling more euphoric ads? Might that serendipity put you on the wrong side of almost every trade you make? After all, that is how the big names in that space make money - telling you to take the losing side of a trade with bogus "research."

Eric Schmidt asks who you would rather give access to this data:

“All this information that you have about us: where does it go? Who has access to that?” (Google servers and Google employees, under careful rules, Schmidt said.) “Does that scare everyone in this room?” The questioner asked, to applause. “Would you prefer someone else?” Schmidt shot back – to laughter and even greater applause. “Is there a government that you would prefer to be in charge of this?”

That exchange helped John Gruber give Eric Schmidt the label Creep Executive Officer, while asking: "Maybe the question isn’t who should hold this information, but rather should anyone hold this information."

But Google has a moral conscience. They think quality score (AKA bid rigging) is illegal, except for when they are the ones doing it!

"I think judgement matters. If you have something that you don't want anyone to know, maybe you shouldn't be doing it in the first place," - Eric Schmidt

Which is why the blog of a certain mistress disappeared from the web. And, of course, since this post is on a blog, it doesn't matter:

If you're ever confused as to the value of newspaper editors, look at the blog world. That's all you need to see. - Eric Schmdit

Here is the thing I don't get about Google's rhetorical position on serendipity & moral authority: if they are to be trusted to recommend what you do, then why do they recommend illegal activities like pirating copyright works via warez, keygens, cracks & torrents?

Serendipity ho!

Pages






    Email Address
    Pick a Username
    Yes, please send me "7 Days to SEO Success" mini-course (a $57 value) for free.

    Learn More

    We value your privacy. We will not rent or sell your email address.