Phorm/Google Behavioral Ad Targeting - Based on Your Browsing Data

Phorm, a UK company that partnered with BT to run secret trials to target ads based on usage data, was roasted by the media with article titles like Phorm’s All-Seeing Parasite Cookie.

Google, which has long stayed away from behavioral targeting due to privacy (and negative publicity) concerns, announced they are jumping into the behavioral ad targeting market:

Google will use data it collects about what Web sites users visit and what it knows about the content of those sites to sort its massive audience of users into groups such as hockey fans or travel enthusiasts. The data won't be drawn from users' search queries, but from text files known as cookies that Google installs on the Web browsers of users who visit pages where it serves ads.

DoubleClick, AdSense, Google Toolbar, Gmail, Youtube, Blogger, Google Groups, Google Checkout, Google Chrome, Google Analytics...there are lots of ways to track you, even if you do not want to be tracked. Google will allow users to opt out of such targeting, with yet another cookie, but if you clear cookies then you are back in the matrix again.

And while Google claims they are not using search queries in their current behavioral targing, Danny Sullivan wrote:

Google confirmed in a session I moderated at the Omniture Summit last month that they have tested behaviorial targeted ads using past search history data. Again, that doesn’t seem to be part of this release, but it could come in the future.

As discovered during early Google research titled The Anatomy of a Large-Scale Hypertextual Web Search Engine:

we expect that advertising funded search engines will be inherently biased towards the advertisers and away from the needs of the consumers.

Indeed. Tim Berners-Lee, creator of the world wide web, spoke out against behavioral targeting:

"People use the web in a crisis, when wondering whether they have a sexually transmitted disease, or cancer, when wondering if they are homosexual and whether to talk about it … to discuss political views."
...
"The power of this information is so great that the commercial incentive for companies or individuals to misuse it will be huge," he said. "It is absolutely essential to have absolute clarity that it is illegal."

If Google continues down this path unquestioned, then in due time you may not be able to get health insurance because of a web page you viewed or a keyword you trusted Google enough to search for. Better luck next life!

Download SEO Book Torrent: Should Google Recommend That?

In the following video Matt Cutts highlighted that he did not feel that the update was driven by brand, but more in concepts of trust, PageRank, and authority:

RankPulse, a tool I used in my analysis of the algorithm change, is powered by the Google SOAP API, which Google will soon stop supporting. Matt played down the size of the algorithm update made by a Googler named Vince. But John Andrews takes a contrarian view, looking at Google's behavior after the algorithm update was analyzed:

You might say that Google’s API,via custom third-party innovations like RankPulse.com, enabled us to “organize the world’s information and make it universally accessible and useful” (which is Google’s corporate mission statement, by the way).

It sure seems contradictory for Google, a company based on the collection and permanent storage of others’ web page content, to forbid others from doing the same. It is also quite egregious for Google to expect to operate secretly, with no accountability (such as might be obtained through archiving of Google results), when Google exerts so much influence over Internet commerce.

One of Google's initial complaints, as mentioned by Joshua Sciarrino, was that search information was too secretive:

At the same time, search engines have migrated from the academic domain to the commercial. Up until now most search engine development has gone on at companies with little publication of technical details. This causes search engine technology to remain largely a black art and to be advertising oriented (see Appendix A). With Google, we have a strong goal to push more development and understanding into the academic realm.......However, it is very difficult to get this (academic) data, mainly because it is considered commercially valuable.

As Google gobbles up your content while shielding its results from unauthorized access, it creates a weakness which a new search service could exploit...by being far more open.

While Google doesn't want anyone to access their proprietary business secrets, if you search for my brand they recommend you look for a torrent to go download an old copy of my ebook.

sounds like a fair trade, eh? No big deal. Google is a common carrier, and intends to use that to their business advantage whenever and wherever possible.

I hope you (and your business model) are not allergic to peanuts!

Hugo Guzman: Deconstructing the Google 'Brand?' Algorithm

So here we are on Tuesday, March 3 rd, and I’m still trying to fully digest the implications of Aaron’s “Heavy emphasis on Brandings” post from last Wednesday, February 25. The data that was presented, the context that was provided and the labyrinth of insightful user comments that were spawned left me reeling for days. So much so that I wouldn’t be surprised if the annals of SEO history associate February 25, 2009 as the infamous “Aaron Wall” update.

In all seriousness, though, this really is a big deal, especially for folks like me who spend their days attempting to optimize mainstream “Big Brand” web sites for a living. I’m fortunate enough work for an interactive agency that takes SEO seriously, and my team strives to deliver a truly comprehensive approach to SEO – blending site-side factors, link building, social media elements, and analytics. We usually do a pretty darn good job, despite the myriad of obstacles and pitfalls associated with trying to implement SEO for a large, lumbering, Fortune 500 web portal. And sadly, like many big firms out there, we have occasionally chalked up our shortcomings to a lack of implementation and cooperation on the part of the client. It’s that typical “not our fault, it’s a crappy big brand site” copout that many of us have heard a thousand times before.

Then along comes Aaron with his revelations about Google’s recent algorithm shift and its ramifications for big brands, and all hell breaks loose:

  • I immediately spiral into self-doubt regarding me and my team’s marketing abilities
  • I start scrambling to deconstruct this alleged algorithm shift
  • I start emailing all of my senior team members asking them to attempt deconstructing the algorithm shift
  • they roll their eyes and one of them tells me stop sending so many random emails at 10 o’clock at night

I’ve calmed down a bit since then, but I’m still hard at work trying to figure out exactly what levers have caused certain “Big Brand” sites to skyrocket in the SERPs while others remain mired in search engine mediocrity. As with most things in life, the best course of action is to introduce a bit of the old scientific method, systematically isolating variables in an attempt to identify predictable patterns that can be replicated.

After taking a high-level look at each of the keywords outlined in Aaron’s post, and the corresponding brand sites that made the jump onto the front page, several possible culprits become apparent. Here are a couple that jumped out at me:

Social Media Signals – companies like University of Phoenix have made a concerted effort to engage users via social media channels, and those social reverberations could be a key facet in Google’s newly refined algorithm, especially if some of those reverberations include mention of the phrase “online degree.”

Increased weighting of anchor text within internal site linkage – companies like American Airlines seem to be leveraging both their own internal site pages as well partner sites to increase the volume of anchor text occurrences for the term “airline tickets” (although they’re missing out on some seriously low-hanging fruit by failing to optimize the alt. image attribute on their global logo image link). If Google has decided to increase the potency of this element, then large brand portals with voluminous amounts of internal pages and partner sites (or branded micro sites) could gain an upper hand for highly competitive terms.

Increased sensitivity to offline marketing campaigns – Perhaps Google’s algorithm is getting better at recognizing site traffic associated with offline marketing campaigns. This would extremely difficult to do without having direct access to a site’s analytics data (although Google Analytics conspiracy theorists are convinced that this is already the case for sites using GA) but perhaps Google is using signals such as the relative volume of specific search queries (e.g. branded queries like “State Farm”) and somehow tying that data back to terms that the algorithm associates with the given brand query (e.g. State Farm = Auto Insurance).

Disclaimer: I haven’t been able to actually test these hypotheses out thoroughly or with any real semblance of scientific method. After all, it’s only been five days since I read the post, and I do have other things to do besides ponder the ramifications of this alleged algorithm shift (it’s 10pm so I have to start annoying my team with random emails again).

Besides, Google’s results could roll back at any moment, rendering all of these insights (nearly) moot. Still, if you’re in any way involved in optimizing web sites for big brands (or if you just want to improve your eye for SEO) it’s probably a good idea to start doing a little scientific testing of your own.

If you liked this post (or even if you thought it was a flaming pile of dog excrement) feel free to reach out to me via my Twitter handle: http://twitter.com/hugoguzman11

Big Brands? Google Brand Promotion: New Search Engine Rankings Place Heavy Emphasis on Branding

Originally when we published this we were going to make it subscriber only content, but the change is so important that I thought we should share some of it with the entire SEO industry. This post starts off with a brief history of recent algorithm updates, and shows the enormous weight Google is placing on branded search results.

The Google Florida Update

I got started in the search field in 2003, and one of the things that helped get my name on the map was when I wrote about the November 14th Google Florida update in a cheeky article titled Google Sells Christmas [1]. To this day many are not certain exactly what Google changed back then, but the algorithm update seemed to hit a lot of low level SEO techniques. Many pages that exhibited the following characteristics simply disappeared from the search results

  • repetitive inbound anchor text with little diversity
  • heavy repetition of the keyword phrase in the page title and on the page
  • words is a phrase exhibiting close proximity with few occurrences of the keywords spread apart
  • a lack of related/supporting vocabulary in the page copy

The Google Florida update was the first update that made SEO complicated enough to where most people could not figure out how to do it. Before that update all you needed to do was buy and/or trade links with your target keyword in the link anchor text, and after enough repetition you stood a good chance of ranking.

Google Austin, Other Filters/Penalties/Updates/etc.

In the years since Google has worked on creating other filters and penalties. At one point they tried to stop artificial anchor text manipulation so much that they accidentally filtered out some brands for their official names [2].

The algorithms have got so complex on some fronts that Google engineers do not even know about some of the filters/penalties/bugs (the difference between the 3 labels often being an issue of semantics). In December 2007, a lot of pages that ranked #1 suddenly ended up ranking no better than position #6 [3] for their core target keyword (and many related keywords). When questioned about this, Matt Cutts denied the problem until after he said they had already fixed it. [4]

When Barry asked me about "position 6" in late December, I said that I didn't know of anything that would cause that. But about a week or so after that, my attention was brought to something that could exhibit that behavior. We're in the process of changing the behavior; I think the change is live at some datacenters already and will be live at most data centers in the next few weeks.

Recent Structural Changes to the Search Results

Google helped change the structure of the web in January 2005 when they proposed a link rel=nofollow tag [5]. Originally it was said to stop blog spam, but by September of the same year, Matt Cutts changed his tune to where you were considered a spammer if you were buying links without using rel=nofollow on them. Matt Cutts documented some of his repeated warnings on the Google Webmaster Central blog. [6]

A bunch of allegedly "social" websites have adopted the use of the nofollow tag, [7] turning their users into digital share-croppers [8] and eroding the link value [9] that came as a part of being a well known publisher who created link-worthy content.

In May of 2007 Google rolled out Universal search [10], which mixes in select content from vertical search databases directly into the organic search results. This promoted

  • Google News
  • Youtube videos (and other video content)
  • Google Product Search
  • Google Maps/Local
  • select other Google verticals, like Google Books

These 3 moves (rel=nofollow, social media, and universal search), coupled with over 10,000 remote quality raters [11], has made it much harder to manipulate the search results quickly and cheaply unless you have a legitimate well trusted site that many people vouch for. (And it does not hurt to have spent a couple hours reading their 2003, 2005, and 2007 remote quality guidelines that were leaked into the SEO industry. [12]

Tracking Users Limits Need for "Random" Walk

The PageRank model is an algorithm built on a random walk of links on the web graph. But if you have enough usage data, you may not need to base your view of the web on that perspective since you can use actual surfing data to help influence the search results. Microsoft has done research on this concept, under the name of BrowseRank. [13] In Internet Explorer 8 usage data is sent to Microsoft by default.

Google's Chrome browser phones home [14] and Google also has the ability to track people (and how they interact with content) through Google Accounts, Google Analytics, Google AdSense, DoubleClick, Google AdWords, Google Reader, iGoogle, Feedburner, and Youtube.

Yesterday we launched a well received linkbait, and the same day our rankings for our most valuable keywords were lifted in both Live and Google, part of that may have been the new links, but I would be willing to bet some of it was caused from 10,000's of users finding their way to our site.

Google's Eric Schmidt Offers Great SEO Advice

If you ask Matt Cutts what big SEO changes are coming up he will tell you "make great content" and so on...never wanting to reveal the weaknesses of their search algorithms. Eric Schmidt, on the other hand, is frequently talking to media and investors with intent of pushing Google's agendas and all the exciting stuff that is coming out. In the last 6 months Mr. Schmidt has made a couple quotes that smart SEOs should incorporate into their optimization strategies - one on brands [15], and another on word relationships [16].

Here is Mr. Schmidt's take on brands from last October

The internet is fast becoming a "cesspool" where false information thrives, Google CEO Eric Schmidt said yesterday. Speaking with an audience of magazine executives visiting the Google campus here as part of their annual industry conference, he said their brands were increasingly important signals that content can be trusted.

"Brands are the solution, not the problem," Mr. Schmidt said. "Brands are how you sort out the cesspool."

"Brand affinity is clearly hard wired," he said. "It is so fundamental to human existence that it's not going away. It must have a genetic component."

And here is his take on word relationships from the most recent earnings call

“Wouldn’t it be nice if Google understood the meaning of your phrase rather than just the words that are in that phrase? We have a lot of discoveries in that area that are going to roll out in the next little while.”

The January 18th Google Update Was Bigger Than Florida, but Few People Noticed it

Tools like RankPulse [17] allow you to track the day to day Google ranking changes for many keywords.

4 airlines recently began ranking for "airline tickets"

At least 90% of the first page of search results for auto insurance is owned by large national brands.

3 boot brands / manufacturers rose from nowhere to ranking at the top of the search results.

3 of the most well recognized diet programs began ranking for diets.

4 multi-billion dollar health insurance providers just began ranking, with Aetna bouncing between positions #1 and 2.

3 of the largest online education providers began ranking for online degree.

5 watch brands jumped onto the first page of search results for watches. To be honest I have never heard of Nixon Now.

The above images are just some examples. Radioshack.com recently started ranking for electronics and Hallmark.com just recently started ranking for gifts. The illustrations do not list all brands that are ranking, but brands that just started ranking. Add in other brands that were already ranking, and in some cases brands have 80% or 90% of the first page search results for some of the most valuable keywords. There are thousands of other such examples across all industries if you take the time to do the research, but the trend is clear - Google is promoting brands for big money core category keywords.

Want to read the rest of our analysis? If you are a subscriber you can access it here.

Mahalo Caught Spamming Google With PageRank Funneling Link Scheme

Jason "SEO is dead" Calacanas, founder of Mahalo, used "SEO is dead" as a publicity stunt to help launch his made for AdSense scraper website. In the past we have noted how he was caught ranking pages without any original content - in clear violation of Google's guidelines. And now he has taken his spam strategy one step further, by creating a widget that bloggers can embed on their blogs.

The following link list looks like something you would find on an autogenerated spam website, but was actually on Hack A Day, a well respected technology blog with lots of PageRank.

  • Note that the links are not delivered in Javascript and do not use nofollow.
  • The links are repetitive and spammy.
  • The links have no contextual relevance.

This activity is in stark contrast to Google's webmaster guidelines:

Your site's ranking in Google search results is partly based on analysis of those sites that link to you. The quantity, quality, and relevance of links count towards your rating. The sites that link to you can provide context about the subject matter of your site, and can indicate its quality and popularity. However, some webmasters engage in link exchange schemes and build partner pages exclusively for the sake of cross-linking, disregarding the quality of the links, the sources, and the long-term impact it will have on their sites. This is in violation of Google's webmaster guidelines and can negatively impact your site's ranking in search results. Examples of link schemes can include:

  • Links intended to manipulate PageRank
  • Links to web spammers or bad neighborhoods on the web
  • Excessive reciprocal links or excessive link exchanging ("Link to me and I'll link to you.")
  • Buying or selling links that pass PageRank


The above links not only appear on hackaday, but Mahalo is actually creating a "Mahalo Blog Network" that cross links to other Mahalo promoting blogs and exists for the purpose of flowing PageRank into high paying Mahalo pages.

Back around the last time Jason was calling SEO spam, he was promoting Weblogs Inc., and his blog revenues relied heavily on selling PageRank from his blogs to casino websites.

Do the venture capitalists that invested in Mahalo support such Google gaming and PageRank selling strategies? When will Google act on this blatant violation of their guidelines? Jason has a clear history of operating outside the spirit of their guidelines, and if Google lets this slide then many other people are going to start spamming them too. Google has an obligation to protect searchers from such devious behavior, lest they let it slide and promote the creation of more spam.

Update: This Looks Worse Than I Originally Thought!

While leveraging blog sidebars to pump PageRank and anchor text is pretty bad, at least it was not in the editorial content of blog posts. But it looks like many Mahalo employees not only put links in their sidebars, but they publish posts that consist of little but a link laundry list pointing at various seasonally hot parts of the Mahalo site.





The above is just a small sample of such posts promoting Mahalo. There are probably hundreds or thousands of suchs posts floating around the web. What makes that strategy any better than the "evil" Pay Per Post strategy that Jason Calacanis was allegedly against? I guess it is only bad when someone else is profiting from it.

Did Google Actually Penalize Google Japan?

After Google Japan got caught buying paid blog reviews it was claimed that Google penalized their own site. Sure their toolbar PageRank score matters, but did it do anything to their actual rankings? Not so far as I can tell.

Search Google for John Chow or Text Link Ads and try to find the official branded sites...that is what a real penalty looks like. It looks like The SEO Commandments don't apply equally to everyone.

Thou shalt bear witness against all thy competitors, spying and snitching and ratting on them whenever thou perceivest a purported spam causing grief to Mine index and My corporate ego. And My profits. For thus shalt thou spare Me labor and the expense of attending to Mine Own job. And if thou wilt not lay it to heart to give glory to My name in this manner, behold, I will corrupt thy ranking, and spread dung upon thy name, and castigate thee as unethical, and thine SEO agency shall be damned and misranked in all eternity. For verily, I am a jealous Search Engine.

Ad Networks - "Partners" Hoarding Publisher Data For Profit

Are the big networks trying to lock up their data?

It would appear that some big players are trying to muscle in between the user and the webmaster by limiting the webmasters access is to valuable statistical data.

The excellent SmackDown blog has a post about Google reportedly testing Ajax results in the main SERPs.

Sounds innocuous enough, right?

Trouble is, what happens to existing tools? Plugins? Rank checkers? Stats and other referral tracking packages? All tools that rely on Google passing data in order to work.

Many tool vendors would likely adapt, but as Michael points out, what happens if all the referral data shows as coming from Google.com i.e. no keyword data is passed?

Browsers do not include that data in the referrer string, and it is never sent to the server. Therefore, all referrals from a Google AJAX driven search currently make it look as if you are getting traffic from Google’s homepage itself. Now, while this kind of information showing up in your tracking programs might be quite a boost to the ego if you don’t know any better, and will work wonders for picking up women in bars (”guess who links to me from their homepage, baby!”), for actual keyword tracking it is of course utterly useless.

Perhaps the only place you'll be able to get this data is Google Analytics? Is this the next step - a lock-in?

It has happened before.

Remember the changes to Adsense? Google introduced a new form of tracking code that can't be tracked by third party tools. However, that data is available within Google Analytics.

This obviously puts other tracking vendors at a competitive disadvantage, and signals to the webmaster community just where the ownership of that data lies.

Data Lock In

There appears to be an emerging trend, of late, whereby networks are leveraging their power against the interests of individual webmasters in terms of data ownership. Having been locked out themselves for a few years, the middle men are trying to squeeze their way back in again.

Take a look at the new contracts of GroupM, the worlds largest buyer of online media, as detailed in GroupM Revises Terms For All Online Ad Buys, Claims Data Is 'Confidential' on MediaPost:

The wording in GroupM's new T&Cs, which are attached to all the insertion orders and contracts it submits to online publishers beginning this year, amends the current industry standard by adding, the following: "Notwithstanding the foregoing or any other provision herein to the contrary, it is expressly agreed that all data generated or collected by Media Company in performing under this Agreement shall be deemed 'Confidential Information' of Agency/Advertiser......Experts familiar with online advertising contracts say the term is a smoking gun, because it raises a broader industry debate over who actually owns the data generated when an advertiser serves an ad on a publisher's page. Is it the advertiser's data? Is it the agency's data? Is it the publisher's data? Under the current industry standard, the data is considered "co-owned" by all sides of the process, but some believe the new GroupM wording seeks to shift the rights over data ownership exclusively to the advertiser and the agency.

The article also suggests that other ad providers may follow suit. What this may mean is that your can't leverage data in other ways. You might not even be able to collect it.

Whilst this issue has popped up again of late, it is nothing new. There has long been a battle for consumer data because it is so valuable. The ad networks can create a lot of valuable data as a by-product of their advertising placement, because they can leverage network effects and scale in the way the individual webmaster cannot. Naturally the next step is to lock it up and protect it.

The cost of protecting that data may come at the webmasters expense. As the MediaPost article says, who does the data belong to? The publisher or the ad network? Both?

Traditionally, it's been both. But that might be about to change, if the above contract is anything to go by.

Forced Partnerships

Incidentally, other contracts really push the boat out when it comes depriving webmasters of control. Techcrunch reported that the Glam Network, a large ad provider made up of advertising affiliates, includes this little clause in their contract:

10. Right of First Refusal
a. Notice. If at any time Affiliate proposes to sell, license, lease or otherwise transfer all or any portion of its interest in any of the Affiliate Websites, then Affiliate shall promptly give Glam written notice of Affiliate’s intention to sell....

Essentially, if you want to sell your website, and you've agreed to these terms, then Glam have first right of refusal on the sale! Nice.

What this all might lead to is less ownership, less control, and less flexibility for the individual webmaster when dealing with big networks.

Or perhaps, in the case of Google, they're going to find other ways to pass data and just haven't outlined how yet.

One to keep a close eye on, methinks...

Google's .edu Domain Love: Department of Economics ≠ Mortgage, or Does It?

Some recent Google shifts have caused a lot of .edu websites to rank for competitive keywords like mortgage and credit card. Here is a screenshot of the top 100 search results for "mortgage" with 57 .edu results and 15 .gov results. And here is a similar credit card screenshot.

Note that few of these pages have any relevant on-page content. Is this a case of Google-bombing? Or did Google dial up the .edu bonus too far?

Does Google want to return all the irrelevant pages? Or does it not matter if they are deep enough in the result set? Will having mystery meat results on pages 2 through 100 hurt Google's brand? Or does everyone just click on the first page?

We discussed this a bit more in the forums: new Google results

Google: Closing the Loop on Content, Advertising, & Commerce

Every listing site or review site has to start off from scratch at some point. Over the past 3 or 4 years it has got much harder to rank thin affiliate database sites, and now that is only going to get harder, with Matt Cutts asking for spam reports on empty review sites.

Of course if Amazon.com or TripAdvisor or Craigslist open new sections they can probably get away with using duplicate or thin content based on the strength of their brands. Branded networks can always throw out a new related niche site and have it be seen as being above board:

The internet is fast becoming a "cesspool" where false information thrives, Google CEO Eric Schmidt said yesterday. "Brands are how you sort out the cesspool."

But new competitors are going to have a hard time building the budget and funding the brand exposure needed to rank because SEO is getting more complex, and if you don't have enough brand or enough AdWords spend you pretty-much are not going to get the exposure needed to get consumer reviews and rank organically, unless you license/steal/borrow/mix/re-mix content to build an opening "reviews" database. Some software tools, like Web Data Parser, make the process easier, but you still need to wrap everything in some time of value add (good design, mash ups, etc.). Or have great public relations. Or start your site off as an editorial only play, where you review what interests you, and then move the brand into the reviews space after you get some momentum and an organic traffic flow.

Matt Cutts explained how thin listing pages may be against their guidelines

Use robots.txt to prevent crawling of search results pages or other auto-generated pages that don’t add much value for users coming from search engines.
….
Don’t create multiple pages, subdomains, or domains with substantially duplicate content.
….
Avoid “doorway” pages created just for search engines, or other “cookie cutter” approaches…

and yet Google crawls form boxes to generate new URLs.

Search is growing more subjective, becoming more about competition and expanding the ad channel. Think like a black hat. You have to stay ahead of Google's internal products & services if you want to avoid the spam label.

The shopping search engines/price comparison sites spend enough on AdWords to be considered a value add user experience (they give AdWords a broad backfill baseline inventory which other merchants have to compete against), but if Google can evolve their Product Search into a revenue stream and encourage reviews then many shopping search engines will soon run out of steam.

A Microsoft engineer notes:

I believe that the locus of advertising will gradually shift towards the creation of valuable and compelling content. There is, however, a relative dearth of professionals or companies that can provide such content creation services. Perhaps advertising agencies might evolve in this direction, or perhaps this may an opportunity for forward-thinking individuals?

Eventually Google will need to become more of a content play if they want to keep growing revenues. This is why...

And if Google co-opts the media that makes it hard to give them serious negative press. Eric Schmidt thinks the press needs to be more tightly integrated into Google

I think the solution is tighter integration. In other words, we can do this without making an acquisition. The term I've been using is 'merge without merging.' The Web allows you to do that, where you can get the Web systems of both organizations fairly well integrated, and you don't have to do it on exclusive basis.

Google's growing depth gives it a huge network advantage. More advertisers = more relevant ads = higher monetization with better user experience & more user loyalty. Microsoft is trying to buy marketshare and will likely push search harder in Windows 7, but it might be too little too late.

Yahoo! screwed up their US advertiser terms of service AND gave up on their international contextual ad service, giving Google yet another competitive advantage.

After reading John Andrews write a great review of Affiliate Summit I got thinking about some of Google's potential moves...

  • give consumers discounts for reviewing merchants and products to quickly build up a leading reviews database
  • broaden the AdWords ad system to allow room for more CPA deals / lead gen inside the SERP
  • offer free hosting and CMS for Google AdWords customers (& track inventory)
  • offer credit cards, or perhaps their own “goog” currency system, pegged to a basket of currencies
  • start buying out leading players in large verticals (Expedia - $2.5B, Bankrate - $600M, Monster.com - $1.2B, and/or WebMD - $1.2B) to strengthen their network advantage

Google Chrome Adds Search Pre-roll Ads

Google Chrome's biggest change from an SEO perspective are that it...

  • combines the address bar and search box into 1 box
  • that search box shows search suggestion results before searchers see the organic search results

The combination of those 2 means that if Google gains significant browser share SEOs will need to optimize for search suggest.

Chrome recently came out of beta, so Google can begin buying marketshare from OEMs. With the product only a few months removed from its announcement they are adding advertising to it, as highlighted by Danny Sullivan:

If Google can buy significant browser marketshare then such ad positions might add a lot of curiosity clicks for the top ranked advertiser, lower the ROI for the top advertiser.

If this gains traction most brand advertisers should not bid on their brand (unless perhaps they do so using an embedded match - which allows them to block advertising on the exact match but show up for other brand searches). Bidding on your brand's exact match and appearing in such ad positions for brand related search queries would be paying Google for adding zero value to the process.

Pages