I recently got put in a test bucket for Google's new layout with a "search results" bar near the top of the page. Generally this impacts the search results in a couple ways:
First off, it is a much better looking design. In the past when the search results would move up and down with Google Instant it really felt like a hack rather than something you would see on the leading internet company's main website. Now with the results fixed it feels much cleaner & much more well put together.
The more stable basic layout of the SERP will allow Google to integrate yet more vertical data into it while making it still look & feel decent. Google may have localized search suggestions & the organic results for a significant period of time, but the combination of them with this new layout where the search results don't move feels much more cohesive.
To get the white space right on the new layout Google shifted from offering 5 Instant suggestion to 4. The Google Instant results don't disappear unless you hit enter, but because the interface doesn't change & move there isn't as much need to click enter. The search experience feels more fluid.
The horizontal line above the search results and the word "Search" in red in the upper left of the page is likely to pull some additional attention toward Google's vertical search features, helping Google to collect more feedback on them (and further use that user behavior to create a signal to drive further integration of the verticals into the regular organic search results).
On the flip side of this, in the past the center column would move up & down while the right column would remain stationary, so I would expect this to slightly diminish right column ad clicks (that appeared at the top even when the organic results moved downward) while boosting center column clicks to offset that.
In the past, when Google Instant would disappear from view, that would pull the center column organic results up a bit.
This always-on bar shifts the pixels above the first search result from about 123 to 184...so roughly 60 pixels downward.
As a baseline, a standard organic listing with no extensions is about 90 pixels tall, so this moves the search results down roughly 2/3 of a listing, which should drive more traffic to the top paid search ads & less to the organic results below them (offset by any diminished clicks on the right column ads).
I tried to line up the results pretty closely on the new test results to show what they look like with Google Instant results showing & after you hit enter. Scroll over the below image to see how the result layout doesn't really change with Google Instant hidden or extended.
And here is an example image showing how the location is sometimes inserted directly into both the organic search results and the search suggestions.
Here is an image using Google's browser size tool to show how end users see the new search results. Note that in this example I used a keyword where Google has comparison/advisor ads, so in markets where they do not yet have those you would move all the organic results one spot up from what is shown below.
In the past doorway pages could be loosely defined as "low-quality pages designed to rank for highly targeted search queries, typically designed to redirect searchers to a page with other advertisements."
The reason they are disliked is a click circus impact they have on web users as they keep clicking in an infinite loop of ads.
This would be a perfect example of that type of website:
A friend of mine told me that the reason CSN stores had to merge into a "brand" was not just because that was the direction of the algorithm, but also because they were hit with the "doorway page" penalty. I don't know if that is 100% accurate, but it sure sounds plausible, given that... (UPDATE: SEE COMMENTS BELOW)
recently multiple friends have told me they were hit with the "doorway page" issue
on WebmasterWorld there are multiple threads from small ecommerce players suggesting they were hit with the doorway page issue
"Today we received messages in our webmaster tools account, for all but 1 of our 20 domains, indicating that Google considers them doorway pages. We have also lost all of our SERP's for those sites." - Uncle_DK
"I was rather disappointed to see that before banning the site the rater visited a very drab and ordinary page on my site. Not a smoking gun of some incriminating evidence of a hacker break-in or some such I was looking for. Also disappointing is the fact that they visited one page only." - 1script
another friend today told me that one of their clients runs numerous websites & that ALL of the sites in the Google webmaster tools account blew up, getting hit with the "doorway page" label (and ALL the sites that were not in that webmaster tools account were missed by the Google engineers)
Like almost anything else Google offers, their webmaster tools are free, right up until Google changes their business objectives and one of their engineers decide that he should put you out of business.
I *knew* the point of the Panda update was not to kill content farms, but to use content farms as a convenient excuse to thin the herd of webmasters & consolidate markets. A couple obvious tells on that front were:
Young remembers begging Wal-Mart for relief. "They said, 'No way,' " says Young. "We said we'll increase the price"--even $3.49 would have helped tremendously--"and they said, 'If you do that, all the other products of yours we buy, we'll stop buying.' It was a clear threat."
Finally, Wal-Mart let Vlasic up for air. "The Wal-Mart guy's response was classic," Young recalls. "He said, 'Well, we've done to pickles what we did to orange juice. We've killed it. We can back off.' " Vlasic got to take it down to just over half a gallon of pickles, for $2.79. Not long after that, in January 2001, Vlasic filed for bankruptcy.
P&G's roll out of Gain dish soap says a lot about the health of the American middle class: The world's largest maker of consumer products is now betting that the squeeze on middle America will be long lasting.
As far as publishing business models go, if Google starts calling ecommerce sites that are part of a network "doorway sites" then Google isn't really allow that sort of testing, unless the content comes from a fortune 500 or is content conveniently hosted on Google.com. As a publisher or merchant, how do you ever grow to scale if you are not allowed to test running multiple projects & products in parallel & keep reinvesting in whatever works best?
Even the biggest publishers are breaking some of their core brands into multiple sites (eg: Boston.com vs BostonGlobe.com) to test different business models. If you have scale that is fine, but if you are smaller that same strategy might soon be considered a black hat doorway strategy.
"The future is already here — it's just not very evenly distributed." - William Gibson
Not only do they monetize via AdWords, but Google has 6 listings in the "organic" search results.
Any Google search engineer care to have a public debate as to the legitimacy of that search result set?
If an SEO gets half of the search results (for anything other than his own brand) he is an overt spammer. If Google eats half of the search results with duplicating nepotism across their own house "brands" then it is legitimate.
Making the above even worse, smaller niche brands are regularly disappeared from Google's index. Google has the ability to redirect search intent to one that is easier to monetize & more along a path they approve of. I was searching for a post John Andrews (webmaster of johnon.com) wrote about Google censorship & what did Google do? They used their power over the dictionary to change the words I searched for on the fly & then promoted their ebooks offering yet again.
Note that listings 1 & 2 promote the exact same book. Google just lists the content they scraped into multiple categories that deserve to be showcased multiple times. How many ways did Google screw up the above search result?
they auto-corrected the search query to an unwanted alternate search
in spite of auto-correction, they still allowed their other verticals to be inserted in the results inline right near the top (when rare longtail searches are auto-corrected, one would expect them to be more adverse to embedding such an aggressive self-promotion in the search results)
they associate content hosted by them as being about their brand simply because they host it (even though that piece of content has no relation to them outside of them scraping it)
they list it not once but twice, right at the top of the results (even though it is duplicate content available elsewhere & both pages are the same on Google, with the exception of one promoting a recent version of the book & the other page promoting a decade older version of the exact same book)
As a publisher you are *required* to keep spending more money on deeper editorial to avoid being labeled as spam or tripping some arbitrary "algorithmic" threshold. And as you do so, Google is humping you from the backside to ensure your profit margins stay low, scraping whatever they can within the limits of the law & operating the types of websites that would be considered spam if anyone else ran them. Once regulatory pressures or public opinion catch on to Google's parasitic behavior, they buy a brand & leverage its content to legitimize their (often) illegitimate enterprise. :)
Oh, and how about a quote from the Censored Screams book: "censorship, like charity, should begin at home, but, unlike charity, it should end there."
Smart SEOs have been preaching brand for years and years now (& so has Google if you read between the lines).
For some time Google has appended prior search queries in AdWords. In some cases they also show ads for related search queries, append your location to the search query for localization, spell-correct search results based on common search trends, and (as the Vince update showed) they can also use search query chains & brand related searches as a signal.
As far back as 2008 Google sugested previous query refinements for organic search, but they haven't been very prevalent thusfar IMHO.
While researching another article, I was searching for some browsers to see how search engines were advertising on various keywords, and after searching for Firefox I later searched for SEO & saw the following:
This is yet another way brand familiarity can boost rankings. Not only are you likely to score higher on generic search queries (due to the Vince & Panda updates), but having a well-known brand also makes Google more likely to recommend your brand as a keyword to suggest in Google Instant, makes people more likely to +1 your site, and it now also can impact the related organic search results further if people search for that brand shortly before searching for broader industry keywords.
Yet another problem with Google's brand first approach to search: parasitic hosting.
The .co.cc subdomain was removed from the Google index due to excessive malware and spam. Since .co.cc wasn't a brand the spam on the domain was too much. But as Google keeps dialing up the "brand" piece of the algorithm there is a lot of stuff on sites like Facebook or even my.Opera that is really flat out junk.
And it is dominating the search results category after category. Spun text remixed together with pages uploaded by the thousand (or million, depending on your scale). Throw a couple links at the pages and watch the rankings take off!
Here is where it gets tricky for Google though...Youtube is auto-generating garbage pages & getting that junk indexed in Google.
While under regulatory review for abuse of power, how exactly does Google go after Facebook for pumping Google's index with spam when Google is pumping Google's index with spam? With a lot of the spam on Facebook at least Facebook could claim they didn't know about it, whereas Google can't claim innocence on the Youtube stuff. They are intentionally poisoning the well.
There is no economic incentive for Facebook to demote the spammers as they are boosting user account stats, visits, pageviews, repeat visits, ad views, inbound link authority, brand awareness & exposure, etc. Basically anything that can juice momentum and share value is reflected in the spam. And since spammers tend to target lucrative keywords, this is a great way for Facebook to arbitrage Google's high-value search traffic at no expense. And since it pollutes Google's search results, it is no different than Google's Panda-hit sites that still rank well in Bing. The enemy of my enemy is my friend. ;)
Even if Facebook wanted to stop the spam, it isn't particularly easy to block all of it. eBay has numerous layers of data they collect about users in their marketplace, they charge for listings, & yet stuff like this sometimes slides through.
And then there are even warning listings that warn against the scams as an angle to sell information
But even some of that is suspect, as you can't really "fix" fake Flash memory to make the stick larger than it actually is. It doesn't matter what the bootleg packaging states...its what is on the inside that counts. ;)
When people can buy Facebook followers for next to nothing & generate tons of accounts on the fly there isn't much Facebook could do to stop them (even if they actually wanted to). Further, anything that makes the sign up process more cumbersome slows growth & risks a collapse in share prices. If the stock loses momentum then their ability to attract talent also drops.
Since some of these social services have turned to mass emailing their users to increase engagement, their URLs are being used to get around email spam filters
Stage 2 of this parasitic hosting problem is when the large platforms move away from turning a blind eye to the parasitic hosting & to engage in it directly themselves. In fact, some of them have already started.
According to Compete.com, Youtube referrals from Google were up over 18% in May & over 30% in July! And Facebook is beginning to follow suit.
Want a good example of Google's brand-bias stuff being a bunch of bs?
Niche expert value-add affiliate websites may now lack the brand signal to rank as the branded sites rise up above them, so what comes next?
Off-topic brands flex their brand & bolt on thin affiliate sections.
Overstock.com was penalized for having a spammy link profile (in spite of being a brand they were so spammy that they were actually penalized, counter to Google's cultural norm) but a few months later the penalty was dropped, even though some of the spam stuff is still in place.
Those who were hit by Panda are of course still penalized nearly a half-year later, but Overstock is back in the game after a shorter duration of pain & now they are an insurance affiliate.
While most the content farms were decimated, that left a big hole in the search results that will allow the Huffington Post to double or triple the yield of their content through additional incremental reach.
Before I go on, let me stop and say a couple of more important things: Aol, Aol Acquires Huffington Post, Aol Buys Huffington Post, Aol Buys Huffpo, Aol Huffington Post, Huffington Post, Huffington Post Aol, Huffington Post Aol Merger, Huffington Post Media Group, Huffington Post Sold, Huffpo Aol, Huffpost Aol, Media News.
See what I did there? That's what you call search-engine optimization, or SEO. If I worked at the Huffington Post, I'd likely be commended for the subtle way in which I inserted all those search keywords into the lede of my article.
I was given eight to ten article assignments a night, writing about television shows that I had never seen before. AOL would send me short video clips, ranging from one-to-two minutes in length — clips from “Law & Order,” “Family Guy,” “Dancing With the Stars,” the Grammys, and so on and so forth… My job was then to write about them. But really, my job was to lie. My job was to write about random, out-of-context video clips, while pretending to the reader that I had watched the actual show in question. AOL knew I hadn’t watched the show. The rate at which they would send me clips and then expect articles about them made it impossible to watch all the shows — or to watch any of them, really.
Doing fake reviews? Scraping content? Putting off-topic crap on a site to monetize it?
Those are the sorts of things Google claims the spammy affiliates & SEOs do, but the truth is they have never been able to do them to the scale the big brands have. And from here out it is only going to get worse.
We highlighted how Google was responsible for creating the content farm business model. Whatever comes next is going to be bigger, more pervasive, and spammier, but coated in a layer of "brand" that magically turns spam into not-spam.
Imagine where this crap leads in say 2 or 3 years?
It won't be long before Google is forced to see the error of their ways.
What Google rewards they encourage. What they encourage becomes a profitable trend. If that trend is scalable then it becomes a problem shortly after investors get involved. When that trend spirals out of control and blows up they have to try something else, often without admitting that they were responsible for causing the trend. Once again, it will be the SEO who takes the blame for bad algorithms that were designed divorced from human behaviors.
I am surprised Google hasn't hired someone like a Greg Boser or David Naylor as staff to explain how people will react to the new algorithms. It would save them a lot of work in the long run.
Disclosure: I hold no position in AOL's stock, but I am seriously considering buying some. When you see me personally writing articles on Huffington Post you will know it's "game on" for GoogleBot & I indeed am a shareholder. And if I am writing 30 or 40 articles a day over there that means I bought some call options as well. :D
My personal preference on subdomains vs. subdirectories is that I usually prefer the convenience of subdirectories for most of my content. A subdomain can be useful to separate out content that is completely different. Google uses subdomains for distinct products such news.google.com or maps.google.com, for example. If you’re a newer webmaster or SEO, I’d recommend using subdirectories until you start to feel pretty confident with the architecture of your site. At that point, you’ll be better equipped to make the right decision for your own site.
Even though subdirectories were the "preferred" default strategy, they are now the wrong strategy. What was once a "best practice" is now part of the problem, rather than part of the solution.
Not too far before Panda came out we were also told that we can leave it to GoogleBot to sort out duplicate content. A couple examples here and here. In those videos (from as recent as March 2010) are quotes like:
"What we would typically do is pick what we think is the best copy of that page and keep it, and we would have the rest in our index but we wouldn't typically show it, so it is not the case that these other pages are penalized."
"Typically, even if it is consider duplicate content, because the pages can be essentially completely identical, you normally don't need to worry about it, because it is not like we cause a penalty to happen on those other pages. It is just that we don't try to show them."
I believe if you were to talk to our crawl and indexing team, they would normally say "look, let us crawl all the content & we will figure out what parts of the site are dupe (so which sub-trees are dupe) and we will combine that together."
I would really try to let Google crawl the pages and see if we could figure out the dupes on our own.
Now people are furiously rewriting content, noindexing, blocking with robots.txt, using subdomains, etc.
Google's advice is equally self-contradicting and self-serving. Worse yet, it is both reactive and backwards looking.
You follow best practices. You get torched for it. You are deciding how many employees to fire & if you should simply file bankruptcy and be done with it. In spite of constantly being lead astray by Google, you look to them for further guidance and you are either told to sit & spin, or are given abstract pablum about "quality."
Everything that is now "the right solution" is the exact opposite of the "best practices" from last year.
And the truth is, this sort of shift is common, because as soon as Google openly recommends something people take it to the Nth degree & find ways to exploit it, which forces Google to change. So the big problem here is not just that Google gives precise answers where broader context would be helpful, but also that they drastically and sharply change their algorithmic approach *without* updating their old suggestions (that are simply bad advice in the current marketplace).
It is why the distinction between a subdirectory and subdomain is both 100% arbitrary AND life changing.
Meanwhile select companies have direct access to top Google engineers to sort out problems, whereas the average webmaster is told to "sit and spin" and "increase quality."
The only ways to get clarity from Google on issues of importance are to:
ignore what Google suggests & test what actually works, OR
publicly position Google as a monopolist abusing their market position
How valuable is that email? How about "not at all" for $500 Alex?
I think Fantomaster is brilliant, but I would much rather read one of his blog posts that dozens or hundreds of Tweets. Sure knowing that 9,000 might have saw a message can be comforting, but 1 blog post will get you way more views (and with far deeper context & meaning).
How to Test the Value of Social Media
Want to see big numbers get small quickly? Try charging anything...as little as $1 & you will quickly see that social media is mostly garbage. Alternatively, try giving away $5 or paying for the in-stream ads that directly manipulate relevancy & once again you will see how worthless social media is as a signal...something that anyone can quickly buy.
Online reputation is important to most researchers, and about 10% of respondents to our survey complained that they or their work have been misrepresented on the Internet. The web has a long memory, and rumours, lies and bad information can spiral out of control to be remembered by posterity.
For a lot of businesses the social media stuff will be nothing but blood & tears. A resource drain that money, time & hope gets poured into with nothing coming out the other end.
That said, I don't think ignoring it is a wise decision at this point. The best tip I have for most people is to try to set up automated systems that help your social signal grow automatically. That can mean onsite integration & perhaps a small token amount of advertising. Beyond that it is probably only participating as much as you enjoy it. And if you are more of a huckster/PR type, pay attention to how folks like Jason Calacanis leverage these channels.
The second best tip would be measure it with stats that actually matter. Revenue and profit are important. Time spent tracking the number of retweets is probably better spent building more content or improving your business in other ways. If you have something that works the rabbit hole goes deep, but if it isn't working then it is likely better spending your time being a bigger fish in a smaller pond.
Inspired by Barry's implementation, we recently added the social buttons to the left rail of the site. That is probably one of the best types of integration you can do, because it is out of the way for those who don't know what it is and/or want to ignore it, but it stays right in the same spot (always visible) for anyone who is interested in those types of buttons.
What Makes the Web Great (for Small Businesses)
Two things that make the web great are the ability to fail fast (and cheaply) & the ability to focus deeply. If social media increases the operating cost (being yet another hoop you have to jump through) & robs valuable attention that could go into your website then it is 0-for-2.
It remains to be seen if an author will be able to carry his or her trust with them to their next gig, but if they can then that would make the media ecosystem more fluid & pull some amount of power away from traditional publishers. Some publishers are suggesting putting their book content online as HTML pages...well if they are doing that then why doesn't the author just install Wordpress and keep more of the value chain themselves (like J.K. Rowling just did)?
Their answer to that question is generally "no" but that they would even ask themselves that question is fallacious Orwellian duplicity.
Would you trust the local plumber to work on your house if he was posting "exciting viral content" online about how some projects went astray? Now every plumber needs to become a marketing expert to not get driven off the web by Roto-Rooter & other chain-styled companies that can collect +1 signals from all their employees & some of their customers across the country or around the world.
Google knows they are tilting the search game toward those who have money. They even flaunt it in their display ads!
Google itself explains that not allowing its device maker partners to ship Skyhook's software was just, the way Google describes it, a necessary measure to prevent damage (Google says "detriment", which is the Anglicized version of the Latin word for "damage") from being caused to the whole ecosystem.
But Google does not want to allow Oracle to control Java the way Google controls Android.
Google today is saying that "social media is important." Just look at their wave of product announcements & their bonus structure.
I loathe the approach (and the message), but I accept it. ;)
The idea behind such Cassandra calls is that the web should be graded based on merit, rather than who has the largest ad budget. The Google founders harped on this in their early research:
we expect that advertising funded search engines will be inherently biased towards the advertisers and away from the needs of the consumers.
Google is not the only search engine in town, and they have been less forthcoming with their own behavior than what they demand of others.
Ads as Content
Both SugarRae and I have highlighted how Google's investment in VigLink is (at best) sketchy given Google's approach to non-Google affiliate links. And now Google's "ads as content" program has spread to Youtube, where Google counts ad views as video views. The problem with this is that any external search service has no way to separate out which views were organic & which were driven by paid exposure.
(Google has access to that data since they charge the advertisers for each view, but there is no way for any external party to access that data, or know how Google is using it other than what Google states publicly).
That is the *exact* type of pollution Google claimed would undermine the web. But it is only bad when someone is doing it to Google (rather than the other way around).
Youtube = Wikipedia + Wikipedia + Wikipedia
As independent webmasters it can be annoying seeing Wikipedia rank for everything under the sun, but after Google's "universal search" push Youtube is far more dominant than Wikipedia. When the Panda update happened Youtube was around 4% of Google's downstream traffic. Youtube has grown their Google-referred traffic by about 4% a month since Panda, up until last month, in which it grew by 18.3% according to Compete.com. That now puts Youtube at over 5% of Google's downstream traffic (over 3x as much traffic as Wikipedia gets from Google)!
1 in 20 downstream clicks is landing onto a nepotistic property where Google has blurred the lines between ads and content, making it essentially impossible for competing search services to score relevancy (in addition to making features inaccessible, the data that is accessible is polluted). It is unsurprising that Youtube is a significant anti-trust issue:
Google acquired YouTube—and since then it has put in place a growing number of technical measures to restrict competing search engines from properly accessing it for their search results. Without proper access to YouTube, Bing and other search engines cannot stand with Google on an equal footing in returning search results with links to YouTube videos and that, of course, drives more users away from competitors and to Google.
Google promotes "openness" wherever they are weak, and then they erect proprietary barriers to erode competitive threat wherever they are strong.
At some point it is hard to operate as a monopoly without being blindingly hypocritical. And this is at the core of why Google's leading engineers feel the need to write guest articles in Politico & Eric Schmidt is working directly with governments to prevent regulatory action. They understand that if they curry favor they can better limit the damage and have more control of what sacrificial anodes die in the eventual anti-trust proceedings.
Is Google Lying Again?
As a marketer & a publisher you can go bankrupt before governments react to monopolies. Thus you need to decide what risks are worthwhile & what suggestions carry any weight.
Here is the litmus test for "is this piece of information from Google more self-serving garbage" ... does Google apply the same principals to itself in markets it is easily winning AND markets it is losing badly?
If their suggestion doesn't apply to Google across-the-board then you can safely ignore it as more self-serving drivel from a monopolist.
As long as end users get their Angry Birds they really don't care how it comes to them. But they should!
Right now, an aggregated link to this entry places higher in a search of the title than my own site, which is a Page Rank 5 site (i.e. it has a lot of "strong" links in and a lot of content). That is a snapshot of what is pushing the Media to spewing ever louder and more meaningless sounds and furies.
search engine offshoots have failed the nation, profoundly, deeply and irrevocably. - Charles Hugh Smith
If you are the first person in your vertical to leverage these new formats that can help your listing look more appealing & help you capture a bit more of the traffic (for a while). But after a half-dozen sites in your vertical use it then it no longer becomes a competitive advantage, rather just an added cost of doing business (just like Google Checkout or +1).
Then eventually it becomes much worse. Rather than being a "top resource" you get to become a "top reference" (unlinked, of course).
Your content ends up in the search result and you are an unneeded artifact from the quaint & early days of the web.
"Many answers to search queries can be computed, rather than simply returning a list of links from an index." - Eric Schmidt
Google Places is at it again, brazenly borrowing reviews from Yelp. But this time it’s in their iPhone app and they are not even bothering to link back to Yelp or attribute where they are getting the reviews.
Apparently the issue is also happening with other sources of reviews and local data such as TripAdvisor. Google says it is a mistake and it is fixing it.
If you go outside Google's guidelines & they try to penalize you for it, simply remind them that it was a technical glitch, a misinterpretation, an accident. No need to worry, as you will fix it on your end at your leisure.
It is almost universally far more profitable to do what Google does, rather than to do what they tell you to do. A fact many webmasters are waking up to 100 days after Panda torched their websites.
On a related note, JC Penny (which flagrantly violated Google's guidelines with bulk link buying) was allowed to rank again after 90 days.
"You don’t want to be vindictive or punitive, so after three months the penalty was lifted." - Matt Cutts
Those that were hit by Panda are still left in the lurch over 100 days later.
If penalizing for greater than 90 days for flagrant guideline violations would be considered "vindictive" or "punitive" then how would one describe a 100-day penalty for not breaking the guidelines?
As the original content sources disappear from the web, the aggregators eat more clicks & get fatter on the no-cost, no-effort profits (in some cases their duplication not only replaces the original source, but drives the original source into bankruptcy, making the duplicate become "unique" content). Youtube's traffic from Google has grown over 4% a month for a few months in a row. Ask grew their Google search referral traffic by roughly 25% in a couple months (while starting from a rather large base).
Keep working on adding quality and value. Then mark up your work. Google will keep working on sucking profits out of the ecosystem.