Then more ads below it. Then a single organic listing with huge sublinks too. And unless you have a huge monitor at that point you are "below the fold."
Negative advertising in AdWords is not allowed. So long as you build enough brand signals & pay the Google toll booth, public relations issues & reputation issues won't be accessible to searchers unless they learn to skip over the first screen of search results.
While it is generally against Google's TOS for advertisers to double dip in AdWords (outside of the official prescribed oversize ad units highlighted above), Google is doing exactly that with their multitude of brands.
Another friend sent me a message today: "just got a whole swathe of non-interlinked microsites torched today. Bastard! Just watching the rank reports coming in..."
I haven't seen his sites, but based on how he described them "whole swathe" I wouldn't guess the quality to be super high. One thing you could say for them was that they were unique.
Where putting in the effort to create original content falls flat on its face is when search engines chose to outrank aggregators (or later copies) over the original source. The issue has got so out of hand that Google has come right out & asked for help with it.
Some Google+ SEO factors now trump linking as prime algo ingredient. Google+ is already and clearly influencing rankings. I watched a presentation last night that definitely showed that rankings can occur from Google+ postings and photo's with no other means of support.
As Google+ grows - so will Google's understanding of how to use it as rankings signals.
We are not playing Google+ because we want too - we are playing Google+ because we have to.
I read that sorta half hoping he was wrong, but know he rarely is.
And then today Google hit me across the head with a 2x4, proving he was right again.
I recently got put in a test bucket for Google's new layout with a "search results" bar near the top of the page. Generally this impacts the search results in a couple ways:
First off, it is a much better looking design. In the past when the search results would move up and down with Google Instant it really felt like a hack rather than something you would see on the leading internet company's main website. Now with the results fixed it feels much cleaner & much more well put together.
The more stable basic layout of the SERP will allow Google to integrate yet more vertical data into it while making it still look & feel decent. Google may have localized search suggestions & the organic results for a significant period of time, but the combination of them with this new layout where the search results don't move feels much more cohesive.
To get the white space right on the new layout Google shifted from offering 5 Instant suggestion to 4. The Google Instant results don't disappear unless you hit enter, but because the interface doesn't change & move there isn't as much need to click enter. The search experience feels more fluid.
The horizontal line above the search results and the word "Search" in red in the upper left of the page is likely to pull some additional attention toward Google's vertical search features, helping Google to collect more feedback on them (and further use that user behavior to create a signal to drive further integration of the verticals into the regular organic search results).
On the flip side of this, in the past the center column would move up & down while the right column would remain stationary, so I would expect this to slightly diminish right column ad clicks (that appeared at the top even when the organic results moved downward) while boosting center column clicks to offset that.
In the past, when Google Instant would disappear from view, that would pull the center column organic results up a bit.
This always-on bar shifts the pixels above the first search result from about 123 to 184...so roughly 60 pixels downward.
As a baseline, a standard organic listing with no extensions is about 90 pixels tall, so this moves the search results down roughly 2/3 of a listing, which should drive more traffic to the top paid search ads & less to the organic results below them (offset by any diminished clicks on the right column ads).
I tried to line up the results pretty closely on the new test results to show what they look like with Google Instant results showing & after you hit enter. Scroll over the below image to see how the result layout doesn't really change with Google Instant hidden or extended.
And here is an example image showing how the location is sometimes inserted directly into both the organic search results and the search suggestions.
Here is an image using Google's browser size tool to show how end users see the new search results. Note that in this example I used a keyword where Google has comparison/advisor ads, so in markets where they do not yet have those you would move all the organic results one spot up from what is shown below.
In the past doorway pages could be loosely defined as "low-quality pages designed to rank for highly targeted search queries, typically designed to redirect searchers to a page with other advertisements."
The reason they are disliked is a click circus impact they have on web users as they keep clicking in an infinite loop of ads.
This would be a perfect example of that type of website:
A friend of mine told me that the reason CSN stores had to merge into a "brand" was not just because that was the direction of the algorithm, but also because they were hit with the "doorway page" penalty. I don't know if that is 100% accurate, but it sure sounds plausible, given that... (UPDATE: SEE COMMENTS BELOW)
recently multiple friends have told me they were hit with the "doorway page" issue
on WebmasterWorld there are multiple threads from small ecommerce players suggesting they were hit with the doorway page issue
"Today we received messages in our webmaster tools account, for all but 1 of our 20 domains, indicating that Google considers them doorway pages. We have also lost all of our SERP's for those sites." - Uncle_DK
"I was rather disappointed to see that before banning the site the rater visited a very drab and ordinary page on my site. Not a smoking gun of some incriminating evidence of a hacker break-in or some such I was looking for. Also disappointing is the fact that they visited one page only." - 1script
another friend today told me that one of their clients runs numerous websites & that ALL of the sites in the Google webmaster tools account blew up, getting hit with the "doorway page" label (and ALL the sites that were not in that webmaster tools account were missed by the Google engineers)
Like almost anything else Google offers, their webmaster tools are free, right up until Google changes their business objectives and one of their engineers decide that he should put you out of business.
I *knew* the point of the Panda update was not to kill content farms, but to use content farms as a convenient excuse to thin the herd of webmasters & consolidate markets. A couple obvious tells on that front were:
Young remembers begging Wal-Mart for relief. "They said, 'No way,' " says Young. "We said we'll increase the price"--even $3.49 would have helped tremendously--"and they said, 'If you do that, all the other products of yours we buy, we'll stop buying.' It was a clear threat."
Finally, Wal-Mart let Vlasic up for air. "The Wal-Mart guy's response was classic," Young recalls. "He said, 'Well, we've done to pickles what we did to orange juice. We've killed it. We can back off.' " Vlasic got to take it down to just over half a gallon of pickles, for $2.79. Not long after that, in January 2001, Vlasic filed for bankruptcy.
P&G's roll out of Gain dish soap says a lot about the health of the American middle class: The world's largest maker of consumer products is now betting that the squeeze on middle America will be long lasting.
As far as publishing business models go, if Google starts calling ecommerce sites that are part of a network "doorway sites" then Google isn't really allow that sort of testing, unless the content comes from a fortune 500 or is content conveniently hosted on Google.com. As a publisher or merchant, how do you ever grow to scale if you are not allowed to test running multiple projects & products in parallel & keep reinvesting in whatever works best?
Even the biggest publishers are breaking some of their core brands into multiple sites (eg: Boston.com vs BostonGlobe.com) to test different business models. If you have scale that is fine, but if you are smaller that same strategy might soon be considered a black hat doorway strategy.
"The future is already here — it's just not very evenly distributed." - William Gibson
Not only do they monetize via AdWords, but Google has 6 listings in the "organic" search results.
Any Google search engineer care to have a public debate as to the legitimacy of that search result set?
If an SEO gets half of the search results (for anything other than his own brand) he is an overt spammer. If Google eats half of the search results with duplicating nepotism across their own house "brands" then it is legitimate.
Making the above even worse, smaller niche brands are regularly disappeared from Google's index. Google has the ability to redirect search intent to one that is easier to monetize & more along a path they approve of. I was searching for a post John Andrews (webmaster of johnon.com) wrote about Google censorship & what did Google do? They used their power over the dictionary to change the words I searched for on the fly & then promoted their ebooks offering yet again.
Note that listings 1 & 2 promote the exact same book. Google just lists the content they scraped into multiple categories that deserve to be showcased multiple times. How many ways did Google screw up the above search result?
they auto-corrected the search query to an unwanted alternate search
in spite of auto-correction, they still allowed their other verticals to be inserted in the results inline right near the top (when rare longtail searches are auto-corrected, one would expect them to be more adverse to embedding such an aggressive self-promotion in the search results)
they associate content hosted by them as being about their brand simply because they host it (even though that piece of content has no relation to them outside of them scraping it)
they list it not once but twice, right at the top of the results (even though it is duplicate content available elsewhere & both pages are the same on Google, with the exception of one promoting a recent version of the book & the other page promoting a decade older version of the exact same book)
As a publisher you are *required* to keep spending more money on deeper editorial to avoid being labeled as spam or tripping some arbitrary "algorithmic" threshold. And as you do so, Google is humping you from the backside to ensure your profit margins stay low, scraping whatever they can within the limits of the law & operating the types of websites that would be considered spam if anyone else ran them. Once regulatory pressures or public opinion catch on to Google's parasitic behavior, they buy a brand & leverage its content to legitimize their (often) illegitimate enterprise. :)
Oh, and how about a quote from the Censored Screams book: "censorship, like charity, should begin at home, but, unlike charity, it should end there."
Smart SEOs have been preaching brand for years and years now (& so has Google if you read between the lines).
For some time Google has appended prior search queries in AdWords. In some cases they also show ads for related search queries, append your location to the search query for localization, spell-correct search results based on common search trends, and (as the Vince update showed) they can also use search query chains & brand related searches as a signal.
As far back as 2008 Google sugested previous query refinements for organic search, but they haven't been very prevalent thusfar IMHO.
While researching another article, I was searching for some browsers to see how search engines were advertising on various keywords, and after searching for Firefox I later searched for SEO & saw the following:
This is yet another way brand familiarity can boost rankings. Not only are you likely to score higher on generic search queries (due to the Vince & Panda updates), but having a well-known brand also makes Google more likely to recommend your brand as a keyword to suggest in Google Instant, makes people more likely to +1 your site, and it now also can impact the related organic search results further if people search for that brand shortly before searching for broader industry keywords.
Yet another problem with Google's brand first approach to search: parasitic hosting.
The .co.cc subdomain was removed from the Google index due to excessive malware and spam. Since .co.cc wasn't a brand the spam on the domain was too much. But as Google keeps dialing up the "brand" piece of the algorithm there is a lot of stuff on sites like Facebook or even my.Opera that is really flat out junk.
And it is dominating the search results category after category. Spun text remixed together with pages uploaded by the thousand (or million, depending on your scale). Throw a couple links at the pages and watch the rankings take off!
Here is where it gets tricky for Google though...Youtube is auto-generating garbage pages & getting that junk indexed in Google.
While under regulatory review for abuse of power, how exactly does Google go after Facebook for pumping Google's index with spam when Google is pumping Google's index with spam? With a lot of the spam on Facebook at least Facebook could claim they didn't know about it, whereas Google can't claim innocence on the Youtube stuff. They are intentionally poisoning the well.
There is no economic incentive for Facebook to demote the spammers as they are boosting user account stats, visits, pageviews, repeat visits, ad views, inbound link authority, brand awareness & exposure, etc. Basically anything that can juice momentum and share value is reflected in the spam. And since spammers tend to target lucrative keywords, this is a great way for Facebook to arbitrage Google's high-value search traffic at no expense. And since it pollutes Google's search results, it is no different than Google's Panda-hit sites that still rank well in Bing. The enemy of my enemy is my friend. ;)
Even if Facebook wanted to stop the spam, it isn't particularly easy to block all of it. eBay has numerous layers of data they collect about users in their marketplace, they charge for listings, & yet stuff like this sometimes slides through.
And then there are even warning listings that warn against the scams as an angle to sell information
But even some of that is suspect, as you can't really "fix" fake Flash memory to make the stick larger than it actually is. It doesn't matter what the bootleg packaging states...its what is on the inside that counts. ;)
When people can buy Facebook followers for next to nothing & generate tons of accounts on the fly there isn't much Facebook could do to stop them (even if they actually wanted to). Further, anything that makes the sign up process more cumbersome slows growth & risks a collapse in share prices. If the stock loses momentum then their ability to attract talent also drops.
Since some of these social services have turned to mass emailing their users to increase engagement, their URLs are being used to get around email spam filters
Stage 2 of this parasitic hosting problem is when the large platforms move away from turning a blind eye to the parasitic hosting & to engage in it directly themselves. In fact, some of them have already started.
According to Compete.com, Youtube referrals from Google were up over 18% in May & over 30% in July! And Facebook is beginning to follow suit.
Want a good example of Google's brand-bias stuff being a bunch of bs?
Niche expert value-add affiliate websites may now lack the brand signal to rank as the branded sites rise up above them, so what comes next?
Off-topic brands flex their brand & bolt on thin affiliate sections.
Overstock.com was penalized for having a spammy link profile (in spite of being a brand they were so spammy that they were actually penalized, counter to Google's cultural norm) but a few months later the penalty was dropped, even though some of the spam stuff is still in place.
Those who were hit by Panda are of course still penalized nearly a half-year later, but Overstock is back in the game after a shorter duration of pain & now they are an insurance affiliate.
While most the content farms were decimated, that left a big hole in the search results that will allow the Huffington Post to double or triple the yield of their content through additional incremental reach.
Before I go on, let me stop and say a couple of more important things: Aol, Aol Acquires Huffington Post, Aol Buys Huffington Post, Aol Buys Huffpo, Aol Huffington Post, Huffington Post, Huffington Post Aol, Huffington Post Aol Merger, Huffington Post Media Group, Huffington Post Sold, Huffpo Aol, Huffpost Aol, Media News.
See what I did there? That's what you call search-engine optimization, or SEO. If I worked at the Huffington Post, I'd likely be commended for the subtle way in which I inserted all those search keywords into the lede of my article.
I was given eight to ten article assignments a night, writing about television shows that I had never seen before. AOL would send me short video clips, ranging from one-to-two minutes in length — clips from “Law & Order,” “Family Guy,” “Dancing With the Stars,” the Grammys, and so on and so forth… My job was then to write about them. But really, my job was to lie. My job was to write about random, out-of-context video clips, while pretending to the reader that I had watched the actual show in question. AOL knew I hadn’t watched the show. The rate at which they would send me clips and then expect articles about them made it impossible to watch all the shows — or to watch any of them, really.
Doing fake reviews? Scraping content? Putting off-topic crap on a site to monetize it?
Those are the sorts of things Google claims the spammy affiliates & SEOs do, but the truth is they have never been able to do them to the scale the big brands have. And from here out it is only going to get worse.
We highlighted how Google was responsible for creating the content farm business model. Whatever comes next is going to be bigger, more pervasive, and spammier, but coated in a layer of "brand" that magically turns spam into not-spam.
Imagine where this crap leads in say 2 or 3 years?
It won't be long before Google is forced to see the error of their ways.
What Google rewards they encourage. What they encourage becomes a profitable trend. If that trend is scalable then it becomes a problem shortly after investors get involved. When that trend spirals out of control and blows up they have to try something else, often without admitting that they were responsible for causing the trend. Once again, it will be the SEO who takes the blame for bad algorithms that were designed divorced from human behaviors.
I am surprised Google hasn't hired someone like a Greg Boser or David Naylor as staff to explain how people will react to the new algorithms. It would save them a lot of work in the long run.
Disclosure: I hold no position in AOL's stock, but I am seriously considering buying some. When you see me personally writing articles on Huffington Post you will know it's "game on" for GoogleBot & I indeed am a shareholder. And if I am writing 30 or 40 articles a day over there that means I bought some call options as well. :D
My personal preference on subdomains vs. subdirectories is that I usually prefer the convenience of subdirectories for most of my content. A subdomain can be useful to separate out content that is completely different. Google uses subdomains for distinct products such news.google.com or maps.google.com, for example. If you’re a newer webmaster or SEO, I’d recommend using subdirectories until you start to feel pretty confident with the architecture of your site. At that point, you’ll be better equipped to make the right decision for your own site.
Even though subdirectories were the "preferred" default strategy, they are now the wrong strategy. What was once a "best practice" is now part of the problem, rather than part of the solution.
Not too far before Panda came out we were also told that we can leave it to GoogleBot to sort out duplicate content. A couple examples here and here. In those videos (from as recent as March 2010) are quotes like:
"What we would typically do is pick what we think is the best copy of that page and keep it, and we would have the rest in our index but we wouldn't typically show it, so it is not the case that these other pages are penalized."
"Typically, even if it is consider duplicate content, because the pages can be essentially completely identical, you normally don't need to worry about it, because it is not like we cause a penalty to happen on those other pages. It is just that we don't try to show them."
I believe if you were to talk to our crawl and indexing team, they would normally say "look, let us crawl all the content & we will figure out what parts of the site are dupe (so which sub-trees are dupe) and we will combine that together."
I would really try to let Google crawl the pages and see if we could figure out the dupes on our own.
Now people are furiously rewriting content, noindexing, blocking with robots.txt, using subdomains, etc.
Google's advice is equally self-contradicting and self-serving. Worse yet, it is both reactive and backwards looking.
You follow best practices. You get torched for it. You are deciding how many employees to fire & if you should simply file bankruptcy and be done with it. In spite of constantly being lead astray by Google, you look to them for further guidance and you are either told to sit & spin, or are given abstract pablum about "quality."
Everything that is now "the right solution" is the exact opposite of the "best practices" from last year.
And the truth is, this sort of shift is common, because as soon as Google openly recommends something people take it to the Nth degree & find ways to exploit it, which forces Google to change. So the big problem here is not just that Google gives precise answers where broader context would be helpful, but also that they drastically and sharply change their algorithmic approach *without* updating their old suggestions (that are simply bad advice in the current marketplace).
It is why the distinction between a subdirectory and subdomain is both 100% arbitrary AND life changing.
Meanwhile select companies have direct access to top Google engineers to sort out problems, whereas the average webmaster is told to "sit and spin" and "increase quality."
The only ways to get clarity from Google on issues of importance are to:
ignore what Google suggests & test what actually works, OR
publicly position Google as a monopolist abusing their market position
How valuable is that email? How about "not at all" for $500 Alex?
I think Fantomaster is brilliant, but I would much rather read one of his blog posts that dozens or hundreds of Tweets. Sure knowing that 9,000 might have saw a message can be comforting, but 1 blog post will get you way more views (and with far deeper context & meaning).
How to Test the Value of Social Media
Want to see big numbers get small quickly? Try charging anything...as little as $1 & you will quickly see that social media is mostly garbage. Alternatively, try giving away $5 or paying for the in-stream ads that directly manipulate relevancy & once again you will see how worthless social media is as a signal...something that anyone can quickly buy.
Online reputation is important to most researchers, and about 10% of respondents to our survey complained that they or their work have been misrepresented on the Internet. The web has a long memory, and rumours, lies and bad information can spiral out of control to be remembered by posterity.
For a lot of businesses the social media stuff will be nothing but blood & tears. A resource drain that money, time & hope gets poured into with nothing coming out the other end.
That said, I don't think ignoring it is a wise decision at this point. The best tip I have for most people is to try to set up automated systems that help your social signal grow automatically. That can mean onsite integration & perhaps a small token amount of advertising. Beyond that it is probably only participating as much as you enjoy it. And if you are more of a huckster/PR type, pay attention to how folks like Jason Calacanis leverage these channels.
The second best tip would be measure it with stats that actually matter. Revenue and profit are important. Time spent tracking the number of retweets is probably better spent building more content or improving your business in other ways. If you have something that works the rabbit hole goes deep, but if it isn't working then it is likely better spending your time being a bigger fish in a smaller pond.
Inspired by Barry's implementation, we recently added the social buttons to the left rail of the site. That is probably one of the best types of integration you can do, because it is out of the way for those who don't know what it is and/or want to ignore it, but it stays right in the same spot (always visible) for anyone who is interested in those types of buttons.
What Makes the Web Great (for Small Businesses)
Two things that make the web great are the ability to fail fast (and cheaply) & the ability to focus deeply. If social media increases the operating cost (being yet another hoop you have to jump through) & robs valuable attention that could go into your website then it is 0-for-2.
It remains to be seen if an author will be able to carry his or her trust with them to their next gig, but if they can then that would make the media ecosystem more fluid & pull some amount of power away from traditional publishers. Some publishers are suggesting putting their book content online as HTML pages...well if they are doing that then why doesn't the author just install Wordpress and keep more of the value chain themselves (like J.K. Rowling just did)?
Their answer to that question is generally "no" but that they would even ask themselves that question is fallacious Orwellian duplicity.
Would you trust the local plumber to work on your house if he was posting "exciting viral content" online about how some projects went astray? Now every plumber needs to become a marketing expert to not get driven off the web by Roto-Rooter & other chain-styled companies that can collect +1 signals from all their employees & some of their customers across the country or around the world.
Google knows they are tilting the search game toward those who have money. They even flaunt it in their display ads!
Google itself explains that not allowing its device maker partners to ship Skyhook's software was just, the way Google describes it, a necessary measure to prevent damage (Google says "detriment", which is the Anglicized version of the Latin word for "damage") from being caused to the whole ecosystem.
But Google does not want to allow Oracle to control Java the way Google controls Android.
Google today is saying that "social media is important." Just look at their wave of product announcements & their bonus structure.
I loathe the approach (and the message), but I accept it. ;)