How Google Destroyed the Value of Google Site Search

Do You Really Want That Indexed?

On-demand indexing was a great value added feature for Google site search, but now it carries more risks than ever. Why? Google decides how many documents make their primary index. And if too many of your documents are arbitrarily considered "low quality" then you get hit with a sitewide penalty. You did nothing but decide to trust Google & use Google products. In response Google goes out of its way to destroy your business. Awesome!

Keep in mind that Google was directly responsible for the creation of AdSense farms. And rather than addressing them directly, Google had to roll everything through an arbitrary algorithmic approach.

< meta name="googlebot" content="noindex" />

Part of the prescribed solution to the Panda Update is to noindex content that Google deems to be of low quality. But if you are telling GoogleBot to noindex some of your content, then if you are also using them for site search, you destroy the usability of their site search feature by making your content effectively invisible to your customers. For Google Site Search customers this algorithmic change is even more value destructive than the arbitrary price jack Google Site Search recently did.

We currently use Google Site Search on our site here, but given Google's arbitrary switcheroo styled stuff, I would be the first person to dump it if they hit our site with their stupid "low quality" stuff that somehow missed eHow & sites which wrap repurposed tweets in a page. :D

Cloaking vs rel=noindex, rel=canonical, etc. etc. etc.

Google tells us that cloaking is bad & that we should build our sites for users instead of search engines, but now Google's algorithms are so complex that you literally have to break some of Google's products to be able to work with other Google products. How stupid! But a healthy reminder for those considering deeply integrating Google into your on-site customer experience. Who knows when their model will arbitrarily change again? But we do know that when it does they won't warn partners in advance. ;)

I could be wrong in the above, but if I am, it is not easy to find any helpful Google documentation. There is no site-search bot on their list of crawlers, questions about if they share the same user agent have gone unanswered, and even a blog post like this probably won't get a response.

That is a reflection of only one more layer of hypocrisy, in which Google states that if you don't provide great customer service then your business is awful, while going to the dentist is more fun than trying to get any customer service from Google. :D

I was talking to a friend about this stuff and I think he summed it up perfectly: "The layers of complexity make everyone a spammer since they ultimately conflict, giving them the ability to boot anyone at will."

Published: March 18, 2011 by Aaron Wall in google

Comments

jeronco
March 18, 2011 - 11:25am

A classic monopoly. I'm just wondering - with your level of knowledge - why don't you exploit Google's complexity? Even better - let's all reverse engineer and exploit it until Google becomes so confused by its own complexity that it ends up being completely useless. Honestly - that would make me kinda happy :)

March 18, 2011 - 1:56pm

It is Google's fear of being manipulated which is causing Google to manipulate themselves ... boxing themselves in the corner and keep breaking things.

insightseo
March 21, 2011 - 7:02pm

I couldn't agree more with this statement and I think it applies perfectly in other areas of Google as well such as what Matt is talking about with exact match domains. Google is so concerned about being manipulated that they are reacting too hastily to every complaint and in the process are breaking themselves. Google is lacking a vision for search and someone to pull everything together so instead they are listening to complaints from across the web and then running around changing things. What they should do is spend more time looking at new ways to solves the problem instead of messing around with older signals that aren't broken, like exact match domains. I believe there's probably much more innovation in search but they are trying to find it in a reactionary manner.

March 21, 2011 - 7:50pm

They want to keep changing things to keep people talking about how things are changing algorithmically, while sliding in more stealth monetization here and there. Keep people focused on the importance of the brand without being focused on anti-competitive behaviors and such.

NeonDog
March 18, 2011 - 3:46pm

Their prime motivator these days seems to be plausible deniability.

They simply want to be sure they can manipulate their results (organic and paid) at any moment in time without worrying about any sort of repercussions.

"The great Oz has spoken!" (don't look behind the curtain where earnings keep going up and Google-owned properties are dominating the results)

MyContent
March 18, 2011 - 4:11pm

Plausible deniability has been their prime motivator for years. It's not just about manipulating the search results. It also drives their methods for extracting, controlling, and monetizing content that is not theirs. It provides them cover for their pre-designed failure to address the scraper sites and content thieves that still dominate the search results, even after their supposed updates.

IntelliBraden
March 22, 2011 - 12:44pm

...we don't install google site search on our clients sites.

servercraft
March 23, 2011 - 1:42am

Aaron, since the site is on Drupal, the brilliantly effective Apache Solr module might be an option to consider for site search.

March 23, 2011 - 2:42am

Thanks for the recommendation :)

Add new comment

(If you're a human, don't change the following field)
Your first name.
(If you're a human, don't change the following field)
Your first name.
(If you're a human, don't change the following field)
Your first name.