Google Robots.txt Wildcard

Not sure if I have seen this mentioned before. Dan Thies noticed Googlebot's wildcard robot.txt support:

Google's URL removal page contains a little bit of handy information that's not found on their webmaster info pages where it should be.

Google supports the use of 'wildcards' in robots.txt files. This isn't part of the original 1994 robots.txt protocol, and as far as I know, is not supported by other search engines. To make it work, you need to add a separate section for Googlebot in your robots.txt file. An example:

User-agent: Googlebot
Disallow: /*sort=

This would stop Googlebot from reading any URL that included the string &sort= no matter where that string occurs in the URL.

Good information to know if your site has recently suffered in Google due to duplicate content issues.

Dan also recently an SEO coach blog on his SEO Research Labs site.

Published: November 7, 2005 by Aaron Wall in technology

Comments

Daniel Aleksandersen
April 14, 2007 - 6:57pm

It is supported by Yahoo! Slurp too!

November 17, 2005 - 7:18pm

Tried to submit this for a client:

User-agent: Googlebot
Disallow: /*PHPSESSID=

Through the Removal site

But I get the folowing message: URLs cannot have wild cards in them (e.g. "*"). The following line contains a wild card: DISALLOW /*PHPSESSID=

¿so?

Mike
January 1, 2007 - 4:59am

I am also getting this message form Google, when I try to remove:

URLs cannot have wild cards in them (e.g. "*"). The following line contains a wild card:
DISALLOW /thread*-0.html

November 7, 2005 - 7:21pm
November 7, 2005 - 7:50pm

I think that made it's rounds before I got on the scene.

world
August 8, 2008 - 6:59am

MSNbot support wildcard robots.txt too.

August 8, 2008 - 8:58am

In 2005 (when I wrote this post) I don't think Microsoft did.

Add new comment

(If you're a human, don't change the following field)
Your first name.
(If you're a human, don't change the following field)
Your first name.
(If you're a human, don't change the following field)
Your first name.