Search is political.
Google has maintained that there were no exceptions to Panda & they couldn't provide personalized advice on it, but it turns out that if you can publicly position their "algorithm" as an abuse of power by a monopoly you will soon find 1:1 support coming to you.
The WSJ's Amir Efrati recently wrote:
In June, a top Google search engineer, Matt Cutts, wrote to Edmondson that he might want to try subdomains, among other things.
We know what will happen from that first bit of advice, in terms of new subdomains:
billions trillions served.
What are the "among other things"?
We have no idea.
All we know is that it has been close to a half-year since Panda has been implemented, and in spite of massive capital investments virtually nobody has recovered.
A few years back Matt Cutts stated Google treats subdomains more like subfolders. Except, apparently that only applies to some parts of "the algorithm" and not others.
My personal preference on subdomains vs. subdirectories is that I usually prefer the convenience of subdirectories for most of my content. A subdomain can be useful to separate out content that is completely different. Google uses subdomains for distinct products such news.google.com or maps.google.com, for example. If you’re a newer webmaster or SEO, I’d recommend using subdirectories until you start to feel pretty confident with the architecture of your site. At that point, you’ll be better equipped to make the right decision for your own site.
Even though subdirectories were the "preferred" default strategy, they are now the wrong strategy. What was once a "best practice" is now part of the problem, rather than part of the solution.
Not too far before Panda came out we were also told that we can leave it to GoogleBot to sort out duplicate content. A couple examples here and here. In those videos (from as recent as March 2010) are quotes like:
- "What we would typically do is pick what we think is the best copy of that page and keep it, and we would have the rest in our index but we wouldn't typically show it, so it is not the case that these other pages are penalized."
- "Typically, even if it is consider duplicate content, because the pages can be essentially completely identical, you normally don't need to worry about it, because it is not like we cause a penalty to happen on those other pages. It is just that we don't try to show them."
- I believe if you were to talk to our crawl and indexing team, they would normally say "look, let us crawl all the content & we will figure out what parts of the site are dupe (so which sub-trees are dupe) and we will combine that together."
- I would really try to let Google crawl the pages and see if we could figure out the dupes on our own.
Now people are furiously rewriting content, noindexing, blocking with robots.txt, using subdomains, etc.
Google's advice is equally self-contradicting and self-serving. Worse yet, it is both reactive and backwards looking.
You follow best practices. You get torched for it. You are deciding how many employees to fire & if you should simply file bankruptcy and be done with it. In spite of constantly being lead astray by Google, you look to them for further guidance and you are either told to sit & spin, or are given abstract pablum about "quality."
Everything that is now "the right solution" is the exact opposite of the "best practices" from last year.
And the truth is, this sort of shift is common, because as soon as Google openly recommends something people take it to the Nth degree & find ways to exploit it, which forces Google to change. So the big problem here is not just that Google gives precise answers where broader context would be helpful, but also that they drastically and sharply change their algorithmic approach *without* updating their old suggestions (that are simply bad advice in the current marketplace).
It is why the distinction between a subdirectory and subdomain is both 100% arbitrary AND life changing.
Meanwhile select companies have direct access to top Google engineers to sort out problems, whereas the average webmaster is told to "sit and spin" and "increase quality."
The only ways to get clarity from Google on issues of importance are to:
- ignore what Google suggests & test what actually works, OR
- publicly position Google as a monopolist abusing their market position
Good to know!
Gain a Competitive Advantage Today
Your top competitors have been investing into their marketing strategy for years.
Now you can know exactly where they rank, pick off their best keywords, and track new opportunities as they emerge.
Explore the ranking profile of your competitors in Google and Bing today using SEMrush.
Enter a competing URL below to quickly gain access to their organic & paid search performance history - for free.
See where they rank & beat them!
- Comprehensive competitive data: research performance across organic search, AdWords, Bing ads, video, display ads, and more.
- Compare Across Channels: use someone's AdWords strategy to drive your SEO growth, or use their SEO strategy to invest in paid search.
- Global footprint: Tracks Google results for 120+ million keywords in many languages across 28 markets
- Historical data: since 2009, before Panda and Penguin existed, so you can look for historical penalties and other potential ranking issues.
- Risk-free: Free trial & low price.