Google will gladly figure out, for you, whether or not your search has local intent. :)
Google's Investment into Local
Late last year Google moved one of their prized executives over to local services, Marissa Mayer. Moving Mayer, fresh off Google Instant and a variety of other high profile areas of Google's search development, to head up local is a real strong reinforcement of how much attention Google is putting on local and local result quality (or perceived quality).
If you are a business owner who operates locally, say a real estate agent or insurance agent or really any other consumer-based service, then this presents a huge opportunity for you if you can harness the targeting and tracking ability available online.
Merging Offline Marketing with Online Marketing
A lot of small businesses or larger businesses that operate locally still rely quite a bit on offline advertising. It use to be that business owners had to rely on staff nailing down exactly how a lead came to them (newspaper ad? radio ad? special discount ad? and so on).
While it is still good practice to do that, relying solely on that to help gauge the ROI of your advertising campaign introduces a good amount of slippage and is not all that accurate (especially if you sell something online).
As local businesses start to see the light with SEO and PPC campaigns versus dropping 5 figures on phonebook advertising, a big selling point as a service provider or an in-house marketing staff member will be to sell the targeting of online campaigns as well as the tracking of those results.
If your a business owner, it's equally important that you understand what's available to you as an online marketer.
Types of Offline Advertising to Track
Locally, you are essentially looking at a few different types of advertising options to work into your new found zest for tracking results:
Print is probably the most wide-ranging in terms of branches of advertising collateral because you can get into newspapers, magazines, flyers, brochures, banners, yellow pages, and so on.
While your approach may be different to each marketing type, the core tracking options are basically the same. You can track in your analytics program via:
Custom Phone Numbers
The beauty of web analytics, specifically a free service like Google Analytics, is that it puts the power of tracking into the hands of a business owner at no cost outside of perhaps a custom set up and implementation by a competent webmaster. All of these tracking methods can be tracked in Google Analytics as well as other robust analytic packages (Clicky.Com as an example, is a reasonably priced product which can do this as well, save for maybe the phone tracking).
Structuring Your Campaigns
With the amount of offline advertising many businesses do, it is easy to get carried away with separate domains, custom URL's, custom phone numbers, and the like.
What I usually like to do is use a good old fashioned spreadsheet to track the specific advertisements that are running, the dates they are running, and the advertising medium they are using. I also include a column or three for the tracking method(s) used (custom URL, separate domain, special phone number).
In addition to this, Google Analytics offers annotations which you can use to note those advertising dates in your traffic graph area to help get an even better idea of the net traffic effect of a particular ad campaign.
How to Track It
Armed with your spreadsheet of ads to track and notes on how you are going to track them, you're ready to set up the technical side of things.
The tracking is designed to track the hits on your site via the methods mentioned, once they get there you'll want to get that traffic assigned to a campaign or a conversion funnel to determine how many of the people actually convert (if you are able to sell or convert the visitor online).
A custom URL is going to be something like:
yoursite.com/save20 for an advert you might be offering 20% savings on
yoursite.com/summer for an advert you could offer a summer special on
You may or may not want to use redirection. You can use a redirect method if you are using something like a static site versus a CMS like Wordpress. With Wordpress, you could create those url's as specific pages and just no-index them and ensure they are not linked to internally so you keep them out of the search engine and the normal flow of navigation. This way you know any visit to that page is clearly related to that offline campaign.
A redirect would be helpful where the above is not possible and you need to use Google's URL builder to help track the campaign and not lose referral parameters on the 301.
So you could use the URL builder to get the following parameters if you were promoting a custom URL like yoursite.com/save20:
Some companies use separate domains to track different campaigns. The idea is the same as is the basic code implementation with exception that you apply any redirect to the domain rather than a sub-page or directory off the domain as we did in the prior example.
So you sell snapping turtles (snappingturtles.com) and maybe you sell turtle insurance so you buy turtleinsurance.com and you want to use that as a part of a large campaign to promote this new and innovative product. You could get this from the url builder:
This would redirect you to the home page of your main site and you can update your .htaccess with a sub-page if you had such a page catering to that specific market.
Custom Phone Numbers
There are quite a few ways to get cheap virtual numbers these days and Phone.com is reliable service where you can get a number for roughly $4.88 per month.
I know companies that implemented custom numbers for a bunch of print ads and it was pretty eye-opening in terms of which as performed better than others and how much money is wasted on untargeted print campaigns.
There certainly is a somewhat intangible brand equity building component to offline ads but it is still interesting to see ads which carry their weight with traffic and response rates, as well as being really helpful when it comes time to reshape the budget.
Here are a couple handfuls of providers which offer phone tracking inside of Google Analytics. Most of these providers will require the purchase of a number from them to tie into a specific URL on your site or just right into the domain + help track those calls alongside the pageviews generated.
Some campaigns are wide-ranging enough to where you may want to target them with a custom number or two and a custom URL or domain. Using a spreadsheet to track these measures along with using Google Analytics annotations to gauge traffic spikes and drops offers business owners deep view into the use of their marketing dollars.
If you are a business owner who thinks "wow this is awesome, how the heck do I do it?", well here is some advice. If the field of web analytics is mostly foreign to you I would suggest finding a certified Google Analytics provider or ask if your current web company can do this for you. Certainly there are plenty of competent people and companies that are not part of the Google Analytics partner program.
If you are interested in a Google Analytics partner you can search for them here. There is also quite a bit of information in the self-education section of Google Analytics.
I would recommend learning how to do this over a period of time so you can make minor or major changes yourself at some point. Also, it helps to establish a business relationship with someone competent and trustworthy for future tasks that may come up, which you cannot do on your own.
If you are a service provider, start implementing this for some of your local clients and you'll likely be well on your way to establishing yourself as a sought-after marketer in your area.
When I graduated high school one of my teachers gave me a card stating how they appreciated my humility. My mom read that and was proud, and I felt a bit embarrassed because I didn't know what the word meant and had to ask. It turns out it is easy to confuse ignorance for humility! :D
Marketing is chuck full of humbling teachable moments. One of the most important concepts is the importance of humility. When Lady Gaga spoke at Google she came off as being totally humble. I don't care for her music, but have a lot more respect for that sort of marketing than the "rapper who has more money than Bernake" sort of approach.
A lot of well known online marketers are the exact opposite of Lady Gaga's approach: anything to be heard & and I am the best, etc. Some people are into that approach, while others find it distasteful. Of course in any market there will be competition with winners and losers as SEO is a zero sum game. But some folks work to build value & monetize, while others aim to exploit & scam.
Because there are eploiter dirtbags working the market, you have to pay attention to what the market is saying about your stuff in real time. Even if you pour 20 hours into creating something that is useful, relevant, engaging, interesting, etc. some people will think it is spam just because it uses a format that spammers have exploited. Notice how in spite of our collateral damage piece being fairly well received across the web some spots that referenced it immediately raised the "spam" concern simply because it is an infographic!
Every Profitable Company is in the Gray Area
Part of the reason you need to track viral stuff in real-time is because people are tuned to think that anything in online marketing that is successful has some layer of deception to it.
In spite of how many marketers love to wear the white hat label, the truth is that almost anyone who is profitable operates somewhere in the gray area. Google has something like $40 billion in the bank, and yet they still have an AdSense category for "get rich quick." They proudly claim how they took down the get rich quick scammers that were trading on the Google brand, but they still have an AdSense category for "get rich quick."
White Hat? Probably a Liar
The problem I have with those who love the white hat label is that many people who claim to adhere to algorithmic best practices are often willing to crap on real people to get ahead. Jason Calacanis claimed to be white hat precisely because he was aware of how dirty and exploitative his Mahalo junk was. You can't really get away with flagrant spamming if you call it what it actually is, so you have to preach righteous virtues while doing it.
That "scammer exploiting a loophole" approach can work for your thin affiliate site that isn't tied to your name & brand, but if you are using that sort of stuff on your client projects or on your own main brand site you build contempt in the marketplace. Which is precisely why so many SEOs were happy to see Mahalo get torched in the content farm update.
The recent "advanced" link building conference brought about 2 teachable moments on that front.
How to Breed Hate & Animosity in Your Marketplace!
Before the conference Will (from Distilled) asked me if I would be ok with him writing a post here. We have tried to be fairly neutral in the marketplace (reviewing tons of competing sites and products and whatnot), so I said sure. He handed me something that I found to be pretty offensive. To which, when he asked for a follow up, I replied:
Generally I felt that suggesting that post was sorta a smack in the face. It was like an ad inside an ad inside an ad. Ad for seminar + ad for your seo services + laundry list of links to client sites.
That you would suggest that made me feel like you think I am stupid or that you were trying to disrespect me. I didn't reply right away because I was a bit angry at the time & didn't want to respond that way. And then all that tech crap happened. Anyhow I think you are savvy and are a great SEO, but a post like that (ad in ad in ad) is better fit for say like John Chow's blog than ours ;)
He responded with how much of a fan he was of mine & that if he knew of anything cool on the news front down the road he would try to help us break it & such.
At the conference he highlighted his "appear authentic, but be driven by a script" type of approach.
But what he *failed* to disclose (until his brother disclosed it during the conference as part of the conference) was...
that they had been hired by a competing SEO site to try to outrank us for SEO tools
that they suggested the site they were working on to use 301 redirects to game Google
that the site they were working for outed our site for using 301 redirects & got it toasted in Google
In other words, where a person who is truly ahead of the market, and does something to create a competitive advantage it must be black hat spam and you should complained to Google to get it torched. Then years later when the people who claimed the technique was spam do the same damn thing it suddenly becomes clean and innovative (cutting edge advanced stuff even).
Then they want the person who was ahead of the curve to be a free conduit for spreading this trash! It is so bad that you couldn't even make this stuff up.
Consider the brand damage they did to themselves & the bad karma they earned in the marketplace with the above stupidity. If they are willing to do that sort of stuff to their own brand, would you want them working on your brand? I wouldn't.
Those who claim to be algorithmically white hat, but are fine with lying, being deceptive, failing to disclose conflicts, etc. are saying that they put the algorithm ahead of how they treat real human beings in their marketplace. It is fine to be exploitative if that is your approach, but be honest about it ... because it is dumb to do it in a way that causes damage to your brand.
Spam vs Junk Trash Garbage That People Hate
Some people also complain about domain names (a clear signal of relevancy) shouldn't count, and yet some of the same folks create software to automate spamming up public communities. Any competitive disadvantage they have is spam, any competitive advantage they have is not spam. ;)
Enter Russ Jones!
On the Virante about page it highlights that "Russ has assisted in the creation of new search marketing technologies. This includes the venerable LinkSleeve Spam Link Verification system, which currently blocks thousands of links spam messages across the web." Yet at the "advanced" link building conference he gave away software to help people spam the crap out of Reddit.
Once the people on Reddit highlighted it he was quick to backpedal, stating: "I don't openly promote spamming. If you think creating highly viral content and submitting it to a social network to let them decide if it is good is spamming, then you have been seriously misled."
Here is the deal though, if the goal of that sofware was to do ANYTHING other than spamming, then it would be promoted to the core audience (so it could reach more people) rather than hunting out old subreddits & spamming them up with links.
Yet again, advanced!
There is nothing new or advanced about that link building "technique." It is just an extension of guestbook or comment spam. The above image links to a Vimeo video which highlights what Matthew Haughey thinks of the SEO industry after he found out someone was selling an info-product on how to spam up Metafilter by dropping links in old posts. Slapping the label "advanced" on old spam techniques makes them neither new nor advanced. The clock moves in one direction. Unfortunately it is not 2003 anymore.
There are a bunch of exploitative douchebags that paint themselves as white hats while destroying the ecosystems we all must work in by undermining basic human decency principals & trust in the marketplace. I don't care if someone wants to be a spammer, but to do so and claim that you are white hat and ethical (and thus that others are somehow inferior) is garbage.
Even Ghetto Rappers Stand for Something
The most important lesson in marketing is consistency. Make promises that you can consistently deliver on.
Rappers are successful. So are folks like Thom Yorke. But they pick their markets & their approach and stick to it. Bouncing back and forth just makes a person look like a dishonest douchebag who stands for nothing.
It's not standing for much, but least the rappers have their drugs, booze and hoes.
What do these internet marketers stand for?
It seems the folks teaching "advanced" internet marketing still need a bit of work on "basic" social interactions & common sense. But I guess those are harder to sell. ;)
If you live outside the United States and were unscathed by the Panda Update, a world of hurt may await soon. Or you may be in for a pleasant surprise. It is hard to say where the chips may lay for you without looking.
Due to Google having multiple algorithms running right now, you can get a peak at the types of sites that were hit, and if your site is in English you can see if it would have got hit by comparing your Google.com rankings in the United States versus in foreign markets by using the Google AdWords ad preview tool.
In most foreign markets Google is not likely to be as aggressive with this type of algorithm as they are in the United States (because foreign ad markets are less liquid and there is less of a critical mass of content in some foreign markets), but I would be willing to bet that Google will be pretty aggressive with it in the UK when it rolls out.
The keywords where you will see the most significant ranking changes will be those where there is a lot of competition, as keywords with less competition generally do not have as many sites to replace them when they are whacked (since there were less people competing for the keyword). Another way to get a glimpse of the aggregate data is to look at your Google Analytics search traffic from the US and see how it has changed relative to seasonal norms. Here is a look out below example, highlighting how Google traffic dropped. ;)
What is worse, is that on most sites impacted revenue declined faster than traffic because search traffic monetizes so well & the US ad market is so much deeper than most foreign markets. Thus a site that had 50% profit margins might have just went to break even or losing money after this update. :D
When Google updates the US content farmer algorithm again (likely soon, since it has already been over a month since the update happened) it will likely roll out around other large global markets, because Google does not like running (and maintaining) 2 sets of ranking algorithms for an extended period of time, as it is more cost intensive and it helps people reverse engineer the algorithm.
It works by extending the search interface to include a layer before the results come up. The layer typically includes a left column of related keywords & a right box that can be anything from:
3 top websites for that query
a weather forecast
the profile of a celebrity
other unique data sets
Here is an example of how the search box flies out
Here is an overview video from Yahoo!
Arbitrage or Helpful?
It is easy to laugh at Ask.com when thinking about the spammy end of the "answers engines" (or even Yahoo! Answers for that matter), but this search direct could range from highly useful to pretty weak depending on what Yahoo! decides to do with it. It's impact on various markets can range from trivial to significant.
What Powers Search Direct?
The ranking algorithm for Yahoo! Search Direct is different than their core results, being powered off a smaller index with its own algorithm, with a rapid refresh rate. Greg Sterling asked Yahoo!'s Shashi Seth about what drove the algorithm:
Seth told me that right now the links and content being shown in the right part of the box are the URLs that are the “most clicked” throughout the Yahoo network. He also implied that it might get more nuanced over time. And he added that rankings can change moment to moment because it’s dynamic.
That click bias has a natural preference toward promoting Yahoo! properties (since Yahoo! users like Yahoo! stuff) and promoting those who are featured on the Yahoo! network through editorial partnerships.
Greater Integration of Self Promotion
One of the benefits of Yahoo! outsourcing search is that they can now claim that they are not a search engine, which gets them around a ton of conflict issues, and allows them to aggressively self-promote without the type of scrutiny Google has come under for hard-coding their search results. Currently Yahoo! Search Direct is not yet running ads, but it is full of self-promotion. It is not a great sign for the longevity of Yahoo! Search that when you start typing almost every letter of the alphabet leads to a downstream Yahoo! product. In the past, search engines which have over-monetized have seen marketshare erode to Google. Hopefully this stuff pushes people to Bing though!
In key verticals where Yahoo! is well established the entire preview box is consumed by content from their vertical databases. See, for example, a search for LeBron James
If you are ESPN it becomes much harder to get traffic from Yahoo! Search directly given that sort of layout. If the model proves profitable enough Yahoo! can close off a lot of verticals. The key for web publishers is that Yahoo! has traditionally been horrible at integration, so the odds of them doing this in a way that monetizes more aggressively without harming Yahoo!'s search marketshare are pretty low. Having wrote that, last year Yahoo! bought Associated Content and has been pushing hard at growing their news, sports & finance verticals. If they are able to instantly tap a large share of the search market & can throw up a featured promotion for some of their key content then that will lead to lots of usage data (Microsoft has already mentioned using clickstream data to create a search signal) & social signals (like Facebook likes) that can bleed into improving the ranking of Yahoo! content in other search engines.
Custom Ad Units
The showing of a mini-search box not only gives them the potential for further self-promotion, but it also allows them to run more custom ad units that are in full focus of the end user. When you display a full search result you are offering a list of options, but premium placement ads in the preview box can allow for tighter integration of video, audio, or other custom ad units within search.
Yahoo! has also taken branded search ads one step further, with a wrap around on certain keyword queries, like eBay.
Where Yahoo! Search Direct falls short, especially when compared against Google Instant is it's force of pushing a single vertical for keywords that can have many meanings. Take, for example, a search on cars. If you don't want the DVD, you are still forced to view information about the cartoon movie because a Yahoo! vertical has a match.
Another thing Yahoo! seems to be doing is force feeding a local option as the last suggested keyword, even where it is totally irrelevant. In the long run I think this would harm Yahoo! local as a true destination, but it can drive short term volume. Of course this only just launched, so it will likely become more relevant as they track how users interact with it. Currently someone is likely registering a Yahoo! local profile with Viagra in it somewhere. :D
Firefox 4 was just released. It is much smoother & faster than prior versions of the browser. And the persistent memory leaking issue seems to have been tamed, even with many extensions installed. Overall an awesome upgrade. I can see this once again becoming my main web browser while also remaining my primary SEO research browser.
In time we will likely think about moving the icons for Rank Checker and SEO for Firefox out of the status bar & into the upper menu, as it is not great for us to create extensions that are reliant on another extension which is then reliant on a browser that changes too ... too many moving parts.
We also just updated the documentation on the plug-in download & upgrade pages for our extensions such that those who do not read our blog still know what they need to do in order to keep everything going smoothly. It also prevents us from having to read too many support tickets like these gems a crazy gave us today, which helps us maintain at least a bit of hope for humanity. :)
In moderation such messages are humorous...but you just hope that the person isn't crazy enough to hunt you down and shoot you because they think Yahoo! is a superior browser to Firefox. Not for the least of reasons because Yahoo! isn't a web browser! :D
And the spam clean up? Google did NOTHING of the sort.
Every single example (of Google spamming Google) that was highlighted is still live.
Now Google can claim they handled the spam on their end / discounted it behind the scenes, but such claims fall short when compared to the standards Google holds other companies to.
Most sites that get manually whacked for link-based penalties are penalized for much longer than 2 weeks.
Remember the brand damage Google did to companies like JC Penny & Overstock.com by talking to the press about those penalties? In spite of THOUSANDS of media outlets writing about Google's BTQ acquisition, The Register was the most mainstream publication discussing Google's penalization of BeatThatQuote, and there were no quotes from Google in it.
When asking for forgiveness for such moral violations, you are supposed to grovel before Google admitting all past sins & admit to their omniscient ability to know everything. This can lead one to over-react and actually make things ever worse than the penalty was!
In an attempt to clean up their spam penalties (or at least to show they were making an effort) JC Penny did a bulk email to sites linking to them, stating that the links were unauthorized and to remove them. So JC Penny not only had to spend effort dropping any ill gotten link equity, but also lost tons of organic links in the process.
Time to coin a new SEO phrase: token penalty.
token penalty: an arbitrary short-term editorial action by Google to deflect against public relations blowback that could ultimately lead to review of anti-competitive monopolistic behaviors from a search engine with monopoly marketshare which doesn't bother to follow its own guidelines.
Your faith in your favorite politician should be challenged after you see him out on the town snorting coke and renting hookers. The same is true for Googler's preaching their guidelines as though it is law while Google is out buying links (and the sites that buy them).
You won't read about this in the mainstream press because they are scared of Google's monopolistic business practices. Luckily there are blogs. And Cyndi Lauper. ;)
Update: after reading this blog post, Google engineers once again penalized BeatThatQuote!
Conversion optimization is an ongoing concern for serious businesses. When viewed in the shadow of a big change like Google's recent Panda Farmer update, optimizing your site's existing traffic streams becomes even more attractive - or necessary - to remain competitive.
For developing a better understanding of the foundational basics and lucrative potential of conversion optimization, it is a book I would highly recommend.
Tone and Style
Conversion optimizing is a dense subject, with lots of bits-and-bytes of small, related information tied together to make a complete picture. Saleh and Shukairy tackle this book with the intention of making this dense, complicated subject easier to understand conceptually, and therefore, profit from.
The first two chapters offer simple foundational ideas for the novice, covering the general concepts, analytics and formulas typically used in measuring and improving conversions. However, in the introduction the authors make this clear - and suggest those with a bit of experience may want to skip right to chapter three. I appreciated the suggestion, but read from the beginning anyway. :)
I like the way this book progresses in this manner - each successive chapter builds on the ideas posited in previous chapters. Even though I have some experience with optimization, I read the book from the beginning. While it was not new information to me, it was nice to reaffirm where my thinking aligns with conversion experts and identify places where our opinions diverged.
Once chapter three starts, the simple ideas presented in the beginning of the book are built-on slowly, which encourages you to see how smaller ideas nourish the roots of larger results. This is a an example of a well considered and deftly executed book idea - and it makes reading and learning easier.
A simple tone from industry experts is common with O'Reilly books - it is part of what makes them solid study materials, especially for introducing you to new subjects. What I think Saleh and Shukairy do uniquely well, is inject just enough warmth in their tone to keep the flow engaging without overdoing it and diluting the impact of the subject matter. It is a careful balancing act - they obviously have a ton of information to share and don't want to overwhelm the reader, but at the same time need to keep it on a level that most anyone can embrace.
Cold facts often need warming-up before serving them. Saleh and Shukairy say: "Conversion optimization is a blend of science and art. It is the intersection of creative, marketing, and analytical disciplines." I would add that creating an easily digestible tome on a genuinely dry subject matter is an art of its own. It requires an intersecting of knowledge, warmth, experience and understanding, and the writing skills to blend these seamlessly. Saleh and Shukairy use a simple tone and style to layer their ideas upon each other and leave the reader with a sense of foundation and conceptual understanding.
The meat of this book concentrates on presenting eight principles that combine into what Saleh and Shukairy call the "Conversion Framework." They believe that understanding this framework correctly allows you to apply it judiciously and continue to benefit from conversion optimizing efforts both online and offline. They want to teach a man to fish, not simply feed him.
Here again is where the reader benefits from the approach of these specific authors. Rather than using ideas that are rooted in topical or fleeting "what is working now" type of thinking, Saleh and Shukairy want you to avoid the simple path, and learn something deeper - something that will continue to offer you value.
These concepts are explained and well supported by examples, numbers and facts. For example, when discussing the creation of personas, they are adamant to warn against getting lost in this effort and provide realistic numbers for you to use to keep your own efforts in-check. While they are encouraging the implementation of conceptual information, they offer guidelines and warnings that are much more concrete. They walk you slowly to the intersection of art and science.
After the Conversion Framework concepts are presented and supported in chapters 3-8, in chapter nine Saleh and Shukairy present you with 49 specific things to consider in optimizing your website. This part of the book is something very concrete that you can return to for any new project. While you may not want to do all 49 of these things to every effort, it is a safe bet that your best moves for most optimization projects are clearly detailed within them. I'd recommend a bookmark here.
One thing I like about these 49 specific things to address, is that Saleh and Shukairy are candid about what to expect. If a change is not likely to produce much of a lift, they state it. What this does, is helps you to approach your own efforts with additional perspective on potential results. You can save time through the benefit of following the authors' expert advice.
Ultimately, the informational depth of a book like this should work to save you time and efforts - bringing an understood focus and purpose to your next move. By establishing a conceptual framework and then offering concrete, actionable items Saleh and Shukairy present a well-balanced and useful resource that achieves this purpose.
While I feel most people who work in selling products would benefit from the ideas presented in this book, the authors themselves offer a warning in the preface to answer the question, "Who Should Not Read This Book?"
We cast a wide net when we wrote this book, but there are a few people who might not enjoy it. Developers whose work stays far from the actual user of their application (i.e., developers of backend applications) aren't likely to enjoy this book. Those who believe that conversion optimization is only about testing may not like our approach to optimization. Finally, those who are looking for pure tactics and are not concerned with the theory behind conversion optimization might find some of the chapters in the book boring.
Personally, I believe that with the simple tone and structured logic in the way the concepts are presented, this is a quick read that offers a lot to gain. Having the 49 items to optimize as a reference-ready checklist simply adds to the overall value.
Consider this: Brand new, this book (offered bundled in both print an e-format) retails for less than $40 US, and you can buy it as just an e-book for even less. This is a very small investment if even one idea in it pays off for you somewhere. If more of these ideas resonate, you may implement new strategies that increase your returns by thousands, or even hundreds-of-thousands of dollars. The potential effect of conversion optimizing cannot be overstated.
On-demand indexing was a great value added feature for Google site search, but now it carries more risks than ever. Why? Google decides how many documents make their primary index. And if too many of your documents are arbitrarily considered "low quality" then you get hit with a sitewide penalty. You did nothing but decide to trust Google & use Google products. In response Google goes out of its way to destroy your business. Awesome!
Part of the prescribed solution to the Panda Update is to noindex content that Google deems to be of low quality. But if you are telling GoogleBot to noindex some of your content, then if you are also using them for site search, you destroy the usability of their site search feature by making your content effectively invisible to your customers. For Google Site Search customers this algorithmic change is even more value destructive than the arbitrary price jack Google Site Search recently did.
Cloaking vs rel=noindex, rel=canonical, etc. etc. etc.
Google tells us that cloaking is bad & that we should build our sites for users instead of search engines, but now Google's algorithms are so complex that you literally have to break some of Google's products to be able to work with other Google products. How stupid! But a healthy reminder for those considering deeply integrating Google into your on-site customer experience. Who knows when their model will arbitrarily change again? But we do know that when it does they won't warn partners in advance. ;)
I could be wrong in the above, but if I am, it is not easy to find any helpful Google documentation. There is no site-search bot on their list of crawlers, questions about if they share the same user agent have gone unanswered, and even a blog post like this probably won't get a response.
That is a reflection of only one more layer of hypocrisy, in which Google states that if you don't provide great customer service then your business is awful, while going to the dentist is more fun than trying to get any customer service from Google. :D
I was talking to a friend about this stuff and I think he summed it up perfectly: "The layers of complexity make everyone a spammer since they ultimately conflict, giving them the ability to boot anyone at will."
The book is called Poke The Box. It's about making a start. Seth encourages us to just jump in and do things. It doesn't matter if they go wrong, the important thing is to make the start. To break out of conservative patterns. It's a scatter-shot rant about the death of the industrial revolution, with Godin inciting us, over and over again, to take action.
Gotta say, I was a little disappointed by the book. It skates over the surface, didn't really hang together, and recycles some pretty tired themes. This review amused me.
Or maybe this book is the start of something else Seth has in mind. I don't know. Having said that, I think the central point of the book is valuable, and that is to.....
Do you ever regret not buying a particular domain name? Or a particular site? Do you regret not having started a site in that niche that is now taking off? Do you ever feel you've missed the boat on affiliate marketing? Do you regret not going harder at SEO in the days when it was just that much easier?
I think a lot of us can relate. There are always regrets and missed opportunities.
We *could* have done some of these things. But, for whatever reason, we didn't. And we probably still find reasons not to make a start on things today. Chances are, we're going to regret not having started them when we look back five years from now, too.
Take Seth's advice, and just make the start on that thing you are thinking of doing.
Fail At Something
Often we don't start something because we're scared of failing. However, as we know, failure is a part of life. The old cliche about the only way never to fail is to never try anything - rings true.
In SEO, one thing that might be good to start, if you're not doing so already, is some simple testing. Buy a few cheap domain names, add a little content, and try to get the site ranking for some obscure keyword term. As you don't really care about the keyword term, you can remain focused on pure SEO. If it fails to work, it doesn't matter. In fact, that tells you something about whatever technique you were using.Throw a few links at it. What happens? Does this fail to produce rankings? At least you know who not to get links from in future!
This is something I've let slip lately, so I'm going to make a new start on it, too.
Do Something Worth Doing
Seth mentions Tom Peters, who wrote "In Search Of Excellence". Seth sees that Peters is frustrated, because people are hearing his message, without embracing the thinking behind it. Being excellent isn't about doing what working extra hard at doing what you're told, it's about making the leap and doing work you decide is worth doing.
Sometimes, the thing that enables us to keep going with a site is simply that we believe in it. Nobody else might be paying attention. The rankings are mediocre. No one is linking to it. But if we feel what we're doing is worthwhile, we're more likely to work through the rough patches when there is no other reward on offer. If we don't really believe in a project, it's hard to find the will to work through the inevitable challenges.
If Microsoft used their primary product to bundle other free products they were giving away to gain market leverage Google would hoot and/or holler. Google demanded that Chrome be shown as an option in Europe when Microsoft was required to market their competitors via BrowserChoice.eu.
Yet if you visit YouTube with an old browser you can see that Google claims it isn't an advertisement, yet somehow Internet Explorer didn't make the short list.
A new version of Microsoft Corp.'s Internet Explorer to be released Tuesday will be the first major Web browser to include a do-not-track tool that helps people keep their online habits from being monitored.
Microsoft's decision to include the tool in Internet Explorer 9 means Google Inc. and Apple Inc. are the only big providers of browsers that haven't yet declared their support for a do-no-track system in their products.
I have long been a fan of using multiple web browsers for different tasks. Perhaps the single best reason to use IE9 is that a large segment of your customer base will be using it. Check out how search is integrated into the browser and use it as a keyword research tool.
The second best reason to use it is that sending some usage data to Microsoft will allow them to improve their search relevancy to better compete with Google. As a publisher I don't care who wins in search, so much as I want the marketshare to be split more evenly, such that if Panda II comes through there is less risk to webmasters. Stable ecosystems allow aggressive investment in growth, whereas unstable ones retard it.
Speaking of Google, Michael Gray recently wrote: "They are the virtual drug dealers of the 21st century, selling ads wrapped around other people’s content, creating information polluted ghettos, and they will become the advertising equivalent of a drug lord poised to rule the web."
In the following video, Matt winces, as though he might have an issue with what he is saying. "We take our advertising business very seriously as well. Both our commitment to delivering the best possible audience for advertisers, and to only show ads that you really want to see." - Matt Cutts
How does this relate to Internet Explorer 9? Well let's look at what sort of ads Google is running:
I am not sure if that is legal. But even if it is, it is low brow & sleazier than Google tries to portray their brand as being.
If Microsoft did the same thing you know Google would cry. Ultimately I think Google's downfall will be them giving Microsoft carte blanche to duplicate their efforts. Microsoft has deep pockets, fat margins, and is rapidly buying search marketshare. If Microsoft can use their browser as a storefront (like Google does) they have much greater marketshare than Chrome has.
I was looking for information about the nuclear reactor issue in Japan and am glad it did not turn out as bad as it first looked!
But in that process of searching for information I kept stumbling into garbage hollow websites. I was cautious not to click on the malware results, but of the mainstream sites covering the issue, one of the most flagrant efforts was from the Huffington Post.
AOL recently announced that they were firing15% to 20% of their staff. No need for original stories or even staff writers when you can literally grab a third party tweet, wrap it in your site design, and rank it in Google. Inline with that spirit, I took a screenshot. Rather than calling it the Huffington Post I decided a more fitting title would be plundering host. :D
We were told that the content farm update was to get rid of low quality web pages & yet that information-less page was ranking at the top of their search results, when it was nothing but a 3rd party tweet wrapped in brand and ads.
You can imagine in a hyperspace a bunch of points, some points are red, some points are green, and in others there’s some mixture. Your job is to find a plane which says that most things on this side of the place are red, and most of the things on that side of the plane are the opposite of red. - Google's Amit Singhal
If you make it past Google's arbitrary line in the sand there is no limit to how much spamming and jamming you can do.
we actually came up with a classifier to say, okay, IRS or Wikipedia or New York Times is over on this side, and the low-quality sites are over on this side. - Matt Cutts
(G)arbitrage never really goes away, it just becomes more corporate.
As bad as that sounds, it is actually even worse than that. Today Google Alerts showed our brand being mentioned on a group-piracy website built around a subscription model of selling 3rd party content without permission! As annoying as that feels, of course there are going to be some dirtbags on the way that you have to deal with from time to time. But now that the content farm update has went through, some of the original content producers are no longer ranking for their own titles, whereas piracy sites that stole their content are now the canonical top ranked sources!
Google never used to put piracy sites on the first page of results for my books, this is a new feature on their part, and I think it goes a long way to show that their problem is cultural rather than technical. Google seems to have reached the conclusion that since many of their users are looking for pirated eBooks, quality search results means providing them with the best directory of copyright infringements available. And since Google streamlined their DMCA process with online forms, I couldn’t discover a method of telling them to remove a result like this from their search results, though I tried anyway.
... I feel like the guy who was walking across the street when Google dropped a 1000 pound bomb to take out a cockroach - MorrisRosenthal
Take a look at what Matt Cutts shares in the following video, where he tries to compare brand domain names vs keyword domain names. He highlights brand over and over again, and then when he talks about exact match domains getting a bonus or benefit, he highlights that Google may well dial that down soon.
Now if you are still on the fence, let me just give you a bit of color. that we have looked at the rankings and the weights that we give to keyword domains, & some people have complained that we are giving a little too much weight for keywords in domains. So we have been thinking about at adjusting that mix a bit and sort of turning the knob down within the algorithm, so that given 2 different domains it wouldn't necessarily help you as much to have a domain name with a bunch of keywords in it. - Matt Cutts
It is believed that Google requires participating hotels to provide Google Maps with the lowest publicly available rates, for stays of one to seven nights, double occupancy, with arrival days up to 90 days ahead.
In a world where Google has business volume data, clientele demographics, pricing data, and customer satisfaction data for most offline businesses they don't really need to place too much weight on links or domain names. Businesses can be seen as being great simply by being great.*
(*and encouraging people to stuff the ballot box for them with discounts :D)
Classical SEO signals (on-page optimization, link anchor text, domain names, etc.) have value up until a point, but if Google is going to keep mixing in more and more signals from other data sources then the value of any single signal drops. I haven't bought any great domain names in a while, and with Google's continued brand push and Google coming over the top with more ad units (in markets like credit cards and mortgage) I am seeing more and more reason to think harder about brand. It seems that is where Google is headed. The link graph is rotted out by nepotism & paid links. Domain names are seen as a tool for speculation & a short cut. It is not surprising Google is looking for more signals.
How have you adjusted your strategies of late? What happens to the value of domain names if EMD bonus goes away & Google keeps adding other data sources?
Google and Bing admitted publicly to having ‘exception lists’ for sites that were hit by algorithms that should not have been hit. Matt Cutts explained that there is no global whitelist but for some algorithms that have a negative impact on a site in Google’s search results, Google may make an exception for individual sites.
The idea that "sites rank where they deserve, with the exception of spammers" has long been pushed to help indemnify Google from potential anti-competitive behavior. Google's marketing has further leveraged the phrase "unique democratic nature of the web" to highlight how PageRank originally worked.
But why don't we conduct a thought experiment for the purpose of thinking through the differences between how Google behaves and how Google doesn't want to be perceived as behaving.
Let's cover the negative view first. The negative view is that either Google has a competing product or a Google engineer dislikes you and goes out of his way to torch your stuff simply because you are you and he dislikes you & is holding onto a grudge. Given Google's current monopoly-level marketshare in most countries, such would be seen as unacceptable if Google was just picking winners and losers based on their business interests.
The positive view is that "the algorithm handles almost everything, except some edge cases of spam." Let's break down that positive view a bit.
Off the start, consider that Google engineers write the algorithms with set goals and objectives in mind.
Google only launched universal search after Google bought Youtube. Coincidence? Not likely. If Google had rolled out universal search before buying Youtube then they likely would have increased the price of Youtube by 30% to 50%.
Likewise, Google trains some of their algorithms with human raters. Google seeds certain questions & desired goals in the minds of raters & then uses their input to help craft an algorithm that matches their goals. (This is like me telling you I can't say the number 3, but I can ask you to add 1 and 2 then repeat whatever you say :D)
At some point Google rolls out a brand-filter (or other arbitrary algorithm) which allows certain favored sites to rank based on criteria that other sites simply can not match. It allows some sites to rank with junk doorway pages while demoting other websites.
To try to compete with that, some sites are forced to either live in obscurity & consistently shed marketshare in their market, or be aggressive and operate outside the guidelines (at least in spirit, if not in a technical basis).
If the site operates outside the guidelines there is potential that they can go unpenalized, get a short-term slap on the wrist, or get a long-term hand issued penalty that can literally last for up to 3 years!
Now here is where it gets interesting...
Google can roll out an automated algorithm that is overly punitive and has a significant number of false positives.
Then Google can follow up by allowing nepotistic businesses & those that fit certain criteria to quickly rank again via whitelisting.
Sites which might be doing the same things as the whitelisted sites might be crushed for doing the exact same thing & upon review get a cold shoulder.
You can see that even though it is claimed "TheAlgorithm" handles almost everything, they can easily interject their personal biases to decide who ranks and who does not. "TheAlgorithm" is first and foremost a legal shield. Beyond that it is a marketing tool. Relevancy is likely third in line in terms of importance (how else could one explain the content farm issue getting so out of hand for so many years before Google did something about it).
When Google did the Panda update they highlighted that not only did some "low quality" sites get hammered, but that some "high quality" sites got a boost. Matt Cutts said: "we actually came up with a classifier to say, okay, IRS or Wikipedia or New York Times is over on this side, and the low-quality sites are over on this side."
Here is the problem with that sort of classification system: doorway pages.
The following Ikea page was ranking page 1 in the search results for a fairly competitive keyword.
Once you strip away the site's navigation there are literally only 20 words on that page. And the main body area "content" for that page is a link to a bizarre, confusing, and poor-functioning flash tour which takes a while to load.
If you were trying to design the worst possible user experience & wanted to push the "minimum viable product" page into the search results then you really couldn't possibly do much worse that that Ikea page is (at least not without delivering malware and such).
I am not accusing Ikea of doing anything spammy. They just have terrible usability on that page. Their backlinks to that page are few in number & look just about as organic as they could possibly come. But not that long ago companies like JC Penny and Overstock were demoted by Google for building targeted deep links (that they needed in order to rank, but were allegedly harming search relevancy & Google user experience). Less than a month later Google arbitrarily changed their algorithm to where other branded sites simply didn't need many (or in some cases any) deep links to get in the game, even if their pages were pure crap.
We are told the recent "content farm" update was to demote low quality content. If that is the case, then how does a skeleton of a page like that rank so high? How did that Ikea page go from ranking on the third page of Google's results to the first one? I think Google's classifier is flashing a new set of exploits for those who know what to look for.
A basic tip? If you see Google ranking an information-less page like that on a site you own, that might be a green light to see how far you can run with it. Give GoogleBot the "quality content" it seeks. Opportunity abound!
There are so many competitive research tools on the market. We reviewed some of the larger ones here but there are quite a few more on the market today.
The truth is that you can really get a lot of good, usable data to give you an idea of what the competition is likely to be by using free tools or the free version of paid tools.
Some of the competitive research tools out there (the paid ones) really are useful if you are going to scale way up with some of your SEO or PPC plans but many of the paid versions are overkill for a lot of webmasters.
Choosing Your Tools
Most tools come with the promises of “UNCOVERING YOUR COMPETITORS BEST _____".
That blank can be links, keywords, traffic sources, and so on. As we know, most competitive research tools are rough estimates at best and almost useless estimates at worst. Unless you get your hands on your competition’s analytics reports, you are still kind of best-guessing. In this example we are looking for the competitiveness of a core keyword.
Best-guessing really isn’t a bad thing so long as you realize that what you are doing is really triangulating data points and looking for patterns across different tools. Keep in mind many tools use Google’s data so you’ll want to try to reach beyond Google’s data points a bit and hit up places like:
The lure of competitive research is to get it done quickly and accurately. However, gauging the competition of a keyword or market can’t really be done with a push of the button as there are factors that come into play which a push-button tool cannot account for, such as:
how hard is the market to link build for?
is the vertical dominated by brands and thick EMD’s?
what is your available capital?
are the ranking sites knowledgeable about SEO or are they mostly ranking on brand authority/domain authority? (how tight is their site structure, how targeted is their content, etc)
is Google giving the competing sites a brand boost?
is Google integrating products, images, videos, local results, etc?
Other questions might be stuff like "how is Google Instant skewing this keyword marketplace" or "is Google firing a vertical search engine for these results (like local" or "is Google placing 3 AdWords ads at the top of the search results" or "is Google making inroads into the market" like they are with mortgage rates.
People don't search in an abstract mathematical world, but by using their fingers and eyes. Looking at the search results matters. Quite a bit of variables come into play which require some human intuition and common sense. A research tool is only as good as the person using it, you have to know what you are looking at & what to be aware of.
Getting the Job Done
In this example I decided to use the following tools:
So we are stipulating that you’ve already selected a keyword. In this case I picked a generic keyword for the purposes of going through how to use the tools. Plug your keyword into Google, flip on SEO for Firefox and off you go!
This is actually a good example of where a push button tool might bite the dust. You’ve got Related Search breadcrumbs at the top, Images in the #1 spot, Shopping in the #3 spot, and News (not pictured) in the #5 spot.
So wherever you thought you might rank, just move yourself down a 1-3 spots depending on where you would be in the SERPS. This can have a large effect on potential traffic and revenue so you’ll want to evaluate the SERP prior to jumping in.
You might decide that you need to shoot for 1 or 2 rather than top 3 or top 5 given all the other stuff Google is integrating into this results page. Or you might decide that the top spot is locked up and the #2 position is your only opportunity, making the risk to reward ratio much less appealing.
With SEO for Firefox you can quickly see important metrics like:
Yahoo! links to domain/page
Open Site Explorer and Majestic SEO link data
presence in strong directories
potential, estimated traffic value from SEM Rush
Close up of SEO for Firefox data:
Basically by looking at the results page you can see what other pieces of universal search you’ll be competing with, whether the home page or a sub-page is ranking, and whether you are competing with brands and/or strong EMD’s.
With SEO for Firefox you’ll see all of the above plus the domain age, domain links, page links, listings in major directories, position in other search engines, and so on. This will give you a good idea of potential competitiveness of this keyword for free and in about 5 seconds.
It is typically better & easier to measure the few smaller sites that managed to rank rather than measuring the larger authoritative domains. Why? Well...
Google's brand boost isn't something you can replicate if you are just starting out
analyzing smaller chunks of data is easier than analyzing huge sets of data
So now that you know how many links are pointing to that domain/page you’ll want to check how many unique domains are pointing in and what the anchor text looks like, in addition to what the quality of those links might be.
Due to its ease of use (in addition to the data being good) I like to use Open Site Explorer from SeoMoz in these cases of quick research. I will use their free service for this example, which requires no log in, and they are even more generous with data when you register for a free account.
The first thing I do is head over to the anchor text distribution of the site or page to see if the site/page is attracting links specific to the keyword I am researching:
What’s great here is you can see the top 5 instances of anchor text usage, how many total links are using that term, and how many unique domains are supplying those total links.
You can also see data relative to the potential quality of the entire link profile in addition to the ratio of total/unique domains linking in.
You probably won’t want or need to do this for every single keyword you decide to pursue. However, when looking at a new market, a potential core keyword, or if you are considering buying an exact match domain for a specific keyword you can accomplish a really good amount of competitive research on that keyword by using a couple free tools.
Types of Competitive Research
Competitive research is a broad term and can go in a bunch of different directions. As an example, when first entering a market you would likely start with some keyword research and move into analyzing the competition of those keywords before you decide to enter or fully enter the market.
As you move into bigger markets and start to do more enterprise-level competitive research specific to a domain, link profiles, or a broader market you might move into some paid tools.
Analysis paralysis is a major issue in SEO. Many times you might find that those enterprise-level tools really are overkill for what you might be trying to do initially. Gauging the competitiveness of a huge keyword or a lower volume keyword really doesn’t change based on the money you throw at a tool. The data is the data especially when you narrow down the research to a keyword, keywords, or domains.
Get the Data, Make a Decision
So with the tools we used here you are getting many of the key data points you need to decide whether pursuing the keyword or keywords you have chosen is right for you.
Some things the tools cannot tell you are questions we talked about before:
how much captial can you allocate to the project?
how hard are you willing to work?
do you have a network of contacts you can lean on for advice and assistance?
do you have enough patience to see the project through, especially if ranking will take a bit..can you wait on the revenue?
is creativity lacking in the market and can you fill that void or at least be better than what’s out there?
You can't learn great SEO from an e-book. Or buying software tools.
Great SEO is built on an understanding.
Reducing SEO To Prescription
One of the problems with reductive, prescribed SEO approaches - i.e. step one: research keywords, step two: put keyword in title etc can be seen in the recent "Content Farm" update.
When Google decide sites are affecting their search quality, they look for a definable, repeated footprint made by the sites they deem to be undesirable. They then design algorithms that flag and punish the sites that use such a footprint.
This is why a lot of legitimate sites get taken out in updates. A collection of sites may not look, to a human, like problem sites, but the algo sees them as being the same thing, because their technical footprint is the same. For instance, a website with a high number of 250-word pages is an example of a footprint. Not necessarily an undesirable one, but a footprint nevertheless. Similar footprints exist amongst ecommerce sites heavy in sitewide templating but light on content unique to the page.
Copying successful sites is a great way to learn, but can also be a trap. If you share a similar footprint, having followed the same SEO prescription, you may go down with them if Google decides their approach is no longer flavor of the month.
The Myth Of White Hat
A lot of sites that get taken out are white hat i.e. sites that follow Google's webmaster guidelines.
It's a reasonably safe approach, but if you understand SEO, you'll soon realize that following a white hat prescription offers no guarantees of ranking, nor does it offer any guarantees you won't be taken out.
The primary reason there aren't any guarantees comes down to numbers. Google knows that when it makes a change, many sites will lose. They also know that many sites will win i.e. replace the sites that lost. If your site drops out, Google aren't bothered. There will be plenty of other sites to take your place. Google are only concerned that their users perceive the search results to be of sufficient quality.
The exception is if your site really is a one-of-a-kind. The kind of site that would embarrass Google if users couldn't find it. BMW, for example, in response to the query "BMW".
It's not fair, but we understand that's just how life is.
For those readers new to SEO, in order to really grasp SEO, you need to see things from the search engines point of view.
Firstly, understand the search engines business case. The search engine can only make money if advertisers pay for search traffic. If it were too easy for those sites who are likely to use PPC to rank highly in the natural results, then the search engines business model is undermined. Therefore, it is in the search engines interest to "encourage" purely commercial entities to use PPC, not SEO. One way they do this is to make the natural results volatile and unpredictable. There are exceptions, covered in my second point.
Secondly, search engines must provide sufficient information quality to their users. This is an SEO opportunity, because without webmasters producing free-to-crawl, quality content, there can be no search engine business model. The search engines must nurture this ecosystem.
If you provide genuine utility to end users, the search engines have a vested interest in your survival, perhaps not as an individual, but certainly as a group i.e. "quality web publishers". Traffic is the lifeblood of the web, and if quality web publishers aren't fed traffic, they die. The problem, for webmasters, is that the search engines don't care about any one "quality publisher", as there are plenty of quality publishers. The exception is if you're the type of quality publisher who has a well recognized brand, and would therefore give the impression to users that Google was useless if you didn't appear.
Thirdly, for all their cryptic black box genius, search engines aren't all that sophisticated. Yes, the people who run them are brilliant. The problems they solve are very difficult. They have built what, only decades ago, would have been considered magic. But, at the end of the day, it's just a bit of maths trying to figure out a set of signals. If you can work out what that set of signals are, the maths will - unblinkingly - reward you. It is often said that in the search engine wars, the black hats will be the last SEOs standing.
Fourthly, the search engines don't really like you. They identified you as a business risk in their statement to investors. You can, potentially, make them look bad. You can undermine their business case. You may compete with their own channels for traffic. They tolerate you because they need publishers making their stuff easy to crawl, and not locking their content away behind paywalls. Just don't expect a Christmas card.
SEO Strategy Built On Understanding
Develop strategies based on how a search engine sees the world.
For example, if you're a known brand, your approach will be different to a little known, generic publisher. There isn't really much risk you won't appear, as you could embarrass Google if users can't find you. This is the reason BMW were reinstated so quickly after falling foul of Google's guidelines, but the same doesn't necessarily apply to lesser known publishers.
If you like puzzles, then testing the algorithms can give you an unfair advantage. It's a lot harder than it used to be, but where there is difficulty, there is a barrier to entry to those who come later. Avoid listening to SEO echo chambers where advice may be well-meaning, but isn't based on rigorous testing.
If you're a publisher, not much into SEO wizardry, and you create content that is very similar to content created by others, you should focus on differentiation. If there are 100's of publishers just like you, then Google doesn't care if you disappear. Google do need to find a way to reward quality, especially in niches that aren't well covered. Be better than the rest, but if you're not, slice your niche finer and finer, until you're the top dog in your niche. You should focus on building brand, so you can own a search stream. For example, this site owns the search stream "SEO Book", a stream Aaron created and built up.
Remember, search engines don't care about you, unless there's something in it for them.
Google tries to wrestle back index update naming from the pundits, naming the update "Panda". Named after one of their engineers, apparently.
The official Google line - and I'm paraphrasing here - is this:
Trust us. We're putting the bad guys on one side, and the good guys on the other
I like how Wired didn't let them off the hook.
Wired.com: Some people say you should be transparent, to prove that you aren’t making those algorithms to help your advertisers, something I know that you will deny.
Singhal: I can say categorically that money does not impact our decisions.
Wired.com: But people want the proof.
This answer, from Matt Cutts, was interesting:
Cutts: If someone has a specific question about, for example, why a site dropped, I think it’s fair and justifiable and defensible to tell them why that site dropped. But for example, our most recent algorithm does contain signals that can be gamed. If that one were 100 percent transparent, the bad guys would know how to optimize their way back into the rankings
Why Not Just Tell Us What You Want, Already!
Blekko makes a big deal about being transparent and open, but Google have always been secretive. After all, if Google want us to produce quality documents their users like and trust, then why not just tell us exactly what a quality document their users like and trust looks like?
Trouble is, Google's algorithmns clearly aren't that bulletproof, as Google admit they can still be gamed, hence the secrecy. Matt says he would like to think there would be a time they could open source the algorithms, but it's clear that time isn't now.
Do We Know Anything New?
So, what are we to conclude?
Google can be gamed. We kinda knew that....
Google still aren't telling us much. No change there....
Google have filed a patent that sounds very similar to what Demand Media does i.e looks for serp areas that are under-served by content, and prompts writers to write for it.
The patent basically covers a system for identifying search queries which have low quality content and then asking either publishers or the people searching for that topic to create some better content themselves. The system takes into account the volume of searches when looking at the quality of the content so for bigger keywords the content would need to be better in order for Google to not need to suggest somebody else writes something
If Google do implement technology based on this patent, then it would appear they aren't down on the "Content Farm" model. They may even integrate it themselves.
How To Avoid Getting Labelled A Content Farmer
The question remains: how do you prevent being labelled as a low-quality publisher, especially when sites like eHow remain untouched, yet Cult Of Mac gets taken out? Note: Cult Of Mac appears to have been reinstated, but one wonders if that was the result of the media attention, or an algo tweak.
Google want content their users find useful. As always, they're cagey about what "useful" means, so those who want to publish content, and want to rank well, but do not want be confused with a content farm, are left to guess. And do a little reverse-engineering.
From a competition & market regulation perspective that was a smart move for Google. They couldn't leave it in the search results while justifying handing out penalties to any of its competitors. As an added bonus, the site is building up tons of authoritative links in the background from all the buzz about being bought by Google. Thus when Google allows it to rank again in 30 days it will rank better than ever.
Based on their web dominance which generates such a widespread media buzz, Google adds millions of Pounds worth of inbound links to any website they buy.
The message Google sends to the market with this purchase is that you should push to get the attention of Google's hungry biz dev folks before you get scrutiny from their search quality team. After the payday the penalty is irrelevant because you already have cash in hand & the added links from the press mentioning the transaction will more than offset any spam links you remove. Month 1 revenues might be slightly lower, but months 2 through x will be far higher.
BeatThatQuote.com today was sold to Google for GBP37.7 million. We think this deal is a tremendous opportunity for our company to develop new and innovative options for personal finance in the UK. Our team is excited about becoming a part of Google. We look forward to working with their engineers to create new tools making it easier for consumers to choose the right financial products. We think
Remember how Overstock.com was recently penalized for offering discounts in exchange for links? BeatThatQuote partnered with Oxfam to create CompareForGood.com. The homepage consists of a bunch of links into BeatThatQuote.com. If you look at those links using our server header checker you will see some 301 redirects. Of course doorway pages are considered spam & we know that Google has torched some other affiliate programs for using 301 redirects.
With that in mind, can anyone explain why Google's newest purchase buying links like
Not so much a categorized listing with an editorial review...just a paid link for the sake of buying links to flow PageRank.
That one is only totally flagrant.
A bit off color. Like comment spamming.
Sorta like the link exchanges in German.
But some are even more outrageous. Consider that BeatThatQuote is buying links from pages with ad sections like
Paid Blog Reviews
Remember those "evil" paid reviews Matt Cutts wrote of? Plenty of those to go around ;)
In fact, some of the paid blog links were in place so long that BeatThatQuote got thank you's for advertising for over a year straight.
I don't have a decade of spam fighting experience like Google does. But is it too much to suggest that before Google buys *any* website they should do a basic compliance audit to verify that the site is operating within Google's TOS. I am an independent SEO and it took me all of 2 minutes to find numerous FLAGRANT violations.
What sort of message does Google send the market with the above behavior?
How Can Google Police Anyone?
Google has on multiple occasions penalized other UK based finance sites for SEO issues & link buying. But now that Google owns one of the horses in the race, and that horse has been found to be using anabolic steroids, can they legitimately penalize any of their new competitors?
If I had a UK finance site I would go on a link buying binge right now. Google can't penalize you for it because they are doing the same damn thing. And if they do penalize it for DOING THE EXACT SAME THING GOOGLE IS DOING then you know you have a legitimate gripe for the media, and I have no doubt Microsoft would be willing to help pick up the legal tab.
Google Eats Microsoft's Lunch Again!
Ultimately this is a body blow to Microsoft. Microsoft started to gain momentum in search through verticalization, but has since backed off. Meanwhile Google took Microsoft's ball and ran it in for a touchdown (acquiring MetaWebs, trying to buy ITA Software, and buying BeatThatQuote). And now one of MSN's portal ad partners is owned by Google:
Head of partnerships at MSN, Phil Coxon said, “At Microsoft Advertising, we’re passionate about collaborating with brands to create compelling advertising campaigns. By providing new and exclusive content that appeals to consumers, this partnership both enhances the overall MSN user experience as well as providing a great platform for BeatThatQuote to engage with their target audience on a meaningful level. This deal builds on our previous partnership with BeatThatQuote, which led to a 400% increase in revenue generated from insurance products. We’re delighted to continue to build on this relationship with this new campaign.”
There is a saying in the bond trading market that if you don't know who the clown in a deal is then look in the mirror because it is probably you. Business is the same way. Almost everyone gets taken for a ride at least once.
What is Ignorance?
Ignorance is often viewed as a condescending word, but it is how we are all born. It is only through learning and experience we are able to do much more than survive. Any time you enter a new market or use a new strategy you start out from behind. You are the sucker who is losing money. Rarely does the new guy win just by showing up, or just by copying someone else's existing strategy. There has to be some point of differentiation.
A Brutal Uphill Climb
The leader has more data, more connections, more links, more capital, higher visitor value, and the algorithms have another layer of karma built over the top of them as well. Matt Cutts described part of the Panda update as "we actually came up with a classifier to say, okay, IRS or Wikipedia or New York Times is over on this side, and the low-quality sites are over on this side."
Roadblocks & Pot Holes Are Everywhere
Based on those sorts of disadvantages, why would anyone want to try SEO? Well in almost any other business model similar roadblocks and pain points exist, and SEO allows one to build momentum over time without it being an all or nothing risk. The slow buildup can lead you toward success in ways you may not have anticipated. And the cost of failure is often little more than time. Plus you gain knowledge even when something fails.
I talk to lots of startups and almost none that I know of post-2008 have gained significant traction through SEO (the rare exceptions tend to be focused on content areas that were previously un-monetizable). Google keeps its ranking algorithms secret, but it is widely believed that inbound links are the preeminent ranking factor. This ends up rewarding sites that are 1) older and have built up years of inbound links 2) willing to engage in aggressive link building, or what is known as black-hat SEO.
A similar blog headline flipped around might read like "Most VC funded companies fail & founders get hosed on equity dilution, so getting funded is no longer a viable company formation strategy for startups." Of course something like that would be laughable, but it is no less absurd than saying SEO is no longer viable.
Sure coming from behind is hard, but the above misses that
many of the most profitable SEO plays are reinvesting into growth
most people who are successful with SEO do not like to attribute their success to it because doing so creates additional risks & more competition
Unique Market Approaches
Even treading water in a market where competitors are reinvesting profits & the market maker is tilting the table is quite respectable. If you want to come from behind and exactly clone someone else's business model, it won't likely be profitable. But that is why people attack markets from different perspectives. This is no different than why there are many different graphs. Chris isn't trying to beat Google in creating another link graph, but is looking at different signals.
Tectonic Shifts in Relevancy
Likewise marketing strategies can be vastly different between different companies and different projects within a company. Certain types of pages & certain types of websites rise and fall as the algorithms are adjusted to close down opportunistic loopholes. But as they make certain things harder they make other things easier. The whole content farm model was only enabled by an excessive weighting on domain authority & the introduction of rel=nofollow.
That opportunity may have fallen by the wayside. Many content mills just got hit pretty hard.
Was The Pain Really That Bad?
But for all the bluster about how it was one of the biggest changes in years, most of the content farms are only down maybe 20% to 50% in terms of traffic & revenues.
Sure that is a lot of revenue to disappear, but when you are operating at 80% net margins you can do that without it destroying your company. And this doesn't even take into account that many of these sites had a clean double over the past year. So if you grow 100% then lose 50% you are still even year on year, in spite of being penalized. Not bad in an environment where tons of businesses are going bankrupt offline.
And of course those sites getting whacked create opportunity for other folks, who build sites using different strategies.
A Cautionary Tale
About a half-decade ago a CEO of a start up contacted me & had us build a few links for them. Then they had to get their VCs approval for doing a full in-depth strategic review because it was going to cost well into 5 figures. Their VC investors didn't believe in SEO!
So that killed the project.
This company had a multi-lingual site where their leading market's content was only accessible through a drop down form where the URLs did not change. Fixing that issue to make the site crawlable would have produced more revenues in the first few months than the cost of our contract. But the VC didn't think SEO was valuable. They never got that tip. And for businesses which have network effects built in, losing $x today can easily be $10x or $20x a few years out.
Current Market Leaders Were Yesterday's Gray Area Marketers
Mr. Dixon also highlights how established TripAdvisor is, but when they were founded they were once the small dog just starting out. His article also fails to mention that TripAdvisor was Text-Link-Ads largest customer. In other words, they came from behind, took a calculated risk, and won. They backed off from the risks when the risks started to exceed the opportunity.
The entire 250+ page document is devoid of any discussion of incoming links which is the cornerstone of search engine optimization. By reading through the lines, it appears that they have two primary sources for link development for their owned and operated sites: (1) from their “undeveloped websites” and (2) from their content partner sites. Although these two initiatives alone are generally not financially profitable, they are successful approaches to maximizing the incoming link equity in their owned and operated properties.
The point is that start ups shouldn't avoid all risk, but they should pick and choose their spots. The above sites are billion Dollar enterprises because they worked in the gray area to catch up & build a lead, and then pulled away from risk after they had a strong market position.
As time passes the opportunities change, but they don't really disappear.
If the Google Farmer update doesn't show you the unfortunate amount of low-quality noise in the SEO industry then there is no hope for you young jedi. :)
It's not unlike the unbelievable noise that surrounds an upcoming Apple product launch. In the interest of full disclosure I happen to be an Apple-ite but the coverage is even nauseating to me.
My poor RSS reader and my Twitter stream came under siege these last few days with the ramp up to the iPad 2 launch and the Google algo update.
This inspired me, after hitting the delete button about 432 times in my RSS and scrubbing the Twitter list, to sit back and review how I consume information, where I consume it from, and who is really worth "my time".
Repeat, Re-tweet, Rinse
Technology blogs and SEO blogs are much different in terms of the availability of content that can be churned out on a daily basis, as you know. There is so much more to choose from with tech but there still is this herd mentality which leads to someone saying "The iPad 2 will have a camera" 15 different ways.
With SEO, it is pretty tough to churn out daily content that is:
without a lot of conjecture
worthy of your time
Sure, SEO changes like any other industry but sometimes you read some of these blogs and you have to wonder how much factual, data-driven information goes into the content? Or is the point stretched to a level where any independent analysis would torch the theory in a matter of minutes?
Show Me The Money!
Something I starting doing a bit before this wake up call which is now helping me whittle down what I am consuming, was to make notes of techniques or tips that were mentioned (noting the source) then implementing those tips while watching to see whether they made any difference (positive or negative).
Also, try and pay attention to trend predictions and industry predictions.
The ones that are usually spot on are probably worth more of your time
One thing I noticed while doing that was some of the information was simply being either re-tweeted, or republished with thin commentary, or referenced with essentially the same content but spun a different way with different industry language.
The problem was that many of the blogs or sites occasionally had a good point or three but the vast majority were just kind of "meh". I don't mean that in a disparaging way but I think if the goal of the writer is to publish frequently then so be it, but it isn't a necessity in my opinion and it can actually hurt the quality of the content if the writer feels like daily or semi-hourly publishing is required of them.
I figure that if you are going to spend time reading or paying attention to someone, you ought to pay attention to how often you skim over their stuff versus how often you actually read it and benefit from it.
Authors That Branch Out
As SEO becomes more and more a part of a holistic view of marketing your business or site, it might be a good move to look at people who can write intelligently about SEO as well as what else goes into web marketing. Things like:
web design and/or development
using popular cms frameworks
domain buying, selling, and domain names
and the many other things a typical SEO or webmaster might be interested in
I'll give you one of my favorite blogs to read (outside of SeoBook of course :D ), Michael Gray AKA Graywolf over at Wolf-Howl.Com. His blog covers many aspects of the web marketing industry and has provided me with some extremely useful advice and tips.
Looking at the homepage of the site today he's covering Raven SEO Tools, How to Choose a Domain Name, a review of a Social Media tool, some Facebook tips for small and local businesses, and a couple of posts on SEO factors.
It's a solid example of a really well-rounded blog which gives actionable information, tips, and strong opinions.
A site that I like as sort of an all in one solution is Search Engine Land. Solid news round ups, excellent guest writers, and a group in tune to what's going on in the world of search marketing.
Many of you might subscribe to these ones already, but if not you should take a peek. :)
Do They Have Something (of value) to Say?
Twitter is probably the worst in terms of noise if you don't engage in some strategic filtering or unfollowing. A stream can quickly get littered with a bunch of RT's with posts about how nice the weather is outside.
Don't get me wrong, I don't mind the personal or non-work tweets (in fact sometimes they are a nice break from the monotony of the day as a webmaster) but if you notice that the person you are following is basically a re-tweet machine then it might be time to move on.
The nature of the web and social media present a way for you to interact with other folks in your industry in a way which makes it seem like you are bosom buddies with your (fill in a number) followers on Twitter, or people you interact within a community.
The hard, sobering fact is that quite a few people have nothing to say professionally that really is of any true business value to you (and why would you care what they are doing over the weekend?).
There are thought leaders in every space who consistently put out good stuff, but thought leaders are few and far in between. We live in a superficial, ME ME ME, celebrity world.
People want to be heard, seen, adored, revered, etc. It's really easy to spot thought leaders but you also have to be able to weed through people who look like thought leaders just because they have a high Twitter follower count.
It's easy to separate out noise though. Pay attention to who you are reading and following and really look at how much you are learning from that person or group.
A Cleansed List & a Productive Day
I ended up cutting my RSS feeds by quite a bit, probably around 70% if I quickly look at the numbers. I follow a few SEO-centric blogs as well as some PPC blogs, a few Local SEO blogs, Google & Bing blogs, blogs specific to tools that I use, and some general business blogs/feeds.
I'm not a big Twitter user, because after the celebs/corporations/internet marketers/bots there is little left. Diversity is good, overwhelming noise is not.
You could spend all day reading theories or re-spun posts instead of getting the information from the cream of the crop and putting that data into action for your business. Some of the spots I no longer read weren't re-publishing houses but they simply didn't bring enough to the table consistently to warrant an investment of *my* time.
What about your time? Are you giving it away to places that do not deserve it?
How often do you ever hear the phrase worst practices? Probably never.
Everything is a best practice approach, right up until things change.
Consider AdSense websites.
Hey Look, a Case Study!
When you look at some of the biggest losers in the Google content farm update, many of them happened to be premium AdSense publishers which were even used by Google as case studies! For instance, Hub Pages or EzineArticles.
We are confident that over time the proven quality of our writers' content will be attractive to users. We have faith in Google's ability to tune results post major updates and are optimistic that the cream will rise back to the top in the coming weeks, which has been our experience with past updates - Paul Edmondson
The problem is that for many businesses there will be no bounce back. Some are simply over. The web has evolved & the algorithm has moved beyond them.
Where is the Much Needed Disclaimer?
What makes this worse is that when Google gives a site their premium AdSense feed & sets something up as a case study others will see that as an explicit endorsement.
THIS IS HOW YOU SHOULD DO IT.
Even after Google torches the companies that follow Google suggested best practices those case studies live on, offering what now amounts to maps to Google hell.
Adding Insult to Injury
What makes such filters/penalties even more infuriating is that in some cases when your site is slapped with a negative karma penalty, others who steal your content & wrap it in AdSense will outrank you, since their site does not yet have a negative karma penalty against it. :)
Individually the splog sites may not live long, but collectively they can keep outranking you to ensure you are invisible for your own words, even if you poured years of your life into creating something beautiful & important.
As we noted yesterday, Cult of Mac was collateral damage in Google’s war on crappy content farms. For some inexplicable reason, we got downgraded when Google tweaked its algorithms last Thursday.
But today we’re back in. We’re on Google News (a very important source of daily traffic) as well as Google’s general search results. However, we still get outranked by some of the scraper sites that steal our content, so not everything’s perfect.
That whooshing sound you just heard was MFA sploggers making a mad dash to steal content from the list of currently penalized sites.
Cult of Mac is lucky they had enough pull with the press to get reconsidered. Most webmasters who got hit did not & anyone who has contracts based on set traffic levels or tight margins which just turned negative are in a pretty crappy situation. Yet another example of the importance of not fueling growth with debt & the importance of profit margins and a cash-on-hand safety net.
Who Are the Opportunistic Maximizers?
The problem with such an approach of maximizing everything you do to suck peak revenue out of the pageview is that things can change on a whim. I have seen some of Google's 1 on 1 AdSense optimization advice they sent to a friend of mine. I told my friend that the optimization advise was at best short-term opportunism that would end up crushing them in the long run if they actually implemented it.
Google doesn't care if following their advice torches your site if it makes them a bit more money, because ultimately there is another person standing in line waiting to follow.
My friend is lucky that they realized my advice was more trustworthy than the advice they were getting direct from Google. If they listened to Google back then their business might be destroyed today.
Google likes to position SEOs as exploiters out for the quick buck, but what honest analysis shows is that it is Google which is pushing the boundaries in terms of:
One of the worst hit sites in the AdSense farm update was WiseGeek. Sure WiseGeek must have had something like a 20% ad clickthrough rate. But with traffic falling 75%, maybe they would have been better off building a cleaner experience with a 5% CTR.
What was the most profitable best practices based approach suddenly falls short. And the results are not always predictable. When Google decided to attack content farms who honestly knew that:
somehow eHow.com would survive
yet somehow Google's "algorithmic" approach would punt 10,000's of smaller websites that have far higher content quality
In advance of the solution I was fairly certain eHow would survive, but what I underestimated was the Google engineers. Or rather the ignorance of same. I simply couldn't imagine such a content farm algorithm going live that missed eHow and decimated the lives of so many independent webmasters.
I guess we can simply view this as an extension of Google's you can have any web you want so long as it is corporate TM policy. I think Brett Tabke said it best in a recent AdSense thread:
When the rules and the enforcements are made up by monopolies in a make believe world - there is no cheating.
The only "cheatings" is when it gets outside the lines of the law. - Brett Tabke
AskTheBuilder is yet another Google AdSense case study. In spite of being a niche player well regarded in his community, Sistrix data shows the site off 87% after the most recent Google update!
Who Caused the Content Farm Problem?
Everyone likes to vilify the content farms and scrapers (and they deserve it) but the real villain behind all of this is CPC/CPM based advertising.
Can you imagine a world where your attention was sold off based on how long you stayed on a page rather then how often you switched pages? If google wants to fix their search results, they should focus on fixing adsense. The technology to more accurately measure a viewer's exposure to an ad are there, it just needs a trustworthy player to bring it to market. Someone trusted by both users and advertisers.
Google made click/impression-based advertising appealing to both groups and it made them what they are now. It's time to get away from it. - po
roll in an algorithm that aggressively penalizes tons of borderline edge cases
see who complains to the media & has connections with the media
fix the rankings of those who you like & those with sway, while ignoring the rest
Can You Trust Google?
All of this leads to the obvious question: can you trust Google?
The short answer is yes.
The long answer is you can *always* expect Google to do what is in the best interest of Google. As they plow into field after field (payments, local, mobile, ecommerce, mortgage, credit cards, travel, weddings, fashion, etc.) & use their search dominance to manipulate other markets one would have to be blind to view Google as anything other than a competitor.
Maybe not today. Maybe not tomorrow. But some day they will come. And it is never fun when it happens to you. :(
Until that day may come, if you always follow their best practices, just remember ... ;)
Put it this way. Any algorithm that takes out Demand Media content is going to take out a lot of SEO content, too. SEO copy-writing? What is that? That's what Demand Media do. As I outlined in the first paragraph, a lot of SEO content in not that different, and any algorithm that targets Demand Media's content isn't going to see any difference. Keyword traffic stream identical to title tag? Yep. A couple of hundred words? Yep. SEO format? Yep. Repeats keywords and keyword phrases a few times? Yep. Contributes to the betterment of mankind? Nope. SEO's need to be careful what they wish for....
There were a lot sites following the SEO model of "writing for the keyword term" taken out, not just sites pejoratively labelled as "Content Farms". Ironicly, the pinup example I used, Demand Media, got off lightly.
Some people have suggested there has been much collateral damage. Google have taken out legitimate pages, too.
What happened is that the pages that were taken out shared enough similarity to pages on Content Farms and the algorithm simply did what it was designed to do, although Google have admitted - kinda - that the change still needs work. The ultimate judgement of whether this is a good or a bad thing comes down to what Google's users think. Does Google deliver higher quality results, or doesn't it?
I'm pissed because we've worked our asses off over the last two years to make this a successful site. Cult of Mac is an independently owned small business. We're a startup. We have a small but talented team, and I'm the only full timer. We're busting our chops to produce high-quality, original content on a shoestring budget.We were just starting to see the light at the end of the tunnel. After two years of uncertainty, the site finally looks like it will be able to stand on its two feet. But this is a major setback. Anyone got Larry's cell number?
Scroll down, as there's also some very interesting comments in reply to that post.
This is nothing new, of course, It's been going on since search began. The search engines shrug, and send businesses that depend on them flying, whilst elevating others.
What can be done?
Spread The Risk
"Be less reliant on Google!", people say.
It's an easy thing to say, right, but what do you do when Google is the only search game in town? We know any business strategy that relies on an entity over which we have no control is high risk, but what choice is there? Wait for Bing to get their act together? Hope Blekko becomes the next big thing?
None of us can wait.
Sometimes, no matter how closely we stick to Google's Guidelines, Google are going to change the game. Whether it is fair or not is beside the point, it's going to happen.
So, we need to adopt web marketing strategies that help lessen this risk.
The best way to lessen this risk, of course, is to not rely on Google at all. Design your site strategy in such a way as that it wouldn't grind to a halt if you blocked all spiders with a robots.txt. Treat any traffic from Google as a bonus. Such a strategy might involve PPC, brand building, offline advertising, social media, email marketing and the wealth of other channels open to you.
Try the above as an academic exercise. If you had to operate without natural traffic, does your business still stand up? Are you filling a niche with high demand, a demand you can see in other channels? Is there sufficient margin to advertise, or does your entire model rely on free search traffic? Are there viral elements which could be better exploited? Are there social elements which could be better exploited?
Academic exercises aside, we can also look to mitigate risk. Think about not putting all your eggs in one basket. Instead of running one site, run multiple sites using different SEO strategies on each. Aaron talks about running auxiliary sites in the forum.
Try to get pages (articles, advertising) on other sites in your niche. If your site is taken out, at least you still have a presence in your niche, albeit on someone else's site. A kindly webmaster may even agree to repoint links to any new site you devise.
Do you have other ideas that help mitigate the risk? Add them to the comments.
It's An Advantage Being An SEO
Finally, be pleased you're an SEO.
SEO just got that much harder, and the harder it gets, the more your services are required, and the higher the barrier to entry for new publishers. Every day search is getting more complex. At the end of the day, it's an algorithm change. It can be reverse engineered, and new strategies will be adopted to maximize the opportunity it presents.
Until such a time as Google tells us exactly what they want to see, and rewards such content, SEO's will just keep doing what they do. And thank goodness Google isn't entirely transparent. If they were the value of your SEO knowledge as a competitive advantage would plunge. For many of us, wages would quickly follow.
Sure a short-term hit is painful, but the best SEOs will recover.
As they do, other content producers will be left scratching their heads.
I’m going to tell you why an SEO Book subscription, for many small businesses, is a much better investment than just hiring a firm or a freelancer.
We, as business owners, all realize that we need an online presence and the backbone of that presence is a top-notch SEO campaign.
Whether it be straight out SEO services, or help with Google Places, or help with reputation management, most small business owners realize they need to be “there” but aren’t quite sure how to do that properly.
You’re a small business owner, so am I and so are many members of our community & industry. Our work lives as small business owners are typically filled with parts of various roles like:
customer service representative
The problem is that SEO can be an abstract thing or idea for small business owners outside of the web marketing industry to grasp, learn, and implement correctly.
This problem leads to small businesses getting taken to the cleaners by either woefully inadequate (and expensive!) SEO firms, competing business models (like YellowPages & YellowBook) selling their version of SEO services due to the significant decrease in revenue from the phonebook model, or just plain snake oil salespeople.
There are many qualified SEO providers out there, tons actually. There's a lot of noise as well and when you don't have a clear understanding of the business it can be hard to discern one from the other.
Finding a Worthy SEO Provider
So if a small business owner is able to carefully avoid those situations and find a reputable SEO firm, chances are that the price for those services will be out of reach or just not economical from an ROI standpoint for some small businesses (unless the firm is hurting for business or it's a new firm starting out).
There’s nothing wrong with that, it’s just simple economics. If a service provider can sell their services for 6 or 5 figure contracts consistently, then it doesn’t behoove their business interest to sell services for 4 or 3 figure contracts.
A Better Option
Even if we stipulate that a business can afford to hire a firm to handle their SEO campaign, where it makes sense for both the provider and the buyer, we’d like to present another option.
That option would be an SEO Book subscription :) Compared to hiring an SEO company, your SEO Book subscription:
is less expensive, resulting an in much higher ROI for your business
is more direct and hands on, you get unbiased feedback from hundreds of SEO professionals
gives the owner the ability to learn the ins and outs so they can manage things themselves
trains the business owner about the industry and best practices so they can intelligently outsource services themselves if they so choose
SeoBook Subscription Options
There are 2 types of SeoBook subscriptions, with different levels of access. The first option is for access to our (over 100) training modules and our premium SEO tool set. The cost of that option is just $69 per month.
The second option is for access to those same training modules and tools, in addition to our community forums. Our community is the cornerstone of our subscription-based membership service.
Inside the forums you have instant access to the most up-to-date, cutting edge information where you will learn from some of the best minds in SEO.
For the purposes of this post I’m going to focus on the option which includes everything.
How Much Would You Invest in You?
When you started your small business you probably thought (correctly) that it was a good idea to at least have a solid understanding of the key concepts related to your business prior to hiring staff to handle day to day tasks.
You probably learned how to operate and troubleshoot equipment, customer service software, the phone system, the coffee pot :) and so on. You likely know who your target market is and you know what type of message you want to convey via print and web design as well as sales copy.
Those are all things that you had to learn in order to grow your business and for your business to function properly.
You Are Your Business
By investing the time in yourself, and by extension to your business, you were able to confidently hire and train staff as well as put together a traditional marketing campaign with the help of local print vendors and maybe your local web design person.
When it comes to something like SEO, where there is no formal education or “certification” (thank goodness), you might have a tough time hiring something to do something you know very little about.
If you don’t know what works and what doesn’t how will you know if the provider is selling you a bag of smoke versus providing an actual quality service? You won’t know, and with what a good SEO campaign from a reputable provider can cost that can cause significant damage to your business.
You Are No Stranger to Hard Work
Investing time, practicing patience, and being willing to learn will reward you and your business many, many, many times over when it comes to the SEO industry. The fact is many people fail because they are lazy and unwilling to learn in addition to having a poor attitude.
You have probably perservered through that and are running a solid business so why not get even more ahead of your competition, lazy or otherwise.
Breaking Down the Costs
Most SEO campaigns can expect to see results in or around 6 months, so we’ll look at the 12 month costs because you should consider SEO (just like traditional marketing) as an ongoing effort to produce results for your business.
From experience I can tell you that a full-on SEO campaign from an experienced SEO or SEO firm for small businesses will likely start at $5,000 per month here in the states. Probably higher for a firm and that amount can flucuate depending on your needs but anything less than thousands per month is unlikely.
When I say full-on I mean the whole deal:
analytics reviews and implementation for testing, tracking, tweaking
on-page SEO (title tags, page copy, and so on)
off-page SEO, like link building
adjusting tactics based on rankings growth or decline (and competitor watching)
If you are new to the SEO space you may not know what some of that means, but you know its important (or else you wouldn't be reading this). You know that your visibility on the web is probably a crucial component of your small business’s long-term success.
Would you really want to outsource that for what it might cost you for an employee or two, without knowing exactly what it is the provider is/should be doing?
Don’t Pay 17x More Than You Need To!
So even being conservative in my estimate, you are talking about around $60k per year and that probably doesn’t include additional money you may need for getting links to your site via branding and such.
Meanwhile, you could be investing just $3,600 per *year* in yourself and your business while learning from quite a few of the thought leaders in the SEO space. Perhaps not the biggest self-promoters in the space but certainly some of the best minds.
My dad always told me to be very wary of someone constantly telling you they are the best at XYZ, usually they aren’t. The ones who are the best are doing the job everyday and doing it well, not telling YOU how great THEY are.
It’s important to keep in mind that an SEO book subscription is going to give you the tools you need, the training you need, and more importantly the knowledge you need to be successful. Have a question?
Just ask it in the community forums and we’ll answer it. In fact, many people will answer it and you’ll get wide range of tips from folks with loads of experience and success.
Now, the membership doesn’t mean that we’ll execute the plan for you but you’ll have a step by step guide on what to do, how to do it, why you’re doing it, and the tools you need to do it.
Plus, you have 24/7 access to the community forum which has hundreds of members and is quite active at all hours as we have members from all over the world.
But I Can Do it For Le$$....
There’s probably someone out there that will say “hey I can do that and do it well for like $2k a month”...ok, but even at that price point it’s $24k versus $3,600 per year!!
If you know what to do with your campaign you can easily outsource the “grunt” work for much cheaper dollars + become educated in a field that is very important now, and will be for the foreseeable future. I not only write this as an employee of SEO Book, but also as a person who was a customer for about a year before joining the site. During that time I helped get our company website squared away and learned how to automate or outsource many aspects of our business: from content, to promotion, to additional link development. And if you need help with any of that stuff, there is a requests forum where you can work with some of our members.
Heck, let's even say someone would run a full-service SEO campaign for you at the absurdly low price point of $500/mo! (not likely, given that some quality links cost $299 per year each). Even before link development that's still approaching *double* the cost of an SEO Book subscription while giving you insight from only one person versus hundreds, no premium tools or training modules, and no access to the latest information in the field as well as you not learning SEO from independent, unbiased sources.
In our community you can not only find out what is working right now, but you can also findsomeone who can help you get the job done without paying for the markup associated with high pressure salesmen or large bureaucratic firms where 50 folks are taking home weekly paychecks for the work done by 5 people.
Discounts on SEO-Related Products
Your SEO Book subscription also comes with tons of discounts on everything from link management software, rank checking applications, SEO conferences, Pay Per Click communities like PpcBlog.Com, web directories which can help with getting exposure/links to your site, social media monitoring services, and many more solid services.
There’s literally *thousands* of dollars in discounts available to our members.
Time is Money, Money is Time
The benefit in outsourcing anything is the time saved and/or the low cost. However, there are typically significant costs (and sometimes irreparable harm) associated with outsourcing any important part of your business to unqualified providers.
Without having the knowledge of what it is you are actually hiring for, you cannot be certain what exactly you are paying for.
Save yourself a lot of money and headaches, learn from the best, and beat your competition in the search engines.
When and if the time comes to hire an SEO firm, you will be fully prepared to make the right decision for your small business. What else could you ask for?
We are now fully open. Our membership area of the site has 2 separate tiers to it. The first gives you access to our tools, training modules, and newsletters, while running at $69 a month. The second level of access gives you access to everything in the first tier and access to our private member forums, while running $300 per month.