Clicky Web Analytics Interview

The field of web analytics is filled with free options, self hosted options, open-source products, expensive options, and affordable paid solutions. If you are looking for an affordable, feature-rich, and easy to use web analytics package you may want to check out Clicky.

Clicky is real time as well, which is a feature even some of the more popular services do not have. You can find a comparison between Clicky and their competitors right on their home page. Currently you can go back 6 months in the interface so you'll want to make copies of your data every few months or so.

Recently we interviewed Sean and Noah over at GetClicky.Com. Clicky is pretty popular with the members here and it's a great alternative to Google Analytics.

Sean and Noah were kind enough to answer some questions about their business model, future plans, and the rich feature set within their product.

1. Is selling the company in your future plans? If so, how would data be protected in such a case. As an example Tracking202 sold out to Bloosky and that concerned many affiliates. Do you plan on selling a version of the software which can be hosted locally on the users own server to get around worries associated with you possibly selling the service someday?

Selling the company is never out of the question; however, it would be inane and arrogant to plan solely for such an exit. We enjoy building Clicky and interacting with the Clicky community, and new owners usually have new agendas. Therefore, we prefer to keep Clicky rather than sell it. But if we did sell it, we would only do so under the condition that nothing changes for existing users. We do not have any plans to offer a self hosted option.

2. Do your sell the data at all? How secure is the data? Some of our members pointed out that Clicky doesn't have an about us page and Roxr's site is thin on the "who you are" details. In dealing with certain search engines, a few folks in the SEO field like to carry around a tinfoil hat or 3. Could you tell us a bit more about your company, infrastructure, etc?

Under no circumstances do we sell our customers' data. Data is stored locally and only accessed by its respective owners. Our privacy policy states this and we abide wholly to this unsigned "contract". We have never had any unauthorized access to customer data. We provide SSL login and encourage customers to use this feature.

Your members are correct; we don't mention the "people" behind Clicky. However, once a user registers for Clicky, he will shortly discover we are at his disposal. We usually respond to emails in the same day; we collaborate with our users through our forums and blog; phone support is offered to our white label clients; and Sean and I are always a tweet away.

  • Sean - @schammy
  • Noah - @roxr
  • Clicky - @getclicky

We build, buy, and host our servers. We chose this route from the beginning because it was cheaper in the long run and gave us more control. The processing of hundreds of millions of clicks daily and billions of database queries is inherently too costly to lease out. Many people ask us why we don't move to the cloud. Cloud computing hosts are new to the market and unproven in our opinion. We have a system that works and is cost-effective.

And if there's any doubt about the quality or "trustworthiness" of our service or our company, just search Google for "getclicky" - you will find thousands of positive reviews and other things about us, and almost nothing negative. I think I've only ever seen 2, or maybe 3, "negative" articles about our service, and all of them were over something pretty silly - but people love to rant when they're mad.

3. Will (or can) Clicky get into intricate analytics tracking to the degree of being able to be relied on for multi-channel attribution analysis? Being able to track vanity url's, special coupons, offer codes, etc. Essentially being able to track multiple offline and online campaigns?

We have full compatibility with Google Analytics campaign tags, which makes it all the more easier for existing GA users to move to our platform. These campaign tags (as you probably know) allow you to easily track and segment visitors arriving at your site from any of your online campaigns. For offline campaigns, we also have a "manual" campaign feature, so you can enter in a landing page URL, e.g. mysite.com/tv1, and we'll automatically flag all users who land on that page and report the campaign data together with your "dynamic" (GA) campaigns. We also have a custom data tracking feature so you can attach any data you want to any visitor session (e.g. if they used an offer code when submitting payment). And you can filter/segment your visitors based on this custom data too.

4. Do you see yourselves becoming an acquisition target for Google? What is to stop all the data currently collected by Clicky from ending up in the hands of Google (as an acquisition target maybe)?

It's certainly a possibility that Google may buy us, but we don't really expect it to happen. We believe strongly in privacy so we would try to ensure the data is treated as private and not used to "improve" search results, as they do with Google Analytics. Of course, the trade off there with GA is they let you use it for free, in exchange for that. (They don't tell you that up front, but it's common knowledge they use GA data for all types of optimization stuff, particularly search related). If Google wanted to use the same model with Clicky, well, it would really depend on the specifics. We would be against it on principle but it would really depend on the specifics. And if Google insisted on it, then we'd insist on letting our users know about that kind of change so they could cancel their account if they wanted to.

5. Sort of a piggyback to question 3 but with Clicky's customization abilities how far can one push the limit on segmentation, custom variables, and so on? Seems like lots of possibilities there but to the non-techie folks it can be a daunting task. Do you plan to offer paid support, paid campaign set up, or maybe a "Clicky Authorized Provider" program to help people set up intricate analytics accounts?

There are really no limits on segmentation, other than at this time you can only do it for a maximum date range of one month at a time. But other than that, you can segment your visitors down on a theoretically unlimited amount of data.

Segmentation is actually one of our strong points, because you don't have to fill our crazy forms or anything to find the data you want. In almost every report, the items in the report are clickable (e.g. viewing your top countries, you could click the US and then you would immediately be seeing only visitors from the US, including a summary of their activity at the top). And once this filtering is invoked, it's very simple to add additional variables via the blue drop down menu at the top, e.g. referring domain = google.com, then you would see all of your US visitors who arrived via Google.

We help users for free through email, our forums, and Twitter. We don't have paid support but then again we don't tell someone we won't help them, no matter what the problem is. We give higher priority to complex problems or questions to our paying users, but we still answer all support requests, no matter if the user is paying or not. Adding paid support may be something we do in the future if there's demand for it. We would have to expand our headcount first though. Currently it's still just the two of us running this operation.

Thanks for the time guys!

Well there you have it. Clicky has some pretty deep segmentation and tracking options which are both vital to the success of web analytics set up. We hope you learned a bit more about the company and the product via this Q&A. Clicky has a great support forum as well, for any questions you have as you start to get familiar with their product.

Interview With Anita Campbell in ~ 1 Hour :)

Hi Everyone
Anita Campbell will be interviewing me on her Small Business Radio program in ~ 1 hour & 15 minutes, at 1:30 PM Eastern.

I was on Small Business Trends Radio

Paid Content: the New Paid Link

Paid Links Are Spam

Buying links is considered spammy by Google because it is a ranking short cut which subverts search relevancy algorithms.

And so Google considers it a black hat SEO practice.

Links are somewhat hard to scale because (outside of those who create a network of spam) it is time intensive to find the right sites, negotiate a price, and then ensure appropriate placement. It requires interacting with many webmasters & going through a lot of rejections to get a few yes responses. Due to scale limitations, paid links typically only exert a slight influence on core industry keywords and common variations, limiting any potential relevancy damage.

Further, when a person buys a link, the relevancy is almost always guaranteed (as one would go broke fast if they rented links targeting irrelevant keywords).

Even still, Google hates paid links because they can lower result diversity & bias the organic search results away from being informational and towards being commercial (which in turn means that Google AdWords ads get fewer clicks).

Policing Paid Links

To make link building efforts easier to police, Google created nofollow, which aimed to disrupt the flow of link equity across certain links. Initially the alleged purpose was blocking comment spam. And then after it was in place, comment spam never went away, but the role of rel=nofollow quickly expanded to be a cure-all to be placed on any paid link.

Google encouraged spam reports that highlight paid links. SEO blogs highlighted people that were buying links. Firms like Text Link Ads were eradicated from the Google index. And all was well in GoogleLand.

...Until...

The Rise of Content Farms

Over the past few years people realized that Google had dialed up the weight on domain authority & that links are now much harder to get. So companies started placing lead generation forms on trusted sites & firms like Demand Media purchased highly trusted websites like eHow (which already had a ton of links in place from back when links were easier to obtain).

Demand Media then automated and streamlined the content production process and poured content into eHow until the rate of returns on new content and growth rate started to slow.

This type of strategy attacks the longtail of search, and given how many unique search queries there are each day, that amounts to a lot of opportunity!

Corporate Content Farming: The Art of Informationless Information

Anyone who has watched The Meatrix is likely afraid of factory farms. The content created by these content farms isn't much better. When I highlighted how bad one of the pieces was their solution was to delete it and hide it from site, then write a memo about how they do "quality" content at scale.

That scale part is no joke - Demand Media brought in over $200 million last year. And I suppose if they put the word "low" in front of quality, it wouldn't be a joke either.

Abusing Nofollow

These same authoritative websites which managed to create content for $10 to $15 a page (or sometimes $0 auto-generated pages) then leveraged nofollow on *all* outbound links, so that they would not vote for anyone, even if their content was only a thin watered down rewrite of 3rd party content sources:

eHow is a content publisher known for “How To..” articles. Lately, it seems eHow visits other websites, scrapes their instructional content (on whatever topic), and republishes it as a How To article on eHow. Sometimes the entire step-by-step process is “copied” for the eHow article. I’ve noticed a few times this week, how eHow articles are basically copies of existing content from other sites, worse than Wikipedia rewrites. That’s pretty much “scraping”, even if done by poorly-paid human workers.

So now companies are building a wide range of "content" business models ranging from auto-generated content to semi-autogenerated mash-ups to poorly crafted manual rewrites (as mentioned above).

Content Scraping & Recycling as a Legitimate Business?

Even search engines are becoming general purpose scrapers, snagging third party content, mixing it together, wrapping it in ads, and pushing it into the index of other search engines.

The result?

Ask.com's share of search traffic rose 21% last month alone!

The Information Age

We are no longer in an “Information Age.” We are in the Age of Noise. Falsehoods, half-truths, talking points, out-of-context video edits, plagiarism, rewriting of history (U.S. was founded as a Christian nation, for example), flip-flops, ignoring facts (Cheney and torture for example), neatly packaged code words and phrases, media ratings focus, dysfunctional government (fillibusters have more than doubled, but most don’t realize Republicans are blocking everything), mainstreaming fringe causes….I could go on and on. Is it any wonder why so many who are struggling with kids, jobs, rising medical costs, etcetera have such a tough time wading through all the crap? - source

Paranoid About Links

As building up your own profile has grown harder (since links are harder to get) many new web 2.0 websites provide free outbound links to help encourage participation and get links back into their websites. But then after they reach a critical mass they claim that spam is an issue and strip away the links by using nofollow, stealing that hard work people did to build up the network, offering nothing in return for it!

Google's fear of links is *so out of hand* that an SEO simply mentioning that a person can get a link from their own profile page on a social site is enough to have Matt Cutts go out of his way to push the social media site to remove the opportunity. If you put a lot of work building up a social profile Google doesn't want you to benefit from that work, but it is fine if that network does:

If Google is the one who wants that web link nofollowed because some twitter profile pages may be automated bots or spammers, then it is time they realize that THEY are responsible for determining which of those individual pages is authoritative, trusted and legitimate enough to pass link popularity, by a method other than demanding that other websites and social networks change the ways they do business to help Google stop links being used as a form of currency and to manipulate their algorithm – an issue Google and Google alone created and profited from.

Any Form of Payment = Not Trustworthy

A few years back a well known SEO joined our training program, read our tip about using self-hosted affiliate programs as a link building tool, and then promptly outed us directly to Matt Cutts, in a video, and on their blog. Google quickly blocked our affiliate program from passing link juice. Later a Google engineer publicly stated affiliate links should count.

Since then affiliate links have been a gray area (it works for some companies and doesn't work for others, based on 100% arbitrary choices inside Google). Looking for clarification on the issue, Eric Enge recently asked Matt Cutts: "If Googlebot sees an affiliate link out there, does it treat that link as an endorsement or an ad?"

Matt Cutts responded with: "Typically, we want to handle those sorts of links appropriately. A lot of the time, that means that the link is essentially driving people for money, so we usually would not count those as an endorsement."

So links which are driven by payment should not count as endorsements, even if the affiliate does endorse & believe in the product. The fact that there is a monetary relationship there means the link *should not count*

The Elephant in the Room at the GooglePlex

Ignoring links for a moment, lets get back to the the content mill content business model. It was fine that Demand Media bought trusted (well linked) sites like eHow for their trust to pour low-end content into, even though those pre-existing links were bought by the new owner.

And here is where the content mill business model gets really shady, in terms of "what is good for the user" ... Demand Media is now licensing backfill content to be hosted on USAToday.com on a revenue share basis. Describing the relationship, Dave Panos, Demand Media's CMO said "It's an opportunity for us to get in front of the audience that's already congregating around very well-known brands."

But you won't find that content on the USAToday.com homepage.

When he said "already congregating around very well-known brands" what he meant was "will rank well on Google." And so, what we have is a paid content partnership which subverts search relevancy algorithms.

If affiliate links shouldn't count, then why would affiliate content?

If Google doesn't stop it from day 1 then the media companies are going to quickly become addicted to the risk-free money like crack. And if Google tries to stop it *after* it is in place then they are going to find themselves lambasted in the media with talks of anti-trust concerns.

Something to think about before heading too far down that path.

Two Roads Diverged in a Wood...

How is a content exchange network any different than a link exchange network? The intent is exactly the same, even if the mechanics and payment terms differ slightly.

If a paid link that subverts search relevancy algorithms shouldn't count on the web graph, then why should Google trust paid content that subverts search relevancy algorithms?

Will the search results start filling up with similar sounding misinformed content ranking for 1 then 3 then 8 of the top 10 search results? Do the search results slowly get dumbed down 1 article and 1 topic at a time?

This trend *will* harm both the accuracy and diversity of content ranking in the search results. And it will grow progressively worse as people begin to quote the misinformed garbage on other websites (because hey, if it ranks in Google and is on USA Today it is *probably* true). Or is it?

Some questions worth thinking about:

  • Google is willing to truth police SEOs. Will they do the same for media outlets publishing backfill "content"?
  • How will Google be able to filter out the Demand Media content without filtering out the rest of the media sites?
  • Does Google care if the quality & diversity of the search results is diminished, even if/when most searchers will not be savvy enough to recognize it? I guess it depends on who has the last word on the issues inside Google, because most garbitrage content is wrapped in AdSense ads.

Beauty is Rare - Elusive so it Can be Easily Sold

A couple years ago my wife and I had our big wedding in the Philippines (we even had the mayor of Manila show up). She was so beautiful that day. And lucky for me she is just as beautiful when she wakes up each day. :D

But she can be hard on herself and if she gains a single pound she worries. Truth is I am the chubby one who needs to drop weight.

Beauty (and the perception of it) is a wonder commodity to sell because there is no limit. Almost everyone could be in better shape or be stronger or eat healthier or not have this or that birthmark or the odd finger that bends backwards.

We are imperfect beings by our very nature.

We get sick.

We break.

And we all fight the battle of aging one day at a time - every single day!

But no matter where you go, whatever is rare is typically considered desirable & beautiful. This is not done as an accident, but as a way to generate profits. If the human condition is flawed (and can't be fixed) then the person selling a bogus solution to that problem is going to make a lot more money than a person who sells something which is actually attainable.

And so we live in a world where we treat symptoms, rather than problems. Anything to make the numbers look good and make the sale. From there you are on your own! If you feel bad, we can give you more drugs!

Spending too much time at the computer and eating unhealthy has made me a bit too chubby. No good in obese America! But did you know that in the certain times & cultures being fat was considered a sign of beauty, like when few people could afford to be fat! ;)

There is too much high fructose corn syrup in the typical American diet for obesity to be considered beautiful:

"Our findings lend support to the theory that the excessive consumption of high-fructose corn syrup found in many beverages may be an important factor in the obesity epidemic," Avena said.

The new research complements previous work led by Hoebel and Avena demonstrating that sucrose can be addictive, having effects on the brain similar to some drugs of abuse.

In the United States many girls not only label anorexia as beauty, but some go to tanning salons so they can darken their skin to look beautiful, at least until they get older:

Long-term exposure to artificial sources of ultraviolet rays like tanning beds (or to the sun's natural rays) increases both men and women's risk of developing skin cancer. In addition, exposure to tanning salon rays increases damage caused by sunlight because ultraviolet light actually thins the skin, making it less able to heal. Women who use tanning beds more than once a month are 55 percent more likely to develop malignant melanoma, the most deadly form of skin cancer.

A service which has no lasting positive tangible value AND certainly has a lasting negative tangible risk can grow to become a multi-billion Dollar industry

Anything to be beautiful! This is what beautiful people do. I want to beautiful.

The above never really made sense to me and always felt a wee bit scammy. There was an odd odor to it, but it was hard to appreciate how scammy it was, until...

When it really hit home for me was when my wife and I were in the Philippines. Many of the department stores sell skin whitening soap! Having a lighter skin tone is supposed to be a sign that you are from a wealthier family. And since wealth is concentrated that is rare. And so that is what is considered beautiful. :D

Are Content Mills the Future of Online Publishing? What Comes Next?

Aaron's discussed content mills in his interview with Tedster yesterday.

What is a content mill?

A content mill is a site that publishes cheap content. The content is either user-contributed, paid, or a mix of the two. The term content mill is obviously pejorative, the implication being that the content is only published to pump content into search engines, and is typically of low value in terms of quality.

The problem is that some sites that publish cheap content may well provide value, but it depends who is reading it. For example, a forum might be considered a content mill, as it contains cheap, user-generated content of little value to a disinterested visitor, or a forum might be a valuable, regularly updated resource provided by a community of enthusiasts!

Depends who you ask.

As Aaron says, content mills are all the rage in 2010. Let's take a closer look.

Why Are SEOs Interested In Content Mills?

This idea is nothing new. It's actually white-hat SEO strategy, and has been used for years.

  • Research keywords
  • Write content about those keywords
  • Publish content and attempt to rank that content in search engine results
  • Repeat

If you can publish a page at a lower cost than your advertising return, then you simply repeat the process over and over, and you're golden. Think Adsense, affiliate, and similar means to monetize pages. Take a look at Demand Media.

The Problem With Content Mills

One of the problems with content mills is that in an attempt to drive the production cost of content below the predicted return, some site owners are producing garbage content, usually by facilitating free contributions from users.

At the low end, Q&A sites proliferate wherein people ask questions and a community of people with opinions, informed or otherwise, provide their two cents worth. Unfortunately, many of the answers are worth somewhat less than two cents, resulting in pages of little or no value to an end reader. I'm sure you've seen such pages, as such pages often rank well in search engines if they are published on a domain with sufficient authority.

Some sites, like Mahalo, not only automate their page creation, but the use that automated page to generate automate related question pages as well. The rabbit hole has no bottom!

At the other end of the spectrum, we have sites that publish higher-cost, well researched content sourced from paid writers. A traditional publishing model, in other words. Generally speaking, such pages are of higher value to end user, but the problem is that the search engines can't appear to tell the difference between these pages and the junk opinion pages. If the content mill has sufficient authority, then the junk gets promoted.

And there are many examples in between, of course.

As Tedster mentioned, "the problem here is that every provider of freelance content is NOT providing junk - though some are. As far as I know, there is no current semantic processing that can sort out the two. It's tough to see how this could be quickly and effectively reined in, at least not by algorithm. I assume that this kind of empty filler content is not very useful for visitors — it certainly isn't for me. So I also assume it must be on Google's radar.".

The Future Of Content Mills

I think Tedster is right - such sites will surely appear on Google's radar, because junk, low value content doesn't help their end users.

It must be a difficult problem to solve, else Google would have done so by now, but I think it's reasonable to assume Google will try to relegate the lowest of the low-value content sites at some point. If you are following a content mill strategy, or considering starting one, it's reasonable to prepare for such an eventuality.

The future, I suspect, is not to be a content mill, in the pejorative sense of the word. Aim for quality.

Arbitrary definitions of quality are difficult enough, as we've discussed above. Objective measurement is impossible, because what is relevant to one person may be irrelevant to the next. The field of IQ (information quality) may provide us some clues regarding Google's approach. IQ is a form of research in systems information management that deals specifically with information quality.

Here are some of the metrics they use:

  • Authority- Authority refers to the expertise or recognized official status of a source. Consider the reputation of the author and publisher. When working with legal or government information, consider whether the source is the official provider of the information.
  • Scope of coverage - Scope of coverage refers to the extent to which a source explores a topic. Consider time periods, geography or jurisdiction and coverage of related or narrower topics.
  • Composition and Organization- Composition and Organization has to do with the ability of the information source to present it’s particular message in a coherent, logically sequential manner.
  • Objectivity - Objectivity is the bias or opinion expressed when a writer interprets or analyze facts. Consider the use of persuasive language, the source’s presentation of other viewpoints, it’s reason for providing the information and advertising.
  • Validity - Validity of some information has to do with the degree of obvious truthfulness which the information carries
  • Uniqueness - As much as ‘uniqueness’ of a given piece of information is intuitive in meaning, it also significantly implies not only the originating point of the information but also the manner in which it is presented and thus the perception which it conjures. The essence of any piece of information we process consists to a large extent of those two elements.
  • Timeliness - Timeliness refers to information that is current at the time of publication. Consider publication, creation and revision dates.
  • Reproducibility

Any of this sound familiar? It should, as the search landscape is rife with this terminology. This is not to say Google look at all these aspects, but they have used similar concepts, starting with PageRank.

As conventional SEO wisdom goes, Google may have tried to solve the relevancy problem partly by focusing on authority, on the premise that a trusted authority must publish trusted content, so the pages of a domain with a high degree of authority receive a boost over those with lower authority levels. But this situation may not last, as some trusted sources, in terms of having authority, do, at times, publish auto-gen garbage content. Google may well start looking at composition metrics, if they aren't doing so already.

This is speculation, of course.

I think a good rule of thumb, for the time being, should be "will this page pass human inspection?". If it looks like junk to a human reviewer in terms of organization, and reads like junk in terms of composition, it probably is junk, and Google will likely feed such information back into their algorithms. Check out Google's Quality Rater Document from 2007 which should give you a feel for Google's editorial policy.

DIY SEO Software Reviews

Is DIY SEO any good? Does it work?

When I got to look at DIY SEO my first thought was: good structure & layout, lets see what is under the hood. But then after opening up the hood I found a car with no engine.

On a score of usability I would give the site a 9 or a 10, but in terms of utility it would be lucky to score as high as a 2 or a 3.

Google AdWords: The Cheapest SEM Strategy for Small & Local Businesses

Maybe there are some small businesses out there who are content being obscure, or who only want to rank for their own business name plus maybe 1 or 2 longtail keywords. But for those businesses I suggest bypassing SEO and buying a few Google AdWords ads.

  • Low traffic keywords are typically cheap to buy search ads on - because you only pay by the click. If few people are searching for something then there will be few clicks to buy.
  • Not only are such markets small, but due to their small size they are also heavily fragmented, making the AdWords traffic even cheaper.
  • If few people are searching for your brand then you can likely spend $25 a month on AdWords and ignore learning SEO.

A Legitimate SEO Strategy Requires Investment

With a paid search campaign, you can use Google AdWords to instantly buy search traffic and gain new customers. SEO is a drawn out strategy & typically requires a much deeper initial investment.

There is little value in investing in SEO unless your goal is to dominate your market, and there is sufficient market scale to justify investing thousands of Dollars (and far more when you consider the value of your time). After all, a single link from Business.com or the Yahoo! Directory will run you $299, and 2 links hardly makes for an effective SEO strategy - but they will set you back $600 a year.

And you don't get those links any cheaper just because your business is small. ;)

Why Does DIY SEO Offer Such a Weak SEO Solution?

When looking at DIY SEO it took me a while to think it through, because I kept thinking "something is missing." Why did they raise funding to build THAT? But then I thought it through. DIY SEO was designed by marketers looking to sell something that would be easy to sell at scale - it was not created out of passion to solve a real problem with the desire to help make a difference in people's lives.

The difference is not subtle.

After all, Andy is the guy who had time to build out hundreds of thin affiliate sites, while being too lazy (and lacking the concern needed) to fix his SEO blog for months while it installed malware on anyone who visited his site. That blog had the tagline Livin' the dream, but that is for him though...you can live with malware. He doesn't care.

Could you imagine a reputable SEO site like Search Engine Land, SEO Book, SEOmoz, or Search Engine Journal delivering malware for months without any care or concern? I can't.

SEO Consulting is Expensive

SEO is both time consuming and expensive. Neither Andy or Patrick offer consulting services because they value their time too much to actually dig into client websites and provide useful, relevant, honest, and effective feedback. Patrick states this on his blog

And as Andy's site states: he no longer sells consulting, and he does not want you to email him

DIY SEO was designed as a high margin automated solution which is so automated that it wouldn't require much feedback or interaction with customers.

But there is one big problem with that strategy...

SEO is *NOT* a Mechanical Process

For anyone looking to seriously compete on the web the DIY SEO tool/system is inadequate, and potentially even harmful. Why?

In SEO, a lot of the potential profit comes from knowing your market well, leveraging new technologies & distribution channels to gain market share, and putting a new spin on old marketing ideas. But they tried to make SEO too black and white...far too mechanical. Anywhere where critical thought & analysis can add value to your SEO strategy, you can count on none of it being done with DIY SEO, just some predetermined path which doesn't really account for everything that makes your business and your market unique.

In an age where the algorithms keep advancing faster and subjective things like branding start playing a role in the search results, mechanical doesn't cut it.

DIY SEO is too prescriptive and limited in nature, and it is a backward looking product. What the phrase "for the rest of us" actually means is "good enough to rank on page 5 of the search results, where you will get virtually no search traffic and make no money."

Paint by Number SEO: An SEO Failure Case Study

Google doesn't always respond to marketing efforts in a predictable way. Consider what happened when Patrick purchased SearchEngineOptimization.net for over $60,000.

At the end of last year he tried to do a 301 redirect to get it to rank, but when it didn't work he asked Matt Cutts about it:

Google's spam czar Matt Cutts never responded (of course), but Patrick ended up having to remove that redirect. Months later that $60,000+ domain name was a "coming soon" page.

And now that the redirect has been removed the original redirected site does not rank as well as it did in the past. So that was certainly a lose / lose scenario.

And the worst part is, when he mentioned the strategy people warned him about what would happen right up front:

He would have been better off donating that money to charity!

Advanced SEO? Or Simpleton SEO?

I don't want to share too many examples of how/why/where their program falls short, but to pick a rather glaring one...

Links are the backbone of an effective SEO strategy. Patrick Gavin built from scratch the #1 link broker on the web - Text Link Ads. A few years back he sold that company for over $30 million. Since then links have only increased in importance while becoming harder to get, but if you check out the advice on links in DIY SEO (or at least when I recently checked it out), one of the "advanced" SEO tips was to ensure that you are not engaging in any link buying or selling.

The advanced tips were not sharing safe & effective link buying techniques. Nope.

The advice was to ensure you were not engaging in link buying or selling.

And, of course, on their own websites they don't follow their own advice.

What Do Effective SEO Campaigns Consist of?

I have no desire to out any of their specific websites (hey I have some crappy ones too), but when you look at the EFFECTIVE strategies that you see Andy and Patrick use in their own publishing efforts, at a minimum they contain strategies like:

  • buying old websites
  • buying strong domain names
  • selectively buying links
  • providing a bit of grease to certain About.com guides for coverage of new 1 page sites
  • nepotistically cross linking sites
  • launching top 100 linkbait lists about trending popular topics
  • building social media accounts to promote those lists
  • buying out some blogs to further seed giving legitimate looking coverage to those lists
  • launching egobait lists of topics like the top 100 ambidextrous hermaphrodite bloggers (complete with running an automated email script to alert people of the "award" they have won, with some people winning multiple awards in the same day - congrats again Nancy P. from Texas on your multiple meaningless awards + thanks for the links...your email address is now in the database, and you will win many more awards as they build out their portfolio of websites!!!!)

... all the clever bits of marketing that go into REAL SEO campaigns that compete on the competitive commercial web ... well that stuff is NOT part of the DIY SEO program.

And it likely won't EVER be, because it isn't paint by number.

Better Small Business SEO Solutions

Want an effective guide to small business SEO? Check out Matt McGee's small business SEO guide. It will give you more than the above program while only setting you back $25.

There are numerous free guides worth recommending as well. Both Bing and Google offer SEO starter guides. We created this one for non-profits, this one for bloggers, and this one for general business websites. SeoMoz offers a pretty good one too.

Interview of Tedster from WebmasterWorld

If you have been in the SEO field for any serious length of time you have probably come across (and benefited from) some of Tedster's work - either directly, or indirectly from others who have repackaged his contributions as their own. He is perhaps a bit modest, but there are few people in the industry as universally well respected as he is. I have been meaning to interview him for a while now, and he is going to be speaking at Pubcon South on April 14th in Dallas, so I figured now was as good a time as any :)

How long have you been an SEO, and how did you get into the field?

I started building websites in 1995 and the word SEO hadn't been invented. I came from a background in retail marketing, rather than technology or graphic design. So my orientation wasn't just "have I have built a good site?", but also "are enough people finding my site?"

The best method for bringing in traffic seemed to be the search engines, so I began discussing this kind of marketing with other people I found who had the same focus. Ah, the good old days, right? We were so basic and innocently focused, you know?

If you could list a few key documents that helped you develop your understanding of search, which would be the most important ones?

Here are a few documents that acted as major watersheds for me:

Is PageRank still crucial? Or have other things replaced it in terms of importance?

What PageRank is measuring (or attempting to measure) is still very critical — both the quality and number of other web pages that link to the given page. We don't need to worship those public PR numbers, but we definitely do need quality back-links (and quality internal linking) to rank well on competitive queries.

There appears to be something parallel to PR that is emerging from social media — some metric that uses the model of influencers or thought leaders. But even with that in the mix, ranking would still depend on links, but they would be modified a bit by "followers", "friends", since many social sites are cautious with do-follow links.

Lets play: I have got a penalty - SEO edition. Position 6, -30, 999, etc. Are these just bogus excuses from poor SEOs who have no job calling themselves SEOs, or are they legitimate filters & penalties?

If the page never ranked well, then yes - it could well be a bogus excuse by someone whose only claim to being an SEO is that they read an e-book and bought some rank tracking software. However, Google definitely has used very obvious numeric demotions for pages that used to rank at the top.

The original -30 penalty is an example that nailed even domain name "navigational" searches. It affected some sites that did very aggressive link and 301 redirect manipulation.

What was originally called the -950 (end of results) penalty, while never an exact number, most definitely sent some very well ranked pages down into the very deep pages. Those websites were often optimized by very solid SEO people, but then Google came along and decided that the methods were no longer OK.

In recent months, those exact number penalties seem to have slipped away, replaced something a bit more "floating" and less transparent. My guess is that a negative percentage is applied to the final run re-ranking, rather than subtracting a fixed number. Google's patent for Phrase-based Indexing does mention both possible approaches.

But even using percentages rather than a fixed number, when a top-ranked page runs afoul of some spam prevention filter, it can still tank pretty far. We just can't read the exact problem from the exact number of positions lost anymore.

Do you see Google as having many false positives when they whack websites?

Yes, unfortunately I do. From what I see, Google tends to build an algorithm or heuristic that gathers up all the URLs that seem to follow their "spam pattern du jour" — and then they all get whacked in one big sweep. Then the reconsideration requests and the forum or blog complaints start flying and soon Google changes some factor in that filter. Viola! Some of the dolphins get released from the tuna net.

One very public case was discussed on Google Groups, where an innocent page lost its ranking because a "too much white space" filter that misread the effect of an iframe!

Google's John Mueller fixed the issue manually by placing a flag on that one site to trigger a human inspection if it ever got whacked in the future. I'd assume that the particular filter was tweaked soon after, although there was no official word.

How many false positives does it take to add up to "many"? I'd guess that collateral damage is a single digit percentage at most — probably well under 5% of all filtered pages, and possibly less than 1%. It still hurts in a big way when it hits YOUR meticulously clean website. And even a penalty that incorrectly nails one site out of 300 can still affect quite a lot over the entire web.

How often when rankings tank do you think it is do to an algorithmic issue versus how often it is via an editorial issue with search employees?

When there are lots of similar complaints at the same time, then it's often a change in some algorithm factor. But if it's just one site, and that site hasn't done something radically new and different in recent times, then it's more likely the ranking change came from a human editorial review.

Human editors are continually doing quality review on the high volume, big money search results. It can easily happen that something gets noticed that wasn't seen before and that slipped through the machine part of the algorithm for a long time.

That said, it is scary how often sites DO make drastic errors and don't realize it. You see things like:

  • nofollow robots meta tags getting imported from the development server
  • robots.txt and .htaccess configurations gone way wrong
  • hacked servers that are hosting cloaked parasite content

Google did a big favor for honest webmasters with their "Fetch as googlebot" tool. Sometimes it's the easiest way to catch what those hacker criminals are doing.

When does it make sense for an SEO to decide to grovel to Google for forgiveness, and when should they try to fix it themselves and wait out an algorithmic response?

If you know what you've been doing that tripped the penalty, fix it and submit the Reconsideration Request. If you don't know, then work on it — and if you can't find a danged thing wrong, try the Google Webmaster Forums first, then a Request. When income depends on it, I say "grovel".

I don't really consider it groveling, in fact. The Reconsideration Request is one way Google acknowledges that their ranking system can do bad things to good websites.

I've never seen a case where a request created a problem for the website involved. It may not do any good, but I've never seen it do harm. I even know of a case where the first response was essentially "your site will never rank again" — but later on, it still did. There's always hope, unless your sites are really worthless spam.

Many SEOs theorize that sometimes Google has a bit of a 2-tier justice system where bigger sites get away with murder and smaller sites get the oppressive thumb. Do you agree with that? If no, please explain why you think it is an inaccurate view. If yes, do you see it as something Google will eventually address?

I'd say there is something like that going on — it comes mostly because Google's primary focus is on the end user experience. Even-handed fairness to all websites is on the table, but it's a secondary concern.

The end user often expects to see such and such an authority in the results, especially when it's been there in the past. So Google itself looks broken to a lot of people if that site gets penalized. They are between a rock and a hard place now.

What may happen goes something like this: an A-list website gets penalized, but they can repair their spam tactics and get released from their penalty a lot faster than some less prominent website would. It does seem that some penalties get released only on a certain time frame, but you don't see those time frames applied to an A-list.

This may even be an effect of some algorithm factor. If you watch the flow of data between the various Google IP addresses, you may see this: There are times when the domain roots from certain high value websites go missing and then come back. Several data center watchers I know feel that this is evidence for some kind of white-list.

If there is a white-list, then it requires a history of trust plus a strong business presence to get included. So it might make also sense that forgiveness can come quickly.

As a practical matter, for major sites there can easily be no one person who knows everything that is going on in all the business units who touch the website.

Someone down the org chart may hire an "SEO company" that pulls some funny business and Google may seem to turn a blind eye to it, because the site is so strong and so important to Google's end user. They may also just ignore those spam signals rather than penalize them.

Large authority site content mills are all the rage in early 2010. Will they still be an effective business model in 2013?

It's tough to see how this could be quickly and effectively reined in, at least not by algorithm. I assume that this kind of empty filler content is not very useful for visitors — it certainly isn't for me. So I also assume it must be on Google's radar.

I'd say there's a certain parallel to the paid links war, and Google's first skirmishes in that arena gave then a few black eyes. So I expect any address to the cheap content mills to be taken slowly, and mostly by human editorial review.

The problem here is that every provider of freelance content is NOT providing junk - though some are. As far as I know, there is no current semantic processing that can sort out the two.

Given that free forums have a fairly low barrier to entry there are perhaps false alarms every day on ringing in the next major update or some such. How do you know when change is the real deal? Do you passively track a lot of data? And what makes you so good at taking a sea of tidbits and sort of mesh them into a working theme?

I do watch a lot of data, although not nearly to the degree that I used to. Trying to reverse engineer the rankings is not as fruitful as it used to be —especially now that certain positions below the top three seem to be "audition spots" rather than actually earned rankings.

It helps to have a lot of private communications — both with other trusted SEOs and also with people who post on the forums. When I combine that kind of input with my study of the patents and other Google communications, usually patterns start to stand out.

When you say "audition spots" how does that differ from "actually earned rankings"? Should webmasters worry if their rankings bounce around a bit? How long does it typically take to stabilize? Are there any early signs of an audition going good or bad? Should webmasters try to adjust mid-stream, and if so, what precautions should they take?

At least in some verticals, Google seems to be using the bottom of page 1 to give promising pages a "trial" to see how they perform. The criteria for passing these trials or "auditions" are not very clear, but something about the page looks good to Google, and so they give it a shot.

So if a page suddenly pops to a first page ranking from somewhere deep, that's certainly a good sign. But it doesn't mean that the new ranking is stable. If a page has recently jumped way up, it may also go back down. I wouldn't suggest doing anything drastic in such situations, and I wouldn't overestimate that new ranking, either. It may only be shown to certain users and not others. As always, solid new backlinks can help - especially if they are coming from an area of the web that was previously not heard from in the backlink profile. But I wouldn't play around with on-page or on-site factors at a time like that.

There's also a situation where a page seems to have earned a lot of recent backlinks but there's something about those links that smells a bit unnatural. In cases like that, I've seen the page get a page one position for just certain hours out of the day. But again, it's the total backlink profile and its diversity that I think is in play. If you've done some recent "link building" but it's all one type, or the anchor text is too obviously manipulated, then look around for some other kinds of places to attract some diversity in future backlinks.

On large & open forums lots of people tend to have vastly different experience sets, knowledge sets, and even perhaps motives. How important is your background knowledge of individuals in determining how to add their input into your working theme? Who are some of the people you trust the most in the search space?

I try never to be prejudiced by someone's recent entry into the field. Sometimes a very new person makes a key observation, even if they can't interpret it correctly.

There is a kind of "soft SEO" knowledge that is rampant today and it isn't going to go away. It's a mythology mill and it's important not to base a business decision on SEO mythology. So, I trust hands on people more than manager types and front people for businesses. If you don't walk the walk, then for me your talk is highly suspect.

I pay attention to how people use technical vocabulary — do they say URL when they mean domain name? Do they say tag when they mean element or attribute? Not that we don't all use verbal shortcuts, but when a pattern of technical precision becomes clear, then I listen more closely.

I have long trusted people who do not have prominent "names" as well as some who do. But I also trust people more within their area of focus, and not necessarily when they offer opinions in some other area.

I hate to make a list, because I know someone is going to get left out accidentally. Let's just say "the usual suspects." But as an example, if Bruce Clay says he's tested something and discovered "X", you can be pretty sure that he's not blowing sunshine.

Someone who doesn't have huge name recognition, but who I appreciate very much is Dave Harry (thegypsy). That's partly because he pays attention to Phrase-based Indexing and other information retrieval topics that I also watch. I used to feel like a lone explorer in those areas before I discovered Dave's contributions.

What is the biggest thing about Google where you later found out you were a bit off, but were pretty certain you were right?

That's easy! Using the rel="nofollow" attribute for PR sculpting. Google made that method ineffective long before I stopped advocating it. I think I actually blushed when I read the comment from Matt Cutts that the change had been in place for over a year.

What is the biggest thing about Google where you were right on it, but people didn't believe until months or years later?

The reality of the poorly named "minus 950" penalty. I didn't name it, by the way. It just sort of evolved from the greater community, even though I kept trying for "EOR" or "End Of Results.

At PubCon South I believe you are speaking on information architecture. How important is site structure to an effective SEO strategy? Do you see it gaining or losing importance going forward?

It is hugely important - both for search engines and for human visitors.

Information Architecture (IA) has also been one the least well understood areas in website development. IA actually begins BEFORE the technical site structure is set up. Once you know the marketing purpose of the site, precisely and in granular detail, then IA is next.

IA involves taking all the planned content and putting it into buckets. There are many different ways to bucket any pile of content. Some approaches are built on rather personal idiosyncrasies, and other types can be more universally approachable. Even if you are planning a very elaborate, user tagged "faceted navigation" system, you still need to decide on a default set of content buckets.

That initial bucketing process then flows into deciding the main menu structure. Nest you choose the menu labels, and this is the stage where you fix the actual menu labels and fold in keyword research. But if a site is built on inflexible keyword targets from the start, then it can often be a confusing mess for a visitor to navigate.

As traffic data grows in importance for search ranking, I do see Information Architecture finally coming into its own. However, the value for the human visitor has always clearly visible on the bottom line.

What are some of the biggest issues & errors you see people make when setting up their IA?

There are two big pitfalls I run into all the time:

  • Throwing too many choices at the visitor. Macy's doesn’t put everything they sell in their display windows, and neither should a website.
  • Using the internal organization of the business as the way to organize the website. That includes merely exporting a catalog to a web interface.

How would you compare PubCon South against other conferences you have attended in the past?

PubCon South is a more intimate venue than, say Vegas. That means less distraction and more in-depth networking. Even though people do attend from all over the world, there is a strong regional attendance that also gives the conference a different flavor — one that I find a very healthy change of pace.

In addition, PubCon has introduced a new format — the Spotlight Session. One entire track is made completely of Spotlight Sessions with just one or two presenters, rather than an entire panel. These are much more interactive and allow us to really stretch out on key topics.

---

Thanks Tedster! If you want to see Tedster speak he will be at Pubcon Dallas on the 14th, and if you want to learn about working with him please check out Converseon. You can also read his latest musings on search and SEO by looking into the Google forums on WebmasterWorld. A few months back Tedster also did an interview with Stuntdubl.

Managing Business Opportunity Overload

Do Something...Now!

In a land of opportunity there is typically lots of distraction, oddly enough those distractions are usually other opportunities. How many times have you:

  • Stared at a domain you wanted to buy, but didn't pull the trigger
  • Stared at a domain you bought, but left it parked for another year
  • Negotiated down to what you wanted to pay for a site or domain, yet didn't move forward due to (fill in the blank)
Sign of Indecision

Typical reasons surrounding procrastination tend to be "not enough time" or "this will never work". Well, how many of your "can't miss" ideas missed and how many of your "probably will miss" ideas actually hit?

Win More, Lose Less

In my experience as long as you win more than you lose you're doing ok. This sounds a bit easier than it is though. In many professions, take sports for an example, success (worth millions in contracts) can be had for "succeeding" less than 50% of the time. A couple of examples:

  • Hitters in baseball strive to get a .300 average, which is failing 7 times out of 10
  • Basketball players are considered great shooters if they are successful making 45%-48% of their shots

Imagine if you succeeded at those clips? If so, you better hope ones that you hit on were big money makers and the ones you lost on required minimal investment amounts. If you take a similar approach to finding and operating in new markets most of the initial costs are fairly similar. Basic costs like:

  • Design
  • Content
  • Site Promotion
  • PPC Testing

tend to be somewhat similar on your average new site, perhaps if you are purchasing a domain or site it can skew the numbers a bit but overall these things tend to average out. So at the very least if you are succeeding 6 out of 10 times and you don't get carried away on a new site launch you should be doing pretty well. They more you do the better your ratio gets, the better your long term profits are, and you should expect to raise that ratio a bit as you start to gain more and more experience in researching + launching new ventures.

Dueling Fears

Most of us have a fear of failure and some of us have a fear of success. A fear that if we become successful it might alienate some of our closest friends and family members, it might turn us into workaholics working day and night to sustain that success and lifestyle, and so on. Fear of failure is something I think even the most successful entrepreneur's face from time to time.

Of course, we all know the old basketball saying: "You miss 100% of the shots you don't take".

Fear of failing and succeeding is something one has to overcome on their own but it terms of trying to overcome procrastination it is usually advisable to set less rigid and more reasonable deadlines for yourself and your work as outlined in this post over at harvard.edu http://www.iq.harvard.edu/blog/sss/archives/2006/10/procrastination.shtml (which references a study co-authored by Dan Ariely, who wrote the must read "Predictably Irrational").

Fear of Failure Chalkboard

Psychology Today has a research piece on the fear of failure here .

The Cost of No Action

It's kind of difficult to lay out pretty graphs and charts showing what the "cost of procrastination" really is. We can assign some arbitrary number to whatever benchmark profit exists per site in an imaginary portfolio. However, I think it's best if you play with your own numbers a bit and figure you what the cost of doing nothing is to you.

Factor in the hours you might be doing things like checking your email every 5 minutes, cluttering up Facebook with Farmville posts and annoying your friends with suggestions, wondering if this latest SEO tool suite will be the answer to your prayers, and last but not least wondering if your idea will work. There are more variables of course, but I just outlined some of things that might be commonplace.

Dealing with Competitors

The bottom line is that the web gets more and more competitive everyday and if you are just sitting on the sidelines waiting and waiting and waiting then your competition is going to sprint by you on their way to the end zone, over and over again.

Even if you don't have any fears of failure or success, or maybe you are extremely self-confident in your abilities, you should consider getting a bit more into the game if you want to make any significant headway in your efforts for world domination. You want to try and avoid doing a bunch of things "average". Try and nail down an effective process which you can replicate somewhat, site to site.

It's Up to You

Project management is an essential skill you'll need if you want to run multiple sites, create multiple products, or if you are running a web business with any scale. I like to work in different markets so I can a sense of what others are doing to be successful, more consumer data to evaluate, the ability to establish connections with people I otherwise would have never been able to establish a business relationship with, and so on. Keeping track of the different things I'm doing can be a chore. Enter.....the cloud.

With so many moving parts to a site these days (SEO, PPC, social media, monetization, domain buying, market research) you'll find yourself with quite a list of to-do's and contacts piling up all over the place. One thing that has helped me tremendously is being able to put most of my business in the cloud with services like:

Being pretty much 100% mobile really has its advantages. I like a change of scenery every once and awhile so having all my stuff readily accessible at a moment's notice is fantastic.

So take advantage of the opportunities out there, don't over-extend yourself, and establish flexible (yet reasonable) due dates and goals for you and your business. In the end, I think you'll thank yourself for it.

TopSEOs.com - A Review of the Top SEOs Paid Rating Service

Who is going to pay to tell people that they are good enough and their lives are fine as they are? A fundamental truth of advertising is that advertising the truth usually isn't very profitable - which is why there is lead generation, affiliate programs, public relations, negative billing options, small print, bogus medical research, and so on... ;)

Ever wonder how an SEO professional can charge first world rates to do third rate, third world work and still get a top rating from a heavily advertised SEO rating website? Edward Lewis has the lowdown on Top SEOs, including TopSEOs complaints.

[edit: above links removed, as Edward sold his site at some point & then the person who bought it later sold it to TopSEOs, so the above links would have led to lead generation forms for some unsavory SEO folks.]

A big part of the problem with the affiliate business model is when people offer fake rankings / ratings and only promote whoever pays them the most. The person/company which can afford to pay the most for leads often can only afford to because there is hidden risk or hidden cost in the service, or because they don't deliver on their promises. An analogy here is those AAA rated mortgage backed securities where an S&P employee explained, "We rate every deal. It could be structured by cows and we would rate it."

The biggest brands don't pay as much per lead because they don't have to. They invest in brand and quality of customer service. The best service-based companies don't need to pay cut-rate ad prices to advertise. The best SEO companies have far more demand for their time than time to pay to hunt for customers.

I remember back in 2006 when one of the currently "top rated SEOs" did work for my wife's website (before she met me). That SEO firm did nothing but outsource overseas irrelevant reciprocal link exchanges and her website *would not rank* for any semi-competitive keywords until *after* the reciprocal links page was removed from her site. After we took down those reciprocal links and built some quality links the site started to rank. We changed the FTP details as well because that guy's services were not only not worth paying for...the reciprocal links were proved to be damaging, and we didn't want him to put them back up. And in spite of not doing any services for months (and certainly no services worth paying for), this person wanted to ensure they got paid for 12 months of "service." And they didn't want to let the contract end when it was supposed to either. They were all sales, all the time. It didn't matter that they were selling ineffective garbage.

What eventually stopped the credit card charges was when I wrote him via email "If her credit card is charged again we will be doing a reverse charge and a full writeup on the service."

He responded to that with the following:

I would watch your comments and threats my friend as you have no idea of what I am capable of or who I am - this is a small industry and if you are trying to be a an up an coming player in it this is not the way to do it by bashing your competition. A simple email professionally stating that you were unhappy with the service would have sufficed and I would have looked into to make sure Giovanna got what she paid for.

I have run 2 optimization companies and have been in this business for 12 years now. With my contacts at Google and the other main engines I can get your ebook website banned within 1-2 days if this is how you do business - with threats and slander - keep it up.

The funny thing is all I said was that if he tried charging again (past the contract) that we would reverse charges. And yet the sleazeball told me to "watch your comments and threats" and that he could use "contacts at Google and other main engines" to get my website banned.

What a jerk.

I have always had contempt for blowhards, and for pure hard-sales salesmen who put sales first and are willfully ignorant of their trade and/or who are willing to sell garbage product without any concern for the customer's welfare.

I am grateful that the above mentioned person sucked at what they did & ripped people off back then. If they were not out scamming people and actually provided a useful service then my wife wouldn't have had a reason to contact me and meet me and marry me. ;)

I let it go for over 3 years, but if they are still scamming people then that needs to stop. I figure its only right that I write this post as a fair warning. All good things must come to an end. And so should bad things. Hopefully these clowns quite scamming people. Enough is enough.

Update: 3 years later the fake ratings continue. BigMouthMedia was rated a top SEO agency by Top SEOs, even when it no longer existed as a distinct company after a merger years earlier. Top SEOs is so bogus with their ratings that they even put out a press release announcing the above rating of the above non-company!

Is Alexa Relevant in 2010?

We recently reviewed a bunch of competitive research tools, and in that spirit I thought it would be a good idea to review Alexa. It is not that Alexa is the #1 service available, but they do provide one of the better services while being free. Every few years it seems they fall behind and become a bit of a relic, and then every few years they catch up.

Recently when using Alexa I saw they added a good number of features, so I thought it would be worth doing a run down.

Traffic


What Alexa is most popular for - their traffic rank, is popular because it has been around for a long time and is well referenced. I don't consider it to be a high value tool in terms of accuracy though. I think all these traffic estimation tools have a big margin for error, and its easy to read too much into the base/core number. Having mentioned that, you can try to use traffic data from Alexa, Google Website Trends / DoubleClick Ad Planner, Compete.com, and Quantcast to try to see how well they agree in terms of the traffic volume of a site or the relative volume between multiple sites in the same vertical.

In spite of my lack of faith in the Alexa rank numbers some people do put weight on it. Some investors use it. And when Markus Friend was launching PlentyOfFish he redirected Alexa users away from his site to stay below radar until his site was strong.

Pageviews Per Visit

This is a good hint at how compelling people find a particular website. Sites which are driven by arbitrage efforts typically are not very engaging, hoping to either sell something right away or get people off the site. They also offer a time on site feature which you can use to compare how sticky sites are.

Bounce Rate

This is basically a flip of the above...people who see 1 page and then are gone. You can see in the yellow area where we tested using a pop up. While the pop up did get more people to register on our site, we dropped the pop up because it was somewhat inconsistent with the rest of our marketing (our core audience of customers tends to tilt torward the expert end) and the types of people who were receptive to pop ups were not as good of a longterm fit for our site as customers.

Downstream Traffic Sources

Who are they sending traffic to? Where do their visitors go after leaving the site?

Upstream Traffic Sources

Who is sending them traffic? In many ways this can be unsurprising, but certain sites end up being more or less dependent on social media due to certain things like if they appeal to younger or older customers, what they are doing offline, if they are producing linkbait relevant to a specific audience which is heavily integrated into social media.

This can also help you locate some advertising locations, figure out how reliant they are on search, and help you see which sites in the vertical they are closely aligned with. DoubleClick Ad Planner also has a pretty awesome traffic affinity feature.

Search Traffic Percent

This shows the percentage of their traffic which comes from search engines. If it is abnormally high, that might mean the site has a search-heavy focus and needs some thickening out in terms of community participation and developing other traffic streams. If it is abnormally low, and you have similar link profiles to other sites that are higher, then it might mean that you are missing some important keywords that you should target. This is where digging in for more data with a tool like SEM Rush or Compete.com shines.

Subdomains

Do they have a membership area to their site? If they host it on a subdomain you can see how active they are. Having anywhere near 5% or 10% of your traffic in the private member's area is quite good if you have a well connected high traffic website. You can also see that our tools subdomain is a quite popular section of our site.

Top Search Queries

You can use this to find some of the most important keywords for a competing site. If some of your best keywords are being revealed it might make sense to publish some filler content on a popular topic that is hard to monetize so that it better shields some of your best keywords from free public viewing.

If you install the Alexa toolbar they will also show you a bit more query data and list some opportunities for that site on the paid search front.

Demographics

You can see what countries a particular site is popular in.

And you can get more detailed demographic data on a per site basis.

Many sites within the same field will have fairly similar demographic targets, but even things like at work vs at home can indicate if the site is primarily targeting independent types or corporate types. When compared against the above, notice how (generally) SeoMoz has a fairly similar audience composition:

They skew a bit younger (I think sometimes my cynical nature turns off some young pople), a bit more college educated (they go to like 10x as many SEO conferences as I do), and we are perhaps a bit more popular with self employed people. And then for Search Engine Land you can see that they have a similar profile to SEO Moz, but with even more people at work and more college educated people.

And then you have sites which are extreme demographic outliers. Ever wonder who the customers are for those websites primarily marketed through hyped up email launch sequences by affiliates?

Well throw some of those sites into Alexa, and you will find that for many of those sites it appears the target market is: old desperate and gullible men from the US who failed at life, still don't have a good b/s meter, and want to believe there is a silver bullet they can use now to automatically generate wealth. They can't, of course, but there is a crew that will sell them that story and get rich by working over the remaining crumbs in their retirement accounts.

I am betting that part of why our age distribution is a bit more flat than most other SEO sites is because we offer free tools which are recommended to some of the audiences that buy the launch product stuff (or, that is my theory, based on some of them left their member's areas not password protected and sent a bunch of traffic at our site).

How do your demographic profiles compare to other sites in your space? Have you checked out all the features Alexa has added? What do you think of them?

Pages