Google Engineers Offering Free Course in Black PR

Has a competitor launched a new feature that concerns you? If so, how do you react?

Google, well known for their public relations expertise, does not like the idea of Facebook creating an (eventual) distributed ad network based on demographics data. In spite of Google personalizing search by default (without asking), Google opting you into behavioral targeting (without asking), & automatically opting you into Google Buzz (without asking), suddenly they are a company concerned with the privacy of people on *other* networks.

An effective attack typically should not look like it comes from corporate, but sound more like a list of alarmed concerns issued by individuals just like you. And so we get alarmed stories from the likes of Ka-Ping Yee, a software engineer for the charitable arm of Google:

Facebook's new system for connecting together the web seems to have a serious privacy hole, a web developer has discovered.
...
"It seemed that anyone could get this list. Today, I spent a while checking to make sure I wasn't crazy," he wrote on his blog. "I didn't opt in for this. I even tried setting all my privacy settings for maximum privacy. But Facebook is still exposing the list of events I've attended, and maybe your event."

The best thing to do is disable your Facebook account and wait it out. It is easy to do, and you can always enable it later! :D

Who Are The Top 10 SEOs in the World?

A lot of people who are well known as SEOs spend too much time on self promotion and not enough time on business development. BTW I would classify myself as being in that camp, though I have been slowly migrating since meeting my wife ;)

So much of SEO stuff is sorta ego in place of performance IMHO. And the problem when you hire top SEOs is that even if they have a strong brand and do great work on their own sites, the market pricing for services tends to be so dysfunctionally under-priced that...

  • it is mostly an exercise in back patting to even do any client services after you have a good amount of capital, cashflow, and leverage online
  • even if you think you are hiring one of the best SEOs you still rarely get to work with them because the people who are out there being really well known are by and large lead generation tools for the company, and the bigger the company is the more likely you are to have an intern servicing your account

Getting serious cashflow out of servicing the SEO market is akin to squeezing water out of a rock, especially when compared against running your own websites.

To me, the measure of an SEO's success is not in their knowledge, but in their ability to leverage their knowledge to build cashflow. I know money isn't everything, but we live in a world where the algorithms grow more complex every day. So each day you are working for less than your market value is a day closer you are to being broke!

Spamming and jamming can get you some paydays, but its not easy to *consistently* pull down 7 or 8 figures a year in profit if you are not building at least 1 or 2 properties with serious staying power and momentum behind them.

Given the complexity of SEO and the lack of liquidity in the SEO market I think that by and large the best SEOs who generate the greatest profits derive most of their profits from publishing. Given that I thought I would highlight some of the people who I would view as top SEOs (and why).

Danny Sullivan

Few people have Danny's knowledge about the history of and trends in search. Even fewer have that type of knowledge while being accessible. And even fewer yet would have been able to put a decade in building up momentum for a brand and website in the industry, stop, start over from scratch, and compete against what they had built for a decade.

Imagine the strongest site you have, giving it a decade of effort, and then one day trying to start from scratch competing directly against it with a similar business model. And yet he pulled it off.

Greg Boser & David Naylor

Greg is probably the first name that comes to mind when someone says "old SEO" (yes even before Bruce Clay). His knowledge is much like Danny's in being rich with historical context. The thing that Greg has done to make consulting actually worth doing is tie payment to performance. Doing SEO in that manner is like becoming an affiliate, but one with few competitors and a huge advantage in the marketplace.

Dave is the UK version of Greg (or maybe Greg is the US version of Dave?), and they have done some successful projects together for some of the biggest brands in the world.

Stephan Spencer

Stephan Spencer branded himself as being an expert at ecommerce SEO. And, rare amongst SEOs, he has the technical chops *and* the marketing skills to sell to big companies (speaking their language & touring the world speaking at dozens of conferences each year).

They built a software program which is almost as sweet as cloaking would be (if you could get away with doing it constantly with no risk), but partnered with the right kinds of (big brand) companies and branded their GravityStream solution appropriately such that it was never viewed by Google from a negative lens. This created a business model where they could get paid based on performance (like many affiliates do) but be paid for the performance of the core brand website! :D

NetConcepts was sold to the SEM company Covario, which will be able to benefit from tying the GravityStream technology to their predictive analytics and Google's quick-indexing caffeine search results.

Patrick Gavin & Andy Hagans

(UPDATED: I like Patrick Gavin, but at the time of writing this he was partnered with Andy on some stuff and Andy went out of his way to screw me multiple times. It was perhaps unfair for me to lump them together as Patrick has been nothing but good to me. Plus he collects sports cards & has all sorts of funny sports-related stories.)

As a person, at this point I don't really trust or respect Andy(and feel that those who do might be in for some eventual bad news). But as far as being efficient at running businesses, few can compare.

Patrick took a gamble and build the Text Link Ads link brokerage into a company he was able to sell for mid 8-figures. And his latest venture in the SEO space was so bold as to call "ensure you are not buying any links" an advanced SEO tip. Meanwhile on Andy's personal site he recommends iAcquire for your link buying needs :D

Not content with sitting on the results from TLA, they invested the proceeds (and other investor funds) into building a domain portfolio that even Kevin Ham or Frank Schilling would admire. But they also turned those domain names into functional websites, and have kept cost structures low, while creating blogs with more top x lists than the rest of the web combined and sending out millions of "congrats" emails at potential link sources. The net result? They have built a lead generation business that has been rumored to be pulling in 8 figures a year.

Wherever there is an economic distortion in the economy leading to a large bubble you can bet these guys have at least a half dozen to a few hundred sites, chipping away at the markets 24/7/365. And the only thing increasing faster than their scale is their efficiency!

At some point I believe Andy was bought out from the projects and Patrick dialed up on quality of the stuff he was building.

Matt Cutts

I always hate when I see Matt Cutts listed on top SEO lists and think "hey he is not even an SEO"

...but...

how many SEOs have seen Google's source code? How many have written a good chunk of it? As one of the top few search engineers at Google, Matt not only has a pulse on what is changing with the web, but he constantly tracks & battles the evolution of spam. His knowledge and experience set allows him to just look at a search result and be able to spot the algorithmic weaknesses & exploits at a glance.

Further, Matt Cutts is better at public relations than 99% of public relations experts are. He is able to constantly promote Google products and engage in issue shaping while rarely being called out for it. And he rarely makes *any* mistakes on the public relations front, even when defending some of Google's most bogus & hypocritical policies.

Imagine if your company had a b/s slogan of "don't be evil" while operating with the above strategy. And yet he somehow manages to make it work.

Jason Callus Anus

Imagine entering an industry pulling in attention by calling everyone in the industry a bunch of scumbags - stating that you will clean things up through the use of manual intervention. Then imagine using the economic downturn to fire almost all your editorial employees and leveraging your built up domain authority to create a low quality automated general purpose web scraper, which stuffs Google with indexing their own search results (heavily wrapped in ads). And then imagine link farming to build authority, then using the leverage of that platform to start selling SEO services to corporate clients & selling links!

When Matt Cutts described scraper sites a few years back he said they were "shoot-on-sight". And yet Jason's crappy site keeps gaining traffic while almost never adding any value anywhere.

Whenever I think of Mr. Anus, I picture a used car salesman who moved to the state which doesn't have a lemon law just so he could get the enjoyment of duping people with broken cars. And yet somehow he manages to pull it off. For public relations brilliance he gets a +1. And the same goes for claiming ignorance of SEO and claiming to be anti-spam so he can get away with passing his spam garbage off onto everyone else while rendering Google's spam team flacid.

Richard Rosenblatt

In 1999 Richard Rosenblatt was able to sell iMall (have you ever heard of it?) for over a half-billion Dollars. He then sold MySpace near the top for $580 million. Trying to strike gold once more, he formed Demand Media and bought eHow.com to build a search-arbitrage content farm. Once growth rates began to slow he then created a controversy by trying to legitimize his model in the media, building his site tons more links. He then used that platform as a success story to get other publishing websites to engage in profit-sharing partnerships where he posts articles on huge trusted authoritative domains like USAToday.com.

Now Demand Media is rumored to be gearing up for an IPO or sale:

Demand Media, a closely watched startup that mines online search engine data to generate thousands of videos and web stories a day, has hired Goldman Sachs to explore an initial public offering.

People familiar with the plans say the company could file for an IPO as early as August. Details have yet to be finalised, but the discussions involve pricing shares around November in an offering valuing the company at about $1.5bn.

A little known fact amongst the SEO industry is that Richard also is the chairman of iCrossing, which is currently being rumored for sale to Hearst Publishing for ~ $400 million:

Under the deal, which is in the final stages of negotiations, iCrossing, one of the nation's biggest independent digital-marketing shops, is likely to fetch about $375 million, plus bonus payments if it reaches certain targets, these people said.
...
One person familiar with the matter cautioned that iCrossing, which is based in Scottsdale, Ariz., could still decide to remain independent if it doesn't attract the right price.

Nice side gig!

That guy flat out prints money. If he keeps it up, in a few years he might put Ben Bernanke to shame. :D

Honorable Mentions

Over the past few years certainly Jeremy Shoemaker, Brian Clark, and SugarRae have built up some nice empires - each with a vastly different approach. The Caveman is great at tying SEO metrics into real world marketing advice, and has the cashflow to prove it. In terms of being great at building on the consulting model, Bruce Clay comes to mind. Tim Armstrong is tasked with turning around AOL, and if he is successful with it he would deserve a mention. I would also put Cygnus high on any SEO list, but he tends to be a bit shy, and is not very boastful in terms of what he has accomplished. John Andrews would make the list too, but then he doesn't like lists! :D

Does Marketing Make You Cynical?

A common practice in the marketing space is for people to diminish what you do, state that it is below them, help rebrand your stuff in a negative light, and then at some point in the future basically clone the idea (maybe with a few new features, maybe not) and then push their clone job aggressively as though it is revolutionary.

Another shady practice is when you ask people for advice and they say "no don't do that" and then as soon as they hang up the phone they send off emails to their workers telling them to do that which they told you was a bad idea.

I don't think that the average person or the average marketer is inherently sleazy. But I think when you look at the people who are the most successful certainly a larger than average percent of them engaged in shady behavior at some point.

To keep building yield and returns at some point short cuts start to look appealing. And so you get

None of the above is a cynical take or an opinion at this point. That was simply a list of 3 stated facts.

Create a large enough organization with enough people and you can always make something shady seem like it was due to the efforts of a rogue individual, rather than as company policy. A key to doing this effectively within a large organization is to publish public thoughts that are the exact opposite of your internal business practices.

The word "propaganda" was a bad word, as that is what the Germans were using, so Edward Bernays had to give it another name - public relations.

Recently the Google public policy blog published a post titled Celebrating Copyright. Around the same time Viacom leaked the following internal Google document

You can't get any clearer than that!

In the past when I claimed Google operated as-per the above I was accused of being cynical or having sour grapes. But when you tie together a lot of experiences and observations others lack and you are not conflicted by corporate business interests you have the ability to speak truth. You are not always going to be right, but the lack of needing to cater to advertiser interests and filter means you will typically catch a lot of the emerging trends before they show up in the media - whatever that is worth.

If you're ever confused as to the value of newspaper editors, look at the blog world. That's all you need to see. - Eric Schmdit

Speaking of the media, have you heard about the Middle American Information Bureau

The Century of Self is an amazing documentary, well worth buying

Google SERP CTR Data by Search Rank

Generally I have not been a huge fan of registering all your websites with Google (profiling risks, etc.), but they keep using the carrot nicely to lead me astray. :D ... So much so that I want to find a Googler and give them a hug.

Google recently decided to share some more data in their webmaster tools. And for many webmasters the data is enough to make it worth registering (at least 1 website)!

AOL Click Data

When speaking of keyword search volume beakdown data people have typically shared information from the leaked AOL search data.

The big problem with that data is it is in aggregate. It is a nice free tool, and a good starting point, but it is fuzzy.

Types of Searches

There are 3 well known search classifications: navigational, transactional, and informational. Each type of query has a different traffic breakdown profile.

  • In general, for navigational searches people click the top result more often than they would on an informational search.
  • In general, for informational searches people tend to click throughout the full set of search results at a more even distribution than they would for navigational or transactional searches.
  • The only solid recently-shared publicly data on those breakdowns is from Dogpile [PDF], a meta search engine. But given how polluted meta search services tend to be (with ads mixed in their search results) those numbers were quite a bit off from what one might expect. And once more, they are aggregate numbers.

Other Stuff in the Search Results

Further, anecdotal evidence suggests that the appearance of vertical / universal results within the search results set can impact search click distribution. Google shows maps on 1 in 13 search results, and they have many other verticals they are pushing - video, updates, news, product search, etc. And then there are AdWords ads - which many searchers confuse as being the organic search results.

Pretty solid looking estimates can get pretty rough pretty fast. ;)

The Value of Data

If there is one critical piece of marketing worth learning above all others it is that context is important.

My suggestions as to what works, another person's opinions or advice on what you should do, and empirical truth collected by a marketer who likes to use numbers to prove his point ... well all 3 data sets fall flat on their face when compared against the data and insights and interactions that come from running your own business. As teachers and marketers we try to share tips to guide people toward success, but your data is one of the most valuable things you own.

A Hack to Collect Search Volume Data & Estimated CTR Data

In their Excel plug-in Microsoft shares the same search data they use internally, but its not certain that when they integrate the Yahoo! Search deal that Microsoft will keep sharing as much data as they do now.

Google offers numerous keyword research tools, but getting them to agree with each other can be quite a challenge.

There have been some hacks to collect organic search clickthrough rate data on Google. One of the more popular strategies was to run an AdWords ad for the exact match version of a keyword and bid low onto the first page of results. Keep the ad running for a while and then run an AdWords impression share report. With that data in hand you can estimate how many actual searches there were, and then compare your organic search clicks against that to get an effective clickthrough rate.

The New Solution

Given search personalization and localization and the ever-changing result sets with all the test Google runs, even the above can be rough. So what is a webmaster to do?

Well Google upgraded the data they share inside their webmaster tools, which includes (on a per keyword level)

  • keyword clickthrough rank
  • clickthrough rate at various ranking positions
  • URL that was clicked onto

Trophy Keywords vs Brand Keywords

Even if your site is rather well known going after some of the big keywords can be a bit self-defeating in terms of the value delivered. Imagine ranking #6 or #7 for SEO. Wouldn't that send a lot of search traffic? Nope.

When you back away the ego searches, the rank checkers, etc. it turns out that there isn't a ton of search volume to be had ranking on page 1 of Google for SEO.

With only a 2% CTR the core keyword SEO is driving less than 1/2 the traffic driven by our 2 most common brand search keywords. Our brand might not seem like it is getting lots of traffic with only a few thousand searches a month, but when you have a > 70% CTR that can still add up to a lot of traffic. More importantly, that is the kind of traffic which is more likely to buy from you than someone searching for a broad discovery or curiosity type of keyword.

The lessons for SEOs in that data?

  • Core keywords & raw mechanical SEO are both quite frequently heavily over-rated in terms of value.
  • Rather than sweating trying to rank well for the hardest keywords first focus on more niche keywords that are easy to rank for.
  • If you have little rank and little work to do then there is lots of time to focus on giving people reasons to talk about you and reference you.
  • Work on building up brand & relationships. This not only gives your link profile more karma, but it sends you a steady stream of leads for if/when you fall out of favor a bit with the search engines.

Those who perceive you well will seek you out and buy from you. But it is much harder to sell to someone who sees you as just another choice amongst many results.

Search is becoming the default navigational tool for the web. People go to Google and then type in "yahoo." If you don't have a branded keyword as one of your top keywords that might indicate long-term risk to your business. If a competitor can clone most of what you are doing and then bake in a viral component you are toast.

Going After the Wrong Brand Keywords

Arbitraging 3rd party brands is an easy way to build up distribution quickly. This is why there are 4,982 Britney Spears fan blogs (well 2 people are actually fans, but the other 4,980 are marketers).

But if you want to pull in traffic you have to go after a keyword that is an extension of the brand. Ranking for "eBay" probably won't send you much traffic (as their clickthrough rate on their first result is probably even higher than the 70% I had above). Though if you have tips on how to buy or sell on eBay those kinds of keywords might pull in a much higher clickthrough rate for you.

To confirm the above I grabbed data for a couple SEO tool brands we rank well for. A number 3 ranking (behind a double listing) and virtually no traffic!

Different keyword, same result

Informational Keywords

Link building is still a bit of a discovery keyword, but I think it is perhaps a bit later staged than just the acronym "SEO." Here the click volume distribution is much flatter / less consolidated than it was on the above brand-oriented examples.

If when Google lowers your rank you still pull in a fairly high CTR that might be a signal to them that your site should rank a bit higher.

Enough Already!

Enough about our keywords, what does your keyword data tell you? How can you better integrate it to grow your business?

Clicky Web Analytics Interview

The field of web analytics is filled with free options, self hosted options, open-source products, expensive options, and affordable paid solutions. If you are looking for an affordable, feature-rich, and easy to use web analytics package you may want to check out Clicky.

Clicky is real time as well, which is a feature even some of the more popular services do not have. You can find a comparison between Clicky and their competitors right on their home page. Currently you can go back 6 months in the interface so you'll want to make copies of your data every few months or so.

Recently we interviewed Sean and Noah over at GetClicky.Com. Clicky is pretty popular with the members here and it's a great alternative to Google Analytics.

Sean and Noah were kind enough to answer some questions about their business model, future plans, and the rich feature set within their product.

1. Is selling the company in your future plans? If so, how would data be protected in such a case. As an example Tracking202 sold out to Bloosky and that concerned many affiliates. Do you plan on selling a version of the software which can be hosted locally on the users own server to get around worries associated with you possibly selling the service someday?

Selling the company is never out of the question; however, it would be inane and arrogant to plan solely for such an exit. We enjoy building Clicky and interacting with the Clicky community, and new owners usually have new agendas. Therefore, we prefer to keep Clicky rather than sell it. But if we did sell it, we would only do so under the condition that nothing changes for existing users. We do not have any plans to offer a self hosted option.

2. Do your sell the data at all? How secure is the data? Some of our members pointed out that Clicky doesn't have an about us page and Roxr's site is thin on the "who you are" details. In dealing with certain search engines, a few folks in the SEO field like to carry around a tinfoil hat or 3. Could you tell us a bit more about your company, infrastructure, etc?

Under no circumstances do we sell our customers' data. Data is stored locally and only accessed by its respective owners. Our privacy policy states this and we abide wholly to this unsigned "contract". We have never had any unauthorized access to customer data. We provide SSL login and encourage customers to use this feature.

Your members are correct; we don't mention the "people" behind Clicky. However, once a user registers for Clicky, he will shortly discover we are at his disposal. We usually respond to emails in the same day; we collaborate with our users through our forums and blog; phone support is offered to our white label clients; and Sean and I are always a tweet away.

  • Sean - @schammy
  • Noah - @roxr
  • Clicky - @getclicky

We build, buy, and host our servers. We chose this route from the beginning because it was cheaper in the long run and gave us more control. The processing of hundreds of millions of clicks daily and billions of database queries is inherently too costly to lease out. Many people ask us why we don't move to the cloud. Cloud computing hosts are new to the market and unproven in our opinion. We have a system that works and is cost-effective.

And if there's any doubt about the quality or "trustworthiness" of our service or our company, just search Google for "getclicky" - you will find thousands of positive reviews and other things about us, and almost nothing negative. I think I've only ever seen 2, or maybe 3, "negative" articles about our service, and all of them were over something pretty silly - but people love to rant when they're mad.

3. Will (or can) Clicky get into intricate analytics tracking to the degree of being able to be relied on for multi-channel attribution analysis? Being able to track vanity url's, special coupons, offer codes, etc. Essentially being able to track multiple offline and online campaigns?

We have full compatibility with Google Analytics campaign tags, which makes it all the more easier for existing GA users to move to our platform. These campaign tags (as you probably know) allow you to easily track and segment visitors arriving at your site from any of your online campaigns. For offline campaigns, we also have a "manual" campaign feature, so you can enter in a landing page URL, e.g. mysite.com/tv1, and we'll automatically flag all users who land on that page and report the campaign data together with your "dynamic" (GA) campaigns. We also have a custom data tracking feature so you can attach any data you want to any visitor session (e.g. if they used an offer code when submitting payment). And you can filter/segment your visitors based on this custom data too.

4. Do you see yourselves becoming an acquisition target for Google? What is to stop all the data currently collected by Clicky from ending up in the hands of Google (as an acquisition target maybe)?

It's certainly a possibility that Google may buy us, but we don't really expect it to happen. We believe strongly in privacy so we would try to ensure the data is treated as private and not used to "improve" search results, as they do with Google Analytics. Of course, the trade off there with GA is they let you use it for free, in exchange for that. (They don't tell you that up front, but it's common knowledge they use GA data for all types of optimization stuff, particularly search related). If Google wanted to use the same model with Clicky, well, it would really depend on the specifics. We would be against it on principle but it would really depend on the specifics. And if Google insisted on it, then we'd insist on letting our users know about that kind of change so they could cancel their account if they wanted to.

5. Sort of a piggyback to question 3 but with Clicky's customization abilities how far can one push the limit on segmentation, custom variables, and so on? Seems like lots of possibilities there but to the non-techie folks it can be a daunting task. Do you plan to offer paid support, paid campaign set up, or maybe a "Clicky Authorized Provider" program to help people set up intricate analytics accounts?

There are really no limits on segmentation, other than at this time you can only do it for a maximum date range of one month at a time. But other than that, you can segment your visitors down on a theoretically unlimited amount of data.

Segmentation is actually one of our strong points, because you don't have to fill our crazy forms or anything to find the data you want. In almost every report, the items in the report are clickable (e.g. viewing your top countries, you could click the US and then you would immediately be seeing only visitors from the US, including a summary of their activity at the top). And once this filtering is invoked, it's very simple to add additional variables via the blue drop down menu at the top, e.g. referring domain = google.com, then you would see all of your US visitors who arrived via Google.

We help users for free through email, our forums, and Twitter. We don't have paid support but then again we don't tell someone we won't help them, no matter what the problem is. We give higher priority to complex problems or questions to our paying users, but we still answer all support requests, no matter if the user is paying or not. Adding paid support may be something we do in the future if there's demand for it. We would have to expand our headcount first though. Currently it's still just the two of us running this operation.

Thanks for the time guys!

Well there you have it. Clicky has some pretty deep segmentation and tracking options which are both vital to the success of web analytics set up. We hope you learned a bit more about the company and the product via this Q&A. Clicky has a great support forum as well, for any questions you have as you start to get familiar with their product.

Interview With Anita Campbell in ~ 1 Hour :)

Hi Everyone
Anita Campbell will be interviewing me on her Small Business Radio program in ~ 1 hour & 15 minutes, at 1:30 PM Eastern.

I was on Small Business Trends Radio

Paid Content: the New Paid Link

Paid Links Are Spam

Buying links is considered spammy by Google because it is a ranking short cut which subverts search relevancy algorithms.

And so Google considers it a black hat SEO practice.

Links are somewhat hard to scale because (outside of those who create a network of spam) it is time intensive to find the right sites, negotiate a price, and then ensure appropriate placement. It requires interacting with many webmasters & going through a lot of rejections to get a few yes responses. Due to scale limitations, paid links typically only exert a slight influence on core industry keywords and common variations, limiting any potential relevancy damage.

Further, when a person buys a link, the relevancy is almost always guaranteed (as one would go broke fast if they rented links targeting irrelevant keywords).

Even still, Google hates paid links because they can lower result diversity & bias the organic search results away from being informational and towards being commercial (which in turn means that Google AdWords ads get fewer clicks).

Policing Paid Links

To make link building efforts easier to police, Google created nofollow, which aimed to disrupt the flow of link equity across certain links. Initially the alleged purpose was blocking comment spam. And then after it was in place, comment spam never went away, but the role of rel=nofollow quickly expanded to be a cure-all to be placed on any paid link.

Google encouraged spam reports that highlight paid links. SEO blogs highlighted people that were buying links. Firms like Text Link Ads were eradicated from the Google index. And all was well in GoogleLand.

...Until...

The Rise of Content Farms

Over the past few years people realized that Google had dialed up the weight on domain authority & that links are now much harder to get. So companies started placing lead generation forms on trusted sites & firms like Demand Media purchased highly trusted websites like eHow (which already had a ton of links in place from back when links were easier to obtain).

Demand Media then automated and streamlined the content production process and poured content into eHow until the rate of returns on new content and growth rate started to slow.

This type of strategy attacks the longtail of search, and given how many unique search queries there are each day, that amounts to a lot of opportunity!

Corporate Content Farming: The Art of Informationless Information

Anyone who has watched The Meatrix is likely afraid of factory farms. The content created by these content farms isn't much better. When I highlighted how bad one of the pieces was their solution was to delete it and hide it from site, then write a memo about how they do "quality" content at scale.

That scale part is no joke - Demand Media brought in over $200 million last year. And I suppose if they put the word "low" in front of quality, it wouldn't be a joke either.

Abusing Nofollow

These same authoritative websites which managed to create content for $10 to $15 a page (or sometimes $0 auto-generated pages) then leveraged nofollow on *all* outbound links, so that they would not vote for anyone, even if their content was only a thin watered down rewrite of 3rd party content sources:

eHow is a content publisher known for “How To..” articles. Lately, it seems eHow visits other websites, scrapes their instructional content (on whatever topic), and republishes it as a How To article on eHow. Sometimes the entire step-by-step process is “copied” for the eHow article. I’ve noticed a few times this week, how eHow articles are basically copies of existing content from other sites, worse than Wikipedia rewrites. That’s pretty much “scraping”, even if done by poorly-paid human workers.

So now companies are building a wide range of "content" business models ranging from auto-generated content to semi-autogenerated mash-ups to poorly crafted manual rewrites (as mentioned above).

Content Scraping & Recycling as a Legitimate Business?

Even search engines are becoming general purpose scrapers, snagging third party content, mixing it together, wrapping it in ads, and pushing it into the index of other search engines.

The result?

Ask.com's share of search traffic rose 21% last month alone!

The Information Age

We are no longer in an “Information Age.” We are in the Age of Noise. Falsehoods, half-truths, talking points, out-of-context video edits, plagiarism, rewriting of history (U.S. was founded as a Christian nation, for example), flip-flops, ignoring facts (Cheney and torture for example), neatly packaged code words and phrases, media ratings focus, dysfunctional government (fillibusters have more than doubled, but most don’t realize Republicans are blocking everything), mainstreaming fringe causes….I could go on and on. Is it any wonder why so many who are struggling with kids, jobs, rising medical costs, etcetera have such a tough time wading through all the crap? - source

Paranoid About Links

As building up your own profile has grown harder (since links are harder to get) many new web 2.0 websites provide free outbound links to help encourage participation and get links back into their websites. But then after they reach a critical mass they claim that spam is an issue and strip away the links by using nofollow, stealing that hard work people did to build up the network, offering nothing in return for it!

Google's fear of links is *so out of hand* that an SEO simply mentioning that a person can get a link from their own profile page on a social site is enough to have Matt Cutts go out of his way to push the social media site to remove the opportunity. If you put a lot of work building up a social profile Google doesn't want you to benefit from that work, but it is fine if that network does:

If Google is the one who wants that web link nofollowed because some twitter profile pages may be automated bots or spammers, then it is time they realize that THEY are responsible for determining which of those individual pages is authoritative, trusted and legitimate enough to pass link popularity, by a method other than demanding that other websites and social networks change the ways they do business to help Google stop links being used as a form of currency and to manipulate their algorithm – an issue Google and Google alone created and profited from.

Any Form of Payment = Not Trustworthy

A few years back a well known SEO joined our training program, read our tip about using self-hosted affiliate programs as a link building tool, and then promptly outed us directly to Matt Cutts, in a video, and on their blog. Google quickly blocked our affiliate program from passing link juice. Later a Google engineer publicly stated affiliate links should count.

Since then affiliate links have been a gray area (it works for some companies and doesn't work for others, based on 100% arbitrary choices inside Google). Looking for clarification on the issue, Eric Enge recently asked Matt Cutts: "If Googlebot sees an affiliate link out there, does it treat that link as an endorsement or an ad?"

Matt Cutts responded with: "Typically, we want to handle those sorts of links appropriately. A lot of the time, that means that the link is essentially driving people for money, so we usually would not count those as an endorsement."

So links which are driven by payment should not count as endorsements, even if the affiliate does endorse & believe in the product. The fact that there is a monetary relationship there means the link *should not count*

The Elephant in the Room at the GooglePlex

Ignoring links for a moment, lets get back to the the content mill content business model. It was fine that Demand Media bought trusted (well linked) sites like eHow for their trust to pour low-end content into, even though those pre-existing links were bought by the new owner.

And here is where the content mill business model gets really shady, in terms of "what is good for the user" ... Demand Media is now licensing backfill content to be hosted on USAToday.com on a revenue share basis. Describing the relationship, Dave Panos, Demand Media's CMO said "It's an opportunity for us to get in front of the audience that's already congregating around very well-known brands."

But you won't find that content on the USAToday.com homepage.

When he said "already congregating around very well-known brands" what he meant was "will rank well on Google." And so, what we have is a paid content partnership which subverts search relevancy algorithms.

If affiliate links shouldn't count, then why would affiliate content?

If Google doesn't stop it from day 1 then the media companies are going to quickly become addicted to the risk-free money like crack. And if Google tries to stop it *after* it is in place then they are going to find themselves lambasted in the media with talks of anti-trust concerns.

Something to think about before heading too far down that path.

Two Roads Diverged in a Wood...

How is a content exchange network any different than a link exchange network? The intent is exactly the same, even if the mechanics and payment terms differ slightly.

If a paid link that subverts search relevancy algorithms shouldn't count on the web graph, then why should Google trust paid content that subverts search relevancy algorithms?

Will the search results start filling up with similar sounding misinformed content ranking for 1 then 3 then 8 of the top 10 search results? Do the search results slowly get dumbed down 1 article and 1 topic at a time?

This trend *will* harm both the accuracy and diversity of content ranking in the search results. And it will grow progressively worse as people begin to quote the misinformed garbage on other websites (because hey, if it ranks in Google and is on USA Today it is *probably* true). Or is it?

Some questions worth thinking about:

  • Google is willing to truth police SEOs. Will they do the same for media outlets publishing backfill "content"?
  • How will Google be able to filter out the Demand Media content without filtering out the rest of the media sites?
  • Does Google care if the quality & diversity of the search results is diminished, even if/when most searchers will not be savvy enough to recognize it? I guess it depends on who has the last word on the issues inside Google, because most garbitrage content is wrapped in AdSense ads.

Beauty is Rare - Elusive so it Can be Easily Sold

A couple years ago my wife and I had our big wedding in the Philippines (we even had the mayor of Manila show up). She was so beautiful that day. And lucky for me she is just as beautiful when she wakes up each day. :D

But she can be hard on herself and if she gains a single pound she worries. Truth is I am the chubby one who needs to drop weight.

Beauty (and the perception of it) is a wonder commodity to sell because there is no limit. Almost everyone could be in better shape or be stronger or eat healthier or not have this or that birthmark or the odd finger that bends backwards.

We are imperfect beings by our very nature.

We get sick.

We break.

And we all fight the battle of aging one day at a time - every single day!

But no matter where you go, whatever is rare is typically considered desirable & beautiful. This is not done as an accident, but as a way to generate profits. If the human condition is flawed (and can't be fixed) then the person selling a bogus solution to that problem is going to make a lot more money than a person who sells something which is actually attainable.

And so we live in a world where we treat symptoms, rather than problems. Anything to make the numbers look good and make the sale. From there you are on your own! If you feel bad, we can give you more drugs!

Spending too much time at the computer and eating unhealthy has made me a bit too chubby. No good in obese America! But did you know that in the certain times & cultures being fat was considered a sign of beauty, like when few people could afford to be fat! ;)

There is too much high fructose corn syrup in the typical American diet for obesity to be considered beautiful:

"Our findings lend support to the theory that the excessive consumption of high-fructose corn syrup found in many beverages may be an important factor in the obesity epidemic," Avena said.

The new research complements previous work led by Hoebel and Avena demonstrating that sucrose can be addictive, having effects on the brain similar to some drugs of abuse.

In the United States many girls not only label anorexia as beauty, but some go to tanning salons so they can darken their skin to look beautiful, at least until they get older:

Long-term exposure to artificial sources of ultraviolet rays like tanning beds (or to the sun's natural rays) increases both men and women's risk of developing skin cancer. In addition, exposure to tanning salon rays increases damage caused by sunlight because ultraviolet light actually thins the skin, making it less able to heal. Women who use tanning beds more than once a month are 55 percent more likely to develop malignant melanoma, the most deadly form of skin cancer.

A service which has no lasting positive tangible value AND certainly has a lasting negative tangible risk can grow to become a multi-billion Dollar industry

Anything to be beautiful! This is what beautiful people do. I want to beautiful.

The above never really made sense to me and always felt a wee bit scammy. There was an odd odor to it, but it was hard to appreciate how scammy it was, until...

When it really hit home for me was when my wife and I were in the Philippines. Many of the department stores sell skin whitening soap! Having a lighter skin tone is supposed to be a sign that you are from a wealthier family. And since wealth is concentrated that is rare. And so that is what is considered beautiful. :D

Are Content Mills the Future of Online Publishing? What Comes Next?

Aaron's discussed content mills in his interview with Tedster yesterday.

What is a content mill?

A content mill is a site that publishes cheap content. The content is either user-contributed, paid, or a mix of the two. The term content mill is obviously pejorative, the implication being that the content is only published to pump content into search engines, and is typically of low value in terms of quality.

The problem is that some sites that publish cheap content may well provide value, but it depends who is reading it. For example, a forum might be considered a content mill, as it contains cheap, user-generated content of little value to a disinterested visitor, or a forum might be a valuable, regularly updated resource provided by a community of enthusiasts!

Depends who you ask.

As Aaron says, content mills are all the rage in 2010. Let's take a closer look.

Why Are SEOs Interested In Content Mills?

This idea is nothing new. It's actually white-hat SEO strategy, and has been used for years.

  • Research keywords
  • Write content about those keywords
  • Publish content and attempt to rank that content in search engine results
  • Repeat

If you can publish a page at a lower cost than your advertising return, then you simply repeat the process over and over, and you're golden. Think Adsense, affiliate, and similar means to monetize pages. Take a look at Demand Media.

The Problem With Content Mills

One of the problems with content mills is that in an attempt to drive the production cost of content below the predicted return, some site owners are producing garbage content, usually by facilitating free contributions from users.

At the low end, Q&A sites proliferate wherein people ask questions and a community of people with opinions, informed or otherwise, provide their two cents worth. Unfortunately, many of the answers are worth somewhat less than two cents, resulting in pages of little or no value to an end reader. I'm sure you've seen such pages, as such pages often rank well in search engines if they are published on a domain with sufficient authority.

Some sites, like Mahalo, not only automate their page creation, but the use that automated page to generate automate related question pages as well. The rabbit hole has no bottom!

At the other end of the spectrum, we have sites that publish higher-cost, well researched content sourced from paid writers. A traditional publishing model, in other words. Generally speaking, such pages are of higher value to end user, but the problem is that the search engines can't appear to tell the difference between these pages and the junk opinion pages. If the content mill has sufficient authority, then the junk gets promoted.

And there are many examples in between, of course.

As Tedster mentioned, "the problem here is that every provider of freelance content is NOT providing junk - though some are. As far as I know, there is no current semantic processing that can sort out the two. It's tough to see how this could be quickly and effectively reined in, at least not by algorithm. I assume that this kind of empty filler content is not very useful for visitors — it certainly isn't for me. So I also assume it must be on Google's radar.".

The Future Of Content Mills

I think Tedster is right - such sites will surely appear on Google's radar, because junk, low value content doesn't help their end users.

It must be a difficult problem to solve, else Google would have done so by now, but I think it's reasonable to assume Google will try to relegate the lowest of the low-value content sites at some point. If you are following a content mill strategy, or considering starting one, it's reasonable to prepare for such an eventuality.

The future, I suspect, is not to be a content mill, in the pejorative sense of the word. Aim for quality.

Arbitrary definitions of quality are difficult enough, as we've discussed above. Objective measurement is impossible, because what is relevant to one person may be irrelevant to the next. The field of IQ (information quality) may provide us some clues regarding Google's approach. IQ is a form of research in systems information management that deals specifically with information quality.

Here are some of the metrics they use:

  • Authority- Authority refers to the expertise or recognized official status of a source. Consider the reputation of the author and publisher. When working with legal or government information, consider whether the source is the official provider of the information.
  • Scope of coverage - Scope of coverage refers to the extent to which a source explores a topic. Consider time periods, geography or jurisdiction and coverage of related or narrower topics.
  • Composition and Organization- Composition and Organization has to do with the ability of the information source to present it’s particular message in a coherent, logically sequential manner.
  • Objectivity - Objectivity is the bias or opinion expressed when a writer interprets or analyze facts. Consider the use of persuasive language, the source’s presentation of other viewpoints, it’s reason for providing the information and advertising.
  • Validity - Validity of some information has to do with the degree of obvious truthfulness which the information carries
  • Uniqueness - As much as ‘uniqueness’ of a given piece of information is intuitive in meaning, it also significantly implies not only the originating point of the information but also the manner in which it is presented and thus the perception which it conjures. The essence of any piece of information we process consists to a large extent of those two elements.
  • Timeliness - Timeliness refers to information that is current at the time of publication. Consider publication, creation and revision dates.
  • Reproducibility

Any of this sound familiar? It should, as the search landscape is rife with this terminology. This is not to say Google look at all these aspects, but they have used similar concepts, starting with PageRank.

As conventional SEO wisdom goes, Google may have tried to solve the relevancy problem partly by focusing on authority, on the premise that a trusted authority must publish trusted content, so the pages of a domain with a high degree of authority receive a boost over those with lower authority levels. But this situation may not last, as some trusted sources, in terms of having authority, do, at times, publish auto-gen garbage content. Google may well start looking at composition metrics, if they aren't doing so already.

This is speculation, of course.

I think a good rule of thumb, for the time being, should be "will this page pass human inspection?". If it looks like junk to a human reviewer in terms of organization, and reads like junk in terms of composition, it probably is junk, and Google will likely feed such information back into their algorithms. Check out Google's Quality Rater Document from 2007 which should give you a feel for Google's editorial policy.

DIY SEO Software Reviews

Is DIY SEO any good? Does it work?

When I got to look at DIY SEO my first thought was: good structure & layout, lets see what is under the hood. But then after opening up the hood I found a car with no engine.

On a score of usability I would give the site a 9 or a 10, but in terms of utility it would be lucky to score as high as a 2 or a 3.

Google AdWords: The Cheapest SEM Strategy for Small & Local Businesses

Maybe there are some small businesses out there who are content being obscure, or who only want to rank for their own business name plus maybe 1 or 2 longtail keywords. But for those businesses I suggest bypassing SEO and buying a few Google AdWords ads.

  • Low traffic keywords are typically cheap to buy search ads on - because you only pay by the click. If few people are searching for something then there will be few clicks to buy.
  • Not only are such markets small, but due to their small size they are also heavily fragmented, making the AdWords traffic even cheaper.
  • If few people are searching for your brand then you can likely spend $25 a month on AdWords and ignore learning SEO.

A Legitimate SEO Strategy Requires Investment

With a paid search campaign, you can use Google AdWords to instantly buy search traffic and gain new customers. SEO is a drawn out strategy & typically requires a much deeper initial investment.

There is little value in investing in SEO unless your goal is to dominate your market, and there is sufficient market scale to justify investing thousands of Dollars (and far more when you consider the value of your time). After all, a single link from Business.com or the Yahoo! Directory will run you $299, and 2 links hardly makes for an effective SEO strategy - but they will set you back $600 a year.

And you don't get those links any cheaper just because your business is small. ;)

Why Does DIY SEO Offer Such a Weak SEO Solution?

When looking at DIY SEO it took me a while to think it through, because I kept thinking "something is missing." Why did they raise funding to build THAT? But then I thought it through. DIY SEO was designed by marketers looking to sell something that would be easy to sell at scale - it was not created out of passion to solve a real problem with the desire to help make a difference in people's lives.

The difference is not subtle.

After all, Andy is the guy who had time to build out hundreds of thin affiliate sites, while being too lazy (and lacking the concern needed) to fix his SEO blog for months while it installed malware on anyone who visited his site. That blog had the tagline Livin' the dream, but that is for him though...you can live with malware. He doesn't care.

Could you imagine a reputable SEO site like Search Engine Land, SEO Book, SEOmoz, or Search Engine Journal delivering malware for months without any care or concern? I can't.

SEO Consulting is Expensive

SEO is both time consuming and expensive. Neither Andy or Patrick offer consulting services because they value their time too much to actually dig into client websites and provide useful, relevant, honest, and effective feedback. Patrick states this on his blog

And as Andy's site states: he no longer sells consulting, and he does not want you to email him

DIY SEO was designed as a high margin automated solution which is so automated that it wouldn't require much feedback or interaction with customers.

But there is one big problem with that strategy...

SEO is *NOT* a Mechanical Process

For anyone looking to seriously compete on the web the DIY SEO tool/system is inadequate, and potentially even harmful. Why?

In SEO, a lot of the potential profit comes from knowing your market well, leveraging new technologies & distribution channels to gain market share, and putting a new spin on old marketing ideas. But they tried to make SEO too black and white...far too mechanical. Anywhere where critical thought & analysis can add value to your SEO strategy, you can count on none of it being done with DIY SEO, just some predetermined path which doesn't really account for everything that makes your business and your market unique.

In an age where the algorithms keep advancing faster and subjective things like branding start playing a role in the search results, mechanical doesn't cut it.

DIY SEO is too prescriptive and limited in nature, and it is a backward looking product. What the phrase "for the rest of us" actually means is "good enough to rank on page 5 of the search results, where you will get virtually no search traffic and make no money."

Paint by Number SEO: An SEO Failure Case Study

Google doesn't always respond to marketing efforts in a predictable way. Consider what happened when Patrick purchased SearchEngineOptimization.net for over $60,000.

At the end of last year he tried to do a 301 redirect to get it to rank, but when it didn't work he asked Matt Cutts about it:

Google's spam czar Matt Cutts never responded (of course), but Patrick ended up having to remove that redirect. Months later that $60,000+ domain name was a "coming soon" page.

And now that the redirect has been removed the original redirected site does not rank as well as it did in the past. So that was certainly a lose / lose scenario.

And the worst part is, when he mentioned the strategy people warned him about what would happen right up front:

He would have been better off donating that money to charity!

Advanced SEO? Or Simpleton SEO?

I don't want to share too many examples of how/why/where their program falls short, but to pick a rather glaring one...

Links are the backbone of an effective SEO strategy. Patrick Gavin built from scratch the #1 link broker on the web - Text Link Ads. A few years back he sold that company for over $30 million. Since then links have only increased in importance while becoming harder to get, but if you check out the advice on links in DIY SEO (or at least when I recently checked it out), one of the "advanced" SEO tips was to ensure that you are not engaging in any link buying or selling.

The advanced tips were not sharing safe & effective link buying techniques. Nope.

The advice was to ensure you were not engaging in link buying or selling.

And, of course, on their own websites they don't follow their own advice.

What Do Effective SEO Campaigns Consist of?

I have no desire to out any of their specific websites (hey I have some crappy ones too), but when you look at the EFFECTIVE strategies that you see Andy and Patrick use in their own publishing efforts, at a minimum they contain strategies like:

  • buying old websites
  • buying strong domain names
  • selectively buying links
  • providing a bit of grease to certain About.com guides for coverage of new 1 page sites
  • nepotistically cross linking sites
  • launching top 100 linkbait lists about trending popular topics
  • building social media accounts to promote those lists
  • buying out some blogs to further seed giving legitimate looking coverage to those lists
  • launching egobait lists of topics like the top 100 ambidextrous hermaphrodite bloggers (complete with running an automated email script to alert people of the "award" they have won, with some people winning multiple awards in the same day - congrats again Nancy P. from Texas on your multiple meaningless awards + thanks for the links...your email address is now in the database, and you will win many more awards as they build out their portfolio of websites!!!!)

... all the clever bits of marketing that go into REAL SEO campaigns that compete on the competitive commercial web ... well that stuff is NOT part of the DIY SEO program.

And it likely won't EVER be, because it isn't paint by number.

Better Small Business SEO Solutions

Want an effective guide to small business SEO? Check out Matt McGee's small business SEO guide. It will give you more than the above program while only setting you back $25.

There are numerous free guides worth recommending as well. Both Bing and Google offer SEO starter guides. We created this one for non-profits, this one for bloggers, and this one for general business websites. SeoMoz offers a pretty good one too.

Interview of Tedster from WebmasterWorld

If you have been in the SEO field for any serious length of time you have probably come across (and benefited from) some of Tedster's work - either directly, or indirectly from others who have repackaged his contributions as their own. He is perhaps a bit modest, but there are few people in the industry as universally well respected as he is. I have been meaning to interview him for a while now, and he is going to be speaking at Pubcon South on April 14th in Dallas, so I figured now was as good a time as any :)

How long have you been an SEO, and how did you get into the field?

I started building websites in 1995 and the word SEO hadn't been invented. I came from a background in retail marketing, rather than technology or graphic design. So my orientation wasn't just "have I have built a good site?", but also "are enough people finding my site?"

The best method for bringing in traffic seemed to be the search engines, so I began discussing this kind of marketing with other people I found who had the same focus. Ah, the good old days, right? We were so basic and innocently focused, you know?

If you could list a few key documents that helped you develop your understanding of search, which would be the most important ones?

Here are a few documents that acted as major watersheds for me:

Is PageRank still crucial? Or have other things replaced it in terms of importance?

What PageRank is measuring (or attempting to measure) is still very critical — both the quality and number of other web pages that link to the given page. We don't need to worship those public PR numbers, but we definitely do need quality back-links (and quality internal linking) to rank well on competitive queries.

There appears to be something parallel to PR that is emerging from social media — some metric that uses the model of influencers or thought leaders. But even with that in the mix, ranking would still depend on links, but they would be modified a bit by "followers", "friends", since many social sites are cautious with do-follow links.

Lets play: I have got a penalty - SEO edition. Position 6, -30, 999, etc. Are these just bogus excuses from poor SEOs who have no job calling themselves SEOs, or are they legitimate filters & penalties?

If the page never ranked well, then yes - it could well be a bogus excuse by someone whose only claim to being an SEO is that they read an e-book and bought some rank tracking software. However, Google definitely has used very obvious numeric demotions for pages that used to rank at the top.

The original -30 penalty is an example that nailed even domain name "navigational" searches. It affected some sites that did very aggressive link and 301 redirect manipulation.

What was originally called the -950 (end of results) penalty, while never an exact number, most definitely sent some very well ranked pages down into the very deep pages. Those websites were often optimized by very solid SEO people, but then Google came along and decided that the methods were no longer OK.

In recent months, those exact number penalties seem to have slipped away, replaced something a bit more "floating" and less transparent. My guess is that a negative percentage is applied to the final run re-ranking, rather than subtracting a fixed number. Google's patent for Phrase-based Indexing does mention both possible approaches.

But even using percentages rather than a fixed number, when a top-ranked page runs afoul of some spam prevention filter, it can still tank pretty far. We just can't read the exact problem from the exact number of positions lost anymore.

Do you see Google as having many false positives when they whack websites?

Yes, unfortunately I do. From what I see, Google tends to build an algorithm or heuristic that gathers up all the URLs that seem to follow their "spam pattern du jour" — and then they all get whacked in one big sweep. Then the reconsideration requests and the forum or blog complaints start flying and soon Google changes some factor in that filter. Viola! Some of the dolphins get released from the tuna net.

One very public case was discussed on Google Groups, where an innocent page lost its ranking because a "too much white space" filter that misread the effect of an iframe!

Google's John Mueller fixed the issue manually by placing a flag on that one site to trigger a human inspection if it ever got whacked in the future. I'd assume that the particular filter was tweaked soon after, although there was no official word.

How many false positives does it take to add up to "many"? I'd guess that collateral damage is a single digit percentage at most — probably well under 5% of all filtered pages, and possibly less than 1%. It still hurts in a big way when it hits YOUR meticulously clean website. And even a penalty that incorrectly nails one site out of 300 can still affect quite a lot over the entire web.

How often when rankings tank do you think it is do to an algorithmic issue versus how often it is via an editorial issue with search employees?

When there are lots of similar complaints at the same time, then it's often a change in some algorithm factor. But if it's just one site, and that site hasn't done something radically new and different in recent times, then it's more likely the ranking change came from a human editorial review.

Human editors are continually doing quality review on the high volume, big money search results. It can easily happen that something gets noticed that wasn't seen before and that slipped through the machine part of the algorithm for a long time.

That said, it is scary how often sites DO make drastic errors and don't realize it. You see things like:

  • nofollow robots meta tags getting imported from the development server
  • robots.txt and .htaccess configurations gone way wrong
  • hacked servers that are hosting cloaked parasite content

Google did a big favor for honest webmasters with their "Fetch as googlebot" tool. Sometimes it's the easiest way to catch what those hacker criminals are doing.

When does it make sense for an SEO to decide to grovel to Google for forgiveness, and when should they try to fix it themselves and wait out an algorithmic response?

If you know what you've been doing that tripped the penalty, fix it and submit the Reconsideration Request. If you don't know, then work on it — and if you can't find a danged thing wrong, try the Google Webmaster Forums first, then a Request. When income depends on it, I say "grovel".

I don't really consider it groveling, in fact. The Reconsideration Request is one way Google acknowledges that their ranking system can do bad things to good websites.

I've never seen a case where a request created a problem for the website involved. It may not do any good, but I've never seen it do harm. I even know of a case where the first response was essentially "your site will never rank again" — but later on, it still did. There's always hope, unless your sites are really worthless spam.

Many SEOs theorize that sometimes Google has a bit of a 2-tier justice system where bigger sites get away with murder and smaller sites get the oppressive thumb. Do you agree with that? If no, please explain why you think it is an inaccurate view. If yes, do you see it as something Google will eventually address?

I'd say there is something like that going on — it comes mostly because Google's primary focus is on the end user experience. Even-handed fairness to all websites is on the table, but it's a secondary concern.

The end user often expects to see such and such an authority in the results, especially when it's been there in the past. So Google itself looks broken to a lot of people if that site gets penalized. They are between a rock and a hard place now.

What may happen goes something like this: an A-list website gets penalized, but they can repair their spam tactics and get released from their penalty a lot faster than some less prominent website would. It does seem that some penalties get released only on a certain time frame, but you don't see those time frames applied to an A-list.

This may even be an effect of some algorithm factor. If you watch the flow of data between the various Google IP addresses, you may see this: There are times when the domain roots from certain high value websites go missing and then come back. Several data center watchers I know feel that this is evidence for some kind of white-list.

If there is a white-list, then it requires a history of trust plus a strong business presence to get included. So it might make also sense that forgiveness can come quickly.

As a practical matter, for major sites there can easily be no one person who knows everything that is going on in all the business units who touch the website.

Someone down the org chart may hire an "SEO company" that pulls some funny business and Google may seem to turn a blind eye to it, because the site is so strong and so important to Google's end user. They may also just ignore those spam signals rather than penalize them.

Large authority site content mills are all the rage in early 2010. Will they still be an effective business model in 2013?

It's tough to see how this could be quickly and effectively reined in, at least not by algorithm. I assume that this kind of empty filler content is not very useful for visitors — it certainly isn't for me. So I also assume it must be on Google's radar.

I'd say there's a certain parallel to the paid links war, and Google's first skirmishes in that arena gave then a few black eyes. So I expect any address to the cheap content mills to be taken slowly, and mostly by human editorial review.

The problem here is that every provider of freelance content is NOT providing junk - though some are. As far as I know, there is no current semantic processing that can sort out the two.

Given that free forums have a fairly low barrier to entry there are perhaps false alarms every day on ringing in the next major update or some such. How do you know when change is the real deal? Do you passively track a lot of data? And what makes you so good at taking a sea of tidbits and sort of mesh them into a working theme?

I do watch a lot of data, although not nearly to the degree that I used to. Trying to reverse engineer the rankings is not as fruitful as it used to be —especially now that certain positions below the top three seem to be "audition spots" rather than actually earned rankings.

It helps to have a lot of private communications — both with other trusted SEOs and also with people who post on the forums. When I combine that kind of input with my study of the patents and other Google communications, usually patterns start to stand out.

When you say "audition spots" how does that differ from "actually earned rankings"? Should webmasters worry if their rankings bounce around a bit? How long does it typically take to stabilize? Are there any early signs of an audition going good or bad? Should webmasters try to adjust mid-stream, and if so, what precautions should they take?

At least in some verticals, Google seems to be using the bottom of page 1 to give promising pages a "trial" to see how they perform. The criteria for passing these trials or "auditions" are not very clear, but something about the page looks good to Google, and so they give it a shot.

So if a page suddenly pops to a first page ranking from somewhere deep, that's certainly a good sign. But it doesn't mean that the new ranking is stable. If a page has recently jumped way up, it may also go back down. I wouldn't suggest doing anything drastic in such situations, and I wouldn't overestimate that new ranking, either. It may only be shown to certain users and not others. As always, solid new backlinks can help - especially if they are coming from an area of the web that was previously not heard from in the backlink profile. But I wouldn't play around with on-page or on-site factors at a time like that.

There's also a situation where a page seems to have earned a lot of recent backlinks but there's something about those links that smells a bit unnatural. In cases like that, I've seen the page get a page one position for just certain hours out of the day. But again, it's the total backlink profile and its diversity that I think is in play. If you've done some recent "link building" but it's all one type, or the anchor text is too obviously manipulated, then look around for some other kinds of places to attract some diversity in future backlinks.

On large & open forums lots of people tend to have vastly different experience sets, knowledge sets, and even perhaps motives. How important is your background knowledge of individuals in determining how to add their input into your working theme? Who are some of the people you trust the most in the search space?

I try never to be prejudiced by someone's recent entry into the field. Sometimes a very new person makes a key observation, even if they can't interpret it correctly.

There is a kind of "soft SEO" knowledge that is rampant today and it isn't going to go away. It's a mythology mill and it's important not to base a business decision on SEO mythology. So, I trust hands on people more than manager types and front people for businesses. If you don't walk the walk, then for me your talk is highly suspect.

I pay attention to how people use technical vocabulary — do they say URL when they mean domain name? Do they say tag when they mean element or attribute? Not that we don't all use verbal shortcuts, but when a pattern of technical precision becomes clear, then I listen more closely.

I have long trusted people who do not have prominent "names" as well as some who do. But I also trust people more within their area of focus, and not necessarily when they offer opinions in some other area.

I hate to make a list, because I know someone is going to get left out accidentally. Let's just say "the usual suspects." But as an example, if Bruce Clay says he's tested something and discovered "X", you can be pretty sure that he's not blowing sunshine.

Someone who doesn't have huge name recognition, but who I appreciate very much is Dave Harry (thegypsy). That's partly because he pays attention to Phrase-based Indexing and other information retrieval topics that I also watch. I used to feel like a lone explorer in those areas before I discovered Dave's contributions.

What is the biggest thing about Google where you later found out you were a bit off, but were pretty certain you were right?

That's easy! Using the rel="nofollow" attribute for PR sculpting. Google made that method ineffective long before I stopped advocating it. I think I actually blushed when I read the comment from Matt Cutts that the change had been in place for over a year.

What is the biggest thing about Google where you were right on it, but people didn't believe until months or years later?

The reality of the poorly named "minus 950" penalty. I didn't name it, by the way. It just sort of evolved from the greater community, even though I kept trying for "EOR" or "End Of Results.

At PubCon South I believe you are speaking on information architecture. How important is site structure to an effective SEO strategy? Do you see it gaining or losing importance going forward?

It is hugely important - both for search engines and for human visitors.

Information Architecture (IA) has also been one the least well understood areas in website development. IA actually begins BEFORE the technical site structure is set up. Once you know the marketing purpose of the site, precisely and in granular detail, then IA is next.

IA involves taking all the planned content and putting it into buckets. There are many different ways to bucket any pile of content. Some approaches are built on rather personal idiosyncrasies, and other types can be more universally approachable. Even if you are planning a very elaborate, user tagged "faceted navigation" system, you still need to decide on a default set of content buckets.

That initial bucketing process then flows into deciding the main menu structure. Nest you choose the menu labels, and this is the stage where you fix the actual menu labels and fold in keyword research. But if a site is built on inflexible keyword targets from the start, then it can often be a confusing mess for a visitor to navigate.

As traffic data grows in importance for search ranking, I do see Information Architecture finally coming into its own. However, the value for the human visitor has always clearly visible on the bottom line.

What are some of the biggest issues & errors you see people make when setting up their IA?

There are two big pitfalls I run into all the time:

  • Throwing too many choices at the visitor. Macy's doesn’t put everything they sell in their display windows, and neither should a website.
  • Using the internal organization of the business as the way to organize the website. That includes merely exporting a catalog to a web interface.

How would you compare PubCon South against other conferences you have attended in the past?

PubCon South is a more intimate venue than, say Vegas. That means less distraction and more in-depth networking. Even though people do attend from all over the world, there is a strong regional attendance that also gives the conference a different flavor — one that I find a very healthy change of pace.

In addition, PubCon has introduced a new format — the Spotlight Session. One entire track is made completely of Spotlight Sessions with just one or two presenters, rather than an entire panel. These are much more interactive and allow us to really stretch out on key topics.

---

Thanks Tedster! If you want to see Tedster speak he will be at Pubcon Dallas on the 14th, and if you want to learn about working with him please check out Converseon. You can also read his latest musings on search and SEO by looking into the Google forums on WebmasterWorld. A few months back Tedster also did an interview with Stuntdubl.

Managing Business Opportunity Overload

Do Something...Now!

In a land of opportunity there is typically lots of distraction, oddly enough those distractions are usually other opportunities. How many times have you:

  • Stared at a domain you wanted to buy, but didn't pull the trigger
  • Stared at a domain you bought, but left it parked for another year
  • Negotiated down to what you wanted to pay for a site or domain, yet didn't move forward due to (fill in the blank)
Sign of Indecision

Typical reasons surrounding procrastination tend to be "not enough time" or "this will never work". Well, how many of your "can't miss" ideas missed and how many of your "probably will miss" ideas actually hit?

Win More, Lose Less

In my experience as long as you win more than you lose you're doing ok. This sounds a bit easier than it is though. In many professions, take sports for an example, success (worth millions in contracts) can be had for "succeeding" less than 50% of the time. A couple of examples:

  • Hitters in baseball strive to get a .300 average, which is failing 7 times out of 10
  • Basketball players are considered great shooters if they are successful making 45%-48% of their shots

Imagine if you succeeded at those clips? If so, you better hope ones that you hit on were big money makers and the ones you lost on required minimal investment amounts. If you take a similar approach to finding and operating in new markets most of the initial costs are fairly similar. Basic costs like:

  • Design
  • Content
  • Site Promotion
  • PPC Testing

tend to be somewhat similar on your average new site, perhaps if you are purchasing a domain or site it can skew the numbers a bit but overall these things tend to average out. So at the very least if you are succeeding 6 out of 10 times and you don't get carried away on a new site launch you should be doing pretty well. They more you do the better your ratio gets, the better your long term profits are, and you should expect to raise that ratio a bit as you start to gain more and more experience in researching + launching new ventures.

Dueling Fears

Most of us have a fear of failure and some of us have a fear of success. A fear that if we become successful it might alienate some of our closest friends and family members, it might turn us into workaholics working day and night to sustain that success and lifestyle, and so on. Fear of failure is something I think even the most successful entrepreneur's face from time to time.

Of course, we all know the old basketball saying: "You miss 100% of the shots you don't take".

Fear of failing and succeeding is something one has to overcome on their own but it terms of trying to overcome procrastination it is usually advisable to set less rigid and more reasonable deadlines for yourself and your work as outlined in this post over at harvard.edu http://www.iq.harvard.edu/blog/sss/archives/2006/10/procrastination.shtml (which references a study co-authored by Dan Ariely, who wrote the must read "Predictably Irrational").

Fear of Failure Chalkboard

Psychology Today has a research piece on the fear of failure here .

The Cost of No Action

It's kind of difficult to lay out pretty graphs and charts showing what the "cost of procrastination" really is. We can assign some arbitrary number to whatever benchmark profit exists per site in an imaginary portfolio. However, I think it's best if you play with your own numbers a bit and figure you what the cost of doing nothing is to you.

Factor in the hours you might be doing things like checking your email every 5 minutes, cluttering up Facebook with Farmville posts and annoying your friends with suggestions, wondering if this latest SEO tool suite will be the answer to your prayers, and last but not least wondering if your idea will work. There are more variables of course, but I just outlined some of things that might be commonplace.

Dealing with Competitors

The bottom line is that the web gets more and more competitive everyday and if you are just sitting on the sidelines waiting and waiting and waiting then your competition is going to sprint by you on their way to the end zone, over and over again.

Even if you don't have any fears of failure or success, or maybe you are extremely self-confident in your abilities, you should consider getting a bit more into the game if you want to make any significant headway in your efforts for world domination. You want to try and avoid doing a bunch of things "average". Try and nail down an effective process which you can replicate somewhat, site to site.

It's Up to You

Project management is an essential skill you'll need if you want to run multiple sites, create multiple products, or if you are running a web business with any scale. I like to work in different markets so I can a sense of what others are doing to be successful, more consumer data to evaluate, the ability to establish connections with people I otherwise would have never been able to establish a business relationship with, and so on. Keeping track of the different things I'm doing can be a chore. Enter.....the cloud.

With so many moving parts to a site these days (SEO, PPC, social media, monetization, domain buying, market research) you'll find yourself with quite a list of to-do's and contacts piling up all over the place. One thing that has helped me tremendously is being able to put most of my business in the cloud with services like:

Being pretty much 100% mobile really has its advantages. I like a change of scenery every once and awhile so having all my stuff readily accessible at a moment's notice is fantastic.

So take advantage of the opportunities out there, don't over-extend yourself, and establish flexible (yet reasonable) due dates and goals for you and your business. In the end, I think you'll thank yourself for it.