Content Locking Ads

Consumer Search Insights.

Google recently launched a consumer insights survey product, which quizes users for access to premium content.

How do users get access to these poll questions? Google locks premium content behind them, likeso:

Google has long stated that "cloaking is bad" and that it was deceptive & users didn't like it. Earlier this year Google also rolled out an algorithm to penalize sites that were too ad heavy:

We’ve heard complaints from users that if they click on a result and it’s difficult to find the actual content, they aren’t happy with the experience. Rather than scrolling down the page past a slew of ads, users want to see content right away. So sites that don’t have much content “above-the-fold” can be affected by this change. If you click on a website and the part of the website you see first either doesn’t have a lot of visible content above-the-fold or dedicates a large fraction of the site’s initial screen real estate to ads, that’s not a very good user experience.

Also recall that the second version of the Panda update encouraged users to block sites & many programmers blocked Experts-exchange due to disliking their scroll cloaking. That in turn caused Experts-exchange to get hit & see a nose dive in traffic.

Between the above & seeing how implementation of this quiz technology works, I had to ask:
How do you feel about ads that lock content behind poll questions like this one?

Response Vote
Hate them. A total waste of time 63.7% (+3.3 / -3.4)
I am indifferent 30.8% (+3.3 / -3.1)
I love them. These are fun 5.5% (+2.5 / -1.7)

There isn't a huge split between men & women. Men hate them a bit more, but they also like them a bit more...they are just less indifferent.

Vote Men (811) Women (409)
Hate them. A total waste of time 66.1% (+3.4 / -3.6) 61.5% (+5.4 / -5.7)
I am indifferent 27.2% (+3.4 / -3.2) 34.2% (+5.6 / -5.2)
I love them. These are fun 6.7% (+2.3 / -1.7) 4.3% (+5.1 / -2.4)

Young people & old people tend to like such quizes more than people in the middle. My guess is this is because older people are a bit lonely & younger people do not value their time as much and presume it is more important that they voice their opinions on trivial matters. People just before their retirement (who have recently been hosed by the financial markets) tend not to like these polls as much & same with people in their mid 30s to mid 40s, who are likely short on time trying to balance career, family & finances.

Vote 18-24 year-olds (359) 25-34 year-olds (267) 35-44 year-olds (151) 45-54 year-olds (200) 55-64 year-olds (158) 65+ year-olds (83)
Hate them. A total waste of time 62.1% (+4.9 / -5.2) 62.6% (+6.0 / -6.4) 69.4% (+6.9 / -7.9) 64.5% (+6.5 / -7.1) 68.3% (+6.3 / -7.1) 62.3% (+10.2 / -11.4)
I am indifferent 28.9% (+4.9 / -4.5) 32.1% (+6.2 / -5.6) 24.0% (+7.6 / -6.2) 30.8% (+7.0 / -6.2) 28.4% (+6.9 / -6.0) 28.7% (+11.3 / -9.1)
I love them. These are fun 8.9% (+3.4 / -2.5) 5.3% (+3.7 / -2.2) 6.6% (+5.3 / -3.0) 4.7% (+3.7 / -2.1) 3.3% (+4.4 / -1.9) 9.0% (+9.7 / -4.9)

People out west tend to be more indifferent. Like, whatever man. This may or may not have something to do with California's marijuana laws. ;)

vote The US Midwest (280) The US Northeast (331) The US South (363) The US West (246)
Hate them. A total waste of time 65.2% (+5.6 / -6.0) 69.0% (+6.2 / -7.0) 65.6% (+5.9 / -6.4) 55.6% (+7.2 / -7.5)
I am indifferent 29.7% (+5.9 / -5.3) 25.6% (+6.8 / -5.8) 28.7% (+6.2 / -5.5) 38.7% (+7.4 / -6.9)
I love them. These are fun 5.1% (+4.5 / -2.4) 5.4% (+5.9 / -2.9) 5.7% (+4.8 / -2.7) 5.6% (+7.4 / -3.3)

Rural people tend to like such polls more than others. Perhaps it has to do with a greater longing for connection due to being more isolated?

vote Urban areas (608) Rural areas (117) Suburban areas (477)
Hate them. A total waste of time 62.6% (+4.6 / -4.9) 53.6% (+10.1 / -10.4) 63.8% (+4.8 / -5.1)
I am indifferent 32.2% (+4.8 / -4.4) 37.5% (+10.4 / -9.3) 29.1% (+5.0 / -4.6)
I love them. These are fun 5.2% (+4.4 / -2.5) 8.9% (+9.5 / -4.8) 7.2% (+5.2 / -3.1)

There aren't any conclusive bits based on income. Wealthier people appear to be more indifferent, however the sampling error on that is huge due to the small sample size.

vote People earning $0-24K (151) People earning $25-49K (670) People earning $50-74K (303) People earning $75-99K (77) People earning $100-149K (20) People earning $150K+
Hate them. A total waste of time 69.0% (+7.7 / -8.9) 62.1% (+4.4 / -4.6) 69.7% (+5.5 / -6.1) 69.7% (+9.1 / -10.9) 53.8% (+19.3 / -20.5) Insufficient data
I am indifferent 26.0% (+8.5 / -7.0) 32.6% (+4.6 / -4.3) 23.6% (+5.8 / -5.0) 26.0% (+11.1 / -8.7) 41.7% (+20.6 / -18.1) Insufficient data
I love them. These are fun 5.0% (+6.8 / -3.0) 5.3% (+4.0 / -2.4) 6.7% (+5.7 / -3.2) 4.3% (+11.8 / -3.3) 4.4% (+27.1 / -4.0) Insufficient data

So, ultimately, Google was right that users hate excessive ads & cloaking. But the one thing users hate more than either of those is paying for content. ;)

Some of the traditional publishing businesses are dying on the vine & this is certainly a great experiment to try to generate incremental revenues.

...but...

How does Google's definition of cloaking square with the above? If publishers (or a competing ad network) do the same thing without Google, would it be considered spam?

Ad Retargeting

Consumer Search Insights.

How do you feel about companies tracking your online behavior to target ads?

Surprisingly, nearly 1 in 11 people like ad retargeting. However, over 3 in 5 people dislike it.

response All (1250) 
I dislike it because it feels creepy 62.3% (+3.1 / -3.3)
I don't care either way 29.3% (+3.1 / -2.9)
I like more relevant ads 8.3% (+2.3 / -1.9)

Women tend to think being stalked by ads is creepier than men do.

vote Men (822)  Women (428) 
I dislike it because it feels creepy 60.6% (+3.7 / -3.8) 64.1% (+5.0 / -5.3)
I don't care either way 30.0% (+3.6 / -3.4) 28.7% (+5.1 / -4.6)
I like more relevant ads 9.5% (+2.6 / -2.1) 7.2% (+4.2 / -2.7)

Younger people who are old enough to be starting families tend to be more financially stressed than most other age groups, so they are likely more appreciative of relevant ads tied to discounts & such. Younger people have also used the web for so much of their lives that they are not as creeped out by tracking & privacy issues as older people are. People in retirement also like relevant ads, perhaps in part because they are feeling the Ben "printing press gone wild but no inflation" Bernake pinch & see their fixed income retirements collapse under artificially low interest rates tied to money printing game.

age 18-24 year-olds (372)  25-34 year-olds (270)  35-44 year-olds (150)  45-54 year-olds (217)  55-64 year-olds (164)  65+ year-olds (77) 
I dislike it because it feels creepy 60.2% (+4.8 / -5.0) 52.3% (+6.3 / -6.4) 65.1% (+7.2 / -8.0) 66.0% (+6.1 / -6.6) 66.6% (+6.9 / -7.7) 55.7% (+11.2 / -11.8)
I don't care either way 33.6% (+4.9 / -4.6) 35.0% (+6.4 / -5.9) 25.5% (+7.6 / -6.3) 27.9% (+6.4 / -5.6) 26.9% (+7.5 / -6.3) 33.5% (+11.9 / -10.1)
I like more relevant ads 6.2% (+2.9 / -2.0) 12.7% (+5.1 / -3.8) 9.5% (+5.9 / -3.8) 6.1% (+3.9 / -2.5) 6.4% (+5.2 / -2.9) 10.7% (+9.1 / -5.2)

People from the west coast are perhaps slightly more aware of the risks of online tracking. People from the south couldn't care either way. In the midwest the stereotype of the mom who clips coupons is shown in the data (though the sample size is small).

vote The US Midwest (259)  The US Northeast (340)  The US South (404)  The US West (247) 
I dislike it because it feels creepy 58.5% (+6.5 / -6.9) 61.8% (+5.9 / -6.3) 61.6% (+5.7 / -6.0) 67.2% (+6.2 / -6.8)
I don't care either way 29.9% (+6.6 / -5.9) 29.1% (+5.8 / -5.2) 32.4% (+5.9 / -5.4) 24.6% (+6.7 / -5.6)
I like more relevant ads 11.6% (+5.6 / -4.0) 9.1% (+5.0 / -3.3) 6.0% (+4.6 / -2.7) 8.2% (+5.7 / -3.5)

On everything outside of disliking online tracking the margin of error is wide enough that it is somewhat hard to notice any strong patterns based on population data.

vote Urban areas (636)  Rural areas (108)  Suburban areas (480) 
I dislike it because it feels creepy 58.9% (+5.0 / -5.1) 61.1% (+9.0 / -9.8) 62.6% (+4.5 / -4.7)
I don't care either way 32.3% (+5.1 / -4.7) 33.9% (+9.9 / -8.6) 27.6% (+4.5 / -4.1)
I like more relevant ads 8.8% (+4.4 / -3.0) 5.0% (+8.7 / -3.3) 9.8% (+3.6 / -2.7)

It is also hard to see much of a broad pattern based on income levels.

vote People earning $0-24K (150)  People earning $25-49K (691)  People earning $50-74K (304)  People earning $75-99K (88) 
I dislike it because it feels creepy 62.2% (+8.4 / -9.1) 60.2% (+4.2 / -4.4) 66.5% (+5.8 / -6.4) 55.1% (+10.2 / -10.6)
I don't care either way 30.0% (+9.2 / -7.8) 30.8% (+4.3 / -4.0) 25.9% (+6.1 / -5.3) 35.8% (+10.4 / -9.2)
I like more relevant ads 7.9% (+8.6 / -4.3) 9.0% (+3.7 / -2.7) 7.5% (+5.5 / -3.3) 9.2% (+9.0 / -4.8)

Google+ Integration

Consumer Search Insights.
As publishers we tend to be quite concerned with the over-promotion of Google+ because it carves up the search landscape, is potentially another hoop that we have to jump through, and in some cases, the Google+ hosted version of a page will outrank the legitimate original source - which screws up the economics of online publishing.

But do users care about how Google+ was integrated directly into the search results? Generally no.

How do you feel Google+ integration has impacted Google's relevancy?

Under 1 in 5 people said it made the search results better, under 1 in 5 said it made the search results worse & over 3 in 5 didn't notice any material impact.

vote All (1260) 
no noticeable impact 64.7% (+3.3 / -3.5)
made it better 17.4% (+2.9 / -2.6)
made it worse 17.9% (+3.0 / -2.7)

Men liked it slightly more than women. However, that difference was within the estimated range of error. If this difference was more significant one might guestimate that women are better at socializing offline & have less need for artificial web relationships, given their relatively larger corpus callosum. ;)

vote Men (875)  Women (385) 
no noticeable impact 64.1% (+3.4 / -3.6) 65.3% (+5.5 / -5.9)
made it better 18.7% (+3.0 / -2.6) 16.2% (+5.2 / -4.1)
made it worse 17.2% (+2.9 / -2.6) 18.5% (+5.3 / -4.4)

Older people are less likely to have loads of online friends & relationships (as they spent most of their lives building relationships in the physical world, before the web or online social networks were popular). Older people also tend to be more set in their ways. Thus many older people won't be signed up for Google+ & won't notice as much of an impact from it.

Younger people are more likely to want to try out new technology, thus they are more likely to notice an impact from it. Some generations tend to be more isolated & individualistic (like the baby boomers) while millennials tend to like to work in groups & network more (it isn't an accident that Facebook started on a college campus & targeted college students), thus younger people are not only more likely to notice something like Google+, but they are also more likely to like its impact.

vote 18-24 year-olds (334)  25-34 year-olds (322)  35-44 year-olds (141)  45-54 year-olds (204)  55-64 year-olds (167)  65+ year-olds (93) 
no noticeable impact 59.8% (+5.1 / -5.4) 64.0% (+5.4 / -5.7) 66.6% (+7.3 / -8.2) 59.3% (+6.6 / -7.0) 65.7% (+6.9 / -7.7) 73.9% (+8.1 / -10.1)
made it better 26.6% (+5.0 / -4.4) 18.8% (+5.0 / -4.1) 16.3% (+7.2 / -5.3) 19.1% (+6.2 / -4.9) 16.4% (+6.7 / -5.0) 7.9% (+8.7 / -4.3)
made it worse 13.6% (+4.1 / -3.3) 17.2% (+4.8 / -3.9) 17.1% (+7.4 / -5.5) 21.6% (+6.0 / -5.0) 17.9% (+6.5 / -5.0) 18.2% (+9.9 / -7.0)

I didn't notice any obvious trends or patterns aligned with locations across the country.

vote The US Midwest (267)  The US Northeast (360)  The US South (378)  The US West (255) 
no noticeable impact 65.5% (+6.7 / -7.3) 61.3% (+7.3 / -7.8) 67.6% (+5.6 / -6.1) 62.4% (+6.6 / -7.1)
made it better 16.2% (+6.2 / -4.7) 20.5% (+7.8 / -6.1) 17.2% (+5.0 / -4.1) 16.5% (+6.3 / -4.8)
made it worse 18.4% (+6.9 / -5.3) 18.2% (+6.3 / -4.9) 15.1% (+5.6 / -4.3) 21.1% (+6.6 / -5.3)

Suburban people were more likely to notice an impact, though they were not heavily skewed in one way or the other

vote Urban areas (669)  Rural areas (124)  Suburban areas (450) 
no noticeable impact 65.9% (+4.1 / -4.4) 66.8% (+9.0 / -10.4) 62.0% (+4.7 / -5.0)
made it better 16.4% (+3.7 / -3.1) 14.3% (+8.5 / -5.7) 20.4% (+4.4 / -3.8)
made it worse 17.6% (+3.9 / -3.3) 18.9% (+9.8 / -7.0) 17.6% (+4.2 / -3.6)

People who earned less were less likely to notice positive or negative impact from Google+ integration (somewhat surprising since younger people tend to skew toward lower incomes & younger people were more likely to notice & like Google+ integration). Outside of that, the data is too bunched up to see any other significant patterns based on income.

vote People earning $0-24K (162)  People earning $25-49K (698)  People earning $50-74K (312)  People earning $75-99K (71) 
no noticeable impact 71.1% (+7.8 / -9.2) 62.8% (+4.4 / -4.6) 61.9% (+6.3 / -6.8) 61.3% (+10.6 / -11.9)
made it better 14.8% (+8.8 / -5.9) 17.5% (+4.0 / -3.4) 18.9% (+5.9 / -4.8) 17.1% (+11.5 / -7.5)
made it worse 14.1% (+9.5 / -6.1) 19.7% (+4.3 / -3.7) 19.2% (+6.4 / -5.1) 21.6% (+11.2 / -8.1)

Editorial Objectivity

Consumer Search Insights.

Should search engines be able to preferentially promote their own services in their search results?

Nearly 3 in 4 people think that search engines should not be able to preferentially promote their own services.

vote All (1226)
no, results should be objective 74.1% (+3.1 / -3.4)
yes, it is their search results 25.9% (+3.4 / -3.1)

There was essentially no split between men & women.

vote Men (827) Women (399)
no, results should be objective 73.7% (+3.1 / -3.4) 74.4% (+5.2 / -6.0)
yes, it is their search results 26.3% (+3.4 / -3.1) 25.6% (+6.0 / -5.2)

Older people tend to prefer/want more editorial objectivity, whereas younger people are more fine with search engines preferentially promoting their own services. Older people tend to be more fixed in their ways & younger people are much less so.

vote 18-24 year-olds (338) 25-34 year-olds (269) 35-44 year-olds (158) 45-54 year-olds (209) 55-64 year-olds (169) 65+ year-olds (83)
no, results should be objective 65.0% (+4.9 / -5.2) 76.0% (+5.1 / -6.0) 74.0% (+6.5 / -7.7) 71.2% (+5.7 / -6.5) 71.4% (+6.5 / -7.5) 87.2% (+6.1 / -10.4)
yes, it is their search results 35.0% (+5.2 / -4.9) 24.0% (+6.0 / -5.1) 26.0% (+7.7 / -6.5) 28.8% (+6.5 / -5.7) 28.6% (+7.5 / -6.5) 12.8% (+10.4 / -6.1)

Geographically, people in the south & midwest tend to be slightly more trusting, perhaps due to the lower cost of living & less competitive markets. However, any differences here are fairly minor & are within the margin of error.

vote The US Midwest (244) The US Northeast (367) The US South (352) The US West (263)
no, results should be objective 72.2% (+6.4 / -7.4) 77.7% (+4.5 / -5.3) 72.1% (+6.0 / -6.9) 75.9% (+5.7 / -6.9)
yes, it is their search results 27.8% (+7.4 / -6.4) 22.3% (+5.3 / -4.5) 27.9% (+6.9 / -6.0) 24.1% (+6.9 / -5.7)

People who are rural tend to be slightly more accepting of Google doing as it wishes, though this is also a small sample size & well within the margin of error.

vote Urban areas (647) Rural areas (106) Suburban areas (453)
no, results should be objective 74.3% (+4.3 / -4.9) 71.9% (+8.5 / -10.5) 74.4% (+4.2 / -4.7)
yes, it is their search results 25.7% (+4.9 / -4.3) 28.1% (+10.5 / -8.5) 25.6% (+4.7 / -4.2)

There isn't a strong correlation with income on this issue either. People cared a bit more at higher income levels, but there was also a wider margin of error due to small sampling size.

vote People earning $0-24K (142) People earning $25-49K (677) People earning $50-74K (316) People earning $75-99K (75) People earning $100-149K People earning $150K+
no, results should be objective 72.0% (+7.8 / -9.4) 76.8% (+3.7 / -4.1) 68.7% (+6.1 / -6.8) 83.1% (+6.9 / -10.2) Insufficient data Insufficient data
yes, it is their search results 28.0% (+9.4 / -7.8) 23.2% (+4.1 / -3.7) 31.3% (+6.8 / -6.1) 16.9% (+10.2 / -6.9) Insufficient data Insufficient data

Jim Boykin Interview

Internet Marketing Ninja Jim Boykin has promoted link building since before I even knew what SEO was. Nearly a decade later so many things have changed in SEO (including renaming We Build Pages to Internet Marketing Ninjas), but he still sees links as a key SEO driver (as do I). I recently interviewed him about links & the changing face of SEO.

so, links links links ... these were the backbone of ranking in Google for years and years. are they still? Is social a huge signal, or something that has been over-hyped?

Yes, I do see backlinks at the backbone of rankings in Google. Every day I see sites that trump the rankings with links, and no social signals...but I've never seen a site that had "poor" backlinks compared to others, but a strong social signal, be ranked great.

There are other signal that I feel are more important than social, like content and user behavior, but then after those, I'd put social signals. Even though I don't think they're more important thank links by any stretch, I do feel that social has a place, in areas like branding, community building, and in assisting in organic search results. I always recommend that people have a strong social presence, even if for only sending additional signals to Google to assist in higher rankings.

Google recently nailed a bunch of lower quality bulk link networks. Were you surprised these lasted as long as they did? Was the fact that they worked at all an indication of the sustained importance of links?

Well...surprised...no... filtering out networks is something that's always going to happen....once something gets too big, or too popular, or too talked about...then it's in danger of being burned... the popular "short cuts" of today are the popular penalized networks of tomorrow... there will always be someone who will create a network (of others sites they control, or their new friends control, or of near expired domains, or blogger groups, etc etc) and that someone will start selling links, and advertising, and it will catch on, and they will sell to everyone and it will become so interconnected that it will cause it's own algorthymitic penalty, or it will get popular, and get the eyes of Google on it, and then it will get filtered, or there will be exact match penalties, or entire site penalties.

If that's the game you play, just understand the risks...or, don't play that game and give other reasons for people to link to you, and get permanent non-paid links, but that takes a lot of time and effort and marketing. That's the price you have to pay...because, yes, rankings in Google still comes down to #1, Links.

After such networks get hit, how hard is it for such sites to recover? Does it create a "flight to quality" impact on link building? Are many of them better off starting from scratch rather than trying to recover the sites?

We've worked with several people who have come to us with after being penalized by Google to some degree (either phrase based penalty, or entire site penalties). Probably the low budget people who got hit just started other sites and tossed their penalized site, but most of the people who come to us can't afford to toss their branded site away.

In almost all of those cases it takes someone removing all the paid and un-natural links that they can. They must understand then that their days of buying links are Over, and they Must create great things on their site that gets natural links....and they must forever give up the chase of being #1 for the big short tail phrases..unless you own the exact .com, or your brand name includes that phrase...In order to recover, they must purge the backlinks of the paid links and the networks, do a reinclusion request, and then start doing "natural things", and then wait and wait and wait...90 days is typical...it's the one Google gave to themselves after you pointed out that Google themselves were buying blog links.

Over time it has become easier to hit various trip wires when link building. You mentioned some things being phrse based or entire site & so on...how does a person determine the difference between these? Some of Google's automated penalties and manual penalties have quite similar footprints, are there easy ways to tell which is which?

A phrase based penalty work like this...let's say you've been targeting "green widgets" and "red widgets" for years...you have lots of backlinks with those exact anchor text....and you were in the top 10 for both phrases....then one day, you rank somewhere on page 3 or higher for those phrases.. you may still rank #3 for "cheap red widgets" or #7 for "widgets green" (reversed phrases)...but for the few exact phrases...it's page 3+ of the SERPS for you....nothing else changes, just those exact phrases.. on the other hand, a sitewide penalty is where pretty much nothing rankings on page 1 or page 2 in the SERPS, when the prior day you had lots of keywords rankings in there. I have no way of knowing which were automatic and which were hand done....sometimes I have a feeling in my gut...but it doesn't really matter...the solution is always the same...clean up the backlinks, and change your methods.

Earlier you mentioned foregoing the head phrase, in spite of things like Google Instant guiding searchers down a path, is there still plenty of tail to be had? Are tail keywords significantly under-rated by the market? How does one square going after tail keywords with algorithms like Panda?

I'm a big believer in the long tail. When we analyze content on a site we tend to grab ranking data from ahrefs for the client, as well as for several of their competitors, and we end up merging all the phrases and showing the search volume and the average cost per click for each phrase...we can always find a huge long tail, even if the clients site currently doesn't have that content (they have to add new original content), there is always a huge long tail to be had.

In 98% of the cases, there are no one or two or three main phrases that account for more than 2% of the total potential search traffic. Even with a sites existing content and existing traffic, the short tail tends not to be more than 5% of traffic for any sites I've been seeing.We often find that a site that may have 5,000 pages, but only 500 pages that site are of value via ranking for anything that has a decent search volume, and a decent worth in a CPC value in Google. If you look at those 500 urls, and you optimize each url for say 5 phrases on average, then you're looking at 2,500 phrases...of those, 50 phrases might be the short tail, and 2450 I would consider the middle tail. If you also add words like "shop" "store" "online" "sale" "cheap" "discount" etc to all those pages, you'll pick up tons more phrases. And from there, the more original content you can add, the more long tail you can get.... but..be careful...no one wants a site to be hit from a Google panda update...make sure the content is original, of value, and that it's of use to the viewers of the page.

When going after head or tail keywords...with one or the other do you feel that link quality is more important than link quantity?

Link quality always trumps. Otherwise, I'd buy those 10,000 backlinks for $100 packages that I see in Google AdWords... and my job would be a lot easier :)

With Google it is getting easier to hit tripwires with anchor text or building links too fast, does this also play into the bias toward quality & away from quantity?

I think it is easier to hit tripwires...but it's nice that Google sent out 700,000 "be careful" emails a few weeks ago... those were automatic....I think the "over optimization update" that Google has been speaking of will trip a lot of wires and people will have to mimic the natural web more and not focus on exact short tail phrases.

Those scammy AdWords ads proming link riches for nothing in part shape the perception of the cost & value of links. How do you get prospective clients to see & appreciate the value of higher quality links (while in some cases some of them will be competing with some of the bulk stuff that ranks today & is gone tomorrow)?

Well, luckily I'm not in sales calls anymore so I don't have to do the convincing :) but I'd say that if you can get links that you just can't buy (ie, a link from NASA.gov or harvard.edu/library/) then they're priceless. Each update Google will filter out some of the links from sites that it feels are artificial. If you can build things that stick and stand the test of time, and if you don't need to be #1 tomorrow, and are willing to invest in the sites content and the sites future, then think long tail and long term. If you're all about today, then do what you have to do today, but those cheap links won't move you much anyways & you'll just have a spammy backlink profile.

Building quality links that last isn't particularly cheap or easy. Even harder to do it in volume. What has allowed/enabled you to succeed where so many others have failed on this front? Is it that you care more about the client's well being, or is it that you have to tie together a bunch of clever bits to make it all back out?

Well, I have an army here....nearly 100 ninjas..the biggest group is the link builders, so I have a lot of link ninjas, we also have a lot of tools...tools that suggest the things we should write that has the highest probability of getting trusted backlinks, we have a content teams that knows how to write to get links from professors and orgs and government agencies, etc.

We have tools that help us to know who to write to after we've written the content..and we have tools that help us send out a lot of personal emails...between the tools and the people and the content, we manage to make it work. If we had to do all the work by hand, and by human guesses, it would never work, but with the tools (and human intervention along the way), we're able to get the links and scale it, while keeping the high quality.

When you talk about getting quality links that are priceless, those have that sort of value precisely because they are so hard to get. How big of a role does content play in the process? Is this something anyone can do?

Content is Key to getting links. There's different types of links....there's the low hanging fruit..then there's the fruit that's way on the top of the tree....the things that tend to be harder to get, also tend to be the most valued and the most trusted. If I wrote to a college professor at Harvard and said, "Hey, Professor Bob, I just wrote a great paper on "The History Of Widgets", you should add it to your article in the Harvard library" then if the article isn't Great, they'll never link to it. It starts with a great idea that morphs into great content, and then we promote it to those we're targeting. Anyone can write this content, guess at what a gov page would link to, or a college professor..see what they currently link out to...write them an email that's been personalized...and with enough emails, you can get the links if your content is good enough. It's a long slow process, but anyone can do it. Thank goodness I have tool that make that process much easier and more accurate to getting links.

You mentioned thinking long term, how long does it usually take to start seeing results from quality link building? Do you ever work on new sites, or do you mostly try to work on older websites that tend to respond quicker? Also have you noticed newer sites being able to rank much quicker if they do a quality-first approach to link building?

With getting the trusted links we tend to see an increase in traffic during the first 3 months. I do the 3 month review phone calls here, and my goal is to show them the ROI via overall rankings increase of the long tail, and an increase in google's organic traffic. Sites tend to see much better increases in these if they also follow our internal linking strategies, and our on page optimization strategies. If someone does link building, on page optimization, and internal linking, after 3 months there's almost no way someone can not increase the traffic to their site.

----

Thanks Jim!

Jim Boykin is the founder and CEO of Internet Marketing Ninjas (formerly We Build Pages, since 1999). Jim's team of marketing ninjas offer a full range of internet marketing services including link building services and social media branding, as well as they employ an in-house team of website designers. Follow Jim and the Ninjas on their blog, Facebook, Google, and Twitter, foursquare, and Linkedin.

Patience is a Virtue

Sorry I haven't blogged as much lately, but one of our employees recently had a child and Google sending out so many warning messages in webmaster central has created a ton of demand for independent SEO advice. Our growth in demand last month was higher than any month outside of the time a few years ago when we announced we would be raising prices and got so many new subscribers that I had to close down the ability to sign up for about 3 or 4 months because there were so many new customers.

Google has been firing on all cylinders this year. They did have a few snafus in the press, but those didn't have any noticeable impact on user perception or behavior & Google recently rolled out yet another billion Dollar business in their consumer surveys.

Google is doing an excellent job of adding friction to SEO & managing its perception to make it appear less stable, less trustworthy and to discourage investment in SEO. They send out warnings for unnatural links, warnings for traffic drops, and even warnings for traffic increases.

Webmaster Tools is a bit of a strange bird...

  • Any SEO consultant who has client sites tied into Webmaster Tools makes it easy to connect them together (making any black swan editorial decisions far riskier).
  • Any SEO company which has clients sign up for their own Webmaster Tools account now has to deal with explaining why things change, when many of the changes that happen are more driven by algorthmic shifts (adding local results to the SERPs or taking them away, other forms of localization, changing of ad placement on the SERP, etc.) than by the work of the SEO. This in turn adds costs to managing SEO projects while also making them seem less stable (even outside of those who were use paid link networks). Think through the sequence...
    • Google first sends a warning for traffic going up, and the SEO tells the client that this is because they did such a great job with SEO.
    • Then Google sends a warning for traffic dropping & the client worries that something is wrong.
    • The net impact on actual traffic or conversions could be a 0, but the warnings amplify the perception of changes.
  • Any SEO who doesn't use Webmaster Tools loses search referral data. It first started with logged in Google users, but apparently it is also headed to Firefox. Who's to say Google Chrome & Safari won't follow Firefox at some point?

Google has changed & obfuscated so many things that it is very hard to isolate cause and effect. They have made changes to how much data you get, changes to their analytics interface & how they report unique visitors, changes to how tightly they filter certain link behaviors, they have rolled in frequent Panda updates, and they have nailed a number of the paid link networks.

BuildMyRank shut down after leaving a self-destructive footprint that made it easy for Google to nuke their network, and some of the remaining paid link networks are getting nailed. Some of their customers are at this point driven primarily by fear, counting down their remaining days as the sky is falling. Fear is an important emotion designed to protect us, but when it is a primary driver we risk self-destruction.

The big winners in these moves by Google are:

  • Google, since they grant themselves more editorial leeway. If everyone is a scofflaw then they can hit just about anyone they want. And the organic search results are going to be far easier to police if many market participants are held back by a fear tax.
  • Larger businesses which are harder to justify hitting & which can buy out smaller businesses at lower multiples based on the perception of fear.
  • Sites which were outranked by people using the obvious paid links, which now rank a bit better after some of those paid link buyers were removed from the search results.
  • SEOs who out others & market themselves by using polarizing commentary (at least in the short run, whereas in the long run that may backfire).
  • Those engaging in negative SEO, which sell services to smoke competitors.

The big losers from these Google moves are:

  • some of the paid link networks & those who used them for years
  • under-priced SEO service providers who were only able to make the model work by scaling up on risk
  • smaller businesses who are not particularly spammy, but are so paralyzed by fear that they won't put in enough effort & investment to compete in the marketplace

One of the reasons I haven't advocated using the paid link networks is I was afraid of putting the associated keywords into a hopper of automated competition that I would then have to compete against year after year. Even if you usually win, over the course of years you can still lose a lot of money by promoting the creation of disposable, automated & scalable competing sites. If you don't mind projects getting hit & starting over the ROI on such efforts might work out, but after so many years in the industry the idea of starting over again and again as sites get hit is less appealing.

It is not just that the links are not trusted, but now they stand a far greater chance of causing penalties:

Dear site owner or webmaster of ….

We’ve detected that some of your site’s pages may be using techniques that are outside Google’s Webmaster Guidelines.

Specifically, look for possibly artificial or unnatural links pointing to your site that could be intended to manipulate PageRank. Examples of unnatural linking could include buying links to pass PageRank or participating in link schemes.

We encourage you to make changes to your site so that it meets our quality guidelines. Once you’ve made these changes, please submit your site for reconsideration in Google’s search results.

If you find unnatural links to your site that you are unable to control or remove, please provide the details in your reconsideration request.

If you have any questions about how to resolve this issue, please see our Webmaster Help Forum for support.

Sincerely,

Google Search Quality Team

If that doesn't change then negative SEO will become a bigger issue than paid links ever were.

What is hard about Google penalizing websites for such links is that it is cheap & easy for someone else to set you up. Shortly after Dan Thies mentioned that it was "about time" to Matt Cutts on Twitter someone started throwing some of the splog links at his site. It is safe to say that Dan didn't build those links, but there are many people who will be in the same situation as Dan who did nothing wrong but had a competitor set them up.

And there is no easy way to disconnect your site from those types of links.

If you go back a few years, it was quite easy to win at SEO by doing it in a "paint by number" fashion. One rarely got hit unless they were exceptionally excessive and stuck out like a sore thumb.

But after all of Google's recent moves, a few missed steps in a drunken stupor can have the same result.

Now more than ever, patience is a virtue!

Google Instant Answers: Rich Snippets & Poor Webmasters

This is a pretty powerful & instructive image in terms of "where search is headed."

It's a Yahoo! Directory page that was ranking in the Google search results on a Google Android mobile device.

Note the following

  • the page is hosted on Google.com
  • the page disclaims that it is not endorsed by Google
  • the page embeds a Google search box
  • the page strips out the Yahoo! Directory search box
  • the page strips out the Yahoo! Directory PPC ads (on the categories which have them)
  • the page strips out the Yahoo! Directory logo
Recall that when Google ran their bogus sting operation on Bing, Google engineers suggest that Bing was below board for using user clickstreams to potentially influence their search results. That level of outrage & the smear PR campaign look ridiculous when compared against Google's behavior toward the Yahoo! Directory, which is orders of magnitude worse:

 

Bing vs Google Google vs Yahoo! Directory
editorial Uses user-experience across a wide range of search engines to potentially impact a limited number of search queries in a minor way. Shags expensive hand-created editorial content wholesale & hosts it on Google.com.
hosting Bing hosts Bing search results using Bing snippets. Google hosts Yahoo! Directory results using Yahoo! Directory listing content & keeps all the user data.
attribution Bing publicly claimed for years to be using a user-driven search signal based on query streams. Google removes the Yahoo! Directory logo to format the page. Does Google remove the Google logo from Google.com when formatting for mobile? Nope.
ads Bing sells their own ads & is not scraping Google content wholesale. Google scrapes Yahoo! Directory content wholesale & strips out the sidebar CPC ads.
search box Bing puts their own search box on their own website. Google puts their own search box on the content of the Yahoo! Directory.
user behavior Google claimed that Bing was using "their data" when tracking end user behavior. Google hosts the Yahoo! Directory page, allowing themselves to fully track user behavior, while robbing Yahoo! of the opportunity to even see their own data with how users interact with their own listings.

 

In the above case the publisher absorbs 100% of the editorial cost & Google absorbs nearly 100% of the benefit (while disclaiming they do not endorse the page they host, wrap in their own search ad, and track user behavior on).

As we move into a search market where the search engines give you a slightly larger listing for marking up your pages with rich snippets, you will see a short term 10% or 20% lift in traffic followed by a 50% or more decline when Google enters your market with "instant answers."

The ads remain up top & the organic resultss get pushed down. It isn't scraping if they get 10 or 20 competitors to do it & then use the aggregate data to launch a competing service ... talk to the bankrupt Yellow Pages companies & ask them how Google has helped to build their businesses.

Update: looks like this has been around for a while...though when I spoke to numerous friends nobody had ever seen it before. The only reason I came across it was seeing a referrer through a new page type from Google & not knowing what the heck it was. Clearly this search option doesn't get much traffic because Google even removes their own ads from their own search results. I am glad to know this isn't something that is widespread, though still surprised it exists at all given that it effectively removes monetization from the publisher & takes the content wholesale and re-publishes it across domain names.

Interview of Jonah Stein

I was recently chatting with Jonah Stein about Panda & we decided it probably made sense to do a full on interview.

You mentioned that you had a couple customers that were hit by Panda. What sort of impact did that have on those websites?

Both of these sites saw an immediate hit of about 35% of google traffic. Ranking dropped 3-7 spots. The traffic hit was across the board, especially in the case of GreatSchools, who saw all content types hit (school profile pages, editorial content, UGC)

GreatSchools was hit on the 4-9 (panda 2.0) update and called out in the Sistrix analysis.

How hard has GreatSchools been hit? Sistrix data suggested that GreatSchools was loosing about 56% of Google traffic. The real answer is that organic Google-referred traffic to the site fell 30% on April 11 (week over week) and overall site entries are down 16%. Total page views are down 13%. The penalty, of course, is a “site wide” penalty but not all entry page types are being affected equally

Google suggested that there was perhaps some false positives but that they were generally pretty satisfied with the algorithms. For sites that were hit, how do clients respond to the SEOs? I mean did the SEO get a lot of the blame or did the clients get that the change was sort of a massive black swan?

I think I actually took it harder then they did. Sure, it hit their bottom line pretty hard, but it hit my ego. Getting paid is important but the real rush for me is ranking #1.

Fortunately none of my clients think they are inherently entitled to Google traffic, so I didn't get blamed. They were happy that I was on top of it (telling them before they noticed) and primarily wanted to know what Panda was about.

Once you get over the initial shock and the grieving, responding to Panda was a rorschach test, everyone saw something different. But is also an interesting self - reflection, especially when the initial advice coming from Greg Boser and a few others was to start to de-index content.

For clients who are not ad driven, the other interesting aspect is that generally speaking conversions were not hurt as much as traffic, so once you start focusing on the bottom line you discover the pain is a little less severe than it seemed initially.

So you mentioned that not all pages were impacted equally. I think pages where there was more competition were generally hit harder than pages that had less competition. Is that sort of inline with what you saw?

Originally I thought that was maybe the case, but as I looked at the data during the recovery process I became convinced that Panda is really the public face of a much deeper switch towards user engagement. While the Panda score is sitewide the engagement "penalty" or weighting effect on also occurs at the individual page. The pages or content areas that were hurt less by Panda seem to be the ones that were not also being hurt by the engagement issue.

On one of my clients we moved a couple sections to sub-domains, following the HubPages example and the experience of some members of your community. The interesting thing is that we moved the blog from /blog to blog.domain.com and we moved one vertical niche from /vertical-kw to vertical-kw.example.com. The vertical almost immediately recovered to pre-panda levels while the traffic to the blog stayed flat.

So the vertical was suddenly getting 2x the traffic. On the next panda push the vertical dropped 20% but that was still a huge improvement over before we moved to the subdomain. The blog didn't budge.

The primary domain also seemed to improve some, but it was hard to isolate that from the impact of all of the other changes, improvements and content consolidation we were doing.

After the next panda data push did not kill the vertical sub domain, we elected to move a second one. On the next data push, everything recovered - a clean bill of health - no pandalization at all.

but....

GreatSchools completely recovered the same day and that was November 11th, so Panda 3.0. I cannot isolate the impact of any particular change versus Google tweaking the algorithm and I think both sites were potentially edge cases for Panda anyway.

Now that we are in 3.3 or whatever the numbering calls it, I can say with confidence that moving "bad" content to a sub-domain carries the Panda score with it and you won't get any significant recovery.

You mentioned Greg Boser suggesting deindexing & doing some consolidation. Outside of canonicalization, did you test doing massive deindexing (or were subdomains your main means of testing isolation)?

We definitely collapse a lot of content, mostly 301s but maybe 25% of it was just de-indexing. That was the first response. We took 1150 categories/keyword focused landing pages and reduced to maybe 300. We did see some gains but nothing that resembled the huge boost when Panda was lifted.

Back to the rorschach test: We did a lot of improvements that yielded incremental gains but were still weighed down. I reminds me of when I used to work on cars. I had this old Audi 100 that was running poorly so I did a complete tune up, new wires, plugs, etc., but it was still running badly. Then I noticed the jet in the carburetor was mis-aligned. As soon as I fixed that, boom, the car was running great. Everything else we fixed may have been the right thing to do for SEO and/or users but it didn't solve the problem we were experiencing.

The other interesting thing is that I had a 3rd client who appeared to get hit by Panda or at least suffer from Panda like symptoms after their host went down for about 9 hours. Rankings tanked across the board, traffic down 50% for 10 days. They fully recovered on the next panda push. My theory is that this outage pushed their engagement metrics over the edge somehow. Of course, it may not have really been Panda at all but the ranking reports and traffic drops felt like Panda. The timing was after November 11th, so it was a more recent version of the Panda infrastructure.

Panda 1.0 was clearly a rush job and 2.0 seemed to be a response to the issues it created and the fact that demand media got a free pass. I think it took 6-8 months for them to really get the infrastructure robust.

My takeaways from Panda are that this is not an individual change or something with a magic bullet solution. Panda is clearly based on data about the user interacting with the SERP (Bounce, Pogo Sticking), time on site, page views, etc., but it is not something you can easily reduce to 1 number or a short set of recommendations. To address a site that has been Pandalized requires you to isolate the "best content" based on your user engagement and try to improve that.

I don't know if it was intentional or not but engagement as a relevancy factor winds up punishing sites who have built links and traffic through link bait and infographics because by definition these users have a very high bounce rate and a relatively low time on site. Look at the behavioral metrics in GA; if your content has 50% of people spending less than 10 seconds, that may be a problem or that may be normal. The key is to look below that top graph and see if you have a bell curve or if the next largest segment is the 11-30 second crowd.

I also think Panda is rewarding sites that have a diversified traffic stream. The higher percentage of your users who are coming direct or searching for you by name (brand) or visiting you from social the more likely Google is to see your content as high quality. Think of this from the engine's point of view instead of the site owner. Algorithmic relevancy was enough until we all learned to game that, then came links as a vote of trust. While everyone was looking at social and talking about likes as the new links they jumped ahead to the big data solution and baked an algorithm that tries to measure interaction of users as a whole with your site. The more time people spend on your site, the more ways they find it aside from organic search, the more they search for you by name, the more Google is confident you are a good site.

Based on that, are there some sites that you think have absolutely no chance of recovery? In some cases did getting hit by Panda cause sites to show even worse user metrics? (there was a guy named walkman on WebmasterWorld who suggested that some sites that had "size 13 shoe out of stock" might no longer rank for the head keywords but would rank for the "size 13" related queries.

I certainly think that if you have a IYP and you have been hit with Panda your toast unless you find a way to get huge amounts of fresh content (yelp). I don't think the size 13 shoe site has a chance but it is not about Panda. Google is about to roll out lots of semantic search changes and the only way ecommerce sites (outside of the 10 or so brands that dominate Google Products) will have a chance is with schema.org markup and Google's next generation search. The truth is the results for a search for shoes by size is a miserable experience at the moment. I wear size 16 EEEE, so I have a certain amount of expertise on this topic. :)

Do you see Schema as a real chance for small players? Or something that is a short term carrot before they get beat with the stick? I look at hotel search results like & and I fear that spreading as more people format their content in a format that is easy to scrape & displace. (For illustration purposes, in the below image, the areas in red are clicks that Google is paid for or clicks to fraternal Google pages.)

I doubt small players will be able to use Schema as a lifeline but it may keep you in the game long enough to transition into being a brand. The reason I have taken your advice about brands to heart and preach it to my clients is that it is short sighted to believe that any of the SEO niche strategies are going to survive if they are not supported with PR, social, PPC and display.

More importantly, however, is that they are going to focus on meeting the needs of the user as opposed to simply converting them during that visit. To use a baseball analogy, we have spent 15 years keeping score of home runs while the companies that are winning the game have been tracking walks, singles, doubles and outs. Schema may deliver some short term opportunities for traffic but I don't think size13shoes.com will be saved by the magic of semantic markup.

On the other hand, if I were running an ecommerce store, particularly if I was competing with Amazon, Bestbuy, Walmart and the hand full of giant brands that dominate the product listings in the SERP, I wouldn't bury my head in the sand and pretend that everyone else wasn't moving in that direction anyway. Maybe if you can do it right you can emerge as a winner, at least over the short and medium term.

In that sense SEO is a moving target, where "best practices" depend on the timing in the marketplace, the site you are applying the strategy to, and the cost of implementation.

Absolutely...but that is only half the story. If you are an entrepreneur who likes to build site based on a monetization strategy, then it is a moving target where you always have to keep your eyes on the horizon. For most of my clients the name of the game is actually to focus on trying to own your keyword space and take advantage of inertia. That is to say that if you understand the keywords you want to target, develop a strategy for them and then go out and be a solid brand, you will eventually win. Most of my clients rank in the top couple of spots for the key terms for their industry with a fairly conservative slow and steady strategy, but I wouldn't accept a new client who comes to me and says they want to rank a new site #1 for credit cards or debt consolidation and they have $200,000 to spend..or even $2,000,000. We may able to get there for the short term but not with strategies that will stand the test of time.

Of course, as I illustrated with the Nuts.com example on SearchEngineLand last month, the same strategy that works on a 14 year old domain may not be as effective for a newer site, even if you 301 that old domain. SEO is an art, not a science. As practitioners we need to constantly be following the latest developments but the real skill is in knowing when to apply them and how much; even then occasionally the results are surprising, disappointing or both.

I think there is a bit of a chicken vs egg problem there then if a company can't access a strong SEO without already having both significant capital & a bit of traction in the marketplace. As Google keeps making SEO more complex & more expensive do you think that will drive a lot of small players out of the market?

I think it has already happened. It isn't about the inability to access a strong SEO it is that anyone with integrity is going to lay out the obstacles they face. Time and time again we see opportunity for creativity to triumph but the odds are really stacked against you if you are an underfunded retailer.

Just last year I helped a client with 450 domains who had been hit with Panda and then with a landing page penalty. It took a few months to sort out and get the reconsideration granted (by instituting cross domain rel=canonical and eliminating all the duplicate content across their network). They are gradually recovering to maybe 80% of where they were before Panda 2.0 but I can't provide them an organic link building strategy that will lift 450 niche ecommerce sites. I can't tell them how they are going to get any placement in a shrinking organic SERP dominated by Google's dogfood, shopping results from big box retailers and enormous Adwords Product Listings with images

From that perspective, if your funding is limited, do you think you are better off attacking a market from an editorial perspective & bolting on commerce after you build momentum (rather than starting with ecommerce and then trying to bolt on editorial?

Absolutely. Clearly the path is to have built Pinterest, but seriously...

if you are passionate about something or have a disruptive idea you will succeed (or maybe fail), but if you think you can copy what others are doing and carve out a niche based on exploits I disagree. Of course, autoinsurancequoteeasy.com seems to be saying you can still make a ton of money in the quick flip world, even with a big bank roll, you need to be disruptive or innovative.

On the other hand, if you have some success in your niche you can use creativity to grow, but it has to be something new. Widget bait launched @oatmeal's online dating site but it is more likely to bury you now than help you rank #1, or at least prevent you from ranking on the matching anchor text.

When a company starts off small & editorially focused how do you know that it is time to scale up on monetization? Like if I had a successful 200 page site & wanted to add a 20,000 page database to it...would you advise against that, or how would you suggest doing that in a post-Panda world?

This is a tough call. I actually have a client in exactly this position. I guess it depends on the nature of the 20,000 pages. If you are running a niche directory (like my client) my advice to them was to add the pages to the site but no index the individual listing until they can get some unique content. This is still likely to run fowl of the engagement issue presented by Panda, so we kept the expanded pages on geo oriented sub-domains.

Earlier you mentioned that Panda challenged some of your assumptions. Could you describe how it changed your views on search?

I always tell prospects that 10-15 years ago my job was to trick search engines into delivering traffic but over the last 5-6 years it has evolved and now my job is to trick clients into developing content that users want. Panda just changed the definition of "good content" from relevant, well linked content to relevant, well linked, sticky content.

It has also made me more of a believer in diversifying traffic.

Last year Google made a huge stink about MSN "stealing" results because they were sniffing traffic streams and crawling queries on Google. The truth is that Google has so many data sources and so many signals to analyze that they don't need to crawl facebook or index links on twitter. They know where traffic is coming from and where it is going and if you are getting traffic from social, they know it.

As Google folds more data into their mix do you worry that SEO will one day become too complex to analyze (or move the needle)? Would that push SEOs to mostly work in house at bigger companies, or would being an SEO become more akin to being a public relations & media relations expert?

I think it may already be too complex to analyze in the sense that it is almost impossible to get repeatable results for every client or tell them how much traffic they are going to achieve. On the other hand, moving the needles is still reasonably easy—as long as you are in agreement about what direction everyone is going. SEO for me is about Website Optimization, about asking everyone about the search intent of the query that brings the visitors to the site and making sure we have actions that match this intent. Most of my engagements wind up being a combination of technical seo/problem solving, analytics, strategy and company wide or at least team wide education. All of these elements are driven by keyword research and are geared towards delivering traffic so it is an SEO based methodology, but the requirements for the job have morphed.

As for moving in house, I have been there and I doubt I will ever go back. Likewise, I am not really a PR or media relations expert but if the client doesn't have those skills in house I strongly suggest they invest in getting them.

Ironically, many companies still fail to get the basics right. They don't empower their team, they don't leverage their real world relationships and most importantly they don't invest enough in developing high quality content. Writing sales copy is not something you should outsource to college students!

It still amazes me how hard it is to get content from clients and how often this task is delegated to whoever is at the bottom of the org chart. Changing a few words on a page can pay huge dividends but the highest paid people in the room are rarely involved enough.

In the enterprise, SEO success is largely driven by getting everyone on board. Being a successful SEO consultant (as opposed to running your own sites) is actually one quarter about being a subject matter expert on everything related to Google, one quarter about social, PR, Link building, conversion, etc and half about being a project manager. You need to get buying from all the stake holders, strive to educate the whole team and hit deliverables.

Given the increased complexity of SEO (in needing to understand user intent, fixing a variety of symptoms to dig to the core of a problem, understanding web analytics data, faster algorithm changes, etc.) is there still a sweet spot for independent consultants who do not want to get bogged down by those who won't fully take on their advice? And what are some of your best strategies for building buy in from various stakeholders at larger companies?

The key is to charge enough and to work on a monthly retainer instead of hourly. This sounds flippant but the bottom line is to balance how many engagements you can manage at one time versus how much you want to earn every month. You can't do justice to the needs of a client and bill hourly. That creates an artificial barrier between you and their team. All of my clients know I am always available to answer any SEO related question from anyone on the team at almost any time.

The increased complexity is really job security. Most of my clients are long term relationships and the ones I enjoy the most are more or less permanent partnerships. We have been very successful together and they value having me around for strategic advice, to keep them abreast of changes and to be available when changes happen. Both of the clients who got hit by Panda have been with me for more than four years.

No one can be an expert in everything. I definitely enjoy analytics and data but I have very strong partnerships with a few other agencies that I bring in when I need them. I am very happy with the work that AnalyticsPros has done for my clients. Likewise David Rodnitzky (PPC Associates) and I have partnered on a number of clients. Both allow me to be involved in the strategy and know that the execution will be very high quality. I only wish I had some link builders I felt as passionate about (given that Deborah Mastaler is always too busy to take my clients.)

You mentioned that you thought user engagement metrics were a big part of Panda based on analytics data & such...how would a person look through analytics data to uncover such trends?

I would focus on the behavioral metrics tab in GA. It is pretty normal to have a large percentage of visitors leave before 10 seconds, but after that you should see a bell curve. Low quality content will actually have 60-70% abandonment in less than 10 seconds, but the trick is for some searches 10 seconds is a good result: weather, what is your address, hours of operations. Lots of users get what they need from searches, sometimes even from the SERP, so look for outliers. Compare different sections of your site, say the blog or those infographics & bad page types.

Its hard to say until you get your hands in the data but if you assume that individual pages can be weighed down by poor engagement and that this trend is maybe 1 year old and evolving, you can find some clues. Learn to use those advance segments and build out meaningful segmentation on your dashboard and you will be surprised how much of this will jump out at you. It is like over optimization: until you believed in it you never noticed & now you can spot it within a few seconds of looking at a page. I won't pretend engagement issues jump out that fast but it is possible to find them, especially if you are an in house SEO who really knows your site.

The other important consideration is that improving engagement for an given page is a win regardless of whether it impacts your rankings or your Panda situation. The mantra about doing what is right for the users, not the search engine may sound cliche but they reality is that most of your decisions and priorities should be driven by giving the user what they want. I won't pretend that this is the short road to SERP dominance but my philosophy is to target the user with 80% of your efforts and feed the engines with the other 20.

Thanks Jonah :)

~~~~~~~~~~

Jonah Stein has 15 years of online marketing experience and is the founder of ItsTheROI, a San Francisco Search Engine Marketing Company that specializes in ROI driven SEO and PPC initiatives. Jonah has spoken at numerous industry conferences including Search Engine Strategies, Search Marketing Expo (SMX), SMX Advanced, SIIA On Demand, the Kelsey Groups Ultimate Search Workshop and LT Pact. He also developed panels Virtual Blight for the Web 2.0 Summit and the Web 2.0 Expo. He has written for Context Web, Search Engine Land and SEO Book

Jonah is also the cofounder of two SaaS companies, including CodeGuard.com, a cloud based backup service that provides a time machine for websites and Hubkick.com, an online collaboration and task management tool that provides a simple way for groups to work together-instantly.

Branding & The Cycle

Since it took me a few hours to put together my SMX presentation I figured it was worth sharing that information on the blog as well. This post will discuss examples of how Google has dialed up their brand bias over time & points to where Google may be headed in the future.

Note that I don't have anything against them promoting brands, I just think it is dishonest to claim they are not.

Against All Odds

When analyzing Google's big-brand bias the question is not "do some small sites manage to succeed against all odds" but…

  • What are the trends?
  • What are the biases?

Quotable Quotes

Eric Schmidt once stated that "Brands are the solution, not the problem. Brands are how you sort out the cesspool. Brand affinity is clearly hard wired."

We have a fear of the unknown. Thus that which we have already experienced is seen as less risky than something new & different. This is a big part of why & how cumulative advantage works - it lowers perceived risk.

A significant portion of brand-related searches are driven by offline advertising. When a story becomes popular in the news people look online to learn more. The same sort of impact can be seen with ads - from infomercials to Superbowl ads. Geico alone spends nearly a billion Dollars per year on advertising, & Warren Buffet mentioned that 3/4 of their quotes come from the internet.

Some of the most profitable business models are built off of questionable means.

Many big brands are owned by conglomerates with many horses in the race. When one gets caught doing something illegal they close it down or sell off the assets & move to promote their parallel projects more aggressively.

If things aligned with brands become relevancy signals then to some degree those measure longevity & size of a company (and their ad budget) rather than the quality of their offering.

Even before the Panda update Google's Amit Singhal suggested the problem with this:

Companies with a high page rank are in a strong position to move into new markets. By “pointing” to this new information from their existing sites they can pass on some of their existing search engine aura, guaranteeing them more prominence.
...
Google’s Mr Singhal calls this the problem of “brand recognition”: where companies whose standing is based on their success in one area use this to “venture out into another class of information which they may not be as rich at”. Google uses human raters to assess the quality of individual sites in order to counter this effect, he adds.

Since Panda Overstock has moved into offering ebooks & insurance quotes while companies like Barnes & Noble run affiliate listings for rugs.

As an example of the above trend gone astray, my wonderful wife recently purchased me a new computer. I was trying to figure out how to move over some user databases (like our Rank Checker & Advanced Web Ranking) and in the search results were pages like this one:

The problems with the above are:

  • actual legitimate reviews get pushed down by such filler
  • the business model behind doing such actual reviews gets eroded by the automated syndicated reviews
  • outside of branding & navigation the content is fully syndicated
  • that particular page is referencing the 2005 version of the software, so the listed price is wrong & the feature set has changed a lot in the last 7 years

Such scrape-n-mash content strategies by large brands are not uncommon. Sites like Answers.com can quickly add a coupons section, sites like FindTheBest can create 10s of millions of automated cross-referencing pages that load a massive keyword net of related keywords below the fold, news sites can create auto-generated subdomains of scraped content, etc.

Eric Schmidt highlighted FindTheBest publicly as an example of a successful vertical search play. That site was launched by an ex-Googler, but if I did the same thing you can be certain that the only way Google would highlight it publicly would be as a "type of spam."

The issue with broadly measuring user experience is that I am still going to visit Yahoo! Sports repeatedly even if my experience on Yahoo! Downloads is pretty crappy. A site which is a market leader in one niche can take those signals to launch a "me too" service in other parallel markets & quickly dominate the market.

Potential Brand Signals

When attempting to debunk the concept of "brand bias" some people claim that it would be ridiculous for Google to have a list of brands that get an across-the-board boost. Of course that debunking is debunking a straw man that was never stated publicly (outside of the irrelevant debunking).

However, some of Google's old rater documents *did* have certain sites whitelisted & Google's Scott Huffman once wrote the following:

At a [search] quality level, we have something similar. On a continuous basis in every one of our data centers, a large set of queries are being run in the background, and we’re looking at the results, looking up our evaluations of them and making sure that all of our quality metrics are within tolerance.

These are queries that we have used as ongoing tests, sort of a sample of queries that we have scored results for; our evaluators have given scores to them. So we’re constantly running these across dozens of locales. Both broad query sets and navigational query sets, like “San Francisco bike shop” to the more mundane, like: Here’s every U.S. state and they have a home page and we better get that home page in the top results, and if we don’t … then literally somebody’s pager goes off.

(Outside of some fraternal Google properties) the algorithm isn't hardcoded to rank sites x & y at #1, but if some sites don't rank for certain queries it does cause an alert to be sent out.

Google has a wide host of quality-based metrics they could look at and analyze when determining if something gets a brand boost, gets ignored, or gets hit by an algorithm like Panda.

A while back we wrote a post on potential brand signals, but a short list of examples would be:

  • Classical relevancy signals
    • domain name
    • website age
    • anchor text
    • link diversity
    • keyword co-citation
    • inclusion in trusted databases
  • Search behavior
    • keyword search volume trends
    • CTR of users on search results (including how users respond to changes in rank)
    • URL-based searches & other branded searches (the most popular keyword on Google is Facebook)
    • back button clicks (did the user find what they were looking for? or did they look somewhere else?)
    • repeat visitors (if someone repeatedly visits a website that is generally a pretty strong indication they had a positive user experience)
    • search query chains (Google suggested this was a big driver in the Vince update)
  • Passive user monitoring
    • search has become the primary mode of navigation online
    • Google has long offered a search toolbar & paid to have it installed in new computers
    • Google paid Mozilla about a billion Dollars for default search placement in Firefox
    • Google owns Chrome & Android
    • Google offers the most widely used analytics program
    • Google can also use AdSense ads and YouTube data to track users
    • Google was recently caught in privacy-related snafus with tracking Safari & Internet Explorer users

Brand-focused Editorial

In 2008 Rhea Drysdale created the following image, which highlighted how the same activity could be viewed as a legitimate marketing strategy or spam based on nothing other than who was doing it.

The Vince Update

In 2009 Google rolled some of their brand bias directly into the relevancy algorithms. A bunch of branded sites all jumped up in rankings out of nowhere for core industry keywords.

Around that time Microsoft offered a search funnels tool, which showed what people searched for after searching for a particular keyword.

The above screenshots (from Rankpulse and the Microsoft Search Funnels) are both from now defunct tools, but Yahoo! has since launched a tool called Yahoo! Clues which shows similar relationships.

Amit Singhal told the Telegraph that Google is "the biggest kingmaker on this Earth."

A Google engineer admitted that the Vince update was largely driven by search funnels. Google then rolled out a search results interface change which promoted brands & stores directly in the search results.

If you search for "fishing gear" and then click their Bass Shop refinement link in the search results, you are thus directly creating that search funnels relevancy "signal." Even if you don't click on that link the exposure to the term may make you remember it and search for it later.

Paid Links

Are paid links evil?

Once again, it depends on who is doing it.

When the largest flower websites were caught buying massive quantities of links, a Google spokesperson told the New York Times: "None of the links … had a significant impact on our rankings, due to automated systems we have in place to assess the relevance of links."

When some small bloggers were selling paid links to K-Mart as part of a "sponsored conversations" outreach, Matt Cutts equated the practice to selling bogus solutions to brain cancer & stated: "Those blogs are not trusted in Google's algorithms any more."

Google also started sending webmasters automated messages for bad links pointing at their sites:

Dear site owner or webmaster of domain.com, We've detected that some of your site's pages may be using techniques that are outside Google's Webmaster Guidelines.
...
We encourage you to make changes to your site so that it meets our quality guidelines. Once you've made these changes, please submit your site for reconsideration in Google's search results.

So if you run a big site & they automatically detect paid links they generally just ignore those links and leave your site alone. If you are a small site & they automatically detect paid links they may decide to automatically penalize your site.

Same offense, entirely different outcome.

Cloaking

Is cloaking evil?

Once again, it depends on who is doing it.

I have a Vistaprint Visa card (so I could get a credit card with our dog's picture on it) and one of the pages that was ranking for Vistaprint Visa was the Singapore Groupon website.

The page forces a pop up and you can't do anything on that page (view the content, scroll around the site, etc.) other than filling in the lead generation form or logging into an existing account. I would never try that because I know I would get smoked for it. ;)

Groupon has also ran AdWords accounts where the only option was to fill in the lead generation form or click into the TOS which are in another language!

After the first iteration of the Google Panda update Google allowed users to vote to block websites. Experts Exchange was hated among some programmers in part because they used scroll cloaking. That in turn got their site hit by the second Panda update.

Google then later rolled out a new ad unit where you pay for viewing content by taking a Google survey & some YouTube videos use preroll ads.

Doorway Pages

Are doorway pages evil?

Once again, it depends on who is doing it.

After the Panda update Ikea's thin content-free pages started ranking page 1 for some pretty competitive keywords.

Huffington Post later wrapped 3rd party Tweets in their site's template & ranked those in Google.

Smaller webmasters who ran network of sites in some cases got hit with "doorway page" penalties for owning networks of sites registered in Google Webmaster Tools, even if each site was a full fledged ecommerce website.

Content Farming

Is content farming evil?

Once again, it depends on who is doing it (and where it is hosted).

Long before the Panda update I highlighted some of the informationless videos Demand Media was uploading to Youtube.

In spite of Google's Panda hitting eHow, Google still decided to pre-pay Demand Media to keep uploading YouTube videos.

Another thing that is interesting about the content farms and the alleged need for the Panda algorithm was that in spite of flagrant editorial violations by both eHow and Mahalo, Google didn't smoke them until it could be done "algorithmically."

On the flip side of the above, in some cases Google has chose to keep smaller webmasters penalized because content that was at one point on their site months in the past!

Google+

When Google+ launched I highlighted how it was acting as a scraper site by outranking original publisher content. About a half-year later some tech blogger noticed that issue & caused a big stink over it. A Google engineer then suggested that it was childish to place any of the blame on Google. Shortly after that Google integrated Google+ in the search results far more aggressively.

A couple weeks after that aggressive promotional integration Amit Singal stated: "The overall takeaway that I have in my mind is that people are judging a product and an overall direction that we have in the first two weeks of a launch, where we are producing a product for the long term."

The problem with build preferential rankings first & increase quality later is that is the exact opposite of what Google is asking publishers to do with algorithms like Panda. Worse yet, Google not only does this integration when you are logged in, but also shows it on obscure longtail advanced queries when you are not logged in.

Affiliates

When Google's ad ecosystem was young they loved affiliates, but that changed over time.

In Google's remote rater documents they suggested that hotel affiliate sites be marked as spam, even if they are helpful.

On Google's reconsideration request form they also stated: "In general, sites that directly profit from traffic (e.g. search engine optimizers, affiliate programs, etc.) may need to provide more evidence of good faith before a site will be reconsidered."

And while Google has biased their editorial philosophies away from affiliates, some of the trusted brands like Barnes & Noble added affiliate listings to their websites, selling things like rugs.


The Business Cycle

Most businesses tend to grow in a cycle...

  • Bootstrap / self-funded
  • Raise funds / take out a loan
  • Build exposure
  • Monetize attention
  • Re-invest in increased quality
  • Build a brand
  • Build further exposure
  • Monetize more attention
  • Re-invest in increased quality

The broken piggy bank in the above cycle highlights the break that exists in the process to building a big brand. It is quite hard to have any level of certainty in the search ecosystem with an algorithm like Panda. Without that level of certainty companies must build from low cost structures, but that very constraint makes them more likely to get hit by an algorithm or search engineer.

Pricing Risk

Being an entrepreneur is all about taking smart calculated bets & managing risk. However as search engines become closed off portals that compete with (& exclude) publishers, there are so many unknowns that estimating risk is exceptionally challenging.

Penalties: How Hard Were They Hit?

  • Years ago when BMW or Wordpress.org got caught spamming aggressively they were back in good graces in a mater of days.
  • About the only times well known (non-affiliate) sites have been penalized for a significant duration was when JC Penney & Overstock.com were hit. But that happened around the time of the Panda fiasco & Google had incentive to show who was boss. When the flower sites were outed for massive link buying that was ignored because Google had already rolled out Panda & reasserted the perception of their brand.
  • When Google was caught buying links (again) to promote Google's Chrome browser & that story spread widely throughout the mainstream press, Googlers lied & claimed there was only 1 paid link in 1 single page & penalized a single page of their site. Small website owners that have been caught in similar link buying (or selling) campaigns have been hit much harder. Remember the above story about the bloggers blogging about K-Mart? So far this year Google has sent webmasters over 700,000 messages in Google Webmaster Central.

1 Strike - You're Out

In 2009 Google banned over 30,000 affiliates from the AdWords auction. In some cases the problem was not with a current ad (or even a landing page the advertiser controlled), but rather ads that ran years ago promoting 3rd party products. In some cases Google changed their AdWords TOS after the fact in an ex post facto style. Google won't allow some of these advertisers to advertise unless they fix the landing page, but if they don't control the landing page they can't ever fix the problem. Making things worse, to this day Google still suggests affiliates do direct linking. But if the company they promote gets bought out by someone too aggressive then that affiliate could be waiting for a lifetime ban through no fault of their own.

A popular programmer who has been an AdSense publisher for 8 years had their AdSense account arbitrarily suspended without warning. After an ex-Googler expressed outrage over the issue he was able to get his AdSense account reactivated. A publisher without those friendships would have been done.

In Australia a small travel site had a similar issue with AdSense. The only way they were able to get a reconsideration was to lodge a formal complaint with regulators. If that is how Google treats their business partners, it colors how they view non-business partners who monetize traffic without giving Google a taste of the revenues.

Why Does Google Lean Into Brand?

  • Minimize legal risks: if they hit small businesses almost nobody will see/notice/care, but big businesses are flush with cash and political connections. When Google hits big businesses they create organizations & movements like Fair Search & Search Neutrality.
  • Minimize duplication: some small businesses & affiliates simply repeat offers that exist on larger merchant sites. That said, many big businesses buy out a 2nd, 3rd, 4th, or even 5th site in a vertical to have multiple placements in the search results.
  • Better user experience: the theory is that the larger sites have more data and capital to improve user experience, but they don't always do it.
  • Business partnerships: if Google wants to strike up closed door business partnerships with big business then some of those negotiations will have specific terms attached to them. It costs Google nothing to give away part of the organic results as part of some custom deals. If Google wants to sell TV ads & run a media streaming device they need to play well with brands.
  • CPA-based product ads: on some searches Google provides CPA-based product ads above the search results. It makes sense for Google to promote those who are buying their ads to get the best relationships possible.
  • Fewer people tasting the revenues: the fewer organizations an ecosystem needs to support the more of the profits from that ecosystem that can be kept by the manager.
  • More complete ad cycle: if Google caters to direct response advertisers they get to monetize the demand fulfillment of demand, however that is only a small slice of the complete ad cycle. If Google caters to brands they get to monetize (directly or indirectly) every piece of the ad cycle. For example, buying display ads helps build brand searches which helps create brand signals. In such a way, improved rankings in the organic results subsidize ad buying.
    • Attention
    • Interest
    • Desire
    • Action
    • Satisfaction
  • Brands buying their equity: Google has create exceptionally large ad units & has convinced many brands to buy their own pre-existing brand equity.

Lack of Diversity

The big issue with brand bias is that a lot of the same *types* of companies rank with roughly similar consumer experiences. If there is a mix of large and small businesses that rank then many of those small businesses will be able to differentiate their offering by adding services to their products, doing in-depth reviews, and so on.

Sure Zappos is a big company known for customer service, but how different is the consumer-facing experience if I click on Target.com or Walmart.com? Sure the text on the page may be slightly different, but is there any real difference beyond aesthetic? Further, a lot of the business models built around strong in-depth editorial reviews & comparisons are eroded by the current algorithms. If the consumer reviews are not good enough, then tough luck!

Do Brands Always Provide a Better User Experience?

Some larger retailers track people in ways that are creepy:

For decades, Target has collected vast amounts of data on every person who regularly walks into one of its stores. Whenever possible, Target assigns each shopper a unique code — known internally as the Guest ID number — that keeps tabs on everything they buy. "If you use a credit card or a coupon, or fill out a survey, or mail in a refund, or call the customer help line, or open an e-mail we've sent you or visit our Web site, we'll record it and link it to your Guest ID," Pole said. "We want to know everything we can."

Many big media companies provided watered down versions of their content online because they don't want to cannibalize their offline channels. Likewise some large stores may consider their website an afterthought. When I wanted to order my wife a specific shoe directly from the brand they didn't have customer support open for extended hours during the holidays and their shopping cart kept kicking an error. Since they *are* the brand, that brand strength allows them to get away with other issues that need fixed.

Some of those same sites carry huge AdSense ad blocks on the category pages & have funky technical issues which act like doorway pages & force users who are using any browser to go through their homepage if they land on a deep page.

Missing the Target indeed.

That above "screw you" redirect error has been going on literally for weeks now, with Target's webmaster asleep at the wheel. Perhaps they want you to navigate their site by internal search so they can track every character you type.

Riding The Waves

With SEO many aggressive techniques work for a period of time & then suddenly stop working. Every so often there are major changes like the Florida update & the Panda update, but in between these there are other smaller algorithmic updates that aim to fill in the holes until a big change comes about.

No matter what Google promotes, they will always have some gaps & relevancy issues. Some businesses that "ignore the algorithms and focus on the user" are likely to run on thinner margins than those who understand where they algorithms are headed. Those thin margins can quickly turn negative if either Google enters your niche or top competitors keep reinvesting in growth to buy more marketshare.

Profit Potential

Given the above pattern - where trends spread until they get hit hard - those who quickly figure out where the algorithms are going & where there are opportunities have plenty of time to monetize their efforts. Whereas if you have to wait until things are widely spread on SEO blogs as common "tricks of the trade" or wait until a Google engineer explicitly confirms something then you are likely only going to be adopting techniques and strategies after most of the profit potential is sucked out of them, just before the goal posts move yet again.

People who cloned some of the most profitable eHow articles years ago had plenty of time to profit before the content farm business model got hit. Those who waited until Demand Media spelled their business model out in a Wired article had about 1.5 years until the hammer. Those who waited until the content farm controversy started creating a public relations issue to clone the model may have only had a couple months of enhanced revenues before their site got hit & was worse off than before they chased the algorithm late in the game.

Ride The Brand

If Google does over-represent established branded websites in their algorithms then in many cases it will be far easier to rank a Facebook notes page or a YouTube video than to try to rank a new site from scratch. There are a ton of web 2.0 sites driven by user generated content.

In addition to those sorts of sites, also consider participating in industry websites in your niche & buying presell pages on sites that rank especially well.

Collecting (& Abusing) User Data

Google has been repeatedly branded as being a bit creepy for their interest in user tracking.

Their latest privacy policy change was rolled out in spite of EU warnings that it might not comply with the law.

Collecting that data & using it for ad targeting can have profound personal implications (think of serving a girl with anorexia ads about losing weight everywhere she goes online, simply because she clicks the ad, in such a case Google reinforces a warped worldview). Then when the person needs counseling Google can recommend a service provider there as well. ;)

Trust in Google's ability to do the right thing would be greater if they were not caught in that drug sting selling ads to fake Mexican pharmacies selling illicit products, a practice they were involved in before going public.

They also take aggregate collected data and sell it off to banksters.

Google as Content Host (& Merchant)

Throughout the history of the web there will be many cycles between open and closed ecosystems. Currently we are cycling toward closed silos (Apple, Amazon, Google, Facebook). As these silos become more closed off they will end up leaving gaps that create new opportunities.

Google has been pushing aggressively for years to host content & crowd out the organic search results.

While on one front Google keeps making it easier for brands to compete against non-brands, Google also keeps clawing back a bigger slice of that branded traffic through larger AdWords ad units & integration of listings from services like Google+, which can in some cases outrank the actual brand.

Google has multiple platforms (Android Marketplace, Chrome Marketplace, Enterprise Marketplace) competing against iTunes. Google recently decided to merge some of their offerings into Google Play. In addition to games, music & books, Play will soon include audiobooks, magazines & other content formats.

Google also wants to compete against Amazon.com to launch an Amazon Prime-like delivery service.

Having a brand & following will still be important for allowing premium rates, fatter margins, building non-search distribution (which can be used to influence the "relevancy" signals), and to help overturn manual editorial interventions. But algorithmically brand emphasis will peak in the next year or two as Google comes to appreciate that they have excessively consolidated some markets and made it too hard for themselves to break into those markets. (Recall how Google came up with their QDF algorithm only *after* Google Finance wasn't able to rank). At that point in time Google will push their own verticals more aggressively & launch some aggressive public relations campaigns about helping small businesses succeed online.

Once Google is the merchant of record, almost everyone is just an affiliate, especially in digital marketplaces with digital delivery.

Is Bryson Meunier Full Of Manure? Learn Why SEO Consultants Push Brand

At SMX I gave a presentation on brand & how Google has biased the algorithms toward brands. having already seeing the bulk of my argument months prior, Bryson Meunier spoke after me and put together a presentation that used bogus statistics & was basically a smear of me. He was so over the top with his obnoxious behavior that when Danny Sullivan mentioned the next speaker after him he jokingly said "up next, Ron Paul."

I honestly thought the point of the discussion was to highlight how Google has (or hasn't) biased the algorithms, editorial policies & search interface toward brands. However, if a person speaks after you and uses bogus statistics to reach junk conclusions, you can't debunk their aggregate information until after you have looked into it some. An honest person can put what they know out there & share it publicly in advanced, a dishonest person hides behind junk research and the label of science to ram through poorly thought out trash, collecting whatever "data" confirms their own bias while ignorning the pieces of reality that don't.

  • As an example, he suggested that based on the number of employees and revenues Wikipedia is a small business. He then went on to say that since Wikipedia wasn't on Interbrand's "scientific" study that they were not a top brand. Nevermind that no countries, religions, sports, celebrities, or non-profits make the list of top "companies."
  • After IAC figured out that they were able to get away with running Ask.com as a thin scraper site, they outsourced "the algorithm" and fired many of their employees. Because they have fewer employees, Bryson considers Ask as "a mid-sized business" even though they are part of a multi-billion Dollar company and IAC is Google's #1 advertiser!
  • According to Compete's downstream traffic stats, YouTube receives about 1 in 13 search clicks from Google, but since it wasn't on Interbrand's list "who cares?" Incidentally, the folks at Interbrand do have a mention of YouTube on their top 100 brands page, but it was a suggestion that you watch their videos on YouTube. Their methodology is so suspect that Goldman Sachs and Yahoo! made the cut while YouTube didn't, even though YouTube is one of their few offsite promotional channels they promote on that very page. Their list also puts Microsoft's brand value at about double Apple's (and the list came out when Steve Jobs was still alive).
  • Bryson also claimed that since big brands are inefficient and slow moving they already have a big disadvantage so it makes sense for search engines to compensate for that. That is at best an illegitimate line of reasoning because those companies have plenty of solutions available to them & have the capital needed to buy out competitors. Even when the SERPs look independent, a lot of the listed sites are owned by large conglomorates. As an example, here is a random search from earlier today:

    Meanwhile the same idiotic logic ignores the lack of resources at small businesses. Nowhere in his presentation was a highlight of how Google favored affiliates & direct marketers until the profit margins of the direct response marketing model started to peak & then Google transitioned to promoting brands, as they wanted to keep increasing revenues and monetize more clicks.
  • Bryson also shared an example of where he got a photo sharing site 40,000 unique visitors a month as a case study of the power of white hat SEO. 40,000 monthly visits to a photo sharing site might fund a light Starbucks addiction (assuming you value your time at nothing, have no employees, ignore hosting costs and the SEO is free), but not much beyond that. If that is a success case study, that shows how much harder the ecosystem is getting to operate in as a small business.
  • He also put out a painfully fluffy "white paper" / sales letter which stated that since Wal-Mart has a page about SEO they should outrank seobook on "SEO" related queries if my theories of brand bias are correct. That misses the point entirely. I never stated that garbage content on branded sites always outperforms quality content on niche sites, but rather that a lot of smaller websites were intentionally being squeezed out of the ecosystem. Sure some small sites manage to compete, but the odds of them succeeding today are much lower than they were 3 or 4 years ago.
  • At SMX near the end of our session a question was asked about the audience composition & most people were either big brands or people working for big brands. If you go back to when I first got into SEO in 2003 the audience composition was almost entirely small publishers and independent SEOs. This squeezing out of small players is not something new to search or the web. If you look at the history of any modern communications network this cycle has repeated itself in every single medium - phone, radio, television, and the web.

To be fair, I can understand why a no-name also ran SEO consultant would want to pitch himself for being up for doing SEO work for large brands. Brands generally have fatter margins, economies of scale, and large budgets. As Google tilts the algorithm toward the big brands (to where they can fall over the finish line in first place) they are the best clients to work for, since you are swimming downstream.

Why push huge boulders up the side of the mountain for crumbs when you can get paid far more to blow on a snowflake at the top of the mountain?

That is why so many SEOs fawn over trying to get brand clients. The work is high-paying, low risk, and relatively easy.

If we were ever to close up our membership site & focus primarily on SEO consulting work in more structured arrangements then absolutely we would aim at brands & help them fall over the finsh line in first place. ;)

Back when I worked with Clientside SEM we did a good number of big brand projects with some of the largest online portals & retailers. Understanding the business objectives & communicating things in a way that builds buy in from other departments is of course challenging. You need simplicity & directness without oversimplifying. But (if you work for great clients - like we did), then that is nowhere near as challenging as building a site from scratch into something that can compete for lucrative keywords. I recently stepped back from the client consulting model for a bit simply because I was pulling myself in too many directions & working too long, but Scott is still flourishing & delivering excellent results for clients.

I have nothing against the concept of branding (think of how many years I slaved building up this site & the capital I have poured into it), but I like to share the trends in the ecosystem as they are, rather than as a hack warping my view to try to pick up consulting clients. Our site would likely make far more income if we kept using the words "enterprise" "brand" "fortune 500" and then sold consulting to that target audience. In fact, a large % of our members here are fortune 500s, conglomerates, newspaper chains, magazine publishers, and so on.

It is not that brand counts for nothing (or that it should count for nothing) but anyone who claims the table isn't tilted is either ignorant, a liar, or both.

Truth has to count for something.

Disclaimer: I am not saying enterprise SEO is always easy (there are real challenges, especially with internal politics that add arbitrary constraints). And I am not saying that everyone who targets the enterprise market is a hack (there are some super talented folks out there). But the challenge of being a profitable small webmaster is much more of a struggle than ranking a site that Google is intentionally biasing their algorithms toward promoting.

Disclaimer2: I realize refuting a douchebag like Bryson Meunier is batting below my league, however as a matter of principal I won't let sleazeballs get away with taking a swipe using junk science. The word science deserves better than that.

Pages