Paid Placement in Search Engines

Apr 10th

Consumer Search Insights.
In the below poll we didn't make any distinction between AdWords & organic SEO investments. If we did I am not sure how it would have impacted the voting.

How do you feel about people paying for placement in search engines?

Nearly 2 in 3 people dislike money manipulating search results.

response All (1201) 
I think it is deceptive 65.4% (+3.3 / -3.5)
It is good if it is relevant 34.6% (+3.5 / -3.3)

Women tend to dislike it slightly more than men.

answer Men (813)  Women (388) 
I think it is deceptive 63.6% (+3.6 / -3.8) 67.2% (+5.4 / -5.9)
It is good if it is relevant 36.4% (+3.8 / -3.6) 32.8% (+5.9 / -5.4)

Older people tend to think money influencing search is manipulative, as do younger people who have not had their idealism beaten out of them by the harshness of the world. However the people in the 25 to 34 range who grew up with the web tend to like paid search far more than other groups do.

response 18-24 year-olds (350)  25-34 year-olds (266)  35-44 year-olds (164)  45-54 year-olds (194)  55-64 year-olds (148)  65+ year-olds (80) 
I think it is deceptive 61.3% (+5.0 / -5.2) 47.9% (+6.6 / -6.6) 63.8% (+7.0 / -7.7) 72.5% (+5.8 / -6.7) 72.8% (+6.9 / -8.1) 70.6% (+9.9 / -12.3)
It is good if it is relevant 38.7% (+5.2 / -5.0) 52.1% (+6.6 / -6.6) 36.2% (+7.7 / -7.0) 27.5% (+6.7 / -5.8) 27.2% (+8.1 / -6.9) 29.4% (+12.3 / -9.9)

People in the south tend to dislike money influencing search than any other region & people out west are more accepting of it. Perhaps the audience from California is more likely to understand how search impacts the local economy?

answer The US Midwest (267)  The US Northeast (333)  The US South (355)  The US West (246) 
I think it is deceptive 64.3% (+6.9 / -7.5) 66.4% (+5.9 / -6.4) 69.5% (+5.6 / -6.2) 59.8% (+7.4 / -7.8)
It is good if it is relevant 35.7% (+7.5 / -6.9) 33.6% (+6.4 / -5.9) 30.5% (+6.2 / -5.6) 40.2% (+7.8 / -7.4)

Rural people dislike money influencing search more than urban people do.

response Urban areas (620)  Rural areas (109)  Suburban areas (460) 
I think it is deceptive 63.2% (+4.4 / -4.6) 70.9% (+8.9 / -10.8) 65.3% (+4.9 / -5.2)
It is good if it is relevant 36.8% (+4.6 / -4.4) 29.1% (+10.8 / -8.9) 34.7% (+5.2 / -4.9)

Income has essentially no impact on the perception of the influence of money in search (though there was insufficient data at the upper end of the income range).

response People earning $0-24K (135)  People earning $25-49K (675)  People earning $50-74K (307)  People earning $75-99K (71)  People earning $100-149K  People earning $150K+ 
I think it is deceptive 65.1% (+7.4 / -8.2) 65.8% (+4.3 / -4.6) 65.4% (+6.1 / -6.7) 66.5% (+9.2 / -10.7) Insufficient data Insufficient data
It is good if it is relevant 34.9% (+8.2 / -7.4) 34.2% (+4.6 / -4.3) 34.6% (+6.7 / -6.1) 33.5% (+10.7 / -9.2) Insufficient data Insufficient data

Content Locking Ads

Apr 10th

Consumer Search Insights.

Google recently launched a consumer insights survey product, which quizes users for access to premium content.

How do users get access to these poll questions? Google locks premium content behind them, likeso:

Google has long stated that "cloaking is bad" and that it was deceptive & users didn't like it. Earlier this year Google also rolled out an algorithm to penalize sites that were too ad heavy:

We’ve heard complaints from users that if they click on a result and it’s difficult to find the actual content, they aren’t happy with the experience. Rather than scrolling down the page past a slew of ads, users want to see content right away. So sites that don’t have much content “above-the-fold” can be affected by this change. If you click on a website and the part of the website you see first either doesn’t have a lot of visible content above-the-fold or dedicates a large fraction of the site’s initial screen real estate to ads, that’s not a very good user experience.

Also recall that the second version of the Panda update encouraged users to block sites & many programmers blocked Experts-exchange due to disliking their scroll cloaking. That in turn caused Experts-exchange to get hit & see a nose dive in traffic.

Between the above & seeing how implementation of this quiz technology works, I had to ask:
How do you feel about ads that lock content behind poll questions like this one?

Response Vote
Hate them. A total waste of time 63.7% (+3.3 / -3.4)
I am indifferent 30.8% (+3.3 / -3.1)
I love them. These are fun 5.5% (+2.5 / -1.7)

There isn't a huge split between men & women. Men hate them a bit more, but they also like them a bit more...they are just less indifferent.

Vote Men (811) Women (409)
Hate them. A total waste of time 66.1% (+3.4 / -3.6) 61.5% (+5.4 / -5.7)
I am indifferent 27.2% (+3.4 / -3.2) 34.2% (+5.6 / -5.2)
I love them. These are fun 6.7% (+2.3 / -1.7) 4.3% (+5.1 / -2.4)

Young people & old people tend to like such quizes more than people in the middle. My guess is this is because older people are a bit lonely & younger people do not value their time as much and presume it is more important that they voice their opinions on trivial matters. People just before their retirement (who have recently been hosed by the financial markets) tend not to like these polls as much & same with people in their mid 30s to mid 40s, who are likely short on time trying to balance career, family & finances.

Vote 18-24 year-olds (359) 25-34 year-olds (267) 35-44 year-olds (151) 45-54 year-olds (200) 55-64 year-olds (158) 65+ year-olds (83)
Hate them. A total waste of time 62.1% (+4.9 / -5.2) 62.6% (+6.0 / -6.4) 69.4% (+6.9 / -7.9) 64.5% (+6.5 / -7.1) 68.3% (+6.3 / -7.1) 62.3% (+10.2 / -11.4)
I am indifferent 28.9% (+4.9 / -4.5) 32.1% (+6.2 / -5.6) 24.0% (+7.6 / -6.2) 30.8% (+7.0 / -6.2) 28.4% (+6.9 / -6.0) 28.7% (+11.3 / -9.1)
I love them. These are fun 8.9% (+3.4 / -2.5) 5.3% (+3.7 / -2.2) 6.6% (+5.3 / -3.0) 4.7% (+3.7 / -2.1) 3.3% (+4.4 / -1.9) 9.0% (+9.7 / -4.9)

People out west tend to be more indifferent. Like, whatever man. This may or may not have something to do with California's marijuana laws. ;)

vote The US Midwest (280) The US Northeast (331) The US South (363) The US West (246)
Hate them. A total waste of time 65.2% (+5.6 / -6.0) 69.0% (+6.2 / -7.0) 65.6% (+5.9 / -6.4) 55.6% (+7.2 / -7.5)
I am indifferent 29.7% (+5.9 / -5.3) 25.6% (+6.8 / -5.8) 28.7% (+6.2 / -5.5) 38.7% (+7.4 / -6.9)
I love them. These are fun 5.1% (+4.5 / -2.4) 5.4% (+5.9 / -2.9) 5.7% (+4.8 / -2.7) 5.6% (+7.4 / -3.3)

Rural people tend to like such polls more than others. Perhaps it has to do with a greater longing for connection due to being more isolated?

vote Urban areas (608) Rural areas (117) Suburban areas (477)
Hate them. A total waste of time 62.6% (+4.6 / -4.9) 53.6% (+10.1 / -10.4) 63.8% (+4.8 / -5.1)
I am indifferent 32.2% (+4.8 / -4.4) 37.5% (+10.4 / -9.3) 29.1% (+5.0 / -4.6)
I love them. These are fun 5.2% (+4.4 / -2.5) 8.9% (+9.5 / -4.8) 7.2% (+5.2 / -3.1)

There aren't any conclusive bits based on income. Wealthier people appear to be more indifferent, however the sampling error on that is huge due to the small sample size.

vote People earning $0-24K (151) People earning $25-49K (670) People earning $50-74K (303) People earning $75-99K (77) People earning $100-149K (20) People earning $150K+
Hate them. A total waste of time 69.0% (+7.7 / -8.9) 62.1% (+4.4 / -4.6) 69.7% (+5.5 / -6.1) 69.7% (+9.1 / -10.9) 53.8% (+19.3 / -20.5) Insufficient data
I am indifferent 26.0% (+8.5 / -7.0) 32.6% (+4.6 / -4.3) 23.6% (+5.8 / -5.0) 26.0% (+11.1 / -8.7) 41.7% (+20.6 / -18.1) Insufficient data
I love them. These are fun 5.0% (+6.8 / -3.0) 5.3% (+4.0 / -2.4) 6.7% (+5.7 / -3.2) 4.3% (+11.8 / -3.3) 4.4% (+27.1 / -4.0) Insufficient data

So, ultimately, Google was right that users hate excessive ads & cloaking. But the one thing users hate more than either of those is paying for content. ;)

Some of the traditional publishing businesses are dying on the vine & this is certainly a great experiment to try to generate incremental revenues.

...but...

How does Google's definition of cloaking square with the above? If publishers (or a competing ad network) do the same thing without Google, would it be considered spam?

Ad Retargeting

Apr 10th

Consumer Search Insights.

How do you feel about companies tracking your online behavior to target ads?

Surprisingly, nearly 1 in 11 people like ad retargeting. However, over 3 in 5 people dislike it.

response All (1250) 
I dislike it because it feels creepy 62.3% (+3.1 / -3.3)
I don't care either way 29.3% (+3.1 / -2.9)
I like more relevant ads 8.3% (+2.3 / -1.9)

Women tend to think being stalked by ads is creepier than men do.

vote Men (822)  Women (428) 
I dislike it because it feels creepy 60.6% (+3.7 / -3.8) 64.1% (+5.0 / -5.3)
I don't care either way 30.0% (+3.6 / -3.4) 28.7% (+5.1 / -4.6)
I like more relevant ads 9.5% (+2.6 / -2.1) 7.2% (+4.2 / -2.7)

Younger people who are old enough to be starting families tend to be more financially stressed than most other age groups, so they are likely more appreciative of relevant ads tied to discounts & such. Younger people have also used the web for so much of their lives that they are not as creeped out by tracking & privacy issues as older people are. People in retirement also like relevant ads, perhaps in part because they are feeling the Ben "printing press gone wild but no inflation" Bernake pinch & see their fixed income retirements collapse under artificially low interest rates tied to money printing game.

age 18-24 year-olds (372)  25-34 year-olds (270)  35-44 year-olds (150)  45-54 year-olds (217)  55-64 year-olds (164)  65+ year-olds (77) 
I dislike it because it feels creepy 60.2% (+4.8 / -5.0) 52.3% (+6.3 / -6.4) 65.1% (+7.2 / -8.0) 66.0% (+6.1 / -6.6) 66.6% (+6.9 / -7.7) 55.7% (+11.2 / -11.8)
I don't care either way 33.6% (+4.9 / -4.6) 35.0% (+6.4 / -5.9) 25.5% (+7.6 / -6.3) 27.9% (+6.4 / -5.6) 26.9% (+7.5 / -6.3) 33.5% (+11.9 / -10.1)
I like more relevant ads 6.2% (+2.9 / -2.0) 12.7% (+5.1 / -3.8) 9.5% (+5.9 / -3.8) 6.1% (+3.9 / -2.5) 6.4% (+5.2 / -2.9) 10.7% (+9.1 / -5.2)

People from the west coast are perhaps slightly more aware of the risks of online tracking. People from the south couldn't care either way. In the midwest the stereotype of the mom who clips coupons is shown in the data (though the sample size is small).

vote The US Midwest (259)  The US Northeast (340)  The US South (404)  The US West (247) 
I dislike it because it feels creepy 58.5% (+6.5 / -6.9) 61.8% (+5.9 / -6.3) 61.6% (+5.7 / -6.0) 67.2% (+6.2 / -6.8)
I don't care either way 29.9% (+6.6 / -5.9) 29.1% (+5.8 / -5.2) 32.4% (+5.9 / -5.4) 24.6% (+6.7 / -5.6)
I like more relevant ads 11.6% (+5.6 / -4.0) 9.1% (+5.0 / -3.3) 6.0% (+4.6 / -2.7) 8.2% (+5.7 / -3.5)

On everything outside of disliking online tracking the margin of error is wide enough that it is somewhat hard to notice any strong patterns based on population data.

vote Urban areas (636)  Rural areas (108)  Suburban areas (480) 
I dislike it because it feels creepy 58.9% (+5.0 / -5.1) 61.1% (+9.0 / -9.8) 62.6% (+4.5 / -4.7)
I don't care either way 32.3% (+5.1 / -4.7) 33.9% (+9.9 / -8.6) 27.6% (+4.5 / -4.1)
I like more relevant ads 8.8% (+4.4 / -3.0) 5.0% (+8.7 / -3.3) 9.8% (+3.6 / -2.7)

It is also hard to see much of a broad pattern based on income levels.

vote People earning $0-24K (150)  People earning $25-49K (691)  People earning $50-74K (304)  People earning $75-99K (88) 
I dislike it because it feels creepy 62.2% (+8.4 / -9.1) 60.2% (+4.2 / -4.4) 66.5% (+5.8 / -6.4) 55.1% (+10.2 / -10.6)
I don't care either way 30.0% (+9.2 / -7.8) 30.8% (+4.3 / -4.0) 25.9% (+6.1 / -5.3) 35.8% (+10.4 / -9.2)
I like more relevant ads 7.9% (+8.6 / -4.3) 9.0% (+3.7 / -2.7) 7.5% (+5.5 / -3.3) 9.2% (+9.0 / -4.8)
  • Over 100 training modules, covering topics like: keyword research, link building, site architecture, website monetization, pay per click ads, tracking results, and more.
  • An exclusive interactive community forum
  • Members only videos and tools
  • Additional bonuses - like data spreadsheets, and money saving tips
We love our customers, but more importantly

Our customers love us!

Google+ Integration

Apr 9th

Consumer Search Insights.
As publishers we tend to be quite concerned with the over-promotion of Google+ because it carves up the search landscape, is potentially another hoop that we have to jump through, and in some cases, the Google+ hosted version of a page will outrank the legitimate original source - which screws up the economics of online publishing.

But do users care about how Google+ was integrated directly into the search results? Generally no.

How do you feel Google+ integration has impacted Google's relevancy?

Under 1 in 5 people said it made the search results better, under 1 in 5 said it made the search results worse & over 3 in 5 didn't notice any material impact.

vote All (1260) 
no noticeable impact 64.7% (+3.3 / -3.5)
made it better 17.4% (+2.9 / -2.6)
made it worse 17.9% (+3.0 / -2.7)

Men liked it slightly more than women. However, that difference was within the estimated range of error. If this difference was more significant one might guestimate that women are better at socializing offline & have less need for artificial web relationships, given their relatively larger corpus callosum. ;)

vote Men (875)  Women (385) 
no noticeable impact 64.1% (+3.4 / -3.6) 65.3% (+5.5 / -5.9)
made it better 18.7% (+3.0 / -2.6) 16.2% (+5.2 / -4.1)
made it worse 17.2% (+2.9 / -2.6) 18.5% (+5.3 / -4.4)

Older people are less likely to have loads of online friends & relationships (as they spent most of their lives building relationships in the physical world, before the web or online social networks were popular). Older people also tend to be more set in their ways. Thus many older people won't be signed up for Google+ & won't notice as much of an impact from it.

Younger people are more likely to want to try out new technology, thus they are more likely to notice an impact from it. Some generations tend to be more isolated & individualistic (like the baby boomers) while millennials tend to like to work in groups & network more (it isn't an accident that Facebook started on a college campus & targeted college students), thus younger people are not only more likely to notice something like Google+, but they are also more likely to like its impact.

vote 18-24 year-olds (334)  25-34 year-olds (322)  35-44 year-olds (141)  45-54 year-olds (204)  55-64 year-olds (167)  65+ year-olds (93) 
no noticeable impact 59.8% (+5.1 / -5.4) 64.0% (+5.4 / -5.7) 66.6% (+7.3 / -8.2) 59.3% (+6.6 / -7.0) 65.7% (+6.9 / -7.7) 73.9% (+8.1 / -10.1)
made it better 26.6% (+5.0 / -4.4) 18.8% (+5.0 / -4.1) 16.3% (+7.2 / -5.3) 19.1% (+6.2 / -4.9) 16.4% (+6.7 / -5.0) 7.9% (+8.7 / -4.3)
made it worse 13.6% (+4.1 / -3.3) 17.2% (+4.8 / -3.9) 17.1% (+7.4 / -5.5) 21.6% (+6.0 / -5.0) 17.9% (+6.5 / -5.0) 18.2% (+9.9 / -7.0)

I didn't notice any obvious trends or patterns aligned with locations across the country.

vote The US Midwest (267)  The US Northeast (360)  The US South (378)  The US West (255) 
no noticeable impact 65.5% (+6.7 / -7.3) 61.3% (+7.3 / -7.8) 67.6% (+5.6 / -6.1) 62.4% (+6.6 / -7.1)
made it better 16.2% (+6.2 / -4.7) 20.5% (+7.8 / -6.1) 17.2% (+5.0 / -4.1) 16.5% (+6.3 / -4.8)
made it worse 18.4% (+6.9 / -5.3) 18.2% (+6.3 / -4.9) 15.1% (+5.6 / -4.3) 21.1% (+6.6 / -5.3)

Suburban people were more likely to notice an impact, though they were not heavily skewed in one way or the other

vote Urban areas (669)  Rural areas (124)  Suburban areas (450) 
no noticeable impact 65.9% (+4.1 / -4.4) 66.8% (+9.0 / -10.4) 62.0% (+4.7 / -5.0)
made it better 16.4% (+3.7 / -3.1) 14.3% (+8.5 / -5.7) 20.4% (+4.4 / -3.8)
made it worse 17.6% (+3.9 / -3.3) 18.9% (+9.8 / -7.0) 17.6% (+4.2 / -3.6)

People who earned less were less likely to notice positive or negative impact from Google+ integration (somewhat surprising since younger people tend to skew toward lower incomes & younger people were more likely to notice & like Google+ integration). Outside of that, the data is too bunched up to see any other significant patterns based on income.

vote People earning $0-24K (162)  People earning $25-49K (698)  People earning $50-74K (312)  People earning $75-99K (71) 
no noticeable impact 71.1% (+7.8 / -9.2) 62.8% (+4.4 / -4.6) 61.9% (+6.3 / -6.8) 61.3% (+10.6 / -11.9)
made it better 14.8% (+8.8 / -5.9) 17.5% (+4.0 / -3.4) 18.9% (+5.9 / -4.8) 17.1% (+11.5 / -7.5)
made it worse 14.1% (+9.5 / -6.1) 19.7% (+4.3 / -3.7) 19.2% (+6.4 / -5.1) 21.6% (+11.2 / -8.1)

Editorial Objectivity

Apr 8th

Consumer Search Insights.

Should search engines be able to preferentially promote their own services in their search results?

Nearly 3 in 4 people think that search engines should not be able to preferentially promote their own services.

vote All (1226)
no, results should be objective 74.1% (+3.1 / -3.4)
yes, it is their search results 25.9% (+3.4 / -3.1)

There was essentially no split between men & women.

vote Men (827) Women (399)
no, results should be objective 73.7% (+3.1 / -3.4) 74.4% (+5.2 / -6.0)
yes, it is their search results 26.3% (+3.4 / -3.1) 25.6% (+6.0 / -5.2)

Older people tend to prefer/want more editorial objectivity, whereas younger people are more fine with search engines preferentially promoting their own services. Older people tend to be more fixed in their ways & younger people are much less so.

vote 18-24 year-olds (338) 25-34 year-olds (269) 35-44 year-olds (158) 45-54 year-olds (209) 55-64 year-olds (169) 65+ year-olds (83)
no, results should be objective 65.0% (+4.9 / -5.2) 76.0% (+5.1 / -6.0) 74.0% (+6.5 / -7.7) 71.2% (+5.7 / -6.5) 71.4% (+6.5 / -7.5) 87.2% (+6.1 / -10.4)
yes, it is their search results 35.0% (+5.2 / -4.9) 24.0% (+6.0 / -5.1) 26.0% (+7.7 / -6.5) 28.8% (+6.5 / -5.7) 28.6% (+7.5 / -6.5) 12.8% (+10.4 / -6.1)

Geographically, people in the south & midwest tend to be slightly more trusting, perhaps due to the lower cost of living & less competitive markets. However, any differences here are fairly minor & are within the margin of error.

vote The US Midwest (244) The US Northeast (367) The US South (352) The US West (263)
no, results should be objective 72.2% (+6.4 / -7.4) 77.7% (+4.5 / -5.3) 72.1% (+6.0 / -6.9) 75.9% (+5.7 / -6.9)
yes, it is their search results 27.8% (+7.4 / -6.4) 22.3% (+5.3 / -4.5) 27.9% (+6.9 / -6.0) 24.1% (+6.9 / -5.7)

People who are rural tend to be slightly more accepting of Google doing as it wishes, though this is also a small sample size & well within the margin of error.

vote Urban areas (647) Rural areas (106) Suburban areas (453)
no, results should be objective 74.3% (+4.3 / -4.9) 71.9% (+8.5 / -10.5) 74.4% (+4.2 / -4.7)
yes, it is their search results 25.7% (+4.9 / -4.3) 28.1% (+10.5 / -8.5) 25.6% (+4.7 / -4.2)

There isn't a strong correlation with income on this issue either. People cared a bit more at higher income levels, but there was also a wider margin of error due to small sampling size.

vote People earning $0-24K (142) People earning $25-49K (677) People earning $50-74K (316) People earning $75-99K (75) People earning $100-149K People earning $150K+
no, results should be objective 72.0% (+7.8 / -9.4) 76.8% (+3.7 / -4.1) 68.7% (+6.1 / -6.8) 83.1% (+6.9 / -10.2) Insufficient data Insufficient data
yes, it is their search results 28.0% (+9.4 / -7.8) 23.2% (+4.1 / -3.7) 31.3% (+6.8 / -6.1) 16.9% (+10.2 / -6.9) Insufficient data Insufficient data

Jim Boykin Interview

Apr 4th

Internet Marketing Ninja Jim Boykin has promoted link building since before I even knew what SEO was. Nearly a decade later so many things have changed in SEO (including renaming We Build Pages to Internet Marketing Ninjas), but he still sees links as a key SEO driver (as do I). I recently interviewed him about links & the changing face of SEO.

so, links links links ... these were the backbone of ranking in Google for years and years. are they still? Is social a huge signal, or something that has been over-hyped?

Yes, I do see backlinks at the backbone of rankings in Google. Every day I see sites that trump the rankings with links, and no social signals...but I've never seen a site that had "poor" backlinks compared to others, but a strong social signal, be ranked great.

There are other signal that I feel are more important than social, like content and user behavior, but then after those, I'd put social signals. Even though I don't think they're more important thank links by any stretch, I do feel that social has a place, in areas like branding, community building, and in assisting in organic search results. I always recommend that people have a strong social presence, even if for only sending additional signals to Google to assist in higher rankings.

Google recently nailed a bunch of lower quality bulk link networks. Were you surprised these lasted as long as they did? Was the fact that they worked at all an indication of the sustained importance of links?

Well...surprised...no... filtering out networks is something that's always going to happen....once something gets too big, or too popular, or too talked about...then it's in danger of being burned... the popular "short cuts" of today are the popular penalized networks of tomorrow... there will always be someone who will create a network (of others sites they control, or their new friends control, or of near expired domains, or blogger groups, etc etc) and that someone will start selling links, and advertising, and it will catch on, and they will sell to everyone and it will become so interconnected that it will cause it's own algorthymitic penalty, or it will get popular, and get the eyes of Google on it, and then it will get filtered, or there will be exact match penalties, or entire site penalties.

If that's the game you play, just understand the risks...or, don't play that game and give other reasons for people to link to you, and get permanent non-paid links, but that takes a lot of time and effort and marketing. That's the price you have to pay...because, yes, rankings in Google still comes down to #1, Links.

After such networks get hit, how hard is it for such sites to recover? Does it create a "flight to quality" impact on link building? Are many of them better off starting from scratch rather than trying to recover the sites?

We've worked with several people who have come to us with after being penalized by Google to some degree (either phrase based penalty, or entire site penalties). Probably the low budget people who got hit just started other sites and tossed their penalized site, but most of the people who come to us can't afford to toss their branded site away.

In almost all of those cases it takes someone removing all the paid and un-natural links that they can. They must understand then that their days of buying links are Over, and they Must create great things on their site that gets natural links....and they must forever give up the chase of being #1 for the big short tail phrases..unless you own the exact .com, or your brand name includes that phrase...In order to recover, they must purge the backlinks of the paid links and the networks, do a reinclusion request, and then start doing "natural things", and then wait and wait and wait...90 days is typical...it's the one Google gave to themselves after you pointed out that Google themselves were buying blog links.

Over time it has become easier to hit various trip wires when link building. You mentioned some things being phrse based or entire site & so on...how does a person determine the difference between these? Some of Google's automated penalties and manual penalties have quite similar footprints, are there easy ways to tell which is which?

A phrase based penalty work like this...let's say you've been targeting "green widgets" and "red widgets" for years...you have lots of backlinks with those exact anchor text....and you were in the top 10 for both phrases....then one day, you rank somewhere on page 3 or higher for those phrases.. you may still rank #3 for "cheap red widgets" or #7 for "widgets green" (reversed phrases)...but for the few exact phrases...it's page 3+ of the SERPS for you....nothing else changes, just those exact phrases.. on the other hand, a sitewide penalty is where pretty much nothing rankings on page 1 or page 2 in the SERPS, when the prior day you had lots of keywords rankings in there. I have no way of knowing which were automatic and which were hand done....sometimes I have a feeling in my gut...but it doesn't really matter...the solution is always the same...clean up the backlinks, and change your methods.

Earlier you mentioned foregoing the head phrase, in spite of things like Google Instant guiding searchers down a path, is there still plenty of tail to be had? Are tail keywords significantly under-rated by the market? How does one square going after tail keywords with algorithms like Panda?

I'm a big believer in the long tail. When we analyze content on a site we tend to grab ranking data from ahrefs for the client, as well as for several of their competitors, and we end up merging all the phrases and showing the search volume and the average cost per click for each phrase...we can always find a huge long tail, even if the clients site currently doesn't have that content (they have to add new original content), there is always a huge long tail to be had.

In 98% of the cases, there are no one or two or three main phrases that account for more than 2% of the total potential search traffic. Even with a sites existing content and existing traffic, the short tail tends not to be more than 5% of traffic for any sites I've been seeing.We often find that a site that may have 5,000 pages, but only 500 pages that site are of value via ranking for anything that has a decent search volume, and a decent worth in a CPC value in Google. If you look at those 500 urls, and you optimize each url for say 5 phrases on average, then you're looking at 2,500 phrases...of those, 50 phrases might be the short tail, and 2450 I would consider the middle tail. If you also add words like "shop" "store" "online" "sale" "cheap" "discount" etc to all those pages, you'll pick up tons more phrases. And from there, the more original content you can add, the more long tail you can get.... but..be careful...no one wants a site to be hit from a Google panda update...make sure the content is original, of value, and that it's of use to the viewers of the page.

When going after head or tail keywords...with one or the other do you feel that link quality is more important than link quantity?

Link quality always trumps. Otherwise, I'd buy those 10,000 backlinks for $100 packages that I see in Google AdWords... and my job would be a lot easier :)

With Google it is getting easier to hit tripwires with anchor text or building links too fast, does this also play into the bias toward quality & away from quantity?

I think it is easier to hit tripwires...but it's nice that Google sent out 700,000 "be careful" emails a few weeks ago... those were automatic....I think the "over optimization update" that Google has been speaking of will trip a lot of wires and people will have to mimic the natural web more and not focus on exact short tail phrases.

Those scammy AdWords ads proming link riches for nothing in part shape the perception of the cost & value of links. How do you get prospective clients to see & appreciate the value of higher quality links (while in some cases some of them will be competing with some of the bulk stuff that ranks today & is gone tomorrow)?

Well, luckily I'm not in sales calls anymore so I don't have to do the convincing :) but I'd say that if you can get links that you just can't buy (ie, a link from NASA.gov or harvard.edu/library/) then they're priceless. Each update Google will filter out some of the links from sites that it feels are artificial. If you can build things that stick and stand the test of time, and if you don't need to be #1 tomorrow, and are willing to invest in the sites content and the sites future, then think long tail and long term. If you're all about today, then do what you have to do today, but those cheap links won't move you much anyways & you'll just have a spammy backlink profile.

Building quality links that last isn't particularly cheap or easy. Even harder to do it in volume. What has allowed/enabled you to succeed where so many others have failed on this front? Is it that you care more about the client's well being, or is it that you have to tie together a bunch of clever bits to make it all back out?

Well, I have an army here....nearly 100 ninjas..the biggest group is the link builders, so I have a lot of link ninjas, we also have a lot of tools...tools that suggest the things we should write that has the highest probability of getting trusted backlinks, we have a content teams that knows how to write to get links from professors and orgs and government agencies, etc.

We have tools that help us to know who to write to after we've written the content..and we have tools that help us send out a lot of personal emails...between the tools and the people and the content, we manage to make it work. If we had to do all the work by hand, and by human guesses, it would never work, but with the tools (and human intervention along the way), we're able to get the links and scale it, while keeping the high quality.

When you talk about getting quality links that are priceless, those have that sort of value precisely because they are so hard to get. How big of a role does content play in the process? Is this something anyone can do?

Content is Key to getting links. There's different types of links....there's the low hanging fruit..then there's the fruit that's way on the top of the tree....the things that tend to be harder to get, also tend to be the most valued and the most trusted. If I wrote to a college professor at Harvard and said, "Hey, Professor Bob, I just wrote a great paper on "The History Of Widgets", you should add it to your article in the Harvard library" then if the article isn't Great, they'll never link to it. It starts with a great idea that morphs into great content, and then we promote it to those we're targeting. Anyone can write this content, guess at what a gov page would link to, or a college professor..see what they currently link out to...write them an email that's been personalized...and with enough emails, you can get the links if your content is good enough. It's a long slow process, but anyone can do it. Thank goodness I have tool that make that process much easier and more accurate to getting links.

You mentioned thinking long term, how long does it usually take to start seeing results from quality link building? Do you ever work on new sites, or do you mostly try to work on older websites that tend to respond quicker? Also have you noticed newer sites being able to rank much quicker if they do a quality-first approach to link building?

With getting the trusted links we tend to see an increase in traffic during the first 3 months. I do the 3 month review phone calls here, and my goal is to show them the ROI via overall rankings increase of the long tail, and an increase in google's organic traffic. Sites tend to see much better increases in these if they also follow our internal linking strategies, and our on page optimization strategies. If someone does link building, on page optimization, and internal linking, after 3 months there's almost no way someone can not increase the traffic to their site.

----

Thanks Jim!

Jim Boykin is the founder and CEO of Internet Marketing Ninjas (formerly We Build Pages, since 1999). Jim's team of marketing ninjas offer a full range of internet marketing services including link building services and social media branding, as well as they employ an in-house team of website designers. Follow Jim and the Ninjas on their blog, Facebook, Google, and Twitter, foursquare, and Linkedin.

Patience is a Virtue

Apr 3rd

Sorry I haven't blogged as much lately, but one of our employees recently had a child and Google sending out so many warning messages in webmaster central has created a ton of demand for independent SEO advice. Our growth in demand last month was higher than any month outside of the time a few years ago when we announced we would be raising prices and got so many new subscribers that I had to close down the ability to sign up for about 3 or 4 months because there were so many new customers.

Google has been firing on all cylinders this year. They did have a few snafus in the press, but those didn't have any noticeable impact on user perception or behavior & Google recently rolled out yet another billion Dollar business in their consumer surveys.

Google is doing an excellent job of adding friction to SEO & managing its perception to make it appear less stable, less trustworthy and to discourage investment in SEO. They send out warnings for unnatural links, warnings for traffic drops, and even warnings for traffic increases.

Webmaster Tools is a bit of a strange bird...

  • Any SEO consultant who has client sites tied into Webmaster Tools makes it easy to connect them together (making any black swan editorial decisions far riskier).
  • Any SEO company which has clients sign up for their own Webmaster Tools account now has to deal with explaining why things change, when many of the changes that happen are more driven by algorthmic shifts (adding local results to the SERPs or taking them away, other forms of localization, changing of ad placement on the SERP, etc.) than by the work of the SEO. This in turn adds costs to managing SEO projects while also making them seem less stable (even outside of those who were use paid link networks). Think through the sequence...
    • Google first sends a warning for traffic going up, and the SEO tells the client that this is because they did such a great job with SEO.
    • Then Google sends a warning for traffic dropping & the client worries that something is wrong.
    • The net impact on actual traffic or conversions could be a 0, but the warnings amplify the perception of changes.
  • Any SEO who doesn't use Webmaster Tools loses search referral data. It first started with logged in Google users, but apparently it is also headed to Firefox. Who's to say Google Chrome & Safari won't follow Firefox at some point?

Google has changed & obfuscated so many things that it is very hard to isolate cause and effect. They have made changes to how much data you get, changes to their analytics interface & how they report unique visitors, changes to how tightly they filter certain link behaviors, they have rolled in frequent Panda updates, and they have nailed a number of the paid link networks.

BuildMyRank shut down after leaving a self-destructive footprint that made it easy for Google to nuke their network, and some of the remaining paid link networks are getting nailed. Some of their customers are at this point driven primarily by fear, counting down their remaining days as the sky is falling. Fear is an important emotion designed to protect us, but when it is a primary driver we risk self-destruction.

The big winners in these moves by Google are:

  • Google, since they grant themselves more editorial leeway. If everyone is a scofflaw then they can hit just about anyone they want. And the organic search results are going to be far easier to police if many market participants are held back by a fear tax.
  • Larger businesses which are harder to justify hitting & which can buy out smaller businesses at lower multiples based on the perception of fear.
  • Sites which were outranked by people using the obvious paid links, which now rank a bit better after some of those paid link buyers were removed from the search results.
  • SEOs who out others & market themselves by using polarizing commentary (at least in the short run, whereas in the long run that may backfire).
  • Those engaging in negative SEO, which sell services to smoke competitors.

The big losers from these Google moves are:

  • some of the paid link networks & those who used them for years
  • under-priced SEO service providers who were only able to make the model work by scaling up on risk
  • smaller businesses who are not particularly spammy, but are so paralyzed by fear that they won't put in enough effort & investment to compete in the marketplace

One of the reasons I haven't advocated using the paid link networks is I was afraid of putting the associated keywords into a hopper of automated competition that I would then have to compete against year after year. Even if you usually win, over the course of years you can still lose a lot of money by promoting the creation of disposable, automated & scalable competing sites. If you don't mind projects getting hit & starting over the ROI on such efforts might work out, but after so many years in the industry the idea of starting over again and again as sites get hit is less appealing.

It is not just that the links are not trusted, but now they stand a far greater chance of causing penalties:

Dear site owner or webmaster of ….

We’ve detected that some of your site’s pages may be using techniques that are outside Google’s Webmaster Guidelines.

Specifically, look for possibly artificial or unnatural links pointing to your site that could be intended to manipulate PageRank. Examples of unnatural linking could include buying links to pass PageRank or participating in link schemes.

We encourage you to make changes to your site so that it meets our quality guidelines. Once you’ve made these changes, please submit your site for reconsideration in Google’s search results.

If you find unnatural links to your site that you are unable to control or remove, please provide the details in your reconsideration request.

If you have any questions about how to resolve this issue, please see our Webmaster Help Forum for support.

Sincerely,

Google Search Quality Team

If that doesn't change then negative SEO will become a bigger issue than paid links ever were.

What is hard about Google penalizing websites for such links is that it is cheap & easy for someone else to set you up. Shortly after Dan Thies mentioned that it was "about time" to Matt Cutts on Twitter someone started throwing some of the splog links at his site. It is safe to say that Dan didn't build those links, but there are many people who will be in the same situation as Dan who did nothing wrong but had a competitor set them up.

And there is no easy way to disconnect your site from those types of links.

If you go back a few years, it was quite easy to win at SEO by doing it in a "paint by number" fashion. One rarely got hit unless they were exceptionally excessive and stuck out like a sore thumb.

But after all of Google's recent moves, a few missed steps in a drunken stupor can have the same result.

Now more than ever, patience is a virtue!

Review of Jim Boykin's Free Broken Link Tool

Mar 21st
posted in

Jim Boykin recently released a free, but powerful tool, that can help you check on broken links, redirects, in addition to helping you generate a Google Sitemap.

Being a free, web-based tool you might think it's a bit lightweight but you'd be wrong :) It can crawl up to 10,000 internal pages, up to 5 runs per day per user.

In addition to the features mentioned above, the tool offers other helpful data points as well as the ability to export the data to CSV/Excel, HTML, and the ability to generate a Google XML Sitemap.

The other data points available to you are:

  • URL of the page spidered
  • Link to an On-Page SEO report for that URL
  • Link depth from the home page
  • HTTP status code
  • Internal links to the page (with the ability to get a report off the in-links themselves)
  • External links on the page (a one-click report is available to see the outlinks)
  • Overall size of the page with a link to the Google page speed tool (cool!)
  • Link to their Image check tool (size, alt text, header check of the page)
  • Rows for Title Tag, Meta Description, and Meta Keywords
  • Canonical tag field

Using the Tool

The tool is really easy to use, just enter the domain, the crawl depth, and your email if you don't care to watch the magic happen live :)

For larger crawls entering your email makes a lot of sense as it can take a bit on big crawls:

Click Ninja Check and off you go!

Working With The Data

The top of the results page auto-updates and shows you:

  • Status of the report
  • Internal pages crawled
  • External links found
  • Internal redirects found
  • External redirects found
  • Internal and External errors

When you click any of the yellow text(s) you are brought to that specific report table (which are below the main results I'll show you below).

This is also where you can export the XML sitemap, download results to Excel/HTML.

The results pane (broken up into 2 images given the horizontal length of the table) looks like:

More to the right is:

The On Page Report

If you click on the On Page Report link in the first table you are brought to their free On-Page Optimization Analysis tool. Enter the URL and 5 targeted phrases:

Their tool does the following:

  • Metadata tool: Displays text in title tags and meta elements
  • Keyword density tool: Reveals statistics for linked and unlinked content
  • Keyword optimization tool: Shows the number of words used in the content, including anchor text of internal and external links
  • Link Accounting tool: Displays the number and types of links used
  • Header check tool: Shows HTTP Status Response codes for links
  • Source code tool: Provides quick access to on-page HTML source code

The data is presented in the same table form as the original crawl. This first section shows the selected domain and keywords in addition to on-page items like your title tag, meta description, meta keywords, external links on the page, and words on the page (linked and non-linked text).

You can also see the density of all words on the page in addition to the density of words that are not links, on the page.

Next up is a word breakdown as well as the internal links on the page (with titles, link text, and response codes).

The word cloud displays targeted keywords in red, linked words underlined, and non-linked words as regular text.

You'll see a total word count, non-linked word count, linked word count, and total unique words on the.

This can be helpful in digging into deep on-page optimization factors as well as your internal link layout on a per page basis:

Next, you'll get a nice breakdown of internal links and the text of those links, the titles, and the words in the url.

Also, you can see any links to sub-domains as well as external links (with anchor text and response codes):

Each section has a show/hide option where you can see all the data or just a snippet.

Another report you get access to is the image checker (accessible from the main report "Check Image Info" option):

Here you'll get a report that shows a breakdown of the files and redirects on the page in addition to the image link, image dimensions, file size, alt text, and a spot to click to view the image:

After that section is the link section which shows the actual link, the file type (html, css, etc), status code and a link check (broken, redirect, ok, and so on)

Additional Reports

The main report referenced at the beginning of this post is the Internal Page Report. There are five additional reports:

  • External Links
  • Internal Redirects
  • External Redirects
  • Internal Errors
  • External Errors

External Links

This report will show you:

  • HTTP Status
  • Internal links to the external link
  • Actual link URL
  • Link anchor text
  • Where the link was first found on the domain

Internal and External Redirects

  • HTTP Status
  • Internal links to the external link
  • Actual link URL
  • Link anchor text
  • Page the URL redirects to

Internal and External Errors

  • HTTP Status
  • Internal links to the external link
  • Actual link URL
  • Link anchor text
  • Give it a Spin

    It's free but more importantly it's quite useful. I find a lot of value in this tool in a variety of ways but mostly with the ability to hone in on your (or your competitor's) internal site and linking structure.

    There are certainly a few on-page tools on the marketing but I found this tool easy to use and full of helpful information, especially with internal structure and link data.

    Try it. :)

    Google Instant Answers: Rich Snippets & Poor Webmasters

    This is a pretty powerful & instructive image in terms of "where search is headed."

    It's a Yahoo! Directory page that was ranking in the Google search results on a Google Android mobile device.

    Note the following

    • the page is hosted on Google.com
    • the page disclaims that it is not endorsed by Google
    • the page embeds a Google search box
    • the page strips out the Yahoo! Directory search box
    • the page strips out the Yahoo! Directory PPC ads (on the categories which have them)
    • the page strips out the Yahoo! Directory logo
    Recall that when Google ran their bogus sting operation on Bing, Google engineers suggest that Bing was below board for using user clickstreams to potentially influence their search results. That level of outrage & the smear PR campaign look ridiculous when compared against Google's behavior toward the Yahoo! Directory, which is orders of magnitude worse:

     

    Bing vs Google Google vs Yahoo! Directory
    editorial Uses user-experience across a wide range of search engines to potentially impact a limited number of search queries in a minor way. Shags expensive hand-created editorial content wholesale & hosts it on Google.com.
    hosting Bing hosts Bing search results using Bing snippets. Google hosts Yahoo! Directory results using Yahoo! Directory listing content & keeps all the user data.
    attribution Bing publicly claimed for years to be using a user-driven search signal based on query streams. Google removes the Yahoo! Directory logo to format the page. Does Google remove the Google logo from Google.com when formatting for mobile? Nope.
    ads Bing sells their own ads & is not scraping Google content wholesale. Google scrapes Yahoo! Directory content wholesale & strips out the sidebar CPC ads.
    search box Bing puts their own search box on their own website. Google puts their own search box on the content of the Yahoo! Directory.
    user behavior Google claimed that Bing was using "their data" when tracking end user behavior. Google hosts the Yahoo! Directory page, allowing themselves to fully track user behavior, while robbing Yahoo! of the opportunity to even see their own data with how users interact with their own listings.

     

    In the above case the publisher absorbs 100% of the editorial cost & Google absorbs nearly 100% of the benefit (while disclaiming they do not endorse the page they host, wrap in their own search ad, and track user behavior on).

    As we move into a search market where the search engines give you a slightly larger listing for marking up your pages with rich snippets, you will see a short term 10% or 20% lift in traffic followed by a 50% or more decline when Google enters your market with "instant answers."

    The ads remain up top & the organic resultss get pushed down. It isn't scraping if they get 10 or 20 competitors to do it & then use the aggregate data to launch a competing service ... talk to the bankrupt Yellow Pages companies & ask them how Google has helped to build their businesses.

    Update: looks like this has been around for a while...though when I spoke to numerous friends nobody had ever seen it before. The only reason I came across it was seeing a referrer through a new page type from Google & not knowing what the heck it was. Clearly this search option doesn't get much traffic because Google even removes their own ads from their own search results. I am glad to know this isn't something that is widespread, though still surprised it exists at all given that it effectively removes monetization from the publisher & takes the content wholesale and re-publishes it across domain names.

    Interview of Jonah Stein

    Mar 16th

    I was recently chatting with Jonah Stein about Panda & we decided it probably made sense to do a full on interview.

    You mentioned that you had a couple customers that were hit by Panda. What sort of impact did that have on those websites?

    Both of these sites saw an immediate hit of about 35% of google traffic. Ranking dropped 3-7 spots. The traffic hit was across the board, especially in the case of GreatSchools, who saw all content types hit (school profile pages, editorial content, UGC)

    GreatSchools was hit on the 4-9 (panda 2.0) update and called out in the Sistrix analysis.

    How hard has GreatSchools been hit? Sistrix data suggested that GreatSchools was loosing about 56% of Google traffic. The real answer is that organic Google-referred traffic to the site fell 30% on April 11 (week over week) and overall site entries are down 16%. Total page views are down 13%. The penalty, of course, is a “site wide” penalty but not all entry page types are being affected equally

    Google suggested that there was perhaps some false positives but that they were generally pretty satisfied with the algorithms. For sites that were hit, how do clients respond to the SEOs? I mean did the SEO get a lot of the blame or did the clients get that the change was sort of a massive black swan?

    I think I actually took it harder then they did. Sure, it hit their bottom line pretty hard, but it hit my ego. Getting paid is important but the real rush for me is ranking #1.

    Fortunately none of my clients think they are inherently entitled to Google traffic, so I didn't get blamed. They were happy that I was on top of it (telling them before they noticed) and primarily wanted to know what Panda was about.

    Once you get over the initial shock and the grieving, responding to Panda was a rorschach test, everyone saw something different. But is also an interesting self - reflection, especially when the initial advice coming from Greg Boser and a few others was to start to de-index content.

    For clients who are not ad driven, the other interesting aspect is that generally speaking conversions were not hurt as much as traffic, so once you start focusing on the bottom line you discover the pain is a little less severe than it seemed initially.

    So you mentioned that not all pages were impacted equally. I think pages where there was more competition were generally hit harder than pages that had less competition. Is that sort of inline with what you saw?

    Originally I thought that was maybe the case, but as I looked at the data during the recovery process I became convinced that Panda is really the public face of a much deeper switch towards user engagement. While the Panda score is sitewide the engagement "penalty" or weighting effect on also occurs at the individual page. The pages or content areas that were hurt less by Panda seem to be the ones that were not also being hurt by the engagement issue.

    On one of my clients we moved a couple sections to sub-domains, following the HubPages example and the experience of some members of your community. The interesting thing is that we moved the blog from /blog to blog.domain.com and we moved one vertical niche from /vertical-kw to vertical-kw.example.com. The vertical almost immediately recovered to pre-panda levels while the traffic to the blog stayed flat.

    So the vertical was suddenly getting 2x the traffic. On the next panda push the vertical dropped 20% but that was still a huge improvement over before we moved to the subdomain. The blog didn't budge.

    The primary domain also seemed to improve some, but it was hard to isolate that from the impact of all of the other changes, improvements and content consolidation we were doing.

    After the next panda data push did not kill the vertical sub domain, we elected to move a second one. On the next data push, everything recovered - a clean bill of health - no pandalization at all.

    but....

    GreatSchools completely recovered the same day and that was November 11th, so Panda 3.0. I cannot isolate the impact of any particular change versus Google tweaking the algorithm and I think both sites were potentially edge cases for Panda anyway.

    Now that we are in 3.3 or whatever the numbering calls it, I can say with confidence that moving "bad" content to a sub-domain carries the Panda score with it and you won't get any significant recovery.

    You mentioned Greg Boser suggesting deindexing & doing some consolidation. Outside of canonicalization, did you test doing massive deindexing (or were subdomains your main means of testing isolation)?

    We definitely collapse a lot of content, mostly 301s but maybe 25% of it was just de-indexing. That was the first response. We took 1150 categories/keyword focused landing pages and reduced to maybe 300. We did see some gains but nothing that resembled the huge boost when Panda was lifted.

    Back to the rorschach test: We did a lot of improvements that yielded incremental gains but were still weighed down. I reminds me of when I used to work on cars. I had this old Audi 100 that was running poorly so I did a complete tune up, new wires, plugs, etc., but it was still running badly. Then I noticed the jet in the carburetor was mis-aligned. As soon as I fixed that, boom, the car was running great. Everything else we fixed may have been the right thing to do for SEO and/or users but it didn't solve the problem we were experiencing.

    The other interesting thing is that I had a 3rd client who appeared to get hit by Panda or at least suffer from Panda like symptoms after their host went down for about 9 hours. Rankings tanked across the board, traffic down 50% for 10 days. They fully recovered on the next panda push. My theory is that this outage pushed their engagement metrics over the edge somehow. Of course, it may not have really been Panda at all but the ranking reports and traffic drops felt like Panda. The timing was after November 11th, so it was a more recent version of the Panda infrastructure.

    Panda 1.0 was clearly a rush job and 2.0 seemed to be a response to the issues it created and the fact that demand media got a free pass. I think it took 6-8 months for them to really get the infrastructure robust.

    My takeaways from Panda are that this is not an individual change or something with a magic bullet solution. Panda is clearly based on data about the user interacting with the SERP (Bounce, Pogo Sticking), time on site, page views, etc., but it is not something you can easily reduce to 1 number or a short set of recommendations. To address a site that has been Pandalized requires you to isolate the "best content" based on your user engagement and try to improve that.

    I don't know if it was intentional or not but engagement as a relevancy factor winds up punishing sites who have built links and traffic through link bait and infographics because by definition these users have a very high bounce rate and a relatively low time on site. Look at the behavioral metrics in GA; if your content has 50% of people spending less than 10 seconds, that may be a problem or that may be normal. The key is to look below that top graph and see if you have a bell curve or if the next largest segment is the 11-30 second crowd.

    I also think Panda is rewarding sites that have a diversified traffic stream. The higher percentage of your users who are coming direct or searching for you by name (brand) or visiting you from social the more likely Google is to see your content as high quality. Think of this from the engine's point of view instead of the site owner. Algorithmic relevancy was enough until we all learned to game that, then came links as a vote of trust. While everyone was looking at social and talking about likes as the new links they jumped ahead to the big data solution and baked an algorithm that tries to measure interaction of users as a whole with your site. The more time people spend on your site, the more ways they find it aside from organic search, the more they search for you by name, the more Google is confident you are a good site.

    Based on that, are there some sites that you think have absolutely no chance of recovery? In some cases did getting hit by Panda cause sites to show even worse user metrics? (there was a guy named walkman on WebmasterWorld who suggested that some sites that had "size 13 shoe out of stock" might no longer rank for the head keywords but would rank for the "size 13" related queries.

    I certainly think that if you have a IYP and you have been hit with Panda your toast unless you find a way to get huge amounts of fresh content (yelp). I don't think the size 13 shoe site has a chance but it is not about Panda. Google is about to roll out lots of semantic search changes and the only way ecommerce sites (outside of the 10 or so brands that dominate Google Products) will have a chance is with schema.org markup and Google's next generation search. The truth is the results for a search for shoes by size is a miserable experience at the moment. I wear size 16 EEEE, so I have a certain amount of expertise on this topic. :)

    Do you see Schema as a real chance for small players? Or something that is a short term carrot before they get beat with the stick? I look at hotel search results like & and I fear that spreading as more people format their content in a format that is easy to scrape & displace. (For illustration purposes, in the below image, the areas in red are clicks that Google is paid for or clicks to fraternal Google pages.)

    I doubt small players will be able to use Schema as a lifeline but it may keep you in the game long enough to transition into being a brand. The reason I have taken your advice about brands to heart and preach it to my clients is that it is short sighted to believe that any of the SEO niche strategies are going to survive if they are not supported with PR, social, PPC and display.

    More importantly, however, is that they are going to focus on meeting the needs of the user as opposed to simply converting them during that visit. To use a baseball analogy, we have spent 15 years keeping score of home runs while the companies that are winning the game have been tracking walks, singles, doubles and outs. Schema may deliver some short term opportunities for traffic but I don't think size13shoes.com will be saved by the magic of semantic markup.

    On the other hand, if I were running an ecommerce store, particularly if I was competing with Amazon, Bestbuy, Walmart and the hand full of giant brands that dominate the product listings in the SERP, I wouldn't bury my head in the sand and pretend that everyone else wasn't moving in that direction anyway. Maybe if you can do it right you can emerge as a winner, at least over the short and medium term.

    In that sense SEO is a moving target, where "best practices" depend on the timing in the marketplace, the site you are applying the strategy to, and the cost of implementation.

    Absolutely...but that is only half the story. If you are an entrepreneur who likes to build site based on a monetization strategy, then it is a moving target where you always have to keep your eyes on the horizon. For most of my clients the name of the game is actually to focus on trying to own your keyword space and take advantage of inertia. That is to say that if you understand the keywords you want to target, develop a strategy for them and then go out and be a solid brand, you will eventually win. Most of my clients rank in the top couple of spots for the key terms for their industry with a fairly conservative slow and steady strategy, but I wouldn't accept a new client who comes to me and says they want to rank a new site #1 for credit cards or debt consolidation and they have $200,000 to spend..or even $2,000,000. We may able to get there for the short term but not with strategies that will stand the test of time.

    Of course, as I illustrated with the Nuts.com example on SearchEngineLand last month, the same strategy that works on a 14 year old domain may not be as effective for a newer site, even if you 301 that old domain. SEO is an art, not a science. As practitioners we need to constantly be following the latest developments but the real skill is in knowing when to apply them and how much; even then occasionally the results are surprising, disappointing or both.

    I think there is a bit of a chicken vs egg problem there then if a company can't access a strong SEO without already having both significant capital & a bit of traction in the marketplace. As Google keeps making SEO more complex & more expensive do you think that will drive a lot of small players out of the market?

    I think it has already happened. It isn't about the inability to access a strong SEO it is that anyone with integrity is going to lay out the obstacles they face. Time and time again we see opportunity for creativity to triumph but the odds are really stacked against you if you are an underfunded retailer.

    Just last year I helped a client with 450 domains who had been hit with Panda and then with a landing page penalty. It took a few months to sort out and get the reconsideration granted (by instituting cross domain rel=canonical and eliminating all the duplicate content across their network). They are gradually recovering to maybe 80% of where they were before Panda 2.0 but I can't provide them an organic link building strategy that will lift 450 niche ecommerce sites. I can't tell them how they are going to get any placement in a shrinking organic SERP dominated by Google's dogfood, shopping results from big box retailers and enormous Adwords Product Listings with images

    From that perspective, if your funding is limited, do you think you are better off attacking a market from an editorial perspective & bolting on commerce after you build momentum (rather than starting with ecommerce and then trying to bolt on editorial?

    Absolutely. Clearly the path is to have built Pinterest, but seriously...

    if you are passionate about something or have a disruptive idea you will succeed (or maybe fail), but if you think you can copy what others are doing and carve out a niche based on exploits I disagree. Of course, autoinsurancequoteeasy.com seems to be saying you can still make a ton of money in the quick flip world, even with a big bank roll, you need to be disruptive or innovative.

    On the other hand, if you have some success in your niche you can use creativity to grow, but it has to be something new. Widget bait launched @oatmeal's online dating site but it is more likely to bury you now than help you rank #1, or at least prevent you from ranking on the matching anchor text.

    When a company starts off small & editorially focused how do you know that it is time to scale up on monetization? Like if I had a successful 200 page site & wanted to add a 20,000 page database to it...would you advise against that, or how would you suggest doing that in a post-Panda world?

    This is a tough call. I actually have a client in exactly this position. I guess it depends on the nature of the 20,000 pages. If you are running a niche directory (like my client) my advice to them was to add the pages to the site but no index the individual listing until they can get some unique content. This is still likely to run fowl of the engagement issue presented by Panda, so we kept the expanded pages on geo oriented sub-domains.

    Earlier you mentioned that Panda challenged some of your assumptions. Could you describe how it changed your views on search?

    I always tell prospects that 10-15 years ago my job was to trick search engines into delivering traffic but over the last 5-6 years it has evolved and now my job is to trick clients into developing content that users want. Panda just changed the definition of "good content" from relevant, well linked content to relevant, well linked, sticky content.

    It has also made me more of a believer in diversifying traffic.

    Last year Google made a huge stink about MSN "stealing" results because they were sniffing traffic streams and crawling queries on Google. The truth is that Google has so many data sources and so many signals to analyze that they don't need to crawl facebook or index links on twitter. They know where traffic is coming from and where it is going and if you are getting traffic from social, they know it.

    As Google folds more data into their mix do you worry that SEO will one day become too complex to analyze (or move the needle)? Would that push SEOs to mostly work in house at bigger companies, or would being an SEO become more akin to being a public relations & media relations expert?

    I think it may already be too complex to analyze in the sense that it is almost impossible to get repeatable results for every client or tell them how much traffic they are going to achieve. On the other hand, moving the needles is still reasonably easy—as long as you are in agreement about what direction everyone is going. SEO for me is about Website Optimization, about asking everyone about the search intent of the query that brings the visitors to the site and making sure we have actions that match this intent. Most of my engagements wind up being a combination of technical seo/problem solving, analytics, strategy and company wide or at least team wide education. All of these elements are driven by keyword research and are geared towards delivering traffic so it is an SEO based methodology, but the requirements for the job have morphed.

    As for moving in house, I have been there and I doubt I will ever go back. Likewise, I am not really a PR or media relations expert but if the client doesn't have those skills in house I strongly suggest they invest in getting them.

    Ironically, many companies still fail to get the basics right. They don't empower their team, they don't leverage their real world relationships and most importantly they don't invest enough in developing high quality content. Writing sales copy is not something you should outsource to college students!

    It still amazes me how hard it is to get content from clients and how often this task is delegated to whoever is at the bottom of the org chart. Changing a few words on a page can pay huge dividends but the highest paid people in the room are rarely involved enough.

    In the enterprise, SEO success is largely driven by getting everyone on board. Being a successful SEO consultant (as opposed to running your own sites) is actually one quarter about being a subject matter expert on everything related to Google, one quarter about social, PR, Link building, conversion, etc and half about being a project manager. You need to get buying from all the stake holders, strive to educate the whole team and hit deliverables.

    Given the increased complexity of SEO (in needing to understand user intent, fixing a variety of symptoms to dig to the core of a problem, understanding web analytics data, faster algorithm changes, etc.) is there still a sweet spot for independent consultants who do not want to get bogged down by those who won't fully take on their advice? And what are some of your best strategies for building buy in from various stakeholders at larger companies?

    The key is to charge enough and to work on a monthly retainer instead of hourly. This sounds flippant but the bottom line is to balance how many engagements you can manage at one time versus how much you want to earn every month. You can't do justice to the needs of a client and bill hourly. That creates an artificial barrier between you and their team. All of my clients know I am always available to answer any SEO related question from anyone on the team at almost any time.

    The increased complexity is really job security. Most of my clients are long term relationships and the ones I enjoy the most are more or less permanent partnerships. We have been very successful together and they value having me around for strategic advice, to keep them abreast of changes and to be available when changes happen. Both of the clients who got hit by Panda have been with me for more than four years.

    No one can be an expert in everything. I definitely enjoy analytics and data but I have very strong partnerships with a few other agencies that I bring in when I need them. I am very happy with the work that AnalyticsPros has done for my clients. Likewise David Rodnitzky (PPC Associates) and I have partnered on a number of clients. Both allow me to be involved in the strategy and know that the execution will be very high quality. I only wish I had some link builders I felt as passionate about (given that Deborah Mastaler is always too busy to take my clients.)

    You mentioned that you thought user engagement metrics were a big part of Panda based on analytics data & such...how would a person look through analytics data to uncover such trends?

    I would focus on the behavioral metrics tab in GA. It is pretty normal to have a large percentage of visitors leave before 10 seconds, but after that you should see a bell curve. Low quality content will actually have 60-70% abandonment in less than 10 seconds, but the trick is for some searches 10 seconds is a good result: weather, what is your address, hours of operations. Lots of users get what they need from searches, sometimes even from the SERP, so look for outliers. Compare different sections of your site, say the blog or those infographics & bad page types.

    Its hard to say until you get your hands in the data but if you assume that individual pages can be weighed down by poor engagement and that this trend is maybe 1 year old and evolving, you can find some clues. Learn to use those advance segments and build out meaningful segmentation on your dashboard and you will be surprised how much of this will jump out at you. It is like over optimization: until you believed in it you never noticed & now you can spot it within a few seconds of looking at a page. I won't pretend engagement issues jump out that fast but it is possible to find them, especially if you are an in house SEO who really knows your site.

    The other important consideration is that improving engagement for an given page is a win regardless of whether it impacts your rankings or your Panda situation. The mantra about doing what is right for the users, not the search engine may sound cliche but they reality is that most of your decisions and priorities should be driven by giving the user what they want. I won't pretend that this is the short road to SERP dominance but my philosophy is to target the user with 80% of your efforts and feed the engines with the other 20.

    Thanks Jonah :)

    ~~~~~~~~~~

    Jonah Stein has 15 years of online marketing experience and is the founder of ItsTheROI, a San Francisco Search Engine Marketing Company that specializes in ROI driven SEO and PPC initiatives. Jonah has spoken at numerous industry conferences including Search Engine Strategies, Search Marketing Expo (SMX), SMX Advanced, SIIA On Demand, the Kelsey Groups Ultimate Search Workshop and LT Pact. He also developed panels Virtual Blight for the Web 2.0 Summit and the Web 2.0 Expo. He has written for Context Web, Search Engine Land and SEO Book

    Jonah is also the cofounder of two SaaS companies, including CodeGuard.com, a cloud based backup service that provides a time machine for websites and Hubkick.com, an online collaboration and task management tool that provides a simple way for groups to work together-instantly.

    Pages






      Email Address
      Pick a Username
      Yes, please send me "7 Days to SEO Success" mini-course (a $57 value) for free.

      Learn More

      We value your privacy. We will not rent or sell your email address.