Google Keyword (Not Provided)

Sep 25th

Just a follow up on the prior (not provided) post, as Google has shot the moon since our last post on this. Here's a quick YouTube video.

The above video references the following:

Matt Cutts when secured search first rolled out:

Google software engineer Matt Cutts, who’s been involved with the privacy changes, wouldn’t give an exact figure but told me he estimated even at full roll-out, this would still be in the single-digit percentages of all Google searchers on Google.com.

This Week in Google (TWIG) show 211, where Matt mentioned the inspiration for encrypted search:

we actually started doing encrypted.google.com in 2008 and one of the guys who did a lot of heavy lifting on that, his name is Evan, and he actually reports to me. And we started that after I read Little Brother, and we said "we've got to encrypt the web."

The integration of organic search performance data inside AdWords.

The esteemed AdWords advertiser David Whitaker.

When asked about the recent increase in (not provided), a Google representative stated the following:

We want to provide SSL protection to as many users as we can, in as many regions as we can — we added non-signed-in Chrome omnibox searches earlier this year, and more recently other users who aren’t signed in. We’re going to continue expanding our use of SSL in our services because we believe it’s a good thing for users….

The motivation here is not to drive the ads side — it’s for our search users.

What an excellent time for Google to block paid search referrals as well.

If the move is important for user safety then it should apply to the ads as well.

Jim Boykin Interview

Sep 4th

Jim Boykin has been a longtime friend & was one of the early SEOs who was ahead of the game back in the day. While many people have came and went, Jim remains as relevant as ever today. We interviewed him about SEO, including scaling his company, disavow & how Google has changed the landscape over the past couple years.

Aaron: How did you get into the field of SEO?

Jim: In 1999 I started We Build Pages as a one man show designing and marketing websites...I never really became much of a designer, but luckily I had much more success in the marketing side. Somehow that little one man show grew to about 100 ninjas, and includes some communities and forums I grew up on (WebmasterWorld, SEOChat, Cre8asiteForums), and I get to work with people like Kris Jones, Ann Smarty, Chris Boggs, Joe Hall, Kim Krause Berg, and so many others at Ninjas who aren't as famous but are just as valuable to me, and Ninjas has really become a family over the years. I still wonder at times how this all happened, but I feel lucky with where we're at.

Aaron: When I got started in SEO some folks considered all link building to be spam. I looked at what worked, and it appeared to be link building. Whenever I thought I came up with a new clever way to hound for links & would hunted around, most the times it seems you got there first. Who were some of the people you looked to for ideas when you first got into SEO?

Jim: Well, I remember going to my first SEO conference in 2002 and meeting people like Danny Sullivan, Jill Whalen, and Bruce Clay. I also remember Bob Massa being the first person "dinged" by google for selling links...that was back in 2002 I think...I grew up on Webmasterworld and I learned a ton from the people in there like: Tedster, Todd Friesen, Greg Boser, Brett Tabke, Shak, Bill, Rae Hoffman, Roger Montti, and so many others in there over the years...they were some of my first influencers....I also used to hang around with Morgan Carey, and Patrick Gavin a lot too. Then this guy selling an SEO Book kept showing up on all my high PR pages where I was getting my links....hehe...

Aaron: One of the phrases in search that engineers may use is "in an ideal world...". There is always some amount of gap between what is advocated & what actually works. With all the algorithmic changes that have happened in the past few years, how would you describe that "gap" between what works & what is advocated?

Jim: I feel there's really been a tipping point with the Google Penguin updates. Maybe it should be "What works best short term" and "What works best long term"....anything that is not natural may work great in the short term, but your odds of getting zinged by Google go way up. If you're doing "natural things" to get citations and links, then it may tend to take a bit longer to see results (in conjunction with all you're doing), but at least you can sleep at night doing natural things (and not worrying about Google Penalties).  It's not like years ago when getting exact targeted anchor text for the phrases you want to rank on was the way to go if you wanted to compete for search rankings. Today it's much more involved to send natural signals to a clients website.  To send in natural signals you must do things like work up the brand signals, trusted citations, return visitors, good user experience, community, authors, social, yada yada....SEO is becming less a "link thing"...and more a "great signals from many trusted people", as well as it's a branding game now. I really like how SEO is evolving....for years Google used to say things like "Think of the users" when talking of the algorthym, but we all laughed and said "Yea, yea, we all know that it's all about the Backlinks"....but today, I think Google has crossed a tipping point where yes, to do great SEO, you must focus on the users, and not the links....the best SEO is getting as many citations and trusted signals to your site than your competitors...and there's a lot of trusted signals which we, as internet marketers, can be working on....it's more complicated, and some SEO's won't survive this game...they'll continue to aim for short term gains on short tail keyword phrases...and they'll do things in bulk....and their network will be filtered, and possibly penalized.

Every website owner has to measure the risks, and the time involved, and the expected ROI....it's not a cheap game any more....doing real marketing involves brains and not buttons...if you can't invest in really building something "special" (ideally many special things), on your site to get signals (links/social), then you're going to find it pretty hard to get links that look natural and don't run a risk of getting penalized.  The SEO game has really matured, the other option is to take a high risk of penalization.

Aaron: In terms of disavow, how deep does one has to cut there?

Jim: as deep as it needs to be to remove every unantural link. If you have 1000 backlinks and 900 are on pages that were created for "unnatural purposes (to give links)" then all 900 have to be disavowed...if you have 1000 backlinks, and only 100 are not "natural" then only 100 need to be disavowed... what percent has to be disavowed to untrip an algorthymitic filter? I'm not sure...but almost always the links which I disavow have zero value (in my opinion) anyways.  Rip the band-aid off, get over it, take your marketing department and start doing real things to attract attention, and to keep it.

Aaron: In terms of recoveries, are most penalized sites "recoverable"? What does the typical recovery period look like in terms of duration & restoration?

Jim: oh...this is a bee's nest you're asking me..... are sites recoverable....yes, most....if a site has 1000 domains that link to it, and 900 of those are artificial and I disavow them, there might not be much of a recovery depending on what that 100 links left are....ie, if I disavow all link text of "green widgets" that goes to your site, and you used to rank #1 for "green widgets" prior to being hit by a Penguin update, then I wouldn't expect to "recover" on the first page for that phrase..... where you recover seems to depend on "what do you have for natural links that are left after the disavow?"....the time period....well.... we've seen some partial recoveries in as soon as 1 month, and some 3 months after the disavow...and some we're still waiting on....

To explain, Google says that when you add links to the disavow document, then way it works is that the next time Google crawls any page that links to you, they will assign a "no follow" to the link at that time.....so you have to wait until enough of the links have been recrawled, and now assigned the no follow, to untrip the filter....but one of the big problems I see is that many of the pages Google shows as linking to you, well, they're not cached in Google!....I see some really spammy pages where Google was there (they record your link), but it's like Google has tossed the page out of the index even though they show the page as linking to you...so I have to ask myself, when will Google return to those pages?...will Google ever return to those pages???  It looks like if  you had a ton of backlinks that were on pages that were so bad in the eyes of Google that they don't even show those pages in their index anymore...we might be waiting a long long time for google to return to those pages to crawl them again....unless you do something to get Google to go back to those pages sooner (I won't elaborate on that one).

Aaron: I notice you launched a link disavow tool & earlier tonight you were showing me a few other cool private tools you have for working on disavow analysis, are you going to make any of those other tools live to the public?

Jim: Well, we have about 12 internal private disavow analysis tools, and only 1 public disavow tool....we are looking to have a few more public tools for analyzing links for disavow analysis in the coming weeks, and in a few months we'll release our Ultimate Disavow Tool...but for the moment, they're not ready for the public, some of those are fairly expensive to run and very database intensive...but I'm pretty sure I'm looking at more link patterns than anyone else in the world when I'm analyzing backlinks for doing disavows. When I'm tired of doing disavows maybe I'll sell access to some of these.

Aaron: Do you see Google folding in the aggregate disavow data at some point? How might they use it?

Jim: um.....I guess if 50,000 disavow documents have spammywebsite.com listed in their disavows, then Google could consider that spammywebsite.com might be a spammy website.....but then again, with people disavowing links who don't know what they're doing, I'm sure their's a ton of great sites getting listed in Disavow documents in Webmaster Tools.

Aaron: When approaching link building after recovering from a penalty, how does the approach differ from link building for a site that has never been penalized?

Jim: it doesn't really matter....unless you were getting unnatural/artificial links or things in bulk in the past, then, yes, you have to stop doing that now...that game is over if you've been hit...that game is over even if you haven't been hit....Stop doing the artificial link building stuff. Get real citations from real people (and often "by accident") and you should be ok.

Aaron: You mentioned "natural" links. Recently Google has hinted that infographics, press releases & other sorts of links should use nofollow by default. Does Google aim to take some "natural" link sources off the table after they are widely used? Or are those links they never really wanted to count anyhow (and perhaps sometimes didn't) & they are just now reflecting that.

Jim: I think ~most of these didn't count for years anyways....but it's been impossible for Google to nail every directory, or every article syndication site, or every Press Release site, or everything that people can do in bulk..and it's harder to get all occurances of widgets and mentions of infographics...so it's probably just a "Google Scare....ie, Google says, "Don't do it, No Follow them" (and I think they say that because it often works), and the less of a pattern there is, the harder for Google to catch it (ie, widgets and infographics) ...I think too much of any 1 thing (be it a "type of link") can be a bad signal....as well as things like "too many links from pages that get no traffic", or "no clicks from links to your site". In most cases, because of keyword abuse, Google doesn't want to count them...links like this may be fine (and ok to follow) in moderation...but if you have 1000 widgets links, and they all have commercial keywords as link text, then you're treading on what could certainly turn into a negative signal, and so then you might want to consider no following those.

Aaron: There is a bit of a paradox in terms of scaling effective quality SEO services for clients while doing things that are not seen as scalable (and thus future friendly & effective). Can you discuss some of the biggest challenges you faced when scaling IMN? How were you able to scale to your current size without watering things down the way that most larger SEO companies do?

Jim: Scaling and keep quality has certainly been a challenge in the past. I know that scaling content was an issue for us for a while....how can you scale quality content?....Well, we've found that by connecting real people, the real writers, the people with real social influence...and by taking these people and connecting them to the brands we work with.....so these real people then become "Brand Evangelist"...and getting these real people who know what they're talking about to then write for our clients, well, when we did that we found that we could scale the content issue. We can scale things like link building by merging with the other "mentions", and specifically targeting industries and people and working on building up associations and relations with others has helped to scale...plus we're always building tools to help us scale while keeping quality. It's always a challenge, but we've been pretty good at solving many of those issues.

I think we've been really good at scaling in house....many content marketers are now more like community managers and content managers....we've been close to 100 employees for a few years now..so it's more how can we do more with the existing people we have...and we've been able to do that by connecting real people to the clients so we can actually have better content and better marketing around that content....I'm really happy that the # of employees has been roughly the same for past few years, but we're doing more business, and the quality keeps getting better....there's not as many content marketers today as there was a few years ago, but there's many more people working on helping authors build up their authorship value and produce more "great marketing" campaigns where as a bi-product, we happen to get some links and social citations.

Aaron: One of the things I noticed with your site over the past couple years is the sales copy has promoted the fusion of branding and SEO. I looked at your old site in Archive.org over the years & have seen quite an amazing shift in terms of sales approach. Has Google squeezed out most of the smaller players for good & does effective sustainable SEO typically require working for larger trusted entities? When I first got into SEO about 80%+ of the hands in the audiences at conferences were smaller independent players. At the last conference I was at it seemed that about 80% of the hands in the audience worked for big companies (or provided services to big companies). Is this shift in the market irreversible? How would you compare/contrast approach in working with smaller & larger clients?

Jim: Today it's down to "Who really can afford to invest in their Brand?" and "Who can do real things to get real citations from the web?"....and who can think way beyond "links"...if you can't do those things, then you can't have an effective sustainable online marketing program.... we once were a "link building company" for many, many years.... but for the past 3 years we've moved into full service, offering way more than what was "link building services".... yea, SEO was about "links" for years, and it still is to a large degree....but unless you want to get penalized, you have to take the "it's way more than links" approach... in order for SEO to work (w/o fear of getting penalized) today, you have to look at sending in natural signals...so thus, you must do "natural" things...things that will get others "talking" about it, and about you....SEO has evolved a lot over the years....Google used to recommend 1 thing (create a great site and create great things), but for years we all knew that SEO was about links and anchor text....today, ...today, I think Google has caught up with (to some degree) with the user, and with "real signals"...yesterday is was "gaming" the system....today it's about doing real things...real marketing...and getting you name out to the community via creating great things that spread, and that get people to come back to your site....those SEO's and businesses who don't realize that the game has changed, will probably be doing a lot of disavowing at some time in the future, and many SEO's will be out of business if they think it's a game where you can do "fake things" to "get links" in bulk....in a few years we'll see who's still around for internet marketing companies...those who are still around will be those who do real marketing using real people and promoting to other real people...the link game itself has changes...in the past we looked a link graphs...today we look at people graphs....who is talking about you, what are they saying....it's way more than "who links to me, and how do they link to me"....Google is turning it into a "everyone gets a vote", and "everyone has a value"...and in order to rank, you'll need real people of value talking about your site...and you'll need a great user experience when they get there, and you'll need loyal people who continue to return to your site, and you'll need to continue to do great things that get mentions....

SEO is no longer a game of some linking algorithm, it's now really a game of "how can you create a great user experience and get a buzz around your pages and brand".

Aaron: With as much as SEO has changed over the years, it is easy to get tripped up at some point, particularly if one is primarily focused on the short term. One of the more impressive bits about you is that I don't think I've ever seen you unhappy. The "I'm feeling lucky" bit seems to be more than just a motto. How do you manage to maintain that worldview no matter what's changing & how things are going?

Jim: Well, I don't always feel lucky...I know in 2008 when Google hit a few of our clients because we were buying links for them I didn't feel lucky (though the day before, when they ranked #1, I felt lucky)....but I'm in this industry for the long term...I've been doing this for almost 15 years....and yes, we've had to constantly change over the year, and continue to grow, and growing isn't always easy...but it is exciting to me, and I do feel lucky for what I have...I have a job I love, I get to work with people whom I love, in an industry I love, I get to travel around the world and meet wonderful people and see cool places...and employee 100 people and win "Best Places to work" awards, and I'm able to give back to the community and to society, and to the earth...those things make me feel lucky...SEO has always been like a fun game of chess to me...I'm trying to do the best I can with any move, but I'm also trying to think a few steps ahead, and trying to think what Google is thinking on the other side of the table.....ok...yea, I do feel lucky....maybe it's the old hippy in me...I always see the glass half full, and I'm always dreaming of a better tomorrow....

If I can have lots of happy clients, and happy employees, and do things to make the world a little better along the way, then I'm happy...sometimes I'm a little stressed, but that comes with life....in the end, there's nothing I'd rather be doing than what I currently do....and I always have big dreams of tomorrow that always make the trials of today seem worth it for the goals of what I want to achieve for tomorrow.

Aaron: Thanks Jim!


Jim Boykin is the CEO of the Internet Marketing Ninjas company, and a Blogger and public speaker. You can find Jim on Twitter, Facebook, and Google Plus.

Winning Strategies to Lose Money With Infographics

Aug 26th

Google is getting a bit absurd with suggesting that any form of content creation that drives links should include rel=nofollow. Certainly some techniques may be abused, but if you follow the suggested advice, you are almost guaranteed to have a negative ROI on each investment - until your company goes under.

Some will ascribe such advice as taking a "sustainable" and "low-risk" approach, but such strategies are only "sustainable" and "low-risk" so long as ROI doesn't matter & you are spending someone else's money.

The advice on infographics in the above video suggests that embed code by default should include nofollow links.

Companies can easily spend at least $2,000 to research, create, revise & promote an infographic. And something like 9 out of 10 infographics will go nowhere. That means you are spending about $20,000 for each successful viral infographic. And this presumes that you know what you are doing. Mix in a lack of experience, poor strategy, poor market fit, or poor timing and that cost only goes up from there.

If you run smaller & lesser known websites, quite often Google will rank a larger site that syndicates the infographic above the original source. They do that even when the links are followed. Mix in nofollow on the links and it is virtually guaranteed that you will get outranked by someone syndicating your infographic.

So if you get to count as duplicate content for your own featured premium content that you dropped 4 or 5 figures on AND you don't get links out of it, how exactly does the investment ever have any chance of backing out?

Sales?

Not a snowball's chance in hell.

An infographic created around "the 10 best ways you can give me your money" won't spread. And if it does spread, it will be people laughing at you.

I also find it a bit disingenuous the claim that people putting something that is 20,000 pixels large on their site are not actively vouching for it. If something was crap and people still felt like burning 20,000 pixels on syndicating it, surely they could add nofollow on their end to express their dissatisfaction and disgust with the piece.

Many dullards in the SEO industry give Google a free pass on any & all of their advice, as though it is always reasonable & should never be questioned. And each time it goes unquestioned, the ability to exist in the ecosystem as an independent player diminishes as the entire industry moves toward being classified as some form of spam & getting hit or not depends far more on who than what.

Does Google's recent automated infographic generator give users embed codes with nofollow on the links? Not at all. Instead they give you the URL without nofollow & those URLs are canonicalized behind the scenes to flow the link equity into the associated core page.

No cost cut-n-paste mix-n-match = direct links. Expensive custom research & artwork = better use nofollow, just to be safe.

If Google actively adds arbitrary risks to some players while subsidizing others then they shift the behaviors of markets. And shift the markets they do!

Years ago Twitter allowed people who built their platform to receive credit links in their bio. Matt Cutts tipped off Ev Williams that the profile links should be nofollowed & that flow of link equity was blocked.

It was revealed in the WSJ that in 2009 Twitter's internal metrics showed an 11% spammy Tweet rate & Twitter had a grand total of 2 "spam science" programmers on staff in 2012.

With smaller sites, they need to default everything to nofollow just in case anything could potentially be construed (or misconstrued) to have the intent to perhaps maybe sorta be aligned potentially with the intent to maybe sorta be something that could maybe have some risk of potentially maybe being spammy or maybe potentially have some small risk that it could potentially have the potential to impact rank in some search engine at some point in time, potentially.

A larger site can have over 10% of their site be spam (based on their own internal metrics) & set up their embed code so that the embeds directly link - and they can do so with zero risk.

I just linked to Twitter twice in the above embed. If those links were directly to Cygnus it may have been presumed that either he or I are spammers, but put the content on Twitter with 143,199 Tweets in a second & those links are legit & clean. Meanwhile, fake Twitter accounts have grown to such a scale that even Twitter is now buying them to try to stop them. Twitter's spam problem was so large that once they started to deal with spam their growth estimates dropped dramatically:

CEO Dick Costolo told employees he expected to get to 400 million users by the end of 2013, according to people familiar with the company.

Sources said that Twitter now has around 240 million users, which means it has been adding fewer than 4.5 million users a month in 2013. If it continues to grow at that rate, it would end this year around the 260 million mark — meaning that its user base would have grown by about 30 percent, instead of Costolo’s 100 percent goal.

Typically there is no presumed intent to spam so long as the links are going into a large site (sure there are a handful of token counter-examples shills can point at). By and large it is only when the links flow out to smaller players that they are spam. And when they do, they are presumed to be spam even if they point into featured content that cost thousands of Dollars. You better use nofollow, just to play it safe!

That duality is what makes blind unquestioning adherence to Google scripture so unpalatable. A number of people are getting disgusted enough by it that they can't help but comment on it: David Naylor, Martin Macdonald & many others DennisG highlighted.

Oh, and here's an infographic for your pleasurings.

  • Over 100 training modules, covering topics like: keyword research, link building, site architecture, website monetization, pay per click ads, tracking results, and more.
  • An exclusive interactive community forum
  • Members only videos and tools
  • Additional bonuses - like data spreadsheets, and money saving tips
We love our customers, but more importantly

Our customers love us!

Google Keyword(Not Provided): High Double Digit Percent

Aug 3rd

Most Organic Search Data is Now Hidden

Over the past couple years since its launch, Google's keyword (not provided) has received quite a bit of exposure, with people discussing all sorts of tips on estimating its impact & finding alternate sources of data (like competitive research tools & webmaster tools).

What hasn't received anywhere near enough exposure (and should be discussed daily) is that the sole purpose of the change was anti-competitive abuse from the market monopoly in search.

The site which provided a count for (not provided) recently displayed over 40% of queries as (not provided), but that percentage didn't include the large percent of mobile search users that were showing no referrals at all & were showing up as direct website visitors. On July 30, Google started showing referrals for many of those mobile searchers, using keyword (not provided).

According to research by RKG, mobile click prices are nearly 60% of desktop click prices, while mobile search click values are only 22% of desktop click prices. Until Google launched enhanced AdWords campaigns they understated the size of mobile search by showing many mobile searchers as direct visitors. But now that AdWords advertisers were opted into mobile ads (and have to go through some tricky hoops to figure out how to disable it), Google has every incentive to promote what a big growth channel mobile search is for their business.

Looking at the analytics data for some non-SEO websites over the past 4 days I get Google referring an average of 86% of the 26,233 search visitors, with 13,413 being displayed as keyword (not provided).

Hiding The Value of SEO

Google is not only hiding half of their own keyword referral data, but they are hiding so much more than half that even when you mix in Bing and Yahoo! you still get over 50% of the total hidden.

Google's 86% of the 26,233 searches is 22,560 searches.

Keyword (not provided) being shown for 13,413 is 59% of 22,560. That means Google is hiding at least 59% of the keyword data for organic search. While they are passing a significant share of mobile search referrers, there is still a decent chunk that is not accounted for in the change this past week.

Not passing keywords is just another way for Google to increase the perceived risk & friction of SEO, while making SEO seem less necessary, which has been part of "the plan" for years now.

Buy AdWords ads and the data gets sent. Rank organically and most the data is hidden.

When one digs into keyword referral data & ad blocking, there is a bad odor emitting from the GooglePlex.

Subsidizing Scammers Ripping People Off

A number of the low end "solutions" providers scamming small businesses looking for SEO are taking advantage of the opportunity that keyword (not provided) offers them. A buddy of mine took over SEO for a site that had showed absolutely zero sales growth after a year of 15% monthly increase in search traffic. Looking at the on-site changes, the prior "optimizers" did nothing over the time period. Looking at the backlinks, nothing there either.

So what happened?

Well, when keyword data isn't shown, it is pretty easy for someone to run a clickbot to show keyword (not provided) Google visitors & claim that they were "doing SEO."

And searchers looking for SEO will see those same scammers selling bogus solutions in AdWords. Since they are selling a non-product / non-service, their margins are pretty high. Endorsed by Google as the best, they must be good.

Or something like that:

Google does prefer some types of SEO over others, but their preference isn’t cast along the black/white divide you imagine. It has nothing to do with spam or the integrity of their search results. Google simply prefers ineffective SEO over SEO that works. No question about it. They abhor any strategies that allow guys like you and me to walk into a business and offer a significantly better ROI than AdWords.

This is no different than the YouTube videos "recommended for you" that teach you how to make money on AdWords by promoting Clickbank products which are likely to get your account flagged and banned. Ooops.

Anti-competitive Funding Blocking Competing Ad Networks

John Andrews pointed to Google's blocking (then funding) of AdBlock Plus as an example of their monopolistic inhibiting of innovation.

sponsoring Adblock is changing the market conditions. Adblock can use the money provided by Google to make sure any non-Google ad is blocked more efficiently. They can also advertise their addon better, provide better support, etc. Google sponsoring Adblock directly affects Adblock's ability to block the adverts of other companies around the world. - RyanZAG

Turn AdBlock Plus on & search for credit cards on Google and get ads.

Do that same search over at Bing & get no ads.

How does a smaller search engine or a smaller ad network compete with Google on buying awareness, building a network AND paying the other kickback expenses Google forces into the marketplace?

They can't.

Which is part of the reason a monopoly in search can be used to control the rest of the online ecosystem.

Buying Browser Marketshare

Already the #1 web browser, Google Chrome buys marketshare with shady one-click bundling in software security installs.

If you do that stuff in organic search or AdWords, you might be called a spammer employing deceptive business practices.

When Google does it, it's "good for the user."

Vampire Sucking The Lifeblood Out of SEO

Google tells Chrome users "not signed in to Chrome (You're missing out - sign in)." Login to Chrome & searchers don't pass referral information. Google also promotes Firefox blocking the passage of keyword referral data in search, but when it comes to their own cookies being at risk, that is unacceptable: "Google is pulling out all the stops in its campaign to drive Chrome installs, which is understandable given Microsoft and Mozilla's stance on third-party cookies, the lifeblood of Google's display-ad business."

What do we call an entity that considers something "its lifeblood" while sucking it out of others?

New Local Carousel

Jun 24th

Google announced they rolled out their local carousel results on desktops in categories like hotels, dining & nightlife for US English search queries. The ranking factors driving local rank are aligned with the same ones that were driving the old 7 pack result set.

The layout seems to be triggered when there are 5 or more listings. One upside to the new layout is that clicks within the carousel might not fall off quite as quickly as they do with vertical listings, so if you don't rank #1 you might still get plenty of traffic.

The default amount of useful information offered by the new layout is less than the old layout provided, while requiring user interaction with the result set to get the information they want. You get a picture, but the only way the phone number is in the result set is if you click into that result set or conduct a branded query from the start.

If you search for a general query (say "Indian restaurants") and want the phone number of a specific restaurant, you will likely need to click on that restaurant's picture in order to shift the search to that restaurant's branded search result set to pull their phone number & other information into the page. In that way Google is able to better track user engagement & enhance personalization on local search. When people repeatedly click into the same paths from logged in Google user accounts then Google can put weight on the end user behavior.

This multi-click process not only gives Google usage data to refine rankings with, but it also will push advertisers into buying branded AdWords ads.

Where this new result set is a bit of a train wreck for navigational searches is when a brand is fairly generic & aligned with a location as part of the business name. For instance, in Oakland there is a place named San Francisco Pizza. Even if you do that branded search, you still get the carousel & there might also be three AdWords ads above the organic search results.

If that company isn't buying branded AdWords ads, they best hope that their customers have large monitors, don't use Google, or are better than the average searcher at distinguishing between AdWords & organic results.

Some of Google's other verticals may appear above the organic result set too. When searching for downtown Oakland hotels they offer listings of hotels in San Francisco & Berkeley inside the hotel onebox.

Perhaps Google can patch together some new local ad units that work with the carousel to offer local businesses a flat-rate monthly ad product. A lot of advertisers would be interested in testing a subscription product that enabled them to highlight selected user reviews and include other options like ratings & coupons & advertiser control of the image. As the search result set becomes the destination some of Google's ad products can become much more like Yelp's.

In the short term the new layout is likely a boon for Yelp & some other local directory plays. Whatever segment of the search audience that dislike's the new carousel will likely be shunted into many of these other local directories.

In the longrun some of these local directories will be the equivalent of MapQuest. As Google gains confidence they will make their listings richer & have more confidence in entirely displacing the result set. The following search isn't a local one, but is a good example of where we may be headed. Even though the search is set to "web" results (rather than "video" results) the first 9 listings are from YouTube.

Update: In addition to the alarming rise of further result displacement, the 2-step clickthrough process means that local businesses will lose even more keyword referral data, as many of the generic queries are replaced by their branded keywords in analytics data.

Inbound, Outbound, Outhouse

Jun 1st

Jon Henshaw put the hammer down on inbound marketing highlighting how the purveyors of "the message" often do the opposite of what they preach. So much of the marketing I see around that phrase is either of the "clueless newb" variety, or paid push marketing of some stripe.

One of the clueless newb examples smacked me in the face last week on Twitter, where some "HubSpot certified partner" (according to his Twitter profile) complained to me about me not following enough of our followers, then sent a follow up spam asking if I saw his artice about SEO.

The SEO article was worse than useless. It suggested that you shouldn't be "obvious" & that you should "naturally attract links." Yet the article itself was a thin guest post containing the anchor text search engine optimization deep linking to his own site. The same guy has a "book" titled Findability: Why Search Engine Optimization is Dying.

Why not promote the word findability with the deep link if he wants to claim that SEO is dying? Who writes about how something is dying, yet still targets it instead of the alleged solution they have in hand?

If a person wants to claim that anchor text is effective, or that push marketing is key to success, it is hard to refute those assertations. But if you are pushy & aggressive with anchor text, then the message of "being natural" and "just let things flow" is at best inauthentic, which is why sites like Shitbound.org exist. ;)

Some of the people who wanted to lose the SEO label suggested their reasoning was that the acronym SEO was stigmatized. And yet, only a day after rebranding, these same folks that claim they will hold SEO near and dear forever are already outing SEOs.

The people who want to promote the view that "traditional" SEO is black hat and/or ineffective have no problems with dumping on & spamming real people. It takes an alleged "black hat" to display any concern with how actual human beings are treated.

If the above wasn't bad enough, SEO is getting a bad name due to the behavior of inbound tool vendors. Look at the summary on a blog post from today titled Lies The SEO Publicity Machine Tells About PPC (When It Thinks No One’s Looking)

Then he told me he wasn’t seeing any results from following all the high-flown rhetoric of the “inbound marketing, content marketing” tool vendor. “Last month, I was around 520 visitors. This month, we’re at 587.” Want to get to 1,000? Work and wait and believe for another year or two. Want to get to 10,000? Forget it. ... You could grow old waiting for the inbound marketing fairy tale to come true.

Of course I commented on the above post & asked Andrew if he could put "inbound marketer" in the post title, since that's who was apparently selling hammed up SEO solutions.

In response to Henshaw's post (& some critical comments) calling inbound marketing incomplete marketing Dharmesh Shah wrote:

When we talk about marketing, we position classical outbound techniques as generally being less effective (and more expensive) over time. Not that they’re completely useless — just that they don’t work as well as they once did, and that this trend would continue."

Hugh MacLeod is brilliant with words. He doesn't lose things in translation. His job is distilling messages to their core. And what did his commissions for HubSpot state?

  • thankfully consiging traditional marketing to the dustbin of history since 2006
  • traditional marketing is easy. all you have to do is pretend it works
  • the good news is, your customers are just as sick of traditional marketing as you are
  • hey, remember when traditional marketing used to work? neither do we
  • traditional marketing doesn't work. it never did

Claiming that "traditional marketing" doesn't work - and never did, would indeed be claiming that classical marketing techniques are ineffective / useless.

If something "doesn't work" it is thus "useless."

You never hear a person say "my hammer works great, it's useless!"

As always, watch what people do rather than what they say.

When prescription and behavior are not aligned, it is the behavior that is worth emulating.

That's equally true for keyword rich deeplink in a post telling you to let SEO happen naturally and for people who relabel things while telling you not to do what they are doing.

If "traditional marketing" doesn't work AND they are preaching against it, why do they keep doing it?

Follow the money.

Why the Yahoo! Search Revenue Gap Won't Close

In spite of Yahoo! accepting revenue guarantees for another year from Microsoft, recently there has been speculation that Yahoo! might want to get out of their search ad deal with Microsoft. I am uncertain if the back channeled story is used as leverage to secure ongoing minimum revenue agreements, or if Yahoo! is trying to set the pretext narrative to later be able to push through a Google deal that might otherwise get blocked by regulators.

When mentioning Yahoo!'s relative under-performance on search, it would be helpful to point out the absurd amount of their "search" traffic from the golden years that was various forms of arbitrage. Part of the reason (likely the primary reason) Yahoo! took such a sharp nose dive in terms of search revenues (from $551 million per quarter to as low as $357 million per quarter) was that Microsoft used quality scores to price down the non-search arbitrage traffic streams & a lot of that incremental "search" volume Yahoo! had went away.

There were all sorts of issues in place that are rarely discussed. Exit traffic, unclosible windows, forcing numerous clicks, iframes in email spam, raw bot clicks, etc. ... and some of this was tied to valuable keyword lists or specific juicy keywords. I am not saying that Google has outright avoided all arbitrage (Ask does boatloads of it in paid + organic & Google at one point tested doing some themselves on credit cards keywords) but it has generally been a sideshow at Google, whereas it was the main attraction at Yahoo!.

And that is what drove down Yahoo!'s click prices.

Yahoo! went from almost an "anything goes" approach to their ad feed syndication, to the point where they made a single syndication partner Cyberplex's Tsavo Media pay them $4.8 million for low quality traffic. There were a number of other clawbacks that were not made public.

Given that we are talking $4.8 million for a single partner & this alleged overall revenue gap between Google AdWords & Bing Ads is somewhere in the $100 million or so range, these traffic quality issues & Microsoft cleaning up the whoring of the ad feed that Yahoo! partners were doing is a big deal. It had a big enough impact that it caused some of the biggest domain portfolios to shift from Yahoo! to Google. I am a bit surprised to see it so rarely mentioned in these discussions.

Few appreciate how absurd the abuses were. For years Yahoo! not only required you to buy syndication (they didn't have a Yahoo!-only targeting option until 2010 & that only came about as a result of a lawsuit) but even when you blocked a scammy source of traffic, if that scammy source was redirecting through another URL you would have no way of blocking the actual source, as mentioned by Sean Turner:

To break it down, yahoo gives you a feed for seobook.com & you give me a feed for turner.com. But all links that are clicked on turner.com redirect through seobook.com so that it shows up in customer logs as seobook.com If you block seobook.com, it will block ads from seobook.com, but not turner.com. The blocked domain tool works on what domains display, not on where the feed is redirected through. So if you are a customer, there is no way to know that turner.com is sending traffic (since it’s redirecting through seobook.com) and no way to block it through seobook.com since that tool only works on the domain that is actually displaying it.

I found it because we kept getting traffic from gogogo.com. We had blocked it over and over and couldn’t figure out why they kept sending us traffic. We couldn’t find our ad on their site. I went to live.com and ran a site:gogogo.com search and found that it indexed some of those landing pages that use gogogo.com as a monetization service.

The other thing that isn't mentioned is the longterm impact of a Yahoo! tie up with Google. Microsoft pays Yahoo! an 88% revenue share (and further guarantees on top of that), provides the organic listings free, manages all the technology, and allows Yahoo! to insert their own ads in the organic results.

If Bing were to exit the online ad market, maybe Yahoo! could make an extra $100 million in the first year of an ad deal with Google, but if there is little to no competition a few years down the road, then when it comes time for Yahoo! to negotiate revenue share rates with Google, you know Google would cram down a bigger rake.

This isn't blind speculation or theory, but aligned with Google's current practices. Look no further than Google's current practices with YouTube, where "partners" are paid different rates & are forbidden to mention their rates publicly: "The Partner Program forbids participants to reveal specifics about their ad-share revenue."

Transparency is a one way street.

Google further dips into leveraging that "home team always wins" mode of negotiating rates by directly investing in some of the aggregators/networks which offer sketchy confidential contracts < ahref="http://obviouslybenhughes.com/post/13933948148/before-you-sign-that-machinima-contract-updated">soaking the original content creators.:

As I said, the three images were posted on yfrog. They were screenshots of an apparently confidential conversation had between MrWonAnother and a partner support representative from Machinima, in which the representative explained that the partner was locked indefinitely into being a Machinima partner for the rest of eternity, as per signed contract. I found this relevant, informative and honestly shocking information and decided to repost the images to obviouslybenhughes.com in hopes that more people would become aware of the darker side of YouTube partnership networks.

Negotiating with a monopoly that controls the supply chain isn't often a winning proposition over the long run.

Competition (or at least the credible risk of it) is required to shift the balance of power.

The flip side of the above situation - where competition does help market participants to get a better revenue share - can be seen in the performance of AOL in their ad negotiation in 2005. AOL's credible threat to switched to Microsoft had Google invest a billion Dollars into AOL, where Google later had to write down $726 million of that investment. If there was no competition from Microsoft, AOL wouldn't have received that $726 million (and likely would have had a lower revenue sharing rate and missed out on some of the promotional AdWords credits they received).

The same sort of "shifted balance of power" was seen in the Mozilla search renewal with Google, where Google paid Mozilla 3X as much due to a strong bid from Microsoft.

The iPad search results are becoming more like phone search results, where ads dominate the interface & a single organic result is above the fold. And Google pushed their "ehnanced" ad campaigns to try to push advertisers into paying higher ad rates on those clicks. It would be a boon for Google if they can force advertisers to pay the same CPC as desktop & couple it with that high mobile ad CTR.

Google owning Chrome + Android & doing deals with Apple + Mozilla means that it will be hard for either Microsoft or Yahoo! to substantially grow search marketshare. But if they partner with Google it will be a short term lift in revenues and dark clouds on the horizon.

I am not claiming that Microsoft is great for Yahoo!, or that they are somehow far better than Google, only that Yahoo! is in a far better position when they have multiple entities competing for their business (as highlighted in the above Mozilla & AOL examples).

Getting Granular With User Generated Content

Apr 24th

The stock market had a flash crash today after someone hacked the AP account & made a fake announcement about bombs going off at the White House. Recently Twitter's search functionality has grown so inundated with spam that I don't even look at the brand related searches much anymore. While you can block individual users, it doesn't block them from showing up in search results, so there are various affiliate bots that spam just about any semi-branded search.

Of course, for as spammy as the service is now, it was worse during the explosive growth period, when Twitter had fewer than 10 employees fighting spam:

Twitter says its "spammy" tweet rate of 1.5% in 2010 was down from 11% in 2009.

If you want to show growth by any means necessary, engagement by a spam bot is still engagement & still lifts the valuation of the company.

Many of the social sites make no effort to police spam & only combat it after users flag it. Consider Eric Schmidt's interview with Julian Assange, where Eric Schmidt stated:

  • "We [YouTube] can't review every submission, so basically the crowd marks it if it is a problem post publication."
  • "You have a different model, right. You require human editors." on Wikileaks vs YouTube

We would post editorial content more often, but we are sort of debating opening up a social platform so that we can focus on the user without having to bear any editorial costs until after the fact. Profit margins are apparently better that way.

As Google drives smaller sites out of the index & ranks junk content based on no factor other than it being on a trusted site, they create the incentive for spammers to ride on the social platforms.

All aboard. And try not to step on any toes!

When I do some product related searches (eg: brand name & shoe model) almost the whole result set for the first 5 or 10 pages is garbage.

  • Blogspot.com subdomains
  • Appspot.com subdomains
  • YouTube accounts
  • Google+ accounts
  • sites.google.com
  • Wordpress.com subdomains
  • Facebook Notes & pages
  • Tweets
  • Slideshare
  • LinkedIn
  • blog.yahoo.com
  • subdomains off of various other free hosts

It comes without surprise that Eric Schmidt fundamentally believes that "disinformation becomes so easy to generate because of, because complexity overwhelms knowledge, that it is in the people's interest, if you will over the next decade, to build disinformation generating systems, this is true for corporations, for marketing, for governments and so on."

Of course he made no mention in Google's role in the above problem. When they are not issuing threats & penalties to smaller independent webmasters, they are just a passive omniscient observer.

With all these business models, there is a core model of building up a solid stream of usage data & then tricking users or looking the other way when things get out of hand. Consider Google's Lane Shackleton's tips on YouTube:

  • "Search is a way for a user to explicitly call out the content that they want. If a friend told me about an Audi ad, then I might go seek that out through search. It’s a strong signal of intent, and it’s a strong signal that someone found out about that content in some way."
  • "you blur the lines between advertising and content. That’s really what we’ve been advocating our advertisers to do."
  • "you’re making thoughtful content for a purpose. So if you want something to get shared a lot, you may skew towards doing something like a prank"

Harlem Shake & Idiocracy: the innovative way forward to improve humanity.

Life is a prank.

This "spam is fine, so long as it is user generated" stuff has gotten so out of hand that Google is now implementing granular page-level penalties. When those granular penalties hit major sites Google suggests that those sites may receive clear advice on what to fix, just by contacting Google:

Hubert said that if people file a reconsideration request, they should “get a clear answer” about what’s wrong. There’s a bit of a Catch-22 there. How can you file a reconsideration request showing you’ve removed the bad stuff, if the only way you can get a clear answer about the bad stuff to remove is to file a reconsideration request?

The answer is that technically, you can request reconsideration without removing anything. The form doesn’t actually require you to remove bad stuff. That’s just the general advice you’ll often hear Google say, when it comes to making such a request. That’s also good advice if you do know what’s wrong.

But if you’re confused and need more advice, you can file the form asking for specifics about what needs to be removed. Then have patience

In the past I referenced that there is no difference between a formal white list & overly-aggressive penalties coupled with loose exemptions for select parties.

The moral of the story is that if you are going to spam, you should make it look like a user of your site did it, that way you

  • are above judgement
  • receive only a limited granular penalty
  • get explicit & direct feedback on what to fix

What Types of Sites Actually Remove Links?

Apr 9th

Since the disavow tool has come out SEOs are sending thousands of "remove my link" requests daily. Some of them come off as polite, some lie & claim that the person linking is at huge risk of their own rankings tank, some lie with faux legal risks, some come with "extortionisty" threats that if they don't do it the sender will report the site to Google or try to get the web host to take down the site, and some come with payment/bribery offers.

If you want results from Google's jackassery game you either pay heavily with your time, pay with cash, or risk your reputation by threatening or lying broadly to others.

At the same time, Google has suggested that anyone who would want payment to remove links is operating below board. But if you receive these inbound emails (often from anonymous Gmail accounts) you not only have to account for the time it would take to find the links & edit your HTML, but you also have to determine if the person sending the link removal request represents the actual site, or if it is someone trying to screw over one of their competitors. Then, if you confirm that the request is legitimate, you either need to further expand your page's content to make up for the loss of that resource or find a suitable replacement for the link that was removed. All this takes time. And if that time is from an employee that means money.

There have been hints that if a website is disavowed some number of times that data can be used to further go out & manually penalize more websites, or create link classifications for spam.

... oh no ...

Social engineering is the most profitable form of engineering going on in the 'Plex.

The last rub is this: if you do value your own life at nothing in a misguided effort to help third parties (who may have spammed up your site for links & then often follow it up with lying to you to achieve their own selfish goals), how does that reflect on your priorities and the (lack of) quality in your website?

If you contacted the large branded websites that Google is biasing their algorithms toward promoting, do you think those websites would actually waste their time & resources removing links to third party websites? For free?

Color me skeptical.

As a thought experiment, look through your backlinks for a few spam links that you know are hosted by Google (eg: Google Groups, YouTube, Blogspot, etc.) and try to get Google's webmaster to help remove those links for you & let us know how well that works out for you.

Some of the larger monopoly & oligopolies don't offer particularly useful customer service to their paying customers. For example, track how long it takes you to get a person on the other end of the phone with a telecom giant, a cable company, or a mega bank. Better yet, look at how long it took AdWords to openly offer phone support & the non-support they offer AdSense publishers (remember the bit about Larry Page believing that "the whole idea of customer support was ridiculous?")

For the non-customer Google may simply recommend that the best strategy is to "start over."

When Google aggregates Webmaster Tools link data from penalized websites they can easily make 2 lists:

  • sites frequently disavowed
  • sites with links frequently removed

If both lists are equally bad, then you are best off ignoring the removal requests & spending your time & resources improving your site.

If I had to guess, I would imagine that being on the list of "these are the spam links I was able to remove" is worse than being on the list of "these are the links I am unsure about & want to disavow just in case."

What say you?

Why is Great SEO so Expensive?

Mar 21st

Sharing is caring!

Please share :)

Embed code is here.

The below image has a somewhat small font size on it. You can see the full sized version by clicking here.

Why SEO is Expensive.

Pages






    Email Address
    Pick a Username
    Yes, please send me "7 Days to SEO Success" mini-course (a $57 value) for free.

    Learn More

    We value your privacy. We will not rent or sell your email address.