Jim Boykin Interview

Jim Boykin has been a longtime friend & was one of the early SEOs who was ahead of the game back in the day. While many people have came and went, Jim remains as relevant as ever today. We interviewed him about SEO, including scaling his company, disavow & how Google has changed the landscape over the past couple years.

Aaron: How did you get into the field of SEO?

Jim: In 1999 I started We Build Pages as a one man show designing and marketing websites...I never really became much of a designer, but luckily I had much more success in the marketing side. Somehow that little one man show grew to about 100 ninjas, and includes some communities and forums I grew up on (WebmasterWorld, SEOChat, Cre8asiteForums), and I get to work with people like Kris Jones, Ann Smarty, Chris Boggs, Joe Hall, Kim Krause Berg, and so many others at Ninjas who aren't as famous but are just as valuable to me, and Ninjas has really become a family over the years. I still wonder at times how this all happened, but I feel lucky with where we're at.

Aaron: When I got started in SEO some folks considered all link building to be spam. I looked at what worked, and it appeared to be link building. Whenever I thought I came up with a new clever way to hound for links & would hunted around, most the times it seems you got there first. Who were some of the people you looked to for ideas when you first got into SEO?

Jim: Well, I remember going to my first SEO conference in 2002 and meeting people like Danny Sullivan, Jill Whalen, and Bruce Clay. I also remember Bob Massa being the first person "dinged" by google for selling links...that was back in 2002 I think...I grew up on Webmasterworld and I learned a ton from the people in there like: Tedster, Todd Friesen, Greg Boser, Brett Tabke, Shak, Bill, Rae Hoffman, Roger Montti, and so many others in there over the years...they were some of my first influencers....I also used to hang around with Morgan Carey, and Patrick Gavin a lot too. Then this guy selling an SEO Book kept showing up on all my high PR pages where I was getting my links....hehe...

Aaron: One of the phrases in search that engineers may use is "in an ideal world...". There is always some amount of gap between what is advocated & what actually works. With all the algorithmic changes that have happened in the past few years, how would you describe that "gap" between what works & what is advocated?

Jim: I feel there's really been a tipping point with the Google Penguin updates. Maybe it should be "What works best short term" and "What works best long term"....anything that is not natural may work great in the short term, but your odds of getting zinged by Google go way up. If you're doing "natural things" to get citations and links, then it may tend to take a bit longer to see results (in conjunction with all you're doing), but at least you can sleep at night doing natural things (and not worrying about Google Penalties).  It's not like years ago when getting exact targeted anchor text for the phrases you want to rank on was the way to go if you wanted to compete for search rankings. Today it's much more involved to send natural signals to a clients website.  To send in natural signals you must do things like work up the brand signals, trusted citations, return visitors, good user experience, community, authors, social, yada yada....SEO is becming less a "link thing"...and more a "great signals from many trusted people", as well as it's a branding game now. I really like how SEO is evolving....for years Google used to say things like "Think of the users" when talking of the algorthym, but we all laughed and said "Yea, yea, we all know that it's all about the Backlinks"....but today, I think Google has crossed a tipping point where yes, to do great SEO, you must focus on the users, and not the links....the best SEO is getting as many citations and trusted signals to your site than your competitors...and there's a lot of trusted signals which we, as internet marketers, can be working on....it's more complicated, and some SEO's won't survive this game...they'll continue to aim for short term gains on short tail keyword phrases...and they'll do things in bulk....and their network will be filtered, and possibly penalized.

Every website owner has to measure the risks, and the time involved, and the expected ROI....it's not a cheap game any more....doing real marketing involves brains and not buttons...if you can't invest in really building something "special" (ideally many special things), on your site to get signals (links/social), then you're going to find it pretty hard to get links that look natural and don't run a risk of getting penalized.  The SEO game has really matured, the other option is to take a high risk of penalization.

Aaron: In terms of disavow, how deep does one has to cut there?

Jim: as deep as it needs to be to remove every unantural link. If you have 1000 backlinks and 900 are on pages that were created for "unnatural purposes (to give links)" then all 900 have to be disavowed...if you have 1000 backlinks, and only 100 are not "natural" then only 100 need to be disavowed... what percent has to be disavowed to untrip an algorthymitic filter? I'm not sure...but almost always the links which I disavow have zero value (in my opinion) anyways.  Rip the band-aid off, get over it, take your marketing department and start doing real things to attract attention, and to keep it.

Aaron: In terms of recoveries, are most penalized sites "recoverable"? What does the typical recovery period look like in terms of duration & restoration?

Jim: oh...this is a bee's nest you're asking me..... are sites recoverable....yes, most....if a site has 1000 domains that link to it, and 900 of those are artificial and I disavow them, there might not be much of a recovery depending on what that 100 links left are....ie, if I disavow all link text of "green widgets" that goes to your site, and you used to rank #1 for "green widgets" prior to being hit by a Penguin update, then I wouldn't expect to "recover" on the first page for that phrase..... where you recover seems to depend on "what do you have for natural links that are left after the disavow?"....the time period....well.... we've seen some partial recoveries in as soon as 1 month, and some 3 months after the disavow...and some we're still waiting on....

To explain, Google says that when you add links to the disavow document, then way it works is that the next time Google crawls any page that links to you, they will assign a "no follow" to the link at that time.....so you have to wait until enough of the links have been recrawled, and now assigned the no follow, to untrip the filter....but one of the big problems I see is that many of the pages Google shows as linking to you, well, they're not cached in Google!....I see some really spammy pages where Google was there (they record your link), but it's like Google has tossed the page out of the index even though they show the page as linking to you...so I have to ask myself, when will Google return to those pages?...will Google ever return to those pages???  It looks like if  you had a ton of backlinks that were on pages that were so bad in the eyes of Google that they don't even show those pages in their index anymore...we might be waiting a long long time for google to return to those pages to crawl them again....unless you do something to get Google to go back to those pages sooner (I won't elaborate on that one).

Aaron: I notice you launched a link disavow tool & earlier tonight you were showing me a few other cool private tools you have for working on disavow analysis, are you going to make any of those other tools live to the public?

Jim: Well, we have about 12 internal private disavow analysis tools, and only 1 public disavow tool....we are looking to have a few more public tools for analyzing links for disavow analysis in the coming weeks, and in a few months we'll release our Ultimate Disavow Tool...but for the moment, they're not ready for the public, some of those are fairly expensive to run and very database intensive...but I'm pretty sure I'm looking at more link patterns than anyone else in the world when I'm analyzing backlinks for doing disavows. When I'm tired of doing disavows maybe I'll sell access to some of these.

Aaron: Do you see Google folding in the aggregate disavow data at some point? How might they use it?

Jim: um.....I guess if 50,000 disavow documents have spammywebsite.com listed in their disavows, then Google could consider that spammywebsite.com might be a spammy website.....but then again, with people disavowing links who don't know what they're doing, I'm sure their's a ton of great sites getting listed in Disavow documents in Webmaster Tools.

Aaron: When approaching link building after recovering from a penalty, how does the approach differ from link building for a site that has never been penalized?

Jim: it doesn't really matter....unless you were getting unnatural/artificial links or things in bulk in the past, then, yes, you have to stop doing that now...that game is over if you've been hit...that game is over even if you haven't been hit....Stop doing the artificial link building stuff. Get real citations from real people (and often "by accident") and you should be ok.

Aaron: You mentioned "natural" links. Recently Google has hinted that infographics, press releases & other sorts of links should use nofollow by default. Does Google aim to take some "natural" link sources off the table after they are widely used? Or are those links they never really wanted to count anyhow (and perhaps sometimes didn't) & they are just now reflecting that.

Jim: I think ~most of these didn't count for years anyways....but it's been impossible for Google to nail every directory, or every article syndication site, or every Press Release site, or everything that people can do in bulk..and it's harder to get all occurances of widgets and mentions of infographics...so it's probably just a "Google Scare....ie, Google says, "Don't do it, No Follow them" (and I think they say that because it often works), and the less of a pattern there is, the harder for Google to catch it (ie, widgets and infographics) ...I think too much of any 1 thing (be it a "type of link") can be a bad signal....as well as things like "too many links from pages that get no traffic", or "no clicks from links to your site". In most cases, because of keyword abuse, Google doesn't want to count them...links like this may be fine (and ok to follow) in moderation...but if you have 1000 widgets links, and they all have commercial keywords as link text, then you're treading on what could certainly turn into a negative signal, and so then you might want to consider no following those.

Aaron: There is a bit of a paradox in terms of scaling effective quality SEO services for clients while doing things that are not seen as scalable (and thus future friendly & effective). Can you discuss some of the biggest challenges you faced when scaling IMN? How were you able to scale to your current size without watering things down the way that most larger SEO companies do?

Jim: Scaling and keep quality has certainly been a challenge in the past. I know that scaling content was an issue for us for a while....how can you scale quality content?....Well, we've found that by connecting real people, the real writers, the people with real social influence...and by taking these people and connecting them to the brands we work with.....so these real people then become "Brand Evangelist"...and getting these real people who know what they're talking about to then write for our clients, well, when we did that we found that we could scale the content issue. We can scale things like link building by merging with the other "mentions", and specifically targeting industries and people and working on building up associations and relations with others has helped to scale...plus we're always building tools to help us scale while keeping quality. It's always a challenge, but we've been pretty good at solving many of those issues.

I think we've been really good at scaling in house....many content marketers are now more like community managers and content managers....we've been close to 100 employees for a few years now..so it's more how can we do more with the existing people we have...and we've been able to do that by connecting real people to the clients so we can actually have better content and better marketing around that content....I'm really happy that the # of employees has been roughly the same for past few years, but we're doing more business, and the quality keeps getting better....there's not as many content marketers today as there was a few years ago, but there's many more people working on helping authors build up their authorship value and produce more "great marketing" campaigns where as a bi-product, we happen to get some links and social citations.

Aaron: One of the things I noticed with your site over the past couple years is the sales copy has promoted the fusion of branding and SEO. I looked at your old site in Archive.org over the years & have seen quite an amazing shift in terms of sales approach. Has Google squeezed out most of the smaller players for good & does effective sustainable SEO typically require working for larger trusted entities? When I first got into SEO about 80%+ of the hands in the audiences at conferences were smaller independent players. At the last conference I was at it seemed that about 80% of the hands in the audience worked for big companies (or provided services to big companies). Is this shift in the market irreversible? How would you compare/contrast approach in working with smaller & larger clients?

Jim: Today it's down to "Who really can afford to invest in their Brand?" and "Who can do real things to get real citations from the web?"....and who can think way beyond "links"...if you can't do those things, then you can't have an effective sustainable online marketing program.... we once were a "link building company" for many, many years.... but for the past 3 years we've moved into full service, offering way more than what was "link building services".... yea, SEO was about "links" for years, and it still is to a large degree....but unless you want to get penalized, you have to take the "it's way more than links" approach... in order for SEO to work (w/o fear of getting penalized) today, you have to look at sending in natural signals...so thus, you must do "natural" things...things that will get others "talking" about it, and about you....SEO has evolved a lot over the years....Google used to recommend 1 thing (create a great site and create great things), but for years we all knew that SEO was about links and anchor text....today, ...today, I think Google has caught up with (to some degree) with the user, and with "real signals"...yesterday is was "gaming" the system....today it's about doing real things...real marketing...and getting you name out to the community via creating great things that spread, and that get people to come back to your site....those SEO's and businesses who don't realize that the game has changed, will probably be doing a lot of disavowing at some time in the future, and many SEO's will be out of business if they think it's a game where you can do "fake things" to "get links" in bulk....in a few years we'll see who's still around for internet marketing companies...those who are still around will be those who do real marketing using real people and promoting to other real people...the link game itself has changes...in the past we looked a link graphs...today we look at people graphs....who is talking about you, what are they saying....it's way more than "who links to me, and how do they link to me"....Google is turning it into a "everyone gets a vote", and "everyone has a value"...and in order to rank, you'll need real people of value talking about your site...and you'll need a great user experience when they get there, and you'll need loyal people who continue to return to your site, and you'll need to continue to do great things that get mentions....

SEO is no longer a game of some linking algorithm, it's now really a game of "how can you create a great user experience and get a buzz around your pages and brand".

Aaron: With as much as SEO has changed over the years, it is easy to get tripped up at some point, particularly if one is primarily focused on the short term. One of the more impressive bits about you is that I don't think I've ever seen you unhappy. The "I'm feeling lucky" bit seems to be more than just a motto. How do you manage to maintain that worldview no matter what's changing & how things are going?

Jim: Well, I don't always feel lucky...I know in 2008 when Google hit a few of our clients because we were buying links for them I didn't feel lucky (though the day before, when they ranked #1, I felt lucky)....but I'm in this industry for the long term...I've been doing this for almost 15 years....and yes, we've had to constantly change over the year, and continue to grow, and growing isn't always easy...but it is exciting to me, and I do feel lucky for what I have...I have a job I love, I get to work with people whom I love, in an industry I love, I get to travel around the world and meet wonderful people and see cool places...and employee 100 people and win "Best Places to work" awards, and I'm able to give back to the community and to society, and to the earth...those things make me feel lucky...SEO has always been like a fun game of chess to me...I'm trying to do the best I can with any move, but I'm also trying to think a few steps ahead, and trying to think what Google is thinking on the other side of the table.....ok...yea, I do feel lucky....maybe it's the old hippy in me...I always see the glass half full, and I'm always dreaming of a better tomorrow....

If I can have lots of happy clients, and happy employees, and do things to make the world a little better along the way, then I'm happy...sometimes I'm a little stressed, but that comes with life....in the end, there's nothing I'd rather be doing than what I currently do....and I always have big dreams of tomorrow that always make the trials of today seem worth it for the goals of what I want to achieve for tomorrow.

Aaron: Thanks Jim!


Jim Boykin is the CEO of the Internet Marketing Ninjas company, and a Blogger and public speaker. You can find Jim on Twitter, Facebook, and Google Plus.

Finding the Perfect Project Management & CRM Tools

pm-crm-header-imate

Picking the right tools for project management and CRM functions can feel like an impossible task. I've gone through a number of applications in recent years (just about all of them actually). What makes choosing (or building) the right systems so difficult are the variables we all deal with in our respective workflows.

At some point in the SEO process a checklist doesn't suffice, at some point intuition and experience come into play and these traits require some intellectual flexibility.

You can build tasks and sub-tasks up to a certain level, but at some point you have to replace the task checklist option with a free form area to capture thoughts and ideas. Those thoughts and ideas can drive the future of the project yet it's hard to foresee what tasks are resultant from this approach at the beginning of a project.

How to Determine What You Need

This is hard. You should have an idea of current needs and possible future needs. It really sounds a bit easier than it is. You have to take a number of things into consideration:

  • Your current email, calendar, and document storage set ups
  • You and your staff's mobile devices and OS's
  • Features that you need
  • Features you might need
  • Reporting
  • Scalability of the platforms
  • Desire to rely on 3rd party integrations
  • Ability to manage multiple 3rd party integrations

Inside each of those items are more variables but for the most part these are the key areas to think about.

The hardest part for me was thinking about where I wanted to go. At one point or another I fell into the following categories:

  • Freelancer wanting to grow into an agency owner
  • Freelancer wanting to stay a freelancer
  • Wanting to exclusively work on my own properties
  • Wanting to exclusively focus on client work
  • Mixing client work and self-managed properties
  • Providing clients with more services vs focusing on a core service or two

When you run through those scenarios there are all sorts of tools that make sense, then don't make sense, and tools that kind of make sense. In addition to the categories I mentioned there are also questions about how big do you want to grow an agency.

Do you want a large staff? A small staff? Do you want to be more of an external marketer or do you want to be more day to day operations? Inside of those questions are lots of other intersections that can have a significant effect on how your workflow will be structured.

I'll give you some insight into how I determined my set up.

Putting Tools Through Their Paces

I do a mix of things for "work". I run some of my own properties, I have some clients, and I love SeoBook. In addition to this I've also been (slowly) developing a passive real estate investment company for a year (very slowly and pragmatically).

I spent quite a bit of time determining what I wanted to do and where I wanted to go and what made me the "happiest". I've been fortunate enough to be able to take the proper amount of time to determine these things without having to rush into a decision simply based on the need to make a buck.

So, I decided the following was best for me:

  • Work with select clients only
  • Have a small, focused team of great people
  • Continue developing our own web properties and products

Invariably when you make these decisions you leave money on the table somewhere and that's hard. Whether it's abandoning some short-term strategies that have a nice ROI or turning away clients based on a gut feeling or just being maxed out on client work, it's still hard to leave the $ there.

What Are Your Core Products

After deciding what I was going to do and the direction I was going to go it was a relief to eliminate some potential solutions from the mix. Overly complicated CRM's like Zoho and Microsoft Dynamics were out (fine products but overkill for me).

Determining the products and services that we would sell also helped narrow down the email, calendar, and document storage issue.

Sometimes a product is so core to your service that it has a significant influence on your choice of tools. I've been using Google Apps for business for awhile and our use of Buzzstream cemented that choice. We've also used Exchange in the past but it doesn't seem to play as nice with Buzzstream as Google Apps. Outreach is key for us and no one does it better than Buzzstream.

Our other "products and services" are fairly platform independent so the next big thing to deal with was document and information management. However, before we chose a provider for this service we needed to determine what CRM/PM system fit our workflow the best.

In my opinion, document integration is a nice add-on but not 100% necessary if you keep things in one place and have a tight file structure. In a larger organization this might be different but a proper client/project folder set up is easy enough to reference without having to compromise on a CRM/PM solution.

CRM and PM Systems

A post covering everything I went through would be like 10,000 words long but suffice to say the most important things to me with these system evaluations were:

  • Ease of Use
  • Speed
  • Reliability
  • Task and Project Template functionality
  • Solid reporting features without overkill
  • Backup functionality
  • Scalability
  • OS agnostic

Compromises will be made when you place any amount of criteria against pre-built solutions. There was a period of time where we might have scaled agency work so I'll mention tools that would have made that cut as well. We ended up settling on:

Using Asana

pm-crm-post-asana-logo

Asana accomplishes about 90% of what I need. It doesn't work on IE which means it doesn't work very well on my Windows phone but I have yet to encounter a situation in 5 years of dealing with 50+ clients and many internal projects where I needed to check in on my phone or where it couldn't wait until I got in front of my computer. I have an iPhone for iOS testing so in a pinch I could use that. Plus, you can have activity data emailed to your inbox so you can see if the sky is falling either way.

Asana doesn't do backups really well, you have to export as JSON but it's better than nothing. I have a project manager whom I trust so I don't need to monitor everything but I can quickly see the tasks assigned to her in case things are falling behind.

We don't assign tasks to other folks (outreachers, designers, programmers, etc), we just let them do their thing. Asana also integrates with Dropbox and Google Drive if you need that kind of integration. Asana also is task/project only, there's no information storage like there is in something like Basecamp or TeamworkPM (for us, that's ok).

Alternative to Asana = TeamworkPM

pm-crm-teamwork-logo

The alternative I would recommend, if you have a larger team or just want to have more granular control over things (and also more reporting functions), would be TeamworkPM. It meets all my requirements and then some. I find it just as easy to use as Basecamp but far more robust and it even makes using Gantt Charts easy and fun :)

For us, it's too much but it really is a nice product that makes scaling your work far more easier than Basecamp. In Basecamp you cannot see all tasks assigned to everyone and their statuses, you have to click on each person to see their individual tasks. This makes multi-employee management cumbersome. TeamworkPM also has project and task templates while Basecamp only has project templates.

I like the ability to create task list templates only because many of our project requirements involve specific tasks not necessarily present on every single project, so having just project templates is far too broad to be effective.

In addition, Basecamp's file handling is poor and messy for our usage because:

  • There's no file versioning
  • You can't delete a file without deleting the conversation attached to a file (so you have to rename them)
  • No integration with any document service

TeamworkPM integrates with various services and also does file versioning in case you use a service they do not integrate with.

Using Pipeline Deals

pm-crm-post-pipeline-logo

PipelineDeals is dead simple to use. It meets just about all my requirements and it has the most important integration a CRM can have; contact integration with my email application (Google Apps). It also has a nice gmail widget that makes email and contact management between Gmail and Pipeline Deals really slick.

We use Right Signature for document signing and Pipeline integrates with that as well. It doesn't integrate with BidSketch, which is what we use for proposals but that's ok. We don't do 20 quotes a week so that level of automation is nice but not necessary.

PipelineDeals doesn't integrate with Asana either. Again, that's fine in our case. We don't need the CRM to speak to the PM. It also does task templates which are a big deal to me and our workflows. Reporting and mobile access are excellent as well, without being overly complicated.

Documents and Information

Before I get to what could be a all in one solution for CRM/PM let's talk about documents and information.

I love the idea of easy information retrieval and not having to think about where to put things or where to look for things. There are a few core choices of document and information management to consider:

For more robust, enterprise level solutions we also considered Sharepoint. It's pretty complex but very robust and overkill for us.

Dropbox is excellent except for collaboration. Conflicted file versions are a pain in the butt but if you don't need any collaborative features it's a good solution. It syncs locally, stores native file types, integrates with a lot of services.

Evernote is a solid tool for textual based information sharing but I don't like it for files because it can't be your only file solution and I'm interested in a file solution that handles all files.

Google Docs is a wonderfully collaborative document management solution and could handle probably 60-75% of files. However, we do some custom stuff with Excel, Word, and some stuff with videos and not having the native file available for quick editing is a hassle.

Also, while emailing from Google Docs is a cool feature it doesn't work if you are emailing inside of an existing conversation. If you email inside Gmail you'll share a link to the file rather than the file itself and many times we have to send a Word doc or Excel file so we have to export from Google Docs to the proper file extension and then email.

Choosing Skydrive

pm-crm-post-skydrive-logo

Skydrive does what Dropbox does and what Google Docs does while maintaining the more widely accepted Office formats. We chose Skydrive for this reason. It's OS agnostic and works across iOS, Android, and of course on Windows phone. For iOS and Android you need an active Office 365 subscription. On iPad's you would still access via the browser though I believe an iPad version of what's on the iPhone now isn't far off at all.

We use Skydrive for project files, reference files, and collaborative files for site/project strategy. This leaves email correspondence with clients as the remaining piece of the information puzzle. CRM email storage is great for pre-sales, up-sells, and billing correspondence but what about project related email?

Project Related Emails

Most PM solutions allow you to email a message to a client from your PM interface and continue the correspondence there. This is great until someone starts adding other bits of information to an email (not everyone sticks to the subject line :D) and it quickly becomes unruly.

Probably the most tried and true solution is to either decide to keep all email correspondence (and notes from calls) in a CRM and label the note appropriately or try and document project related stuff in a project notebook or message. Asana doesn't have this option but TeamworkPM does.

My preference is to just keep that stuff in a CRM for easy reference but for larger teams I'd go with keeping it in the CRM + summarizing in the PM system.

There's another solution though. There's a product out there that combines CRM/PM into one app and makes keeping information together fairly simple.

Considering Insightly.Com

pm-crm-post-insightly

Insightly is a pretty robust and affordable CRM/PM solution. It's email dropbox allows you to keep emails stored for quick reference across projects and contacts.

The reason it can handle emails in this way is due to its unique linking relationships. You can link a contact and/or an organization to a project (and multiple projects).

You can easily see all projects associated with X but what's even more powerful is you can link vendors to projects too. When you BCC your project dropbox it will also link the email to the participants on the project as well has have a "Email" tab in the project interface so you can see all the relevant emails for that project whether it's with a client, vendor, staff member, etc.

If we were to move into a more client-facing company Insightly would merit strong consideration for its unique ability to easily keep all related information together.

Is Automation Overrated Sometimes?

I like automation, to an extent. I like syncing 2 apps together directly. There's a service out there called Zapier which does a great job linking otherwise incompatible services together. My hesitation here is relying on too many "parties" to accomplish tasks.

Automation is wonderful, really, but I would recommend sitting down and thinking about what automation do you really need and how helpful will it really be and what happens if a 3rd, 4th, or 5th party goes under.

For me, an example would be when I was considering Highrise.

  • Contacts sync provided by a third-party
  • Task templates provided by another third-party
  • Document integration provided by another third-party

I'm hesitant to rely on these extra services for core functionality because these functions are crucial for my business. There could be situations where those services get abandoned, an API changes and you're waiting for a fix, and so on.

There's plenty of services that offer integration between core apps like contacts, billing, time tracking, quoting, and so on. I just think it's wise to consider very carefully what you are relying on for core functionality and if you have to go outside of your chosen application too much it might be time to consider a new one.

Compromises and Moving Forward

If you choose any pre-built solution you're going to probably have some compromises. I have found that structure is really important and easy access to information, data, and task progress are more important than features and "options".

I think having too many services inside of your operation is a hindrance to being as productive and efficient as possible. Knowing where to look and why to look is half the battle. If you're running multiple project management solutions, multiple document management solutions, and so on then you might want to consider more efficient ways to handle your operation.

Without going through this process multiple times over the years there is no way I would have been able to stay as lean as possible while being as efficient as possible. Doing both of those things correctly usually leads:

  • Happier clients
  • A more productive work environment
  • A more profitable business

The Benefits Of Thinking Like Google

Shadows are the color of the sky.

It’s one of those truths that is difficult to see, until you look closely at what’s really there.

To see something as it really is, we should try to identify our own bias, and then let it go.

“Unnatural” Links

This article tries to make sense of Google's latest moves regarding links.

It’s a reaction to Google’s update of their Link Schemes policy. Google’s policy states “Any links intended to manipulate PageRank or a site’s ranking in Google search results may be considered part of a link scheme and a violation of Google’s Webmaster Guidelines." I wrote on this topic, too.

Those with a vested interest in the link building industry - which is pretty much all of us - might spot the problem.

Google’s negative emphasis, of late, has been about links. Their message is not new, just the emphasis. The new emphasis could pretty much be summarized thus:"any link you build for the purpose of manipulating rank is outside the guidelines." Google have never encouraged activity that could manipulate rankings, which is precisely what those link building, for the purpose of SEO, attempt to do. Building links for the purposes of higher rank AND staying within Google's guidelines will not be easy.

Some SEOs may kid themselves that they are link building “for the traffic”, but if that were the case, they’d have no problem insisting those links were scripted so they could monitor traffic statistics, or at very least, no-followed, so there could be no confusion about intent.

How many do?

Think Like Google

Ralph Tegtmeier: In response to Eric's assertion "I applaud Google for being more and more transparent with their guidelines", Ralph writes- "man, Eric: isn't the whole point of your piece that this is exactly what they're NOT doing, becoming "more transparent"?

Indeed.

In order to understand what Google is doing, it can be useful to downplay any SEO bias i.e. what we may like to see from an SEO standpoint, rather try to look at the world from Google’s point of view.

I ask myself “if I were Google, what would I do?”

Clearly I'm not Google, so these are just my guesses, but if I were Google, I’d see all SEO as a potential competitive threat to my click advertising business. The more effective the SEO, the more of a threat it is. SEOs can’t be eliminated, but they can been corralled and managed in order to reduce the level of competitive threat. Partly, this is achieved by algorithmic means. Partly, this is achieved using public relations. If I were Google, I would think SEOs are potentially useful if they could be encouraged to provide high quality content and make sites easier to crawl, as this suits my business case.

I’d want commercial webmasters paying me for click traffic. I’d want users to be happy with the results they are getting, so they keep using my search engine. I’d consider webmasters to be unpaid content providers.

Do I (Google) need content? Yes, I do. Do I need any content? No, I don’t. If anything, there is too much content, and lot of it is junk. In fact, I’m getting more and more selective about the content I do show. So selective, in fact, that a lot of what I show above the fold content is controlled and “published”, in the broadest sense of the word, by me (Google) in the form of the Knowledge Graph.

It is useful to put ourselves in someone else’s position to understand their truth. If you do, you’ll soon realise that Google aren’t the webmasters friend if your aim, as a webmaster, is to do anything that "artificially" enhances your rank.

So why are so many SEOs listening to Google’s directives?

Rewind

A year or two ago, it would be madness to suggest webmasters would pay to remove links, but that’s exactly what’s happening. Not only that, webmasters are doing Google link quality control. For free. They’re pointing out the links they see as being “bad” - links Google’s algorithms may have missed.

Check out this discussion. One exasperated SEO tells Google that she tries hard to get links removed, but doesn’t hear back from site owners. The few who do respond want money to take the links down.

It is understandable site owners don't spend much time removing links. From a site owners perspective, taking links down involves a time cost, so there is no benefit to the site owner in doing so, especially if they receive numerous requests. Secondly, taking down links may be perceived as being an admission of guilt. Why would a webmaster admit their links are "bad"?

The answer to this problem, from Google's John Mueller is telling.

A shrug of the shoulders.

It’s a non-problem. For Google. If you were Google, would you care if a site you may have relegated for ranking manipulation gets to rank again in future? Plenty more where they came from, as there are thousands more sites just like it, and many of them owned by people who don’t engage in ranking manipulation.

Does anyone really think their rankings are going to return once they’ve been flagged?

Jenny Halasz then hinted at the root of the problem. Why can’t Google simply not count the links they don’t like? Why make webmasters jump through arbitrary hoops? The question was side-stepped.

If you were Google, why would you make webmasters jump through hoops? Is it because you want to make webmasters lives easier? Well, that obviously isn’t the case. Removing links is a tedious, futile process. Google suggest using the disavow links tool, but the twist is you can’t just put up a list of links you want to disavow.

Say what?

No, you need to show you’ve made some effort to remove them.

Why?

If I were Google, I’d see this information supplied by webmasters as being potentially useful. They provide me with a list of links that the algorithm missed, or considered borderline, but the webmaster has reviewed and thinks look bad enough to affect their ranking. If the webmaster simply provided a list of links dumped from a link tool, it’s probably not telling Google much Google doesn’t already know. There’s been no manual link review.

So, what webmasters are doing is helping Google by manually reviewing links and reporting bad links. How does this help webmasters?

It doesn’t.

It just increases the temperature of the water in the pot. Is the SEO frog just going to stay there, or is he going to jump?

A Better Use Of Your Time

Does anyone believe rankings are going to return to their previous positions after such an exercise? A lot of webmasters aren’t seeing changes. Will you?

Maybe.

But I think it’s the wrong question.

It’s the wrong question because it’s just another example of letting Google define the game. What are you going to do when Google define you right out of the game? If your service or strategy involves links right now, then in order to remain snow white, any links you place, for the purposes of achieving higher rank, are going to need to be no-followed in order to be clear about intent. Extreme? What's going to be the emphasis in six months time? Next year? How do you know what you're doing now is not going to be frowned upon, then need to be undone, next year?

A couple of years ago it would be unthinkable that webmasters would report and remove their own links, even paying for them to be removed, but that’s exactly what’s happening. So, what is next year's unthinkable scenario?

You could re-examine the relationship and figure what you do on your site is absolutely none of Google’s business. They can issue as many guidelines as they like, but they do not own your website, or the link graph, and therefore don't have authority over you unless you allow it. Can they ban your site because you’re not compliant with their guidelines? Sure, they can. It’s their index. That is the risk. How do you choose to manage this risk?

It strikes me you can lose your rankings at anytime whether you follow the current guidelines or not, especially when the goal-posts keep moving. So, the risk of not following the guidelines, and following the guidelines but not ranking well is pretty much the same - no traffic. Do you have a plan to address the “no traffic from Google” risk, however that may come about?

Your plan might involve advertising on other sites that do rank well. It might involve, in part, a move to PPC. It might be to run multiple domains, some well within the guidelines, and some way outside them. Test, see what happens. It might involve beefing up other marketing channels. It might be to buy competitor sites. Your plan could be to jump through Google’s hoops if you do receive a penalty, see if your site returns, and if it does - great - until next time, that is.

What's your long term "traffic from Google" strategy?

If all you do is "follow Google's Guidelines", I'd say that's now a high risk SEO strategy.

Winning Strategies to Lose Money With Infographics

Google is getting a bit absurd with suggesting that any form of content creation that drives links should include rel=nofollow. Certainly some techniques may be abused, but if you follow the suggested advice, you are almost guaranteed to have a negative ROI on each investment - until your company goes under.

Some will ascribe such advice as taking a "sustainable" and "low-risk" approach, but such strategies are only "sustainable" and "low-risk" so long as ROI doesn't matter & you are spending someone else's money.

The advice on infographics in the above video suggests that embed code by default should include nofollow links.

Companies can easily spend at least $2,000 to research, create, revise & promote an infographic. And something like 9 out of 10 infographics will go nowhere. That means you are spending about $20,000 for each successful viral infographic. And this presumes that you know what you are doing. Mix in a lack of experience, poor strategy, poor market fit, or poor timing and that cost only goes up from there.

If you run smaller & lesser known websites, quite often Google will rank a larger site that syndicates the infographic above the original source. They do that even when the links are followed. Mix in nofollow on the links and it is virtually guaranteed that you will get outranked by someone syndicating your infographic.

So if you get to count as duplicate content for your own featured premium content that you dropped 4 or 5 figures on AND you don't get links out of it, how exactly does the investment ever have any chance of backing out?

Sales?

Not a snowball's chance in hell.

An infographic created around "the 10 best ways you can give me your money" won't spread. And if it does spread, it will be people laughing at you.

I also find it a bit disingenuous the claim that people putting something that is 20,000 pixels large on their site are not actively vouching for it. If something was crap and people still felt like burning 20,000 pixels on syndicating it, surely they could add nofollow on their end to express their dissatisfaction and disgust with the piece.

Many dullards in the SEO industry give Google a free pass on any & all of their advice, as though it is always reasonable & should never be questioned. And each time it goes unquestioned, the ability to exist in the ecosystem as an independent player diminishes as the entire industry moves toward being classified as some form of spam & getting hit or not depends far more on who than what.

Does Google's recent automated infographic generator give users embed codes with nofollow on the links? Not at all. Instead they give you the URL without nofollow & those URLs are canonicalized behind the scenes to flow the link equity into the associated core page.

No cost cut-n-paste mix-n-match = direct links. Expensive custom research & artwork = better use nofollow, just to be safe.

If Google actively adds arbitrary risks to some players while subsidizing others then they shift the behaviors of markets. And shift the markets they do!

Years ago Twitter allowed people who built their platform to receive credit links in their bio. Matt Cutts tipped off Ev Williams that the profile links should be nofollowed & that flow of link equity was blocked.

It was revealed in the WSJ that in 2009 Twitter's internal metrics showed an 11% spammy Tweet rate & Twitter had a grand total of 2 "spam science" programmers on staff in 2012.

With smaller sites, they need to default everything to nofollow just in case anything could potentially be construed (or misconstrued) to have the intent to perhaps maybe sorta be aligned potentially with the intent to maybe sorta be something that could maybe have some risk of potentially maybe being spammy or maybe potentially have some small risk that it could potentially have the potential to impact rank in some search engine at some point in time, potentially.

A larger site can have over 10% of their site be spam (based on their own internal metrics) & set up their embed code so that the embeds directly link - and they can do so with zero risk.

I just linked to Twitter twice in the above embed. If those links were directly to Cygnus it may have been presumed that either he or I are spammers, but put the content on Twitter with 143,199 Tweets in a second & those links are legit & clean. Meanwhile, fake Twitter accounts have grown to such a scale that even Twitter is now buying them to try to stop them. Twitter's spam problem was so large that once they started to deal with spam their growth estimates dropped dramatically:

CEO Dick Costolo told employees he expected to get to 400 million users by the end of 2013, according to people familiar with the company.

Sources said that Twitter now has around 240 million users, which means it has been adding fewer than 4.5 million users a month in 2013. If it continues to grow at that rate, it would end this year around the 260 million mark — meaning that its user base would have grown by about 30 percent, instead of Costolo’s 100 percent goal.

Typically there is no presumed intent to spam so long as the links are going into a large site (sure there are a handful of token counter-examples shills can point at). By and large it is only when the links flow out to smaller players that they are spam. And when they do, they are presumed to be spam even if they point into featured content that cost thousands of Dollars. You better use nofollow, just to play it safe!

That duality is what makes blind unquestioning adherence to Google scripture so unpalatable. A number of people are getting disgusted enough by it that they can't help but comment on it: David Naylor, Martin Macdonald & many others DennisG highlighted.

Oh, and here's an infographic for your pleasurings.

Making the New Raven Work Without Rankings

Automating and Product-izing Your SEO Business

Google: Press Release Links

So, Google have updated their Webmaster Guidelines.

Here are a few common examples of unnatural links that violate our guidelines:....Links with optimized anchor text in articles or press releases distributed on other sites.

For example: There are many wedding rings on the market. If you want to have a wedding, you will have to pick the best ring. You will also need to buy flowers and a wedding dress.

In particular, they have focused on links with optimized anchor text in articles or press releases distributed on other sites. Google being Google, these rules are somewhat ambiguous. “Optimized anchor text”? The example they provide includes keywords in the anchor text, so keywords in the anchor text is “optimized” and therefore a violation of Google’s guidelines.

Ambiguously speaking, of course.

To put the press release change in context, Google’s guidelines state:

Any links intended to manipulate PageRank or a site's ranking in Google search results may be considered part of a link scheme and a violation of Google’s Webmaster Guidelines. This includes any behavior that manipulates links to your site or outgoing links from your site

So, links gained, for SEO purposes - intended to manipulate ranking - are against Google Guidelines.

Google vs Webmasters

Here’s a chat...

In this chat, Google’s John Muller says that, if the webmaster initiated it, then it isn't a natural link. If you want to be on the safe side, John suggests to use no-follow on links.

Google are being consistent, but what’s amusing is the complete disconnect on display from a few of the webmasters. Google have no problem with press releases, but if a webmaster wants to be on the safe side in terms of Google’s guidelines, the webmaster should no-follow the link.

Simple, right. If it really is a press release, and not an attempt to link build for SEO purposes, then why would a webmaster have any issue with adding a no-follow to a link?

He/she wouldn't.

But because some webmasters appear to lack self-awareness about what it is they are actually doing, they persist with their line of questioning. I suspect what they really want to hear is “keyword links in press releases are okay." Then, webmasters can continue to issue pretend press releases as a link building exercise.

They're missing the point.

Am I Taking Google’s Side?

Not taking sides.

Just hoping to shine some light on a wider issue.

If webmasters continue to let themselves be defined by Google, they are going to get defined out of the game entirely. It should be an obvious truth - but sadly lacking in much SEO punditry - that Google is not on the webmasters side. Google is on Google’s side. Google often say they are on the users side, and there is certainly some truth in that.

However,when it comes to the webmaster, the webmaster is a dime-a-dozen content supplier who must be managed, weeded out, sorted and categorized. When it comes to the more “aggressive” webmasters, Google’s behaviour could be characterized as “keep your friends close, and your enemies closer”.

This is because some webmasters, namely SEOs, don’t just publish content for users, they compete with Google’s revenue stream. SEOs offer a competing service to click based advertising that provides exactly the same benefit as Google's golden goose, namely qualified click traffic.

If SEOs get too good at what they do, then why would people pay Google so much money per click? They wouldn’t - they would pay it to SEOs, instead. So, if I were Google, I would see SEO as a business threat, and manage it - down - accordingly. In practice, I’d be trying to redefine SEO as “quality content provision”.

Why don't Google simply ignore press release links? Easy enough to do. Why go this route of making it public? After all, Google are typically very secret about algorithmic topics, unless the topic is something they want you to hear. And why do they want you to hear this? An obvious guess would be that it is done to undermine link building, and SEOs.

Big missiles heading your way.

Guideline Followers

The problem in letting Google define the rules of engagement is they can define you out of the SEO game, if you let them.

If an SEO is not following the guidelines - guidelines that are always shifting - yet claim they do, then they may be opening themselves up to legal liability. In one recent example, a case is underway alleging lack of performance:

Last week, the legal marketing industry was aTwitter (and aFacebook and even aPlus) with news that law firm Seikaly & Stewart had filed a lawsuit against The Rainmaker Institute seeking a return of their $49,000 in SEO fees and punitive damages under civil RICO

.....but it’s not unreasonable to expect a somewhat easier route for litigants in the future might be “not complying with Google’s guidelines”, unless the SEO agency disclosed it.

SEO is not the easiest career choice, huh.

One group that is likely to be happy about this latest Google push is legitimate PR agencies, media-relations departments, and publicists. As a commenter on WMW pointed out:

I suspect that most legitimate PR agencies, media-relations departments, and publicists will be happy to comply with Google's guidelines. Why? Because, if the term "press release" becomes a synonym for "SEO spam," one of the important tools in their toolboxes will become useless.

Just as real advertisers don't expect their ads to pass PageRank, real PR people don't expect their press releases to pass PageRank. Public relations is about planting a message in the media, not about manipulating search results

However, I’m not sure that will mean press releases are seen as any more credible, as press releases have never enjoyed a stellar reputation pre-SEO, but it may thin the crowd somewhat, which increases an agencies chances of getting their client seen.

Guidelines Honing In On Target

One resource referred to in the video above was this article, written by Amit Singhal, who is head of Google’s core ranking team. Note that it was written in 2011, so it’s nothing new. Here’s how Google say they determine quality:

we aren't disclosing the actual ranking signals used in our algorithms because we don't want folks to game our search results; but if you want to step into Google's mindset, the questions below provide some guidance on how we've been looking at the issue:

  • Would you trust the information presented in this article?
  • Is this article written by an expert or enthusiast who knows the topic well, or is it more shallow in nature?
  • Does the site have duplicate, overlapping, or redundant articles on the same or similar topics with slightly different keyword variations?
  • Are the topics driven by genuine interests of readers of the site, or does the site generate content by attempting to guess what might rank well in search engines?
  • Does the article provide original content or information, original reporting, original research, or original analysis?
  • Does the page provide substantial value when compared to other pages in search results?
  • How much quality control is done on content?

….and so on. Google’s rhetoric is almost always about “producing high quality content”, because this is what Google’s users want, and what Google’s users want, Google’s shareholders want.

It’s not a bad thing to want, of course. Who would want poor quality content? But as most of us know, producing high quality content is no guarantee of anything. Great for Google, great for users, but often not so good for publishers as the publisher carries all the risk.

Take a look at the Boston Globe, sold along with a boatload of content for a 93% decline. Quality content sure, but is it a profitable business? Emphasis on content without adequate marketing is not a sure-fire strategy. Bezos has just bought the Washington Post, of course, and we're pretty sure that isn't a content play, either.

High quality content often has a high upfront production cost attached to it, and given measly web advertising rates, the high possibility of invisibility, getting content scrapped and ripped off, then it is no wonder webmasters also push their high quality content in order to ensure it ranks. What other choice have they got?

To not do so is also risky.

Even eHow, well known for cheap factory line content, is moving toward subscription membership revenues.

The Somewhat Bigger Question

Google can move the goal- posts whenever they like. What you’re doing today might be frowned upon tomorrow. One day, your content may be made invisible, and there will be nothing you can do about it, other than start again.

Do you have a contingency plan for such an eventuality?

Johnon puts it well:

The only thing that matters is how much traffic you are getting from search engines today, and how prepared you are for when some (insert adjective here) Googler shuts off that flow of traffic"

To ask about the minuate of Google’s policies and guidelines is to miss the point. The real question is how prepared are you when Google shuts off you flow of traffic because they’ve reset the goal posts?

Focusing on the minuate of Google's policies is, indeed, to miss the point.

This is a question of risk management. What happens if your main site, or your clients site, runs foul of a Google policy change and gets trashed? Do you run multiple sites? Run one site with no SEO strategy at all, whilst you run other sites that push hard? Do you stay well within the guidelines and trust that will always be good enough? If you stay well within the guidelines, but don’t rank, isn’t that effectively the same as a ban i.e. you’re invisible? Do you treat search traffic as a bonus, rather than the main course?

Be careful about putting Google’s needs before your own. And manage your risk, on your own terms.

Google Keyword(Not Provided): High Double Digit Percent

Most Organic Search Data is Now Hidden

Over the past couple years since its launch, Google's keyword (not provided) has received quite a bit of exposure, with people discussing all sorts of tips on estimating its impact & finding alternate sources of data (like competitive research tools & webmaster tools).

What hasn't received anywhere near enough exposure (and should be discussed daily) is that the sole purpose of the change was anti-competitive abuse from the market monopoly in search.

The site which provided a count for (not provided) recently displayed over 40% of queries as (not provided), but that percentage didn't include the large percent of mobile search users that were showing no referrals at all & were showing up as direct website visitors. On July 30, Google started showing referrals for many of those mobile searchers, using keyword (not provided).

According to research by RKG, mobile click prices are nearly 60% of desktop click prices, while mobile search click values are only 22% of desktop click prices. Until Google launched enhanced AdWords campaigns they understated the size of mobile search by showing many mobile searchers as direct visitors. But now that AdWords advertisers were opted into mobile ads (and have to go through some tricky hoops to figure out how to disable it), Google has every incentive to promote what a big growth channel mobile search is for their business.

Looking at the analytics data for some non-SEO websites over the past 4 days I get Google referring an average of 86% of the 26,233 search visitors, with 13,413 being displayed as keyword (not provided).

Hiding The Value of SEO

Google is not only hiding half of their own keyword referral data, but they are hiding so much more than half that even when you mix in Bing and Yahoo! you still get over 50% of the total hidden.

Google's 86% of the 26,233 searches is 22,560 searches.

Keyword (not provided) being shown for 13,413 is 59% of 22,560. That means Google is hiding at least 59% of the keyword data for organic search. While they are passing a significant share of mobile search referrers, there is still a decent chunk that is not accounted for in the change this past week.

Not passing keywords is just another way for Google to increase the perceived risk & friction of SEO, while making SEO seem less necessary, which has been part of "the plan" for years now.

Buy AdWords ads and the data gets sent. Rank organically and most the data is hidden.

When one digs into keyword referral data & ad blocking, there is a bad odor emitting from the GooglePlex.

Subsidizing Scammers Ripping People Off

A number of the low end "solutions" providers scamming small businesses looking for SEO are taking advantage of the opportunity that keyword (not provided) offers them. A buddy of mine took over SEO for a site that had showed absolutely zero sales growth after a year of 15% monthly increase in search traffic. Looking at the on-site changes, the prior "optimizers" did nothing over the time period. Looking at the backlinks, nothing there either.

So what happened?

Well, when keyword data isn't shown, it is pretty easy for someone to run a clickbot to show keyword (not provided) Google visitors & claim that they were "doing SEO."

And searchers looking for SEO will see those same scammers selling bogus solutions in AdWords. Since they are selling a non-product / non-service, their margins are pretty high. Endorsed by Google as the best, they must be good.

Or something like that:

Google does prefer some types of SEO over others, but their preference isn’t cast along the black/white divide you imagine. It has nothing to do with spam or the integrity of their search results. Google simply prefers ineffective SEO over SEO that works. No question about it. They abhor any strategies that allow guys like you and me to walk into a business and offer a significantly better ROI than AdWords.

This is no different than the YouTube videos "recommended for you" that teach you how to make money on AdWords by promoting Clickbank products which are likely to get your account flagged and banned. Ooops.

Anti-competitive Funding Blocking Competing Ad Networks

John Andrews pointed to Google's blocking (then funding) of AdBlock Plus as an example of their monopolistic inhibiting of innovation.

sponsoring Adblock is changing the market conditions. Adblock can use the money provided by Google to make sure any non-Google ad is blocked more efficiently. They can also advertise their addon better, provide better support, etc. Google sponsoring Adblock directly affects Adblock's ability to block the adverts of other companies around the world. - RyanZAG

Turn AdBlock Plus on & search for credit cards on Google and get ads.

Do that same search over at Bing & get no ads.

How does a smaller search engine or a smaller ad network compete with Google on buying awareness, building a network AND paying the other kickback expenses Google forces into the marketplace?

They can't.

Which is part of the reason a monopoly in search can be used to control the rest of the online ecosystem.

Buying Browser Marketshare

Already the #1 web browser, Google Chrome buys marketshare with shady one-click bundling in software security installs.

If you do that stuff in organic search or AdWords, you might be called a spammer employing deceptive business practices.

When Google does it, it's "good for the user."

Vampire Sucking The Lifeblood Out of SEO

Google tells Chrome users "not signed in to Chrome (You're missing out - sign in)." Login to Chrome & searchers don't pass referral information. Google also promotes Firefox blocking the passage of keyword referral data in search, but when it comes to their own cookies being at risk, that is unacceptable: "Google is pulling out all the stops in its campaign to drive Chrome installs, which is understandable given Microsoft and Mozilla's stance on third-party cookies, the lifeblood of Google's display-ad business."

What do we call an entity that considers something "its lifeblood" while sucking it out of others?

What Is Your SEO Strategy?

How do you determine your SEO strategy?

Actually, before you answer, let’s step back.

What Is SEO, Anyway?

“Search engine optimization” has always been an odd term as it’s somewhat misleading. After all, we’re not optimizing search engines.

SEO came about when webmasters optimized websites. Specifically, they optimized the source code of pages to appeal to search engines. The intent of SEO was to ensure websites appeared higher in search results than if the site was simply left to site designers and copywriters. Often, designers would inadvertently make sites uncrawlable, and therefore invisible in search engines.

But there was more to it than just enhancing crawlability.

SEOs examined the highest ranking page, looked at the source code, often copied it wholesale, added a few tweaks, then republished the page. In the days of Infoseek, this was all you needed to do to get an instant top ranking.

I know, because I used to do it!

At the time, I thought it was an amusing hacker trick. It also occurred to me that such positioning could be valuable. Of course, this rather obvious truth occurred to many other people, too. A similar game had been going on in the Yahoo Directory where people named sites “AAAA...whatever” because Yahoo listed sites in alphabetical order. People also used to obsessively track spiders, spotting fresh spiders (Hey Scooter!) as they appeared and....cough......guiding them through their websites in a favourable fashion.

When it comes to search engines, there’s always been gaming. The glittering prize awaits.

The new breed of search engines made things a bit more tricky. You couldn’t just focus on optimizing code in order to rank well. There was something else going on.

Backlinks.

So, SEO was no longer just about optimizing the underlying page code, SEO was also about getting links. At that point, SEO jumped from being just a technical coding exercise to a marketing exercise. Webmasters had to reach out to other webmasters and convince them to link up.

A young upstart, Google, placed heavy emphasis on links, making use of a clever algorithm that sorted “good” links from, well, “evil” links. This helped make Google’s result set more relevant than other search engines. Amusingly enough, Google once claimed it wasn’t possible to spam Google.

Webmasters responded by spamming Google.

Or, should I say, Google likely categorized what many webmasters were doing as “spam”, at least internally, and may have regretted their earlier hubris. Webmasters sought links that looked like “good” links. Sometimes, they even earned them.

And Google has been pushing back ever since.

Building links pre-dated SEO, and search engines, but, once backlinks were counted in ranking scores, link building was blended into SEO. These days, most SEO's consider link building a natural part of SEO. But, as we've seen, it wasn’t always this way.

We sometimes get comments on this blog about how marketing is different from SEO. Well, it is, but if you look at the history of SEO, there has always been marketing elements involved. Getting external links could be characterized as PR, or relationship building, or marketing, but I doubt anyone would claim getting links is not SEO.

More recently, we’ve seen a massive change in Google. It’s a change that is likely being rolled out over a number of years. It’s a change that makes a lot of old school SEO a lot less effective in the same way introducing link analysis made meta-tag optimization a lot less effective.

My takeaways from Panda are that this is not an individual change or something with a magic bullet solution. Panda is clearly based on data about the user interacting with the SERP (Bounce, Pogo Sticking), time on site, page views, etc., but it is not something you can easily reduce to 1 number or a short set of recommendations. To address a site that has been Pandalized requires you to isolate the "best content" based on your user engagement and try to improve that.

Google is likely applying different algorithms to different sectors, so the SEO tactics used in on sector don’t work in another. They’re also looking at engagement metrics, so they’re trying to figure out if the user really wanted the result they clicked on. When you consider Google's work on PPC landing pages, this development is obvious. It’s the same measure. If people click back often, too quickly, then the landing page quality score drops. This is likely happening in the SERPs, too.

So, just like link building once got rolled into SEO, engagement will be rolled into SEO. Some may see that as a death of SEO, and in some ways it is, just like when meta-tag optimization, and other code optimizations, were deprecated in favour of other, more useful relevancy metrics. In others ways, it's SEO just changing like it always has done.

The objective remains the same.

Deciding On Strategy

So, how do you construct your SEO strategy? What will be your strategy going forward?

Some read Google’s Webmaster Guidelines. They'll watch every Matt Cutts video. They follow it all to the letter. There’s nothing wrong with this approach.

Others read Google’s Guidelines. They'll watch every Matt Cutts video. They read between the lines and do the complete opposite. Nothing wrong with that approach, either.

It depends on what strategy you've adopted.

One of the problems with letting Google define your game is that they can move the goalposts anytime they like. The linking that used to be acceptable, at least in practice, often no longer is. Thinking of firing off a press release? Well, think carefully before loading it with keywords:

This is one of the big changes that may have not been so clear for many webmasters. Google said, “links with optimized anchor text in articles or press releases distributed on other sites,” is an example of an unnatural link that violate their guidelines. The key are the examples given and the phrase “distributed on other sites.” If you are publishing a press release or an article on your site and distribute it through a wire or through an article site, you must make sure to nofollow the links if those links are “optimized anchor text.

Do you now have to go back and unwind a lot of link building in order to stay in their good books? Or, perhaps you conclude that links in press releases must work a little too well, else Google wouldn’t be making a point of it. Or conclude that Google is running a cunning double-bluff hoping you’ll spend a lot more time doing things you think Google does or doesn’t like, but really Google doesn’t care about at all, as they’ve found a way to mitigate it.

Bulk guest posting were also included in Google's webmaster guidelines as a no no. Along with keyword rich anchors in article directories. Even how a site monetizes by doing things like blocking the back button can be considered "deceptive" and grounds for banning.

How about the simple strategy of finding the top ranking sites, do what they do, and add a little more? Do you avoid saturated niches, and aim for the low-hanging fruit? Do you try and guess all the metrics and make sure you cover every one? Do you churn and burn? Do you play the long game with one site? Is social media and marketing part of your game, or do you leave these aspects out of the SEO equation? Is your currency persuasion?

Think about your personal influence and the influence you can manage without dollars or gold or permission from Google. Think about how people throughout history have sought karma, invested in social credits, and injected good will into their communities, as a way to “prep” for disaster. Think about it.

We may be “search marketers” and “search engine optimizers” who work within the confines of an economy controlled (manipulated) by Google, but our currency is persuasion. Persuasion within a market niche transcends Google

It would be interesting to hear the strategies you use, and if you plan on using different strategy going forward.

Authority Labs Review

There are quite a few rank tracking options on the market today and selecting one (or two) can be difficult. Some have lots of integrations, some have no integrations. Some are trustworthy, some are not.

Deciding on the feature set is tough enough but you also need to take into account who is storing your data. Can you trust that person or company? Will they use your aggregate data in a blog post (which is a signal that they are using your data for their own gains) or use your data to out a client of yours? Decisions, decisions...

What I Use

I use and recommend 2 services; one is web-based and one is software-based (where I have full control over the data). The software version is quite robust and has many integrations and options (that you may not need). This review covers my recommended web-based platform, Authority Labs.

I use Authority Labs for most rank checking reports and I find it to be a wonderfully powerful web-based tool that is super easy to use. My recommended software package is Advanced Web Ranking. AWR is what I use for really in-depth analysis of pretty much everything (rankings, analytics, links, competitive analysis, etc.). If you are interested in learning a bit more, check out our Advanced Web Ranking review.

In-depth analysis doesn't need to occur every day, but overviews of overall ranking health does. Daily, aggregate spot checks will help you spot large-scale changes quickly. Be consistently proactive with your clients and your own sites is quite a bit better than always being reactive.

Benefits of the Two Tool Approach

The benefits of this approach are that I get a locally-owned copy of my data and all the options I'd ever need while getting a reliable, hassle-free web-based copy that updates daily and is really easy to report on and/or give clients access to ranking reports if needed.

Some clients require more in-depth reporting as a whole and you should strive to make yourself way more valuable than just a ranking report hand-off company, but if you are rolling your own reports and mashing data together then Authority Labs can really make your life quite a bit simpler.

With Authority Labs and Advanced Web Ranking I get the best of both worlds and redundancy. It's a beautiful thing.

What Does Authority Labs Do?

It's a rank tracking application, plain and simple. Some of the main features I use most of the time are:

  • Tracking keywords daily
  • Tracking Google, Bing, and Yahoo SERPs
  • Viewing estimated search volume (via a bar graph) for your tracked keywords
  • Selecting a location all the way down to the zip code
  • Viewing daily ranking charts for a selected keyword
  • Exporting PDF reports for monthly, weekly, quarterly, *since added* date, and/or daily comparison reports
  • Comparing rankings against a competitor
  • Sharing a public URL with a client for their project
  • Exporting one domain or an entire account history in CSV format as part of a backup process
  • Producing white label reports

The local feature is quite nice as well. It will track as if the search is occurring in that particular location (obviously really, really helpful for locally based keywords).

Another feature that I really like is the "results type" column:

authority-labs-resultstype-column

In this column, which appears next to the keyword, it will show you if any of the following items appeared in that SERP:

  • Image results
  • News results
  • Video results
  • Shopping results
  • Snippets
  • Google Places results

There are some other solid features as well but the ones mentioned above are some of my favorites.

Working with Domains

Authority Labs gives us the ability to do a few nifty things with domains. We can:

  • Group domains
  • Sync domains
  • Tag domains

In order to understand how best to use the domain categorization features we have to understand how domain tracking works in the application. You can utilize specific URL, subdomain, or root domain tracking and also introduce wildcards to track more in-depth site structures.

Some general rules of thumb:

  • If you choose a subdomain it will not track the root and beyond, only what's housed under the sub-domain structure
  • If you choose a root domain, it will track sub-domains and sub-pages across the root and any sub-domains
  • If you add just a site.com/folder it will only track that folder and down
  • If you add just a site.com/folder/page it will track just that page
  • If you use a wild card like site.com/wildcard/something it will track anything on the root and on any sub-domains that have "something" as a folder or page name preceded by a category or folder

You can tag to your hearts content but it can get a bit unwieldy so I'd recommended using the solution that works best for your set up.

Personally I like the ability to use grouping to group my sites/client sites and competing sites while relying on tagging to determine whether it's a client site, a site I own, and the market it is in (finance, ecommerce, SEO, whatever). This way I can quickly see a client-specific group, my own sites separated out, and then drill down into a particular market/core keyword to see the competing sites and such. Remember that when you sync domains together to track against each other, they reside in their own group.

I also like the ability to group domains that might not be a direct competitor and tag them as "watch" just to track their growth and then try and reverse engineer the strategy.

These options offer a lot of flexibility and there's no real wrong way to use them, I just recommended really thinking through how you want to organize things prior to moving things around in the interface.

Adding a Domain

Adding a domain is simple. Click on domains and then add a domain in the sidebar on the left:

authority-labs-add-domain-dialog-sidebar

From here, you add the domain (or subdomain, page, domain with wildcard, etc) and select what engines to track, what options to show, and what location (if any) to search from:

authority-labs-add-new-domain

Next up is adding the keywords to the domain, up to 25 at a time (otherwise you should use the import function):

authority-labs-adding-keywords-to-domain

After I add a site my workflow usually is to add tags to the domain, add competing sites, then group them. So here is the ranking interface of a specific domain:

authority-labs-domain-interface

You can see in the upper left where you can add tags, the link in the upper right is for the publicly shareable link, the paper icon is for a PDF report of what you see on the screen + time frame selected, and you can see where you can filter keywords by tag or name.

The time frames available:

  • Compared to previous day
  • Compared to previous week
  • Compared to previous month
  • Compared to 3 months ago(quarterly)
  • Compared to date added

So then I'd add the competing sites in the same way, except with different tags. Keep in mind that when you want to add competing sites with yours they have to have the same keyword sets (no more, no less).

If you want to do a lot of specific competitor tracking across the entire breadth of your site's keywords then you can utilize the grouping and tagging features mentioned above to split them off into relevant buckets. Keep in mind that any synced domains will also belong to their own group (the domains that are synced are grouped together)

When you are in the domain interface you can see average rank based on time period selected (same as time periods above) and filter by domain name and tag:

authority-labs-domain-landing-page-top

Here's what a domain looks like in this overview area:

authority-labs-average-rank

Overall average rank is 7 for all the keywords and +6 since last week (as that is the time frame selected).

Grouping and Syncing Domains

After adding the domains, their keywords, and tagging them you can then group them as needed. Back on the domain overview page you can see my ungrouped domains for this particular review:

authoritylabs-ungrouped-domains

To group or sync them just check off the boxes and click group in the left sidebar.

authority-labs-sync-or-group-domains

Once they are synced you just go back to the domain overview, click on the group name where the domains are synced, and you get the keywords side by side with the synced domains.

Working with Keywords

To add up to 25 keywords to a project just get into the domain and click add keywords on the left. If you need to bulk upload keywords you can click on the bulk upload button and the instructions are there for you:

authority-labs-keyword-mgmt

If you click on a keyword the tag dialog comes up on the left. If you have a large keyword list and you aren't using the domain strategies mentioned above, tagging keywords certainly makes sense.

You can also filter keywords by tags and keyword names (just the keyword itself).

Another thing you can do with keywords is to click on the graph to the left of the keyword to see a daily history over the course of a month, 3 months, 6 months, and 1 year.

You can click on multiple keywords to graph them together. This is helpful when diagnosing ranking nosedives (or upticks of signifigance). If you are tracking multiple engines you can switch between them too

Reporting

There are 3 types of reporting options available:

  • PDF
  • Excel
  • Shareable URL

PDF's are available in the upper right of the domain landing page and the report will show the changes relative to the time frame selected on the screen. Again, the time frames available are:

  • Compared to previous day
  • Compared to previous week
  • Compared to previous month
  • Compared to 3 months ago(quarterly)
  • Compared to date added

If you want to compare a specific date range outside of the above, you'll get an excel download. This is something I hope they can update in the future to be a bit more robust with PDF reporting.

The excel download is really just an export (as described in the next section) for a specific time period with day by day numbers. So if you exported for a 30 day period you'd get the rank for each keyword on each day in Excel format.

You can also white label reports, which is standard in just about all rank tracking/reporting applications.

Importing and Exporting

Currently you can only import keywords as described above, you cannot import historical data (they did offer a Raven import back when Raven shut down Rank Tracking) from another application yet.

Exporting is easy, you can choose 1 domain or all domains and a specific time frame:

auth-labs-export

Whatever date range you select here will result in day by day ranking positions (the excel report mentioned above). This is one (kind of clunky) way to compare specific dates. In fairness, the date ranges they give you for onsite viewing and PDF downloads really do cover a good percentage of the date ranges you'd need to figure out what was going on. Still, it would be nice to have more granular comparison options.

Access Levels

You can do the following with access levels:

  • Add someone to your team (they get access to selected domains in your account)
  • If you add them as an Admin they can manage the entire account
  • Create a new "Team" and give that team access to specific domains only and add people to that team only (great for clients)

Wrapping Up

Another great feature is that 1 keyword only counts once even if you are tracking competitors with those keywords and using the 3 engines. This really makes it cost-effective to track pretty much everything you want to track.

There are some improvements that I'd like to see (analytics integration, link integration, and some more granular reporting options) but for a web-based rank tracker Authority Labs is my tool of choice.

Give them a try, they have a generous 30 day free trial with rather solid pricing.

Pages