My Must Have Tools of 2014

There are a lot of tools in the SEO space (sorry, couldn't resist :D) and over the years we've seen tools fall into 2 broad categories. Tools that aim to do just about everything and tools that focus on one discipline of online marketing.

As we continue to lose more and more data (not provided) and the data we have access to becomes a bit more unreliable (rankings, competitive research data, data given to us by search engines, etc) one has to wonder at what point does access to a variety of tools start producing diminishing returns?

In other words, if you are starting with unreliable or very, very inexact data does layering more and more extrapolations on top make you worse off than you were before? Probably.

I do think that a fair amount of tool usage scenarios have become less effective (or less necessary) at this point. Consider what were once the cornerstones of industry research and data:

  • Rankings
  • SERP difficulty analysis
  • Link prospecting
  • Competitive link research
  • Analytics

Each one of these areas of data has really taken a beating over the last 2-3 years thanks to collateral damage from broad-reaching, unforgiving Google updates, the loss of actual keyword data, the less obvious relationship between links and rankings, personalized search, various SERP layout changes, and on and on.

I believe the best way forward for evaluating what tools you should be using is to determine what does X best to the point where supplementing it with data from a similar provider is overkill and not worth the extra monthly subscription cost nor the cognitive overhead.

Which Ones to Choose?

Well, this certainly depends on what you do. I'm going to focus on the small to mid-size agency market (which also includes freelancers and folks who just operate their own properties) but for those tetering on mid-large size I'll make 2 recommendations based on personal experience:

If I were operating a bigger agency I'd strongly consider both of those. They both do a really solid job of providing customized reporting and research modules.

For the rest of us, I'll share what I'm using as a basis for my recommendations with reasons why I selected them.

These tools are focused on what I do on a daily basis and are the ones I simply cannot live without. They cover:

  • Reporting
  • Competitive Link & Keyword Research
  • Keyword Research
  • PR and Outreach
  • Advanced Web Ranking

    This is the tool I rely on the most. It does just about everything with the only drawbacks being the learning curve and that it is desktop software. The learning curve payoff is very much worth it though. This tool does the following for me:

    • Reporting for pretty much every aspect of a campaign
    • Interfaces with Majestic SEO for link data as well as data from Moz for link research and tracking
    • Connects to social accounts for reporting
    • Site audit crawls
    • Interfaces with Google Analytics
    • Keyword research
    • Competitor analysis
    • Rankings
    • On-page analysis

    They have a cloud version for reporting and I believe that in the near future a good amount of this functionality will go to its cloud service. This tool is highly recommended.

    Advanced Web Ranking - here's a basic overview of the software from a few years ago, though it has been updated a number of times since then

    Ahrefs

    I remember when this was for sale on Flippa! I find Ahrefs to be very reliable and easy to use. They have added quite a few features over the past year and, in my opinion, they are right up there with Majestic SEO when it comes to relevant, deep link data.

    Their interface has improved dramatically over time and the constant addition of helpful, new features has left other tools playing catchup. I'm hoping to see more integration with them in 2014 via services like Raven and Advanced Web Ranking.

    Ahrefs.com - here's a review from last year (though they no longer offer the SERP tracking feature they offered back then)

    Authority Labs

    The most accurate and stable online rankings provider I've used thus far. The interface has improved recently as has the speed of exports. I would still like to see a bulk PDF export of each individual site in the near future but overall my experience with Authority Labs has been great.

    I use it as a stable, online, automated rank checker to supplement my data from Advanced Web Ranking. It also has some nice features like being able to track rankings from a zip code and showing what else is in the SERP it encounters (videos, snippets, etc).

    Authority Labs - here's a review from 5 months ago

    Buzzstream

    Buzzstream is an absolute must have for anyone doing PR-based and social outreach. The email integration is fantastic and the folks that help me with outreach routinely rave about using Buzzstream.

    The UI has really been turned up recently and the customer support has been excellent for us. I'm positive that our outreach would not be nearly has effective without Buzzstream and there really isn't a competing product out there that I've seen.

    This is a good example of a really niche product that excels at its intended purpose.

    Buzzstream - here's a review from a couple years ago

    Citation Labs Suite

    We use the Contact Finder, Link Prospector, and Broken Link Building tool inside our prospecting process. Much like Buzzstream this is a suite of tools that focuses on a core area and does it very well.

    You have to spend some time with the prospector to get the best queries possible for your searches but the payoff is really relevant, quality link prospects.

    Citation Labs - here's a review from a couple years ago

    Link Research Tools

    While LRT is primarily known for its Link Detox tool, this set of tools covers quite a bit of the SEO landscape. I do not use all the tools in the suite but the areas that I utilize LRT for are:

    • Link cleanup
    • Link prospecting
    • SERP competition analysis
    • Competitive domain comparisons

    It's missing a few pieces but it is similar to Advanced Web Ranking in terms of power and data. LRT hooks into many third party tools (Majestic, SemRush, Moz, etc) so you get a pretty solid overview, in one place, of what you need to see or want to see.

    The prospecting is similar, to an extent, when compared with Citation Labs but you can define specific SEO metrics to prospect filtering as well as base it off of links that appear across multiple sites in a given SERP.

    LinkResearchTools

    Majestic SEO

    Majestic is still the defacto standard for deep link data (both fresh and historical data). They recently launched a new feature called Search Explorer, which is designed to be a specialized search engine devoid of personalization and what not, while showing rankings based on its interpretation of the web graph and how influential a site is for a given term.

    As of this writing, Search Explorer is in Alpha but it does appear to be a really solid innovation from Majestic. The other reason for having a Majestic subscription is to get access to it's API so you can integrate the data however you choose to. I use it (access to the API) inside of LRT and Advanced Web Ranking.

    Majestic SEO - here's a review from a couple years ago by Julie Joyce

    Moz

    I use Moz mainly for access to it's link data via Advanced Web Ranking. Compared to the other tools I use I do not see a ton of value in the rest of its tool suite and I also get data from it via my Raven subscription (which is where I tend to do a fair bit of research).

    If you are on a tight budget it's worthy of consideration for the breadth of tools the subscription offers but I think you could get better options elsewhere if you have some budget to spread around.

    Moz

    Raven Tools

    I don't use every single feature in Raven but I find Raven to be one of the most well-executed, stable tool suites on the market. I use Raven to:

    • Manage keyword lists
    • Research competitors
    • Manage and report on Twitter/Facebook profiles and campaigns
    • Track social mentions
    • Automate site crawls
    • Compare various, customizable metrics between sites
    • Google Analytics and Google/Bing Webmaster tools integration

    In 2014 I'm looking to do more with Raven in the content management area and in the reporting area. I still prefer to supplement GWT rankings data with rankings data from another source (Advanced Web Ranking, Authority Labs, etc) but a goal for 2014, for me, is to fit more reporting into Raven's already excellent reporting engine.

    Raven - here's a review from a few years ago

    SemRush

    In terms of keyword, ranking, and PPC competitive research tools SemRush really has moved ahead of the competition in the past year or so. I use most of the features in the suite:

    • Organic SEO Research
    • PPC keyword and strategy research
    • Multiple domain comparisons covering organic and paid search strategies
    • Yearly historical data feature on a specific domain

    I also like the filtering feature(s) that really help me whittle down keyword data to exactly what I'm looking for without worrying about export limits and such.

    SemRush - here's a review from a few years ago

    SeoBook Community and Tools

    Knowledge is power, naturally. All the tools in the world will not overcome a lack of knowledge. All of the specific, unbiased, actionable business & marketing knowledge that I've received over the last handul of years (and the relationships made) is the single most direct reason for any succcess I've had in this space.

    The SeoBook Toolbar is still one of my most utilized tools. It is data source agnostic, you get data from a variety of sources quickly and reliably. Seo For Firefox takes most of the info in the toolbar and assigns it to each individual listing in a given SERP. Both tools are indispensible to me on the research front.

    We also have some premium tools that I like quite a bit:

    • Local Rank - It scans up to 1,000 Google results and then cross-references links pointing from those sites to the top 10, 20, or 50 results for that same query. The tool operates on the premise that sites that are well linked to from other top ranked results might get an additional ranking boost on some search queries. You can read more about this in a Google patent here.
    • HubFinder - HubFinder looks for sites that have co-occuring links across up to 10 sites on a given topic. This is useful in finding authoritative links that link to competing sites in a given SERP.
    • Duplicate Content Checker - This Firefox extension scans Google for a given block of text to see if others are using the same content. The Duplicate Content Checker searches Google for each sentence wrapped in quotes and links to the results of the search.

    Screaming Frog SEO Spider

    This is my desktop crawler of choice for most sites, it's complimented by Raven's Site Auditor (which can be run automatically) and Advanced Web Ranking's site audit tool in my usage.

    Just about anything you can think of from a site architecture and on-page standpoint can be done with this tool.

    Screaming Frog - a few years ago Branko did a great review

    TermExplorer

    A cloud-based tool that processes large amounts of keywords pretty quickly and does a good job of bringing in terms from multiple sources.

    It also offers a competitive analysis feature that I don't use very much as well as white-label reports. It has pretty slick filtering options for keywords and scans for exact match domains (.com and .net) in addition to CPC and keyword volume data.

    Term Explorer

    Avoid Tool Fatigue

    There is going to be overlap across some of these tools and while the idea of all-in-one sounds nice it rarely works in practice. Clients are different, deliverables are different, and business models are different.

    The trick is to avoid as much overlap as possible between the tools that you use, otherwise you end up wasting time, money, and resources by overthinking whatever it is that you are doing.

    I have less than 20-ish toolsets that I use on an almost daily basis. Some of these are not used daily but definitely monthly. At one point I had access to over 40 different tools. The tools mentioned in this post are the ones that I've found the most value in and gained the most success from.

    Should Venture Backed Startups Engage in Spammy SEO?

    Here's a recent video of the founders of RapGenius talking at TechCrunch disrupt.

    Oops, wrong video. Here's the right one. Same difference.

    Recently a thread on Hacker News highlighted a blog post which pointed how RapGenius was engaging in reciprocal promotional arrangements where they would promote blogs on their Facebook or Twitter accounts if those bloggers would post a laundry list of keyword rich deeplinks at RapGenius.

    Matt Cutts quickly chimed in on Hacker News "we're investigating this now."

    A friend of mine and I were chatting yesterday about what would happen. My prediction was that absolutely nothing would happen to RapGenius, they would issue a faux apology, they would put no effort into cleaning up the existing links, and the apology alone would be sufficient evidence of good faith that the issue dies there.

    Today RapGenius published a mea culpa where ultimately they defended their own spam by complaining about how spammy other lyrics websites are. The self-serving jackasses went so far as including this in their post: "With limited tools (Open Site Explorer), we found some suspicious backlinks to some of our competitors"

    It's one thing to in private complain about dealing in a frustrating area, but it's another thing to publicly throw your direct competitors under the bus with a table of link types and paint them as being black hat spammers.

    Google can't afford to penalize Rap Genius, because if they do Google Ventures will lose deal flow on the start ups Google co-invests in.

    In the past some of Google's other investments were into companies that were pretty overtly spamming. RetailMeNot held multiple giveaways where if you embedded a spammy sidebar set of deeplinks to their various pages they gave you a free t-shirt:

    Google's behavior on such arrangements has usually been to hit the smaller players while looking the other way on the bigger site on the other end of the transaction.

    That free t-shirt for links post was from 2010 - the same year that Google invested in RetailMeNot. They did those promotions multiple times & long enough that they ran out of t-shirts!. The widgets didn't link to the homepage of RetailMeNot or pages relevant to that particular blog, rather they used (in some cases dozens of different) keyword rich deep links in each widget - arbitraging search queries tied various third party brands. Now that RTM is a publicly traded billion Dollar company which Google already endorsed by investing in, there's a zero percent chance of them getting penalized.

    To recap, if you are VC-backed you can: spam away, wait until you are outed, when outed reply with a combined "we didn't know" and a "our competitors are spammers" deflective response.

    For the sake of clarity, let's compare that string of events (spam, warning but no penalty, no effort needed to clean up, insincere mea culpa) to how a websites are treated when not VC backed. For smaller sites it is "shoot on sight" first and then ask questions later, perhaps coupled with a friendly recommendation to start over.

    Here's a post from today highlighting a quote from Google's John Mueller:

    My personal & direct recommendation here would be to treat this site as a learning experience from a technical point of view, and then to find something that you're absolutely passionate & knowledgeable about and create a website for that instead.

    Growth hack inbound content marketing, but just don't call it SEO.

    What's worse, is with the new fearmongering disavow promotional stuff, not only are some folks being penalized for the efforts of others, but some are being penalized for links that were in place BEFORE Google even launched as a company.

    Given that money allegedly shouldn't impact rankings, its sad to note that as everything that is effective gets labeled as spam, capital and connections are the key SEO "innovations" in the current Google ecosystem.

    Beware Of SEO Truthiness

    When SEO started, many people routinely used black-box testing to try any figure out what pages the search engines rewarded.

    Black box testing is terminology used in IT. It’s a style of testing that doesn’t assume knowledge of the internal workings of a machine or computer program. Rather, you can only test how the system responds to inputs.

    So, for many years, SEO was about trying things out and watching how the search engine responded. If rankings went up, SEOs assumed correlation meant causation, so they did a lot more of whatever it was they thought was responsible for the boost. If the trick was repeatable, they could draw some firmer conclusions about causation, at least until the search engine introduced some new algorithmic code and sent everyone back to their black-box testing again.

    Well, it sent some people back to testing. Some SEO’s don’t do much, if any, testing of their own, and so rely on the strategies articulated by other people. As a result, the SEO echo chamber can be a pretty misleading place as “truthiness” - and a lot of false information - gets repeated far and wide, until it’s considered gospel. One example of truthiness is that paid placement will hurt you. Well, it may do, but not having it may hurt you more, because it all really…..depends.

    Another problem is that SEO testing can seldom be conclusive, because you can’t be sure of the state of the thing you’re testing. The thing you're testing may not be constant. For example, you throw up some more links, and your rankings rise, but the rise could be due to other factors, such as a new engagement algorithm that Google implemented in the middle of your testing, you just didn’t know about it.

    It used to be a lot easier to conduct this testing. Updates were periodic. Up until that point, you could reasonably assume the algorithms were static, so cause and effect were more obvious than they are today. Danny Sullivan gave a good overview of search history at Moz earlier in the year:

    That history shows why SEO testing is getting harder. There are a lot more variables to isolate that there used to be. The search engines have also been clever. A good way to thwart SEO black box testing is to keep moving the target. Continuously roll out code changes and don’t tell people you’re doing it. Or send people on a wild goose chase by arm-waving about a subtle code change made over here, when the real change has been made over there.

    That’s the state of play in 2013.

    However….(Ranting Time :)

    Some SEO punditry is bordering on the ridiculous!

    I’m not going to link to one particular article I’ve seen recently, as, ironically, that would mean rewarding them for spreading FUD. Also, calling out people isn't really the point. Suffice to say, the advice was about specifics, such as how many links you can “safely” get from one type of site, that sort of thing....

    The problem comes when we can easily find evidence to the contrary. In this case, a quick look through the SERPs and you'll find evidence of top ranking sites that have more than X links from Site Type Y, so this suggests….what? Perhaps these sites are being “unsafe”, whatever that means. A lot of SEO punditry is well meaning, and often a rewording of Google's official recommendations, but can lead people up the garden path if evidence in the wild suggests otherwise.

    If one term defined SEO in 2013, it is surely “link paranoia”.

    What's Happening In The Wild

    When it comes to what actually works, there are few hard and fast rules regarding links. Look at the backlink profiles for top ranked sites across various categories and you’ll see one thing that is constant....

    Nothing is constant.

    Some sites have links coming from obviously automated campaigns, and it seemingly doesn’t affect their rankings. Other sites have credible link patterns, and rank nowhere. What counts? What doesn’t? What other factors are in play? We can only really get a better picture by asking questions.

    Google allegedly took out a few major link networks over the weekend. Anglo Rank came in for special mention from Matt Cutts.

    So, why are Google making a point of taking out link networks if link networks don’t work? Well, it’s because link networks work. How do we know? Look at the back link profiles in any SERP area where there is a lot of money to be made, and the area isn’t overly corporate i.e. not dominated by major brands, and it won’t be long before you spot aggressive link networks, and few "legitimate" links, in the backlink profiles.

    Sure, you wouldn't want aggressive link networks pointing at brand sites, as there are better approaches brand sites can take when it comes to digital marketing, but such evidence makes a mockery of the tips some people are freely handing out. Are such tips the result of conjecture, repeating Google's recommendations, or actual testing in the wild? Either the link networks work, or they don’t work but don’t affect rankings, or these sites shouldn't be ranking.

    There’s a good reason some of those tips are free, I guess.

    Risk Management

    Really, it’s a question of risk.

    Could these sites get hit eventually? Maybe. However, those using a “disposable domain” approach will do anything that works as far as linking goes, as their main risk is not being ranked. Being penalised is an occupational hazard, not game-over. These sites will continue so long as Google's algorithmic treatment rewards them with higher ranking.

    If your domain is crucial to your brand, then you might choose to stay away from SEO entirely, depending on how you define “SEO”. A lot of digital marketing isn’t really SEO in the traditional sense i.e. optimizing hard against an algorithm in order to gain higher rankings, a lot of digital marketing is based on optimization for people, treating SEO as a side benefit. There’s nothing wrong with this, of course, and it’s a great approach for many sites, and something we advocate. Most sites end up somewhere along that continuum, but no matter where you are on that scale, there’s always a marketing risk to be managed, with perhaps "non-performance" being a risk that is often glossed over.

    So, if there's a take-away, it's this: check out what actually happens in the wild, and then evaluate your risk before emulating it. When pundits suggest a rule, check to see if you can spot times it appears to work, and perhaps more interestingly, when it doesn't. It's in those areas of personal inquiry and testing where gems of SEO insight are found.

    SEO has always been a mix of art and science. You can test, but only so far. The art part is dealing with the unknown past the testing point. Performing that art well is to know how to pick truthiness from reality.

    And that takes experience.

    But mainly a little fact checking :)

    SEO Discussions That Need to Die

    Sometimes the SEO industry feels like one huge Groundhog Day. No matter how many times you have discussions with people on the same old topics, these issues seem to pop back into blogs/social media streams with almost regular periodicity. And every time it does, just the authors are new, the arguments and the contra arguments are all the same.

    Due to this sad situation, I have decided to make a short list of such issues/discussions and hopefully if one of you is feeling particularly inspired by it and it prevents you from starting/engaging in such a debate, then it was worth writing.

    So here are SEO's most annoying discussion topics, in no particular order:

    Blackhat vs. Whitehat

    This topic has been chewed over and over again so many times, yet people still jump into it with both their feet, having righteous feeling that their, and no one else's argument is going to change someone's mind. This discussion is becomes particularly tiresome when people start claiming moral high ground because they are using one over the other. Let's face it once and for all times: there are no generally moral (white) and generally immoral (black) SEO tactics.

    This is where people usually pull out the argument about harming clients' sites, argument which is usually moot. Firstly, there is a heated debate about what is even considered whitehat and what blackhat. Definition of these two concepts is highly fluid and changes over time. One of the main reasons for this fluidity is Google moving the goal posts all the time. What was once considered purely whitehat technique, highly recommended by all the SEOs (PR submissions, directories, guest posts, etc.) may as of tomorrow become “blackhat”, “immoral” and what not. Also some people consider “blackhat” anything that dares to not adhere to Google Webmaster Guidelines as if it was carved in on stone tablets by some angry deity.

    Just to illustrate how absurd this concept is, imagine some other company, Ebay say, creates a list of rules, one of which is that anyone who wants to sell an item on their site, is prohibited from trying to sell it also on Gumtree or Craigslist. How many of you would practically reduce the number of people your product is effectively reaching because some other commercial entity is trying to prevent competition? If you are not making money off search, Google is and vice versa.

    It is not about the morals, it is not about criminal negligence of your clients. It is about taking risks and as long as you are being truthful with your clients and yourself and aware of all the risks involved in undertaking this or some other activity, no one has the right to pontificate about “morality” of a competing marketing strategy. If it is not for you, don't do it, but you can't both decide that the risk is too high for you while pseudo-criminalizing those who are willing to take that risk.

    The same goes for “blackhatters” pointing and laughing at “whitehatters”. Some people do not enjoy rebuilding their business every 2 million comment spam links. That is OK. Maybe they will not climb the ranks as fast as your sites do, but maybe when they get there, they will stay there longer? These are two different and completely legitimate strategies. Actually, every ecosystem has representatives of those two strategies, one is called “r strategy” which prefers quantity over quality, while the K strategy puts more investment in a smaller number of offsprings.

    You don't see elephants calling mice immoral, do you?

    Rank Checking is Useless/Wrong/Misleading

    This one has been going around for years and keeps raising its ugly head every once in a while, particularly after Google forces another SaaS provider to give up part of its services because of either checking rankings themselves or buying ranking data from a third party provider. Then we get all the holier-than-thou folks, mounting their soap boxes and preaching fire and brimstone on SEOs who report rankings as the main or even only KPI. So firstly, again, just like with black vs. white hat, horses for courses. If you think your way of reporting to clients is the best, stick with it, preach it positively, as in “this is what I do and the clients like it” but stop telling other people what to do!

    More importantly, vast majority of these arguments are based on a totally imaginary situation in which SEOs use rankings as their only or main KPI. In all of my 12 years in SEO, I have never seen any marketer worth their salt report “increase in rankings for 1000s of keywords”. As far back as 2002, I remember people were writing reports to clients which had a separate chapter for keywords which were defined as optimization targets, client's site reached top rankings but no significant increase in traffic/conversions was achieved. Those keywords were then dropped from the marketing plan altogether.

    It really isn't a big leap to understand that ranking isn't important if it doesn't result in increased conversions in the end. I am not going to argue here why I do think reporting and monitoring rankings is important. The point is that if you need to make your argument against a straw man, you should probably rethink whether you have a good argument at all.

    PageRank is Dead/it Doesn't Matter

    Another strawman argument. Show me a linkbuilder who today thinks that getting links based solely on toolbar PageRank is going to get them to rank and I will show you a guy who has probably not engaged in active SEO since 2002. And not a small amount of irony can be found in the fact that the same people who decry use of Pagerank, a closest thing to an actual Google ranking factor they can see, are freely using proprietary metrics created by other marketing companies and treating them as a perfectly reliable proxy for esoteric concepts which even Google finds hard to define, such as relevance and authority. Furthermore, all other things equal, show me the SEO who will take a pass on a PR6 link for the sake of a PR3 one.

    Blogging on “How Does XXX Google Update Change Your SEO” - 5 Seconds After it is Announced

    Matt hasn't turned off his video camera to switch his t-shirt for the next Webmaster Central video and there are already dozens of blog posts discussing to the most intricate of details on how the new algorithm update/penalty/infrastructure change/random- monochromatic-animal will impact everyone's daily routine and how we should all run for the hills.

    Best-case scenario, these prolific writers only know the name of the update and they are already suggesting strategies on how to avoid being slapped or, even better, get out of the doghouse. This was painfully obvious in the early days of Panda, when people were writing their “experiences” on how to recover from the algorithm update even before the second update was rolled out, making any testimony of recovery, in the worst case, a lie or (given a massive benefit of the doubt) a misinterpretation of ranking changes (rank checking anyone).

    Put down your feather and your ink bottle skippy, wait for the dust to settle and unless you have a human source who was involved in development or implementation of the algorithm, just sit tight and observe for the first week or two. After that you can write those observations and it will be considered a legitimate, even interesting reporting on the new algorithm but anything earlier than that will paint you as a clueless, pageview chaser, looking to ride the wave of interest with blog post that are often closed with “we will probably not even know what the XXX update is all about until we give it some time to get implemented”. Captain Obvious to the rescue.

    Adwords Can Help Your Organic Rankings

    This one is like a mythological Hydra – you cut one head off, two new one spring out. This question was answered so many times by so many people, both from within search engines and from the SEO community, that if you are addressing this question today, I am suspecting that you are actually trying to refrain from talking about something else and are using this topic as a smoke screen. Yes, I am looking at you Google Webmaster Central videos. Is that *really* the most interesting question you found on your pile? What, no one asked about <not provided> or about social signals or about role authorship plays on non-personalized rankings or on whether it flows through links or million other questions that are much more relevant, interesting and, more importantly, still unanswered?

    Infographics/Directories/Commenting/Forum Profile Links Don't Work

    This is very similar to the blackhat/whitehat argument and it is usually supported by a statement that looks something like “what do you think that Google with hundreds of PhDs haven't already discounted that in their algorithm?”. This is a typical “argument from incredulity” by a person who glorifies post graduate degrees as a litmus of intelligence and ingenuity. My claim is that these people have neither looked at backlink profiles of many sites in many competitive niches nor do they know a lot of people doing or having a PhD. They highly underrate former and overrate the latter.

    A link is a link is a link and the only difference is between link profiles and percentages that each type of link occupies in a specific link profile. Funnily enough, the same people who claim that X type of links don't work are the same people who will ask for link removal from totally legitimate, authoritative sources who gave them a totally organic, earned link. Go figure.

    “But Matt/John/Moultano/anyone-with a brother in law who has once visited Mountain View” said…

    Hello. Did you order “not provided will be maximum 10% of your referral data”? Or did you have “I would be surprised if there was a PR update this year”? How about “You should never use nofollow on-site links that you don't want crawled. But it won't hurt you. Unless something.”?

    People keep thinking that people at Google sit around all day long, thinking how they can help SEOs do their job. How can you build your business based on advice given out by an entity who is actively trying to keep visitors from coming to your site? Can you imagine that happening in any other business environment? Can you imagine Nike marketing department going for a one day training session in Adidas HQ, to help them sell their sneakers better?

    Repeat after me THEY ARE NOT YOUR FRIENDS. Use your own head. Even better, use your own experience. Test. Believe your own eyes.

    We Didn't Need Keyword Data Anyway

    This is my absolute favourite. People who were as of yesterday basing their reporting, link building, landing page optimization, ranking reports, conversion rate optimization and about every other aspect of their online campaign on referring keywords, all of a sudden fell the need to tell the world how they never thought keywords were an important metric. That's right buster, we are so much better off flying blind, doing iteration upon iteration of a derivation of data based on past trends, future trends, landing pages, third party data, etc.

    It is ok every once in a while to say “crap, Google has really shafted us with this one, this is seriously going to affect the way I track progress”. Nothing bad will happen if you do. You will not lose face over it. Yes there were other metrics that were ALSO useful for different aspects of SEO but it is not as if when driving a car and your brakes die on you, you say “pfffftt stopping is for losers anyway, who wants to stop the car when you can enjoy the ride, I never really used those brakes in the past anyway. What really matters in the car is that your headlights are working”.

    Does this mean we can't do SEO anymore? Of course not. Adaptability is one of the top required traits of an SEO and we will adapt to this situation as we did to all the others in the past. But don't bullshit yourself and everyone else that 100% <not provided> didn't hurt you.

    Responses to SEO is Dead Stories

    It is crystal clear why the “SEO is dead” stories themselves deserve to die a slow and painful death. I am talking here about hordes of SEOs who rise to the occasion every freeking time some 5th rate journalist decides to poke the SEO industry through the cage bars and convince them, nay, prove to them how SEO is not only not dying but is alive and kicking and bigger than ever. And I am not innocent of this myself, I have also dignified this idiotic topic with a response (albeit a short one) but how many times can we rise to the same occasion and repeat the same points? What original angle can you give to this story after 16 years of responding to the same old claims? And if you can't give an original angle, how in the world are you increasing our collective knowledge by re-warming and serving the same old dish that wasn't very good first time it was served? Don't you have rankings to check instead?

    There is No #10.

    But that's what everyone does, writes a “Top 10 ways…” article, where they will force the examples until they get to a linkbaity number. No one wants to read a “Top 13…” or a “Top 23…” article. This needs to die too. Write what you have to say. Not what you think will get most traction. Marketing is makeup, but the face needs to be pretty before you apply it. Unless you like putting lipstick on pigs.


    Branko Rihtman has been optimizing sites for search engines since 2001 for clients and own web properties in a variety of competitive niches. Over that time, Branko realized the importance of properly done research and experimentation and started publishing findings and experiments at SEO Scientist, with some additional updates at @neyne. He currently consults a number of international clients, helping them improve their organic traffic and conversions while questioning old approaches to SEO and trying some new ones.

    Value Based SEO Strategy

    One approach to search marketing is to treat the search traffic as a side-effect of a digital marketing strategy. I’m sure Google would love SEOs to think this way, although possibly not when it comes to PPC! Even if you’re taking a more direct, rankings-driven approach, the engagement and relevancy scores that come from delivering what the customer values should serve you well, too.

    In this article, we’ll look at a content strategy based on value based marketing. Many of these concepts may be familiar, but bundled together, they provide an alternative search provider model to one based on technical quick fixes and rank. If you want to broaden the value of your SEO offering beyond that first click, and get a few ideas on talking about value, then this post is for you.

    In any case, the days of being able to rank well without providing value beyond the click are numbered. Search is becoming more about providing meaning to visitors and less about providing keyword relevance to search engines.

    What Is Value Based Marketing?

    Value based marketing is customer, as opposed to search engine, centric. In Values Based Marketing For Bottom Line Success, the authors focus on five areas:

    • Discover and quantify your customers' wants and needs
    • Commit to the most important things that will impact your customers
    • Create customer value that is meaningful and understandable
    • Assess how you did at creating true customer value
    • Improve your value package to keep your customers coming back

    Customers compare your offer against those of competitors, and divide the benefits by the cost to arrive at value. Marketing determines and communicates that value.

    This is the step beyond keyword matching. When we use keyword matching, we’re trying to determine intent. We’re doing a little demographic breakdown. This next step is to find out what the customer values. If we give the customer what they value, they’re more likely to engage and less likely to click back.

    What Does The Customer Value?

    A key question of marketing is “which customers does this business serve”? Seems like an obvious question, but it can be difficult to answer. Does a gym serve people who want to get fit? Yes, but then all gyms do that, so how would they be differentiated?

    Obviously, a gym serves people who live in a certain area. So, if our gym is in Manhattan, our customer becomes “someone who wants to get fit in Manhattan”. Perhaps our gym is upmarket and expensive. So, our customer becomes “people who want to get fit in Manhattan and be pampered and are prepared to pay more for it”. And so on, and so on. They’re really questions and statements about the value proposition as perceived by the customer, and then delivered by the business.

    So, value based marketing is about delivering value to a customer. This syncs with Google’s proclaimed goal in search, which is to put users first by delivering results they deem to have value, and not just pages that match a keyword term. Keywords need to be seen in a wider context, and that context is pretty difficult to establish if you’re standing outside the search engine looking in, so thinking in terms of concepts related to the value proposition might be a good way to go.

    Value Based SEO Strategy

    The common SEO approach, for many years, has started with keywords. It should start with customers and the business.

    The first question is “who is the target market” and then ask what they value.

    Relate what they value to the business. What is the value proposition of the business? Is it aligned? What would make a customer value this business offering over those of competitors? It might be price. It might be convenience. It’s probably a mix of various things, but be sure to nail down the specific value propositions.

    Then think of some customer questions around these value propositions. What would be the likely customer objections to buying this product? What would be points that need clarifying? How does this offer differ from other similar offers? What is better about this product or service? What are the perceived problems in this industry? What are the perceived problems with this product or service? What is difficult or confusing about it? What could go wrong with it? What risks are involved? What aspects have turned off previous customers? What complaints did they make?

    Make a list of such questions. These are your article topics.

    You can glean this information by either interviewing customers or the business owner. Each of these questions, and accompanying answer, becomes an article topic on your site, although not necessarily in Q&A format. The idea is to create a list of topics as a basis for articles that address specific points, and objections, relating to the value proposition.

    For example, buying SEO services is a risk. Customers want to know if the money they spend is going to give them a return. So, a valuable article might be a case study on how the company provided return on spend in the past, and the process by which it will achieve similar results in future. Another example might be a buyer concerned about the reliability of a make of car. A page dedicated to reliability comparisons, and another page outlining the customer care after-sale plan would provide value. Note how these articles aren’t keyword driven, but value driven.

    Ever come across a FAQ that isn’t really a FAQ? Dreamed-up questions? They’re frustrating, and of little value if the information doesn’t directly relate to the value we seek. Information should be relevant and specific so when people land on the site, there’s more chance they will perceive value, at least in terms of addressing the questions already on their mind.

    Compare this approach with generic copy around a keyword term. A page talking about “SEO” in response to the keyword term “SEO“might closely match a keyword term, so that’s a relevance match, but unless it’s tied into providing a customer the value they seek, it’s probably not of much use. Finding relevance matches is no longer a problem for users. Finding value matches often is. Even if you’re keyword focused, added these articles provides you semantic variation that may capture keyword searches that aren't appearing in keyword tools.

    Keyword relevance was a strategy devised at a time when information was less readily available and search engines weren't as powerful. Finding something relevant was more hit and miss that it is today. These days, there’s likely thousands, if not millions, of pages that will meet relevance criteria in terms of keyword matching, so the next step is to meet value criteria. Providing value is less likely to earn a click back and more likely to create engagement than mere on-topic matching.

    The Value Chain

    Deliver value. Once people perceive value, then we have to deliver it. Marketing, and SEO in particular, used to be about getting people over the threshold. Today, businesses have to work harder to differentiate themselves and a sound way of doing this is to deliver on promises made.

    So the value is in the experience. Why do we return to Amazon? It’s likely due to the end-to-end experience in terms of delivering value. Any online e-commerce store can deliver relevance. Where competition is fierce, Google is selective.

    In the long term, delivering value should drive down the cost of marketing as the site is more likely to enjoy repeat custom. As Google pushes more and more results beneath the fold, the cost of acquisition is increasing, so we need to treat each click like gold.

    Monitor value. Does the firm keep delivering value? To the same level? Because people talk. They talk on Twitter and Facebook and the rest. We want them talking in a good way, but even if they talk in a negative way, it can still useful. Their complaints can be used as topics for articles. They can be used to monitor value, refine the offer and correct problems as they arise. Those social signals, whilst not a guaranteed ranking boost, are still signals. We need to adopt strategies whereby we listen to all the signals, so to better understand our customers, in order to provide more value, and hopefully enjoy a search traffic boost as a welcome side-effect, so long as Google is also trying to determine what users value. .

    Not sounding like SEO? Well, it’s not optimizing for search engines, but for people. If Google is to provide value, then it needs to ensure results provide not just relevant, but offer genuine value to end users. Do Google do this? In many cases, not yet, but all their rhetoric and technical changes suggest that providing value is at the ideological heart of what they do. So the search results will most likely, in time, reflect the value people seek, and not just relevance.

    In technical terms, this provides some interesting further reading:

    Today, signals such as keyword co-occurrence, user behavior, and previous searches do in fact inform context around search queries, which impact the SERP landscape. Note I didn’t say the signals “impact rankings,” even though rank changes can, in some cases, be involved. That’s because there’s a difference. Google can make a change to the SERP landscape to impact 90 percent of queries and not actually cause any noticeable impact on rankings.

    The way to get the context right, and get positive user behaviour signals, and align with their previous searches, is to first understand what people value.