My Must Have Tools of 2014

Dec 30th

There are a lot of tools in the SEO space (sorry, couldn't resist :D) and over the years we've seen tools fall into 2 broad categories. Tools that aim to do just about everything and tools that focus on one discipline of online marketing.

As we continue to lose more and more data (not provided) and the data we have access to becomes a bit more unreliable (rankings, competitive research data, data given to us by search engines, etc) one has to wonder at what point does access to a variety of tools start producing diminishing returns?

In other words, if you are starting with unreliable or very, very inexact data does layering more and more extrapolations on top make you worse off than you were before? Probably.

I do think that a fair amount of tool usage scenarios have become less effective (or less necessary) at this point. Consider what were once the cornerstones of industry research and data:

  • Rankings
  • SERP difficulty analysis
  • Link prospecting
  • Competitive link research
  • Analytics

Each one of these areas of data has really taken a beating over the last 2-3 years thanks to collateral damage from broad-reaching, unforgiving Google updates, the loss of actual keyword data, the less obvious relationship between links and rankings, personalized search, various SERP layout changes, and on and on.

I believe the best way forward for evaluating what tools you should be using is to determine what does X best to the point where supplementing it with data from a similar provider is overkill and not worth the extra monthly subscription cost nor the cognitive overhead.

Which Ones to Choose?

Well, this certainly depends on what you do. I'm going to focus on the small to mid-size agency market (which also includes freelancers and folks who just operate their own properties) but for those tetering on mid-large size I'll make 2 recommendations based on personal experience:

If I were operating a bigger agency I'd strongly consider both of those. They both do a really solid job of providing customized reporting and research modules.

For the rest of us, I'll share what I'm using as a basis for my recommendations with reasons why I selected them.

These tools are focused on what I do on a daily basis and are the ones I simply cannot live without. They cover:

  • Reporting
  • Competitive Link & Keyword Research
  • Keyword Research
  • PR and Outreach
  • Advanced Web Ranking

    This is the tool I rely on the most. It does just about everything with the only drawbacks being the learning curve and that it is desktop software. The learning curve payoff is very much worth it though. This tool does the following for me:

    • Reporting for pretty much every aspect of a campaign
    • Interfaces with Majestic SEO for link data as well as data from Moz for link research and tracking
    • Connects to social accounts for reporting
    • Site audit crawls
    • Interfaces with Google Analytics
    • Keyword research
    • Competitor analysis
    • Rankings
    • On-page analysis

    They have a cloud version for reporting and I believe that in the near future a good amount of this functionality will go to its cloud service. This tool is highly recommended.

    Advanced Web Ranking - here's a basic overview of the software from a few years ago, though it has been updated a number of times since then

    Ahrefs

    I remember when this was for sale on Flippa! I find Ahrefs to be very reliable and easy to use. They have added quite a few features over the past year and, in my opinion, they are right up there with Majestic SEO when it comes to relevant, deep link data.

    Their interface has improved dramatically over time and the constant addition of helpful, new features has left other tools playing catchup. I'm hoping to see more integration with them in 2014 via services like Raven and Advanced Web Ranking.

    Ahrefs.com - here's a review from last year (though they no longer offer the SERP tracking feature they offered back then)

    Authority Labs

    The most accurate and stable online rankings provider I've used thus far. The interface has improved recently as has the speed of exports. I would still like to see a bulk PDF export of each individual site in the near future but overall my experience with Authority Labs has been great.

    I use it as a stable, online, automated rank checker to supplement my data from Advanced Web Ranking. It also has some nice features like being able to track rankings from a zip code and showing what else is in the SERP it encounters (videos, snippets, etc).

    Authority Labs - here's a review from 5 months ago

    Buzzstream

    Buzzstream is an absolute must have for anyone doing PR-based and social outreach. The email integration is fantastic and the folks that help me with outreach routinely rave about using Buzzstream.

    The UI has really been turned up recently and the customer support has been excellent for us. I'm positive that our outreach would not be nearly has effective without Buzzstream and there really isn't a competing product out there that I've seen.

    This is a good example of a really niche product that excels at its intended purpose.

    Buzzstream - here's a review from a couple years ago

    Citation Labs Suite

    We use the Contact Finder, Link Prospector, and Broken Link Building tool inside our prospecting process. Much like Buzzstream this is a suite of tools that focuses on a core area and does it very well.

    You have to spend some time with the prospector to get the best queries possible for your searches but the payoff is really relevant, quality link prospects.

    Citation Labs - here's a review from a couple years ago

    Link Research Tools

    While LRT is primarily known for its Link Detox tool, this set of tools covers quite a bit of the SEO landscape. I do not use all the tools in the suite but the areas that I utilize LRT for are:

    • Link cleanup
    • Link prospecting
    • SERP competition analysis
    • Competitive domain comparisons

    It's missing a few pieces but it is similar to Advanced Web Ranking in terms of power and data. LRT hooks into many third party tools (Majestic, SemRush, Moz, etc) so you get a pretty solid overview, in one place, of what you need to see or want to see.

    The prospecting is similar, to an extent, when compared with Citation Labs but you can define specific SEO metrics to prospect filtering as well as base it off of links that appear across multiple sites in a given SERP.

    LinkResearchTools

    Majestic SEO

    Majestic is still the defacto standard for deep link data (both fresh and historical data). They recently launched a new feature called Search Explorer, which is designed to be a specialized search engine devoid of personalization and what not, while showing rankings based on its interpretation of the web graph and how influential a site is for a given term.

    As of this writing, Search Explorer is in Alpha but it does appear to be a really solid innovation from Majestic. The other reason for having a Majestic subscription is to get access to it's API so you can integrate the data however you choose to. I use it (access to the API) inside of LRT and Advanced Web Ranking.

    Majestic SEO - here's a review from a couple years ago by Julie Joyce

    Moz

    I use Moz mainly for access to it's link data via Advanced Web Ranking. Compared to the other tools I use I do not see a ton of value in the rest of its tool suite and I also get data from it via my Raven subscription (which is where I tend to do a fair bit of research).

    If you are on a tight budget it's worthy of consideration for the breadth of tools the subscription offers but I think you could get better options elsewhere if you have some budget to spread around.

    Moz

    Raven Tools

    I don't use every single feature in Raven but I find Raven to be one of the most well-executed, stable tool suites on the market. I use Raven to:

    • Manage keyword lists
    • Research competitors
    • Manage and report on Twitter/Facebook profiles and campaigns
    • Track social mentions
    • Automate site crawls
    • Compare various, customizable metrics between sites
    • Google Analytics and Google/Bing Webmaster tools integration

    In 2014 I'm looking to do more with Raven in the content management area and in the reporting area. I still prefer to supplement GWT rankings data with rankings data from another source (Advanced Web Ranking, Authority Labs, etc) but a goal for 2014, for me, is to fit more reporting into Raven's already excellent reporting engine.

    Raven - here's a review from a few years ago

    SemRush

    In terms of keyword, ranking, and PPC competitive research tools SemRush really has moved ahead of the competition in the past year or so. I use most of the features in the suite:

    • Organic SEO Research
    • PPC keyword and strategy research
    • Multiple domain comparisons covering organic and paid search strategies
    • Yearly historical data feature on a specific domain

    I also like the filtering feature(s) that really help me whittle down keyword data to exactly what I'm looking for without worrying about export limits and such.

    SemRush - here's a review from a few years ago

    SeoBook Community and Tools

    Knowledge is power, naturally. All the tools in the world will not overcome a lack of knowledge. All of the specific, unbiased, actionable business & marketing knowledge that I've received over the last handul of years (and the relationships made) is the single most direct reason for any succcess I've had in this space.

    The SeoBook Toolbar is still one of my most utilized tools. It is data source agnostic, you get data from a variety of sources quickly and reliably. Seo For Firefox takes most of the info in the toolbar and assigns it to each individual listing in a given SERP. Both tools are indispensible to me on the research front.

    We also have some premium tools that I like quite a bit:

    • Local Rank - It scans up to 1,000 Google results and then cross-references links pointing from those sites to the top 10, 20, or 50 results for that same query. The tool operates on the premise that sites that are well linked to from other top ranked results might get an additional ranking boost on some search queries. You can read more about this in a Google patent here.
    • HubFinder - HubFinder looks for sites that have co-occuring links across up to 10 sites on a given topic. This is useful in finding authoritative links that link to competing sites in a given SERP.
    • Duplicate Content Checker - This Firefox extension scans Google for a given block of text to see if others are using the same content. The Duplicate Content Checker searches Google for each sentence wrapped in quotes and links to the results of the search.

    Screaming Frog SEO Spider

    This is my desktop crawler of choice for most sites, it's complimented by Raven's Site Auditor (which can be run automatically) and Advanced Web Ranking's site audit tool in my usage.

    Just about anything you can think of from a site architecture and on-page standpoint can be done with this tool.

    Screaming Frog - a few years ago Branko did a great review

    TermExplorer

    A cloud-based tool that processes large amounts of keywords pretty quickly and does a good job of bringing in terms from multiple sources.

    It also offers a competitive analysis feature that I don't use very much as well as white-label reports. It has pretty slick filtering options for keywords and scans for exact match domains (.com and .net) in addition to CPC and keyword volume data.

    Term Explorer

    Avoid Tool Fatigue

    There is going to be overlap across some of these tools and while the idea of all-in-one sounds nice it rarely works in practice. Clients are different, deliverables are different, and business models are different.

    The trick is to avoid as much overlap as possible between the tools that you use, otherwise you end up wasting time, money, and resources by overthinking whatever it is that you are doing.

    I have less than 20-ish toolsets that I use on an almost daily basis. Some of these are not used daily but definitely monthly. At one point I had access to over 40 different tools. The tools mentioned in this post are the ones that I've found the most value in and gained the most success from.

    Advanced Web Ranking Review - Website Auditor

    May 30th

    Advanced Web Ranking (AWR) is one of my favorite pieces of SEO software on the market today. It has been indispensable to me over the years. The software does it all and then some.

    I reviewed it a few years ago; you can read that here, most of it is still relevant and I'll be updating it in the near future. In this post I want to highlight their Website Auditor tool.

    Combining On and Off Page Factors

    The beauty of this feature is the simple integration of on and off-page elements. There are other tools on the market that focus solely on the on-page stuff (and do a fantastic job of it) and AWR does as well.

    The all-in-one nature of Advanced Web Ranking allows you to deftly move between the on and off (links, social, etc) page factors for a site (and its competition) inside of the Website Auditor feature. AWR has other tools built-in to go even deeper on competitive analysis as well.

    A quick FYI on some general settings and features:

    • You can crawl up to 10,000 pages on-demand
    • All results are exportable
    • Audits are saved so you can look at historical data trends
    • Complete white-label reporting is available
    • Because it's software it's all you can eat :) (save for the page limit)

    You can also set the tool to crawl only certain sections of a site as well as completely ignore certain sections or parameters so you can make the best use of your 10,000 page-crawl limit. This is a nice way to crawl a specific section of a site to find the most "social" content (limit the crawl to /blog as an example).

    Interface Overview

    Here's what the initial interface looks like:

    awr-site-audit-interface-overview

    It's a thick tool for sure, on the whole, but just focus on the Auditor piece. It's fairly self-explanatory but the top toolbar (left to right) shows:

    • Current site being viewed
    • Update date history for historical comparison
    • Filtering options (all pages, only specific pages (200's, 404's, missing title tags, basically all the data points are available for slicing and dicing)
    • Button for on-page issues to show in the view area
    • Button for page-level external link data to show in the view area
    • Button for page-level social metrics (Twitter, Facebook, G+) to show in the view area
    • Update Project button (to update the Audit :D )
    • Text box where you can filter the results manually
    • Auditor settings (see below)
    • Link data source, Open Site Explorer for now (Majestic is available in other areas of AWR and I'm told it will be available in Website Auditor as another option on the next release, 9.6 (due out very soon)

    The tool settings button allows to configure many areas of the Auditor tool to help get the exact data you want:

    awr-site-audit-tool-settings

    On-Page and Off-Page Data Points

    The on-page overview gives you all of what is listed in the viewport shown previously and if you click on the Filter icon you'll be able to look at whatever piece of on-page data you'd like to:

    awr-site-audit-page-filters

    I did just a short crawl here in order to show you how your data will look inside the tool. The view of the initial on-page report shows your traditional items such as:

    • Title tag info
    • Meta descriptions
    • Duplicate content
    • Robots and indexing information
    • Broken link and external link counts
    • Levels deep from the root
    • HTTP Status Code

    Each page can be clicked on to show specific information about that page:

    • Links from the page to other sites
    • Internal links to the page
    • Broken links
    • External links pointing into the page with anchor text data, Page Authority, and MozRank. Also whether the link is no-follow or an image will be shown as well
    • Broken link and external link counts
    • Levels deep from the root
    • HTTP Status Code

    The on-page overview is also referred to as the Issues Layout:

    awr-site-audit-on-page-view

    The other 2 views are more of a mix of on-page and off-page factors.

    The Links Layout shows the following (for the root domain and for the sub-pages individually):

    • Levels deep from the homepage
    • Page Authority
    • MozRank
    • Linking Root Domains
    • Total Inbound Links
    • Outbound Links
    • No-follows
    • Inbound and Outbound Internal Links

    awr-audit-links-overview

    In this view you can click on any of the crawled pages and see links to the page internally and externally as well as broken links.

    The Social Layout shows the following information:

    • Facebook Shares, Twitter Shares, and Google +1's for a given URL
    • Internal and external links to the page
    • Indexed or not
    • HTTP Status
    • Meta information
    • Broken Links

    awr-audit-social-layot

    This data is helpful in finding content ideas, competitor's content/social strategy, and for finding possible influencers to target in a link building/social awareness campaign for your site.

    Reporting and Scheduling

    Currently you can provide white label PDF/interactive HTML reports for the following:

    • Issues Layout
    • Link Layout
    • Social Layout

    You can also do a quick export from the viewport window inside the Website Auditor tab to get either an HTML/PDF/CSV export of the data you are looking at (list of link issues, social stats, on-page issues, and so on).

    Reports can be scheduled to run automatically so long as the computer AWR resides on is on and functional. You could also remote in with a service like LogMeIn to run an update remotely or use the AWR server plan where you host the AWR application on one machine and remote client machines (staff as an example) can connect to the shared database and make an update or run a report if needed.

    Advanced Web Ranking's Website Auditor is one of the most robust audit tools on the market and soon it will have integration with Majestic SEO (currently it ties into OpenSiteExplorer/Linkscape). It already pulls in social metrics from Twitter, Facebook, and G+ to give you a more comprehensive view of your site and your content.

    If you conduct technical audits or do competitive analysis you should give AWR a try, I think you'll like it :)

    Soft Launching SEOTools.net

    Nov 2nd

    Last month we soft launched SEOTools.net. Here are a few entries as a sample of things to come...

    ... do subscribe to the RSS feed if you like what you see thusfar.

    Why create yet another site about SEO?

    Good question, glad you asked. ;)

    Our customer base on this site consists primarily of the top of this pyramid. I can say without doubt that I know that some of our customers know more about SEO than I do & that generally makes them bleeding edge. ;)

    And then some people specialize in local or video or ecommerce or other such verticals where there are bits of knowledge one can only gain via first hand experience (eg: importing from China or doing loads of testing of YouTube variables or testing various upsells). There is becoming so much to know that nobody can really know everything, so the goal of our site here is to sorta bring together a lot of the best folks.

    Some people newer to the field & a bit lower down on the pyramid are lucky/smart enough to join our community too & those who do so and participate likely save anywhere from 1 to 3 years on their learning curve...leveling up quickly in the game/sport of SEO. But by and large our customers are mostly the expert end of the market.

    We could try to water down the community & site to try to make it more mass market, but I think that would take the site's leading strength and flush it down the toilet. In the short run it would mean growth, but it would also make the community less enjoyable ... and this site is as much a labor of love as it is a business. I think I would burn myself out & no longer love it if the site became noisy & every third post was about the keyword density of meta tags.

    What Drives You?

    When SEOBook.com was originally created SEO was much less complex & back in 2003 I was still new to the field, so I was writing at a level that was largely aligned with the bulk of the market. However, over the past decade SEO has become much more complex & many of our posts tend to be at a pretty high level, pondering long-term implications of various changes.

    When there are big changes in the industry we are usually early in discussing them. We were writing about exact match domains back in 2006 and when Google's algorithm hinted at a future of strong brand preference we mentioned that back in 2009. With that being said, many people are not nimble enough to take advantage of some of the shifts & many people still need solid foundational SEO 101 in place before the exceptions & more advanced topics make sense.

    The following images either make sense almost instantly, or they look like they are in Greek...depending on one's experience in the field of SEO.

    My mom and I chat frequently, but she tells me some of the posts here tend to be pretty deep / complex / hard to understand. Some of them take 20 hours to write & likely read like college dissertations. They are valuable for those who live & breathe SEO, but are maybe not a great fit for those who casually operate in the market.

    My guess is my mom is a pretty good reflection of most of the market in understanding page titles, keywords, and so on...but maybe not knowing a lot about anchor text filters, link velocity, extrapolating where algorithm updates might create future problems & how Google might then respond to those, etc. And most people who only incidentally touch the SEO market don't need to get a PhD in the topic in order to reach the point of diminishing returns.

    Making Unknowable SEO More Knowable

    SEO has many pieces that are knowable (rank, traffic, rate of change, etc.), but over time Google has pulled back more and more data. As Google gets greedier with their data, that makes SEO harder & increases the value of some 3rd party tools that provide competitive intelligence information.

    • Being able to look up the performance of a section of a site is valuable.
    • Tracking how a site has done over time (to identify major ranking shifts & how they align with algorithm updates) is also quite valuable.
    • Seeing link spikes & comparing those with penalties is also valuable.

    These data sets help offer clues to drive strategy to try to recover from penalties, & how to mimic top performing sites to make a site less likely to get penalized.

    The Difference Between These 2 Sites

    Our goal with SEO Book is to...

    • try to cover important trends & topics deeper than anyone else (while not just parroting Google's view)
    • offer a contrary view to lifestyle image / slogan-based SEO lacking in substance or real-world experience
    • maintain the strongest community of SEO experts, such that we create a community I enjoy participating in & learning from

    Our goal with SEO tools is to...

    • create a site that is a solid fit for the beginner to intermediate portions of the market
    • review & compare various industry tools & highlight where they have unique features
    • offer how to guides on specific tasks that help people across a diverse range of backgrounds & skill levels save time and become more efficient SEOs
    • provide introduction overviews of various SEO-related topics

    Comparing Backlink Data Providers

    Sep 12th

    Since Ayima launched in 2007, we've been crawling the web and building our own independent backlink data. Starting off with just a few servers running in our Directory of Technology's bedroom cupboard, we now have over 130 high-spec servers hosted across 2 in-house server rooms and 1 datacenter, using a similar storage platform as Yahoo's former index.

    Crawling the entire web still isn't easy (or cheap) though, which is why very few data providers exist even today. Each provider makes compromises (even Google does in some ways), in order to keep their data as accurate and useful as possible for their users. The compromises differ between providers though, some go for sheer index size whilst others aim for freshness and accuracy. Which is best for you?

    This article explores the differences between SEOMoz's Mozscape, MajesticSEO's Fresh Index, Ahref's link data and our own humble index. This analysis has been attempted before at Stone Temple and SEOGadget, but our Tech Team has used Ayima's crawling technology to validate the data even further.

    We need a website to analyze first of all, something that we can't accidentally "out". Search Engine Land is the first that came to mind, very unlikely to have many spam links or paid link activity.

    So let's start off with the easy bit - who has the biggest result set for SEL?

    The chart above shows MajesticSEO as the clear winner, followed by a very respectable result for Ahrefs. Does size matter though? Certainly not at this stage, as we only really care about links which actually exist. The SEOGadget post tried to clean the results using a basic desktop crawler, to see which results returned a "200" (OK) HTTP Status Code. Here's what we get back after checking for live linking pages:

    Ouch! So MajesticSEO's "Fresh" index has the distinct smell of decay, whilst Mozscape and Ayima V2 show the freshest data (by percentage). Ahrefs has a sizeable decay like MajesticSEO, but still shows the most links overall in terms of live linking pages. Now the problem with stopping at this level, is that it's much more likely that a link disappears from a page, than the page itself disappearing. Think about short-term event sponsors, 404 pages that return a 200, blog posts falling off the homepage, spam comments being moderated etc. So our "Tenacious Tim" got his crawler out, to check which links actually exist on the live pages:

    Less decay this time, but at least we're now dealing with accurate data. We can also see that Ayima V2 has a live link accuracy of 82.37%, Mozscape comes in at 79.61%, Ahrefs at 72.88% and MajesticSEO is just 53.73% accurate. From Ayima's post-crawl analysis, our techies concluded that MajesticSEO's crawler was counting URLs (references) and not actual HTML links in a page. So simply mentioning http://www.example.com/ somewhere on a web page, was counting as an actual link. Their results also included URL references in JavaScript files, which won't offer any SEO value. That doesn't mean that MajesticSEO is completely useless though, I'd personally use it more for "mention" detection outside of the social sphere. You can then find potential link targets who mention you somewhere, but do not properly link to your site.

    Ahrefs wins the live links contest, finding 84,496 more live links than MajesticSEO and 513,733 more live links than SEOmoz's Mozscape! I still wouldn't use Ahrefs for comparing competitors or estimating the link authority needed to compete in a sector though. Not all links are created equal, with Ahrefs showing both the rank-improving links and the crappy spam. I would definitely use Ahrefs as my main data source for "Link Cleanup" tasks, giving me a good balance of accuracy and crawl depth. Mozscape and Ayima V2 filter out the bad pages and unnecessarily deep sites by design, in order to improve their data accuracy and showing the links that count. But when you need to know where the bad PageRank zero/null links are, Ahrefs wins the game.

    So we've covered the best data for "mentions", the best data for "link cleanup", now how about the best for competitor comparison and market analysis? The chart below shows an even more granular filter, removing dead links, filtering by unique Class C IP blocks and removing anything below a PageRank 1. By using Google's PageRank data, we can filter the links from pages that hold no value or that have been penalized in the past. Whilst some link data providers do offer their own alternative to PageRank scores (most likely based on the original Google patent), these cannot tell whether Google has hit a site for selling links or for other naughty tactics.

    Whilst Ahrefs and MajesticSEO hit the top spots, the amount of processing power needed to clean their data to the point of being useful, makes them untenable for most people. I would therefore personally only use Ayima V2 or Mozscape for comparing websites and analyzing market potential. Ayima V2 isn't available to the public quite yet, so let's give this win to Mozscape.

    So in summary

    • Ahrefs - Use for link cleanup
    • MajesticSEO - Use for mentions monitoring
    • Mozscape - Use for accurate competitor/market analysis

    Juicy Data Giveaway

    One of the best parts of having your own index, is being able to create cool custom reports. For example, here's how the big SEO websites compare against each other:

    "Index Rank" is a ranking based on who has the most value-passing Unique Class C IP links across our entire index. The league table is quite similar to HitWise's list of the top traffic websites, but we're looking at the top link authorities.

    Want to do something cool with the data? Here's an Excel spreadsheet with the Top 10,000 websites in our index, sorted by authority: Top 10,000 Authority Websites.


    Rob Kerry is the co-founder of Ayima, a global SEO Consultancy started in 2007 by the former in-house team of an online gaming company. Ayima now employs over 100 people on 3 continents and Rob has recently founded the new Ayima Labs division as Director of R&D.

    Bing Offers Up a Free Link Graph

    Bing refreshed their webmaster tools offering & now allows you to look up link data for 3rd party sites.

    We recently interviewed Bing's Duane Forrester about the new SEO tools & their product roadmap.

    Here is a screenshot of their new link explorer, but I highly recommend setting up an account and checking it out firsthand.

    For a long time Yahoo! provided great link data, but most other search engines were more reserved with sharing link data for competing sites. What were some of the driving forces behind Bing opening up on this front?

    Bing values the power of strong partnerships as one way to spur innovation and deliver compelling experiences for our users. For any partnership to be effective, remaining as transparent as possible is critical, including those we forge with agency and publisher partners. Sharing link information was something very clearly asked for by tool users, so after doing the internal work to see if we could provide the information, it was an easy decision to build this tool when the answer came back positive. You wanted it, we had it and could share it. Done.

    As a search engine your web index is much much larger than most SEO tools. On Twitter Rand mentioned that the index size of Bing's new Link Explorer was fairly comparable to Open Site Explorer. Is the link data offered in the tool a select slice of the index? Were you trying to highlight the highest quality link sources for each site?

    We see the entire index, or at least "can" see the entire index and link ecosystem. We’re limited to the actual number we can show at any given time, however.

    Currently it appears as though the tool lists link source URLs & page titles. Will the tool also add anchor text listings at some point?

    On the list – sometimes we run into data sourcing issues, so when we hit those walls, it takes us longer to add features. Bing WMT pulls data from all the sources available within Bing Search, and sometimes those have limits imposed for other reasons. In those cases, we must abide by those rules or seek to influence changes to increase our own access/capacities. A search engine is a complex thing it turns out… J

    There are filters for "anchor text" and "additional query." What are the differences between these filters?

    Anchor Text is pretty clear to most SEOs. "Additional Query" allows you to look for, as an example, a page with "N" text appearing on it. So text not just as "anchor text", but simply appearing on the page.

    Currently if I search for "car" I believe it will match pages that have something like "carson" on it. In the future will there be a way to search for an exact word without extra characters?

    I’m going to split this answer. Users can enable “Strict” filtering to only see “cars” data by selecting the “Strict” box. To your point, however, this is what some of our tools are Beta. We will continually refine them as time goes on, adding features folks find useful.

    Will you guys also offer TLD-based filters at some point?

    First time anyone's mentioned it, so I’ll add this to our list for consideration.

    A few years ago my wife was at a PPC seminar where a Bing representative stated that the keyword search data provided in the tools matched your internal data. Is this still the case?

    Bing Advertising is completely separate from Webmaster Tools. I’m not sure if that rep was meaning data within the adCenter tools matches data or what. Bing WMT does import CPC data to showcase alongside keywords which sent traffic to your site. That data matches as we pull direct from adCenter. The data we show through our tools comes direct from Bing Search, so that’s a match if this is what you’re referring to.

    Bing's Webmaster tools offers an API with keyword research & link data. Bing's Ad Intelligence is easily one of my 3 favorite SEO tools. Will Bing eventually offer a similar SEO-oriented plugin for Excel?

    No plans on the roadmap for an Excel plugin.

    At SMX Derrick Connell suggested that there was a relevancy perception gap perhaps due to branding. What are some of the features people should try or things they should search for that really highlight where Bing is much stronger than competing services?

    Without doubt people should be logging in and using the Facebook integration when searching. This feature is tremendously helpful when you’re researching something, for example, as you can reach out directly to friends for input during your research process. While searching, keep your eyes open for the caret that indicates there is more data about a specific result. Hovering over that activates the “snapshot” showing the richer experience we have for that result. Businesses need to make sure they focus on social and managing it properly. It’s not going away and those who lag will find themselves facing stiff, new competition from those getting social right. Businesses also need to get moving adopting rich snippets on their sites. This data helps us provide the deeper experiences the new consumer interface is capable of in some cases.

    You have wrote a couple books & done a significant amount of offline marketing. One big trend that has been highlighted for years and years is everything moving online, but as search advances do you see offline marketing as becoming an important point of differentiation for many online plays?

    In a way yes. In fact, with the simplification of SEO via tools like our own and many others, more and more businesses can get things done to a level on their own. SEO will eventually become a common marketing tactic, and when that hits, we’re right back to a more traditional view of marketing: where all tactics are brought to bear to sell a product or service. Think of this…email marketing is still one of the single best converting forms of marketing in existence. Yet so many businesses focus on SEO (drive new traffic!) instead of email (work with current, proven shoppers!).

    In fact, neither alone is the "best" strategy for most online businesses. It’s a blend of everything. Social happens either with you or without you. You can influence it, and by participating, the signals the engines see change. We can see those changes and it helps us understand if a searcher might or might not have a good experience with you. That can influence (when combined with a ton of other factors, obviously) how we rank you. Everything is connected today. Complex? Sure, but back in the day marketers faced similar complexity with their own programs. Just a new "complex" for us today. More in the mix to manage.

    What is the best part about being an SEO who also works for a search engine?

    On Wednesday, June 6th at 10AM PST, I was part of the team that brought a new level of tools forward, resetting expectations around what Webmaster Tools should deliver to users. Easily one of the proudest moments of my life was that release. While I’m an SEO and I work for the engine, the PM and Lead Engineer on the WMT product are also SEOs. ;) To say Bing is investing in building the partnership with SEOs is no mere boast. Great tools like this happen because the people building them live the life of the user.

    What is the hardest part about being an SEO who also works for a search engine?

    Still so few people around me that speak this language. The main difficulty is in trying to understand the sheer scope of search. Because everything you thought you knew as an SEO take son an entirely different dimension when you’re inside the engine. Imagine taking every SEO conversation and viewing it through a prism. So many more things to consider.

    And, finally, nothing against Matt here, but why are dogs so much better than cats?

    1 – they listen to you and execute commands like a soldier
    2 – generally, they don’t crap in your house
    3 – you can have a genuine conversation with a dog
    4 – one of my dogs drives
    5 – when was the last time your cat fetched anything for you?
    6 – your dog might look at you funny, but won’t hiss at you
    7 – guard cat? Hardly… you’d be better off with peacocks in the yard.
    8 – dogs make great alarm clocks
    9 – even YOU know you look strange walking your cat on a leash…
    10 – dogs inspire you to be a better person

    -----

    Thanks for the interview Duane & the great new tools. :)

    Duane also did a video review of their new tools on SEOmoz, which highlights how they show rank & traffic data on a per keyword & per page basis. To learn more about Bing, subscribe to their search blog & their webmaster central blog. Duane also shares SEO information on Twitter @DuaneForrester & via his personal blog.

    Citation Labs Review - Here's Why I Use it

    Apr 12th

    So what are we calling it today? Link building, link prospecting, content marketing, linkbait, socialbait, PR ? Whatever it is and whatever sub-definitions exist for the process of finding quality, related websites to link back to yours is difficult and time-consuming work.

    As with most processes associated with SEO campaigns, or website marketing campaigns in general, enterprising folks have built tools to make our lives a little easier and our time more fruitful and productive. A couple of those enterprising fellows are Garrett French and Darren Shaw (from Whitespark.Ca) over at Citation Labs.

    Garrett has a suite of link building tools available, many of them complement his flagship tool; The Link Prospector.

    Link Prospector Review TOC

    To help you navigate to specific sections of the review we've included in-content links below.

    Getting Started

    Back to Topics

    So let's assume I've been contracted to embark on a link building campaign for SeoBook :) It's very easy to create a campaign and get up and running:

    Create your campaign:

    clabs-1

    Move right into the prospects section:

    clabs-2

    Start prospecting :)

    clabs-3

    Selecting a Report

    Back to Topics

    The nice thing about this tool is that it's designed for a specific purpose; link prospecting. It's not bloated with a bunch of other stuff you may not need and it's easy to use, yet powerful, because it focus on doing one thing and doing it very well.

    The UI of this tool is right on the money, in my opinion. Garrett has built in his own queries to find specific types of links for you (preset Reports). Here you can see the reports available to you, which are built to help you find common link types:

    clabs-4

    Customizing Your Prospecting

    Back to Topics

    As you can see, there are a variety of built in queries available which run the gamut of most of the link outreach goals you might have (interviews, resource pages, guest posts, directories, and so on). Once you settle on the report type it's time to select additional parameters like:

    • Region
    • Web or Blog, or Web AND Blog results
    • Search Depth (You can go up to 1,000 deep here, but if you make use of your exclusion lists you shouldn't have to dive that deep)
    • TLD Options
    • Date Range (Google's "past our, day, week, month, year, or anytime" options)

    Try to make your queries as relevant but broad as possible to get the best results. Searches that are too specific will either net to few results or many of your direct competitors. Here, you can see my report parameters for interviews I may want to do in specific areas of SEO (Garrett includes a helpful video on that page, which I highly recommend watching):

    clabs-5

    Using Exclusions

    The use of exclusions is an often overlooked feature of this toolset. Brands are all over the SERPs these days so when you have the Link Prospector go out to crawl potential link sources based on keywords/queries, you'll want to make sure you exclude sites you are fairly certain you won't get a link from.

    You may want to exclude such sites as Ebay, Amazon, NewEgg, and so on if you are running a site about computer parts. You can put your exclusions into 2 categories:

    • Global Exclusions
    • Campaign Exclusions

    Global exclusions apply to each campaign automatically. You might want to go out and download top 100 site lists (or top 1,000) lists to stick in the Global Exclusions area or simply apply specific sites you know are irrelevant to your prospecting on the whole. To access Exclusion lists, just click on the exclusion option. From there, it's just a matter of entering your domains:

    clabs-6

    Campaign exclusions only apply to a specific campaign. This is good news if you provide link building services and work with a variety of clients; you are not constrained to one draconian exclusion list. In speaking with Garrett, he does mention that this is an often overlooked feature of the toolset but one of the most effective features (both Global and Campaign exclusions).

    Working With the Data

    Back to Topics

    So I ran my report which was designed to find interviewees within certain broader areas of the SEO landscape. The tool will confirm submission of your request and email you when it's complete, at any time you can go in and check the status of your reports by going to Prospects -> View Prospects. Here's what the queue looks like:

    clabs-7

    The results are presented in a web interface but can be easily exported to excel. From the web interface, you can see:

    • Total Domains
    • Total Paths (pages on the domain where relevancy exists, maybe we would find a relevant video channel on YouTube where it makes sense to reach out)
    • TLD
    • LTS - Link Target Score
    • PR of Domain
    • Export Options

    LTS is a proprietary score provided by Citation Labs (essentially a measure of domain frequency and position within the SERPs pulled back for a given report).

    If we expand the domain to see the paths, using Search Engine Land as an example, we can see pages where targets outside of the main domain might exist for our interviewing needs:

    clabs-9

    This is where Citation Labs really shines. Rather than just spitting back a bunch of domains for you to pursue at a broad level, it breaks down authoritative domains into specific prospecting opportunities which are super-relevant to your query/keyword relationship.

    If you are on Windows (or run Windows via a virutal machine) you can use SEO Tools for Excel to take all these URLs, or the ones you want to target, and pull in social metrics, backlink data, and many other data points to further refine your list.

    You can also import this data right into Buzzstream (export from Citation Labs to a CSV or Excel, then import into Buzzstream) and Buzzstream will go off and look up relevant social and contact details for outreach purposes.

    We recently did a Buzzstream Review that you might find helpful.

    You can also utilize Garrett's Contact Finder for contact research.

    Creating Your Own Queries

    Back to Topics

    Another nice thing about Citation Labs's Link Prospector is that you can enter your own query parameters. You are not locked in to any specific type of data output (even though the built in ones are solid). You can do this by selecting "Custom" in the report selection field

    In the Custom Report area you can create your own search operators along with the following options:

    • Region
    • Web or Blog, or Web AND Blog results
    • Search Depth (You can go up to 1,000 deep here, but if you make use of your exclusion lists you shouldn't have to dive that deep)
    • TLD Options
    • Date Range (Google's "past our, day, week, month, year, or anytime" options)

    One of the tools we mention quite a bit inside the forums is the Solo SEO Link Search Tool. You can grab a lot of search operators from that tool for your own use inside the Citation Labs tool.

    Garrett's Pro Tips

    Back to Topics

    Can you give us some tips on using the right phrases?

    One objection I hear from folks who test the link prospector is "my results are full of competitors." This is typically because the research phrases they've selected don't line up with the type of prospects they're seeking. And more often than not it's because they've added their target SEO keywords rather than "category keywords" that define their area of practice.

    The solution is simple though - you just need to experiment with some "bigger head" phrases. Instead of using "Atlanta Divorce Lawyer" for guest post prospecting, try just "Divorce Lawyer," or even "Divorce."

    And I'd definitely recommend experimenting with the tilde "~Divorce" as it will help with synonyms that you may not have thought of. So if you're looking for guest posting opportunities for a divorce lawyer your five research phrases could look like this:

    divorce
    ~divorce
    ~divorce -divorce
    Divorce ~Lawyer
    "family law"

    The link prospector tool will take these five phrases and combine them with 20+ guest posting footprints so we end up doing 100+ queries for you. And there WILL be domain repetitions due to the close semantic clustering of these phrases. This overlap can help "float up" the best opportunities based on our LTS score (which is essentially a measurement of relevance).

    All this said there are PLENTY of situations where using your SEO keywords can be productive... For example in guest posting it's common for people to use competitive keywords as anchor text. You could (and yes I'm completely contradicting my example) use "Atlanta Divorce Lawyer" as a guest posting research phrase along with your other target SEO KWs. The prospects that come back will probably have been placed by competitors.

    How do you fine-tune your research phrases?

    I often test my research phrases before throwing them in the tool. Let's go back to the divorce guest posting example above. To test I simply head to Google and search [divorce "guest post"]. If I see 4 or more results in the top 10 that look like "maybes" I consider that a good keyword to run with. The test footprint you should use will vary from report-type to report-type.

    A good links page test is to take a potential research phrase and add intitle:links. For content promoters you could combine a potential research phrase with intitle:"round up".

    I find that this testing does two things. For one it helps me drop research phrases that are only going to clog my reports with junk.

    Secondarily I often discover new phrases that are likely to be productive. Look back at the list of divorce research phrases above - the last one, "family law," is there because I spotted it while testing [~divorce "guest post"]. Spending time in Google is always, always productive and I highly advise it.

    What tips can you give us regarding proper Search Depth usage?

    Depth is a measure of how many results the link prospector brings back from Google. How often do you find useful results on the third page of Google? How about the tenth page? There's a gem now and again, but I find that if I've carefully selected 5 awesome research phrases I save time by just analyzing the results in the top 20.

    Your mileage may vary, and the tool DOES enable users to scrape all the way down to 1000 for those rare cases where you have discovered a mega-productive footprint. Test it once for sure, don't just take my word for it - my guess is you'll end up with tons of junk that actually kills the efficiency that the tool creates.

    Any more expert tips on how to best use phrases and search operators?

    You can addadvanced search operators in all your research phrases. Combine them with your research phrases and try them out in Google first (see tip 2) and then use them as you see fit. I use the heck out of the tilde now, as it saves me time and aids in research phrase discovery when I vet my phrases in Google. The tilde even works in conjunction with the wildcard operator (*).

    So if you're looking for law links pages you could test [~law* intitle:links] and then add ~law* as one of your research phrases if it seems productive. It's not super productive by the way, because the word "code" is a law synonym... but I wouldn't have known if I didn't test, and if I didn't test I'd end up with link prospetor results that don't have anything to do with the targets I'm seeking.

    Any tips on how to best leverage Exclusions (beyond putting in sites like google.com into your Global Exclusions :D )

    If you have junk, not-ops that keeps turning up in your reports, add the domain as domain.com and www.domain.com to the exclusions file. Poof. It's gone from future reports you run.

    You can even add the domains you've already viewed so they won't show up anymore. Be careful though - make sure you're adding them to your campaign-level excludes rather than Global.

    How often do you update the tool and what is coming down the pike?

    If you sign up and you find yourself asking "I wonder what would happen if I..." please write me an email. If I don't have an answer for you I will send you credits for you to do some testing. I will end up learning from you. I have users continually pushing the limits with the tool and finding new ways to use it.

    We've added PR for domains, titles and snippets for each URL, blog-only search, and fixed numerous bugs and inefficiencies based on requests from our users. We're also bringing in DA, MozRank and an API because of user requests.

    Thanks Garrett!!

    Free Trial and Pricing

    Citation Labs is currently offering a free trial. They have monthly and per credit (love that!) pricing as well. You can find their pricing structure here.

    AHREFS Review: An In-Depth Look at a New Link Research Tool

    Feb 1st

    Ahrefs is the newest entry into the link research tool space. They use their own bot and their own index (which they state is based on information from a trillion website connections).

    They claim their index is updated every 30 minutes and the fresh data is available to their users within 30 minutes of the actual index refresh.

    Ahrefs also has a ranking database of roughly 45 million keywords from 9 different countries (US GB FR RU DE ES IT AU BR). The tools within their membership are:

    • Site Explorer
    • SERPs Analysis
    • Reports
    • Labs/Tools

    Their pricing is very straight-forward and only increases or decreases based on volume of data you have access to. You can check out the easy to understand pricing on their pricing page (and they offer SEO Book readers a 50% discount on the first month).

    Site Explorer

    Ahref's Site Explorer functions in a similar way to Majestic's Site Explorer and SeoMoz's Site Explorer. You can choose a specific URL, the domain without subdomains, or domain with all its subdomains:

    ahrefs-site-explorer

    If we look at the Site Explorer results, you'll see an overview of the last 45 days or so from Ahref's crawl history:

    ahrefs-site-explorer-overview

    On the left you can see some interesting stats like the total number of backlinks, different referring IP's and subnets (class c blocks and such), unique domains, and the types of backlinks the site has (text, image, redirects, and so on).

    In addition to the overview report, you have other research options to chose from:

    • New Links
    • Lost Links (great opportunity for you to swoop in and alert the linker + sell them on linking to you and your resources)
    • Anchor Text Profile
    • Pages Crawled on the Site
    • Referring Domain Breakdown
    • SERP Positions (organic ranking report)
    • Raw Export of the Data (up to your limits based on your pricing plan)

    New Links

    In the New Link tab you can go back to a previous month, or work inside the current month, and find newly discovered links by the day. Here is what that looks like:

    site-explorer-new-links

    Click on whatever day you want and you'll get a list of linking urls, the target link page, and the anchor text used for the link:

    site-explorer-new-links-results

    This report can help you reverse engineer, down to the day, a link building campaign that your competitor is running (always good to be out in front of a big link push by a competitor) and can also help you evaluate your own link campaign or even help you spot a link growth issue that may have resulted in some kind of penalty or over-optimization filter.

    Now keep in mind that, based on their stated crawling guidelines, the stronger links from stronger sites tend to get crawled more frequently so the spammiest of the spammy link approaches might not get picked up on. For that level of deep research a historical report from Majestic SEO and a link status checker, like Advanced Link Manager, is likely a better bet.

    You can export this report to Excel or .CSV format.

    Lost Links

    The Lost Links tab has the same interface as the New Links report does. For your own domain you might want to consider tracking your own links in something like Raven or Buzzstream but this tool does report dropped links down to the day. Combine that with their crawling preferences (better links = quicker attention) and you can spot drops of substance quickly.

    You can use this report to find links that a competitor has lost, off of which you can contact the webmaster and see if you can't promote your site or similar content to earn the link your competitor was previously getting.

    You can export this report to Excel or .CSV format.

    Anchor Text

    The anchor text report is exactly what you expect it to be. It lists the anchor texts of external links, the number of occurrences, as well as an expandable dropdown menu to see the pages being linked from and the pages being linked to on the site you are researching.

    site-explorer-anchor-text-report

    You can export to Excel or .CSV and choose to export everything, up to your limit, or just the current page.

    Crawled Pages

    This report will show you all the pages crawled by Ahrefs with the following stats:

    • Page URL and Title
    • Crawl Date
    • Page Size
    • Internal Links
    • External Links

    site-explorer-crawled-pages

    I would likely use this report (on competitors) for checking some of their more popular internally linked-to pages as well as checking out how they structure their site. You can also jump right to a site explorer report for any of the URL's listed on that report as well as check the SERP positions for any of them.

    Referring Domains

    One thing I like about Ahrefs is that it's straight and to the point. It's very easy to get in, get your data, and get out. Each report does pretty much what you expect it to. This report shows the referring domains + number links coming from that domain. You can access the links from each domain by clicking the Expand button next to the referring domain:

    ahrefs-referring-domains

    SERP Positioning

    Similar to SemRush, Ahref's provides estimated ranking data for keyword sets on both Google and Bing/Yahoo in multiple countries (US, UK, AU, DE, FR, ES, IT, BR, RU). The tool shows the:

    • Position
    • Keyword
    • CPC
    • Estimated cost
    • Ranking url
    • Global search volume
    • Advertiser competition
    • Last date checked
    • Rating (estimated visitors per month based on assumed traffic distribution)

    site-explorer-ahrefs-serp-history

    The other cool thing about this report is that it will tell you the change from the last time they checked the ranking.

    SERPs Analysis

    This is similar to the SERP positioning report. Essentially, you enter a URL and you get the Google + Bing & Yahoo ranking data with those same metrics as stated above:

    • Position
    • Keyword
    • CPC
    • Estimated cost
    • Ranking url
    • Global search volume
    • Advertiser competition
    • Last date checked
    • Rating (estimated visitors per month based on assumed traffic distribution)

    In addition to that, you also have the following reports:

    • Daily Stats
    • History of Changes
    • Ads Analysis

    Daily Stats

    ahrefs-daily-stats

    This report shows you, on a daily basis, the following data points:

    • New Keywords
    • Lost Keywords
    • Total Keywords that moved up
    • Total Keywords that moved down
    • Total Positions up
    • Total Positions down
    • Rating Change (estimated percentage of traffic gained or lost)
    • Cost Change (rating change * CPC)

    There are graphical charts for:

    • Search Engine Traffic (shown above)
    • Keyword Trend (total keywords ranking)
    • Traffic Cost
    • Bar Graph for New and Lost Keywords
    • Estimated Traffic Changes
    • Estimated Traffic Cost Changes

    History of Changes

    This report breaks down the keyword changes by day and how much the specific keyword moved up/down (and the corresponding page that is ranking).

    You can look at a daily report, a 7 day report, 30 days, or a custom range.

    ahrefs-history-changes

    Ads Analysis

    Ahrefs also incorporates Google (and Bing/Yahoo but I had a hard time getting figures for Bing/Yahoo) PPC data. You can pull in the ranking of the ad, the ad text, volume & CPC data, as well as last updated date & competition levels.

    ahrefs-ad-history

    You can look at just the keyword/ranking data or choose from their other 2 reports; keywords/ranking + ad text (Table + Ads) or just the PPC ad text itself (Ads Preview).

    Reports

    You can create reports for your own domain for free or a any other site as a part of your subscription. Each domain counts as a separate report, so you can enter as many as you are entitled to in this interface but they do count against your monthly allowance.

    ahrefs-reporting-1

    The report overview looks like this:

    ahrefs-report-overview

    Each tab represents a data point you can review. In any tab you can choose to export the visible page or the entire report.

    There are quite a few filtering options here, as you can see below:

    ahrefs-filtering

    Your filtering options, report-wide, are:

    • URLs from - you can include or exclude based on user-defined data (exclude by word(s), domain extension, and so on)
    • Backlinks Type - you can choose to show, specifically, different backlink types (nofollow, image, frame, redirect, form, deleted)
    • Pages - show links only to a specific page
    • Subdomain - show links only to a specific subdomain
    • Countries - show links from specific countries
    • Anchors - show or exclude specific anchor text links
    • Referring Domains - show links from a specific domain, or set of domains only
    • IP - show links from a specific IP or range of IP's only
    • Subnets - show backlinks only from specific subnets
    • TLD - show links from specific TLD's only
    • Date - show links based on specific crawl period

    The cool thing here is that you can layer on the filters as you wish. The following screenshot shows all filters selected and available:

    ahrefs-report-filtering

    The reporting is really quite powerful and provides numerous ways to quickly filter out junk links so you can focus on the good stuff.

    Labs

    There are 3 additional tools in their Labs section.

    • Ahrefs Top - Top 1 million domains by number of backlinks, completely searchable
    • Domain Comparison - compare up to 5 domains for different link metrics (see below)
    • Batch Domains - (see below) dump in a bunch of URLs and get a total count of backlinks, referring domains, and IP's. Unsure of the limit here but I did about 25 with no problem.

    Here is a screenshot of the domain comparison feature:

    ahrefs-compare-top

    ahrefs-report-2

    The Batch Domains feature looks like this (and is completely exportable!):

    ahrefs-compare-domains-batch

    Ahrefs is Worth a Spin

    I was impressed with the speed of this tool, the exportability of the data, and the report filtering capabilities. It hardly hurts to have another link database to pull from, especially one that is updated every 30 minutes.

    The tool is quite easy to use and it does pretty much what you expect it to. If you are into link research you should give this tool a try. The database appears to be a fairly good size for a new database and the ability to slice and dice that data from right within the web interface is a solid feature. If you do try it out, let us know what you think! We are also adding their link data to SEO for Firefox & the SEO Toolbar today.

    SEO Spyglass Review: A Brand New Link Source

    Dec 22nd

    SEO Spyglass is one of the 4 tools Link-Assistant sells (individually) and as a part of their SEO Power Suite.

    We did a review of their Rank Tracker application a few months ago and we plan to review their other 2 tools in upcoming blog posts.

    Update: Please note that in spite of us doing free non-affiliate reviews of their software, someone spammed the crap out of our blog promoting this company's tools, which is at best uninspiring.

    Key Features of SEO Spyglass

    The core features of SEO Spyglass are:

    • Link Research
    • White Label Reporting
    • Historical Link Tracking

    As with most software tools there are features you can and cannot access, or limits you'll hit, depending on the version you choose. You can see the comparison here.

    Perhaps the biggest feature is their newest feature. They recently launched their own link database, a couple of months early in beta, as the tool had been largely dependent on the now dead Yahoo! Site Explorer.

    The launch of a third or fourth-ish link database (Majestic SEO, Open Site Explorer, A-Href's rounding out the others) is a win for link researchers. It still needs a bit of work, as we'll discuss below, but hopefully they plan on taking the some of the better features of the other tools and incorporating them into their tool.

    After all, good artists copy and great artists steal :)

    Setting Up a Project for a Specific Keyword

    One of my pet peeves with software is feature bloat which in turn creates a rough user experience. Link-Assistant's tools are incredibly easy to use in my experience.

    Once you fire up SEO Spyglass you can choose to research links from a competing website or links based off of a keyword.

    Most of the time I use the competitor's URL when doing link research but SEO Spyglass doubles as a link prospecting tool as well, so here I'll pick a keyword I might want to target "Seo Training".

    The next screen is where you'll choose the search engine that is most relevant to where you want to compete. They have support for a bunch of different countries and search engines and you can see the break down on their site.

    So if you are competing in the US you can pull data the top ranking site off of the following engines (only one at a time):

    • Google
    • Google Blog Search
    • Google Groups
    • Google Images
    • Google Mobile
    • YouTube
    • Bing
    • Yahoo! (similar to Bing of course)
    • AOL
    • Alexa
    • Blekko
    • And some other smaller web properties

    I'll select Google and the next screen is where you select the sources you want Spyglass to use for grabbing the links of the competing site it will find off of the preceding screen:

    So SEO Spyglass will grab the top competitor from your chosen SERP will run multiple link sources off of that site (would love to see some API integration with Majestic and Open Site Explorer here).

    This is where you'll see their own Backlink Explorer for the first time.

    Next you can choose unlimited backlinks (Enterprise Edition only) or you can limit it by
    Project or Search Engine. For the sake of speed I'm going to limit it to 100 links per search engine (that we selected in a previous screen) and exclude duplicates (links found in one engine and another) just to get the most accurate, usable data possible:

    When you start pinging engines, specifically Google in this example, you routinely will get captcha's like this:

    On this small project I entered about 8 of them and the project found 442 backlinks (here is what you'll see after the project is completed):

    One way around captchas is to either pay someone to run this tool for you and manually do it, but for large projects that is not ideal as captcha's will pile up and you could get the IP temporarily banned.

    Link-Assistant offers an Anti-Captcha plan to combat this issue, you can see the pricing here.

    Given the size of the results pane it is hard to see everything but you are initially returned with:

    • an icon of what search engine the link was found in
    • the backlinking page
    • the backlinking domain

    Spyglass will then ask you if you want to update the factors associated with these links.

    Your options by default are:

    • domain age
    • domain ip
    • domain PR
    • Alexa Rank
    • Dmoz Listing
    • Yahoo! Directory Listing
    • On-page info (title, meta description, meta keywords)
    • Total links to the page
    • External links to other sites from the page
    • Page rank of the page itself

    You can add more factors by clicking the Add More button. You're taken to the Spyglass Preferences pane where you can add more factors:

    You can add a ton of social media stuff here including popularity on Facebook, Google +, Page-level Twitter mentions and so on.

    You can also pick up bookmarking data and various cache dates. Keep in mind that the more you select, especially with stuff like cache date, you are likely to run into captcha's.

    SEO Spyglass also offers Search Safety Settings (inside of the preferences pane, middle of the left column in the above screenshot) where you can update human emulation settings and proxies to both speed up the application and to help avoid search engine bans.

    I've used Trusted Proxies with Link-Assistant and they have worked quite well.

    You can't control the factors globally, you have to do it for each project but you can update Spyglass to only offer you specific backlink sources.

    I'm going to deselect PageRank here to speed up the project (you can always update later or use other tools for PageRank scrapes).

    Working With the Results

    When the data comes back you can do number of things with it. You can:

    • Build a custom report
    • Rebuild it if you want to add link sources or backlink factors
    • Update the saved project later on
    • Analyze the links within the application
    • Update and add custom workspaces

    These options are all available within the results screen (again, this application is incredibly easy to use):

    I've blurred out the site information as I see little reason to highlight the site here. But you can see where the data has populated for the factors I selected.

    In the upper left hand corner of the applications is where you can build the report, analyze the data from within the application, update the project, or rebuild it with new factors:

    All the way to the right is where you can filter the data inside the application and create a
    new workspace:

    Your filtering options are seen to the left of the workspaces here. It's not full blown filtering and sorting but if you are looking for some quick information on specific link queries, it can be helpful.

    Each item listed there is a Workspace. You can create your own or edit one of the existing ones. Whatever factors you include in the Workspace is what will show in the results pane as factors

    So think of Workspaces as your filtering options. Your available metrics/columns are

    • Domain Name
    • Search Engine (where the link was found)
    • Last Found Date (for updates)
    • Status of Backlink (active, inactive, etc)
    • Country
    • Page Title
    • Links Back (does the link found by the search engine actually link to the site? This is a good way of identifying short term, spammy link bursts)
    • Anchor Text
    • Link Value (essentially based on the original PageRank formula)
    • Notes (notes you've left on the particular link). This is very limited and is essentially a single Excel-type row
    • Domain Age/IP/PR
    • Alexa Rank
    • Dmoz
    • Yahoo! Directory Listing
    • Total Links to page/domain
    • External links
    • Page-level PR

    Most of the data is useful. I think the link value is overvalued a bit based on my experience finding links that often had 0 link value in the tool but clearly benefited the site it ended up linking to.

    PageRank queries in bulk will cause lots of captcha's and given how out of date PR can be it isn't a metric I typically include on large reports.

    Analyzing the Data

    When you click on the Analyze tab in the upper left you can analyze in multiple ways:

    • All backlinks found for the project
    • Only backlinks you highlight inside the application
    • Only backlinks in the selected Workspace

    The Analyze tab is a separate window overlaying the report:

    You can't export from this window but if you just do a control/command-a you can copy and paste to a spreadsheet.

    Your options here:

    • Keywords - keywords and ratios of specific keywords in the title and anchor text of backlinks
    • Anchor Text - anchor text distribution of links
    • Anchor URL - pages being linked to on the site and the percentages of link distribution (good for evaluating deep link distribution and pages targeted by the competing site as well as popular pages on the site...content ideas :) )
    • Webpage PR
    • Domain PR
    • Domains linking to the competing site and the percentage
    • TLD - percentage of links coming from .com, net, org, info, uk, and so on
    • IP address - links coming from IP's and the percentages
    • Country breakdown
    • Dmoz- backlinks that are in Dmoz and ones that are not
    • Yahoo! - same as Dmoz
    • Links Back - percentages of links found that actually link to the site in question

    Updating and Rebuilding

    Updating is pretty self-explanatory. Click the Update tab and select whether or not to update all the links, the selected links, or the Workspace specific links:

    (It's the same dialog box as when you actually set up the project)

    Rebuilding the report is similar to updating except updating doesn't allow you to change the specified search engine.

    When you Rebuild the report you can select a new search engine. This is helpful when comparing what is ranking in Google versus Bing.

    Click Rebuild and update the search engine plus add/remove backlink factors.

    Reporting

    There are 2 ways to get to the reporting data inside of Spyglass

    There is a quick SEO Report Tab and the Custom Report Builder:

    Much like the Workspaces in the prior example, there are reporting template options on the right side of the navigation:

    It functions the same way as Workspaces do in terms of being able to completely customize the report and data. You can access your Company Profile (your company's information and logo), Publishing Profiles (delivery methods like email, FTP, and so on), as well as Report Templates in the settings option:

    You can't edit the ones that are there now except for playing around with the code used to generate the report. It's kind of an arcane way to do reporting as you can really hose up the code (below the variables in red is all the HTML):

    You can create your own template with the following reporting options:

    • Custom introduction
    • All the stats described earlier on this report as available backlink factors
    • Top 30 anchor URLs
    • Top 30 anchor texts
    • Top 30 links by "link value"
    • Top 30 domains by "link value"
    • Conclusion (where you can add your own text and images)

    Overall the reporting options are solid and offer lots of data. It's a little more work to customize the reports but you do have lots of granular customization options and once they are set up you can save them as global preferences.

    As with other software tools you can set up scheduled checks and report generation.

    Researching a URL

    The process for researching a URL is the same as described above, except you already know the URL rather than having SEO Spyglass find the top competing site for it.

    You have the same deep reporting and data options as you do with a keyword search. It will be interesting to watch how their database grows because, for now, you can (with the Enterprise version) research an unlimited number of backlinks.

    SEO Spyglass in Practice

    Overall, I would recommend trying this tool out. If nothing else, it is another source of backlinks which pulls from other search engines as well (Google, Blekko, Bing, etc).

    The reporting is good and you have a lot of options with respect to customizing specific link data parameters for your reports.

    I would like to see more exclusionary options when researching a domain. Like the ability to filter redirects and sub-domain links. It doesn't do much good if we want a quick, competitive report but a quarter or more of the report is from something like a subdomain of the site you are researching.

    SEO Spyglass's pricing is as follows:

    • Purchase a professional option or an enterprise option (comparison)
    • 6 months of their Live Plan for free
    • Purchase of a Live Plan required after 6 months to continue using the tool's link research functionality.
    • Pricing for all editions and Live Plans can be found here

    In running a couple of comparisons against Open Site Explorer and Majestic SEO it was clear that Spyglass has a decent database but needs more filtering options (sub-domains mainly). It's not as robust as OSE or Majestic yet, but it's to be expected. I still found a variety of unique links from its database that I did not see on other tools across the board.

    You can get a pretty big discount if you purchase their suite of tools as a bundle rather than individually

    SEM Rush Review & Free Trial SEMRush Coupons

    Oct 20th

    SEM Rush has long been one of my favorite SEO tools. We wrote a review of SEM Rush years ago. They were best of breed back then & they have only added more features since, including competitive research data for many local versions of Google outside of the core US results: UK, Russia, Germany, France, Spain, Italy, Brazil.

    Recently they let me know that they started offering a free 2-week trial to new users. try SEM Rush for free.

    Set up a free account on their website & enter the promotional code "89MW-YR43-HFNJ-K94M"

    For full disclosure, SEM Rush has been an SEO Book partner for years, as we have licensed their API to use in our competitive research tool. They also have an affiliate program & we are paid if you become a paying customer, however we do not get paid for recommending their free trial & their free trial doesn't even require giving them a credit card, so it literally is a no-risk free trial.

    What is SEM Rush?

    SEM Rush is a competitive research tool which helps you spy on how competing sites are performing in search. The big value add that SEM Rush has over a tool like Compete.com is that SEM Rush offers CPC estimates (from Google's Traffic Estimator tool) & estimated traffic volumes (from the Google AdWords keyword tool) near each keyword. Thus, rather than showing the traffic distribution to each site, this tool can list keyword value distribution for the sites (keyword value * estimated traffic).

    As Google has started blocking showing some referral data the value of using these 3rd party tools has increased.

    Normalizing Data

    Using these estimates generally does not provide overall traffic totals that are as accurate as Compete.com's data licensing strategy, but if you own a site and know what it earns, you can set up a ratio to normalize the differences (at least to some extent, within the same vertical, for sites of similar size, using a similar business model).

    One of our sites that earns about $5,000 a month shows a Google traffic value of close to $20,000 a month.
    5,000/20,000 = 1/4 = 0.25

    A similar site in the same vertical shows $10,000
    $10,000 * 0.25 = $2,500

    A couple big advantages over Compete.com and services like QuantCast for SEM Rush are that:

    • they focus exclusively on estimating search traffic
    • you get click volume estimates and click value estimates right next to each other
    • they help you spot valuable up-and-coming keywords where you might not yet get much traffic because you rank on page 2 or 3

    Disclaimers With Normalizing Data

    It is hard to monetize traffic as well as Google does, so in virtually every competitive market your profit per visitor (after expenses) will generally be less than Google. Some reason why..

    1. In some markets people are losing money to buy marketshare, while in other markets people may overbid just to block out competition.
    2. Some merchants simply have fatter profit margins and can afford to outbid affiliates.
    3. It is hard to integrate advertising in your site anywhere near as aggressively as Google does while still creating a site that will be able to gather enough links (and other signals of quality) to take a #1 organic ranking in competitive markets...so by default there will typically be some amount of slippage.
    4. A site that offers editorial content wrapped in light ads will not convert eyeballs into cash anywhere near as well as a lead generation oriented affiliate site would.

    SEM Rush Features

    Keyword Values & Volumes

    As mentioned above, this data is scraped from the Google Traffic Estimator and the Google Keyword Tool. More recently Google combined their search-based keyword tool features into their regular keyword tool & this data has become much harder to scrape (unless you are already sitting on a lot of it like SEM Rush is).

    Top Search Traffic Domains

    A list of the top 100 domain names that are estimated to be the highest value downstream traffic sources from Google.

    You could get a similar list from Compete.com's Referral Analytics by running a downstream report on Google.com, although I think that might also include traffic from some of Google's non-search properties like Reader. Since SEM Rush looks at both traffic volume and traffic value it gives you a better idea of the potential profits in any market than looking at raw traffic stats alone would.

    Top Competitors

    Here is a list of sites that rank for many of the same keywords that SEO Book ranks for

    Most competitors are quite obvious, however sometimes they will highlight competitors that you didn't realize, and in some cases those competitors are also working in other fertile keyword themes that you may have missed.

    Overlapping Keywords

    Here is a list of a few words where Seo Book and SEOmoz compete in the rankings

    These sorts of charts are great for trying to show clients how site x performs against site y in order to help allocate more resources.

    Compare AdWords to Organic Search

    These are sites that rank for keywords that SEO Book is buying through AdWords

    And these are sites that buy AdWords ads for keywords that this site ranks for

    Before SEM Rush came out there were not many (or perhaps any?) tools that made it easy to compare AdWords against organic search.

    Start Your Free Trial Today try SEM Rush for free.

    SEM Rush Pro costs $79 per month (or $69 if you sign up recurring), so this free trial is worth about $35 to $40.

    Take advantage of SEMRush's free 2-week trial today.

    Set up a free account on their website & enter the promotional code "89MW-YR43-HFNJ-K94M"

    If you have any questions about getting the most out of SEM Rush feel free to ask in the comments below. We have used their service for years & can answer just about any question you may have & offer a wide variety of tips to help you get the most out of this powerful tool.

    Link Assistant's Rank Tracker - A Complete Review

    Oct 14th

    Link Assistant's Rank Tracker Reviewed

    Link Assistant offers SEO's a suite of tools, under an umbrella aptly named SEO Power Suite, which covers many aspects of an SEO campaign.

    Link Assistant provides the following tools inside of their Power Suite:

    • Rank Tracker - rank tracking software
    • WebSite Auditor - on-page optimization tool
    • SEO Spy Glass - competitive link research tool
    • Link Assistant - their flagship link prospecting, management, and tracking tool

    We'll be reviewing their popular Rank Tracking tool in this post. I've used their tools for awhile now and have no issue in recommending them. They also claim to have the following companies as clients:

    • Disney
    • Microsoft
    • Audi
    • HP

    Rank Tracker is one of the more robust, fast, and reliable rank checking tools out there.

    Update: Please note that in spite of us doing free non-affiliate reviews of their software, someone spammed the crap out of our blog promoting this company's tools, which is at best uninspiring.

    Is Rank Tracker a Worthy Investment?

    Rank Tracker offers a few different pricing options:

    • Free
    • Pro
    • Enterprise

    All of the editions have the following features:

    • Unlimited sites
    • Unlimited keywords
    • Customizable reports (you can only save and print with Enterprise level however, kind of a drawback in my opinion. Pro accounts should have this functionality)
    • API key's
    • Human search emulation built in
    • User agent rotation
    • Proxy support
    • Proxy rotation
    • Google analytics integration
    • Multiple language support (English, German, Russian, French, Dutch, Spanish, Slovak)
    • Runs on Windows, Mac, Linux

    All editions offer access to their keyword research features, with all the features included, the only difference here is the free edition doesn't allow KEI updates.

    Rank Tracker Feature Set

    Rank Tracker offers a keyword research tool and a rank checking component within the application. A more thorough breakdown of the feature set is as follows:

    Keyword Research

    I prefer to do my keyword research outside of tools like this. Generally specific tools seem to excel at their chosen discipline, in this case rank checking, but fall kind of short in areas they try to add-on. I like to use a variety of tools when doing keyword research and it's easier for me, personally, to create and merge various spreadsheets and various data points rather than doing research inside of an application.

    However, Rank Tracker does offer a one-stop shop for cumbersome research options like various search suggest methods and unique offerings like estimated traffic based on ranking #1 for that specified term.

    Overall, a nice set of keyword research features if you want to add on to the research you've already done.

    Rank Tracker also gives you the option to factor in data from Google Trends as well as through Google Analytics (see current ranking for each keyword and actual traffic).

    Rank Checking

    As this is the core piece tool it's really no surprise that this part of Rank Tracker shines. Some of the interesting options here are in the ability to track multiple Google search areas like images, videos, and places.

    In addition to the interesting features I mentioned above, Rank Tracker also includes a wide array of charting and design options to help you work with your data more directly and in a clearer way:

    Usability is Top Notch

    While the interfaces aren't the prettiest, this is one of one most user-friendly rank tracking tools that I've come across.

    First you simply enter the URL you wish to track. Rank Tracker will automatically find the page AND sub-domain on the domain ranking for the keywords chosen, so you don't have to enter these separately.

    You enter the site you want to check (remember, subpages and subdomains are automatically included)

    Choose from a whole host of engines and select universal search if you wish to factor in places taken up by Google insertions into the SERPS:

    Enter your keywords:

    Let Rank Tracker go to work: (you can choose to display the running tasks as line views or tree views, a minor visual preference)

    That's all there is to it. It is extremely easy to get a project up and running inside of this tool.

    Working with Rank Tracker

    Inside of Rank Tracker the data is displayed clearly, in an easy to understand format:

    In the top part you'll get to see:

    • the keywords you selected
    • current rank compared to last rank
    • overall visibility (top rankings) in each search engine selected
    • custom tags you might decide to choose to tag your keywords with for tracking purposes or something

    On the bottom chart you'll see three options for the selected search engine (bottom) and keyword (top):

    • ranking information for each search engine for the selected keyword
    • historical records (last check date and position)
    • progress graph (visual representation of rankings, customizable with sliders as shown in the picture)

    The ranking chart shows the chart for the chosen keyword and search engine:

    Within the ranking results page, you can select from these options to get a broader view of how your site is performing on the whole:

    Customizing Rank Tracker

    Inside of Rank Tracker's preferences you'll see the following options, most of which are self-explanatory:

    This is where you can take advantage of some of their cooler features like:

    • adding competitors to track
    • adding in your Google Analytics account
    • customizing your reporting templates
    • changing up human emulation settings
    • adding in a captcha service
    • scheduling reports
    • adding in multiple proxies to help with the speed of the tool as well as to prevent blocks

    You can track up to 5 competitors per Rank Tracker profile (meaning, 5 competitors per one of your sites).

    Key Configuration Options

    Rank Tracker has a ton of options as you can see from the screenshot above. Some of the more important ones you'll want to pay attention to begin with their reporting options.

    You'll want to set up your company information as shown here: (this is what will show on your reports)

    On a per profile basis you can customize client-specific details likeso:

    You can create new and modify existing templates for multiple report types here as well:

    Emulation settings are important, you want to make sure you are set up so your requests look as normal and human as possible. It makes sense to check off the "visit search engine home page" option to help it appear more natural in addition to having delays between queries (again, to promote a natural approach to checking rankings/searching).

    One thing that irks me about Rank Tracker is that they have emulation turned off by default. If you don't adjust your settings and you try and run a moderately sized report you'll get a Google automated ban in short order, so be careful!

    In addition to emulation, search approach is also worthy of a bit of tinkering as well. Given how often Google inserts things like images, products, and videos into search results you might want to consider using universal search when checking rankings.

    Also, the result depth is important. Going deep here can help identifying sites that have been torched rather than sites that simply fell outside the top 20 or 50. 100 is a good baseline as a default.

    Successive search gives you a more accurate view as it manually goes page by page rather than grabbing 100 results at a time (double listings, as an example, can throw off the count when not using successive search)

    Finally, another important option is scheduling. You can schedule emails, FTP uploads, and so on (as well as rank checks) from this options panel. Your machine does have to be on for this to work (not in sleep mode for instance). In my experience Rank Tracker has been pretty solid on this front, with respect to executing the tasks you tell it to execute (consistently).

    Software versus Cloud

    There are some strong, cloud based competitors to Rank Tracker. Our Rank Checker is a great solution for quick checks and for ongoing tracking if you do not need graphical charts and such (though, you can easily make those in excel if you need to).

    Competitors and Options

    Raven offers rank tracking as a part of their package and there are other cloud based services like Authority Labs (who actually power Raven's tools) you can look into if you want to avoid using software tools for rank checking.

    There are some drawbacks to cloud-based rank tracking though. Some of them do not have granular date-based comparisons as they typically run on the provider's schedule rather than yours.

    Also, most cloud rank checking solutions offer limits on how many keywords you can track. So if you are doing enterprise level rank checking it makes sense to use a software tool + a proxy service like Trusted Proxies

    Pricing and Final Thoughts

    Rank Tracker offers a generous discount if you grab all their tools in one bundle. If you want to customize, schedule, and print reports you'll need the enterprise edition.

    I think requiring the purchase of your top tier for the basic functionality of printing reports is a mistake. I can see having that limitation on the free edition, but if you pay you should get access to reports.

    You can find their bundle prices here and Rank Tracker's specific pricing here. Also, similar to competitors, they have an ongoing service plan which is required if you plan to continue to receive updates after the initial 6 months.

    Despite my pricing concern regarding the reporting options, I think this is one of the top rank checkers out there. It has a ton of features and is very simple to use. I would recommend that you give this tool a shot if you are in the market for a robust rank checking solution. Oh I almost forgot, rank checking is still useful :)

    One More Note of Caution

    Be sure to read the below complaints about how unclear & sneaky the maintenance plan pricing is. This is something they should fix ASAP.

    Pages






      Email Address
      Pick a Username
      Yes, please send me "7 Days to SEO Success" mini-course (a $57 value) for free.

      Learn More

      We value your privacy. We will not rent or sell your email address.