SEMrush IPO (SEMR)

On Wednesday SEMrush priced their IPO at $14 a share & listed Thursday.

There have been many marketing and online advertising companies which are publicly traded, but few that were so focused specifically on SEO while having a sizeable market cap. According to this SeekingAlpha post at the IPO price SEMrush had a valuation of about $1.95 to $1.99 billion. For comparison sake, here are some other companies & valuations.

  • Facebook acquired Instagram for $1 billion.
  • Google acquired YouTube for $1.65 billion.
  • Yelp trades at around a $2.9 billion market cap.
  • Yahoo! was acquired by Verizon for $4.48 billion.
  • Hubspot has a market cap of around $20.4 billion.

A couple years ago Gannett bought AdWords reseller WordStream. A few years before that they bought ReachLocal. The Hearst publishing empire also bought iCrossing long ago. Marin Software remains publicly traded, but they are only valued at about $20 million.

Newspapers reselling Google AdWords ads isn't really SEO though. Beyond those sorts of deals, many of the publicly traded SEO stuff has been only tangentially relevant to SEO, or crap.

There are some quality category-leading publishers which use SEO as a means of distribution but are not necessarily an SEO service provider like TripAdvisor, BankRate, and WebMD. Over time many of these sorts of companies have been gobbled up by Red Ventures or various private equity firms. Zillow, Yelp and TripAdvisor are some of the few examples which still exist as independent companies.

So that puts most of the publicly traded SEO stuff in one of the following categories...

  • small scale - does anyone other than Andy Beal & Mike Grehan still remember KeywordRanking / WebSourced / Think Interactive / MarketSmart Interactive?
  • hope and nope - sites like Business.com were repeatedly acquired but never really gained lasting relevance.
  • affiliate networks - which reliant on partners with SEO traffic like Quinstreet & Commission Junction. many affiliate networks were hit hard as the barrier to entry in SEO increased over the years. Quinstreet is doing well in some verticals but sold their education division to Education Dynamics for $20 million. CJ was part of the Publicis Groupe acquisition of Epsilon.
  • pump and dump scams - Demand Media, owner of eHow, which later rebranded as Leaf Group & still trades at a small fraction of their IPO price.

[Editorial note: 8 days after writing this post LEAF announced a $304.3 million all cash buyout offer from Graham Holdings at 21% above current market prices and was trading at $8.63 a share. If you bought shares at $40 or $30 or $20 and hoped it would at some point come back - nope - the losses are crystalized on a take out. Graham Holdings formerly owned the Washington Post but sold it to Jeff Bezos 8 years ago for $250 million.]

The one lasting counter-example to the above is Barry Diller's IAC. His innovation ecosystem is surreal. Across time & across markets he is the best creator of vertical leading properties later spun off as their own companies. He's owned Expedia, TripAdvisor, LendingTree, HomeAdvisor, Match.com, TicketMaster and so many other category leaders. His buying of Ask.com did not pan out as well as hoped as web browsers turned the address bar into a search box, his ability to differentiate the service went away after they shut down the engine in 2008, he was locked out of mobile search marketshare by default placement contracts & Google pushes back against extension bundling, but just about everything else he touched turned to gold. A lot of their current market cap is their ownership of Vimeo, which by itself is valued at $6 billion.

What is the most recent big bet for Barry Diller? MGM. Last August he bet $1 billion on the growth of online gambling. And he was willing to bet another billion to help them acquire Entain:

IAC has to date invested approximately US$1 billion in MGM with an initial investment thesis of accelerating MGM’s penetration of the $450 billion global gaming market. IAC notes in its letter of intent that IAC continues to strongly support this objective for MGM whether or not a transaction with Entain is consummated.

Barry Diller not only accurately projects future trends, but he also has the ability to rehab broken companies past their due dates.

The New York Times bought About.com for $410 million in 2005 & did little with it as its relevance declined over time as its content got stale, Wikipedia grew and search engines kept putting more scraped content in the search results. The relentless growth of Wikipedia and Google launching "universal search" in 2007 diminished the value of About.com even as web usage was exploding.

IAC bought About.com from the New York Times for $300 million in August of 2012. They tried to grow it through improving usability, content depth and content quality but ultimately decided to blow it up.

They were bold enough to break it into vertical category branded sites. They've done amazingly well with it and in many cases they rank 2, 3, 4 times in the SERPs with different properties like TheSpruce, TheBalance, Investopedia, etc. As newspapers chains keep consolidating or going under, IAC is one of the few constant "always wins" online publishers.

At its peak TheBalance was getting roughly 2/3 the traffic About.com generated.

Part of the decline in the chart there was perhaps a Panda hit, but the reason traffic never fully recovered is they broke some of these category sites into niche sites using sub-brands.

All the above search traffic estimate trend charts are from SEMrush. :)

I could do a blog post titled 1001 ways to use SEMrush if you would like me to, though I haven't yet as I already have affiliate ads for them here and don't want to come across as a shill by overpromoting a tool I love & use regularly.

I tend to sort of "not get" a lot of SaaS stocks in terms of prices and multiples, though they seem to go to infinity and beyond more often than not. I actually like SEMrush more than most though & think they'll do well for years to come. I get the sense with both them and Ahrefs that they were started by programmers who learned marketing rather than started by marketers who cobbled together offerings which they though would sell. If you ever have feedback on ways to improve SEMrush they are fast at integrating it, or at least were in the past whenever I had feedback.

When SEMrush released their S-1 Dan Barker did a quick analysis on Twitter.

Some stats from the S-1: $144 million in annual recurring revenues @ 50% compound annual growth rate, 76% gross margins, nearly 1,000 employees and over 67,000 paying customers.

At some point a lot of tool suits tend to overlap because much of their data either comes from scraping Google or crawling the open web. If something is strong enough of a point of differentiation to where it is widely talked about or marketed then competitors will try to clone it. Thus spending a bit extra on marketing to ensure you have the brand awareness to be the first tool people try is wise. Years ago when I ran a membership site here I paid to license the ability to syndicate some SEMrush data for our members & I have promoted them as an affiliate for what seems like a decade now.

When Dan Barker did his analysis of the S-1 it made me think SEMrush likely has brighter prospects than many would consider. A few of the reasons I could think of off the top of my head:

  • each day their archive of historical data is larger, especially when you consider they crawl many foreign markets which some other competitive research tools ignore
  • increasing ad prices promote SEO by making it relatively cheaper
  • keyword not provided on organic search means third party competitive analysis tools are valuable not only for measuring competitors but also measuring your own site
  • Google Ads has recently started broadening ad targeting further and hiding some keyword data so advertisers are paying for clicks where they are not even aware what the keyword was

That last point speaks to Google's dominance over the search ecosystem. But it is also so absurd that even people who ran AdWords training workshops point out the absurdity.

In Google maximizing their income some nuance is lost for the advertiser who must dig into N-Gram analysis or look at historical data to find patterns to adjust:

The account overall has a CPA in the $450 range. If the word ‘how’ is in the query, our CPA is over double. If someone searches for ‘quote,’ our CPA is under $300. If they ask a question about cost, the CPA is over $1000. Obviously, looking for quotes versus cost data is very different in the eyes of a user, but not in the matching search terms of Google.

Every ad network has incentive to overstate its contribution to awareness and conversions so that more ad budget is allocated to them.

  • Facebook kept having to restate their ad stats around video impressions, user reach, etc.
  • Facebook gave themselves a 28 day window for credit for some app installs.
  • Google AMP accidentally double counted unique users on Google Analytics (drives adoption = good).
  • Google Analytics came with last click attribution, which over-credits the search channel you use near the end of a conversion journey.

There are a lot of Google water carriers who suggest any and all of their actions are at worst benevolent, but when I hear about hiding keyword data I am reminded of the following quote from the Texas AG Google lawsuit.

"Google employees agreed that, in the future, they should not directly lie to publishers, but instead find ways to convince publishers to act against their interest and remove header bidding on their own."

That lawsuit details the great lengths Google went to in order to leverage their search monopoly to keep monopoly profit margins on their display ad serving business.

AMP was created with the explicit intent to kill header bidding as header bidding shifted power and profit margins to publishers. Some publishers saw a 50% rise in ad revenues from header bidding.

Remember how Google made companywide bonuses depend on the performance of the Google Facebook clone named Google+? Google later literally partnered with Facebook on a secret ad deal to prevent Facebook from launching a header bidding solution. The partnership agreement with Facebook explicitly mentioned antitrust repeatedly.

When a company partners with its biggest direct competitor on a bid rigging scheme you can count on it that the intent is to screw others.

So when you see Google talk about benevolence, remember that they promise to no longer lie in the future & only deceive others into working against themselves via other coercive measures.

We went from the observation that you can't copyright facts to promoting opinion instead:

to where after many thousands of journalists have been laid off now the "newspaper of record" is promoting ponzi scheme garbage as a performance art piece:

Is it any wonder people have lost trust in institutions?

The decline of About.com was literally going to be terminal without the work of Barry Diller to revive it. That slide reflected how over time a greater share of searches never actually leave Google:

Of those 5.1T searches, 33.59% resulted in clicks on organic search results. 1.59% resulted in clicks on paid search results. The remaining 64.82% completed a search without a direct, follow-up click to another web property. Searches resulting in a click are much higher on desktop devices (50.75% organic CTR, 2.78% paid CTR). Zero-click searches are much higher on mobile devices (77.22%)

The data from the above study came from SimilarWeb, which is another online marketing competitive research tool planning on going public soon.

Google "debunked" Rand's take by focusing on absolute numbers instead of relative numbers. But if you keep buying default placements in a monopoly ecosystem where everyday more people have access to a computer in their pocket you would expect your marketshare and absolute numbers to increase even if the section of pie other publishers becomes a smaller slice of a bigger pie.

Google's take there is disingenuous at the core. It reminds me of the time when they put out a study claiming brand bidding was beneficial and that it was too complex and expensive for advertisers to set up a scientific study, without any mention of the fact the reason that would be complex and expensive is because Google chooses not to provide those features in their ad offering. That parallels the way they now decide to hide keyword data even from paying advertisers in much the same way they hide ad fees and lie to publishers to protect their ad income.

Google suggests they don't make money from news searches, but if they control most of the display ads technology stack & used search to ram AMP down publishers throats as a technological forced sunk cost while screwing third party ad networks and news publishers, Google can both be technically true in their statement and lying in spirit.

"Google employees agreed that, in the future, they should not directly lie to publishers, but instead find ways to convince publishers to act against their interest and remove header bidding on their own."

There are many more treats in store for publishers.

Google Chrome stopped sending full referrals for most web site visitors late last year. Google will stop supporting third party cookies in Chrome next year. They've even floated the idea of hiding user IP addresses from websites (good luck to those who need to prevent fraud!).

Google claims they also going to stop selling ads where targeting is based on tracking user data across websites:

"Google plans to stop selling ads based on individuals’ browsing across multiple websites, a change that could hasten upheaval in the digital advertising industry. The Alphabet Inc. company said Wednesday that it plans next year to stop using or investing in tracking technologies that uniquely identify web users as they move from site to site across the internet. ... Google had already announced last year that it would remove the most widely used such tracking technology, called third-party cookies, in 2022. But now the company is saying it won’t build alternative tracking technologies, or use those being developed by other entities, to replace third-party cookies for its own ad-buying tools. ... Google says its announcement on Wednesday doesn’t cover its ad tools and unique identifiers for mobile apps, just for websites."

Google stated they would make no replacement for the equivalent of the third party cookie tracking of individual users:

"we continue to get questions about whether Google will join others in the ad tech industry who plan to replace third-party cookies with alternative user-level identifiers. Today, we’re making explicit that once third-party cookies are phased out, we will not build alternate identifiers to track individuals as they browse across the web, nor will we use them in our products. We realize this means other providers may offer a level of user identity for ad tracking across the web that we will not — like PII graphs based on people’s email addresses. We don’t believe these solutions will meet rising consumer expectations for privacy, nor will they stand up to rapidly evolving regulatory restrictions, and therefore aren’t a sustainable long term investment."

On the above announcement, other ad networks tanked, with TheTradeDesk falling 20% in two days.

Competing ad networks wonder if Google will play by their own rules:

“One clarification I’d like to hear from them is whether or not it means there’ll be no login for DBM [a historic name for Google’s DSP], no login for YouTube and no login for Google properties. I’m looking for them to play by the same rules that they so generously foisted upon the rest of the industry,” Magnite CTO Tom Kershaw said.

Regulators are looking into antitrust implications:

"Google’s plan to block a popular web tracking tool called “cookies” is a source of concern for U.S. Justice Department investigators who have been asking advertising industry executives whether the move by the search giant will hobble its smaller rivals, people familiar with the situation said."

The web will continue to grow more complicated, but it isn't going to get any more transparent anytime soon.

"Google employees agreed that, in the future, they should not directly lie to publishers, but instead find ways to convince publishers to act against their interest and remove header bidding on their own."

As the Attention Merchants blur the ecosystem while shifting free clicks over to paid and charging higher ad rates on their owned and operated properties it increases the value of neutral third party measurement services.

The trend is not too hard to notice if you are remotely awake.

While I was writing this post Google announced the launch of a "best things" scraper website featuring their scraped re-representations of hot selling items. And they are cross-promoting competitors in "knowledge" panels to dilute brand values & force the brand ad buy.

Shortly after Google launched their thin affiliate scraper site full of product ads they announced an update to demote other product review sites.

Where Google can get away with it, they will rig things in their favor to rip off other players in the ecosystem:

Google for years operated a secret program that used data from past bids in the company’s digital advertising exchange to allegedly give its own ad-buying system an advantage over competitors, according to court documents filed in a Texas antitrust lawsuit. The program, known as “Project Bernanke,” wasn’t disclosed to publishers who sold ads through Google’s ad-buying systems.

If I could give you one key takeaway here, it would be this:

"Google employees agreed that, in the future, they should not directly lie to publishers, but instead find ways to convince publishers to act against their interest and remove header bidding on their own."

New Version of SEO Toolbar

Our programmer recently updated our SEO toolbar to work with the most recent version of Firefox.

You can install it from here. After you install it the toolbar should automatically update on a forward basis.

It is easy to toggle on or off simply by clicking on the green or gray O. If the O is gray it is off & if it is green it is on.

The toolbar shows site & page level link data from data sources like SEMRush, Ahrefs & Majestic along with estimated Google search traffic from SEMrush and some social media metrics.

At the right edge of the toolbar there is a [Tools] menu which allows you to pull in the age of a site from the Internet Archive Wayback Machine, the IP address hosting a site & then cross links into search engine cached copies of pages and offers access to our SEO Xray on-page analyzer.

SEO today is much more complex than it was back when we first launched this toolbar as back them almost everything was just links, links, links. All metrics in isolation are somewhat useless, but being able to see estimated search traffic stats right near link data & being able to click into your favorite data sources to dig deeper into the data can help save a lot of time.

For now the toolbar is still only available on Firefox, though we could theoretically have it work on Chrome *if* at some point we trusted Google.

Internet Wayback Machine Adds Historical TextDiff

The Wayback Machine has a cool new feature for looking at the historical changes of a web page.

The color scale shows how much a page has changed since it was last cached & you can select between any two documents to see how a page has changed over time.

You can then select between any two documents to see a side-by-side comparison of the documents.

That quickly gives you an at-a-glance view of how they've changed their:

  • web design
  • on-page SEO strategy
  • marketing copy & sales strategy

For sites that conduct seasonal sales & rely heavily on holiday themed ads you can also look up the new & historical ad copy used by large advertisers using tools like Moat, WhatRunsWhere & Adbeat.

New Keyword Tool

Our keyword tool is updated periodically. We recently updated it once more.

For comparison sake, the old keyword tool looked like this

Whereas the new keyword tool looks like this

The upsides of the new keyword tool are:

  • fresher data from this year
  • more granular data on ad bids vs click prices
  • lists ad clickthrough rate
  • more granular estimates of Google AdWords advertiser ad bids
  • more emphasis on commercial oriented keywords

With the new columns of [ad spend] and [traffic value] here is how we estimate those.

  • paid search ad spend: search ad clicks * CPC
  • organic search traffic value: ad impressions * 0.5 * (100% - ad CTR) * CPC

The first of those two is rather self explanatory. The second is a bit more complex. It starts with the assumption that about half of all searches do not get any clicks, then it subtracts the paid clicks from the total remaining pool of clicks & multiplies that by the cost per click.

The new data also has some drawbacks:

  • Rather than listing search counts specifically it lists relative ranges like low, very high, etc.
  • Since it tends to tilt more toward keywords with ad impressions, it may not have coverage for some longer tail informational keywords.

For any keyword where there is insufficient coverage we re-query the old keyword database for data & merge it across. You will know if data came from the new database if the first column says something like low or high & the data came from the older database if there are specific search counts in the first column

For a limited time we are still allowing access to both keyword tools, though we anticipate removing access to the old keyword tool in the future once we have collected plenty of feedback on the new keyword tool. Please feel free to leave your feedback in the below comments.

One of the cool features of the new keyword tools worth highlighting further is the difference between estimated bid prices & estimated click prices. In the following screenshot you can see how Amazon is estimated as having a much higher bid price than actual click price, largely because due to low keyword relevancy entities other than the official brand being arbitraged by Google require much higher bids to appear on competing popular trademark terms.

Historically, this difference between bid price & click price was a big source of noise on lists of the most valuable keywords.

Recently some advertisers have started complaining about the "Google shakedown" from how many brand-driven searches are simply leaving the .com part off of a web address in Chrome & then being forced to pay Google for their own pre-existing brand equity.

Rank Checker Update

Recently rank checker started hanging on some search queries & the button on the SEO Toolbar which launched rank checker stopped working. Both of these issues should now be fixed if you update your Firefox extensions.

If ever the toolbar button doesn't work one can enable the Menu bar in Firefox, then go under the tools menu to the rank checker section to open it.

Years ago we created a new logo for rank checker which we finally got around to changing it today. :)

Rank Checker.

Free Google AdWords Keyword Suggestion Tool Alternative

Google recently made it much harder to receive accurate keyword data from the AdWords keyword tool.

They have not only grouped similar terms, but then they broadened out the data ranges to absurdly wide ranges like 10,000 to 100,000 searches a month. Only active AdWords advertisers receive (somewhat?) decent keyword data. And even with that, there are limitations. Try to view too many terms and you get:

"You’ve reached the maximum number of page views for this day. This page now shows ranges for search volumes. For a more detailed view, check back in 24 hours."

Jennifer Slegg shared a quote from an AdWords advertiser who spoke with a representative:

"I have just spoken to a customer service manger from the Australia support help desk. They have advised me that there must be continuous activity in your google ad-words campaign (clicks and campaigns running) for a minimum of 3-4 months continuous in order to gain focused keyword results. If you are seeing a range 10-100 or 100-1k or 1k -10k its likely your adwords account does not have an active campaign or has not had continuous campaigns or clicks."

So you not only need to be an advertiser, but you need to stay active for a quarter-year to a third of a year to get decent data.

Part of the sales pitch of AdWords/PPC was that you can see performance data right away, whereas SEO investments can take months or years to back out.

But with Google outright hiding keyword data even from active advertisers, it is probably easier and more productive for those advertisers to start elsewhere.

There are many other keyword data providers (Wordtracker, SEMrush, Wordze, Spyfu, KeywordSpy, Keyword Discovery, Moz, Compete.com, SimilarWeb, Xedant, Ubersuggest, KeywordTool.io, etc.) And there are newer entrants like the Keyword Keg Firefox extension & the brilliantly named KeywordShitter).

In light of Google's push to help make the web more closed-off & further tilt the web away from the interests of searchers toward the interest of big advertisers*, we decided to do the opposite & recently upgraded our keyword tool to add the following features...

  • expanded the results per search to 500
  • we added negative match and modified broad match to the keyword export spreadsheet (along with already having phrase, broad & exact match)

Our keyword tool lists estimated search volumes, bid prices, cross links to SERPs, etc. Using it does require free account registration to use, but it is a one-time registration and the tool is free. And we don't collect phone numbers, hard sell over the phone, etc. We even shut down our paid members area, so you are not likely to receive any marketing messages from us anytime soon.

Export is lightning quick AND, more importantly, we have a panda in our logo!

Here is what the web interface looks like

And here is an screenshot of data in Excel with the various keyword match types

If the tool looks like it is getting decent usage, we may upgrade it further to refresh the data more frequently, consider adding more languages, add a few more reference links to related niche sites in the footer cross-reference section, and maybe add a few other features.

"Every market has some rules and boundaries that restrict freedom of choice. A market looks free only because we so unconditionally accept its underlying restrictions that we fail to see them."Ha-Joon Chang

Restoring Firefox Extensions After The Firefox 43 Update

Update: Our extensions are now signed, so you should be able to download the most recent version of them & use them with the most recent version of Firefox without having to mess with the Firefox plugin option security settings.

Firefox recently updated to version 43 & with that, they automatically disabled all extensions which are not signed, even if they were previously installed by a user and used for years.

If you go to the add ons screen after the update (by typing about:addons in the address bar) you will see a screen like this

Extensions which are submitted to the Mozilla Firefox add ons directory are automatically signed when approved, but other extensions are not by default:

Only Mozilla can sign your add-on so that Firefox will install it by default. Add-ons are signed by submitting them to AMO or using the API and passing either an automated or manual code review. Note that you are not required to list or distribute your add-on through AMO. If you are distributing the add-on on your own, you can choose the Unlisted option and AMO will only serve as the way to get your package signed.

In a couple days we will do that submission to get the add ons signed, but if you recently had the extensions go away it is fairly easy to override this signing feature to get the extensions back working right away.

If you recently saw rank checker, SEO for Firefox or the SEO toolbar disabled after a recent Mozilla Firefox update, here is how to restore them...

Step 1: go to the Firefox settings configuration section

Type about:config into the address bar & hit enter. Once that page loads click on the "I'll be careful, I promise" button.

Step 2: edit the signing configuration

Once the configuration box loads you'll see a bunch of different listed variables in it & a search box at the top. In that search box, enter
xpinstall.signatures.required

By default xpinstall.signatures.required is set to TRUE to force add ons to be signed. Click on it until it goes to bold, which indicates that the TRUE setting is set to FALSE.

Step 3: restart Firefox

After changing the add on signature settings, restart Firefox to apply the setting & your Firefox extensions will be restored.

Installing These Extensions On a New Computer

If you are having trouble setting up your extensions on a new computer, start with the above 3 steps & then go here to download & install the extensions.

My Must Have Tools of 2014

There are a lot of tools in the SEO space (sorry, couldn't resist :D) and over the years we've seen tools fall into 2 broad categories. Tools that aim to do just about everything and tools that focus on one discipline of online marketing.

As we continue to lose more and more data (not provided) and the data we have access to becomes a bit more unreliable (rankings, competitive research data, data given to us by search engines, etc) one has to wonder at what point does access to a variety of tools start producing diminishing returns?

In other words, if you are starting with unreliable or very, very inexact data does layering more and more extrapolations on top make you worse off than you were before? Probably.

I do think that a fair amount of tool usage scenarios have become less effective (or less necessary) at this point. Consider what were once the cornerstones of industry research and data:

  • Rankings
  • SERP difficulty analysis
  • Link prospecting
  • Competitive link research
  • Analytics

Each one of these areas of data has really taken a beating over the last 2-3 years thanks to collateral damage from broad-reaching, unforgiving Google updates, the loss of actual keyword data, the less obvious relationship between links and rankings, personalized search, various SERP layout changes, and on and on.

I believe the best way forward for evaluating what tools you should be using is to determine what does X best to the point where supplementing it with data from a similar provider is overkill and not worth the extra monthly subscription cost nor the cognitive overhead.

Which Ones to Choose?

Well, this certainly depends on what you do. I'm going to focus on the small to mid-size agency market (which also includes freelancers and folks who just operate their own properties) but for those tetering on mid-large size I'll make 2 recommendations based on personal experience:

If I were operating a bigger agency I'd strongly consider both of those. They both do a really solid job of providing customized reporting and research modules.

For the rest of us, I'll share what I'm using as a basis for my recommendations with reasons why I selected them.

These tools are focused on what I do on a daily basis and are the ones I simply cannot live without. They cover:

  • Reporting
  • Competitive Link & Keyword Research
  • Keyword Research
  • PR and Outreach
  • Advanced Web Ranking

    This is the tool I rely on the most. It does just about everything with the only drawbacks being the learning curve and that it is desktop software. The learning curve payoff is very much worth it though. This tool does the following for me:

    • Reporting for pretty much every aspect of a campaign
    • Interfaces with Majestic SEO for link data as well as data from Moz for link research and tracking
    • Connects to social accounts for reporting
    • Site audit crawls
    • Interfaces with Google Analytics
    • Keyword research
    • Competitor analysis
    • Rankings
    • On-page analysis

    They have a cloud version for reporting and I believe that in the near future a good amount of this functionality will go to its cloud service. This tool is highly recommended.

    Advanced Web Ranking - here's a basic overview of the software from a few years ago, though it has been updated a number of times since then

    Ahrefs

    I remember when this was for sale on Flippa! I find Ahrefs to be very reliable and easy to use. They have added quite a few features over the past year and, in my opinion, they are right up there with Majestic SEO when it comes to relevant, deep link data.

    Their interface has improved dramatically over time and the constant addition of helpful, new features has left other tools playing catchup. I'm hoping to see more integration with them in 2014 via services like Raven and Advanced Web Ranking.

    Ahrefs.com - here's a review from last year (though they no longer offer the SERP tracking feature they offered back then)

    Authority Labs

    The most accurate and stable online rankings provider I've used thus far. The interface has improved recently as has the speed of exports. I would still like to see a bulk PDF export of each individual site in the near future but overall my experience with Authority Labs has been great.

    I use it as a stable, online, automated rank checker to supplement my data from Advanced Web Ranking. It also has some nice features like being able to track rankings from a zip code and showing what else is in the SERP it encounters (videos, snippets, etc).

    Authority Labs - here's a review from 5 months ago

    Buzzstream

    Buzzstream is an absolute must have for anyone doing PR-based and social outreach. The email integration is fantastic and the folks that help me with outreach routinely rave about using Buzzstream.

    The UI has really been turned up recently and the customer support has been excellent for us. I'm positive that our outreach would not be nearly has effective without Buzzstream and there really isn't a competing product out there that I've seen.

    This is a good example of a really niche product that excels at its intended purpose.

    Buzzstream - here's a review from a couple years ago

    Citation Labs Suite

    We use the Contact Finder, Link Prospector, and Broken Link Building tool inside our prospecting process. Much like Buzzstream this is a suite of tools that focuses on a core area and does it very well.

    You have to spend some time with the prospector to get the best queries possible for your searches but the payoff is really relevant, quality link prospects.

    Citation Labs - here's a review from a couple years ago

    Link Research Tools

    While LRT is primarily known for its Link Detox tool, this set of tools covers quite a bit of the SEO landscape. I do not use all the tools in the suite but the areas that I utilize LRT for are:

    • Link cleanup
    • Link prospecting
    • SERP competition analysis
    • Competitive domain comparisons

    It's missing a few pieces but it is similar to Advanced Web Ranking in terms of power and data. LRT hooks into many third party tools (Majestic, SemRush, Moz, etc) so you get a pretty solid overview, in one place, of what you need to see or want to see.

    The prospecting is similar, to an extent, when compared with Citation Labs but you can define specific SEO metrics to prospect filtering as well as base it off of links that appear across multiple sites in a given SERP.

    LinkResearchTools

    Majestic SEO

    Majestic is still the defacto standard for deep link data (both fresh and historical data). They recently launched a new feature called Search Explorer, which is designed to be a specialized search engine devoid of personalization and what not, while showing rankings based on its interpretation of the web graph and how influential a site is for a given term.

    As of this writing, Search Explorer is in Alpha but it does appear to be a really solid innovation from Majestic. The other reason for having a Majestic subscription is to get access to it's API so you can integrate the data however you choose to. I use it (access to the API) inside of LRT and Advanced Web Ranking.

    Majestic SEO - here's a review from a couple years ago by Julie Joyce

    Moz

    I use Moz mainly for access to it's link data via Advanced Web Ranking. Compared to the other tools I use I do not see a ton of value in the rest of its tool suite and I also get data from it via my Raven subscription (which is where I tend to do a fair bit of research).

    If you are on a tight budget it's worthy of consideration for the breadth of tools the subscription offers but I think you could get better options elsewhere if you have some budget to spread around.

    Moz

    Raven Tools

    I don't use every single feature in Raven but I find Raven to be one of the most well-executed, stable tool suites on the market. I use Raven to:

    • Manage keyword lists
    • Research competitors
    • Manage and report on Twitter/Facebook profiles and campaigns
    • Track social mentions
    • Automate site crawls
    • Compare various, customizable metrics between sites
    • Google Analytics and Google/Bing Webmaster tools integration

    In 2014 I'm looking to do more with Raven in the content management area and in the reporting area. I still prefer to supplement GWT rankings data with rankings data from another source (Advanced Web Ranking, Authority Labs, etc) but a goal for 2014, for me, is to fit more reporting into Raven's already excellent reporting engine.

    Raven - here's a review from a few years ago

    SemRush

    In terms of keyword, ranking, and PPC competitive research tools SemRush really has moved ahead of the competition in the past year or so. I use most of the features in the suite:

    • Organic SEO Research
    • PPC keyword and strategy research
    • Multiple domain comparisons covering organic and paid search strategies
    • Yearly historical data feature on a specific domain

    I also like the filtering feature(s) that really help me whittle down keyword data to exactly what I'm looking for without worrying about export limits and such.

    SemRush - here's a review from a few years ago

    SeoBook Community and Tools

    Knowledge is power, naturally. All the tools in the world will not overcome a lack of knowledge. All of the specific, unbiased, actionable business & marketing knowledge that I've received over the last handul of years (and the relationships made) is the single most direct reason for any succcess I've had in this space.

    The SeoBook Toolbar is still one of my most utilized tools. It is data source agnostic, you get data from a variety of sources quickly and reliably. Seo For Firefox takes most of the info in the toolbar and assigns it to each individual listing in a given SERP. Both tools are indispensible to me on the research front.

    We also have some premium tools that I like quite a bit:

    • Local Rank - It scans up to 1,000 Google results and then cross-references links pointing from those sites to the top 10, 20, or 50 results for that same query. The tool operates on the premise that sites that are well linked to from other top ranked results might get an additional ranking boost on some search queries. You can read more about this in a Google patent here.
    • HubFinder - HubFinder looks for sites that have co-occuring links across up to 10 sites on a given topic. This is useful in finding authoritative links that link to competing sites in a given SERP.
    • Duplicate Content Checker - This Firefox extension scans Google for a given block of text to see if others are using the same content. The Duplicate Content Checker searches Google for each sentence wrapped in quotes and links to the results of the search.

    Screaming Frog SEO Spider

    This is my desktop crawler of choice for most sites, it's complimented by Raven's Site Auditor (which can be run automatically) and Advanced Web Ranking's site audit tool in my usage.

    Just about anything you can think of from a site architecture and on-page standpoint can be done with this tool.

    Screaming Frog - a few years ago Branko did a great review

    TermExplorer

    A cloud-based tool that processes large amounts of keywords pretty quickly and does a good job of bringing in terms from multiple sources.

    It also offers a competitive analysis feature that I don't use very much as well as white-label reports. It has pretty slick filtering options for keywords and scans for exact match domains (.com and .net) in addition to CPC and keyword volume data.

    Term Explorer

    Avoid Tool Fatigue

    There is going to be overlap across some of these tools and while the idea of all-in-one sounds nice it rarely works in practice. Clients are different, deliverables are different, and business models are different.

    The trick is to avoid as much overlap as possible between the tools that you use, otherwise you end up wasting time, money, and resources by overthinking whatever it is that you are doing.

    I have less than 20-ish toolsets that I use on an almost daily basis. Some of these are not used daily but definitely monthly. At one point I had access to over 40 different tools. The tools mentioned in this post are the ones that I've found the most value in and gained the most success from.

    Advanced Web Ranking Review - Website Auditor

    Advanced Web Ranking (AWR) is one of my favorite pieces of SEO software on the market today. It has been indispensable to me over the years. The software does it all and then some.

    I reviewed it a few years ago; you can read that here, most of it is still relevant and I'll be updating it in the near future. In this post I want to highlight their Website Auditor tool.

    Combining On and Off Page Factors

    The beauty of this feature is the simple integration of on and off-page elements. There are other tools on the market that focus solely on the on-page stuff (and do a fantastic job of it) and AWR does as well.

    The all-in-one nature of Advanced Web Ranking allows you to deftly move between the on and off (links, social, etc) page factors for a site (and its competition) inside of the Website Auditor feature. AWR has other tools built-in to go even deeper on competitive analysis as well.

    A quick FYI on some general settings and features:

    • You can crawl up to 10,000 pages on-demand
    • All results are exportable
    • Audits are saved so you can look at historical data trends
    • Complete white-label reporting is available
    • Because it's software it's all you can eat :) (save for the page limit)

    You can also set the tool to crawl only certain sections of a site as well as completely ignore certain sections or parameters so you can make the best use of your 10,000 page-crawl limit. This is a nice way to crawl a specific section of a site to find the most "social" content (limit the crawl to /blog as an example).

    Interface Overview

    Here's what the initial interface looks like:

    awr-site-audit-interface-overview

    It's a thick tool for sure, on the whole, but just focus on the Auditor piece. It's fairly self-explanatory but the top toolbar (left to right) shows:

    • Current site being viewed
    • Update date history for historical comparison
    • Filtering options (all pages, only specific pages (200's, 404's, missing title tags, basically all the data points are available for slicing and dicing)
    • Button for on-page issues to show in the view area
    • Button for page-level external link data to show in the view area
    • Button for page-level social metrics (Twitter, Facebook, G+) to show in the view area
    • Update Project button (to update the Audit :D )
    • Text box where you can filter the results manually
    • Auditor settings (see below)
    • Link data source, Open Site Explorer for now (Majestic is available in other areas of AWR and I'm told it will be available in Website Auditor as another option on the next release, 9.6 (due out very soon)

    The tool settings button allows to configure many areas of the Auditor tool to help get the exact data you want:

    awr-site-audit-tool-settings

    On-Page and Off-Page Data Points

    The on-page overview gives you all of what is listed in the viewport shown previously and if you click on the Filter icon you'll be able to look at whatever piece of on-page data you'd like to:

    awr-site-audit-page-filters

    I did just a short crawl here in order to show you how your data will look inside the tool. The view of the initial on-page report shows your traditional items such as:

    • Title tag info
    • Meta descriptions
    • Duplicate content
    • Robots and indexing information
    • Broken link and external link counts
    • Levels deep from the root
    • HTTP Status Code

    Each page can be clicked on to show specific information about that page:

    • Links from the page to other sites
    • Internal links to the page
    • Broken links
    • External links pointing into the page with anchor text data, Page Authority, and MozRank. Also whether the link is no-follow or an image will be shown as well
    • Broken link and external link counts
    • Levels deep from the root
    • HTTP Status Code

    The on-page overview is also referred to as the Issues Layout:

    awr-site-audit-on-page-view

    The other 2 views are more of a mix of on-page and off-page factors.

    The Links Layout shows the following (for the root domain and for the sub-pages individually):

    • Levels deep from the homepage
    • Page Authority
    • MozRank
    • Linking Root Domains
    • Total Inbound Links
    • Outbound Links
    • No-follows
    • Inbound and Outbound Internal Links

    awr-audit-links-overview

    In this view you can click on any of the crawled pages and see links to the page internally and externally as well as broken links.

    The Social Layout shows the following information:

    • Facebook Shares, Twitter Shares, and Google +1's for a given URL
    • Internal and external links to the page
    • Indexed or not
    • HTTP Status
    • Meta information
    • Broken Links

    awr-audit-social-layot

    This data is helpful in finding content ideas, competitor's content/social strategy, and for finding possible influencers to target in a link building/social awareness campaign for your site.

    Reporting and Scheduling

    Currently you can provide white label PDF/interactive HTML reports for the following:

    • Issues Layout
    • Link Layout
    • Social Layout

    You can also do a quick export from the viewport window inside the Website Auditor tab to get either an HTML/PDF/CSV export of the data you are looking at (list of link issues, social stats, on-page issues, and so on).

    Reports can be scheduled to run automatically so long as the computer AWR resides on is on and functional. You could also remote in with a service like LogMeIn to run an update remotely or use the AWR server plan where you host the AWR application on one machine and remote client machines (staff as an example) can connect to the shared database and make an update or run a report if needed.

    Advanced Web Ranking's Website Auditor is one of the most robust audit tools on the market and soon it will have integration with Majestic SEO (currently it ties into OpenSiteExplorer/Linkscape). It already pulls in social metrics from Twitter, Facebook, and G+ to give you a more comprehensive view of your site and your content.

    If you conduct technical audits or do competitive analysis you should give AWR a try, I think you'll like it :)

    Soft Launching SEOTools.net

    Last month we soft launched SEOTools.net. Here are a few entries as a sample of things to come...

    ... do subscribe to the RSS feed if you like what you see thusfar.

    Why create yet another site about SEO?

    Good question, glad you asked. ;)

    Our customer base on this site consists primarily of the top of this pyramid. I can say without doubt that I know that some of our customers know more about SEO than I do & that generally makes them bleeding edge. ;)

    And then some people specialize in local or video or ecommerce or other such verticals where there are bits of knowledge one can only gain via first hand experience (eg: importing from China or doing loads of testing of YouTube variables or testing various upsells). There is becoming so much to know that nobody can really know everything, so the goal of our site here is to sorta bring together a lot of the best folks.

    Some people newer to the field & a bit lower down on the pyramid are lucky/smart enough to join our community too & those who do so and participate likely save anywhere from 1 to 3 years on their learning curve...leveling up quickly in the game/sport of SEO. But by and large our customers are mostly the expert end of the market.

    We could try to water down the community & site to try to make it more mass market, but I think that would take the site's leading strength and flush it down the toilet. In the short run it would mean growth, but it would also make the community less enjoyable ... and this site is as much a labor of love as it is a business. I think I would burn myself out & no longer love it if the site became noisy & every third post was about the keyword density of meta tags.

    What Drives You?

    When SEOBook.com was originally created SEO was much less complex & back in 2003 I was still new to the field, so I was writing at a level that was largely aligned with the bulk of the market. However, over the past decade SEO has become much more complex & many of our posts tend to be at a pretty high level, pondering long-term implications of various changes.

    When there are big changes in the industry we are usually early in discussing them. We were writing about exact match domains back in 2006 and when Google's algorithm hinted at a future of strong brand preference we mentioned that back in 2009. With that being said, many people are not nimble enough to take advantage of some of the shifts & many people still need solid foundational SEO 101 in place before the exceptions & more advanced topics make sense.

    The following images either make sense almost instantly, or they look like they are in Greek...depending on one's experience in the field of SEO.

    My mom and I chat frequently, but she tells me some of the posts here tend to be pretty deep / complex / hard to understand. Some of them take 20 hours to write & likely read like college dissertations. They are valuable for those who live & breathe SEO, but are maybe not a great fit for those who casually operate in the market.

    My guess is my mom is a pretty good reflection of most of the market in understanding page titles, keywords, and so on...but maybe not knowing a lot about anchor text filters, link velocity, extrapolating where algorithm updates might create future problems & how Google might then respond to those, etc. And most people who only incidentally touch the SEO market don't need to get a PhD in the topic in order to reach the point of diminishing returns.

    Making Unknowable SEO More Knowable

    SEO has many pieces that are knowable (rank, traffic, rate of change, etc.), but over time Google has pulled back more and more data. As Google gets greedier with their data, that makes SEO harder & increases the value of some 3rd party tools that provide competitive intelligence information.

    • Being able to look up the performance of a section of a site is valuable.
    • Tracking how a site has done over time (to identify major ranking shifts & how they align with algorithm updates) is also quite valuable.
    • Seeing link spikes & comparing those with penalties is also valuable.

    These data sets help offer clues to drive strategy to try to recover from penalties, & how to mimic top performing sites to make a site less likely to get penalized.

    The Difference Between These 2 Sites

    Our goal with SEO Book is to...

    • try to cover important trends & topics deeper than anyone else (while not just parroting Google's view)
    • offer a contrary view to lifestyle image / slogan-based SEO lacking in substance or real-world experience
    • maintain the strongest community of SEO experts, such that we create a community I enjoy participating in & learning from

    Our goal with SEO tools is to...

    • create a site that is a solid fit for the beginner to intermediate portions of the market
    • review & compare various industry tools & highlight where they have unique features
    • offer how to guides on specific tasks that help people across a diverse range of backgrounds & skill levels save time and become more efficient SEOs
    • provide introduction overviews of various SEO-related topics

    Pages