Increase Your Profits with MixRank's New Competitive Research Tool

Aug 2nd

Not many spy tools out there do what MixRank does. MixRank is a tool that gives you the ability to peek into the contextual and display ad campaigns of sites advertising with Google AdSense.

Uncovering successful advertising on the AdSense network can give you all sorts of ideas on how to increase your site's profitability.

Not only can you uncover profitable AdSense ad campaigns but you can pick off AdSense publisher sites and leverage competitive research data off of those domains to help with your SEO campaign.

With MixRank own your competitors in the following ways:

  • Obtain the domains your competitor's ads are served on
  • Swipe your competitor's ad copy
  • Watch ad trends to target your competition's most profitable campaigns and combinations of ads

Another great thing about MixRank is how easy to use it is. Let's go step by step and see how powerful MixRank really is!

Step 1: Pick a Competitor to Research

MixRank makes is super easy to get started. Just start typing in a domain name and you'll see a suggested list of names along with the amount of ads available:

Here we are going to take a look at Groupon as we consider building a niche deals site. Keep in mind that MixRank is currently accepted free accounts while in beta so over time we can expect their portfolio to grow and grow.

MixRank breaks their tool down into 2 core parts:

  • Ads (text and display)
  • Traffic Sources

We'll cover all the options for both parts of the MixRank tool in the following sections.

Step 2: Working with Ad Data (Text and Display)

Let's start with text ad options. So with text ads you have 3 areas to look at:

  • Active Ads
  • Ad Reach
  • Best Performers

Here's a look at the interface:

As you can see, it is really simple to switch between different ad research options. Also, you can export all the results at any time.

The image above is for "Active Ads". In the active ads tab you'll get the following data points (all sortable):

  • Publishers - maximum number of AdSense publishers running that particular ad
  • Last Seen - last known date the ad was seen by MixRank
  • Frequency - amount of publisher sites on which the ad appeared
  • Avg. Position - average position of the ad inside AdSense blocks

Here you can export the data to manipulate in excel or do some sorting inside of MixRank to find the ads earning the lion's share of the traffic.

The Ad Reach tab shows up to 4 ads at a time and compares the publisher trends for those ads. To spread the love around let's look at a couple ads from LivingSocial.Com:

Here you can see that one ad crashed and fell more in line with an existing ad. You can compare up to 4 ads at once to get an idea of what kind of ad copy is or might be working best for this advertiser.

The Best Performers section compares, again, up to 4 ads at a time (use the arrows to move on to the next set) which have recently taken off across the network.

Needless to say, this report can give you ideas for new ad approaches and maybe even new products/markets to consider advertising on.

If the advertiser is running Banner Ads you can see those as well:

With Banner Ads, MixRank groups them by size and you can see all of them by clicking on the appropriate size link.

When you click on a banner ad you'll see this:

This is a good way to get ideas on which banner ads are sticking for your competitors. Also, it's a great way to get ideas of how to design your ads too. A little inspiration goes a long way :)

So that's how you work with the Ads option inside of MixRank. One thing I dig about MixRank is that it's so easy to use, the data is easy to understand and work with, and it does its intended job very well (ok, ok so 3 things!)

Step 3: Traffic Sources

Now that you have an idea of what type of text ads and banner ads are effective for your competition, it's time to move into what sites are likely the most profitable to advertise on.

MixRank gives you the following options with traffic sources:

  • Traffic Sources - domains being advertised on, last date when the ad was seen, average ad position and number of days seen over the last month
  • Reach - total number of publishers the advertiser is running ads on

The traffic sources tab shows:

  • Uniques - estimated number of unique visitors based on search traffic estimates
  • Last Seen - last date MixRank saw the ad
  • Days Seen - number of days over the last month MixRank saw the ad
  • Average Position - average position in the AdSense Block

A winning combination here would be recent last seen dates and a high number under the Days Seen category. This would be the advertiser has been and is running ads on the domain, indicating that it may be a profitable spot for them to be in.

You can also pull these domains into a competitive research tool like our Competitive Research Tool, SemRush, SpyFu, or KeywordSpy and find potential keywords you can add to your own SEO campaign.

Another tip here would be to target these domains as possible link acquisition targets for your link building campaign.

The Reach option is pretty self-explanatory; it shows the total number of publishers the advertiser is showing up on:

Another good way to evaluate traffic sources is to view the average position (remember, all the metrics are sortable). A high average position will confirm that the ads are pretty well targeted to the content of that particular domain.

Combine the high average position with Days Seen/Last Seen and you've got some well-targeted publishers. You can export all the data to excel and do multiple filters to bring the cream of crop to the top of your ad campaign planning.

MixRank is Looking Good

It's early on for MixRank but so far I like what I see. The tool can do so many things for your content network advertising, media buy planning, link building campaigns, and SEO campaigns that I feel it's an absolute no-brainer to sign up for right now.

For now it's *free* during their beta testing. Currently they are tracking about 90,000 sites so it's still fairly robust for being a new tool.

The Ultimate Guide to Using Bing's Webmaster Tools

Aug 1st

Bing's Webmaster Tools recently got a nice refresh and update. There is a lot you can do inside of the tools so we figured you'd want to know all about it :)

Also, we've included some free advertising coupons at the end of this guide to help get you started.

Account Dashboard

Bing's webmaster tools are fairly easy to use and the interface is quite clean. On the main account dashboard page you can select whatever site you want, in your account, and see quick stats on:

  • Clicks
  • Impressions
  • Pages Indexed
  • Pages Crawled

The percentages account for the net gain or loss from the week. For more specific site data, and more historical numbers, you would want to get into the site's dashboard which we will cover in the next section.

This initial account dashboard shows all the sites you have in your account and the associated metrics. The data is from a test site I created awhile back and kind of forgot about until they updated the tools over at Bing.

From this page you can:

  • Add sites
  • Remove Sites
  • Export data
  • Click on a site to get to its dashboard
  • See any account specific messages from Bing

A snapshot of all your sites in one place is a good way to immediately spot any recent issues with ranking, indexing, or crawling on your sites.

Once you are ready to move on into a specific site, just click on the site name under the heading "Site". When you click the site's name, you'll be brought to the site's dashboard.

Site Dashboard

Each site you have in Bing's webmaster tools has its own dashboard (not to be confused with the account dashboard). Once you get into a site's dashboard you see the data we talked about above at the top of the dashboard and then a 30 glimpse of the following metrics for the selected site:

  • Traffic summary
  • Index summary
  • Crawl summary (and a separate chart for crawl errors)

Here is what my test site's dashboard looks like:



For established sites with steady traffic (if for tracking ongoing campaigns) these 30 day snapshots are good ways for you to get a read on recent site activity and/or issues with traffic, crawling, indexing.

These types of reports can also be very helpful to watch when you are doing site re-structuring or complete site overhauls (changing CMS, url structure, and so on).

Each section has its own place within your site's webmaster tool profile. You can get more information on traffic, indexing, and crawling just by clicking the approriate link and we'll discuss each of these sections below.

Traffic Summary

Inside the Traffic Summary tab you have 2 options:

  • Traffic Summary - 6 month history of traffic and search query performance
  • Page Summary - Same as Traffic Summary except the data is broken out by page with the option to click through to the page's search query report

On this page the second chart listed is one that you can slide back and forth to shorten or lengthen the history of the data you are looking at.

The lines are color coded to show overall impressions versus clicks. Bing does present the data in a clean and easy to understand way inside of their webmaster tool reports.

The second chart on the traffic summary page shows search query performance. You'll see keywords you received traffic for as well as ones that you gained impressions (but no clicks) for:

This report is in conjunction with the first report of overall traffic/impressions from a time view. If you shorten the report this report will adjust as well.

You'll see the following data points in this report (all sortable and exportable):

  • Keyword
  • Impressions
  • Clicks
  • CTR
  • The Average Position your listing was in when the impression was gained
  • Average Position of your listing when a click was earned

This is a good way to evaluate how you might be able to increase your CTR. By showing you impressions versus clicks (the average positions) you can guesstimate on which keywords could use a bit of freshening up on the title tag and meta description front.

Page Traffic Report

The Page Traffic report shows the same charts as the Traffic Summary page with the exception of the bottom chart, which shows page level metrics. Here's a snippet from yesterday:

You can click whatever page you want and get the following keyword summary, similar to the initial chart on the Traffic Summary page but on a per page level on whatever time frame you selected (the above was a day so when you click through, that date carries into this report):

You can do the same thing here with average impression and average click position (and CTR) to evaluate pages which can use a refresh on title tags and meta descriptions for possible CTR upswings.

Another tip here would be to export the queries and see if there is potential to build out the page's category further with content targeted to specific queries.

So if a query is "chocolate truffles" and you are seeing some data for "white chocolate truffles" you might want to consider building out this section to include content specifically for those related but separate queries (if you haven't already)

Index Summary

The index summary page shows the index rate of your selected site, in Bing, over (roughly) the last 6 months.

The index summary chart is similar to the other charts in Bing's webmaster tools, which all the interactive sliding parameters that let you expand the report out over 6 months or drill down into a really tight, specific time frame.

Index Explorer

Bing's index explorer is a helpful tool that can alert you to HTTP code problems or confirm correct implementation of things like 301 directs.

The interface is easy to use:

With the index explorer you can check the following HTTP status codes that Bing has discovered over your selected time period (all time, last week, last month) and over your selected crawl range (all time, last week, last 2 weeks, last 3 weeks):

  • All HTTP codes
  • HTTP codes 200-299
  • HTTP codes 300
  • HTTP code 301
  • HTTP code 302
  • HTTP codes 400-499
  • HTTP codes 500-599
  • All other HTTP codes

You can also search for pages where the Bing bot has identified malware as being present as well as choose to show pages that you've excluded in your robots.txt file:

Below the options listed above, are where the pages that meet your filter requirements will show. It breaks the site down into categories and pages. When you hover over a page you'll see the following details:

If you click on a page you can also see a couple of additional data points:

  • Document size
  • Inbound links to the page
  • Block cache and block URL options for that particular page

Using this in conjunction with internal link checking tools like Xenu Link Sleuth (win) or Integrity (mac) can really help you get a good peek into the potential on-page technical issues of your site.

A couple of tools that give you valuable data about your on-page optimization are our Website Health Check tool (web based) and Screaming Frog's SEO Spider (mac/win).

I hope Bing adds some export functionality here, as they do in other areas of their webmaster tools, but the filtering options are solid enough to drill down into key issues for now.

Submit URLs

So this is a pretty straightforward option. Bing gives you the option to submit URLs (can be ones that are or are not in their index now) that you would like to request a recrawl or an initial crawl on.

The URL allowance is pretty limited so it's best to save these requests for more important pages on your site (their crawl section has a spot for sitemaps).

Block URLs

You can also select pages, directories, or an entire site to block from indexing and/or Bing's cache:

One area for improvement here, I think, is to be able to input or upload individual pages. As of now, you can only input 1 page per click, or select a directory to block (site.com/directory/), or block the entire site.

They do offer export functionality which is helpful when doing site audits, but a way to mass upload or input URLs would be nice (though you can tackle some of this with their URL normalization feature that will cover below).

Inbound Links

Bing will also show you the links they know about (in their index) that point to specific pages on your site.

Much like the charts above, you are presented with a historical chart which you can adjust with the slider below it (just like the Rank and Traffic stats shown prior).

Below those charts Bing will show you the pages on your site which have external inlinks and how many links they know of per page.

Once you click on a page, you'll see the linking URLs and the corresponding anchor text:

You can export page-specific links as well as the overall breakdown of pages with links and how many links those pages have. The export functions offer a nice way to get a high-level view of the overall link depth of your site.

While it's still a recommend practice to invest in a paid link research tool, supplementing your paid research by getting free link data from search engines is a no-brainer :)

Deep Links

Bing's Deep links are basically the same as Google Sitelinks. If you have been blessed by Bing, you'll see them in the Deep Links section of your site.

Bing's official statement on Deep Links is:

These Deep Links are assigned to websites which are seen by Bing to be “authoritative” on the topic which they target. The best way to influence whether you are chosen to have Deep Links displayed for your website is to create unique, compelling content that pleases searchers. Sites receiving this feature do an excellent job of delivering what visitors want, and keep visitors coming back time and again.

URL Normalization

If your URLs encounter parameter issues that can lead to duplicate content (e-commerce sites, CMS functionality, etc) then you might want to take a look at Bing's URL normalization feature.

Google offers a similar tool called Parameter Handling (great write up on this from Vanessa Fox)

This is a section where you need to be really careful as to not unintentionally boot out relevant URLs and content from the site.

Combining this with use of the canonical tag (which Bing uses as a hint) is your best bet to ensure that there are as few duplicate content, link juice splitting issues on your site (with Bing).

Again, make sure you or your programmer(s) know what you or they are doing so you do not do more harm than good.

With Bing, you basically just add whatever parameter you want to ignore so make sure that parameter or parameters do not crossover to other areas of your site that you would rather not have Bing ignore:

You can export all your inputted parameters as well.

Crawl Summary

The Crawl Summary section shows similar charts to other category charts inside Bing's Webmaster Tools on the landing page (6 month charting with interactive timeframe filtering).

You can check total number of pages crawled as well as pages with crawl errors off the landing page for this category (no exporting unfortunately) and dig into specific sections like:

  • Crawl Settings
  • Crawl Details
  • Sitemaps

Crawl Settings

Bing let's you set up custom crawl rates on a per site basis:

You may have situations where a custom crawl rate might make sense:

  • You want the bot to visit off-peak hours rather than when customers are visiting
  • You might be running special promotions or season promotions at specific times on an e-commerce site and want to limit bandwidth usage to visitors rather than Bing's bot
  • You might be doing a live stream or interview of some sort, and are expecting large amounts of traffic
  • Maybe you are doing some heavy content promotion across the web and social media and you want to avoid having any site load issues

You can use the timeframes given to line up with your server's location to make sure you are hitting the hours correctly (base time on the chart is GMT time).

You can also allow for crawling of AJAX crawlable URLs if so you choose. They recently rolled this out and their help section is weak on this topic so it's unclear on exactly how they'll handle it (outside of #!) but it's an option nonetheless.

Crawl Details

Bing's Crawl Details page gives you an updated overview of what's covered in the Crawl Summary. This feature doesn't require you to do any filtering to find issues, you can simply see if any of your pages have notable HTTP information, might be infected with Malware, and which ones are excluded by robots.txt.

If you have any pages pop up, just click on the corresponding link to the left and a list of exportable pages will pop up.

Another helpful, exportable report for site auditing purposes.

Sitemaps (XML, Atom, RSS)

This is where you'd submit your sitemap to Bing. For XML sitemaps, double check your submission with the Sitemaps.Org protocol

For a site that's going to be a fairly static site (like this one) I'd pay more attention to proper site architecture rather than relying on a sitemap, I might even skip the sitemap unless I was using Wordpress where you could just have it auto-generate and update with new posts and such.

You can add, remove, and resubmit site maps as well as see the last date crawled, last date submitted, and URLs submitted.

Bing Webmaster Resources

Bing's recent update to their Webmaster Tools added a good amount of value to their reporting. Here are some additional resources to help you get acquainted to Bing.

Free Microsoft Advertising Coupon

While you are over at Bing, signing up for Webmaster Tools, feel free to use these Microsoft AdCenter coupons for your advertising account :)

A Complete Review of Wordtracker's Link Builder

Jul 20th

You need links to rank, period. We can talk all we want about great content, social signals, brand signals, and all that jazz but quite a bit of that is subjective.

If you practice SEO, and have success with it, then you are well aware that a claim of "you need links to rank" is an objective, true statement without a bunch of false positives.

The gray areas come in to play when we talk about things like anchor text, quality, volume, and so on but the overarching truth is without links you are largely invisible in the SERPS.

Ok, enough of what you already know. Wordtracker recently updated one of their core tools with some cool new features and functionality.

What is Link Builder from Wordtracker?

Link Builder is designed to address a most of the core, key functions of a link building and prospecting campaign.

  • Locate potential link partners via competitor backlinks or based on specific keywords
  • Setting up a link building campaign and sorting your links properly (blogs, directories, social media, etc)
  • Tracking the status of your link campaign's efforts

Wordtracker uses Majestic SEO's Fresh Index by default but you can use the Historic Index as well.

I might opt for the Fresh Index initially, because Majestic tends to have dead links in the historic index (thanks to the significant churn on the web) but if you can't find enough decent prospects in the Fresh Index, using the Historic one isn't a bad option.

There is a lot I like about this tool and a few things I'd like to see them add to or improve on.

Step 1: Setting Up a Campaign

I'm a fan of clean, easy to use interfaces and Wordtracker definitely scores well here. Here is the first screen you are presented with when starting up a fresh campaign:

Researching competing link profiles is not enough with respect to link prospecting, in my opinion. I really like the option to not only research multiple URL's at once but also to research keyword-specific prospects.

You can research lots of countries as well. Below is a snapshot of the countries available to you in Link Builder:

Step 2: Prospecting With Competitor URL's

I am craving some chocolate at the moment, as you can tell from my selected URL's :)

Here's a good example of my decision making process when it comes to using the Historic Index and the Fresh Index. My thought process usually involves the following information:

  • The bigger/older the link profiles of the URL's the more likely I am to use the Fresh Index to avoid lots of dead links
  • If the site is a well known brand I will be more likely to use the Fresh Index given the likelihood that the link profile is quite large
  • Smaller link profiles, newer link profiles will probably benefit from using the Historic Index more

In this example the sites I'm researching have big link profiles and have been around for quite awhile in addition to being large brands, so I will use the Fresh Index to cut down on potential dead-ends.

I selected the "Edit Sources" box because I want to make sure I pick the URL with the most links (or you can just go with both) but I wanted to show you the options:

I'll leave all selected just to maximize the opportunities. Sometimes you'll find pages ranking for specific keywords you might be targeting, rather than just the homepage ranking, so you can use both or one or the other if that's the case.

In this scenario I'm looking at the URLs ranking for "chocolate", and they all happened to be homepage's anyway.

Wordtracker is pretty quick with getting the data in, but while you're waiting you'll see the following screen:

Step 3: Working with the Analysis Tab

In order to keep the results as targeted as possible, Wordtracker automatically removes the following links from the results:

  • Image Links
  • Redirects
  • No-follow links

One thing I'd like to see them do is let no-follows through because even though they might not pass any juice they certainly can be decent traffic sources and link building isn't just about passing juice, it's also about brand building and traffic generation.

I'd even say let image links through. I understand they don't want to be a pure link research tool but image links can be valuable for some of what I just mentioned as well. I would say, give us the data and the ability to filter it rather than just taking it away completely.

Here is a snippet of the result page and a description on what it represents:

On the left are pre-designed buckets that Link Builder groups your links into. This is helpful but I'd like to see more flexibility here.

They also offer a tagging feature to help you group links in another way. The tagging can be helpful for things like assigning links to specific people within your group or really any other custom setup you have going on (maybe stuff like grouping keywords into priority buckets or whatever.)

The prospect tab gives you the domain (chow.com in the below example) the link sits on, the page it links to on a competing site or sites, and the page the link is actually on from the linking site:

All you have to do is click that "links to" button to see where the link is pointing to (in this case chow.com is only linking to 1 of the sites I inputted).

The column to the right shows the page on the domain where the link is originating from and the number in the middle is a measure of how important that particular prospect might be.

The furthest most right column shows columns that tell you whether the domain is also linking to you and how many other sites, out of the sites you inputted, that domain is linking to. The idea being that the domain might be more likely to link to you if they are linking out to multiple competing sites as well:

The grayed out button to the right of the co-link count is the "target" button. This is the button you'd click to let the tool know that this is a prospect you'd like to target.

You have the following toolbar available to you in the Analysis tab:

These are generally self-explanatory:

  • Delete - removes selected prospects from the campaign
  • Export - export your results to a CSV file
  • Copy to - copies prospects to another campaign within your account
  • Tag - allows you to tag selected prospects to help create custom grouping fields
  • Filter - filters Top Link by "contains" or "does not contain". An example might be if you wanted to target a link prospect or prospects which contained the word "chocolate" somewhere in the URL

You can also click on any of the groupings on the left to view those specific groups only. I find that the groupings are fairly accurate but I personally prefer the ability to customize fields like that rather than being boxed in.

I created a sample tag titled "for eric" that contains 2 links I want a team member named Eric to work on:

Step 4: Working with the Contact Tab

The Contact tab has most of the same toolbar options as the Analysis tab with one exception:

  • Find Contact and About Links - click on the links you want to find contact information on and/or find the about page on

Link Builder works in the background to find this information and you can continue working in the application. There is a notes option as well. There's no specific way to leave multiple, time-stamped notes (for team environments) but the input box is expandable so you can leave an ongoing contact history.

You have the same contact flag on the right and to the left of that is an email icon that turns yellow if you click it and is designed to let you know contact is in progress or has been initiated.

When the contact request comes back (just refresh the contact tab) you'll see the following, new fields within the Contact tab that denote the contact/about pages for the prospect:

Step 5: Reporting

The Reporting piece of Link Builder has the following reports:

  • History - options for the Fresh/Historic Index of Majestic SEO via cumulative and non-cumulative views for the chosen domains
  • Spider Profile - the link category breakdown (the aforementioned pre-defined link sources Wordtracker assigns your prospects to) of each domain
  • Target Summary - number of targets, number/% of targets contacted, number/% of targets not contacted, number/% of targets linking to you

This gives you a quick overview of the growth of competing link profiles, current link building rate, types of links they have, and your own Prospect metrics. All the reports are exportable to PDF.

Here's the History report:

Here's the Spider Report:

Here's the Target Summary:

Additional Campaign Options

As we discussed earlier, you can either input a list of domains to search on a specific keyword.

If you search on a specific keyword to start you are able to select URL's to include in your prospecting search. Everything else, in terms of options after the URL selection is the same as if you were to have started with domains.

Having a keyword search to start a campaign is helpful in case you are looking to go beyond competitors you already know of and get a real deep look into link prospects across that keyword's market as a whole.

Also, right next to your campaign name you can sign up to be automatically notified of new links and prospects for your campaign:

Firefox Extension

Link Builder also has a Firefox extension that allows you to grab all the external links from a page and save them in your Link Builder account.

I find this is helpful on directory sites (for gathering a list of topic-specific URLs), as an example. The extension is really easy to use. You can install it here. Once you arrive at a page you want to use it on you just click on the LB logo in your toolbar:

Then once you click on the option to gather the links, you get the following interface:

You can save the chosen links right into your Link Builder account.

What I Like

The features that I like in Wordtracker's Link Builder tool are:

  • Ability to prospect by multiple URLs or by choosing a specific keyword
  • Option to use Fresh or Historic Index via Majestic SEO
  • Simple ways to keep notes and contact information
  • Ability to search for contact and about information on selected prospects
  • Robust selection of countries
  • Initial, intelligent link grouping
  • Exporting capabilites
  • Fast results and a really clean, easy to use interface

What Could Be Improved On

I think Wordtracker could do some things to make this tool even more functional and useful:

  • More flexibility with the naming and assigning of link types
  • Have profile-wide settings to include all links (no-follow, image, etc) or exclude some rather than excluding without a choice to include
  • More filtering options around the data points they offer and whether a prospect has been targeted or not
  • More robust link tracking (if the status of links change send me an alert). Though I realize that is getting into link tracking versus link building, it's still a nice option
  • A bit more flexibility with notes and timestamps for a more defined contact history (especially if teams use this)

A Solid Link Building Product

Overall I think this tool does a good job with its intended use, link building. I think some users would like to see more done to make it more team friendly but I think you can accomplish a lot with their tagging feature.

As stated above, I'd like to see some more done with notes and such but as a link prospecting and building tool Wordtracker's Link Builder is worth your time to try out.

You can grab a free trial over at Wordtracker.

  • Over 100 training modules, covering topics like: keyword research, link building, site architecture, website monetization, pay per click ads, tracking results, and more.
  • An exclusive interactive community forum
  • Members only videos and tools
  • Additional bonuses - like data spreadsheets, and money saving tips
We love our customers, but more importantly

Our customers love us!

How To Make Awesome Landing Pages for Local PPC

Jul 12th

Am I the only one who gets a warm, fuzzy feeling from a well-crafted, super-targeted landing page? Right, I didn't think so :)

Landing pages tend to suck more often than they inspire.

Local landing pages are even worse in many cases; with hapless advertisers throwing Google AdWords coupons away by simply sending you to their home page for every single ad :(

Why Local PPC Matters

I firmly believe that local PPC (and SEO) is still an untapped resource for those looking to make client work a part of their business portfolio.

It's quite hard enough for a local business owner, specifically one who has little experience in web marketing, to be expected to get a 75$ AdWords coupon and magically turn that into a quality PPC campaign that lasts.

Google tried that mass approach to marketing and failed. The result of that failure has brought about things like:

Google recognizes the market for helping small businesses reach customers on the web as do Groupon, Restaurant.Com, and all their clones.

Local PPC, especially when used in conjunction with local SEO, can really make significant differences at the local business level and many of those businesses need help to do it.

Landing Page Quality Matters

I really dislike hitting a generic landing page after I make a really specific query. It's kind of like going to Disney and asking where Space Mountain is, only to be told that "we have lots of attractions sir, here is a map of the entire resort".

Generally speaking, I believe most people like being led around by the nose. People typically want things yesterday so it's your job to give them exactly what they are looking for; after all, is that the point of search?

I think anyone who's worked with PPC campaigns can attest to the fact that targeted landing pages are quite high on the importance totem pole. Tailoring your landing pages to your target market matters a lot.

Solid Local PPC Landing Pages

Designing a good landing page for local queries is not hard at all. There are many different layouts you can use and you should test as many as is practicable, relative to your traffic levels, to understand which ones will work for you.

One area where local PPC is ripe for local business owners is insurance. I'm going to share a good example of a local lander below but if you are doing local PPC, before you get to the landing page design, utilize Google's address links like this advertiser did (green arrow mine)

The above can help you stand out from the crowd where you are one the few local advertisers and it helps create that local experience right from the start.

So I came across a couple of examples of good ways to tie in local content with your landing page design.

Here's one from the insurance industry targeting terms around "wisconsin car insurance" followed by some tips on why I feel it's a good example (green arrows are mine):


Why is this a good example?

  • Use of the local modifier in key spots (doesn't appear stuffed)
  • The Wisconsin Badger college football team's main color is red (not sure if that factored in but it helps to tie stuff like that in)
  • Icon of the state in the main header
  • Good use of badges to display authority in the insurance niche
  • Lack of other navigation options, focused on the offer and the benefits of using their service
  • I might have bolded "we only do business in Wisconsin" though

In the above example you see a problem with many insurance agents locally though, quite a few do not have the ability to offer live quotes so they have to use a contact form. In a web of instant gratification this is something that can be an issue.

Any good example is in another area where local customization works well, travel!:

This was for a search around the keyword "boston hotels". The imagery is great here. A couple things I would have done would have been to eliminate the left navigation and make the main content area more bullet-point oriented rather than a set of paragraphs.

Overall, they have a set up here where they can do the same approach across a bunch of different locations.

Not So Solid Local PPC Landing Pages

While searching for the above examples I also found some that were examples of being really untargeted approaches to local keywords. Here's an example of a brand just throwing out a really basic lander:

Absolutely no local customization at all. Good landing page basics though (clear CTA, clear benefits). Perhaps bigger brands don't need to, or fail to see the value in, making landing pages local-specific on local queries.

Liberty has no excuse not to either. They have local offices in every state, they could easily make their pages more local but they, for whatever reason, choose not to.

In keeping with the same theme, I found this landing page for "boston hotels" to be underwhelming at best:

It's a list of information in an otherwise coldly designed table. Perhaps this works well enough, just give people the info I suppose.

As a user, especially if I'm traveling, I'd like to see pictures, brief info about the area, why choose here over the hundreds of other providers, etc.

Quality Landing Page Foundations

Typically, I would recommend starting out with a base layout and designing the page according to your market and then layering on local criteria. If you look at examples of good landing pages the layouts themselves don't change all that much.

Some local elements you can include are:

  • Local imagery
  • Locations and hours
  • Integrated map with directions
  • Proximity to local landmarks (good for things like hotels, bed and breakfasts, etc)
  • Local phone number and contact information
  • Membership in any local group (rotary club logo, Better Business Bureau, chamber of commerce logo, logos of local charities or events you are involved with, etc)

As discussed before, design should also speak to your audience (more tech savvy or less tech savvy, age, gender, market, and so on).

Consider these 2 examples of landing pages for online invoicing. This is a market where design should be fresh, modern, "web X.X" if you will (like market leader Freshbooks).

Here's a win for good landing page design:

I really like the free sign up bar at the bottom. Your call to action is always available if you have to scroll or not. Good use of headlines, solid list of benefits, and super-easy sign up.

Compare that to something like Quickbooks which requires quite a bit of info to get started:

Then you have another example of, usually, what not to do. Too many navigation options here, run on paragraphs, lack of bullet points, outdated design for this market in my opinion:

So the layouts don't change drastically and I'd recommend coming up with a layout first, a base design, and base copy. Then you can easily turn any landing page into a targeted, local page pretty quickly with small design and copy tweaks.

Landing Page Resources

A few places I have bookmarked for landing page references are:

A couple of tools to help you with cranking out solid landing pages would be:

  • Unbounce (hosted)
  • Premise (Wordpress plugin from Copyblogger which comes with a ton of custom graphics and built in copywriting advice + tips)

It's not that difficult to create awesome, locally targeted landing pages. It's a really simple process:

  • Check out the resources linked to above and make a swipe file of nicely designed landing pages (design and layout)
  • Incorporate the base layout and copy layout (headings, graphics, CTA's, etc) into a wireframe
  • Minimize distractions (focus on getting the clicker to complete the desired task)
  • Get the UI and graphics in order
  • Think about all the ways you can sprinkle in a local feel to the page, like we talked about above (colors, locations, hours, local connections, imagery, and so on)
  • Add in the local components to your base page

What are some of your best practices when putting together landing pages for local PPC campaigns or landing page tips in general?

What's In Your SEO Toolbox?

Jul 7th

The SEO tool space is a pretty crowded one (and growing one!). Tools are helpful, there is no doubt about that. However, tools are generally only as good as the person using them. We'd love to know what tools you use and why, so please let us know in the comments after the post :)

I am not "house" handy by any means, I can barely hang a picture frame straight. So if you gave me the best construction tools in the world I'd still make extra holes and screw something up.

Even if I managed to get the picture hung correctly, it certainly would not look professional.

You can buy as many guides, tools, and accessories as you like but in the end it is your skill that determines the success or failure of a project (building a deck or building a website). Skills can be harnessed, but tools do not overcome a lack of skill.

SEO Tool Fatigue

SEO tool fatigue is a real issue for some folks. Some people spend a good chunk of their productivity on testing or trying out new tools, or even using so many tools that their implementation and interpretation of data suffers a great deal. One tool says this, another says that, and yet another says 1 or the other or both or neither :) .

The first thing to realize is that most of the data from tools (excluding analytics and such) are basically estimates of estimated data, or are directly from Google's various estimation-type tools (Keyword Tool, Trends, Insights, and so on), or driven off what the tool builder thinks are important or reliable metrics to build your research off of (there tends to be some swings and misses with that type of approach).

You are not going to fail miserably if you decide not to do days and days and days of keyword research with multiple tools and then spending more days comparing different datasets. Research is important, but there is a limit.

Picking a Core Set of Tools

From a cost and time standpoint I've found it really helpful to pick a core set of tools and stick with them rather than bouncing around to get an extra feature or two.

It's good to peek around from time to time but using mostly similar tools can lead to a "needle in the haystack" approach; where you spend most of your time digging a time-suck hole rather than building websites and adjusting strategies based on analytics and/or AdWords data.

Again, research is important but there is a sweet spot and it's a good idea to get some kind of system down so you can focus on doing "enough" research without doing harm to the time it takes you to get sites up and running.

Evaluating Tools

I'm going to highlight some of the tools I've used below, most of which are considered to be market leaders. I'll point out why I use certain tools, why I don't use others (yet) and I encourage anyone who's dealing with tool overload to do the same for the tools you use.

The areas I'll be focusing on are:

  • Keyword Research
  • On Page Criteria
  • Rank Checkers
  • Competitive Link Research Tools
  • Link Monitoring

Keyword Research

There are many keyword research tools that pull data from the sources listed below (like our free keyword research tool, which pulls from Wordtracker).

These tools use their own databases (although in Wordtracker you can ping Google's tool as well).

I use all the Google tools as well as Ad Intelligence and Wordtracker as well as the SeoBook Keyword Tool. Sometimes I use Wordtracker just via our keyword research tool and sometimes I use Wordtracker's web interface (I like being able to store stuff in there).

Our keyword tool also links in to most of the sources listed above. A big reason why I like our keyword research tool is that it's super easy to hit the major data points I want to hit on a particular keyword from one location.

Ad Intelligence is solid as (Microsoft claims) they incorporate actual search data into their results, rather than estimating like Google does.

I should also note that I mainly use Trends and Insights for comparing similar keywords and looking at locality (in addition to the history of keywords). Sometimes you run across really similar keywords (car, auto) and it can help to know which one is most relevant to your campaign.

On-Page Optimization

For the on page stuff I'm mainly concerned with large scale, high level overviews.

I use our toolbar for specific on-page stuff but when I'm looking to diagnose internal linking problems (not maximizing internal link flow, broken links, http status codes, and so on) or issues with title tags and meta descriptions either missing, being too short, or too long, or duplication then I use a couple different tools.

Since I'm on a Mac and I don't care to run Windows for anything other than testing, I use the three listed which work on Mac (though I don't use them in every situation).

I use Screaming Frog's SEO Spider pretty frequently as well as Peacock's Integrity. Integrity is a broken link checker while SEO Spider incorporates other SEO related features (title tags, H1/H2's, anchor text, and a ton of other important elements).

WebSite Auditor offers most, if not all, of what SEO Spider does but also incorporates white-label reporting, Google Page Rank, Yahoo! & Google Link popularity, cache dates, and so on.

For some of those features in Website Auditor you might want to either outsource the Captcha inputting or use their Anti-Captcha service so you don't have to sit there for hours entering in captcha's.

In my regular workflow, SEO Spider and Integrity get used a lot and Website Auditor comes in to play for some of those other metrics and for white label reporting.

Rank Checking

Here's a crowded space! So I think the right choice here really depends on your needs. Are you a solo SEO who runs multiple sites, or maybe you run your own sites and client sites, or maybe you are a client-only shop.

Here are some of the main players in this space:

Even if you have reporting needs, you can still do a lot for free with our free rank checking tool (scheduled reports, stored reports, multiple search engines, and so on) and Excel or another spreadsheet program like OpenOffice.Org or Google Docs. Some good tips on creating ranking charts with Excel can be found here.

There are a couple differences with the software players, Advanced Web Ranking and Link Assistant's Rank Tracker (both have multiple levels so it's wise to check the features of both to see if you need the higher end version or if the lower priced versions will work for you). Some of the key differences are:

  • Rank Tracker integrates with Google Analytics
  • Advanced Web Ranking has a variety of ways to track local rankings, including maps and a local preview engine
  • Advanced Web Ranking has more, easier to customize reporting options
  • I find that the interface with Rank Tracker is much easier to work with
  • If all you are looking for is rank checking, then Link Assistant is a bit cheaper overall (comparing enterprise versions of both). While noting, AWR has more local options at their higher price point. You can see AWR's pricing here and Link Assistant's here. Note, it's worthwhile to check out maintenance pricing as well (Link Assistant and AWR)
  • AWR let's you assign a proxy per project, which can be really helpful if you have clients all over the map.
  • AWR automatically pulls in the top ten sites for a keyword, and their last position compared to current, and let's you add that site to your tracking (at any point) with all the historical data saved and updated within your account.

One tip with software tools is to run them on a different machine, perhaps even behind an IP off of a private VPN service like WiTopia, and think about utilizing multiple proxies from a service like Trusted Proxies and/or using an anti-captcha service with Link Assistant's tools.

The idea is to not get your IP banned and to let you continue to work as normal on your main machine while another machine is handling the automated queries. If you don't want to fuss with that, you might want to try a cloud app.

The Cloud and Scalability

The 3 main services, that I've used anyway, come from Raven, SeoMoz, and Authority Labs. Authority Labs now powers Raven's SERP tracker too. My biggest concern with cloud-based rank checkers is that the keyword volume can be (understandably) limited. Now, Authority Labs has unlimited checking at 450/month but the other two have limits.

Let's just look at the highest plans for a second, Moz allows 30 campaigns and a total of 3,500 keywords. Raven's highest plan allows for unlimited domains and 2,500 keywords total (and 200 competitors).

If scalability is a concern for you then you might be better off with software solutions. Once you start running multiple sites or are responsible for reporting on multiple sites (and you are working the long tail and your analytics) then you can see how restrictive this could become.

Of course, comparing just the rank checking options of a tool set like Raven and Moz (which both have other useful tools, Raven more so for full on campaign management) doesn't do the pricing justice. So what you could do is still use the many other tools available from each company and use a software solution once your rank checking scales beyond what they offer.

Both Moz and Raven integrate with Google Analytics, and Raven's campaign integration with GA is quite nice too (beyond just rankings).

Link Research

Free tools like Yahoo!'s Site Explorer, search query tools like Solo SEO's link search tool and Blekko's link data are nice but at some point in your SEO career you'll might have to get on board with a more advanced link research tool or tools to get the data you need to compete in competitive SERPS.

A good chunk of software-based solutions pull link data from search engines but if you want a more, way more, comprehensive view of a competing site's link profile (and link history) you do have a few options.

Majestic was originally known for having a much deeper database, with the caveat that they keep a lot of decayed links, and their UI wasn't overly impressive. Well, as noted in a recent blog post (which includes 20% off coupons) on Majestic's new tools, most of that isn't the case anymore. Though, I still feel Open Site Explorer has a better and smoother UI.

Advanced Link Manager's strength lies in their ongoing link management and reporting but they also have some decent link research tools built in and they can connect to SeoMoz's API to gather link data, so that kind of sets them apart from those other software-based solutions.

Again, Moz offers other tools as well so it's hard to really compare price points. What I like about OSE is that you can get a really solid, quick overview of the anchor text profile of a competing site. Also, you get unlimited look ups and up to 10k links per query on their pro plan (in addition to other Moz tools). You can get a 30 day free trial of all the Moz tools as of this writing.

Majestic's New Tools

Majestic, now with their new site explorer and fresh index, rival OSE's UI and freshness a bit but there still are limits on usage. You can check out Majestic's pricing here and don't forget about the 20% off coupon mentioned here.

Typically I like to use both Majestic and OSE. I like the new tools Majestic has come out with and their historical data is solid. OSE, for me, is great for getting some of a site's top metrics quickly (anchor text, top pages, etc).

If I had to pick one, I'd go with Majestic mostly because Moz gives a decent amount of data away for free (being a registered user) and because Majestic has really good historical + deeper data.

Link Management

Building links, especially if you have a team, can be a cumbersome process unless you have collaborative tools to work with. Even if you operate mostly on your own, you might want to track links you've earned or built directly.

Every once and awhile i like to download a report from Majestic SEO and add any links that are not yet in my tracking program into the program. Some people like to just track paid or exchanged links and let the natural ones sort of come and go naturally.

There are a couple of tools out there that I've used, and one I haven't but I've heard good things about it from reputable sources so I'll include it here.

Raven's Link Manager is probably their flagship tool. It has received really high praise from experienced SEO's and is easy to use. You can easily add links, assign them to employees, and let Raven worry about the automatic checking and reporting in case something changes with a link.

Advanced Link Manager has many features built in but you can use it just for tracking links you want to track by uploading the links into the program. It's software based and you can set it to run whenever you'd like, automatically.

I personally haven't used Buzzstream, but reputable people have told me it is a solid program, and they have a free 14 day trial here. It's a dedicated link building and management tool (and also has a PR and social media tool) so chances are if you are looking for a specific tool to fill that need, this one might be worth a shot.

If you don't have a ton of links to manage or a team to manage, you might be just fine with an Excel spreadsheet or a Google Doc. To me, it's just one more thing to think about and Raven and Buzzstream have low priced plans if you don't need enterprise-level storage.

What's in Your Toolbox?

So there's an overview of what I feel are the best SEO tools out there and one's that I use frequently (or infrequently).

I'd love to know what you are using and why (or why not?) :)

Is SEO Irreducibly Complex?

Jun 10th

In his book, Origin of Species, Charles Darwin says that:

"If it could be demonstrated that any complex organ existed which could not possibly have been formed by numerous, successive, slight modifications, my theory would absolutely break down."

This is typically used by proponents of Intelligent Design to state their case against evolution by invoking the principle of Irreducible Complexity, which is to say that:

This applies to any system of interacting parts in which the removal of any one part destroys the function of the entire system. An irreducibly complex system, then, requires each and every component to be in place before it will function

Essentially the idea is that something that is irreducibly complex does not evolve to its state in a gradual manner (like evolution) and so the scientific research process of how X came into being is mostly irrelevant versus something that has evolved over time (like an algorithm). A man made algorithm fits into both categories.

What creature could be more complex than a creature which not only was part of natural evolution but also has elements of intelligent design within its core?

How does this apply to SEO? Google's algorithm evolves and it certainly fits precisely with how Darwin laid out the basis of his theory of evolution (numerous, successive, slight modifications in general) but by human hand and captured data.

Of course, sometimes Google makes a big update but generally speaking they make lots of minor updates per year.

Consider that a couple years back in 2009 they claim to have made just south of 500 updates in that year alone.

So the point I'm making is that SEO is both irreducibly complex (remove the hand of man and it would no longer evolve or even work as intended) and a product of a natural evolutionary process (the constantly adjusted algorithm) with layers and layers of thousands of changes over time, full of small and large complexities. These two characteristics make the process of trying to break it down to a stagnant formula with assigned percentages you can attribute to a majority of examples (with confidence) cumbersome and inaccurate.

Forcing Simplicity Creates Complexity

If you read a bunch of SEO blogs you might feel a bit overwhelmed with where to start and what to do. Some blogs tend to be information-heavy, some heavy in theory and (attempted) science, some straight news oriented, and some that are of the good old fashioned boot in your rear end "get something done now" genre.

I think it's important to pick blogs to read from those aforementioned areas of the industry to help get a well-rounded view of the SEO space. However, sometimes I think the more simple you try to make something, say like trying to whittle SEO down to a push button solution, the more complex you make things because then you need to have sound, reliable data to back up those kinds of claims and solutions.

If data starts reading out 50/50 or 60/40 probabilities then that's not really sound science at all. In fact, if anything, it just shows that some things cannot be broken down into a push button forumla or a statistic with any reliability whatsoever. It probably makes for good salesmanship when you want to wow a client with your superior knowledge but it also makes for laughable science, kind of like this kind of science:

The real problem is that Google claims to have more than 200 parts to its algorithm (which we obviously don't have available for studying :) ). Even if you call it an even 200 what about the different weight each factor has? Surely each does not represent 0.5% of the algorithm.

When you dive into trying to mathematically and scientifically break down a formula, of which you know an average (at best) amount of the variables + their direct effects, you actually create more confusion because you have to go out and find examples proving a specific theory while ignoring ones that point in the other direction.

Figuring Out the Variables

I think the annual SeoMoz Search Engine Ranking Factors is a worthy read as they pull data from lots and lots of respected folks in the industry and the presentation is top notch. I think overall it's a good representation of the factors you will need to face when conducting an SEO campaign.

Another good page to bookmark is this page from Search Engine Journal which has guesstimates of what they feel these elusive variables might be.

It can be hard to isolate really specific types of variables because of the constant Google updates, the other factors that are involved with your site and its ranking, and anything being done by the competition. You can test elements for sure, things like:

  • Does X link pass any pop?
  • Seeing if a couple pages pass juice on a 301 before 301-ing an entire site
  • On-page elements like title tag changes, internal linking, and external linking
  • An so on and so on..

The issues are still there though, even with "testing". It is still really, really hard to sell off a scientific breakdown of a consistent path to success beyond high-level ideas like:

  • Become a brand (brand signals, social media signals, offline branding, nice site design, etc)
  • Lots of links from unique domains (preferably good ones of course)
  • A good natural mix of anchor text
  • Great user experience and deep user engagement
  • Targeted content which gives the user exactly what they are looking for

I think that for someone looking to move forward in their SEO career it is important to try and remove the idea that you can break down the factors into exact numbers, as far as value of each individual variable goes. Anyone who practices SEO will likely tell you that you simply want to win more than you lose and even if you are on top of your game you still will have site failures here and there.

The issue of failing might not even be because of some current practice. You could be sailing right along and all of a sudden a Google update cleans your clock (another good reason to be involved with multiple projects).

You might spend more time agonizing over some magic formula or avoiding a project because some tool told you it was too competitive (rather than your knowledge) than building out multiple web properties to weather the expected storms and the ebbs and flows of the web.

Dealing with Complex & Unknown Variables

When faced with the prospect of working within a system where the variables that hold the key to your success are unknown, it can seem daunting. It can also make you want to run out and buy a shiny new tool to solve all your problems and get you that elusive Google ranking you've been waiting for.

The sad truth is if there was such a tool the person(s) who created it wouldn't be selling it to you for less than $100 or slightly higher (or even way higher!). They would be building sites in many verticals and making an absolute killing in the SERPS. By selling it to you they would just be creating more work for themselves and competition.

Not all tools are bad of course. I use the tools here at SeoBook as well as tools from Majestic, Raven, SeoMoz, and Caphyon (Advanced Web Ranking). The tools give you data and data points to work with as well as to cross reference. They do not provide answers for you at the push of a button.

The best thing to do is to start launching some sites and play around with different strategies. Over time you'll find that even strategies that worked in A, B, and C markets didn't work in D or E.

Things like algorithm's changing and competitor's stepping up their game can be factors as to why test results aren't always that accurate (at the real granular level) and why certain strategies worked here but not there.

Keeping Track of Wins & Losses

It makes sense to keep some kind of running journal on a site (why I did this, when I did that, etc) so you can go back and evaluate real (not theorized) data.

Running weekly rank checks isn't a bad idea and tools like Advanced Web Ranking and Raven have built in ways of you keeping notes (events for Raven) on a specific campaign or date-based events (added X links this day).

I happen to like Evernote for these kinds of things but most project management applications and information organizer tools have this kind of capability built in (as does having separate Word and Excel docs for your campaigns).

So if you are involved with a handful or four of projects, in addition to keeping track of strategies used, you can really get a solid handle on what is likely to work in the short to mid term and what really is working now.

A good example of this would be folks poo-pooing the idea of exact match domains being a golden egg of sorts over the years. If you were or are running any SEO campaigns you'll notice that the exact match benefit was quite real. So while pontificators were decrying their effectiveness, practitioners were laughing all the way to the bank.

There is no substitute for real experience and real data. Which group do you want to be in?

Mental Models

As we discussed above, the algorithm has a lot of components to it. There is generally no 1 correct universal right answer to each and every SERP. The gold usually lies in trying to understand where algorithms are heading and how they have changed.

As an example, in his recent post about exact match domains losing weight, Aaron used highlights to visually segment the search results in regards to "why is XYZ ranking". I'll include the image here:

This is a good example of the fact that when you build your own sites and you collect your data it helps you form and solidify your mental models.

The tricky part is how do you know who's advice is garbage vs who you should trust? You should take your independently arrived upon conclusions that you have repeatedly tested and see who is offering similar advice. Those are the folks who you can trust to tell you "what actually works" rather than "how to buy the number they are selling as a solution".

For another example of a mental model in action, you should check out Charlie Munger's piece on mental models and investing.

One more piece of advice here. Recently we wrote about the the importance of rank checking with a tie-in to analytics. It's vital to have both installed as you can get concrete before and after data. Without hard data relative to ongoing algorithm changes, you are kind of flying blind to the actual changes being made.

Being in the Know

The reason this community and many paid communities are successful is because there isn't a lot of noise or high pressure sales (like there are on free chat forums or message boards) and because experienced people are able to freely share ideas, thoughts, and data with like-minded people.

The more information and thoughts you get from people who are in the trenches on a daily basis can only help your efforts, knowledge, and experience because theories will only get you so far.

I think there is a scientific element to some factors like links, domain age, social signals, brand signals, anchor text (but at a high level, nothing overly exact) but overall I think it's too complex to break down into a reliable scientific formula.

It's important to pay attention to trends but your own experience and data is invaluable to your ongoing success. I believe that search is going to continue to get more complex but that's necessarily a bad thing if you have access to good information.

A friend gave me a great quote from Michael Lewis's book, Liar's Poker:

You spend a lot of time asking yourself questions: Are munis (municipal bonds) right for me? Are govys (government bonds) right for me? Are corporates (corporate bonds) right for me?

You spend a lot of time thinking about that. And you should.

But think about this: might be more important to choose a jungle guide than to choose your product.

When it comes to SEO, it's pretty important to choose your jungle guides correctly.

Set Up a Local PPC Campaign in 8 Easy Steps

Local PPC Tips

Recently, we did a post on the benefits of using PPC for Local SEO. In this post, we are going to go through how to structure a local PPC campaign for a small business.

This post is designed to show you how really easy it is to get a local PPC campaign up and running. If you want to streamline the campaign building process, choosing the right tools upfront can be a big help. Landing page creation can be a time-consuming and expensive task, but it doesn't have to be.

Step 1: Choosing a Landing Page Tool

For the quick generation of nicely designed landing pages, Unbounce is tough to beat. Unbounce has a variety of paid plans available. Unbounce hosts your pages on their site. You can use a CNAME record to have these pages look like they are on your domain via sub-pages or via a sub-domain. This is done very easily through your host and you can get the instructions as well as track the status right from the dashboard:

Unbounce Dashboard

I just set up this new account so it will be a little bit before this propagates. Once you arrive at the landing page dashboard, you just click the big shiny green button to create a new page:

Create New Page

You can choose from any of Unbounce's pre-designed templates (which are quite nice) or you can start with a blank template.

There are quite a few templates we can use, but since this is largely a lead generation campaign (for a local insurance agent) I'm going to grab one of the spiffy lead generation templates.

Customizing Unbounce

Unbounce has a Photoshop-like interface which makes changing text and imagery a breeze:

Unbounce Editor 1

Unbounce Editor 2

For a local insurance agent who sells maybe 3 or 4 different products to a small handful of towns, setting up his or her landing pages is really quite painless.

Alternative to Unbounce (for Wordpress Sites)

Once you start generating thousands of unique visits a month (hopefully!) Unbounce can get pricey. At that point, where you are presumably making lots of sales off of PPC, you might want to invest in having a designer start creating some custom landing pages rather than paying a hundred or hundreds of dollars per month on a service like Unbounce.

If you are using Wordpress and don't mind getting your hands into some basic coding then you should check out Premise (which is a relatively new Wordpress plugin from Copyblogger Media.

After doing some basic customizations on the look and feel of your template, you can quickly generate good looking and solid landing pages while getting copywriting advice built right in. Also, Premise comes with a ton of custom, well-designed graphics from their in-house graphic artist. The plugin works with any Wordpress theme.

Geordie did a review of Premise over at PPC Blog.

Step 2: Keyword List Generation

With local keyword research you'll most certainly run across a lack of data being returned to you by keyword tools. There are a few tips you can use when doing local keyword research to help get a better handle on a keyword list suitable for a PPC campaign.

Throughout this post we will be referring to a local insurance agency but this can apply to any campaign that is pursuing local PPC as an option.

Competing Sites

One of the first things I would do would be to check out the local sections of bigger websites. In the example of insurance there are plenty of large sites which act as lead aggregators and target local keywords (usually down to the state level).

You can very easily visit one of these sites and take a quick peak at their title tag and the on-page copy to determine what keywords they are targeting. In your market, like with insurance, there usually are related keywords that might overlap. Car insurance and auto insurance are classic examples in the insurance industry.

Looking at a few competing sites might give you an idea of whether or not most sites are pursuing keyword x versus keyword y. This can be a helpful data point to use when constructing your PPC campaign.

Google Trends

Google Trends is a nice tool to use when you are trying to discern between similar keywords and it offers trends by region, state, and city. For example, here are some charts for home insurance versus homeowners insurance versus homeowner insurance:

Google Trends

Here is the section where it shows trends by location (you can filter further, down to a city level)

Google Trends

I usually like to keep multiple variants in the PPC campaign to start just to test things out but using Trends to find which variant is appreciably higher in volume can help with choosing which keywords you should be looking to research further.

Your Own Knowledge

You know how your customers speak about products and you know the lingo of your industry best. That is how I usually would start a local keyword list. Then I would move into using Google Trends and looking at competing sites to see what the tools tell me about my initial take on a possible keyword list.

I would then further expand that list by entering those core terms into the Google AdWords Keyword Tool and the SeoBook Keyword Tool (our tool is powered by Wordtracker) and see what keywords I may be overlooking.

You can search in those tools for local terms like "boston car insurance" but the data, when you start to dig in to keyword research, gets almost non-existent for local terms past the point of a core term like "boston car insurance".

I would recommend taking the non-geo modified keywords you find with the processes mentioned above and entering them into a keyword list generator, along with the names of the towns and cities you service (and states). You can use our free keyword list generator :) .

SeoBook Keyword List Generator

This campaign happens to be in a small, rural area. So the strategy here is to get feedback ASAP to see if we are dealing with any palpable volume which would warrant further investment into a PPC campaign (outside of just testing search volume for SEO purposes).

I'm going to say that this agency sells car, home, life, and business insurance in 5 towns. There are some local PPC tools out there but for a campaign this size it won't be all that time consuming to set up.

What you can use is our free keyword list generator and export to either broad, phrase, or exact match after grouping everything together :)

Keyword Generator

Once you click "Generate" you get the results displayed like this:

Export that to CSV and you are good to go!

So basically, because the volume is small, we want to try and hit as many variations as possible. With local keywords you can have the following modifiers:

  • state spelled out or abbreviated
  • zip code
  • town or city
  • various combinations of the above 3 elements

You can choose to do your ad groups by a particular variable like city/town or product. As modifiers are playing a huge role in this campaign, and to keep things cleaner for me, I will do the ad groups by town and one for the state.

Step 3: Setting Up the Campaign

There are some initial steps you'll have to go through when you set up your AdWords account.

Choosing Local Settings

There are 2 areas for location targeting:

  • Location and Languages - where you would select specific language and location settings (country, state, town, custom map, etc)
  • Advanced Settings - where you can target by physical location and search intent, just physical location, or just search intent. You can also use Exclusion settings where you can exclude by physical location.

In the next section I'll show you why I chose certain settings for this specific campaign and when other options might be appropriate.

Location and Languages

With Location and Languages you get the following options:

Location Languages

In this campaign I am targeting one area of a particular state for a local business. There is no set rule here, preferences should be considered based on the client.

For instance, this is a local insurance agent in one part of the state with 2 offices in this particular area. It generally isn't wise to attempt to go after towns which are not within driving distance to the offices.

The selling point of a local agent is local service, a place you can drive to in order to talk with your agent about your policy, and so on. Most local agents do not have the technological ability to compete with direct writers like Geico and Progressive with respect to being able to adequately serve customers across the country. If the agency had multiple offices across the state I would reconsider my position on location targeting.

In any case, we are using broad matched geo-modified keywords and the surrounding states may have one or two overlapping town names so I'm going to use a custom map for location. The custom map works here because I can generally cover most of the areas where people who live in the area of the client either live or work.

Alternatively, I could choose (as states) Connecticut, Massachusetts, and Rhode Island but then you could run into overlapping town issues (which is the concern with just going country wide). You could solve that by introducing the state modifier (CT or Connecticut in this case) into your keywords but that defeats the goal of starting with a low effort campaign to see if there's any volume in the first place (just using city/town and keyword as a broad match)

Just click on "Select one or more locations" and click the Custom tab in the dialog box that opens up:

Custom Map Dialog Box

You can click or drag to create your custom map. I like to click (must be 3 or more clicks) because I find it to be more precise when trying to isolate a location.

I've created a custom map targeting the locations where customers likely live and work:

Custom Map Results

By their very nature most local businesses service a specific, geographical segment of a particular market. In sticking with the insurance example, let's say you have an agency in Boston, Cape Cod, Springfield, and Worcester (hitting most of the major counties in Massachusetts).

In the case of multiple locations you'd want to run multiple campaigns targeting those specific locations and perhaps an extra campaign which didn't use geo-modifiers but used just the state for targeting. This way, even if you hit on a broad keyword from another area of the state it still is likely that you service that location within driving distance.

You could easily highlight the multiple office locations on a landing page whereas a competing agency that just has a location down on the Cape would generally benefit very little, if at all, from doing any sort of broad-based geographic targeting (targeting the whole state with non-geo keywords as an example).

Advanced Location Options

You have five options here:

Advanced Settings Locations

You can read more about the options here and I'm going to leave them as defaults based on my geo-only targeting.

I may come back to these options if I find the search volume is not high enough to warrant continued PPC investment. In this market, non-geo modified keywords are brutally priced for a small, local agent so I think starting a bit more cautiously is a good idea.

Google makes a good point regarding the usage of these options

These targeting methods might conflict with location terms in keywords. You should only use either the advanced targeting options or use keywords to accomplish your campaign goals, but not both methods.

So if you've got a campaign which is exclusively using geo-modified keywords and a specific locale you could really mess things up if add another layer of targeting on top of that (which would be unnecessary anyway).

Businesses that service a community wouldn't benefit as much, if at all, from using these options but a small or local business which does business all over the country, or promotes travel to the local area, or ships products to different locations can benefit from these kinds of options as described in the example Google gives on Napa Valley Wine.

Say you are a chocolatier in Vermont and you want to run multiple campaigns for Vermont (exclusively with location matching and geo-modified keywords) but then you want to run other campaigns targeting Canada and maybe one for the US as well, while excluding Vermont and/or geo modified searches. You can do these sort of things with the variety of location options AdWords provides.

Network and Devices

By default Google opts you in to the search network and display network. I would avoid the display network at this point and just focus on the search network for these particular keywords:

Networks Devices Ppc

Bidding and Budget

For a basic starter campaign like this, and for many campaigns quite frankly, you can skip some of the advanced bidding options and just use manual bidding and "show ads evenly over time".

Bidding And Budget

Generally I recommend that you should be comfortable assuming somewhere in the high hundreds to a couple thousand dollars worth of AdWords spend being lost due to testing and such (based on whatever your cost per click is). Even though it's a much more targeted way of advertising than say a local newspaper ad, you still have to go in expecting to lose a little bit upfront in order to find that sweet spot in your approach within your market.

I would say, if you could, budget $100.00 per day for a month and take a peek at it each day just to make sure things are running smoothly. You'll have to pay a bit more upfront on a brand new account while your account gains trust in the eyes of Google. The beauty of a daily budget is that you can change it at any time. After a month or so you should have a pretty good idea of what is going on unless your business is in its offseason.

Location Extensions

I would use both the address extension (from Google Places, and if your a local small business you absolutely should be in Google Places) and the phone extension. This helps your ad stand out against the other insurance ads from large companies with no local presence and insurance affiliates with no location at all.

Location Extensions

Advanced Settings

Starting off, I would recommend leaving these as default and come back to them later to adjust if you start to see things like conversions being heavily weighted to weekends versus weekdays or days and nights. With a local campaign though, chances are you might not have enough data for awhile to make those kinds of data-driven decisions.

Some of the options are N/A anyway because we are not using the content network.

Advanced Settings Ppc Campaign

Step 4: Ad Targeting Options

The examples given above were for keywords being geo-modified because the advertiser is mainly a local business servicing a very specific area of the state and the non-geo modified keywords for this market are brutally priced.

For most local businesses, especially to start, I recommend using a geo-modified keyword campaign and then moving into setting up another campaign which is more of a keyword driven campaign rather than a geographically driven campaign (from a keyword standpoint). Why? Mainly for the examples mentioned above and because starting out with a huge AdWords campaign can be overwhelming to a new user, which can lead to poor management and a poor account quality score right off the bat.

Once you get comfortable with AdWords and you are seeing some early success (or failure, like obvious lack of volume for instance) I would begin to consider moving into setting up that second campaign. By failure I specifically mean a lack of volume early on. If you are consistently showing in the top 1-5 spots in AdWords but are getting very, very little traffic then that means you need to broaden your campaign. If you are getting traffic but aren't converting then you need to tweak and test elements on your landing page and maybe consider other keywords you haven't bid on yet.

I would suggest launching the geo-targeted campaign first and do the initial steps for a broader, non-geo campaign in the background (keyword research, building landing pages, thinking about ad copy and ad group structure, and so on). Obviously this is quite a bit different than a company that happens to be located in Anytown, USA but mainly sells nationally with little or no local presence.

Step 5: Setting Up the Ad Groups

Eventually you may find yourself adding campaigns for non-geo modified keywords while utilizing the targeting options mentioned above, or maybe you want to target just the Display Network. In this case, especially for small business owners who are working with a campaign for the first time, simplicity is preferred in the face of the many options provided within the AdWords System.

The idea with ad groups is to align them as tightly as possible with specific keywords. Using the town or city as the main grouping variable, followed by the product, we can indeed have very tight ad groups. This will also allow us the ability to create a landing page which is super targeted to the keyword being bid on. For example, let's take the idea of two towns and two keywords (auto insurance, home insurance)

I would set up the ad groups as follows:

  • (ad group) Town 1, (keywords) Town 1 auto insurance | Town 1 car insurance
  • (ad group) Town 1, (keywords) Town 1 home insurance | Town 1 homeowner insurance *as well as other closely related terms like home owner, home owners, condo, renters, and tenants
  • (ad group) Town 2, (keywords) Town 1 auto insurance | Town 1 car insurance
  • (ad group) Town 2, (keywords) Town 1 home insurance | Town 1 homeowner insurance *as well as other closely related terms like home owner, home owners, condo, renters, and tenants
  • (ad group) State/State Abbreviation, (keywords) State (Massachusetts) auto insurance | State Abbreviation (MA) auto insurance (and so on)

I would repeat this process for as many towns, states, and product variations as needed. You can mix in copy and imagery to speak to similar words like auto and car, as well as to speak to similar products like homeowners, condo, and renters insurance

Since we targeted a specific area on the map, we can target just state level searches without worrying about someone searching from an area we cannot service.

Step 6: AdWords Copy

When you first set up the campaign you get the page where you can name the first ad group and set up the sales copy, with helpful ad previews on the right:

AdWords Preview Copy

Here, you will enter:

  • Headline
  • Description line 1
  • Description line 2
  • Display URL
  • Destination URL

These are fairly self-explanatory and ideally you want your headline to contain your targeted (or most of your targeted) keyword as it will be bolded when the ad shows to the user.

I usually use the first description line to describe the features or benefits of what I'm or the client is selling, and the second description line as a strong call to action

Your display URL is what the user sees and can be customized to speak more to your offer, while the destination URL is URL Google actually sends the user too (this would be your landing page URL).

So for example, if you were selling insurance in Boston and your domain was massachusettsinsurance.com you could add /Boston to that in your display URL so it appears more relevant to the user (they do not see the destination URL)

Step 7: Install Analytics

AdWords offers PPC specific reports but installing, if you don't already have it, an analytics package is must if you want to really track your campaigns at deep levels. You'll want to be able to track conversions and typically that requires a multi-step process (even if you are just collecting emails) and having an analytics package can really take your data analysis to the next level.

You can use the AdWords Conversion Tracker for some conversion metrics but a full-featured analytics package gives you more options and data points to utilize. Some of the more popular and affordable analytics packages are:

AdWords offers built-in integration with Google Analytics, so for simplicity you might want to give that a shot upfront. Even though it's free, Google Analytics is a fully featured analytics provider suitable for large and small sites.

Step 8: Test, Tweak, and Adjust

Local campaigns can sometimes take a bit of time to return an appropriate amount of data needed to analyze and adjust to trends in your account. So, be careful not to make wholesale changes to a campaign off of a small amount of data. Try to be consistent for a bit and see what the data tells you over time.

Some tips on account maintenance would be:

  • If you get low relevancy messages or "ads not showing" messages on a keyword, isolate it in its own ad group with a super-targeted ad and landing page
  • Never let your credit card expire :) You can also prepay AdWords if you'd prefer'
  • Try different ad text copies from time to time, pointing out different benefits and using different calls to action
  • Try different elements on your landing page (maybe a click-thru button instead of an opt-in form, maybe a brief video, etc)
  • When adding new keywords keep the same tight ad group structure that you started out with
  • Use the Search Terms report to find exact keywords that triggered an ad click on your broad or phrase matched keywords
  • Use the keywords found in your Search Terms report as additions to your current PPC program and SEO planning, rinse and repeat

The more targeted your keywords are (and landing pages) the fewer clicks you should need to determine an appropriate level of feedback from your data. Over time you should pay attention to your conversion rates as they stabilize and look at feedback that way. In other words, if you typically convert at 30% and all of a sudden you go 0-100 on your next 100 clicks, something might be up. Where as if you convert at 5%, you'll need more clicks to determine anything from that data.

I'd peg the click amount to be somewhere in the hundreds when first setting up the account (assuming the keyword/landing page is super targeted). On a really targeted local campaign, I'd like to see a couple hundred clicks or so before I made any decisions on that particular keyword. Since local keywords are usually more trial and error upfront, pay close attention to the aforementioned Search Term report to find those really targeted keywords.

A local campaign is generally smaller in nature so it's a bit easier to have really tight ad group structuring (one or a few keywords per ad group). This makes it a bit easier on the local business owner because upfront set up is quicker and maintenance and tracking are both a bit more streamlined.

A local business that combines SEO and PPC can really clean up in the SERPS. To help get you started, you can pick up a 75$ AdWords credit below and if you are an agency you should check out the Engage program which gives you a generous amount of coupons for your clients.

Free Google AdWords Coupons

Google is advertising a free $75 coupon for new AdWords advertisers, and offers SEM firms up to $2,000 in free AdWords credits via their Engage program.

Free Competitive Research on Domains

May 31st

We love free stuff, especially when it comes to SEO tools and SEO data. Recently, we published a post on how to do a good bit of competitive research with free tools and now we are going to do that for competitive research on domains.

There are a number of tools we can use here. We are going to focus on using these tools to help evaluate a domain from a competitive research point of view:

  • SeoBook Toolbar
  • SemRush
  • Compete
  • AdWords Keyword Tool
  • Open SIte Explorer
  • Alexa
  • Quantcast
  • Google Ad Planner

It is worth noting that we reviewed the paid elements of most of the prominent spy tools about a year ago.

Getting Started with a Domain

Researching a competitive domain can have many benefits. Beyond evaluating the strength of a domain with respect to age, links, and engagement statistics you can find things like:

  • High traffic keywords
  • Profitable keywords
  • Low hanging keyword fruit (keywords they are ranking for mostly off domain/brand authority)
  • Site structure
  • Competing domains and overlapping keywords
  • Keywords being purchased for PPC

So you can do a few different things with domains. You might want to evaluate the strength of the domain as a whole if you are beyond the keyword research phase or perhaps you want to do that in addition to checking out potential keywords you can add to your campaign.

There are a few different tools you can use for this and I like to start with the SeoBook Toolbar because it's quick, easy, and incorporates the tools I want to use in one spot.

Using the SeoBook Toolbar

The toolbar links through to a ton of external tools and most of the tools listed above. It also provides a way to quickly review a bunch of the most relevant data with a simple click. Turn the toolbar on, visit the domain you want to research, and click the blue "I" icon shown below, next to the SeoBook icon:

Once you click on the blue info ball you get all this nice data immediately:

So in what really amounts to a quick, 3 step process you are able to instantly see helpful information about:

  • High level site data about age, Pagerank, indexed pages, and recent cache date
  • Link data from Yahoo! Site Explorer, Open Site Explorer, and Majestic SEO
  • Rough traffic estimates from sources like Compete.Com, Alexa, and SEM Rush
  • Social stats
  • Important directory links

It will be somewhat clear just by looking at the chart how strong the domain is. In this case, the domain is one of the stronger ones on the web.

You can link through to each tool/statistic from this chart and also from the icons on the toolbar itself.

As you continue down the toolbar you can see the link-thru icons Open Site Explorer, Majestic SEO, and Blekko. The "Dir" dropdown will show you the appearance of the site in the more important directories on the web.

Then you also can link thru to the Archive, Compete.Com, SEM Rush, the free SeoBook Rank Checker (to quickly check rankings of a keyword on a particular domain you might be researching), and the X-Ray Tool.

The "Competition" drop down will show you the following:

So here you can link through to a variety of sites to check out all sorts of data points about a domain including, but no limited to, domain registration, demographic data, and keyword data.

If that weren't enough, the toolbar also offers more tools:

The first link gives you the following drop down, which links through to a bunch of keyword tools based on the keyword you enter in the form field to the left of the book:

The highlighter highlights the typed in keyword on the current page and then you've got a link to SeoBook archives, recommended RSS feeds, no-follow highlighting, and a button which allows you to compare up to 5 domains at once.

Typically, I use the SeoBook toolbar as my research assistant of sorts when researching different aspects of a domain. It links through to the relevant tools I need to properly evaluate and research a particular domain.

SemRush

An appropriate disclaimer would be that data can be limited on these free accounts but they can help establish a rough baseline to start off of. From the SeoBook toolbar you can easily link through to an SemRush report which gives you limited data on:

  • Organic keywords a site is ranking for
  • Keywords a site is buying in AdWords
  • Domain competition in organic SERPS
  • Domain competition in AdWords
  • Actual AdWords ad copy
  • Potential traffic/ad buyers/sellers based on the AdWords and Organic competitive data

A comprehensive review on SemRush can be found here.

Focusing on the organic keywords, you can get the top ten keywords driving traffic to a site (disclaimer: Spy tools should be taken as rough data points rather than data that is 100% accurate. In order to achieve 100% accuracy you'd need access to a site's analytics :D )

This can be helpful if you are trying to research whether traffic is heavily branded traffic or if it's more keyword centric traffic as well as the overall rankings of a site across a wide spectrum of keywords.

In the above example you can see that many of the top keywords are brands but they also rank highly for really competitive, core keywords. This conicides with our initial findings, via the SeoBook Toolbar, that this site is a very strong site.

If you wanted to dig deeper you can subscribe to one of SemRush's paid accounts. We also offer up to 1,000 results per query (organic data) with our Competitive Research Tool (which pulls data from SemRush) in both our membership options We also have our own custom data calculations inside the Competitive Research Tool which are pretty sweet :)

Compete

Compete is a more expensive competitive research tool but they do give you a fair amount of data for free on a domain.

So here is an example of the free data they give on a "Site Profile" report:

Some of the key points missing on a free account are (besides full access to the teaser data) are demographics and some deeper engagement metrics.

We can get some semblance of demographic data from Google Ad Planner and Quantcast for free.

This report can give you some, albeit small, keyword data outside of a Google tool in addition to traffic history (searching for victims of Panda as an example) and some high level signals about how many sites the domain is getting traffic from.
I would use a free Compete site profile to get a really high level overview of traffic size, top keywords outside of a Google tool, and traffic/visitor trends and history.

This report certainly lines up with the site being an extremely competitive one, a large brand with lots of traffic sources, and a site unaffected by the latest Google update.

AdWords Keyword Tool

So once you move away from looking at some keyword and traffic sampling numbers, as well as the solid high level overview provided by the SeoBook Toolbar, you might want to consider site structure and keyword structure.

A neat feature in the AdWords Keyword Tool is you can enter a domain and Google will list the keywords and the page assigned to that particular keyword (in their eyes):

*Other columns were removed to show this feature specifically:

This can be helpful in terms of breaking down the site structure of a competing site, finding profitable keywords they are ranking for but not necessarily targeting, and helping you plan your site structure.

Open Site Explorer

Since this post is on free tools, I would go with Open Site Explorer here (you could also use Yahoo! Site Explorer and Blekko for more data points but OSE offers a really quick, easy to use interface and has tons of link data).

Using this tool you can find things like the anchor text distribution of a site (see if they are targeting keywords that you might be considering or if lots of their anchor text is brand related)

Inside of OSE you can find other key data points like:

  • Top linked to pages on the site
  • List of linking domains
  • External linking pages
  • % of no-follow to followed links
  • % of internal versus external links
  • 301 redirected domains/links

I do like using Yahoo and Blekko as well but I find that when looking at the free data options, OSE provides the deepest data out of the three and it's very easy/quick to use. On the paid side it competes with Majestic SEO which is a solid paid option as well.

Alexa

I think Alexa can be somewhat useful when doing quick and free competitive research, but it's also a tool that gets a bad rap due to internet hype marketers promoting it as the BEST THING EVER!.

We did an in-depth review of Alexa here and a review of their paid tool here. Alexa gives out a few different data points:

  • Traffic Stats
  • Search Analytics
  • Audience Profile
  • Clickstream

Within those sections Alexa offers a lot of data points (based mainly on their toolbar data). Here we have data similar to Compete's:

You can also see things like global traffic ranks (where the site ranks in Alexa's Top Sites in each country)

Helpful information on where folks are navigating on the site (if you are in the same market are there site features you could be missing out on?)

Similar to SemRush stats but based on a smaller sample:

Trailing data on traffic being received from search engines:

Keywords that they are growing and keywords where they are slipping:

Potentially profitable keywords they are ranking for (factoring in advertising competition)

They also offer some demographic data compared to a relative baseline figure for each demo section:

Find out what sites people are coming to the site from (possible ad partners or related domains you can target in the same way you are targeting the current one from a competitive research perspective):

Where people are going when they leave:

Again, Alexa's data (like most spy tools) should be taken as rough figures rather than exact data. It's helpful to compare data from multiple sources as you can start to see patterns emerge or you can prove or disprove theories you may have about the site and your proposed method of attack.

Quantcast

Most sites I run across are not "quantified" so the data is a rough estimate (again).

So with Quantcast you can get more of that same traffic data along with some deeper demographic data:

This is on the overview page, there are separate sections for traffic and demographic data which break the information down a bit further:

You can also see data about what other sites are used/liked by visitors of the site you are doing research on:

This is on the demographics page and can give you an idea of what type of customer you'll be encountering which can help in determining how to present your offer and what to offer:

I like to use Quantcast mostly for demographc research on competiting or similar website (similar to products or services I am offering to help shape those offers and the presentation of my site).

Google Ad Planner

Ad Planner offers similar demographic data to Quantcast and similar traffic data to Alexa and Compete.

The big difference is the data is obtained from various Google products so it's probably somewhat safer to assume that the data might be a bit more relevant or accurate since Google has lots more data than any of the tools mentioned above (at least in terms of traffic data).

Ad Planner will show you "Google-ized" data for traffic patterns:

Unique visitor data in addition to Google Analytics data (for those who like to share)

You can also see top search queries:

As well as demographic data and audience interest data:

When to Go Paid

As you can see, free tools can give you lots of data but at some point you might have to scale up to use some paid tools. Paid tools certainly give you more data to work with but you can accomplish a lot of competitive research and background research on a domain with free tools.

Insurance For SEO's

May 2nd

Insurance is a popular, profitable area for some SEO's. Trying to find reputable insurance for an SEO business is not so popular because many insurance agents do not have the experience to make the distinction between what a web design shop does versus what an SEO or PPC business does.

Prior to entering this business I was an insurance agent and before that I was an underwriter and I still have my agent license (hey, you never know!!). There are policies out there which SEO's should consider purchasing as well as any web design or development shop.

There are a few different policies you might want to consider in this industry:

  • General Liability
  • Professional Liability (Errors and Omissions)
  • Workers Compensation (if you have employees)
  • Short-Term and Long-Term Disability

As a business owner, you will have or not have the following conditions:

  • employees
  • office space
  • equipment
  • office space where you conduct business with clients and vendors

Workers Comp and STD/LTD

Workers Compensations and STD/LTD are fairly general insurance policies with respect to the policies not really being specific to the SEO business. Here in the US, Workers Comp is administered on the state level. Workers Comp is required in certain situations, depending on your state, so it is wise to check with your attorney and insurance agency regarding what is appropriate for you.

While you only have to worry about Workers Comp if you have employees (actual employees hired and placed on payroll not contracted labor or freelance arrangements) you should consider Short Term and/or Long Term Disability insurance even if you're a solo SEO.

Short Term and Long Term Disability

If you are coming from corporate America you likely had these policies under a group plan (which is why it's so cheap). Essentially, it breaks down as follows:

  • Coverage responds if you are injured and unable to work (this doesn't cover sick time and generally excludes maternity coverage)
  • Short Term Disability will cover you for a certain period of time (usually under 90 days) at 100% of your pre-tax (sometimes you can choose post-tax) income.
  • Long Term Disability kicks in either after Short Term has expired or if you decided not to purchase Short Term at all.
  • Long Term will typically cover you at 60% or so of pre-tax (or post-tax if you are given the choice) for an extended period of time.

Wages are usually determined based on your prior year's tax return in conjunction with a current Profit/Loss statement (another reason why you should report all your income!).

Sometimes it makes more sense to get a quote on LTD (as it's cheaper) and just build up a reserve of your own to cover what Short Term Disability would have covered (rather than spending money on premiums and losing it if you never file a claim).

If you advance a claim for either, there usually is a waiting period of a few weeks to a few months while a case manager is assigned to investigate the claim.

Be prepared to get put under the microscope much more than you would if you were part of a large group plan (if you work for a large company as an example) as you are no longer part of a protected herd, but this is where using an independent agent can be help in fending off the overzealous claims adjuster who might see you as an easy case :)

Different policies have different exclusions so it's wise to discuss all your extracurricular activities with your agent as things like sky-diving are usually not covered causes of injury.

General Liability

This is one of the most common forms of business insurance. The meat of what this type of policy provides is:

  • Bodily Injury and Property Damage
  • Defense Costs while defending a suit
  • Personal Injury
  • Medical Expenses
  • Operations Liability

Most of this coverage is applicable when/if some of the following conditions occur:

  • Client is injured on your premises
  • Damages to property you are renting to other businesses
  • Advertising mishaps (slander, libel, copyright infringement, etc)
  • Injuries sustained by others on your defined premises due to the activities and operations of your business

Combining GL with Property Insurance (BOP Policies)

Many times, a GL policy is combined with Property Insurance to make what is called a BOP or business owners policy (BOP package).

Property Insurance is fairly standard and covers things like:

  • Inventory
  • Equipment (probably computers for most of us)
  • Records and Documents
  • Buildings
  • Other Real and Personal Property
  • Lost income due to a covered loss

A BOP is really geared towards business which have an office building, equipment in that office, meet clients on the premises, or rent out space to others.

Neither of these, or the BOP package, cover Workers Comp. A BOP is a good solution for a shop which has a physical location, clients on site, and has inventory/equipment on premises.

Professional Liability & Specialized Insurance

These types of policies is where a chunk of the specialized coverage for an SEO would come from. A BOP policy is meant to cover "products" with respect to liability so as an SEO, web designer, or web developer you'll need more specialized coverage which covers things like:

  • Data Storage
  • Malicious Code (say your Wordpress site gets hacked and distributes malware or keyloggers)
  • Hosting
  • Loss of business income (say your host goes down and your client's e-commerce site goes offline, or it goes offline due to an error on your end)
  • Data security

Depending on your level of involvement with servers, software, and application development you may want to scale up and get a more specialized policy. Lots of larger insurers sell specialized, broad polices under the name of Technology Insurance or Information Technology Insurance.

Over the years these policies have developed to cover more and more specialized areas of tech insurance. In the beginning they were mostly geared towards straight IT companies but now cover all sorts of tech groups like:

  • Web Designers
  • Web Developers
  • Consultants
  • Outsourced Applications
  • Hosting Services

Be Up-front

Our industry is no different than any other, lots of snakes. If you are engaging in some kind of off the wall activity you really need to tell the agent. Tell them exactly what you do, if you are doing things that your state or other states consider illegal (fake reviews for instance) then don't expect your policy to respond to such things.

It's no different than getting a homeowners policy while saying you don't have a Pit Bull, even though you have a Pit Bull (which are excluded by all standard insurers), then expecting coverage when your Pit Bull bites the neighbor.

While the agent may not understand the nuances of the business, you are typically good to go if you take the time to fill out the application completely and accurately.

Why Use a Local, Independent Agent?

Most folks you get in the call centers of GEICO or Progressive are salaried or hourly employees, they don't live in your community, and they really don't care to get you the best deal (they can only give your theirs).

Having a local agent gives you access to more markets, someone close by to help you fight any injustices the carrier may try to perpetrate on you, someone who is making a living selling policies in a local market (less likely to burn you as they care about their reputation), and someone who can help insure all your personal and business needs.

A call center rep, if they are on commission, generally wants to churn and burn through the calls and probably won't meet with you face to face to go over anything and answer all your questions (or get answers). Plus, most local agencies need help with web marketing so it could be an easy in for you!

What Do You Need?

Scenario 1 (Work from home, no employees)

Get a business endorsement on your personal homeowners policies and schedule your equipment if you need to (might not need to if you have warranties in place already). This will also make your home office deduction look more official to the IRS.

Get a Professional Liability (E&O) policy and look into a specialized Technology policy depending on your business model. Consider STD/LTD insurance for lost wages if you can't work.

Scenario 2 (Home office, external office, no employees)

Same as above but add in a BOP (General Liability and Property Package) which is probably required if you are renting office space anyway.

Scenario 3 (Home office, external office, employees)

You should check with your attorney about how you are defining employees and if that means they are actual employees versus free-lancers or contractors.

You would want to look at a BOP, Professional Liability and/or specialized Tech Insurance, and Workers Comp if required as well as employee benefit/insurance packages for things like STD/LTD.

Insurance is Boring

Yes it's boring but it's worthwhile for most of us from a cost/benefit standpoint (or legal liability standpoint). Solid, legal contracts reviewed by attorney's are also HIGHLY recommended.

Make time to sit with some local agents to really go over your business and your business activities. It's a really soft market right now for business insurance, especially small business, so agents are going to be more than happy to sit with you and go over what they have to offer.

Why Contracts are Important

There are many contracts available online, for free and for a fee. These contracts, even ones from places like LegalZoom have not been reviewed for your specific business by attorneys in your state.

A contract is no good if it's not enforceable. You can probably expect a fee for a good attorney to review your contracts, and make any necessary changes, to be in the high hundreds of dollars or low four figures.

There is a cost certainty to the cost for an attorney to give you the green light on a set of contracts (though, for really big deals you might still be wise to get a contract specific to that deal) but there is no cost certainty to the legal liability you could face if your contracts are essentially worthless in a court of law.

Another thing you'll want to watch out for is a client who tries to give you unlimited downside from a liability standpoint (in the contract) but severely limits the upside to your fees. You might be willing to take on the risk of downside so long as you are getting a decent %, or a few % points, of the upside on the deal. This is another case where an attorney reviewing the contracts can be well worth the cost.

Choosing the Right Business Entity

Your insurance policy is mutually exclusive from your business entity. If you are a sole-proprietor your personal assets are at risk even though your insurance policy covers defense costs. Choosing the right entity is another way to insulate yourself from liability.

In most states a single member LLC up to a full-blown corporation (and everything in between like a multi-member LLC, S-corp, and so on) will insulate your personal assets (home, savings, future personal earnings) from legal liability, whereas a sole proprietorship will leave your personal assets exposed.

The combination of a good insurance policy and the right business entity will cover defense costs (on a covered event) and protect your personal assets. Choosing the wrong set up can be financially disastrous.

The policy will cover defense costs if a claim is covered but if it is a frivolous lawsuit, or something personal, you might have to defend yourself. This is where having the right entity is key because your personal assets are not at risk even though you are probably going to incur your own legal costs.

The best way to protect yourself is to insure and to pick either an LLC or a corporation of some kind. Choosing neither, or one but not the other, can leave you and your business significantly exposed to liability.

Free Keyword Competition Research

Apr 29th

free-keyword-competitive-research

The term "competitive research" conjures up all sorts of imagery like expensive tools, shiny buttons, cute charts, and fancy (sometimes foolish) language about precise insight into a particular site or marketplace.

In reality we know that such claims are usually best taken with a large grain of salt. Most competitive research data is scraped from search engines and then has custom filters applied to it. Such filters can actually be a detriment to the data because, in desperate attempts at differentiation, tool-sets routinely use metrics which get overly convoluted with custom values and such that the final product because overhyped and underwhelming.

Some tools make up for their sampling errors by allowing you to upload your keywords & data directly into their database. The problem with this is that you are putting keywords which were "below the radar" into a database that your competitors may be using. Why just give away your data to the competition like that? Talk about working against yourself!

Let's remember that these custom metrics and estimates are typically extrapolated off of scraped data, or data purchased from IP's, or data from custom toolbars, all of which are data samples. So it is kind of like; scraped data +/- data extrapolations + in-house data + custom metrics = final product.

It is reasonable to assume that the more custom or guesstimated layers you build off of occasionally unreliable data (waves at Google's keyword tool and SERPS) the less and less targeted that data is. Moral of the story is, "choose wisely young jedi".

Getting Useful Data for Free

Now that we've set the expectation stage (don't expect tools to be a push button, slot machine win) you might feel like paying hundreds or thousands a month for these kinds of tools is a bit much. Sometimes yes, but multiple data points certainly have their advantages and it's not that the data is junk by any means, it's just that the data shouldn't be relied upon as if it were scientific. The data can most certainly be helpful but it comes down to ROI for you and your specific project(s).

free tools

There are many tools you can use to get lots and lots of decent keyword competition data for free. We aren't going to be covering free trials, just tools that give you what the have for free or tools that give enough useful data inside of a free version of their product.

If you are in the competitive research stage, you've probably already got a topic in mind. So we'll assume that you are doing competitive research on the keyword "camping equipment".

Accessing Free Tools

You could do a specific bookmark folder which encompasses links to your free tools for easy access. The first things you will probably look at are:

  • the SERP for your keyword
  • age of ranking sites
  • links (total links, links to domain, links to page, edu/gov links)
  • domain age
  • domain name (brand, exact match, both, none?)
  • signals of trust (key directories, dmoz, twitter)

Those data points can easily be accessed with SEO For Firefox.

SEO For Firefox

logo

SEO for Firefox is a free firefox extension which will give you important SEO metrics quickly, from a variety of reputable data sources. Typically, you might want to aim for in at least the top 3 given all the stuff that could be included in a SERP like:

  • a map
  • Google products
  • Google images
  • Google shopping results
  • YouTube videos
  • News results
  • Real-time results
  • and so on...

You can get a pretty good initial glimpse of the competition metrics within a few seconds. This is clearly a brand-heavy SERP and it is reflected in the SEO metrics. Here's a screen shot of what you'd see for a particular domain:

seo-ff-results

All things considered this is a pretty strong domain. It's a brand, has lots of links, .edu links, also ranks highly in Bing-powered Yahoo!, and has a PR 6 to boot.

Another cool thing about SEO For Firefox is that you can export the results into a .csv file for further research, processing and comparison. If you don't have a copy of Office for Mac or Windows already then, in keeping with the "free nature" of this post, you can use Open Office.

SEO Toolbar

Maybe you know the sites you want to research already or maybe you want a graphical, side by side comparison of up to 5 sites in your market. You can use our SEO Toolbar to accomplish this quickly and efficiently. If you click on the green arrows on the right side of the tool bar, you are presented with a GUI for the processing of up to 5 sites at ones (screenshot below)

toolbar-compare-sites

The comparison feature gives you access to key, relevant SEO metrics side by side for up to 5 sites.

So by now you should have a spreadsheet or three containing relevant data for the top sites on a particular keyword

Link Tools

Now that you've gotten some of the higher-level metrics out of the way, you can dive into examining the link profile of a competing site.

You can use free tools (or free versions of paid tools) to look at the links from a competing site, tools like:

  • Yahoo's Site Explorer
  • Blekko's SEO Tools
  • Open Site Explorer
  • Majestic SEO

While it's a good idea to get data from a variety of sources, and run them through a tool like Advanced Link Manager to get a full(er) picture of things, you can get some juicy data for free.

When doing competitive research for a keyword I want to know what the anchor text profile looks like. When I am doing competitive research on a domain there are other relevant data points like top pages, most linked to pages, and total number of unique domains linking at the domain or page (whichever is ranking).

Blekko and Open Site Explorer are the ones I use for targeted and quick anchor text distribution views. Yahoo! generally ranks the best links first and allows for a CSV export, Majestic's free account gives limited data on referring domains, top back-links, and top pages. So for the purposes of looking at anchor text, I prefer Blekko and Open Site Explorer.

Blekko

Blekko has a link to SEO data and Links data, as shown below:

blekko-links-to-data

The Links selection will bring up a Yahoo! Explorer-like list of links, the SEO link option brings up a bunch of SEO data like:

  • links to the domain
  • links to the page
  • anchor text information
  • links broken down by geography
  • external links
  • pie-chart, graphical representation of link data points
  • and other non-link related, but helpful, data (crawl data, site pages, etc)

The data is free, you get the data they offer without registration requirements.

Open Site Explorer

Open Site Explorer is a quick and easy way to get the type of data we are looking for in this example (anchor text profile).

They currently have a 30 day trial and offer 3 plans:

  • Free, No Registration - limited to 3 reports per day, shows up to 200 links and top 5 link metrics for a given criteria
  • Free, Registration Required - no limit on reports, 1,000 links returned, top 20 link metrics for a given criteria (anchor text, top pages, etc)
  • PRO - part of subscription to SeoMoz, up to 10k links, no limit on metrics
  • CSV export available for all plans

If you know the sites you want to look at, and the keyword(s), you can likely get away with just using it as a guest. However, the free but registered plan does give you a bunch more data. What I like in this example is that you basically type the domain name in, hit enter, then click on the anchor text distribution tab and the anchor text data is right there:

anchor-text-ose-example

You'll see the actual anchor text, the number of domains linking with that anchor text, and the total links with that anchor text in them (good way to spot site-wides from one domain). In this example, our target keyword is not in the top 5 (or 20) with respect to anchor text occurrences. This domain is a large brand though, so you'd likely want to make sure you could build an authoritative and useful site about the topic in order to overcome Google's love affair with brands.

Checking the On-Page Optimization

Though I believe the link data and domain data to be mostly paramount, the on-page criteria follows closely in the importance department.

This is pretty self-explanatory and you don't really need a full blown tool for this. Basically you'll want to look at things like the title tag, meta description tag, and the on-page copy itself.

You can do that pretty easily with just your eyeballs, but the SEO Toolbar also has a feature where you type the keyword into the box in the upper-right, and click the highlighter:

seo-toolbar-chocolate-truff

In this case I used 2 words, and they are highlighted in different colors:

godiva-choc-truffles

This can give you an idea of how the site is using the copy to say, scream, or shout what the page is about. Sometimes you'll find that sites might just be ranking for a keyword or phrase based on the authority of their domain. If they are ignoring the on-page and off-page (links) for a keyword, it could signal to you that this might be a keyword worth pursuing and a keyword you can reasonably expect to rank for.

Making it a Process

Competitive research is just one piece of the puzzle, as you know. I find that breaking the entire process down into manageable chunks can help each process be more productive and efficient. This would be my process when researching the competitiveness of a keyword. While there are other pieces to your SEO research you should note that you do not need to spend hundreds or thousands of dollars on fancy competitive research tools off the start.

Save the money you might spend on tools on link development, content development, and content promotion.

Pages






    Email Address
    Pick a Username
    Yes, please send me "7 Days to SEO Success" mini-course (a $57 value) for free.

    Learn More

    We value your privacy. We will not rent or sell your email address.