Selling SEO Services: A Consultative Approach

Does the thought of selling fill you with dread?

If you see yourself as a technologist, or marketer, then selling may not come easy to you. But we all need to sell something, even if it is just our opinion! If you're a consultant of any description, it comes with the territory.

So it pays to know a few techniques. Luckily, sales isn't something you have to be born to do - it does not require supernatural charm, charisma, a hide as thick as an elephant, and a superhuman drive.

Selling can be like a doctors consultation.

A Visit To The Doctor

When you go to the doctor, do you expect the doctor to just guess what is wrong with you?

A doctors consultation involves the doctor asking you a series of questions. This questioning is to help determine what the problem is, and how it can best be solved. At the end of the process, the feeling is probably one of relief and assurance i.e. that the doctor has your best interests at heart, and will cure what ails you.

It's the same in business.

Any client you encounter has a problem. Like a specialist doctor, it is your job to ask a series of questions to help nail down the problem and find a solution. The very act of questioning - known as consultative selling - helps build trust and rapport with the client in the same way you may experience with a doctor. This works especially well in the field of consulting, which is based on information sharing.

The emphasis is on clients needs, as opposed to getting a signature on the dotted line. You first establish a client's needs, then you provide a solution, if you have one. You're building a relationship, based on trust, by asking a series of questions.

Not so hard, really.

The Mechanics Of Consultative Selling

Ok, so how do you do it?

First, you need to understand the buyers buying process. You then match your selling process to their buy process.

All buyers go through a specific process. For example, if a company needs internet marketing services, do they go to their established provider - possibly the web design company who built their site - or do they go direct to the SEO market? Do they attend conferences? If so, which ones? Hint: they may not be SEO conferences. Do they ask other business people in their business network? Do they go with a known brand?

It's pretty simple to determine the buying process if the buyer comes straight to your website, fills out the contact form, and requests a call-back. But life often doesn't work that way.

A prospective client may ask their web design company. Their web design company may not have had a clue, had you not been in to see them a week earlier. You asked the web design people a few questions about whether they had an SEO capability in house, found out they didn't, and found out they had a lot of clients who quite possibly needed SEO. You proposed a joint deal whereas they would refer their clients to you, for a 10% commission.

Try to find out how your prospective clients buy SEO services, and position yourself accordingly. Think business associations and clubs, their existing providers in related areas, and the other companies they have an association with.

You need to get yourself positioned correctly in their buying process.

If you've managed to get in front of them, you then need to think about the questions you are going to ask. You should be asking about their business, where they see it going, what problems they are having, their place in the market, and their competitors. Business owners typically like doing this, and will welcome your interest, so long as you're seen as a "doctor" i.e someone they trust to help. You'll also need to make a presentation, which, depending on the context, need not be formal. It could consist of showing them case studies of how you've helped solve this problem before. Let's face it, most SEO/SEM problems and solutions are going to look pretty much the same.

It's all about trust relationships. It's a fact of life that people buy more readily from people they trust.

But how do you know if you can trust your prospective buyer?

Screening Buyers

Consultative selling is also a great way to screen out tire kickers. A person who is just pumping you for information will reveal very little about themselves. The conversation will be one sided.

If they are genuinely interested in your service, they are more likely to answer questions. They do have to trust you first in order to do this, so try to think like a doctor if you encounter resistance. i.e. "I want to help you get more traffic, but I can't do so if I don't know more about your business before I can devise an appropriate solution".

Be prepared to walk if they don't volunteer the information you need. Even if you did land the job, you may end providing a substandard solution to their problem, which will likely end in tears. Better to find clients who you can work with, rather than against.

Another method of screening is to pre-close the sale. When you are gathering needs, ask that if you can solve their problems to their complete satisfaction, as a result of this discussion, that they will buy your services.

This will sound to them like a fairly safe bet i.e. you have to propose something that solves their problem. However, it also creates an implied obligation on their part to do so. There is no risk on your side, as you can either solve the problem, in which case you'll likely get the business, or you can't, in which case you'll walk anyway.

If they are hesitant, it is either an opportunity to walk, and thus stop wasting your time, or an opportunity to find out something more about their buying process.

In short, when thinking about sales:

  • You are not a salesperson. You are a "doctor"
  • Focus on the needs of the client, not landing the job. Sale hucksters typically focus on the close too soon, which can destroy trust
  • It's ok to walk away. You won't be able to help some clients
  • Insist that the client engage in conversation. A client who asks you questions, and volunteers little information, might be pumping you for information

These consultative sales techniques are covered in various sales theory books. Check out "Consultative Selling", by Mack Hanan, Jay Abrams "The Sticking Point Solution", and "Stop Telling, Start Selling: How to Use Customer-Focused Dialogue to Close Sales" by Linda Richardson.

How Many Companies Has Google Bought?

One of the best ways to track Google's strategies is through visualizing & analyzing their acquisitions. Which is what the following image helps you do. Click on it for the full enlarged version :)


via Scores

Alexa Site Audit Review

Alexa Logo

Alexa, a free and well-known website information tool, recently released a paid service.

For $199 per site Alexa will audit your site (up to 10,000 pages) and return a variety of different on-page reports relating to your SEO efforts.

It has a few off-page data points but it focuses mostly on your on-page optimization.

Alexa Site Audit Review Homepage

You can access Alexa's Site Audit Report here:

http://www.alexa.com/siteaudit

Report Sections

Alexa's Site Audit Report breaks the information down into 6 different sections (some which have additional sub-sections as well)

  • Overview
  • Crawl Coverage
  • Reputation
  • Page Optimization
  • Keywords
  • Stats

The sections break down as follows:

Site Audit sections and subsections

So we ran Seobook.com through the tool to test it out :)

Generally these reports take about a day or two, ours had some type of processing error so it took about a week.

Overview

The first section you'll see is the number of pages crawled, followed by 3 "critical" aspects of the site (Crawl Coverage, Reputation, and Page Optimization). All three have their own report sections as well. Looks like we got an 88. Excuse me, but shouldn't that be a B+? :)

So it looks like we did just fine on Crawl Coverage and Reputation, but have some work to do with Page Optimization.

Alexa Site Audit Overview

The next section on the overview page is 5 recommendations on how to improve your site, with links to those specific report sections as well. At the bottom you can scroll to the next page or use the side navigation. We'll investigate these report sections individually but I think the overview page is helpful in getting a high-level overview of what's going on with the site.

Alexa Site Audit Overview

Crawl Coverage

This measures the "crawl-ability" of the site, internal links, your robots.txt file, as well as any redirects or server errors.

Reachability

The Reachability report shows you a break down of what HTML pages were easy to reach versus which ones were not so easy to each. Essentially for our site, the break down is:

  • Easy to find - 4 or less links a crawler must follow to get to a page
  • Hard to find - more than 4 links a crawler must follow to get to a page

The calculation is based on the following method used by Alexa in determining the path length specific to your site:

Our calculation of the optimal path length is based on the total number of pages on your site and a consideration of the number of clicks required to reach each page. Because optimally available sites tend to have a fan-out factor of at least ten unique links per page, our calculation is based on that model. When your site falls short of that minimum fan-out factor, crawlers will be less likely to index all of the pages on your site.

Alexa Site Audit Reachability Report

A neat feature in this report is the ability to download your URL's + the number of links the crawler had to follow to find the page in a .CSV format.

Alexa Site Audit Reachability Report Download Links

This is a useful feature for mid-large scale sites. You can get a decent handle on some internal linking issues you may have which could be affecting how relevant a search engine feels a particular page might be. Also, this report can spot some weaknesses in your site's linking architecture from a usability standpoint.

On-Site Links

While getting external links from unique domains is typically a stronger component to ranking a site it is important to have a strong internal linking plan as well. Internal links are important in a few ways:

  • The only links where you can 100% control the anchor text (outside of your own sites of course, or sites owned by your friends)
  • They can help you flow link equity to pages on your site that need an extra bit of juice to rank
  • Users will appreciate a logical, clear internal navigation structure and you can use internal linking to get them to where you want them to go

Alexa will show you your top linked to (from internal links) pages:

Onsite Links Alexa Site Audit

You can also click the link to the right to expand and see the top ten pages that link to that page:

Expanded Onsite Links Report

So if you are having problems trying to rank some sub-pages for core keywords or long-tail keywords, you can check the internal link counts (and see the top 10 linked from pages) and see if something is amiss with respect to your internal linking structure for a particular page.

Robots.txt

Here you'll see if you've restricted access to these search engine crawlers:

  • ia_archiver (Alexa)
  • googlebot (Google)
  • teoma (Ask)
  • msnbot (Bing
  • slurp (Yahoo)
  • baiduspider (Baidu)

Site Audit Robots.Txt

If you block out registration areas or other areas that are normally restricted, then the report will say that you are not blocking major crawlers but will show you the URL's you are blocking under that part of the report.

There is not much that is groundbreaking with Robots.Txt checks but it's another part of a site that you should check when doing an SEO review so it is a helpful piece of information.

Redirects

We all know what happens when redirects go bad on a mid-large sized site :)

Redirects Gone Bad

This report will show you what percentage of your crawled pages are being redirected to other pages with temporary redirects.

The thing with temporary redirects, like 302's, is that unlike 301's they do not pass any link juice so you should pay attention to this part of the report and see if any key pages are being redirected improperly.

Redirect Report Alexa Site Audit

Server Errors

This section of the report will show you any pages which have server errors.

Alexa Site Audit Server Errors

Making sure your server is handling errors correctly (such as a 404) is certainly worthy of your attention.

Reputation

The only part of this module is external links from authoritative sites and where your site ranks in conjunction with "similar sites" with respect to the number of sites linking to your sites and similar sites.

Links from Top Sites

The analysis is given based on the aforementioned forumla:

Alexa Reputation

Then you are shown a chart which correlates to your site and related sites (according to Alexa) plus the total links pointing at each site which places the sites in a specific percentile based on links and Alexa Rank.

Since Alexa is heavily biased towards webmaster type sites based on their user base, these Alexa Rank's are probably higher than they should be but it's all relative since all sites are being judged on this measure.

Alexa Site Audit Link Chart

The Related Sites area is located below the chart:

Related Sites Link Module Alexa Audit

Followed by the Top Ranked sites linking to your site:

Alexa Site Audit Top Ranked Sites

I do not find this incredibly useful as a standalone measure of reputation. As mentioned, Alexa Rank can be off and I'd rather know where competing sites (and my site or sites) are ranking in terms of co-occurring keywords, unique domains linking, strength of the overall link profile, and so on as a measure of true relevance.

It is, however, another data point you can use in conjunction with other tools and methods to get a broader idea of your site and related sites compare.

Page Optimization

Checking the on-page aspects of a mid-large sized site can be pretty time consuming. Our Website Health Check Tool covers some of the major components (like duplicate/missing title tags, duplicate/missing meta descriptions, canonical issues, error handling responses, and multiple index page issues) but this module does some other things too.

Link Text

The Link Text report shows a break down of your internal anchor text:

Link Text Report Alexa

Click on the pages link and see the top pages using that anchor text to link to a page (shows the page the text is on as well as the page it links too):

Link Expansion Site Audit Report

The report is based on the pages it crawled so if you have a very large site or lots and lots of blog posts you might find this report lacking a bit in terms of breadth of coverage on your internal anchor text counts.

Broken Links

Checks broken links (internal and external) and groups them by page, which is an expandable option similar to the other reports:

Alexa Broken Links Report

Xenu is more comprehensive as a standalone tool for this kind of report (and for some of their other link reports as well).

Duplicate Content

The Duplicate Content report groups all the pages that have the same content together and gives you some recommendations on things you can do to help with duplicate content like:

  • Working with robots.txt
  • How to use canonical tags
  • Using HTTP headers to thwart duplicate content issues

Alexa Duplicate Content Overview

Here is how they group items together:

Alexa Duplicate Content Grouped Links

Anything that can give you some decent insight into potential duplicate content issues (especially if you use a CMS) is a useful tool.

Duplicate Meta Descriptions

No duplicate meta descriptions here!

Alexa Site Audit Duplicate Meta Descriptions

Fairly self-explanatory and while a meta description isn't incredibly powerful as standalone metric it does pay to make sure you have unique ones for your pages as every little bit helps!

Duplicate Title Tags

You'll want to make sure you are using your title tags properly and not attacking the same keyword or keywords in multiple title tags on separate pages. Much like the other reports here, Alexa will group the duplicates together:

Alexa Site Audit Duplicate Title Tags

Low Word Count

Having a good amount of text on a page is good way to work in your core keywords as well as to help in ranking for longer tail keywords (which tend to drive lots of traffic to most sites). This report kicks out pages which have (in looking at the stats) less than 150 words or so on the page:

Alexa Site Audit Low Word Count

There's no real magic bullet for the amount of words you "should" have on a page. You want to have the right balance of word counts, images, and overall presentation components to make your site:

  • Linkable
  • Textually relevant for your core and related keywords
  • Readable for humans

Image Descriptions

Continuing on with the "every little bit helps" mantra, you can see pages that have images with missing ALT attributes:

Alexa Site Audit ALT Attribute Overview

Alexa groups the images on per page, so just click the link to the right to expand the list:

Alexa Site Audit ALT Attribute Groupings

Like meta descriptions, this is not a mega-important item as a standalone metric but it helps a bit and helps with image search.

Session IDs

This report will show you any issues your site is having due to the use of session id's.

Alexa Site Audit Session ID

If you have issues with session id's and/or other URL parameters here you should take a look at using canonical tags or Google's parameter handling (mostly to increase the efficiency of your site's crawl by Googlebot, as Google will typically skip the crawling of pages based on your parameter list)

Heading Recommendations

Usually I cringe when I see automated SEO solutions. The headings section contains "recommended" headings for your pages. You can download the entire list in CSV format:

Automated Headings Alexa

The second one listed, "interface seo", is on a page which talks about Google adding breadcrumbs to the search results. I do not think that is a good heading tag for this blog post. I suspect most of the automated tags are going to be average to less than average.

Keywords

Alexa's Keyword module offers recommended keywords to pursue as well as on site recommendations in the following sub-categories:

  • Search Engine Marketing (keywords)
  • Link Recommendations (on-site link recommendations

Search Engine Marketing

Based on your site's content Alexa offers up some keyword recommendations:

Alexa Site Audit Keyword Recommendations

The metrics are defined as:

  • Query - the proposed keyword
  • Opportunity - (scales up to 1.0) based on expected search traffic to your site from keywords which have a low CPC. A higher value here typically means a higher query popularity and a low QCI. Essentially, the higher the number the better the relationship is between search volume, low CPC, and low ad competition.
  • Query Popularity (scales up to 100) based on the frequency of searches for that keyword
  • QCI - (scales up to 100) based on how many ads are showing across major search engines for the keyword

For me, it's another keyword source. The custom metrics are ok to look at but what disappoints me about this report is that they do not align the keywords to relevant pages. It would be nice to see "XYZ keywords might be good plays for page ABC based on ABC's content".

Link Recommendations

This is kind of an interesting report. You've got 3 sets of data here. The first is the "source page" and this is a listing of pages that, according to Alexa's crawl, are pages that appear to be important to search engines as well as pages that are easily crawled by crawlers:

Alexa Site Audit Link Recommendations

These are pages Alexa feels should be pages you link from. The next 2 data sets are in the same table. They are "target pages" and keywords:

Alexa Site Audit Link Recommendations Target

Some of the pages are similar but the attempt is to match up pages and predict the anchor text that should be used from the source page to the target page. It's a good idea but there's a bit of page overlap which detracts from the overall usefulness of the report IMO.

Stats

The Stats section offers 3 different reports:

  • Report Stats - an overview of crawled pages
  • Crawler Errors - errors Alexa encountered in crawling your site
  • Unique Hosts Crawled - number of unique hosts (your domain and internal/external domains and sub-domains) Alexa encountered in crawling your site

Report Stats

An overview of crawl statistics:

Alexa Site Audit Report Stats

Crawler Errors

This is where Alexa would show what errors, if any, they encountered when crawling the site

Alexa Site Audit Crawl Errors

Unique Hosts Crawled

A report showing which sites you are linking to (as well as your own domain/subdomains)

Alexa Site Audit Unique Hosts

Is it Worth $199?

Some of the report functionality is handled by free (in some cases) tools that are available to you. Xenu does a lot of what Alexa's link modules do and if you are a member here the Website Health Check Tool does some of the on-page stuff as well.

I would also like to see more export functionality especially in lieu of white label reporting. The crawling features are kind of interesting and the price point is fairly affordable as one time fee.

The Alexa Site Audit Report does offer some benefit IMO and the price point isn't overly cost-prohibitive but I wasn't really wowed by the report. If you are ok with spending $199 to get a broad overview of things then I think it's an ok investment. For larger sites sometimes finding (and fixing) only 1 or 2 major issues can be worth thousands in additional traffic.

It left me wanting a bit more though, so I might prefer to spend that $199 on links since most of the tool's functionality is available to me without dropping down the fee. Further, the new SEOmoz app also covers a lot of these features & is available at a monthly $99 price-point, while allowing you to run reports on up to 5 sites at a time. The other big thing for improving the value of the Alexa application would be if they allowed you to run a before and after report as part of their package. That way in-house SEOs can not only show their boss what was wrong, but can also use that same 3rd party tool as verification that it has been fixed.

Your Favorite Eric Schmidt Quotes?

Do you want Google to tell you what you should be doing? Mr. Schmidt thinks so:

"More and more searches are done on your behalf without you needing to type. I actually think most people don't want Google to answer their questions," he elaborates. "They want Google to tell them what they should be doing next. ... serendipity—can be calculated now. We can actually produce it electronically."

Of course the problem with algorithms is they rely on prior experience to guide you. The won't tell you to do something unique & original that can change the world, rather they will lead you down a well worn path.

What are some of the most bland and most well worn paths in the world? Established brands:

The internet is fast becoming a "cesspool" where false information thrives, Google CEO Eric Schmidt said yesterday. Speaking with an audience of magazine executives visiting the Google campus here as part of their annual industry conference, he said their brands were increasingly important signals that content can be trusted.

"Brands are the solution, not the problem," Mr. Schmidt said. "Brands are how you sort out the cesspool."

"Brand affinity is clearly hard wired," he said. "It is so fundamental to human existence that it's not going away. It must have a genetic component."

If Google is so smart then why the lazy reliance on brand? Why not show me something unique & original & world-changing?

Does brand affinity actually have a hard wired genetic component? Or is it that computers are stupid & brands have many obvious signals associated with them: one of which typically being a large ad budget. And why has Google's leading search engineer complained about the problem of "brand recognition" recently?

While Google is collecting your data and selling it off to marketers, they have also thought of other ways to monetize that data and deliver serendipity:

"One day we had a conversation where we figured we could just try and predict the stock market..." Eric Schmidt continues, "and then we decided it was illegal. So we stopped doing that."

Any guess how that product might have added value to the world? On down days (or days when you search for "debt help") would Google deliver more negatively biased ads & play off fears more, while on up days selling more euphoric ads? Might that serendipity put you on the wrong side of almost every trade you make? After all, that is how the big names in that space make money - telling you to take the losing side of a trade with bogus "research."

Eric Schmidt asks who you would rather give access to this data:

“All this information that you have about us: where does it go? Who has access to that?” (Google servers and Google employees, under careful rules, Schmidt said.) “Does that scare everyone in this room?” The questioner asked, to applause. “Would you prefer someone else?” Schmidt shot back – to laughter and even greater applause. “Is there a government that you would prefer to be in charge of this?”

That exchange helped John Gruber give Eric Schmidt the label Creep Executive Officer, while asking: "Maybe the question isn’t who should hold this information, but rather should anyone hold this information."

But Google has a moral conscience. They think quality score (AKA bid rigging) is illegal, except for when they are the ones doing it!

"I think judgement matters. If you have something that you don't want anyone to know, maybe you shouldn't be doing it in the first place," - Eric Schmidt

Which is why the blog of a certain mistress disappeared from the web. And, of course, since this post is on a blog, it doesn't matter:

If you're ever confused as to the value of newspaper editors, look at the blog world. That's all you need to see. - Eric Schmdit

Here is the thing I don't get about Google's rhetorical position on serendipity & moral authority: if they are to be trusted to recommend what you do, then why do they recommend illegal activities like pirating copyright works via warez, keygens, cracks & torrents?

Serendipity ho!

SeoMoz's WebApp Reviewed

SeoMoz recently released a beta product called, simply, "WebApp".

Raven SEO Tools Review

Raven SEO Logo.

Everyday it seems like a new SEO tool or toolset is launching.

I've been quite impressed with the improvements and enhancements to Raven's SEO Tools since they launched. There are so many features in Raven but I want to focus on some of the really unique ones which make Raven a must have for me.

Link Research Tools

Raven has 2 powerful, time-saving tools in their Link Research toolset. Site Finder and Backlink Explorer are 2 tools that really help me quickly assess and work through link profiles and the link landscape of a particular keyword.

Site Finder

Site Finder is keyword driven and the reports are saved under the website profile you are working on in Raven. While the tool is fast (my auto insurance quotes example took about 6 seconds!) one of the workflow features that I really like is that I can run a bunch of these and go off to do other things within Raven rather than waiting for the reports to come back.

On to Site Finder! :

To use Site Finder, just navigate to it under the Links tab, enter your keyword, and hit "Run":

Site Finder Start page

Here are the results returned for my query on auto insurance quotes:

Site Finder Results

Site Finder gives you quite a bit of data and options in an easy to use interface, here's how it breaks down:

  • Search Box - search for a specific domain or reset the results post-search
  • Display Settings - show anywhere from 25 - 1k results on the page, show links that are "hidden" (links you "hid" via the options column), or show all links with no filters
  • Display Settings Option Box - click "Display Settings' and you'll get a box where you can toggle ACRank, MozRank, Page Authority, and/or Connections off and on
  • Site Finder Settings

  • Domain- the name of a domain which is linking to at least 1 site in the top ten Google Results. Click on the domain link to get a slick drop down of the sites that domain is linking too
  • Site Finder Domain Options

  • Link Icon - click the icon to display the domain in a new
  • Connections - number of sites in the top 10 for your keyword that have a link from that domain
  • ACRank - a quick, simple data point which aims to show how important a specific page is (0-15, 15 is the highest) based on referring domains. A more in-depth definition can be found here
  • MozRank - SeoMoz's global link popularity score. It mirrors PageRank but SeoMoz says it updates it more frequently and is more precise (scaled 0-10, 10 being the highest). A more in-depth overview can be found here
  • Page Authority - a predictor of how likely a page is to rank based on a 100 point, logarithmic scale independent of the page's content. The higher the better :)
  • Backlinks - total number of links the domain has going into the top 10 Google results
  • Options Tab - if you want to hide a domain from the report (maybe not a link you want to go after, you or your team members can click "hide" and the link will be hidden from the report. If "add" is clicked then the link is added to the link queue in the Link Manager (more on this shortly)
  • Export Options - export your report to PDF or CSV (really helpful, especially when running reports on hidden links to gauge how well a link builder might be doing in terms of assessing the appropriate links to hide

So that's Site Finder. The flexibility, power, speed, and collaborative features of Site Finder make it one of my favorite tools to use.

Backlink Explorer

Researching competitor's link profiles is usually a time-consuming piece of the SEO puzzle. While it still involves time, especially on larger link profiles, Backlink Explorer delivers some pretty impressive results quickly and efficiently via a 3rd party tie-in to Majestic SEO.

Another nice thing with Raven is a consistent, clean user interface across the toolset. Here's the spot where you enter the domain you want to research:

Backlink Explorer Start Page

Just like Site Finder it will save the report in the history of whatever website profile you are saving the report in. You can explore it at anytime or delete it at anytime:

Backlink Explorer History

Continuing on with the auto insurance theme, I ran a quick report on GEICO:

Backlink Explorer Results

Backlink Explorer gives you the following data points and options:

  • Search Box - search for a particular domain or words within a domain
  • Display Settings - group domains (this is really helpful for cutting down duplicate results from domains with more than one link to the site), show/hide hidden or already linked from domains, filter by ACRank, and display up to 1,000 results on the page
  • Display Settings Box - display or hide no-follow, image, or date data fields
  • Source URL - the site the link is from
  • Link Icon - open page in a new window
  • ACRank - as discussed in Site Finder's review, more info here
  • Anchor Text - the anchor text of the link
  • No-follow - whether it's no-follow or not
  • Image - whether it's an image link or not
  • Options Box - hide the domain or add it to your link queue
  • Export - export results, filtered or non-filtered to CSV

What's really great about this tool is that you can do some pretty heavy filtering to get rid of the noisy links and quickly add the good ones to your link queue. On its face it may seem like it's not that big of a time-saver, but it really is if you are combing through a large profile or multiple link profiles.

You could really buzz through some fairly thick link profiles with the filtering options and put them right into your link queue for you to work on later or for a team member to work on. Once you start working with it you'll quickly see how efficient it is for you or for you and your staff.

Link Management

This is probably my favorite tool in the toolset. Prior to utilizing this tool, I was using lots and lots of spreadsheets to track link building campaigns which got to be pretty time consuming and tough to collaborate on.

It's built in to the Raven SEO Toolbar which allows you to quickly add a link to your link queue, right from your browser, rather than hand copying the website's data to a spreadsheet for further processing. This is a slick feature for a one person show and really sings when used in a collaborative link building environment. The last 2 spots are where your site would be listed and your account profile name:

Raven SEO Toolbar

When you are researching link partners, simply click that Add Link button and you are presented with this screen:

Raven SEO Toolbar Add a Link

The link manager in an of itself is worth the price of admission in my opinion. So here you can:

  • Set the status to queued, requested, active, inactive, ignore, or declined. Most of the time it will be "queued" if you are saving it for further handling
  • Input the date the record was created
  • Select the type of link (organic, paid, blog, exchange, and so on). You can even define custom types in Raven and it will show as an option in this application
  • Note the desired anchor text of the link (great for collaboration with link building staff members)
  • Include the URL of where you'd like the link to point to
  • Add more links if you might be getting more than one link from the page
  • Tag the link for sorting within the link manager application
  • Set it to be monitored automatically from within Raven
  • Add it as a task for you or a staff member
  • Raven pulls in the URL, domain name of the site, and PageRank of the page
  • If available you can list the contact name and email as well as the type of site it is and even leave a note attached to the record

Try doing all that in a spreadsheet and a bunch of word or text documents for notes :)

Once again, another solid way to save loads of time doing what is probably the most time consuming part of an SEO campaign, link building.

So that was just the toolbar portion of the Link Manager. Within your Raven account you have access to the same "add link" application that you do from the toolbar. Perhaps you have link opportunities that you or a staff member cultivated outside of Raven. You can use this form to plug them right in.

You can also import links into your Raven account.

Raven Link Manager Import Links

You can upload a CSV file with custom data that Raven will recognize up to 20 columns of data points. These data points relate to Raven's Link Manager application. So you're able to define all of these (Raven gives you a handy sample CSV to do this from):

  1. Status
  2. Link Type
  3. Link Text
  4. Link URL
  5. Website Name
  6. Website URL
  7. Website Type
  8. PR
  9. Contact Name
  10. Contact Email
  11. Contact ID
  12. Cost Type
  13. Cost
  14. Payment Method
  15. Payment Reference
  16. Start Date
  17. End Date
  18. Creation Date
  19. Comment
  20. Owner Name

Currently the currencies supported are USD, GBP, EUR, AUD.

When you upload you can automatically add link monitoring by clicking the link monitoring box.

You can also import up to 1,000 backlinks from Yahoo! via your domain or your competitor's domains (ones you've defined in Raven).

Raven's link monitoring service will alert you if any changes occur to a link or a page the link is on. For example, you would be notified if:

  • PageRank changes
  • Anchor text changes
  • Another link gets added to the page
  • They add no-follow to your link
  • The location of your link changes

I believe Raven now has about 21 different tools within their toolset now. This one tool, for me, is well worth the subscription cost. It really does save quite a bit of time and there's really nothing else like it on the market that I've seen (in terms of functionality, collaboration, and ease of use).

Facebook

There are a growing number of applications out there where you can manage your social media accounts (mainly Twitter and Facebook, but Facebook in this example). If you want the most bang for your buck, Raven offers a state of the art Facebook application within its toolset.

Raven Facebook Entry Page

In addition to the deep reporting Raven gives you from within Facebook you can now integrate with Google Analytics from within Raven.

Facebook and Google Analytics with Raven

Here are some of the features offered within Raven's Facebook Tool:

  • Deep Google Analytics integration
  • White label reporting of Facebook metrics
  • Automatic wall post scheduling
  • Fan tracking, customizable by date range
  • Monitor posts, comments, and likes

What I really like about the Facebook tool in Raven is that you can really synch up your analytics information and truly get a handle on what's working and not working over defined periods of time.

The reason why I'm a big fan of the integration here is due to the fact that you are likely going to be using either Twitter or Facebook (or both) in your internet marketing campaign(s). So to have this data in one place and integrated, as well as using the deep metrics that the tools provide, amount to a set of game changing features with respect to Facebook campaign management.

Sometimes with all in one toolsets you see features like this get added and they are kind of watered down. This is not the case here, it's one of the stronger Facebook management tools out there. If you are going to allocate resources to search and social then you need a way to accurately track the ROI of your campaigns and that's exactly what you get with this tool.

Twitter

Occasionally Social Media campaigns can be tough to quantify in terms of ROI and overall effectiveness. Much like the Facebook Monitor, Raven offers a tool for Twitter users which is a real gem.

Twitter Entry Page Raven

Raven's Twitter Tool

One feature within the Twitter tool is the ability to post a new tweet right away or schedule it for later, integrate with 3 URL shortener services (bit.ly, is.gd, j.mp, and tinyurl), and set custom Google Analytics campaign variables. Raven also gives you the ability to work with bit.ly and j.mp's APIs.

Twitter Tweet Posting Raven

Monitor Twitter Activity and Engagement

If you are allocating resources to Twitter, or being paid by a company to run their Twitter account, then you'll want the ability to see some pretty juicy stats related to your Twitter campaign. With Raven's new Twitter tool you'll be able to see the following:

  • Posts
  • Followers
  • Friends
  • Friend to Follower Ratio
  • Mentions
  • Google Analytics referral data
  • Reply and Retweet reach (a great way to see how many readers are seeing the message

Here's a screenshot of the statistical overlay:

Twitter Insight Metrics Ravenf

What's really nice about this is the date range comparisons. It's a huge time-saver to manage this data mostly in one place, you can truly get a handle on what's working and what's not working, as well as why it's not working or working. The level of detail and integration is really unique to Raven's suite of tools.

Monitor Tweets Related to Your Account

In addition to viewing tweets from your public timeline you can also see all mentions associated with your account, as well as tweets posted from your account:

Raven Timeline Image Twitter

A great feature here is that if there is a thread associated with a tweet you can click on the "view thread" link and see the entire thread from within the Twitter tool.

You can also access this via Raven's slick iPhone/iPad app

Campaign Reporting

Much like the link tools are worth the full subscription for me, if you have a need for custom reporting then Raven's Campaign Reporting features are probably worth the price of admission for you.

In lockstep with their other tools, the Campaign Reporting feature set is super easy to use:

Campaign Reporting Image

You can quickly create white-labeled, customized reports for the following modules within Raven:

  • Link Building
  • Twitter
  • Rankings
  • Facebook
  • Keyword Research
  • Competitor Research
  • Social Media Monitoring (track mentions of your brand and/or keywords related to your service. It also allows you to manage overall sentiment and track daily buzz)
  • Google Analytics

The reporting options include the ability for you to use customized descriptions to explain different parts of the report, summary pages for different sections, and Raven will even generate a table of contents for you.

Brand Templates

Here you can quickly create a completely customized brand template for use with your reports, just click New Brand Template in the campaign home screen.

Give the template a name:

Name Brand Template

Assign it to a website, a profile or an account:

Brand Template Assignment

Pick a custom logo or text header:

Customize Header

Customize the colors and the footer text

Color and Footer Customization

Customize the appearance of your ranking results (keyword and rank alignment, numbers/+/-/arrows)

Custom Ranking Result Display

Report Templates

Report Templates allow you to configure specific aspects of each report, saving you from having to create them over and over again for each client or each report:

Similar to a Brand Template you start by clicking "New Report Template" in the Campaign Report screen. What I like about these reports is that they are fully customizable. Maybe you have clients that just hire you for keyword research, or just links, or both of those and social media (and so on). Well with the customization flexibility of these reports you can set up a custom template for just about any reporting need you may come across.

So name your report (I did Test 1) and you'll see the creation options on the left side:

Order Report Template

To give you an idea of how deep your customization and reporting options are, here is that left bar fully extended:

Custom Ranking Result Display

Every singe one of those tabs is a customizable report :) So you just click on the ones you want to add and they are added to the report template.

Customizing Reporting Fields

When you add the fields to a template, or when you are creating the report, you can expand the section and customize each one (the summary page and title are report-wide options, but they each have other options depending on the piece you are reporting on). Here's the customization options you get with the link detail module:

Expanded Reporting Options

Once you add more than one, you can collapse them and reorder them in a drag and drop fashion:

Report Order Customization

Scheduling and Auto Delivery

Maybe you want to auto-deliver reports to employees for further customization or presentation work, or maybe you want to set and forget the delivery of reports to your clients. You can send reports as attached PDF's or as trackable download links.

Scheduling Options

You can do monthly, daily, weekly, or quarterly reports and select a day between 1-28 as well as define a custom date range.

Create the Report

It's really easy to create a detailed, customized report within Raven. Name your report, select your brand and report templates, set you scheduling and delivery options, and create! It is really that simple. As mentioned in the Report Template section you can add, customize, and arrange all those reporting areas to suit your reporting needs.

Additional Features

While I focused on key areas that sold me on Raven, I also utilize their other tools. In addition to the tools mentioned above Raven's tools also include

  • Blog Manager - manage unlimited WordPress blogs (or any blog that supports XML-RPC
  • Competitor Manager - track competitors and see key metrics like PageRank, pages in Google's index, and links.
  • Contact Manager - this is where Raven stores (via this feature and via the Link Manager) contact information (mailing address, email, phone number, username, company, etc) which you can assign to different links, websites, and tasks
  • Content Manager - a place where you can manager articles, website content, and posts. You can add keyword analyzer features to check frequency, density, and relevance. You can also list where the article or post was used (quite handy for link building campaigns)
  • Design Analyzer - what I really like about this tool is the ability to look at your website in a Lynx browser
  • Event Manager - similar to GA annotations, the event manager can help you track any type of event related to your site. You can even include these in your reports, which is great for in-house record-keeping and/or client reports.
  • Firefox Toolbar - a killer link building assistant as discussed in the link section of this review. You can easily switch between your site profiles in the toolbar, use the analyzer features, and use logins for different social media personas.
  • Keyword Manager - a place to store potential and active keywords. A handy tagging system can be used to group keywords and you can add them to your rank tracker in one click.
  • Persona Manager - store multiple social network profiles and logins. In addition, you can also share these with staff members. This functionality is also available in the Toolbar.
  • Quality Analyzer - you can use this in your Raven account and from the Toolbar (which is a nice feature when scouring the web for links). It measures the site's indexed page count in Google and Yahoo, links from Yahoo, .edu links, .gov links, domain age, domain expiration, Google PageRank, Alexa Traffic Rank, and whether or not the site is in DMOZ. It assigns a numerical score based on this data.
  • Research Assistant - enter a domain to see data regarding the site's paid keywords, organic keywords, and competitors in both. You can one-click add a keyword or a competing URL to either the keyword/competition manager or to your SERP tracker (rank checker). Enter a keyword to see matching keywords and related keyword with data from SEM Rush, Google, and Wordtracker. View a page to see semantic data powered by OpenCalais.Com and keywords (related to the page's content) from AlchemyAPI.Com.
  • SERP Tracker - Raven's rank checker, runs once per week automatically, has historical chart and data viewing capabilities, and supports a bunch of international versions of Google, Yahoo, and Bing.
  • Google Analytics Integration - tie in your Google Analytics account for easy viewing and slick reporting.
  • Social Media - in addition to Facebook and Twitter Raven also offers brand/keyword monitoring services, integration with KnowEm and Omgili.
  • Website Directory - records of all the websites used in your campaign with filtering options to sort out different site and link types.
  • iPhone and iPad apps

Give Raven a Try

Raven's integration is slick and powerful:

  • Google, SEM Rush, and Wordtracker for keyword research
  • Majestic SEO & SeoMoz for link building and research
  • Google Analytics integration
  • Twitter & Facebook integration with lots of engagement goodies

Raven currently offers a free 30 trial, no credit card required, on all their plans. The combination of SEO tools, link building tools, social media integration, and custom reporting options were strong selling points for me especially at the price points Raven offers. I think you can also see the significant time saving benefits Raven provides, especially in the reporting module.

There isn't much to lose, a free 30 day trial that doesn't require you to enter any payment information. So give Raven's SEO Tools a try.

Pricing and Free Trial Info

Yahoo! Search Now Powered by Bing

Pretty exciting day in search seeing Bing results live on Yahoo! Search results.

Yahoo! Search Powered by Bing.

There were some questions as to what might transfer and what might stay. It seems that generally algorithmically there was roughly a 1 to 1 transfer.

Same Rankings.

Yahoo! is still showing fewer characters in their page titles than Bing does. Site links (listed below some sites) may also use different anchor text. But the core results are the same. The big exceptions to the concept of the 1:1 representation would be vertical search results, left rail navigation customizations & the inline search suggestions Bing does in their search results for popular search queries.

The vertical search results & left rail navigation being home grown is no surprise, as many of the features aim to keep you on the parent portal, and that is Yahoo!'s bread and butter. Here is an example of the inline suggestions Bing does (in this example, for "loans")

Inline Suggest.

Instead of inline suggestions like that, you might see the following kinds of navigational cues from Yahoo!

Also Try.

There has been some speculation as to if any Yahoo! penalties will get rolled into Bing (or Yahoo!'s version of Bing) & so far it seems like that is generally a no. Of course, that could change over time. There also has been speculation of Yahoo! Site Explorer going away, but it seems it will remain through early 2012.

The Yahoo! Site Explorer team is planning tighter integration between Site Explorer and Bing Webmaster Center to make the transition as smooth as possible for webmasters. At this stage in the transition, it is important for webmasters to continue using Yahoo! Site Explorer to inform us about your website and its structure so you keep getting high quality traffic from searches originating on Yahoo! and our partner sites – even from markets outside the US and Canada that haven’t yet transitioned to Microsoft systems. To keep things simple, we will share site information you provide on Site Explorer with Microsoft during this transition period.

When Microsoft fully powers the Yahoo! Search back-end globally, expected in 2012, it will be important for webmasters to use Bing Webmaster Center as well. The Bing tool will manage site, webpage and feed submissions. Yahoo! Site Explorer will shift to focus on new features for webmasters that provide richer analysis of the organic search traffic you get from the Yahoo! network and our partner sites.

Unfortunately some of Yahoo!'s advanced link query operators seem to no longer work (say you wanted to find links to a domain from .gov pages). But you can get such link data (or at least a piece of it) from Majestic SEO or SEOmoz's Linkscape (also in OSE's export feature & eventually their online interface).

Some smaller search companies, like Exalead, still offer advanced filters while performing link searches. The ability to search a full web index allows you to do cool stuff you can't do with just a link graph. I haven't looked at it yet, but I have heard good things & owe the folks at InfluenceFinder a review soon. When Blekko launches they will have a boatload of free SEO features to share as well. Members of our community have been giving it rave reviews for the past month or so.

Why 'Spam' is Everywhere & Why That Means Nothing!

Sigh, not this again. ;)

Recently Rand highlighted his surprise at how prevalent search spam is. But the big issue with search today is not the existence of spam, but how it is dealt with. For a long period of time Google spent much of their resources fighting spam manually. That worked when spammers were sorta poor and one hit wonders fighting on the edge of the web & few people knew how search worked. But as technology advances & "spammers" keep building a bigger stock of capital eventually Google loses the manual game.

Search engines concede the importance of SEO. It is now officially mainstream.

  • Both Google and Microsoft offer SEO guides.
  • Microsoft and Yahoo! have in-house SEO teams.
  • Yahoo! purchased a content mill.
  • Microsoft's update email about powering Yahoo! search results later this week contained "After this organic transition is complete, Bing will power 5.2 billion monthly searches, which is 31.6 percent of the search market share in the United States and 8.6 percent share in Canada. You can take advantage of this traffic by using search engine optimization (SEO) to complement your search campaigns and boost the visibility of your business."

Sure you will still see some media reports about the "dark arts" of SEO, but that is mainly because they prefer publishing ignorant pablum to drive more page views, as self-survival is their first objective. Some of the same media companies alerting us of the horrors of SEOs have in-house SEO teams that call me for SEO consultations.

A Google engineer highlighted this piece by submitting it to Hacker News, using this as the title "sufficiently advanced spam is indistinguishable from content." We tend to over-estimate end users. If most people don't realize something is spam then to them it isn't. If the search engineers have a hard time telling if a blog is ESL or auto-generated, how is a typical web user going to distinguish the difference?

Some SEO professionals have huge networks of websites and are 8 or 9 figures flush in capital. They can afford to simply buy marketshare in any market they want to enter. Burn one of their sites and they get better at covering their tracks as they buy 5 more. At the same time the media companies are partnering with content mills & the leading content mill filed for an IPO where they are hoping for a $1.5 billion valuation.

Why does one form of garbage deserve to rank when another doesn't? If link buying is bad, then why did Google invest in Viglink? If link buying is so bad then is lying for links any better? If so, how?

How exactly can Google stop the move toward spam in a capitalistic market where domains can be registered with privacy and marketers can always rent an expert to speak for the brand? Is a celebrity endorsement which yields publicity spam? How can Google speak out against spam when they beta test search results that are 100% Google ads?

Wherever possible, Google is trying to replace part of the "organic" search results with another set of Google vertical results. If Google can roughly match relevancy while gaining further control over the traffic they will. Just look at how hard it is to get to the publisher site if you use Google image search. And Google is rumored to be buying Like.com, which will make image search far more profitable for Google.

As Google continues to try to suck additional yield out of the search results, I believe they are moving away from demoting spam (due to the point of diminishing returns & risks associated with demoting what they themselves do creating anti-trust issues). Instead of looking for what to demote, they are now shifting toward trying to find more data/signals to promote quality from.

The issue with manual intervention (rather than algorithmic advancements) is that it warps the web to promote large beaurocratic enterprises that are highly inefficient. That is ok in the short run, but in the long run it leaves search as a watered down experience. One lacking in flavor and variety. One which is boring.

Google is going to get x% of online ad revenues and y% of GDP. In the long run, them promoting inefficient organizations doesn't make the web (or search) any more stable. They need to push toward the creation of more efficient and more profitable media enterprises. Purchases of ITA Software and Metawebs allow Google to attack some of the broader queries and gain more influence over the second click in the traffic stream. Business models which are efficient grow, whereas inefficient ones are driven into bankruptcy.

As Paul Graham has highlighted, we might be moving away from a society dominated by large organizations to ones where more individuals are self-employed (or who work for smaller organizations). We hire about a dozen people, but they are sorta bucketed into separate clusters. Some work on SEO Book, some blog, some help create featured content, some help with marketing, etc. etc. etc. The net result of our efficient little enterprise is pushing terabytes of web traffic each month. Would you describe the site you are currently reading as being "spam" simply because it is efficient & profitable? Would a site that took VC capital and was less efficient be any more proper? How much less interesting is the average big media article on the field of SEO?

If a search engine gets too aggressive with penalizing "spam" then tanking competitors becomes a quite profitable business model. If they are to focus on what to demote search engineers need to figure out who is doing what AND who did it. Thus the role of SEO today is not to remain "spam free" (whatever that is) but to create enough signals of quality that you earn the benefit of the doubt. This protects you from the whims of search engineers, algorithmic updates, and attempts at competitive sabotage.

You can future-proof your SEO strategy to the point where your site never loses traffic because it never ranked! Or you can get in the game and keep building in terms of quantity and quality. If lower quality stuff is all that is typically profitable in a particular market then it isn't hard to stand out by starting out with a small high-quality website. That attempt to stand out might not be profitable, but it might give you a platform to test from. After all, Demand Media purchased eHow.com to throw up their "quality content" on.

Online the concept of meritocracy is largely a farce. Which is precisely why large search companies are willing to buy content mills. If search engines want to promote meritocracy they should focus more on rewarding individual efforts, though that might have a lower yield, and some people prefer to stay anonymous given competitive threats from outing AND some of the creepy ways online ad networks harvest their data to target them.

What does the lack of meritocracy mean for marketers? If you are a marketer you need to be aggressive at marketing your wares or someone with inferior product will out-market you and steal marketshare from you.

Will someone consider your site spam?

Sure.

But they will have worse rankings than you do!

Jon Glick Interview

Jon Glick.
Jon Glick is one of the leading experts on search, having literally both wrote the code at leading search engines and later becoming an SEO professional. I remember speaking with him in 2004 at the Ghost Bar in Las Vegas and it was perhaps the most fascinating conversation about search I have ever been part of. I have wanted to interview him for years & just recently was able to. :)

In some past interviews (like this one) you have highlighted how Google's key strength is perhaps brand rather than relevancy. After seeing Yahoo! bow out of the search game do you still hold that same opinion? What do you think of the Bing brand?

Brand is still Google’s strongest competitive asset in search. It means that to get someone to switch you have to be significantly better than they are, which is a tall order. Bing is the first search offering from MSFT that is in the same league with Google, so it’s more about branding and positioning than objective quality at this point. If Bing was a standalone brand they wouldn’t have a chance, but it has the advantage of default positioning in IE, so for now it just has to be close enough that people won’t swap it out. Over time Bing may evolve some interesting differentiation from Google, but that’s not really the case right now (at least it seems to be pressuring Google to experiment/innovate a bit more). It’s been quite a while since using a MSFT product was “cool” and Bing has that drag on its brand.

Some of the new upstarts entering the search game believe that perhaps the thinning of the herd is creating an entry opportunity? Have you checked out Blekko yet? Any other new general search projects interest you?

Google rose to prominence during the dot-com bust when the existing players were quite disinterested in search, since at the time (pre-PPC) it was money loser. Search is so ridiculously lucrative right now that any promising technology that starts to get traction or buzz is likely to be quickly acquired by one of the major players as a blocking measure. Google’s rumored attempt to acquire Cuil for $80MM pre-launch is an example. There is an opportunity, but it’s more about getting bought out for a sweet price than taking down the SEs.

There is also so much manual tuning in search these days that even a great system will take a lot of effort to return great results. “Plumber OR Pipefitter” is a Boolean query, “Portland OR Plumber” is not, and someone’s got to build code to recognize that. This is where the existing players have a huge legacy advantage.

Looking at new search technologies I’m very cautious about those that ask users to do more work in return for better results. Search is a low-intensity activity that people don’t really want to learn or spend time on. This is where an approach like WA (that Bing is also aiming towards) looks interesting. We’d all like search to be like the computer from Star Trek that gives you back exactly the answer/data you ask for. The complication with this, beyond the technical issues, is what benefit it has for the webmasters (i.e. why should I let you crawl/index my site). Current SEs take your data for their use, but provide traffic in return, which an answering system would not.

You are one of the few guys who literally wrote the relevancy algorithms & then later worked in the SEO space. Do you consider the roles to be primarily complimentary or adversarial?

So is SEO good or bad for SEs? On the whole I think it’s a benefit for them. From an algo perspective it’s a lot easier to determine the intent of a well SEO’d page. The SEs give webmasters a lot of tools and encourage them to use them because it makes search better. 301 your pages so we know where the content went, let us know what parameters don’t impact page content so we don’t get caught in robot traps, tell us what language your page is in using the metatags so we don’t have to guess, etc. If one of these tools ends up being a net negative, SEs can always change how they treat it (NoFollow), or just start ignoring it all together (Keywords MetaTag). This is not to say that a lot of work doesn’t have to be put into removing spam and factoring out overly aggressive optimization, but it’s a lot less than what they’d need to do if no one SEO’d.

Given your experience on both sides of the table, do you feel that ranking great in other search engines is like stealing candy from a baby, or is it still hard? What aspects of the SEO process do you find most challenging?

For SEO-ing established businesses it’s not a slam dunk, but it is still possible to generate very strong returns. At Become.com we have dozens of people working on SEO in a very organized manner and paybacks on investing effort are better than almost any other aspect of our business. The challenging part is the innate volatility of SEO and the fact that ultimately the SEs control our destiny. You can put together a great growth plan, and then watch an algo update like MayDay shred it.

For the spammers, it’s like stealing candy from a sleeping Doberman. It’s easy until the Doberman wakes up.

Does your experience allow you to just look at a search result and almost instantly know why something is ranked? If so, what are the key things SEOs should study / work on to help gain that level of understanding?

I wish. There is always some pattern recognition that comes from experience (i.e. this is a collage site), but there are so many nuances in the code and off-page stuff that it’s not always instant, you just get better at knowing what to look for. The real learning comes from looking at pages that are ranking well for no obvious reason and seeing what they are doing. It’s no secret why apple is #1 for “ipod nano,” but what is that site I haven’t heard of doing right to get the #5 position? Also if we see a competitor suddenly see a step-function traffic lift we look to see what they changed/added that the SEs seem to be liking.

Back in 2006 you highlighted the rise of some of the MFA collage websites. In 2010 content mills are featured in the press almost every week. Are you surprised how far it has went & how long it has lasted?

I think Google actually likes folks like Demand Media. What they are doing is seeing where GG’s users are looking for something and not finding it, then plugging that hole. It may not be the Pulitzer Prize-winning content, but it allows users to find something and thus makes Google more useful and universal. When better content comes along those pages will slip down, but they serve a purpose in Google’s ecosystem.

Collage websites (stitch sites in Yahoo! parlance) are another story entirely. They add virtually no value and are pretty much spam IMO. The difficulty is in detecting and eradicating them as fast as they can be robo-created.

You mentioned looking at the aboutness of a site for Become.com when judging links. Do you think broad general search engines care about link relevancy?

Personally, I have not seen it have much of an impact, which is a shame. I think the main reason is that it is quite difficult for general SEs to judge which site relationships are meaningful, and which are not. For example, a golf course might get links from a real estate site; golf and real estate might be classified as very different verticals, but the links are quite relevant because the real estate agent is pointing out one of the benefits of the community. As a result link relevancy has become more about avoiding bad neighborhoods (3Ps, link farms, etc.) than finding good ones.

How important do you think temporal analysis is in judging the quality and authenticity of a link profile?

It’s certainly a red flag if a site gains too many links too quickly. The same is true if the profile of the links looks unnatural. If all your new links are coming from PR3-PR4 blog sites, something’s off. If bloggers are suddenly that interested in you wouldn’t a lot of PR0 comments exist, FB mentions, tweets, and a few higher PR press mentions? At Yahoo! sites that got a sudden upsurge in inlinks were classed as “spike” sites. Legit spike sites (ex. the website of some unknown who wins an Olympic medal) have typical hallmarks like temporally-linked mentions in media sites that you can’t buy access to (AP, NYT, Time, etc.). The spikes that are blackhatted look totally different.

In an interview a couple years ago Priyank Garg mentioned Yahoo! looked at the link's location on a page. Do you feel other search engines take this into account?

All of the major SEs have been doing boilerplate stripping for a while. They recognize footers, rail nav., etc. and look at those links differently. Also, SEs will only follow a limited number of links per page. They typically collect all the links, remove the checksum dups (note: if your links vary by even one parameter they will not be deduped at this phase), and follow the first N links from the code. None of the SEs will say exactly what N is, but it’s probably somewhere between 75 and 300 links (Google recommends you have <100). Put your important links high up in the code and save the header/footer stuff for further down.

What are some of the biggest advantages vertical search engines have over general search engines? As Google adds verticals, will they be able to build meaningful services that people prefer to use over leading vertical plays?

The big advantage of being a vertical search engine is the ability to limit the scope of the problem we’re trying to tackle. You can use a more focused taxonomy to provide a better experience, and present data in a way that is much more relevant than the 10 blue links. Sidestep is going to help me find the plane flight I want a lot easier than a Google search. The challenge is that the experience that you offer has to be dramatically better than Google. Google is easy, people know how to use it and it works for almost everything. Being 5% better at one thing won’t get anyone to switch behavior.

As Google adds verticals, it’s ironic that they are in a position in the browser similar to how I think of Microsoft historically on the desktop (link and leverage): they don’t need to win by being the best, they win by being the default. Google Product Search doesn’t have to provide a better user experience than say Shopping.com; it will get used because it gets placed prominently on the Google SERP.

At the upcoming SES you are speaking about meaningful SEO metrics. What are some of the least valuable metrics people still track heavily?

The one that jumps to mind is pages indexed. Depending on which GG servers you are hitting, that number is going to fluctuate, and I see people stress over those fluctuations when there is often no actual change. Also, getting indexed is virtually worthless; it’s getting ranked that’s valuable. It’s easy to get your “iPod” page indexed, getting a top10 ranking is another story. What’s the point of having 300,000 pages indexed if all your traffic is coming from 30 that have decent rankings? If you have pages that are indexed, but not ranking; either do some SEO for those pages (internal links, extra content, etc.) or NoIndex them and take them out of your sitemaps so other pages on your site get a chance.

Another is pageload time. Google has mentioned this as a ranking factor, but we really have not seen an impact. We focus on reducing latency, and loading search relevant content first (vs. headers or banner media), but that’s because it reduces abandonment rate not that it helps SEO.

What are some of the most valuable metrics which are not generally appreciated enough in the market?

The big one is revenue. Everything else is a means to this end; never lose sight of that.

The other is crawl rate (esp. from Google). This is a great leading indicator.

----

Thanks Jon! To hear more of Jon's insights on search check out his panel at San Francisco's SES conference next week.

How To Lie With Statistics

There are three kinds of lies: lies, damned lies, and statistics - Disreli

We get presented with graphs and statistics every day. "Most SEOs think keywords in the title tag is an important ranking factor." "Spending on search to rise by $10b". Ever get that feeling that what you're being presented with sounds plausible, but the conclusion just doesn't make sense?

Here are a few common ways people try to pull the wool over your eyes with statistics. Some you'll be familiar with. If you've got more, add 'em to the comments :)

1. Built In Bias

The sample data supports an obvious agenda. For example, a company is hardly likely to show a graph that shows their product has produced negative results. Try to determine the bias of the person or organisation presenting the data - "what would they want me to hear"? then ask yourself: "what data are they not showing me?"

2. The Average

The media loves to state "the average", then neglect to tell you which average they are talking about.

For example, the average house price for an area could both be 500K and 200K, depending on what type of average is being used. They could be referring to either the mean, the median or the mode. They often mix these up, depending on what conclusion they want you to reach.

3. Inadequate Sample Size

20% of web designers make over $1M. That may be true if the sample size consisted of ten highest earning people in the industry, and two people just happen to have had a great year. But what if the sample size is all those who practice web design for a living? The outcome may be somewhat different.

4. Meaningless Differences

A difference is only a difference if it makes a difference. Potential employee Jill may have an IQ of 120, and potential employee Jack may have an IQ of 118, but does that really mean anything? What if Jill has an attitude problem, and Jack is a great conversationalist? Who would be the better hire?

5. Oh My God!

Al Gore loves this one. The graph that shows some astonishing change in the status quo. The impression is one of significant movement and is meant to shock an audience.

However, if the chart appears in a different context - say, over a longer time period - the rise may not look all that unusual. You often see this in stock price quotes. You could also change the measurement into smaller units, thus making any movement in the graph look even more impressive.

6. What You Infer Is Up To You

If you can't prove what you want to prove, prove something else and pretend they are the same thing. Often used in the alternative medicine industry. They may not be able to prove that their natural products cure cancer, but they can say that the plant extract has been used by some remote tribe, and they have a proven historical low incidence of cancer.

7. Post Hoc

A study found students who smoked got lower grades. The fallacy of one thing not following the other i.e. smoking doesn't cause bad grades. Frequently, other factors are left out i.e. the students who smoked also tended to be party animals. Look out for correlations that happen by chance.

8. Data Precision

Quoting specific numbers, especially including decimals points, can look authoritative. "Real estate values up 4.95%" Why would someone be so precise if they didn't know their stuff? The numbers can be wild guesses, but accuracy gives an air of authority.

General Tips For Spotting The Lies

  • Ask "who says so?" Are they likely to be biased? If experts are cited, check to see if those experts actually agree with the conclusions. Often, they do not.
  • Ask "How do they know"? Is the sample size really large enough, or relevant enough, to draw conclusions?
  • Look To See If They Change The Subject. Look for a change between the raw data and the conclusion. Does one follow the other? For example, more reported incidences of crime do not necessarily mean there is more crime occurring.
  • Ask "Does this make sense?" - are they trying to blind you with numbers? If the conclusion just sounds wrong, look for a disconnect between the data and the conclusion

If you want to delve deeper in to How To Lie With Statistics, grab the little book of the same name. It's getting a bit dated now - it was written in 1954 - but the advice and examples are great :)

Financial Steroids

One of Wordnet's definitions for slave is "someone entirely dominated by some influence or person; 'a slave to fashion'; 'a slave to cocaine'; 'his mother was his abject slave.'"

Amongst that definition of the word, it is no stretch to say many Americans (and indeed the United States) are debt slaves. We encourage it in virtually every aspect of our lives: consumerism, taking on debt to buy a new car or house, education which requires a decade or more of solid employment to pay for, even when it sometimes prohibits employment:

Jordan Hueseman, 25, accrued roughly $100,000 in student loans at the University of Denver earning a bachelor's degree in international business and a master's in business administration. On the job hunt, he found his graduate degree sometimes hindered more than it helped.

“At one point, I applied to Whole Foods, hoping they might see some potential for me to move to some type of management position,” Hueseman said. “The e-mail I received from them said I was far too overqualified for any of their hourly positions and as such would not be considered for a position.”

Hueseman said that after one job application, he was told he should leave his degrees off his resume.

As bad as that is, student loan debt typically can't be discharged via bankruptcy. Introducing the for-profit element to the federally guaranteed loans also gives you major price distortions:

A student interested in a massage therapy certificate costing $14,000 at a for-profit college was told that the program was a good value. However the same certificate from a local community college cost $520.

Imagine buying an iPod for $6,703.84. That is how much one would cost at the above ratio. Even the die hard Apple fans wouldn't be buyers at that price. And yet the availability of credit (which only has to be paid back later) tied with the words of a recruiter/salesman closes such a deal every single day of the year.

You have to love marketing!

Many try their hardest to pay their debts. Some can't. The debts are then bought up for pennies on the Dollar & then they harassed to pay them. Some who can't make the payments end up being put in jail:

It's not a crime to owe money, and debtors' prisons were abolished in the United States in the 19th century. But people are routinely being thrown in jail for failing to pay debts.

The debts -- often five or six years old -- are purchased from companies like cellphone providers and credit card issuers, and cost a few cents on the dollar. Using automated dialing equipment and teams of lawyers, the debt-buyer firms try to collect the debt, plus interest and fees. A firm aims to collect at least twice what it paid for the debt to cover costs. Anything beyond that is profit.

Bail is often being set at exactly how much debt you have.

The banking class put teeth into the consumer bankruptcy laws under an Orwellian bill called the "Bankruptcy Abuse Prevention and Consumer Protection Act of 2005." Only a few months after it was passed an article titled Newly Bankrupt Raking In Piles of Credit Offers was published in the New York Times.

Of course, a few years later, when it was turn for the bankrupt banks to go out of business due to widespread intentional mortgage fraud and accounting control fraud, they pushed a bill through congress offering them a bailout - threatening marshall law and tanks in the street if they didn't get it.

The bailouts and legalizing accounting fraud (allowing banks to claim bogusly inflated asset values) were done with the alleged purpose of helping the banks restore their balance sheets. However those banks have started paying record bonuses again & a more cynical look at the sequence describes it as:

In effect, it's a Third World/colonial scam on a gigantic scale: plunder the public treasury, then buy the debt which was borrowed and transferred to your pockets. You are buying the country with money you borrowed from its taxpayers. No despot could do better.

The new president claimed to be in favor of transparency, and as part of the bill promoting it gave us this:

The law, signed last week by President Obama, exempts the SEC from disclosing records or information derived from "surveillance, risk assessments, or other regulatory and oversight activities." Given that the SEC is a regulatory body, the provision covers almost every action by the agency, lawyers say. Congress and federal agencies can request information, but the public cannot.

Here is the thing about business and personal investment. So often what we think we need is to invest money when what we really need to invest is time and effort. If you work twice as long as most people do, learn furiously, are willing to put yourself out there, and you know your market then you can overcome a lack of capital to build momentum.

Are there short cuts? Absolutely. But the most obvious ones which seem like they have the least upfront risk are typically not the best ones. There was a thread recently in our forums about forging a certain type of partnership, and John Andrews shared a great take on how that can work out. I shared a similar story as well. A $50,000+ life lesson without having to experience the pain.

About a month ago there was a thread where someone thought they *had* to have something which cost $100,000. Members of the forums dug up a great alternative which was only $1,700. Now he is in an incredible position without all that debt!

It is easy to think that debt is the key to growth, but "When the Student is Ready, the Teacher will appear" is a better way to think about growth. If you have to take on a lot of debt to do something then it might not be a great idea.

Debt works to limit you. It consumes your thought cycles, adds uncertainty, and pull attention away from what you do best. It raises your stress and is a major cause of divorce. Rand's story of building up a half million of debt is a good story of why it should be avoided. And he didn't start getting very successful until the debts were being paid off so he could focus on growing his business.

Given open source content management services like Wordpress, free themes, 99Designs, cheap web hosting, tons of market research data from keyword tools, etc. a person can get started for only a few hundred Dollars. Presuming you start by attacking your market from an informational angle, there is no need to take on huge leverage to get a project started.

Money can be a great lever. And if you have a lot of it certainly it makes sense to use it to your advantage. But the compounding interest on debt is also a lever working against you. It is what forces us to have recessions.

Can you succeed with the use of debt? Sure. But debt is a claim on future labor (with interest). The net impact on most people is probably more harmful than it is good. Particularly because if you spend more than you are making today then tomorrow you need to

  • cut your expenses to within your income
  • cut your expenses below your income to have money for interest on the loans
  • cut your expenses further to have capital to pay off the principal of the loan

And you have to do that in an increasingly gamed market where the rug can be pulled out from under you at any time. You don't control international balance of payments issues, but you certainly feel its impact in job security & the unemployment numbers. At any time forces beyond your control can pull the plug, rewrite the terms, or impact your market in ways that put you in a sour situation. If you have no debt and a bit of savings they can only screw you a bit. If you are loaded up on debt there are some risks you can't take. They own you.

"Compound interest is the eighth wonder of the world. He who understands it, earns it ... he who doesn't ... pays it." — Albert Einstein

Am I trying to say there is such a thing as a perfectly secure market position? Not at all. Market makers are often market manipulators. But when I read this quote:

"There was 5 exabytes of information created between the dawn of civilization through 2003," Schmidt said, "but that much information is now created every 2 days, and the pace is increasing...People aren't ready for the technology revolution that's going to happen to them."

the last thing I want to do is load up on debt.

How about you?

Ryan Deiss Perpetual Traffic Formula Review

Marketing generally has 2 core strategies in terms of customers: finding new customers & keeping your current/old customers happy. The best businesses tend to keep the interest of their customers for months and years through consistently improving their products and services to deliver more value. Whereas the other sorts of businesses tend to be hard-close / hype driven & always promoting a new product / software / scheme. It is never a complete system being sold, but some "insider secret" shortcut that unearths millions automatically while you sleep - perpetually. ;)

One of the problems with false scarcity hype launches is that it attracts the type of customers who can't succeed. The people who are receptive to that sort of marketing want to be sold a dream, they are not the type of people who want to put the time and effort in to become successful. They are at stage 2 in this video: "my life sucks" ... so sell me a story that will instantly make everything better without requiring any change from me at all. ;)

Another one of the problems with the hype launch business model is that it requires you to keep repeating the sales process like a traveling salesman. Each day you need to think up a new scheme or angle to sell a new set of crap from, and you have to hope that the web has a short enough memory that the scammy angles used to pitch past hyped up product launches don't come to bite you in the ass.

I don't mind when the get rich quick market work their core market, as there is a group of weak minded individuals who are addicted to buying that stuff. But I always get pissed off when someone claims that your field is trash or a scam (as an angle to sell something else), and then they later start trying to paint themselves as an expert in your field.

Here is a video snippet of Ryan Deiss exclaiming his ignorance of the SEO field & how he got ripped off thrice because he knew so little he couldn't tell a bad service provider from a good one.

"If you want to get free traffic you have to get good at the cut-throat game of SEO (which I for one am not). ... SEO for most of us isn't the right answer." - Ryan Deiss

And his latest info-product (in perhaps a series of dozens of them?) is called Perpetual Traffic Formula. In the squeeze page he highlights that it offers you the opportunity to... "Discovering a crack in Google algorithm so big it simply can't be patched. Being able repeat the process for similar results in UNLIMITED niches."

You don't have to be an expert to create an info-product!

The Droid has a pretty good review of how awful his sites are doing in terms of "perpetual traffic." :D

If you want to buy from a person who *always* has another new product with a secret short cut to sell, Ryan is THE guy. If you want to learn how to evaluate the quality of products being sold, here are some good tips on that front. And if you want to get a good overview of the internet marketing world for free you will love this.

Infographic: History of Search Timeline

PPC Blog has another cool infographic out. This one is called The History of Search: How Finding Stuff Online Became a $20 Billion Business.

Click on the below image to see the full version. And if you like it, feel free to use the embed code to add it to your website :)

History of Search.

Google Shows You How to Talk Out of Both Sides of Your Mouth (BETA)

Rel=nofollow to the Rescue

Years ago Google introduced rel=nofollow, claiming it as a cure-all for comment spam. Once in place, it was quickly promoted as a tool to use on any paid link. Google scared webmasters about selling links so much that many webmasters simply became afraid to link out to anyone for fear of falling out of favor with Google.

If You Don't Disclose You Are a Spammer

As the pool of links dried up due to the launch & spread of nofollow any ad network which used direct links was supposed to adopt nofollow or feel the wrath. Just ask Pay Per Post what Google can do to you if you sell links (to/through someone other than Google).

Google demanded that any form of paid link contain a machine readable and user readable disclaimer that it is paid for (even though in Google's marketing they highlight how some of their users are unaware the search results contain paid links).

What it came down to is if there was a monetary relationship associated with a link and you didn't disclose it then you were operating outside of Google's guidelines and may be considered a "spammer."

Selective Search Guideline Enforcement

I am one of many who have highlighted how by-and-large Google was responsible for killing off the link graph through their paranoia about "paid links," and their willingness to fund companies operating outside their guidelines that syndicate Google ads.

Our affiliate program on this site stopped passing link juice after a fellow SEO blogger outed it quite publicly. Other affiliate programs continue to pass PageRank. Highlighting Google's double standards invites more scrutiny and more selective arbitrary enforcement. Whereas promoting Google products earns free links. ;)

No Disclosure Required: WOOT!

Reading the news today I found out that VigLink bought out DrivingTraffic. Both are networks to help publishers monetize their outbound links. The claim about VigLink is the one of no-effort money:

"Quite simply, if you're a Web publisher who hasn't recognized the value of your outbound traffic, you are leaving money on the table," said Raymond Lyle, CEO and Co-Founder of Driving Revenue. "Dozens of our publishers make six figure incomes for a one-time investment of one minute of work. Who isn't interested in that?"

Note that "1 minute of work" doesn't really leave much time for disclosure. As stated in this video, the intent is to not offer any:

The page loads fast. And your site looks exactly the same. Even your links look and behave the same way. The only difference is that now when your visitors buy products or services you'll earn a commission. ... Once you have set up viglink you can sign in to view reports about your site. You can see how much money you are making every day and compare that with last week. You can see which merchants are the most profitable, and make decisions on who to link to in the future.

So basically Viglink is suggesting controlling who you link to based on whatever makes you the most money, and not providing any disclosure of the financial relationship.

AKA: paid links.

Presumably these VigLinks will still pass PageRank, but the affiliate stuff will be layered on top of the regular links using JavaScript. Pay affiliates using VigLink a bit of a higher percent for the exposure and you bought a ton of valuable inbound links for pennies on the Dollar.

Here is where it really gets screwed up: Google is an investor in VigLink.

Selectively allowing some links to pass link juice while arbitrarily blocking others indeed controls the shape of the web graph. It gives anyone who works with Google a strong competitive advantage in the organic search results over those who are not using Google endorsed technology.

Google also has a patent on automatically adding inline links inside content. Since they can't legally do it without permission of the webmaster, one presumes any implementation would be as part of a distributed ad network.

Makes you wonder about how evil undisclosed paid links are, no?