Thanks for the Feedback & All You Need is Love

Early feedback on the SEO Toolbar has been quite positive.

You know people like a Firefox extension when an official Microsoft blog makes an entry promoting it!

Have a good weekend everyone!

Actually I just wanted an excuse to embed this Beattles video in a blog post :)

Time to Update the SEO Toolbar

Earlier today I called an older version of the SEO Toolbar that does not have the update option built in it. If you downloaded it earlier today, please download again from

The current version should be 1.0.1 (rather than 0.1). Sorry about the error on the updating part...but you won't have to download it again after this time...the update feature will work, and it is safe to just download it now as it will write over the earlier version of the extension.

The SEO Toolbar

What would happen if you smooshed together many of the best parts of Rank Checker, SEO for Firefox, the best keyword research tools across the web, a feed reader (pre-populated with many SEO feeds), a ton of competitive research tools, the ability to compare up to 5 competing sites against each other, easy data export, and boatloads of other features into 1 handy Firefox extension? Well, you would have the SEO Toolbar.

Yes, Keyword Ranking Reports are Still Valuable

In Mike Grehan's New Signals to Search Engines he highlights how personalization, social media, and universal search may help move search beyond text and links.

Mike also contended that ranking reports are dead. While clients should see the end effect of optimization in their analytics and sales data, ranking reports still have good value to professional SEOs. Below are a couple examples of why and how ranking reports are still important, even as Google crowds the organic search results with universal search stuff.

New Sites

Track Your Growth

When you build a new site from scratch you get to see how effective your link building strategies are as the site's rankings improve. You have to get in the game before you compete...ranking improvements give you an idea of how your site's trust is growing even before you rank well enough to receive much stable traffic.

This early feedback data can be used to guide further investment in link building efforts, and prioritize which websites get the most effort and investment.

Show Clients Baseline Rankings & Growth

If you sell services to clients and they have a brand new site with limited traction then a ranking report shows baseline rankings and proof of growth, even before top rankings yield lots of traffic. This helps customers have confidence in their SEO provider, even if their SEO investment loses money before making it back.

Page 2/3 Rankings

If you rank on page 2 or 3 for some high value keywords you might not see much traffic from them. But if your keyword rankings let you know that you are close to the top you can consider working on link building and altering your site structure to improve the rankings of those pages.

Services like SEMRush also help give insights into such ranking improvement opportunities.

Algorithm Changes & Penalties

How Are Search Algorithms Shifting?

Is Google putting more weight on authority sites? How much does the domain name count (if at all)? Is anchor text becoming more important or less important? How aggressive should you be with anchor text?

When major algorithm updates happen, tracking a wide array of sites and keywords can help you hypothesize what might be gaining importance and what might be losing importance.

What Happened to My Google Traffic?

Sometimes sites get filtered out of the search results due to manual penalties, automated penalties, automated filters, algorithm changes, or getting hacked. Sometimes the issues are related to particular pages, particular folders, whole sites, or keywords closely related to (or containing) another word.

Seeing a traffic drop gives you some clues that something may be wrong, but one of the easiest ways to isolate the issue and further investigate is to look at ranking reports to see what keywords and what pages were affected...then you can start thinking about if it was a glitch, something you can fix, or something you can't.

Download Alexa Top 1,000,000 Websites for Free

Quantcast was the first web traffic analytics company that allowed users to download their top 1,000,000 websites. Recently Alexa followed suit, giving away a daily updated index of their top 1,000,000 websites.

Such lists should be taken with a grain of salt, but at free one can't complain about the price. As time passes free and good enough is going to force those selling tools and information to offer something that has a sustainable advantage over free.

At the same time...

  • the sea of information will become increasingly hard to navigate, increasing the value of filters (particularly those built around a shared perspective or bias)
  • hyped up salesmen will be able to build many business models out of selling such recycled information to the uninitiated, forcing others who sell information to add even more differentiators between themselves and the competition

Thanks to Jamey for the Alexa tip. :)

Google Launches a Sweet Competitive Research & Keyword Research Tool

The Inside AdWords blog announced the beta launch of Google's Search-based Keyword Tool. To some degree the tool is a knock off, but with a number of exceptions

  • this tool is free
  • Google has more search data than does
  • this shows bid prices and search volume estimates next to keywords (like the Google Traffic Estimator)
  • this shows your current page titles and keywords
  • this shows the % of organic and paid traffic going to a URL

For any keyword, the Google Search-based Keyword Tool will show up to 800 related keywords with cost and search volume estimates. This tool also works to show you 100 keywords related to a site, and if you own a website they will show you thousands of keywords that they think you could bid on which are not already in your account. In addition they show your search share of voice (via ads and organic search results) for keywords. This data is easy to export using a handy export button.

There are a variety of cool extra filters that can be applied on this tool, including...

  • minimum or maximum search volumes
  • bid price range
  • low, medium, or high competition
  • keyword in URL
  • combining URL and keywords as filters
  • keyword + general category
  • negative keywords

Using a variety of different combinations for these filters you can see many different sets of 800 keywords even within the same subset. Export these different lists a variety of times and you can quickly build a list of thousands of high value keywords.

If you are a paying subscriber, this thread has a few more tips for how to get the most out of this tool.

SEM Rush Search Marketing Research - Review of

What is SEM Rush?

A sweet new competitive research tool by the name SEMRush has hit the market. It can be seen as a deeper extension of the SEO Digger project (adding PPC data and tracking AdWords keywords), and a competitor to services like and SpyFu (which recently launched SpyFu Kombat).

Brief Tool Overview

Competitive research tools can help you find a baseline for what to do & where to enter a market. Before spending a dime on SEO (or even buying a domain name for a project), it is always worth putting in the time to get a quick lay of the land & learn from your existing competitors.

  • Seeing which keywords are most valuable can help you figure out which areas to invest the most in.
  • Seeing where existing competitors are strong can help you find strategies worth emulating. While researching their performance, it may help you find new pockets of opportunities & keyword themes which didn't show up in your initial keyword research.
  • Seeing where competitors are weak can help you build a strategy to differentiate your approach.

Enter a competing URL in the above search box & you will quickly see where your competitors are succeeding, where they are failing & get insights on how to beat them. SEMrush offers:

  • granular data across the global Bing & Google databases, along with over 2-dozen regional localized country-specific Google databases (Argentina, Australia, Belgium, Brazil, Canada, Denmark, Finland, France, Germany, Hungary, Japan, Hong Kong, India, Ireland, Israel, Italy, Mexico, Netherlands, Norway, Poland, Russia, Singapore, Spain, Sweden, Switzerland, Turkey, United Kingdom, United States)
  • search volume & ad bid price estimates by keyword (which, when combined, function as an estimate of keyword value) for over 120,000,000 words
  • keyword data by site or by page across 74,000,000 domain names
  • the ability to look up related keywords
  • the ability to directly compare domains against one another to see relative strength
  • the ability to compare organic search results versus paid search ads to leverage data from one source into the other channel
  • the ability to look up sites which have a similar ranking footprint as an existing competitor to uncover new areas & opportunities
  • historical performance data, which can be helpful in determining if the site has had manual penalties or algorithmic ranking filters applied against it
  • a broad array of new features like tracking video ads, display ads, PLAs, backlinks, etc.

While their tool is a paid service, the above search box still allows you to get a great sampling of their data for free. SEMrush is easily our favorite competitive research tool. We like their tool so much we also license their data to offer our paying subscribers a competitive research tool powered by their database.

In-Depth Review

SEM Rush vs

The big value add that SEM Rush has over a tool like is that SEM Rush adds cost per click estimates (scraped from Google's Traffic Estimator tool) and estimated traffic volumes (from the Google AdWords keyword tool) near each keyword. Thus, rather than showing the traffic distribution to each site, this tool can list keyword value distribution for the sites (keyword value * estimated traffic).

Normalizing Data

Using these estimates does not provide results that are as accurate as's data licensing strategy, but if you own a site and know what it earns, you can set up a ratio to normalize the differences (at least to some extent, within the same vertical, for sites of similar size, using a similar business model).

One of our sites that earns about $5,000 a month shows a Google traffic value of close to $20,000 a month.
5,000/20,000 = 1/4 = 0.25

A similar site in the same vertical shows $10,000
$10,000 * 0.25 = $2,500

Disclaimers With Normalizing Data

It is hard to monetize traffic as well as Google does, so in virtually every competitive market your profit per visitor (after expenses) will generally be less than Google. Some reason why..

  1. In some markets people are losing money to buy marketshare, while in other markets people may overbid just to block out competition.
  2. Some merchants simply have fatter profit margins and can afford to outbid affiliates.
  3. It is hard to integrate advertising in your site anywhere near as aggressively as Google does while still creating a site that will be able to gather enough links (and other signals of quality) to take a #1 organic ranking in competitive by default there will typically be some amount of slippage.
  4. A site that offers editorial content wrapped in light ads will not convert eyeballs into cash anywhere near as well as a lead generation oriented affiliate site would.

SEM Rush Features

Keyword Values & Volumes

As mentioned above, this data is scraped from the Google Traffic Estimator and the Google Keyword Tool.

Top Search Traffic Domains

A list of the top 100 domain names that are estimated to be the highest value downstream traffic sources from Google.

You could get a similar list from's Referral Analytics by running a downstream report on, although I think that might also include traffic from some of Google's non-search properties like Reader.

Top Competitors

Here is a list of sites that rank for many of the same keywords that SEO Book ranks for

Overlapping Keywords

Here is a list of a few words where Seo Book and SEOmoz compete in the rankings

Compare AdWords to Organic Search

These are sites that rank for keywords that SEO Book is buying through AdWords

And these are sites that buy AdWords ads for keywords that this site ranks for

Once Upon a Time...

I was going to create a tool similar to this one about a year ago, until I hired a programmer that was EPIC FAIL. The guy who managed that program is no longer selling programming services - and that makes the world a better place.

I actually had 3 attempts at such a tool. I bought a GREAT domain name, spec'd out the project, then planned on doing it...

  • investor backed, who decided to back out
  • self funded, but I hired... 1.) a programmer who mid-project decided he needed to make double what I make working part-time, then 2.) the worst programmers ever.
  • combination of heavily self funded with the guidance of a bad ass VC, but I backed out due to a need to focus on this site

I spent most of this year focusing on trying to build our community and raise our editorial quality (both goals are going well, but require significant maintenance). We have had 4 strong hires in a row, so it seems like our luck has changed on that front. Recently I started working with a programmer who really clicks with me, often taking my ideas and making them way better than I intended.

If these guys had not made this tool I was going to try to take another run at something like this early next year...which brings up a good point that a friend (and wicked intelligent open source programmer) named François Planque told me. He said all he had to do was think up a good idea but not do it, and within 6 to 12 months if he had not done it, someone else would have already launched it.

Entry cost is so low that a lot of great tools are going to get made in short order, but it is hard to win by sitting on a good idea. ;)

Lots of Marketing Goodies

PPC Stuff

My wife recently put together a PPC strategy flowchart. Check it out and please give her feedback.

Search Engine Land has a good post with interview snippets of Nick Fox about some of the recent Google AdWords changes.

Google announced they are ending the proposed partnership with Yahoo!

The FCC approved the wireless broadband whitespace plan, which in time should make for more online searchers.

SEO Stuff

Wordtracker released a new keyword tool based around keyword questions. The information is quick and easy to export. Ken McGaffin said, “This is a fun tool that is a great source of inspiration for web content writers. You need never be short of creative ideas again." And it is a cool idea - good job Wordtracker!

Majestic SEO did a major update, claiming to have crawled about 52 billion URLs and has nearly 350 billion unique URLs in their anchor index. Here is a list of their top URLs with inbound links.

They also did a comparison between their link counts and those found by Yahoo! Site Explorer and LinkScape. They claim to have more links in their database than Yahoo! is showing, but I have to wonder how they could do that economically, if they are counting more duplicates, and why they haven't bought a site design that reflects how much they must be spending on data.

A few years back search engines were in an ego based contest about who has the biggest index, and I find it a bit ironic that a couple SEO companies will likely be engaged in such a data war...but the marketplace competition should be good for all SEOs.

I recently did an interview with Patrick Altoft about link building for affiliates.

Jim Boykin started the WeBuildPages SEO blog.

In the weird department, have you heard the We Like SEO song yet?

Conversion Stuff

Conversion Rate Experts highlighted 14 cool conversion enhancing tools.

Avinash Kaushik showed how powerful Google Analytics segmentation is.

Content Stuff

The NYT is getting close to where it will be hard for them to service their debt. Who should buy them out?

Funny blog post about economic blog posts

Political Marketing

Obama won the election and gave one of his best speeches. But Seth Godin didn't even wait for the vote to happen before he deconstructed the campaigns from a marketer's perspective. XMCP also discussed the backchannel negative PR campaigns.

SEO for Firefox - Now With SEO X-ray

We recently added an SEO X-ray feature to SEO for Firefox. You must use Firefox 3.0 or above to see these features, but if you want to see...

  • how the on page optimization of any page looks (headings, meta description, page title)
  • the keyword density of the page and popular phrases on the page
  • how many links point into a page (total links, or links from external resources)
  • how many links point out of a page (as well as the anchor text of these links, nofollow vs follow, internal vs external - all exportable in CSV format)

then this new feature makes it quick and easy to do all of that. Simply right click on the page you are viewing, scroll down to SEO for Firefox, and click on SEO X-ray.

That will show you an overlay on the screen like this

We are planning on doing another update in the next couple days, and may add...

  • the IP address of the site (and links to other sites on the same IP address)
  • character and word counts for page title and meta description body content
  • a link to the domain tools overview page for the associated site

If you are using Firefox 3 and SEO for Firefox please give this a try and let us know what you think.

SEOmoz's Linkscape: Why the Backlash is Overblown

Right after I finished writing a post about how being likeable is a great business strategy, I went back to Sphinn and saw it erupted with controversy and negative feedback about SEOmoz's Linkscape. Since then threads have been open, closed, and open. People are worried about everything from the index size to how to remove your site to why you shouldn't label your site with an obvious SEO footprint.

So my timing on that last post was a bit off, but I still think the general thesis is valid. But now that there has been so much negative feedback I figure it is my job to play devil's advocate and highlight reasons why most SEOs do not need to be too worried about Linkscape.

Cool Features

Unique Linking Domains

One of the coolest features of this tool is knowing the number of unique linking domains pointing links at a specific site, but that feature is for paying members only.

A competing tool by the name of Majestic SEO allows you to see that data as part of their free overview. Click on the image below for an example.

If your competitor has high authority links then you need more than just quantity to compete, but if most of their backlinks are garbage then this is a good stat to have, along with many other stats you can get from tools like SEO for Firefox.

Spam Reporting

Not that I advocate spam reporting (as the official guidelines have departed from reality so much that almost everyone that ranks is spamming and/or spammed in the past to get to their current market position), but for professional SEOs that own dozens of sites and like doing spam reports to Google this might be a good tool for outing competitors, since it makes it easy to find some noscript links, links from off the page, inbound 301 redirects, but the average webmaster probably does not need to worry about that.

A Bit Top Heavy

One of the biggest limitations in Linkscape is that you can only go 500 results deep unless you want to buy a custom report. They allow you to see various lenses of 500 at a time through search features and filters, but a big recommendation I can make on this front is for them to allow you to see all that data, even if it requires exporting data to CSV...they already spent the money to collect the data, so if you're a customer they may as well give it to helps nobody if nobody sees it.

Majestic SEO appears to have a similar sized database as Linkscape, and they allow you to do a full data export for your own domain free of charge. Other domains they charge a scaling price for depending on the number of links to the domain.

More Cool Features?

Nick Gerner promised more features in the next version of Linkscape, but unless they start buying usage data and become more like I am not sure if it will be a game changer. On to explaining why...

1. Editorial Rules

When Linkscape was announced Danny Sullivan said:

Personally, I'm not too worried. You want to compete with me and get links in places where I'm listed? We get listed in places where editorial rules. So just knowing where we're at doesn't get you in the door -- you have to be good enough to walk in. And if you are good enough, well, good I guess.

The highest quality links typically tend to be editorial in nature, with many of those being driven by social relationships. No matter how much one decides to analyze link patterns, they can't re-create most of the link relationships if they don't already have the content quality, market exposure, and awareness. And if you copy someone's idea after they already did it you need to greatly improve upon it to get credit for it.

2. Tons of Alternative Data Sources

Common link analysis questions...

How do I Get a Basic Competitive Overview of the Search Results?

Search Google with SEO for Firefox turned on. Make sure you are pulling data in the automatic mode while searching.

I Want to do Anchor Text Analysis. How do I Analyze Links?

Some options include...

  • SEO Link Analysis - a free Firefox extension that adds anchor text to Google Webmaster Central and Yahoo! Site Explorer.
  • Link Diagnosis - another useful Firefox extension.
  • Link Analysis Tool - shows the PageRank and number of inlinks to each page on a site, though it requires you to set up a MySQL database.
  • Both Google Webmaster Central and Majestic SEO allow you to download backlink profiles for your own sites after you authenticate your sites.
  • Backlink Analyzer - a free desktop based tool I had created a few years ago that pulls data from the Yahoo! API. Make sure to watch the video on the download page before using it.

I Want to Find New Links to Competing Sites

If you want to find what someone's best ideas are all you have to do is subscribe to the Google Blogsearch feed for links to their site, like so. That should list many of the people who are talking about this site.

A paid option on this front is Advanced Link Manager. It costs $199 (or $299 if you package it with Advanced Web Ranking) and scrapes data from Yahoo!, keeping track of the date when the link was found.

I Want to Find New Links to My Site

This is the same as competing sites, but you can also use your web analytics and server logs to dig up additional information. You can also look inside Google Webmaster Central to download backlink reports.

I Want to Find The Most Authoritative Links Pointing at a Site

Yahoo! Site Explorer generally orders backlinks roughly in terms of authority, with some of the most authoritative backlinks showing up at the top of their results.

I Want to Find .edu Links

Yahoo! Search offers a wide array of advanced link operators. Here are .edu & .gov links pointing at

I Want to Get an Estimate of Unique Linking Domains

Majestic SEO offers a free estimate...though, like LinkScape, their crawl is not as comprehensive as Yahoo!'s.

I Want to Find Hub Links?

What Sites Drive the Most Traffic to My Competitors?

The best way I have found to get this data is from Referral Analytics, though it requires a $500 a month subscription...which is a nice chunk of change, unless you are already doing quite well!

Do I Have Any Broken Links?

3. All Link Graphs Are Unique

Each search engine has its own crawling priorities and own web graph. Google has probably spent hundreds of millions of dollars building and refining their crawling sequence. No two crawls are the same.

Image from Google Touchgraph.

4. Yahoo! Search Counts Link Weight Differently Based on Page Segmentation

Google's PageRank was designed based on a random walk theory, where browsers click a random link on the page. But search engines are looking to move beyond the random walk model.

Yahoo! Search's Priyank Garg stated:

The irrelevant links at the bottom of a page, which will not be as valuable for a user, don’t add to the quality of the user experience, so we don’t account for those in our ranking. All of those links might still be useful for crawl discovery, but they won’t support the ranking.

5. Microsoft May be Looking to Heavily Incorporate Usage Data

Microsoft did research on BrowseRank, which aims to use actual usage data to augment (or perhaps replace) their link graph. Be default, Internet Explorer 8 sends usage data to Microsoft...when you know what 80% of web users are doing you do not need to rely on a random walk.

Think of having access to the majority of the web's usage data like this:

  • If Google's algorithms are more relevant than Microsoft, then putting weight on usage data allows Microsoft to quickly catch up by weighting whatever Google is weighting
  • Microsoft could theoretically be better than Google at filtering out paid links, as most paid links in a sidebar or footer do not send much traffic...and thus could easily be weighted less than links in content - though with Google owning so many products they could improve significantly on this front as well, if they decided to use their AdSense data, analytics data, Chrome browser data, Feedburner data, and toolbar data.

6. Google Does a Lot of Hand Editing

Google hires 10,000 remote quality raters.

Beyond those editors there are many search engineers inside the webspam team offering a variety of techniques to throw off SEOs, including

  • stripping all PageRank from a site and killing all its rankings
  • stripping some portion of a site's PageRank and ranking abilities
  • stripping PageRank from the toolbar but still allowing sites to rank
  • showing full PageRank in the toolbar, but killing the ability of a link to pass PageRank

Without working inside of Google and/or buying and testing lots of links across a wide array of sites and verticals it would be hard to know if any particular site passes PageRank, and how much it might pass. For instance, a link from's website is one of my highest MozRank links, but I doubt Google places much weight on that link since Google does not let Text Link Ads rank for their own brand.

Read Eric Schmidt's perspective on brands to consider how Google holds different sites to different standards.

7. Search Engine Editorial Policies are Selective, & Constantly Changing

According to Udi Manber, Google did 450 search algorithm updates last year. Even if you could somehow catch up with all the editorial stuff search engines were doing to manipulate their version of the link based web graph, you would have a hard time of keeping up with it - let alone accounting for the hoards of usage data the search engines have.

The status of a link (and its ability to pass PageRank) may arbitrarily change based on media exposure. In the past many websites were hijacked by 302 affiliate links (this even happened to Google's site, and this is still happening today to corporate sites as big as Snapnames).

At an SEO conference about 3 or 4 months back someone highlighted that some large sites use 301 redirects on affiliate links. This topic came up once again at SMX East, where it was deemed an acceptable marketing practice:

Shockingly, when asked point blank if affiliate programs that employed juice-passing links (those not using nofollow) were against guidelines or if they would be discounted, the engineers all agreed with the position taken by Sean Suchter of Yahoo!. He said, in no uncertain terms, that if affiliate links came from valuable, relevant, trust-worthy sources - bloggers endorsing a product, affiliates of high quality, etc. - they would be counted in link algorithms. Aaron from Google and Nathan from Microsoft both agreed that good affiliate links would be counted by their engines and that it was not necessary to mark these with a nofollow or other method of blocking link value.

A few years ago I set up my affiliate program to use 301 redirects to prevent hijacking, and get any link benefits I could. But right after I changed by business model to a membership site my affiliate program was featured/outed in this interview, and it no longer passes PageRank.

Watch the above video and see how at 2 minutes and 15 seconds in my site was put up for review to any Google engineer that happened to watch it.

The same set of links, to the same site, using the same format, under similar circumstances...

  • counts for most major corporations (and is allegedly an approved and legitimate strategy)
  • counted for this site for years
  • stopped counting around the time they were outed by a popular SEO blogger

8. Temporal Algorithms + Domains Expire, & May Lose PageRank

Search engines may place weight not only on the number of links pointing at a page, but also on the rate at which links are accumulated. Even if you know the raw number of links and the site age it still does not tell you how many links were built last month or in the last year.

Not only are links born, but some of them rot. The web graph as a whole is over a decade old. Linkrot was a big issue in 1998, and it is still a big issue today. In 1998 6% of links were broken, and the DotBot crawl shows 7% of links being broken.

To appreciate how bad linkrot is...

Some domains that expire may keep their PageRank, but many expiring domains lose their PageRank. With how hard it is to build links today and 1 in 7 links broke there are SEO tools designed around trying to capture this link equity

The domains that die off may later be re-registered and re-purposed. And keep in mind that the 1 in 7 broken links number is actually much higher than that when you consider how many people buy expired domain names and build them out.

By creating an index of the web in 2008 a person would have no idea if...

  • the links occurred recently
  • if the links are old
  • if the site expired and potentially lost much of its link weight

And Matt Cutts generally hates re-purposing expired domain names. Why? The very first spam site he found was a high PageRank expired domain linked from the W3C. That site was converted to a porn site, and ever since then (before Matt was the head of the webspam group - before Google even had a webspam group) Matt has not liked expired domains.

Matt offers background on that story 30 seconds into this video:

9. Advancing Algorithms That Move Away From PageRank & Anchor Text

Paid links have been an obvious weak spot in the relevancy algorithms for years. PageRank and anchor text are still both important, but Google also considers other factors like...

  • domain age / link age
  • domain name (and extension)
  • domain history (ie: spam infractions/penalties, etc.)
  • site authority
  • signals of locality (hosting location, TLD, link sources, etc.)
  • searcher intent (Google's Amit Singhal stated "the same query can mean entirely different things in different countries. For example, [Côte d'Or] is a geographic region in France - but it is a large chocolate manufacturer in neighboring French-speaking Belgium")
  • other forms of search personalization (past searches, user subscriptions, frequently visited sites, etc.)
  • editorial partnerships with news companies & other universal search categories (like Google Shopping Search and the maps local onebox)
  • usage data (especially with sites they host, like YouTube)
  • content age (read up on the Query Deserves Freshness algorithm)

Look at some of the search results from Google's 2001 index and compare them to current search results to see how much Google has moved away from a raw PageRank model. Yahoo! Search's Priyank Garg also stated that they have moved away from placing so much weight on links:

All of those links might still be useful for crawl discovery, but they won’t support the ranking. That’s what we are constantly looking at in algorithms. I can tell you one thing, that over the last few years as we have been building out our search engine and incorporating lots of data, the absolute percentage contribution of links and anchor text to the natural ranking of algorithms or to the importance in our ranking algorithms has gone down somewhat.

Final Thoughts

It is not that Linkscape is a bad tool, it is just aiming to do something incredibly complex, and as long as Yahoo! Site Explorer gives us a decent free sample (and other tools let us layer data on top of Yahoo!) we can get a good idea of the approximate level of competition for free. But with Yahoo! at $12 a share, if Yahoo! gets bought out and Site Explorer goes away then Linkscape (or Majestic SEO, depending on who does a better job of innovation) might be one of the best SEO investments one can make.