Reminder: New Free Tools:
About two weeks ago I added a bunch of SEO tools to my free SEO tool list. I just made all of the new public tools open source, so if you like any of them you can do whatever you want with them.
Cool Keyword Research Tool: Needs Feedback:
I would love feedback on the SEO Book keyword research tool. Give it a try and see if you think it is useful or what features you would like added to it.
The tool does not try to do everything within itself, but provides relevant links for digging through keyword data from lots of other tools, and makes it easy to cross reference Overture results and Google Suggest results, as well as many other keyword tools.
At the bottom it also provides links to well known web directories, news sites, shopping sites, tagging sites, encyclopedias, blog search engines, and blog trend graphs. Those links are intended to help you find information that others found useful, and maybe also help you come up with linkbait ideas.
I think occasionally if it gets queried too heavily it breaks, and sometimes intermittently Overture will not provide data, but whenever I re-query it the tool usually works fairly quickly.
Link Harvester Updated:
Some people recently notified me that the xls sheet output in Link Harvester was putting array where the individual links were supposed to be. I just got that fixed too.
Backlink Analyzer Update to Come:
The last update to Backlink Analyzer (like 2 months ago) broke some of the key features. I did not realize this until recently. I just spoke with the lead developer, and although a new version was intended to be out in December I am hoping to have it out by the end of this month.
I think I have updated Link Harvester twice since I last posted new source code. It now allows you to grab link data via Yahoo! or MSN.
On top of allowing you to search for links at a specific page or links to anywhere in a domain it also has a third function called deep links which allows you to get a sample of deep link data without grabbing links pointing at the home page. The theory is that many good sites get deep links. Looking through the deep links may give you a better view of how they were acquired or if they are all garbage scraper links, etc.
By looking through the deep links you can
check the quality of links pointing at inner pages.
know what URLs you really need to redirect if you are changing your content management system.
know what URLs are important to redirect if you buy a site and want to modify the content or gut out pieces that were causing duplicate content or other problems
Another useful feature of looking at the deep link profile is that if you look at the links pointing at sites that were not actively marketed via SEO techniques it can help you see what natural link profiles look like.
MSN tends to give some weird numbers with their backlink count sometimes and typically shows fewer backlinks than Yahoo! so by default when Link Harvester gives link counts like
Showing 421 unique domains from the first 250 results of 1129 total results
it means that between Yahoo! and MSN there were 421 unique results returned in the query. The of first 250 means that the link search depth was set to 250 per engine. The 1129 results is the number of links in the Yahoo! database (although they don't return 100% of what they know of they return most of them). If Yahoo is turned off the third number should be from MSNs database.
Some of the tools could be quite a bit better, but I bought a bunch of them for cheap, almost just for the heck of it because they were cheap. Let me know if you find any of them useful. I already know some tweaks I want to make when time becomes available.
The next build of Backlink Analyzer is coming along slowly but surely. Am hoping to have a cross platform version of it with a few more features before the month is out.
Originally when Threadwatch was created NickW was going to track forums, but quickly found them to be a bit too repetitive & later switched to finding other news sources.
A friend of mine by the name Chris Ridings created a site called Resource Rate, which aimed to use a variety of editors to track SEO forums. It seemed to have quickly faded in popularity.
Another one of my friends, named Eaden, recently lauched SEO Bytes. It is a concept similar to Resource Rate, with a few exceptions:
No central editors: instead of having central editors the threads are ranked by freshness, number of replies, and recent activity.
Adjustible scoring: you can chose to place more weight on freshness or recent activity to get the newest threads first. You also can rate up good forums place less weighting on forums you do not like as much.
I believe SEO Bytes stores your settings in a cookie, but some SEO's travel a good bit. A few features I would like to see:
allow people to log in so their settings work on different computers
allow me to block all sub forums from a specific forum
add a few more forums to the list of forums
if he really wants to put a ton of effort into it ;) allow users to place more weight on thread ratings from friends and allow friends to submit threads for their friends to see
I am fairly certain Eaden will read this post, so please post what you like and what you would like to see at SEO Bytes.
In many industries it is likely that tools such as SEO Bytes will spring up. Sometimes they will have the best value as public research tools, and often if left private they can help some webmasters get the scoops.
For the longest time it seemed as though Google was uncomfortable sharing some of their search data, afraid to give competitors the inside scoop, but that is no more. Google recently launched a new Google AdWords Keyword Suggestion Tool[you have to be logged in for the link to work & it may not be available in all accounts yet], which is much more usable than their older Google AdWords Keyword Sandbox.
Keyword list sorting. Sort the results of your keyword search by popularity, performance history within the AdWords system, cost, and predicted ad position.
Easy keyword manipulation. Select a few keywords here and there, or add them all at once. Keywords already present in your Ad Group will be marked so that you don't have to worry about them. You can also download your keyword list as a .csv file.
Search for keywords in three ways. Use keywords you enter, your existing high clickthrough rate keywords, or any webpage URL for your search. You can also expand your keyword search even further to include pages linked to from the original URL. (Note: Site-related keyword searching is currently only available for English language users.)
More keyword results based on regularly updated statistics. Our advanced search engine technology allows us to provide you with the latest information on potential keywords for your campaigns.
Google also has tips on how to use the tool. I am not sure how well this tool interfaces with their API, but automating keyword selection based on Google's usage data and extracting meaning from page content makes the market a hell of a lot more efficient, especially for large advertisers willing to pay a bit extra for branding. The new tool also makes it easy for newbies to quickly build out targeted keyword lists. Google also has put Google Suggest in their toolbar, which allows them to sell better targeted ads than searches on broad generic terms would, and also helps consolidate the less common search queries (misspellings, etc.) to fewer overall phrases and more predictible patterns (since the search term suggestions are going to be based off of past popular searches). All of these will lead to Google being able to increase their profit margin per search. Combine that with the recent toolbar bundling and the numbers are looking up for Google.
This new keyword tool allows you to:
tap Google's userbase to select keywords based on past searches
create keywords from a URL, site, keywords you enter, or the most relevant terms in your account (as determined by CTR)
RankAttack technology does not submit your site to the search engines... rather it creates a persona of "popularity" around your site in the eyes of the search engines. The purpose in RankAttack technology is simple: get the search engines attention and make them want to list your website under the keywords you desire'
I can't see search term co-citation being a trusted source of data unless it is from well estabished search history accounts and/or there are also a number of news stories about the topic and/or new web pages on trusted sites about the topic.
If temporal effects of increased search volume are used to allow sites to gain link popularity at quicker rate then odds are pretty good search engines would also look at the number of news stories and unique sites posting about the topic.
I suppose you can write a number of press releases and the like, but it is going to be hard to get mainstream news coverage for most websites, and without it then I can't see any value in poisoning the keyword research tools in your keyword space (unless you are doing it to screw with competitors keyword research ability or marketing your site through spamming keyword suggestion tools - as many SEO companies have done).
Andy states that this type of search spam is poisoning the keyword databases, but WordTracker has worked hard to filter out most of it:
Unfortunately, this approach is skewing the popular keyword databases such as (our own) Wordtracker, KeywordDiscovery, Overture suggestion tool and the Google keyword tool.
However, we have improved our spam filter and 99% of these skewed terms have now been removed from the Wordtracker database.