I'm expecting some creative answers here. I'll phrase it more generally: Forget XML files or even what Sitemaps looks like currently. What info would you want as a webmaster?
If you could design your dream webmaster or site owner console on Google, what would it look like?
Rand announced the launch of his page strength tool, which aims to be more accurate than Google's PageRank. The one downside to the tool is that there is a delay most all the data sources (for example I think my SEO for Firefox page is ranking at #10 in Google for SEO right now, but the page strength tool shows it as being at 3.5), but it is probably quite a bit more accurate than PageRank alone is.
Also interesting that on one front Google is requesting to look for ways to share as much data as they can with you while on other fronts they make external tools and ideas necessary and valuable because they are unwilling to share data they once shared. Thus markets which were once fairly open are getting more and more abstract. It happened with PageRank and SEO and now it is happening with AdWords too.
It now works on international versions of Google and Yahoo!, While it currently only works on the global Google.com and Yahoo.com sites and it pulls a ton of marketing data right into the search results to help you analyze how competitive a marketplace is. Learn more about SEO for Firefox.
There is a new version of Backlink Analyzer. I added PageRank to it (hopefully that doesn't piss off Matt too bad), made the search term feature more reliable for deep backlink analysis (ie: it shouldn't crash if you are doing an analysis of the keywords in thousands of backlinks), and it also shows what URL extension the links are coming from, as well as the page the links are pointing at if you do a linkdomain search.
If you find any problems with it please leave a comment on this post.
If it is not working when you try it here are some things to take note of
off the start if it does not work you may have to click the preferences button (the one with two check marks on it) to de-check the "use proxy server" option if you do not want to use a proxy
Norton (or equivalent) may block it
make sure you set the result limit count to a reasonable number (like 100 or 1000)
you have to select at least 1 engine to pull link data from
if you want to enable PageRank, site age, etc. you have to check those features
you cant get the keyword summary until the tool is done pulling in all the links
This thread will probably be up most of the rest of the weekend as I am reading a killer book and my internet access has been shifty recently. I have replaced the wall jack and all cords. A new router will be in next week. If my internet access is still unreliable that will speed up my decision of when and where to move.
So my friend might take another week or two to get it done, but I am having him make an extension for adding data to Google's SERPs on the fly. A mock up might look something like this. Notice the links under each organic search result showing things like PageRank, site age, site size, and linkage data. Of course if this extension was made you would be able to actively pull in the data automatically or click a button to have the data selection pulled in on an as needed per URL basis.
Is SEO for Firefox an extension worth making? What marketing data would you like to see in Google's SERPs? What data points should be page specific? Which should be site specific? Which should be both? When gathering site data should it gather subdomain specific data? Or domain specific data?
Bob Mutch at SEO Company created an inbound link quality extension for Firefox. You can download the extension from his home page, or access the tool online (again on his home page, but the web based tool has been slow). The tool checks to see if a site is listed in the Yahoo! Directory or DMOZ. In addition it searches Yahoo! for the number of .edu and .gov links pointing at a website.
The extension looks like this
While in some cases there are some .edu and .gov sites that offer up spammy links, the theory behind the tool is that most .edu or .gov links are going to be harder to get / more pure / of higher quality than the average link from most commercial sites. In that sense, the raw number of .edu and .gov links can be seen as a proxy for an indication of if a site has any quality natural editorial inbound links and an estimate of the depth of quality citations a site received.
Similar to the content categorization engine, but for keywords. In addition to the uses described above this tool can also show you how well your page is aligned with your core keywords. Try Microsoft's Keyword Categorization Engine
Demographics Prediction Tool:
Shows the age groups and gender of searchers for a particular query or visitors to a specific URL. Useful for:
showing the most common markets for a search query or domain.
showing you how well your site audience is aligned with your core keywords (for example, if a site lacks corporate bullshitspeakâ„¢, it would be unsurprising that the viewers of that site would be younger than the demographic averages for a field which is typically targeted toward older people who can't get enough corporate bullshitspeakâ„¢)
the most common groups of visitors and mindset to a site or for a query might be obvious, but some of the secondary and tertiary markets may be well less defined. this tool can help you find some of those other markets.
Shows seasonal search spikes. It is like a hybrid between Google Trends and Google Suggest, but it will also show you relevant keyword phrases that have your keyword in the middle of them. This tool does not seem to have as much depth as Google Trends (ie: only a surprisingly few searches show results). They also seemed to have stripped out many gambling and porn related keywords. Unlike Google, MSN places search volume numbers on their trends. Useful for:
Shows you Microsoft's opinion of the probability of a query or a page being information, commercial-informational, or commercial-transactional in nature. Works well in conjunction with Yahoo! Mindset. Useful for:
seeing how commercial they think a term or page is, which is important because it is believed that some search engines, such as Google, have a heavy informational bias to their search results.
A friend of mine recently quit his job to work as a full time content developer for me, where we share the revenues generated. He started off with about a 20 page website about a month ago and right now it has about 40 pages indexed in Google. Yesterday the site brought in about $40 in AdSense earnings (which is not a lot, but the site is young). Within 2 months I expect the site to be able to make at least $200 a day.
Create a Wide Net:
The ability to quickly grow revenues quickly is largely due to throwing out many fishing poles at the same time and then accepting the feedback the market provides. As you learn what feedback the market offers you can create more content in those areas.
Use Software to Track Your AdSense Clicks & Earnings Granularly:
Here are some ad click stats by URL from the last 3 days. Off to the left of this image are URLs which I did not want to include in the image, but you can see that a few of the pages get the bulk of the pageviews and the bulk of the clicks.
In the last 3 days 137 clicks came from 17 new pages that my friend recently created. That is not bad considering that the rest of the site has exceptionally amazing link authority pointing at its well established pages and only had 117 clicks over that same time period.
Keys to Understanding how to Profit from Content:
not all content is created equal
if your site already has plenty of authority that may carry new pages added to the site
If content production is nothing but a game of margins (and it is) then it makes sense to focus on the areas that are easy to get exposure in and are conducive to click happy site visitors.
Track Your Earnings at the Keyword Phrase Level:
So that is kinda how tracking your site performance on a per URL basis can help, but you can also track your ad clicks on a per search query basis. I had to scroll down through 9 screens to look at all the search queries that wounded up leading to ad clicks on that site this month.
While there were hundreds of search queries that resulted in ad clicks many of them had any of about a half dozen common modifiers in them. The modifiers that were useful on certain pages of the site were also likely useful relevant words for other sections of the site or perhaps the entire site.
Manufacturing Sitewide Relevancy:
I recently modified the site to
integrating relevant keyword rich navigation, breaking large sections into smaller sections and using subheaders near the navigational categories/options and
a descriptive bit of teaser content integrated into the content on each page
Doing that makes it easy for every page on your site to naturally match many modifier rich queries that were previously only relevant to a few pages that may have only accidentally contained those modifiers that resulted in heavy traffic.
If it is done well it could double the productivity of your site without adding much risk profile. Good spam does not look like spam. The intent is hard to question if content development is done in a manner where the additional content and navigation elements look like they are useful and targeted at human consumption.
How Do I Track My AdSense Clickthrough Statistics? What are the Best Scripts to Use?
If you want to start tracking your AdSense clickthroughs I offer a free script to track AdSense via Google Analytics. If you don't mind splashing out $97 you may want to give AdSenseGold.com's AdSense tracker a try as it has a few more features and is well worth the price if you intend on making big bank from AdSense. As a warning though, I found the associated newsletter to be so hyped up and spammy that I cursed him out for it, but I think AdSense Tracker itself is an exceptionally useful tool well worth its price.
Microsoft AdCenter Labs offers a tool for Detecting Online Commercial Intention. It estimates the probability of a web page or search query being information, commercial-informational, or commercial-transactional in nature. I think you have to use IE to use Microsoft's tool. Well at least they are consistant with the stupidity of trying to make it hard for their good ideas to spread.
You can also use Yahoo! Mindset to see how page relevancy scores change as algorithms move from commercial to informational in nature. Google's current search algorithms are heavily biased toward older and informational resources.
Sufyan created a free tool which checks page similarity. You can set it to check sitewide on small sites, or enter in a couple URLs manually to cross check them for how similar the pages are to one another.
This tool doesn't test if a site has canonicalization issues, but it is plenty cool for free.
update: link to seojunkie.com/2006/05/24/site-wide-duplicate-content-analyzer/ removed as it is now a domain lander page