URL Canonicalization: The Missing Manual

Oct 13th
posted in

Canonicalization can be a confusing area for webmasters, so let's take a look at what it is, and ways to avoid it causing problems.

What Is Canonicalization?

Canonicalization is the process by which URLs are standardized. For example, www.acme.com and www.acme.com/ are treated as the same page, even though the syntax of the URL is different.

Why Is Canonicalization An Issue For SEO?

Problems can occur when the search engine doesn't normalize URLs properly.

For example, a search engine might see http://www.acme.com and http://acme.com as different pages. In this instance, the search engine has the host names confused.

Why Is This a Problem?

If the search engines sees a page as being published at many separate URLs, the search engine may rank your pages lower than they would otherwise, or not rank them at all.

Canonicalization issues can split link juice between pages if people link to variants of the URL. Not only does this affect rank (less PageRank = lower rank), but it can also affect crawl depth (if PageRank is spent on duplicate content it is not being spent getting other unique content indexed).

To appreciate what a dramatic effect canonicalization issues can have on search traffic look at the following example, and notice that for the given example proper canonicalization increased traffic for that keyword by 300%

  Link Equity Google Ranking Position % of Search Traffic Daily Traffic Volume Traffic Increase
split 1 60% 8 3% 50 -
split 2 40% 15, filtered = 0 0% 0 -
canonical 100% 2 12% 200 300%

What Conditions Can Cause This Problem?

There are various conditions, but the following are amongst the most common:

  • Different host names i.e. www.acme.com vs acme.com
  • Redirects pointing to different URLs i.e. 302 used inappropriately
  • Forwarding multiple URLs to the same content, and/or publishing the same content on multiple domains
  • Improperly configured dynamic URLs i.e. any url rewriting based on changing conditions
  • Two index pages appearing in the same location i.e. Index.htm vs Index.html
  • Different protocols i.e. https://www vs http://www
  • Multiple slashes in the filepath i.e. www.acme.com/ vs www.acme.com//
  • Scripts that generate alternate URLs for the same content i.e. some blogging and forum software, ecommerce software that adds tracking URLs
  • Port numbers in the domain name i.e. acme.com/4430 : can sometimes be seen in virtual hosting environments.
  • Capitalization - i.e. www.acme.com/Index.html vs www.acme.com/index.html
  • URLs "built" from the path you take to reach a page i.e. tracking software may incorporate the click path in the URL for statistical purposes.
  • Trailing questions marks, with or without parameters i.e. www.acme.com/? or www.acme.com/?source=cnn (a common tagging strategy amongst ad buys)

How Can I Tell If Canonicalization Issues Are Affecting My Site?

Besides working through the checklist performing a manual check, you can also use Google's cache date.

Previously, you would have been able to use Google's supplemental index marker, although Google have recently done away with this feature.

The supplemental index is a secondary index, seperate from Google's main index. It is a graveyard, of sorts, containing outdated pages, pages with low trust scores, duplicate content, and other erroneous pages. As duplicate pages often reside in the supplemental index, appearing in the supplemental index can be an indicator you may have canonicalization issues, all else being equal.

Before Google removed the supplemental index label, many SEOs noticed that supplemental pages had an old cache date and that cache date is a good proxy for trust. If your page is not indexed frequently, and you think it should be, chances are the page is residing in the supplemental index.

Michael Gray at Wolf-Howl" outlines a method to easily check for this data. In summary, you add a date and unique field to each page, wait a couple of months, then search on this term.

How Can I Avoid Canonicalization Issues?

Good Site Planning

Using good site planning and architecture, from the start, can save you a lot of problems later on. Pick a convention for linking, and stick with it.

Maintain Consistent Linking Conventions

It's an important point, so I'll repeat it ;) Always link to www.acme.com, rather than sometimes linking to acme.com/index.htm, and sometimes linking to www.acme.com.

301 Redirect Non-www to www , Or Vice Versa

You can force resolution to one URL only. To do this, you create a 301 redirect.

Here's a typical 301 redirect script:

RewriteEngine On RewriteCond %{HTTP_HOST} ^seobook.com [NC] RewriteRule ^(.*)$ http://www.seobook.com/$1 [L,R=301]

For a more detailed analysis on how to use redirects, see .htaccess, 301 Redirects & SEO.

Use The Website Health Check Tool

This tool, and accompanying video, shows you how to spot a number of site architecture problems, including canonicalization issues.

Download the tool, check the www vs non-www option box, and hit the Analyze button.

If you have a large site you may not be able to surface all the canonicalization issues using the default tool settings. You may need to use the date based filter options to get a deep view of recently indexed pages...many canonicalization issues occur sitewide, so looking deeply at new pages should help you detect problems.

Another free, but far more time consuming option, is to use the date based filters on Google's advanced search page.

Workaround For Https://

Sometimes Google will index both the http:// and the https:// versions of a site.

One way around this is to tell the bots not to index the https:// version.

Tony Spencer outlines two ways to do this in .htaccess, 301 Redirects & SEO. One is to cloak the robots.txt file, the other is to create a conditional php script.

Use Absolute, As Opposed To Relative Links

An absolute link specifies the exact location of a file on a webserver. For example, http://www.acme.com/filename.html

A relative link is, as the name suggests, relative to a pages' location on the server.

A relative link looks like this:
"/directory/filename.htm"

There are various issues to consider, not related to canonicalization issues, when deciding to using either format. These issues include page download speed, server access times, and design conventions. The point to remember is to remain consistent. Absolute links tend to make doing so easier, as there is only ever one URL format for a file, regardless of context.

Don't Link To Multiple Versions Of The Page

In some cases, you may intend to have duplicate content on your site.

For example, some software, such as blog and forum software, aggregates posts into archives. Always link to the original version of the post, as opposed to the archive, or any other, location i.e. www.acme.com/todays-post.htm , not www.acme.com/archive/december/todays-post.htm.

If your software program links to a duplicate version of the content (like an individual post from a forum thread) consider adding rel=nofollow to those links.

Use 301s, not 302s On Internal Affiliate Redirects

A 301 redirect is a permanent redirect, which indicates a page has been moved permanently. 301s typically pass PageRank, and do not cause canonicalization issues.

A 302 redirect is a temporary redirect. If you use 302s the wrong page may rank. Google's Matt Cutts claims they are trying to fix the problem:

we’ve changed our heuristics to make showing the source url for 302 redirects much more rare. We are moving to a framework for handling redirects in which we will almost always show the destination url. Yahoo handles 302 redirects by usually showing the destination url, and we are in the middle of transitioning to a similar set of heuristics. Note that Yahoo reserves the right to have exceptions on redirect handling, and Google does too. Based on our analysis, we will show the source url for a 302 redirect less than half a percent of the time (basically, when we have strong reason to think the source url is correct)

but if you use 302s on affiliate links the affiliate page may rank in the search results, as shown in the below SnapNames search. This, in turn, would credit the affiliate with a commission anytime someone buys through that link in the search results...effectively cutting the margins of the end merchant.

Specify preferred urls in Google Webmaster Tools

Google Webmaster Tools provides an area where you can specify which version of URL i.e. http://www.acme or http//acme Google should use.

Note: It is important not to use the remove URL tool to try and fix these domain issues. Doing so may result in your entire domain, as opposed to one page, being removed from the index.

Further Reading

Social Interaction & Advertising Are The Modern Day Search Engine Submission & Link Building

Oct 8th

Years ago (well before I was an SEO, or knew what SEO was) search engine submission was a huge phrase. Only recently has search engine marketing replaced search engine submission in popularity.

Search engine submission was big part of the optimization game when search relevancy algorithms were heavily reliant on meta tags and on the page content. As search got polluted with on the page spam you needed to more than submit to compete for coveted valuable phrases, you had to build signals of trust from other sites. Link building was a requirement.

Many of the links that you could easily "build" have effectively disappeared from the web, through the use of nofollow and Google editing the PageRank of many (perhaps most) web directories. Recently Google removed their recommendations for directory submission and link building when these 2 points disappeared from their guidelines

  • Have other relevant sites link to yours.
  • Submit your site to relevant directories such as the Open Directory Project and Yahoo!, as well as to other industry-specific expert sites.

Might their reliance on directories be waning?

Absolutely.

Each additional link created and each additional web page published make Google smarter.

The web is a social network and search engines follow people. Once you think of the web from that perspective you have a HUGE advantage over competitors who are "building" one link at a time.

Google wants those who are well connected (and those who can afford to advertise) to succeed. Thus the evolution of SEO looks like...

  • search engine submission
  • on page optimization
  • link "building"
  • advertising, branding, viral marketing, public relations, & social interaction

Getting the basics right (keyword research, site structure, on page optimization) help make everything else you do more effective. But each day that passes you need a bit more (economic and/or social) capital to compete. What social interactions are built into your site? Why should bloggers write about your business?

Align Your SEO Strategy With Site Structure

Sep 30th
posted in

I'd like to take a look at an area often overlooked in SEO.

Site architecture.

Site architecture is important for SEO for three main reasons:

  • To focus on the most important keyword terms
  • Control the flow of link equity around the site
  • Ensure spiders can crawl the site

Simple, eh. Yet many webmasters get it wrong.

Let's take a look at how to do it properly.

Evaluate The Competition

One you've decided on your message, and your plan, the next step is to layout your site structure.

Start by evaluating your competition. Grab your list of keyword terms, and search for the most popular sites listed under those terms. Take a look at their navigation. What topic areas do they use for their main navigation scheme? Do they use secondary navigation? Are there similarities in topic areas across competitor sites?

Open a spreadsheet, and list their categories, and title tags, and look for keyword patterns. You'll soon see similarities. By evaluating the navigation used by your competition, you'll get a good feel for the tried-n-true "money" topics.

You can then run these sites through metrics sites like Compete.com.

Use the most common, heavily trafficked areas as your core navigation sections.

The Home Page Advantage

Those who know how Page Rank functions can skip this section.

Your home page will almost certainly have the highest level of authority.

While there are a lot of debates about the merits of PageRank when it comes to ranking, it is fair to say that PageRank is rough indicator of a pages' level of authority. Pages with more authority are spidered more frequently and enjoy higher ranking than pages with lower authority. The home page is often the page with the most links pointing to it, so the home page typically has the highest level of authority. Authority passes from one page to the next.

For each link off a page, the authority level will be split.

For example - and I'm simplifying* greatly for the purposes of illustration - if you have a home page with a ten units of link juice, two links to two sub-pages would see each sub-page receive 5 points of link juice. If the sub-page has two links, each sub-sub would receive two units of link juice, and so on.

The important point to understand is that the further your pages are away from the home page, generally the less link juice those pages will have, unless they are linked from external pages. This is why you need to think carefully about site structure.

For SEO purposes, try to keep your money areas close to the home page.

*Note: Those who know how Page Rank functions will realise my explaination above is not technically correct. The way Page Rank splits is more sophisticated than that given in my illustration. For those who want a more technical breakdown of the Page Rank calculations, check out Phils post at WebWorkshop.

How Deep Do I Go?

Keeping your site structure shallow is a good rule of thumb. So long as you main page is linked well, all your internal pages will have sufficient authority to be crawled regularly. You also achieve clarity and focus.

A shallow site structure is not just about facilitating crawling. After all, you could just create a Google Site Map and achieve the same goal. Site structure is also about selectively passing authority to your money pages, and not wasting it on pages less deserving. This is straightforward with a small site, but the problem gets more challenging as you site grows.

One way to mange scale is by grouping your keyword terms into primary and secondary navigation.

Main & Secondary Navigation

Main navigation is where you place your core topics i.e. the most common, highly trafficked topics you found when you performed your competitive analysis. Typically, people use tabs across the top, or a list down the left hand side of the screen. Main navigation appears on all other pages.

Secondary navigation consists of all other links, such as latest post, related articles, etc. Secondary navigation does not appear on every page, but is related to the core page upon which it appears.

One way to split navigation is to organize your core areas into the main navigation tabs across the top, and provide secondary navigation down the side.

For example, let's say you main navigation layout looked like this:

Each time I click a main navigation term, the secondary navigation down the left hand side changes. The secondary navigation are keywords related to the core area.

For those of you who are members, Aaron has an indepth video demonstration on Site Architecture And Internal Linking, as well as instruction on how to integrate and mange keywords.

Make Navigation Usable

Various studies indicate that humans are easily confused when presented with more than seven choices. Keep this in mind when creating your core navigation areas.

If you offer more than seven choices, find ways to break things down further. For example, by year, manufacturer, model, classification, etc.

You can also break these areas down with an "eye break" between each. Here's a good example of this technique on Chocolate.com:

Search spiders, on the other hand, aren't confused by multiple choices. Secondary navigation, which includes links within the body copy, provides plenty of opportunity to place keywords in links. Good for usability, too.

As your site grows, new content is linked to by secondary navigation. The key is to continually monitor what content produces the most money/visitor response. Elevate successful topics higher up you navigation tree, and relegate loss-making topics.

Use your analytics package to do this. In most packages, you can get breakdowns of the most popular, and least popular, pages. Organise this list by "most popular". Your most popular pages should be at the top of your navigation tree. You also need to consider your business objectives. Your money pages might not be the same pages as your most popular pages, so it's also a good idea to set up funnel tracking to ensure the pages you're elevating also align with your business goals.

If a page is ranking well for a term, and that page is getting good results, you might want to consider adding a second page targeting the same term. Google may then group the pages together, effectively giving you listings #1 and #2.

Subject Themeing

A variant on Main & Secondary Navigation is subject themeing.

Themeing is a controversial topic in SEO. The assumption is that the search engines will try and determine the general theme of your site, therefore you should keep all your pages based around a central theme.

The theory goes that you can find out what words Google places in the same "theme" by using the tilde ~ command in Google. For example, if you search on ~ cars, you'll see "automobile", "auto", "bmw" and other related terms highlighted in the SERP results. You use these terms as headings for pages in your site.

However, many people feel that themes do not work, because search engines return individual pages, not sites. Therefore, it follows that the topic of other pages on the site aren't directly attributable to the ranking of an individual page.

Without getting into a debate about the the existence or non-existence of theme evaluation in the algorithm, themeing is a great way to conceptually organize your site and research keywords.

Establish a central theme, then create a list of sub-topics made up of related (~) terms. Make sub-topics of sub-topics. Eventually, your site resembles a pyramid structure. Each sub-topic is organized into a directory folder, which naturally "loads" keywords into URL strings, breadcrumb trails, etc. The entire site is made up of of keywords related to the main theme.

Bruce Clay provides a good overview of Subject Themeing.

Bleeding Page Rank?

You might also wish to balance the number of outgoing links with the number of internal links. Some people are concerned about this aspect, i.e. so-called "bleeding page rank". A page doesn't lose page rank because you link out, but linking does effect the level of page rank available to pass to other pages. This is also known as link equity.

It is good to be aware of this, but not let it dictate your course of action too much. Remember, outbound linking is a potential advertisement for your site, in the form of referral data in someone else logs. A good rule of thumb is to balance the number of internal links with the the number of external links. Personally, I ignore this aspect of SEO site construction and instead focus on providing visitor value.

Link Equity & No Follow

Another way to control the link equity that flows around your site is to use the no-follow tag. For example, check out the navigational links at the bottom of the page:

As these target pages aren't important in terms of ranking, you could no-follow these pages ensure your main links have more link equity to pass to other pages.

Re-Focus On The Most Important Content

This might sound like sacrilege, but it can often pay not to let search engines display all the pages in your site.

Let's say you have twenty pages, all titled "Acme". Links containing the keyword term "Acme" point to various pages. What does the algorithm do when faced with these pages? It doesn't display all of them for the keyword term "Acme". It choses the one page it considers most worthy, and displays that.

Rather than leave it all to the algorithm, it often pays to pick the single most relevant page you want to rank, and 301 all the other similarly-themed pages to point to it. Here's some instructions on how to 301 pages.

By doing this, you focus link equity on the most important page, rather than splitting it across multiple pages.

Create Cross Referenced Navigational Structures

Aaron has a good tip regarding cross-referencing within the secondary page body text. I'll repeat it here for good measure:

This idea may sound a bit complex until you visualize it as a keyword chart with an x and y axis.

Imagine that a, b, c, ... z are all good keywords.
Imagine that 1, 2, 3, ... 10 are all good keywords.

If you have a page on each subject consider placing the navigation for a through z in the sidebar while using links and brief descriptions for 1 through 10 as the content of the page. If people search for d7, or b9, that cross referencing page will be relevant for it, and if it is done well it does not look too spammy. Since these types of pages can spread link equity across so many pages of different categories make sure they are linked to well high up in the site's structure. These pages works especially well for categorized content cross referenced by locations.

Related Reading:

Search Engine Optimization - Evolution or Extinction?

Sep 21st

The following is a guest blog post by Jeremy L. Knauff from Wildfire Marketing Group, highlighting many of the recent changes to the field of SEO.

Marketing is constantly evolving and no form of marketing has evolved more over the last ten years than search engine optimization. That fact isn’t going to change anytime soon. In fact, the entire search engine optimization industry is headed for a major paradigm shift over the next twelve months. Like many of the major algorithm updates in the past, some people will be prepared while some will sit teary-eyed amongst their devastation wondering what happened and scrambling to pick up the pieces. Unlike the major algorithm updates of the past, you won’t be able to simply fix the flaws in your search engine optimization and jump back to the top of the SERPs.

Why is this change going to be so different? In the past, the search engines have incrementally updated certain aspects of their algorithms to improve the quality of their SERPs, for example, eliminating the positive effect of Meta tag keyword stuffing which was being abused by spammers. Anyone who has been in the SEO industry for more than a few years probably remembers the chaos and panic when the major search engines stopped ranking websites based on this approach. This time around though, we’re looking at something much more significant than simply updating an algorithm to favor particular factors or discount others. We are looking at not only a completely new way for search engines to assign value to web pages, but more importantly, a new way for search engines to function.

Local search

A number one ranking for a particular keyword phrase was once the end-all, be-all goal but now many searches are regionalized to show the most relevant web pages that are located in the area that you are searching from. While this will probably reduce your traffic, the traffic that you now receive will be more targeted in many cases. Additionally, it give smaller websites a more equal chance to compete.
local.jpg

Google suggest

This August, Google Suggest was moved from Google Labs to the homepage, offering real-time suggestions based on the letters you’ve typed into the search box so far. This can be an incredibly helpful feature for users. At the same time, it can be potentially devastating to websites that rely of long-tail traffic because once a user sees a keyword phrase that seems like at least a mediocre choice they will usually click on it rather than continuing to type a more specific keyword phrase.
suggest.jpg

Devaluation of paid links

Google’s recent attempt to eliminate paid links has scared a lot of people on both sides of the link buying equation into implementing the “nofollow” tag. In the midst of this hypocritical nonsense, Google has also been taking great measures to devalue links based on quantifiable criteria, such as the “C” class of the originating IP, similarities in anchor text and/or surrounding text, location of the link on the page and the authority of the domain the link is from to name a few. Regardless of the effectiveness of any search engines ability to evaluate and subsequently devalue paid links, the fear of getting caught and possibly penalized is more than enough to deter a lot of people from buying or selling links.

Visitor usage data

Again, Google is leading the charge on this one. Between their analytics, toolbar and web browser, they are collecting an enormous amount of data on visitor usage. When a visitor arrives at a website, Google knows how long they stayed there, how many pages they accessed, which links they followed and much more. With this data, a search engine can determine the quality of a website, which is beginning to carry more weight in regards to ranking than some of the more manipulatable factors such as keyword density or inbound links. This puts the focus on content quality instead of content quantity and over time, will begin to knock many of the “me too” websites further down the SERPs pages, or out of the picture all together. The websites that will prosper will be those that produce relevant, original content that their visitors find useful.

TrustRank

Simply pointing a vast number of links with a particular keyword phrase in the anchor text to a website was once a quick and easy way to assure top ranking. The effectiveness of this approach is diminishing and will continue in that direction as a result of TrustRank. In a nutshell, a particular set of websites are chosen (by Google) based on their editorial quality and prominence on the Internet. Then Google analyzes the outbound links from these sites, the outbound links from the sites linked to by these site, and so on down the chain. The sites that are further up the chain carry more trust and those further down the chain, less trust. Links from sites with more TrustRank, those further up the chain, have a greater impact on ranking than links from websites further down the chain. On one hand, this makes it difficult for new websites to improve their position in the SERPs compared to established website; one the other hand, it helps to eliminate many of the redundant websites out there that are just repeating what everyone else is saying.

Google Chrome

Utilizing a combination of visitor usage data and a not so gentle nudge in Google’s direction, Google Chrome is set to change the way search engines gather data and present it to users. For example, when a user begins typing in the address bar of the browser, they are presented with a dropdown list of suggestions containing choices consisting of the #1 result in Google’s SERPs, related search terms and other pages you’ve recently visited. This gives a serious advantage to the websites that hold top ranking in Google and at the same time, gives a serious advantage to Google by giving their Internet real estate even more exposure than ever before.
chrome.jpg
So the question remains, is search engine optimization facing evolution or extinction? Certainly not extinction, not by a long shot, but in a short period of time it is going to be drastically different than it is today. The focus will soon be on producing a valuable and enjoyable user experience rather than just achieving top ranking, which is what it should have been all along.

Is Your Search Result Sexy?

Sep 18th

Title Tags As Ads

Do your tags scream "Click Me"?

Following on from my post yesterday, How To Craft Kick-Ass Title Tags & Headlines, lets look at meta tags as an advertisement, and why you need to think carefully about your offer, and the offers of your competition, when you craft your tags.

Why Are Title Tags Important?

Ranking debates aside, the main reason Title tags are important is because they are displayed, in bold, in the SERPs.

A SERP is a list of 20+ links, all clamoring for the visitors click. It is therefore important to entice visitors to click on your listing, rather than everyone else's. Sometimes you achieve this by rank placement alone, but with well-crafted tags, you stand a better chance of receiving that click.

What Is The Optimal Length For A Title Tag?

The W3C recommends the title tag should be less than 64 characters long.

Some SEOs think that long, keyword-loaded tags are the best approach. Some SEOs think short punchy tags are best, as long tags may dilute the weight of the keyword phrase, and there is less risk of Google cutting off you message midstream.

Because other factors play a more significant role in terms of rank, I ignore prescriptive tag lengths. Instead, I look to optimize the message in line with the business goals of a site.

Know Your Enemy

This is a proven Adwords strategy which also dovetails nicely into SEO.

The first step is to evaluate your surrounding competition.

Look at the wording of the most successful adwords ad for your chosen keyword term. Your aim is replicate success. Run an adwords campaign and experiment with the wording to find out the wording combination that receives the most clicks and subsequent desired action. You then craft your title tags and description tags to match. What works for Adwords works in the main SERPs, too.

Another way to approach title tags is to constantly rotate the tags using a script, and monitor the results. The is a split-run approach known as Keyword Spinning. You keep with the winners and cut the losers. This approach is describe in my post "Tested Advertising Strategies Respun For SEO"

What Are The Ideal Lengths For Meta Description Tags?

Common SEO wisdom dictates the description tag should be around 160 characters long.

Again, my approach is take prescriptive lengths with a grain of salt. Instead, focus on marketing and business goals.

The description tag doesn't have any ranking benefit, but it can be used to encourage people to click on your listing. Evaluate the surrounding competition, run tests using phrase variations, and make your description tag enticing. Also keep in mind that Google may match up a page description if the exact search query exists in the description tag.

Examples Of Title And Description Tags

This is how it should be done:

The title and description are clear and descriptive. There is a call to action and an appeal to self-interest.

This is a jumble:

The title and descriptions are confused. It is not clear what the benefit is to the visitor.

Google's Quirks

One problem is that Google sometimes uses a snippet Google may also use a DMOZ description.

Google will use the snippet when it finds no description tag, or determines the description tag that your provided is inappropriate. To improve the chances your meta description tag will be used, see Google's guide: "Improve Snippets With A Meta Description Make Over". Essentially, you need to make you meta description tag descriptive, as opposed to a series of keywords.

You can prevent search engines from using the DMOZ description using the following meta tag:

Prevent DMOZ META NAME="ROBOTS" CONTENT="NOODP"

See Googles Webmaster Guideline: "Changing your site's title and description in search results".

Get Into The Mind Of The Searcher

An important part of positioning an offer is to know what's on the searchers mind.

In some cases, the keyword query will contain this information. For example "Buy X Online Overnight Delivery" is self-evident, however the majority of searches are not transactional.

According to a Penn State research study, the breakdown of searches is as follows:

  • 80% Of Searches Are Informational
  • 10% Of Searches Are Navigational
  • 10% Of Searches Are Transactional

Definitions:

  • Informational queries are meant to obtain data or information in order to address an informational need, desire, or curiosity.
  • Navigational queries are looking for a specific URL.
  • Transactional queries are looking for resources that require another step to be useful.

Query classifications can be broken down further into the following sub-categories:

  • Directed: Specific question. i.e "Registering a domain name".
  • Undirected: Tell me everything about a topic. i.e. "Singers in the 80s".
  • List Of Candidates: List Of Candidates i.e. "Things to do in Hollywood".
  • Find: Locate where some real world service or productcan be obtained i.e."PVC suit"
  • Advice: Advice, ideas, suggestions, instructions. i.e. "What to serve with roast pork tenderloin".
  • Navigation to transactional: The URL the user wants is a transactional site i.e "match.com"
  • Navigation to informational: The URL the user wants is information i.e. "google.com"
  • Obtain: Obtain a specific resource or object i.e. "Music lyrics"
  • Download: Find a file to download ie. "mp3 downloads"
  • Results page: Obtain a resource that one can printed,save, or read from the search engine results page i.e. (The user enters a query with the expectation that 'answer' willbe on the search engine results page and not require browsing toanother Website)
  • Interact: Interact with program/resource on another Website. i.e "buy table clock"

And further by sub-category type:

  • Closed: Deals with one topic; question with one, unam-biguous answer. i.e "Nine supreme court justices ".
  • Open: Deals with two or more topics . i.e. "excretory system of arachnids".
  • Online: The resource will be obtained online i.e. "Things to do in Hollywood".
  • Off-line: The resource will be obtained off-line and may require additional actions by the user i.e."Airline seat map"
  • Free: The downloadable file is free i.e. "Full metal alchemist wallpapers Free".
  • Not free: The downloadable file is not necessarily free i.e. "family guy episode"
  • Links: The resources appears in the title, summary, or URL of one or more of the results on the search engine results pages
  • Other: The resources does not appear one of theresults but somewhere else on the search engine results page

Source: "Determining the informational, navigational,and transactional intent of Web queries" Bernard J. Jansen, Danielle L. Booth, Amanda Spink; Pennsylvania State University

Google have teams devoted to this very function, and this type of classification will feed through into their algorithms.

When crafting your tags, think about what classification of query the searcher is undertaking. How would they structure it? What terms would they use? Would they phrase their query as a question? What words would they include? What words would they omit? Dig deep into your keyword research tools and web logs to find this data.

Think about their mindset. Using words like research and compare help you tap into people in the research mode, whereas words like buy, save, coupons, and free shipping attract people ready to buy.

A Call To Action

The title tag and description provides opportunities to include calls to action. A call to action is a phrase that provides the opportunity for a visitor to take a step along the sales process.

The keyword term you've selected might give you a clue as to what point of the sales process the visitor is at. Obviously, "Buy X Online Overnight Delivery" tends to indicate a visitor is about to hand over the cash, so you draft your title tag and description accordingly in order to help close the deal.

However, most keyword terms aren't this overt. This is where you need to think about the type of offer you present.

How To Decide Between A Hard Offer And A Soft Offer

Some of the most effective offers are seldom "reasons to buy", but rather "reasons to respond." This is the difference between a hard and soft offer.

The vast majority of searchers are not ready to buy, so by using a soft offer, you stand to capture a greater number of leads than you would if you just made a hard "buy right now!" offer. If all you've got is a hard offer, then visitors who aren't ready to buy will click back, or won't select your SERP result at all.

Opportunity lost.

Instead, encourage the visitor to take a relatively painless action, such as joining a mailing list, or downloading a free case study.

You can take this a step further my using the case study title to find out more about your visitors. For example, a case study entitled "Real Estate" won't tell you much about the problem your visitor is trying to solve, but a descriptive title, such as "Seven Ways To Sell Your Own Home" will. If they download the latter, and your service solves this problem for people, you're one step closer to making the sale.

Benefits Of The Soft Offer

  • You'll generate more leads
  • You have the opportunity to enter a dialogue with the visitor, thus moving them through the process

Only you'll know if a hard offer or a soft offer is most appropriate. But think carefully about the nature of your offer when crafting your titles and descriptions. Is your offer exactly the same as every other offer in the SERP? Or could you tweak you offer to make it stand out from the rest? Your offer should be more enticing than every other offer on the page. Try to get this across in your title and description.

Related Reading & Tools

How To Craft Kick-Ass Title Tags & Headlines

Headlines

One old-skool marketing technique that will always hold true is the value of the catchy headline.

The headline, given its power to convey meaning quickly, is more important than ever. Attention spans are limited. Media messages flood the channels. We're busy. The function of the headline is to grab our attention and pull us deeper into the message.

Many books have been written on how to craft great headlines. I'm going to quote from the advertisers bible on the topic, Tested Adverting Methods by John Caples. Caples identifies three main classes of successful headlines.

The Three Main Classes Of Successful Headlines

  • Self-Interest: The best headlines are those that appeal to self interest. They offer the reader benefits that they want, and they can get from you. For example, RETIRE AT 30
  • News - Humans are pre-disposed to seek out what is new and different in their environment. For example, NEW, CHEAPER IPHONE CALL PLANS RELEASED
  • Curiosity Appeal to our curious nature. LOST: $1 BILLION DOLLARS

Of the three, by far the most effective headline in advertising is the self interest headline. Our self interest usually trumps our curiosity, and news, especially when time is short.

Compare these two headlines:

PUT UP OR SHUT UP

FIVE TOTALLY NEW WAYS TO GET TOP RANKING IN GOOGLE

The first says nothing that appeals to our self interest. We don't even know what it is about. But you'd be hard pressed not to click on the second headline. The self-interest is just too strong. This is why the second form is used so often in link-baiting and social media. It screams for attention, and then makes a strong appeal to self-interest.

There is a downside to such headlines, however. Modern audiences have become jaded and cynical, especially where marketing messages are concerned. Overplay the benefit, and you'll come off as a shark. Link-baiting, a useful SEO tactic, has developed a bad reputation through overuse of this approach.

Eventually, people tune out.

Get Your Tone Right

We can twist the overused appeal-to-self interest headline strategy slightly to make it work for us. The key to getting the appeal to self-interest right is to get the tone right. Understand both the audiences' desires and the tone of "voice" they respond to.

For example, look at Digg. A cynic might argue that a surefire way to get top page on Digg is to write a headline that includes the following subject matter, and do so using an irreverent tone:

  • Criticism of Bush
  • Anything about Digg itself
  • Pumping Linux
  • Dumping DRM
  • Some crazy-weird activity from a country no-one has ever heard of :)

By the way, if anyone can come up with a headline that includes one of those elements, feel free to add it to the comments :)

The headline needs to be crafted in such a way as to appeal to Diggs demographic, which is mostly young, tech-savvy males. This demographic tends to respond to a tone that is cynical, flippant and irreverent. Get that tone wrong - i.e. play it too straight, or too advertorial - and it doesn't matter how strong the self-interest angle, it's unlikely to work.

How To Use Headline Strategy In SEO.

SEO has an additional challenge.

For SEO to work well, the headline, which is often also used as the title tag, should include a keyword term. Many studies have shown that a SERP or Adword that includes the keyword term results in more clicks. In order to get the headline strategy to work for SEO, try amalgamating the keyword term with one of the three formats.

For example, where the keyword term is "high speed routers", try:

  • High Speed Routers- How To Get Routers At Half Price (appeal to self interest)
  • High Speed Routers- Latest Features To Insist On (news, with a hint of self interest)
  • High Speed Routers- How We Blew Our Budget (Curiosity)

Even if you're not #1 in the serps for that term, you're more likely to attract a click than the guy who simply uses "High Speed Routers", by itself.

Your headline (i.e. the title tag) competes with at least ten other SERPs on the page, along with a various Adwords listings along the top and down the side. The top three SERP poitions are gold, but if you can add a touch of appeal-to-self-interest, or news, or curiosity, you'll up your chances of getting the click.

If you want to go one step further with this tactic, use it as a way to segment visitors. The first example I gave is likely to attract those people who are ready to buy, and who are buying on price.

You then need to include your title as a heading on the page, which confirms to they visitor their click has got them where they wanted to be. They're now far more likely to read beyond the headline.

Further Reading:

Black Hole SEO

Sep 11th

Black Hole SEO

There is a black hole forming.

A few of them, actually.

These black holes aren't the result of the CERN Hadron Collider. They are forming for two reasons: the desire to keep people on site longer; and to hoard link juice, in order to dominate the SERPs.

Increasingly, top-tier sites are becoming cagey about linking out. They are more than happy to be linked to, of course, but often the favor is not reciprocated. Check out this post by SEOBlackhat.

What Does A Black Hole Look Like?

  • Uber-black hole, The New York Times, seems reluctant to link to anyone but themselves. This is especially annoying when they write about websites.
  • Wikipedia no-followed their links some time ago, thus forming a PageRank variant of the black hole.
  • The mini-me black hole, as practiced by TechCrunch. Rather than directing you to a site mentioned in an article, TechCrunch would direct you to their own CrunchBase entry instead, thereby keeping you on-site longer, and passing link authority to their own web pages. As a result, a search on Google for a sites' name may well bring up the CrunchBase entry. To be fair, TechCrunch does also link out, and there is an explanation as to why TechCrunch aren't as bad as the New York Times here.

The result is a link-love black hole. Sites using such a strategy can dominate the rankings, if they are big enough.

So if you wanted to create a blackhole, what would you do?

  • Don't link to anyone
  • If you must link out, then No-Follow the links, or wrap them in scripts
  • Direct page rank around your own site, especially to pages featuring your competitors names
  • Buy a motherlode of links
  • Become a newspaper magnate :)

Now, if you're an SEO, you might be feeling a tad conflicted about now. Why wouldn't every SEO do this? What if you owned a black hole? Isn't that the ultimate SEO end game?

In the long term, I doubt it.

If this problem becomes too widespread, Google will move to counter it. If Google's results aren't sufficiently diversified, then their index will look stale. If you search for a site, and get third party information about that site, rather than the site itself, then this will annoy users. Once confidence is lost in the search results, then users will start to migrate to Google's competitors.

I'm not certain such a move will be entirely altruistic, however. After all, what is the point of Knol? No, really - what is the point of Knol? ;)

The Advantages Of Sharing The Love

Consider what you gain by linking out.

  • Webmasters look at their referalls, and may follow the link back to check out your site
  • Outbounds may count for more in future, if they don't already
  • Your users expect it. Don't fight against their expectations else you'll devalue your brand equity
  • Any site that looks "too-SEO'd" risks standing out on a link graph
  • There is social value in doing so. Black hole sites start to look like bad actors, can receive bad press, and risk damaging their relationships with partners, suppliers, and communities.

Create More Value Than You Capture

Tim O'Reilly put it well:

"..... The web is a great example of a system that works because most sites create more value than they capture. Maybe the tragedy of the commons in its future can be averted. Maybe not. It's up to each of us".

Update:
The phrase Black Hole SEO was used by Eli on BlueHatSEO.com over a year ago to describe various aggressive SEO techniques.

SEO For Regional Domains

Sep 8th

Webmasters are often faced with the problem of how to approach SEO on websites which have a country-specific focus. As you may have noticed, the search engine results pages on Google's geo-targeted search services frequently display different rankings than those you experience on Google.com. 

If you run a few queries on, say, Google.com.au, you'll soon notice distinct regionalization patterns. In order to make search results more relevant to local audiences, Google uses different sorting methodologies than those used on Google.com.

Here is a guide to optimizing sites for the different regional flavors of Google.

Country Specific Local SEO Tips

  1. Get a local domain extension:  Google places a lot of weight on the domain name, so it is important to get the appropriate country-code domain extension. If you compare results across the different geo-targeted flavors of Google, you'll notice the weight given to the local TLDs. There are exceptions, but the local TLD tends to trump .com when it comes to local result sets. Different countries have different registration criteria for domain resitration. It is fairly easy to register a co.uk or a .co.nz, whilst a .com.au can involve setting up a business entity in Australia. 
  2. Specify your country association in Google Webmaster ToolsGoogle Webmaster Tools offers a facility whereby you can specify a country association for your content. You can do this on a domain, sub-domain and directory level. More detailed instructions can be found on Google's Webmaster Tools Blog.
  3. Include local contact information: Specify a local address, business name, and local contact phone numbers. Whilst not critical in terms of ranking, every little bit helps, and by including local information, the site becomes more credible to a local audience. 
  4. Local hosting: Depending on who you ask, you'll get different answers as to whether the geographic location of the web host makes a difference in terms of ranking. I have .com.au, .co.nz, and .co.uk sites, hosted on US servers, and they rank well on the appropriate local versions of Google. Other people feel that location-based hosting is a must. Still others say the location of the name server is most important! It's fair to say that if you have a choice between hosting locally and hosting offshore, then it might pay to host locally. It certainly can't hurt, and there might be additional benefits, such as increased download speed. If you go this route, one thing to check is the servers physical location. Often, web hosts have a local office, but their servers are located in a different country. Use an IP lookup tool to determine the exact location of a server. 
  5. Spelling & Language: Ensure you use the appropriate spelling for your chosen region. There is a difference between "optimization" and "optimisation". Keep in mind that searchers will use the local vernacular. For example, if you are optimizing a travel site in the US, you might use the term "vacation". However, searchers in Australia, the UK and New Zealand, amongst others, tend to use the term "holiday". 
  6. Tone: Copy that works well in one geographic location may not work in another.  For example, the sales language used in the US is usually more direct than that typically used in the UK, Australia or New Zealand. Familiarize yourself with local approaches to marketing, or engage local copywriters.     
  7. Inbound links: Seek out local links. All links are good, but inbound links from local TLDs are even better. Approach your local chamber of commerce, friends, suppliers, government agencies, business partners, and local industry groups and ask them for links.
  8. Local directories: Get your site listed in local directories. Local directories still feature well in geo-targeted search results as the depth of content, in terms of sheer volume, isn't as great in the local TLD space as that published on .com. Obviously, you stand to gain from the local traffic that the directories send your way, and any local link juice the directory may pass on.  Here are some top local directories:
  • The local Yellow Pages i.e. Yellow Pages Australia, Yellow Pages New Zealand, and Yell (UK). Keep in mind that some of these directories may not pass link juice, however you can weigh this factor against their value in terms of local reach. You could also seek listings in the regional sections of the following global directories: DMOZ, Yahoo, and BestOfTheWeb.
  • Recommended regional directories:

  • Scoot.co.uk is a prominent UK business directory.
  • Webwombat.com.au is a comprehensive Australian directory.
  • Te Puna is a government run New Zealand directory.
  • Press releases: Try to come up with a local angle for your press releases, and submit them to local news and information channels. Small, local news outlets are highly likely to run local interest stories, which in turn may help your brand exposure and get you more local links. 
  • Avoid Duplicate content: If you market is in one country, then it makes sense to use the country-code TLD for that country. However, if you target multiple countries, consider creating different content on each domain. Placing the same content on multiple domains may risk duplicate content penalties. 
  • Off-line marketing: Don't forget to get your name out locally. If people search by you by your brand or business name, you'll always be well positioned in the serps. 
  • Have Your Say

    If you have some additional ideas that have worked well for you, please feel free to add them to the comments.

    Robots.txt vs Rel=Nofollow vs Meta Robots Nofollow

    Aug 6th

    I was just fixing up our Robots.txt tutorial today, and figured that I should blog this as well. From Eric Enge's interview of Matt Cutts I created the following chart. Please note that Matt did not say they are more likely to ban you for using rel=nofollow, but they have on multiple occasions stated that they treat issues differently if they think it was an accident done by an ignorant person or a malicious attempt to spam their search engine by a known SEO (in language that is more rosy than what I just wrote).

    Crawled by Googlebot?
    Appears in Index?
    Consumes PageRank
    Risks? Waste?
    robots.txt no If document is linked to, it may appear URL only, or with data from links or trusted third party data sources like the ODP yes

    People can look at your robots.txt file to see what content you do not want indexed. Many new launches are discovered by people watching for changes in a robots.txt file.

    Using wildcards incorrectly can be expensive!

    robots meta noindex tag yes no yes, but can pass on much of its PageRank by linking to other pages

    Links on a noindex page are still crawled by search spiders even if the page does not appear in the search results (unless they are used in conjunction with nofollow on that page).

    Page using robots meta nofollow (1 row below) in conjunction with noindex do accumulate PageRank, but do not pass it on to other pages.

    robots meta nofollow tag destination page only crawled if linked to from other documents destination page only appears if linked to from other documents no, PageRank not passed to destination If you are pushing significant PageRank into a page and do not allow PageRank to flow out from that page you may waste significant link equity.
    link rel=nofollow destination page only crawled if linked to from other documents destination page only appears if linked to from other documents no, PageRank not passed to destination If you are doing something borderline spammy and are using nofollow on internal links to sculpt PageRank then you look more like an SEO and are more likely to be penalized by a Google engineer for "search spam"

    If you want to download the chart as an image here you go http://www.seobook.com/images/robotstxtgrid.png

    And The Winner Is...

    Jul 31st

    I decided to pick David Lubertazzi and Elisabeth Sowerbutts as the winners for their SEO Knol improvement comments.

    I added a few pictures and fixed up some writing errors and incorporated a bunch of the feedback (like making the introduction better - thanks Andrew). There are many things (like domain names, duplicate content, blogging, social media, conversion, history and background of SEO) that I could have discussed, but I was unsure of how long I should let the Knol get, while still claiming that it was a basic introduction. Thanks for the feedback everyone!

    Pages






      Email Address
      Pick a Username
      Yes, please send me "7 Days to SEO Success" mini-course (a $57 value) for free.

      Learn More

      We value your privacy. We will not rent or sell your email address.