Seven Ways To Be More Persuasive

Oct 20th
posted in

We spend a lot of time thinking about how to get visitors to our sites, but how much time do we spend thinking about better ways to persuade people once they've arrived?

Such topics are often talked about in terms of conversion and split/run testing. However, I'm like to talk talking about something a little more subtle.

The gentle art of persuasion.

Is Your Web Site Persuasive?

Every site "sells" something. It might be a product, a service, an opinion, or an increased level of engagement. You might wish to sign up members. You might want someone to bookmark your site, and return at a later date. You might want someone comment on your blog post. How can we best achieve these aims?

In "Yes, 50 Secrets From The Science of Persuasion", there is a great example of how to inconvenience your customers in order to make a sale. Colleen Szot, a leading infomercial writer, changed three words in an infomercial line which resulted in a significant increase in the number of people who purchased her product. What is remarkable is she seemingly made ordering more of a hassle.

What were those three words?

Instead of saying "Operators are waiting, please call now", she said "If operators are busy, please call back".

It seems odd that informing the customers they might have to wait would work, After all, the revised line implies the customer might have to redial a few times. However, the change worked because it used the principle of social proof. i.e. if people are uncertain about looking to perform a social action, then they'll look beyond their own judgment for a guide on what action to take.

"Operators are waiting to take your call" conjures up a mental image of rows of bored operators waiting for the phone to ring. Nobody is buying. However, by suggesting the operators might be busy, we imagine that many people are buying the product. If other people are buying it, it must be good. Of course, this isn't logical, but it is how people act. They perceive there to be safety in numbers.

We can take these ideas and apply them to websites, too. Here are seven. These ideas are all documented in "Yes, 50 Secrets From The Science of Persuasion".

I'm not getting any kickbacks for mentioning it. I just really enjoyed the book :)

1. Establish Social Proof

Look at ways in which you can demonstrate other people have taken this course of action.

Typical examples on the web include personal recommendations and endorsements. More subtle indicators include a running total of the number of comments made, indicators as to the size of the community, and the number of people who have visited the site. RSS counters. Social network plug-ins, such as MyBlogLog . All indicators that other people congregate here.

Think of the web as a place.

This is another reason the brochure web site is dying a death compared to interactive sites. There are few social markers on brochure sites, and there is seldom a sense of place. People want to be where are other people are.

No one wants to eat in an empty restaurant.

2. Don't Give People Too Many Options.

In a study of over 800,000 workers, behavioral scientist Sheena Lyengar studied company sponsored retirement programs. The study found that the more choices that were offered, the less likely employees were to enroll in the program. Giving people too many options forces people to differentiate. This can lead to confusion and disengagement from the task at hand.

When you consider that an exit on the web is only one click away, it becomes vitally important that people do not become disengaged. Decide on a limited number of desired actions you want visitors to take, and focus people's attention on those few options.

3. The Middle Option

I've covered this tactic before in Predictable Irrational Marketing Strategies, but it's such a great persuasive technique, it can't hurt to revisit it :)

If you want to people to take a desired action, frame it alongside two less desirable options.

For example, let's say you're offering TVs for sale. If you offer a cheap TV and an expensive TV, you're forcing people to make a choice based on price. People will tend to pick the lowest price option if forced into a decision based solely on price. However, if you offer a third option the decision becomes less focused on price. It becomes a compromise choice based on both price and features.

Given this option, people tend to pick the middle option. Consider that the middle option was the expensive option in the first either/or offer :)

4. Scare 'Em

A persuasive technique favored by politicians. "Terrorists!". "Your Savings Will Be Wiped Out!". "Your Jobs Will Go Offshore!".

These threat messages work because humans are conditioned to look out for threats. It's a survival mechanism. You can incorporate this persuasive technique in a more subtle way, however.

People experience fear on a number of different levels, i.e. they may simply fear that by not having your product, service or blog in their feed reader, they may be missing out. Describe the threats your product or service can alleviate, and provide a clear, concise course of action the visitor must take.

This technique must be used carefully however, as fear can also lead to inaction. Hence the phrase "paralyzed by fear", which can also occur if you offer too many options. People are afraid they'll pick the wrong one.

5. Give Forward

Reciprocity is a strong human driver. We want to give back to those people who give to us as we feel obligated. Curiously, studies show that we don't even have to like the person to feel indebted.

One of the most ridiculous pitches in web marketing is the link swap email. Someone asks you for a link, and once that link is in place, they'll link back to you. Typically they want a prominent link from your site, in return for a link on a page buried deep in their site, alongside thousands of other links.

Not much of an offer, really.

Some people try and twist the idea by giving a link first, but will retract it if you don't reciprocate. Once again, this isn't really giving anything away.

A much better approach is to simply link out to the target site. Webmasters tend to follow links back to see who is linking to them. Your link becomes a subtle form of advertising. If you then praise that website, and offer great content, you're significantly raising your chances of getting a link back.

Ask not the question "who can help me", but "whom can I help?".

6. The Post It Note

Research shows that a post-it note attached to a document tends to increase response rates. Why? Partly it has to do with the bright post-it note acts as a highlighter, and partly it has to do with the fact someone has added a personal touch

You can see the post-it note technique creeping into web design. People use a post-it note graphics, like this one on CopyBlogger. There's also a design trend to add "hand writing" as a form of personalization. Check out a few examples on Smashing Magazine.

The more personalized a request, the more likely people are to agree to it.

7. Labeling

When Luke persuaded Darth Vader to turn against the dark side, he said "I know there is still good in you! There's good in you. I can sense it". This is known as labeling. BTW: That link has little to do with this point, but it is funny :)

I digress...

The technique is to assign a trait, attitude or belief to another person and then make a request of that person consistent with that label.

For example, if you were selling accounting books, you could suggest that people who buy accounting books are also big consumers of your finance titles. Then offer them a finance title. This also works in terms of social proof.

Your Turn

What are your favorite persuasion techniques?

Web Publishing: Strategies To Help You Stand Out From The Crowd

Oct 17th
posted in

Web publishing has a low barrier to entry. This is great, because it enables anyone to be a publisher, and to reach a world wide audience.

The downside is that because there is a low barrier to entry, the web is saturated with content!

So, how do you choose topics to write about that stand out from the crowd? How do you stay ahead of everyone else? How do you stay ahead of those who have more time/money/energy to publish than you do? One way, of course, is to work smarter.

In this article, we'll look at strategies and tools that will help you do just that. But before we do, let's take a look at the state of the web..

The Evolution Of Personal Publishing

Personal publishing is in a constant state of evolution.

Take blogs, for example.

At one time, is was good enough simply to link to topics. The first blog, Robot Wisdom, took this approach. However, with the rise of social media, like Digg & Twitter, this approach - apart from a few, long-established exceptions aside - is a dead duck.

Next came the "rewriting news stories" approach. This approach still works, but in crowded niches, every blog ends up publishing the same thing. If you're a late follower in a niche, it's unlikely you'll make much headway using this technique, because it doesn't offer anything people can't get - and aren't already getting - elsewhere.

Next came providing opinion, analysis and context to news stories. This works well if the opinions on offer are new, insightful, and unique. This is the current state of the blogshere, and chances are the top blogs you read take this approach. They address a need in the market - i.e. a need for depth and analysis . I suspect you're already reading less and less of the blogs that either just point to sources or rewrite news stories.

It's not quite as linear as I'm making out, but the point is wish to make is that as content more plentiful, the bar gets raised on the quality level of content you need to produce in order to stand out.

Plenty of new opportunities lie in synergising information to provide readers with the new angles and editorial depth they crave. If you aggregate from different sources, and can spot trends before others do, you stand a good chance of standing out from the crowd.

But how do you do this?

Tools & Strategies

1. RSS Reader

Chances are you already use one. But if you don't, an RSS reader is possibly the single most important tool for article and information discovery. An RSS reader brings information to you. It brings the information to you soon after it is published. It's like having your own personal newspaper which auto-updates every few minutes.

The main advantage of an RSS reader is that you can scan a huge number of sources in very little time. Aim to monitor a lot of sources, across related industry verticals.

There are plenty of RSS readers to choose from. Here are a few to get you started: Google Reader, Bloglines, and NewsFox.

2. Have A Point Of View About Future Direction

Try to form opinions about the way your market or niche is heading, rather than where it is now, then analyse information through this filter. If asked, could you say where internet marketing is now, and where it will be in five years time? What will it look like? What are the stages it will move through to get there?

If you use such a mental filter, you should be able to spot the nuances in sources more easily. The aim is to weed out the tired, repetitive and redundant. Specifically, try to look for the points where people's behaviors start to deviate from an established norm.

Services like Compete and Google Trends are great for spotting these types of changes. There are a variety of sources data can be pulled from, including government, industry bodies, and free secondary research.

Here's a graphical comparison of various Google services. I'm sure there's an article topic in there somewhere ;)

Of course, you need to watch out for bias. One famous example of the problems of biased data was the 1948 election:

On Election night, the Chicago Tribune printed the headline DEWEY DEFEATS TRUMAN, which turned out to be mistaken. In the morning the grinning President-Elect, Harry S. Truman, was photographed holding a newspaper bearing this headline. The reason the Tribune was mistaken is that their editor trusted the results of a phone survey. Survey research was then in its infancy, and few academics realized that a sample of telephone users was not representative of the general population. Telephones were not yet widespread, and those who had them tended to be prosperous and have stable addresses

This is why cross-checking is often a good idea. One example, in the field of SEO, is keyword data. Some keyword research tools pull data from small, third party search engines, whilst Adwords data might be a more reliable indicator of the numbers of searches on Google for a specified keyword term, if that's what you're aiming to measure.

TrendWatching.com offers a good definition of trends:

A (new trend) is a manifestation of something that has unlocked or newly serviced an existing (and hardly ever changing) consumer need,* desire, want, or value.

"At the core of this statement is the assumption that human beings, and thus consumers, don't change that much. Their deep needs remain the same, yet can be unlocked or newly serviced. The 'unlockers' can be anything from changes in societal norms and values, to a breakthrough in technology, to a rise in prosperity."

Can you spot anything people have recently started doing differently?

One example was PPC advertising. Before PPC advertising came about, SEOs wouldn't dream of paying for clicks. Why would they when they could get them for free?

So, the established norm was a group of marketers who operated on the principle of getting clicks for free.

PPC emerged because there were a group of advertsiers that were prepared to pay per click, rather than spend time, money and effort in the hit and miss field of SEO. PPC addressed a deep need. PPC, of course, quickly grew into a multi-billion dollar industry.

3. Monitor Cross-Industry

Monitor not just your own vertical, but also look across related industries. What's hot and emerging in one market may not have hit your market yet. See if there is a natural synergy between the two. If there is, and no one is writing about it yet - great - you've just discovered a ground breaking content idea.

4. Aggregators

There are a wide range of aggregators available, with new options popping up all the time. Aggregators are particularly good for finding new sources. Try Techmeme, FriendFeed, StumbleUpon, Popurls, Topix, and, of course, the recently updated Google Blog Search.

5. Set Up A tips@ Email Address

Your readers might be a rich source of ideas. Some may also have some insider information that they might not feel comfortable publishing yourself.

Set up a tips@ email address, and encourage people to email you with information. Make it easy for them to do so.

BTW, if anyone does have some insider information they want to share, or answers you need, or article suggestions, please email us at seobook@gmail.com. :)

6. Cultivate News Stories Using Social Media

Start a Digg-style news community for your niche. Try to create communities of people who enjoy mining for information on a given topic. One search-oriented example of such a community is Sphinn.com.

If you don't have the inclination to set-up a community yourself, find existing communities and monitor them.

Check out:

Pligg.com
Sphinn.com
Mixx.com

7. RSS Remixing

RSS remixing is agrregating different RSS feeds into one feed. You can remix each industry vertical, rather than have multiple feeds, which can make it easier to scan.

Add each feed to your reader, aggregate them into the one big feed, the same folder or view, and viola - you have your own niche news mining engine.
Also check out remixers such as FeedRinse, FeedDigest, and BlastFeed.

8. URL Monitoring On Digg

In the Digg search option, choose "URL only" and "upcoming stories". Type in the domain name of any site you want to watch. You should see an orange RSS button in the right hand corner. Click on it and save the results as an Rss feed.

9. Google Alerts

Why search for news when Google can do it for you? For those who don't know, Google Alerts is an email service that monitors Google result sets for the keyword of your choice.

For example, you can monitor when people talk about you or your site, you can keep track of your competitors or industry, and stay on top of breaking news.

Also check out Track Engine. Similar to Google Alerts, Track Engine can be used to identify when websites update, without you having to visit them. You can also set tracking perameters so customise the information you receive.

10. Google Insights

Insights For Search is a hugely useful tool.

You can use it in a number of ways. For example, you can track seasonal trends. This chart shows when interest is highest in basketball. The pattern of interest is a consistent, shape year after year. You could use this information to dictate the timing of your stories on certain topics.

11. Random Stumbling & Association

Sometimes stumbling about in unknown territory can be a great way to get the creative juices flowing.

Another fun option are Oblique Strategies cards.

Try famous quotes. Quotes contain universal truths, which you might be able to apply to your area of interest, in order to view things in a different way.

Image collections are another. Search on various themes, and see what image comes back. Does the image prompt a fresh way of thinking?

Hold multiple, disconnected ideas in your head and see if you can discover a synergy. For example, a famous example is:

  • A Red Traffic Light
  • A cigarette

This led to the little red mark on cigarettes encouraging smokers to stop smoking when the cigarette burned down to that point, and thereby they could control their habit. More likely, it was a ruse to get smokers to go through a pack faster.

Got any strategies on how to generate story ideas? Add 'em to the comments below.

Further Reading

Tracking the Evolution of Search Spam

Oct 16th
posted in

As part of their 10th birthday celebrations, Google recently released a 2001 index, to show us how much things have changed.

It is fascinating to look into the past, especially from an SEO point of view. Has the nature of spam changed since 2001? How has Google changed in order to nullify the affects of spam?

When Google filed their registration statement prior to IPO, Google identified a number of risk factors.

One of these risks was:

We are susceptible to index spammers who could harm the integrity of our web search results

There is an ongoing and increasing effort by “index spammers” to develop ways to manipulate our web search results. For example, because our web search technology ranks a web page’s relevance based in part on the importance of the web sites that link to it, people have attempted to link a group of web sites together to manipulate web search results. We take this problem very seriously because providing relevant information to users is critical to our success. If our efforts to combat these and other types of index spamming are unsuccessful, our reputation for delivering relevant information could be diminished. This could result in a decline in user traffic, which would damage our business."

Curious how Google conflates spamming with relevance, eh. While it could be true that manipulating rank could lead to lower relevance, that isn't a given. The manipulation could, after all, produce relevant results. "Relevant" being a subjective judgement made by the user.

I digress...

What Google are really getting at is the type of manipulation that leads to less relevant results, commonly referred to as search engine spam. In this respect, what has changed since 2001?

Has Search Spam Been Defeated?

Or, to put another way, what changes have Google made to reduce the business risk of non-relevant search results?

Compare the following examples with the results we see today:

Buy Viagra
Viagra

Now try searching on those two phrases in today's index. How many differences can you spot? How have the result sets changes? Are they less "spammy"?

Here are a few aspects I noticed:

  1. The search results are much tighter and much more well policed. You wouldn't find the penis-envy.com site's link exchange page ranking in Google's 2008 search results for Paxil search queries.
  2. Google used to match keyword strings a lot more than it does today. This is the reason why a lot of on-page optimization techniques have become redundant, and the reason why effective on page optimization in 2008 is more about diversity than repeating words.
  3. Blogs have came from an obscure force to category leaders in many markets.
  4. If you happen to be searching outside the US, Google now incorporates, and boosts, regional results.
  5. Google now incorporates YouTube, news, and other related informational sources, thus forcing results from smaller sites further down the page
  6. There used to be a lot more hyphenated domain names showing up top ten. Not so much these days.
  7. Wikipedia, then called Nupedia, had only just started in 2001, so wasn't yet appearing in every single search result ;)

When Google first emerged, algorithmic search was in real danger of becoming unusable. Engines like Alta Vista were losing the war against spammers, and result sets were becoming increasingly irrelevant. Sergey Brin once declared that it wasn't possible to spam Google. When Google came along, they had defeated spam forever using a clever link analysis algorithm. No more spam!

Well, not really.

Spam hasn't gone away. But it is fair to say that Google is doing a pretty good job of maintaining relevance, and in many cases, eliminating the worst forms of spam. For example, it is now uncommon to see the type of deceptive redirects that were common in 1997, whereby if you clicked on a link, you were led you to a site that was unrelated to the link text.

We've seen the rise of the authoritative domain, and the relegation of the influence of many smaller sites. Pages hosted on authoritative domains are more likely to rank higher than pages on sites that haven't established authority. This has, in turn, led to a different type of spam. People hack into authoritative sites in order to place their links, or entire pages, on these domains. Wikipedia has an ongoing battle to keep their pages free from "commercial imperatives".

The target has, in many ways, shifted down a level.

Big Changes

Since 2001, Google has incorporated verticals.

In this article, Danny Sullivan outlined the use of "invisible tags" in the delivery of search results.

"The solution I see coming is something I call "invisible tabs." Quietly, behind the scenes, search engines will automatically push the correct tab for your query and retrieve specialized search results. This should ultimately prove an improvement over the situation now, where you're handed 10 or 20 matching web pages."

Result sets have increasingly become query dependent, as if you'd pre-selected a topic tab. For example, if your query is determined to have an informational intent, you're unlikely to receive a commercially oriented result set. It is has become a lot more difficult to get off-topic listings - which in this specific case would be commercial pages - into such result sets.

We've also seen the structure of search results pages change markedly. We see images, videos, news, related searches, sub pages, onebox results boxes, personalized results, desktop results, and Adwords. This leaves less and less room for other types of pages, as the search results orient more heavily around a wider variety of data types.

However, in the end, the SERP is still just a list, that looks much like the old list. What will search, and search spam, look like in another tens years?

The Future

Over $10 billion dollars are chasing paid search each year, and that figure will surely grow as media spend increasingly shifts online. There is still a strong incentive to use all means necessary to get to the top of the list.

Google will, of course, continue to try and counter this threat to their business model. The PageRank has likely been changed considerably to when it was first published. Google is likely to continue to incorporate usage metrics, making it more and more difficult for less relevant pages to gain a foothold.

On the flip side, will search be important as it is now? There appears to be a trend for more information to be pushed our way, rather than going out and finding it ourselves. RSS, recommendation engines (Amazon, YouTube, et al), community models (Facebook), and more. Will our surfing habits be (voluntarily) monitored, and answers provided before we we're even aware of the question? We're already seeing the early stages of this with contextual Adwords in Gmail. These changes will, in turn, give rise to a new breed of spam. While the commercial incentive remains, there will always be a level of spam.

The game of cat and mouse continues...

The Google 2001 Search Index is a Great SEO Tool

Having a glimpse of the past reminds us of how things changes, which might help us think of why they changed and how they may change going forward.

The 2001 index provides for a great tool to show past popular SEO techniques that have become irrelevant, which is useful when the boss uncovers an old spammy strategy that they feel you must follow to succeed. It not only helps us inform employers, but also allows us to talk about and highlight overt forms of spam without the worry of "outing" a page that is currently ranking.

  • Over 100 training modules, covering topics like: keyword research, link building, site architecture, website monetization, pay per click ads, tracking results, and more.
  • An exclusive interactive community forum
  • Members only videos and tools
  • Additional bonuses - like data spreadsheets, and money saving tips
We love our customers, but more importantly

Our customers love us!

URL Canonicalization: The Missing Manual

Oct 13th
posted in

Canonicalization can be a confusing area for webmasters, so let's take a look at what it is, and ways to avoid it causing problems.

What Is Canonicalization?

Canonicalization is the process by which URLs are standardized. For example, www.acme.com and www.acme.com/ are treated as the same page, even though the syntax of the URL is different.

Why Is Canonicalization An Issue For SEO?

Problems can occur when the search engine doesn't normalize URLs properly.

For example, a search engine might see http://www.acme.com and http://acme.com as different pages. In this instance, the search engine has the host names confused.

Why Is This a Problem?

If the search engines sees a page as being published at many separate URLs, the search engine may rank your pages lower than they would otherwise, or not rank them at all.

Canonicalization issues can split link juice between pages if people link to variants of the URL. Not only does this affect rank (less PageRank = lower rank), but it can also affect crawl depth (if PageRank is spent on duplicate content it is not being spent getting other unique content indexed).

To appreciate what a dramatic effect canonicalization issues can have on search traffic look at the following example, and notice that for the given example proper canonicalization increased traffic for that keyword by 300%

  Link Equity Google Ranking Position % of Search Traffic Daily Traffic Volume Traffic Increase
split 1 60% 8 3% 50 -
split 2 40% 15, filtered = 0 0% 0 -
canonical 100% 2 12% 200 300%

What Conditions Can Cause This Problem?

There are various conditions, but the following are amongst the most common:

  • Different host names i.e. www.acme.com vs acme.com
  • Redirects pointing to different URLs i.e. 302 used inappropriately
  • Forwarding multiple URLs to the same content, and/or publishing the same content on multiple domains
  • Improperly configured dynamic URLs i.e. any url rewriting based on changing conditions
  • Two index pages appearing in the same location i.e. Index.htm vs Index.html
  • Different protocols i.e. https://www vs http://www
  • Multiple slashes in the filepath i.e. www.acme.com/ vs www.acme.com//
  • Scripts that generate alternate URLs for the same content i.e. some blogging and forum software, ecommerce software that adds tracking URLs
  • Port numbers in the domain name i.e. acme.com/4430 : can sometimes be seen in virtual hosting environments.
  • Capitalization - i.e. www.acme.com/Index.html vs www.acme.com/index.html
  • URLs "built" from the path you take to reach a page i.e. tracking software may incorporate the click path in the URL for statistical purposes.
  • Trailing questions marks, with or without parameters i.e. www.acme.com/? or www.acme.com/?source=cnn (a common tagging strategy amongst ad buys)

How Can I Tell If Canonicalization Issues Are Affecting My Site?

Besides working through the checklist performing a manual check, you can also use Google's cache date.

Previously, you would have been able to use Google's supplemental index marker, although Google have recently done away with this feature.

The supplemental index is a secondary index, seperate from Google's main index. It is a graveyard, of sorts, containing outdated pages, pages with low trust scores, duplicate content, and other erroneous pages. As duplicate pages often reside in the supplemental index, appearing in the supplemental index can be an indicator you may have canonicalization issues, all else being equal.

Before Google removed the supplemental index label, many SEOs noticed that supplemental pages had an old cache date and that cache date is a good proxy for trust. If your page is not indexed frequently, and you think it should be, chances are the page is residing in the supplemental index.

Michael Gray at Wolf-Howl" outlines a method to easily check for this data. In summary, you add a date and unique field to each page, wait a couple of months, then search on this term.

How Can I Avoid Canonicalization Issues?

Good Site Planning

Using good site planning and architecture, from the start, can save you a lot of problems later on. Pick a convention for linking, and stick with it.

Maintain Consistent Linking Conventions

It's an important point, so I'll repeat it ;) Always link to www.acme.com, rather than sometimes linking to acme.com/index.htm, and sometimes linking to www.acme.com.

301 Redirect Non-www to www , Or Vice Versa

You can force resolution to one URL only. To do this, you create a 301 redirect.

Here's a typical 301 redirect script:

RewriteEngine On RewriteCond %{HTTP_HOST} ^seobook.com [NC] RewriteRule ^(.*)$ http://www.seobook.com/$1 [L,R=301]

For a more detailed analysis on how to use redirects, see .htaccess, 301 Redirects & SEO.

Use The Website Health Check Tool

This tool, and accompanying video, shows you how to spot a number of site architecture problems, including canonicalization issues.

Download the tool, check the www vs non-www option box, and hit the Analyze button.

If you have a large site you may not be able to surface all the canonicalization issues using the default tool settings. You may need to use the date based filter options to get a deep view of recently indexed pages...many canonicalization issues occur sitewide, so looking deeply at new pages should help you detect problems.

Another free, but far more time consuming option, is to use the date based filters on Google's advanced search page.

Workaround For Https://

Sometimes Google will index both the http:// and the https:// versions of a site.

One way around this is to tell the bots not to index the https:// version.

Tony Spencer outlines two ways to do this in .htaccess, 301 Redirects & SEO. One is to cloak the robots.txt file, the other is to create a conditional php script.

Use Absolute, As Opposed To Relative Links

An absolute link specifies the exact location of a file on a webserver. For example, http://www.acme.com/filename.html

A relative link is, as the name suggests, relative to a pages' location on the server.

A relative link looks like this:
"/directory/filename.htm"

There are various issues to consider, not related to canonicalization issues, when deciding to using either format. These issues include page download speed, server access times, and design conventions. The point to remember is to remain consistent. Absolute links tend to make doing so easier, as there is only ever one URL format for a file, regardless of context.

Don't Link To Multiple Versions Of The Page

In some cases, you may intend to have duplicate content on your site.

For example, some software, such as blog and forum software, aggregates posts into archives. Always link to the original version of the post, as opposed to the archive, or any other, location i.e. www.acme.com/todays-post.htm , not www.acme.com/archive/december/todays-post.htm.

If your software program links to a duplicate version of the content (like an individual post from a forum thread) consider adding rel=nofollow to those links.

Use 301s, not 302s On Internal Affiliate Redirects

A 301 redirect is a permanent redirect, which indicates a page has been moved permanently. 301s typically pass PageRank, and do not cause canonicalization issues.

A 302 redirect is a temporary redirect. If you use 302s the wrong page may rank. Google's Matt Cutts claims they are trying to fix the problem:

we’ve changed our heuristics to make showing the source url for 302 redirects much more rare. We are moving to a framework for handling redirects in which we will almost always show the destination url. Yahoo handles 302 redirects by usually showing the destination url, and we are in the middle of transitioning to a similar set of heuristics. Note that Yahoo reserves the right to have exceptions on redirect handling, and Google does too. Based on our analysis, we will show the source url for a 302 redirect less than half a percent of the time (basically, when we have strong reason to think the source url is correct)

but if you use 302s on affiliate links the affiliate page may rank in the search results, as shown in the below SnapNames search. This, in turn, would credit the affiliate with a commission anytime someone buys through that link in the search results...effectively cutting the margins of the end merchant.

Specify preferred urls in Google Webmaster Tools

Google Webmaster Tools provides an area where you can specify which version of URL i.e. http://www.acme or http//acme Google should use.

Note: It is important not to use the remove URL tool to try and fix these domain issues. Doing so may result in your entire domain, as opposed to one page, being removed from the index.

Further Reading

Why Bloggers Need To Think About Marketing Strategy

Oct 9th

I started a blog on search engines in 2002.

In those days, the idea of blogging about anything other than politics, or blogging, or what your cat had for breakfast, was new. In fact, the idea of blogs was new. Most people's reaction to the word blog was "huh"?

I quickly built up an audience, and links, mostly because I had first mover advantage, and I threw in a few social media basics. It certainly wasn't rocket science. But, at the time, I was doing something unique and "remarkable", in the Seth Godin sense of the word.

Fast forward to today, and the landscape is very different.

There are thousands - perhaps tens of thousands - of blogs on search, and most of those go unread. A blog on search is no longer remarkable.

Unless you have first-class insider information, and can produce it on a regular basis, I wouldn't advise anyone start a generalist search engine blog these days. The low hanging fruit is gone, but there are still easy pickings in other areas, it's simply a matter of finding them, identifying your strengths, and exploiting them.

How Many Blogs Are Out There?

This years "State Of The Blogsphere" report indicates there are around 133 million blogs, and they are only the blogs indexed by Technorati since 2002.

Even if we assume that half of those are spam blogs, or cobweb blogs, that's still a lot of "personal journals". Are there 133 million readers?

ComScore MediaMetrix (August 2008)
Blogs: 77.7 million unique visitors in the US
Facebook: 41.0 million | MySpace 75.1 million
Total internet audience 188.9 million
eMarketer (May 2008)
94.1 million US blog readers in 2007 (50% of Internet users)
22.6 million US bloggers in 2007 (12%)
Universal McCann (March 2008)
184 million WW have started a blog | 26.4 US
346 million WW read blogs | 60.3 US
77% of active Internet users read blogs

Would a generalist blog do well in such a market? It could, but it's highly unlikely. Such deep markets tend to favor a niche approach.

So, instead of a blog on search, one strategy might be simply to go deep on one aspect of that market. How about a blog on the mathematics of search engine algorithms? Or search marketing for a specific region? Or search marketing in one industry vertical, such as travel?

How To Find And Test A Niche

First up, read these posts:

Once you've decided on a niche, you can further test the validity of your idea, and your approach, by asking questions.

One formalized way of doing this is called a SWOT analysis. It's a high-brow marketing term, but the idea is simple in practice. Swot stands for Strengths, Weaknesses, Opportunities and Threats.

Make a list:

  • Strengths - why do I do well?
  • Weaknesses - What do I do poorly?
  • Opportunities - What upcoming trends fit with my strengths? What am I doing now that could be leveraged?
  • Threats - What internal problems do I face? What external problems do I face?

You then detail how you can use each strength, how you can improve each weakness, how you exploit each opportunity, and how you mitigate each risk.

Simply going through such exercises can open a world of possibilities. It is important to write it down. I find the simple act of writing something down seems to make an idea less abstract and more concrete.

One of the big threats in the blog world is the low barrier to entry. Anyone can start a blog within minutes.

Ask yourself how will you stay ahead of the person who starts in the next hour? The ten people who have started by tomorrow? The hundreds of people who have started by next week, not to mention the big, established names who already have a dedicated share of an audience that isn't really growing.

Tough call. There are no easy answers to such a question, as it really depends on your individual strengths and weaknesses, which is why asking questions like these can provide valuable insight.

Philip Kotler, a renowned marketing guru, suggests asking the following questions of any new business plan or idea:

  • Does this strategy contain exciting new opportunities?
  • Is the plan clear at defining a target market?
  • Will the customer in each target market see our offering as superior?
  • Do the strategies see, coherent? Are the right tools being used?
  • What is the probability that the plan will achieve its stated objectives?
  • What would you eliminate from the plan if you only had 80% of your budget?
  • What would you add to the plan if you only had 120% of your budget?

Those last two might seem a little odd in this context, but they certainly are applicable. What would you do if you had more of a budget to promote your blog? Would you spend it on advertising? If so, where, specifically, would you spend it?

Asking these questions can suggest all manner of options. By pretending you have more of a budget, you might identify great advertising partners, but because, in reality, you might not have this budget, you could instead suggest you write guest articles for them, and thus achieve much the same result.

SEO For Blogs

The latest shift in SEO, as Aaron details in Social Interaction & Advertising Are The Modern Day Search Engine Submission & Link Building, is towards relationship marketing, which is why SEOs are increasingly adopting marketing and PR strategies in order to operate more effectively.

Let's face it - SEO for blogs is a cakewalk. Blog software, such as Wordpress, is already search friendly, right out of the box. If you want to tweak it further, there are a wealth of available tools and instruction. Anyone can do it, and that's a problem.

But it's not really about the tools. It's how you use them. The key part to success in doing SEO on blogs is the way you interact.

Specific Strategies To Consider

Quote And Link To Popular Bloggers

Apart from the obvious potential that a blogger will follow inbound links back to their source (you!), meme aggregators, such as Techmeme and Google Blog News, are becoming more prevalent.

These sites aggregate similar conversations together. Simply by talking about what others are talking about, and adding to the conversation, you might get a link and/or attention.

Leave Valuable Useful Comments On Popular Related Blogs

Go where the crowd already is.

For example, I follow most comments in these blog posts back to the authors, and if they have left a site name, I check it out.

Most are then added to my RSS feed reader.

Write Articles For Other Popular Blogs

Think of this as advertising. Advertising costs, and in this case, that cost is your time. The benefits of contributing editorial can be fantastic, however, as you can reach a large, established market quickly.

Create Community Based Ideas, Ask For Feedback Before Launching

This is cheap and cheerful market research. You also give your audience an opportunity for buy-in on the outcome. If the audience feels they are part of the process, they are more likely to accept it, and even promote it.

Add Value To Ideas So People Reference You When Talking About Them

Besides the obvious link benefit involved, it is also great for your brand. Your name becomes your brand, and the more people mention your name, the further your brand spreads. Seth Godin is a master at this, and if you aren't reading his blog already, you should be.

See! It just happened. Twice, in this post, in fact.

Actively Solicit Comments And Reply To Them

One over-looked value of comments is that people are providing crawlable, unique content. Usually I find the more contentious the post, the more comments you receive. So don't be afraid to stir the hornets nest every one in a while ;)

Encouraging Contribution From Others And Highlighting Their Contribution Builds Community

The best situation is win-win. Are you giving your readers and community members a chance to do so?

This is one of the reasons I think black hole SEO is short-sighted, especially for community sites and blogs. It doesn't allow others to win, too.

Network Offline At Industry Trade Shows

I once worked with a guy who had been a very successful investment banker on Wall Street. He says he ignores the University qualifications and information in the public domain, as the real business world works on inside information and who you know. There's no doubt that the best place to get insider search information, and great contacts, is in the bars between conferences.

Every community has an epicenter - a group of people who most others take a lead from - and that epicenter might be as small as three or four highly influential people. Those are the people you need to talk to.

Don’t Be Afraid Of Controversy

If you gain mindshare and authority, some people will hate you for it.

This is related to my "stir-the-hornets-nest" point above. Once you start getting attention, you also become a target. You have little choice but to go with the flow, and keep in mind you cannot please all the people, all the time. Sometimes, it even pays not to please them. People are more likely to engage if they feel passionate, and especially if they passionately believe you are wrong!

Reminds me of a great quote by Oscar Wilde: "The only thing worse than being talked about is not being talked about!"

Further Reading

Relationship Marketing Via Consumer Interaction

Oct 7th
posted in

There was a time when people bought from those people who knew them.

You went to the local butcher or baker, and he knew your name, and your kids names. Personal interaction was a valuable sales and marketing tool.

We can apply this strategy to the web, too.

Interaction Marketing

Interaction marketing, as the name suggests, is about the marketing benefit that can be had from engaging with a visitor in a more personalized way.

It works well in an environment of anonymous, mee-too sameness, because people still crave uniqueness and personal attention. This startegy isn't limited to commerce, either. It applies to all kinds of sites, including blogs.

What Are The Benefits Of Encouraging Interaction?

Encouraging interaction can result in more repeat visits, more sales, more loyalty, and more attention. In many cases, it's quite a simple thing to do, and the pay-offs can be enormous.

A well-known example is: "Do you want fries with that?". McDonald's upsell is an example of interaction marketing. They're asking the right question at just the right time, and they're personalizing the service. And it works, to the tune of billions in extra revenue per year.

The Blogs Squeaky Wheel

One of the problems with blogs, this one included, is that the audience isn't just one audience. There are many audiences.

Some people are experienced SEOs and have been reading here for a long time. Others might have only just learned what the phrase SEO means. Most people are spread across the continuum.

How do you deliver an experience that works well for everyone?

Distinguish Between New And Returning Visitors

Seth Godin advocates distinguishing between new and returning visitors to your site, and targeting them with slightly different messages.

For example, new commentors could be delivered to a special welcome page, informing them about various, important areas of your site. You wouldn't necessarily want to do this for long-time users of the site, because it would slow them down, and might be seen as condesending rather than helpful.


One opportunity that's underused is the idea of using cookies to treat returning visitors differently than newbies. It's more work at first, but it can offer two experiences to two different sorts of people.

Here is a Wordpress plugin that will do just that: Wordpress Commentor PlugIn.

By default, new visitors to your blog will see a small box above each post containing the words "If you're new here, you may want to subscribe to my RSS feed. Thanks for visiting!" After 5 visits the message disappears. You can customize this message, its lifespan, and its location."

You could also try this one: Comment Redirect PlugIn

Another way to achieve the same thing is to send an email to new commentors upon registration, outlining the top posts and welcoming them.

You're customizing the experience only slightly, but the payoffs in terms of relationship building could be considerable. The user are more likely to perceive the interaction as helpful and personalized.

Ask For A Link In The Order Confirmation E-mail

That is certainly one of those "why-didn't-I-think-of-that" moments.

You could ask customers, or new sign ups, to link to you. Your customers are prime candidates to approach for links, because they are already familiar with you, presumably like you, and the relationship has already been established.

Amazon-Style Feedback Reminder

Amazon, and their partners, ask for a review a few days after you buy something.

Not only is this a great way to get feedback, customers may also provide you with content. Make it easy for them to do so.

Selective Advertising

Advertising can annoy visitors, and compromise your brand. You can give people added value by removing advertising for those who join up.

Similarly, you could leave advertising off new content. Create a different template, that includes ads, for your archived content. By doing so, you can monetarize most of your content without annoying your regular readers.

Further Reading

Is Buying Links Stupid?

Oct 3rd
posted in

This old chestnut.

There is a post over at Search Engine Land by Danny Sullivan entitled "Conversation With An Idiot Link Broker". To cut a long story short, some guy tries to broker a link deal with Danny, seemingly not knowing who Danny is, and Danny plays him along. Danny reports him to the Google spam team.

For the sake of furthering discussion, I'll play devils advocate :)

Regardless of anyone's views on link buying, it is wrong to mislead people. Danny clearly felt this guy was being misleading, and gave him a number of chances to clarify his position. But is buying and selling links really as "risky" a behavior as is being made out?

It might be considered a risky behavior if you spend a lot of time obsessing about Google, as SEOs tend to do. However, links are the glue that binds the web. Link buying and selling started long before Google existed. It will always happen.

It's called advertising.

But it would be disingenuous not to see what Danny is really talking about here. He's talking about buying links for the sole purpose of gaining link juice. I can understand why Google takes a dim view of this practice. . Paid links compromises Google's business model.

Fair enough. If I worked for Google, I'd take the same stance.

For Danny Sullivan, given the level of exposure of his site in the search world, the risks presented by link trading would be significant. Regardless of Danny's personal opinion on such practices, such a deal would clearly be a non-starter. The link seller is a fool for, above all else, failing to identify his customer.

However, for most sites, the reality is that the risk of link buying and selling is probably negligible.

Google taking out the occasional site amidst a storm of publicity doesn't mean much when there are tens of thousands of sites that clearly do not receive the exact same treatment. If one site in two got hammered, it would be a different story, but it is likely the figures run into one site in thousands. It then becomes a matter of weighing one's chances of being detected and punished by Google against the potential rewards on offer.

For example, there are credible, Fortune 500 companies engaged in buying and selling links. The risk of big names being taken out for any longer than a day or two is near zero. If you run the sort of big name site searchers expect to see in the results, Google probably aren't going to leave you out on a technicality. This would compromise their business model, because Google must deliver relevant results.

Is it up to the link seller to outline all the potential risks involved? Apart from the comical farce of a link seller failing to identify Danny Sullivan, how big a moral crime has the guy really committed? Do Google outline all the risks associated with using their products and services? Or is Danny cunningly implying that Google's algorithm cannot determine which links are paid, and in fact relies on people filing reports? ;)

A moral tone runs through such discussions, and I'm not sure it is entirely consistent.

Google are a business and their pronouncements must be considered in this context. They will act in their own interest, and those interest may or may not align with your own. Are we at risk of ceding the assumption of moral superiority to Google when they may not deserve it? Google, like you, are trying to earn a crust, and any organization may not be entirely transparent and morally consistent in all they do. Who do you call out, and who gets a free pass?

Google certainly holds the power, and if being in the SERPs matters a lot to you, then you should stay within Google's guidelines. It's also fair to say that, these days, even this approach offers no guarantees.

Tread wisely :)

Further Reading

How To Choose Domain Names For SEO

Oct 1st

Domaining.

It has been a hot topic for a while now, yet many domineers aren't overly active in the SEO space. Yet.

Domaining is when you register a domain, or buy a domain on the seondary market, with the intention of deriving traffic, and turning that traffic into revenue. Traffic comes from type in traffic. i.e. people type a keyword into the address bar and add .com on the end. Domains can be valuable internet real estate, because, unlike a search engine, there is no middleman between you and the visitor. A lucrative pursuit, if you choose the right names.

Let's take a look at how domineering strategy can be applied to SEO.

Background

Aaron has a great interview with Frank Schilling. Frank is one of the biggest domaineers on the planet, and an articulate advocate of this strategy.

Add this lot to your feed reader:

http://www.sevenmile.com/
http://rickschwartz.typepad.com/
http://www.whizzbangsblog.com/
http://www.domainnews.com/

If anyone has other suggestions for great domaining blogs, please add them to the comments.

How To Select A Domain Name

Google tends to give weight to keywords in the domain name. This increases the importance of selecting a good name.

When choosing a domain name for SEO purposes, there are three main factors to consider:

  • Brand
  • Rankability
  • Linkability

Brand

Should you use hyphenated, multi-term domain like search-engine-marketing-services.com?

I'd avoid such names like the plague.

Why?

They have no branding value. They have limited SEO value. Even if you do manage to get such a domain top ten, you're probably going to need to sell on the first visit, as few people are going to remember it once they leave. It is too generic, and it lacks credibility.

In a crowded market, brand offers a point of distinction.

It is easier to build links to branded domain names. People take these name more seriously that keyword-keyword-keyword-keyword.com, which looks spammy and isn't fooling anyone. Would you link to such a name? By doing so, it devalues your own content .

It can even difficult to get such domain names linked to when you pay for the privilege! Directory editors often reject these names on sight, because such names are often associated with low-quality content. Imagine how many free links you might be losing by choosing such a name.

Is there a downside to using branded names?

Yes.

Unless you have a huge marketing budget, no one is going to search for perseefgxcbtrfy.com, which is a new killer, brand I just made up ;)

Thankfully, there is a happy medium between brand and SEO strategy.

Rankability

SEOs release the value of keywords. When naming your site, and deciding on a domain name, try combining the lessons of SEO, branding and domaining.

Genric + term is a good approach to use. Take your chosen keyword, and simply add another word on the end. SeoBook, Travelocity, FlightsCity, CarHub, etc. These words have SEO value built into them, because people are forced to use your keywords in the link. Also, Google (currently) values a keyword within the domain name for ranking purposes. Finally, such a name retains an element of unique branding.

These types of domain names score high on the rank-ability and link-ability meter. They are generic enough to rank well for the keyword term, yet contain just enough branding difference to be memorable.

The SEO Advantage

There is another advantage for SEOs in the domain space.

Dot com's can sell for 5-20 times as much as a .org or .net. Keyword + .com can sell for millions of dollars, depending on the domain name.

Expensive, huh.

But...

By registering or buying the cheaper .net or .org equivilent, building out the site, and ranking well for the keyword + net, or +org, you increase the value of the domain name markedly. Sure, you're one step away from pure domaineering and you still have Google to contend with, but you'll be head and shoulders above those who are undervaluing these names.

A lot of domaineers aren't operating in this space.

Yet.

Other Tips And Ideas

Leave The Keyword Out Entirely

Used the related search function on Google ~ + keyword and see if any of the related keyword terms fit. This can be a good strategy to use if all the good generic keyword names are gone. It might get you close enough to the action, without the enormous price tag. Might be more memorable, too.

How To Test A Domain Name For Penalties Before Buying It

  • Verify the site is not blocking GoogleBot in their robots.txt file
  • Point a link at the domain from a trusted site and see if Google indexes it
  • Within a couple weeks (at most a month) Google should list the site when you search for it in Google using site:domainname.com

Further Reading:

Align Your SEO Strategy With Site Structure

Sep 30th
posted in

I'd like to take a look at an area often overlooked in SEO.

Site architecture.

Site architecture is important for SEO for three main reasons:

  • To focus on the most important keyword terms
  • Control the flow of link equity around the site
  • Ensure spiders can crawl the site

Simple, eh. Yet many webmasters get it wrong.

Let's take a look at how to do it properly.

Evaluate The Competition

One you've decided on your message, and your plan, the next step is to layout your site structure.

Start by evaluating your competition. Grab your list of keyword terms, and search for the most popular sites listed under those terms. Take a look at their navigation. What topic areas do they use for their main navigation scheme? Do they use secondary navigation? Are there similarities in topic areas across competitor sites?

Open a spreadsheet, and list their categories, and title tags, and look for keyword patterns. You'll soon see similarities. By evaluating the navigation used by your competition, you'll get a good feel for the tried-n-true "money" topics.

You can then run these sites through metrics sites like Compete.com.

Use the most common, heavily trafficked areas as your core navigation sections.

The Home Page Advantage

Those who know how Page Rank functions can skip this section.

Your home page will almost certainly have the highest level of authority.

While there are a lot of debates about the merits of PageRank when it comes to ranking, it is fair to say that PageRank is rough indicator of a pages' level of authority. Pages with more authority are spidered more frequently and enjoy higher ranking than pages with lower authority. The home page is often the page with the most links pointing to it, so the home page typically has the highest level of authority. Authority passes from one page to the next.

For each link off a page, the authority level will be split.

For example - and I'm simplifying* greatly for the purposes of illustration - if you have a home page with a ten units of link juice, two links to two sub-pages would see each sub-page receive 5 points of link juice. If the sub-page has two links, each sub-sub would receive two units of link juice, and so on.

The important point to understand is that the further your pages are away from the home page, generally the less link juice those pages will have, unless they are linked from external pages. This is why you need to think carefully about site structure.

For SEO purposes, try to keep your money areas close to the home page.

*Note: Those who know how Page Rank functions will realise my explaination above is not technically correct. The way Page Rank splits is more sophisticated than that given in my illustration. For those who want a more technical breakdown of the Page Rank calculations, check out Phils post at WebWorkshop.

How Deep Do I Go?

Keeping your site structure shallow is a good rule of thumb. So long as you main page is linked well, all your internal pages will have sufficient authority to be crawled regularly. You also achieve clarity and focus.

A shallow site structure is not just about facilitating crawling. After all, you could just create a Google Site Map and achieve the same goal. Site structure is also about selectively passing authority to your money pages, and not wasting it on pages less deserving. This is straightforward with a small site, but the problem gets more challenging as you site grows.

One way to mange scale is by grouping your keyword terms into primary and secondary navigation.

Main & Secondary Navigation

Main navigation is where you place your core topics i.e. the most common, highly trafficked topics you found when you performed your competitive analysis. Typically, people use tabs across the top, or a list down the left hand side of the screen. Main navigation appears on all other pages.

Secondary navigation consists of all other links, such as latest post, related articles, etc. Secondary navigation does not appear on every page, but is related to the core page upon which it appears.

One way to split navigation is to organize your core areas into the main navigation tabs across the top, and provide secondary navigation down the side.

For example, let's say you main navigation layout looked like this:

Each time I click a main navigation term, the secondary navigation down the left hand side changes. The secondary navigation are keywords related to the core area.

For those of you who are members, Aaron has an indepth video demonstration on Site Architecture And Internal Linking, as well as instruction on how to integrate and mange keywords.

Make Navigation Usable

Various studies indicate that humans are easily confused when presented with more than seven choices. Keep this in mind when creating your core navigation areas.

If you offer more than seven choices, find ways to break things down further. For example, by year, manufacturer, model, classification, etc.

You can also break these areas down with an "eye break" between each. Here's a good example of this technique on Chocolate.com:

Search spiders, on the other hand, aren't confused by multiple choices. Secondary navigation, which includes links within the body copy, provides plenty of opportunity to place keywords in links. Good for usability, too.

As your site grows, new content is linked to by secondary navigation. The key is to continually monitor what content produces the most money/visitor response. Elevate successful topics higher up you navigation tree, and relegate loss-making topics.

Use your analytics package to do this. In most packages, you can get breakdowns of the most popular, and least popular, pages. Organise this list by "most popular". Your most popular pages should be at the top of your navigation tree. You also need to consider your business objectives. Your money pages might not be the same pages as your most popular pages, so it's also a good idea to set up funnel tracking to ensure the pages you're elevating also align with your business goals.

If a page is ranking well for a term, and that page is getting good results, you might want to consider adding a second page targeting the same term. Google may then group the pages together, effectively giving you listings #1 and #2.

Subject Themeing

A variant on Main & Secondary Navigation is subject themeing.

Themeing is a controversial topic in SEO. The assumption is that the search engines will try and determine the general theme of your site, therefore you should keep all your pages based around a central theme.

The theory goes that you can find out what words Google places in the same "theme" by using the tilde ~ command in Google. For example, if you search on ~ cars, you'll see "automobile", "auto", "bmw" and other related terms highlighted in the SERP results. You use these terms as headings for pages in your site.

However, many people feel that themes do not work, because search engines return individual pages, not sites. Therefore, it follows that the topic of other pages on the site aren't directly attributable to the ranking of an individual page.

Without getting into a debate about the the existence or non-existence of theme evaluation in the algorithm, themeing is a great way to conceptually organize your site and research keywords.

Establish a central theme, then create a list of sub-topics made up of related (~) terms. Make sub-topics of sub-topics. Eventually, your site resembles a pyramid structure. Each sub-topic is organized into a directory folder, which naturally "loads" keywords into URL strings, breadcrumb trails, etc. The entire site is made up of of keywords related to the main theme.

Bruce Clay provides a good overview of Subject Themeing.

Bleeding Page Rank?

You might also wish to balance the number of outgoing links with the number of internal links. Some people are concerned about this aspect, i.e. so-called "bleeding page rank". A page doesn't lose page rank because you link out, but linking does effect the level of page rank available to pass to other pages. This is also known as link equity.

It is good to be aware of this, but not let it dictate your course of action too much. Remember, outbound linking is a potential advertisement for your site, in the form of referral data in someone else logs. A good rule of thumb is to balance the number of internal links with the the number of external links. Personally, I ignore this aspect of SEO site construction and instead focus on providing visitor value.

Link Equity & No Follow

Another way to control the link equity that flows around your site is to use the no-follow tag. For example, check out the navigational links at the bottom of the page:

As these target pages aren't important in terms of ranking, you could no-follow these pages ensure your main links have more link equity to pass to other pages.

Re-Focus On The Most Important Content

This might sound like sacrilege, but it can often pay not to let search engines display all the pages in your site.

Let's say you have twenty pages, all titled "Acme". Links containing the keyword term "Acme" point to various pages. What does the algorithm do when faced with these pages? It doesn't display all of them for the keyword term "Acme". It choses the one page it considers most worthy, and displays that.

Rather than leave it all to the algorithm, it often pays to pick the single most relevant page you want to rank, and 301 all the other similarly-themed pages to point to it. Here's some instructions on how to 301 pages.

By doing this, you focus link equity on the most important page, rather than splitting it across multiple pages.

Create Cross Referenced Navigational Structures

Aaron has a good tip regarding cross-referencing within the secondary page body text. I'll repeat it here for good measure:

This idea may sound a bit complex until you visualize it as a keyword chart with an x and y axis.

Imagine that a, b, c, ... z are all good keywords.
Imagine that 1, 2, 3, ... 10 are all good keywords.

If you have a page on each subject consider placing the navigation for a through z in the sidebar while using links and brief descriptions for 1 through 10 as the content of the page. If people search for d7, or b9, that cross referencing page will be relevant for it, and if it is done well it does not look too spammy. Since these types of pages can spread link equity across so many pages of different categories make sure they are linked to well high up in the site's structure. These pages works especially well for categorized content cross referenced by locations.

Related Reading:

Where Are You Placed On The Quality Curve?

Sep 28th
posted in

Techcrunch is publishing a rumour that Yahoo might be looking to sell off Yahoo Answers.

"Yahoo Answers, which was launched in late 2005, is a staggeringly huge site. Recent Comscore stats say the service attracts nearly 150 million monthly visitors worldwide and generates 1.3 billion monthly page views. That's 67% unique visitor growth in the last year. Yahoo as a whole, though, has nearly 100 billion monthly page views, so it isn't a material percentage of total Yahoo traffic"

Nice traffic, however Yahoo Answers is full of junk content. There are now numerous competitors in the Q&A space.

If you're first mover, as Yahoo was, you can get away with low quality content, but as competition increases, the quality must also increase in order to keep people hooked. Whilst hugely successful in terms of traffic numbers, Yahoo Answers now must to respond to increasing competition. With rumours of a sale, it looks like Yahoo may instead be refocusing their efforts on their core business.

This is an example of the "curve to quality" pattern. First movers can get away with junk content for a while, but eventually competitors will up the quality and gain audience share as a result. This reinforces the need to adapt business models in light of competition, and the need to avoid commodity status.

We can see the same curve to quality pattern in the blog world.

Jackob Neilsen was advising a world leader in his field on what to do about his website. The guy wanted to know if he should start a blog.

Neilsens answer was no, and here's why:

"Blog postings will always be commodity content: there's a limit to the value you can provide with a short comment on somebody else's work. Such postings are good for generating controversy and short-term traffic, and they're definitely easy to write. But they don't build sustainable value. Think of how disappointing it feels when you're searching for something and get directed to short postings in the middle of a debate that occurred years before, and is thus irrelevant."

Also check out the graph "variability of posting quality" in Nielsen's post.

I suspect Nielsen is on the right track. Blog traffic is reportedly at an all time high, but they still only accounts for 0.73% of US traffic. Perhaps as the quality of the average blog increases, so to will the audience share.

Due to the pressure of competition, low quality content eventually becomes commodity.

Do you read mee-too search blogs? Not many people do. Most people gravitate towards the blogs that offer the highest perceived level of quality, as opposed to those that repeat the same news found elsewhere. Mee-too content is no longer an effective strategy in the blog world, or the newspaper world, as syndicated news services are finding out. There is simply too much competition.

There are other reasons why you might want to focus on quality as a strategy.

Google will always try to filter out low quality, commodity content in order to heighten user experience. Google approaches this problem in a number of ways.

In the remote quality rater document, Google lists a range of categories raters can attribute to web content. One category is "Not Relevant". This category applies to "news items that appear outdated" and "lower quality pages about the topic". Obviously, "lower quality" is a relative term and the comparison would be made between competing SERP results. Pages categorised as "Not Relevant" will receive lower SERP placement.

Also consider the notion of poison words. Posion words are words the search engines equate with content of low quality. If, just for example, forum content is found to frequently be of low quality, then it is reasonable to assume Google will look for markers that the site is a forum and mark this content down as a result. Markers might include a link back to a popular forum software script, for example.

This metric would not be taken in isolation as there are various other quality markers Google use. However, if the content is low quality and appears in a low quality format, you stand less chance of ranking for competitive queries.

The same might apply to commercial content, especially such content that appears in non-commercial query results.

Google's business model involves advertisers paying for clicks in the form of Adwords. The main SERPs are essentially a loss leader that facilitate people clicking on text advertisements. The main SERPs are the reason people use Google.

Such a business model would be supported by an algorithm that rewarded quality, informative content in the main SERPs. It could operate by downgrading any content deemed as purely commercial, and this would involve looking for commercially-oriented poison words. Posion words in this context might include "Buy Now", "Business Address", and other variants unique to commercial content. This would "encourage" those with commercial messages to list with Adwords because they would have trouble appearing in the main SERPs. It is unlikely such an algorithmn would apply to commercial queries, however.

Google filters in this way because there is much competition for keyword queries. Google looks to find the best answer. The answer of highest quality, both in terms of relevance and searcher satisfaction. As competition increases, the answers will get better, which is why you must aim to stay high on the quality curve.

Pages






    Email Address
    Pick a Username
    Yes, please send me "7 Days to SEO Success" mini-course (a $57 value) for free.

    Learn More

    We value your privacy. We will not rent or sell your email address.