I'd like to take a look at an area often overlooked in SEO.
Site architecture is important for SEO for three main reasons:
To focus on the most important keyword terms
Control the flow of link equity around the site
Ensure spiders can crawl the site
Simple, eh. Yet many webmasters get it wrong.
Let's take a look at how to do it properly.
Evaluate The Competition
One you've decided on your message, and your plan, the next step is to layout your site structure.
Start by evaluating your competition. Grab your list of keyword terms, and search for the most popular sites listed under those terms. Take a look at their navigation. What topic areas do they use for their main navigation scheme? Do they use secondary navigation? Are there similarities in topic areas across competitor sites?
Open a spreadsheet, and list their categories, and title tags, and look for keyword patterns. You'll soon see similarities. By evaluating the navigation used by your competition, you'll get a good feel for the tried-n-true "money" topics.
You can then run these sites through metrics sites like Compete.com.
Use the most common, heavily trafficked areas as your core navigation sections.
The Home Page Advantage
Those who know how Page Rank functions can skip this section.
Your home page will almost certainly have the highest level of authority.
While there are a lot of debates about the merits of PageRank when it comes to ranking, it is fair to say that PageRank is rough indicator of a pages' level of authority. Pages with more authority are spidered more frequently and enjoy higher ranking than pages with lower authority. The home page is often the page with the most links pointing to it, so the home page typically has the highest level of authority. Authority passes from one page to the next.
For each link off a page, the authority level will be split.
For example - and I'm simplifying* greatly for the purposes of illustration - if you have a home page with a ten units of link juice, two links to two sub-pages would see each sub-page receive 5 points of link juice. If the sub-page has two links, each sub-sub would receive two units of link juice, and so on.
The important point to understand is that the further your pages are away from the home page, generally the less link juice those pages will have, unless they are linked from external pages. This is why you need to think carefully about site structure.
For SEO purposes, try to keep your money areas close to the home page.
*Note: Those who know how Page Rank functions will realise my explaination above is not technically correct. The way Page Rank splits is more sophisticated than that given in my illustration. For those who want a more technical breakdown of the Page Rank calculations, check out Phils post at WebWorkshop.
How Deep Do I Go?
Keeping your site structure shallow is a good rule of thumb. So long as you main page is linked well, all your internal pages will have sufficient authority to be crawled regularly. You also achieve clarity and focus.
A shallow site structure is not just about facilitating crawling. After all, you could just create a Google Site Map and achieve the same goal. Site structure is also about selectively passing authority to your money pages, and not wasting it on pages less deserving. This is straightforward with a small site, but the problem gets more challenging as you site grows.
One way to mange scale is by grouping your keyword terms into primary and secondary navigation.
Main & Secondary Navigation
Main navigation is where you place your core topics i.e. the most common, highly trafficked topics you found when you performed your competitive analysis. Typically, people use tabs across the top, or a list down the left hand side of the screen. Main navigation appears on all other pages.
Secondary navigation consists of all other links, such as latest post, related articles, etc. Secondary navigation does not appear on every page, but is related to the core page upon which it appears.
One way to split navigation is to organize your core areas into the main navigation tabs across the top, and provide secondary navigation down the side.
For example, let's say you main navigation layout looked like this:
Each time I click a main navigation term, the secondary navigation down the left hand side changes. The secondary navigation are keywords related to the core area.
Various studies indicate that humans are easily confused when presented with more than seven choices. Keep this in mind when creating your core navigation areas.
If you offer more than seven choices, find ways to break things down further. For example, by year, manufacturer, model, classification, etc.
You can also break these areas down with an "eye break" between each. Here's a good example of this technique on Chocolate.com:
Search spiders, on the other hand, aren't confused by multiple choices. Secondary navigation, which includes links within the body copy, provides plenty of opportunity to place keywords in links. Good for usability, too.
As your site grows, new content is linked to by secondary navigation. The key is to continually monitor what content produces the most money/visitor response. Elevate successful topics higher up you navigation tree, and relegate loss-making topics.
Use your analytics package to do this. In most packages, you can get breakdowns of the most popular, and least popular, pages. Organise this list by "most popular". Your most popular pages should be at the top of your navigation tree. You also need to consider your business objectives. Your money pages might not be the same pages as your most popular pages, so it's also a good idea to set up funnel tracking to ensure the pages you're elevating also align with your business goals.
If a page is ranking well for a term, and that page is getting good results, you might want to consider adding a second page targeting the same term. Google may then group the pages together, effectively giving you listings #1 and #2.
A variant on Main & Secondary Navigation is subject themeing.
Themeing is a controversial topic in SEO. The assumption is that the search engines will try and determine the general theme of your site, therefore you should keep all your pages based around a central theme.
The theory goes that you can find out what words Google places in the same "theme" by using the tilde ~ command in Google. For example, if you search on ~ cars, you'll see "automobile", "auto", "bmw" and other related terms highlighted in the SERP results. You use these terms as headings for pages in your site.
However, many people feel that themes do not work, because search engines return individual pages, not sites. Therefore, it follows that the topic of other pages on the site aren't directly attributable to the ranking of an individual page.
Without getting into a debate about the the existence or non-existence of theme evaluation in the algorithm, themeing is a great way to conceptually organize your site and research keywords.
Establish a central theme, then create a list of sub-topics made up of related (~) terms. Make sub-topics of sub-topics. Eventually, your site resembles a pyramid structure. Each sub-topic is organized into a directory folder, which naturally "loads" keywords into URL strings, breadcrumb trails, etc. The entire site is made up of of keywords related to the main theme.
You might also wish to balance the number of outgoing links with the number of internal links. Some people are concerned about this aspect, i.e. so-called "bleeding page rank". A page doesn't lose page rank because you link out, but linking does effect the level of page rank available to pass to other pages. This is also known as link equity.
It is good to be aware of this, but not let it dictate your course of action too much. Remember, outbound linking is a potential advertisement for your site, in the form of referral data in someone else logs. A good rule of thumb is to balance the number of internal links with the the number of external links. Personally, I ignore this aspect of SEO site construction and instead focus on providing visitor value.
Link Equity & No Follow
Another way to control the link equity that flows around your site is to use the no-follow tag. For example, check out the navigational links at the bottom of the page:
As these target pages aren't important in terms of ranking, you could no-follow these pages ensure your main links have more link equity to pass to other pages.
Re-Focus On The Most Important Content
This might sound like sacrilege, but it can often pay not to let search engines display all the pages in your site.
Let's say you have twenty pages, all titled "Acme". Links containing the keyword term "Acme" point to various pages. What does the algorithm do when faced with these pages? It doesn't display all of them for the keyword term "Acme". It choses the one page it considers most worthy, and displays that.
Rather than leave it all to the algorithm, it often pays to pick the single most relevant page you want to rank, and 301 all the other similarly-themed pages to point to it. Here's some instructions on how to 301 pages.
By doing this, you focus link equity on the most important page, rather than splitting it across multiple pages.
This idea may sound a bit complex until you visualize it as a keyword chart with an x and y axis.
Imagine that a, b, c, ... z are all good keywords.
Imagine that 1, 2, 3, ... 10 are all good keywords.
If you have a page on each subject consider placing the navigation for a through z in the sidebar while using links and brief descriptions for 1 through 10 as the content of the page. If people search for d7, or b9, that cross referencing page will be relevant for it, and if it is done well it does not look too spammy. Since these types of pages can spread link equity across so many pages of different categories make sure they are linked to well high up in the site's structure. These pages works especially well for categorized content cross referenced by locations.
The following is a guest blog post by Jeremy L. Knauff from Wildfire Marketing Group, highlighting many of the recent changes to the field of SEO.
Marketing is constantly evolving and no form of marketing has evolved more over the last ten years than search engine optimization. That fact isn’t going to change anytime soon. In fact, the entire search engine optimization industry is headed for a major paradigm shift over the next twelve months. Like many of the major algorithm updates in the past, some people will be prepared while some will sit teary-eyed amongst their devastation wondering what happened and scrambling to pick up the pieces. Unlike the major algorithm updates of the past, you won’t be able to simply fix the flaws in your search engine optimization and jump back to the top of the SERPs.
Why is this change going to be so different? In the past, the search engines have incrementally updated certain aspects of their algorithms to improve the quality of their SERPs, for example, eliminating the positive effect of Meta tag keyword stuffing which was being abused by spammers. Anyone who has been in the SEO industry for more than a few years probably remembers the chaos and panic when the major search engines stopped ranking websites based on this approach. This time around though, we’re looking at something much more significant than simply updating an algorithm to favor particular factors or discount others. We are looking at not only a completely new way for search engines to assign value to web pages, but more importantly, a new way for search engines to function.
A number one ranking for a particular keyword phrase was once the end-all, be-all goal but now many searches are regionalized to show the most relevant web pages that are located in the area that you are searching from. While this will probably reduce your traffic, the traffic that you now receive will be more targeted in many cases. Additionally, it give smaller websites a more equal chance to compete.
This August, Google Suggest was moved from Google Labs to the homepage, offering real-time suggestions based on the letters you’ve typed into the search box so far. This can be an incredibly helpful feature for users. At the same time, it can be potentially devastating to websites that rely of long-tail traffic because once a user sees a keyword phrase that seems like at least a mediocre choice they will usually click on it rather than continuing to type a more specific keyword phrase.
Devaluation of paid links
Google’s recent attempt to eliminate paid links has scared a lot of people on both sides of the link buying equation into implementing the “nofollow” tag. In the midst of this hypocritical nonsense, Google has also been taking great measures to devalue links based on quantifiable criteria, such as the “C” class of the originating IP, similarities in anchor text and/or surrounding text, location of the link on the page and the authority of the domain the link is from to name a few. Regardless of the effectiveness of any search engines ability to evaluate and subsequently devalue paid links, the fear of getting caught and possibly penalized is more than enough to deter a lot of people from buying or selling links.
Visitor usage data
Again, Google is leading the charge on this one. Between their analytics, toolbar and web browser, they are collecting an enormous amount of data on visitor usage. When a visitor arrives at a website, Google knows how long they stayed there, how many pages they accessed, which links they followed and much more. With this data, a search engine can determine the quality of a website, which is beginning to carry more weight in regards to ranking than some of the more manipulatable factors such as keyword density or inbound links. This puts the focus on content quality instead of content quantity and over time, will begin to knock many of the “me too” websites further down the SERPs pages, or out of the picture all together. The websites that will prosper will be those that produce relevant, original content that their visitors find useful.
Simply pointing a vast number of links with a particular keyword phrase in the anchor text to a website was once a quick and easy way to assure top ranking. The effectiveness of this approach is diminishing and will continue in that direction as a result of TrustRank. In a nutshell, a particular set of websites are chosen (by Google) based on their editorial quality and prominence on the Internet. Then Google analyzes the outbound links from these sites, the outbound links from the sites linked to by these site, and so on down the chain. The sites that are further up the chain carry more trust and those further down the chain, less trust. Links from sites with more TrustRank, those further up the chain, have a greater impact on ranking than links from websites further down the chain. On one hand, this makes it difficult for new websites to improve their position in the SERPs compared to established website; one the other hand, it helps to eliminate many of the redundant websites out there that are just repeating what everyone else is saying.
Utilizing a combination of visitor usage data and a not so gentle nudge in Google’s direction, Google Chrome is set to change the way search engines gather data and present it to users. For example, when a user begins typing in the address bar of the browser, they are presented with a dropdown list of suggestions containing choices consisting of the #1 result in Google’s SERPs, related search terms and other pages you’ve recently visited. This gives a serious advantage to the websites that hold top ranking in Google and at the same time, gives a serious advantage to Google by giving their Internet real estate even more exposure than ever before.
So the question remains, is search engine optimization facing evolution or extinction? Certainly not extinction, not by a long shot, but in a short period of time it is going to be drastically different than it is today. The focus will soon be on producing a valuable and enjoyable user experience rather than just achieving top ranking, which is what it should have been all along.
Following on from my post yesterday, How To Craft Kick-Ass Title Tags & Headlines, lets look at meta tags as an advertisement, and why you need to think carefully about your offer, and the offers of your competition, when you craft your tags.
Why Are Title Tags Important?
Ranking debates aside, the main reason Title tags are important is because they are displayed, in bold, in the SERPs.
A SERP is a list of 20+ links, all clamoring for the visitors click. It is therefore important to entice visitors to click on your listing, rather than everyone else's. Sometimes you achieve this by rank placement alone, but with well-crafted tags, you stand a better chance of receiving that click.
What Is The Optimal Length For A Title Tag?
The W3C recommends the title tag should be less than 64 characters long.
Some SEOs think that long, keyword-loaded tags are the best approach. Some SEOs think short punchy tags are best, as long tags may dilute the weight of the keyword phrase, and there is less risk of Google cutting off you message midstream.
Because other factors play a more significant role in terms of rank, I ignore prescriptive tag lengths. Instead, I look to optimize the message in line with the business goals of a site.
Know Your Enemy
This is a proven Adwords strategy which also dovetails nicely into SEO.
The first step is to evaluate your surrounding competition.
Look at the wording of the most successful adwords ad for your chosen keyword term. Your aim is replicate success. Run an adwords campaign and experiment with the wording to find out the wording combination that receives the most clicks and subsequent desired action. You then craft your title tags and description tags to match. What works for Adwords works in the main SERPs, too.
Another way to approach title tags is to constantly rotate the tags using a script, and monitor the results. The is a split-run approach known as Keyword Spinning. You keep with the winners and cut the losers. This approach is describe in my post "Tested Advertising Strategies Respun For SEO"
What Are The Ideal Lengths For Meta Description Tags?
Common SEO wisdom dictates the description tag should be around 160 characters long.
Again, my approach is take prescriptive lengths with a grain of salt. Instead, focus on marketing and business goals.
The title and description are clear and descriptive. There is a call to action and an appeal to self-interest.
This is a jumble:
The title and descriptions are confused. It is not clear what the benefit is to the visitor.
One problem is that Google sometimes uses a snippet Google may also use a DMOZ description.
Google will use the snippet when it finds no description tag, or determines the description tag that your provided is inappropriate. To improve the chances your meta description tag will be used, see Google's guide: "Improve Snippets With A Meta Description Make Over". Essentially, you need to make you meta description tag descriptive, as opposed to a series of keywords.
You can prevent search engines from using the DMOZ description using the following meta tag:
Informational queries are meant to obtain data or information in order to address an informational need, desire, or curiosity.
Navigational queries are looking for a specific URL.
Transactional queries are looking for resources that require another step to be useful.
Query classifications can be broken down further into the following sub-categories:
Directed: Specific question. i.e "Registering a domain name".
Undirected: Tell me everything about a topic. i.e. "Singers in the 80s".
List Of Candidates: List Of Candidates i.e. "Things to do in Hollywood".
Find: Locate where some real world service or productcan be obtained i.e."PVC suit"
Advice: Advice, ideas, suggestions, instructions. i.e. "What to serve with roast pork tenderloin".
Navigation to transactional: The URL the user wants is a transactional site i.e "match.com"
Navigation to informational: The URL the user wants is information i.e. "google.com"
Obtain: Obtain a specific resource or object i.e. "Music lyrics"
Download: Find a file to download ie. "mp3 downloads"
Results page: Obtain a resource that one can printed,save, or read from the search engine results page i.e. (The user enters a query with the expectation that 'answer' willbe on the search engine results page and not require browsing toanother Website)
Interact: Interact with program/resource on another Website. i.e "buy table clock"
And further by sub-category type:
Closed: Deals with one topic; question with one, unam-biguous answer. i.e "Nine supreme court justices ".
Open: Deals with two or more topics . i.e. "excretory system of arachnids".
Online: The resource will be obtained online i.e. "Things to do in Hollywood".
Off-line: The resource will be obtained off-line and may require additional actions by the user i.e."Airline seat map"
Free: The downloadable file is free i.e. "Full metal alchemist wallpapers Free".
Not free: The downloadable file is not necessarily free i.e. "family guy episode"
Links: The resources appears in the title, summary, or URL of one or more of the results on the search engine results pages
Other: The resources does not appear one of theresults but somewhere else on the search engine results page
Source: "Determining the informational, navigational,and transactional intent of Web queries" Bernard J. Jansen, Danielle L. Booth, Amanda Spink; Pennsylvania State University
When crafting your tags, think about what classification of query the searcher is undertaking. How would they structure it? What terms would they use? Would they phrase their query as a question? What words would they include? What words would they omit? Dig deep into your keyword research tools and web logs to find this data.
Think about their mindset. Using words like research and compare help you tap into people in the research mode, whereas words like buy, save, coupons, and free shipping attract people ready to buy.
A Call To Action
The title tag and description provides opportunities to include calls to action. A call to action is a phrase that provides the opportunity for a visitor to take a step along the sales process.
The keyword term you've selected might give you a clue as to what point of the sales process the visitor is at. Obviously, "Buy X Online Overnight Delivery" tends to indicate a visitor is about to hand over the cash, so you draft your title tag and description accordingly in order to help close the deal.
However, most keyword terms aren't this overt. This is where you need to think about the type of offer you present.
How To Decide Between A Hard Offer And A Soft Offer
Some of the most effective offers are seldom "reasons to buy", but rather "reasons to respond." This is the difference between a hard and soft offer.
The vast majority of searchers are not ready to buy, so by using a soft offer, you stand to capture a greater number of leads than you would if you just made a hard "buy right now!" offer. If all you've got is a hard offer, then visitors who aren't ready to buy will click back, or won't select your SERP result at all.
Instead, encourage the visitor to take a relatively painless action, such as joining a mailing list, or downloading a free case study.
You can take this a step further my using the case study title to find out more about your visitors. For example, a case study entitled "Real Estate" won't tell you much about the problem your visitor is trying to solve, but a descriptive title, such as "Seven Ways To Sell Your Own Home" will. If they download the latter, and your service solves this problem for people, you're one step closer to making the sale.
Benefits Of The Soft Offer
You'll generate more leads
You have the opportunity to enter a dialogue with the visitor, thus moving them through the process
Only you'll know if a hard offer or a soft offer is most appropriate. But think carefully about the nature of your offer when crafting your titles and descriptions. Is your offer exactly the same as every other offer in the SERP? Or could you tweak you offer to make it stand out from the rest? Your offer should be more enticing than every other offer on the page. Try to get this across in your title and description.
One old-skool marketing technique that will always hold true is the value of the catchy headline.
The headline, given its power to convey meaning quickly, is more important than ever. Attention spans are limited. Media messages flood the channels. We're busy. The function of the headline is to grab our attention and pull us deeper into the message.
Many books have been written on how to craft great headlines. I'm going to quote from the advertisers bible on the topic, Tested Adverting Methods by John Caples. Caples identifies three main classes of successful headlines.
The Three Main Classes Of Successful Headlines
Self-Interest: The best headlines are those that appeal to self interest. They offer the reader benefits that they want, and they can get from you. For example, RETIRE AT 30
News - Humans are pre-disposed to seek out what is new and different in their environment. For example, NEW, CHEAPER IPHONE CALL PLANS RELEASED
Curiosity Appeal to our curious nature. LOST: $1 BILLION DOLLARS
Of the three, by far the most effective headline in advertising is the self interest headline. Our self interest usually trumps our curiosity, and news, especially when time is short.
Compare these two headlines:
PUT UP OR SHUT UP
FIVE TOTALLY NEW WAYS TO GET TOP RANKING IN GOOGLE
The first says nothing that appeals to our self interest. We don't even know what it is about. But you'd be hard pressed not to click on the second headline. The self-interest is just too strong. This is why the second form is used so often in link-baiting and social media. It screams for attention, and then makes a strong appeal to self-interest.
There is a downside to such headlines, however. Modern audiences have become jaded and cynical, especially where marketing messages are concerned. Overplay the benefit, and you'll come off as a shark. Link-baiting, a useful SEO tactic, has developed a bad reputation through overuse of this approach.
Eventually, people tune out.
Get Your Tone Right
We can twist the overused appeal-to-self interest headline strategy slightly to make it work for us. The key to getting the appeal to self-interest right is to get the tone right. Understand both the audiences' desires and the tone of "voice" they respond to.
For example, look at Digg. A cynic might argue that a surefire way to get top page on Digg is to write a headline that includes the following subject matter, and do so using an irreverent tone:
Criticism of Bush
Anything about Digg itself
Some crazy-weird activity from a country no-one has ever heard of :)
By the way, if anyone can come up with a headline that includes one of those elements, feel free to add it to the comments :)
The headline needs to be crafted in such a way as to appeal to Diggs demographic, which is mostly young, tech-savvy males. This demographic tends to respond to a tone that is cynical, flippant and irreverent. Get that tone wrong - i.e. play it too straight, or too advertorial - and it doesn't matter how strong the self-interest angle, it's unlikely to work.
How To Use Headline Strategy In SEO.
SEO has an additional challenge.
For SEO to work well, the headline, which is often also used as the title tag, should include a keyword term. Many studies have shown that a SERP or Adword that includes the keyword term results in more clicks. In order to get the headline strategy to work for SEO, try amalgamating the keyword term with one of the three formats.
For example, where the keyword term is "high speed routers", try:
High Speed Routers- How To Get Routers At Half Price (appeal to self interest)
High Speed Routers- Latest Features To Insist On (news, with a hint of self interest)
High Speed Routers- How We Blew Our Budget (Curiosity)
Even if you're not #1 in the serps for that term, you're more likely to attract a click than the guy who simply uses "High Speed Routers", by itself.
Your headline (i.e. the title tag) competes with at least ten other SERPs on the page, along with a various Adwords listings along the top and down the side. The top three SERP poitions are gold, but if you can add a touch of appeal-to-self-interest, or news, or curiosity, you'll up your chances of getting the click.
If you want to go one step further with this tactic, use it as a way to segment visitors. The first example I gave is likely to attract those people who are ready to buy, and who are buying on price.
You then need to include your title as a heading on the page, which confirms to they visitor their click has got them where they wanted to be. They're now far more likely to read beyond the headline.
These black holes aren't the result of the CERN Hadron Collider. They are forming for two reasons: the desire to keep people on site longer; and to hoard link juice, in order to dominate the SERPs.
Increasingly, top-tier sites are becoming cagey about linking out. They are more than happy to be linked to, of course, but often the favor is not reciprocated. Check out this post by SEOBlackhat.
What Does A Black Hole Look Like?
Uber-black hole, The New York Times, seems reluctant to link to anyone but themselves. This is especially annoying when they write about websites.
Wikipedia no-followed their links some time ago, thus forming a PageRank variant of the black hole.
The mini-me black hole, as practiced by TechCrunch. Rather than directing you to a site mentioned in an article, TechCrunch would direct you to their own CrunchBase entry instead, thereby keeping you on-site longer, and passing link authority to their own web pages. As a result, a search on Google for a sites' name may well bring up the CrunchBase entry. To be fair, TechCrunch does also link out, and there is an explanation as to why TechCrunch aren't as bad as the New York Times here.
The result is a link-love black hole. Sites using such a strategy can dominate the rankings, if they are big enough.
So if you wanted to create a blackhole, what would you do?
Don't link to anyone
If you must link out, then No-Follow the links, or wrap them in scripts
Direct page rank around your own site, especially to pages featuring your competitors names
Buy a motherlode of links
Become a newspaper magnate :)
Now, if you're an SEO, you might be feeling a tad conflicted about now. Why wouldn't every SEO do this? What if you owned a black hole? Isn't that the ultimate SEO end game?
In the long term, I doubt it.
If this problem becomes too widespread, Google will move to counter it. If Google's results aren't sufficiently diversified, then their index will look stale. If you search for a site, and get third party information about that site, rather than the site itself, then this will annoy users. Once confidence is lost in the search results, then users will start to migrate to Google's competitors.
I'm not certain such a move will be entirely altruistic, however. After all, what is the point of Knol? No, really - what isthe point of Knol? ;)
The Advantages Of Sharing The Love
Consider what you gain by linking out.
Webmasters look at their referalls, and may follow the link back to check out your site
Outbounds may count for more in future, if they don't already
Your users expect it. Don't fight against their expectations else you'll devalue your brand equity
Any site that looks "too-SEO'd" risks standing out on a link graph
There is social value in doing so. Black hole sites start to look like bad actors, can receive bad press, and risk damaging their relationships with partners, suppliers, and communities.
"..... The web is a great example of a system that works because most sites create more value than they capture. Maybe the tragedy of the commons in its future can be averted. Maybe not. It's up to each of us".
The phrase Black Hole SEO was used by Eli on BlueHatSEO.com over a year ago to describe various aggressive SEO techniques.
Webmasters are often faced with the problem of how to approach SEO on websites which have a country-specific focus. As you may have noticed, the search engine results pages on Google's geo-targeted search services frequently display different rankings than those you experience on Google.com.
If you run a few queries on, say, Google.com.au, you'll soon notice distinct regionalization patterns. In order to make search results more relevant to local audiences, Google uses different sorting methodologies than those used on Google.com.
Here is a guide to optimizing sites for the different regional flavors of Google.
Country Specific Local SEO Tips
Get a local domain extension: Google places a lot of weight on the domain name, so it is important to get the appropriate country-code domain extension. If you compare results across the different geo-targeted flavors of Google, you'll notice the weight given to the local TLDs. There are exceptions, but the local TLD tends to trump .com when it comes to local result sets. Different countries have different registration criteria for domain resitration. It is fairly easy to register a co.uk or a .co.nz, whilst a .com.au can involve setting up a business entity in Australia.
Specify your country association in Google Webmaster Tools: Google Webmaster Tools offers a facility whereby you can specify a country association for your content. You can do this on a domain, sub-domain and directory level. More detailed instructions can be found on Google's Webmaster Tools Blog.
Include local contact information: Specify a local address, business name, and local contact phone numbers. Whilst not critical in terms of ranking, every little bit helps, and by including local information, the site becomes more credible to a local audience.
Local hosting: Depending on who you ask, you'll get different answers as to whether the geographic location of the web host makes a difference in terms of ranking. I have .com.au, .co.nz, and .co.uk sites, hosted on US servers, and they rank well on the appropriate local versions of Google. Other people feel that location-based hosting is a must. Still others say the location of the name server is most important! It's fair to say that if you have a choice between hosting locally and hosting offshore, then it might pay to host locally. It certainly can't hurt, and there might be additional benefits, such as increased download speed. If you go this route, one thing to check is the servers physical location. Often, web hosts have a local office, but their servers are located in a different country. Use an IP lookup tool to determine the exact location of a server.
Spelling & Language: Ensure you use the appropriate spelling for your chosen region. There is a difference between "optimization" and "optimisation". Keep in mind that searchers will use the local vernacular. For example, if you are optimizing a travel site in the US, you might use the term "vacation". However, searchers in Australia, the UK and New Zealand, amongst others, tend to use the term "holiday".
Tone: Copy that works well in one geographic location may not work in another. For example, the sales language used in the US is usually more direct than that typically used in the UK, Australia or New Zealand. Familiarize yourself with local approaches to marketing, or engage local copywriters.
Inbound links: Seek out local links. All links are good, but inbound links from local TLDs are even better. Approach your local chamber of commerce, friends, suppliers, government agencies, business partners, and local industry groups and ask them for links.
Local directories: Get your site listed in local directories. Local directories still feature well in geo-targeted search results as the depth of content, in terms of sheer volume, isn't as great in the local TLD space as that published on .com. Obviously, you stand to gain from the local traffic that the directories send your way, and any local link juice the directory may pass on. Here are some top local directories:
Te Puna is a government run New Zealand directory.
Press releases: Try to come up with a local angle for your press releases, and submit them to local news and information channels. Small, local news outlets are highly likely to run local interest stories, which in turn may help your brand exposure and get you more local links.
Avoid Duplicate content: If you market is in one country, then it makes sense to use the country-code TLD for that country. However, if you target multiple countries, consider creating different content on each domain. Placing the same content on multiple domains may risk duplicate content penalties.
Off-line marketing: Don't forget to get your name out locally. If people search by you by your brand or business name, you'll always be well positioned in the serps.
Have Your Say
If you have some additional ideas that have worked well for you, please feel free to add them to the comments.
I was just fixing up our Robots.txt tutorial today, and figured that I should blog this as well. From Eric Enge's interview of Matt Cutts I created the following chart. Please note that Matt did not say they are more likely to ban you for using rel=nofollow, but they have on multiple occasions stated that they treat issues differently if they think it was an accident done by an ignorant person or a malicious attempt to spam their search engine by a known SEO (in language that is more rosy than what I just wrote).
Crawled by Googlebot?
Appears in Index?
If document is linked to, it may appear URL only, or with data from links or trusted third party data sources like the ODP
People can look at your robots.txt file to see what content you do not want indexed. Many new launches are discovered by people watching for changes in a robots.txt file.
yes, but can pass on much of its PageRank by linking to other pages
Links on a noindex page are still crawled by search spiders even if the page does not appear in the search results (unless they are used in conjunction with nofollow on that page).
Page using robots meta nofollow (1 row below) in conjunction with noindex do accumulate PageRank, but do not pass it on to other pages.
robots meta nofollow tag
destination page only crawled if linked to from other documents
destination page only appears if linked to from other documents
no, PageRank not passed to destination
If you are pushing significant PageRank into a page and do not allow PageRank to flow out from that page you may waste significant link equity.
destination page only crawled if linked to from other documents
destination page only appears if linked to from other documents
no, PageRank not passed to destination
If you are doing something borderline spammy and are using nofollow on internal links to sculpt PageRank then you look more like an SEO and are more likely to be penalized by a Google engineer for "search spam"
I decided to pick David Lubertazzi and Elisabeth Sowerbutts as the winners for their SEO Knol improvement comments.
I added a few pictures and fixed up some writing errors and incorporated a bunch of the feedback (like making the introduction better - thanks Andrew). There are many things (like domain names, duplicate content, blogging, social media, conversion, history and background of SEO) that I could have discussed, but I was unsure of how long I should let the Knol get, while still claiming that it was a basic introduction. Thanks for the feedback everyone!
I try to teach my mom SEO stuff from time to time, and often do so through the use of analogies. Some analogies perhaps oversimplify the SEO process, but are good for helping get the basic concepts across.
On Page Content
fish and a fishing pole - when explaining how text heavy sites often outrank thin ecommerce sites, I like to call searchers fish and each word on the page an additional fishing pole in the water. This is really powerful when used in combination with analytics data, showing her the hundreds of phrases that people searched for to find a given page on her site...helping her see the long tail as schools of fish. :)
Don't Make Me Think - people scan more than they read. Large blocks of text are imposing. People are more likely to read well formatted content that uses many headings, subheadings, and inline links. Expect people to ignore your global navigation, and do whatever you ask them to do (via inline links).
I have worked with some large multi-national brands who had multi-lingual sites, but they typically hired us for English optimization, and never really asked for much more than general advice and strategies (internal link flow, subdomains vs unique domains, etc.) when it came to other languages and cultures. I noticed a few differences between Google.com & International Google results while traveling, but I still only analyzed stuff that was published in English.
What are the best informational sources for SEO in Japanese? SEO in Chinese? SEO in Spanish? SEO in your language or region? How do you feel SEO in your area differs from the SEO advice you read from those of us who operate in the English US marketplace? I also would love to publish a guest article for each language.