I'd like to take a look at an area often overlooked in SEO.
Site architecture is important for SEO for three main reasons:
To focus on the most important keyword terms
Control the flow of link equity around the site
Ensure spiders can crawl the site
Simple, eh. Yet many webmasters get it wrong.
Let's take a look at how to do it properly.
Evaluate The Competition
One you've decided on your message, and your plan, the next step is to layout your site structure.
Start by evaluating your competition. Grab your list of keyword terms, and search for the most popular sites listed under those terms. Take a look at their navigation. What topic areas do they use for their main navigation scheme? Do they use secondary navigation? Are there similarities in topic areas across competitor sites?
Open a spreadsheet, and list their categories, and title tags, and look for keyword patterns. You'll soon see similarities. By evaluating the navigation used by your competition, you'll get a good feel for the tried-n-true "money" topics.
You can then run these sites through metrics sites like Compete.com.
Use the most common, heavily trafficked areas as your core navigation sections.
The Home Page Advantage
Those who know how Page Rank functions can skip this section.
Your home page will almost certainly have the highest level of authority.
While there are a lot of debates about the merits of PageRank when it comes to ranking, it is fair to say that PageRank is rough indicator of a pages' level of authority. Pages with more authority are spidered more frequently and enjoy higher ranking than pages with lower authority. The home page is often the page with the most links pointing to it, so the home page typically has the highest level of authority. Authority passes from one page to the next.
For each link off a page, the authority level will be split.
For example - and I'm simplifying* greatly for the purposes of illustration - if you have a home page with a ten units of link juice, two links to two sub-pages would see each sub-page receive 5 points of link juice. If the sub-page has two links, each sub-sub would receive two units of link juice, and so on.
The important point to understand is that the further your pages are away from the home page, generally the less link juice those pages will have, unless they are linked from external pages. This is why you need to think carefully about site structure.
For SEO purposes, try to keep your money areas close to the home page.
*Note: Those who know how Page Rank functions will realise my explaination above is not technically correct. The way Page Rank splits is more sophisticated than that given in my illustration. For those who want a more technical breakdown of the Page Rank calculations, check out Phils post at WebWorkshop.
How Deep Do I Go?
Keeping your site structure shallow is a good rule of thumb. So long as you main page is linked well, all your internal pages will have sufficient authority to be crawled regularly. You also achieve clarity and focus.
A shallow site structure is not just about facilitating crawling. After all, you could just create a Google Site Map and achieve the same goal. Site structure is also about selectively passing authority to your money pages, and not wasting it on pages less deserving. This is straightforward with a small site, but the problem gets more challenging as you site grows.
One way to mange scale is by grouping your keyword terms into primary and secondary navigation.
Main & Secondary Navigation
Main navigation is where you place your core topics i.e. the most common, highly trafficked topics you found when you performed your competitive analysis. Typically, people use tabs across the top, or a list down the left hand side of the screen. Main navigation appears on all other pages.
Secondary navigation consists of all other links, such as latest post, related articles, etc. Secondary navigation does not appear on every page, but is related to the core page upon which it appears.
One way to split navigation is to organize your core areas into the main navigation tabs across the top, and provide secondary navigation down the side.
For example, let's say you main navigation layout looked like this:
Each time I click a main navigation term, the secondary navigation down the left hand side changes. The secondary navigation are keywords related to the core area.
Various studies indicate that humans are easily confused when presented with more than seven choices. Keep this in mind when creating your core navigation areas.
If you offer more than seven choices, find ways to break things down further. For example, by year, manufacturer, model, classification, etc.
You can also break these areas down with an "eye break" between each. Here's a good example of this technique on Chocolate.com:
Search spiders, on the other hand, aren't confused by multiple choices. Secondary navigation, which includes links within the body copy, provides plenty of opportunity to place keywords in links. Good for usability, too.
As your site grows, new content is linked to by secondary navigation. The key is to continually monitor what content produces the most money/visitor response. Elevate successful topics higher up you navigation tree, and relegate loss-making topics.
Use your analytics package to do this. In most packages, you can get breakdowns of the most popular, and least popular, pages. Organise this list by "most popular". Your most popular pages should be at the top of your navigation tree. You also need to consider your business objectives. Your money pages might not be the same pages as your most popular pages, so it's also a good idea to set up funnel tracking to ensure the pages you're elevating also align with your business goals.
If a page is ranking well for a term, and that page is getting good results, you might want to consider adding a second page targeting the same term. Google may then group the pages together, effectively giving you listings #1 and #2.
A variant on Main & Secondary Navigation is subject themeing.
Themeing is a controversial topic in SEO. The assumption is that the search engines will try and determine the general theme of your site, therefore you should keep all your pages based around a central theme.
The theory goes that you can find out what words Google places in the same "theme" by using the tilde ~ command in Google. For example, if you search on ~ cars, you'll see "automobile", "auto", "bmw" and other related terms highlighted in the SERP results. You use these terms as headings for pages in your site.
However, many people feel that themes do not work, because search engines return individual pages, not sites. Therefore, it follows that the topic of other pages on the site aren't directly attributable to the ranking of an individual page.
Without getting into a debate about the the existence or non-existence of theme evaluation in the algorithm, themeing is a great way to conceptually organize your site and research keywords.
Establish a central theme, then create a list of sub-topics made up of related (~) terms. Make sub-topics of sub-topics. Eventually, your site resembles a pyramid structure. Each sub-topic is organized into a directory folder, which naturally "loads" keywords into URL strings, breadcrumb trails, etc. The entire site is made up of of keywords related to the main theme.
You might also wish to balance the number of outgoing links with the number of internal links. Some people are concerned about this aspect, i.e. so-called "bleeding page rank". A page doesn't lose page rank because you link out, but linking does effect the level of page rank available to pass to other pages. This is also known as link equity.
It is good to be aware of this, but not let it dictate your course of action too much. Remember, outbound linking is a potential advertisement for your site, in the form of referral data in someone else logs. A good rule of thumb is to balance the number of internal links with the the number of external links. Personally, I ignore this aspect of SEO site construction and instead focus on providing visitor value.
Link Equity & No Follow
Another way to control the link equity that flows around your site is to use the no-follow tag. For example, check out the navigational links at the bottom of the page:
As these target pages aren't important in terms of ranking, you could no-follow these pages ensure your main links have more link equity to pass to other pages.
Re-Focus On The Most Important Content
This might sound like sacrilege, but it can often pay not to let search engines display all the pages in your site.
Let's say you have twenty pages, all titled "Acme". Links containing the keyword term "Acme" point to various pages. What does the algorithm do when faced with these pages? It doesn't display all of them for the keyword term "Acme". It choses the one page it considers most worthy, and displays that.
Rather than leave it all to the algorithm, it often pays to pick the single most relevant page you want to rank, and 301 all the other similarly-themed pages to point to it. Here's some instructions on how to 301 pages.
By doing this, you focus link equity on the most important page, rather than splitting it across multiple pages.
This idea may sound a bit complex until you visualize it as a keyword chart with an x and y axis.
Imagine that a, b, c, ... z are all good keywords.
Imagine that 1, 2, 3, ... 10 are all good keywords.
If you have a page on each subject consider placing the navigation for a through z in the sidebar while using links and brief descriptions for 1 through 10 as the content of the page. If people search for d7, or b9, that cross referencing page will be relevant for it, and if it is done well it does not look too spammy. Since these types of pages can spread link equity across so many pages of different categories make sure they are linked to well high up in the site's structure. These pages works especially well for categorized content cross referenced by locations.
"Yahoo Answers, which was launched in late 2005, is a staggeringly huge site. Recent Comscore stats say the service attracts nearly 150 million monthly visitors worldwide and generates 1.3 billion monthly page views. That's 67% unique visitor growth in the last year. Yahoo as a whole, though, has nearly 100 billion monthly page views, so it isn't a material percentage of total Yahoo traffic"
Nice traffic, however Yahoo Answers is full of junk content. There are now numerous competitors in the Q&A space.
If you're first mover, as Yahoo was, you can get away with low quality content, but as competition increases, the quality must also increase in order to keep people hooked. Whilst hugely successful in terms of traffic numbers, Yahoo Answers now must to respond to increasing competition. With rumours of a sale, it looks like Yahoo may instead be refocusing their efforts on their core business.
This is an example of the "curve to quality" pattern. First movers can get away with junk content for a while, but eventually competitors will up the quality and gain audience share as a result. This reinforces the need to adapt business models in light of competition, and the need to avoid commodity status.
We can see the same curve to quality pattern in the blog world.
Jackob Neilsen was advising a world leader in his field on what to do about his website. The guy wanted to know if he should start a blog.
"Blog postings will always be commodity content: there's a limit to the value you can provide with a short comment on somebody else's work. Such postings are good for generating controversy and short-term traffic, and they're definitely easy to write. But they don't build sustainable value. Think of how disappointing it feels when you're searching for something and get directed to short postings in the middle of a debate that occurred years before, and is thus irrelevant."
Also check out the graph "variability of posting quality" in Nielsen's post.
I suspect Nielsen is on the right track. Blog traffic is reportedly at an all time high, but they still only accounts for 0.73% of US traffic. Perhaps as the quality of the average blog increases, so to will the audience share.
Due to the pressure of competition, low quality content eventually becomes commodity.
Do you read mee-too search blogs? Not many people do. Most people gravitate towards the blogs that offer the highest perceived level of quality, as opposed to those that repeat the same news found elsewhere. Mee-too content is no longer an effective strategy in the blog world, or the newspaper world, as syndicated news services are finding out. There is simply too much competition.
There are other reasons why you might want to focus on quality as a strategy.
Google will always try to filter out low quality, commodity content in order to heighten user experience. Google approaches this problem in a number of ways.
In the remote quality rater document, Google lists a range of categories raters can attribute to web content. One category is "Not Relevant". This category applies to "news items that appear outdated" and "lower quality pages about the topic". Obviously, "lower quality" is a relative term and the comparison would be made between competing SERP results. Pages categorised as "Not Relevant" will receive lower SERP placement.
Also consider the notion of poison words. Posion words are words the search engines equate with content of low quality. If, just for example, forum content is found to frequently be of low quality, then it is reasonable to assume Google will look for markers that the site is a forum and mark this content down as a result. Markers might include a link back to a popular forum software script, for example.
This metric would not be taken in isolation as there are various other quality markers Google use. However, if the content is low quality and appears in a low quality format, you stand less chance of ranking for competitive queries.
The same might apply to commercial content, especially such content that appears in non-commercial query results.
Google's business model involves advertisers paying for clicks in the form of Adwords. The main SERPs are essentially a loss leader that facilitate people clicking on text advertisements. The main SERPs are the reason people use Google.
Such a business model would be supported by an algorithm that rewarded quality, informative content in the main SERPs. It could operate by downgrading any content deemed as purely commercial, and this would involve looking for commercially-oriented poison words. Posion words in this context might include "Buy Now", "Business Address", and other variants unique to commercial content. This would "encourage" those with commercial messages to list with Adwords because they would have trouble appearing in the main SERPs. It is unlikely such an algorithmn would apply to commercial queries, however.
Google filters in this way because there is much competition for keyword queries. Google looks to find the best answer. The answer of highest quality, both in terms of relevance and searcher satisfaction. As competition increases, the answers will get better, which is why you must aim to stay high on the quality curve.
Since we have a number of popular Firefox extensions, I frequently get asked how to update Firefox extensions. Rather that writing 3 emails a week I figure it was quicker to jot down a quick blog post. To update or uninstall an extension you first have to click into the add-ons panel.
When you get inside the extensions area (by following the path highlighted above) you will see an Add-ons window with a Find Updates button at the bottom of it. That is an easy way to update many extensions at once.
The other way to update or uninstall is to scroll on an extension and click on it.
If you left click, Disable and Uninstall buttons will appear.
If you right click on an extension you will see a menu pop up with the option to Uninstall the extension. This menu also gives an option for you to Find Update.
Any time you do an update or uninstall you have to restart Firefox for it to take effect. If you uninstall an extension that you want to reinstall, go to the source where you downloaded it from to be able to reinstall it again. Instructions for installing an extension are well laid out on the SEO for Firefox page.
I am not sure if safe harbor covers companies that index content, cache/host content, and suggest searches for downloading pirated works...but if it does, I think the law needs changed. It seems Google could have thought about the torrent related keyword suggestions before launching search suggest as a default.
Part of the reason why I had to change my business model was the need for a more interactive higher value service, but another big part of it was also that I saw this sort of activity coming. It is too hard to create valuable information and sell it in a digital format unless it is broken up into pieces, is time sensitive, and/or has interactive elements added to it.
If you think Google respects copyright you are wrong. All content wants to be free, and, preferably hosted by Google, wrapped in AdSense.
"A good plan today is better than a perfect plan tomorrow" - Proverb
Contrary to what many business books will tell you, unless you're looking to raise capital, you don't need an extensive business plan before you start. However, having no plan at all is often a recipe for disaster. When writing your plan, aim for a concise, one page explanation that clearly states where you're going and how you'll get there.
When I write my plans, the plan also includes the message - more on the message soon - and then, at the very bottom of the page, I leave myself a reminder: "Change Everything!". I write "Change Everything" because I know my plan will change and adapt as I go along. The best business plans are fluid, because the tides of the market will forever change beneath you. Rigid planning can easily put you off-course when the winds inevitably change.
Developing The Message
The message is a simple outline of who you are and what you do. It is also referred to as the elevator pitch. It is used to communicate, quickly and concisely, what you're about, and to help you make a myriad of decisions on design, to SEO, to marketing.
It can be difficult to reduce your message to a clear simple paragraph, so here are a few tips on how to do it. One useful technique is to think of it in terms of questions and answers.
Ask, and answer, the following questions:
What value do you add for your customers?
What problem do I solve?
What outcome will resolves this problem?
What do I do differently from my competitors?
What adjectives and nouns best illustrate the above points?
Then blend the answers into a tight, focused two paragraph explanation of what you do and the benefit your product or service provides someone else.
"We are Acme.com. We provide online human resources programs for small companies that lack a dedicated human resources division . Our products and services help companies meet their human resources objectives at low cost, and the service is available to our customers 24 hours a day, seven days a week via our easy-to-use web site. Some of our clients have reduced staff-turnover by up to 50% after using our services".
Needs, work, but that's a start.
Next, test your message out on friends and colleagues. Are they crystal clear about what you do and benefits your provide? Your message flows through everything you do, from domain name selection, to site design, to marketing.
Domain names are easy to register. The hard part is finding the right name.
As I'm sure you're aware, the domain name market is fiercely competitive, so finding the ideal name can be difficult, not to mention expensive if you need to go to the resale market.
When selecting a name, which will likely also be the name of your product or service, consider the search value of names. Google places a lot of emphasis on keywords within the domain name, and the link text pointing to a site. This may change in the future, but it has held true for the past few years.
Try combining your main search keyword term bolted to another descriptive term. "SeoBook", "CarWarehouse, "RealEstateGold" etc. The plus side is that you'll get keywords in the links pointing to your site. Directories, link partners, and most forms of text advertising, tend to place your domain/company name in the link text by default. If your domain/company name doesn't include keyword, you may find it more difficult to get keyword terms in the links.
The downside of this approach is that the brand tends towards the generic, and can therefore be less memorable. Another approach is to ignore the search aspect, and make up a completely unique name. This is the traditional approach to branding. One advantage of such an approach is that you'll "own" any keyword searches for this term.
Your web design needs to be consistent with your message.
While anyone can knock up a web design, I'd advise you not to take this approach unless you're an accomplished designer. Hire a professional instead. First impressions count, and when an exit is only a click away, you must make a good one, else all your other marketing efforts could be wasted.
I use the message as a key part of the the design brief. Web designers appreciate this detail, and will be able to design a look and feel that incorporates your message into the design. For example, if your brand is a luxury brand, then the website should look glossy in order to stay consistent with your message. The same glossy design will not work for a more accessible, utilitarian brand like, say, Google. The message would be mixed, which could lead to visitor confusion. The story you're telling wouldn't ring true.
Your message helps govern design questions.
I'll post more indepth about site construction and architecture, but for the meantime, keep it simple, functional, fast-loading, and ensure your design supports and reinforces your message. As I mentioned in my post on Brand Building Tips On A Budget, everything you do on your site must tell a consistent story. Everything you do is your brand - your message. Great design is of little use if the copy writing is sub-standard, and vice-versa. Get all those little, but important, details right. Broken links, 404s, slow load times, confusing navigation, unexpected surprises - they all part of your brand experience.
As you're reading this site, you already know the value of internet marketing, specifically search marketing. So, I won't go over that aspect. I'm sure you've read the book ;)
But what other cheap promotional options are open to you?
Here are a few ideas that work well, and corresponding links telling you the hows and the whys:
The most important aspect of site marketing is to measure performance. You want to run with the winners and cut the losers.
Are you getting sales from the search terms you rank for? If not, why not? Is your message inconsistent with the search terms you are targeting? Refine your message, or target different keyword terms. This is why it is important to test drive your SEO keyword terms using Adwords before you engage in SEO. You can test to see if your keyword terms and your message sync-up to create the desired result.
You need good analytics to track the value of each channel you use. The important point is to be able to identify where the traffic is coming from and, most importantly, what this traffic does when it gets to your site. There is no point ranking for the high traffic terms if none of that traffic converts to desired action.
You've probably heard the term content is king?
Conversion is king.
Content might help you get a visitor to convert to desired action, or it might lead them astray. Once again, ask yourself if your content is on-message. Is your content consistent with your business goals? Is your content helping you achieve your business goals?
"Buffett once told me there are three 'I's in every cycle. The 'innovator,' that's the first 'I.' After the innovator comes the 'imitator.' And after the imitator in the cycle comes the idiot."
-Theodore Forstmann, quoting Warren Buffett
Great quote, huh.
It applies everywhere, including online. Who wants to start a blog network in 2008? How about becoming a ring- tone affiliate? Or start a web 2.0 news blog?
The problem with those ideas is that they are well past the first and second "I" stage, and probably sit deep in the "idiot" zone. These markets are heavily saturated, so it would take serious investment of time and resources in order for a newcomer to compete with the established operators. It is questionable whether such an investment would be worthwhile, unless someone can put a new spin on the existing model in order to put it back in the innovator zone.
In my working life, I've spent plenty of time in all three zones.
Real World Examples
When SEOBook.com started, it was a little late to the table.
The "Book-On-SEO" market was not new. Not innovative. However, the market wasn't heavily saturated, as books on SEO were beginning to fall out of favor, mainly because by the time they were published, they were already out of date. This probably placed "books on seo" in the imitator zone. However, SEOBook was combined with a blog and regular updates - a new page a day -which was Aaron's way of re-spinning the idea back into the innovator zone.
Could someone release an SEO book today? Sure they could, but without a new angle, they're facing a lot of entrenched competition. A me-too product at this point won't get much traction, because it isn't remarkable, and the market is mature. In any case, training on SEO has morphed into a service.
The often-copied Weblogs Inc, which was one of the first blog networks, sold to AOL for $25M.
It came out at a time when only uber-geeks knew about blogs. There was no money in it. There were no directly-applicable proven revenue models. But this is exactly what new emerging markets look like. It is only easy to see them in hindsight. Fast forward to 2008, and the dead pool features numerous well-funded blog networks that simply arrived too late. The ship had sailed. In 2008, the blog network is in the idiot zone.
An example of a fast rising market is the environmental market.
In August 2007, TreeHugger, which was a blog about environmental news, sold to Discovery for $10 million. There are now a raft of imitators, but it is questionable if many will make any real money. The real money in the environmental space will likely come through innovation and change. Got any innovative ideas for that space?
There is a ton of - excuse the pun - blue sky in that market.
How To Stay Out Of The Idiot Zone
I'm going to start by qualifying this notion a little.
People can, and do, make money in the idiot zone. They come late to the table, yet still manage to prosper. But anyone who has done this will tell you that the work level, time and money investment, and smarts required are significant.
Contrast this with getting in at the innovator level or imitator level in new, rising markets. It is relatively easy, and cheap, to make a big spash in new markets due to lack of entrenched competition.
Is It Better To Be An Innovator Or Imitator?
Microsoft was a fast-follower. As was Google.
Whilst the innovator gets the fame, they can often fail to sustain the pace. The fast-follower is often the guy that makes the most money. It can be a bit simplistic to frame success in this way, but this frame of reference can help to clarify potentially confusing business problems. I think we all agree that being in the idiot zone is a problem, and best avoided.
If you suspect your business might be in this zone, think about how you can re-spin it to put it back in the innovator or imitator zone. Can you get a better business model? Google built a better business model by extending and refining the auction advertising model. Is there a way to out-manage your competitors? Are they heading off in the wrong direction? Are they neglecting the very audience that made them successful?
So How Do You Identify Rising Markets?
If you're starting out, how do you ensure you don't dive straight into the idiot zone?
You need to try and establish at which point that market is at: innovator, imitator or idiot. Measurement is more an art than science, but with some market research you should be able to get a good feel for it.
2. Learn To Recognize A Consolidated Market - And Avoid It
A consolidated market occurs when the business cycle peaks in a crowded field. A few mega players swallow up the minions.
An example of this is the PC market, which started off with a huge number of brands, and has now been largely consolidated by Dell & Gateway. The rest of the market is commodity no-name brands. Would you try and launch a PC brand in this market? You'd need to have something truly remarkable, and it would take a lot of effort.
3. Don't Listen To Bloggers
Ever heard popular bloggers sharing a little "secret" with tens of thousands of anonymous readers? "I made my money easily - just get into X".
By the time anyone is sharing that sort of information, the market has peaked. The horse has bolted, run across the field, got on a plane, and sent back the picture postcard.
Why would someone create more competition for themselves? They wouldn't.
In most cases, they recognize there is a lot of competition in their market niche, and the only way to maintain their revenue it is to get scale - you guessed it - by signing up an army of sub-affiliates.
Whenever I read a story about Google losing it's competitive edge or spreading itself too thin I think that they author just does not get the network effects baked into web distribution when a company is the leader in search and advertising, and how solidly Google competes where it allegedly failed.
Sideline projects, like their book scanning project, turn into a treasure for librarians and researchers who guide others to trust Google. Syndicated products and services like their book API nearly create themselves as an off-shoot of creating indexable searchable content.
To sum up Google's lasting competitive advantage (including brand, marketshare, price control, distribution, undermining copyright, strategic partnerships, etc.) I turn to telecom lobbyist Scott Cleland's Googleopoly:
Google arguably enjoys more multi-dimensional dominating efficiencies and network effects of network effects of any company ever - obviously greater than Standard Oil, IBM, AT&T, or Microsoft ever were ever able to achieve in their day.
The five main anti-competitive strategies in Google's predatory playbook to foreclose competition are
Cartelize most search competitors into financially-dependent 'partnerships;'
Pay website traffic leaders predatory supra competitive fees to lock up traffic share;
Buy/co-opt any potential first-mover product/service that could obsolete category's boundaries;
Commoditize search complements to neutralize potential competiton; and
Leverage information asymmetry to create entry barriers for competitive platforms.
If you have a spare hour to read, you may want to check out Mr. Cleland's Googleopoly 2 [PDF]. I don't agree with everything in it, but it sums up Google's competitive advantages and business strategies nicely. Anyone can learn a lot about marketing just by watching and analyzing what Google does.
I'm a complete idiot, and if I can do it, anyone can!
If you've ever researched making money online, no doubt you've heard the above pitch. We all know the pitch is nonsense, of course. If these guys really were hitting the numbers they claim, then you've got to wonder why they are selling their "secrets" for $97?
Perhaps it is true.
Perhaps they really are idiots :)
However, the reality is that making money online is the same as making money offline. You need to find a market opportunity and fill it.
And that takes work.
I'd like to share a few ideas on research potential markets, and how you can use search engines to help you.
Definition Of Market Research
Market research is the study of groups of people in order to determine if there is a market for your product or service.
One of the biggest mistakes entrepreneurs often make is to attempt to solve a non-problem. The TechCrunch dead-pool is littered with examples of solutions to non-problems. An idea might sound good. Your family and friends might agree it is good. But is it really? How can you really know?
By spending a little time finding out if your idea solves a real problem, as opposed to an imaginary one, you can save yourself a lot of time, effort, pain and money later on.
But how do you undertake market research on a limted budget?
Use Search As A Market Research Tool
Search marketers have an ace up their sleeves that most people just don't see. One of the valuable most valuable market research tools available costs very little: Google Adwords.
Google Adwords provides you with a wealth of data. You can measure actual visitor interest - real search numbers, not just estimates - and you can quickly and easily test your ideas in the live marketplace. You can test your product or service offer, even before you're ready to provide it!.
Once you've gathered this valuable data, and found that your idea works, you can then design your time-consuming SEO strategy.
Sounds easy, right?
Well, it is. But there is a little work involved.
What we need to do is take a few important measurements.
2. Construct a small site consisting of landing pages.
You can test the effectiveness of each landing page using a/b testing, but this would probably over-complicate matters at this stage. What you want to know are three key pieces of information: actual search volume, response to offer, and competition levels.
Search volume is the number of people who search on a certain term. The actual search volume, as opposed to estimates.
Response to offer is the number of people who take a desired action, not those who click back.
Competition level is the level of advertisers competition.
If the search volume is sufficient to achieve your goals, then you're part way there. If not, you might to to rework your idea, but at least you haven't undertaken a time consuming SEO campaign only to find this out there is no real traffic.
Once a visitor lands on your page, you want to measure their level of interest in your offer. How many buyers are you likely to get vs tire kickers? Prompt the visitor to take an action that would indicate that they would buy your service or product. For example, you could send them to an affiliate program offering a similar service or product, and measure your success rates, or collect the visitors email address as an expression of interest. Those who click back are telling you your offer is quite right.
3. Evaluate Competition Levels
You can gain an understanding of the competition levels by looking at the bid price. Obviously, the higher the bid prices, the higher the level of competition. If you're failing to get on the front page with reasonable relevancy and bids, you're in a fairly competitive area, and the SERPs will be likewise.
Let's say you've got all three ducks lined up. Great. You now have some fantastic market research data that you can build into your site and into your SEO campaign. Most offline market researchers would kill to be able to get this lucrative data so easily and cheaply.
Google Traffic Estimator helps you see how often your ads would appear for keywords, and gives you approximate prices. It works for various match types, including broad match, phrase match and exact match. Here's some information on why understanding match types is important.
Various metrics tools, including a cool Excel Plug in. To see a demonstration of how to use this, check out Aaron's video: Top Paying AdSense Keyword Lists Video. Ad Intelligence gives you actual search data, not rough estimates.
Google Insights will show you where search activity is taking place at different periods of time. This is especially useful for honing local and regional offers. It is also useful for time-based research, such as Christmas and Thanksgiving and other vacation periods.
Google have just updated their guidelines in regards to rewriting URLs.
Previously, the guideline stated:
"Don't use "&id=" as a parameter in your URLs, as we don't include these pages in our index"
Google have now removed this guideline, saying they can now index URLs that contain that parameter. Google have also posted a blog entry explaining the difference between dynamic URL's and static URL's, and encourage you to let Google handle the problem.
Should You Avoid Rewriting Dynamic URLs?
In most cases, yes. The translation can be messy, and if not handled correctly can lead to indexing problems.
However, for SEO purposes you might want to consider the following points.
Sometimes Static URLs Do Make For Better SEO
The URLs look nicer and will likely get clicked on more often
The URLs will provide better anchor text if people use the URLs as the link anchor text
If you later change CMS programs having core clean URLs associated with content make it easier to mesh that content with the new CMS
the benefit Google espouses for dynamic URLs (Googlebot being able to stab more random search attempts into a search box) is only beneficial if your site structure is poor and/or you have way more pagerank than content (like a wikipedia or techcrunch)
Have you ever tried to get people to link to your pure commerce/commercial brochure-web site? You know how tough it is out there. The link economy has become so established, we've even got strategies built around the idea of never linking out. Once people perceive something to be valuable, they'll think twice about just handing it over for nothing.
So what is an SEO supposed to do?
The key to linking in an environment where there is high value placed on links is to think of linking less as a process, and more in terms of building relationships.
Here are a few linking ideas designed to reduce the pain and increase the effectiveness of your link building campaign.
Relationship Link Building 101
The first step in your link building strategy occurs before your site hits the web.
If you're thinking of launching a static brochure-ware site, and link building is part of your marketing strategy, think again.
There is less chance for relationship building.
Preferably, you want a site with plenty of potential for on-going community involvement and interaction.
News Sites. Social sites. Blogs. Frequently-updated information sites. Teaching sites. Advice sites. Q&A. Wikipedia-style sites. The static brochure website will still have a place, but those sites with higher levels of user engagement will trump it.
Produce Really, Really Interesting Content
Posting what everyone else is posting is not interesting.
Look at what everyone else is posting and take a new angle on the the topic. Don't just go one better, go ten better. Learn the lessons of The Purple Cow. Be worth remarking upon. People are hungry for unique, quality content.
They'll link to you if you have it.
If your competitors are spending ten minutes on their posts, you spend a day. Spend a whole week. Cover areas no one else is covering. Make your posts game-changing posts. You're going to need not one, but a consistent body of such posts. Think about the sites you link to. You need to aim to be better than those sites.
At very least, you need to offer a point of difference in order to be linkworthy.
If you're new, you're going to need friends. You're going to need influential friends.
A link out to sites run by influential people becomes an advertisement for your site in their referral logs. People will follow the links back to see who is talking about them, and if you're got an impressive set of articles/posts, you'll be on their radar in no time.
Most modern marketing is based on the idea of reciprocation. If you do something for others, without requesting something in return, most people feel they should reciprocate.
Give something valuable. Give wide. Give freely. Some of it will eventually come back.
Give nothing, and you're guaranteed that nothing will come back.
Lose The Ads
The less commercial you appear, the more likely you'll get linked to, especially from .edu and other authority information hubs. Few people want to link to sites plastered with advertising unless that site already has established authority.
You can introduce advertising once you've built up link authority.
Flattery Gets You Everywhere
Make people feel important. Make them look good. If you make them look good, they'll want to point that fact out to others. They'll do your marketing for you.
Look For Companies With "In The News" Pages
This tip flows on from flattery. Write about companies in a good light. To find companies that have "in the news" style pages, do a Google search for [your industry + "in the news"].
Write stories about fast-breaking events that have little competition but high interest levels. If the meme gets big enough, news sites will look around for content to quote, and, given a lack of competition, hopefully they'll quote yours.
Get Seen In The Community
Participate in answer sites, forums, article sites, Wikipedia, Squidoo, Amazon et al. Contribute something of real value. You'll get direct links in some cases, but at very least you'll raise awareness, which can translate into links down the line.
The Designer Angle
Get your site re-designed by a high profile designer who has a history of showcasing his/her work.
The cost of the design might be more than covered by the value of the inbound links and attention you receive, especially if the design is mentioned in trade bibles, like Smashing Magazine.
Less about relationships, but good tools to have in the box.
Trade links, ask for links, beg for links. Hey, it still works, although it's probably the least effective method, and most time consuming. Outsource this task, if you can.
List With Local Business Services
List with your Chamber of Commerce, Business Bureau's, Government Advisories, libraries, and other appropriate institutions.
Link baiting is when you write content with the specific aim of attracting links. It works, but you've got to be careful with your pitch. Get the tone wrong for your audience, and you'll put people off.
Top Ten Lists
How To Do Something Exceptional With (Seemingly) No Effort
Be The First To Do Something
Almost all press releases end up in the web equivalent of the wastepaper bin, but if you can provide a fresh, newsy angle, there is significant potential for links.
Try combining link bait strategies with press release strategies. A local angle works well for local news services, who are often starved of local news.
Keep the following criteria in mind when evaluating which web directories are worth your time.
They appear in the SERPs
Offer direct links - i.e they aren't routed through a script, or no-followed.
High crawl frequency - check out the latest crawl date in Google cache. If the directory pages haven't been cached in months, chances are Google may regard them as low quality.
Look for quality standards - Matt Cutts outlined Google's view of a good directory. Directories that stay closest to these guidelines are more likely to be around for the long haul.
If you've reached this far, and thought "I know this stuff!" - great :)
How about sharing your single best link acquisition strategy with the community :)
The Future Of Linking
Links have been so important for so long now, but are things about to change?
In the dark, distant past - 1997 - the web was about publishing.
However, the web ecosystem is evolving into more of an interactive space, based on platforms.
As a result, we're seeing a different kind of website emerge - it is more "place" than "brochure". Think Facebook, YouTube, Wikipedia, Blogs, et al. We're seeing more applications. We're seeing more cloud computing. The web is becoming a place where we truly interact, as opposed to simply publish.
Google's ranking models have, in the past, been based on publishing models - specifically, an academic citation model in the form of PageRank. This approach will become less effective at determining relevance as people move away from the publishing model and towards interaction and engagement.
Google realize this, of course. This is why I think Google will be adapting their model to monitor and gauge interaction. Interaction will become a new valuable metric as to a sites worth, which will flow into ranking.
In a recent post on The Official Googleblog, Google talked of how interaction will change how systems "think and react":
"As we're already seeing, people will interact with the cloud using a plethora of devices: PCs, mobile phones and PDAs, and games. But we'll also see a rush of new devices customized to particular applications, and more environmental sensors and actuators, all sending and receiving data via the cloud. The increasing number and diversity of interactions will not only direct more information to the cloud, they will also provide valuable information on how people and systems think and react..... As systems are allowed to learn from interactions at an individual level, they can provide results customized to an individuals situational needs: where they are located, what time of day it is, what they are doing. And translation and multi-modal systems will also be feasible, so people speaking one language can seamlessly interact with people and information in other languages."
Notice the frequency with which Google use the terms "interact".
I think this hints at the future direction of search and ranking. Google will increasingly shift from measuring external popularity metrics, such as linking, to measuring the level of interaction, if they are not already doing so.
There have been three recent developments that search marketers should be aware of:
Microsoft have released a research patent on BrowseRank, a system which determines relevancy by tracking usage data
This all points to the increasing role of engagement metrics.
In order to positioned well in the future, you'll need to think as much about the level and type of interaction on your site as you will as you will about link authority. This comes all the way back to my first point above - build a site with plenty of potential for relationship building.
The following is a guest blog post by Jeremy L. Knauff from Wildfire Marketing Group, highlighting many of the recent changes to the field of SEO.
Marketing is constantly evolving and no form of marketing has evolved more over the last ten years than search engine optimization. That fact isn’t going to change anytime soon. In fact, the entire search engine optimization industry is headed for a major paradigm shift over the next twelve months. Like many of the major algorithm updates in the past, some people will be prepared while some will sit teary-eyed amongst their devastation wondering what happened and scrambling to pick up the pieces. Unlike the major algorithm updates of the past, you won’t be able to simply fix the flaws in your search engine optimization and jump back to the top of the SERPs.
Why is this change going to be so different? In the past, the search engines have incrementally updated certain aspects of their algorithms to improve the quality of their SERPs, for example, eliminating the positive effect of Meta tag keyword stuffing which was being abused by spammers. Anyone who has been in the SEO industry for more than a few years probably remembers the chaos and panic when the major search engines stopped ranking websites based on this approach. This time around though, we’re looking at something much more significant than simply updating an algorithm to favor particular factors or discount others. We are looking at not only a completely new way for search engines to assign value to web pages, but more importantly, a new way for search engines to function.
A number one ranking for a particular keyword phrase was once the end-all, be-all goal but now many searches are regionalized to show the most relevant web pages that are located in the area that you are searching from. While this will probably reduce your traffic, the traffic that you now receive will be more targeted in many cases. Additionally, it give smaller websites a more equal chance to compete.
This August, Google Suggest was moved from Google Labs to the homepage, offering real-time suggestions based on the letters you’ve typed into the search box so far. This can be an incredibly helpful feature for users. At the same time, it can be potentially devastating to websites that rely of long-tail traffic because once a user sees a keyword phrase that seems like at least a mediocre choice they will usually click on it rather than continuing to type a more specific keyword phrase.
Devaluation of paid links
Google’s recent attempt to eliminate paid links has scared a lot of people on both sides of the link buying equation into implementing the “nofollow” tag. In the midst of this hypocritical nonsense, Google has also been taking great measures to devalue links based on quantifiable criteria, such as the “C” class of the originating IP, similarities in anchor text and/or surrounding text, location of the link on the page and the authority of the domain the link is from to name a few. Regardless of the effectiveness of any search engines ability to evaluate and subsequently devalue paid links, the fear of getting caught and possibly penalized is more than enough to deter a lot of people from buying or selling links.
Visitor usage data
Again, Google is leading the charge on this one. Between their analytics, toolbar and web browser, they are collecting an enormous amount of data on visitor usage. When a visitor arrives at a website, Google knows how long they stayed there, how many pages they accessed, which links they followed and much more. With this data, a search engine can determine the quality of a website, which is beginning to carry more weight in regards to ranking than some of the more manipulatable factors such as keyword density or inbound links. This puts the focus on content quality instead of content quantity and over time, will begin to knock many of the “me too” websites further down the SERPs pages, or out of the picture all together. The websites that will prosper will be those that produce relevant, original content that their visitors find useful.
Simply pointing a vast number of links with a particular keyword phrase in the anchor text to a website was once a quick and easy way to assure top ranking. The effectiveness of this approach is diminishing and will continue in that direction as a result of TrustRank. In a nutshell, a particular set of websites are chosen (by Google) based on their editorial quality and prominence on the Internet. Then Google analyzes the outbound links from these sites, the outbound links from the sites linked to by these site, and so on down the chain. The sites that are further up the chain carry more trust and those further down the chain, less trust. Links from sites with more TrustRank, those further up the chain, have a greater impact on ranking than links from websites further down the chain. On one hand, this makes it difficult for new websites to improve their position in the SERPs compared to established website; one the other hand, it helps to eliminate many of the redundant websites out there that are just repeating what everyone else is saying.
Utilizing a combination of visitor usage data and a not so gentle nudge in Google’s direction, Google Chrome is set to change the way search engines gather data and present it to users. For example, when a user begins typing in the address bar of the browser, they are presented with a dropdown list of suggestions containing choices consisting of the #1 result in Google’s SERPs, related search terms and other pages you’ve recently visited. This gives a serious advantage to the websites that hold top ranking in Google and at the same time, gives a serious advantage to Google by giving their Internet real estate even more exposure than ever before.
So the question remains, is search engine optimization facing evolution or extinction? Certainly not extinction, not by a long shot, but in a short period of time it is going to be drastically different than it is today. The focus will soon be on producing a valuable and enjoyable user experience rather than just achieving top ranking, which is what it should have been all along.
Compete.com quietly launched a referral analytics product as part of their advanced package ($499/month). Even as a free user you can see the top 3 results for any site, which can be used to see how reliant a site is on search. Why is % of search traffic an important statistic?
If search traffic (as a % of total traffic) is low (relative to other competing sites) then it could indicate that there are organic optimization opportunities that are currently being missed and/or that site has a large organic traffic stream that can be marketed to in order to help it improve any search related weakness.
If search traffic (as a % of total traffic) is high (relative to other competing sites) then it could indicate that the site is near its full search potential, that the site is not very engaging, and/or does not have many loyal users
Here are search stats for SEO Book. Note that Google controls a minority of the traffic to this site, which means they have limited direct influence on the revenue of this site. Some sites are closer to 90% Google, which makes it easy for Google to effectively remove them from the web!
This sort of data is important for considering the viability of a business model, the stability of a site, and what multiple a site should sell for. It can also be used when considering the CPM of an ad unit - search traffic is much more targeted and goal oriented than a person browsing a forum is.
Until everyone and their dog started looking at PageRank (and how to manipulate it) it was a rather sound way of finding the most valuable backlinks. But with the pollution of endless bought links, nepotistic links, and PageRank only being updated quarterly it is tough to glean much market data from only looking at PageRank. Tools like SEO for Firefox (especially when used on a Yahoo! backlink search) allow you to gather more data about the quality of link sources. But they all try to measure proxies for value rather than how people actually surf the web.
Microsoft BrowseRank research would use browsing data to supplement PageRank on determining relevancy. In Internet Explorer 8 (currently in beta) a person's browsing details are sent to Microsoft by default. With ~ 80% of the browser market, Microsoft does not need to use a random walk for the core of their relevancy algorithm - they know what people are actually doing, and can use usage data as a big part of their relevancy algorithms.
Using a tool like Compete.com Referral Analytics makes it far easier to poach top affiliates, discover the best ad buying locations, and replicate a competitor's best backlinks. Be forewarned that the tool only works at the domain level, so it is much better at analyzing Yahoo.com than shopping.yahoo.com.
Along with referral analytics Compete offers destination analytics, which let you know what websites people visit AFTER visiting a particular site...which should help you glean information about how sites are monetizing, what offers are working well, what sites are well referenced by another site, and what sites people go to if they can't get what they want on the current site.
At $500 a month, this tool is probably only going to be used by those who are already fairly successful rather than as an entry level tool.
In years past Consumer Reports WebWatch studies showed that consumers struggled to differentiate ads from organic search results and that "more than 60 percent of respondents were unaware that search engines accept fees to list some sites more prominently than others in search results."
Since those studies Google has changed the background color on top ads from blue to a light yellow color that is hard to notice on some monitors. Changing my contrast setting from 50% to 55% it is hard for me to see the edge of the sponsored box...it simply bleeds into the organic search results. Google interviewed German searchers to ask if they noticed the yellow background on sponsored links and got a negative answer:
INT [interviewer]: “Why do the results on top have a yellow background, did you notice?”
TP [tester]: “I didn’t notice this.”
INT: “What does it mean?”
TP: “It definitely means they’re the most relevant.”
...it is going to get much harder to compete for attention in big verticals unless you have the best visitor value and can afford PPC, or you build a formal partnership with the search engines.
To see where this is headed check out the Yahoo! Search results for a popular band, and see how Yahoo! turned their search results into a useful interaction AND an advertisement for Rhapsody - allowing searchers to play songs directly in the search results. Large portions of the search stream (lyrics, music, entertainment, sports) are going to be directly controlled by the search engines that keep users on their network longer and the second click.
* at least in the mind of searchers tested by Google and used in Google promotions to promote paid search advertising.
Following on from my post yesterday, How To Craft Kick-Ass Title Tags & Headlines, lets look at meta tags as an advertisement, and why you need to think carefully about your offer, and the offers of your competition, when you craft your tags.
Why Are Title Tags Important?
Ranking debates aside, the main reason Title tags are important is because they are displayed, in bold, in the SERPs.
A SERP is a list of 20+ links, all clamoring for the visitors click. It is therefore important to entice visitors to click on your listing, rather than everyone else's. Sometimes you achieve this by rank placement alone, but with well-crafted tags, you stand a better chance of receiving that click.
What Is The Optimal Length For A Title Tag?
The W3C recommends the title tag should be less than 64 characters long.
Some SEOs think that long, keyword-loaded tags are the best approach. Some SEOs think short punchy tags are best, as long tags may dilute the weight of the keyword phrase, and there is less risk of Google cutting off you message midstream.
Because other factors play a more significant role in terms of rank, I ignore prescriptive tag lengths. Instead, I look to optimize the message in line with the business goals of a site.
Know Your Enemy
This is a proven Adwords strategy which also dovetails nicely into SEO.
The first step is to evaluate your surrounding competition.
Look at the wording of the most successful adwords ad for your chosen keyword term. Your aim is replicate success. Run an adwords campaign and experiment with the wording to find out the wording combination that receives the most clicks and subsequent desired action. You then craft your title tags and description tags to match. What works for Adwords works in the main SERPs, too.
Another way to approach title tags is to constantly rotate the tags using a script, and monitor the results. The is a split-run approach known as Keyword Spinning. You keep with the winners and cut the losers. This approach is describe in my post "Tested Advertising Strategies Respun For SEO"
What Are The Ideal Lengths For Meta Description Tags?
Common SEO wisdom dictates the description tag should be around 160 characters long.
Again, my approach is take prescriptive lengths with a grain of salt. Instead, focus on marketing and business goals.
The title and description are clear and descriptive. There is a call to action and an appeal to self-interest.
This is a jumble:
The title and descriptions are confused. It is not clear what the benefit is to the visitor.
One problem is that Google sometimes uses a snippet Google may also use a DMOZ description.
Google will use the snippet when it finds no description tag, or determines the description tag that your provided is inappropriate. To improve the chances your meta description tag will be used, see Google's guide: "Improve Snippets With A Meta Description Make Over". Essentially, you need to make you meta description tag descriptive, as opposed to a series of keywords.
You can prevent search engines from using the DMOZ description using the following meta tag:
Informational queries are meant to obtain data or information in order to address an informational need, desire, or curiosity.
Navigational queries are looking for a specific URL.
Transactional queries are looking for resources that require another step to be useful.
Query classifications can be broken down further into the following sub-categories:
Directed: Specific question. i.e "Registering a domain name".
Undirected: Tell me everything about a topic. i.e. "Singers in the 80s".
List Of Candidates: List Of Candidates i.e. "Things to do in Hollywood".
Find: Locate where some real world service or productcan be obtained i.e."PVC suit"
Advice: Advice, ideas, suggestions, instructions. i.e. "What to serve with roast pork tenderloin".
Navigation to transactional: The URL the user wants is a transactional site i.e "match.com"
Navigation to informational: The URL the user wants is information i.e. "google.com"
Obtain: Obtain a specific resource or object i.e. "Music lyrics"
Download: Find a file to download ie. "mp3 downloads"
Results page: Obtain a resource that one can printed,save, or read from the search engine results page i.e. (The user enters a query with the expectation that 'answer' willbe on the search engine results page and not require browsing toanother Website)
Interact: Interact with program/resource on another Website. i.e "buy table clock"
And further by sub-category type:
Closed: Deals with one topic; question with one, unam-biguous answer. i.e "Nine supreme court justices ".
Open: Deals with two or more topics . i.e. "excretory system of arachnids".
Online: The resource will be obtained online i.e. "Things to do in Hollywood".
Off-line: The resource will be obtained off-line and may require additional actions by the user i.e."Airline seat map"
Free: The downloadable file is free i.e. "Full metal alchemist wallpapers Free".
Not free: The downloadable file is not necessarily free i.e. "family guy episode"
Links: The resources appears in the title, summary, or URL of one or more of the results on the search engine results pages
Other: The resources does not appear one of theresults but somewhere else on the search engine results page
Source: "Determining the informational, navigational,and transactional intent of Web queries" Bernard J. Jansen, Danielle L. Booth, Amanda Spink; Pennsylvania State University
When crafting your tags, think about what classification of query the searcher is undertaking. How would they structure it? What terms would they use? Would they phrase their query as a question? What words would they include? What words would they omit? Dig deep into your keyword research tools and web logs to find this data.
Think about their mindset. Using words like research and compare help you tap into people in the research mode, whereas words like buy, save, coupons, and free shipping attract people ready to buy.
A Call To Action
The title tag and description provides opportunities to include calls to action. A call to action is a phrase that provides the opportunity for a visitor to take a step along the sales process.
The keyword term you've selected might give you a clue as to what point of the sales process the visitor is at. Obviously, "Buy X Online Overnight Delivery" tends to indicate a visitor is about to hand over the cash, so you draft your title tag and description accordingly in order to help close the deal.
However, most keyword terms aren't this overt. This is where you need to think about the type of offer you present.
How To Decide Between A Hard Offer And A Soft Offer
Some of the most effective offers are seldom "reasons to buy", but rather "reasons to respond." This is the difference between a hard and soft offer.
The vast majority of searchers are not ready to buy, so by using a soft offer, you stand to capture a greater number of leads than you would if you just made a hard "buy right now!" offer. If all you've got is a hard offer, then visitors who aren't ready to buy will click back, or won't select your SERP result at all.
Instead, encourage the visitor to take a relatively painless action, such as joining a mailing list, or downloading a free case study.
You can take this a step further my using the case study title to find out more about your visitors. For example, a case study entitled "Real Estate" won't tell you much about the problem your visitor is trying to solve, but a descriptive title, such as "Seven Ways To Sell Your Own Home" will. If they download the latter, and your service solves this problem for people, you're one step closer to making the sale.
Benefits Of The Soft Offer
You'll generate more leads
You have the opportunity to enter a dialogue with the visitor, thus moving them through the process
Only you'll know if a hard offer or a soft offer is most appropriate. But think carefully about the nature of your offer when crafting your titles and descriptions. Is your offer exactly the same as every other offer in the SERP? Or could you tweak you offer to make it stand out from the rest? Your offer should be more enticing than every other offer on the page. Try to get this across in your title and description.
ChrisG has put together a special pre-launch offer on his new AuthorityBlogger course. At first look it looks like he put a lot of work in creating a great service well worth the layout if you want to become a kick ass blogger and/or get the attention of other bloggers. Nice job Chris.
IMDB is offering lots of free shows and movies online, which may lead to people becoming more acclimated with watching videos online, but if it does people might start expecting more in terms of production value. I am long on the value of video content, but this article shares some of my hesitation with creating tons of video in a complex rapidly changing field.
Despite the rise of amateur video and the new modes of distribution and discussion, Internet technologies have not been able to change the fundamental character of video. Whether someone watches video on a television screen, or plays it on YouTube, video is a linear, passive experience, designed to be watched from beginning to end without alterations or input from the audience. In this sense, video is still following the model set by film in the late 19th century.
Many things I said in the past later turned out to be incorrect after the market changed. Only with years of experience did I learn how importance the clause it depends is. With text an edit might take 30 seconds, but with video it might take 30 minutes. One way to de-linearize video is to create many small targeted videos rather than one large video.
One old-skool marketing technique that will always hold true is the value of the catchy headline.
The headline, given its power to convey meaning quickly, is more important than ever. Attention spans are limited. Media messages flood the channels. We're busy. The function of the headline is to grab our attention and pull us deeper into the message.
Many books have been written on how to craft great headlines. I'm going to quote from the advertisers bible on the topic, Tested Adverting Methods by John Caples. Caples identifies three main classes of successful headlines.
The Three Main Classes Of Successful Headlines
Self-Interest: The best headlines are those that appeal to self interest. They offer the reader benefits that they want, and they can get from you. For example, RETIRE AT 30
News - Humans are pre-disposed to seek out what is new and different in their environment. For example, NEW, CHEAPER IPHONE CALL PLANS RELEASED
Curiosity Appeal to our curious nature. LOST: $1 BILLION DOLLARS
Of the three, by far the most effective headline in advertising is the self interest headline. Our self interest usually trumps our curiosity, and news, especially when time is short.
Compare these two headlines:
PUT UP OR SHUT UP
FIVE TOTALLY NEW WAYS TO GET TOP RANKING IN GOOGLE
The first says nothing that appeals to our self interest. We don't even know what it is about. But you'd be hard pressed not to click on the second headline. The self-interest is just too strong. This is why the second form is used so often in link-baiting and social media. It screams for attention, and then makes a strong appeal to self-interest.
There is a downside to such headlines, however. Modern audiences have become jaded and cynical, especially where marketing messages are concerned. Overplay the benefit, and you'll come off as a shark. Link-baiting, a useful SEO tactic, has developed a bad reputation through overuse of this approach.
Eventually, people tune out.
Get Your Tone Right
We can twist the overused appeal-to-self interest headline strategy slightly to make it work for us. The key to getting the appeal to self-interest right is to get the tone right. Understand both the audiences' desires and the tone of "voice" they respond to.
For example, look at Digg. A cynic might argue that a surefire way to get top page on Digg is to write a headline that includes the following subject matter, and do so using an irreverent tone:
Criticism of Bush
Anything about Digg itself
Some crazy-weird activity from a country no-one has ever heard of :)
By the way, if anyone can come up with a headline that includes one of those elements, feel free to add it to the comments :)
The headline needs to be crafted in such a way as to appeal to Diggs demographic, which is mostly young, tech-savvy males. This demographic tends to respond to a tone that is cynical, flippant and irreverent. Get that tone wrong - i.e. play it too straight, or too advertorial - and it doesn't matter how strong the self-interest angle, it's unlikely to work.
How To Use Headline Strategy In SEO.
SEO has an additional challenge.
For SEO to work well, the headline, which is often also used as the title tag, should include a keyword term. Many studies have shown that a SERP or Adword that includes the keyword term results in more clicks. In order to get the headline strategy to work for SEO, try amalgamating the keyword term with one of the three formats.
For example, where the keyword term is "high speed routers", try:
High Speed Routers- How To Get Routers At Half Price (appeal to self interest)
High Speed Routers- Latest Features To Insist On (news, with a hint of self interest)
High Speed Routers- How We Blew Our Budget (Curiosity)
Even if you're not #1 in the serps for that term, you're more likely to attract a click than the guy who simply uses "High Speed Routers", by itself.
Your headline (i.e. the title tag) competes with at least ten other SERPs on the page, along with a various Adwords listings along the top and down the side. The top three SERP poitions are gold, but if you can add a touch of appeal-to-self-interest, or news, or curiosity, you'll up your chances of getting the click.
If you want to go one step further with this tactic, use it as a way to segment visitors. The first example I gave is likely to attract those people who are ready to buy, and who are buying on price.
You then need to include your title as a heading on the page, which confirms to they visitor their click has got them where they wanted to be. They're now far more likely to read beyond the headline.
The common wisdom is that linking out will result in the following:
People will not link back to your site
A page that sends people away has low engagement
It boosts the completion at your expense
However, it appears that top news site in terms of session use, two months running, is DrudgeReport, a site that does nothing but send people away. I believe Google got rather popular for doing much the same thing :)
And look at the numbers:
"Page view statistics
500 million page views monthly
1.95 billion ad impressions monthly
12 million unique visitors monthly
1.75 million daily unique visitors (weekday)
1 million daily unique visitors (weekend day)
Assuming 60% sell-through at $4 CPM… that’s $56 million annual revenue.
One guy. Linking."
If you provide something people really want, they'll keep coming back.
I just came across one of the funnier SEO emails I have ever read. When I shared it with my wife we both laughed out loud, so I thought I would share it with you. Personally identifiable information has been removed to protect the guilty.
___________ are looking for sites that would be interested in publishing content on behalf of a number of the UK's major brands, including the likes of ________ and _________ and ___.
For a site such as ___________ we'd be prepared to pay up to £30 per article a month, every month, depending on the nature of the agreement.
Naturally, you would have a say in what content is placed on your site, we would simply provide you with useful, accurate and well written content.
To see how the links might look on your home page please visit _____________ (the articles are near the bottom of the page under the title ‘___________’).
The reason we are looking to pursue this relationship with you is because there are a number of sub-standard websites that are ranking higher in the search engines than our clients for their own products by using unethical techniques. It is our intention to address this imbalance and is why we are willing to compensate you on a monthly basis for the publishing of our content and links to our client's sites on your site.
If you feel this is an opportunity that you are willing to discuss further or if you have any questions about this proposition then please feel free to contact me.
________ _______, Media Buyer
Generally by the time an SEO is experience enough to be working with Fortune 500s and big brands they are smarter than to buy into the bogus ethics debate. But what was funny is the ethical links they were buying in the example site were not even for brand related queries...some of the anchor text was for core category keywords like life insurance and loans. :)
It was pretty stupid for them to publish their clients (and a published site with link buying examples) in that email. If I would have fully published it without redacting information that probably would have made their rankings a bit worse ;)
That SEO firm claims to be award winning...I shall send them an email asking if they seek nomination for the worst link request email award.
They are a world-leading enterprise, employing over 22,000 people. Fortune named them "America's Most Innovative Company". They also run various online marketplace services, through which a vast amount of money flows. They are a trusted name in households across the country. It is the year 2000, and that company is Enron.
Less than a year later, Enron would collapse under the weight of institutionalized fraud. And hubris.
The lessons learned from the Enron collapse were the dangers of monopolistic power and lack of transparency.
Google In 2008
Google is the darling of the tech world. In fact, they're pretty much the darling of every world, given their massive market reach and the usefulness of their services. Google occupy a position of enormous power. It is fair to say Google has nothing in common with Enron, other than the fact they are a big company, and for the most part, Google has done a good job in terms of gaining and maintaining trust with a wide range of stakeholders.
Take for example the recent case of United Airlines stock. An old story about the airline's bankruptcy was published online, resulting in $1B being wiped off the value off the value of the stocks within minutes. The finger pointing started soon after, with Google blaming the originator of the piece, The South Florida Sun-Sentinel, whilst the Tribune Company, who publishes The Sun-Sentinel, pointed the finger right back.
To be fair, the mistake was largely due to a chain of human errors, and most of the mistakes made were outside of the control of Google. Questions of blame aside, this issue comes down to a matter of trust. Clearly, people trusted the information they saw on an automated news service, and acted accordingly. The lesson learned is that we should not be so quick to place trust in the machine.
From Trust To Anti-Trust
There is another trust - actually, anti-trust - issue of late, and this issue goes to the heart of Google's business model - online advertising.
Google's proposed Yahoo partnership is raising fresh antitrust woes. Regulators are starting to look more closely at Google's role in the world of online advertising. Will this deal give Google too much control of the online advertising space? Yahoo claims this partnership will create more market access, and provide better ROI, to advertisers. Advertisers fear that Google could use market dominance to set higher prices for search ads.
In a related example, Aaron reported on a feature in The New York Times about how Google refused to tell the owner of a directory why his bid prices had skyrocketed.
"When I pressed Mr. Fox about Sourcetool, he refused to tell me why the algorithm had problems with the site. When I asked him why the business.com site was in the algorithm’s good graces but Sourcetool’s wasn’t, he wouldn’t tell me that, either. All I got were platitudes about the user experience. It wasn’t long before I was almost as exasperated as Mr. Savage. How can you adapt your business model to Google’s specs if Google won’t tell you what the specs are?"
"I wouldn’t hesitate because I understand that if a search engine happens to stumble upon what it considers improper SEO techniques all on their own, they will more than likely contact us directly to discuss the matter. Getting kicked out of the database won’t even be a consideration. If our improper SEO tactics happens to get outed publicly by some gung-ho blogger, or one of the many competitors competing for our terms, I know that all we’ll get is a tiny slap on the wrist to show the world that the particular search engine is serious about web spam. And once our public scolding is completed, we will instantly be allowed to cut to the front of the confessional line".
Google may well enjoy a significant trust level, but they couldn't exactly be described as transparent, or consistent. The Adwords and Adsense systems have become a hall of smoke and mirrors, where some players get a free ride, whilst others get hammered. There is often little or no explanation given as to why. With transparency comes trust, and the often secretive Google could do a lot more to provide clarity.
Cases of this nature are always complicated and it is unlikely much will change in the short term. Many of us simply wish that Google would be a lot more transparent about how webmasters can use, and build upon, their platform.
I suspect that, going forward, saying "Trust Us!" won't be good enough.
"I have seen one advertisment sell 19-1/2 times as much goods as another" - John Caples
I've been browsing through a pile of my old marketing books looking for tried-n-true techniques that could be applied to SEO in 2008. Here are some examples of ads that worked last century:
"They laughed when I sat down at the piano - but when I started to play!". And "When thin film covers teeth, smiles lose fascination" . A personal favorite "How a strange accident saved me from baldness".
Trouble is, in 2008, these ads come across as hokey.
However, much of the underlying psychology of these time-tested techniques is pure gold, and can be directly applied to search marketing and social media. Over the next few days, I'll look at strategies that can help you grab and hold visitor attention. Some of this will be old-hat to SEO pros, but hopefully it will help those new to the game :)
First up - the value of testing.
The Key To Success Lies In Perpetual Testing Of All Variables
John Caples, author of "Tested Advertising Strategies", outlined two classes of advertising:
1. The Testers
2. The Non-Testers
The idea is simple: decide on a desired action you wish the user to take i.e. making a purchase; then link this action back to the advertising spend/keyword term. You also test the wording of one page against another. Run with the winners, and cut the losers. Repeat.
This is known as split-run testing. This process was developed by direct marketers, and has a natural fit with search marketing. An ocean of material has been written about how to do this, so I won't reinvent the wheel by repeating it.
Here are three great resources regarding split-run, and more accurate, but complex, multivariate testing:
The payoff in split/run testing is in the big swings in user action. If you're not seeing big improvements, then you've probably got your landing pages about right, and split run testing will offer incremental value, at best.
SEO: What Data Do I Test?
We're spolit for data.
The speed at which we can test and obtain data regarding visitor behaviour patterns is unprecidented. Before the internet, direct marketers used to run a series of trial campaigns in print. Can you imagine how difficult it was to measure response? And how much time it all took? These days, we've got detailed, automated analytics in the form of server stats, and we can build, test and analyse campaings, often within hours.
One trap those new to SEO often fall into is using the wrong data and metrics. One terrible, but often-cited metric, is ranking, as ranking doesn't tell you anything about utility. For example, how much traffic will the ranking generate? If the ranking does result in traffic, is it the type of traffic I need to fulfill my objectives?
We can find this information out by running a few simple tests.
How Can Testing Be Applied To SEO?
Keyword Testing With PPC
How do you know if the keyword you intend targeting is really worth targeting? A keyword term may have considerable volume and little competition, but if those searchers are intent on research, and you are trying to sell something, then your effort is wasted.
One way you can test user intent is with a short PPC campaign.
PPC offers you some valuable data points. Firstly, you can test actual search volume, as opposed to estimated volume, simply by running an ad. The Google data provides these numbers.
Secondly, you can measure the intent of the query by measuring click activity.
Determining searcher intent is important. If you aim is to sell via your site, then you want to target people who buy. Often, this information is contained within the keyword query itself. For example, the intent behind "buy x online" is clear. However, the intent behind "San Francisco Houses", less so. How do you measure intent if it is unclear from the search phrase?
You can do this by crafting different adwords ads - i.e. some commercial in nature, some informational - and testing them against one other. You can further test visitor intent by crafting landing pages that demand the visitor commits to an action that causes them some level of "pain". i.e. filling out a form. If they aren't prepared to engage after clicking a PPC ad, they're unlikely to do so just because they found your pages in the organic listings, either. If you find a PPC term with a high level of user engagement, chances are that keyword term is gold on the organic side, too, and therefore a great keyword to target.
After you've validated keyword phrases in this way, you can then set off on your SEO campaign, armed with the knowledge that your keyword terms should underpin your business objectives.
One flaw in this approach is that the searcher has clicked on a PPC ad vs an organic listing. This very action tends to indicate a commercial intent as people who do not have a commercial intent tend to ignore search advertising altogether. However, it will give you a ballpark idea and could save valuable time and effort, especially if you're targeting generic, non-specific keyword terms that don't clearly convey intent.
I spotted this technique a while back on BlueHatSEO. It is a fantastic technique for testing and refining SEO on big sites.
It can be difficult to know what keyword variant attracts the most visitors. For example, does "Myspace Pimps" get all the traffic, or does the variant "Pimps On Myspace"? Keyword tools often aren't sensitive enough to reveal this data, and it can be time consuming to monitor, test, and change thousands of individual web pages.
Instead, try adding a counter to each page and decide on a delimiter. Say, 5 visits per page per month. If the page views are less than this figure, use an automated script to scramble the title tag and the headings to produce a different keyword order. Reset the delimiter, and see if the new keyword order receives more page views than the previous order. Your site will self-optimise, based on the results of each test.
This is an excellent application of an established advertising method - it's an automated split/run test - and applying it to SEO.
Great article in the NYT over the weekend about an ad arbitrage directory named Sourcetool, which Google punted from the AdWords program. A couple quotes:
When I pressed Mr. Fox about Sourcetool, he refused to tell me why the algorithm had problems with the site. When I asked him why the business.com site was in the algorithm’s good graces but Sourcetool’s wasn’t, he wouldn’t tell me that, either. All I got were platitudes about the user experience. It wasn’t long before I was almost as exasperated as Mr. Savage. How can you adapt your business model to Google’s specs if Google won’t tell you what the specs are?
sells links (yes they have editors, but when they were interviewed about a year ago by Aviva Directory they only had 6 editors managing 65,000+ categories...many of the listings not only included aggressive anchor text, but also allowed the use of up to 5 spammy sub-links with each listing)
used nofollow on many of the free editorial links (while passing link juice out on the paid links)...this was corrected after we gave them a proper roasting on Threadwatch :)
uses a funky ajax set up to hide work.com content in a pop up (but makes it accessible to the Google crawler)
scrapes Google search results as "web listings" and in some cases Google ranks these pages! (Google is ranking a Google search result surrounded with Google AdSense ads, branded as Business.com)
Any one of those 4 would be enough to kill most websites, but because of Business.com's large scale, strong domain name, and brand they can do things that most webmasters can not. They are given the benefit of the doubt because Google can not clean up all arbitrage without hurting their own revenues - and Google's job it easier if they have to police a few thousand companies rather than millions of individuals.
Google also told me that it never made judgments of what was “good” and “bad” because it was all in the hands of the algorithm. But that turns out not to be completely true. Mr. Savage shared with me an e-mail message from a Google account executive to someone at another company who had run into the same kind of landing page problem as Sourcetool. The Google account executive wrote back to say that she had looked at the site and found that “there seems to be a wealth of valuable information on the site.” Consequently, her team overruled the algorithm.
Algorithms (and under-waged third world employees labeled as the algorithm) often make mistakes. If a mistake is made when Google passes judgement against your site, is your site good enough to recover? If your site was deleted from the Google index would anyone other than you notice and care?
I've got the cheapest cars. I got the best deals in town! You won't get better!"
What is wrong with this picture?
This is how sales used to work. The salesperson worked to a script. The complex desires and concerns of the customer mattered a whole lot less than the need to push a generic solution. There is little in the way of relationship development or needs assessment.
Fast forward to 2008, and we live in a very different world. Due to rich, deep product and services markets, the customer has near infinite choice and options. The power has shifted to the consumer, albeit the downside is often confusion and inaction. This is why it is important to listen.
What Does The Customer Really Want?
In my opening paragraph, the salesman hasn't really bothered to find out what the customer really wants. All he knows is they probably want a car. He has made assumptions about the rest, and launched straight into benefits.
Many websites make the same mistake.
Here is a market research questions format you can incorporate into your SEO and copy writing. The aim is to find out if the benefits you are selling are the benefits the customers actually want. This is known as the SPIN selling method, and it was devised by Neil Rackham.
Situation Questions: These ask about facts or explore the buyer's present situation. For example, "How big is your family?"
Problem Questions: These deal with the problems, difficulties, and dissatisfactions that the buyer is experiencing with the present situation and that the supplier can solve with its products and services. For example, "What mile per gallon is your car old getting?"
Implications Questions: These ask about the consequences or effects of a buyer's problems, difficulties, or dissatisfactions. For example, "How much is it costing you to run your car each week"?
Need-Pay Off Questions: These questions ask about the value of usefulness of a proposed solution. For example, "How much extra money would you have for other things if we could reduce your weekly transport costs?"
Can you imagine how focused your pitch would be if you had the answers to these questions? You'd know exactly what your visitor wanted, and you'd have a good chance of closing the sale. However, it can be difficult to get this level of engagement from web visitors.
There are a few strategies we can adopt to get closer to those answers. It can help our SEO, too.
1) Path Your Visitors
On your landing page, write some copy, then ask the visitor a few questions. Keep the SPIN methodology in mind. Make each question a link to another page. Depending on how the visitors answer, they will be taken through a series of different pages that help address their needs. This will lead them closer to desired action. This has a great payoff for SEO, too. You can incorporate hundreds of pages into your site, all asking slightly different questions about pretty much the same thing. These pages become natural, interlinked keyword variations on a theme.
2) Overload With Answers
Sometimes pathing isn't appropriate.
One of the potential risks is the visitor may tire of the questions, and drop out of the sales process. Always be sure to make it easy for the visitor to complete the desired action (e.g. make a purchase ) at any step.
Another approach you can use is to overload your sales pages with keyword copy in an attempt to answer most buyer needs on the one page. In the direct marketing world, it is an established fact that long copy produces more sales than short copy. Part of the reason for this is because people are at different stages in their buy cycle, and their needs and desires will vary. You see this approach on sites such as Amazon.com, and some pretty horrid examples on hard-sell sites where this technique is taken to the extreme.
Try making long copy from a series of independent short copy units. One obvious example of this is a FAQ. A person doesn't need to read all the copy to get their questions answered, but can jump to the appropriate place to find their answer. Another can be seen in the Amazon page structure. Those who like customer reviews know to scroll down to the bottom of the page. Those who want a description look in the middle section. Those who want to price compare can do so against second hand copies. And so on.
If you're thinking this point is obvious, you're right. It is. But it underscores the need to always consider your visitors needs and how they may vary. Orient your strategy around the idea that there will be multiple questions and answers, needs and wants, and work this into your copy and site structure.
AIDA stands for Attention, Interest, Desire, and Action.
These are the things you must do, in terms of web information structure, in order to get your visitor to the close.
Capture attention and interest. Create desire by sweetening the deal (one day offer, bonus gift, discount today only, etc). Help visitors complete the action by overcoming objections (money back guarantee, etc) You lead the dance, but you're always listening out for the visitors wants and needs. Create funnels in your analytics programs. See where people are dropping out and ask "why"? Sales used to be about talking. These days, it's about listening.
Eventually, your sales copy will meet the varied desires and needs of your visitors.
I'll be making further posts about AIDA, but the important thing for now is to think about how listening can be used in terms of information structure, and the ways in which you need to respond. SEOs are already good at listening. They "listen" for keyword queries, and orient their copy accordingly.
Think of the process less as a sales funnel, and more of a buy path. The buy path is a relationship, consisting of questions and answers. That process starts on the search engine, and ends when you provide a visitor with the answers, and the solutions, they need.
These black holes aren't the result of the CERN Hadron Collider. They are forming for two reasons: the desire to keep people on site longer; and to hoard link juice, in order to dominate the SERPs.
Increasingly, top-tier sites are becoming cagey about linking out. They are more than happy to be linked to, of course, but often the favor is not reciprocated. Check out this post by SEOBlackhat.
What Does A Black Hole Look Like?
Uber-black hole, The New York Times, seems reluctant to link to anyone but themselves. This is especially annoying when they write about websites.
Wikipedia no-followed their links some time ago, thus forming a PageRank variant of the black hole.
The mini-me black hole, as practiced by TechCrunch. Rather than directing you to a site mentioned in an article, TechCrunch would direct you to their own CrunchBase entry instead, thereby keeping you on-site longer, and passing link authority to their own web pages. As a result, a search on Google for a sites' name may well bring up the CrunchBase entry. To be fair, TechCrunch does also link out, and there is an explanation as to why TechCrunch aren't as bad as the New York Times here.
The result is a link-love black hole. Sites using such a strategy can dominate the rankings, if they are big enough.
So if you wanted to create a blackhole, what would you do?
Don't link to anyone
If you must link out, then No-Follow the links, or wrap them in scripts
Direct page rank around your own site, especially to pages featuring your competitors names
Buy a motherlode of links
Become a newspaper magnate :)
Now, if you're an SEO, you might be feeling a tad conflicted about now. Why wouldn't every SEO do this? What if you owned a black hole? Isn't that the ultimate SEO end game?
In the long term, I doubt it.
If this problem becomes too widespread, Google will move to counter it. If Google's results aren't sufficiently diversified, then their index will look stale. If you search for a site, and get third party information about that site, rather than the site itself, then this will annoy users. Once confidence is lost in the search results, then users will start to migrate to Google's competitors.
I'm not certain such a move will be entirely altruistic, however. After all, what is the point of Knol? No, really - what isthe point of Knol? ;)
The Advantages Of Sharing The Love
Consider what you gain by linking out.
Webmasters look at their referalls, and may follow the link back to check out your site
Outbounds may count for more in future, if they don't already
Your users expect it. Don't fight against their expectations else you'll devalue your brand equity
Any site that looks "too-SEO'd" risks standing out on a link graph
There is social value in doing so. Black hole sites start to look like bad actors, can receive bad press, and risk damaging their relationships with partners, suppliers, and communities.
"..... The web is a great example of a system that works because most sites create more value than they capture. Maybe the tragedy of the commons in its future can be averted. Maybe not. It's up to each of us".
The phrase Black Hole SEO was used by Eli on BlueHatSEO.com over a year ago to describe various aggressive SEO techniques.
When was Quintura launched? What gave you the idea to launch it? What problems were you trying to solve by launching it?
Quintura was founded in August 2005 and released its first search application in November of that year. One year later, we launched a web-based search. It was based on visual context-based search concepts that the founders had been developing since 1990s. Quintura was founded to solve several fundamental problems inherent with today's search engines. Those problems include too many irrelevant search results returned, no one reads past the first page of results; inability to manage or tune results by defining context or adding search scope; no means for users to graphically visualize search terms or manage their relationship/relevance. Quintura is designed to make it visually simple for searchers to find what they are looking for, and to make it easy for web publishers to expose the content their visitors are looking for.
You guys have got a lot of great press from tech bloggers. On the marketing front what are some of the biggest and most successful surprises you have encountered? What have you found to be hardest when marketing your search service?
The simple fact that there is a tremendous amount of interest in our technology and service, in spite of the large field of alternative search engines on the market. We've invested most of our time and efforts in research and development. Our biggest challenge has been in getting our first marketing message out, which is we're in the process of expanding now to mainstream media.
How do you guys generate the keyword clouds?
That's part of the magic behind the Quintura technology. At the heart of our technology is a semantic-based 'neural network' algorithm. The cloud is literally a depiction of those search terms laid out to show their contextual relationship. Since the graphic depiction is dynamic - (you are interacting with the search in real time) one of our design goals has been to develop the widget to be extremely responsive. Through the past year, we think we've reached that point.
Quintura is popular as a keyword research tool amongst many SEOs (I use it all the time). Have you thought about combining your service with search volume data and/or competitive research data to create a formal premium keyword research (or competitive research) service/tool?
We've been asked that several times, but for now, our goals are to provide the best consumer site search services to the market and to provide our search widget to as broad an end-user audience as possible.
Quintura makes boolean search easy to visualize. Do you think searchers will eventually start using advanced search operators more on general web search engines, or will most only use it when it is presented in an aesthetically friendly way like Quintura does?
The question is whether users want to become adept at boolean logic or would they prefer to have that hidden in the background. From our experience, users would prefer to focus not on the math but on the search itself - finding the most relevant results in the least amount of time. By laying out search terms contextually and graphically, Quintura helps users manage their search and be in control of their search.
When partners sign up for Quintura you guys create a custom index from a crawl of their sites. How many domains can be part of the same index? What sort of sites does Quintura's visual search work great on? Which ones are not as strong of a match?
There is no limit. We're glad to work with large web publishers directly to assure that we are indexing all important content as part of our site search solution. The publisher of several web-sites can create a “vertical” search engine based on the Quintura search cloud. Quintura works well with all web-sites that we have worked with to date including numerous amount of blogs. Though, our first major site search clients were lifestyle portals and lifestyle magazine web-sites.
Do you see the face/interface of general web search changing drastically in the coming years? How might it change?
The web is getting more visual. So is search interface. That’s the trend. We are enabling our content-publisher customers to be more creative through customization of the widget itself. We're also looking at ways to make the search results even easier to see through the use of even more graphics.
Does Google have general search locked up? What competitive positions might allow people to build out a strong competitor that can take marketshare from Google?
General search is mostly locked up with Google. In my opinion, the best way of taking a marketshare from Google is not by building a better search destination site, but by changing the paradigm – give reasons for users not to make a decision to go to a search engine. Because when the think search engine, they think Google. Essentially, what Quintura site search does is creating environments where users keep exploring the passions, their interests, their information needs from where they are on the Web. People go to search engines when they can’t find what they want where they are.
Chitika has created a fairly large sized behaviorally targeted ad network by targeting ads to the search query prior to people landing on a page. Your site search strategy seems like it could be a rather powerful strategy for building a strong network. How has growth been going? Do you have any interesting success stories from the publisher or advertiser standpoints?
Quintura currently powers site search for a monthly audience of 8 million site users. The tests are underway on various U.S. sites, including two major men’s lifestyle sites and an educational publisher. We plan to reach the audience of 50 million in 2009. You can see Quintura search widget on lifestyle sites Maxim.com, Passion.ru and Cosmo.ru; technology news sites ReadWriteWeb.com and Compulenta.ru, business community portal E-xecutive.ru; web-sites of consumer magazines Hilary Magazine, Russian Newsweek, ComputerBild, luxury news site LeLuxe.ru, in addition to hundreds of smaller web-sites and blogs that joined ouraffiliate program for site search. We have also approached several online advertisers including security software vendor Kaspersky Lab to advertise on our search widget network of sites.
What types of ads work especially well with a service like Quintura? Which ones are less strong?
We have tested both contextual search ads and display ads. We are going to blend search ads with display ads for more visual appeal. Plus, can target those contextual graphic search ads with much greater precision because of our context-based algorithm. Ads from companies with established brand logos benefit from our ability to graphically display their logos in the search cloud.
What areas does Quintura have a lot of inventory in?
It is in lifestyle and technology areas.
Many search engines (Google, Yahoo! Search, Live), large content & commerce sites (Amazon.com, eBay, Wikipedia), and browsers (IE8 Beta 2, Google Chrome, Firefox 3) are now adding search suggestions in the browser via the search box and/or address bar. Do you see this eventually evolving into a Quintura-like service?
It’s a helpful feature that is mostly based on search statistics. We go a step further by offering contextual suggestions. One of the greatest aspects of our display cloud is that it shows contextually-related results, and to depict them with a graphical element. Can you imagine a shoppng experience that lets you see related items in real time?
Quintura is currently powered from the Yahoo! index. Do you guys ever plan on creating your own web-wide search index?
As a matter of fact, we are already creating our own web index from individual indexes of web-sites where Quintura powers site search. Quintura site search on those web-sites is powered by search results from Quintura index of those sites.
How many regular users does Quintura.com have as a search destination? Do you guys intend to become a consumer search destination, or are you more focused on providing search for third party sites?
We focus on providing site search, analytics and monetization platform for web publishers and content owners. As a search destination, Quintura has less than 1 million users per month. We will continue operating and developing our search sites to provide the benefits of our search technology to users. For example, Quintura.com will evolve into an online research tool where registered users will be able to save and share their searches online with the other registered members.
You guys have a vertical search service for kids. Is that seeing good adoption? Do you plan on coming out with any other vertical search engines?
Children are far more graphically oriented and can grasp contextual depictions easily. It was a natural extension for us to offer a search engine designed specifically for children - Quintura for Kids. It's also a great test bed for us to further evolve search technologies while giving kids a hand. The search engine is used mostly in the elementary schools and public libraries in the U.S., Canada, Australia, and New Zealand. Since its first launch in March 2007, several hundred school and teacher web-sites linked to Quintura for Kids. According to site statistics, the search engine has 70 percent returning visitors. 75 percent of visitors come to the site directly from a browser. In June 2007, Quintura for Kidswas ranked the highest among search engines for kids by Search Engine Watch.
We evaluate additional opportunities including licensing our technology to intranets and major search engines.
For now, our hands are full with upcoming site search product enhancements and monetization as well as with our growing site search customer base.
Everyone already knows who they are. They have an established, iconic presence. They have mega-bucks to spend. They hire very expensive people to make very expensive noises in every market-place in the world.
But what do you do if you're a web entrepreneur trying to build a brand, from scratch, from your couch?
I've put together a list of brand building ideas, strategies and resources that can help you enhance and establish your brand on a limited budget.
1. Own Your Keyword Name
An obvious example of this strategy is SEOBook.com.
I recall Aaron describing how there was no search volume for "seo book" when he started, although there was a market for books on SEO. By building up that brand name, Aaron sparked brand searches, and forever owns the search term.
SEOs will be aware of the power of incorporating keywords into your brand name itself. The trick is not to be too generic, else you'll forever compete with everyone else who targets generic keyword terms.
2. Tell A Consistent Story
You walk into a luxury hotel. The street frontage and reception and first class, but as you explore, you notice the hallways are shabby. The rooms are top notch, but the bathrooms are dated and there are cracks in the bath.
The brand is not telling a consistent story - well, not a story that says "luxury" - and will suffer as a result.
Everything you do on your site must tell a consistent story. Everything you do is your brand. Great design is of little use if the copy writing is sub-standard, and vice-versa. Get all those little, but important, details right. Broken links, 404s, slow load times, confusing navigation, unexpected surprises - they all part of your brand experience.
3. Tell A Great Story
You'll hear this a lot in modern marketing. Businesses often say "we have a great story to tell".
Stories can be very powerful brand building exercises because people like being told stories. Stories are easy to remember, they capture the imagination, and they engage people.
Learn how stories are constructed. In a nutshell, stories move from a point of equilibrium, into chaos. The central character faces a series of challenges, which s/he overcomes. A new status quo is established.
How could this be used for a brand?
Apple started in a garage. Two misfit teens overcame the might of the corporate world to produce one of the worlds most successful, technology brands.
That's a David vs Goliath story.
But what if you don't have such a glorious story yet?
Tell a series of small stories.
"I was always getting frustrated because I often had to yell over a crowd when there was no PA available. So I started using and selling cheap, mobile PA systems. Now everyone can hear me, whether they like it or not!"
Not a great story, but it illustrates a benefit.
What is your story?
4. User Experience Is Your Brand
Site structure and usability are as much part of branding as site design. Learn the lessons of Google. The user experience is the brand i.e. fast, simple, uncluttered. Brand recognition is largely created by the accumulation of experiences and associations the user makes with your company.
There is often no need to hit people over the head with convoluted mission statements. People don't care about you. They care about them. If you make their experience a good one, they'll reward you.
5. Brand Partnership
Partner with someone who has an existing brand.
An example of this strategy was mentioned on CopyBlogger.com recently. Approach authors of well-known how-to books and provide an online learning resource. The author puts his/her name to it, and receives a share of the revenue. You run the online learning resource. You have the benefit of starting with a pre-established brand and audience.
Also consider licensing brands and product marks.
6. Let Your Customers Tell You What Your Brand Is
In the 4-Hour Work Week, Timothy Ferris outlined a strategy using Adwords to decide the title of his book. He placed Adwords text ads, varied the titles, and chose the title with the highest click-thru rate. His potential audience decided his title, which is also his brand: "The Four Hour Work Week".
This strategy is useful in that it can help identify untapped niches in markets.
Why is your product better than the others?
Answer that question, and you have a brand.
Without it, you don't have a brand.
Move heaven and earth to maintain your good standing.
9. Become The Brand
Be your brand. Live your brand. Tell everyone, and tell them often.
It seems obvious, but I've seen many a presentation where I couldn't recall the names of most of the companies by the end of the day, mostly because people didn't do the simple thing of repeating their brand name often enough. Chances are that you need to repeat this information five times before most people will remember it.
As an aside, Jason Calacanis had a piece of advice in one of his recent newsletters. "If you don't *really* believe in your product on a deep, intrinsic level, it's going to come across *immediately* to the bloggers and press you're pitching".
The simple, most powerful thing you can do is to believe in your brand. Everything else flows from there.
10. Viral Baby
If you're reading this site, chances are you're already ahead of the curve when it comes to the huge potential the Internet offers the little guy. Multi-national businesses can now be run from a bedroom.
Look at Digg. YouTube. Facebook. Flickr. They all started from relatively humble beginnings, then went supernova very quickly. Why? There are many reasons, but they all have one thing in common.
They built viral into the brand.
They rely on one person telling another person. They facilitate it. They encourage it. They make it almost impossible not to do it.
Can your brand be made viral? Can you twist it so that people will engage with it and pass it on to their friends?
11. Time To Advertise
Once you've got your messages down, then it is time to advertise. You'd be surprised how many people do this the other way around!
Some corporates are especially bad at this, possibly because the marketing department isn't talking to the sales department, but therein lies the opportunity for the nimble entrepreneur.
One tip is to use banner ads, where you pay per click. Click-thru rates on banner ads are notoriously low, whereas they do generate brand awareness. Also seek out sites that aren't in direct competition with you, but have a similar, established audience. You can leverage off their brand by association.
Whatever channels you choose, the key is to repeat a single, simple, compelling message, over-and-over again.
I'm reading a book called "Predictably Irrational", by Don Ariely. It's about the hidden, irrational forces that shape our decisions, and it's a great read.
There are a few interesting case studies in this book that can be applied to web marketing. I'd like to look at two aspects which might help those of you involved in e-commerce.
Relative Pricing Structures
The first experiment looks at relative pricing structures. How do you structure your prices in order to achieve higher returns? Often, it can be a simple case of making an offer no one in their right mind will accept.
Here is an example.
The Economist presented readers with the following subscription offer:
1. Internet only subscription for $59
2. Print only subscription for $125
3. Print and Internet subscription for $125
Notice something odd about option 2 and 3? Why would anyone take up option 2?
They wouldn't. And that's the point.
It turns out humans rarely choose things in absolute terms. We work out how much things are worth based on what other things are worth and compare them. In the above example, the "Print and Internet" offer is better that just the print offer. The "Internet Only" offer might be better than both, however there is no point of relative comparison for that offer. The relative comparison is made between offers 2 and 3, which makes option 2 look poor, and option 3 look like a steal. Ariely ran real tests to measure take-up, and sure enough, most people took option three.
To illustrate how powerful this pricing method is, let's remove option 2.
1. Internet only subscription for $59
2. Print and Internet subscription for $125
In this example, people are faced only with a cheap option and an expensive option. The point of comparison is largely about price. You can guess what option most people chose. They selected the cheapest option, as price becomes the key differentiator.
So, try splitting your offers. Create offers that are valuable compared to other - deliberately substandard - offers.
The Effect Of Expectations
In another chapter entitled "The Effect of Expectations", Ariely asks why the mind gets what it expects, and not necessarily the reality of a situation.
For example, Ariely conducts an experiment whereby researchers offer students a free cup of coffee, along with some rather unusual condiments, such as cloves, nutmeg, orange peel, anise and sweet paprika. Not the sort of thing you'd likely put in your cup of coffee! The students were asked to rate the taste of the coffee, and specify the maximum price they were prepared to pay for a brew.
From time to time, the researchers made one subtle change. They placed the condiments in a range of containers, from rough styrofoam cups, through to beautiful glass-and-metal containers. The condiments were never actually used, however the mere appearance of the serving bowls had a curious effect. When the condiments were placed in luxury containers, the coffee drinkers were more likely to say they liked the coffee, and whats-more, they were prepared to pay a lot more for it.
If people thought the coffee was upmarket, they convinced themselves the coffee was upmarket. The reality was that the coffee never changed. The coffee was of the same quality throughout the experiment.
Self-evident, right? So, can this theory be directly applied to web marketing?
Essentially, we're talking about branding. There is the logical first step of using upmarket web design in order to help convince people your product or service is more desirable. There is a trap, however, and this is the reason I think this case study needs to be adapted for the web environment. When it comes to e-commerce, upmarket, glossy sites do not necessarily result in higher sales. There are various reasons for this, but I think mainly it has to do with the level of trust. A slick website can sometimes feel impersonal, and people crave a personal feel on the web.
Trust, not slick graphical design, is the equivalent of the elaborate serving bowls.
In order to raise expectations, consider raising the level of design, but only if you do so without losing trust. Achieving a fine balance between excellent usability, trust metrics and excellent graphic design is a great target to aim for.
Consider the converse. Have you bought from sites that are unusable? Plastered with over-the-top Adsense? Such sites are less desirable as expectations are set low, primarily because of the low level of branding. The buyer is expecting "cheap". That's probably the only reason people buy from such a site.
Such sites are the web equivalent of broken styrofoam cups, compared to the elegant serving bowls.
Have your say
What do you think? Have you got any "irrational strategies" to share?
I recently asked Matt Mullenweg if he would be up for doing an interview via email. He said sure, and here are his answers to the best questions I could come up with. Thanks again for doing the interview Matt!
How did you get into web programming? What made you decide to start working on WordPress?
I had started off pretty badly with Frontpage and Dreamweaver. Later I started to use things like guestbooks and forum scripts and light modifications of those for sites I was working on. The breakthrough for me personally, though, was a book called Mastering Regular Expressions from O'Reilly which inspired me to start writing my first code from scratch.
I think my first code contribution to any Open Source project was a set of regular expressions that would "curl" quotes to make them typographically correct, and it was accepted into the b2 system.
Did any early setbacks make you want to quit the WordPress project? If so, how did you work through them?
Since I was just doing it for fun and my own personal usage there were never any problems that were *that* big a deal. There were plenty of times that were tough around security problems, spam links, or community splits, but most ended up being learning opportunities.
When did you know that WordPress was going to work out?
When Zeldman switched.
How did you get beyond wanting to do everything yourself?
That's a tough one - I'm a perfectionist. I think it was that I eventually met folks who were as passionate as I was about the product and were clearly more competent. I think you have to know someone is better than you at something before you can truly let go.
One of the things that blew me away at Elite Retreat was how deeply you grasped the web. Who were some of the major influences in shaping how you perceive the web? What are some key articles and books that you think programmers and marketers should read?
Do you think the strategy of "I'm happy to ship a crude version 1.0 and iterate. I find my time is more effective post-launch than pre-launch." applies to bloggers and content producers as well as software producers?
Not as much - for an individual atom of content you don't have ongoing usage, you have a single chance to make an impression on someone. For a site as a whole the iterate approach is good, but for a given post or article give it your all.
At Elite Retreat you mentioned the concept of a "personal newspaper." What does that phrase mean to you, and do you see that concept spreading far and wide as the web ages?
I think Google Reader has the best chance of doing this. Basically there is a ton of interaction data I produce every day about what I read, how long I spend on different types of content, what I buy and gadgets I own, what topics I'm actually interested in, what topics I aspire to be interested in... There's no reason all of this couldn't be used as a filter on the torrent of news and information available every single day.
Blogging has become perhaps the leading information distribution format online. Have you been surprised by the growth of blogging? Do you envision blogs leading onling publishing for a long time? What other formats could gain significant traction?
I was pretty surprised by the growth of blogging, so I'm not going to attempt to make predictions about other formats I know even less about. :)
During past interviews you mentioned that you liked to "stay small while creating a lot of value." With powerful open source software tools & large community sites that may be possible, but what lessons should traditional niche service based business models and publishers take from successful open source software programs like Wordpress.org and communities like Wordpress.com and apply to their businesses?
I think one of the most important lessons is that you have to let go and let the community or your customers guide your direction, bet it around development, pricing, or direction. The extent WordPress has been successful thus far is directly correlated to our responsiveness to our users.
At Elite Retreat you mentioned a meta tag change that dipped the traffic to Wordpress.com. What happened and how long did it take to figure out what happened? How long did it take traffic to recover?
We had changed the meta description on permalink pages to basically be an excerpt from the post. This was less effective in SERPs than Google's auto-generated excerpt and so traffic dipped as a result. It probably took a month or so to figure it out, but traffic came back pretty quickly after we reverted the change.
Wordpress.com is one of the leading user generated content sites on the web. What are some of the leading strategies you have used to entice quality content creation? What strategies are key to detering the creation of spam?
Well one thing that has certainly helped is the lack of user ads, which removes people's direct financial incentive to create content purely for Adsense. Second I would say we take a very proactive in watching out for people trying to take advantage of the system to spam or drive traffic back to other sites inorganically.
Akismet says that 89% of comments are spam. Have you been surpised by the growth of comment spam? What seems to be driving the logarithmic growth of comment spam?
I think comment spam growth has mirrored what happened in the email world, and will probably continue to. The growth seems to be related to the low cost to spammers of just flooding everyone.
Someone used an automated bot to register an account on my site and post a contextually relevant comment about splogs being a problem. They then referenced a post on their blog, which was stolen as their blog was a splog. That splog had 60 subscribers on Feedburner! I have also caught a comment bot sequence that conversed with itself on one of my blogs. As spam gets more sophisticated will central systems like Askimet become more powerful?
I sure hope so. :)
I imagine that comment spamming on MA.TT is a quick way to get into Askimet. As online marketing gets harder some people are willing to do negative marketing for competitors. What steps can brands take to help prevent being listed as a spammer if someone else tries to ruin their reputation?
Akismet is pretty sophisticated and can usually detect that type of bowling, but of course if there is ever a persistent problem you can contact Akismet support 24/7 on the site.
At points in time I think many bloggers hated SEOs (probably for associating the field of SEO with all the comment spam they got every day). What do you think of the field of SEO? Does Wordpress employ key SEO strategies by default, and what modifications, if any, do you recommend?
I'm conflicted - on one hand there are certain things you can do to make your site more accessible to search engines that should be a baseline that everyone does but on the other hand search engines are just trying to deliver the best results to their users, so if you just focus on users and their experience the search engine should be able to figure out you're the canonical resource for a given topic over time.
WordPress' SEO I think is largely the result of focusing on other goals that also happen to have SEO benefits, like well-structured semantic markup, sane URL structures, meaningful title tags, and such. That said, people far more experienced with SEO than me have lots of suggestions of things we could do better and we listen to those closely. Ideally I think it's something WordPress users should never need to think about.
I imagine that many of the comment spammers have to be targeting high value keywords and niches. Have you ever thought about opening up some of the Askimet spam data to create a great keyword research tool? Doing that adds some opportunity cost and might dis-incentivize some of the comment spamming.
You probably would be disappointed in me for this, but I had a number of Wordpress blogs where I have not updated the CMS in years. About a week ago one of my blogs got hacked where someone added spammy credit card links to it. I was surprised with how easy it was to upgrade Wordpress. Do you forsee Wordpress.org ever doing automated updates? If someone gets hacked and temporarily removed from Google what are the quickest ways they can find out what went wrong and where the spam is?
We're working on making updated easier than it is today, and a number of web hosts have already integrated tools that make upgrading a one-click procedure just like installs are.
I've heard from people that were removed from Google that contacting their support or webmaster tools describing what happened is a pretty good way to get re-included in the index. They understand that this new wave of SEO hackers is pretty malicious and it's not your fault.
If there was no Wordpress and you were starting from scratch on the web today what areas would you focus on? Where would you start?
An email client or a cloud-synced desktop text editor.
Thanks Matt! Check out Ma.tt to read more of Matt's stories, see his photo galleries, and keep up with Matt's latest travels.
I've been trying out Google's Chrome browser. I like it. I really do.
I like Chrome mainly because it is fast. Faster than Firefox, anyway. However, I'll be alternating between the two browsers, because Firefox has a plethora of useful plug-ins that Chrome lacks.
Like many Firefox converts, I haven't looked at Internet Explorer for some time now.
Microsoft have recently released IE8, so I thought I'd evaluate it in terms of search, and contrast it with the functionality and positioning of Chrome. Many in the internet community have speculated that Chrome is going to eat Microsoft's lunch, and not just in the browser space, but with the ushering in of cloud computing. Is this plausible?
Let's take a look.
You can download IE8 Beta from here. As usual, you'll have to sign your soul, and those of your yet unborn children, etc, etc over to Microsoft, and then reboot.
Goodbye Google Toolbar
You run through the inevitable setup screens. The first search-related issue I noticed was that Google's toolbar wasn't compatible with IE8 beta, and asks me if I want to disable it. Is a bug, feature, or a market position? ;)
Next up, IE8 asks you if you want to use "Express Settings", which means that the search provider will default to your existing default, and just about everything else defaults to Microsoft products or services. Internet Explorer also wants to become your default browser. At this point, you can opt for Custom Settings, and modify each setting individually.
Pretty flexible, really. If you want to opt out of Microsoft services, you can do so easily.
The Search Wars
My main reason for looking at IE8 is in terms of search. What functionalities do you get, and how is this browser positioned against Google?
One feature, called Search Suggestions, offers, naturally enough, search suggestions. Like the equivalent Google feature, IE8 will try to guess what keyword you are search for a prompt you with suggestions as you type. This feature works with many different search providers (Google, Yahoo!, Live) and large ecommerce and content sites (Amazon.com, eBay, Wikipedia), which makes the search box a nice keyword research tool, but nothing new to most of us, I'm sure.
Note that this type-ahead feature, like on all browsers offering type-ahead suggestions, will send your search queries to your search provider, even if you don't hit send. Matt Cutts, perhaps sensitive to the privacy concerns aimed at Google, makes the point in this comment he posted on GoogleBlogoscoped that " if "Suggested Sites" is on, "your web browsing history is sent to Microsoft, .... the addresses of websites you visit are sent to Microsoft, together with some standard information from your computer such as IP address, browser type, regional and language settings.....".
How Will This Affect SEO?
An aspect SEOs need to consider is how the widespread implementation of search suggest is going to affect SEO. In this post, Aaron talks about how search suggest is likely to force a consolidation around the most popular terms. This has implications for those going after the long tail, but also provides new SEO opportunities, especially if you have a brand that incorporates popular search terms.
Explorer also allows search suggestion from any provider, which can be a useful SEO tool, in itself.
IE8 also offer Visual Search, which provides pictures to help you select a result. This didn't seem to work for me, but I did notice that a search on "Seattle Weather", the search term suggested by Microsoft, did bring up a page featuring advertisements for Australian outdoor sportswear suppliers. Reminds me how far other providers have to go in this text ad space in order to catch up with Google. It wasn't until I dug around a bit further that I discovered that you need to install search providers. Even then, it wasn't playing well, giving me a string of error messages.
Still, problems are to be expected in a beta release.
Other improvements include search history matching, a useful "Find On Page" button added to the instant search box, and the ability to drag the search box in order to change the width. A few nice touches.
Forced Search Provider?
On the Microsoft global-domination conspiracy front, far from locking you in, Microsoft have made it rather easy to configure IE8 to incorporate your choice of search provider. It wants to default to Live Search, but you can easily select Google, or other services. The pull-down search box provides options to add more. So, good marks in terms of flexibility.
There are various other features, including InPrivate browsing, which supposedly blocks ads and prevents people tracking you across the web. As it isn't search related, I won't review it, other than to say it is good that the user has to jump through a few hoops to enable it. Love 'em or hate 'em, web ads enable the production of a lot of "free" web content. If ads were turned off by default, many sites would simply cease to exist, or start charging for content. Full marks to Microsoft for leaving this option to the power users.
IE8 Vs Chrome
Now, contrast these features with Google's Chrome.
Did you find Chrome noticeably faster than your existing browser, be it Firefox or IE?
Speed was the deciding factor for me. On the internet, speed is (nearly) everything. IE8 didn't strike me as being any faster than Firefox, and certainly a lot slower than Chrome.
In this respect, IE8 feels like an update to an existing product, as opposed to a game changer. Chrome feels like a game-changer, even though, when pushed, I can't put my finger on exactly why this is. I think it may come down to the usability gains of extra speed, especially if your day to day use orients a search function. IE8 is adding functions, desktop application-stylee, while Google is busy taking features out in order to simplify and minimize.
If cloud computing is to take off, then the browser is going to need to need the speed of an application, and it is going to need to be simple and transparent in order for people to bother migrating.
Application-Centric Vs Web Centric
Chrome explains itself better. The Google information pages tell a cohesive story, whilst Microsoft's story appears scattered and a little confused. I'd liken Chrome to an Ipod. It lacks features some users might demand, but it works right out of the box for most people. Microsoft IE8 is, well....Microsoft. It feels more application centric.
Perhaps that says something about the web strategy of the respective companies. Google wants to pull users out of their existing habits, and into the Google web, whilst Microsoft needs to integrate existing application users with the web.
A subtle difference, but there nonetheless.
Have Your Say
What are your thoughts? Have you tried both new browsers?
Chrome, Google's new web browser, has made a huge splash everywhere this past week. User response has generally been favorable, however GoogleBlogoscoped is reporting that the German "Federal Office Of Information Security" may not be particularly happy with it:
"The Federal Office for Information Security warned internet users of the new browser Chrome. The application by the company Google should not be used for surfing the internet, as a spokesperson for the office told the Berliner Zeitung. It was said to be problematic that Chrome was distributed as an unfinished advance version. Furthermore it was said to be risky that user data is hoarded with a single vendor. With its search engine, email program and the new browser, Google now covers all important areas on the internet."
However, there appears to be no formal warning published on the Federal Office for Information Security's website. As various commentators point out, such an announcement would be odd, given that there has been no reported announcement about the IE8 Beta, which has also been released in a "unfinished advanced version".
Meanwhile, Matt Cutts is busy fighting "conspiracy theorists" regarding Google Chromes Terms Of Service. Some people were less than happy with the wording, which appeared to imply Google may assert rights to any content you submit, post or display on or through "the Services". Check out all the updates Matt makes as Google struggles to find the right words.
Google subsequently changed their Terms Of Service to read:
"11.1 You retain copyright and any other rights you already hold in Content which you submit, post or display on or through, the Services."
Webmasters are often faced with the problem of how to approach SEO on websites which have a country-specific focus. As you may have noticed, the search engine results pages on Google's geo-targeted search services frequently display different rankings than those you experience on Google.com.
If you run a few queries on, say, Google.com.au, you'll soon notice distinct regionalization patterns. In order to make search results more relevant to local audiences, Google uses different sorting methodologies than those used on Google.com.
Here is a guide to optimizing sites for the different regional flavors of Google.
Country Specific Local SEO Tips
Get a local domain extension: Google places a lot of weight on the domain name, so it is important to get the appropriate country-code domain extension. If you compare results across the different geo-targeted flavors of Google, you'll notice the weight given to the local TLDs. There are exceptions, but the local TLD tends to trump .com when it comes to local result sets. Different countries have different registration criteria for domain resitration. It is fairly easy to register a co.uk or a .co.nz, whilst a .com.au can involve setting up a business entity in Australia.
Specify your country association in Google Webmaster Tools: Google Webmaster Tools offers a facility whereby you can specify a country association for your content. You can do this on a domain, sub-domain and directory level. More detailed instructions can be found on Google's Webmaster Tools Blog.
Include local contact information: Specify a local address, business name, and local contact phone numbers. Whilst not critical in terms of ranking, every little bit helps, and by including local information, the site becomes more credible to a local audience.
Local hosting: Depending on who you ask, you'll get different answers as to whether the geographic location of the web host makes a difference in terms of ranking. I have .com.au, .co.nz, and .co.uk sites, hosted on US servers, and they rank well on the appropriate local versions of Google. Other people feel that location-based hosting is a must. Still others say the location of the name server is most important! It's fair to say that if you have a choice between hosting locally and hosting offshore, then it might pay to host locally. It certainly can't hurt, and there might be additional benefits, such as increased download speed. If you go this route, one thing to check is the servers physical location. Often, web hosts have a local office, but their servers are located in a different country. Use an IP lookup tool to determine the exact location of a server.
Spelling & Language: Ensure you use the appropriate spelling for your chosen region. There is a difference between "optimization" and "optimisation". Keep in mind that searchers will use the local vernacular. For example, if you are optimizing a travel site in the US, you might use the term "vacation". However, searchers in Australia, the UK and New Zealand, amongst others, tend to use the term "holiday".
Tone: Copy that works well in one geographic location may not work in another. For example, the sales language used in the US is usually more direct than that typically used in the UK, Australia or New Zealand. Familiarize yourself with local approaches to marketing, or engage local copywriters.
Inbound links: Seek out local links. All links are good, but inbound links from local TLDs are even better. Approach your local chamber of commerce, friends, suppliers, government agencies, business partners, and local industry groups and ask them for links.
Local directories: Get your site listed in local directories. Local directories still feature well in geo-targeted search results as the depth of content, in terms of sheer volume, isn't as great in the local TLD space as that published on .com. Obviously, you stand to gain from the local traffic that the directories send your way, and any local link juice the directory may pass on. Here are some top local directories:
Te Puna is a government run New Zealand directory.
Press releases: Try to come up with a local angle for your press releases, and submit them to local news and information channels. Small, local news outlets are highly likely to run local interest stories, which in turn may help your brand exposure and get you more local links.
Avoid Duplicate content: If you market is in one country, then it makes sense to use the country-code TLD for that country. However, if you target multiple countries, consider creating different content on each domain. Placing the same content on multiple domains may risk duplicate content penalties.
Off-line marketing: Don't forget to get your name out locally. If people search by you by your brand or business name, you'll always be well positioned in the serps.
Have Your Say
If you have some additional ideas that have worked well for you, please feel free to add them to the comments.
Now that Peter Da Vanzo has joined the site, we have another writer and can spend a bit more time on the blog. In the past some of my most popular blog posts came out of feedback from readers. What topics would you love to see us cover?
Nearly any SEO/PPC/blogging/internet marketing questions are fair game (although we won't do site reviews, or explain specifically why site X is ranking or why site Y does not rank).
When I first started blogging I tried to learn from and emulate 3 of my favorite bloggers: Seth Godin, Peter Da Vanzo, and Steven Berlin Johnson. A large part of the success of this site was learning from those guys. Recently I was lucky enough to hire on Peter Da Vanzo to help do some of the writing on this site. He has been blogging about search since 2002 on Search Engine Blog, which officially makes old school.
An SEO Book reader has ported over SEO for Firefox to make an Opera SEO extension. I have not tried it yet, but if you are an Opera fan let me know what you think of it.
Dan Thies recently launched his Stomping the Search Engines 2 video series with an interesting business model...the buyer only has to pay $10 for shipping and handling, and is added to a continuity product where they buy a print magazine about marketing each month.
In Macropayments Cory Doctorow highlights the incremental costs (and flaws) of an artist acting like a publisher selling information at a low price point. Truth be told I stretched my limits under my old business model and the new one is much better...proving his theory correct.
In this video David Heinemeier Hansson highlights why most start ups that charge money for their products do better than the start ups that aim to make everything free and make everyone happy.
To appreciate how bad free can be (in the wrong context), think of how good many of Amazon.com's reviews are, and then read the drivel in their political forums. Same site, same audience, and yet one is remarkably intelligent while the other is equally banal and belligerent.
People far smarter than I have talked about the web becoming an operating system, and search being at the center of how we access the cloud. What better way for Google to position themselves as the C prompt than to turn the address bar into a search box?
I think operating systems are kind of an old way to think of the world. They have become kind of bulky, they have to do lots and lots of different (legacy) things. - Sergey Brin
Some have dismissed Google Chrome as being unoriginal, but it is "a step that needed to wait until the company had, essentially, come of age. It is an explicit attempt to accelerate the movement of computing off the desktop and into the cloud"
Google is Serious About Marketing Chrome
Sergey Brin stated that they did not intend to lower Firefox's marketshare, but a day after launch Google was already marketing Chrome on their homepage (internationally and abroad to users of Internet Explorer, Firefox, Opera, and Chrome!)
How the Omnibox Shapes SEO
Recently I mentioned how Google Suggest could change SEO, and the Omnibox drastically extends those effects. Google did not pull any dirty tricks to force their search service into being the browser default, but they did try to turn the address bar into a search box - which will increase how often we search. The Omnibox offers short cuts as you type:
The parts that are in black are related search queries and the parts that are in green are typically one or more of the following
the #1 ranking organic Google search result
pages you recently visited that are relevant to the search query
The "search results before the search results" have major SEO implications:
Raising the awareness of your brand and getting many people to search for your brand will help your brand related queries show up when people search for broader related brands.
The value of a #1 Google ranking goes up, as the top ranked site has another opportunity to capture the searcher BEFORE they see the SERPs, and will be more likely to get clicked on when searchers see the search results (since they just saw the URL a second earlier).
The value of awareness advertising, website interactivity, and consumer generated content go up as they make you more likely to show up in the list of previously viewed pages.
For heavily advertised and/or frequently viewed pages I can see an advantage to adding a tomes of relevant text below the fold such that your site shows up for many related search queries. :)
Given Google's large ad network and their network advantages in search monetization, they will easily be able to buy marketshare through advertising on their own ad network and bundling this browser with hardware providers.
If the feature is widely adopted by other browsers it could lower the value of type in domain names (by making people more likely to search rather than type in a domain name). This could force some domainers to sell or develop, which could lower domain prices (and the .com premium)...this trend may already be underway given the pending Yahoo!/Google ad deal.
IAC recently broke up into 5 separate companies - LendingTree, Interval International, Ticketmaster, Home Shopping Network, and new IAC. Barry Diller thought that splitting up the company would lower uncertainty associated with the company and allow the core company to trade at a richer multiple, but that has not been the case, as noted in this WSJ article:
Stripping out $1.3 billion of net cash on the balance sheet, Wall Street is valuing the operating businesses at barely $1.1 billion, or an undemanding multiple of 5.5 times Ebitda. Google enjoys a multiple of 11.6; Amazon.com, 18.7; and slower-growing eBay, 7.4, says Cowen & Co.
For the entire remaining company (Ask.com, Dictionary.com, Thesaurus.com, Reference.com, Citysearch, Service Magic, Evite, Iwon, Gifts.com, Match.com, college humor) to be valued at only 1.1 billion seems a tad bit crazy. They paid 1.85 billion for Ask and roughly another $100 million for Lexico (Dictionary.com, Thesaurus.com, Reference.com).
If Microsoft could afford aQuantive for $6 billion they should be able to afford buying all the above brands for a couple billion. And if Microsoft doesn't buy Ask I wouldn't be surprised to see some private capital raise to take IAC private.
Sugarrae has a great post on how Google's policing of the web and pushing nofollow are undermining the social network and links that their relevancy algorithms are based upon. Worth reading from start to finish twice, then blogging about it. I would quote it, but a quote wouldn't do it justice.
The core issue is that Google places too much weight on domain authority and PageRank. Is The Wall Street Journal easy to trust? Sure...if they print garbage investors will stop buying their magazine. But even they publish garbage sometimes. Maybe Google could find a way to tune down domain trust and place more weight on other factors.
If Google decides that large networks should be trusted but that individuals should not be trusted much they are doing a bad job of encouraging web innovation. You only have to look at the entire history of mankind to realize that most innovation comes from individuals and small groups...not the large existing ones.
With as strong as Google is integrated into the web, if they are ever to fail their failure is more likely going to be due to an internal misperception than an external force. Great ideas are ignored, then shunned, then proven, then accepted. If Google doesn't make things accessible until step 3 or 4 they leave a big hole for competition in the search marketplace.
The path of least resistance and least trouble is a mental rut already made. It requires troublesome work to undertake the alternation of old beliefs. Self-conceit often regards it as a sign of weakness to admit that a belief to which we have once committed ourselves is wrong. We get so identified with an idea that it is literally a “pet” notion and we rise to its defense and stop our eyes and ears to anything different. — John Dewey
Last week Google announced a 3 year extension to their Firefox search distribution deal. This week Google announced Google Chrome, their new open source web browser, by sending an offline comic to Philipp Lenssen.
If you are not at Google's scale you probably do not have blogs focused on your company to pitch products to, but this is the sort of marketing big brands should be using to take advantage of their brand.
I few posts back I mentioned that Amazon's Mechanical Turk would be good for some SEO processes. It looks like people are already using it to spam social media. As sock puppets rise up many of the broad/general social media sites will get polluted by increasing amounts of spam.