Who Do You Recommend for Web Design?

Good design makes quality content look and feel better. Design can help improve conversion rate, makes a site more linkable, and sometimes a site generates additional links and mentions just for having a great aesthetic design.

I frequently get asked how we can run a wide array of websites with only a few high-quality part time employees. One of our secrets is staying away from the stuff we are no good at - like web design. I could show you my attempts at design, but you would think less of me if I did. ;)

Rather than going the DIY route, I have been getting quality custom website designs from Wildfire Marketing Group for many of our newer sites, and they look great. I liked their services enough to work a deal with them to get SEO Book training subscribers $100 off their designs, which start out at $765 for a basic design and $975 for a design + a Wordpress theme. Their services page is here, and the coupon code is here.

A couple other people I would also recommend for design work without hesitation are Sophie Wegat and Chris Pearson. Though Chris Pearson is working on Thesis and no longer is available for hire. Luckily we have a 20% off Thesis coupon too.

Should You Have Multiple Websites?

Or just one?

Let's take look at a web strategy that has a number of SEO and benefits: the hub and spoke strategy. A hub and spoke strategy is when you create one authoritative domain (the hub), and then hang various related websites off that domain (the spokes).

If you don't yet have an authority site, it's probably best to focus on that one site. However, once you've built an authority hub, it can be a good idea to specialize in a number of niches using multiple, smaller sites.

Let's look at a few reasons why, in the context of dominating a niche.

Economics

Economic theory holds that division of labor increases profitability.

During the early days of the web, it was easy to make money by being a generalist. However, as the web got deeper and richer, it became difficult to maintain a generalist position unless you had significant resources.

Specialization, by way of niches, allows for greater targeting, and this targeting can increase value. Leads and advertising become more valuable, because the target audience can be reached more efficiently.

The hub and spoke approach is this theory in microcosm. The hub is the generalist authority, whilst the spokes allow for niche specialization.

We'll see how this dove-tails with SEO shortly.

Domain Knowledge

If you were to create a series of sites on different topics, it might take a significant period of time to know each area well. However, if you create niche topics within your own area of expertise, you should be able to create new sites very quickly.

Why would you create new sites? Why not just stick with one?

Let's say your main site is fairly broad in it's appeal. However, you've discovered some lucrative niche keyword areas within that broad topic area. By creating spoke sites, you can focus on these keyword areas, and dig deeper, without compromising the general appeal of your main site.

An example might be a hub site that is aimed at community education, whilst spoke sites might cover private tuition, corporate learning materials, and education facility hire.

This segmentation can be done in a number of ways. You could aggressively target one search engines algorithm and/or audience (MSN) with one spoke, whilst targeting another search engine on another spoke. One site might be aimed at do-it-yourself people, whilst another site is aimed at a person looking to hire a professional. Both sites cover the same topic, but require a different approach in terms of language, structure, offer and tone.

Likewise, you may use spoke sites for brand reasons. When Google bought YouTube they wisely kept the YouTube name, as the brand appealed to users. Google Video - not so much. There is a general perception that YouTube does video, and Google is a search company, and never the twain shall meet.

Google knew better than to force the issue.

Legitimate Links

A hub site on education that links out to pharmaceutical affiliates could easily get hit by Google. The relationship between the two areas is questionable. However, if you link out to your spoke sites, that cover related niches, your link pattern will be much more acceptable.

From an SEO standpoint, it can be difficult to get links to purely commercial sites. If you have a hub site that already has link authority - or is created specifically to attract links - then you can pass this authority to your more specialized spokes. Once the spokes become more popular, you can either pass that authority along to yet more specialized sites (one way), or even promote your hub site (reciprocal). Either way, the link graph makes sense.

Each site doesn't need to be directly profitable. You can use one site to attract links, and pass this authority on to your monetarized domains. One can subsidize the production of the other.

Fame

If you've already built up name recognition in your niche, you'll find it easier to get links and press attention for your new projects.

Status is important because if no one knows who you are, they probably don't care about the content so much. Let's say Danny Sullivan or Matt Cutts writes something, it will instantly get attention because of who they are and the trust relationship they have with their audience. If you're new to the SEO space, no matter how profound your content is, it could easily get over-looked.

This is why it can be more difficult building multiple areas across unrelated niches. You may need to establish yourself in each new area, which can be a lot more difficult than leveraging your name recognition in your existing niche, then going granular.

Enhanced Monetarization Opportunities

We've looked at how you can target the most profitable areas aggressively using a hub and spoke strategy, without affecting the main brand.

Other advantages include economies of scale. As your network grows, you have more ad inventory to sell people. The inventory can be segmented, as opposed to the advertiser having to accept a one-size-fits-all approach of a generalist site. Similarly, you may be able to demand higher affiliate payouts, because you can precisely target offers.

Aaron covers this toipic in greater depth in the video "Why You Should Dominate A Niche".

Google's .edu Domain Love: Department of Economics ≠ Mortgage, or Does It?

Some recent Google shifts have caused a lot of .edu websites to rank for competitive keywords like mortgage and credit card. Here is a screenshot of the top 100 search results for "mortgage" with 57 .edu results and 15 .gov results. And here is a similar credit card screenshot.

Note that few of these pages have any relevant on-page content. Is this a case of Google-bombing? Or did Google dial up the .edu bonus too far?

Does Google want to return all the irrelevant pages? Or does it not matter if they are deep enough in the result set? Will having mystery meat results on pages 2 through 100 hurt Google's brand? Or does everyone just click on the first page?

We discussed this a bit more in the forums: new Google results

Where Do New SEOs Go Wrong When They Set Learning Priorities?

Another question we received recently from the SEOBook.com community was:

What qualities are common in Aaron Wall, DaveN, Bob Massa, Jason Duke, SugarRae, et al, that new SEOs can adopt, to come closer to people like these in expertise. Where do most new SEOs go wrong when they set learning priorities?

I've asked these people to provide their views, which I'll get to shortly.

It's a great question, because the avalanche of SEO information that confronts the beginner can be overwhelming. How do you know what information is important? What aspects do you really need to spend you time on, and what information do you need to reject? What are the qualities that make for a good SEO?

Let's take a look...

Learning SEO

Most people stumble into being an SEO.

An awareness of SEO usually comes about when a person launches a site, only to find that the site doesn't magically appear #1.

Soon after, the webmaster will likely find themselves knee deep in SEO forums and blogs, where everyone has a viewpoint, and often those viewpoints contradict each other. Contradiction is rife in SEO. To understand why, we need to understand the history of search engines.

The first step in setting learning priorities for SEO is to.....

1. Understand The History & Context Of SEO

My own foray into SEO began with Infoseek.

Infoseek was one of the early search engines. Infoseek introduced a feature around 1996 , whereby they would crawl a site and update their index immediately. This feature made it easy for webmasters to game the algorithm.

I had just launched a small, commercial site. I thought all I had to do was publish a site, and the search engine would do it's job, and put me at number one! Unsurprisingly, that didn't happen.

So, I tried to figure out why Infoseek didn't think my site was great. I could see that there were sites ranking above mine, so there was clearly something about those sites that Infoseek did like. I looked at the code of the high ranking sites. Did that have something to do with it? To test that idea, I cut and pasted it their code into my own code and republished my site. Viola, I was at number 2!

So far, so good.

But why wasn't I number one? The sites that were ranking highly tended to have long pages on the same topic, so I added more text to my pages. Soon enough, with a little trial and error, I was number one. Predictably, Infoseek soon pulled this feature when they saw what was happening.

I was clearly not alone in my underhanded trickery.

At the time, I thought my cut n paste trick was an amusing hack, but I wasn't earning my bread and butter from the internet. I was working in the computer industry, and unaware of "SEO". I soon forgot about it.

A few years later, a whole cottage industry had sprung up around SEO. The search technology had become a lot more sophisticated. My dubious copy n' paste hack no longer worked, and the search engines were locked in a war against webmasters who were trying to game their ranking criteria.

There is an inherent conflict between the business model of the search engine, and that of the SEO. The SEO wants their site to rank, the search engine wants to rank a page a searcher will find useful.

That isn't necessarily the same thing.

Therefore, the search engines are notoriously secret about their ranking formulas. SEOs try and reverse engineer the formulas, or just guess the factors involved, which is why you'll see so many contradictory viewpoints.

So who do you listen to? What information is relevant?

2. Technical Know-How

Dave Naylor had this to say about doing too much at once:

Common qualities that's simple we notice the little things and understand the larger impact that they will have in long term,

And where do you most new SEOs go wrong when they set learning priorities?

From the new SEO's on the block that I chat too, they seem to run at a million miles an hour trying 100 different things at once, they need to slow get a decent data set of information and slowly pick though it and test small things at a time, and work out thing like why is it when I search for The FT in Google it returns Grand Theft Auto ?

Most people new to SEO place a lot of emphasis on the technical aspects. It's natural to seek out the secret recipe of high rankings. Whilst most forums obsess over these issues, much of what you'll read is irrelevant fluff. These days, SEO is more about a holistic process, rather than an end unto itself.

Start with a solid, credible source - like SEOBook's course for example ;) The cost of a well researched course is nothing compared to the time you may spend heading in the wrong direction.

Most people will benefit by applying the 80/20 rule. To rank in Google, you need to be on-topic, you need to be crawlable, and you need to have inbound links.

You could spend a lifetime trying to figure out the other 20%. Unfortunately, the formula is in Google's hands, and even then, only known to a few. It is reasonable to assume Google tweaks the dials often, especially once a common exploit makes the rounds. Take Dave's advice and take it one step at a time. Focus on the key aspects first - relevance, crawlability and linking - then methodically test and evaluate in order to expand your knowledge.

Bob Massa on not sweating the small stuff:

I honestly think the only way anyone can go wrong, new to online promotion or a seasoned veteran, is to not look too hard for tricks and magic beans from those who make their names posting those so-called tricks, in forums.

I believe anyone can be successful at online marketing or even traffic generation and search engine placement specifically, if they just stop looking for ways to trick machines and instead look for ways to connect with humans.

search engines are just computer programs and algorithms written by humans. The engine is only a tool intended to aide humans do things faster and easier that are important to their lives. I think machines can help with connecting humans BUT the humans are the target, the goal, the end that machines can provide the means to.

I think one thing that is common among the list of people you mentioned is that they all realize, understand and accept that concept.

3. Strategy & Goals

The opportunity in SEO lies in the fact that Google must have content, around which it places advertising. If you rank high, you get "free" clicks.

Of course, nothing in this world is free, and SEO is no exception. There is significant time cost involved in getting lucrative rankings. And that cost comes with a reasonable degree of risk. Google has no obligation to show you at position x, and your competitors will always try and eat your lunch.

Strategy is the most important aspect, and one you should spend a lot of your time on. Why are you trying to rank? Are there better things you could be doing i.e. building up a community? Do you have an on-going publishing model? How is your brochure-web site ever going to attract links? Are you building enough link juice to ensure your entire 500K page affiliate site gets indexed?

Check out my post on strategy and goal setting. The key is to take a holistic approach.

I think some of the general principals that apply to most of them are that they are: smart, curious, hard working, blunt, honest, and sharing. They also view SEO as a tool to help them achieve other goals, rather than having SEO be the end goal.

Where a lot of people go wrong with SEO is that they try to think in concrete numbers based on a limited perspective built off a limited set of data. Some things may happen sometimes, but there are very few universal truths to the shifting field of SEO beyond preparing for change. And the certain lasting truths do not provide much competitive advantage...that is built through curiosity, testing, hard work, and creativity - Aaron Wall.

4. Measurement

It's surprising how little time is spent talking about measurement, because without it, SEOs are flying blind.

One common metric is rank. It's not a very good metric, because it doesn't tell you very much, other than you've won the ranking game.

But so what?

What if that rank doesn't help you achieve your goals? What if every person who clicks on your link ends up buying from the guy who is advertising on Adwords instead?

This is why measurement, aligned with your goals, is important. If you track SEO efforts through to a goal, and most of those goals tend to involve making money, then you'll be head and shoulders above most of the forum hacks and pretenders. It doesn't matter what tracking software you use. Become an expert and tracking and metrics.

Summary

  • 1. Understand the history and context of SEO
  • 2. Learn your chops from a reputable source
  • 3. Clearly define your strategy and goals
  • 4. Become a metrics and measurement guru

How Your Competitors Can Help You

Are you thinking of building a new site?

Before you do, it pays to take a look at your competitors. By choosing the right sites to compete against, you can gain significant advantage.

Firstly, you need to position your offering relative to your competitors.

1. What Problem Do you Solve?

Making money is mostly about solving problems. Write down the problem you're going to solve. Be specific.

For example:

  • Provide auto repair training to amateurs
  • Sell bomb detectors to airlines
  • Sell ice to Eskimos

For this article, we'll use the idea "sell ice to Eskimos". No doubt you've already spotted the problem with this rather lousy business model, but let's have a look at what a bad idea looks like within this evaluation process.

2. Who Is Your Audience?

You may have noticed I included the prospective audience in my examples above. Know what you're selling, and to whom.

Demographics, in other words.

Who are you customers? What do they want? What type of language do they use? Build up a profile.

In our example, our customers are Eskimos. Eskimos live around the North Pole region, mainly in Siberia, Alaska, Canada, and Greenland. Internet access is obviously going to be an issue, not to mention language barriers, which is about the point the idea should die.

Yet, surprisingly, many prospective web businesses never address this simple question. Various Web 2.0 businesses clearly didn't ask questions 1 & 2. Presumably they jumped straight to the "how can we get a few million dollars in VC?" question instead.

3. Where Are Your Customers Hiding?

You need to get in front of your audience.

SEOs know a lot about keyword research, so have a huge advantage over others when it comes to finding out who their competitors are, and where the opportunity lies.

You're probably familiar with keyword research tools and competitive research tools such as:

Find out the keyword terms your potential audience use, conduct searches, and make a note of the big players under those keyword terms. Keep in mind that language people use on search engines is always changing. There are more queries that are longer and more specific. These give you valuable insights into how to position your offering.

What questions are people asking? What problems are they trying to solve? What are the many different ways they describe that problem? What keyword areas are your competitors missing? What value can you provide that they do not? Have your competitors missed lucrative keyword areas?

4. What Is The Nature Of The Market?

You should look for rapidly growing markets. You want to avoid established, declining markets, unless you can provide a new layer of value that is difficult for competitors to emulate.

Take a look at the type of sites you intend to compete against. Are they big companies? Are they hobby blogs and thin affiliate sites? It's going to be much easier for your new site to compete against the thin affiliates and hobby projects than it is to compete against the establishment.

One of the common stumbling blocks at this point is solving a non-problem. The "ice to Eskimos" market is not dominated by established players, or hobby blogs for that matter, but there's a good reason for that - Eskimos don't have a "lack of ice" problem. Beware of the Web 2.0 trap - solving the non-problem.

5. What Related Markets Exist?

If the market you were thinking of entering is competitive, are there any closely related markets you can enter? You can find these areas by looking for patterns in the keyword research results.

Let's try "fitness". Notice any patterns here?

You might notice there are numerous searches for fitness locations i.e. a gym, a center, a club. So, instead of targeting fitness in terms of health, which would see you up against established health organizations and generalist publications, you might want to target the fitness center section of the market e.g. a comparison of gyms and centers. Such a niche could possibly be more lucrative, as there is a clear money making opportunity as people need to pay to join these facilities.

Which brings us onto...

6. Is there Potential To Make Money?

Just because a lot of people are doing something, doesn't mean it is worth doing.

Some areas are difficult to monetarize. Science, for example. And social discussions. Some area's are saturated, making it very difficult to find a new value layer to add. SEO, for example.

How easy would it be for one of the major players to copy your value proposition? Look for areas that have a clear path to monetarization and that aren't dominated by major players, or saturated by sites with little to distinguish them.

Good luck in your hunt for a lucrative niche :)

If you want to see a video presentation on how to evaluate competitors, Aaron has more in the members section.

Majestic SEO - Interview of Alex Chudnovsky

Majestic SEO.

I have been fond of the depth of Majestic SEO's data and the speed with which you can download millions of backlinks for a website. While not as hyped as similar offerings, Majestic SEO is a cool SEO service worth trying out, and their credit based system allows you to try it out pretty cheaply (unless you are trying to get all the backlinks for a site as big as Wikipedia)...as the credits depend on the number of inbound linking domains.

They give you data on your own domain for free, and share a nice amount of data about third party sites for free. For instance, anyone can look up the most well linked to pages on SEO Book free of charge

What made you decide to create Majestic SEO?

We arrived to it naturally - our main aim with the Majestic-12 Distributed Search Engine project is to create a viable competitor to Google. We use volunteers around the world to help us crawl and index the web data. This project was started in late 2004 and about 2 years later it became clear that we need to be as relevant as Google and in order to do that we have to master power of backlinks and anchor text. As time went on many hundreds of terabytes of data were crawled it also became clear that we need to earn money as well in order to sustain our project. It took well over a year to actually achieve the level that we felt confident with to release it publicly in early 2008.

What were the hardest parts about getting it up and running?

The most difficult part was to avoid the temptation to simplify the problem and focus on a small subset of data that is much smaller than that indexed by Google. It was felt that it would be a mistake as you can't really be sure that you have the same view of the web unless you are close to Google's scale.

Once it was decided to follow the hard path a lot of technical scalability problems had to be solved as well, and then deal with the financial aspect of storing insane amount of data using sane amount of hardware.

You use a distributed crawl, much like Grub did. What were some of the key points to get people to want to contribute to the project? How many servers are you running?

The people that joined our project did so because they felt that Google is quickly becoming a monopoly (this was back in 2004) and a viable alternative was necessary. We have over 100 regulars in our project that run distributed crawler and analyser on well over 150 distributed clients: all this allows us to crawl at sustained rate of around 500 Mbits.

Since we recently moved closer to commercial world with Majestic-SEO it was decided that our project participants will benefit from our success by virtue of share ownership - essentially project members are partners. It needs to be stressed here that our members did not join the project for financial reasons.

How often do you crawl? How often do you crawl pages that have not been updated recently?

We crawl every day around 200 mln urls. At the moment our main focus is to grow our database in order to catch up with Google (see analysis here), however we have dedicated some of our capacity to recrawls, in fact in February we should have new version of automatic recrawls of important pages (high ACRanked) released and this will allow to see competitor backlink building activity pretty quickly. Our beta daily updates feature shows new backlinks found in previous day for registered or purchased domains, this gives a chance to see new backlinks before we do full index update (around every 2 months time).

What is AC Rank? How does it compare to Google's PageRank?

ACRank is a very simple measure of how important a web page is based on number of unique domains linking to it. More information can be found here: http://www.majesticseo.com/glossary.php#ACRank

This measure is not as good as PageRank because it does not yet "flow" between pages. We are going to have much improved version of ACRank released soon.

Do you have any new features planned?

Can't stop thinking about them ;)

You allow people to export an amazing amount of data, but mostly in a spreadsheet basis on a per site basis. Have you thought about creating a web based or desktop interface where people can do advanced analysis?

We offer a web based interface to all this data with ability to quickly export it using CSV format.

For example, what if I wanted to know pages (or sites) that were linking to SearchEngineWatch.com AND SearchEngineLand.com but NOT linking to SeoBook.com AND have a minimum AC rank of 3 AND are not using nofollow. Doing something like that would be quite powerful, and given that you have already done the complex crawl I imagine adding a couple more filters on top should be doable. Another such feature that would be cool would be adding an Optilink-link anchor text analysis feature which allows users to break down the anchor text percentages.

We do have powerful options that enable our customers to slice and dice data in many ways, such as excluding backlinks marked as nofollow or only showing such backlinks, this applies to single domain analysis however, but something like what you describe in your example of interdomain linking will be possible soon.

Have customers shared with you creative ways to use Majestic SEO that you have not yet thought of?

We get good customer feedback and often implement customer requested features to make data analysis easier. As for new creative ways our customers prefer to keep them to themselves, but once you look at the data you might see one or two good strategies on how to use it. ;)

How big of an issue is duplicate content while crawling the web? How do you detect and filter duplicate content?

It is a very big issue for search engines (and thus us) as many pages are duplicate or near duplicate of each other, with very small changes that make it hard to detect them. We do not currently detect such pages (we crawl
pretty much everything) though we have a good idea how to do it and will implement it soon. Our reports tend to show data ordered by importance of the backlink, so often it is not an issue though it depends on backlinking profile of a particular site.

A lot of links are rented/bought, and many of these sources get filtered by Google. Does your link graph take into account any such editorial actions? If not, what advice would you give Majestic SEO users when describing desirable links vs undesirable ones?

At the moment our tools report factual information on where backlinks were found, we do not currently flag links as paid or not. This is something that humans are good with and computer algorithms ain't - that's why Google hates such paid links so much. We do have some ideas however on how to detect topically relevant backlinks (paid would usually come from irrelevant sites) - it's coming soon and might actually turn up to be a ground breaking feature!

Microsoft has done research on BrowseRank, which is a system of using usage data to augment or replace link data. Do you feel such a system is viable? If search engines incorporate usage data will links still be the backbone of relevancy algorithms?

BrowseRank is a very interesting concept, though we are yet to see practical implementation on large scale web engine. I don't think such system obsoletes link data at all, in fact it is based on link data just like PageRank only it allows to detect the most relevant outgoing links on a page, essentially such votes should be given more weight in PageRank-like analysis. For example imagine that this very interview page is analysed using BrowseRank and it finds that the following cleverly crafted link to Majestic-SEO homepage is clicked a lot, then such link could be judged as the real vote that this page gives out!

This approach would help identify more important parts of on page content as well so that keyword matches within this content block could get higher score in ranking algorithms. So I actually think there is a lot of mileage in BrowseRank concept, but it would be a mistake to think that it will completely replace need for link data analysis. I am pretty sure Google uses something like this already - Google Toolbar stats would give them all they need to know.

The great irony in my view is that Microsoft lacks good web graph data to apply their browsing concept, this is probably why they are so desperate to buy Yahoo search operations who are much better than it comes to backlinks analysis, though Google are the real masters. Majestic-SEO is trying to slot itself just behind Google and who knows what happens after it ;)

I look up a competing site and see that a competitor has 150,000 more links than I do and feel that it would take years to catch up. Would you suggest I look into other keywords & markets, or what tricks and ideas do you have for how to compete using fewer links, or what strategies do you find effective for building bulk links?

First of all: don't panic! :)

Secondly use the SEO Toolbar that will query our Majestic-SEO database to show number of referring domains - it may well be very few.

Thirdly consider investing into detailed stats we have on this domain: this will tell you anchor text used, actual backlinks that you can analyse by their importance (we measure it using ACRank). Once you see real data a lot of things can become clear: for example you can see that your competitor has got lots of backlinks pointing just to homepage or spread around the site. Seeing actual anchor text is really an eye opener - it can show which keywords site was optimised for, this will allow you to make a good decision whether you can catch up or not. Chances are you may find that your competitor is weak for some keywords, this is where keywords research tool like Wordtracker is invaluable.

And finally consider that a few good relevant backlinks are likely to be worth more than many irrelevant ones: it is those backlinks that you want to get and knowing where your competitor got them should help you create a well targeted strategy.

You allow people to download a free link report for their site. How does this report compare to other link sources (Yahoo! Site Explorer, Alexa links, Google link:, and links in Google Webmaster Central)?

We give free access to verified websites, this is a great way to try our system and you might see the backlinks that you won't find elsewhere because our index is so large and we show you all backlinks (rather than top 1000) that we've got: this will include backlinks from "bad neighbourhoods" (this is not yet automatically marked by our system, but visual human analysis wins the day here) that you may not be shown in other sources.

We believe that our free access reports are the best in class, since it's free why not find out for yourself?

For analyzing third party sites you have a credit based system. How much does it cost to analyze an average site?

The price depends on how large (in terms of external referring domains) a particular website is. We have some sites that have hundreds of millions of backlinks, average would be very different depending on what you really after, the best option is just to run searches for domains that you interested in on our website, this will give you very interesting free information as well as price for full data access.

For a domain like Wikipedia I might only want the links to a specific page. Are you thinking about offering page level reports?

Yep I am thinking of it - I actually had requests like this, funnily Wikipedia being the main object of interest.

What is the maximum number of links can we download in 1 report?

Our web reporting system tries to focus on most valuable backlinks to avoid information overload, however we allow complete dataset download that will include all backlinks - some of our clients have retrieved data on domains with well over 100 mln backlinks! Using our powerful analysis options you can focus on backlinks for particular urls coming from particular pages and retrieve all qualifying data.

------

Thanks Alex. For more information on Majestic SEO please visit their site and look up your domain.

Five Dreaded SEO Topics We Hope We Never See Again

1. Search Engine Spamming Should Be Outlawed

Out in the wild west, moral confusion abounds.

There have been SEOs who have argued - with a straight face - that whilst it's ok for them to game search engine algorithms, it's not ok for others to do so. This is usually because the other guy isn't following "the rules".

What are the rules?

The rules are decided - and vaguely defined - by the search engines, and then interpreted to mean whatever an SEO decides they mean. Far be it for a search engine to create rules that serve their own business interests, which may not align with the interests of the webmaster.

SEO is built on shifting sands. What do you do when what you were doing was "within the guidelines" and no longer is because the rules change? Do you willfully decide to rank lower?

Conclusion: Spam is what the other guy does. Also an acronym for "Sites Positioned Above Mine".

2. How To Create Meta Tags

Hard to believe now, but forum wars were fought over how many times a webmaster could repeat a keyword in a meta keyword tag. Twice was often deemed ok, but any more than that and you were almost certainly an "evil spammer" (see #1).

Meta tag manipulation doesn't count for anything these days. The tags are mainly used to describe the content of pages, that the search engines may display as snippet text.

Conclusion: Deader than AltaVista

3. Is SEO Ethical?

A curious framing of SEO in terms of ethics and morality.

Is it good and proper to try to get a higher rank than the search engine would bestow otherwise? The point of SEO is, of course, to get a higher rank than the search engines would bestow otherwise.

These people were usually in the wrong game. Many went on to join Seminaries.

Conclusion: Welcome to the jungle

4. Should I Buy Links?

Yes.

No.

Depends.

Paranoia runs rampant in SEO, especially when search engines make a example of someone. Like SearchKing.

Almost all SEOs once advocated buying links in, say, Yahoo, as a listing in Yahoo would lead to better rankings in Google. However, Matt Cutts put the cat amongst the pigeons back in 2005 when he declared that "Google does consider buying text links for PageRank purposes to be outside our quality guidelines".

The argument quickly descended into a semantic war i.e. define "paid". Money changing hands? Favors? Nepotism? Erm...Yahoo Directory? One of the more interesting conclusions often got buried: "Hey, perhaps if Google dislike them so much, paid links really do work!"

Conclusion: Yeah, they work

5. Should There Be SEO Standards?

A natural progression of the ethical debate. It was proposed that SEOs should all conform to a common code of practice, as other professions often do.

The problem was that the relationship between search engines and SEOs has always been grey. Only the search engine can really define what the search engine wants, and what the search engine wants might not align with what the SEO, or their client, wants. In any case, the search engine isn't going to publicly define exactly what they want, as they are worried that people, like SEOs, will game their systems.

So, you got a few self-appointed search police officers, who would suggest that everyone followed their particular code of practice, based on their interpretation of the search engines guidelines. The self-appointed cops usually out-numbered those who followed them, and invariably disagreed amongst themselves anyway.

Conclusion: Impossible to get buy-in

Barack Obama Earns the 2009 Domainer of the Year Award

It is no secret that Obama is great at public speaking, building a fan base, working the press, and leveraging new distribution channels, but one of his most overlooked marketing achievements is...domain name selection. Anyone who has watched Idiocracy should appreciate the domain names that government programs are now launched under.

Before Obama was sworn in, he launched Change.gov under The Office of The President-elect. Change is an easy concept to grasp after 8 years of crony capitalism and fraudulent wars built on lies from international war criminals. But "change" in and of itself is a tool, not a destination...where is it going?

While launching a plan to increase government spending by nearly a trillion dollars to "create" millions of jobs, upgrade infrastructure, and computerize the national health care system (what hidden costs might come to individuals from that "change"?), Barack announced that his plan to spend this money will be tracked under Recovery.gov.

Part of Obama's spending plan is to "expanding broadband access to millions of Americans so businesses can compete on a level playing field, wherever they are located." But the playing field has never been level (just look at how the bankers rewrote the consumer bankruptcy laws to shaft consumers a few years before the banks were begging the government for trillions of dollars of handouts).

Investments in the web will increase the value of web assets, but the increased competition will make it harder to gain attention and exposure unless you have capital to invest, and invest it wisely. Just this week a corporation worth $10's of billions put one of our core keywords in the page title of their home page! We still outrank them, but for how long?

I thought it would be at least a decade before the United States government started domaining. Many large corporations are sure to catch on soon, increasing domain prices and closing off a great investment opportunity for smaller players. I have been busy over at BuyDomains looking for good names to hoard and build, picking up another 3 yesterday. SEO Book members have access to a coupon code to get 15% off BuyDomains domain names on our member discounts & coupons page.

The 100+ Ranking Variables Google Uses, And Why You Shouldn't Care

Continuing on with our community questions, here are a few requests for specific ranking information:

"What are the 100+ variables Google considers in their ranking algorithm?"

Cheeky :)

Easy to say, hard to do. Take a job at Google, work your way up the ranks and join the inner circle.

Another question we received is along the same lines:

How do you outrank a super established website in your niche, one where Google is giving site links and their domain is older

Again, easy to say, hard to do. Either forget outranking the domain and buy it, or spend time doing exactly what they have done, and hope they also stop their SEO efforts in order to let you catch up.

These types of questions arise often. "If I could just learn a few quick-fix insider secrets, I can outrank everyone!"

If there was a quick n easy secret formula that would guarantee high rank, why would those who know it, reveal it?

The reality is that quick-fix secret formulas don't exist.

Sure, there are quirks in the algorithms that can be exploited, but they are often trumped by historical factors, like authority metrics, that are difficult to fake. One common blackhat technique is to hack an established domain, and place "money" pages on that domain. That's an admission, if ever there was, that technical trickery on your own domain is either too time consuming, or doesn't work so well.

I know some of the worlds top SEOs, and I can't recall them spending much time talking about secret sauce. What they do talk about is making money and growing empires. They're more focused on the business strategy of SEO.

The effectiveness of many SEO techniques will be dead soon, anyway.

What you need to think about for the future is user interaction.

The Future Of SEO

Have a read of this document, by my good friend and Merlot drinker, Mike Grehan. Mike outlines his view on the future of search, and he makes a number of important points:

  • The web crawler model is nearing the end of its useful life
  • Signals from users, not content creators, will become more important
  • Universal Search changed the ranking game forever
  • Forget rank, think engagement

If you want to future proof your SEO strategy, take heed of Mike's words.

The crawler model is failing because the crawler was designed for structured text, not multimedia. The crawler can't see behind pay-walls. It has trouble navigating databases in which the data isn't interlinked or marked-up. The search engines will need to look for other ways of finding and making sense of data.

Social networks, blogs, Twitter etc indicate a move away from the webmaster as signaler of importance i.e. who you choose to link out to. The search engines will need to mine the social signals form those networks. The user will signal where their attention is focused by their interaction and paths.

Universal search, in may cases, has pushed results listings down below the fold. For example, to get a client seen high up on the results page may involve making sure making sure they are featured on Google Maps. Similarly, if they have video content, it should be placed on YouTube. Google have shown they are increasingly looking to the aggregators for results and featuring their content in prominent positions.

That list of search results is becoming more and more personalized, and this will continue. Who knows, we may not have a list before too long. More and more "search" data - meaning "answers to questions" - might be pushed to us, rather than us having to go hunt for it.

The future of SEO, therefore, will be increasingly about engaging people. The search engines will be measuring the signals users send. In the past, it's all been about the signals webmasters send i.e. links and marked up content.

For now, you still need to cover the obvious bases - create crawlable, on-topic content, backed by quality linking. But you'll also need to think about the users - and the signals they send - in order to future proof your site. Google has long placed the user at the center of the web. Their algorithms are surely heading towards measuring them, too.

What are these signals? Ah, now there's a question.....

Spying on Customers & SEO Data Aggregation

We Do Not Spy on Our Customers

I have had a very well known SEO company dust one of best link building strategies (outing it directly to a Google engineer) because I was trusting enough to mention how effective it was inside our training program, thinking that a competitor would not out it, but I was wrong! At least I know what to expect, and can use that knowledge to mitigate future risks.

One of the common concerns about the SEO Toolbar is something along the lines of "does it phone home" or "are you spying on us" or "what data is it sending you". Some SEO companies offer a huge EULA and do spy on the people who use their toolbars, but we do not do that for a number of reasons

  • I felt rather angry when that well known SEO company outed my site (and haven't really trusted them since then)
  • I never really liked the idea of spying on customers, and going down that path could harm our perceived brand value
  • knowing that information is kept private adds value and builds trust
  • we are already under-staffed (running quite lean) and have more projects to work on than time, so we are not in need of new projects
  • With all the great competitive research tools available now (like Microsoft Ad Intelligence, Google Search-based Keyword Tool, Compete.com, SEM Rush, and many others) it is easy to get a lot of keyword data quickly, and I see little value add in spying on our users.

Why Give Away so Much Value?

It is pretty obvious that the trend in software (since the day I got on the web) is that open source software is commoditizing the value of most software products and tools. Providing tools that require limited maintenance costs and provide access to a best of breed collection of SEO tools makes it easy for us to evolve with the space and help our customers do so, without building up a huge cost sink that requires raising capital and having to listen to some icky investors. :)

The reason we can (and do) provide so many free SEO tools is because I feel doing so...

  • makes the web a better place (Tim O'Reilly says you should create more value than you capture)
  • offers value to the community
  • extends opportunity to more people around the globe (anyone who is just fresh starting out like I was ~6 years ago could use the help)
  • commoditizes the value of some bloated all-in-one SEO software (many of those products generally lack value and misguide people)
  • makes it hard for con-artists to sell hyped up junk (by commoditizing the value of their offerings to all but the most desperate of get rich quick folks)
  • helps to educate potential future customers (when we did a survey recently about 80% of our customers have been practicing SEO for over a year)
  • is an affordable distribution strategy for brand awareness
  • builds trust by delivering value for free (rather than trying to squeeze every penny out of potential customers)
  • is a big differentiator between us and most SEO websites

In addition to all the above points, most of the tools we create are tools I want to use. So the cost of building them would still be there even if we did not share them. Sharing them gets us lots of great user feedback to improve them, and does not cost us much relative to the potential upside.

Small Industry, Lightweight Strategy

Rather than centralizing things, we like to rely on a distributed software strategy which has a much lower cost structure.

That strategy allows this site (with a popular blog, an array of tools, some videos, training modules, and an active community) to run on 1 server. We find the Plenty of Fish story inspiring, though doubt we will need his distributed computing skills anytime soon given how small our industry is. After 5 years we are still millions of visitors and over a billion monthly pageviews behind Plenty of Fish :)

Though we are doing ok in our little corner of the web :)

We have analytics on our website to help us see where we are getting coverage, and to measure and improve conversions (an area ripe for opportunity given our brand exposure and site traffic). We may add relevant affiliate links and offers to some of our SEO tools to help pay for the 10s or 100s of thousands of dollars we spent developing our various tools (for example, see how we integrated a link to our Wordtracker keyword guide and the Wordtracker keyword research service in our keyword tool). But we have no need or desire to spy on users who download our tools. Spying and outing are poor strategies for professional SEOs to employ....they erode trust and value.

Pages