An Interview of Branko Rihtman (AKA: SEO Scientist)

We recently interviewed Branko Rihtman. He started working in the industry in 2001, doing SEO for clients and properties in a variety of competitive niches. Over that time, he realized the importance of properly done research and experimentation and started publishing findings and experiments at http://www.seo-scientist.com.

How did you get into the SEO space?

Completely by accident. When I was done with my compulsory army service, I knew I would rather work in an internet based company than, say, dig ditches. So I went into a local internet portal and searched for “internet companies in Jerusalem”. One of the replies was from an SEO company. They offered me a job with flexible hours and a possibility of working from home. Since I was about to start university, working from home looked particularly interesting. I ended up spending 8 years in that company.

When did you know you were going to "make it" in SEO?

Ummm never? I don’t think any of us ever “makes it” in SEO. Yes some people are more popular than the others and some get invited to speak at more conferences than the others but that is most certainly not a measurement of “making it”. SEOBook forum is full of people that are more succesfull and savy than the majority of SEOs out there, yet very few of them are well known in the general marketing circles. One of the things I like about SEO is that it is constantly “making” you and “breaking” you. If it wasn’t like that, we wouldn’t be constantly learning and adapting.

What is the most exciting thing that has happened to you while in the SEO field? Do you still get a rush of excitement when a new project takes off?

Getting a site into a top 5 for [mesothelioma] on Google. Kidding. One of the more appealing qualities of the SEO field is the puzzle cracking. You are constantly presented with puzzles – why did Google penalize this site, why is this site ranking above me, what are the parameters considered in the new update… For me, cracking those puzzles is the most exciting part of my work. I really have to remind myself sometimes that I should be thinking about potential profitability of these conundrums because to me a puzzle is there to be solved and that is all that matters. Once I crack it, I kinda lose interest in it so I have to make sure that 1) solving the current SEO puzzle is worth my time in terms of profitability and 2) I can get action items from possible solutions. I think the best example of these puzzles is Google overoptimization filter. I kinda developed a knack of getting sites out of it (which landed me my current job as well). Another exciting thing would be implementing extensive structural changes to large sites and seeing the positive effect in SERPs. As for new projects, I have seen so many of them die off miserably that I find it hard to get excited at the beginning. First jolts of traffic and first rankings get me excited and then I turn the engines on.

How would you compare biology to SEO?

Oh dear, this could be a whole blog post. There are several aspects that are very similar. Mainly, and this is especially true in molecular biology, we are making changes on a system that is a black box. We have a whole bunch of (presumed) parameters to tinker with and very limited list of observable outputs. So we make deductions which can, but don’t have to, be true. So if I am changing a certain ingredient in my bacterial culture and observe a change in growth rate, I cannot be sure what exactly the base cause of the increase was. Maybe the element I have added is actually poison and my bacteria are trying to multiply on reserves of food, hoping that one mutant will be able to overcome the adverse effects of the element I have added. Similarly, when we add a link pointing to a website, we don’t know whether it was that link that caused an increase in ranking or someone in Bangladesh created a valuable link that is pointing to one of the pages that is linking to our new linking page and we enjoyed some of that juice.

Another important similarity (and then I will shut up about it) is the arms race between the search engines and SEOs and SEOs among themselves. Evolutionary theory and ecological sciences are full of very important lessons that can be applied to the world of SEO. I have written on my blog in the past how some evolutionary theories can be applied to understand and foresee the relationship between Google and link buying. Another metaphor from the evolutionary theory I like to use is the Red Queen Principle – in evolution, competing organisms have to invest all their efforts in improving and adapting so they can remain at the same competitive point relatively to their enemies. Like with the Red Queen from Alice in Wonderland, they have to run their fastest to remain in the same place. The same can be said about websites competing in lucrative niches – it is not enough to get to the first spot. Your competitors are constantly aiming for that place too and you have to put in maximal efforts (linking, site speed, trimming indexing fat, QDF hunting etc.) to remain in the same place.

You are a big proponent of applying the scientific method to SEO. What parts of tests are easy to do? What parts are hard?

SEO tests can be easy from the beginning till the end if done right. The hard part is asking the question in a “testable” way. You have to keep in mind the limits of your testing system and constantly be aware about what you can measure and what you can’t. You have to make sure you have taken into the account all the possible outcomes of your test and what each of those outcomes is telling you. Otherwise you can find yourself spending valuable time, just to end up with a highly ambivalent result that is not teaching you anything about the issue you are researching. Deciding what controlling factors you are going to implement and doing it in a way that doesn’t interfere with your test can also be challenging.

What do public SEO "studies" often get wrong?

Mostly, people get the order of steps that make up the test wrong. They usually start with a pre-made conclusion and then build the test (and, I suspect, not rarely the results themselves) around it. They want to show that, for example, text surrounding the link will pass the relevancy to the target page, so they go out to prove that. That is the exact opposite of the scientific process. Now many people say that trying to approach SEO questions with a scientific process is an overkill, but science is more a state of mind then a set of tools. It exists so that minimal bias enters your decision and conclusion process, therefore people should not approach it as something that involves a lab coat and chemicals, but rather change their mindset from “what do I want the results to be” to “what the reality is”.

What percent of well-known fundamental "truths" in SEO would you describe as being wrong?

I would say that 100% of absolute, definitive statements about SEO are false. Recently, Joe Hall has written about becoming a “postmodern SEO” when realizing that every conventional truth in SEO can be 100% right and 100% wrong, depending on the context. I very much identify with this sentiment. It very much rubs me the wrong way when people in the industry come out against a certain SEO technique (and it rubs me even more when I know they were the biggest abusers of it until yesterday) or when they make a strategy X an “absolute prerogative and whoever doesn’t do X should be fired by their clients and sued for dishonest practices”. Keyword tags can be useful in some cases, rank reporting can be useful in some cases, forum signature spamming can be useful in some cases and increasing keyword density can be useful in some cases. It all depends on the context.

In the forums sometimes when I read your contributions & think "classic whitehat consultant view" and then on other entries I think "aggressive affiliate in gaming." What allowed you to develop such a diverse range?

I am very flattered that people think this when they read my ramblings or talk to me about SEO. What allowed me to develop a diverse range of experiences in SEO is not being judgemental towards SEO techniques. Continuing from the previous question, understanding that they are all tools that should be put in the right context and used responsibly, enabled me to try and see all the advantages and disadvantages of all SEO techniques and apply them accordingly. Had I taken “holier than thou” approach towards any end of the SEO spectrum, I would have been a worse SEO. I also consider myself lucky to have had an opportunity to work in a wide range of niches - from legal, ecommerce, travel and financial, all the way to porn, pharmaceutical and gaming with a lot of niches and business sizes in between those extremes. Once you look at link profiles of sites that have been ranking for years in some of those extreme industries, you understand how preposterous divisions to hats of different colours really are.

As a second part to that question, how do you decide what techniques are good for some of your own websites & which are good for client websites?

Again, it is all in the context. I make a big differentiation between our sites and clients’ sites in a way that whenever I want to use a riskier SEO technique on a client site, I make sure to educate the client to all the risks and benefits of going down that road. I make sure the client understands the possible repercussions and I try to offer a cleaner alternative. There are clients that are not interested at all in organic promotion and there are clients that enter the project knowing that the site we work on can be burnt in a matter of minutes. When it comes to our sites, it depends on the profitability of the site, obviously. Then there are sites I test stuff on that I wouldn’t click on without wearing my lab gloves.

Do you believe Google is intentionally tilting the search game toward brands, or do you think there are many other signals they are looking for that brands just happen to frequently score high on?

I don’t think we need to speculate about that much – they have openly said in the past that the brands are the solution to the cesspool of the internet. They are rewarding brands with SERP enhancements. They are creating algorithmic changes in which brands are apparently being treated less harshly than run-of-the-mill sites. On the other hand they are making sure to stress in their PR announcements that brands are not treated differently than anyone else. As I don’t believe they openly lie about these things, it seems to me they are just doing doublespeak and being intentionally obscure about it. I can say that I don’t discriminate against tall people on busses and I will be factually correct since no one goes over the bus line and takes out people over 180 cm tall and sends them back home. However, by making the legspace very uncomfortable for these people, I may as well kick them out of line and save everyone the trouble. So while there is probably no checkbox next to certain websites marking them as brands, the ranking algorithms can theoretically be tweaked so that the brands surface to the top of a lot of the money queries and I think that is what we are seeing here. Possible signals for this can be percentage of links with URL for anchor, certain number of searches for the brand name and others. By the way, reliance on these signals can be used to explain the relative advantage that exact match domains have for their keyword.

Both the relevancy algorithms & webmasters are in some ways reactive. I believe that frequently causes the relevancy algorithms to ebb and flow toward & away from different types of sites. Do you generally have 1 sorta go-to-market plan at any given time, or do you suggest creating multiple SEO driven strategies in parallel?

It all depends on the client responsiveness levels. If I see that the client is willing and allows us to become part of their marketing team, then we both aim for harnessing every marketing activity for SEO benefits, while also trying to diversify and reduce the dependency on any single traffic source. In cases when, for a whole lot of different reasons, we cannot establish a network of sites that will use different strategies, we try to work with a whole lot of subdomains, trusting how Google treated subdomains historically. I have to admit that in the majority of cases, the responsiveness of the deciding ranks (or the lack of thereof), together with a constantly growing list of more basic, day-to-day tasks, prevents us from making these strategic marketing decisions for the client – it is hard to talk about holistic approach to marketing when their homepage doesn’t appear on first 3 pages of the site: query or when their IT department decides to 302 every product page to homepage while they are moving servers for 3 months.

When major algorithm changes happen they destroy certain business models & eventually create other ones. How many steps ahead / how far ahead do you feel you generally are from where the algorithm is at any given time?

We are all over the board with this. Luckily (or unluckily) none of our clients were affected by Panda. I say “unluckily” because the scientist in me would want nothing more than to test different theories about Panda on an affected site. The marketer in me is stabbing the scientist in the back with a long sword for having such blasphemous thoughts. I would say that we usually “hang around” where the algorithm is at any given moment and if we stay behind, we manage to close the gap in a reasonable period of time. At least that has been the case so far. In some other cases, we have benefited from sites getting hit by algorithmic changes. This only means we are lucky, because I don’t think there is any single strategy that is 100% working all of the time in every level of niche competitiveness. Had such strategy existed, someone would have cracked it (Dave Naylor most probably), used it to their own benefit and Google would have changed the rules again, rendering the “perfect strategy” less than perfect.

How far behind that point would you put a.) the general broader SEO industry b.) SEO advice in the mainstream media?

One of the major revelations I discovered in SEOBook forum is that the public SEO community is really just a small tip of the iceberg that is this industry. There are so many skilled people working on their own sites, being affiliates or working in-house professionals that do not participate in the SEO Agora that any attempt to characterize “the general broader SEO industry” would be wrong. There is no way of judging where the industry is, other than by what they write about and talk about in social media and I don’t think that is a fair judgement. This is the industry of marketers and people do not write to dispense knowledge most of the time. Vast majority of the content put out there is created with the purpose of self-promotion and/or following some invented rule that “you must write X posts per week to keep your audience engaged”. It is very similar to the whole “Top X” lists format in which it is obvious that a significant percentage of items on the numbered list were forced in there so that the number X would be round or fit some theory of “most read top X articles”. While I do believe that someone will find value in anything, when looking across the board, there is very little you can tell about the actual knowledge of the people in this industry from what they write. I hope. I will tell you that I do see a general difference between the European and the US SEO crowd – I have seen (percentagewise) a seemingly larger amount of UK, Dutch and German SEOs that are more daring and questioning in their writing than the US SEOs. Don’t ask me why this is so, that is beyond my scope of expertise (or interest).

As for the mainstream media, living in the Middle East, I have learned to automatically distrust the mainstream media on issues much more important than SEO, therefore I usually treat mainstream SEO articles as a comic relief. Or a tragic one.

Many times when the media covers SEO they do it from the "lone ranger black hat lawbreaker" angle to drum up pageviews. Do you ever see that ending?

Nope. Nor do I ever see people in our industry not taking the bait and responding to that kind of coverage, thus contributing significantly to the mentioned drumming up of traffic. Even if the advertising industry moves away from impression-based pricing, more attention will always mean more links and that is just a different kind of opiate.

From a scientific standpoint, do you ever feel that pushing average to below-average quality sites is bad because it is information pollution (not saying that you particularly do it or do it often...but just in general), or do you view Google as being somewhat adversarial in their approach to search & thus deserving of anything they get from publishers?

I consider as below average anything Google would not allow Adsense on. Maybe someone really doesn’t know how to drink water from a glass and for that person eHow article is the best fit. On a serious note, just like with hats, I try not to be judgmental when it comes to content. If lower quality content that does not rank anywhere is used to push high quality content in very popular SERPs, I think it all levels out at the end. The bigger problem for me is rehashed, bland content, which you can see that was written according to a mold: Start with a question, present some existing views on the issue and end with asking your readers the initial question so you encourage comments. Or numbered list articles. Or using totally unrelated current events AND numbered lists in combination with a tech topic. I have just seen an article titled “5 things Amy Winehouse’s death teaches us about small business”. Spamming forums is Pulitzer worthy material compared to this garbage. Yet Google constantly ranks this crap and rewards it with a cut from their advertising revenue. And what is even worse, the crap ranks for head terms (ok maybe a bit less after Panda) while forum or comment spam does not appear in my SERPs. So who is polluting the web again?

I don’t think a scientific approach is relevant here. One thing that exists in the world of science and doesn’t in SEO is peer review. So if something gets published in a scientific journal, it was reviewed critically by the experts in that field and was deemed worthy in every possible aspect by some rigorous standards. Had this kind of system existed in the world of SEO, we wouldn’t have a below-average-quality content problem.

Can Bing or anyone else (outside of say Naver, Yandex & Baidu) challenge Google & win a significant slice of the search marketshare?

Only if Google does it for them and drops the ball completely. I don’t believe in homicide in the world of hi-tech companies (Facebook killer, Google killer, iPad killer) but I definitely believe in suicide (Myspace). The ball is constantly in Google’s court since they are the biggest kid on the playground and they have managed it fine so far. It is ironic how they have to deal with bad press on so many issues, almost making MS the underdog and a company people turn to when they want to boycott Google. Right now Google is the innovator and a trend setter in many fields beside the search (Documents, Analytics, G+, Adwords) so having all those eyeballs and improving integration of all those products into search and vice versa will make them an impossible act to follow in any foreseeable future. Which is something that was said about ancient Rome too.

A lot of SEOs are driven by gut feeling. With your focus on the scientific method, how much do you have to test something before you are confident in it? How often does your strategy revolve around gut feeling?

There are things that I know that work without everyday testing. Keywords in anchor will pass relevancy in the majority of cases and I don’t need to test that every time that I place a link somewhere. I am also aware of the exceptions to that rule (second link doesn’t count, for example) so when I see unusual or unexpected response from search engine, it gets my attention and I start testing. I also like to test extraordinary claims by people in the SEO industry, because they usually go against common knowledge and that is always informative. I will usually not let the testing process stand in the way of work. If there are several possible outcomes to the test that takes a long time to perform, I try to run with the project for as long as I can without making the decision, leaving all future direction possibilities open.

Gut feeling is something I usually use to assess trustworthiness of the people I listen to. I rely a lot (maybe more than I should) on other people’s knowledge. As I mentioned, I haven’t had the chance to test how pandalized site responds to different changes so I had to trust other people’s reports. Gut feeling is very helpful here to save time reading mile-long posts of people that I suspect do not even practice SEO on daily basis.

If a friend of yours said they wanted to get into SEO, what would you tell them to do in order to get up to speed?

To read the free guides from Google and SEOMoz. To pick a niche and create a site from scratch. To learn how to code, how to delegate, how to measure and how to hire and fire people. To read at least one SEO article every day. To read no more than one SEO article every day. To invest their first profits into SEOBook Training Section and to submit their site for review in the forum. The value they get from the advice there is going to be the best investment they made at the early stage. After their site is making money, to repeat that process in a different niche with a different strategy. Diversification is the best insurance policy in the ever changing algorithm world

If you had to start from scratch today with no money but your knowledge would you still be able to compete in 2011?

Yeah. Competing is about picking the battles you can win with what you have at the moment. There are still niches that can be monetized with relatively low effort (especially in non-english markets) and I think I would be able to monetize the knowledge I have and leverage it to create revenue in a reasonable amount of time. Luckily, I don’t have to test that claim.

If you had $50,000 to start, but lacked your current knowledge, what do you think your chances of success in SEO are?

Very low. Part of the knowledge is knowing what to spend the money on. Without prior knowledge, I would probably think that I can take on this SEO thing all by myself and $50K would be gone before I realized my mistakes. I would probably fall into the trap of buying links from some link network or torching my new site with 200,000 forum signature links all created in 2 hours

And, saving a tough one for last, in what areas of SEO (if any) do you feel science falls flat on its face?

First, I would like to reiterate: science is not a tool, it is a way of thinking and approaching problems. So under those definitions, I don’t think that science can fall flat on the face at all. I do see a problem with the abuse of the word “science” for marketing goals and a lot of those “approaches” fail because they lack the scientific way of thinking. Mostly they lack self-criticism and are so blinded by tagging their work as “science” that they will not adopt some of the humility and self doubt that is present in the majority of scientific work. The lust for hitting that Publish button, especially if there is potential financial benefit in publishing a certain kind of results, is the most unscientific drive in our industry.

There are some areas of SEO that scientific thinking should take a back seat to other approaches. One that instantly springs to mind is link building. To me, link building is the true art of marketing – recognizing what drives the potential linkers, leading them to linking to you while all along they are thinking that they came up with that decision themselves. There are some measurements involved and any testing should be planned and executed with a scientific rigour, but the creative part of it is something where science is of little use.

---

Thanks Branko! You can find him rambling at @neyne on Twitter or the SEOBook Forum & publishing findings and experiments at http://www.seo-scientist.com.

Currently, he is responsible for SEO R&D at Whiteweb, agency that provides SEO services to a small number of large clients in highly profitable niches. His responsibilities at Whiteweb are to gather, organize and expand the company's knowhow through research, experimentation and cooperation with other SEO professionals. In addition to being an SEO, he is currently writing his MSc thesis in environmental microbiology at HebrewU in Jerusalem.

Longer Google AdWords Ad Copy

I was just checking out the ongoing strategic meltdown in the value of the Dollar & noticed an AdWords ad with an extended headline & a 150 character ad description.

Currently I believe the above extended description is a limited beta test, but if Google starts mixing that in with Google Advisor ads & ad sitelinks there might not be a single organic result above the fold on commercial keywords.

The above image is even uglier when Google Instant is extended.

Using the 150 word ad descriptions would drive everything down one more row per ad. Adding another line to each of the AdWords ads would push the "organic" search results down another listing.

Of course one response is to operate in the tail of search, but just look at DMD to see how well that worked for them.

They are so desperate that they sent legal threats at a site flaming them. Humorously, that site also runs AdSense ads.

And that desperation is *before* Google has finalized a legal agreement on the book front & started aggressively pushing those ebooks in their search results with full force. In 12 months ebooks will be the new Youtube...a service that magically keeps growing over 10% a month "organically" in Google's search results.

Your content isn't good enough to compete, unless you post it to Youtube.

In addition to uploading spammy videos in bulk to Youtube, maybe SEOs should create a collective to invest in "an oversized monitor" in every home and on every desk. :D

Alternatively, switching the default search provider on every computer you touch to Bing doesn't seem like a bad idea.

Google Brand Bias Reinvigorates Parastic Hosting Strategy

Yet another problem with Google's brand first approach to search: parasitic hosting.

The .co.cc subdomain was removed from the Google index due to excessive malware and spam. Since .co.cc wasn't a brand the spam on the domain was too much. But as Google keeps dialing up the "brand" piece of the algorithm there is a lot of stuff on sites like Facebook or even my.Opera that is really flat out junk.

And it is dominating the search results category after category. Spun text remixed together with pages uploaded by the thousand (or million, depending on your scale). Throw a couple links at the pages and watch the rankings take off!

Here is where it gets tricky for Google though...Youtube is auto-generating garbage pages & getting that junk indexed in Google.

While under regulatory review for abuse of power, how exactly does Google go after Facebook for pumping Google's index with spam when Google is pumping Google's index with spam? With a lot of the spam on Facebook at least Facebook could claim they didn't know about it, whereas Google can't claim innocence on the Youtube stuff. They are intentionally poisoning the well.

There is no economic incentive for Facebook to demote the spammers as they are boosting user account stats, visits, pageviews, repeat visits, ad views, inbound link authority, brand awareness & exposure, etc. Basically anything that can juice momentum and share value is reflected in the spam. And since spammers tend to target lucrative keywords, this is a great way for Facebook to arbitrage Google's high-value search traffic at no expense. And since it pollutes Google's search results, it is no different than Google's Panda-hit sites that still rank well in Bing. The enemy of my enemy is my friend. ;)

Even if Facebook wanted to stop the spam, it isn't particularly easy to block all of it. eBay has numerous layers of data they collect about users in their marketplace, they charge for listings, & yet stuff like this sometimes slides through.

And then there are even warning listings that warn against the scams as an angle to sell information

But even some of that is suspect, as you can't really "fix" fake Flash memory to make the stick larger than it actually is. It doesn't matter what the bootleg packaging states...its what is on the inside that counts. ;)

When people can buy Facebook followers for next to nothing & generate tons of accounts on the fly there isn't much Facebook could do to stop them (even if they actually wanted to). Further, anything that makes the sign up process more cumbersome slows growth & risks a collapse in share prices. If the stock loses momentum then their ability to attract talent also drops.

Since some of these social services have turned to mass emailing their users to increase engagement, their URLs are being used to get around email spam filters

Stage 2 of this parasitic hosting problem is when the large platforms move away from turning a blind eye to the parasitic hosting & to engage in it directly themselves. In fact, some of them have already started.

According to Compete.com, Youtube referrals from Google were up over 18% in May & over 30% in July! And Facebook is beginning to follow suit.

Job Crusher Review - a New Type of Spam

I was going through my inbox and noticed that someone sent my Paypal account 10 cents. I believe Paypal eats anything under a quarter, so the only reason for the payment is to try to ensure that there is a better chance of the unsolicited spam email is read.

If the payment served any purpose (other than to insult the person on the receiving end of it) they at least could have sent a few hundred or done something classier like donating money to a charity they know you like. But no, they wanted to go lowbrow with their spam.

That is where you can one day end up if you want to be a big baller like the job crusher folks...you can spam people with offensive 10 cent payments. Isn't sending someone a friggen dime sorta counter to the marketing message of allegedly being rich & wealthy? Any way you slice it, the act is, at best, classless.

If you see someone promoting Job Crusher 2 with glowing reviews endorsing it, be sure to look for affiliate links & affiliate redirects. If they are singing praises for it & using affiliate links then they might make up to $10 per click for marketing it to desperate internet marketing newbies who hate their jobs/lives so bad that they keep paying people for tips on how to get rich quick.

If you want to make serious money using the Job Crusher system the way to do so is to email newbies to internet marketing promoting it. The secret to most get rich quick systems is the buyer's ignorance. The newbies *are* the product/system.

Unfortunately, I don't hurt for money bad enough to stoop to promote it, so I just refunded their dime and asked them not to spam me again. ;)

If I sent them an invoice for the time it took to write this "no thanks spammer" blog post, do you think they would actually pay it?

The homepage copy on the Job Crusher site states

What If You Just Did 10% Of A Million Dollars?? That’s Still $100,000 Per Year!
...
We are sharing this because we are sick and tired of the scam-offers out there being offered and all the BS garbage pounding our community.

What if their classless hack spam marketing angle sent more than 10% of a single Dollar?

What if...

What if they just shot themselves and stopped spamming?

Well, since it is rude to say "just shoot yourself" all I can ask of the Job Crusher team is this: if you want to get rid of "all the garbage pounding" you can start by leaving my inbox alone.

Thanks!

A Complete Review of Wordtracker's Link Builder

You need links to rank, period. We can talk all we want about great content, social signals, brand signals, and all that jazz but quite a bit of that is subjective.

If you practice SEO, and have success with it, then you are well aware that a claim of "you need links to rank" is an objective, true statement without a bunch of false positives.

The gray areas come in to play when we talk about things like anchor text, quality, volume, and so on but the overarching truth is without links you are largely invisible in the SERPS.

Ok, enough of what you already know. Wordtracker recently updated one of their core tools with some cool new features and functionality.

What is Link Builder from Wordtracker?

Link Builder is designed to address a most of the core, key functions of a link building and prospecting campaign.

  • Locate potential link partners via competitor backlinks or based on specific keywords
  • Setting up a link building campaign and sorting your links properly (blogs, directories, social media, etc)
  • Tracking the status of your link campaign's efforts

Wordtracker uses Majestic SEO's Fresh Index by default but you can use the Historic Index as well.

I might opt for the Fresh Index initially, because Majestic tends to have dead links in the historic index (thanks to the significant churn on the web) but if you can't find enough decent prospects in the Fresh Index, using the Historic one isn't a bad option.

There is a lot I like about this tool and a few things I'd like to see them add to or improve on.

Step 1: Setting Up a Campaign

I'm a fan of clean, easy to use interfaces and Wordtracker definitely scores well here. Here is the first screen you are presented with when starting up a fresh campaign:

Researching competing link profiles is not enough with respect to link prospecting, in my opinion. I really like the option to not only research multiple URL's at once but also to research keyword-specific prospects.

You can research lots of countries as well. Below is a snapshot of the countries available to you in Link Builder:

Step 2: Prospecting With Competitor URL's

I am craving some chocolate at the moment, as you can tell from my selected URL's :)

Here's a good example of my decision making process when it comes to using the Historic Index and the Fresh Index. My thought process usually involves the following information:

  • The bigger/older the link profiles of the URL's the more likely I am to use the Fresh Index to avoid lots of dead links
  • If the site is a well known brand I will be more likely to use the Fresh Index given the likelihood that the link profile is quite large
  • Smaller link profiles, newer link profiles will probably benefit from using the Historic Index more

In this example the sites I'm researching have big link profiles and have been around for quite awhile in addition to being large brands, so I will use the Fresh Index to cut down on potential dead-ends.

I selected the "Edit Sources" box because I want to make sure I pick the URL with the most links (or you can just go with both) but I wanted to show you the options:

I'll leave all selected just to maximize the opportunities. Sometimes you'll find pages ranking for specific keywords you might be targeting, rather than just the homepage ranking, so you can use both or one or the other if that's the case.

In this scenario I'm looking at the URLs ranking for "chocolate", and they all happened to be homepage's anyway.

Wordtracker is pretty quick with getting the data in, but while you're waiting you'll see the following screen:

Step 3: Working with the Analysis Tab

In order to keep the results as targeted as possible, Wordtracker automatically removes the following links from the results:

  • Image Links
  • Redirects
  • No-follow links

One thing I'd like to see them do is let no-follows through because even though they might not pass any juice they certainly can be decent traffic sources and link building isn't just about passing juice, it's also about brand building and traffic generation.

I'd even say let image links through. I understand they don't want to be a pure link research tool but image links can be valuable for some of what I just mentioned as well. I would say, give us the data and the ability to filter it rather than just taking it away completely.

Here is a snippet of the result page and a description on what it represents:

On the left are pre-designed buckets that Link Builder groups your links into. This is helpful but I'd like to see more flexibility here.

They also offer a tagging feature to help you group links in another way. The tagging can be helpful for things like assigning links to specific people within your group or really any other custom setup you have going on (maybe stuff like grouping keywords into priority buckets or whatever.)

The prospect tab gives you the domain (chow.com in the below example) the link sits on, the page it links to on a competing site or sites, and the page the link is actually on from the linking site:

All you have to do is click that "links to" button to see where the link is pointing to (in this case chow.com is only linking to 1 of the sites I inputted).

The column to the right shows the page on the domain where the link is originating from and the number in the middle is a measure of how important that particular prospect might be.

The furthest most right column shows columns that tell you whether the domain is also linking to you and how many other sites, out of the sites you inputted, that domain is linking to. The idea being that the domain might be more likely to link to you if they are linking out to multiple competing sites as well:

The grayed out button to the right of the co-link count is the "target" button. This is the button you'd click to let the tool know that this is a prospect you'd like to target.

You have the following toolbar available to you in the Analysis tab:

These are generally self-explanatory:

  • Delete - removes selected prospects from the campaign
  • Export - export your results to a CSV file
  • Copy to - copies prospects to another campaign within your account
  • Tag - allows you to tag selected prospects to help create custom grouping fields
  • Filter - filters Top Link by "contains" or "does not contain". An example might be if you wanted to target a link prospect or prospects which contained the word "chocolate" somewhere in the URL

You can also click on any of the groupings on the left to view those specific groups only. I find that the groupings are fairly accurate but I personally prefer the ability to customize fields like that rather than being boxed in.

I created a sample tag titled "for eric" that contains 2 links I want a team member named Eric to work on:

Step 4: Working with the Contact Tab

The Contact tab has most of the same toolbar options as the Analysis tab with one exception:

  • Find Contact and About Links - click on the links you want to find contact information on and/or find the about page on

Link Builder works in the background to find this information and you can continue working in the application. There is a notes option as well. There's no specific way to leave multiple, time-stamped notes (for team environments) but the input box is expandable so you can leave an ongoing contact history.

You have the same contact flag on the right and to the left of that is an email icon that turns yellow if you click it and is designed to let you know contact is in progress or has been initiated.

When the contact request comes back (just refresh the contact tab) you'll see the following, new fields within the Contact tab that denote the contact/about pages for the prospect:

Step 5: Reporting

The Reporting piece of Link Builder has the following reports:

  • History - options for the Fresh/Historic Index of Majestic SEO via cumulative and non-cumulative views for the chosen domains
  • Spider Profile - the link category breakdown (the aforementioned pre-defined link sources Wordtracker assigns your prospects to) of each domain
  • Target Summary - number of targets, number/% of targets contacted, number/% of targets not contacted, number/% of targets linking to you

This gives you a quick overview of the growth of competing link profiles, current link building rate, types of links they have, and your own Prospect metrics. All the reports are exportable to PDF.

Here's the History report:

Here's the Spider Report:

Here's the Target Summary:

Additional Campaign Options

As we discussed earlier, you can either input a list of domains to search on a specific keyword.

If you search on a specific keyword to start you are able to select URL's to include in your prospecting search. Everything else, in terms of options after the URL selection is the same as if you were to have started with domains.

Having a keyword search to start a campaign is helpful in case you are looking to go beyond competitors you already know of and get a real deep look into link prospects across that keyword's market as a whole.

Also, right next to your campaign name you can sign up to be automatically notified of new links and prospects for your campaign:

Firefox Extension

Link Builder also has a Firefox extension that allows you to grab all the external links from a page and save them in your Link Builder account.

I find this is helpful on directory sites (for gathering a list of topic-specific URLs), as an example. The extension is really easy to use. You can install it here. Once you arrive at a page you want to use it on you just click on the LB logo in your toolbar:

Then once you click on the option to gather the links, you get the following interface:

You can save the chosen links right into your Link Builder account.

What I Like

The features that I like in Wordtracker's Link Builder tool are:

  • Ability to prospect by multiple URLs or by choosing a specific keyword
  • Option to use Fresh or Historic Index via Majestic SEO
  • Simple ways to keep notes and contact information
  • Ability to search for contact and about information on selected prospects
  • Robust selection of countries
  • Initial, intelligent link grouping
  • Exporting capabilites
  • Fast results and a really clean, easy to use interface

What Could Be Improved On

I think Wordtracker could do some things to make this tool even more functional and useful:

  • More flexibility with the naming and assigning of link types
  • Have profile-wide settings to include all links (no-follow, image, etc) or exclude some rather than excluding without a choice to include
  • More filtering options around the data points they offer and whether a prospect has been targeted or not
  • More robust link tracking (if the status of links change send me an alert). Though I realize that is getting into link tracking versus link building, it's still a nice option
  • A bit more flexibility with notes and timestamps for a more defined contact history (especially if teams use this)

A Solid Link Building Product

Overall I think this tool does a good job with its intended use, link building. I think some users would like to see more done to make it more team friendly but I think you can accomplish a lot with their tagging feature.

As stated above, I'd like to see some more done with notes and such but as a link prospecting and building tool Wordtracker's Link Builder is worth your time to try out.

You can grab a free trial over at Wordtracker.

The God Complex in SEO

Authoritative, but Often Wrong

Trusting a powerful authority is easy. It allows us to have a quick shorthand for how things work without having to go through the pain, effort, & expense to figure things out. But it often leads to bogus solutions.

This video does a great job of explaining how nothing replaces experience in the SEO industry.

A combination of numerous parallel projects, years of trial and error experience & a deep study of analytics data is far superior to having the God complex & feeling 100% certain you are right.

Change is the only constant in SEO.

Grand Plans

Big plans often get subverted before they pan out & the more obvious something is the shorter its shelf life. By the time everyone notices a trend then jumping on it at that point probably isn't much of a competitive advantage. You might still be able to make some money for a limited time (or for a longer time if you apply it to new markets), but...

It is the contrarian investors who are taking (what is generally perceived to be) big risks who are allowed to ride a trend for years and years.

Options & Opportunities

When Panda happened a lot of theories were thrown out as to what happened & how to fix it. Anyone who only runs 1 website is working from a limited data set and a limited set of experience. They could of course decide to do everything, but there is an opportunity cost to doing anything.

Making things worse, if they have limited savings & no other revenue producing websites there are some risks they simply can't take. They can still sorta infer some stuff from looking at the search results, but those who have multiple sites where some were hit and others were not know intimately well the differences between the sites. They also have cashflow to fund additional trial and error campaigns & to double down on the pieces that are working to offset the losses.

Success Requires Failure

A lot of times people want to enter a market with a grand plan that they can follow without changing it once the map is made, but almost anyone who creates something that is successful is forced to change. Every year in the United States 10% of companies go under! And due to the increased level of competition online it likely separates winners from losers even faster than in the offline world. Those who stick to a grand plan are less able to keep up with innovation than those who have an allegiance to the data. Sometimes having a backup plan is far more important than having a grand plan.

Incremental Investing, Small & Large

Almost anything that I have done that has been successful has started ugly & improved over time. This site was an $8 domain & I couldn't even afford a $99 logo for it until I was a couple months into building it. Most of our other successes have been that way as well. If something works keep reinvesting until the margins drop. But when the margins do drop off, it is helpful to have another project you can invest in, such that you are not 1 and done.

The earliest Google research highlighted how ad-based search business models were bad & the now bankrupt Excite.com turned down buying Google for under $1 million. It turns out everyone was wrong there. One company adjusted & the other is bankrupt.

Overcoming the God Complex

We don't control Google. We can only influence variables that they have decided to count. As their business interests and business models change (along with the structure of the web) so must we.

The God complex always look a bit interesting from afar, no matter how reasonable it sounds to the true believer.

Our "Brand" Stands for 'Anything That Will Make Money'

Want a good example of Google's brand-bias stuff being a bunch of bs?

Niche expert value-add affiliate websites may now lack the brand signal to rank as the branded sites rise up above them, so what comes next?

Off-topic brands flex their brand & bolt on thin affiliate sections.

Overstock.com was penalized for having a spammy link profile (in spite of being a brand they were so spammy that they were actually penalized, counter to Google's cultural norm) but a few months later the penalty was dropped, even though some of the spam stuff is still in place.

Those who were hit by Panda are of course still penalized nearly a half-year later, but Overstock is back in the game after a shorter duration of pain & now they are an insurance affiliate.

prnewswire.com/news-releases/oco-launches-insurance-tab-125739128.html

And this "fold the weak & expand the brand" game is something the content farm owners are on to. Observe:

While most the content farms were decimated, that left a big hole in the search results that will allow the Huffington Post to double or triple the yield of their content through additional incremental reach.

And, yes, this is *the* same Huffington Post that is famous for aggregating 3rd party content (sans attribution), wrapping a Tweet in a page & ranking it, and gets mocked by other journalists for writing 90's-styled blocks of keyword spam:

Before I go on, let me stop and say a couple of more important things: Aol, Aol Acquires Huffington Post, Aol Buys Huffington Post, Aol Buys Huffpo, Aol Huffington Post, Huffington Post, Huffington Post Aol, Huffington Post Aol Merger, Huffington Post Media Group, Huffington Post Sold, Huffpo Aol, Huffpost Aol, Media News.

See what I did there? That's what you call search-engine optimization, or SEO. If I worked at the Huffington Post, I'd likely be commended for the subtle way in which I inserted all those search keywords into the lede of my article.

And, of course, AOL is a company with the highest journalistic standards:

I was given eight to ten article assignments a night, writing about television shows that I had never seen before. AOL would send me short video clips, ranging from one-to-two minutes in length — clips from “Law & Order,” “Family Guy,” “Dancing With the Stars,” the Grammys, and so on and so forth… My job was then to write about them. But really, my job was to lie. My job was to write about random, out-of-context video clips, while pretending to the reader that I had watched the actual show in question. AOL knew I hadn’t watched the show. The rate at which they would send me clips and then expect articles about them made it impossible to watch all the shows — or to watch any of them, really.

Doing fake reviews? Scraping content? Putting off-topic crap on a site to monetize it?

Those are the sorts of things Google claims the spammy affiliates & SEOs do, but the truth is they have never been able to do them to the scale the big brands have. And from here out it is only going to get worse.

We highlighted how Google was responsible for creating the content farm business model. Whatever comes next is going to be bigger, more pervasive, and spammier, but coated in a layer of "brand" that magically turns spam into not-spam.

Imagine where this crap leads in say 2 or 3 years?

It won't be long before Google is forced to see the error of their ways.

What Google rewards they encourage. What they encourage becomes a profitable trend. If that trend is scalable then it becomes a problem shortly after investors get involved. When that trend spirals out of control and blows up they have to try something else, often without admitting that they were responsible for causing the trend. Once again, it will be the SEO who takes the blame for bad algorithms that were designed divorced from human behaviors.

I am surprised Google hasn't hired someone like a Greg Boser or David Naylor as staff to explain how people will react to the new algorithms. It would save them a lot of work in the long run.

Disclosure: I hold no position in AOL's stock, but I am seriously considering buying some. When you see me personally writing articles on Huffington Post you will know it's "game on" for GoogleBot & I indeed am a shareholder. And if I am writing 30 or 40 articles a day over there that means I bought some call options as well. :D

Update to Firefox 5

I am not sure how many people were holding off on updating to Firefox 5 because of our SEO extensions, however we made versions for Firefox 5 quite a while ago for Seo for Firefox, Rank Checker & the SEO Toolbar. When you first go to update it there might be a message that the extensions are not compliant. If that is the case, upgrade to Firefox 5 & then after you get Firefox 5 installed it has a check for updated versions of extensions.

Our newest extensions no longer support Firefox 3 (we get some complaints from people using 3.6) and some early versions of Firefox 4 (like 4.0.1) may not be supported either. If you have an older browser & try to install our extensions you will get an incompatibility message, likeso:

If you like the extensions as they are then there is no need to upgrade, however if you are having any issues with them (not being able to install them, not being able to pull Bing rankings, blank CSV export, etc.) then an upgrade should fix the problem.

Firefox stated that the version 5 update is a security one, so I did it right away. If your Firefox version is high enough you should see an "allow" message box, likeso:

Shout out to Brad McMillen, who had a support request & donated $20 to charity: water to receive a response. He was the first person to do so after months of us making the suggestion on the help desk area, even with 10 daily freetards (who are too lazy to read installation instructions) send us support tickets every day, flaming us because they "paid" for Firefox years ago & such. ;)

I have been losing weight recently and working out a decent amount every single day & working a bit less. I even had time to go see my mom, see my sister, and visit my favorite childhood park.

As an added bonus we dusted off the Nintendo & found a store selling vintage games that had my favorite pinball machine ever - Medieval Madness. I felt like a genuine escentric when trying to explain to my wife how buying a pinball machine for the house was reasonable. Even more eccentric, she didn't counter the idea. Who knows where that will lead...but it could add extra incentive to buy vs rent, if only California real estate didn't start at 7 figures on up. :D

Extra time for reading, exercising & playing has led to a higher level of personal happiness, even as my fear of crushing state debts & banker fraud leading to a new wave of fascism the world over grow daily.

Probably the single best business move I made over the past couple years was deciding that freetards were worth less than nothing and just deleting them. Part of what helped me do that was I actually had an employee answer tickets & after less than a week of doing it he was miserable & had a health issue. Since discarding freetards entirely I have seen 0 business impact and a huge lift in quality of life. If you are trying to please too many people and are showing signs of an unbalanced life for it (things like lacking sleep, high stress level, gaining weight, etc.) then a change is in order. I am still pretty chubby, but have already lost about 30 pounds.

Sometimes I think it makes sense to lean into living a somewhat unbalanced lifestyle to build leverage, but after you are doing well for a while at some point it makes sense to live a bit more balanced life & enjoy it a bit more (or else the hidden health issues will become unhidden in short order). :D

I think sometimes if you just read the blog posts things can be perceived to be more cynical and negative than they actually are. One of the bigger things I struggle with is having inspiration to keep making new posts after having published thousands of them. As I read more about the history of communications & how monopolies come to control information it is easy for me to write about some of the parallels between that and the current market. It is much harder to have something new to write about marketing though, as so much of it is just a repeat of history.

Sure we can say everything is changing and hype everything new to try to pick up some links from people who want to cite quasi-research, but beyond understanding broad stroke philosophical stuff, a lot of what is new is either just hyping what is new for the sake of it or a regurgitation of what was old.

The Google <3's brands theme is something that has been playing out for about a half-decade now. And if you look at every other major established ad driven media model, brand is there as well. Other big components of the ad ecosystem?

  • Classifieds = local/mobile/deals
  • retail = ecommerce/deals/payment processing
  • channel segmentation = ad personalization & social media platforms that you reveal your tastes & interests on

What areas are Google pushing into? Those exact same areas. Just look at this 2007 slide from Hal Varian...

...or see what Larry Page is pushing on Google+

I think about our products in three separate categories

First, there is search and our ads products, the core driver of revenue for the company. Nikesh and Susan are going to talk more about ads later in the call

Next, we have products that are enjoying high consumer success--YouTube, Android and Chrome. We are investing in these in order to optimize their long-term success

Then we have our new products--Google+ and Commerce and Local. We are are investing in them to drive innovation and adoption

The other hard bit with blogging is that of course sometimes there are some really delicious bits to SEO that most the market is unaware of. If you blog them there is a good chance the idea dies. Sometimes valuable tips are shared though, like in Rae's latest link building group interview.

Google Says "Let a TRILLION Subdomains Bloom"

Search is political.

Google has maintained that there were no exceptions to Panda & they couldn't provide personalized advice on it, but it turns out that if you can publicly position their "algorithm" as an abuse of power by a monopoly you will soon find 1:1 support coming to you.

The WSJ's Amir Efrati recently wrote:

In June, a top Google search engineer, Matt Cutts, wrote to Edmondson that he might want to try subdomains, among other things.

We know what will happen from that first bit of advice, in terms of new subdomains: billions trillions served.

What Subdomains Will Soon Look Like. From Jonathunder on Wikipedia's McDonalds Page.

What are the "among other things"?

We have no idea.

All we know is that it has been close to a half-year since Panda has been implemented, and in spite of massive capital investments virtually nobody has recovered.

A few years back Matt Cutts stated Google treats subdomains more like subfolders. Except, apparently that only applies to some parts of "the algorithm" and not others.

My personal preference on subdomains vs. subdirectories is that I usually prefer the convenience of subdirectories for most of my content. A subdomain can be useful to separate out content that is completely different. Google uses subdomains for distinct products such news.google.com or maps.google.com, for example. If you’re a newer webmaster or SEO, I’d recommend using subdirectories until you start to feel pretty confident with the architecture of your site. At that point, you’ll be better equipped to make the right decision for your own site.

Even though subdirectories were the "preferred" default strategy, they are now the wrong strategy. What was once a "best practice" is now part of the problem, rather than part of the solution.

Not too far before Panda came out we were also told that we can leave it to GoogleBot to sort out duplicate content. A couple examples here and here. In those videos (from as recent as March 2010) are quotes like:

  • "What we would typically do is pick what we think is the best copy of that page and keep it, and we would have the rest in our index but we wouldn't typically show it, so it is not the case that these other pages are penalized."
  • "Typically, even if it is consider duplicate content, because the pages can be essentially completely identical, you normally don't need to worry about it, because it is not like we cause a penalty to happen on those other pages. It is just that we don't try to show them."
  • I believe if you were to talk to our crawl and indexing team, they would normally say "look, let us crawl all the content & we will figure out what parts of the site are dupe (so which sub-trees are dupe) and we will combine that together."
  • I would really try to let Google crawl the pages and see if we could figure out the dupes on our own.

Now people are furiously rewriting content, noindexing, blocking with robots.txt, using subdomains, etc.

Google's advice is equally self-contradicting and self-serving. Worse yet, it is both reactive and backwards looking.

You follow best practices. You get torched for it. You are deciding how many employees to fire & if you should simply file bankruptcy and be done with it. In spite of constantly being lead astray by Google, you look to them for further guidance and you are either told to sit & spin, or are given abstract pablum about "quality."

Everything that is now "the right solution" is the exact opposite of the "best practices" from last year.

And the truth is, this sort of shift is common, because as soon as Google openly recommends something people take it to the Nth degree & find ways to exploit it, which forces Google to change. So the big problem here is not just that Google gives precise answers where broader context would be helpful, but also that they drastically and sharply change their algorithmic approach *without* updating their old suggestions (that are simply bad advice in the current marketplace).

It is why the distinction between a subdirectory and subdomain is both 100% arbitrary AND life changing.

Meanwhile select companies have direct access to top Google engineers to sort out problems, whereas the average webmaster is told to "sit and spin" and "increase quality."

The only ways to get clarity from Google on issues of importance are to:

  • ignore what Google suggests & test what actually works, OR
  • publicly position Google as a monopolist abusing their market position

Good to know!

How To Make Awesome Landing Pages for Local PPC

Am I the only one who gets a warm, fuzzy feeling from a well-crafted, super-targeted landing page? Right, I didn't think so :)

Landing pages tend to suck more often than they inspire.

Local landing pages are even worse in many cases; with hapless advertisers throwing Google AdWords coupons away by simply sending you to their home page for every single ad :(

Why Local PPC Matters

I firmly believe that local PPC (and SEO) is still an untapped resource for those looking to make client work a part of their business portfolio.

It's quite hard enough for a local business owner, specifically one who has little experience in web marketing, to be expected to get a 75$ AdWords coupon and magically turn that into a quality PPC campaign that lasts.

Google tried that mass approach to marketing and failed. The result of that failure has brought about things like:

Google recognizes the market for helping small businesses reach customers on the web as do Groupon, Restaurant.Com, and all their clones.

Local PPC, especially when used in conjunction with local SEO, can really make significant differences at the local business level and many of those businesses need help to do it.

Landing Page Quality Matters

I really dislike hitting a generic landing page after I make a really specific query. It's kind of like going to Disney and asking where Space Mountain is, only to be told that "we have lots of attractions sir, here is a map of the entire resort".

Generally speaking, I believe most people like being led around by the nose. People typically want things yesterday so it's your job to give them exactly what they are looking for; after all, is that the point of search?

I think anyone who's worked with PPC campaigns can attest to the fact that targeted landing pages are quite high on the importance totem pole. Tailoring your landing pages to your target market matters a lot.

Solid Local PPC Landing Pages

Designing a good landing page for local queries is not hard at all. There are many different layouts you can use and you should test as many as is practicable, relative to your traffic levels, to understand which ones will work for you.

One area where local PPC is ripe for local business owners is insurance. I'm going to share a good example of a local lander below but if you are doing local PPC, before you get to the landing page design, utilize Google's address links like this advertiser did (green arrow mine)

The above can help you stand out from the crowd where you are one the few local advertisers and it helps create that local experience right from the start.

So I came across a couple of examples of good ways to tie in local content with your landing page design.

Here's one from the insurance industry targeting terms around "wisconsin car insurance" followed by some tips on why I feel it's a good example (green arrows are mine):


Why is this a good example?

  • Use of the local modifier in key spots (doesn't appear stuffed)
  • The Wisconsin Badger college football team's main color is red (not sure if that factored in but it helps to tie stuff like that in)
  • Icon of the state in the main header
  • Good use of badges to display authority in the insurance niche
  • Lack of other navigation options, focused on the offer and the benefits of using their service
  • I might have bolded "we only do business in Wisconsin" though

In the above example you see a problem with many insurance agents locally though, quite a few do not have the ability to offer live quotes so they have to use a contact form. In a web of instant gratification this is something that can be an issue.

Any good example is in another area where local customization works well, travel!:

This was for a search around the keyword "boston hotels". The imagery is great here. A couple things I would have done would have been to eliminate the left navigation and make the main content area more bullet-point oriented rather than a set of paragraphs.

Overall, they have a set up here where they can do the same approach across a bunch of different locations.

Not So Solid Local PPC Landing Pages

While searching for the above examples I also found some that were examples of being really untargeted approaches to local keywords. Here's an example of a brand just throwing out a really basic lander:

Absolutely no local customization at all. Good landing page basics though (clear CTA, clear benefits). Perhaps bigger brands don't need to, or fail to see the value in, making landing pages local-specific on local queries.

Liberty has no excuse not to either. They have local offices in every state, they could easily make their pages more local but they, for whatever reason, choose not to.

In keeping with the same theme, I found this landing page for "boston hotels" to be underwhelming at best:

It's a list of information in an otherwise coldly designed table. Perhaps this works well enough, just give people the info I suppose.

As a user, especially if I'm traveling, I'd like to see pictures, brief info about the area, why choose here over the hundreds of other providers, etc.

Quality Landing Page Foundations

Typically, I would recommend starting out with a base layout and designing the page according to your market and then layering on local criteria. If you look at examples of good landing pages the layouts themselves don't change all that much.

Some local elements you can include are:

  • Local imagery
  • Locations and hours
  • Integrated map with directions
  • Proximity to local landmarks (good for things like hotels, bed and breakfasts, etc)
  • Local phone number and contact information
  • Membership in any local group (rotary club logo, Better Business Bureau, chamber of commerce logo, logos of local charities or events you are involved with, etc)

As discussed before, design should also speak to your audience (more tech savvy or less tech savvy, age, gender, market, and so on).

Consider these 2 examples of landing pages for online invoicing. This is a market where design should be fresh, modern, "web X.X" if you will (like market leader Freshbooks).

Here's a win for good landing page design:

I really like the free sign up bar at the bottom. Your call to action is always available if you have to scroll or not. Good use of headlines, solid list of benefits, and super-easy sign up.

Compare that to something like Quickbooks which requires quite a bit of info to get started:

Then you have another example of, usually, what not to do. Too many navigation options here, run on paragraphs, lack of bullet points, outdated design for this market in my opinion:

So the layouts don't change drastically and I'd recommend coming up with a layout first, a base design, and base copy. Then you can easily turn any landing page into a targeted, local page pretty quickly with small design and copy tweaks.

Landing Page Resources

A few places I have bookmarked for landing page references are:

A couple of tools to help you with cranking out solid landing pages would be:

  • Unbounce (hosted)
  • Premise (Wordpress plugin from Copyblogger which comes with a ton of custom graphics and built in copywriting advice + tips)

It's not that difficult to create awesome, locally targeted landing pages. It's a really simple process:

  • Check out the resources linked to above and make a swipe file of nicely designed landing pages (design and layout)
  • Incorporate the base layout and copy layout (headings, graphics, CTA's, etc) into a wireframe
  • Minimize distractions (focus on getting the clicker to complete the desired task)
  • Get the UI and graphics in order
  • Think about all the ways you can sprinkle in a local feel to the page, like we talked about above (colors, locations, hours, local connections, imagery, and so on)
  • Add in the local components to your base page

What are some of your best practices when putting together landing pages for local PPC campaigns or landing page tips in general?

What's In Your SEO Toolbox?

The SEO tool space is a pretty crowded one (and growing one!). Tools are helpful, there is no doubt about that. However, tools are generally only as good as the person using them. We'd love to know what tools you use and why, so please let us know in the comments after the post :)

I am not "house" handy by any means, I can barely hang a picture frame straight. So if you gave me the best construction tools in the world I'd still make extra holes and screw something up.

Even if I managed to get the picture hung correctly, it certainly would not look professional.

You can buy as many guides, tools, and accessories as you like but in the end it is your skill that determines the success or failure of a project (building a deck or building a website). Skills can be harnessed, but tools do not overcome a lack of skill.

SEO Tool Fatigue

SEO tool fatigue is a real issue for some folks. Some people spend a good chunk of their productivity on testing or trying out new tools, or even using so many tools that their implementation and interpretation of data suffers a great deal. One tool says this, another says that, and yet another says 1 or the other or both or neither :) .

The first thing to realize is that most of the data from tools (excluding analytics and such) are basically estimates of estimated data, or are directly from Google's various estimation-type tools (Keyword Tool, Trends, Insights, and so on), or driven off what the tool builder thinks are important or reliable metrics to build your research off of (there tends to be some swings and misses with that type of approach).

You are not going to fail miserably if you decide not to do days and days and days of keyword research with multiple tools and then spending more days comparing different datasets. Research is important, but there is a limit.

Picking a Core Set of Tools

From a cost and time standpoint I've found it really helpful to pick a core set of tools and stick with them rather than bouncing around to get an extra feature or two.

It's good to peek around from time to time but using mostly similar tools can lead to a "needle in the haystack" approach; where you spend most of your time digging a time-suck hole rather than building websites and adjusting strategies based on analytics and/or AdWords data.

Again, research is important but there is a sweet spot and it's a good idea to get some kind of system down so you can focus on doing "enough" research without doing harm to the time it takes you to get sites up and running.

Evaluating Tools

I'm going to highlight some of the tools I've used below, most of which are considered to be market leaders. I'll point out why I use certain tools, why I don't use others (yet) and I encourage anyone who's dealing with tool overload to do the same for the tools you use.

The areas I'll be focusing on are:

  • Keyword Research
  • On Page Criteria
  • Rank Checkers
  • Competitive Link Research Tools
  • Link Monitoring

Keyword Research

There are many keyword research tools that pull data from the sources listed below (like our free keyword research tool, which pulls from Wordtracker).

These tools use their own databases (although in Wordtracker you can ping Google's tool as well).

I use all the Google tools as well as Ad Intelligence and Wordtracker as well as the SeoBook Keyword Tool. Sometimes I use Wordtracker just via our keyword research tool and sometimes I use Wordtracker's web interface (I like being able to store stuff in there).

Our keyword tool also links in to most of the sources listed above. A big reason why I like our keyword research tool is that it's super easy to hit the major data points I want to hit on a particular keyword from one location.

Ad Intelligence is solid as (Microsoft claims) they incorporate actual search data into their results, rather than estimating like Google does.

I should also note that I mainly use Trends and Insights for comparing similar keywords and looking at locality (in addition to the history of keywords). Sometimes you run across really similar keywords (car, auto) and it can help to know which one is most relevant to your campaign.

On-Page Optimization

For the on page stuff I'm mainly concerned with large scale, high level overviews.

I use our toolbar for specific on-page stuff but when I'm looking to diagnose internal linking problems (not maximizing internal link flow, broken links, http status codes, and so on) or issues with title tags and meta descriptions either missing, being too short, or too long, or duplication then I use a couple different tools.

Since I'm on a Mac and I don't care to run Windows for anything other than testing, I use the three listed which work on Mac (though I don't use them in every situation).

I use Screaming Frog's SEO Spider pretty frequently as well as Peacock's Integrity. Integrity is a broken link checker while SEO Spider incorporates other SEO related features (title tags, H1/H2's, anchor text, and a ton of other important elements).

WebSite Auditor offers most, if not all, of what SEO Spider does but also incorporates white-label reporting, Google Page Rank, Yahoo! & Google Link popularity, cache dates, and so on.

For some of those features in Website Auditor you might want to either outsource the Captcha inputting or use their Anti-Captcha service so you don't have to sit there for hours entering in captcha's.

In my regular workflow, SEO Spider and Integrity get used a lot and Website Auditor comes in to play for some of those other metrics and for white label reporting.

Rank Checking

Here's a crowded space! So I think the right choice here really depends on your needs. Are you a solo SEO who runs multiple sites, or maybe you run your own sites and client sites, or maybe you are a client-only shop.

Here are some of the main players in this space:

Even if you have reporting needs, you can still do a lot for free with our free rank checking tool (scheduled reports, stored reports, multiple search engines, and so on) and Excel or another spreadsheet program like OpenOffice.Org or Google Docs. Some good tips on creating ranking charts with Excel can be found here.

There are a couple differences with the software players, Advanced Web Ranking and Link Assistant's Rank Tracker (both have multiple levels so it's wise to check the features of both to see if you need the higher end version or if the lower priced versions will work for you). Some of the key differences are:

  • Rank Tracker integrates with Google Analytics
  • Advanced Web Ranking has a variety of ways to track local rankings, including maps and a local preview engine
  • Advanced Web Ranking has more, easier to customize reporting options
  • I find that the interface with Rank Tracker is much easier to work with
  • If all you are looking for is rank checking, then Link Assistant is a bit cheaper overall (comparing enterprise versions of both). While noting, AWR has more local options at their higher price point. You can see AWR's pricing here and Link Assistant's here. Note, it's worthwhile to check out maintenance pricing as well (Link Assistant and AWR)
  • AWR let's you assign a proxy per project, which can be really helpful if you have clients all over the map.
  • AWR automatically pulls in the top ten sites for a keyword, and their last position compared to current, and let's you add that site to your tracking (at any point) with all the historical data saved and updated within your account.

One tip with software tools is to run them on a different machine, perhaps even behind an IP off of a private VPN service like WiTopia, and think about utilizing multiple proxies from a service like Trusted Proxies and/or using an anti-captcha service with Link Assistant's tools.

The idea is to not get your IP banned and to let you continue to work as normal on your main machine while another machine is handling the automated queries. If you don't want to fuss with that, you might want to try a cloud app.

The Cloud and Scalability

The 3 main services, that I've used anyway, come from Raven, SeoMoz, and Authority Labs. Authority Labs now powers Raven's SERP tracker too. My biggest concern with cloud-based rank checkers is that the keyword volume can be (understandably) limited. Now, Authority Labs has unlimited checking at 450/month but the other two have limits.

Let's just look at the highest plans for a second, Moz allows 30 campaigns and a total of 3,500 keywords. Raven's highest plan allows for unlimited domains and 2,500 keywords total (and 200 competitors).

If scalability is a concern for you then you might be better off with software solutions. Once you start running multiple sites or are responsible for reporting on multiple sites (and you are working the long tail and your analytics) then you can see how restrictive this could become.

Of course, comparing just the rank checking options of a tool set like Raven and Moz (which both have other useful tools, Raven more so for full on campaign management) doesn't do the pricing justice. So what you could do is still use the many other tools available from each company and use a software solution once your rank checking scales beyond what they offer.

Both Moz and Raven integrate with Google Analytics, and Raven's campaign integration with GA is quite nice too (beyond just rankings).

Link Research

Free tools like Yahoo!'s Site Explorer, search query tools like Solo SEO's link search tool and Blekko's link data are nice but at some point in your SEO career you'll might have to get on board with a more advanced link research tool or tools to get the data you need to compete in competitive SERPS.

A good chunk of software-based solutions pull link data from search engines but if you want a more, way more, comprehensive view of a competing site's link profile (and link history) you do have a few options.

Majestic was originally known for having a much deeper database, with the caveat that they keep a lot of decayed links, and their UI wasn't overly impressive. Well, as noted in a recent blog post (which includes 20% off coupons) on Majestic's new tools, most of that isn't the case anymore. Though, I still feel Open Site Explorer has a better and smoother UI.

Advanced Link Manager's strength lies in their ongoing link management and reporting but they also have some decent link research tools built in and they can connect to SeoMoz's API to gather link data, so that kind of sets them apart from those other software-based solutions.

Again, Moz offers other tools as well so it's hard to really compare price points. What I like about OSE is that you can get a really solid, quick overview of the anchor text profile of a competing site. Also, you get unlimited look ups and up to 10k links per query on their pro plan (in addition to other Moz tools). You can get a 30 day free trial of all the Moz tools as of this writing.

Majestic's New Tools

Majestic, now with their new site explorer and fresh index, rival OSE's UI and freshness a bit but there still are limits on usage. You can check out Majestic's pricing here and don't forget about the 20% off coupon mentioned here.

Typically I like to use both Majestic and OSE. I like the new tools Majestic has come out with and their historical data is solid. OSE, for me, is great for getting some of a site's top metrics quickly (anchor text, top pages, etc).

If I had to pick one, I'd go with Majestic mostly because Moz gives a decent amount of data away for free (being a registered user) and because Majestic has really good historical + deeper data.

Link Management

Building links, especially if you have a team, can be a cumbersome process unless you have collaborative tools to work with. Even if you operate mostly on your own, you might want to track links you've earned or built directly.

Every once and awhile i like to download a report from Majestic SEO and add any links that are not yet in my tracking program into the program. Some people like to just track paid or exchanged links and let the natural ones sort of come and go naturally.

There are a couple of tools out there that I've used, and one I haven't but I've heard good things about it from reputable sources so I'll include it here.

Raven's Link Manager is probably their flagship tool. It has received really high praise from experienced SEO's and is easy to use. You can easily add links, assign them to employees, and let Raven worry about the automatic checking and reporting in case something changes with a link.

Advanced Link Manager has many features built in but you can use it just for tracking links you want to track by uploading the links into the program. It's software based and you can set it to run whenever you'd like, automatically.

I personally haven't used Buzzstream, but reputable people have told me it is a solid program, and they have a free 14 day trial here. It's a dedicated link building and management tool (and also has a PR and social media tool) so chances are if you are looking for a specific tool to fill that need, this one might be worth a shot.

If you don't have a ton of links to manage or a team to manage, you might be just fine with an Excel spreadsheet or a Google Doc. To me, it's just one more thing to think about and Raven and Buzzstream have low priced plans if you don't need enterprise-level storage.

What's in Your Toolbox?

So there's an overview of what I feel are the best SEO tools out there and one's that I use frequently (or infrequently).

I'd love to know what you are using and why (or why not?) :)