eCommerce SEO? Google AdWords or No Soup for You

Affiliates Are a Dying Breed

Being an ecommerce affiliate keeps getting harder & harder unless you have a strong brand and/or are selling things with a complex sales cycle.

Portable air conditioners is a pretty niche category, but when I look at it I simply don't see any opportunity on the SEO front unless you take on the significant risk of carrying inventory & drop hundreds of thousands to millions of Dollars on branding.

The Corporate, The Bad & The Ugly

Head keyword: note the brand navigation, the extended AdWords ads & the product search results that drive the traditional search results below the fold

Tail keywords are every bit as ugly, with Google product ads sometimes coming in inline, further driving down the organic search results.

And it is nastier when Google Instant is still extended. 10% of browsers can see a single organic search result!

Corporate, Corporate, Corporate

As ugly as that looks, not only do the larger merchants have an advantage in AdWords (getting their product ads on a CPA basis while smaller merchants have to pay on a CPC basis), Google product search (more reviews), inline search navigation options (featuring the same brands yet again), but most of the organic results (that are generally below the fold) are also the same big brands after the Panda update gave them a boost while torching their smaller competitors.

The Chicken vs Egg Problem of Scale

For online pure-plays (outside of Amazon.com, eBay & a few others) the "no opportunity anywhere" problem in search also harms the ability to be competitive on pricing because without the ability to rank you don't have the leverage over the supply chain the way that the big box stores do from winning everywhere in the SERP & having offline distribution. There is little opportunity to organically grow to scale over time unless you enter the market with some point of leverage (like going so far as creating the product right on through to marketing it to consumers), sell something totally different than what is already available in the market (and hope it doesn't get cloned), buy out an existing company that went bankrupt, and/or build significant non-search distribution channels first.

I suppose the last option on that front would be to promote your stuff on a large platform that is already doing well in Google (say eBay, Amazon, or Facebook), but doing that gives you limited control over the customer experience & forces you to keep chasing new one-off sales rather than building & deepening relationships with customers.

Killing Off Diversity

As Google collects more usage date (mobile is already 12% of search) these big box stores will have an even bigger moat between them & smaller competitors.

The "big box stores only" search results also create an experience that is bland & uniform. At first glance things may look different, but it is the same type of sites again and again: a lot of the brands cross hire, have similar "politically correct" cultures & have roughly similar customer experience sets. When you buy from Walmart you are not going to get that caring email from a founder offering hands on tips & advice. Scale requires homoginization, which generally kills of personality & differentiation.

Killing Off Innovation

The problem with the "be huge or die" approach to search is that most legitimate economic innovation comes from smaller players that challenge the existing power structure. Set the barrier to entry too high and you might have less spam to fight, but you certainly will have less economic innovation & more of the would-be innovators will be stuck working dead end job at dysfunctional corporations.

Now You See it, Now You Don't

Most people can't see what they are missing out on so they won't know, but (as Tim Wu put it so eloquently in The Master Switch) the same was true for AT&T when it held back innovations like the answering machine & what ultimately came to be the WWW. What sort of price do you put on email taking a decade longer to launch? How many other disruptive changes built off of incremental improvements will never appear because they simply weren't large & corporate enough to compete on Google's web?

The web was great because it offered something different. Unfortunately you have to search using something other than Google to find it.

Sustainability

"When plunder becomes a way of life for a group of men living together in society, they create for themselves in the course of time a legal system that authorizes it and a moral code that glorifies it." - Frederic Bastiat

Business Ethics vs Sustainability

The concept of business ethics is usually a self-serving approach to marketing.

Some people would rather make money dishonestly than honestly, getting satisfaction out of screwing people over (hi Andy), but gray is a broad spectrum. To focus on "ethics" is to allow relative justifications & miss the broader issues.

Likewise, additional layers of complexity promoting more rules & incentives over wisdom ultimately destroys the skills and will to develop legitimate leadership.

Rather than adding additional layers of complexity, what we should ask is:

"Is what we are doing sustainable, or is it not?"

That which can not be sustained won't.

Eradicating Poverty Opportunity

As the world has grown richer poverty should eventually disappear, however it isn't. The public claims from the US of supporting & spreading democracy are inconsistent with blocking the ability of a Hatian to earn a livable wage. "The Obama administration pressured Haiti not to raise its minimum wage to 61 cents an hour."

How is THAT for left-leaning altruistic behavior?

While funding a large imperial army that blows trillions of Dollars on fraudulent wars (and jails those who tell the truth) the government also helps bankers grift a large % of GDP through outright malicious fraud (which is never prosecuted). To embed that much corruption into the core of a political system you can only really have "democracy" over here so long as you fund exploitation elsewhere.

However, even that system of imperialism seems to be falling apart as Dollar depreciation not only caused the overthrow of many middle east governments, but inequality is also sharply rising in the US. The "recovery" happened for the investor class as we collectively as a society throw more people under the bus daily:

The structural problems is that way to much of our productivity (and all our increased productivity in the past 3 decades) is going to the investor class. Any solution to our structural problems must, therefore, contain an element of redistributing wealth from rich to poor and middle class people.

Wearing Political Blinders?

One thing I like about SEO is it forces you to view systems & attempt to understand ecosystems. If you don't learn many valuable lessons about society the online market is so competitive that it can & will wipe you out.

Some market purists might cringe at the word "redistribution" but ultimately covering the losses of the banking class was exactly that. Except it worked counter to the free-market ideology those same bankers preach to everyone else.

I have been called right leaning & I have been called left leaning. But I see both "labels" as missing the broader point of what happens in reality. When a right wing president was in charge the bankers were allowed to commit endless crime without punishment. Based on disgust toward that fraud a left wing president was elected on the promise of hope & change. He then proceeded to screw the general public while fellatiating the banking class. He is a sell out.

Money > Human Life

One thing I question the legitimacy of is why capital gains are taxed at a lower rate than normal wages. Is a person's blood, sweat & tears worth less than compounding interest on money? There is some theory that the capital markets are more efficient if they are taxed lower & that makes the rest of society more efficient, but that theory fell flat on its face when the US government shoveled trillions of Dollars into the banks & none of the bankers went to jail (even though some of them have stated before congress that they knew 80% of their "product" was defective).

Go Long Fraud, The Government Is!

AIG was nearly bankrupted from insuring mortgage-backed securities that were misrepresented (which was, of course, yet again, fraud & thus illegal). AIG, facing bankruptcy, was not allowed access to a taxpayer-funded bailout unless it waived its rights to sue the banking criminals which destroyed their company. What dirtbag in government decided that AIG (and taxpayers) should have their rights waved so that they can funnel money into criminal banking enterprises? Does it matter that many government "regulators" were former employees of AIG's counterparties?

If a certain class of individual is inherently destabilizing, then why (other than fraud) do we keep "doubling down" on hoping that group of people will eventually change their approach (even as we shield them from the negative consequences of their own actions by making everyone else foot the bill)?

We block them off from the market feedback mechanisms we pride ourselves on pushing on everyone else. Yet if their failure causes anyone else to fail we claim that it was a combination of "free market forces" and an individuals lack of performance that is to blame for their own plight.

Sure some industries get disrupted, but when one industry goes down another rises up. Not with the banking fraud though, it is just an across-the-board kick in the nuts to almost every non-banker at the exact same time. Since our monetary system is a debt-based system, one person's savings is backed by another person's debt, thus most people are *required* to live pretty close to paycheck to paycheck. Yet it is considered totally reasonable for the Federal Reserve to destroy their store of wealth AND their ability to earn at the exact same time:

Deflation and outsourced jobs is creating a loss of purchasing power at the bottom and middle income strata. Housing wealth destruction and job stability are exacerbating this.

MEANWHILE in monetary land….inflation is being injected into the system by the Fed who lend money to rich people who then buy commodities, driving up prices, who buy bonds which lower savings rates and borrowing costs, and stocks which drives down dividend yields and increases cap gains.

That hand is *anything* but invisible. Adam Smith wouldn't call that a "free" market & anyone who does is one or more of the following

  • ignorant
  • a liar

Since those who lost trillions of Dollars through intentional malicious fraud can't accept the "free market" forces that they suggest everyone else should be subject to, the only solution is "more debt, please."

The same banking criminals who caused the mess are paid interest to loan the government its own money, so that the banks may "earn" their way out of the bankruptcies they earned through their fraudulent activities.

2 Groups of Citizens, 2 Books of Law

Some bankers were caught bid-rigging to rip off the government. Those who they committed bribery with are now in jail. The bankers once again have nobody in jail, even though they were engaged in the exact same crimes!

Those same bankers got a slap on the wrist for laundering hundreds of billions of Dollars in drug money & then commit perjury before the courts by manufacturing robosigned documents to replace the ones they had to destroy to hide their mortgage fraud.

If we are intellectually honest, how is it that a person can go to jail for using a fairly soft drug like weed when the people who push hundreds of billions of Dollars of drug money (tied to much harder drugs & global violence) go unpunished?

But the drug money isn't enough.

Now they are trying to figure out how to steal social security funds that already withdrawn from you check into the abyss. The nationwide decline and austerity are already so obvious that the Oakland police publicly issued criminals a laundry list of crimes that they won't even bother responding to:

  • burglary
  • theft
  • embezzlement
  • grand theft
  • identity theft
  • required to register as sex or arson offender
  • dump waste or offensive matter
  • pass fictitious check
  • stolen license plate
  • extortion
  • vandalism
  • etc etc etc

If Markets Are Efficient...

There is no commodity in the world that is as commoditized as money is. Most of it doesn't even require actual printing, but is just digits on a computer screen...much like the digits I am typing right now. And yet the banking criminals have captured so much of governance that it is not uncommon to see them take 30% of corporate profits (and then shift the subsequent mirroring losses onto the public at large).

To claim that those huge profit margins (during fraud-driven bubbles) are reasonable AND that they deserved to be bailed out whenever they get things wrong is intellectually dishonest. If the taxpayer is capturing 100% of the downside risk, then why (other than fraud) are these banking criminals paid a single cent more than a person enlisted in the military?

Global Instability

Internationally we are at least as screwed up as we are domestically. Does outsourcing make the world more efficient? In many cases absolutely. But the global imbalances create unsustainable current account issues.

Can look to technology to solve the problems?

Maybe not!

Even The White Horses Have Bloody Hooves

Apple is one of the most profitable companies in the world. At the other end of that supply chain is a factory where harsh chemicals give employees nerve damage & people keep committing suicide. And as soon as those people get any aspirations at all, suddenly even the 3rd world slave wage labor is too expensive & over the next few years a million people working for Foxconn will lose their jobs to robots.

If even slave wage labor isn't competitive with robots then what does that mean for labor in countries with a higher standard of living, where the labor is far more expensive?

Eric Schmidt famously stated that laws are written by lobbyists & then Google quickly hired a dozen lobbying firms while significantly increasing their political donations. The lobbyist comment was a sneer at the system, until Google followed suit and started buying the political process. They make the same claims toward patents, yet they are investing billions of Dollars there as well.

Where Legitimate Innovation Comes From

Historically most valuable economic innovation (the legitimate kind, not the fraudulent financial engineering of the past decade) comes from individuals & smaller groups who challenge existing industries & powerful companies with creative destruction by making markets more efficient. The AT&T monopoly did a lot of great research, but anything that they thought could cannibalize their current business model did not see the light of day - sometimes for decades.

If we are moving into an ideas & technology driven economy (and that is where most of our growth will come from) then diversity should be cherished & encouraged. Unfortunately, the software patent wars lead to smaller developers being driven out of the ecosystem & innovation is slowed while the 800 pound gorillas & patent trolls engage in an arms race.

These sorts of "legislate away the future" that aim to keep existing powers at the top of the food chain are literally everywhere - from the food supply right on through to a proposed law that would harm the ability for whistleblowers to even highlight the pervasive epidemic corruption.

And Now On the Auction Block...

"US congressional parties now post prices for key slots in the lawmaking process."

The system of fraud is so unsustainable that the US got a credit downgrade. The criminal mortgage probes are fizzling out, but Fannie Mae needs more money! Those responsible for the downgrade (who committed fraud & then sold their junk to the government at above market values) are warning the US about the risk of it losing its worldwide reserve currency status. To make the bankers whole, everyone else must lose, at least up until that no longer works.

Exponential Growth in a Finite World?

We have a debt-based monetary system which requires exponential growth to prevent collapse, an oligarchy-based political system that punishes innovation & success while rewarding failure, and it exists in a finite world where we are reaching the peak production in key base-level economic inputs: like energy.

The same criminal banking organizations that forced you to eat their debts are now hoarding the physical commodities you need to live. And yet, our worldwide finance-first economy (where we put derivatives & abstractions before reality) is so dysfunctional that even the banks are engaging in massive layoffs.

The parasite ate the host & now the host has nothing left to give.

Manipulation vs Fixing The Problems

It is thus without surprise then that the government invests in manipulating the perception of reality (rather than fixing actual problems) and society has an issue with mass psychosis.

That which can not be sustained won't.

Dylan Ratigan's Mad as Hell on msnbc

Google Rips Rip Off Report From The Search Results

We live in a culture where it is far more profitable to solve symptoms than it is to solve problems. As such, the disappearance of ripoffreport.com from Google's index probably has retainer-based reputation management firms like reputation.com singing the blues.

Ed Magedson, the owner of Rip Off Report, has been charged with RICO in the past and managed to come through unscathed, but he has never tackled an opponent as media savvy or powerful as Google.

He is pretty savvy with the legal system & the media, so it will be fun to watch how he responds to this one, as his business model relies relied on top Google rankings:

Attorney: So what I've gathered from all of your testimony, Dickson, is that Ed Magedson has indirectly told you that he is responsible for making posts about companies. He will make these posts.

Mr. Woodard: Yes.

Attorney: And then he will manipulate the search engines; is that true?

Mr. Woodard: No question about the search engines. That's where the money is made.

A new take on Will it Blend: can a vampire suck blood from another vampire?

Vampires have often found it advantageous to maintain a hidden presence in humanity’s most powerful institutions. In the 1600s, it was the Catholic church, and today, as you all know, it’s Google, Fox News.

Update: adding intrigue to the situation, it looks like the site was removed due to a request inside Google Webmaster Tools, but the folks from ROR claimed they didn't make the removal request: "Ripoff Report did not intentionally request Google to delist the website, and we are still investigating what occured."

Update 2: Looks like they are back ranking in Google again. Perhaps someone found yet another loophole with Google's URL removal feature.

Increase Your Profits with MixRank's New Competitive Research Tool

Not many spy tools out there do what MixRank does. MixRank is a tool that gives you the ability to peek into the contextual and display ad campaigns of sites advertising with Google AdSense.

Uncovering successful advertising on the AdSense network can give you all sorts of ideas on how to increase your site's profitability.

Not only can you uncover profitable AdSense ad campaigns but you can pick off AdSense publisher sites and leverage competitive research data off of those domains to help with your SEO campaign.

With MixRank own your competitors in the following ways:

  • Obtain the domains your competitor's ads are served on
  • Swipe your competitor's ad copy
  • Watch ad trends to target your competition's most profitable campaigns and combinations of ads

Another great thing about MixRank is how easy to use it is. Let's go step by step and see how powerful MixRank really is!

Step 1: Pick a Competitor to Research

MixRank makes is super easy to get started. Just start typing in a domain name and you'll see a suggested list of names along with the amount of ads available:

Here we are going to take a look at Groupon as we consider building a niche deals site. Keep in mind that MixRank is currently accepted free accounts while in beta so over time we can expect their portfolio to grow and grow.

MixRank breaks their tool down into 2 core parts:

  • Ads (text and display)
  • Traffic Sources

We'll cover all the options for both parts of the MixRank tool in the following sections.

Step 2: Working with Ad Data (Text and Display)

Let's start with text ad options. So with text ads you have 3 areas to look at:

  • Active Ads
  • Ad Reach
  • Best Performers

Here's a look at the interface:

As you can see, it is really simple to switch between different ad research options. Also, you can export all the results at any time.

The image above is for "Active Ads". In the active ads tab you'll get the following data points (all sortable):

  • Publishers - maximum number of AdSense publishers running that particular ad
  • Last Seen - last known date the ad was seen by MixRank
  • Frequency - amount of publisher sites on which the ad appeared
  • Avg. Position - average position of the ad inside AdSense blocks

Here you can export the data to manipulate in excel or do some sorting inside of MixRank to find the ads earning the lion's share of the traffic.

The Ad Reach tab shows up to 4 ads at a time and compares the publisher trends for those ads. To spread the love around let's look at a couple ads from LivingSocial.Com:

Here you can see that one ad crashed and fell more in line with an existing ad. You can compare up to 4 ads at once to get an idea of what kind of ad copy is or might be working best for this advertiser.

The Best Performers section compares, again, up to 4 ads at a time (use the arrows to move on to the next set) which have recently taken off across the network.

Needless to say, this report can give you ideas for new ad approaches and maybe even new products/markets to consider advertising on.

If the advertiser is running Banner Ads you can see those as well:

With Banner Ads, MixRank groups them by size and you can see all of them by clicking on the appropriate size link.

When you click on a banner ad you'll see this:

This is a good way to get ideas on which banner ads are sticking for your competitors. Also, it's a great way to get ideas of how to design your ads too. A little inspiration goes a long way :)

So that's how you work with the Ads option inside of MixRank. One thing I dig about MixRank is that it's so easy to use, the data is easy to understand and work with, and it does its intended job very well (ok, ok so 3 things!)

Step 3: Traffic Sources

Now that you have an idea of what type of text ads and banner ads are effective for your competition, it's time to move into what sites are likely the most profitable to advertise on.

MixRank gives you the following options with traffic sources:

  • Traffic Sources - domains being advertised on, last date when the ad was seen, average ad position and number of days seen over the last month
  • Reach - total number of publishers the advertiser is running ads on

The traffic sources tab shows:

  • Uniques - estimated number of unique visitors based on search traffic estimates
  • Last Seen - last date MixRank saw the ad
  • Days Seen - number of days over the last month MixRank saw the ad
  • Average Position - average position in the AdSense Block

A winning combination here would be recent last seen dates and a high number under the Days Seen category. This would be the advertiser has been and is running ads on the domain, indicating that it may be a profitable spot for them to be in.

You can also pull these domains into a competitive research tool like our Competitive Research Tool, SemRush, SpyFu, or KeywordSpy and find potential keywords you can add to your own SEO campaign.

Another tip here would be to target these domains as possible link acquisition targets for your link building campaign.

The Reach option is pretty self-explanatory; it shows the total number of publishers the advertiser is showing up on:

Another good way to evaluate traffic sources is to view the average position (remember, all the metrics are sortable). A high average position will confirm that the ads are pretty well targeted to the content of that particular domain.

Combine the high average position with Days Seen/Last Seen and you've got some well-targeted publishers. You can export all the data to excel and do multiple filters to bring the cream of crop to the top of your ad campaign planning.

MixRank is Looking Good

It's early on for MixRank but so far I like what I see. The tool can do so many things for your content network advertising, media buy planning, link building campaigns, and SEO campaigns that I feel it's an absolute no-brainer to sign up for right now.

For now it's *free* during their beta testing. Currently they are tracking about 90,000 sites so it's still fairly robust for being a new tool.

The Ultimate Guide to Using Bing's Webmaster Tools

Bing's Webmaster Tools recently got a nice refresh and update. There is a lot you can do inside of the tools so we figured you'd want to know all about it :)

Also, we've included some free advertising coupons at the end of this guide to help get you started.

Account Dashboard

Bing's webmaster tools are fairly easy to use and the interface is quite clean. On the main account dashboard page you can select whatever site you want, in your account, and see quick stats on:

  • Clicks
  • Impressions
  • Pages Indexed
  • Pages Crawled

The percentages account for the net gain or loss from the week. For more specific site data, and more historical numbers, you would want to get into the site's dashboard which we will cover in the next section.

This initial account dashboard shows all the sites you have in your account and the associated metrics. The data is from a test site I created awhile back and kind of forgot about until they updated the tools over at Bing.

From this page you can:

  • Add sites
  • Remove Sites
  • Export data
  • Click on a site to get to its dashboard
  • See any account specific messages from Bing

A snapshot of all your sites in one place is a good way to immediately spot any recent issues with ranking, indexing, or crawling on your sites.

Once you are ready to move on into a specific site, just click on the site name under the heading "Site". When you click the site's name, you'll be brought to the site's dashboard.

Site Dashboard

Each site you have in Bing's webmaster tools has its own dashboard (not to be confused with the account dashboard). Once you get into a site's dashboard you see the data we talked about above at the top of the dashboard and then a 30 glimpse of the following metrics for the selected site:

  • Traffic summary
  • Index summary
  • Crawl summary (and a separate chart for crawl errors)

Here is what my test site's dashboard looks like:



For established sites with steady traffic (if for tracking ongoing campaigns) these 30 day snapshots are good ways for you to get a read on recent site activity and/or issues with traffic, crawling, indexing.

These types of reports can also be very helpful to watch when you are doing site re-structuring or complete site overhauls (changing CMS, url structure, and so on).

Each section has its own place within your site's webmaster tool profile. You can get more information on traffic, indexing, and crawling just by clicking the approriate link and we'll discuss each of these sections below.

Traffic Summary

Inside the Traffic Summary tab you have 2 options:

  • Traffic Summary - 6 month history of traffic and search query performance
  • Page Summary - Same as Traffic Summary except the data is broken out by page with the option to click through to the page's search query report

On this page the second chart listed is one that you can slide back and forth to shorten or lengthen the history of the data you are looking at.

The lines are color coded to show overall impressions versus clicks. Bing does present the data in a clean and easy to understand way inside of their webmaster tool reports.

The second chart on the traffic summary page shows search query performance. You'll see keywords you received traffic for as well as ones that you gained impressions (but no clicks) for:

This report is in conjunction with the first report of overall traffic/impressions from a time view. If you shorten the report this report will adjust as well.

You'll see the following data points in this report (all sortable and exportable):

  • Keyword
  • Impressions
  • Clicks
  • CTR
  • The Average Position your listing was in when the impression was gained
  • Average Position of your listing when a click was earned

This is a good way to evaluate how you might be able to increase your CTR. By showing you impressions versus clicks (the average positions) you can guesstimate on which keywords could use a bit of freshening up on the title tag and meta description front.

Page Traffic Report

The Page Traffic report shows the same charts as the Traffic Summary page with the exception of the bottom chart, which shows page level metrics. Here's a snippet from yesterday:

You can click whatever page you want and get the following keyword summary, similar to the initial chart on the Traffic Summary page but on a per page level on whatever time frame you selected (the above was a day so when you click through, that date carries into this report):

You can do the same thing here with average impression and average click position (and CTR) to evaluate pages which can use a refresh on title tags and meta descriptions for possible CTR upswings.

Another tip here would be to export the queries and see if there is potential to build out the page's category further with content targeted to specific queries.

So if a query is "chocolate truffles" and you are seeing some data for "white chocolate truffles" you might want to consider building out this section to include content specifically for those related but separate queries (if you haven't already)

Index Summary

The index summary page shows the index rate of your selected site, in Bing, over (roughly) the last 6 months.

The index summary chart is similar to the other charts in Bing's webmaster tools, which all the interactive sliding parameters that let you expand the report out over 6 months or drill down into a really tight, specific time frame.

Index Explorer

Bing's index explorer is a helpful tool that can alert you to HTTP code problems or confirm correct implementation of things like 301 directs.

The interface is easy to use:

With the index explorer you can check the following HTTP status codes that Bing has discovered over your selected time period (all time, last week, last month) and over your selected crawl range (all time, last week, last 2 weeks, last 3 weeks):

  • All HTTP codes
  • HTTP codes 200-299
  • HTTP codes 300
  • HTTP code 301
  • HTTP code 302
  • HTTP codes 400-499
  • HTTP codes 500-599
  • All other HTTP codes

You can also search for pages where the Bing bot has identified malware as being present as well as choose to show pages that you've excluded in your robots.txt file:

Below the options listed above, are where the pages that meet your filter requirements will show. It breaks the site down into categories and pages. When you hover over a page you'll see the following details:

If you click on a page you can also see a couple of additional data points:

  • Document size
  • Inbound links to the page
  • Block cache and block URL options for that particular page

Using this in conjunction with internal link checking tools like Xenu Link Sleuth (win) or Integrity (mac) can really help you get a good peek into the potential on-page technical issues of your site.

A couple of tools that give you valuable data about your on-page optimization are our Website Health Check tool (web based) and Screaming Frog's SEO Spider (mac/win).

I hope Bing adds some export functionality here, as they do in other areas of their webmaster tools, but the filtering options are solid enough to drill down into key issues for now.

Submit URLs

So this is a pretty straightforward option. Bing gives you the option to submit URLs (can be ones that are or are not in their index now) that you would like to request a recrawl or an initial crawl on.

The URL allowance is pretty limited so it's best to save these requests for more important pages on your site (their crawl section has a spot for sitemaps).

Block URLs

You can also select pages, directories, or an entire site to block from indexing and/or Bing's cache:

One area for improvement here, I think, is to be able to input or upload individual pages. As of now, you can only input 1 page per click, or select a directory to block (site.com/directory/), or block the entire site.

They do offer export functionality which is helpful when doing site audits, but a way to mass upload or input URLs would be nice (though you can tackle some of this with their URL normalization feature that will cover below).

Inbound Links

Bing will also show you the links they know about (in their index) that point to specific pages on your site.

Much like the charts above, you are presented with a historical chart which you can adjust with the slider below it (just like the Rank and Traffic stats shown prior).

Below those charts Bing will show you the pages on your site which have external inlinks and how many links they know of per page.

Once you click on a page, you'll see the linking URLs and the corresponding anchor text:

You can export page-specific links as well as the overall breakdown of pages with links and how many links those pages have. The export functions offer a nice way to get a high-level view of the overall link depth of your site.

While it's still a recommend practice to invest in a paid link research tool, supplementing your paid research by getting free link data from search engines is a no-brainer :)

Deep Links

Bing's Deep links are basically the same as Google Sitelinks. If you have been blessed by Bing, you'll see them in the Deep Links section of your site.

Bing's official statement on Deep Links is:

These Deep Links are assigned to websites which are seen by Bing to be “authoritative” on the topic which they target. The best way to influence whether you are chosen to have Deep Links displayed for your website is to create unique, compelling content that pleases searchers. Sites receiving this feature do an excellent job of delivering what visitors want, and keep visitors coming back time and again.

URL Normalization

If your URLs encounter parameter issues that can lead to duplicate content (e-commerce sites, CMS functionality, etc) then you might want to take a look at Bing's URL normalization feature.

Google offers a similar tool called Parameter Handling (great write up on this from Vanessa Fox)

This is a section where you need to be really careful as to not unintentionally boot out relevant URLs and content from the site.

Combining this with use of the canonical tag (which Bing uses as a hint) is your best bet to ensure that there are as few duplicate content, link juice splitting issues on your site (with Bing).

Again, make sure you or your programmer(s) know what you or they are doing so you do not do more harm than good.

With Bing, you basically just add whatever parameter you want to ignore so make sure that parameter or parameters do not crossover to other areas of your site that you would rather not have Bing ignore:

You can export all your inputted parameters as well.

Crawl Summary

The Crawl Summary section shows similar charts to other category charts inside Bing's Webmaster Tools on the landing page (6 month charting with interactive timeframe filtering).

You can check total number of pages crawled as well as pages with crawl errors off the landing page for this category (no exporting unfortunately) and dig into specific sections like:

  • Crawl Settings
  • Crawl Details
  • Sitemaps

Crawl Settings

Bing let's you set up custom crawl rates on a per site basis:

You may have situations where a custom crawl rate might make sense:

  • You want the bot to visit off-peak hours rather than when customers are visiting
  • You might be running special promotions or season promotions at specific times on an e-commerce site and want to limit bandwidth usage to visitors rather than Bing's bot
  • You might be doing a live stream or interview of some sort, and are expecting large amounts of traffic
  • Maybe you are doing some heavy content promotion across the web and social media and you want to avoid having any site load issues

You can use the timeframes given to line up with your server's location to make sure you are hitting the hours correctly (base time on the chart is GMT time).

You can also allow for crawling of AJAX crawlable URLs if so you choose. They recently rolled this out and their help section is weak on this topic so it's unclear on exactly how they'll handle it (outside of #!) but it's an option nonetheless.

Crawl Details

Bing's Crawl Details page gives you an updated overview of what's covered in the Crawl Summary. This feature doesn't require you to do any filtering to find issues, you can simply see if any of your pages have notable HTTP information, might be infected with Malware, and which ones are excluded by robots.txt.

If you have any pages pop up, just click on the corresponding link to the left and a list of exportable pages will pop up.

Another helpful, exportable report for site auditing purposes.

Sitemaps (XML, Atom, RSS)

This is where you'd submit your sitemap to Bing. For XML sitemaps, double check your submission with the Sitemaps.Org protocol

For a site that's going to be a fairly static site (like this one) I'd pay more attention to proper site architecture rather than relying on a sitemap, I might even skip the sitemap unless I was using Wordpress where you could just have it auto-generate and update with new posts and such.

You can add, remove, and resubmit site maps as well as see the last date crawled, last date submitted, and URLs submitted.

Bing Webmaster Resources

Bing's recent update to their Webmaster Tools added a good amount of value to their reporting. Here are some additional resources to help you get acquainted to Bing.

Free Microsoft Advertising Coupon

While you are over at Bing, signing up for Webmaster Tools, feel free to use these Microsoft AdCenter coupons for your advertising account :)

An Interview of Branko Rihtman (AKA: SEO Scientist)

We recently interviewed Branko Rihtman. He started working in the industry in 2001, doing SEO for clients and properties in a variety of competitive niches. Over that time, he realized the importance of properly done research and experimentation and started publishing findings and experiments at http://www.seo-scientist.com.

How did you get into the SEO space?

Completely by accident. When I was done with my compulsory army service, I knew I would rather work in an internet based company than, say, dig ditches. So I went into a local internet portal and searched for “internet companies in Jerusalem”. One of the replies was from an SEO company. They offered me a job with flexible hours and a possibility of working from home. Since I was about to start university, working from home looked particularly interesting. I ended up spending 8 years in that company.

When did you know you were going to "make it" in SEO?

Ummm never? I don’t think any of us ever “makes it” in SEO. Yes some people are more popular than the others and some get invited to speak at more conferences than the others but that is most certainly not a measurement of “making it”. SEOBook forum is full of people that are more succesfull and savy than the majority of SEOs out there, yet very few of them are well known in the general marketing circles. One of the things I like about SEO is that it is constantly “making” you and “breaking” you. If it wasn’t like that, we wouldn’t be constantly learning and adapting.

What is the most exciting thing that has happened to you while in the SEO field? Do you still get a rush of excitement when a new project takes off?

Getting a site into a top 5 for [mesothelioma] on Google. Kidding. One of the more appealing qualities of the SEO field is the puzzle cracking. You are constantly presented with puzzles – why did Google penalize this site, why is this site ranking above me, what are the parameters considered in the new update… For me, cracking those puzzles is the most exciting part of my work. I really have to remind myself sometimes that I should be thinking about potential profitability of these conundrums because to me a puzzle is there to be solved and that is all that matters. Once I crack it, I kinda lose interest in it so I have to make sure that 1) solving the current SEO puzzle is worth my time in terms of profitability and 2) I can get action items from possible solutions. I think the best example of these puzzles is Google overoptimization filter. I kinda developed a knack of getting sites out of it (which landed me my current job as well). Another exciting thing would be implementing extensive structural changes to large sites and seeing the positive effect in SERPs. As for new projects, I have seen so many of them die off miserably that I find it hard to get excited at the beginning. First jolts of traffic and first rankings get me excited and then I turn the engines on.

How would you compare biology to SEO?

Oh dear, this could be a whole blog post. There are several aspects that are very similar. Mainly, and this is especially true in molecular biology, we are making changes on a system that is a black box. We have a whole bunch of (presumed) parameters to tinker with and very limited list of observable outputs. So we make deductions which can, but don’t have to, be true. So if I am changing a certain ingredient in my bacterial culture and observe a change in growth rate, I cannot be sure what exactly the base cause of the increase was. Maybe the element I have added is actually poison and my bacteria are trying to multiply on reserves of food, hoping that one mutant will be able to overcome the adverse effects of the element I have added. Similarly, when we add a link pointing to a website, we don’t know whether it was that link that caused an increase in ranking or someone in Bangladesh created a valuable link that is pointing to one of the pages that is linking to our new linking page and we enjoyed some of that juice.

Another important similarity (and then I will shut up about it) is the arms race between the search engines and SEOs and SEOs among themselves. Evolutionary theory and ecological sciences are full of very important lessons that can be applied to the world of SEO. I have written on my blog in the past how some evolutionary theories can be applied to understand and foresee the relationship between Google and link buying. Another metaphor from the evolutionary theory I like to use is the Red Queen Principle – in evolution, competing organisms have to invest all their efforts in improving and adapting so they can remain at the same competitive point relatively to their enemies. Like with the Red Queen from Alice in Wonderland, they have to run their fastest to remain in the same place. The same can be said about websites competing in lucrative niches – it is not enough to get to the first spot. Your competitors are constantly aiming for that place too and you have to put in maximal efforts (linking, site speed, trimming indexing fat, QDF hunting etc.) to remain in the same place.

You are a big proponent of applying the scientific method to SEO. What parts of tests are easy to do? What parts are hard?

SEO tests can be easy from the beginning till the end if done right. The hard part is asking the question in a “testable” way. You have to keep in mind the limits of your testing system and constantly be aware about what you can measure and what you can’t. You have to make sure you have taken into the account all the possible outcomes of your test and what each of those outcomes is telling you. Otherwise you can find yourself spending valuable time, just to end up with a highly ambivalent result that is not teaching you anything about the issue you are researching. Deciding what controlling factors you are going to implement and doing it in a way that doesn’t interfere with your test can also be challenging.

What do public SEO "studies" often get wrong?

Mostly, people get the order of steps that make up the test wrong. They usually start with a pre-made conclusion and then build the test (and, I suspect, not rarely the results themselves) around it. They want to show that, for example, text surrounding the link will pass the relevancy to the target page, so they go out to prove that. That is the exact opposite of the scientific process. Now many people say that trying to approach SEO questions with a scientific process is an overkill, but science is more a state of mind then a set of tools. It exists so that minimal bias enters your decision and conclusion process, therefore people should not approach it as something that involves a lab coat and chemicals, but rather change their mindset from “what do I want the results to be” to “what the reality is”.

What percent of well-known fundamental "truths" in SEO would you describe as being wrong?

I would say that 100% of absolute, definitive statements about SEO are false. Recently, Joe Hall has written about becoming a “postmodern SEO” when realizing that every conventional truth in SEO can be 100% right and 100% wrong, depending on the context. I very much identify with this sentiment. It very much rubs me the wrong way when people in the industry come out against a certain SEO technique (and it rubs me even more when I know they were the biggest abusers of it until yesterday) or when they make a strategy X an “absolute prerogative and whoever doesn’t do X should be fired by their clients and sued for dishonest practices”. Keyword tags can be useful in some cases, rank reporting can be useful in some cases, forum signature spamming can be useful in some cases and increasing keyword density can be useful in some cases. It all depends on the context.

In the forums sometimes when I read your contributions & think "classic whitehat consultant view" and then on other entries I think "aggressive affiliate in gaming." What allowed you to develop such a diverse range?

I am very flattered that people think this when they read my ramblings or talk to me about SEO. What allowed me to develop a diverse range of experiences in SEO is not being judgemental towards SEO techniques. Continuing from the previous question, understanding that they are all tools that should be put in the right context and used responsibly, enabled me to try and see all the advantages and disadvantages of all SEO techniques and apply them accordingly. Had I taken “holier than thou” approach towards any end of the SEO spectrum, I would have been a worse SEO. I also consider myself lucky to have had an opportunity to work in a wide range of niches - from legal, ecommerce, travel and financial, all the way to porn, pharmaceutical and gaming with a lot of niches and business sizes in between those extremes. Once you look at link profiles of sites that have been ranking for years in some of those extreme industries, you understand how preposterous divisions to hats of different colours really are.

As a second part to that question, how do you decide what techniques are good for some of your own websites & which are good for client websites?

Again, it is all in the context. I make a big differentiation between our sites and clients’ sites in a way that whenever I want to use a riskier SEO technique on a client site, I make sure to educate the client to all the risks and benefits of going down that road. I make sure the client understands the possible repercussions and I try to offer a cleaner alternative. There are clients that are not interested at all in organic promotion and there are clients that enter the project knowing that the site we work on can be burnt in a matter of minutes. When it comes to our sites, it depends on the profitability of the site, obviously. Then there are sites I test stuff on that I wouldn’t click on without wearing my lab gloves.

Do you believe Google is intentionally tilting the search game toward brands, or do you think there are many other signals they are looking for that brands just happen to frequently score high on?

I don’t think we need to speculate about that much – they have openly said in the past that the brands are the solution to the cesspool of the internet. They are rewarding brands with SERP enhancements. They are creating algorithmic changes in which brands are apparently being treated less harshly than run-of-the-mill sites. On the other hand they are making sure to stress in their PR announcements that brands are not treated differently than anyone else. As I don’t believe they openly lie about these things, it seems to me they are just doing doublespeak and being intentionally obscure about it. I can say that I don’t discriminate against tall people on busses and I will be factually correct since no one goes over the bus line and takes out people over 180 cm tall and sends them back home. However, by making the legspace very uncomfortable for these people, I may as well kick them out of line and save everyone the trouble. So while there is probably no checkbox next to certain websites marking them as brands, the ranking algorithms can theoretically be tweaked so that the brands surface to the top of a lot of the money queries and I think that is what we are seeing here. Possible signals for this can be percentage of links with URL for anchor, certain number of searches for the brand name and others. By the way, reliance on these signals can be used to explain the relative advantage that exact match domains have for their keyword.

Both the relevancy algorithms & webmasters are in some ways reactive. I believe that frequently causes the relevancy algorithms to ebb and flow toward & away from different types of sites. Do you generally have 1 sorta go-to-market plan at any given time, or do you suggest creating multiple SEO driven strategies in parallel?

It all depends on the client responsiveness levels. If I see that the client is willing and allows us to become part of their marketing team, then we both aim for harnessing every marketing activity for SEO benefits, while also trying to diversify and reduce the dependency on any single traffic source. In cases when, for a whole lot of different reasons, we cannot establish a network of sites that will use different strategies, we try to work with a whole lot of subdomains, trusting how Google treated subdomains historically. I have to admit that in the majority of cases, the responsiveness of the deciding ranks (or the lack of thereof), together with a constantly growing list of more basic, day-to-day tasks, prevents us from making these strategic marketing decisions for the client – it is hard to talk about holistic approach to marketing when their homepage doesn’t appear on first 3 pages of the site: query or when their IT department decides to 302 every product page to homepage while they are moving servers for 3 months.

When major algorithm changes happen they destroy certain business models & eventually create other ones. How many steps ahead / how far ahead do you feel you generally are from where the algorithm is at any given time?

We are all over the board with this. Luckily (or unluckily) none of our clients were affected by Panda. I say “unluckily” because the scientist in me would want nothing more than to test different theories about Panda on an affected site. The marketer in me is stabbing the scientist in the back with a long sword for having such blasphemous thoughts. I would say that we usually “hang around” where the algorithm is at any given moment and if we stay behind, we manage to close the gap in a reasonable period of time. At least that has been the case so far. In some other cases, we have benefited from sites getting hit by algorithmic changes. This only means we are lucky, because I don’t think there is any single strategy that is 100% working all of the time in every level of niche competitiveness. Had such strategy existed, someone would have cracked it (Dave Naylor most probably), used it to their own benefit and Google would have changed the rules again, rendering the “perfect strategy” less than perfect.

How far behind that point would you put a.) the general broader SEO industry b.) SEO advice in the mainstream media?

One of the major revelations I discovered in SEOBook forum is that the public SEO community is really just a small tip of the iceberg that is this industry. There are so many skilled people working on their own sites, being affiliates or working in-house professionals that do not participate in the SEO Agora that any attempt to characterize “the general broader SEO industry” would be wrong. There is no way of judging where the industry is, other than by what they write about and talk about in social media and I don’t think that is a fair judgement. This is the industry of marketers and people do not write to dispense knowledge most of the time. Vast majority of the content put out there is created with the purpose of self-promotion and/or following some invented rule that “you must write X posts per week to keep your audience engaged”. It is very similar to the whole “Top X” lists format in which it is obvious that a significant percentage of items on the numbered list were forced in there so that the number X would be round or fit some theory of “most read top X articles”. While I do believe that someone will find value in anything, when looking across the board, there is very little you can tell about the actual knowledge of the people in this industry from what they write. I hope. I will tell you that I do see a general difference between the European and the US SEO crowd – I have seen (percentagewise) a seemingly larger amount of UK, Dutch and German SEOs that are more daring and questioning in their writing than the US SEOs. Don’t ask me why this is so, that is beyond my scope of expertise (or interest).

As for the mainstream media, living in the Middle East, I have learned to automatically distrust the mainstream media on issues much more important than SEO, therefore I usually treat mainstream SEO articles as a comic relief. Or a tragic one.

Many times when the media covers SEO they do it from the "lone ranger black hat lawbreaker" angle to drum up pageviews. Do you ever see that ending?

Nope. Nor do I ever see people in our industry not taking the bait and responding to that kind of coverage, thus contributing significantly to the mentioned drumming up of traffic. Even if the advertising industry moves away from impression-based pricing, more attention will always mean more links and that is just a different kind of opiate.

From a scientific standpoint, do you ever feel that pushing average to below-average quality sites is bad because it is information pollution (not saying that you particularly do it or do it often...but just in general), or do you view Google as being somewhat adversarial in their approach to search & thus deserving of anything they get from publishers?

I consider as below average anything Google would not allow Adsense on. Maybe someone really doesn’t know how to drink water from a glass and for that person eHow article is the best fit. On a serious note, just like with hats, I try not to be judgmental when it comes to content. If lower quality content that does not rank anywhere is used to push high quality content in very popular SERPs, I think it all levels out at the end. The bigger problem for me is rehashed, bland content, which you can see that was written according to a mold: Start with a question, present some existing views on the issue and end with asking your readers the initial question so you encourage comments. Or numbered list articles. Or using totally unrelated current events AND numbered lists in combination with a tech topic. I have just seen an article titled “5 things Amy Winehouse’s death teaches us about small business”. Spamming forums is Pulitzer worthy material compared to this garbage. Yet Google constantly ranks this crap and rewards it with a cut from their advertising revenue. And what is even worse, the crap ranks for head terms (ok maybe a bit less after Panda) while forum or comment spam does not appear in my SERPs. So who is polluting the web again?

I don’t think a scientific approach is relevant here. One thing that exists in the world of science and doesn’t in SEO is peer review. So if something gets published in a scientific journal, it was reviewed critically by the experts in that field and was deemed worthy in every possible aspect by some rigorous standards. Had this kind of system existed in the world of SEO, we wouldn’t have a below-average-quality content problem.

Can Bing or anyone else (outside of say Naver, Yandex & Baidu) challenge Google & win a significant slice of the search marketshare?

Only if Google does it for them and drops the ball completely. I don’t believe in homicide in the world of hi-tech companies (Facebook killer, Google killer, iPad killer) but I definitely believe in suicide (Myspace). The ball is constantly in Google’s court since they are the biggest kid on the playground and they have managed it fine so far. It is ironic how they have to deal with bad press on so many issues, almost making MS the underdog and a company people turn to when they want to boycott Google. Right now Google is the innovator and a trend setter in many fields beside the search (Documents, Analytics, G+, Adwords) so having all those eyeballs and improving integration of all those products into search and vice versa will make them an impossible act to follow in any foreseeable future. Which is something that was said about ancient Rome too.

A lot of SEOs are driven by gut feeling. With your focus on the scientific method, how much do you have to test something before you are confident in it? How often does your strategy revolve around gut feeling?

There are things that I know that work without everyday testing. Keywords in anchor will pass relevancy in the majority of cases and I don’t need to test that every time that I place a link somewhere. I am also aware of the exceptions to that rule (second link doesn’t count, for example) so when I see unusual or unexpected response from search engine, it gets my attention and I start testing. I also like to test extraordinary claims by people in the SEO industry, because they usually go against common knowledge and that is always informative. I will usually not let the testing process stand in the way of work. If there are several possible outcomes to the test that takes a long time to perform, I try to run with the project for as long as I can without making the decision, leaving all future direction possibilities open.

Gut feeling is something I usually use to assess trustworthiness of the people I listen to. I rely a lot (maybe more than I should) on other people’s knowledge. As I mentioned, I haven’t had the chance to test how pandalized site responds to different changes so I had to trust other people’s reports. Gut feeling is very helpful here to save time reading mile-long posts of people that I suspect do not even practice SEO on daily basis.

If a friend of yours said they wanted to get into SEO, what would you tell them to do in order to get up to speed?

To read the free guides from Google and SEOMoz. To pick a niche and create a site from scratch. To learn how to code, how to delegate, how to measure and how to hire and fire people. To read at least one SEO article every day. To read no more than one SEO article every day. To invest their first profits into SEOBook Training Section and to submit their site for review in the forum. The value they get from the advice there is going to be the best investment they made at the early stage. After their site is making money, to repeat that process in a different niche with a different strategy. Diversification is the best insurance policy in the ever changing algorithm world

If you had to start from scratch today with no money but your knowledge would you still be able to compete in 2011?

Yeah. Competing is about picking the battles you can win with what you have at the moment. There are still niches that can be monetized with relatively low effort (especially in non-english markets) and I think I would be able to monetize the knowledge I have and leverage it to create revenue in a reasonable amount of time. Luckily, I don’t have to test that claim.

If you had $50,000 to start, but lacked your current knowledge, what do you think your chances of success in SEO are?

Very low. Part of the knowledge is knowing what to spend the money on. Without prior knowledge, I would probably think that I can take on this SEO thing all by myself and $50K would be gone before I realized my mistakes. I would probably fall into the trap of buying links from some link network or torching my new site with 200,000 forum signature links all created in 2 hours

And, saving a tough one for last, in what areas of SEO (if any) do you feel science falls flat on its face?

First, I would like to reiterate: science is not a tool, it is a way of thinking and approaching problems. So under those definitions, I don’t think that science can fall flat on the face at all. I do see a problem with the abuse of the word “science” for marketing goals and a lot of those “approaches” fail because they lack the scientific way of thinking. Mostly they lack self-criticism and are so blinded by tagging their work as “science” that they will not adopt some of the humility and self doubt that is present in the majority of scientific work. The lust for hitting that Publish button, especially if there is potential financial benefit in publishing a certain kind of results, is the most unscientific drive in our industry.

There are some areas of SEO that scientific thinking should take a back seat to other approaches. One that instantly springs to mind is link building. To me, link building is the true art of marketing – recognizing what drives the potential linkers, leading them to linking to you while all along they are thinking that they came up with that decision themselves. There are some measurements involved and any testing should be planned and executed with a scientific rigour, but the creative part of it is something where science is of little use.

---

Thanks Branko! You can find him rambling at @neyne on Twitter or the SEOBook Forum & publishing findings and experiments at http://www.seo-scientist.com.

Currently, he is responsible for SEO R&D at Whiteweb, agency that provides SEO services to a small number of large clients in highly profitable niches. His responsibilities at Whiteweb are to gather, organize and expand the company's knowhow through research, experimentation and cooperation with other SEO professionals. In addition to being an SEO, he is currently writing his MSc thesis in environmental microbiology at HebrewU in Jerusalem.

Longer Google AdWords Ad Copy

I was just checking out the ongoing strategic meltdown in the value of the Dollar & noticed an AdWords ad with an extended headline & a 150 character ad description.

Currently I believe the above extended description is a limited beta test, but if Google starts mixing that in with Google Advisor ads & ad sitelinks there might not be a single organic result above the fold on commercial keywords.

The above image is even uglier when Google Instant is extended.

Using the 150 word ad descriptions would drive everything down one more row per ad. Adding another line to each of the AdWords ads would push the "organic" search results down another listing.

Of course one response is to operate in the tail of search, but just look at DMD to see how well that worked for them.

They are so desperate that they sent legal threats at a site flaming them. Humorously, that site also runs AdSense ads.

And that desperation is *before* Google has finalized a legal agreement on the book front & started aggressively pushing those ebooks in their search results with full force. In 12 months ebooks will be the new Youtube...a service that magically keeps growing over 10% a month "organically" in Google's search results.

Your content isn't good enough to compete, unless you post it to Youtube.

In addition to uploading spammy videos in bulk to Youtube, maybe SEOs should create a collective to invest in "an oversized monitor" in every home and on every desk. :D

Alternatively, switching the default search provider on every computer you touch to Bing doesn't seem like a bad idea.

Google Brand Bias Reinvigorates Parastic Hosting Strategy

Yet another problem with Google's brand first approach to search: parasitic hosting.

The .co.cc subdomain was removed from the Google index due to excessive malware and spam. Since .co.cc wasn't a brand the spam on the domain was too much. But as Google keeps dialing up the "brand" piece of the algorithm there is a lot of stuff on sites like Facebook or even my.Opera that is really flat out junk.

And it is dominating the search results category after category. Spun text remixed together with pages uploaded by the thousand (or million, depending on your scale). Throw a couple links at the pages and watch the rankings take off!

Here is where it gets tricky for Google though...Youtube is auto-generating garbage pages & getting that junk indexed in Google.

While under regulatory review for abuse of power, how exactly does Google go after Facebook for pumping Google's index with spam when Google is pumping Google's index with spam? With a lot of the spam on Facebook at least Facebook could claim they didn't know about it, whereas Google can't claim innocence on the Youtube stuff. They are intentionally poisoning the well.

There is no economic incentive for Facebook to demote the spammers as they are boosting user account stats, visits, pageviews, repeat visits, ad views, inbound link authority, brand awareness & exposure, etc. Basically anything that can juice momentum and share value is reflected in the spam. And since spammers tend to target lucrative keywords, this is a great way for Facebook to arbitrage Google's high-value search traffic at no expense. And since it pollutes Google's search results, it is no different than Google's Panda-hit sites that still rank well in Bing. The enemy of my enemy is my friend. ;)

Even if Facebook wanted to stop the spam, it isn't particularly easy to block all of it. eBay has numerous layers of data they collect about users in their marketplace, they charge for listings, & yet stuff like this sometimes slides through.

And then there are even warning listings that warn against the scams as an angle to sell information

But even some of that is suspect, as you can't really "fix" fake Flash memory to make the stick larger than it actually is. It doesn't matter what the bootleg packaging states...its what is on the inside that counts. ;)

When people can buy Facebook followers for next to nothing & generate tons of accounts on the fly there isn't much Facebook could do to stop them (even if they actually wanted to). Further, anything that makes the sign up process more cumbersome slows growth & risks a collapse in share prices. If the stock loses momentum then their ability to attract talent also drops.

Since some of these social services have turned to mass emailing their users to increase engagement, their URLs are being used to get around email spam filters

Stage 2 of this parasitic hosting problem is when the large platforms move away from turning a blind eye to the parasitic hosting & to engage in it directly themselves. In fact, some of them have already started.

According to Compete.com, Youtube referrals from Google were up over 18% in May & over 30% in July! And Facebook is beginning to follow suit.

A Complete Review of Wordtracker's Link Builder

You need links to rank, period. We can talk all we want about great content, social signals, brand signals, and all that jazz but quite a bit of that is subjective.

If you practice SEO, and have success with it, then you are well aware that a claim of "you need links to rank" is an objective, true statement without a bunch of false positives.

The gray areas come in to play when we talk about things like anchor text, quality, volume, and so on but the overarching truth is without links you are largely invisible in the SERPS.

Ok, enough of what you already know. Wordtracker recently updated one of their core tools with some cool new features and functionality.

What is Link Builder from Wordtracker?

Link Builder is designed to address a most of the core, key functions of a link building and prospecting campaign.

  • Locate potential link partners via competitor backlinks or based on specific keywords
  • Setting up a link building campaign and sorting your links properly (blogs, directories, social media, etc)
  • Tracking the status of your link campaign's efforts

Wordtracker uses Majestic SEO's Fresh Index by default but you can use the Historic Index as well.

I might opt for the Fresh Index initially, because Majestic tends to have dead links in the historic index (thanks to the significant churn on the web) but if you can't find enough decent prospects in the Fresh Index, using the Historic one isn't a bad option.

There is a lot I like about this tool and a few things I'd like to see them add to or improve on.

Step 1: Setting Up a Campaign

I'm a fan of clean, easy to use interfaces and Wordtracker definitely scores well here. Here is the first screen you are presented with when starting up a fresh campaign:

Researching competing link profiles is not enough with respect to link prospecting, in my opinion. I really like the option to not only research multiple URL's at once but also to research keyword-specific prospects.

You can research lots of countries as well. Below is a snapshot of the countries available to you in Link Builder:

Step 2: Prospecting With Competitor URL's

I am craving some chocolate at the moment, as you can tell from my selected URL's :)

Here's a good example of my decision making process when it comes to using the Historic Index and the Fresh Index. My thought process usually involves the following information:

  • The bigger/older the link profiles of the URL's the more likely I am to use the Fresh Index to avoid lots of dead links
  • If the site is a well known brand I will be more likely to use the Fresh Index given the likelihood that the link profile is quite large
  • Smaller link profiles, newer link profiles will probably benefit from using the Historic Index more

In this example the sites I'm researching have big link profiles and have been around for quite awhile in addition to being large brands, so I will use the Fresh Index to cut down on potential dead-ends.

I selected the "Edit Sources" box because I want to make sure I pick the URL with the most links (or you can just go with both) but I wanted to show you the options:

I'll leave all selected just to maximize the opportunities. Sometimes you'll find pages ranking for specific keywords you might be targeting, rather than just the homepage ranking, so you can use both or one or the other if that's the case.

In this scenario I'm looking at the URLs ranking for "chocolate", and they all happened to be homepage's anyway.

Wordtracker is pretty quick with getting the data in, but while you're waiting you'll see the following screen:

Step 3: Working with the Analysis Tab

In order to keep the results as targeted as possible, Wordtracker automatically removes the following links from the results:

  • Image Links
  • Redirects
  • No-follow links

One thing I'd like to see them do is let no-follows through because even though they might not pass any juice they certainly can be decent traffic sources and link building isn't just about passing juice, it's also about brand building and traffic generation.

I'd even say let image links through. I understand they don't want to be a pure link research tool but image links can be valuable for some of what I just mentioned as well. I would say, give us the data and the ability to filter it rather than just taking it away completely.

Here is a snippet of the result page and a description on what it represents:

On the left are pre-designed buckets that Link Builder groups your links into. This is helpful but I'd like to see more flexibility here.

They also offer a tagging feature to help you group links in another way. The tagging can be helpful for things like assigning links to specific people within your group or really any other custom setup you have going on (maybe stuff like grouping keywords into priority buckets or whatever.)

The prospect tab gives you the domain (chow.com in the below example) the link sits on, the page it links to on a competing site or sites, and the page the link is actually on from the linking site:

All you have to do is click that "links to" button to see where the link is pointing to (in this case chow.com is only linking to 1 of the sites I inputted).

The column to the right shows the page on the domain where the link is originating from and the number in the middle is a measure of how important that particular prospect might be.

The furthest most right column shows columns that tell you whether the domain is also linking to you and how many other sites, out of the sites you inputted, that domain is linking to. The idea being that the domain might be more likely to link to you if they are linking out to multiple competing sites as well:

The grayed out button to the right of the co-link count is the "target" button. This is the button you'd click to let the tool know that this is a prospect you'd like to target.

You have the following toolbar available to you in the Analysis tab:

These are generally self-explanatory:

  • Delete - removes selected prospects from the campaign
  • Export - export your results to a CSV file
  • Copy to - copies prospects to another campaign within your account
  • Tag - allows you to tag selected prospects to help create custom grouping fields
  • Filter - filters Top Link by "contains" or "does not contain". An example might be if you wanted to target a link prospect or prospects which contained the word "chocolate" somewhere in the URL

You can also click on any of the groupings on the left to view those specific groups only. I find that the groupings are fairly accurate but I personally prefer the ability to customize fields like that rather than being boxed in.

I created a sample tag titled "for eric" that contains 2 links I want a team member named Eric to work on:

Step 4: Working with the Contact Tab

The Contact tab has most of the same toolbar options as the Analysis tab with one exception:

  • Find Contact and About Links - click on the links you want to find contact information on and/or find the about page on

Link Builder works in the background to find this information and you can continue working in the application. There is a notes option as well. There's no specific way to leave multiple, time-stamped notes (for team environments) but the input box is expandable so you can leave an ongoing contact history.

You have the same contact flag on the right and to the left of that is an email icon that turns yellow if you click it and is designed to let you know contact is in progress or has been initiated.

When the contact request comes back (just refresh the contact tab) you'll see the following, new fields within the Contact tab that denote the contact/about pages for the prospect:

Step 5: Reporting

The Reporting piece of Link Builder has the following reports:

  • History - options for the Fresh/Historic Index of Majestic SEO via cumulative and non-cumulative views for the chosen domains
  • Spider Profile - the link category breakdown (the aforementioned pre-defined link sources Wordtracker assigns your prospects to) of each domain
  • Target Summary - number of targets, number/% of targets contacted, number/% of targets not contacted, number/% of targets linking to you

This gives you a quick overview of the growth of competing link profiles, current link building rate, types of links they have, and your own Prospect metrics. All the reports are exportable to PDF.

Here's the History report:

Here's the Spider Report:

Here's the Target Summary:

Additional Campaign Options

As we discussed earlier, you can either input a list of domains to search on a specific keyword.

If you search on a specific keyword to start you are able to select URL's to include in your prospecting search. Everything else, in terms of options after the URL selection is the same as if you were to have started with domains.

Having a keyword search to start a campaign is helpful in case you are looking to go beyond competitors you already know of and get a real deep look into link prospects across that keyword's market as a whole.

Also, right next to your campaign name you can sign up to be automatically notified of new links and prospects for your campaign:

Firefox Extension

Link Builder also has a Firefox extension that allows you to grab all the external links from a page and save them in your Link Builder account.

I find this is helpful on directory sites (for gathering a list of topic-specific URLs), as an example. The extension is really easy to use. You can install it here. Once you arrive at a page you want to use it on you just click on the LB logo in your toolbar:

Then once you click on the option to gather the links, you get the following interface:

You can save the chosen links right into your Link Builder account.

What I Like

The features that I like in Wordtracker's Link Builder tool are:

  • Ability to prospect by multiple URLs or by choosing a specific keyword
  • Option to use Fresh or Historic Index via Majestic SEO
  • Simple ways to keep notes and contact information
  • Ability to search for contact and about information on selected prospects
  • Robust selection of countries
  • Initial, intelligent link grouping
  • Exporting capabilites
  • Fast results and a really clean, easy to use interface

What Could Be Improved On

I think Wordtracker could do some things to make this tool even more functional and useful:

  • More flexibility with the naming and assigning of link types
  • Have profile-wide settings to include all links (no-follow, image, etc) or exclude some rather than excluding without a choice to include
  • More filtering options around the data points they offer and whether a prospect has been targeted or not
  • More robust link tracking (if the status of links change send me an alert). Though I realize that is getting into link tracking versus link building, it's still a nice option
  • A bit more flexibility with notes and timestamps for a more defined contact history (especially if teams use this)

A Solid Link Building Product

Overall I think this tool does a good job with its intended use, link building. I think some users would like to see more done to make it more team friendly but I think you can accomplish a lot with their tagging feature.

As stated above, I'd like to see some more done with notes and such but as a link prospecting and building tool Wordtracker's Link Builder is worth your time to try out.

You can grab a free trial over at Wordtracker.

The God Complex in SEO

Authoritative, but Often Wrong

Trusting a powerful authority is easy. It allows us to have a quick shorthand for how things work without having to go through the pain, effort, & expense to figure things out. But it often leads to bogus solutions.

This video does a great job of explaining how nothing replaces experience in the SEO industry.

A combination of numerous parallel projects, years of trial and error experience & a deep study of analytics data is far superior to having the God complex & feeling 100% certain you are right.

Change is the only constant in SEO.

Grand Plans

Big plans often get subverted before they pan out & the more obvious something is the shorter its shelf life. By the time everyone notices a trend then jumping on it at that point probably isn't much of a competitive advantage. You might still be able to make some money for a limited time (or for a longer time if you apply it to new markets), but...

It is the contrarian investors who are taking (what is generally perceived to be) big risks who are allowed to ride a trend for years and years.

Options & Opportunities

When Panda happened a lot of theories were thrown out as to what happened & how to fix it. Anyone who only runs 1 website is working from a limited data set and a limited set of experience. They could of course decide to do everything, but there is an opportunity cost to doing anything.

Making things worse, if they have limited savings & no other revenue producing websites there are some risks they simply can't take. They can still sorta infer some stuff from looking at the search results, but those who have multiple sites where some were hit and others were not know intimately well the differences between the sites. They also have cashflow to fund additional trial and error campaigns & to double down on the pieces that are working to offset the losses.

Success Requires Failure

A lot of times people want to enter a market with a grand plan that they can follow without changing it once the map is made, but almost anyone who creates something that is successful is forced to change. Every year in the United States 10% of companies go under! And due to the increased level of competition online it likely separates winners from losers even faster than in the offline world. Those who stick to a grand plan are less able to keep up with innovation than those who have an allegiance to the data. Sometimes having a backup plan is far more important than having a grand plan.

Incremental Investing, Small & Large

Almost anything that I have done that has been successful has started ugly & improved over time. This site was an $8 domain & I couldn't even afford a $99 logo for it until I was a couple months into building it. Most of our other successes have been that way as well. If something works keep reinvesting until the margins drop. But when the margins do drop off, it is helpful to have another project you can invest in, such that you are not 1 and done.

The earliest Google research highlighted how ad-based search business models were bad & the now bankrupt Excite.com turned down buying Google for under $1 million. It turns out everyone was wrong there. One company adjusted & the other is bankrupt.

Overcoming the God Complex

We don't control Google. We can only influence variables that they have decided to count. As their business interests and business models change (along with the structure of the web) so must we.

The God complex always look a bit interesting from afar, no matter how reasonable it sounds to the true believer.

Pages