Celebrities Killed The SEO Star

As the co-founder of an SEO Consultancy, my biggest hurdle in business is finding more staff. Clients are lining up at our door, we have no trouble there, it's finding the staff to work with them that becomes the issue. This may not sound like the worst dilemma for a business to face, especially during the current global economic decline, but the causation is a matter of great concern to me as both an SEO and a businessman.

Ayima's company structure is such that only highly skilled SEOs make it through to our interview stage and yet even then, less than 5% meet our skill requirements. This isn't me being picky, misjudging characters or sourcing bad candidates - this is a knowledge pandemic that is spreading through our industry. We've started apprenticeship programs to teach eager candidates from the ground up, but this can take several years to generate the finished article.

After looking back at our past 30 interview candidates, my opinion for the reason behind this issue may not be a popular one. I believe that celebrity SEOs, brands and blogs are feeding a generation of untested and poorly trained search marketers, who pass themselves off as SEO experts. I will of course explain my positioning…

The Pander Update

Some high profile SEO bloggers recently ceased client work and personal projects, in order to appear impartial and trustworthy to their community. This makes sense at first, after-all, who wants to use a link building tool operated by someone working for one of your client's competitors? It does however bring to light 2 much larger issues;

1) a reliance on tertiary information for SEO analysis, and
2) a reliance on search engineers to provide fresh and exclusive information/data.

Some SEO information sites may argue that they have access to the Web Analytics accounts of their partners and that they do study index changes, but nothing replaces the value of following a handful of websites every single day of the year. An absence of "boots on the ground" leads to misinformation and a distancing from the SEO practices and concerns that really matter. This in turn results in an information churn which newbies to the industry naturally perceive as important.

Moving away from servicing clients or running in-house/affiliate projects also causes a financial flux. Revenue no longer relies on problem solving, but on juicy search engine spoilers and interviews. Search Engines are businesses too though and it's in their best interest to only reward and recommend the publishers/communities that tow their line. A once edgy and eager SEO consultancy must therefore transition into a best practice, almost vanilla, publisher in order to pander to the whims of over-eager search reps.

How do we expect the next generation of SEO consultants to analyse a website and its industry competitors, when all they've read about is how evil paid links are and how to tweak Google Analytics?

I could directly link the viewpoints and understandings of some recent SEO candidates back to a single SEO community, word for word. They would be horrified to see the kind of broken and malformed SEOs that their community has produced.

OMG, Check Out My Klout

It's true that social media metrics will become important factors for SEO in the future, but this certainly does not negate the need for a solid technical understanding of SEO. Getting 50 retweets and 20 +1's for a cute cat viral is the work of a 12 year old schoolgirl, not an SEO. If you can't understand the HTML mark-up of a page and how on-page elements influence a search engine, pick up a HTML/SEO book from 2001 and get reading. If you don't know how to optimise site structure and internal linking, read a book on how the web works or even a "UNIX for Dummies" manual. If you're unable to completely map out a competitor website's linking practices, placement and sources, set up a test site and start finding out how people buy/sell/barter/blag/bate for links.

You may be thinking at this point, "Rob, I already know this - why are you telling me?". Well, the sad fact is that many SEOs, with several years of experience at major and minor agencies, fail to show any understanding of these basic SEO building blocks. There are SEOs who can't identify the H1 on a page and that seriously consider "Wordle" and "Link Diagnosis" as business-class SEO tools. It used to be the case that candidates would read Aaron Wall's SEO Book or Dan Thies' big fat Search Engine Marketing Kit from cover-to-cover before even contemplating applying for an entry level SEO role. These days, major agencies are hiring people who simply say that "Content is King" and "Paid Links are Evil", they have at least 50 Twitter followers of course.

"Certified SEO" is NOT the answer

In most other professional industries, the answer would be simple - regulate and certify. This simply does not work for SEO though. I die a little, each time I see a "Certified SEO" proclamation on a résumé, with their examining board consisting of a dusty old SEO company, online questionnaire or a snake-oil salesman. A complete SEO knowledgebase cannot be taught or controlled by a single company or organisation. No one in their right mind would use Google's guide to SEO as their only source of knowledge for instance, just as no self-respecting Paid Search padawan would allow Google to set-up their PPC campaigns. Google's only interest is Google, not you. Popular SEO communities and training providers have their own agendas and opinions too.

I do however concede that some learning should be standardised, such as scientifically proven or verified ranking factors. Just the facts, no opinions, persuasions or ethical stances.

My Plea To You, The Industry

I plea to you, my fellow SEOs, to help fix this mess that we're in. Mentor young marketers, but let them make up their own minds. Put pressure on SEO communities to concentrate on facts/data and not to be scared by controversy or those with hidden agendas. Promote apprenticeship schemes in your company, so that SEOs learn on the job and not via a website. Encourage people to test ideas, rather than blindly believing the SEO teachings of industry celebs and strangers.

An experienced SEO with, what I perceive to be basic skills, isn't too much to ask for is it?

SEM Rush Review & Free Trial SEMRush Coupons

SEM Rush has long been one of my favorite SEO tools. We wrote a review of SEM Rush years ago. They were best of breed back then & they have only added more features since, including competitive research data for Bing and for many local versions of Google outside of the core US results: Argentina, Australia, Belgium, Brazil, Canada, Denmark, Finland, France, Germany, Hungary, Japan, Hong Kong, India, Ireland, Israel, Italy, Mexico, Netherlands, Norway, Poland, Russia, Singapore, Spain, Sweden, Switzerland, Turkey, United Kingdom.

Recently they let me know that they started offering a free 2-week trial to new users. try SEM Rush for free.

Set up a free account on their website & enter the promotional code located in the image to the right.

For full disclosure, SEM Rush has been an SEO Book partner for years, as we have licensed their API to use in our competitive research tool. They also have an affiliate program & we are paid if you become a paying customer, however we do not get paid for recommending their free trial & their free trial doesn't even require giving them a credit card, so it literally is a no-risk free trial. In fact, here is a search box you can use to instantly view a sampling of their data

Quick Review

Competitive research tools can help you find a baseline for what to do & where to enter a market. Before spending a dime on SEO (or even buying a domain name for a project), it is always worth putting in the time to get a quick lay of the land & learn from your existing competitors.

  • Seeing which keywords are most valuable can help you figure out which areas to invest the most in.
  • Seeing where existing competitors are strong can help you find strategies worth emulating. While researching their performance, it may help you find new pockets of opportunities & keyword themes which didn't show up in your initial keyword research.
  • Seeing where competitors are weak can help you build a strategy to differentiate your approach.

Enter a competing URL in the above search box & you will quickly see where your competitors are succeeding, where they are failing & get insights on how to beat them. SEMrush offers:

  • granular data across the global Bing & Google databases, along with over 2-dozen regional localized country-specific Google databases (Argentina, Australia, Belgium, Brazil, Canada, Denmark, Finland, France, Germany, Hungary, Japan, Hong Kong, India, Ireland, Israel, Italy, Mexico, Netherlands, Norway, Poland, Russia, Singapore, Spain, Sweden, Switzerland, Turkey, United Kingdom, United States)
  • search volume & ad bid price estimates by keyword (which, when combined, function as an estimate of keyword value) for over 120,000,000 words
  • keyword data by site or by page across 74,000,000 domain names
  • the ability to look up related keywords
  • the ability to directly compare domains against one another to see relative strength
  • the ability to compare organic search results versus paid search ads to leverage data from one source into the other channel
  • the ability to look up sites which have a similar ranking footprint as an existing competitor to uncover new areas & opportunities
  • historical performance data, which can be helpful in determining if the site has had manual penalties or algorithmic ranking filters applied against it
  • a broad array of new features like tracking video ads, display ads, PLAs, backlinks, etc.

Longer, In-Depth Review

What is SEM Rush?

SEM Rush is a competitive research tool which helps you spy on how competing sites are performing in search. The big value add that SEM Rush has over a tool like Compete.com is that SEM Rush offers CPC estimates (from Google's Traffic Estimator tool) & estimated traffic volumes (from the Google AdWords keyword tool) near each keyword. Thus, rather than showing the traffic distribution to each site, this tool can list keyword value distribution for the sites (keyword value * estimated traffic).

As Google has started blocking showing some referral data the value of using these 3rd party tools has increased.

Normalizing Data

Using these estimates generally does not provide overall traffic totals that are as accurate as Compete.com's data licensing strategy, but if you own a site and know what it earns, you can set up a ratio to normalize the differences (at least to some extent, within the same vertical, for sites of similar size, using a similar business model).

One of our sites that earns about $5,000 a month shows a Google traffic value of close to $20,000 a month.
5,000/20,000 = 1/4 = 0.25

A similar site in the same vertical shows $10,000
$10,000 * 0.25 = $2,500

A couple big advantages over Compete.com and services like QuantCast for SEM Rush are that:

  • they focus exclusively on estimating search traffic
  • you get click volume estimates and click value estimates right next to each other
  • they help you spot valuable up-and-coming keywords where you might not yet get much traffic because you rank on page 2 or 3

Disclaimers With Normalizing Data

It is hard to monetize traffic as well as Google does, so in virtually every competitive market your profit per visitor (after expenses) will generally be less than Google. Some reason why..

  1. In some markets people are losing money to buy marketshare, while in other markets people may overbid just to block out competition.
  2. Some merchants simply have fatter profit margins and can afford to outbid affiliates.
  3. It is hard to integrate advertising in your site anywhere near as aggressively as Google does while still creating a site that will be able to gather enough links (and other signals of quality) to take a #1 organic ranking in competitive markets...so by default there will typically be some amount of slippage.
  4. A site that offers editorial content wrapped in light ads will not convert eyeballs into cash anywhere near as well as a lead generation oriented affiliate site would.

SEM Rush Features

Keyword Values & Volumes

As mentioned above, this data is scraped from the Google Traffic Estimator and the Google Keyword Tool. More recently Google combined their search-based keyword tool features into their regular keyword tool & this data has become much harder to scrape (unless you are already sitting on a lot of it like SEM Rush is).

Top Search Traffic Domains

A list of the top 100 domain names that are estimated to be the highest value downstream traffic sources from Google.

You could get a similar list from Compete.com's Referral Analytics by running a downstream report on Google.com, although I think that might also include traffic from some of Google's non-search properties like Reader. Since SEM Rush looks at both traffic volume and traffic value it gives you a better idea of the potential profits in any market than looking at raw traffic stats alone would.

Top Competitors

Here is a list of sites that rank for many of the same keywords that SEO Book ranks for

Most competitors are quite obvious, however sometimes they will highlight competitors that you didn't realize, and in some cases those competitors are also working in other fertile keyword themes that you may have missed.

Overlapping Keywords

Here is a list of a few words where Seo Book and SEOmoz compete in the rankings

These sorts of charts are great for trying to show clients how site x performs against site y in order to help allocate more resources.

Compare AdWords to Organic Search

These are sites that rank for keywords that SEO Book is buying through AdWords

And these are sites that buy AdWords ads for keywords that this site ranks for

Before SEM Rush came out there were not many (or perhaps any?) tools that made it easy to compare AdWords against organic search.

Start Your Free Trial Today try SEM Rush for free.

SEM Rush Pro costs $79 per month (or $69 if you sign up recurring), so this free trial is worth about $35 to $40.

Take advantage of SEMRush's free 2-week trial today.

Set up a free account on their website & enter the promotional code in the image located to the right.

If you have any questions about getting the most out of SEM Rush feel free to ask in the comments below. We have used their service for years & can answer just about any question you may have & offer a wide variety of tips to help you get the most out of this powerful tool.

Google Aggressively Enters Make Money Online Niche

Even if you are in a seedy vertical that you think Google wouldn't touch with a 10 foot pole, Google may still be gunning for you!

Recall that when Google bought DoubleClick, Larry Page wanted to keep running the Performics SEO & SEM shop:

Google would spin Performics out of DoubleClick, and sell it to holding firm Publicis. Only one major force inside of Google hated the plan. Guess who? Larry Page.

According to our source, Larry tried to sell the rest of Google's executive team on keeping Performics. "He wanted to see how those things work. He wanted to experiment."

A search engine selling SEO services? Yep.

And now they are aggressively entering the make money online niche. Both Prizes.org & YouTube are in the top 3 ad slots for "make money online"

And I am seeing some of those across portions of the content/display network as well. I just saw this in Gmail today.

How does this align with the Google AdWords TOS?

To protect the value and diversity of the ads running on Google, we don't generally permit advertisers to manage multiple accounts featuring the same business or keywords except in certain limited exceptions. Furthermore, Google doesn't permit multiple ads from the same or an affiliated company or person to appear on the same results page. We've found that pages with multiple text ads from the same company provide less relevant results and a lower quality experience for users. Over time, multiple ads from the same source also reduce overall advertiser performance and lower their return on investment.

Google doesn't allow advertisers or affiliates to have any of the following:

  • Ads across multiple accounts for the same or similar businesses
  • Ads across multiple accounts triggered by the same or similar keywords

Well, as it turns out, the Google AdWords TOS doesn't actually apply to Google.

Search is a zero sum game.

Google is just getting started with breakfast. I am afraid to see what the last meal looks like!

Mutated Search Queries

Google has recently began refining search queries far more aggressively. In the past they would refine search queries if they thought there was a misspelling, but new refinements have taken to changing keywords that are spelled correctly to align them with more common (and thus profitable) keywords.

As one example, the search result [weight loss estimator] is now highly influenced by [weight loss calculator]. The below chart compares the old weight loss estimator SERP, the current weight loss estimator SERP & the current weight loss calculator SERP. Click on the image for a larger view.

Google keyword mutation.

There are 2 serious issues with this change

  • disclosure: in the past refinement disclosures appeared at the top of the search results, but now it often ends up at the bottom
  • awful errors: a couple months after I was born my wife was born in Manila. When I was doing some searches about visiting & things like that, sometimes Google would take the word "Manila" out of the search query. (My guess is because the word "Manila" is also a type of envelope?)

Here is an example of an "awful error" in action. Let's say while traveling you find a great gift & want to send it to extended family. Search for [shipping from las vegas to manila] and you get the following

The search results contain irrelevant garbage like an Urban Spoon page for Las Vegas delivery restaurants.

How USELESS is that?

And now, with disclosure of changes at the bottom of the search results, there isn't even a strong & clean signal to let end users tell Google "hey you are screwing this up badly."

In some ways I am inspired by Google's willingness to test and tweak, but in others I wonder if their new model for search is to care less about testing and hope that SEOs will highlight where Google is punting it bad. In that case, they just roped me into offering free advice. ;)

Panda 2.5...and Youtube Wins Again

On September 28th, Google rolled out Panda 2.5. Yet again Youtube is the #1 site on the leader board, while even some branded sites like MotorTrend were clipped, and sites that had past recovered from Panda (like Daniweb) were hit once more. In the zero sum game of search, Google's Android.com joins YouTube on the leader board.

It doesn't matter what "signals" Google chooses to use when Google also gets to score themselves however they like. And even if Google were not trying to bias the promotion of their own content then any signals they do collect on Google properties will be over-represented by regular Google users.

Google can put out something fairly average, promote it, then iterate to improve it as they collect end user data. Publishers as big as MotorTrend can't have that business model though. And smaller publishers simply get effectively removed from the web when something like Panda or a hand penalty hits them. Worse yet, upon "review" search engineers may choose to review an older version of the site rather than the current site!

With that level of uncertainty, how do you aggressively invest in improving your website?

Over a half-year after Panda launched there are few case studies of recoveries & worse yet, some of the few sites that recovered just relapsed!

If you look at search using a pragmatic & holistic view, then this year the only thing that really changed with "content" farms is you can now insert the word video for content & almost all that video is hosted on Youtube.

To highlight the absurdity, I created another XtraNormal video. :)

References for the above video:

SMX East 2011 Recap for SEOBook

Useful Links:

SMX Facebook: http://www.facebook.com/searchmarketingexpo
Twitter # activity for the conference: http://twitter.com/#!/search/%23smx

Another successful SMX East is in the books. From all accounts, the event seemed to go through flawlessly and without a hitch. Kudos to Danny Sullivan, Claire Schoen, and crew as the caliber of speakers, sessions, and attendees was top notch, as always. Judging from the event, search marketing is alive and thriving more than ever before. There was a healthy mix of industry experts, consultants, large corporations, agencies, and small businesses. The sessions covered a broad range of topics from beginner link building fundamentals to more advanced technical SEO sessions covering site architecture, technical coding optimization and everything inbetween. A huge thank you goes out to the organizers for a job well done.

It seemed there were two themes that surfaced regularly - Panda and Google Plus/+1. Clearly, there are still many webmasters struggling with Panda and how to properly handle content in the new post-Panda world . The search engines are addressing this and giving webmasters and SEO’s more tools and information to organize their websites correctly. After some of the presentations, it seems Google is very dedicated to their Plus and +1 initiatives which will have a large affect on SEO should end user usage continue to increase.

Below are tidbits and takeaways from the conference, from an SEO perspective. Enjoy!



Schema.org, Rel=Author & Meta Tagging For 2012

Panelists:
Janet Driscoll Miller, Search Mojo http://twitter.com/#!/janetdmiller
Topher Kohan, CNN https://twitter.com/#!/Topheratl
Jack Menzel, Google http://twitter.com/#!/jackm

Microformats where the original snippet format, however, they have been replaced by the new and evolving standard, microdata (which is Schema.org/Google/Bing are developing for and placing resources towards). Some notes from the presentations:

  • General consensus is rich snippets can greatly help in getting your content noticed.
  • In one example given, Eatocracy added the hRecipe tag to their pages, and immediately saw a 47% increase in their recipes being picked up and indexed into Google (which does support this in their recipe search). Additionally, they saw a 22% increase in their recipe traffic.
  • CNN started using Yahoo SearchMonkey / RDFa, and saw a 35% increase in their video content on Google Video search, and saw a 22% increase in overall search traffic. However, they removed the additional code from their site as it increased their page load time. The take away on that is that you should think to integrate this into your down dev cycle, your cms, or your template.
  • Per Google, their studies show that sites w/ rich snippets have a better CTR as well. Rich Snippets Engineer at Google, RV Guha noted, “From our experiments, it seemed that giving the user a better idea of what to expect on the page increases the click-through rate on the search results. So if the webmasters do this, it’s really good for them. They get more traffic. It’s good for users because they have a better idea of what to expect on the page. And, overall, it’s good for the web.”
  • Rich snippets only work for one site (no cross site references).
  • Sites like LinkedIn and Google Profiles still use microformats. Google has also provided a tool in WMT, but it is a bit buggy and may throw false errors. If you don’t see your snippets show up in the SERP’s, it’s likely caused by longer than preferred latency load times, errors in your code, or a random Google bug - (per Google).
  • The current types of rich snippets: reviews, people, products, businesses & organizations, recipes*, events, music


Session - “Ask the Search Engines”

Panelists:
Tiffany Oberio - Google http://twitter.com/#!/tiffanyoberoi
Duane Forrester - Bing http://twitter.com/#!/duaneforrester
Rich Skrenta - Blekko http://twitter.com/#!/skrenta

  • One audience member asked how to handle ‘subcategory’ pages that are often created in ecommerce sites such as “Sort Prices $0-$5”, “Prices $5-$25” etc. The question was whether or not to use the “rel=canonical” tag and point the pages back to the main page. The panelists agreed that those pages should be blocked completely and should not use the canonical tag. The Google representative said not only do these pages not add value to the engine’s index, but they also eat up the sites crawl budget.
  • If you see the warning "we're seeing a high # of URL's" in Webmaster Tools, most times its a duplicate content issue.
  • One audience member asked: do you look at subdomain as part of the main domain?
    • Blekko - no inheritance from main domain
    • Google - "it depends". Sometimes it is inherited, sometimes not.
    • Bing - we look and try to determine if subdomain is a standalone business/website and will get treated differently based on that determination
  • One question touched on removing URL’s from Google’s index. Google advised that a removed URL may or may not stay in the index for a period of time, and that to expedite removal of a URL one should use Webmaster Tools remove-url tool
  • Duane from Bing was adamant about keeping your submitted sitemap clean. The threshold is 1%. If there are issues in your submitted sitemap >1%, Bing will “lose trust” for your website
  • Panelists advised to make your 404 pages useful to the user
  • It may not be breaking news, but Bing and Google both said unequivocally - duplicate content does hurts you
  • Google commented they are big fans of HTML 5 technology
  • At this point it seems Google will crawl a page if +1 is present, regardless of the robots.txt. This could possibly create issues with trying to not crawl certain pages to avoid dup content. More information found here: http://www.webmasterworld.com/google/4358033.htm
  • Panelists advised to spend a lot of energy “containing urls” on your website and to be thoughtful about which URLs you are getting out there
  • Bing and Google confirmed that “pagerank sculpting” is misunderstood and not effective. For example, if a page has 5 outgoing links and link juice is spread 20% to each of the 5 links, if you no follow one of the links, the link juice distribution will not become 25% to the remaining 4 links. It will remain 4 x 20%. In essence, you have just evaporated potential link juice

Google Plus and +1

These were hot topics at this year’s SMX East. Multiple session covered Google Plus and +1 in depth.

  • Speaker Benjamin Vigneron from Esearchvision covered the basics of Google Plus and +1 . He noted a +1 to a search result will +1 the ppc ad/landing page, too.
  • With PPC, +1 could have a significant affect on Adrank by affecting each of the Quality Score factors including quality of the landing page, CTR, and the ad’s past performance.
  • Interesting that Adwords could conceivably add segmenting on all information in Google Plus (similar to FB) ie males, ages, etc.
  • Christian Oestlien, the Google Product Manager for Google Plus, spoke about Google Plus features and fielded questions. He mentioned Google is testing and experimenting with celebrity endorsements +1'ing and showed an example SERP with a +1 annotation under the search result (for example “Kim Kardashian has +1’ed” Brand X or search result X). He noted Google is seeing much higher CTR with the +1 annotation and that usage for the “Circles” feature is relatively high.
  • Google software engineer Tiffany Oberoi was also present on the panel. She noted +1 is NOT a ranking factor, but social search is still of course implemented in search results. She confirmend Facebook likes have no impact on rankings but also noted regarding social signals, “explicit user feedback is like gold for us". She also touched on spam with +1 and said she is currently working with spam team. Regarding +1’s and spamming, she said to think of +1’s similarly to links. The same guidelines could apply. Google wants to use them as a real signal. Using in an unnatural way will not good for you.

Hardcore Local Search Tactics

Panelists:
Matt McGee - Search Engine Land
Mike Ramsey, - Nifty Marketing
Will Scott - Search Influence

Panelists here gave an encore presentation of the session these folks put on at SMX Advanced in Seattle. The content was excellent and definitely deserved another run through. Here are the notes:

  • July 21st, Google removed citations from their Places listings. While they have been removed for public viewing, they are still used. Sources like Whitespark (link: http://www.whitespark.ca/) can be very helpful in uncovering citation building opportunities.
  • Citation accuracy is among the most important factors in getting your business to rank in the O or 7-Pack. Doing a custom Google search of “business name”+”address”+”phone number” will help determine what other sources Google sees as citation sources.
  • Average number of IYP reviews of ranked listings vs non ranked listings showed to be a large gap, indicating that IYP reviews do in fact provide quite a bit of listing weight.
  • Offsite Citation’s / Data appear to be the no. 1 ranking factor in Places listings
  • Linking Root Domains appear to be the no. 2 ranking factor in Places listings
  • Exact match anchor links appear to be the no. 3 ranking factor in Places listings
  • Links are the new citations for local in 2011-12
  • Building a custom landing page to link your Places Listing to appears to be a huge success factor. Include your Name, Address, Phone (NAP) in the title tag
    • Design that landing page to mirror a Places listing on their site w/ a map, business hours, contact data, etc.
    • If needed, submit your contact/location page as your Places URL/Landing Page which will create a stronger geo scent
  • When trying to understand how users are searching for your client, Insights for Search is a great tool as you can find Geo targeted data w/ KW differentiation (ie Lawyer vs Attorney, which is used more in that area)
  • Local requires a different mindset from traditional SEO
    • Optimize location (local SEO) vs Optimize websites (traditional SEO)
    • Blended search is about matching them up
  • PageRank of Places URL does NOT seem to affect Local ranking -(source: David Mihm)
  • Multi-Location Tips
    • Flat site architecture beginning w/ a “Store Locator” page
      • Great Example, lakeland.co.uk/StoreLocator.action
    • Give each location its own page
      • Great Example, lakeland.co.uk/stores/aberdeen
    • Cross link nearby locations w/ geo anchor text
  • Ensure the use of KML Sitemap in Google WMT
  • Encourage Community Edits - Make Use of Google’s Map Maker
  • Include Geo data in Facebook pages and article engines

Panda Recovery Case Study - High Gear Media

Speaker Matt Heist from High Gear Media covered their experiences over the past 8 months with recovering from Panda. High Gear Media is an online publisher of auto news and reviews.

Heist walked through the company’s strategy pre-panda and explained their contrasting new post-panda strategy. The original strategy was many auto review niche sites across a broad range of auto makes, models and manufacturers. The company originally had 107 sites and 20+ writers and dispersed content amongst all the sites. The content was "splashing" everywhere, unfocused. The “large network of microsites” strategy was working and traffic was climbing each month. Then Panda hit - hard. Traffic plummeted beginning this past Spring. Leaders at High Gear was forced to reevaluate their strategy and concluded that a more focused approach was better for users and consequently would help search traffic recover.

High Gear took the following actions:

  • Eliminated most of their properties completely (301'ed) and pared them down to 7 total sites with 4 being ‘core’: FamilyCarGuide, Motorauthority, GreenCarReports, TheCarConnection.
  • Properly canonicalized duplicate content
  • Aggregated content with strong user engagement was KEPT, but not indexed
  • The made the hard decision to eliminate content that could be making money but not good for the long term
  • Dedicated significant resources to redesigning each of the 7 sites remaining sites

Their strategy seems to be working. Heist noted traffic has ‘flipped, plus some”. According to Heist, here are the learning's:

  • High Gear Media believes that premium content will prevail and that Panda will help that
  • Advertisers like bigger brands - it is now easier to sell ads and for more $ with fewer, more powerful sites
  • With evolution of Social (joining Search from a distribution perspective), premium content that is authoritative AND fresh with flourish

Raven Tools

We were able to meet up with the friendly staff over at Raven Tools, sit down with them, and learn a bit more about their product. We personally have been using Raven for about a year now, and highly recommend it. There are several features in the works that will make this even more of an incredible product. If you haven't used them, we would HIGHLY suggest giving the tools a run. They are partnering with new companies constantly, and as such, are building out a best in class seo management product.

Upcoming Features:

  • A new feature they are working on is a Chrome Toolbar to compliment the current Firefox toolbar
  • Another feature coming is “templated messaging” for link requests and manual link building which will include BCC’s back to records. Templated Messaging will be built into our Contact Manager, but they are working on making that functionality available in the toolbar.
  • Another upcoming features is file management. RavenTools engineers are looking at integrating Dropbox into the system to allow files to be associated with other data and records.
  • The Co-Founder Jon Henshaw alluded many times to the idea that link building and consequently their toolset will continue to become more and more based on relationships in the future. He also alluded to the idea that traffic can or in some cases should be associated with PEOPLE as the referrer, rather than a website (ie x amount of traffic came from person A, whether it be their facebook, twitter, blog, or website). In other words, a relationship management system looks to be a integral part of the future of Raventools.
  • For future updates, Raventools takes explicit user feedback greatly into account. If you have a feature request or a software integration request, please contact: http://raventools.com/feature-requests/
  • Regarding MajesticSEO and OSE/Linkscape, they will be more fully integrating it into the Research section of Raven. That means they’ll be adding as much functionality into Raven as their APIs will allow. In addition to getting more full access to that data, users will be able to easily add that data to other tools, like the Keyword and Competitor Managers, Rank Tracker, etc...
  • Speed is the number one priority right now. They have full-time staff that are solely dedicated to speeding up the system. The goal is to make it run as fast as a desktop app.
  • Long term - 3rd party integration will be a constant (and should accelerate) for the platform for the foreseeable future.
  • Screenshot of "Social Stream" prototype design http://cl.ly/1b2h0u3P3U441w000o1K/o
  • AdWords Insights: Flagged Pages: http://cloud.raven.im/9v8d
  • Link Clips link checker results with historical results: http://cloud.raven.im/9zgK/o

Other Notes

  • Regarding Panda, one panelist referenced what he called a website’s “Content Performance Ratio” referring to the % of content on a site that is good versus bad or ‘performing vs non performing’ and using that as a gauge as to the health of a website.
  • Panelists also noted in his experience it takes 3-4 requests on a 404 before search engine believes you and removes it from the index.
  • Panelists in the “Ask the SEO” session said to pay close attention to anchor text diversity and human engagement signals

Author bio:
Jake Puhl is the Co-Founder/Co-Owner of Firegang Digital Marketing, a Local search marketing company, specializing in all aspects "Local", including custom web design, SEO, Google Places, and local PPC advertising. Jake has personally consulted businesses from Hawaii to New York and everywhere in-between. Jake can be contacted at jacobpuhl at firegang.com.

Endless AdWords Profits

"To thine own self be true"

In a word?

Prescient!

10 links in a single AdWords ad unit!

Then more ads below it. Then a single organic listing with huge sublinks too. And unless you have a huge monitor at that point you are "below the fold."

Negative advertising in AdWords is not allowed. So long as you build enough brand signals & pay the Google toll booth, public relations issues & reputation issues won't be accessible to searchers unless they learn to skip over the first screen of search results.

While it is generally against Google's TOS for advertisers to double dip in AdWords (outside of the official prescribed oversize ad units highlighted above), Google is doing exactly that with their multitude of brands.

BeatThatQuote is back yet again.

The line between ads & content is getting blurry. Mighty blurry.

Is it time yet for a new slogan?

Google: the sales engine!

Google+ Doorway Pages / Scraper Site

Another friend sent me a message today: "just got a whole swathe of non-interlinked microsites torched today. Bastard! Just watching the rank reports coming in..."

I haven't seen his sites, but based on how he described them "whole swathe" I wouldn't guess the quality to be super high. One thing you could say for them was that they were unique.

Where putting in the effort to create original content falls flat on its face is when search engines chose to outrank aggregators (or later copies) over the original source. The issue has got so out of hand that Google has come right out & asked for help with it.

The big issue is that Google is often the culprit. Either indirectly through their ads programs & algorithmic biases or more directly through the launch of new features.

When Google launched Knol I was quick to flame them after I saw them ranking recycled content on Knol ahead of the original source. The Knol even highlighted similar works, showing that Google allowed Knol to outrank earlier sources of the same work.

In a recent WebmasterWorld thread Brett Tabke stated that Google is putting serious weight on Google+:

Some Google+ SEO factors now trump linking as prime algo ingredient. Google+ is already and clearly influencing rankings. I watched a presentation last night that definitely showed that rankings can occur from Google+ postings and photo's with no other means of support.

As Google+ grows - so will Google's understanding of how to use it as rankings signals.

We are not playing Google+ because we want too - we are playing Google+ because we have to.

I read that sorta half hoping he was wrong, but know he rarely is.

And then today Google hit me across the head with a 2x4, proving he was right again.

Business Insider is not some small niche site that Google can justify accidentally deleting from the web with 2 clicks of a mouse, yet when I was doing a *navigational* search, trying to find a piece of their content I had already read, guess what popped up in the search results.

Yup. Google+

What's worse is that isn't from a friend, isn't from the original source, is the full article wholesale, from Google Reader, and the source URL has Google's feedproxy in it.

If Google wants to add value to the ecosystem & insert themselves as a new layer of value then how can we do anything but welcome it. However, when they want to take 3rd party content & "wrap it in Google" it is absolutely unacceptable for them to outrank the original source with their copy of it, even if they feel the deserve to outrank it & have made multiple copies of it.

On large complex system I get that some advice will be self-serving and progress often comes with bumps and bruises.

But Google's dominance in search coupled with their dominance in display (from owning DoubleClick & YouTube) has led competing portals to team up to try to compete against Google with display ads.

And, if the big portals are struggling that much, then at the individual publisher level, how do you profitably produce content when Google takes your content & ranks their copy ahead of yours?

New Google "Search Results" Bar

I recently got put in a test bucket for Google's new layout with a "search results" bar near the top of the page. Generally this impacts the search results in a couple ways:

  • First off, it is a much better looking design. In the past when the search results would move up and down with Google Instant it really felt like a hack rather than something you would see on the leading internet company's main website. Now with the results fixed it feels much cleaner & much more well put together.
  • The more stable basic layout of the SERP will allow Google to integrate yet more vertical data into it while making it still look & feel decent. Google may have localized search suggestions & the organic results for a significant period of time, but the combination of them with this new layout where the search results don't move feels much more cohesive.
  • To get the white space right on the new layout Google shifted from offering 5 Instant suggestion to 4. The Google Instant results don't disappear unless you hit enter, but because the interface doesn't change & move there isn't as much need to click enter. The search experience feels more fluid.
  • The horizontal line above the search results and the word "Search" in red in the upper left of the page is likely to pull some additional attention toward Google's vertical search features, helping Google to collect more feedback on them (and further use that user behavior to create a signal to drive further integration of the verticals into the regular organic search results).
  • On the flip side of this, in the past the center column would move up & down while the right column would remain stationary, so I would expect this to slightly diminish right column ad clicks (that appeared at the top even when the organic results moved downward) while boosting center column clicks to offset that.
  • In the past, when Google Instant would disappear from view, that would pull the center column organic results up a bit.
    • This always-on bar shifts the pixels above the first search result from about 123 to 184...so roughly 60 pixels downward.
    • As a baseline, a standard organic listing with no extensions is about 90 pixels tall, so this moves the search results down roughly 2/3 of a listing, which should drive more traffic to the top paid search ads & less to the organic results below them (offset by any diminished clicks on the right column ads).
    • This is a much cleaner way of taking advantage of white space than some of the cheesy & ugly-looking stuff they recently tested.

I tried to line up the results pretty closely on the new test results to show what they look like with Google Instant results showing & after you hit enter. Scroll over the below image to see how the result layout doesn't really change with Google Instant hidden or extended.

And here is an example image showing how the location is sometimes inserted directly into both the organic search results and the search suggestions.

Here is an image using Google's browser size tool to show how end users see the new search results. Note that in this example I used a keyword where Google has comparison/advisor ads, so in markets where they do not yet have those you would move all the organic results one spot up from what is shown below.

Google Offers a New Definition for Doorway Pages?

In the past doorway pages could be loosely defined as "low-quality pages designed to rank for highly targeted search queries, typically designed to redirect searchers to a page with other advertisements."

The reason they are disliked is a click circus impact they have on web users as they keep clicking in an infinite loop of ads.

This would be a perfect example of that type of website:

However, ever since Google started to eat their "organic" search results, the definition of doorway pages has changed significantly.

A friend of mine told me that the reason CSN stores had to merge into a "brand" was not just because that was the direction of the algorithm, but also because they were hit with the "doorway page" penalty. I don't know if that is 100% accurate, but it sure sounds plausible, given that... (UPDATE: SEE COMMENTS BELOW)

  • recently multiple friends have told me they were hit with the "doorway page" issue
  • on WebmasterWorld there are multiple threads from small ecommerce players suggesting they were hit with the doorway page issue
    • "Today we received messages in our webmaster tools account, for all but 1 of our 20 domains, indicating that Google considers them doorway pages. We have also lost all of our SERP's for those sites." - Uncle_DK
    • "I was rather disappointed to see that before banning the site the rater visited a very drab and ordinary page on my site. Not a smoking gun of some incriminating evidence of a hacker break-in or some such I was looking for. Also disappointing is the fact that they visited one page only." - 1script
  • another friend today told me that one of their clients runs numerous websites & that ALL of the sites in the Google webmaster tools account blew up, getting hit with the "doorway page" label (and ALL the sites that were not in that webmaster tools account were missed by the Google engineers)


Like almost anything else Google offers, their webmaster tools are free, right up until Google changes their business objectives and one of their engineers decide that he should put you out of business.

I *knew* the point of the Panda update was not to kill content farms, but to use content farms as a convenient excuse to thin the herd of webmasters & consolidate markets. A couple obvious tells on that front were:

  • the update taking so long to happen
  • the first version of the Panda update embarrassingly missing eHow
  • the update hitting so many small ecommerce websites, even as it somehow missed eHow

Part of the brand bias in Google Panda allowed corporate branded doorway pages to rank higher than ever. Google's solution to this problem is, once again, to punish the victim - wiping independent webmasters off the web.

What is the new definition of doorway pages?

Pages on non-brand websites, that are not owned by a fortune 500 company, which aggressively monetizes web traffic without giving Google a piece of the action.

If you are not a brand you can be wiped out at any time with absolutely 0 recourse unless you can damage Google's brand or harm their standing before market regulators.

If you want to be an independent webmaster you better study public relations. Start here, with Edward Bernays.

Wal-Mart has received a bad reputation for how their dominant control of the supply chain sucked most the profits out of some markets & drove some of their suppliers into bankruptcy:

Young remembers begging Wal-Mart for relief. "They said, 'No way,' " says Young. "We said we'll increase the price"--even $3.49 would have helped tremendously--"and they said, 'If you do that, all the other products of yours we buy, we'll stop buying.' It was a clear threat."
...
Finally, Wal-Mart let Vlasic up for air. "The Wal-Mart guy's response was classic," Young recalls. "He said, 'Well, we've done to pickles what we did to orange juice. We've killed it. We can back off.' " Vlasic got to take it down to just over half a gallon of pickles, for $2.79. Not long after that, in January 2001, Vlasic filed for bankruptcy.

Such strong-arm business negotiation tactics might be sleazy, but you know one thing Wal-Mart does do? They tolerate multiple brands from a single manufacturer. In fact, many leading manufacturers are creating down market brands to compensate for the economic malaise we are going through:

P&G's roll out of Gain dish soap says a lot about the health of the American middle class: The world's largest maker of consumer products is now betting that the squeeze on middle America will be long lasting.

As far as publishing business models go, if Google starts calling ecommerce sites that are part of a network "doorway sites" then Google isn't really allow that sort of testing, unless the content comes from a fortune 500 or is content conveniently hosted on Google.com. As a publisher or merchant, how do you ever grow to scale if you are not allowed to test running multiple projects & products in parallel & keep reinvesting in whatever works best?

Even the biggest publishers are breaking some of their core brands into multiple sites (eg: Boston.com vs BostonGlobe.com) to test different business models. If you have scale that is fine, but if you are smaller that same strategy might soon be considered a black hat doorway strategy.

Meanwhile...

Pages