Favicon SEO

Google recently copied their mobile result layout over to desktop search results. The three big pieces which changed as part of that update were

  • URLs: In many cases Google will now show breadcrumbs in the search results rather than showing the full URL. The layout no longer differentiates between HTTP and HTTPS. And the URLs shifted from an easily visible green color to a much easier to miss black.
  • Favicons: All listings now show a favicon next to them.
  • Ad labeling: ad labeling is in the same spot as favicons are for organic search results, but the ad labels are a black which sort of blends in to the URL line. Over time expect the black ad label to become a lighter color in a way that parallels how Google made ad background colors lighter over time.

One could expect this change to boost the CTR on ads while lowering the CTR on organic search results, at least up until users get used to seeing favicons and not thinking of them as being ads.

The Verge panned the SERP layout update. Some folks on Reddit hate this new layout as it is visually distracting, the contrast on the URLs is worse, and many people think the organic results are ads.

I suspect a lot of phishing sites will use subdomains patterned off the brand they are arbitraging coupled with bogus favicons to try to look authentic. I wouldn't reconstruct an existing site's structure based on the current search result layout, but if I were building a brand new site I might prefer to put it at the root instead of on www so the words were that much closer to the logo.

Google provides the following guidelines for favicons

  • Both the favicon file and the home page must be crawlable by Google (that is, they cannot be blocked to Google).
  • Your favicon should be a visual representation of your website's brand, in order to help users quickly identify your site when they scan through search results.
  • Your favicon should be a multiple of 48px square, for example: 48x48px, 96x96px, 144x144px and so on. SVG files, of course, do not have a specific size. Any valid favicon format is supported. Google will rescale your image to 16x16px for use in search results, so make sure that it looks good at that resolution. Note: do not provide a 16x16px favicon.
  • The favicon URL should be stable (don’t change the URL frequently).
  • Google will not show any favicon that it deems inappropriate, including pornography or hate symbols (for example, swastikas). If this type of imagery is discovered within a favicon, Google will replace it with a default icon.

In addition to the above, I thought it would make sense to provide a few other tips for optimizing favicons.

  • Keep your favicons consistent across sections of your site if you are trying to offer a consistent brand perception.
  • In general, less is more. 16x16 is a tiny space, so if you try to convey a lot of information inside of it, you'll likely end up creating a blob that almost nobody but you recognizes.
  • It can make sense to include the first letter from a site's name or a simplified logo widget as the favicon, but it is hard to include both in a single favicon without it looking overdone & cluttered.
  • A colored favicon on a white background generally looks better than a white icon on a colored background, as having a colored background means you are eating into some of the scarce pixel space for a border.
  • Using a square shape versus a circle gives you more surface area to work with.
  • Even if your logo has italics on it, it might make sense to avoid using italics in the favicon to make the letter look cleaner.

Here are a few favicons I like & why I like them:

  • Citigroup - manages to get the word Citi in there while looking memorable & distinctive without looking overly cluttered
  • Nerdwallet - the N makes a great use of space, the colors are sharp, and it almost feels like an arrow that is pointing right
  • Inc - the bold I with a period is strong.
  • LinkedIn - very memorable using a small part of the word from their logo & good color usage.

Some of the other memorable ones that I like include: Twitter, Amazon, eBay, Paypal, Google Play & CNBC.

Here are a few favicons I dislike & why

  • Wikipedia - the W is hard to read.
  • USAA - they included both the logo widget and the 4 letters in a tiny space.
  • Yahoo! - they used inconsistent favicons across their sites & use italics on them. Some of the favicons have the whole word Yahoo in them while the others are the Y! in italics.

If you do not have a favicon Google will show a dull globe next to your listing. Real Favicon Generator is a good tool for creating favicons in various sizes.

What favicons do you really like? Which big sites do you see that are doing it wrong?

Reinventing SEO

Back in the Day...

If you are new to SEO it is hard to appreciate how easy SEO was say 6 to 8 years ago.

Almost everything worked quickly, cheaply, and predictably.

Go back a few years earlier and you could rank a site without even looking at it. :D

Links, links, links.

Meritocracy to Something Different

Back then sharing SEO information acted like a meritocracy. If you had something fantastic to share & it worked great you were rewarded. Sure you gave away some of your competitive advantage by sharing it publicly, but you would get links and mentions and recommendations.

These days most of the best minds in SEO don't blog often. And some of the authors who frequently publish literally everywhere are a series of ghostwriters.

Further, most of the sharing has shifted to channels like Twitter, where the half-life of the share is maybe a couple hours.

Yet if you share something which causes search engineers to change their relevancy algorithms in response the half-life of that algorithm shift can last years or maybe even decades.

Investing Big

These days breaking in can be much harder. I see some sites with over 1,000 high quality links that are 3 or 4 months old which have clearly invested deep into 6 figures which appear to be getting about 80 organic search visitors a month.

From a short enough timeframe it appears nothing works, even if you are using a system which has worked, should work, and is currently working on other existing & trusted projects.

Time delays have an amazing impact on our perceptions and how our reward circuitry is wired.

Most the types of people who have the confidence and knowledge to invest deep into 6 figures on a brand new project aren't creating "how to" SEO information and giving it away free. Doing so would only harm their earnings and lower their competitive advantage.

Derivatives, Amplifications & Omissions

Most of the info created about SEO today is derivative (people who write about SEO but don't practice it) or people overstating the risks and claiming x and y and z don't work, can't work, and will never work.

And then from there you get the derivative amplifications of don't, can't, won't.

And then there are people who read and old blog post about how things were x years ago and write as though everything is still the same.

Measuring the Risks

If you are using lagging knowledge from derivative "experts" to drive strategy you are most likely going to lose money.

  • First, if you are investing in conventional wisdom then there is little competitive advantage to that investment.
  • Secondly, as techniques become more widespread and widely advocated Google is more likely to step in and punish those who use those strategies.
  • It is when the strategy is most widely used and seems safest that both the risk is at its peak while the rewards are de minimus.

With all the misinformation, how do you find out what works?

Testing

You can pay for good advice. But most people don't want to do that, they'd rather lose. ;)

The other option is to do your own testing. Then when you find out somewhere where conventional wisdom is wrong, invest aggressively.

"To invent you have to experiment, and if you know in advance that it’s going to work, it’s not an experiment. Most large organizations embrace the idea of invention, but are not willing to suffer the string of failed experiments necessary to get there. Outsized returns often come from betting against conventional wisdom, and conventional wisdom is usually right." - Jeff Bezos

That doesn't mean you should try to go against consensus view everywhere, but wherever you are investing the most it makes sense to invest in something that is either hard for others to do or something others wouldn't consider doing. That is how you stand out & differentiate.

But to do your own testing you need to have a number of sites. If you have one site that means everything to you and you get wildly experimental then the first time one of those tests goes astray you're hosed.

False Positives

And, even if you do nothing wrong, if you don't build up a stash of savings you can still get screwed by a false positive. Even having a connection in Google may not be enough to overcome a false positive.

Cutts said, “Oh yeah, I think you’re ensnared in this update. I see a couple weird things. But sit tight, and in a month or two we’ll re-index you and everything will be fine.” Then like an idiot, I made some changes but just waited and waited. I didn’t want to bother him because he’s kind of a famous person to me and I didn’t want to waste his time. At the time Google paid someone to answer his email. Crazy, right? He just got thousands and thousands of messages a day.

I kept waiting. For a year and a half, I waited. The revenues kept trickling down. It was this long terrible process, losing half overnight but then also roughly 3% a month for a year and a half after. It got to the point where we couldn’t pay our bills. That’s when I reached out again to Matt Cutts, “Things never got better.” He was like, “What, really? I’m sorry.” He looked into it and was like, “Oh yeah, it never reversed. It should have. You were accidentally put in the bad pile.”

“How did you go bankrupt?"
Two ways. Gradually, then suddenly.”
― Ernest Hemingway, The Sun Also Rises

True Positives

A lot of SEMrush charts look like the following

What happened there?

Well, obviously that site stopped ranking.

But why?

You can't be certain why without doing some investigation. And even then you can never be 100% certain, because you are dealing with a black box.

That said, there are constant shifts in the algorithms across regions and across time.

Paraphrasing quite a bit here, but in this video Search Quality Senior Strategist at Google Andrey Lipattsev suggested...

He also explained the hole Google has in their Arabic index, with spam being much more effective there due to there being little useful content to index and rank & Google modeling their ranking algorithms largely based on publishing strategies in the western world. Fixing many of these holes is also less of a priority because they view evolving with mobile friendly, AMP, etc. as being a higher priority. They algorithmically ignore many localized issues & try to clean up some aspects of that manually. But even whoever is winning by the spam stuff at the moment might not only lose due to an algorithm update or manual clean up, but once Google has something great to rank there it will eventually win, displacing some of the older spam on a near permanent basis. The new entrant raises the barrier to entry for the lower-quality stuff that was winning via sketchy means.

Over time the relevancy algorithms shift. As new ingredients get added to the algorithms & old ingredients get used in new ways it doesn't mean that a site which once ranked

  • deserved to rank
  • will keep on ranking

In fact, sites which don't get a constant stream of effort & investment are more likely to slide than have their rankings sustained.

The above SEMrush chart is for a site which uses the following as their header graphic

When there is literally no competition and the algorithms are weak, something like that can rank.

But if Google looks at how well people respond to what is in the result set, a site as ugly as that is going nowhere fast.

Further, a site like that would struggle to get any quality inbound links or shares.

If nobody reads it then nobody will share it.

The content on the page could be Pulitzer prize level writing and few would take it seriously.

With that design, death is certain in many markets.

Many Ways to Become Outmoded

The above ugly header design with no taste and a really dumb condescending image is one way to lose. But there are also many other ways.

Excessive keyword repetition like the footer with the phrase repeated 100 times.

Excessive focus on monetization to where most visitors quickly bounce back to the search results to click on a different listing.

Ignoring the growing impact of mobile.

Blowing out the content footprint with pagination and tons of lower quality backfill content.

Stale content full of outdated information and broken links.

A lack of investment in new content creation AND promotion.

Aggressive link anchor text combined with low quality links.

Investing in Other Channels

The harder & more expensive Google makes it to enter the search channel the greater incentive there is to spend elsewhere.

Why is Facebook doing so well? In part because Google did the search equivalent to what Yahoo! did with their web portal. The rich diversity in the tail was sacrificed to send users down well worn paths. If Google doesn't want to rank smaller sites, their associated algorithmic biases mean Facebook and Amazon.com rank better, thus perhaps it makes more sense to play on those platforms & get Google traffic as a free throw-in.

Of course aggregate stats are useless and what really matters is what works for your business. Some may find Snapchat, Instagram, Pinterest or even long forgotten StumbleUpon as solid traffic drivers. Other sites might do well with an email newsletter and exposure on Twitter.

Each bit of exposure (anywhere) leads to further awareness. Which can in turn bleed into aggregate search performance.

People can't explicitly look for you in a differentiated way unless they are already aware you exist.

Some amount of remarketing can make sense because it helps elevate the perceived status of the site, so long as it is not overdone. However if you are selling a product the customer already bought or you are marketing to marketers there is a good chance such investments will be money wasted while you alienate pas

Years ago people complained about an SEO site being far too aggressive with ad retargeting. And while surfing today I saw that same site running retargeting ads to where you can't scroll down the page enough to have their ad disappear before seeing their ad once again.

If you don't have awareness in channels other than search it is easy to get hit by an algorithm update if you rank in competitive markets, particularly if you managed to do so via some means which is the equivalent of, erm, stuffing the ballot box.

And if you get hit and then immediately run off to do disavows and link removals, and then only market your business in ways that are passively driven & tied to SEO you'll likely stay penalized in a long, long time.

While waiting for an update, you may find you are Waiting for Godot.

Guide To Optimizing Client Sites 2014

For those new to optimizing clients sites, or those seeking a refresher, we thought we'd put together a guide to step you through it, along with some selected deeper reading on each topic area.

Every SEO has different ways of doing things, but we’ll cover the aspects that you’ll find common to most client projects.

Few Rules

The best rule I know about SEO is there are few absolutes in SEO. Google is a black box, so complete data sets will never be available to you. Therefore, it can be difficult to pin down cause and effect, so there will always be a lot of experimentation and guesswork involved. If it works, keep doing it. If it doesn't, try something else until it does.

Many opportunities tend to present themselves in ways not covered by “the rules”. Many opportunities will be unique and specific to the client and market sector you happen to be working with, so it's a good idea to remain flexible and alert to new relationship and networking opportunities. SEO exists on the back of relationships between sites (links) and the ability to get your content remarked upon (networking).

When you work on a client site, you will most likely be dealing with a site that is already established, so it’s likely to have legacy issues. The other main challenge you’ll face is that you’re unlikely to have full control over the site, like you would if it were your own. You’ll need to convince other people of the merit of your ideas before you can implement them. Some of these people will be open to them, some will not, and some can be rather obstructive. So, the more solid data and sound business reasoning you provide, the better chance you have of convincing people.

The most important aspect of doing SEO for clients is not blinding them with technical alchemy, but helping them see how SEO provides genuine business value.

1. Strategy

The first step in optimizing a client site is to create a high-level strategy.

"Study the past if you would define the future.” - Confucious

You’re in discovery mode. Seek to understand everything you can about the clients business and their current position in the market. What is their history? Where are they now and where do they want to be? Interview your client. They know their business better than you do and they will likely be delighted when you take a deep interest in them.

  • What are they good at?
  • What are their top products or services?
  • What is the full range of their products or services?
  • Are they weak in any areas, especially against competitors?
  • Who are their competitors?
  • Who are their partners?
  • Is their market sector changing? If so, how? Can they think of ways in which this presents opportunities for them?
  • What keyword areas have worked well for them in the past? Performed poorly?
  • What are their aims? More traffic? More conversions? More reach? What would success look like to them?
  • Do they have other online advertising campaigns running? If so, what areas are these targeting? Can they be aligned with SEO?
  • Do they have offline presence and advertising campaigns? Again, what areas are these targeting and can they be aligned with SEO?

Some SEO consultants see their task being to gain more rankings under an ever-growing list of keywords. Ranking for more keywords, or getting more traffic, may not result in measurable business returns as it depends on the business and the marketing goals. Some businesses will benefit from honing in on specific opportunities that are already being targeted, others will seek wider reach. This is why it’s important to understand the business goals and market sector, then design the SEO campaign to support the goals and the environment.

This type of analysis also provides you with leverage when it comes to discussing specific rankings and competitor rankings. The SEO can’t be expected to wave a magic wand and place a client top of a category in which they enjoy no competitive advantage. Even if the SEO did manage to achieve this feat, the client may not see much in the way of return as it’s easy for visitors to click other listings and compare offers.

Understand all you can about their market niche. Look for areas of opportunity, such as changing demand not being met by your client or competitors. Put yourself in their customers shoes. Try and find customers and interview them. Listen to the language of customers. Go to places where their customers hang out online. From the customers language and needs, combined with the knowledge gleaned from interviewing the client, you can determine effective keywords and themes.

Document. Get it down in writing. The strategy will change over time, but you’ll have a baseline point of agreement outlining where the site is at now, and where you intend to take it. Getting buy-in early smooths the way for later on. Ensure that whatever strategy you adopt, it adds real, measurable value by being aligned with, and serving, the business goals. It’s on this basis the client will judge you, and maintain or expand your services in future.

Further reading:

- 4 Principles Of Marketing Strategy In The Digital Age
- Product Positioning In Five Easy Steps [pdf]
- Technology Marketers Need To Document Their Marketing Strategy

2. Site Audit

Sites can be poorly organized, have various technical issues, and missed keyword opportunities.
We need to quantify what is already there, and what’s not there.

  • Use a site crawler, such as Xenu Link Sleuth, Screaming Frog or other tools that will give you a list of URLs, title information, link information and other data.
  • Make a list of all broken links.
  • Make a list of all orphaned pages
  • Make a list of all pages without titles
  • Make a list of all pages with duplicate titles
  • Make a list of pages with weak keyword alignment
  • Crawl robots txt and hand-check. It’s amazing how easy it is to disrupt crawling with a robots.txt file

Broken links are a low-quality signal. It's debatable if they are a low quality signal to Google, but certainly to users. If the client doesn't have one already, implement a system whereby broken links are checked on a regular basis. Orphaned pages are pages that have no links pointing to them. Those pages may be redundant, in which case they should be removed, or you need to point inbound links at them, so they can be crawled and have more chance of gaining rank. Page titles should be unique, aligned with keyword terms, and made attractive in order to gain a click. A link is more attractive if it speaks to a customer need. Carefully check robots.txt to ensure it’s not blocking areas of the site that need to be crawled.

As part of the initial site audit, it might make sense to include the site in Google Webmaster Tools to see if it has any existing issues there and to look up its historical performance on competitive research tools to see if the site has seen sharp traffic declines. If they've had sharp ranking and traffic declines, pull up that time period in their web analytics to isolate the date at which it happened, then look up what penalties might be associated with that date.

Further Reading:

- Broken Links, Pages, Images Hurt SEO
- Three Easy Ways To Fix Broken Links And Stop Unnecessary Visitor Loss
- 55 Ways To Use Screaming Frog
- Robots.txt Tutorial

3. Competitive Analysis

Some people roll this into a site audit, but I’ll split it out as we’re not looking at technical issues on competitor sites, we’re looking at how they are positioned, and how they’re doing it. In common with a site audit, there’s some technical reverse engineering involved.

There are various tools that can help you do this. I use SpyFu. One reporting aspect that is especially useful is estimating the value of the SEO positions vs the Adwords positions. A client can then translate the ranks into dollar terms, and justify this back against your fee.

When you run these competitive reports, you can see what content of theirs is working well, and what content is gaining ground. Make a list of all competitor content that is doing well. Examine where their links are coming from, and make a list. Examine where they’re mentioned in the media, and make a list. You can then use a fast-follow strategy to emulate their success, then expand upon it.

Sometimes, “competitors”, meaning ranking competitors, can actually be potential partners. They may not be in the same industry as your client, just happen to rank in a cross-over area. They may be good for a link, become a supplier, welcome advertising on their site, or be willing to place your content on their site. Make a note of the sites that are ranking well within your niche, but aren’t direct competitors.

Using tools that estimate the value of ranks by comparing Adwords keywords prices, you can estimate the value of your competitors positions. If your client appears lower than the competition, you can demonstrate the estimated dollar value of putting time and effort into increasing rank. You can also evaluate their rate of improvement over time vs your client, and use this as a competitive benchmark. If your client is not putting in the same effort as your competitor, they’ll be left behind. If their competitors are spending on ongoing-SEO and seeing tangible results, there is some validation for your client to do likewise.

Further reading:

- Competitor Analysis [pdf]
- Illustrated SEO Competitive Workflow
- Competitive Analysis: How To Become A SEO Hero In 4 Steps

4. Site Architecture

A well organised site is both useful from a usability standpoint and an SEO standpoint. If it’s clear to a user where they need to go next, then this will flow through into better engagement scores. If your client has a usability consultant on staff, this person is a likely ally.

It’s a good idea to organise a site around themes. Anecdotal evidence suggests that Google likes pages grouped around similar topics, rather than disparate topics (see from 1.25 onwards).

  • Create spreadsheet based on a crawl after any errors have been tidied up
  • Identify best selling products and services. These deserve the most exposure and should be placed high up the site hierarchy. Items and categories that do not sell well, and our less strategically important, should be lower in the hierarchy
  • Pages that are already getting a lot of traffic, as indicated by your analytics, might deserve more exposure by moving them up the hierarchy.
  • Seasonal products might deserve more exposure just before that shopping season, and less exposure when the offer is less relevant.
  • Group pages into similar topics, where possible. For example, acme.com/blue-widgets/ , acme.com/green-widgets/.
  • Determine if internal anchor text is aligned with keyword titles and page content by looking at a backlink analysis

A spreadsheet of all pages helps you group pages thematically, preferably into directories with similar content. Your strategy document will guide you as to which pages you need to work on, and which pages you need to religate. Some people spend a lot of time sculpting internal pagerank i.e. flowing page rank to some pages, but using nofollow on other links to not pass link equity to others. Google may have depreciated that approach, but you can still link to important products or categories sitewide to flow them more link equity, while putting less important sites lower in the site's architecture. Favour your money pages, and relegate your less important pages.

Think mobile. If your content doesn't work on mobile, then getting to the top of search results won't do you much good.

Further Reading:

- Site Architecture & Search Engine Success Factors
- Optimiing Your Websites Architecture For SEO (Slide Presentation)
- The SEO Guide To Information Archetecture

5. Enable Crawling & Redirects

Ensure your site is deep crawled. To check if all your URLs are included in Google’s index, sign up with Webmaster Tools and/or other index reporting tools.

  • Include a site map
  • Check the existing robots.txt. Kep robots out of non-essential areas, such as script repositories and other admin related directories.
  • If you need to move pages, or you have links to pages that no longer exist, use page redirects to tidy them up
  • Make a list of 404 errors. Make sure the 404 page has useful navigation into the site so visitors don’t click back.

The accepted method to redirect a page is to use a 301. The 301 indicates a page has permanently moved location. A redirect is also useful if you change domains, or if you have links pointing to different versions of the site. For example, Google sees http://www.acme.com and http://acme.com as different sites. Pick one and redirect to it.

Here’s a video explaining how:

If you don’t redirect pages, then you won’t be making full use of any link juice allocated to those pages.

Further Reading:

- What Are Google Site Maps?
- The Ultimate Guide To 301 Redirects
- Crawling And Indexing Metrics

6. Backlink Analysis

Backlinks remain a major ranking factor. Generally, the more high quality links you have pointing to your site, the better you’ll do in the results. Of late, links can also harm you. However, if your overall link profile is strong, then a subset of bad links is unlikely to cause you problems. A good rule of thumb is the Matt Cutts test. Would you be happy to show the majority of your links to Matt Cutts? :) If not, you're likely taking a high risk strategy when it comes to penalties. These can be manageable when you own the site, but they can be difficult to deal with on client sites, especially if the client was not aware of the risks involved in aggressive SEO.

  • Establish a list of existing backlinks. Consider trying to remove any that look low quality.
  • Ensure all links resolve to appropriate pages
  • Draw up a list of sites from which your main competitors have gained links
  • Draw up a list of sites where you’d like to get links from

Getting links involves either direct placement or being linkworthy. On some sites, like industry directories, you can pay to appear. In other cases, it’s making your site into an attractive linking target.

Getting links to purely commercial sites can be a challenge. Consider sponsoring charities aligned with your line of business. Get links from local chambers of commerce. Connect with education establishments who are doing relevant research and consider sponsoring or become involved in some way.

Look at the sites that point to your competitors. How were these links obtained? Follow the same path. If they successfully used white papers, then copy that approach. If they successfully used news, do that, too. Do whatever seems to work for others. Evaluate the result. Do more/less of it, depending on the results.

You also need links from sites that your competitors don’t have. Make a list of desired links. Figure out a strategy to get them. It may involve supplying them with content. It might involve participating in their discussions. It may involve giving them industry news. It might involve interviewing them or profiling them in some way, so they link to you. Ask “what do they need”?. Then give it to them.

Of course, linking is an ongoing strategy. As a site grows, many links will come naturally, and that in itself, is a link acquisition strategy. To grow in importance and consumer interest relative to the competition. This involves your content strategy. Do you have content that your industry likes to link to? If not, create it. If your site is not something that your industry links to, like a brochure site, you may look at spinning-off a second site that is information focused, and less commercial focused. You sometimes see blogs on separate domains where employees talk about general industry topics, like Signal Vs Noise, Basecamps blog. These are much more likely to receive links than sites that are purely commercial in nature.

Before chasing links, you should be aware of what type of site typically receives links, and make sure you’re it.

Further Reading:

- Interview Of Debra Mastaler, the Link Guru
- Scaleable Link Building Techniques
- Creative Link Building Ideas

7 Content Assessment

Once you have a list of keywords, an idea of where competitors rank, and what the most valuable terms are from a business point of view, you can set about examining and building out content.

Do you have content to cover your keyword terms? If not, add it to the list of content that needs to be created. If you have content that matches terms, see if compares well with client content on the same topic. Can the pages be expanded or made more detailed? Can more/better links be added internally? Will the content benefit from amalgamating different content types i.e. videos, audio, images et al?

You’ll need to create content for any keyword areas you’re missing. Rather than copy what is already available in the niche, look at the best ranking/most valuable content for that term and ask how it could be made better. Is there new industry analysis or reports that you can incorporate and/or expand on? People love the new. They like learning things they don’t already know. Mee-too content can work, but it’s not making the most of the opportunity. Aim to produce considerably more valuable content than already exists as you’ll have more chance of getting links, and more chance of higher levels of engagement when people flip between sites. If visitors can get the same information elsewhere, they probably will.

Consider keyword co-occurrence. What terms are readily associated with the keywords you’re chasing? Various tools provide this analysis, but you can do it yourself using the Adwords research tool. See what keywords it associates with your keywords. The Google co-occurrence algorithm is likely the same for both Adwords and organic search.

Also, think about how people will engage with your page. Is it obvious what the page is about? Is it obvious what the user must do next? Dense text and distracting advertising can reduce engagement, so make sure the usability is up to scratch. Text should be a reasonable size so the average person isn’t squinting. It should be broken up with headings and paragraphs. People tend to scan when reading online,searching for immediate confirmation they’ve found the right information. This was written a long time ago, but it’s interesting how relevant it remains.

Further Reading:

- Content Marketing Vs SEO
- Content Analysis Using Google Analytics
- Content Based SEO Strategy Will Eventually Fail

8. Link Out

Sites that don’t link out appear unnatural. Matt Cutts noted:

Of course, folks never know when we're going to adjust our scoring. It's pretty easy to spot domains that are hoarding PageRank; that can be just another factor in scoring. If you work really hard to boost your authority-like score while trying to minimize your hub-like score, that sets your site apart from most domains. Just something to bear in mind.

  • Make a list of all outbound links
  • Determine if these links are complementary i.e. similar topic/theme, or related to the business in some way
  • Make a list of pages with no links out

Links out are both a quality signal and good PR practise. Webmaster look at their inbound links, and will likely follow them back to see what is being said about them. That’s a great way to foster relationships, especially if your client’s site is relatively new. If you put other companies and people in a good light, you can expect many to reciprocate in kind.

Links, the good kind, are about human relationships.

It’s also good for your users. Your users are going to leave your site, one way or another, so you can pick up some kudos if you help them on their way by pointing them to some good authorities. If you’re wary about linking to direct competitors, then look for information resources, such as industry blogs or news sites, or anyone else you want to build a relationship with. Link to suppliers and related companies in close, but non-competing niches. Link to authoritative sites. Be very wary about pointing to low value sites, or sites that are part of link schemes. Low value sites are obvious. Sites that are part of link schemes are harder to spot, but typically feature link swapping schemes or obvious paid links unlikely to be read by visitors. Avoid link trading schemes. It’s too easy to be seen as a part of a link network, and it’s no longer 2002.

Further Resources:

- Five Reasons You Should Link Out
- The Domino Effects Of Links And Relationships
- Link Building 101: Utilizing Past Relationships

9. Ongoing

It’s not set and forget.

Clients can’t expect to do a one off optimisation campaign and expect it to keep working forever. It may be self-serving for SEOs to say it, but it’s also the truth. SEO is ongoing because search keeps changing and competitors and markets move. Few companies would dream of only having one marketing campaign. The challenge for the SEO, like any marketer, is to prove the on-going spend produces a return in value.

  • Competition monitoring i.e. scan for changes in competitors rank, new competitors, and change of tactics. Determine what is working, and emulate it.
  • Sector monitoring - monitor Google trends, keywords trends, discussion groups, and news releases. This will give you ideas for new campaign angles.
  • Reporting - the client needs to be able to see the work you’ve done is paying off.
  • Availability - clients will change things on their site, or bring in other marketers, so will want you advice going forward

Further Reading

Whole books can be written about SEO for clients. And they have. We've skimmed across the surface but, thankfully, there is a wealth of great information out there on the specifics of how to tackle each of these topic areas.

Perhaps you can weigh in? :) What would your advice be to those new to optimizing client sites? What do you wish someone had told you when you started?

Building a Legal Moat for Your SEO Castle

About a month ago a year old case of an SEO firm being sued by it's client resurfaced via a tweet from Matt Cutts.

I'd like to add something to this conversation that will be helpful for you as a service provider seeking to avoid that really, really scary issue.

Some quick background information if I may? No specifics are allowed but I've been a party, on both sides, to actual litigation pertaining to SEO contracts (not services rendered, just contractual issues with a third-party).

I've been the plaintiff and the defendant in cases involving contractual disputes and legal obligations so I, much to my dismay, speak from experience.

Suffice to say I'm not a lawyer, don't act on any of this advice without talking it over with your counsel so they can tailor it to your specific needs and state law.

Basic Protections

There are essentially 3 ways to legally protect yourself and/or your company objectively. I say objectively because anyone can sue you for anything and "service" is a subjective term as are "results" unless they are specifically spelled out in your contract.

Objectively speaking, the law gives you 3 broad arenas for protective measures:

  1. Contracts
  2. Entity Selection
  3. Insurance

Contracts

Get a real lawyer, do not use internet "templates" and do not modify any piece of the contract yourself. Make sure your attorney completely understands what you do. A good lawyer will listen to you. Heck, mine now knows who Matt Cutts is and where the Webmaster Guidelines are located and what "anchor text" is :)

Your contracts need to cover the following scenarios:

  • Client services
  • Vendor relationships
  • Employee/Contractor relationships

For standard client agreements you'll want to cover some basic areas:

  • Names of the legal entities partaking in the agreement
  • Duties and nature of services
  • Term and termination (who can cancel and when, what are the ramifications, etc)
  • Fees
  • No exclusive duty (a clause that says you can work with other clients and such)
  • Disclaimer, Limitation of Liability
  • Confidentiality
  • Notices (what is considered legal notice? a letter? certified mail? email?)
  • Governing law
  • Attorney's fees (if you need to enforce the contract make sure you can also collect fees)
  • Relationship of Parties (spell out the relationship; independent entities? partners? joint ventures? spell out exactly what you are and what you are not
  • Scope of Work
  • Signatures (you should sign as you are in your entity; member, president, CEO, etc)

Some important notes are needed to discussion a couple of core areas of the contract:

For Governing law go with your home state if possible. Ideally, I try to get an arbitration clause in there rather than state law so in case there is a dispute it goes to a much less expensive form of resolution.

However, you can make an argument that if your contract is signed with your home state as governing law and your language is strong you are better off doing that instead of arbitration where one person makes a decision and no appeal is available.

For Limit of Liability go broad, real broad. You want to spell out that organic search (or just about any service) is not guaranteed to produce results, no promises were made, Google does not fully publish the algorithim thus you can't be held liable for XYZ that happens.

Also, if your client is asking you to do things against webmaster guidelines, and you decide to do them, you NEED to get that documented. Have them email it to you, record the call, something. Here is the liability clause in my contract:

Client agrees and acknowledges that the internet is an organic, constantly shifting entity, and that Client’s ranking and/or performance in a search engine may change for many reasons and be affected by many factors, including but not limited to any actual or alleged non-compliance by Provider to guidelines set forth by Google related to search engine optimization.

Client agrees that no representation, express or implied, and no warranty or guaranty is provided by Provider with respect to the services to be provided by Provider under this Agreement. Provider’s services may be in the form of rendering consultation which Client may or may not choose to act on. To the maximum extent permitted by law, Client agrees to limit the liability of Provider and its officers, owners, agents, and employees to the sum of Provider’s fees actually received from Client.

This limitation will apply regardless of the cause of action or legal theory pled or asserted. In no event shall Provider be liable for any special, incidental, indirect, or consequential damages arising from or related to this Agreement or the Project. Client agrees, as a material inducement for Provider to enter into this Agreement that the success and/or profitability of Client’s business depends on a variety of factors and conditions beyond the control of Provider and the scope of this Agreement. Provider makes no representations or warranties of any kind regarding the success and/or profitability of Client’s business, or lack thereof, and Provider will not be liable in any manner respecting the same.

Client agrees to indemnify and hold harmless Provider and its officers, owners, agents, and employees from and against any damages, claims, awards, and reasonable legal fees and costs arising from or related to any services provided by Provider, excepting only those directly arising from Provider’s gross negligence or willful misconduct.

For vendor and independent contractor agreements you'll want most of the aforementioned clauses (especially the relationship of parties) in addition to a few more things (for employee stuff, get with your lawyer because states are quite different and a lot of us use remote workers in different states)

  1. Non-Competition and non-interference
  2. Non-Solicitation and non-contact

These clauses essentially prohibit the pursuit of your clientele and employees by a vendor/contractor for a specified period of time.

Legal Entity

Don't be a sole proprietor, ever. If you're a smaller shop you might consider being a single member LLC (just you), an LLC (you and employees), or an S Corp. If you're a larger operation you might want to incorporate and go Inc.

The benefits of the LLC set up are:

  • Your personal assets are generally untouchable (providing you are not co-mingling funds in a bank account)
  • Very easy to administer compared to other options
  • Your liability is limited to company assets (pro tip: clear out your business bank account each month minus some operating margin, move it to personal savings)

Benefits of an S Corp are:

  • Same protections as LLC
  • You save a fair amount on self employment taxes (more below)

With the S Corp there's more paperwork and filings but if you are earning a fair bit of money it may be worth it to you. Here's a good article breaking this all down, and a excerpt:

"If you operate your business as a sole proprietorship or partnership/LLC, you will pay roughly 15.3% in self-employment taxes on your $100,000 of profits. The calculations get a little tricky if you want to be really super-precise but you can think about self-employment tax as roughly a 15% tax. So 15% on $100,000 equals $15,000. Roughly."

"With an S corporation, you split your business profits into two categories: "shareholder wages" and "distributive share." Only the "shareholder wages" get subjected to the 15.3% tax. The leftover "distributive share" is not subject to 15.3% tax."

Be careful here (and I'm not a CPA so don't do anything without consulting with your accountant) not to be absurd with your wages. So, if your net income is 1 million don't take 25k in wages and 975k as a distribution.

Some final thoughts on entities:

  • Most of you will probably fall into the LLC/S S Corp category, get with your attorney and accountant
  • Keep everything separate because if you don't (credit cards, bank accounts, etc) your personal assets might be at risk due to the "piercing of the corporate veil"

Insurance

As you would imagine, insurance policies are few and far between for our industry. You can get general liability for your office, workers comp for your employees, disability for yourself, and so on. However, what you might want to look into is a professional liability policy.

You'll probably end up looking at a miscellaneous one like the one here (marketing consultant?) offered by Travelers. You'll probably have to educate your agent on your business practices to ensure proper coverage.

This might be worth it just due to the legal protection clause; meaning they will pay for a lawyer to defend you. Having the proper entity classification might protect your assets but paying lawyers is expensive to defend even frivolous lawsuits.

Record Keeping

This is a bit out of the "contract" topic but good record keeping is essential. If you use a project management and/or a CRM system you really should make sure you can export when you need it.

Many online CRM applications and project management applications have limited export capabilities especially when it comes to export comments and notes on things like tasks and records. Most have an API that you can have a developer custom code to export your stuff. I'd look into this as well.

Final Thoughts

Get with your attorney and CPA to get your specific situations up to legal snuff if you haven't already. Don't act on my advice as I'm not a lawyer nor a CPA. Contracts and agreements are not fun to negotiate and can be even harder when you work with people you generally trust.

However, when it comes to business dealings and contracts I would save my trust for my lawyer :)

The Positive Negative SEO Strategy

There’s a case study on Moz on how to get your site back following a link penalty. An SEO working on a clients site describes what happened when their client got hit with a link penalty. Even though the link penalty didn't appear to be their fault, it still took months to get their rankings back.

Some sites aren't that lucky. Some sites don’t get their rankings back at all.

The penalty was due to a false-positive. A dubious site links out to a number of credible sites in order to help disguise their true link target. The client site was one of the credible sites, mistaken by Google for a bad actor. Just goes to show how easily credible sites can get hit by negative SEO, and variations thereof.

There’s a tactic in there, of course.

Take Out Your Competitors

Tired of trying to rank better? Need a quicker way? Have we got a deal for you!

Simply build a dubious link site, point some rogue links at sites positioned above yours and wait for Google’s algorithm to do the rest. If you want to get a bit tricky, link out to other legitimate sites, too. Like Wikipedia. Google, even. This will likely confuse the algorithm for a sufficient length of time, giving your tactic time to work.

Those competitors who get hit, and who are smart enough to work out what’s going on, may report your link site, but, hey, there are plenty more link sites where that came from. Roll another one out, and repeat. So long as your link site can’t be connected with you - different PC, different IP address, etc - then what have you got to lose? Nothing much. What have your competitors got to lose? Rank, a lot of time, effort, and the very real risk they won’t get back into Google’s good books. And that’s assuming they work out why they lost rankings.

I’m not advocating this tactic, of course. But we all know it’s out there. It is being used. And the real-world example above shows how easy it is to do. One day, it might be used against you, or your clients.

Grossly unfair, but what can you do about it?

Defensive Traffic Strategy

Pleading to Google is not much of a strategy. Apart from anything else, it’s an acknowledgement that the power is not in your hands, but in the hands of an unregulated arbiter who likely views you as a bit of an annoyance. It’s no wonder SEO has become so neurotic.

It used to be the case that competitors could not take you out pointing unwanted links at you. No longer. So even more control has been taken away from the webmaster.

The way to manage this risk is the same way risk is managed in finance. Risk can be reduced using diversification. You could invest all your money in one company, or you could split it between multiple companies, banks, bonds and other investment classes. If you’re invested in one company, and they go belly up, you lose everything. If you invest in multiple companies and investment classes, then you’re not as affected if one company gets taken out. In other words, don’t put all your eggs in one basket.

It’s the same with web traffic.

1. Multiple Traffic Streams

If you only run one site, try to ensure your traffic is balanced. Some traffic from organic search, some from PPC, some from other sites, some from advertisements, some from offline advertising, some from email lists, some from social media, and so on. If you get taken out in organic search, it won’t kill you. Alternative traffic streams buy you time to get your rankings back.

2. Multiple Pages And Sites

A “web site” is a construct. Is it a construct applicable to a web that mostly orients around individual pages? If you think in terms of pages, as opposed to a site, then it opens up more opportunities for diversification.

Pages can, of course, be located anywhere, not just on your site. These may take the form of well written, evergreen, articles published on other popular sites. Take a look at the top sites in closely related niches and see if there are any opportunities to publish your content on them. Not only does this make your link graph look good, so long as it’s not overt, you’ll also have achieve more diversity.

Consider Barnacle SEO.

Will creatively defines the concept of barnacle SEO as follows:
Attaching oneself to a large fixed object and waiting for the customers to float by in the current.
Directly applied to local search, this means optimizing your profiles or business pages on a well-trusted, high-ranking directory and working to promote those profiles instead of — or in tandem with — your own website.“

You could also build multiple sites. Why have just one site when you can have five? Sure, there’s more overhead, and it won’t be appropriate in all cases, but again, the multiple site strategy is making a comeback due to Google escalating the risk of having only one site. This strategy also helps get your eggs into multiple baskets.

3. Prepare For the Worst

If you've got most of your traffic coming from organic search, then you’re taking a high risk approach. You should manage that risk down with diversification strategies first. Part of the strategy for dealing with negative SEO is not to make yourself so vulnerable to it in the first place.

If you do get hit, have a plan ready to go to limit the time you’re out of the game. The cynical might suggest you have a name big enough to make Google look bad if they don’t show you.

Lyrics site Rap Genius says that it is no longer penalized within Google after taking action to correct “unnatural links” that it helped create. The site was hit with a penalty for 10 days, which meant people seeking it by name couldn’t find it.

For everyone else, here’s a pretty thorough guide about how to get back in.

Have your “plead with Google” gambit ready to go at a moments notice. The lead time to get back into Google can be long, so the sooner you get onto it, the better. Of course, this is really the last course of action. It’s preferable not make yourself that vulnerable in the first place.

By diversifying.

Handling Objections From SEO Clients

If the current war on SEOs by Google wasn’t bad enough if you own the site you work on, then it is doubly so for the SEO working for a client. When the SEO doesn’t have sufficient control over the strategy and technology, it can be difficult to get and maintain rankings.

In this post, we'll take a look at the challenges and common objections the SEO faces when working on a client site, particularly a client who is engaging an SEO for the first time. The SEO will need to fit in with developers, designers and managers who may not understand the role of SEOs. Here are common objections you can expect, and some ideas on how to counter them.

1. Forget About SEO

The objection is that SEO gets in the way. It’s too hard.

It’s true. SEO is complicated. It can often compromise design and site architecture. To managers and other web technicians, SEO can look like a dark art. Or possibly a con. There are no fixed rules as there are in, say, coding, and results are unpredictable.

So why spend time and money on SEO?

One appropriate response is “because your competitors are”

Building a website is the equivalent of taking the starting line in a race. Some site owners think that’s all they need do. However, the real race starts after the site is built. Every other competitor has a web site, and they’re already off and running in terms of site awareness. Without SEO, visitors may find a site, but if the site owner is not using the SEO channel, and their competitors are, then their competitors have an advantage in terms of reach.

2. Can’t SEOs Do Their Thing After The Site Is Built?

SEO’s can do their thing after the site is built, but it’s more difficult. As a result, it’s likely to be more expensive. Baking SEO into the mix when it is conceived and built is an easier route.

Just as copywriters require space to display their copy, SEO's require room to manoeuvre. They’ll likely contribute to information architecture, copy, copy markup and internal linking structures. So start talking about SEO as early as possible, and particularly during information architecture.

There are three key areas where SEO needs to integrate with design. One, the requirement that text is machine readable. Search engines "think" mostly in terms of words, so topics and copy need to relate to search terms visitors may use.

Secondly, linking architecture and information hierarchies. If pages are buried deep in the site, but deemed important in terms of search, they will likely be elevated in the hierarchy to a position closer to the home page.

Thirdly, crawl-ability. A search engine sends out a spider, which grabs the source code of your website, and dumps it back in the search engines database. The spider skips from page to page, following links. If a page doesn't have a crawlable link pointing to it, it will be invisible to search engines. There are various means of making a site easy to crawl, but one straightforward way is to use a site map, linked to from each page on the site. The SEO may also want to ensure the site navigation is crawlable.

3. We Don’t Want The SEO To Interfere With Code

SEO’s do need to tweak code, however the mark-up is largely inconsequential.

SEO's need to specify title tags and some meta tags. These tags need to be unique for each page on the site, as each page is a possible entry page. A search visitor will not necessarily arrive at the home page first.

The title tag appears in search results as a clickable link, so serves a valuable marketing function. When search visitors consider which link on a search results page to click, the title tag and snippet will influence their decision. The title tag should, therefore, closely match the content of each page.

The second aspect concerns URL's. Ideally, a URL should contain descriptive words, as opposed to numbers and random letters. For example, acme.com/widgets/red-widgets.htm is good, whilst acme.com/w/12345678&tnr.php, less so.

The more often the keyword appears, the more likely it will be bolded on a search results page, and is therefore more likely to attract a click. It's also easier for the search engine to determine meaning if a URL is descriptive as opposed to cryptic.

4. I’ve Got An SEO PlugIn. That’s All I Need

SEO Plugins cover the on-site basics. But ranking well involves more than covering the basics.

In order to rank well, a page needs to have links from external sites. The higher quality those sites, the more chances your pages have of ranking well. The SEO will look to identify linking possibilities, and point these links to various internal pages on the site.

It can be difficult, near impossible, to get high quality links to brochure-style advertising pages. Links tend to be directed at pages that have unique value.

So, the type and quality of content has more to do with SEO than the way that content is marked up by a generic plugin. The content must attract links and generate engagement. The visitor needs to see a title on a search result, click through, not click back, and, preferably take some action on that page. That action may be a click deeper into the site, a bookmark, a tweet, or some other measurable form of response.

Content that lends itself to this type of interaction includes blog posts, news feeds, and content intended for social network engagement. In this way, SEO-friendly content can be functionally separated from other types of content. Not every page needs to be SEO’d, so SEO can be sectioned off, if necessary.

5. The SEO Is Just Another Technician

If your aim, or your clients aim, is to attract as much targeted traffic as possible then SEO integration must be taken just as seriously as design, development, copy and other media. SEO is more than a technical exercise, it’s a strategic marketing exercise, much like Public Relations.

SEO considerations may influence your choice of CMS. It may influence your strategic approach in terms of what type of information you publish. It may change the way you engage visitors. Whilst SEO can be bolted-on afterwards, this is a costly and less-effective way of doing SEO, much like re-designing a site is costly and less effective than getting it right in the planning stage.

6. Why Have Our Ranking Disappeared?

The reality of any marketing endeavour is that it will have a shelf-life. Sometimes, that shelf life is short. Other times, it can run for years.

SEO is vulnerable to the changes made by search engines. These changes aren’t advertised in advance, nor are they easily pinned down even after they have occurred. This is why SEO is strategic, just as Public Relations is strategic. The Public Relations campaign you were using a few years ago may not be the same one you use now, and the same goes for SEO.

The core of SEO hasn’t changed much. If you produce content visitors find relevant, and that content is linked to, and people engage with that content, then it has a good chance of doing well in search engines. However, the search engines constantly tweak their settings, and when they do, a lot of previous work - especially if that work was at the margins of the algorithms - can come undone.

So, ranking should never be taken for granted. The value the SEO brings is that they are across underlying changes in the way the search engines work and can adapt your strategy, and site, to the new changes.

Remember, whatever problems you may have with the search engines, the same goes for your competitors. They may have dropped rankings, too. Or they may do so soon. The SEO will try to figure out why the new top ranking sites are ranked well, then adapt your site and strategy so that it matches those criteria.

7. Why Don’t We Just Use PPC Instead?

PPC has many advantages. The biggest advantage is that you can get top positioning, and immediate traffic, almost instantly. The downside is, of course, you pay per click. Whilst this might be affordable today, keep in mind that the search engine has a business objective that demands they reward the top bidders who are most relevant. Their auction model forces prices higher and higher, and only those sites with deep pockets will remain in the game. If you don’t have deep pockets, or want to be beholden to the PPC channel, a long term SEO strategy works well in tandem.

SEO and PPC complement one another, and lulls and challenges in one channel can be made up for by the other. Also, you can feed the keyword data from PPC to SEO to gain a deeper understanding of search visitor behaviour.

8. Does SEO Provide Value For Money?

This is the reason for undertaking any marketing strategy.

An SEO should be able to demonstrate value. One way is to measure the visits from search engines before the SEO strategy starts, and see if these increase significantly post implementation. The value of each search click changes depending on your business case, but can be approximated using the PPC bid prices. Keep in mind the visits from an SEO campaign may be maintained, and increased, over considerable time, thus driving down their cost relative to PPC and other channels.

Should Venture Backed Startups Engage in Spammy SEO?

Here's a recent video of the founders of RapGenius talking at TechCrunch disrupt.

Oops, wrong video. Here's the right one. Same difference.

Recently a thread on Hacker News highlighted a blog post which pointed how RapGenius was engaging in reciprocal promotional arrangements where they would promote blogs on their Facebook or Twitter accounts if those bloggers would post a laundry list of keyword rich deeplinks at RapGenius.

Matt Cutts quickly chimed in on Hacker News "we're investigating this now."

A friend of mine and I were chatting yesterday about what would happen. My prediction was that absolutely nothing would happen to RapGenius, they would issue a faux apology, they would put no effort into cleaning up the existing links, and the apology alone would be sufficient evidence of good faith that the issue dies there.

Today RapGenius published a mea culpa where ultimately they defended their own spam by complaining about how spammy other lyrics websites are. The self-serving jackasses went so far as including this in their post: "With limited tools (Open Site Explorer), we found some suspicious backlinks to some of our competitors"

It's one thing to in private complain about dealing in a frustrating area, but it's another thing to publicly throw your direct competitors under the bus with a table of link types and paint them as being black hat spammers.

Google can't afford to penalize Rap Genius, because if they do Google Ventures will lose deal flow on the start ups Google co-invests in.

In the past some of Google's other investments were into companies that were pretty overtly spamming. RetailMeNot held multiple giveaways where if you embedded a spammy sidebar set of deeplinks to their various pages they gave you a free t-shirt:

Google's behavior on such arrangements has usually been to hit the smaller players while looking the other way on the bigger site on the other end of the transaction.

That free t-shirt for links post was from 2010 - the same year that Google invested in RetailMeNot. They did those promotions multiple times & long enough that they ran out of t-shirts!. The widgets didn't link to the homepage of RetailMeNot or pages relevant to that particular blog, rather they used (in some cases dozens of different) keyword rich deep links in each widget - arbitraging search queries tied various third party brands. Now that RTM is a publicly traded billion Dollar company which Google already endorsed by investing in, there's a zero percent chance of them getting penalized.

To recap, if you are VC-backed you can: spam away, wait until you are outed, when outed reply with a combined "we didn't know" and a "our competitors are spammers" deflective response.

For the sake of clarity, let's compare that string of events (spam, warning but no penalty, no effort needed to clean up, insincere mea culpa) to how a websites are treated when not VC backed. For smaller sites it is "shoot on sight" first and then ask questions later, perhaps coupled with a friendly recommendation to start over.

Here's a post from today highlighting a quote from Google's John Mueller:

My personal & direct recommendation here would be to treat this site as a learning experience from a technical point of view, and then to find something that you're absolutely passionate & knowledgeable about and create a website for that instead.

Growth hack inbound content marketing, but just don't call it SEO.

What's worse, is with the new fearmongering disavow promotional stuff, not only are some folks being penalized for the efforts of others, but some are being penalized for links that were in place BEFORE Google even launched as a company.

Given that money allegedly shouldn't impact rankings, its sad to note that as everything that is effective gets labeled as spam, capital and connections are the key SEO "innovations" in the current Google ecosystem.

Beware Of SEO Truthiness

When SEO started, many people routinely used black-box testing to try any figure out what pages the search engines rewarded.

Black box testing is terminology used in IT. It’s a style of testing that doesn’t assume knowledge of the internal workings of a machine or computer program. Rather, you can only test how the system responds to inputs.

So, for many years, SEO was about trying things out and watching how the search engine responded. If rankings went up, SEOs assumed correlation meant causation, so they did a lot more of whatever it was they thought was responsible for the boost. If the trick was repeatable, they could draw some firmer conclusions about causation, at least until the search engine introduced some new algorithmic code and sent everyone back to their black-box testing again.

Well, it sent some people back to testing. Some SEO’s don’t do much, if any, testing of their own, and so rely on the strategies articulated by other people. As a result, the SEO echo chamber can be a pretty misleading place as “truthiness” - and a lot of false information - gets repeated far and wide, until it’s considered gospel. One example of truthiness is that paid placement will hurt you. Well, it may do, but not having it may hurt you more, because it all really…..depends.

Another problem is that SEO testing can seldom be conclusive, because you can’t be sure of the state of the thing you’re testing. The thing you're testing may not be constant. For example, you throw up some more links, and your rankings rise, but the rise could be due to other factors, such as a new engagement algorithm that Google implemented in the middle of your testing, you just didn’t know about it.

It used to be a lot easier to conduct this testing. Updates were periodic. Up until that point, you could reasonably assume the algorithms were static, so cause and effect were more obvious than they are today. Danny Sullivan gave a good overview of search history at Moz earlier in the year:

That history shows why SEO testing is getting harder. There are a lot more variables to isolate that there used to be. The search engines have also been clever. A good way to thwart SEO black box testing is to keep moving the target. Continuously roll out code changes and don’t tell people you’re doing it. Or send people on a wild goose chase by arm-waving about a subtle code change made over here, when the real change has been made over there.

That’s the state of play in 2013.

However….(Ranting Time :)

Some SEO punditry is bordering on the ridiculous!

I’m not going to link to one particular article I’ve seen recently, as, ironically, that would mean rewarding them for spreading FUD. Also, calling out people isn't really the point. Suffice to say, the advice was about specifics, such as how many links you can “safely” get from one type of site, that sort of thing....

The problem comes when we can easily find evidence to the contrary. In this case, a quick look through the SERPs and you'll find evidence of top ranking sites that have more than X links from Site Type Y, so this suggests….what? Perhaps these sites are being “unsafe”, whatever that means. A lot of SEO punditry is well meaning, and often a rewording of Google's official recommendations, but can lead people up the garden path if evidence in the wild suggests otherwise.

If one term defined SEO in 2013, it is surely “link paranoia”.

What's Happening In The Wild

When it comes to what actually works, there are few hard and fast rules regarding links. Look at the backlink profiles for top ranked sites across various categories and you’ll see one thing that is constant....

Nothing is constant.

Some sites have links coming from obviously automated campaigns, and it seemingly doesn’t affect their rankings. Other sites have credible link patterns, and rank nowhere. What counts? What doesn’t? What other factors are in play? We can only really get a better picture by asking questions.

Google allegedly took out a few major link networks over the weekend. Anglo Rank came in for special mention from Matt Cutts.

So, why are Google making a point of taking out link networks if link networks don’t work? Well, it’s because link networks work. How do we know? Look at the back link profiles in any SERP area where there is a lot of money to be made, and the area isn’t overly corporate i.e. not dominated by major brands, and it won’t be long before you spot aggressive link networks, and few "legitimate" links, in the backlink profiles.

Sure, you wouldn't want aggressive link networks pointing at brand sites, as there are better approaches brand sites can take when it comes to digital marketing, but such evidence makes a mockery of the tips some people are freely handing out. Are such tips the result of conjecture, repeating Google's recommendations, or actual testing in the wild? Either the link networks work, or they don’t work but don’t affect rankings, or these sites shouldn't be ranking.

There’s a good reason some of those tips are free, I guess.

Risk Management

Really, it’s a question of risk.

Could these sites get hit eventually? Maybe. However, those using a “disposable domain” approach will do anything that works as far as linking goes, as their main risk is not being ranked. Being penalised is an occupational hazard, not game-over. These sites will continue so long as Google's algorithmic treatment rewards them with higher ranking.

If your domain is crucial to your brand, then you might choose to stay away from SEO entirely, depending on how you define “SEO”. A lot of digital marketing isn’t really SEO in the traditional sense i.e. optimizing hard against an algorithm in order to gain higher rankings, a lot of digital marketing is based on optimization for people, treating SEO as a side benefit. There’s nothing wrong with this, of course, and it’s a great approach for many sites, and something we advocate. Most sites end up somewhere along that continuum, but no matter where you are on that scale, there’s always a marketing risk to be managed, with perhaps "non-performance" being a risk that is often glossed over.

So, if there's a take-away, it's this: check out what actually happens in the wild, and then evaluate your risk before emulating it. When pundits suggest a rule, check to see if you can spot times it appears to work, and perhaps more interestingly, when it doesn't. It's in those areas of personal inquiry and testing where gems of SEO insight are found.

SEO has always been a mix of art and science. You can test, but only so far. The art part is dealing with the unknown past the testing point. Performing that art well is to know how to pick truthiness from reality.

And that takes experience.

But mainly a little fact checking :)

SEO Discussions That Need to Die

Sometimes the SEO industry feels like one huge Groundhog Day. No matter how many times you have discussions with people on the same old topics, these issues seem to pop back into blogs/social media streams with almost regular periodicity. And every time it does, just the authors are new, the arguments and the contra arguments are all the same.

Due to this sad situation, I have decided to make a short list of such issues/discussions and hopefully if one of you is feeling particularly inspired by it and it prevents you from starting/engaging in such a debate, then it was worth writing.

So here are SEO's most annoying discussion topics, in no particular order:

Blackhat vs. Whitehat

This topic has been chewed over and over again so many times, yet people still jump into it with both their feet, having righteous feeling that their, and no one else's argument is going to change someone's mind. This discussion is becomes particularly tiresome when people start claiming moral high ground because they are using one over the other. Let's face it once and for all times: there are no generally moral (white) and generally immoral (black) SEO tactics.

This is where people usually pull out the argument about harming clients' sites, argument which is usually moot. Firstly, there is a heated debate about what is even considered whitehat and what blackhat. Definition of these two concepts is highly fluid and changes over time. One of the main reasons for this fluidity is Google moving the goal posts all the time. What was once considered purely whitehat technique, highly recommended by all the SEOs (PR submissions, directories, guest posts, etc.) may as of tomorrow become “blackhat”, “immoral” and what not. Also some people consider “blackhat” anything that dares to not adhere to Google Webmaster Guidelines as if it was carved in on stone tablets by some angry deity.

Just to illustrate how absurd this concept is, imagine some other company, Ebay say, creates a list of rules, one of which is that anyone who wants to sell an item on their site, is prohibited from trying to sell it also on Gumtree or Craigslist. How many of you would practically reduce the number of people your product is effectively reaching because some other commercial entity is trying to prevent competition? If you are not making money off search, Google is and vice versa.

It is not about the morals, it is not about criminal negligence of your clients. It is about taking risks and as long as you are being truthful with your clients and yourself and aware of all the risks involved in undertaking this or some other activity, no one has the right to pontificate about “morality” of a competing marketing strategy. If it is not for you, don't do it, but you can't both decide that the risk is too high for you while pseudo-criminalizing those who are willing to take that risk.

The same goes for “blackhatters” pointing and laughing at “whitehatters”. Some people do not enjoy rebuilding their business every 2 million comment spam links. That is OK. Maybe they will not climb the ranks as fast as your sites do, but maybe when they get there, they will stay there longer? These are two different and completely legitimate strategies. Actually, every ecosystem has representatives of those two strategies, one is called “r strategy” which prefers quantity over quality, while the K strategy puts more investment in a smaller number of offsprings.

You don't see elephants calling mice immoral, do you?

Rank Checking is Useless/Wrong/Misleading

This one has been going around for years and keeps raising its ugly head every once in a while, particularly after Google forces another SaaS provider to give up part of its services because of either checking rankings themselves or buying ranking data from a third party provider. Then we get all the holier-than-thou folks, mounting their soap boxes and preaching fire and brimstone on SEOs who report rankings as the main or even only KPI. So firstly, again, just like with black vs. white hat, horses for courses. If you think your way of reporting to clients is the best, stick with it, preach it positively, as in “this is what I do and the clients like it” but stop telling other people what to do!

More importantly, vast majority of these arguments are based on a totally imaginary situation in which SEOs use rankings as their only or main KPI. In all of my 12 years in SEO, I have never seen any marketer worth their salt report “increase in rankings for 1000s of keywords”. As far back as 2002, I remember people were writing reports to clients which had a separate chapter for keywords which were defined as optimization targets, client's site reached top rankings but no significant increase in traffic/conversions was achieved. Those keywords were then dropped from the marketing plan altogether.

It really isn't a big leap to understand that ranking isn't important if it doesn't result in increased conversions in the end. I am not going to argue here why I do think reporting and monitoring rankings is important. The point is that if you need to make your argument against a straw man, you should probably rethink whether you have a good argument at all.

PageRank is Dead/it Doesn't Matter

Another strawman argument. Show me a linkbuilder who today thinks that getting links based solely on toolbar PageRank is going to get them to rank and I will show you a guy who has probably not engaged in active SEO since 2002. And not a small amount of irony can be found in the fact that the same people who decry use of Pagerank, a closest thing to an actual Google ranking factor they can see, are freely using proprietary metrics created by other marketing companies and treating them as a perfectly reliable proxy for esoteric concepts which even Google finds hard to define, such as relevance and authority. Furthermore, all other things equal, show me the SEO who will take a pass on a PR6 link for the sake of a PR3 one.

Blogging on “How Does XXX Google Update Change Your SEO” - 5 Seconds After it is Announced

Matt hasn't turned off his video camera to switch his t-shirt for the next Webmaster Central video and there are already dozens of blog posts discussing to the most intricate of details on how the new algorithm update/penalty/infrastructure change/random- monochromatic-animal will impact everyone's daily routine and how we should all run for the hills.

Best-case scenario, these prolific writers only know the name of the update and they are already suggesting strategies on how to avoid being slapped or, even better, get out of the doghouse. This was painfully obvious in the early days of Panda, when people were writing their “experiences” on how to recover from the algorithm update even before the second update was rolled out, making any testimony of recovery, in the worst case, a lie or (given a massive benefit of the doubt) a misinterpretation of ranking changes (rank checking anyone).

Put down your feather and your ink bottle skippy, wait for the dust to settle and unless you have a human source who was involved in development or implementation of the algorithm, just sit tight and observe for the first week or two. After that you can write those observations and it will be considered a legitimate, even interesting reporting on the new algorithm but anything earlier than that will paint you as a clueless, pageview chaser, looking to ride the wave of interest with blog post that are often closed with “we will probably not even know what the XXX update is all about until we give it some time to get implemented”. Captain Obvious to the rescue.

Adwords Can Help Your Organic Rankings

This one is like a mythological Hydra – you cut one head off, two new one spring out. This question was answered so many times by so many people, both from within search engines and from the SEO community, that if you are addressing this question today, I am suspecting that you are actually trying to refrain from talking about something else and are using this topic as a smoke screen. Yes, I am looking at you Google Webmaster Central videos. Is that *really* the most interesting question you found on your pile? What, no one asked about <not provided> or about social signals or about role authorship plays on non-personalized rankings or on whether it flows through links or million other questions that are much more relevant, interesting and, more importantly, still unanswered?

Infographics/Directories/Commenting/Forum Profile Links Don't Work

This is very similar to the blackhat/whitehat argument and it is usually supported by a statement that looks something like “what do you think that Google with hundreds of PhDs haven't already discounted that in their algorithm?”. This is a typical “argument from incredulity” by a person who glorifies post graduate degrees as a litmus of intelligence and ingenuity. My claim is that these people have neither looked at backlink profiles of many sites in many competitive niches nor do they know a lot of people doing or having a PhD. They highly underrate former and overrate the latter.

A link is a link is a link and the only difference is between link profiles and percentages that each type of link occupies in a specific link profile. Funnily enough, the same people who claim that X type of links don't work are the same people who will ask for link removal from totally legitimate, authoritative sources who gave them a totally organic, earned link. Go figure.

“But Matt/John/Moultano/anyone-with a brother in law who has once visited Mountain View” said…

Hello. Did you order “not provided will be maximum 10% of your referral data”? Or did you have “I would be surprised if there was a PR update this year”? How about “You should never use nofollow on-site links that you don't want crawled. But it won't hurt you. Unless something.”?

People keep thinking that people at Google sit around all day long, thinking how they can help SEOs do their job. How can you build your business based on advice given out by an entity who is actively trying to keep visitors from coming to your site? Can you imagine that happening in any other business environment? Can you imagine Nike marketing department going for a one day training session in Adidas HQ, to help them sell their sneakers better?

Repeat after me THEY ARE NOT YOUR FRIENDS. Use your own head. Even better, use your own experience. Test. Believe your own eyes.

We Didn't Need Keyword Data Anyway

This is my absolute favourite. People who were as of yesterday basing their reporting, link building, landing page optimization, ranking reports, conversion rate optimization and about every other aspect of their online campaign on referring keywords, all of a sudden fell the need to tell the world how they never thought keywords were an important metric. That's right buster, we are so much better off flying blind, doing iteration upon iteration of a derivation of data based on past trends, future trends, landing pages, third party data, etc.

It is ok every once in a while to say “crap, Google has really shafted us with this one, this is seriously going to affect the way I track progress”. Nothing bad will happen if you do. You will not lose face over it. Yes there were other metrics that were ALSO useful for different aspects of SEO but it is not as if when driving a car and your brakes die on you, you say “pfffftt stopping is for losers anyway, who wants to stop the car when you can enjoy the ride, I never really used those brakes in the past anyway. What really matters in the car is that your headlights are working”.

Does this mean we can't do SEO anymore? Of course not. Adaptability is one of the top required traits of an SEO and we will adapt to this situation as we did to all the others in the past. But don't bullshit yourself and everyone else that 100% <not provided> didn't hurt you.

Responses to SEO is Dead Stories

It is crystal clear why the “SEO is dead” stories themselves deserve to die a slow and painful death. I am talking here about hordes of SEOs who rise to the occasion every freeking time some 5th rate journalist decides to poke the SEO industry through the cage bars and convince them, nay, prove to them how SEO is not only not dying but is alive and kicking and bigger than ever. And I am not innocent of this myself, I have also dignified this idiotic topic with a response (albeit a short one) but how many times can we rise to the same occasion and repeat the same points? What original angle can you give to this story after 16 years of responding to the same old claims? And if you can't give an original angle, how in the world are you increasing our collective knowledge by re-warming and serving the same old dish that wasn't very good first time it was served? Don't you have rankings to check instead?

There is No #10.

But that's what everyone does, writes a “Top 10 ways…” article, where they will force the examples until they get to a linkbaity number. No one wants to read a “Top 13…” or a “Top 23…” article. This needs to die too. Write what you have to say. Not what you think will get most traction. Marketing is makeup, but the face needs to be pretty before you apply it. Unless you like putting lipstick on pigs.


Branko Rihtman has been optimizing sites for search engines since 2001 for clients and own web properties in a variety of competitive niches. Over that time, Branko realized the importance of properly done research and experimentation and started publishing findings and experiments at SEO Scientist, with some additional updates at @neyne. He currently consults a number of international clients, helping them improve their organic traffic and conversions while questioning old approaches to SEO and trying some new ones.

What Types of Sites Actually Remove Links?

Since the disavow tool has come out SEOs are sending thousands of "remove my link" requests daily. Some of them come off as polite, some lie & claim that the person linking is at huge risk of their own rankings tank, some lie with faux legal risks, some come with "extortionisty" threats that if they don't do it the sender will report the site to Google or try to get the web host to take down the site, and some come with payment/bribery offers.

If you want results from Google's jackassery game you either pay heavily with your time, pay with cash, or risk your reputation by threatening or lying broadly to others.

At the same time, Google has suggested that anyone who would want payment to remove links is operating below board. But if you receive these inbound emails (often from anonymous Gmail accounts) you not only have to account for the time it would take to find the links & edit your HTML, but you also have to determine if the person sending the link removal request represents the actual site, or if it is someone trying to screw over one of their competitors. Then, if you confirm that the request is legitimate, you either need to further expand your page's content to make up for the loss of that resource or find a suitable replacement for the link that was removed. All this takes time. And if that time is from an employee that means money.

There have been hints that if a website is disavowed some number of times that data can be used to further go out & manually penalize more websites, or create link classifications for spam.

... oh no ...

Social engineering is the most profitable form of engineering going on in the 'Plex.

The last rub is this: if you do value your own life at nothing in a misguided effort to help third parties (who may have spammed up your site for links & then often follow it up with lying to you to achieve their own selfish goals), how does that reflect on your priorities and the (lack of) quality in your website?

If you contacted the large branded websites that Google is biasing their algorithms toward promoting, do you think those websites would actually waste their time & resources removing links to third party websites? For free?

Color me skeptical.

As a thought experiment, look through your backlinks for a few spam links that you know are hosted by Google (eg: Google Groups, YouTube, Blogspot, etc.) and try to get Google's webmaster to help remove those links for you & let us know how well that works out for you.

Some of the larger monopoly & oligopolies don't offer particularly useful customer service to their paying customers. For example, track how long it takes you to get a person on the other end of the phone with a telecom giant, a cable company, or a mega bank. Better yet, look at how long it took AdWords to openly offer phone support & the non-support they offer AdSense publishers (remember the bit about Larry Page believing that "the whole idea of customer support was ridiculous?")

For the non-customer Google may simply recommend that the best strategy is to "start over."

When Google aggregates Webmaster Tools link data from penalized websites they can easily make 2 lists:

  • sites frequently disavowed
  • sites with links frequently removed

If both lists are equally bad, then you are best off ignoring the removal requests & spending your time & resources improving your site.

If I had to guess, I would imagine that being on the list of "these are the spam links I was able to remove" is worse than being on the list of "these are the links I am unsure about & want to disavow just in case."

What say you?

Pages