The Only Thing Certain In SEO Is Change

Oct 15th
posted in

SEO is subject to frequent change, but in the last year or two, the changes feel both more frequent and significant than changes in the past. Florida hit in 2003. Since then, it’s like we get a Florida every six months.

Whenever Google updates the underlying landscape, the strategies need to change in order to deal with it. No fair warning. That’s not the game.

From Tweaks To Strategy

There used to be a time when SEOs followed a standard prescription. Many of us remember a piece of software called Web Position Gold.

Web Position Gold emerged when SEO could be reduced to a series of repeatable - largely technical - steps. Those steps involved adding keywords to a page, repeating those keywords in sufficient density, checking a few pieces of markup, then scoring against an “ideal” page. Upload to web. Add a few links. Wait a bit. Run a web ranking report. Viola! You’re an SEO. In all but the most competitive areas, this actually worked.

Seems rather quaint these days.

These days, you could do all of the above and get nowhere. Or you might get somewhere, but when so many more factors in play, they can’t be isolated to an individual page score. If the page is published on a site with sufficient authority, it will do well almost immediately. If it appears on a little known site, it may remain invisible for a long time.

Before Google floated in 2004, they released an investor statement signalling SEO - well, “index spammers” - as a business risk. If you ever want to know what Google really feels about people who “manipulate” their results, it’s right here:

We are susceptible to index spammers who could harm the integrity of our web search results.

There is an ongoing and increasing effort by “index spammers” to develop ways to manipulate our web search results. For example, because our web search technology ranks a web page’s relevance based in part on the importance of the web sites that link to it, people have attempted to link a group of web sites together to manipulate web search results. We take this problem very seriously because providing relevant information to users is critical to our success. If our efforts to combat these and other types of index spamming are unsuccessful, our reputation for delivering relevant information could be diminished. This could result in a decline in user traffic, which would damage our business.

SEO competes with the Adwords business model. So, Google “take very seriously” the activities of those who seek to figure out the algorithms, reverse engineer them, and create push-button tools like Web Position Gold. We’ve had Florida, and Panda, and Penguin, and Hummingbird, all aimed at making the search experience better for users, whilst having the pleasant side effect, as far as Google is concerned, of making life more difficult for SEOs.

I think the key part of Google’s statement was “delivering relevant information”.

From Technical Exercise To PR

SEO will always involve technical aspects. You get down into code level and mark it up. The SEO needs to be aware of development and design and how those activities can affect SEO. The SEO needs to know how web servers work, and how spiders can sometimes fail to deal with their quirks.

But in the years since Florida, marketing aspects have become more important. An SEO can perform the technical aspects of SEO and get nowhere. More recent algorithms, such as Panda and Penguin, gauge the behaviour of users, as Google tries to determine information quality of pages. Hummingbird attempts to discover the intent that lays behind keywords.

As a result, Keyword-based SEO is in the process of being killed off. Google withholds keyword referrer data and their various algorithms attempt to deliver pages based on a users intent and activity - both prior and present - in order to deliver relevant information. Understanding the user, having a unique and desirable offering, and a defensible market position is more important than any keyword markup. The keyword match, on which much SEO is based, is not an approach that is likely to endure.

The emphasis has also shifted away from the smaller operators and now appears to favour brands. This occurs not because brands are categorized as “brands”, but due to the side effects of significant PR activities. Bigger companies tend to run multiple advertising and PR campaigns, so produce signals Google finds favorable i.e. search volume on company name, semantic associations with products and services, frequent links from reputable media, and so on. This flows through into rank. And it also earns them leeway when operating in the gray area where manual penalties are handed out to smaller & weaker entities for the same activities.

Rankings

Apparently, Google killed off toolbar PageRank.

We will probably not going to be updating it [PageRank] going forward, at least in the Toolbar PageRank.

A few people noted it, but the news won't raise many eyebrows as toolbar PR has long since become meaningless. Are there any SEOs altering what they do based on toolbar PR? It’s hard to imagine why. The reality is that an external PR value might indicate an approximate popularity level, but this isn’t an indicator of the subsequent ranking a link from such a page will deliver. There are too many other factors involved. If Google are still using an internal PR metric, it’s likely to be a significantly more complicated beast than was revealed in 1997.

A PageRank score is a proxy for authority. I’m quite sure Google kept it going as an inside joke.

A much more useful proxy for authority are the top ten pages in any niche. Google has determined all well-ranking pages have sufficient authority, and no matter what the toolbar, or any other third-party proxy, says, it’s Google’s output that counts. A link from any one of the top ten pages will likely confer a useful degree of authority, all else being equal. It’s good marketing practice to be linked from, and engage with, known leaders in your niche. That’s PR, as in public relations thinking, vs PR (Page rank), technical thinking.

The next to go will likely be keyword-driven SEO. Withholding keyword referral data was the beginning of the end. Hummingbird is hammering in the nails. Keywords are still great for research purposes - to determine if there’s an audience and what the size of that audience may be - but SEO is increasingly driven by semantic associations and site categorizations. It’s not enough to feature a keyword on a page. A page, and site, needs to be about that keyword, and keywords like it, and be externally recognized as such. In the majority of cases, a page needs to match user intent, rather than just a search term. There are many exceptions, of course, but given what we know about Hummingbird, this appears to be the trend.

People will still look at rank, and lust after prize keywords, but really, rankings have been a distraction all along. Reach and specificity is more important i.e. where’s the most value coming from? The more specific the keyword, typically the lower the bounce rate and the higher the conversion rate. The lower the bounce-rate, and higher the conversion rate, the more positive signals the site will generate, which will flow back into a ranking algorithm increasing being tuned for engagement. Ranking for any keyword that isn’t delivering business value makes no sense.

There are always exceptions. But that’s the trend. Google are looking for pages that match user intent, not just pages that match a keyword term. In terms of reach, you want to be everywhere your customers are.

Search Is The Same, But Different

To adapt to change, SEOs should think about search in the widest possible terms. A search is quest for information. It may be an active, self-directed search, in the form of a search engine query. Or a more passive search, delivered via social media subscriptions and the act of following. How will all these activities feed into your search strategy?

Sure, it’s not a traditional definition of SEO, as I'm not limiting it to search engines. Rather, my point is about the wider quest for information. People want to find things. Eric Schmidt recently claimed Amazon is Google's biggest competitor in search. The mechanisms and channels may change, but the quest remains the same. Take, for example, the changing strategy of BuzzFeed:

Soon after Peretti had turned his attention to BuzzFeed full-time in 2011, after leaving the Huffington Post, BuzzFeed took a hit from Google. The site had been trying to focus on building traffic from both social media marketing and through SEO. But the SEO traffic — the free traffic driven from Google’s search results — dried up.

Reach is important. Topicality is important. Freshness, in most cases, is important. Engagement is important. Finding information is not just about a technical match of a keyword, it’s about an intellectual match of an idea. BuzzFeed didn’t take their eye off the ball. They know helping users find information is the point of the game they are in.

And the internet has only just begun.

In terms of the internet, nothing has happened yet. The internet is still at the beginning of its beginning. If we could climb into a time machine and journey 30 years into the future, and from that vantage look back to today, we’d realize that most of the greatest products running the lives of citizens in 2044 were not invented until after 2014. People in the future will look at their holodecks, and wearable virtual reality contact lenses, and downloadable avatars, and AI interfaces, and say, oh, you didn’t really have the internet (or whatever they’ll call it) back then.

In 30 years time, people will still be on the exact same quest for information. The point of SEO has always been to get your information in front of visitors, and that’s why SEO will endure. SEO was always a bit of a silly name, and it often distracts people from the point, which is to get your stuff seen ahead of the rest.

Some SEOs have given up in despair because it’s not like the old days. It’s becoming more expensive to do effective SEO, and the reward may not be there, especially for smaller sites. However, this might be to miss the point, somewhat.

The audience is still there. Their needs haven’t changed. They still want to find stuff. If SEO is all about helping users find stuff, then that’s the important thing. Remember the “why”. Adapt the “how”

In the next few articles, we’ll look at the specifics of how.

Measuring SEO Performance After "Not Provided"

Aug 24th
posted in

In recent years, the biggest change to the search landscape happened when Google chose to withhold keyword data from webmasters. At SEOBook, Aaron noticed and wrote about the change, as evermore keyword data disappeared.

The motivation to withold this data, according to Google, was privacy concerns:

SSL encryption on the web has been growing by leaps and bounds. As part of our commitment to provide a more secure online experience, today we announced that SSL Search on https://www.google.com will become the default experience for signed in users on google.com.

At first, Google suggested it would only affect a single-digit percentage of search referral data:

Google software engineer Matt Cutts, who’s been involved with the privacy changes, wouldn’t give an exact figure but told me he estimated even at full roll-out, this would still be in the single-digit percentages of all Google searchers on Google.com

...which didn't turn out to be the case. It now affects almost all keyword referral data from Google.

Was it all about privacy? Another rocket over the SEO bows? Bit of both? Probably. In any case, the search landscape was irrevocably changed. Instead of being shown the keyword term the searcher had used to find a page, webmasters were given the less than helpful “not provided”. This change rocked SEO. The SEO world, up until that point, had been built on keywords. SEOs choose a keyword. They rank for the keyword. They track click-thrus against this keyword. This is how many SEOs proved their worth to clients.

These days, very little keyword data is available from Google. There certainly isn’t enough to keyword data to use as a primary form of measurement.

Rethinking Measurement

This change forced a rethink about measurement, and SEO in general. Whilst there is still some keyword data available from the likes of Webmaster Tools & the AdWords paid versus organic report, keyword-based SEO tracking approaches are unlikely to align with Google’s future plans. As we saw with the Hummingbird algorithm, Google is moving towards searcher-intent based search, as opposed to keyword-matched results.

Hummingbird should better focus on the meaning behind the words. It may better understand the actual location of your home, if you’ve shared that with Google. It might understand that “place” means you want a brick-and-mortar store. It might get that “iPhone 5s” is a particular type of electronic device carried by certain stores. Knowing all these meanings may help Google go beyond just finding pages with matching words

The search bar is still keyword based, but Google is also trying to figure out what user intent lays behind the keyword. To do this, they’re relying on context data. For example, they look at what previous searches has the user made, their location, they are breaking down the query itself, and so on, all of which can change the search results the user sees.

When SEO started, it was in an environment where the keyword the user typed into a search bar was exact matching that with a keyword that appears on a page. This is what relevance meant. SEO continued with this model, but it’s fast becoming redundant, because Google is increasingly relying on context in order to determine searcher intent & while filtering many results which were too aligned with the old strategy. Much SEO has shifted from keywords to wider digital marketing considerations, such as what the visitor does next, as a result.

We’ve Still Got Great Data

Okay, if SEO’s don’t have keywords, what can they use?

If we step back a bit, what we’re really trying to do with measurement is demonstrate value. Value of search vs other channels, and value of specific search campaigns. Did our search campaigns meet our marketing goals and thus provide value?

Do we have enough data to demonstrate value? Yes, we do. Here are a few ideas SEOs have devised to look at the organic search data they are getting, and they use it to demonstrate value.

1. Organic Search VS Other Activity

If our organic search tracking well when compared with other digital marketing channels, such as social or email? About the same? Falling?

In many ways, the withholding of keyword data can be a blessing, especially to those SEOs who have a few ranking-obsessed clients. A ranking, in itself is worthless, especially if it’s generating no traffic.

Instead, if we look at the total amount of organic traffic, and see that it is rising, then we shouldn’t really care too much about what keywords it is coming from. We can also track organic searches across device, such as desktop vs mobile, and get some insight into how best to optimize those channels for search as a whole, rather than by keyword. It’s important that the traffic came from organic search, rather than from other campaigns. It’s important that the visitors saw your site. And it’s important what that traffic does next.

2. Bounce Rate

If a visitor comes in, doesn’t like what is on offer, and clicks back, then that won’t help rankings. Google have been a little oblique on this point, saying they aren’t measuring bounce rate, but I suspect it’s a little more nuanced, in practice. If people are failing to engage, then anecdotal evidence suggests this does affect rankings.

Look at the behavioral metrics in GA; if your content has 50% of people spending less than 10 seconds, that may be a problem or that may be normal. The key is to look below that top graph and see if you have a bell curve or if the next largest segment is the 11-30 second crowd.

Either way, we must encourage visitor engagement. Even small improvements in terms of engagement can mean big changes in the bottom line. Getting visitors to a site was only ever the first step in a long chain. It’s what they do next that really makes or breaks a web business, unless the entire goal was that the visitor should only view the landing page. Few sites, these days, would get much return on non-engagement.

PPCers are naturally obsessed with this metric, because each click is costing them money, but when you think about it, it’s costing SEOs money, too. Clicks are getting harder and harder to get, and each click does have a cost associated with it i.e. the total cost of the SEO campaign divided by the number of clicks, so each click needs to be treated as a cost.

3. Landing Pages
We can still do landing page analysis. We can see the pages where visitors are entering the website. We can also see which pages are most popular, and we can tell from the topic of the page what type of keywords people are using to find it.

We could add more related keyword to these pages and see how they do, or create more pages on similar themes, using different keyword terms, and then monitor the response. Similarly, we can look at poorly performing pages and make the assumption these are not ranking against intended keywords, and mark these for improvement or deletion.

We can see how old pages vs new pages are performing in organic search. How quickly do new pages get traffic?

We’re still getting a lot of actionable data, and still not one keyword in sight.

4. Visitor And Customer Acquisition Value

We can still calculate the value to the business of an organic visitor.

We can also look at what step in the process are organic visitors converting. Early? Late? Why? Is there some content on the site that is leading them to convert better than other content? We can still determine if organic search provided a last click-conversion, or a conversion as the result of a mix of channels, where organic played a part. We can do all of this from aggregated organic search data, with no need to look at keywords.

5. Contrast With PPC

We can contrast Adwords data back against organic search. Trends we see in PPC might also be working in organic search.

For AdWords our life is made infinitesimally easier because by linking your AdWords account to your Analytics account rich AdWords data shows up automagically allowing you to have an end-to-end view of campaign performance.

Even PPC-ers are having to change their game around keywords:

The silver lining in all this? With voice an mobile search, you’ll likely catch those conversions that you hadn’t before. While you may think that you have everything figured out and that your campaigns are optimal, this matching will force you into deeper dives that hopefully uncover profitable PPC pockets.

6. Benchmark Against Everything

In the above section I highlighted comparing organic search to AdWords performance, but you can benchmark against almost any form of data.

Is 90% of your keyword data (not provided)? Then you can look at the 10% which is provided to estimate performance on the other 90% of the traffic. If you get 1,000 monthly keyword visits for [widgets], then as a rough rule of thumb you might get roughly 9,000 monthly visits for that same keyword shown as (not provided).

Has your search traffic gone up or down over the past few years? Are there seasonal patterns that drive user behavior? How important is the mobile shift in your market? What landing pages have performed the best over time and which have fallen hardest?

How is your site's aggregate keyword ranking profile compared to top competitors? Even if you don't have all the individual keyword referral data from search engines, seeing the aggregate footprints, and how they change over time, indicates who is doing better and who gaining exposure vs losing it.

Numerous competitive research tools like SEM Rush, SpyFu & SearchMetrics provide access to that type of data.

You can also go further with other competitive research tools which look beyond the search channel. Is most of your traffic driven from organic search? Do your competitors do more with other channels? A number of sites like Compete.com and Alexa have provided estimates for this sort of data. Another newer entrant into this market is SimilarWeb.

And, finally, rank checking still has some value. While rank tracking may seem futile in the age of search personalization and Hummingbird, it can still help you isolate performance issues during algorithm updates. There are a wide variety of options from browser plugins to desktop software to hosted solutions.

By now, I hope I’ve convinced you that specific keyword data isn’t necessary and, in some case, may have only served to distract some SEOs from seeing other valuable marketing metrics, such as what happens after the click and where do they go next.

So long as the organic search traffic is doing what we want it to, we know which pages it is coming in on, and can track what it does next, there is plenty of data there to keep us busy. Lack of keyword data is a pain, but in response, many SEOs are optimizing for a lot more than keywords, and focusing more on broader marketing concerns.

Further Reading & Sources:

Guide To Optimizing Client Sites 2014

Jul 16th
posted in

For those new to optimizing clients sites, or those seeking a refresher, we thought we'd put together a guide to step you through it, along with some selected deeper reading on each topic area.

Every SEO has different ways of doing things, but we’ll cover the aspects that you’ll find common to most client projects.

Few Rules

The best rule I know about SEO is there are few absolutes in SEO. Google is a black box, so complete data sets will never be available to you. Therefore, it can be difficult to pin down cause and effect, so there will always be a lot of experimentation and guesswork involved. If it works, keep doing it. If it doesn't, try something else until it does.

Many opportunities tend to present themselves in ways not covered by “the rules”. Many opportunities will be unique and specific to the client and market sector you happen to be working with, so it's a good idea to remain flexible and alert to new relationship and networking opportunities. SEO exists on the back of relationships between sites (links) and the ability to get your content remarked upon (networking).

When you work on a client site, you will most likely be dealing with a site that is already established, so it’s likely to have legacy issues. The other main challenge you’ll face is that you’re unlikely to have full control over the site, like you would if it were your own. You’ll need to convince other people of the merit of your ideas before you can implement them. Some of these people will be open to them, some will not, and some can be rather obstructive. So, the more solid data and sound business reasoning you provide, the better chance you have of convincing people.

The most important aspect of doing SEO for clients is not blinding them with technical alchemy, but helping them see how SEO provides genuine business value.

1. Strategy

The first step in optimizing a client site is to create a high-level strategy.

"Study the past if you would define the future.” - Confucious

You’re in discovery mode. Seek to understand everything you can about the clients business and their current position in the market. What is their history? Where are they now and where do they want to be? Interview your client. They know their business better than you do and they will likely be delighted when you take a deep interest in them.

  • What are they good at?
  • What are their top products or services?
  • What is the full range of their products or services?
  • Are they weak in any areas, especially against competitors?
  • Who are their competitors?
  • Who are their partners?
  • Is their market sector changing? If so, how? Can they think of ways in which this presents opportunities for them?
  • What keyword areas have worked well for them in the past? Performed poorly?
  • What are their aims? More traffic? More conversions? More reach? What would success look like to them?
  • Do they have other online advertising campaigns running? If so, what areas are these targeting? Can they be aligned with SEO?
  • Do they have offline presence and advertising campaigns? Again, what areas are these targeting and can they be aligned with SEO?

Some SEO consultants see their task being to gain more rankings under an ever-growing list of keywords. Ranking for more keywords, or getting more traffic, may not result in measurable business returns as it depends on the business and the marketing goals. Some businesses will benefit from honing in on specific opportunities that are already being targeted, others will seek wider reach. This is why it’s important to understand the business goals and market sector, then design the SEO campaign to support the goals and the environment.

This type of analysis also provides you with leverage when it comes to discussing specific rankings and competitor rankings. The SEO can’t be expected to wave a magic wand and place a client top of a category in which they enjoy no competitive advantage. Even if the SEO did manage to achieve this feat, the client may not see much in the way of return as it’s easy for visitors to click other listings and compare offers.

Understand all you can about their market niche. Look for areas of opportunity, such as changing demand not being met by your client or competitors. Put yourself in their customers shoes. Try and find customers and interview them. Listen to the language of customers. Go to places where their customers hang out online. From the customers language and needs, combined with the knowledge gleaned from interviewing the client, you can determine effective keywords and themes.

Document. Get it down in writing. The strategy will change over time, but you’ll have a baseline point of agreement outlining where the site is at now, and where you intend to take it. Getting buy-in early smooths the way for later on. Ensure that whatever strategy you adopt, it adds real, measurable value by being aligned with, and serving, the business goals. It’s on this basis the client will judge you, and maintain or expand your services in future.

Further reading:

- 4 Principles Of Marketing Strategy In The Digital Age
- Product Positioning In Five Easy Steps [pdf]
- Technology Marketers Need To Document Their Marketing Strategy

2. Site Audit

Sites can be poorly organized, have various technical issues, and missed keyword opportunities.
We need to quantify what is already there, and what’s not there.

  • Use a site crawler, such as Xenu Link Sleuth, Screaming Frog or other tools that will give you a list of URLs, title information, link information and other data.
  • Make a list of all broken links.
  • Make a list of all orphaned pages
  • Make a list of all pages without titles
  • Make a list of all pages with duplicate titles
  • Make a list of pages with weak keyword alignment
  • Crawl robots txt and hand-check. It’s amazing how easy it is to disrupt crawling with a robots.txt file

Broken links are a low-quality signal. It's debatable if they are a low quality signal to Google, but certainly to users. If the client doesn't have one already, implement a system whereby broken links are checked on a regular basis. Orphaned pages are pages that have no links pointing to them. Those pages may be redundant, in which case they should be removed, or you need to point inbound links at them, so they can be crawled and have more chance of gaining rank. Page titles should be unique, aligned with keyword terms, and made attractive in order to gain a click. A link is more attractive if it speaks to a customer need. Carefully check robots.txt to ensure it’s not blocking areas of the site that need to be crawled.

As part of the initial site audit, it might make sense to include the site in Google Webmaster Tools to see if it has any existing issues there and to look up its historical performance on competitive research tools to see if the site has seen sharp traffic declines. If they've had sharp ranking and traffic declines, pull up that time period in their web analytics to isolate the date at which it happened, then look up what penalties might be associated with that date.

Further Reading:

- Broken Links, Pages, Images Hurt SEO
- Three Easy Ways To Fix Broken Links And Stop Unnecessary Visitor Loss
- 55 Ways To Use Screaming Frog
- Robots.txt Tutorial

3. Competitive Analysis

Some people roll this into a site audit, but I’ll split it out as we’re not looking at technical issues on competitor sites, we’re looking at how they are positioned, and how they’re doing it. In common with a site audit, there’s some technical reverse engineering involved.

There are various tools that can help you do this. I use SpyFu. One reporting aspect that is especially useful is estimating the value of the SEO positions vs the Adwords positions. A client can then translate the ranks into dollar terms, and justify this back against your fee.

When you run these competitive reports, you can see what content of theirs is working well, and what content is gaining ground. Make a list of all competitor content that is doing well. Examine where their links are coming from, and make a list. Examine where they’re mentioned in the media, and make a list. You can then use a fast-follow strategy to emulate their success, then expand upon it.

Sometimes, “competitors”, meaning ranking competitors, can actually be potential partners. They may not be in the same industry as your client, just happen to rank in a cross-over area. They may be good for a link, become a supplier, welcome advertising on their site, or be willing to place your content on their site. Make a note of the sites that are ranking well within your niche, but aren’t direct competitors.

Using tools that estimate the value of ranks by comparing Adwords keywords prices, you can estimate the value of your competitors positions. If your client appears lower than the competition, you can demonstrate the estimated dollar value of putting time and effort into increasing rank. You can also evaluate their rate of improvement over time vs your client, and use this as a competitive benchmark. If your client is not putting in the same effort as your competitor, they’ll be left behind. If their competitors are spending on ongoing-SEO and seeing tangible results, there is some validation for your client to do likewise.

Further reading:

- Competitor Analysis [pdf]
- Illustrated SEO Competitive Workflow
- Competitive Analysis: How To Become A SEO Hero In 4 Steps

4. Site Architecture

A well organised site is both useful from a usability standpoint and an SEO standpoint. If it’s clear to a user where they need to go next, then this will flow through into better engagement scores. If your client has a usability consultant on staff, this person is a likely ally.

It’s a good idea to organise a site around themes. Anecdotal evidence suggests that Google likes pages grouped around similar topics, rather than disparate topics (see from 1.25 onwards).

  • Create spreadsheet based on a crawl after any errors have been tidied up
  • Identify best selling products and services. These deserve the most exposure and should be placed high up the site hierarchy. Items and categories that do not sell well, and our less strategically important, should be lower in the hierarchy
  • Pages that are already getting a lot of traffic, as indicated by your analytics, might deserve more exposure by moving them up the hierarchy.
  • Seasonal products might deserve more exposure just before that shopping season, and less exposure when the offer is less relevant.
  • Group pages into similar topics, where possible. For example, acme.com/blue-widgets/ , acme.com/green-widgets/.
  • Determine if internal anchor text is aligned with keyword titles and page content by looking at a backlink analysis

A spreadsheet of all pages helps you group pages thematically, preferably into directories with similar content. Your strategy document will guide you as to which pages you need to work on, and which pages you need to religate. Some people spend a lot of time sculpting internal pagerank i.e. flowing page rank to some pages, but using nofollow on other links to not pass link equity to others. Google may have depreciated that approach, but you can still link to important products or categories sitewide to flow them more link equity, while putting less important sites lower in the site's architecture. Favour your money pages, and relegate your less important pages.

Think mobile. If your content doesn't work on mobile, then getting to the top of search results won't do you much good.

Further Reading:

- Site Architecture & Search Engine Success Factors
- Optimiing Your Websites Architecture For SEO (Slide Presentation)
- The SEO Guide To Information Archetecture

5. Enable Crawling & Redirects

Ensure your site is deep crawled. To check if all your URLs are included in Google’s index, sign up with Webmaster Tools and/or other index reporting tools.

  • Include a site map
  • Check the existing robots.txt. Kep robots out of non-essential areas, such as script repositories and other admin related directories.
  • If you need to move pages, or you have links to pages that no longer exist, use page redirects to tidy them up
  • Make a list of 404 errors. Make sure the 404 page has useful navigation into the site so visitors don’t click back.

The accepted method to redirect a page is to use a 301. The 301 indicates a page has permanently moved location. A redirect is also useful if you change domains, or if you have links pointing to different versions of the site. For example, Google sees http://www.acme.com and http://acme.com as different sites. Pick one and redirect to it.

Here’s a video explaining how:

If you don’t redirect pages, then you won’t be making full use of any link juice allocated to those pages.

Further Reading:

- What Are Google Site Maps?
- The Ultimate Guide To 301 Redirects
- Crawling And Indexing Metrics

6. Backlink Analysis

Backlinks remain a major ranking factor. Generally, the more high quality links you have pointing to your site, the better you’ll do in the results. Of late, links can also harm you. However, if your overall link profile is strong, then a subset of bad links is unlikely to cause you problems. A good rule of thumb is the Matt Cutts test. Would you be happy to show the majority of your links to Matt Cutts? :) If not, you're likely taking a high risk strategy when it comes to penalties. These can be manageable when you own the site, but they can be difficult to deal with on client sites, especially if the client was not aware of the risks involved in aggressive SEO.

  • Establish a list of existing backlinks. Consider trying to remove any that look low quality.
  • Ensure all links resolve to appropriate pages
  • Draw up a list of sites from which your main competitors have gained links
  • Draw up a list of sites where you’d like to get links from

Getting links involves either direct placement or being linkworthy. On some sites, like industry directories, you can pay to appear. In other cases, it’s making your site into an attractive linking target.

Getting links to purely commercial sites can be a challenge. Consider sponsoring charities aligned with your line of business. Get links from local chambers of commerce. Connect with education establishments who are doing relevant research and consider sponsoring or become involved in some way.

Look at the sites that point to your competitors. How were these links obtained? Follow the same path. If they successfully used white papers, then copy that approach. If they successfully used news, do that, too. Do whatever seems to work for others. Evaluate the result. Do more/less of it, depending on the results.

You also need links from sites that your competitors don’t have. Make a list of desired links. Figure out a strategy to get them. It may involve supplying them with content. It might involve participating in their discussions. It may involve giving them industry news. It might involve interviewing them or profiling them in some way, so they link to you. Ask “what do they need”?. Then give it to them.

Of course, linking is an ongoing strategy. As a site grows, many links will come naturally, and that in itself, is a link acquisition strategy. To grow in importance and consumer interest relative to the competition. This involves your content strategy. Do you have content that your industry likes to link to? If not, create it. If your site is not something that your industry links to, like a brochure site, you may look at spinning-off a second site that is information focused, and less commercial focused. You sometimes see blogs on separate domains where employees talk about general industry topics, like Signal Vs Noise, Basecamps blog. These are much more likely to receive links than sites that are purely commercial in nature.

Before chasing links, you should be aware of what type of site typically receives links, and make sure you’re it.

Further Reading:

- Interview Of Debra Mastaler, the Link Guru
- Scaleable Link Building Techniques
- Creative Link Building Ideas

7 Content Assessment

Once you have a list of keywords, an idea of where competitors rank, and what the most valuable terms are from a business point of view, you can set about examining and building out content.

Do you have content to cover your keyword terms? If not, add it to the list of content that needs to be created. If you have content that matches terms, see if compares well with client content on the same topic. Can the pages be expanded or made more detailed? Can more/better links be added internally? Will the content benefit from amalgamating different content types i.e. videos, audio, images et al?

You’ll need to create content for any keyword areas you’re missing. Rather than copy what is already available in the niche, look at the best ranking/most valuable content for that term and ask how it could be made better. Is there new industry analysis or reports that you can incorporate and/or expand on? People love the new. They like learning things they don’t already know. Mee-too content can work, but it’s not making the most of the opportunity. Aim to produce considerably more valuable content than already exists as you’ll have more chance of getting links, and more chance of higher levels of engagement when people flip between sites. If visitors can get the same information elsewhere, they probably will.

Consider keyword co-occurrence. What terms are readily associated with the keywords you’re chasing? Various tools provide this analysis, but you can do it yourself using the Adwords research tool. See what keywords it associates with your keywords. The Google co-occurrence algorithm is likely the same for both Adwords and organic search.

Also, think about how people will engage with your page. Is it obvious what the page is about? Is it obvious what the user must do next? Dense text and distracting advertising can reduce engagement, so make sure the usability is up to scratch. Text should be a reasonable size so the average person isn’t squinting. It should be broken up with headings and paragraphs. People tend to scan when reading online,searching for immediate confirmation they’ve found the right information. This was written a long time ago, but it’s interesting how relevant it remains.

Further Reading:

- Content Marketing Vs SEO
- Content Analysis Using Google Analytics
- Content Based SEO Strategy Will Eventually Fail

8. Link Out

Sites that don’t link out appear unnatural. Matt Cutts noted:

Of course, folks never know when we're going to adjust our scoring. It's pretty easy to spot domains that are hoarding PageRank; that can be just another factor in scoring. If you work really hard to boost your authority-like score while trying to minimize your hub-like score, that sets your site apart from most domains. Just something to bear in mind.

  • Make a list of all outbound links
  • Determine if these links are complementary i.e. similar topic/theme, or related to the business in some way
  • Make a list of pages with no links out

Links out are both a quality signal and good PR practise. Webmaster look at their inbound links, and will likely follow them back to see what is being said about them. That’s a great way to foster relationships, especially if your client’s site is relatively new. If you put other companies and people in a good light, you can expect many to reciprocate in kind.

Links, the good kind, are about human relationships.

It’s also good for your users. Your users are going to leave your site, one way or another, so you can pick up some kudos if you help them on their way by pointing them to some good authorities. If you’re wary about linking to direct competitors, then look for information resources, such as industry blogs or news sites, or anyone else you want to build a relationship with. Link to suppliers and related companies in close, but non-competing niches. Link to authoritative sites. Be very wary about pointing to low value sites, or sites that are part of link schemes. Low value sites are obvious. Sites that are part of link schemes are harder to spot, but typically feature link swapping schemes or obvious paid links unlikely to be read by visitors. Avoid link trading schemes. It’s too easy to be seen as a part of a link network, and it’s no longer 2002.

Further Resources:

- Five Reasons You Should Link Out
- The Domino Effects Of Links And Relationships
- Link Building 101: Utilizing Past Relationships

9. Ongoing

It’s not set and forget.

Clients can’t expect to do a one off optimisation campaign and expect it to keep working forever. It may be self-serving for SEOs to say it, but it’s also the truth. SEO is ongoing because search keeps changing and competitors and markets move. Few companies would dream of only having one marketing campaign. The challenge for the SEO, like any marketer, is to prove the on-going spend produces a return in value.

  • Competition monitoring i.e. scan for changes in competitors rank, new competitors, and change of tactics. Determine what is working, and emulate it.
  • Sector monitoring - monitor Google trends, keywords trends, discussion groups, and news releases. This will give you ideas for new campaign angles.
  • Reporting - the client needs to be able to see the work you’ve done is paying off.
  • Availability - clients will change things on their site, or bring in other marketers, so will want you advice going forward

Further Reading

Whole books can be written about SEO for clients. And they have. We've skimmed across the surface but, thankfully, there is a wealth of great information out there on the specifics of how to tackle each of these topic areas.

Perhaps you can weigh in? :) What would your advice be to those new to optimizing client sites? What do you wish someone had told you when you started?

  • Over 100 training modules, covering topics like: keyword research, link building, site architecture, website monetization, pay per click ads, tracking results, and more.
  • An exclusive interactive community forum
  • Members only videos and tools
  • Additional bonuses - like data spreadsheets, and money saving tips
We love our customers, but more importantly

Our customers love us!

Have We Reached Peak Advertising?

Jun 19th
posted in

The internet runs on advertising. Google is funded almost entirely by advertising. Facebook , likewise. Digital marketing spends continue to rise:

Internet advertising revenues in the United States totaled $12.1 billion in the fourth quarter of 2013, an increase of 14% from the 2013 third-quarter total of $10.6 billion and an increase of 17% from the 2012 fourth-quarter total of $10.3 billion. 2013 full year internet advertising revenues totaled $42.78 billion, up 17% from the $36.57 billion reported in 2012.

Search advertising spend comes out on top, but that’s starting to change:

Search accounted for 41% of Q4 2013 revenues, down from 44% in Q4 2012, as mobile devices have shifted. Search-related revenues away from the desktop computer. Search revenues totaled $5.0 billion in Q4 2013, up 10% from Q4 2012, when Search totaled $4.6 billion

The growth area for digital advertising lays in mobile:

Mobile revenues totaled 19% of Q4 2013 revenues, or $2.3 billion, up 92% from the $1.2 billion (11% of total) reported in Q4 2012

Prominent venture capitalist, Mary Meeker, recently produced an analysis that also highlights this trend.

So, internet advertising is growing, but web internet adoption is slowing down. Meanwhile, mobile and tablet adoption is increasing fast, yet advertising spend on these mediums is comparatively low. Nice opportunity for mobile, however mobile advertising is proving hard to crack. Not many people are clicking on paid links on mobile. And many mobile ad clicks are accidental, driving down advertiser bids.

This is not just a problem for mobile. There may be a problem with advertising in general. It’s about trust, and lack thereof. This situation also presents a great opportunity for selling SEO.

But first, a little background....

People Know More

Advertising’s golden age was in the 50’s and 60’s.

Most consumers were information poor. At least, they were information poor when it came to getting timely information. This information asymmetry played into the hands of the advertising industry. The advertising agency provided the information that helped match the problems people had with a solution. Of course, they were framing the problem in a way that benefited the advertiser. If there wasn’t a problem, they made one up.

Today, the internet puts real time information about everything in the hands of the consumer. It is easy for people to compare offers, so the basis for advertising - which is essentially biased information provision - is being eroded. Most people see advertising as an intrusion. Just because an advertiser can get in front of a consumer at “the right time” does not necessarily mean people will buy what the advertiser has to offer with great frequency.

Your mobile phone pings. “You’re passing Gordon’s Steak House….come in and enjoy our Mega Feast!” You can compare that offer against a wide range of offers, and they can do so in real time. More than likely, you’ll just resent the intrusion. After all, you may be a happy regular at Susan’s Sushi.

“Knowing things” is not exclusive. Being able to “know things” is a click away. If information is freely available, then people are less likely to opt for whatever is pushed at them by advertisers at that moment. If it’s easy to research, people will do so.

This raises a problem when it comes to the economics of content creation. If advertising becomes less effective for the advertiser, then the advertisers is going to reduce spend, or shift spend elsewhere. If they do, then what becomes of the predominant web content model which is based on advertising?

Free Content Driven By Ads May Be An Unsustainable Model

We’re seeing it in broadcast television, and we’ll see it on the web.

Television is dying and being replaced by the Netflix model. There is a lot of content. There are not enough advertisers paying top dollar as the audience is now highly fragmented. As a result, a lot of broadcast television advertising can be ineffective. However, as we’ve seen with Netflix and Spotify, people are prepared to pay directly for the content they consume in the form of a monthly fee.

The long term trend for advertising engagement on the web is not favourable.

The very first banner advertisement appeared in 1994. The clickthru rate of that banner ad was a staggering 44% It had a novelty value, certainly. The first banner ad also existed in an environment where there wasn’t much information. The web was almost entirely about navigation.

Today, there is no shortage of content. The average Facebook advertisement clickthrough rate is around 0.04%. Advertisers get rather excited if they manage to squeeze 2% or 3% click-thrus rates out of Facebook ads.

Digital advertising is no longer novel, so the click-thru rate has plummeted. Not only do people feel that the advertising isn’t relevant to them, they have learned to ignore advertising even if the ad is talking directly to their needs. 97-98% of the time, people will not click on the ad.

And why should they? Information isn’t hard to come by. So what is the advertiser providing the prospective customer?

Even brand engagement is plummeting on Facebook as the novelty wears off, and Facebook changes policy:

According to a new report from Simply Measured, the total engagement for the top 10 most-followed brands on Facebook has declined 40 percent year-over-year—even as brands have increased the amount of content they’re posting by 20.1 percent.

Is Advertising Already Failing?

Our industry runs on advertising. Much of web publishing runs on advertising.

However, Eric Clemons makes the point that the traditional method of advertising was always bound to fail, mainly because after the novelty wears off, it’s all about interruption, and nobody likes to be interrupted.

But wait! Isn’t the advantage of search that it isn’t interruption advertising? In search, the user requests something. Clemons feels that search results can still be a form of misdirection:

Misdirection, or sending customers to web locations other than the ones for which they are searching. This is Google’s business model. Monetization of misdirection frequently takes the form of charging companies for keywords and threatening to divert their customers to a competitor if they fail to pay adequately for keywords that the customer is likely to use in searches for the companies’ products; that is, misdirection works best when it is threatened rather than actually imposed, and when companies actually do pay the fees demanded for their keywords. Misdirection most frequently takes the form of diverting customers to companies that they do not wish to find, simply because the customer’s preferred company underbid.

He who pays becomes “relevant”:

it is not scalable; it is not possible for every website to earn its revenue from sponsored search and ultimately at least some of them will need to find an alternative revenue model.

The companies that appear high on PPC are the companies who pay. Not every company can be on top, because not every company can pay top dollar. So, what the user sees is not necessarily what the user wants, but the company that has paid the most - along with their quality score - to be there.

But nowadays, the metrics of this channel have changed dramatically, making it impossible or nearly impossible for small and mid-sized business to turn a profit using AdWords. In fact, most small businesses can’t break even using AdWords.This goes for many large businesses as well, but they don’t care. And that is the key difference, and precisely why small brands using AdWords nowadays are being bludgeoned out of existence

Similarly, the organic search results are often dominated by large companies and entities. This is a direct or side-effect of the algorithms. Big entities create a favourable footprint of awareness, engagement and links as a result of PR, existing momentum, brand recognition, and advertising campaigns. It’s a lot harder for small companies to dominate lucrative competitive niches as they can’t create those same footprints.

Certainly when it comes to PPC, the search visitor may be presented with various big player links at the expense of smaller players. Google, like every other advertising driven medium, is beholden to it’s big advertisers. Jacob Nielsen noted in 1998:

Ultimately, those who pay for something control it. Currently, most websites that don't sell things are funded by advertising. Thus, they will be controlled by advertisers and will become less and less useful to the users”

If Interruption Advertising Is Failing, Is Advertising Scalable?

Being informed has changed customer behaviour.

The problem is not the medium, the problem is the message, and the fact that it is not trusted, not wanted, and not needed.

People don’t trust ads. There is a vast literature to support this. Is it all wrong?
People don’t want ads. Again, there is a vast literature to support this. Think about your own behavior, you own channel surfing and fast forwarding and the timing of when you leave the TV to get a snack. Is it during the content or the commercials?
People don’t need ads. There is a vast amount of trusted content on the net. Again, there is literature on this. But think about how you form your opinion of a product, from online ads or online reviews?
There is no shortage of places to put ads. Competition among them will be brutal. Prices will be driven lower and lower, for everyone but Google.

If the advertising is not scaleable, then a lot of content based on advertising will die. Advertising may not be able to support the net:

Now reality is reasserting itself once more, with familiar results. The number of companies that can be sustained by revenues from internet advertising turns out to be much smaller than many people thought, and Silicon Valley seems to be entering another “nuclear winter”

A lot of Adsense publishers are being kicked from the program. Many are terminated, without reason. Google appear to be systematically culling the publisher herd. Why? Shouldn’t web publishing, supported by advertising, be growing?

The continuing plunge in AdSense is in sharp contrast to robust 20% revenue growth in 2012, which outpaced AdWords' growth of 19%.....There are serious issues with online advertising affecting the entire industry. Google has reported declining value from clicks on its ads. And the shift to mobile ads is accelerating the decline, because it produces a fraction of the revenue of desktop ads.
Matt Sanchez, CEO of San Francisco based ad network Say Media, recently warnedthat, "Mobile Is Killing Media."
Digital publishing is headed off a cliff … There's a five fold gap between mobile revenue and desktop revenue… What makes that gap even starker is how quickly it’s happening… On the industry’s current course, that’s a recipe for disaster.

Prices tumble when consumers have near-perfect real time information. Travel. Consumer goods. Anything generic that can be readily compared is experiencing falling prices and shrinking margins. Sales growth in many consumer categories is coming from the premium offerings. For example, beer consumption is falling across the board except in one area: boutique, specialist brews. That market sector is growing as customers become a lot more aware of options that are not just good enough, but great. Boutique breweries offer a more personal relationship, and they offer something the customer perceives as being great, not just “good enough”.

Mass marketing is expensive. Most of the money spent on it is wasted. Products and services that are “just good enough” will be beaten by products and services that are a precise fit for consumers needs. Good enough is no longer good enough, products and services need to be great and precisely targeted unless you've got advertising money to burn.

How Do We Get To These Consumers If They No Longer Trust Paid Advertising?

Consumers will go to information suppliers they trust. There is always demand for a trusted source.

Trip Advisor is a great travel sales channel. It’s a high trust layer over a commodity product. People don’t trust Trip Advisor, per se, they trust the process. Customers talk to each other about the merits, or otherwise, of holiday destinations. It’s transparent. It’s not interruption, misleading or distracting. Consumers seek it out.

Trust models will be one way around the advertising problem. This suits SEOs. If you provide trusted information, especially in a transparent, high-trust form, like Trip Advisor, you will likely win out over those using more direct sales methods. Consumers are getting a lot better at tuning those out.

The trick is to remove the negative experience of advertising by not appearing to be advertising at all. Long term, it’s about developing relationships built on trust, not on interruption and misdirection. It’s a good idea to think about advertising as a relationship process, as opposed to the direct marketing model on which the web is built - which is all about capturing the customer just before point of sale.

Rand Fishkin explained the web purchase process well in this presentation. The process whereby someone becomes a customer, particularly on the web, isn’t all about the late stages of the transaction. We have to think of it in terms of a slow burning relationship developed over time. The consumer comes to us at the end of an information comparison process. Really, it’s an exercise in establishing consumer trust.

Amazon doesn’t rely on advertising. Amazon is a trusted destination. If someone wants to buy something, they often just go direct to Amazon. Amazon’s strategy involves what it calls “the flywheel”, whereby the more things people buy from Amazon, the more they’ll buy from Amazon in future. Amazon builds up a relationship rather than relying on a lot of advertising. Amazon cuts out the middle man and sells direct to customers.

Going viral with content, like Buzzfeed, may be one answer, but it’s likely temporary. It, too, suffers from a trust problem and the novelty will wear off:

Saying “I’m going to make this ad go viral” ignores the fact that the vast majority of viral content is ridiculously stupid. The second strategy, then, is the high-volume approach, same as it ever was. When communications systems wither, more and more of what’s left is the advertising dust. Junk mail at your house, in your email; crappy banner ads on MySpace. Platforms make advertising cheaper and cheaper in a scramble to make up revenue through volume.

It’s not just about supplying content. It could be said newspapers are suffering because bundled news is just another form of interruption and misdirection, mainly because it isn't specifically targeted:

Following The New York Times on Twitter is just like paging through a print newspaper. Each tweet is about something completely unrelated to the tweets before it. And this is the opposite of why people usually follow people and brands online. It's not surprising that The New York Times have a huge problem with engagement. They have nothing that people can connect and engage with

Eventually, the social networks will likely suffer from a trust problem, if they don’t already. Their reliance on advertising makes them spies. There is a growing awareness of data privacy and users are unlikely to tolerate invasions of privacy, especially if they are offered an alternative. Or perhaps the answer is to give users a cut themselves. Lady Gaga might be onto something.

Friends “selling” (recommending) to friends is a high trust environment.

A Good Approach To SEO Involves Building Consumer Trust

The serp is low trust. PPC is low trust. Search keyword plus a site that is littered with ads is low trust. So, one good long term future strategy is to move from low to high trust advertising.

A high trust environment doesn’t really look like advertising. It’s could be characterised as a transparent platform. Amazon and Trip Advisor are good examples. They are honest about what they are, and they provide the good along with the bad. It could be something like Wikipedia. Or an advisory site. There are many examples, but it's fair to say we know it when we see it.

A search on a keyword that finds a specific, relevant site that isn’t an obvious advertisement is high trust. The first visit is the start of a relationship. This is not the time to bombard visitors with your needs. Instead, give the visitor something they can trust. Trip Advisor even spells it out: "Find hotels travelers trust".

Telsla understands the trust relationship. Recently, they’ve made their patents open-source, which, apart from anything else, is a great form of reputation marketing. It’s clear Telsa is more interested in long term relationships and goodwill than pushing their latest model on you at a special price. Their transparency is endearing.

First, you earn trust. Then you sell them something later. If you don’t earn their trust, then you’re just like any other advertiser. People will compare you. People will seek out information. You’re one of many options, unless you have formed a prior relationship. SEO is a brilliant channel to develop a relationship based on trust. If you're selling SEO to clients, think about discussing the trust building potential - and value proposition - of SEO with them.

It's a nice side benefit of SEO. And it's a hedge against the problems associated with other forms of advertising.

The Rigged Search Game

Jun 5th
posted in

SEO was all about being clever. Still is, really. However, SEO used to reward the clever, too. The little guy could take on the big guys and munch their lunch by outsmarting them.

It was such an appealing idea.

The promise of the internet was that the old power structures would be swept aside, the playing field would be made level again, and those who played the smartest game would prosper.

Sadly, this promise didn’t last long.

Power

The names may have changed, but traditional power structures were soon reasserted. The old gatekeepers were replaced with the new gatekeepers. The new gatekeepers, like Google, grew fat, rich and powerful. They controlled the game and the game was, once again, rigged in favor of those with the most power. That's not a Google-specific criticism, it's just the way commerce works. You get big, you move markets simply by being big and present. In search, we see the power imbalance as a side-effect, namely the way big players are treated in the SERPs compared to small players.

SEO for big, established companies, in terms of strategy, is simple. Make sure the site is crawlable. Run PR campaigns that frequently mention the name of the big, established company - which PR campaigns do anyway - and ensure those mentions include a back link. Talk to a lof of friendly reporters. Publish content, do so often, and make sure the important content is somewhere near the top.

That’s it.

The market reputation of the entity does most of the grunt work when it comes to ranking. So long as their ship is pointed in the right direction, they’re golden.

The main aim of the SEO who works for a big, established entity is to stop the big, established entity doing something stupid. So long as the SEO can prevent the entity doing stupid things - often a difficult task, granted - the big, established entity will likely dominate their niche simply by virtue of established market power.

That didn’t used to be the case.

When SEO started, and for a number of years after, the little guy could dominate niches by being the most relevant. The little guy could become the most relevant by carefully deconstructing the algorithm and giving the search engine what the search engine wanted. If the search engines weren’t careful, they were in very real danger of getting exactly what they asked for!

That temporary inversion of the traditional power structure made SEO a lot of fun. You did some clever stuff. You rose to the top. You collected the rewards. I think it’s grown less fun now because being clever isn’t enough. SEO works, but not quite as well as it used to for small players as the cost/reward equation favors big players.

Do A Lot Of Clever SEO Stuff, Get Nowhere

These days, a glass ceiling exists. SEOBook members can read a detailed post by Aaron outlining the glass ceiling here.

Here’s how it often plays out...

About a 8 months ago we launched one of the most viral pieces of content that we have ever done (particulary for a small site that doesn't have a huge following) ... it was done so well that it was organically referenced/hardcoded into Wikipedia. In addition it was cited on news sites, dozens and dozens of blogs (likely north of 100), a number of colleges, etc. It got like a couple hundred unique linking domains....which effectively doubled the unique linking domains that linked into the parent site. What impact did that have on rankings? Nada.

For link building to work well, the right signals need to exist, too. There needs to have high levels of reach and engagement. Big companies tend to have high levels of reach and engagement due to their market position and wider PR and advertising campaigns. This creates search keyword volume, keyword associations, engagement, and frequent mentions in important places, and all this is difficult to compete with if you have a small budget. The exception appears to be in relatively new niches, and in the regions, where the underlying data concerning engagement, reach and interest is unlikely to be particularly deep and rich.

Yet.

So, the little guy is often fighting a losing battle when it comes to search. Even if they choose a new, fast growing niche, as soon as that niche becomes lucrative enough to attract big players, traditional power will reassert itself. The only long term option for the little guy is to become a big guy, or get bought up by one, or go work for one.

Slavery

Abraham Lincoln thought wage labour was a stage workers pass through, typically in their 20’s or early 30s. Eventually, they become self-employed and keep all the profits of their labour.

Adam Smith maintained markets only work as intended if everyone had enough to participate. They also must have sufficient control over their own means of production. Adam Smith, father of modern capitalism, was not a big fan of corporate capitalism:

Merchants and master manufacturers are . . . the two classes of people who commonly employ the largest capitals, and who by their wealth draw to themselves the greatest share of the public consideration.

A side-effect of big players is they can distort markets. They have more purchasing power and that purchasing power sends a signal about what’s important. To big companies. The result is less diversity.

It’s self evident that power changes the search game. The search results become more about whoever is the most powerful. It seems ironic that Google started as an upstart outsider. The search results are difficult to conquer if you’re an upstart outsider, but pretty easy to do if you’re already a major player. Adwords, quality score being equal, favours those with deep pockets.

What’s happening is the little guy is getting squeezed out of this landscape and many of them will become slaves.

Huh? “Slaves”? By Aristotle's and Lincoln's definition, quite possibly:

If we want to have markets, we have to give everybody an equal chance to get into them, or else they don’t work as a means of social liberation; they operate as a means of enslavement.

Enslavement in the sense that the people with enough power, who can get the market to work on their behalf…

Right — bribing politicians to set up the system so that they accumulate more, and other people end up spending all their time working for them. The difference between selling yourself into slavery and renting yourself into slavery in the ancient world was basically none at all, you know. If Aristotle were here, he’d think most people in a country like England or America were slaves.

What’s happening in search is a microcosm of what is happening elsewhere in society. Markets are dominated and distorted by corporations at the direct expense of the small players. Yes, it’s nothing new, but it hasn’t always been this way in search.

So what is my point?

My point is that if you’re not getting the same business benefits from search as you used to, and the game seems that much harder, then it’s not because you’re not clever. It’s because the game is rigged.

Of course, small companies can prosper. You’ll find many examples of them in the SERPs. But their path to getting there via the search channel is now much longer and doesn't pay as well as it used to. This means fewer SEOs will be hired by small companies because the cost of effective SEO is rising fast whilst the rewards are shrinking. Meanwhile, the big companies are increasing their digital budgets.

Knowing all this, the small operator can change their approach. The small operator has one advantage. They can be nimble, flexible, and change direction quickly. So, looking forward over the not-too-distant horizon, we either need a plan to take advantage of fast emerging markets before the big guys enter them, or we need a plan to scale, or we need to fight differently, such as taking brand/USP centric approaches.

Or go work for one of the big guys.

As a "slave". Dude :) (Just kidding)

What’s In A Name?

Jun 2nd
posted in

Many SEO love keyword-loaded domain names. The theory is that domains that feature a keyword will result in a boost in ranking. It’s still a contentious topic:

I've seen bloggers, webmasters and search aficionados argue the case around the death of EMDs time and time again, despite the evidence staring them in the face: EMDs are still all over the place. What's more, do a simple bulk backlink analysis via Majestic, and you will find tons which rank in the top 10 while surrounded by far more authoritative domains.

No matter what the truth of the matter as to the ranking value of EMDs, most would agree that finding the right language for describing and profiling our business is important.

Terminology Changes

Consider the term “startup”.

This term, which describes a new small business, feels like it has been around forever. Not so. Conduct a search on the time period 1995-1998 and you won’t find results for start-up:

It's a word that has grown up with the web and sounds sexier than just business. Just like the word "consultant" or "boutique" sounds better than "mom and pop" or "1 person business". (You must remember of course when "sanitation engineer" replaced "trash man".) oI just did a search to see the use of the word startup from the period 1995 to 1998 and came up with zilch in terms of relation to business

Start up does sound sexier than “mom n pop” or “one person business”, or “a few stoner mates avoiding getting a job”. A pitch to a VC that described the business as a “mom n pop” may not be taken seriously, whereas calling it a startup will.

If we want to be taken seriously by our audience, then finding the audience's language is important.

SEO or Digital Marketing Or…..?

Has SEO become a dirty word? Has it always been a dirty word?

SEO’s don’t tend to see it that way, even if they are aware of the negative connotations. They see SEO as a description of what they do. It’s always been a bit of a misnomer, as we don’t optimize search engines, but for whatever reason, it stuck.

The term SEO is often associated with spam. The ever-amiable Matt Cutts video's could be accompanied by a stern, animated wagging finger and a "tut tut tut" subtext. The search engines frown on a lot when it comes to SEO. SEO is permanent frown territory. Contrast this with PPC. PPC does not have that negative connotation. There is no reputation issue in saying you’re a PPC provider.

Over the years, this propaganda exercise that has resulted in the "SEO questionable/PPC credible" narrative has been pretty effective. The spammer label, borrowed from the world of email spam, has not been a term the SEO has managed to shrug off. The search engines have even managed to get SEOs to use the term “spammer” as a point of differentiation. “Spam is what the other SEOs do. Not me, of course.” This just goes to show how effective the propaganda has been. Once SEOs used spam to describe their own industry, the fate of the term SEO was sealed. After all, you seldom hear doctors, lawyers and retailers defining what they do against the bad actors in their sector.

As traffic acquisition gets broader, encompassing PR and social media, new titles like Digital Marketer have emerged. These terms have the advantage of not being weighed down by historical baggage. I’m not suggesting people should name themselves one thing or the other. Rather, consider these terms in a strategic sense. What terms best describe who you are and what you do, and cast you in the best possible light to those you wish to serve, at this point in time?

The language moves.

Generic Name Or Brandable?

Keyword loaded names, like business.com, are both valuable and costly. The downside of such names, besides being costly, is they severely limit branding opportunities. The better search engines get, and the more people use social media and other referral channels, the less these generic names will matter.

What matters most in crowded markets is being memorable.

A memorable, unique name is a valuable search commodity. If that name is always associated with you and no one else, then you’ll always be found in the search results. SEMRush, MajesticSEO, and Mo are unlikely to be confused with other companies. “Search Engine Tools”, not so much.

Will the generic name become less valuable because generic names are perhaps only useful at the start of an industry? How mature is your industry? How can you best get differentiation in a crowded market through language alone?

The Strategy Behind Naming

Here are a few points to consider.

1. Start Early

Names are often an afterthought. People construct business plans. They think about how their website looks. They think about their target market. They don’t yet have a name. Try starting with a name and designing everything else around it. The name can set the tone of every other decision you make.

2. Positioning

In mature markets, differentiation is strategically important. Is your proposed name similar to other competitors names? Is it unique enough? If you’re in at the start of a new industry, would a generic, keyword loaded name work best? Is it time for a name change because you’ve got lost in the crowd? Has your business focus changed?

Does your name go beyond mere description and create an emotional connection with your audience? Names that take on their own meaning, like Amazon, are more likely to grow with the business, rather than have the business outgrow the name. Imagine if Amazon.com had called itself Books.com.

3. What Are You All About?

Are you a high-touch consultative company? Or a product based, functional company? Are you on the cutting edge? Or are you catering to a market who like things just the way they are?

Writing down a short paragraph about how you see yourself, how the customers see you, and your position in the market, will help you come up with suitable names. Better yet, write a story.

4. Descriptive Vs Differentiation

Descriptive can be safe. “Internet Search Engine” or “Web Crawler”. There’s no confusing what those businesses do. Compare them with the name Google. Google gives you no idea what the company does, but it’s more iconic, quirky and memorable. There’s no doubt it has grown with the company and become a natural part of their identity in ways that “Internet Search Engine” never could.

Sometimes, mixing descriptions to create something quirky works well. Airbnb is a good example. The juxtaposition of those two words creates something new, whilst at the same time having a ring of the familiar. It’s also nice to know if the domain name is available, and if the name can be trademarked. The more generic the name, the harder it is to trademark, and the less likely the domain name is available.

5. Does Your Name Travel Well?

Hopefully, your name isn’t a swear word in another culture. Nor have negative connotations. Here are a few comical examples where it went wrong:

Nokia’s new smartphone translates in Spanish slang to prostitute, which is unfortunate, but at least the cell phone giant is in good company. The name of international car manufacturer Peugeot translates in southern China to Biao zhi, which means the same thing.

This is not such an issue if your market is local, but if you plan to expand into other markets in future, then it pays to consider this angle.

6. There’s No Right Answer

There is probably no universally good name. At least, when you first come up with a name, you can be assured some people will hate it, some will be indifferent, and some will like it - no matter what name you choose.

This is why it’s important to ground the subjective name-choosing process in something concrete, like your business strategy, or positioning in the market. You name could have come before the business plan. Or it could reflect it. You then test your name with people who will likely buy your product or service. It doesn’t matter what your Mom or your friends think of the name, it’s what you think of the name and what your potential customers think of the name that counts.

7. Diluting Your Name

Does each service line and product in your company need a distinctive name? Maybe, but the risk is that it could dilute the brand. Consider Virgin. They put the exact same name on completely different service lines. That same brand name carries the values and spirit of Virgin to whatever new enterprise they undertake. This also reduces the potential for customer confusion.

Creating a different name for some of your offerings might be a good idea, Say, if you’re predominantly a service-based company, yet you also have one product that you may spin off at some point in future. You may want to clearly differentiate the product from the service so as not to dilute the focus of the service side. Again, this is where strategy comes in. If you’re clear about what your company does, and your position in the market, then it becomes easier to decide how to name new aspects of your business. Or whether you should give them a name at all.

7. Is your name still relevant?

Brands evolve. They can appear outdated if the market moves on. On the other hand, they can built equity through longevity. It seems especially difficult to change internet company names as the inbound linking might be compromised as a result. Transferring the equity of a brand is typically expensive and difficult. All the more reason to place sufficient importance on naming to begin with.

8. More Than A Name

The branding process is more than just a name and identity. It's the language of your company. It’s the language of your customers. It becomes a keyword on which people search. Your customers have got to remember it. You, and your employees, need to be proud of it. It sets you apart.

The language is important. And strategic.

Please Remove My Link. Or Else.

May 23rd
posted in

Getting links removed is a tedious business.

It’s just as tedious for the site owner who must remove the links. Google’s annoying practice of "suggesting" webmasters jump through hoops in order to physically remove links that the webmaster suspects are bad, rather than Google simply ignoring the links that they’ve internally flagged, is causing frustration.

Is it a punitive punishment? If so, it’s doing nothing to endear Google to webmasters. Is it a smokescreen? i.e. they don't know which links are bad, but by having webmasters declare them, this helps Google build up a more comprehensive database? Bit of both? It might also be adding costs to SEO in order to put SEO out of reach of small companies. Perhaps it’s a red herring to make people think links are more important than they actually are.

Hard to be sure.

Collateral Damage

SEOs are accustomed to search engines being coy, punitive and oblique. SEOs accept it as part of the game. However, it becomes rather interesting when webmasters who are not connected to SEO get caught up in the collateral damage:

I received an interesting email the other day from a company we linked to from one of our websites. In short, the email was a request to remove links from our site to their site. We linked to this company on our own accord, with no prior solicitation, because we felt it would be useful to our site visitors, which is generally why people link to things on the Internet.

And check out the subsequent discussion on Hacker News. Matt Cutts first post is somewhat disingenuous:

Situation #1 is by far the most common. If a site gets dinged for linkspam and works to clean up their links, a lot of them send out a bunch of link removal requests on their own prerogative

Webmasters who receive the notification are encouraged by Google to clean up their backlinks, because if they don’t, then their rankings suffer.

But, essentially from our point of view when it comes to unnatural links to your website we want to see that you’ve taken significant steps to actually remove it from the web but if there are some links that you can’t remove yourself or there are some that require payment to be removed then having those in the disavow file is fine as well.

(Emphasis mine)

So, of course webmasters who have received a notification from Google are going to contact websites to get links removed. Google have stated they want to see that the webmaster has gone to considerable effort to remove them, rather than simply use the disavow tool.

The inevitable result is that a webmaster who links to anyone who has received a bad links notification may receive the latest form of email spam known as the “please remove my link” email. For some webmasters, this email has become more common that the “someone has left you millions in a Nigerian bank account” gambit, and is just as persistent and annoying.

From The Webmasters Perspective

Webmasters could justifiably add the phrase “please remove my link” and the word "disavow" to their spam filters.

Let’s assume this webmaster isn’t a bad neighbourhood and is simply caught in the cross-fire. The SEO assumes, perhaps incorrectly, the link is bad and requests a take-down. From the webmasters perspective, they incur a time cost dealing with the link removal requests. A lone request might take a few minutes to physically remove - but hang on a minute - how does the webmaster know this request is coming from the site owner and not from some dishonest competitor? Ownership takes time to verify. And why would the webmaster want to take down this link, anyway? Presumably, they put it up because they deemed it useful to their audience. Or, perhaps some bot put the link there - perhaps as a forum or blog comment link - against the webmasters wishes - and now, to add insult to injury, the SEO wants the webmaster to spend his time taking it down!

Even so, this might be okay if it’s only one link. It doesn't take long to remove. But, for webmasters who own large sites, it quickly becomes a chore. For large sites with thousands of outbound links built up over years, removal requests can pile up. That’s when the spam filter kicks in.

Then come the veiled threats. “Thanks for linking to us. This is no reflection on you, but if you don’t remove my link I’ll be forced to disavow you and your site will look bad in Google. I don’t want to do this, but I may have to.”

What a guy.

How does the webmaster know the SEO won’t do that anyway? Isn’t that exactly what some SEO conference speakers have been telling other SEOs to do regardless of whether the webmaster takes the link down or not?

So, for a webmaster caught in the cross-fire, there’s not much incentive to remove links, especially if s/he's read Matt's suggestion:

higherpurpose, nowhere in the original article did it say that Google said the link was bad. This was a request from a random site (we don't know which one, since the post dropped that detail), and the op can certainly ignore the link removal request.

In some cases Google does specify links:

We’ve reviewed the links to your site and we still believe that some of them are outside our quality guidelines.

Sample URLs:
ask.metafilter.com/194610/get-me-and-my-stuff-from-point-a-to-point-b-possibly-via-point-c

Please correct or remove all inorganic links, not limited to the samples provided above. This may involve contacting webmasters of the sites with the inorganic links on them.

And they make errors when they specify those links. They've flagged DMOZ & other similar links: "Every time I investigate these “unnatural link” claims, I find a comment by a longtime member of MetaFilter in good standing trying to help someone out, usually trying to identify something on Ask MetaFilter."

Changing Behaviour

Then the webmaster starts thinking.

"Hmmm...maybe linking out will hurt me! Google might penalize me or, even worse, I’ll get flooded with more and more “please remove my link” spam in future."

So what happens?

The webmaster becomes very wary about linking out. David Naylor mentioned an increasing number of sites adopting a "no linking" policy. Perhaps the webmaster no-follows everything as a precaution. Far from being the life-giving veins of the web, links are seen as potentially malignant. If all outbound links are made no-follow, perhaps the chance of being banned and flooded with “please remove my link”spam is reduced. Then again, even nofollowed links are getting removal requests.

As more webmasters start to see links as problematic, fewer legitimate sites receive links. Meanwhile, the blackhat, who sees their sites occasionally getting burned as a cost of doing business, will likely see their site rise as they’ll be the sites getting all the links, served up from their curated link networks.

A commenter notes:

The Google webspam team seems to prefer psychology over technology to solve the problem, especially recently. Nearly everything that's come out of Matt Cutt's mouth in the last 18 months or so has been a scare tactic.
IMO all this does is further encourage the development of "churn and burn" websites from blackhats who have being penalized in their business plan. So why should I risk all the time and effort it takes to generate quality web content when it could all come crashing down because an imperfect and overzealous algorithm thinks it's spam? Or worse, some intern or non-google employee doing a manual review wrongly decides the site violates webmaster guidelines?

And what’s the point of providing great content when some competitor can just take you out with a dedicated negative SEO campaign, or if Google hits you with a false positive? If most of your traffic comes from Google, then the risk of the web publishing model increases.

Like MetaFilter:

Is Google broken? Or is your site broken? That’s the question any webmaster asks when she sees her Google click-throughs drop dramatically. It’s a question that Matt Haughey, founder of legendary Internet forum MetaFilter, has been asking himself for the last year and a half, as declining ad revenues have forced the long-running site to lay off several of its staff.

Then again, Google may just not want what MetaFilter has to offer anymore.

(In)Unintended Consequences

Could this be uncompetitive practice from Google? Are the sites getting hit with penalties predominantly commercial sites? It would be interesting to see how many of them are non-commercial. If so, is it a way to encourage commercial sites to use Adwords as it becomes harder and harder to get a link by organic means? If all it did was raise the cost of doing SEO, it would still be doing its job.

I have no idea, but you could see why people might ask that question.

Let’s say it’s benevolent and Google is simply working towards better results. The unintended consequence is that webmasters will think twice about linking out. And if that happens, then their linking behaviour will start to become more exclusive. When links become harder to get and become more problematic, then PPC and social-media is going to look that much more attractive.

What’s Wrong With A/B Testing

Apr 22nd

A/B testing is an internet marketing standard. In order to optimize response rates, you compare one page against another. You run with the page that gives you the best response rates.

But anyone who has tried A/B testing will know that whilst it sounds simple in concept, it can be problematic in execution. For example, it can be difficult to determine if what you’re seeing is a tangible difference in customer behaviour or simply a result of chance. Is A/B testing an appropriate choice in all cases? Or is it best suited to specific applications? Does A/B testing obscure what customers really want?

In this article, we’ll look at some of the gotchas for those new to A/B testing.

1. Insufficient Sample Size

You set up test. You’ve got one page featuring call to action A and one page featuring call to action B. You enable your PPC campaign and leave it running for a day.

When you stop the test, you’ve found call-to-action A converted at twice the rate of call-to-action B. So call-to-action A is the winner and we should run with it, and eliminate option B.

But this would be a mistake.

The sample size may be insufficient. If we only tested one hundred clicks, we might get a significant difference in results between two pages, but that change doesn't show up when we get to 1,000 clicks. In fact, the result may even be reversed!

So, how do we determine a sample size that is statistically significant? This excellent article explains the maths. However, there are various online sample size calculators that will do the calculations for you, including Evan’s. Most A/B tracking tools will include sample size calculators, but it’s a good idea to understand what they’re calculating, and how, to ensure the accuracy of your tests.

In short, make sure you've tested enough of the audience to determine a trend.

2. Collateral Damage

We might want to test a call to action metric. We want to test the number of people who click on the “find out more” link on a landing page. We find that a lot more people click on this link we use the term “find out more” than if we use the term “buy now”.

Great, right?

But what if the conversion rate for those who actually make a purchase falls as a result? We achieved higher click-thrus on one landing page at the expense of actual sales.

This is why it’s important to be clear about the end goal when designing and executing tests. Also, ensure we look at the process as a whole, especially when we’re chopping the process up into bits for testing purposes. Does a change in one place affect something else further down the line?

In this example, you might A/B test the landing page whilst keeping an eye on your total customer numbers deeming the change effective only if customer numbers also rise. If your aim was only to increase click-thru, say to boost quality scores, then the change was effective.

3. What, Not Why

In the example above, we know the “what”. We changed the wording of a call-to-action link, and we achieved higher click thru’s, although we’re still in the dark as to why. We’re also in the dark as to why the change of wording resulted in fewer sales.

Was it because we attracted more people who were information seekers? Were buyers confused about the nature of the site? Did visitors think they couldn’t buy from us? Were they price shoppers who wanted to compare price information up front?

We don’t really know.

But that’s good, so long as we keep asking questions. These types of questions lead to more ideas for A/B tests. By turning testing into an ongoing process, supported by asking more and hopefully better questions, we’re more likely to discover a whole range of “why’s”.

4. Small Might Be A Problem

If you’re a small company competing directly with big companies, you may already be on the back foot when it comes to A/B testing.

It’s clear that its very modularity can cause problems. But what about in cases where the number of tests that can be run at once is low? While A/B testing makes sense on big websites where you can run hundreds of tests per day and have hundreds of thousands of hits, only a few offers can be tested at one time in cases like direct mail. The variance that these tests reveal is often so low that any meaningful statistical analysis is impossible.

Put simply, you might not have the traffic to generate statistically significant results. There’s no easy way around this problem, but the answer may lay in getting tricky with the maths.

Experimental design massively and deliberately increases the amount of variance in direct marketing campaigns. It lets marketers project the impact of many variables by testing just a few of them. Mathematical formulas use a subset of combinations of variables to represent the complexity of all the original variables. That allows the marketing organization to more quickly adjust messages and offers and, based on the responses, to improve marketing effectiveness and the company’s overall economics

Another thing to consider is that if you’re certain the bigger company is running A/B tests, and achieving good results, then “steal” their landing page*. Take their ideas for landing pages and use that as a test against your existing pages. *Of course, you can’t really steal their landing page, but you can be "influenced by” their approach.

What your competitors do is often a good starting point for your own tests. Try taking their approach and refine it.

5. Might There Be A Better Way?

Are there alternatives to A/B testing?

Some swear by the Multi Armed Bandit methodology:

The multi-armed bandit problem takes its terminology from a casino. You are faced with a wall of slot machines, each with its own lever. You suspect that some slot machines pay out more frequently than others. How can you learn which machine is the best, and get the most coins in the fewest trials?
Like many techniques in machine learning, the simplest strategy is hard to beat. More complicated techniques are worth considering, but they may eke out only a few hundredths of a percentage point of performance.

Then again…..

What multi-armed bandit algorithm does is that it aggressively (and greedily) optimizes for currently best performing variation, so the actual worse performing versions end up receiving very little traffic (mostly in the explorative 10% phase). This little traffic means when you try to calculate statistical significance, there’s still a lot of uncertainty whether the variation is “really” worse performing or the current worse performance is due to random chance. So, in a multi-armed bandit algorithm, it takes a lot more traffic to declare statistical significance as compared to simple randomization of A/B testing. (But, of course, in a multi-armed bandit campaign, the average conversion rate is higher).

Multivariate testing may be suitable if you’re testing a combination of variables, as opposed to just one i.e.

  • Product Image: Big vs. Medium vs Small
  • Price Text Style: Bold vs Normal
  • Price Text Color: Blue vs. Black vs. Red

There would be 3x2x3 different versions to test.

The problem with multivariate tests is they can get complicated pretty quickly and require a lot of traffic to produce statistically significant results. One advantage of multivariate testing over A/B testing is that it can tell you which part of the page is most influential. Was it a graphic? A headline? A video? If you're testing a page using an A/B test, you won't know. Multivariate testing will tell you which page sections influence the conversion rate and which don’t.

6. Methodology Is Only One Part Of The Puzzle

So is A/B testing worthwhile? Are the alternatives better?

The methodology we choose will only be as good as the test design. If tests are poorly designed, then the maths, the tests, the data and the software tools won’t be much use.

To construct good tests, you should first take a high level view:

Start the test by first asking yourself a question. Something on the lines of, “Why is the engagement rate of my site lower than that of the competitors…..Collect information about your product from customers before setting up any big test. If you plan to test your tagline, run a quick survey among your customers asking how they would define your product.

Secondly, consider the limits of testing. Testing can be a bit of a heartless exercise. It’s cold. We can’t really test how memorable and how liked one design is over the other, and typically have to go by instinct on some questions. Sometimes, certain designs just work for our audience, and other designs don’t. How do we test if we're winning not just business, but also hearts and minds?

Does it mean we really understand our customers if they click this version over that one? We might see how they react to an offer, but that doesn’t mean we understand their desires and needs. If we’re getting click-backs most of the time, then it’s pretty clear we don’t understand the visitors. Changing a graphic here, and wording there, isn’t going to help if the underlying offer is not what potential customers want. No amount of testing ad copy will sell a pink train.

The understanding of customers is gained in part by tests, and in part by direct experience with customers and the market we’re in. Understanding comes from empathy. From asking questions. From listening to, and understanding, the answers. From knowing what’s good, and bad, about your competitors. From providing options. From open communication channels. From reassuring people. You're probably armed with this information already, and that information is highly useful when it comes to constructing effective tests.

Do you really need A/B testing? Used well, it can markedly improve and hone offers. It isn't a magic bullet. Understanding your audience is the most important thing. Google, a company that uses testing extensively, seem to be most vulnerable when it comes to areas that require a more intuitive understanding of people. Google Glass is a prime example of failing to understand social context. Apple, on the other hand, were driven more by an intuitive approach. Jobs: "We built [the Mac] for ourselves. We were the group of people who were going to judge whether it was great or not. We weren’t going to go out and do market research"

A/B testing is can work wonders, just so long as it isn’t used as a substitute for understanding people.

The Positive Negative SEO Strategy

Apr 1st
posted in

There’s a case study on Moz on how to get your site back following a link penalty. An SEO working on a clients site describes what happened when their client got hit with a link penalty. Even though the link penalty didn't appear to be their fault, it still took months to get their rankings back.

Some sites aren't that lucky. Some sites don’t get their rankings back at all.

The penalty was due to a false-positive. A dubious site links out to a number of credible sites in order to help disguise their true link target. The client site was one of the credible sites, mistaken by Google for a bad actor. Just goes to show how easily credible sites can get hit by negative SEO, and variations thereof.

There’s a tactic in there, of course.

Take Out Your Competitors

Tired of trying to rank better? Need a quicker way? Have we got a deal for you!

Simply build a dubious link site, point some rogue links at sites positioned above yours and wait for Google’s algorithm to do the rest. If you want to get a bit tricky, link out to other legitimate sites, too. Like Wikipedia. Google, even. This will likely confuse the algorithm for a sufficient length of time, giving your tactic time to work.

Those competitors who get hit, and who are smart enough to work out what’s going on, may report your link site, but, hey, there are plenty more link sites where that came from. Roll another one out, and repeat. So long as your link site can’t be connected with you - different PC, different IP address, etc - then what have you got to lose? Nothing much. What have your competitors got to lose? Rank, a lot of time, effort, and the very real risk they won’t get back into Google’s good books. And that’s assuming they work out why they lost rankings.

I’m not advocating this tactic, of course. But we all know it’s out there. It is being used. And the real-world example above shows how easy it is to do. One day, it might be used against you, or your clients.

Grossly unfair, but what can you do about it?

Defensive Traffic Strategy

Pleading to Google is not much of a strategy. Apart from anything else, it’s an acknowledgement that the power is not in your hands, but in the hands of an unregulated arbiter who likely views you as a bit of an annoyance. It’s no wonder SEO has become so neurotic.

It used to be the case that competitors could not take you out pointing unwanted links at you. No longer. So even more control has been taken away from the webmaster.

The way to manage this risk is the same way risk is managed in finance. Risk can be reduced using diversification. You could invest all your money in one company, or you could split it between multiple companies, banks, bonds and other investment classes. If you’re invested in one company, and they go belly up, you lose everything. If you invest in multiple companies and investment classes, then you’re not as affected if one company gets taken out. In other words, don’t put all your eggs in one basket.

It’s the same with web traffic.

1. Multiple Traffic Streams

If you only run one site, try to ensure your traffic is balanced. Some traffic from organic search, some from PPC, some from other sites, some from advertisements, some from offline advertising, some from email lists, some from social media, and so on. If you get taken out in organic search, it won’t kill you. Alternative traffic streams buy you time to get your rankings back.

2. Multiple Pages And Sites

A “web site” is a construct. Is it a construct applicable to a web that mostly orients around individual pages? If you think in terms of pages, as opposed to a site, then it opens up more opportunities for diversification.

Pages can, of course, be located anywhere, not just on your site. These may take the form of well written, evergreen, articles published on other popular sites. Take a look at the top sites in closely related niches and see if there are any opportunities to publish your content on them. Not only does this make your link graph look good, so long as it’s not overt, you’ll also have achieve more diversity.

Consider Barnacle SEO.

Will creatively defines the concept of barnacle SEO as follows:
Attaching oneself to a large fixed object and waiting for the customers to float by in the current.
Directly applied to local search, this means optimizing your profiles or business pages on a well-trusted, high-ranking directory and working to promote those profiles instead of — or in tandem with — your own website.“

You could also build multiple sites. Why have just one site when you can have five? Sure, there’s more overhead, and it won’t be appropriate in all cases, but again, the multiple site strategy is making a comeback due to Google escalating the risk of having only one site. This strategy also helps get your eggs into multiple baskets.

3. Prepare For the Worst

If you've got most of your traffic coming from organic search, then you’re taking a high risk approach. You should manage that risk down with diversification strategies first. Part of the strategy for dealing with negative SEO is not to make yourself so vulnerable to it in the first place.

If you do get hit, have a plan ready to go to limit the time you’re out of the game. The cynical might suggest you have a name big enough to make Google look bad if they don’t show you.

Lyrics site Rap Genius says that it is no longer penalized within Google after taking action to correct “unnatural links” that it helped create. The site was hit with a penalty for 10 days, which meant people seeking it by name couldn’t find it.

For everyone else, here’s a pretty thorough guide about how to get back in.

Have your “plead with Google” gambit ready to go at a moments notice. The lead time to get back into Google can be long, so the sooner you get onto it, the better. Of course, this is really the last course of action. It’s preferable not make yourself that vulnerable in the first place.

By diversifying.

Handling Objections From SEO Clients

Mar 18th
posted in

If the current war on SEOs by Google wasn’t bad enough if you own the site you work on, then it is doubly so for the SEO working for a client. When the SEO doesn’t have sufficient control over the strategy and technology, it can be difficult to get and maintain rankings.

In this post, we'll take a look at the challenges and common objections the SEO faces when working on a client site, particularly a client who is engaging an SEO for the first time. The SEO will need to fit in with developers, designers and managers who may not understand the role of SEOs. Here are common objections you can expect, and some ideas on how to counter them.

1. Forget About SEO

The objection is that SEO gets in the way. It’s too hard.

It’s true. SEO is complicated. It can often compromise design and site architecture. To managers and other web technicians, SEO can look like a dark art. Or possibly a con. There are no fixed rules as there are in, say, coding, and results are unpredictable.

So why spend time and money on SEO?

One appropriate response is “because your competitors are”

Building a website is the equivalent of taking the starting line in a race. Some site owners think that’s all they need do. However, the real race starts after the site is built. Every other competitor has a web site, and they’re already off and running in terms of site awareness. Without SEO, visitors may find a site, but if the site owner is not using the SEO channel, and their competitors are, then their competitors have an advantage in terms of reach.

2. Can’t SEOs Do Their Thing After The Site Is Built?

SEO’s can do their thing after the site is built, but it’s more difficult. As a result, it’s likely to be more expensive. Baking SEO into the mix when it is conceived and built is an easier route.

Just as copywriters require space to display their copy, SEO's require room to manoeuvre. They’ll likely contribute to information architecture, copy, copy markup and internal linking structures. So start talking about SEO as early as possible, and particularly during information architecture.

There are three key areas where SEO needs to integrate with design. One, the requirement that text is machine readable. Search engines "think" mostly in terms of words, so topics and copy need to relate to search terms visitors may use.

Secondly, linking architecture and information hierarchies. If pages are buried deep in the site, but deemed important in terms of search, they will likely be elevated in the hierarchy to a position closer to the home page.

Thirdly, crawl-ability. A search engine sends out a spider, which grabs the source code of your website, and dumps it back in the search engines database. The spider skips from page to page, following links. If a page doesn't have a crawlable link pointing to it, it will be invisible to search engines. There are various means of making a site easy to crawl, but one straightforward way is to use a site map, linked to from each page on the site. The SEO may also want to ensure the site navigation is crawlable.

3. We Don’t Want The SEO To Interfere With Code

SEO’s do need to tweak code, however the mark-up is largely inconsequential.

SEO's need to specify title tags and some meta tags. These tags need to be unique for each page on the site, as each page is a possible entry page. A search visitor will not necessarily arrive at the home page first.

The title tag appears in search results as a clickable link, so serves a valuable marketing function. When search visitors consider which link on a search results page to click, the title tag and snippet will influence their decision. The title tag should, therefore, closely match the content of each page.

The second aspect concerns URL's. Ideally, a URL should contain descriptive words, as opposed to numbers and random letters. For example, acme.com/widgets/red-widgets.htm is good, whilst acme.com/w/12345678&tnr.php, less so.

The more often the keyword appears, the more likely it will be bolded on a search results page, and is therefore more likely to attract a click. It's also easier for the search engine to determine meaning if a URL is descriptive as opposed to cryptic.

4. I’ve Got An SEO PlugIn. That’s All I Need

SEO Plugins cover the on-site basics. But ranking well involves more than covering the basics.

In order to rank well, a page needs to have links from external sites. The higher quality those sites, the more chances your pages have of ranking well. The SEO will look to identify linking possibilities, and point these links to various internal pages on the site.

It can be difficult, near impossible, to get high quality links to brochure-style advertising pages. Links tend to be directed at pages that have unique value.

So, the type and quality of content has more to do with SEO than the way that content is marked up by a generic plugin. The content must attract links and generate engagement. The visitor needs to see a title on a search result, click through, not click back, and, preferably take some action on that page. That action may be a click deeper into the site, a bookmark, a tweet, or some other measurable form of response.

Content that lends itself to this type of interaction includes blog posts, news feeds, and content intended for social network engagement. In this way, SEO-friendly content can be functionally separated from other types of content. Not every page needs to be SEO’d, so SEO can be sectioned off, if necessary.

5. The SEO Is Just Another Technician

If your aim, or your clients aim, is to attract as much targeted traffic as possible then SEO integration must be taken just as seriously as design, development, copy and other media. SEO is more than a technical exercise, it’s a strategic marketing exercise, much like Public Relations.

SEO considerations may influence your choice of CMS. It may influence your strategic approach in terms of what type of information you publish. It may change the way you engage visitors. Whilst SEO can be bolted-on afterwards, this is a costly and less-effective way of doing SEO, much like re-designing a site is costly and less effective than getting it right in the planning stage.

6. Why Have Our Ranking Disappeared?

The reality of any marketing endeavour is that it will have a shelf-life. Sometimes, that shelf life is short. Other times, it can run for years.

SEO is vulnerable to the changes made by search engines. These changes aren’t advertised in advance, nor are they easily pinned down even after they have occurred. This is why SEO is strategic, just as Public Relations is strategic. The Public Relations campaign you were using a few years ago may not be the same one you use now, and the same goes for SEO.

The core of SEO hasn’t changed much. If you produce content visitors find relevant, and that content is linked to, and people engage with that content, then it has a good chance of doing well in search engines. However, the search engines constantly tweak their settings, and when they do, a lot of previous work - especially if that work was at the margins of the algorithms - can come undone.

So, ranking should never be taken for granted. The value the SEO brings is that they are across underlying changes in the way the search engines work and can adapt your strategy, and site, to the new changes.

Remember, whatever problems you may have with the search engines, the same goes for your competitors. They may have dropped rankings, too. Or they may do so soon. The SEO will try to figure out why the new top ranking sites are ranked well, then adapt your site and strategy so that it matches those criteria.

7. Why Don’t We Just Use PPC Instead?

PPC has many advantages. The biggest advantage is that you can get top positioning, and immediate traffic, almost instantly. The downside is, of course, you pay per click. Whilst this might be affordable today, keep in mind that the search engine has a business objective that demands they reward the top bidders who are most relevant. Their auction model forces prices higher and higher, and only those sites with deep pockets will remain in the game. If you don’t have deep pockets, or want to be beholden to the PPC channel, a long term SEO strategy works well in tandem.

SEO and PPC complement one another, and lulls and challenges in one channel can be made up for by the other. Also, you can feed the keyword data from PPC to SEO to gain a deeper understanding of search visitor behaviour.

8. Does SEO Provide Value For Money?

This is the reason for undertaking any marketing strategy.

An SEO should be able to demonstrate value. One way is to measure the visits from search engines before the SEO strategy starts, and see if these increase significantly post implementation. The value of each search click changes depending on your business case, but can be approximated using the PPC bid prices. Keep in mind the visits from an SEO campaign may be maintained, and increased, over considerable time, thus driving down their cost relative to PPC and other channels.

Pages






    Email Address
    Pick a Username
    Yes, please send me "7 Days to SEO Success" mini-course (a $57 value) for free.

    Learn More

    We value your privacy. We will not rent or sell your email address.