Measure For Business Benefit

Jan 28th
posted in

Matt Cutts is just toying with SEO’s these days.

Going by some comments, many SEOs still miss the big picture. Google is not in the business of enabling SEOs. So he may as well have a little fun - Matt has “called it” on guest posting.

Okay, I’m calling it: if you’re using guest blogging as a way to gain links in 2014, you should probably stop. Why? Because over time it’s become a more and more spammy practice, and if you’re doing a lot of guest blogging then you’re hanging out with really bad company.

The hen-house erupted.

The hens should know better by now. If a guest post is good for the audience and site, then do it. If it’s being done for no other reason than to boost rank in Google, then that’s a sign a publishing strategy is weak, high risk, and vulnerable to Google's whims. Change the publishing strategy.

Measuring What Is Important

Although far from perfect, Google is geared towards recognizing utility. If Google doesn’t recognize utility, then Google will become weaker and someone else will take their place. Only a few people remember Alta Vista. They didn’t provide much in the way of utility, and Google ate their lunch.

Which brings me onto the importance of measurement.

It’s important we measure the right things. If people get upset because guest posting is called out, are they upset because they are counting the number of inbound links as if that were the only benefit? Why are they counting inbound links? To get a ranking boost? So, why are some people getting upset? They know Google doesn’t like marketing practices that serve no other purpose than to boost rank. Or are people concerned Google might confuse a post of genuine utility with link spam?

A publishing strategy based on nothing more than Google rankings is not a publishing strategy, it’s a tactic. Given the changes Google has made recently, it’s not a good tactic, because if they can isolate and eliminate SEO tactics, they will. Those who guest post on other sites, and offer guest post placement in order to provide utility, should continue to do so. They are unlikely to eliminate genuine utility, regardless of links, and at worst, they'll likely ignore the site it appears on.

Interesting

To prosper, we need to be more interesting that the next guy. We need to focus on delivering “interestingness”.

The buzzword term is “visitor engagement”, but that really means “be interesting”. If we provide interesting material, people will read it, and if we provide it on a regular basis, they might come back, or remember our brand name, and then search on that brand name, and then they might link to it, and that this activity combined helps us rank. Ranking is a side effect of being genuinely interesting.

This is not to say measuring links, or page views, are unimportant. But they can be an oversimplification when taken in isolation.

Demand Media's eHow focused on pageviews rather than engagement. Which is a big part of the reason why the guys who sold them eHow were able to beat them with wikiHow.

Success depends on achieving the underlying business goal. Perhaps high page views are not important if a site is targeting a very specific audience. Perhaps rankings aren't all that important if most of the audience is on social media or repeat business. Sometimes, focusing on the wrong metrics leads to the wrong marketing tactics.

What else can we measure? Some common stuff....

  • Page views
  • Subscriptions
  • Comments
  • Quality of comments
  • Syndication
  • Time on site
  • Videos watched
  • Unique visitors
  • Traffic sent to partner sites
  • Bookmarking activity
  • Search engine exposure
  • Brand searches
  • Offline mentions
  • Online mentions
  • Customer satisfaction
  • Conversion rates
  • Number of inquiries
  • Relationships
  • Sales
  • Reduced costs

The choice of what we measure depends on what we’re trying to achieve. The SEO may say they are trying to achieve a high rank, but why? To get more traffic, perhaps. Why do we want more traffic? In the hope more people will buy our widget.

So, if buying more widgets is the goal, then perhaps more energy needs to be placed into converting the traffic we already have, as opposed to spending the same energy getting more? Perhaps more time needs to be spent on conversion optimization. Perhaps more time needs to be spent refining the offer. Or listening to customers. Hearing their objections. Writing Q&A that addresses those objections. Guest posting somewhere else and addressing industry wide objections. Thinking up products to sell to previous customers. Making them aware of changes via an email list. Optimizing the interest factor of your site to make it more interesting than your competitors, then treat the rankings as a bonus. Link building starts with "being interesting".

When it comes to the guest post, if you’re only doing it to get a link, then you’re almost certainly selling yourself short. A guest post should serve a number of functions, such as building awareness, increasing reach, building brand, and be based on serving your underlying marketing objective. Pick where you post carefully. Deliver real value. If you do guest post, always try and extract way more benefit than just the link.

There was a time when people could put low-quality posts on low-quality sites and enjoy a benefit. But that practice is really just selling a serious web business short.

How Do We Know If We're Interesting?

There are a couple of different types of measurement marketers use. One is an emotional response, where the visitor becomes “positively interested”. This is measured by recall studies, association techniques, customers surveys and questionnaires. However, the type of response on-line marketers focus on, which is somewhat easier to measure, is behavioural interest. When people are really interested, they do something in response.

So, to measure the effectiveness of a guest posting, we might look for increased name or brand searches. More linkedin views. We might look at how many people travel down the links. We look at what they do when they land on the site, and - the most important bit - whether they do whatever that thing is that translates to the bottom line. Was it subscribing? Commenting? Downloading a white paper? Watching a video? Getting in contact? Tweeting? Bookmarking? What was that thing you wanted them to do in order to serve your bottom line?

Measurement should be flexible and will be geared towards achieving business goals. SEOs may worry that if they don’t show rankings and links, then the customer will be dissatisfied. I’d wager the customer will be a lot more dissatisfied if they do get a lot of links and a rankings boost, yet no improvement in the bottom line. We could liken this to companies that have a lot of meetings. There is an air of busyness, but are they achieving anything worthwhile? Maybe. Maybe not. We should be careful not to mistake frenzy for productivity.

Measuring links, like measuring the number of meetings, is reductive. So is measuring engagement just by looking at clicks. The picture needs to be broad and strategic. So, if guest posts help you build your business, measured by business metrics, keep doing them. Don’t worry about what Google may or may not do, because it’s beyond your control, regardless.

Control what you can. Control the quality of information you provide.

SEO 2014

Jan 11th
posted in

We’re at the start of 2014.

SEO is finished.

Well, what we had come to know as the practical execution of “whitehat SEO” is finished. Google has defined it out of existence. Research keyword. Write page targeting keyword. Place links with that keyword in the link. Google cares not for this approach.

SEO, as a concept, is now an integral part of digital marketing. To do SEO in 2014 - Google-compliant, whitehat SEO - digital marketers must seamlessly integrate search strategy into other aspects of digital marketing. It’s a more complicated business than traditional SEO, but offers a more varied and interesting challenge, too.

Here are a few things to think about for 2014.

1. Focus On Brand

Big brands not only get a free pass, they can get extra promotion. By being banned. Take a look at Rap Genius. Aggressive link-building strategy leads to de-indexing. A big mea culpa follows and what happens? Not only do they get reinstated, they’ve earned themselves a wave of legitimate links.

Now that’s genius.

Google would look deficient if they didn’t show that site as visitors would expect to find it - enough people know the brand. To not show a site that has brand awareness would make Google look bad.

Expedia's link profile was similarly outed for appearing to be at odds with Google's published standards. Could a no-name site pass a hand inspection if they use aggressive linking? Unlikely.

What this shows is that if you have a brand important enough so that Google would look deficient by excluding it, then you will have advantages that no-name sites don’t enjoy. You will more likely pass manual inspections, and you’re probably more likely to get penalties lifted.

What is a brand?

In terms of search, it’s a site that visitors can use a brand search to find. Just how much search volume you require is open to debate, but you don’t need to be a big brand like Apple, or Trip Advisor or Microsoft. Rap Genius aren't. Ask “would Google look deficient if this site didn’t show up” and you can usually tell that by looking for evidence of search volume on a sites name.

In advertising, brands have been used to secure a unique identity. That identity is associated with a product or service by the customer. Search used to be about matching a keyword term. But as keyword areas become saturated, and Google returns fewer purely keyword-focused pages anyway, developing a unique identity is a good way forward.

If you haven’t already, put some work into developing a cohesive, unique brand. If you have a brand, then think about generating more awareness. This may mean higher spends on brand-related advertising than you’ve allocated in previous years. The success metric is an increase in brand searches i.e. the name of the site.

2. Be Everywhere

The idea of a stand-alone site is becoming redundant. In 2014, you need to be everywhere your potential visitors reside. If your potential visitors are spending all day in Facebook, or YouTube, that’s where you need to be, too. It’s less about them coming to you, which is the traditional search metaphor, and a lot more about you going to them.

You draw visitors back to your site, of course, but look at every platform and other site as a potential extension of your own site. Pages or content you place on those platforms are yet another front door to your site, and can be found in Google searches. If you’re not where your potential visitors are, you can be sure your competitors will be, especially if they’re investing in social media strategies.

A reminder to see all channels as potential places to be found.

Mix in cross-channel marketing with remarketing and consumers get the perception that your brand is bigger. Google ran the following display ad before they broadly enabled retargeting ads. Retargeting only further increases that lift in brand searches.

3. Advertise Everywhere

Are you finding it difficult to get top ten in some areas? Consider advertising with AdWords and on the sites that already hold those positions. Do some brand advertising on them to raise awareness and generate some brand searches. An advert placed on a site that offers a complementary good or service might be cheaper than going to the expense and effort needed to rank. It might also help insulate you from Google’s whims.

The same goes for guest posts and content placement, although obviously you need to be a little careful as Google can take a dim view of it. The safest way is to make sure the content you place is unique, valuable and has utility in it’s own right. Ask yourself if the content would be equally at home on your own site if you were to host it for someone else. If so, it’s likely okay.

4. Valuable Content

Google does an okay job of identifying good content. It could do better. They’ve lost their way a bit in terms of encouraging production of good content. It’s getting harder and harder to make the numbers work in order to cover the production cost.

However, it remains Google’s mission to provide the user with answers the visitor deems relevant and useful. The utility of Google relies on it. Any strategy that is aligned with providing genuine visitor utility will align with Google’s long term goals.

Review your content creation strategies. Content that is of low utility is unlikely to prosper. While it’s still a good idea to use keyword research as a guide to content creation, it’s a better idea to focus on topic areas and creating engagement through high utility. What utility is the user expecting from your chosen topic area? If it’s rap lyrics for song X, then only the rap lyrics for song X will do. If it is plans for a garden, then only plans for a garden will do. See being “relevant” as “providing utility”, not keyword matching.

Go back to the basic principles of classifying the search term as either Navigational, Informational, or Transactional. If the keyword is one of those types, make sure the content offers the utility expected of that type. Be careful when dealing with informational queries that Google could use in it’s Knowledge Graph. If your pages deal with established facts that anyone else can state, then you have no differentiation, and that type of information is more likely to end up as part of Google’s Knowledge Graph. Instead, go deep on information queries. Expand the information. Associate it with other information. Incorporate opinion.

BTW, Bill has some interesting reading on the methods by which Google might be identifying different types of queries.

Methods, systems, and apparatus, including computer program products, for identifying navigational resources for queries. In an aspect, a candidate query in a query sequence is selected, and a revised query subsequent to the candidate query in the query sequence is selected. If a quality score for the revised query is greater than a quality score threshold and a navigation score for the revised query is greater than a navigation score threshold, then a navigational resource for the revised query is identified and associated with the candidate query. The association specifies the navigational resource as being relevant to the candidate query in a search operation.

5. Solve Real Problems

This is a follow-on from “ensuring you provide content with utility”. Go beyond keyword and topical relevance. Ask “what problem is the user is trying to solve”? Is it an entertainment problem? A “How To” problem? What would their ideal solution look like? What would a great solution look like?

There is no shortcut to determining what a user finds most useful. You must understand the user. This understanding can be gleaned from interviews, questionnaires, third party research, chat sessions, and monitoring discussion forums and social channels. Forget about the keyword for the time being. Get inside a visitors head. What is their problem? Write a page addressing that problem by providing a solution.

6. Maximise Engagement

Google are watching for the click-back to the SERP results, an action characteristic of a visitor who clicked through to a site and didn’t deem what they found to be relevant to the search query in terms of utility. Relevance in terms of subject match is now a given.

Big blocks of dense text, even if relevant, can be off-putting. Add images and videos to pages that have low engagement and see if this fixes the problem. Where appropriate, make sure the user takes an action of some kind. Encourage the user to click deeper into the site following an obvious, well placed link. Perhaps they watch a video. Answer a question. Click a button. Anything that isn’t an immediate click back.

If you’ve focused on utility, and genuinely solving a users problem, as opposed to just matching a keyword, then your engagement metrics should be better than the guy who is still just chasing keywords and only matching in terms of relevance to a keyword term.

7. Think Like A PPCer

Treat every click like you were paying for it directly. Once that visitor has arrived, what is the one thing you want them to do next? Is it obvious what they have to do next? Always think about how to engage that visitor once they land. Get them to take an action, where possible.

8.Think Like A Conversion Optimizer

Conversion optimization tries to reduce the bounce-rate by re-crafting the page to ensure it meets the users needs. They do this by split testing different designs, phrases, copy and other elements on the page.

It’s pretty difficult to test these things in SEO, but it’s good to keep this process in mind. What pages of your site are working well and which pages aren’t? Is it anything to do with different designs or element placement? What happens if you change things around? What do the three top ranking sites in your niche look like? If their link patterns are similar to yours, what is it about those sites that might lead to higher engagement and relevancy scores?

9. Rock Solid Strategic Marketing Advantage

SEO is really hard to do on generic me-too sites. It’s hard to get links. It’s hard to get anyone to talk about them. People don’t share them with their friends. These sites don’t generate brand searches. The SEO option for these sites is typically what Google would describe as blackhat, namely link buying.

Look for a marketing angle. Find a story to tell. Find something unique and remarkable about the offering. If a site doesn’t have a clearly-articulated point of differentiation, then the harder it is to get value from organic search if your aim is to do so whilst staying within the guidelines.

10. Links

There’s a reason Google hammers links. It’s because they work. Else, surely Google wouldn’t make a big deal about them.

Links count. It doesn’t matter if they are no-follow, scripted, within social networks, or wherever, they are still paths down which people travel. It comes back to a clear point of differentiation, genuine utility and a coherent brand. It’s a lot easier, and safer, to link build when you’ve got all the other bases covered first.

Beware Of SEO Truthiness

Dec 13th
posted in

When SEO started, many people routinely used black-box testing to try any figure out what pages the search engines rewarded.

Black box testing is terminology used in IT. It’s a style of testing that doesn’t assume knowledge of the internal workings of a machine or computer program. Rather, you can only test how the system responds to inputs.

So, for many years, SEO was about trying things out and watching how the search engine responded. If rankings went up, SEOs assumed correlation meant causation, so they did a lot more of whatever it was they thought was responsible for the boost. If the trick was repeatable, they could draw some firmer conclusions about causation, at least until the search engine introduced some new algorithmic code and sent everyone back to their black-box testing again.

Well, it sent some people back to testing. Some SEO’s don’t do much, if any, testing of their own, and so rely on the strategies articulated by other people. As a result, the SEO echo chamber can be a pretty misleading place as “truthiness” - and a lot of false information - gets repeated far and wide, until it’s considered gospel. One example of truthiness is that paid placement will hurt you. Well, it may do, but not having it may hurt you more, because it all really…..depends.

Another problem is that SEO testing can seldom be conclusive, because you can’t be sure of the state of the thing you’re testing. The thing you're testing may not be constant. For example, you throw up some more links, and your rankings rise, but the rise could be due to other factors, such as a new engagement algorithm that Google implemented in the middle of your testing, you just didn’t know about it.

It used to be a lot easier to conduct this testing. Updates were periodic. Up until that point, you could reasonably assume the algorithms were static, so cause and effect were more obvious than they are today. Danny Sullivan gave a good overview of search history at Moz earlier in the year:

That history shows why SEO testing is getting harder. There are a lot more variables to isolate that there used to be. The search engines have also been clever. A good way to thwart SEO black box testing is to keep moving the target. Continuously roll out code changes and don’t tell people you’re doing it. Or send people on a wild goose chase by arm-waving about a subtle code change made over here, when the real change has been made over there.

That’s the state of play in 2013.

However….(Ranting Time :)

Some SEO punditry is bordering on the ridiculous!

I’m not going to link to one particular article I’ve seen recently, as, ironically, that would mean rewarding them for spreading FUD. Also, calling out people isn't really the point. Suffice to say, the advice was about specifics, such as how many links you can “safely” get from one type of site, that sort of thing....

The problem comes when we can easily find evidence to the contrary. In this case, a quick look through the SERPs and you'll find evidence of top ranking sites that have more than X links from Site Type Y, so this suggests….what? Perhaps these sites are being “unsafe”, whatever that means. A lot of SEO punditry is well meaning, and often a rewording of Google's official recommendations, but can lead people up the garden path if evidence in the wild suggests otherwise.

If one term defined SEO in 2013, it is surely “link paranoia”.

What's Happening In The Wild

When it comes to what actually works, there are few hard and fast rules regarding links. Look at the backlink profiles for top ranked sites across various categories and you’ll see one thing that is constant....

Nothing is constant.

Some sites have links coming from obviously automated campaigns, and it seemingly doesn’t affect their rankings. Other sites have credible link patterns, and rank nowhere. What counts? What doesn’t? What other factors are in play? We can only really get a better picture by asking questions.

Google allegedly took out a few major link networks over the weekend. Anglo Rank came in for special mention from Matt Cutts.

So, why are Google making a point of taking out link networks if link networks don’t work? Well, it’s because link networks work. How do we know? Look at the back link profiles in any SERP area where there is a lot of money to be made, and the area isn’t overly corporate i.e. not dominated by major brands, and it won’t be long before you spot aggressive link networks, and few "legitimate" links, in the backlink profiles.

Sure, you wouldn't want aggressive link networks pointing at brand sites, as there are better approaches brand sites can take when it comes to digital marketing, but such evidence makes a mockery of the tips some people are freely handing out. Are such tips the result of conjecture, repeating Google's recommendations, or actual testing in the wild? Either the link networks work, or they don’t work but don’t affect rankings, or these sites shouldn't be ranking.

There’s a good reason some of those tips are free, I guess.

Risk Management

Really, it’s a question of risk.

Could these sites get hit eventually? Maybe. However, those using a “disposable domain” approach will do anything that works as far as linking goes, as their main risk is not being ranked. Being penalised is an occupational hazard, not game-over. These sites will continue so long as Google's algorithmic treatment rewards them with higher ranking.

If your domain is crucial to your brand, then you might choose to stay away from SEO entirely, depending on how you define “SEO”. A lot of digital marketing isn’t really SEO in the traditional sense i.e. optimizing hard against an algorithm in order to gain higher rankings, a lot of digital marketing is based on optimization for people, treating SEO as a side benefit. There’s nothing wrong with this, of course, and it’s a great approach for many sites, and something we advocate. Most sites end up somewhere along that continuum, but no matter where you are on that scale, there’s always a marketing risk to be managed, with perhaps "non-performance" being a risk that is often glossed over.

So, if there's a take-away, it's this: check out what actually happens in the wild, and then evaluate your risk before emulating it. When pundits suggest a rule, check to see if you can spot times it appears to work, and perhaps more interestingly, when it doesn't. It's in those areas of personal inquiry and testing where gems of SEO insight are found.

SEO has always been a mix of art and science. You can test, but only so far. The art part is dealing with the unknown past the testing point. Performing that art well is to know how to pick truthiness from reality.

And that takes experience.

But mainly a little fact checking :)

  • Over 100 training modules, covering topics like: keyword research, link building, site architecture, website monetization, pay per click ads, tracking results, and more.
  • An exclusive interactive community forum
  • Members only videos and tools
  • Additional bonuses - like data spreadsheets, and money saving tips
We love our customers, but more importantly

Our customers love us!

Value Based SEO Strategy

Dec 1st
posted in

One approach to search marketing is to treat the search traffic as a side-effect of a digital marketing strategy. I’m sure Google would love SEOs to think this way, although possibly not when it comes to PPC! Even if you’re taking a more direct, rankings-driven approach, the engagement and relevancy scores that come from delivering what the customer values should serve you well, too.

In this article, we’ll look at a content strategy based on value based marketing. Many of these concepts may be familiar, but bundled together, they provide an alternative search provider model to one based on technical quick fixes and rank. If you want to broaden the value of your SEO offering beyond that first click, and get a few ideas on talking about value, then this post is for you.

In any case, the days of being able to rank well without providing value beyond the click are numbered. Search is becoming more about providing meaning to visitors and less about providing keyword relevance to search engines.

What Is Value Based Marketing?

Value based marketing is customer, as opposed to search engine, centric. In Values Based Marketing For Bottom Line Success, the authors focus on five areas:

  • Discover and quantify your customers' wants and needs
  • Commit to the most important things that will impact your customers
  • Create customer value that is meaningful and understandable
  • Assess how you did at creating true customer value
  • Improve your value package to keep your customers coming back

Customers compare your offer against those of competitors, and divide the benefits by the cost to arrive at value. Marketing determines and communicates that value.

This is the step beyond keyword matching. When we use keyword matching, we’re trying to determine intent. We’re doing a little demographic breakdown. This next step is to find out what the customer values. If we give the customer what they value, they’re more likely to engage and less likely to click back.

What Does The Customer Value?

A key question of marketing is “which customers does this business serve”? Seems like an obvious question, but it can be difficult to answer. Does a gym serve people who want to get fit? Yes, but then all gyms do that, so how would they be differentiated?

Obviously, a gym serves people who live in a certain area. So, if our gym is in Manhattan, our customer becomes “someone who wants to get fit in Manhattan”. Perhaps our gym is upmarket and expensive. So, our customer becomes “people who want to get fit in Manhattan and be pampered and are prepared to pay more for it”. And so on, and so on. They’re really questions and statements about the value proposition as perceived by the customer, and then delivered by the business.

So, value based marketing is about delivering value to a customer. This syncs with Google’s proclaimed goal in search, which is to put users first by delivering results they deem to have value, and not just pages that match a keyword term. Keywords need to be seen in a wider context, and that context is pretty difficult to establish if you’re standing outside the search engine looking in, so thinking in terms of concepts related to the value proposition might be a good way to go.

Value Based SEO Strategy

The common SEO approach, for many years, has started with keywords. It should start with customers and the business.

The first question is “who is the target market” and then ask what they value.

Relate what they value to the business. What is the value proposition of the business? Is it aligned? What would make a customer value this business offering over those of competitors? It might be price. It might be convenience. It’s probably a mix of various things, but be sure to nail down the specific value propositions.

Then think of some customer questions around these value propositions. What would be the likely customer objections to buying this product? What would be points that need clarifying? How does this offer differ from other similar offers? What is better about this product or service? What are the perceived problems in this industry? What are the perceived problems with this product or service? What is difficult or confusing about it? What could go wrong with it? What risks are involved? What aspects have turned off previous customers? What complaints did they make?

Make a list of such questions. These are your article topics.

You can glean this information by either interviewing customers or the business owner. Each of these questions, and accompanying answer, becomes an article topic on your site, although not necessarily in Q&A format. The idea is to create a list of topics as a basis for articles that address specific points, and objections, relating to the value proposition.

For example, buying SEO services is a risk. Customers want to know if the money they spend is going to give them a return. So, a valuable article might be a case study on how the company provided return on spend in the past, and the process by which it will achieve similar results in future. Another example might be a buyer concerned about the reliability of a make of car. A page dedicated to reliability comparisons, and another page outlining the customer care after-sale plan would provide value. Note how these articles aren’t keyword driven, but value driven.

Ever come across a FAQ that isn’t really a FAQ? Dreamed-up questions? They’re frustrating, and of little value if the information doesn’t directly relate to the value we seek. Information should be relevant and specific so when people land on the site, there’s more chance they will perceive value, at least in terms of addressing the questions already on their mind.

Compare this approach with generic copy around a keyword term. A page talking about “SEO” in response to the keyword term “SEO“might closely match a keyword term, so that’s a relevance match, but unless it’s tied into providing a customer the value they seek, it’s probably not of much use. Finding relevance matches is no longer a problem for users. Finding value matches often is. Even if you’re keyword focused, added these articles provides you semantic variation that may capture keyword searches that aren't appearing in keyword tools.

Keyword relevance was a strategy devised at a time when information was less readily available and search engines weren't as powerful. Finding something relevant was more hit and miss that it is today. These days, there’s likely thousands, if not millions, of pages that will meet relevance criteria in terms of keyword matching, so the next step is to meet value criteria. Providing value is less likely to earn a click back and more likely to create engagement than mere on-topic matching.

The Value Chain

Deliver value. Once people perceive value, then we have to deliver it. Marketing, and SEO in particular, used to be about getting people over the threshold. Today, businesses have to work harder to differentiate themselves and a sound way of doing this is to deliver on promises made.

So the value is in the experience. Why do we return to Amazon? It’s likely due to the end-to-end experience in terms of delivering value. Any online e-commerce store can deliver relevance. Where competition is fierce, Google is selective.

In the long term, delivering value should drive down the cost of marketing as the site is more likely to enjoy repeat custom. As Google pushes more and more results beneath the fold, the cost of acquisition is increasing, so we need to treat each click like gold.

Monitor value. Does the firm keep delivering value? To the same level? Because people talk. They talk on Twitter and Facebook and the rest. We want them talking in a good way, but even if they talk in a negative way, it can still useful. Their complaints can be used as topics for articles. They can be used to monitor value, refine the offer and correct problems as they arise. Those social signals, whilst not a guaranteed ranking boost, are still signals. We need to adopt strategies whereby we listen to all the signals, so to better understand our customers, in order to provide more value, and hopefully enjoy a search traffic boost as a welcome side-effect, so long as Google is also trying to determine what users value. .

Not sounding like SEO? Well, it’s not optimizing for search engines, but for people. If Google is to provide value, then it needs to ensure results provide not just relevant, but offer genuine value to end users. Do Google do this? In many cases, not yet, but all their rhetoric and technical changes suggest that providing value is at the ideological heart of what they do. So the search results will most likely, in time, reflect the value people seek, and not just relevance.

In technical terms, this provides some interesting further reading:

Today, signals such as keyword co-occurrence, user behavior, and previous searches do in fact inform context around search queries, which impact the SERP landscape. Note I didn’t say the signals “impact rankings,” even though rank changes can, in some cases, be involved. That’s because there’s a difference. Google can make a change to the SERP landscape to impact 90 percent of queries and not actually cause any noticeable impact on rankings.

The way to get the context right, and get positive user behaviour signals, and align with their previous searches, is to first understand what people value.

Historical Revisionism

Nov 1st
posted in

A stopped clock is right two times a day.

There’s some amusing historical revisionism going on in SEO punditry world right now, which got me thinking about the history of SEO. I’d like to talk about some common themes of this historical revision, which goes along the lines of “what I predicted all those years ago came true - what a visionary I am! ." No naming names, as I don't meant this to be anything personal - as the same theme has popped up in a number of places - just making some observations :)

See if you agree….

Divided We Fall

The SEO world has never been united. There are no industry standards and qualifications like you’d find in the professions, such as being a doctor, or lawyer or a builder. If you say you’re an SEO, then you’re an SEO.

Part of the reason for the lack of industry standard is that the search engines never came to the party. Sure, they talked at conferences, and still do. They offered webmasters helpful guidelines. They participated in search engine discussion forums. But this was mainly to do with risk management. Keep your friends close, and your enemies closer.

In all these years, you won’t find one example of a representative from a major search engine saying “Hey, let’s all get together and form an SEO standard. It will help promote and legitimize the industry!”.

No, it has always been decrees from on high. “Don’t do this, don’t do that, and here are some things we’d like you to do”. Webmasters don’t get a say in it. They either do what the search engines say, or they go against them, but make no mistake, there was never any partnership, and the search engines didn’t seek one.

This didn’t stop some SEOs seeing it as a form of quasi-partnership, however.

Hey Partner

Some SEOs chose to align themselves with search engines and do their bidding. If the search engine reps said “do this”, they did it. If the search engines said “don’t do this”, they’d wrap themselves up in convoluted rhetorical knots pretending not to do it. This still goes on, of course.

In the early 2000’s, it turned, curiously, into a question of morality. There was “Ethical SEO”, although quite what it had to do with ethics remains unclear. Really, it was another way of saying “someone who follows the SEO guidelines”, presuming that whatever the search engines decree must be ethical, objectively good and have nothing to do self-interest. It’s strange how people kid themselves, sometimes.

What was even funnier was the search engine guidelines were kept deliberately vague and open to interpretation, which, of course, led to a lot of heated debate. Some people were “good” and some people were “bad”, even though the distinction was never clear. Sometimes it came down to where on the page someone puts a link. Or how many times someone repeats a keyword. And in what color.

It got funnier still when the search engines moved the goal posts, as they are prone to do. What was previously good - using ten keywords per page - suddenly became the height of evil, but using three was “good” and so all the arguments about who was good and who wasn’t could start afresh. It was the pot calling the kettle black, and I’m sure the search engines delighted in having the enemy warring amongst themselves over such trivial concerns. As far as the search engines were concerned, none of them were desirable, unless they became paying customers, or led paying customers to their door. Then there was all that curious Google+ business.

It's hard to keep up, sometimes.

Playing By The Rules

There’s nothing wrong with playing by the rules. It would have been nice to think there was a partnership, and so long as you followed the guidelines, high rankings would naturally follow, the bad actors would be relegated, and everyone would be happy.

But this has always been a fiction. A distortion of the environment SEOs were actually operating in.

Jason Calacanis, never one to miss an opportunity for controversy, fired some heat seekers at Google during his WebmasterWorld keynote address recently…..

Calacanis proceeded to describe Cutts and Google in terms like, “liar,” “evil,” and “a bad partner.” He cautioned the PubCon audience to not trust Google, and said they cooperate with partners until they learn the business and find a way to pick off the profits for themselves. The rant lasted a good five minutes….

He accused Google of doing many of the things SEOs are familiar with, like making abrupt algorithm changes without warning. They don’t consult, they just do it, and if people’s businesses get trashed as a result, then that’s just too bad. Now, if that’s a sting for someone who is already reasonably wealthy and successful like Calacanis, just imagine what it feels like for the much smaller web players who are just trying to make a living.

The search business is not a pleasant environment where all players have an input, and then standards, terms and play are generally agreed upon. It’s war. It’s characterized by a massive imbalance of power and wealth, and one party will use it to crush those who it determines stands in its way.

Of course, the ever pleasant Matt Cutts informs us it’s all about the users, and that’s a fair enough spin of the matter, too. There was, and is, a lot of junk in the SERPs, and Mahalo was not a partner of Google, so any expectation they’d have a say in what Google does is unfounded.

The take-away is that Google will set rules that work for Google, and if they happen to work for the webmaster community too, well that’s good, but only a fool would rely on it. Google care about their bottom line and their projects, not ours. If someone goes out of business due to Google’s behaviour, then so be it. Personally, I think the big technology companies do have a responsibility beyond themselves to society, because the amount of power they are now centralising means they’re not just any old company anymore, but great vortexes that can distort entire markets. For more on this idea, and where it’s all going, check out my review of “Who Owns The Future” by Jaron Lanier.

So, if you see SEO as a matter of playing by their rules, then fine, but keep in mind "those who can give you everything can also take everything away". Those rules weren't designed for your benefit.

Opportunity Cost

There was a massive opportunity cost by following so called ethical SEO during the 2000s.

For a long time, it was relatively easily to get high rankings by being grey. And if you got torched, you probably had many other sites with different link patterns good to go. This was against the webmaster guidelines, but given marketing could be characterized as war, one does not let the enemy define ones tactics. Some SEOs made millions doing it. Meanwhile, a lot of content-driven sites disappeared. That was, perhaps, my own "a stopped clock is right two times a day" moment. It's not like I'm going to point you to all the stuff I've been wrong about, now is it :)

These days, a lot of SEO is about content and how that content is marketed, but more specifically it’s about the stature of the site on which that content appears. That’s the bit some pundits tend to gloss over. You can have great content, but that’s no guarantee of anything. You will likely remain invisible. However, put that exact same content on a Fortune 500 site, and that content will likely prosper. Ah, the rich get richer.

So, we can say SEO is about content, but that’s only half the picture. If you’re a small player, the content needs to appear in the right place, be very tightly targeted to your audiences needs so they don’t click back, and it should be pushed through various media channels.

Content, even from many of these "ethical SEOs", used to be created for search engines in the hope of netting as many visitors as possible. These days, it’s probably a better strategy to get inside the audience's heads and target it to their specific needs, as opposed to a keyword, then get that content out to wherever your audience happens to be. Unless, of course, you’re Fortune 500 or otherwise well connected, in which case you can just publish whatever you like and it will probably do well.

Fair? Not really, but no one ever said this game was fair.

Whatever Next?

Do I know what’s going to happen next? In ten years time? Nope. I could make a few guesses, and like many other pundits, some guesses will prove right, and some will be wrong, but that’s the nature of the future. It will soon make fools of us all.

Having said that, will you take a punt and tell us what you think will be the future of SEO? Does it have one? What will look like? If you’re right, then you can point back here in a few years time and say “Look, I told you so!”.

If you’re wrong, well, there’s always historical revisionism :)

Optimizing The SEO Model

Oct 29th
posted in

SEO has always been focused on acquisition.

The marketing strategy, based on high rankings against keyword terms, is about gaining a steady flow of new visitors. If a site ranks better than competing sites, this steady stream of new visitors will advantage the top sites to the disadvantage of those sites beneath it.

The selling point of SEO is a strong one. The client gets a constant flow of new visitors and enjoys competitive advantage, just so long as they maintain rank.

A close partner of SEO is PPC. Like SEO, PPC delivers a stream of new visitors, and if you bid well, and have relevant advertisements, then you enjoy a competitive advantage. Unlike PPC, SEO does not cost per click, or, to be more accurate, it should cost a lot less per click once the SEOs fees are taken into account, so SEO has enjoyed a stronger selling point. Also, the organic search results typically have a higher level of trust from search engine users.

91% prefer using natural search results when looking to buy a product or service online".[Source: Tamar Search Attitudes Report, Tamar, July 2010]

Rain On The Parade

Either by coincidence or design, Google’s algorithm shifts have made SEO less of a sure proposition.

If you rank well, the upside is still there, but because the result is less certain than it used to be, and the work more involved than ever, the risk, and costs in general, have increased. The more risky SEO becomes in terms of getting results, the more Adwords looks attractive, as at least results are assured, so long as spend is sufficient.

Adwords is a brilliant system. For Google. It’s also a brilliant system for those advertisers who can find a niche that doesn’t suffer high levels of competition. The trouble is competition levels are typically high.

Because competition is high, and Adwords is an auction model, bid prices must rise. As bid prices rise, only those companies that can achieve ROI at high costs per click will be left bidding. The higher their ROI, the higher the bid prices can conceivably go. Their competitors, if they are to keep up, will do likewise.

So, the PPC advertiser focused on customer acquisition as a means of growing the company will be passing more and more of their profits to Google in the form of higher and higher click prices. If a company wants to grow by customer acquisition, via the search channel, then they’ll face higher and higher costs. It can be difficult to maintain ROI via PCC over time, which is why SEO is appealing. It’s little wonder Google has their guns pointed at SEO.

A fundamental problem with Adwords, and SEO in general, is that basing marketing success around customer acquisition alone is a poor long term strategy.

More on that point soon….

White-Hat SEO Is Dead

It’s surprising a term such as “white hat SEO” was ever taken seriously.

Any attempt to game a search engine’s algorithm, as far as the search engine is concerned, is going to be frowned upon by the search engine. What is gaming if it’s not reverse engineering the search engines ranking criteria and looking to gain a higher rank than a site would otherwise merit? Acquiring links, writing keyword-focused articles, for the purpose of gaining a higher rank in a search engine is an attempt at rank manipulation. The only thing that varies is the degree.

Not that there’s anything wrong with that, as far as marketers are concerned.

The search marketing industry line has been that so long as you avoided “bad behaviour”, your site stood a high chance of ranking well. Ask people for links. Find keywords with traffic. Publish pages focused on those topics. There used to more certainty of outcome.

If the outcome is not assured, then so long as a site is crawlable, why would you need an SEO? You just need to publish and see where Google ranks you. Unless the SEO is manipulating rank, then where is the value proposition over and above simply publishing crawlable content? Really, SEO is a polite way of saying “gaming the system”.

Those who let themselves be defined by Google can now be seen scrambling to redefine themselves. “Inbound marketers” is one term being used a lot. There’s nothing wrong with this, of course, although you’d be hard pressed to call it Search Engine Optimization. It’s PR. It’s marketing. It’s content production. The side effect of such activity might be a high ranking in the search engines (wink, wink). It’s like Fight Club. The first rule of Fight Club is…...

A few years back, we predicted that the last SEOs standing would be blackhat, and that’s turned out to be true. The term SEO has been successfully co-opted and marginalized. You can still successfully game the system with disposable domains, by aggressively targeting keywords, and buying lot of links and/or building link networks, but there’s no way that’s compliant with Google’s definitions of acceptable use. It would be very difficult to sell that to a client without full disclosure. Even with full disclosure, I’m sure it’s a hard sell.

But I digress….

Optimization In The New Environment

The blackhats will continue on as usual. They never took direction from search engines, anyway.

Many SEOs are looking to blend a number of initiatives together to take the emphasis off search. Some call it inbound. In practice, it blends marketing, content production and PR. It's a lot less about algo hacking.

For it to work well, and to get great results in search, the SEO model needs to be turned on its head. It’s still about getting people to a site, but because the cost of getting people to a site has increased, every visitor must count. For this channel to maintain value, then more focus will go on what happens after the click.

If the offer is not right, and the path to that offer isn’t right, then it’s like having people turn up for a concert when the band hasn’t rehearsed. At the point the audience turns up, they must deliver what the audience wants, or the audience isn’t coming back. The bands popularity will quickly fade.

This didn’t really matter too much in the past when it was relatively cheap to position in the SERPs. If you received a lot of slightly off-topic traffic, big deal, it’s not like it cost anything. Or much. These days, because it’s growing ever more costly to position, we’re increasingly challenged by the “growth by acquisition” problem.

Consider optimizing in two areas, if you haven’t already.

1. Offer Optimization

We know that if searchers don’t find what they what, they click back. The click back presents two problems. One, you just wasted time and money getting that visitor to your site. Secondly, it’s likely that Google is measuring click-backs in order to help determine relevancy.

How do you know if your offer is relevant to users?

The time-tested way is to examine a couple of the 4ps. Product, price, position, and place. Place doesn’t matter so much, as we’re talking about the internet, although if you’ve got some local-centric product or service, then it’s a good idea to focus on it. Promotion is what SEOs do. They get people over the threshold.

However, two areas worth paying attention to are product and price. In order to optimize product, we need to ask some fundamental questions:

  • Does the customer want this product or service?
  • What needs does it satisfy? Is this obvious within a few seconds of viewing the page?
  • What features does it have to meet these needs? Are these explained?
  • Are there any features you've missed out? Have you explained all the features that meet the need?
  • Are you including costly features that the customer won't actually use?
  • How and where will the customer use it?
  • What does it look like? How will customers experience it?
  • What size(s), color(s) should it be?
  • What is it to be called?
  • How is it branded?
  • How is it differentiated versus your competitors?
  • What is the most it can cost to provide, and still be sold sufficiently profitably?

SEOs are only going to have so much control over these aspects, especially if they’re working for a client. However, it still pays to ask these questions, regardless. If the client can’t answer them, then you may be dealing with a client who has no strategic advantage over competitors. They are likely running a me-too site. Such sites are difficult to position from scratch.

Even older sites that were at one point highly differentiated have slid into an unprofitable me too status as large sites like Amazon & eBay offer a catalog which grows deeper by the day.

Unless you're pretty aggressive, taking on me-too sites will make your life difficult in terms of SEO, so thinking about strategic advantage can be a good way to screen clients. If they have no underlying business advantage, ask yourself if you really want to be doing SEO for these people?

In terms of price:

  • What is the value of the product or service to the buyer?
  • Are there established price points for products or services in this area?
  • Is the customer price sensitive? Will a small decrease in price gain you extra market share? Or will a small increase be indiscernible, and so gain you extra profit margin?
  • What discounts should be offered to trade customers, or to other specific segments of your market?
  • How will your price compare with your competitors?

Again, even if you have little or no control over these aspects, then it still pays to ask the questions. You're looking for underlying business advantage that you can leverage.

Once we’ve optimized the offer, we then look at conversion.

2. Conversion Optimization

There’s the obvious conversion most search marketers know about. People arrive at a landing page. Some people buy what’s on offer, and some leave. So, total conversions/number of views x 100 equals the conversion rate.

However, when it comes to SEO, it’s not just about the conversion rate of a landing page. Unlike PPC, you don’t have precise control over the entry page. So, optimizing for conversion is about looking at every single page on which people enter your site, and optimizing each page as if it were an entry point.

What do you want people to do when they land on your page?

Have a desired action in mind for every page. It might be a sign-up. It might be to encourage a bookmark. It might be to buy something. It might be to tweet. Whatever it is, we need to make the terms of engagement, for the visitor, clear for each page - with a big, yellow highlight on the term “engagement”! Remember, Google are likely looking at bounce-back rates. So, there is a conversion rate for every single page on your site, and they’re likely all different.

Think about the shopping cart process. Is a buyer, particularly a mobile buyer, going to wade through multiple forms? Or could the sale be made in as few clicks as possible? Would integrating Paypal or Amazon payments lift your conversion rates? What’s your site speed like? The faster, the better, obviously. A lot of conversion is about streamlining things - from processes, to navigation to site speed.

At this point, a lot of people will be wondering how to measure and quantify all this. How to track track conversion funnels across a big site. It’s true, it’s difficult. It many cases, it’s pretty much impossible to get adequate sample sizes.

However, that’s not a good reason to avoid conversion optimization. You can measure it in broad terms, and get more incremental as time goes on. A change across pages, a change in paths, can lead to small changes on those pages and paths, even changes that are difficult to spot, but there is sufficient evidence that companies who employ conversion optimization can enjoy significant gains, especially if they haven't focused on these areas in the past.

While you could quantify every step of the way, and some companies certainly do, there’s probably a lot of easy wins that can be gained merely by following these two general concepts - optimizing the offer and then optimizing (streamlining) the pages and paths that lead to that offer. If something is obscure, make it obvious. If you want the visitor to do something, make sure the desired action is writ-large. If something is slow, make it faster.

Do it across every offer, page and path in your site and watch the results.

Time For A Content Audit

Oct 2nd
posted in

"Content is king" is one of those “truthy” things some marketers preach. However, in most businesses the bottom line is king, attention is queen, and content can be used as a means to get both, but it depends.

The problem is that content is easy to produce. Machines can produce content. They can tirelessly churn out screeds of content every second. Even if they didn’t, billions of people on the internet are perfectly capable of adding to the monolithic content pile at similar rates.

Low barriers to content production and distribution mean the internet has turned a lot of content into near worthless commodity. Getting and maintaining attention is the tricky part, and once a business has that, then the benefits can flow through to the bottom line.

Some content is valuable, of course. Producing valuable content can earn attention. The content that gets the most attention is typically something for which an audience has a strong need, yet can’t easily get elsewhere, and is published in a place they're likely to see. Or someone they know is likely to see. An article on title tags will likely get buried. An article on the secret code to cracking Google's Hummingbird algorithms will likely crash your server.

Up until the point everyone else has worked out how to crack them, too, of course.

What Content Does The User Want?

Content can become King if the audience bestows favor upon it. Content producers need to figure out what content the audience wants. Perversely, Google have chosen to make this task even more difficult than it was before by withholding keyword data. Between Google’s supposed “privacy” drive, Hummingbird supposedly using semantic analysis, and Penguin/Panda supposedly using engagement metrics, page level and path level optimization are worth focusing upon going forward.

If you haven’t done one for a while, now is probably a good time to take stock and undertake a content audit.

You Have Valuable Historical Information

If you’ve got historical keyword data, archive it now. It will give you an advantage over those who follow you from this point on. Going forward, it will be much more expensive to acquire this data.

Run an audit on your existing content. What content works best? What type of content is it? Video? Text? What’s the content about? What keywords did people use to find it previously? Match content against your historical keyword data.

Here’s a useful list of site and content audit tools and resources.

If keywords can no longer suggest content demand, then how do we know what the visitor wants in terms of content? We must seek to understand the audience at a deeper level. Take a more fuzzy approach.

Watch Activity Signals

Analytics can get pretty addictive and many tools let you watch what visitors do in real time. Monitor engagement levels on your pages. What is a user doing on that page? Are they reading? Contributing? Clicking back and forward looking for something else?

Ensure pages with high engagement are featured prominently in your information architecture. Relegate or fix low-engagement pages. Segment out your content so you know which is the most popular, in terms of landings, and link that information back to ranking reports. This way, you can approximate keywords and stay focused on the content users find most relevant and engaging. Segment out your audience, too. Different visitors respond to different things. Do you know which group favours what? What do older people go for? What do younger people go for? Here are a few ideas on how to segment users.

User behavior is getting increasingly complex. It takes multiple visits to purchase, from multiple channels/influences. Hence the addition of user segmentation allows us to focus on people. (For these exact reasons multi-channel funnels analysis and attribution modeling are so important!)
At the moment in web analytics solutions, people are defined by the first party cookie stored on their browser. Less than ideal, but 100x better then what we had previously. Over-time as we all expand to Universal Analytics perhaps we will have more options to track the same person, after explicitly asking for permission, across browsers, channels and devices

In-Site Search

If Google won’t give you keywords, build your own keyword database. Think about ways you can encourage people to use your in-site search. Watch the content they search for and consume the most. Another way of looking at site search is to provide navigation links that emphasize different keywords terms. For example, you could place these high up on your page, with each offering a different option relating to related keyword terms. Take a note of which keyword terms visitors favour over others.

In the good old days, people dutifully used site navigation at the left, right, or top of a website. But, two websites have fundamentally altered how we navigate the web: Amazon, because the site is so big, sells so many things, and is so complicated that many of us go directly to the site search box on arrival. And Google, which has trained us to show up, type what we want, and hit the search button. Now when people show up at a website, many of them ignore our lovingly crafted navigational elements and jump to the site search box. The increased use of site search as a core navigation method makes it very important to understand the data that site search generates

Distribution

Where does attention flow from? Social media? A mention is great, but if no attention flows over that link to your content, then it might be a misleading metric. Are people sharing your content? What topics and content gets shared the most?

Again, this comes back to understanding the audience, both what they’re talking about and what actions they take as a result. In “Digital Marketing Analytics: Making Sense Of Consumer Data”the authors recommend creating a “learning agenda”. Rather than just looking for mentions and volume of mentions, focus on specific brand or service attributes. Think about the specific questions you want answered by visitors as if they those visitors were sitting in front of you.

For example, how are consumers reacting to prices in your niche? What are their complaints? What do they wish would happen? Are people talking negatively about something? Are they talking positively about something? Who are the new competitors in this space?

Those are pretty rich signals. We can then link this back to content by addressing those issues within our content.

Design Thinking

Sep 16th
posted in

One of the problems with analysing data is the potential to get trapped in the past, when we could be imagining the future. Past performance can be no indication of future success, especially when it comes to Google’s shifting whims.

We see problems, we devise a solution. But projecting forward by measuring the past, and coming up with “the best solution” may lead to missing some obvious opportunities.

Design Thinking

In 1972, psychologist, architect and design researcher Bryan Lawson created an empirical study to understand the difference between problem-based solvers and solution-based solvers. He took two groups of students – final year students in architecture and post-graduate science students – and asked them to create one-story structures from a set of colored blocks. The perimeter of the building was to optimize either the red or the blue color, however, there were unspecified rules governing the placement and relationship of some of the blocks.
Lawson found that:

The scientists adopted a technique of trying out a series of designs which used as many different blocks and combinations of blocks as possible as quickly as possible. Thus they tried to maximize the information available to them about the allowed combinations. If they could discover the rule governing which combinations of blocks were allowed they could then search for an arrangement which would optimize the required color around the design. By contrast, the architects selected their blocks in order to achieve the appropriately colored perimeter. If this proved not to be an acceptable combination, then the next most favorably colored block combination would be substituted and so on until an acceptable solution was discovered.

Nigel Cross concludes from Lawson's studies that "scientific problem solving is done by analysis, while designers problem solve through synthesis”

Design thinking tends to start with the solution, rather than the problem. A lot of problem based-thinking focuses on finding the one correct solution to a problem, whereas design thinking tends to offer a variety of solutions around a common theme. It’s a different mindset.

One of the criticisms of Google, made by Google’s former design leader Douglas Bowman, was that Google were too data centric in their decision making:

When a company is filled with engineers, it turns to engineering to solve problems. Reduce each decision to a simple logic problem. Remove all subjectivity and just look at the data...that data eventually becomes a crutch for every decision, paralyzing the company and preventing it from making any daring design decisions…

There’s nothing wrong with being data-driven, of course. It’s essential. However, if companies only think in those terms, then they may be missing opportunities. If we imagine “what could be”, rather than looking at “what was”, opportunities present themselves. Google realise this, too, which is why they have Google X, a division devoted to imagining the future.

What search terms might people use that don’t necessarily show up on keyword mining tools? What search terms will people use six months from now in our vertical? Will customers contact us more often if we target them this way, rather than that way? Does our copy connect with our customers, of just search engines? Given Google is withholding more search referral data, which is making it harder to target keywords, adding some design thinking to the mix, if you don’t already, might prove useful.

Tools For Design Thinking

In the book, Designing For Growth, authors Jeanne Liedtka and Tim Ogilvie outline some tools for thinking about opportunities and business in ways that aren’t data-driven. One famous proponent of the intuitive, design-led approach was, of course, Steve Jobs.

It's really hard to design products by focus groups. A lot of times, people don't know what they want until you show it to them

The iphone or iPad couldn’t have been designed by looking solely at the past. They mostly came about because Jobs had an innate understanding of what people wanted. He was proven right by the resulting sales volume.

Design starts with empathy. It forces you to put yourself in the customers shoes. It means identifying real people with real problems.

In order to do this, we need to put past data aside and watch people, listen to people, and talk with people. The simple act of doing this is a rich source of keyword and business ideas because people often frame a problem in ways you may not expect.

For example, a lot of people see stopping smoking as a goal-setting issue, like a fitness regime, rather than a medical issue. Advertising copy based around medical terminology and keywords might not work as well as copy oriented around goal setting and achieving physical fitness. This shift in the frame of reference certainly conjures up an entirely different world of ad copy, and possibly keywords, too. That different frame might be difficult to determine from analytics and keyword trends alone, but might be relatively easy to spot simply by talking to potential customers.

Four Questions

Designing For Growth is worth a read if you’re feeling bogged down in data and looking for new ways to tackle problems and develop new opportunities. I don’t think there’s anything particularly new in it, and it can come across as "the shiny new buzzword" at times, but the fundamental ideas are strong. I think there is value in applying some of these ideas directly to current SEO issues.

Designing For Growth recommends asking the following questions.

What is?

What is the current reality? What is the problem your customers are trying to solve? Xerox solved a problem customers didn’t even know that had when Xerox invented the fax machine. Same goes for the Polaroid camera. And the microwave oven. Customers probably couldn’t describe those things until they saw and understood them, but the problem would have been evident had someone looked closely at the problems they faced i.e. people really wanted faster, easier ways of completing common tasks.

What do your customers most dislike about the current state of affairs? About your industry? How often do you ask them?

One way of representing this information is with a flowchart. Map the current user experience from when they have a problem, to imagining keywords, to searching, to seeing the results, to clicking on one of those results, to finding your site, interacting to your site, to taking desired action. Could any of the results or steps be better?

Usability tests use the same method. It’s good to watch actual customers as they do this, if possible. Conduct a few interviews. Ask questions. Listen to the language people use. We can glean some of this information from data mining, but there’s a lot more we can get by direct observation, especially when people don’t click on something, as non-activity seldom registers in a meaningful way in analytics.

What if?

What would “something better” look like?

Rather than think in terms of what is practical and the constraints that might prevent you from doing something, imagine what an ideal solution would look like if it weren’t for those practicalities and constraints.

Perhaps draw pictures. Make mock-ups. Tell a story. Anything that fires the imagination. Use emotion. Intuition. Feeling. Just going through such a process will lead to making connections that are difficult to make by staring at a spreadsheet.

A lot of usability testers create personas. These are fictional characters based on real or potential customers and are used try to gain an understanding of what they might search for, what problems they are trying to solve, and what they expect to see on our site. Is this persona a busy person? Well educated? Do they use the internet a lot? Are they buying for themselves, or on behalf of others? Do they tend to react emotionally, or are they logical? What incentives would this persona respond to?

Personas tend to work best when they’re based on actual people. Watch and observe. Read up on relevant case studies. Trawl back through your emails from customers. Make use of story-boards to capture their potential actions and thoughts. Stories are great ways to understand motivations and thoughts.

What are those things your competition does, and how could they be better? What would those things look like in the best possible world, a world free of constraints?

What wows?

“What wows” is especially important for social media and SEO going forward.

Consider Matt Cutts statement about frogs:

Those other sites are not bringing additional value. While they’re not duplicates they bring nothing new to the table. It’s not that there’s anything wrong with what these people have done, but they should not expect this type of content to rank.
Google would seek to detect that there is no real differentiation between these results and show only one of them so we could offer users different types of sites in the other search results

Cutts talks about the creation of new value. If one site is saying pretty much the same as another site, then those sites may not be duplicates, but one is not adding much in the way of value, either. The new site may be relegated simply for being “too samey”.

It's the opposite of the Zygna path:

"I don't fucking want innovation," an anonymous ex-employee recalls Pincus saying in 2010, according to the SF Weekly. "You're not smarter than your competitor. Just copy what they do and do it until you get their numbers."

Generally speaking, up-and-coming sites should focus on wowing their audience with added depth and/or a new perspective. This, in turn, means having something worth remarking upon, which then attracts mentions across social media, and generates more links.

Is this certain to happen? Nothing is certain as far as Google is concerned. They could still bury you on a whim, but wowing an audience is a better bet than simply imitating long-established players using similar content and link structures. At some point, those long-established players had to wow their audience to get the attention and rankings they enjoy today. They did something remarkably different at some point. Instead of digging the same hole deeper, dig a new hole.

In SEO, change tends to be experimental. It’s iterative. We’re not quite sure what works ahead of time, and no amount of measuring the past tells us all we want to know, but we try a few things and see what works. If a site is not ranking well, we try something else, until it does.

Which leads us to….

What works?

Do searchers go for it? Do they do that thing we want them to do, which is click on an ad, or sign up, or buy something?

SEOs are pretty accomplished at this step. Experimentation in areas that are difficult to quantify - the algorithms - have been an intrinsic part of SEO.

The tricky part is not all things work the same everywhere & much like modern health pathologies, Google has clever delays in their algorithms:

Many modern public health pathologies – obesity, substance abuse, smoking – share a common trait: the people affected by them are failing to manage something whose cause and effect are separated by a huge amount of time and space. If every drag on a cigarette brought up a tumour, it would be much harder to start smoking and much easier to quit.

One site's rankings are more stable because another person can't get around the sandbox or their links get them penalized. The same strategy and those same links might work great for another site.

Changes in user behavior are more directly & immediately measurable than SEO.

Consider using change experiments as an opportunity to open up a conversation with potential users. “Do you like our changes? Tell us”. Perhaps use a prompt asking people to initiate a chat, or participate on a poll. Engagement that has many benefits. It will likely prevent a fast click back, you get to see the words people use and how they frame their problems, and you learn more about them. You become more responsive and empathetic sympathetic to their needs.

Beyond Design Thinking

There’s more detail to design thinking, but, really, it’s mostly just common sense. Another framework to add, especially if you feel you’re getting stuck in faceless data.

Design thinking is not a panacea. It is a process, just as Six Sigma is a process. Both have their place in the modern enterprise. The quest for efficiency hasn't gone away and in fact, in our economically straitened times, it's sensible to search for ever more rigorous savings anywhere you can

What's best about it, I feel, is this type of thinking helps break strategy and data problems down and give it a human face.

In this world, designers can continue to create extraordinary value. They are the people who have, or could have, the laterality needed to solve problems, the sensing skills needed to hear what the world wants, and the databases required to build for the long haul and the big trajectories. Designers can be definers, making the world more intelligible, more habitable

The Benefits Of Thinking Like Google

Aug 27th
posted in

Shadows are the color of the sky.

It’s one of those truths that is difficult to see, until you look closely at what’s really there.

To see something as it really is, we should try to identify our own bias, and then let it go.

“Unnatural” Links

This article tries to make sense of Google's latest moves regarding links.

It’s a reaction to Google’s update of their Link Schemes policy. Google’s policy states “Any links intended to manipulate PageRank or a site’s ranking in Google search results may be considered part of a link scheme and a violation of Google’s Webmaster Guidelines." I wrote on this topic, too.

Those with a vested interest in the link building industry - which is pretty much all of us - might spot the problem.

Google’s negative emphasis, of late, has been about links. Their message is not new, just the emphasis. The new emphasis could pretty much be summarized thus:"any link you build for the purpose of manipulating rank is outside the guidelines." Google have never encouraged activity that could manipulate rankings, which is precisely what those link building, for the purpose of SEO, attempt to do. Building links for the purposes of higher rank AND staying within Google's guidelines will not be easy.

Some SEOs may kid themselves that they are link building “for the traffic”, but if that were the case, they’d have no problem insisting those links were scripted so they could monitor traffic statistics, or at very least, no-followed, so there could be no confusion about intent.

How many do?

Think Like Google

Ralph Tegtmeier: In response to Eric's assertion "I applaud Google for being more and more transparent with their guidelines", Ralph writes- "man, Eric: isn't the whole point of your piece that this is exactly what they're NOT doing, becoming "more transparent"?

Indeed.

In order to understand what Google is doing, it can be useful to downplay any SEO bias i.e. what we may like to see from an SEO standpoint, rather try to look at the world from Google’s point of view.

I ask myself “if I were Google, what would I do?”

Clearly I'm not Google, so these are just my guesses, but if I were Google, I’d see all SEO as a potential competitive threat to my click advertising business. The more effective the SEO, the more of a threat it is. SEOs can’t be eliminated, but they can been corralled and managed in order to reduce the level of competitive threat. Partly, this is achieved by algorithmic means. Partly, this is achieved using public relations. If I were Google, I would think SEOs are potentially useful if they could be encouraged to provide high quality content and make sites easier to crawl, as this suits my business case.

I’d want commercial webmasters paying me for click traffic. I’d want users to be happy with the results they are getting, so they keep using my search engine. I’d consider webmasters to be unpaid content providers.

Do I (Google) need content? Yes, I do. Do I need any content? No, I don’t. If anything, there is too much content, and lot of it is junk. In fact, I’m getting more and more selective about the content I do show. So selective, in fact, that a lot of what I show above the fold content is controlled and “published”, in the broadest sense of the word, by me (Google) in the form of the Knowledge Graph.

It is useful to put ourselves in someone else’s position to understand their truth. If you do, you’ll soon realise that Google aren’t the webmasters friend if your aim, as a webmaster, is to do anything that "artificially" enhances your rank.

So why are so many SEOs listening to Google’s directives?

Rewind

A year or two ago, it would be madness to suggest webmasters would pay to remove links, but that’s exactly what’s happening. Not only that, webmasters are doing Google link quality control. For free. They’re pointing out the links they see as being “bad” - links Google’s algorithms may have missed.

Check out this discussion. One exasperated SEO tells Google that she tries hard to get links removed, but doesn’t hear back from site owners. The few who do respond want money to take the links down.

It is understandable site owners don't spend much time removing links. From a site owners perspective, taking links down involves a time cost, so there is no benefit to the site owner in doing so, especially if they receive numerous requests. Secondly, taking down links may be perceived as being an admission of guilt. Why would a webmaster admit their links are "bad"?

The answer to this problem, from Google's John Mueller is telling.

A shrug of the shoulders.

It’s a non-problem. For Google. If you were Google, would you care if a site you may have relegated for ranking manipulation gets to rank again in future? Plenty more where they came from, as there are thousands more sites just like it, and many of them owned by people who don’t engage in ranking manipulation.

Does anyone really think their rankings are going to return once they’ve been flagged?

Jenny Halasz then hinted at the root of the problem. Why can’t Google simply not count the links they don’t like? Why make webmasters jump through arbitrary hoops? The question was side-stepped.

If you were Google, why would you make webmasters jump through hoops? Is it because you want to make webmasters lives easier? Well, that obviously isn’t the case. Removing links is a tedious, futile process. Google suggest using the disavow links tool, but the twist is you can’t just put up a list of links you want to disavow.

Say what?

No, you need to show you’ve made some effort to remove them.

Why?

If I were Google, I’d see this information supplied by webmasters as being potentially useful. They provide me with a list of links that the algorithm missed, or considered borderline, but the webmaster has reviewed and thinks look bad enough to affect their ranking. If the webmaster simply provided a list of links dumped from a link tool, it’s probably not telling Google much Google doesn’t already know. There’s been no manual link review.

So, what webmasters are doing is helping Google by manually reviewing links and reporting bad links. How does this help webmasters?

It doesn’t.

It just increases the temperature of the water in the pot. Is the SEO frog just going to stay there, or is he going to jump?

A Better Use Of Your Time

Does anyone believe rankings are going to return to their previous positions after such an exercise? A lot of webmasters aren’t seeing changes. Will you?

Maybe.

But I think it’s the wrong question.

It’s the wrong question because it’s just another example of letting Google define the game. What are you going to do when Google define you right out of the game? If your service or strategy involves links right now, then in order to remain snow white, any links you place, for the purposes of achieving higher rank, are going to need to be no-followed in order to be clear about intent. Extreme? What's going to be the emphasis in six months time? Next year? How do you know what you're doing now is not going to be frowned upon, then need to be undone, next year?

A couple of years ago it would be unthinkable that webmasters would report and remove their own links, even paying for them to be removed, but that’s exactly what’s happening. So, what is next year's unthinkable scenario?

You could re-examine the relationship and figure what you do on your site is absolutely none of Google’s business. They can issue as many guidelines as they like, but they do not own your website, or the link graph, and therefore don't have authority over you unless you allow it. Can they ban your site because you’re not compliant with their guidelines? Sure, they can. It’s their index. That is the risk. How do you choose to manage this risk?

It strikes me you can lose your rankings at anytime whether you follow the current guidelines or not, especially when the goal-posts keep moving. So, the risk of not following the guidelines, and following the guidelines but not ranking well is pretty much the same - no traffic. Do you have a plan to address the “no traffic from Google” risk, however that may come about?

Your plan might involve advertising on other sites that do rank well. It might involve, in part, a move to PPC. It might be to run multiple domains, some well within the guidelines, and some way outside them. Test, see what happens. It might involve beefing up other marketing channels. It might be to buy competitor sites. Your plan could be to jump through Google’s hoops if you do receive a penalty, see if your site returns, and if it does - great - until next time, that is.

What's your long term "traffic from Google" strategy?

If all you do is "follow Google's Guidelines", I'd say that's now a high risk SEO strategy.

Google: Press Release Links

Aug 7th
posted in

So, Google have updated their Webmaster Guidelines.

Here are a few common examples of unnatural links that violate our guidelines:....Links with optimized anchor text in articles or press releases distributed on other sites.

For example: There are many wedding rings on the market. If you want to have a wedding, you will have to pick the best ring. You will also need to buy flowers and a wedding dress.

In particular, they have focused on links with optimized anchor text in articles or press releases distributed on other sites. Google being Google, these rules are somewhat ambiguous. “Optimized anchor text”? The example they provide includes keywords in the anchor text, so keywords in the anchor text is “optimized” and therefore a violation of Google’s guidelines.

Ambiguously speaking, of course.

To put the press release change in context, Google’s guidelines state:

Any links intended to manipulate PageRank or a site's ranking in Google search results may be considered part of a link scheme and a violation of Google’s Webmaster Guidelines. This includes any behavior that manipulates links to your site or outgoing links from your site

So, links gained, for SEO purposes - intended to manipulate ranking - are against Google Guidelines.

Google vs Webmasters

Here’s a chat...

In this chat, Google’s John Muller says that, if the webmaster initiated it, then it isn't a natural link. If you want to be on the safe side, John suggests to use no-follow on links.

Google are being consistent, but what’s amusing is the complete disconnect on display from a few of the webmasters. Google have no problem with press releases, but if a webmaster wants to be on the safe side in terms of Google’s guidelines, the webmaster should no-follow the link.

Simple, right. If it really is a press release, and not an attempt to link build for SEO purposes, then why would a webmaster have any issue with adding a no-follow to a link?

He/she wouldn't.

But because some webmasters appear to lack self-awareness about what it is they are actually doing, they persist with their line of questioning. I suspect what they really want to hear is “keyword links in press releases are okay." Then, webmasters can continue to issue pretend press releases as a link building exercise.

They're missing the point.

Am I Taking Google’s Side?

Not taking sides.

Just hoping to shine some light on a wider issue.

If webmasters continue to let themselves be defined by Google, they are going to get defined out of the game entirely. It should be an obvious truth - but sadly lacking in much SEO punditry - that Google is not on the webmasters side. Google is on Google’s side. Google often say they are on the users side, and there is certainly some truth in that.

However,when it comes to the webmaster, the webmaster is a dime-a-dozen content supplier who must be managed, weeded out, sorted and categorized. When it comes to the more “aggressive” webmasters, Google’s behaviour could be characterized as “keep your friends close, and your enemies closer”.

This is because some webmasters, namely SEOs, don’t just publish content for users, they compete with Google’s revenue stream. SEOs offer a competing service to click based advertising that provides exactly the same benefit as Google's golden goose, namely qualified click traffic.

If SEOs get too good at what they do, then why would people pay Google so much money per click? They wouldn’t - they would pay it to SEOs, instead. So, if I were Google, I would see SEO as a business threat, and manage it - down - accordingly. In practice, I’d be trying to redefine SEO as “quality content provision”.

Why don't Google simply ignore press release links? Easy enough to do. Why go this route of making it public? After all, Google are typically very secret about algorithmic topics, unless the topic is something they want you to hear. And why do they want you to hear this? An obvious guess would be that it is done to undermine link building, and SEOs.

Big missiles heading your way.

Guideline Followers

The problem in letting Google define the rules of engagement is they can define you out of the SEO game, if you let them.

If an SEO is not following the guidelines - guidelines that are always shifting - yet claim they do, then they may be opening themselves up to legal liability. In one recent example, a case is underway alleging lack of performance:

Last week, the legal marketing industry was aTwitter (and aFacebook and even aPlus) with news that law firm Seikaly & Stewart had filed a lawsuit against The Rainmaker Institute seeking a return of their $49,000 in SEO fees and punitive damages under civil RICO

.....but it’s not unreasonable to expect a somewhat easier route for litigants in the future might be “not complying with Google’s guidelines”, unless the SEO agency disclosed it.

SEO is not the easiest career choice, huh.

One group that is likely to be happy about this latest Google push is legitimate PR agencies, media-relations departments, and publicists. As a commenter on WMW pointed out:

I suspect that most legitimate PR agencies, media-relations departments, and publicists will be happy to comply with Google's guidelines. Why? Because, if the term "press release" becomes a synonym for "SEO spam," one of the important tools in their toolboxes will become useless.

Just as real advertisers don't expect their ads to pass PageRank, real PR people don't expect their press releases to pass PageRank. Public relations is about planting a message in the media, not about manipulating search results

However, I’m not sure that will mean press releases are seen as any more credible, as press releases have never enjoyed a stellar reputation pre-SEO, but it may thin the crowd somewhat, which increases an agencies chances of getting their client seen.

Guidelines Honing In On Target

One resource referred to in the video above was this article, written by Amit Singhal, who is head of Google’s core ranking team. Note that it was written in 2011, so it’s nothing new. Here’s how Google say they determine quality:

we aren't disclosing the actual ranking signals used in our algorithms because we don't want folks to game our search results; but if you want to step into Google's mindset, the questions below provide some guidance on how we've been looking at the issue:

  • Would you trust the information presented in this article?
  • Is this article written by an expert or enthusiast who knows the topic well, or is it more shallow in nature?
  • Does the site have duplicate, overlapping, or redundant articles on the same or similar topics with slightly different keyword variations?
  • Are the topics driven by genuine interests of readers of the site, or does the site generate content by attempting to guess what might rank well in search engines?
  • Does the article provide original content or information, original reporting, original research, or original analysis?
  • Does the page provide substantial value when compared to other pages in search results?
  • How much quality control is done on content?

….and so on. Google’s rhetoric is almost always about “producing high quality content”, because this is what Google’s users want, and what Google’s users want, Google’s shareholders want.

It’s not a bad thing to want, of course. Who would want poor quality content? But as most of us know, producing high quality content is no guarantee of anything. Great for Google, great for users, but often not so good for publishers as the publisher carries all the risk.

Take a look at the Boston Globe, sold along with a boatload of content for a 93% decline. Quality content sure, but is it a profitable business? Emphasis on content without adequate marketing is not a sure-fire strategy. Bezos has just bought the Washington Post, of course, and we're pretty sure that isn't a content play, either.

High quality content often has a high upfront production cost attached to it, and given measly web advertising rates, the high possibility of invisibility, getting content scrapped and ripped off, then it is no wonder webmasters also push their high quality content in order to ensure it ranks. What other choice have they got?

To not do so is also risky.

Even eHow, well known for cheap factory line content, is moving toward subscription membership revenues.

The Somewhat Bigger Question

Google can move the goal- posts whenever they like. What you’re doing today might be frowned upon tomorrow. One day, your content may be made invisible, and there will be nothing you can do about it, other than start again.

Do you have a contingency plan for such an eventuality?

Johnon puts it well:

The only thing that matters is how much traffic you are getting from search engines today, and how prepared you are for when some (insert adjective here) Googler shuts off that flow of traffic"

To ask about the minuate of Google’s policies and guidelines is to miss the point. The real question is how prepared are you when Google shuts off you flow of traffic because they’ve reset the goal posts?

Focusing on the minuate of Google's policies is, indeed, to miss the point.

This is a question of risk management. What happens if your main site, or your clients site, runs foul of a Google policy change and gets trashed? Do you run multiple sites? Run one site with no SEO strategy at all, whilst you run other sites that push hard? Do you stay well within the guidelines and trust that will always be good enough? If you stay well within the guidelines, but don’t rank, isn’t that effectively the same as a ban i.e. you’re invisible? Do you treat search traffic as a bonus, rather than the main course?

Be careful about putting Google’s needs before your own. And manage your risk, on your own terms.

Pages






    Email Address
    Pick a Username
    Yes, please send me "7 Days to SEO Success" mini-course (a $57 value) for free.

    Learn More

    We value your privacy. We will not rent or sell your email address.