Media.net Review (2014 Update)

Jun 1st

Overall Rating

General Review

Many smaller ad networks have a huge fall off, to where if you earned 50 cents or a dollar a click with AdSense, you'd see penny and nickle clicks. Thankfully Media.net is nothing like that & they are perhaps the best network at competing with AdSense on a eCPM basis. Their interface is quite easy to use, both in terms of creating & customizing new ad units and in tracking performance reports.

In most markets Media.net won't vastly outperform AdSense, but it is certainly worth testing & may do far better than one would expect, particularly in light of how weak the Yahoo! Publisher Network was at performing against AdSense many years ago. I've read some Media.net reviews which were a bit negative, but many of those were from people earning under $1 a day or such. Since Media.net lacks Google's advertising network depth & scale, they try to offset that by trying to do better ad integration with a manually intensive work process to really help the ads match the look and feel of your sites. It is worth noting that unless you have a decent amount of scale they probably won't be able to justify spending a lot of resources working on custom ad integration for your website.

Great Features

  • Competitive eCPM when compared against AdSense in many categories.
  • Can be used in conjunction with AdSense.
  • Has some standard ad unit sizes & some that are custom, which gives you flexibility in terms of integrating them in typical ad spots and in terms of having units which look different than common ones and thus have greater eye appeal than a standard 468x60 or 728x90 banner.
  • Leverages the Yahoo! Bing Network, which gives it a fairly decent advertiser base & network scale to tap into to ensure there are relevant ads for most topics. I believe one thing that has helped them do so well is Microsoft has done a much better job on pricing click quality than many ad networks did in years past.
  • Since they are a smaller company than Google, their partner communications are much clearer. You don't have to pull down millions of dollars a year to be considered a valued partner.
  • Their customer support team not only communicates clearly with publishers, but also works to help improve ad integration.
  • Once your account has been established and they see strong traffic quality they are generally quite quick at approving any additional sites you add to your account.
  • In addition to offering contextual ads, Media.net has a partnership to serve Google display ads on their network (though publishers have to sign up with Google).
  • While earning statistics are not real-time, they provide them the following day.
  • Fast Net-30 payouts.

Drawbacks

The main drawbacks would be:

  • They require English as your primary language & that your site receives the majority of its traffic from the United States, Canada, and the United Kingdom. If you operate outside those markets, then they wouldn't be a great fit at the moment (though who knows where they may be in a couple years as Bing gets more aggressive with international expansion of their ad network).
  • It can take a while to get a new account approved, so it is worth applying early to have some experience with their network and to have a backup in place in case anything should happen to your AdSense account.
  • Inability to split test units (though if you are doing enough volume your customer support person will help set up and implement a split test for you).
  • While they do offer statistics on a per-site, per-day & per-ad unit basis (along with pageview stats), they currently do not offer data down to the page or keyword level. They provide data on earnings, pageviews & eCPM; but they currently do not provide click or CPC data. (I believe they will be adding more granular metrics fairly soon).

>>> Sign up to activate your Media.net publisher account today.

Growing An SEO Business By Removing Constraints

May 31st
posted in

If you run an SEO business, or any service business, you’ll know how hard it can be to scale up operations. There are many constraints that need to be overcome in order to progress.

We’ll take a look at a way to remove barriers to growth and optimize service provision using the Theory Of Constraints. This approach proposes a method to identify the key constraints to performance which hinder growth and expansion.

The Theory Of Constraints has been long been used for optimizing manufacturing.....

We had no legs to stand on to maintain our current customer base let alone acquire and keep new business. This was not an ideal position to be in, particularly in a down economy when we couldn’t afford to have sales reduce further

... but more recently, it’s been applied to services, too.

The results were striking. The number of days to decide food stamp eligibility dropped from 15 to 11; phone wait times were reduced from 23 minutes to nine minutes. Budgetary savings have exceeded the $9 million originally cut

It’s one way of thinking about how to improve performance by focusing on bottlenecks. If you’re experiencing problems such as being overworked and not having enough time, it could offer a solution.

First we’ll take a look at the theory, then apply it to an SEO agency. It can be applied to any type of business, of course.

Theory Of Constraints

Any manageable system as being limited in achieving more of its goals by a very small number of constraints. There is always at least one constraint, and TOC uses a focusing process to identify the constraint and restructure the rest of the organization around it

If there weren’t constraints, you could grow your business as large and as fast as you wanted.

You can probably think of numerous constraints that prevent you from growing your business. However, the theory holds that most constraints are really side issues, and that organizations are constrained by only one constraint at any one time.

A constraint is characterized as the “weakest link”.

The weakest link holds everything else up. Once this constraint has been eliminated or managed, another “weakest link” may well emerge, so the process is repeated until the business is optimized. Constraints can be people, procedures, supplies, management, and systems.

In Dr. Eli Goldratt’s book, "The Goal", Golddratt identifies the five steps to identify and address the constraint:

  • Identify the constraint
  • Exploit the constraint
  • Subordinate everything else to the constraint
  • Elevate the constraint
  • Go back to step 1
  • 1. Identify The Constraint

    What is the biggest bottleneck that holds back company performance? What activity always seems to fall behind schedule, or take the most time? This activity might not be the main activity of the company. It could be administrative. It could be managerial.

    If you’re not sure, try the “Five Whys” technique to help determine the root cause:

    By repeatedly asking the question “Why” (five is a good rule of thumb), you can peel away the layers of symptoms which can lead to the root cause of a problem. Very often the ostensible reason for a problem will lead you to another question. Although this technique is called “5 Whys,” you may find that you will need to ask the question fewer or more times than five before you find the issue related to a problem

    2. Exploit The Constraint

    Once the constraint is identified, you then utilize the constraint to its fullest i.e. you try to make sure that constraint is working at maximum performance. What is preventing the constraint from working at maximum performance?

    If the constraint is staff, you might look at ways for people to produce more work, perhaps by automating some of their workload, or allocating less-essential work to someone else. It could involve more training. It could involve adopting different processes.

    3. Subordinate Everything Else To The Constraint

    Identify all the non-constraints that may prevent the constraint from working at maximum performance. These might be activities or processes the constraint has to undertake but aren’t directly related to the constraint.

    For example, a staff member who is identified as a constraint might have a billing task that could either by automated or allocated to someone else.

    The constraint should not be limited by anything outside their control. The constraint can’t do any more than it possibly can i.e. if your constraint is time, you can’t have someone work anymore than 24 hours in a day! More practically, 8 hours a day.

    Avoid focusing on non-constraints. Optimizing non-constraints might feel good, but they won’t do much to affect overall productivity.

    4. Elevate The Constraint

    Improve productivity of the constraint by lifting the performance of the constraint. Once you’ve identified the constraint, and what is limiting performance, then you typically find spare capacity emerges. You then increase the workload. The productivity of the entire company is now lifted. Only then would you hire an additional person, if necessary.

    5. Repeat

    The final step is to repeat the process.

    The process is repeated because the weakest link may now move to another area of the business. For example, if more key workers have been hired to maximize throughput, then the constraint may have shifted to a management level, because the supervisory workload has increased.

    If so, this new constraint gets addressed via the same process.

    Applying The Theory Of Constraints To An SEO Agency

    Imagine Acme SEO Inc.

    Acme SEO are steadily growing their client base and have been meeting their clients demands. However, they’ve noticed projects are taking longer and longer to finish. They’re reluctant to take on new work, as it appears they’re operating at full capacity.

    When they sit down to look at the business in terms of constraints, they find that they’re getting the work, they’re starting the work on time, but the projects slow down after that point. They frequently rush to meet deadlines, or miss them. SEO staff appear overworked. If the agency can’t get through more projects, then they can’t grow. Everything else in the business, from the reception to sales, depends on it. Do they just hire more people?

    They apply the five steps to define the bottleneck and figure out ways to optimize performance.

    Step One

    Identify the constraint. What is the weakest link? What limits the SEO business doing more work? Is it the employees? Are they skilled enough? How about the systems they are using? Is there anything getting in the way of them completing their job?

    Try asking the Five Whys to get to the root of the problem:

    1. Why is this process taking so long? Because there is a lot of work involved.
    2. Why is there a lot of work involved? Because it’s complex.
    3. Why is it complex? Because there is a lot of interaction with the client.
    4. Why is there a lot of interaction with the client? Because they keep changing their minds.
    5. Why do they keep changing their demands? Because they’re not clear about what they want.

    Step Two

    Exploiting the constraint. How can the SEO work at maximum load?

    If an SEO isn’t doing as much as they could be, is it due to project management and training issues? Do people need more direct management? More detailed processes? More training?

    It sounds a bit ruthless, especially when talking about people, but really it’s about constructively dealing with the identified bottlenecks, as opposed to apportioning blame.

    In our example, the SEOs have the skills necessary, and work hard, but the clients kept changing scope, which is leading to a lot of rework and administrative overhead.

    Once that constraint had been identified, changes were made to project management, eliminating the potential for scope creep after the project had been signed off, thus helping maximize the throughput of the worker.

    Step Three

    Subordinate the constraint. So, the process has been identified as the cause of a constraint. By redesigning the process to control scope creep before the SEO starts, say at a sales level, they free up more time. When the SEO works on the project, they’re not having to deal with administrative overhead that has a high time cost, therefore their utility is maximised.

    The SEO is now delivering more forward momentum.

    Step Four

    Elevate the performance of the constraint. They monitor the performance of the SEO. Does the SEO now have spare capacity? Is the throughput increasing? Have they done everything possible to maximize the output? Are there any other processes holding up the SEO? Should the SEO be handling billing when someone else could be doing that work? Is the SEO engaged in pre-sales when that work could be handled by sales people?

    Look for work being done that takes a long time, but doesn’t contribute to output. Can these tasks be handed to someone else - someone who isn’t a constraint?

    If the worker is working at maximum utility, then adding another worker might solve the bottleneck. Once the bottleneck is removed, performance improves.

    Adding bodies is the common way service based industry, like SEO, scales up. A consultancy bills hours, and the more bodies, the more hours they can ill. However, if the SEO role is optimized to start with, then they might find they have spare capacity opening up so don’t need as many new hires.

    Step Five

    Repeat.

    Goldratt stressed that using the Theory Of Constraints to optimize business is an on-going task. You identify the constraint - which may not necessarily be the most important aspect of the business i.e. it could be office space - which then likely shifts the weakest link to another point. You then optimize that point, and so on. Fixing the bottleneck is just the beginning of a process.

    It’s also about getting down to the root of the problem, which is why the Five Whys technique can be so useful. Eliminating a bottleneck sounds simple, and a quick fix, but the root of the problem might not be immediately obvious.

    In our example, it appeared as though the staff are the problem, so the root cause could be misdiagnosed as “we need more staff”. In reality, the root cause of the bottleneck was a process problem.

    Likewise some problems aligned with an employee on a specific project might be tied to the specific client rather than anything internal to your company. Some people are never happy & will never be satisfied no matter what you do. Probably the best way to deal with people who are never satisfied is to end those engagements early before they have much of an impact on your business. The best way to avoid such relationships in the first place is to have some friction upfront so that those who contact you are serious about working with you. It can also be beneficial to have some of your own internal sites to fall back on, such that when consulting inquiries are light you do not chase revenue at the expense of lower margins from clients who are not a good fit. These internal projects also give you flexibility to deal with large updates by being able to push some of your sites off into the background while putting out any fires that emerge from the update. And those sorts of sites give you a testing platform to further inform your strategy with client sites.

    How have you addressed business optimization problems? What techniques have you found useful, and how well did they work?

    Further Resources:

    I’ve skimmed across the surface, but there’s a lot more to it. Here’s some references used in the article, and further reading...

    Advanced Web Ranking Review - Website Auditor

    May 30th

    Advanced Web Ranking (AWR) is one of my favorite pieces of SEO software on the market today. It has been indispensable to me over the years. The software does it all and then some.

    I reviewed it a few years ago; you can read that here, most of it is still relevant and I'll be updating it in the near future. In this post I want to highlight their Website Auditor tool.

    Combining On and Off Page Factors

    The beauty of this feature is the simple integration of on and off-page elements. There are other tools on the market that focus solely on the on-page stuff (and do a fantastic job of it) and AWR does as well.

    The all-in-one nature of Advanced Web Ranking allows you to deftly move between the on and off (links, social, etc) page factors for a site (and its competition) inside of the Website Auditor feature. AWR has other tools built-in to go even deeper on competitive analysis as well.

    A quick FYI on some general settings and features:

    • You can crawl up to 10,000 pages on-demand
    • All results are exportable
    • Audits are saved so you can look at historical data trends
    • Complete white-label reporting is available
    • Because it's software it's all you can eat :) (save for the page limit)

    You can also set the tool to crawl only certain sections of a site as well as completely ignore certain sections or parameters so you can make the best use of your 10,000 page-crawl limit. This is a nice way to crawl a specific section of a site to find the most "social" content (limit the crawl to /blog as an example).

    Interface Overview

    Here's what the initial interface looks like:

    awr-site-audit-interface-overview

    It's a thick tool for sure, on the whole, but just focus on the Auditor piece. It's fairly self-explanatory but the top toolbar (left to right) shows:

    • Current site being viewed
    • Update date history for historical comparison
    • Filtering options (all pages, only specific pages (200's, 404's, missing title tags, basically all the data points are available for slicing and dicing)
    • Button for on-page issues to show in the view area
    • Button for page-level external link data to show in the view area
    • Button for page-level social metrics (Twitter, Facebook, G+) to show in the view area
    • Update Project button (to update the Audit :D )
    • Text box where you can filter the results manually
    • Auditor settings (see below)
    • Link data source, Open Site Explorer for now (Majestic is available in other areas of AWR and I'm told it will be available in Website Auditor as another option on the next release, 9.6 (due out very soon)

    The tool settings button allows to configure many areas of the Auditor tool to help get the exact data you want:

    awr-site-audit-tool-settings

    On-Page and Off-Page Data Points

    The on-page overview gives you all of what is listed in the viewport shown previously and if you click on the Filter icon you'll be able to look at whatever piece of on-page data you'd like to:

    awr-site-audit-page-filters

    I did just a short crawl here in order to show you how your data will look inside the tool. The view of the initial on-page report shows your traditional items such as:

    • Title tag info
    • Meta descriptions
    • Duplicate content
    • Robots and indexing information
    • Broken link and external link counts
    • Levels deep from the root
    • HTTP Status Code

    Each page can be clicked on to show specific information about that page:

    • Links from the page to other sites
    • Internal links to the page
    • Broken links
    • External links pointing into the page with anchor text data, Page Authority, and MozRank. Also whether the link is no-follow or an image will be shown as well
    • Broken link and external link counts
    • Levels deep from the root
    • HTTP Status Code

    The on-page overview is also referred to as the Issues Layout:

    awr-site-audit-on-page-view

    The other 2 views are more of a mix of on-page and off-page factors.

    The Links Layout shows the following (for the root domain and for the sub-pages individually):

    • Levels deep from the homepage
    • Page Authority
    • MozRank
    • Linking Root Domains
    • Total Inbound Links
    • Outbound Links
    • No-follows
    • Inbound and Outbound Internal Links

    awr-audit-links-overview

    In this view you can click on any of the crawled pages and see links to the page internally and externally as well as broken links.

    The Social Layout shows the following information:

    • Facebook Shares, Twitter Shares, and Google +1's for a given URL
    • Internal and external links to the page
    • Indexed or not
    • HTTP Status
    • Meta information
    • Broken Links

    awr-audit-social-layot

    This data is helpful in finding content ideas, competitor's content/social strategy, and for finding possible influencers to target in a link building/social awareness campaign for your site.

    Reporting and Scheduling

    Currently you can provide white label PDF/interactive HTML reports for the following:

    • Issues Layout
    • Link Layout
    • Social Layout

    You can also do a quick export from the viewport window inside the Website Auditor tab to get either an HTML/PDF/CSV export of the data you are looking at (list of link issues, social stats, on-page issues, and so on).

    Reports can be scheduled to run automatically so long as the computer AWR resides on is on and functional. You could also remote in with a service like LogMeIn to run an update remotely or use the AWR server plan where you host the AWR application on one machine and remote client machines (staff as an example) can connect to the shared database and make an update or run a report if needed.

    Advanced Web Ranking's Website Auditor is one of the most robust audit tools on the market and soon it will have integration with Majestic SEO (currently it ties into OpenSiteExplorer/Linkscape). It already pulls in social metrics from Twitter, Facebook, and G+ to give you a more comprehensive view of your site and your content.

    If you conduct technical audits or do competitive analysis you should give AWR a try, I think you'll like it :)

    • Over 100 training modules, covering topics like: keyword research, link building, site architecture, website monetization, pay per click ads, tracking results, and more.
    • An exclusive interactive community forum
    • Members only videos and tools
    • Additional bonuses - like data spreadsheets, and money saving tips
    We love our customers, but more importantly

    Our customers love us!

    LarryWorld

    May 29th
    posted in

    It’s hard to disagree with Larry Page.

    In his recent speech at Google I/O, Page talked about privacy and how it impairs Google. “Why are people so focused on keeping their medical history private”? If only people would share more, then Google could do more.

    Well, quite.

    We look forward to Google taking the lead in this area and opening up their systems to public inspection. Perhaps they could start with the search algorithms. If Google would share more, publishers could do more.

    What’s not to like? :)

    But perhaps that’s comparing apples with oranges. The two areas may not be directly comparable as the consequences of opening up the algorithm would likely destroy Google’s value. Google’s argument against doing so has been that the results would suffer quality issues.

    Google would not win.

    TechnoUtopia

    If Page's vision sounds somewhat utopian, then perhaps we should consider where Google came from.

    In a paper entitled “The Politics Of Search: A Decade Retrospective”, Laura Granker points out that when Google started out, the web was a more utopian place.

    A decade ago, the Internet was frequently viewed through a utopian lens, with scholars redicting that this increased ability to share, access, and produce content would reduce barriers to information access...Underlying most of this work is a desire to prevent online information from merely mimicking the power structure of the conglomerates that dominate the media landscape. The search engine, subsequently, is seen as an idealized vehicle that can differentiate the Web from the consolidation that has plagued ownership and content in traditional print and broadcast media

    At the time, researchers Introna and Nissenbaum felt that online information was too important to be shaped by market forces alone. They correctly predicted this would lead to a loss of information quality, and a lack of diversity, as information would pander to popular tastes.

    They advocated, perhaps somewhat naively in retrospect, public oversight of search engines and algorithm transparency to correct these weaknesses. They argued that doing so would empower site owners and users.

    Fast forward to 2013, and there is now more skepticism about such utopian values. Search engines are seen as the gatekeepers of information, yet they remain secretive about how they determine what information we see. Sure, they talk about their editorial process in general terms, but the details of the algorithms remain a closely guarded secret.

    In the past decade, we’ve seen a considerable shift in power away from publishers and towards the owners of big data aggregators, like Google. Information publishers are expected to be transparent - so that a crawler can easily gather information, or a social network can be, well, social - and this has has advantaged Google and Facebook. It would be hard to run a search engine or a social network if publishers didn't buy into this utopian vision of transparency.

    Yet, Google aren’t quite as transparent with their own operation. If you own a siren server, then you want other people to share and be open. But the same rule doesn’t apply to the siren server owner.

    Opening Up Health

    Larry is concerned about constraints in healthcare, particularly around access to private data.

    “Why are people so focused on keeping their medical history private?” Page thinks it’s because people are worried about their insurance. This wouldn’t happen if there was universal care, he reasons.

    I don’t think that’s correct.

    People who live in areas where there is universal healthcare, like the UK, Australia and New Zealand, are still very concerned about the privacy of their data. People are concerned that their information might be used against them, not just by insurance companies, but by any company, not to mention government agencies and their employees.

    People just don’t like the idea of surveillance, and they especially don’t like the idea of surveillance by advertising companies who operate inscrutable black boxes.

    Not that good can’t come from crunching the big data linked to health. Page is correct in saying there is a lot of opportunity to do good by applying technology to the health sector. But first companies like Google need to be a lot more transparent about their own data collection and usage in order to earn trust. What data are they collecting? Why? What is it used for? How long is it kept? Who can access it? What protections are in place? Who is watching the watchers?

    Google goes someway towards providing transparency with their privacy policy. A lesser known facility, called Data Liberation allows you to move data out of Google, if you wish.

    I’d argue that in order for people to trust Google to a level Page demands would require a lot more rigor and transparency, including third party audit. There are also considerable issues to overcome, in terms of government legislation, such as privacy acts. Perhaps the most important question is "how does this shift power balances"? No turkey votes for an early Christmas. If your job relies on being a gatekeeper of health information, you're hardly going to hand that responsibility over to Google.

    So, it’s not a technology problem. And not just because people afraid of insurance companies. And it’s not because people aren’t on board with the whole Burning-Man-TechnoUtopia vision. It’s to do with trust. People would like to know what they’re giving up, to whom, and what they’re getting in return. And it's about power and money.

    Page has answered some of the question, but not nearly enough of it. Something might be good for Google, and it might be good for others, but people want a lot more than just his word on it.

    Sean Gallagher writes in ArsTechnica:

    The changes Page wants require more than money. They require a change of culture, both political and national. The massively optimistic view that technology can solve all of what ails America—and the accompanying ideas on immigration, patent reform, and privacy—are not going to be so easy to force into the brains of the masses.

    The biggest reason is trust. Most people trust the government because it's the government—a 226-year old institution that behaves relatively predictably, remains accountable to its citizens, and is governed by source code (the Constitution) that is hard to change. Google, on the other hand, is a 15-year old institution that is constantly shifting in nature, is accountable to its stockholders, and is governed by source code that is updated daily. You can call your Congressman and watch what happens in Washington on C-SPAN every day. Google is, to most people, a black box that turns searches and personal data into cash”

    And it may do so at their expense, not benefit.

    GoogleMart

    May 27th
    posted in

    It was hard to spot, at first.

    It started with one store on the outskirts of town. It was big. Monolithic. It amalgamated a lot of cheap, largely imported stuff and sold the stuff on. The workers were paid very little. The suppliers were squeezed tight on their margins.

    And so it grew.

    And as it grew, it hollowed out the high street. The high street could not compete with the monoliths sheer power. They couldn’t compete with the monoliths influence on markets. They couldn’t compete with the monoliths unique insights gained from clever number crunching of big data sets.

    I’m talking about Wal Mart, of course.

    Love ‘em or loathe ‘em, Walmart gave people what they wanted, but in so doing, hollowed out a chunk of America's middle class. It displaced a lot of shop keepers. It displaced small business owners on Main Street. It displaced the small family retail chain that provided a nice little middle class steady earner.

    Where did all those people go?

    It was not only the small, independent retail businesses and local manufacturers who were fewer in number. Their closure triggered flow-on effects. There was less demand for the services they used, such as local small business accountants, the local lawyer, small advertising companies, local finance companies, and the host of service providers that make up the middle class ecosystem.

    Where did they all go?

    Some would have taken up jobs at WalMart, of course. Some would become unemployed. Some would close their doors and take early retirement. Some would change occupations and some would move away to where prospects were better.

    What does any of this have to do with the internet?

    The same thing is happening on the internet.

    And if you’re a small business owner, located on the web-equivalent of the high street, or your business relies on those same small business owners, then this post is for you.

    Is Technology Gutting The Middle Class?

    I’ve just read “Who Owns The Future”, by Jaron Lanier. Anyone who has anything to do with the internet - and anyone who is even remotely middle class - will find it asks some pretty compelling questions about our present and future.

    Consider this.

    At the height of it’s power, the photography company Kodak employed more than 140,000 people and wa worth $28 billion. They even invented the first digital camera. But today Kodak is bankrupt, and the new face of digital photography has become Instagram. When it was sold to Facebook for a billion dollars in 2012, Instagram only employed 13 people

    Great for Instagram. Bad for Kodak. And bad for the people who worked for Kodak. But, hey. That’s progress, right? Kodak had an outdated business model. Technology overtook them.

    That’s true. It is progress. It’s also true that all actions have consequences. The consequence of transformative technology is that, according to Lanier, it may well end up destroying the middle class if too much of the value is retained in the large technology companies.

    Lanier suggests that the advance of technology is not replacing as many jobs as it destroys, and those jobs that are destroyed are increasingly middle class.

    Not Political (Kinda)

    I don’t wish to make this post political, although all change is inherently political. I’m not taking political sides. This issue cuts across political boundaries. I have a lot of sympathy for technological utopian ideas and the benefits technology brings, and have little time for luddism.

    However, it’s interesting to focus on the the consequences of this shift in wealth and power brought about by technology and whether enough people in the internet value chain receive adequate value for their efforts.

    If the value doesn't flow through, as capitalism requires in order to function well, then few people win. Are children living at home longer than they used to? Are people working longer hours than they used to in order to have the same amount of stuff? Has the value chain been broken, Lanier asks? And, if so, what can be done to fix it?

    What Made Instagram Worth One Billion Dollars?

    Lanier points out that Instagram wasn’t worth a billion dollars because it had extraordinary employees doing amazing things.

    The value of Instagram came from network effects.

    Millions of people using Instagram gave the Instagram network value. Without the user base, Instagram is just another photo app.

    Who got paid in the end? Not the people who gave the network value. The people who got paid were the small group at the top who organized the network. The owners of the "Siren Servers":

    The power rests in what Lanier calls the “Siren Servers”: giant corporate repositories of information about our lives that we have given freely and often without consent, now being used for huge financial benefit by a super-rich few

    The value is created by all the people who make up the network, but they only receive a small slither of that value in the form of a digital processing tool. To really benefit, you have to own, or get close to, a Siren Server.

    Likewise, most of Google’s value resides in the network of users. These users feed value into Google simply by using it and thereby provide Google with a constant stream of data. This makes Google valuable. There isn’t much difference between Google and Bing in terms of service offering, but one is infinitely more valuable than the other purely by virtue of the size of the audience. Same goes for Facebook over Orkut.

    You Provide Value

    Google are provided raw materials by people. Web publishers allow Google to take their work, at no charge, and for Google to use that work and add value to Google’s network. Google then charges advertisers to place their advertising next to the aggregated information.

    Why do web publishers do this?

    Publishers create and give away their work in the hope they’ll get traffic back, from which they may derive benefit. Some publishers make money, so they can then pay real-world expenses, like housing, food and clothing. The majority of internet publishers make little, or nothing, from this informal deal. A few publishers make a lot. The long tail, when it comes to internet publishing, is pretty long. The majority of wealth, and power, is centralized at the head.

    Similarly, Google’s users are giving away their personal information.

    Every time someone uses Google, they are giving Google personal information of value. Their search queries. They browsing patterns. Their email conversations. Their personal network of contacts. Aggregate that information together, and it becomes valuable information, indeed. Google records this information, crunches it looking for patterns, then packages it up and sells it to advertisers.

    What does Google give back in return?

    Web services.

    Is it a fair exchange of value?

    Lanier argues it isn’t. What’s more, it’s an exchange of value so one-sided that it’s likely to destroy the very ecosystem on which companies like Google are based - the work output, and the spending choices, of the middle class. If few of the people who publish can make a reasonable living doing so, then the quality of what gets published must decrease, or cease to exist.

    People could make their money in other ways, including offline. However, consider that the web is affecting a lot of offline business, already. The music industry is a faint shadow of what it once was, even as recent as one decade ago. There are a lot fewer middle class careers in the music industry now. Small retailers are losing out to the web. Fewer jobs there. The news industry is barely making any money. Same goes for book publishers. All these industries are struggling as online aggregators carve up their value chains.

    Now, factor in all the support industries of these verticals. Then think about all the industries likely to be affected in the near future - like health, or libraries, or education, for example. Many businesses that used to hire predominantly middle class people are going out of business, downsizing their operations, or soon to have chunks of their value displaced.

    It’s not Google’s aim to gut the middle class, of course. This post is not an anti-Google rant, either, simply a look at action and consequence. What is the effect of technology and, in particular, the effect of big technology companies on the web, most of whom seem obsessed with keeping you in their private, proprietary environments for as long as possible?

    Google’s aim is index all the worlds information and make it available. That’s a good aim. It’s a useful, free service. But Lanier argues that gutting the middle class is a side-effect of re-contextualising, and thereby devaluing, information. Information may want to be free, but the consequence of free information is that those creating the information may not get paid. Many of those who do get paid may be weaker organizations more willing to sacrifice editorial quality in able to stay in business. We already see major news sites with MFA-styled formatting on unvetted syndicated press releases. What next?

    You may notice that everyone is encouraged to “share” - meaning “give away” - but sharing doesn't seem to extend to the big tech companies, themselves.

    They charge per click.

    Robots.txt

    One argument is that if someone doesn’t like Google, or any search engine, they should simply block that search engine via robots.txt. The problem with that argument is it’s like saying if you don’t like aspects of your city, you should move to the middle of the wilderness. You could, but really you’d just like to make the city a better place to be, and to see it thrive and prosper, and be able to thrive within it.

    Google provides useful things. I use Google, just like I use my iPhone. I know the deal. I get the utility in exchange for information, and this exchange is largely on their terms. What Lanier proposes is a solution that not only benefits the individual, and the little guy, but ultimately the big information companies, themselves.

    Money Go Round

    Technology improvements have created much prosperity and the development of a strong middle class. But the big difference today is that what is being commoditized is information itself. In a world increasingly controlled by software that acts as our interface to information, if we commoditize information then we commoditize everything else.

    If those creating the information don’t get paid, quality must decrease, or become less available than it otherwise would be. They can buy less stuff in the real world. If they can’t buy as much stuff in the real world, then Google and Facebook’s advertisers have fewer people to talk to that they otherwise would.

    It was all a social construct to begin with, so what changed, to get to your question, is that at the turn of the [21st] century it was really Sergey Brin at Google who just had the thought of, well, if we give away all the information services, but we make money from advertising, we can make information free and still have capitalism. But the problem with that is it reneges on the social contract where people still participate in the formal economy. And it’s a kind of capitalism that’s totally self-defeating because it’s so narrow. It’s a winner-take-all capitalism that’s not sustaining

    That isn’t a sustainable situation long-term. A winner-takes-all system centralizes wealth and power at the top, whilst everyone else occupies the long tail. Google has deals in place with large publishers, such as AP, AFP and various European agencies, but this doesn't extend to smaller publishers. It’s the same in sports. The very top get paid ridiculous amounts of money whilst those only a few levels down are unlikely to make rent on their earnings.

    But doesn’t technology create new jobs? People who were employed at Kodak just go do something else?

    The latest waves of high tech innovation have not created jobs like the old ones did. Iconic new ventures like Facebook employ vastly fewer people than big older companies like, say, General Motors. Put another way, the new schemes.....channel much of the productivity of ordinary people into an informal economy of barter and reputation, while concentrating the extracted old -fashioned wealth for themselves. All activity that takes place over digital networks becomes subject to arbitrage, in the sense that risk is routed to whoever suffers lesser computation resources

    The people who will do well in such an environment will likely be employees of those who own the big data networks, like Google. Or they will be the entrepreneurial and adaptable types who manage to get close to them - the companies that serve WalMart or Google, or Facebook, or large financial institutions, or leverage off them - but Lanier argues there simply aren't enough of those roles to sustain society in a way that gave rise to these companies in the first place.

    He argues this situation disenfranchises too many people, too quickly. And when that happens, the costs spread to everyone, including the successful owners of the networks. They become poorer than they would otherwise be by not returning enough of the value that enables the very information they need to thrive. Or another way of looking at it - who’s going to buy all the stuff if only a few people have the money?

    The network, whether it be a search engine, a social network, an insurance company, or an investment fund uses information to concentrate power. Lanier argues they are all they same as they operate in pretty much the same way. The use network effects to mine and crunch big data, and this, in turn, grows their position at the expense of smaller competitors, and the ecosystem that surrounds them.

    It doesn’t really matter what the intent was. The result is that the technology can prevent the middle class from prospering and when that happens, everyone ultimately loses.

    So What Does He Propose Can Be Done?

    A few days ago, Matt Cutts released a video about what site owners can expect from the next round of Google changes.

    Google have announced a web spam change, called Penguin 2.0. They’ll be “looking at” advertorials, and native advertising. They’ll be taking a “stronger line” on this form of publishing. They’ll also be “going upstream” to make link spammers less effective.

    Of course, whenever Google release these videos, the webmaster community goes nuts. Google will be making changes, and these changes may either make your day, or send you to the wall.

    The most interesting aspect of this, I think, is the power relationship. If you want to do well in Google’s search results then there is no room for negotiation. You either do what they want or you lose out. Or you may do what they want and still lose out. Does the wealth and power sit with the publisher?

    Nope.

    In other news, Google just zapped another link network.

    Cutts warns they’ll be going after a lot of this happening. Does wealth and power sit with the link buyer or seller?

    Nope.

    Now, Google are right to eliminate or devalue sites that they feel devalues their search engine. Google have made search work. Search was all but dead twelve years ago due to the ease with which publishers could manipulate the results, typically with off-topic junk. The spoils of solving this problem have flowed to Google.

    The question is has too much wealth flowed to companies like Google, and is this situation going to kill off large chunks of the ecosystem on which it was built? Google isn’t just a player in this game, they’re so pervasive they may as well be the central planner. Cutts is running product quality control. The customers aren’t the publishers, they’re the advertisers.

    It’s also interesting to note what these videos do not say. Cutts video was not about how your business could be more prosperous. It was all about your business doing what Google wants in order for Google to be more prosperous. It’s irrelevant if you disagree or not, as you don’t get to dictate terms to Google.

    That’s the deal.

    Google’s concern lies not with webmasters just as WalMarts concern lies not with small town retailers. Their concern is to meet company goals and enhance shareholder value. The effects aren’t Google or WalMarts fault. They are just that - effects.

    The effect of Google pursuing those objectives might be to gouge out the value of publishing, and in so doing, gouge out a lot of the value of the middle class. The Google self-drive cars project is fascinating from a technical point of view - the view Google tends to focus on - but perhaps even more fascinating when looked at from a position they seldom seem to consider, at least, not in public, namely what happens to all those taxi drivers, and delivery drivers, who get their first break in society doing this work? Typically, these people are immigrants. Typically, they are poor but upwardly mobile.

    That societal effect doesn't appear to be Google’s concern.

    So who’s concern should it be?

    Well, perhaps it really should be Google’s concern, as it’s in their own long-term best interest:

    Today, a guitar manufacturer might advertise through Google. But when guitars are someday spun out of 3D printers, there will be no one to buy an ad if guitar designs are “free”. Yet Google’s lifeblood is information put online for free. That is what Google’s servers organize. Thus Google’s current business model is a trap in the longterm

    Laniers suggestion is everyone gets paid, via micro-payments, linked back to the value they helped create. These payments continue so long as people are using their stuff, be it a line of code, a photograph, a piece of music, or an article.

    For example, if you wrote a blog post, and someone quoted a paragraph of it, you would receive a tiny payment. The more often you’re quoted, the more relevant you are, therefore the more payment you receive. If a search engine indexes your pages, then you receive a micro-payment in return. If people view your pages, you receive a micro-payment. Likewise, when you consume, you pay. If you conduct a search, then you run Google’s code, and Google gets paid. The payments are tiny, as far as the individual is concerned, but they all add up.

    Mammoth technical issues of doing this aside, the effect would be to take money from the head and pump it back into the tail. It would be harder to build empires off the previously free content others produce. It would send money back to producers.

    It also eliminates the piracy question. Producers would want people to copy, remix and redistribute their content, as the more people that use it, the more money they make. Also, with the integration of two-way linking, the mechanism Lanier proposes to keep track of ownership and credit, you’d always know who is using your content.

    Information would no longer be free. It would be affordable, in the broadest sense of the word. There would also be a mechanism to reward the production, and a mechanism to reward the most relevant information the most. The more you contribute to the net, and the more people use it, the more you make. Tiny payments. Incremental. Ongoing.

    Interesting Questions

    So, if these questions are of interest to you, I’d encourage you to read “Who Owns The Future” by Jaron Lanier. It’s often rambling - in a good way - and heads off on wild tangents - in a good way, and you can tell there is a very intelligent and thoughtful guy behind it all. He’s asking some pretty big, relevant questions. His answers are sketches that should be challenged, argued, debated and enlarged.

    And if big tech companies want a challenge that really will change the world, perhaps they could direct all that intellect, wealth and power towards enriching the ecosystem at a pace faster than they potentially gouge it.

    Why the Yahoo! Search Revenue Gap Won't Close

    In spite of Yahoo! accepting revenue guarantees for another year from Microsoft, recently there has been speculation that Yahoo! might want to get out of their search ad deal with Microsoft. I am uncertain if the back channeled story is used as leverage to secure ongoing minimum revenue agreements, or if Yahoo! is trying to set the pretext narrative to later be able to push through a Google deal that might otherwise get blocked by regulators.

    When mentioning Yahoo!'s relative under-performance on search, it would be helpful to point out the absurd amount of their "search" traffic from the golden years that was various forms of arbitrage. Part of the reason (likely the primary reason) Yahoo! took such a sharp nose dive in terms of search revenues (from $551 million per quarter to as low as $357 million per quarter) was that Microsoft used quality scores to price down the non-search arbitrage traffic streams & a lot of that incremental "search" volume Yahoo! had went away.

    There were all sorts of issues in place that are rarely discussed. Exit traffic, unclosible windows, forcing numerous clicks, iframes in email spam, raw bot clicks, etc. ... and some of this was tied to valuable keyword lists or specific juicy keywords. I am not saying that Google has outright avoided all arbitrage (Ask does boatloads of it in paid + organic & Google at one point tested doing some themselves on credit cards keywords) but it has generally been a sideshow at Google, whereas it was the main attraction at Yahoo!.

    And that is what drove down Yahoo!'s click prices.

    Yahoo! went from almost an "anything goes" approach to their ad feed syndication, to the point where they made a single syndication partner Cyberplex's Tsavo Media pay them $4.8 million for low quality traffic. There were a number of other clawbacks that were not made public.

    Given that we are talking $4.8 million for a single partner & this alleged overall revenue gap between Google AdWords & Bing Ads is somewhere in the $100 million or so range, these traffic quality issues & Microsoft cleaning up the whoring of the ad feed that Yahoo! partners were doing is a big deal. It had a big enough impact that it caused some of the biggest domain portfolios to shift from Yahoo! to Google. I am a bit surprised to see it so rarely mentioned in these discussions.

    Few appreciate how absurd the abuses were. For years Yahoo! not only required you to buy syndication (they didn't have a Yahoo!-only targeting option until 2010 & that only came about as a result of a lawsuit) but even when you blocked a scammy source of traffic, if that scammy source was redirecting through another URL you would have no way of blocking the actual source, as mentioned by Sean Turner:

    To break it down, yahoo gives you a feed for seobook.com & you give me a feed for turner.com. But all links that are clicked on turner.com redirect through seobook.com so that it shows up in customer logs as seobook.com If you block seobook.com, it will block ads from seobook.com, but not turner.com. The blocked domain tool works on what domains display, not on where the feed is redirected through. So if you are a customer, there is no way to know that turner.com is sending traffic (since it’s redirecting through seobook.com) and no way to block it through seobook.com since that tool only works on the domain that is actually displaying it.

    I found it because we kept getting traffic from gogogo.com. We had blocked it over and over and couldn’t figure out why they kept sending us traffic. We couldn’t find our ad on their site. I went to live.com and ran a site:gogogo.com search and found that it indexed some of those landing pages that use gogogo.com as a monetization service.

    The other thing that isn't mentioned is the longterm impact of a Yahoo! tie up with Google. Microsoft pays Yahoo! an 88% revenue share (and further guarantees on top of that), provides the organic listings free, manages all the technology, and allows Yahoo! to insert their own ads in the organic results.

    If Bing were to exit the online ad market, maybe Yahoo! could make an extra $100 million in the first year of an ad deal with Google, but if there is little to no competition a few years down the road, then when it comes time for Yahoo! to negotiate revenue share rates with Google, you know Google would cram down a bigger rake.

    This isn't blind speculation or theory, but aligned with Google's current practices. Look no further than Google's current practices with YouTube, where "partners" are paid different rates & are forbidden to mention their rates publicly: "The Partner Program forbids participants to reveal specifics about their ad-share revenue."

    Transparency is a one way street.

    Google further dips into leveraging that "home team always wins" mode of negotiating rates by directly investing in some of the aggregators/networks which offer sketchy confidential contracts < ahref="http://obviouslybenhughes.com/post/13933948148/before-you-sign-that-machinima-contract-updated">soaking the original content creators.:

    As I said, the three images were posted on yfrog. They were screenshots of an apparently confidential conversation had between MrWonAnother and a partner support representative from Machinima, in which the representative explained that the partner was locked indefinitely into being a Machinima partner for the rest of eternity, as per signed contract. I found this relevant, informative and honestly shocking information and decided to repost the images to obviouslybenhughes.com in hopes that more people would become aware of the darker side of YouTube partnership networks.

    Negotiating with a monopoly that controls the supply chain isn't often a winning proposition over the long run.

    Competition (or at least the credible risk of it) is required to shift the balance of power.

    The flip side of the above situation - where competition does help market participants to get a better revenue share - can be seen in the performance of AOL in their ad negotiation in 2005. AOL's credible threat to switched to Microsoft had Google invest a billion Dollars into AOL, where Google later had to write down $726 million of that investment. If there was no competition from Microsoft, AOL wouldn't have received that $726 million (and likely would have had a lower revenue sharing rate and missed out on some of the promotional AdWords credits they received).

    The same sort of "shifted balance of power" was seen in the Mozilla search renewal with Google, where Google paid Mozilla 3X as much due to a strong bid from Microsoft.

    The iPad search results are becoming more like phone search results, where ads dominate the interface & a single organic result is above the fold. And Google pushed their "ehnanced" ad campaigns to try to push advertisers into paying higher ad rates on those clicks. It would be a boon for Google if they can force advertisers to pay the same CPC as desktop & couple it with that high mobile ad CTR.

    Google owning Chrome + Android & doing deals with Apple + Mozilla means that it will be hard for either Microsoft or Yahoo! to substantially grow search marketshare. But if they partner with Google it will be a short term lift in revenues and dark clouds on the horizon.

    I am not claiming that Microsoft is great for Yahoo!, or that they are somehow far better than Google, only that Yahoo! is in a far better position when they have multiple entities competing for their business (as highlighted in the above Mozilla & AOL examples).

    Link Madness

    May 14th
    posted in

    Link paranoia is off the scale. As the “unnatural link notifications” fly, the normally jittery SEO industry has moved deep into new territory, of late.

    I have started to wonder if some of these links (there are hundreds since the site is large) may be hurting my site in the Google Algo. I am considering changing most of my outbound links to rel="nofollow". It is not something I want to do but ... “

    We’ve got site owners falling to their knees, confessing to be link spammers, and begging for forgiveness. Even when they do, many sites don’t return. Some sites have been returned, but their rankings, and traffic, haven’t recovered. Many sites carry similar links, but get a free pass.

    That’s the downside of letting Google dictate the game, I guess.

    Link Removal

    When site owners are being told by Google that their linking is “a problem,” they tend to hit the forums and spread the message, so the effect is multiplied.

    Why does Google bother with the charade of “unnatural link notifications,” anyway?

    If Google has found a problem with links to a site, then they can simply ignore or discount them, rather than send reports prompting webmasters to remove them. Alternatively, they could send a report saying they’ve already discounted them.

    So one assumes Google’s strategy is a PR - as in public relations - exercise to plant a bomb between link buyers and link sellers. Why do that? Well, a link is a link, and one could conclude that Google must still have problems nailing down the links they don’t like.

    So they get some help.

    The disavow links tool, combined with a re-inclusion request, is pretty clever. If you wanted a way to get site owners to admit to being link buyers, and to point out the places from which they buy links, or you want to build a database of low quality links, for no money down, then you could hardly imagine a better system of outsourced labour.

    If you’re a site owner, getting hundreds, if not thousands, of links removed is hardly straightforward. It’s difficult, takes a long time, and is ultimately futile.

    Many site owners inundated with link removal requests have moved to charging removal fees, which in many cases is understandable, given it takes some time and effort to verify the true owner of a link, locate the link, remove it, and update the site.

    As one rather fed-up sounding directory owner put it:

    Blackmail? Google's blackmailing you, not some company you paid to be listed forever. And here's a newsflash for you. If you ask me to do work, then I demand to be paid. If the work's not worth anything to you, then screw off and quit emailing me asking me to do it for free.

    Find your link, remove it, confirm it's removed, email you a confirmation, that's 5 minutes. And that's $29US. Don't like it? Then don't email me. I have no obligation to work for you for free, not even for a minute. …. I think the best email I got was someone telling me that $29 was extortion. I then had to explain that $29 wasn't extortion - but his new price of $109 to have the link removed, see, now THAT'S extortion.

    if it makes you sleep at night, you might realize that you paid to get in the directory to screw with your Google rankings, now you get to pay to get out of the directory, again to screw your Google rankings. That's your decision, quit complaining about it like it's someone else's fault. Not everyone has to run around in circles because you're cleaning up the very mess that you made

    Heh.

    In any case, if these links really did harm a site - which is arguable - then it doesn’t take a rocket scientist to guess the next step. Site owners would be submitting their competitors links to directories thick and fast.

    Cue Matt Cutts on negative SEO....

    Recovery Not Guaranteed

    Many sites don’t recover from Google penalties, no matter what they do.

    It’s conceivable that a site could have a permanent flag against it no matter what penance has been paid. Google takes into account your history in Adwords, so it’s not a stretch to imagine similar flags may continue to exist against domains in their organic results.

    The most common reason is not what they're promoting now, its what they've promoted in the past.
    Why would Google hold that against them? It's probably because of the way affiliates used to churn and burn domains they were promoting in years gone by...

    This may be the reason why some recovered sites just don’t rank like they used to after they've returned. They may carry permanent negative flags.

    However, the reduced rankings and traffic when/if a site does return may have nothing to do with low-quality links or previous behaviour. There are many other factors involved in ranking and Google’s algorithm updates aren’t sitting still, so it’s always difficult to pin down.

    Which is why the SEO environment can be a paranoid place.

    Do Brands Escape?

    Matt Cutts is on record discussing big brands, saying they get punished, too. You may recall the case of Interflora UK.

    Google may well punish big brands, but the punishment might be quite different to the punishment handed out to a no-brand site. It will be easier for a big brand to return, because if Google don’t show what Google users expect to see in the SERPs then Google looks deficient.

    Take, for example, this report received - amusingly - by the BBC:

    I am a representative of the BBC site and on Saturday we got a ‘notice of detected unnatural links’. Given the BBC site is so huge, with so many independently run sub sections, with literally thousands or agents and authors, can you give us a little clue as to where we might look for these ‘unnatural links

    If I was the BBC webmaster, I wouldn’t bother. Google isn’t going to dump the BBC sites as Google would look deficient. If Google has problems with some of the links pointing to the BBC, then perhaps Google should sort it out.

    Take It On The Chin, Move On

    Many of those who engaged in aggressive link tactics knew the deal. They went looking for an artificial boost in relevancy, and as a result of link building, they achieved a boost in the SERPs.

    That is playing the game that Google, a search engine that factors in backlinks, "designed". By design, Google rewards well-linked sites by ranking them above others.

    The site owners enjoyed the pay-off at the expense of their less aggressive competitors. The downside - there’s always a downside - is that Google may spot the artificial boost in relevancy, now or in the future, and and may slam the domain as a result.

    That’s part of the game, too.

    Some cry about it, but Google doesn’t care about crying site owners, so site owners should build that risk into their business case from the get go.

    Strategically, there are two main ways of looking at “the game”:

    Whack A Mole: Use aggressive linking for domains you’re prepared to lose. If you get burned, then that’s a cost of playing the game. Run multiple domains using different link graphs for each and hope that a few survive at any one time, thus keeping you in the game. If some domains get taken out, then take it on the chin. Try to get reinstated, and if you can’t, then torch them and move on.

    Ignore Google: If you operate like Google doesn’t exist, then it’s pretty unlikely Google will slam you, although there are no guarantees. In any case, a penalty and a low ranking are the same thing in terms of outcome.

    Take one step back. If your business relies on Google rankings, then that’s a business risk. If you rely entirely on Google rankings, then that’s a big business risk. I’m not suggesting it’s not a risk worth taking, but only you can answer that what risks make sense for your business.

    If the whack a mole strategy is not for you, and you want to lower the business risk of Google’s whims, then it makes sense to diversify the ways in which you get traffic so that if one traffic stream fails, then all is not lost. If you’re playing for the long term, then establishing brand, diversifying traffic, and treating organic SEO traffic as a bonus should be considerations. You then don’t need to worry about what Google may or may not do as Google aren’t fueling your engine.

    Some people run both these strategies simultaneously, which is an understandable way of managing risk. Most people probably sit somewhere in the middle and hope for the best.

    Link Building Going Forward

    The effect of Google’s fear, uncertainty and doubt strategy is that a lot of site owners are going to be running scared or confused, or both.

    Just what is acceptable?

    Trouble is, what is deemed acceptable today might be unacceptable next week. It’s pretty difficult, if not impossible, for a site owner to wind the clock back once they undertake a link strategy, and who knows what will be deemed unacceptable in a years time.

    Of course, Google doesn’t want site owners to think in terms of a “link strategy”, if the aim of said link strategy is to “inflate rankings”. That maxim has remained constant.

    If you want to take a low-risk approach, then it pays to think of Google traffic as a bonus. Brett Tabke, founder of WebmasterWorld, used to keep a sticker on his monitor that said “Pretend The Search Engines Don’t Exist”, or words to that effect. I’m reminded of how useful that message still is today, as it's a prompt to think strategically beyond SEO. If you disappeared from Google today, would your business survive? If the answer is no, then you should revise your strategy.

    Is there a middle ground?

    Here are a few approaches to link building that will likely stand the test of time, and incorporate strategy that provides resilience from Google’s whims. The key is having links for reasons besides SEO, even if you part of their value is higher rankings.

    1. Publisher

    Publish relevant, valuable content, as determined by your audience.

    It’s no longer enough to publish pages of information on a topic, the information must have demonstrable utility i.e. other people need to deem it valuable, reference it, visit it, and talk about it. Instead of putting your money into buying links, you put your money into content development and then marketing it to people. The links will likely follow. This is passive link acquisition.

    It’s unlikely these types of links will ever be a problem, as the link graph is not going to look contrived. If any poor quality links slip into this link graph, then they’re not going to be the dominant feature. The other signals will likely trump them and therefore diminish their impact.

    Build brand based on unique, high quality information, and then market it to people by via multiple channels, and the links tend to follow, which then boost your ranking in Google. Provide a high degree of utility, first and foremost.

    One problem with this model is that it’s easy for other people to steal your utility. This is a big problem and prevents investment in quality content. One way of getting around this is to use some content as loss-leader and lock the rest away behind pay-walls. You give the outside world, and Google, just enough, but if they want the rest, then they’re going to need to sign up.

    Think carefully about the return on giving the whole farm away to a crawler. Think about providing utility, not "content".

    2. Differentiation

    There is huge first mover advantage when it comes to getting links.

    If a new field opens up, and you get there first, or early, then it’s relatively easy to build a credible link graph. As a field expands, the next layer involves a lot of meta activity i.e. bloggers, media and other information curators writing about that activity. At the start of any niche, there aren’t many players to talk about, so the early movers get all the links.

    As a field matures, you get a phenomenon Mike Grehan aptly characterised as “filthy linking rich

    The law of "preferential attachment" as it is also known, wherein new links on the web are more likely to go to sites that already have many links, proves that the scheme is inherently biased against new and unknown pages. When search engines constantly return popular pages at the top of the pile, more web users discover those pages and more web users are likely to link to them

    Those who got all those links early on will receive more and more links over time because they are top of the results. They just need to keep doing what they’re doing. It becomes very difficult for late entrants to beat them unless they do something extraordinary. By definition, that probably means shifting the niche to a new niche.

    If you’re late to a crowded field, then you need to think in terms of differentiation. What can you offer the rest do not? New content in such fields must be remarkable i.e worth remarking upon.

    Is that field moving in a new direction? If so, can you pivot in that direction and be first mover in that direction? Look not where a niche currently is, but where it’s going, then position ahead of it.

    "Same old, same old content” doesn’t get linked to, engaged with, ranked, or remarked upon - and why should it? The web is not short of content. The web has so much content that companies like Google have made billions trying to reduce it to a manageable set of ten links

    3. Brand

    Brand is the ultimate Google-protection tactic.

    It’s not that brands don’t get hammered by Google occasionally, because they do. But what tends to differ is the sentence handed down. The bigger the brand, the lighter the sentence, or the shorter the sentence, because no matter how much WalMart or The Office Of The President Of The United States Of America spams Google, Google must show such sites. I’m not suggesting these sites engage in aggressive SEO tactics, or need to, but we know they’ll always be in Google.

    You don’t have to be a big brand. You do need search volume on your distinctive brand name. If you’re well known enough in your niche i.e. you attract significant type-in search volume, Google must show you or appear deficient.

    This is not to say having a brand means you can get away with poor behavior. But the more type-in traffic for your brand, the more pressure there is on Google to rank you.

    Links to a brand name will almost never look forced in the same way a link in a footer to “cheap online pharmacy” looks forced. People know your name, and they link to you by name , they talk about you by name - naturally.

    The more generic your site, the more vulnerable you are, as it’s very difficult to own a space when you’re aligning with generic keyword terms. The links are always going to look a little - or a lot - forced.

    This is not to say you shouldn't get links with keywords in them, but build a distinctive brand, too. The link graph will appear mostly natural - because it is. A few low quality links won’t trump the good signals created by a lot of natural brand links.

    4. Engagement

    The web is a place.

    This placed is filled with people. There are relationships between people. Relationships between people on the web, are almost always expressed as a link. It might be a Facebook link, a Twitter link, a comment link, a blog link, but they’re all links. It doesn’t matter if they’re crawlable or not, or if they’re no-followed, or not, it still indicates a relationship.

    If Google is to survive, it must figure out these relationships.

    That’s why all links - apart from negative SEO - are good links. The more signals of a real relationship, the better you *should* be ranked, because you are more relevant, in an objective sense.

    So look for ways to foster relationships and engagement. It might be guest posting. It might be commenting on someone elses site. It might be contributing to forums. It might be interviewing people. It might be accepting interviews. It might be forging relationships with the press. It might be forging relationships with business organisations. It might be contributing to charity. It might be running competitions. It might be attending conferences. It might be linking out to influential people.

    It’s all networking.

    And wherever you network, you should be getting links as a byproduct.

    One potential problem:

    Provide long - well, longer than 400 words - unique, editorial articles. Articles also need get linked to, and engaged with. Articles need to be placed on sites they’ll be seen, as opposed to content farms.

    Ask yourself "am I providing genuine utility?"

    5. Fill A Need

    This is similar to differentiation, but a little more focused.

    Think about the problems people have in a niche. The really hard problems to solve. “How to”, “tips”, “advice”, and so on.

    Solving a genuine problem for people tends to make people feel a sense of obligation, especially if they would otherwise have to pay for that help. If you can twist that obligation towards getting a link, all the better. For example, “if this article/video/whatever helped you, no payment necessary! But it would be great if you link to us/follow us on Twitter/” and so on. It doesn’t need to be that overt, although sometimes overt is what is required. It fosters engagement. It builds your network. And it builds links.

    Think about ways you can integrate a call-to-action that results in a link of some kind.

    Coda

    In other news, Caesars Palace bans Google :)

    Measuring Social Media

    May 2nd
    posted in

    Measuring PPC and SEO is relatively straightforward. But how do we go about credibly measuring social media campaigns, and wider public relations and audience awareness campaigns?

    As the hype level of social media starts to fall, then more questions are asked about return on investment. During the early days of anything, the hype of the new is enough to sustain an endeavor. People don't want to miss out. If their competitors are doing it, that's often seen as good enough reason to do it, too.

    You may be familiar with this graph. It's called the hype cycle and is typically used to demonstrate the maturity, adoption and social application of specific technologies:

    Where would social media marketing be on this graph?

    I think a reasonable guess, if we're seeing more and more discussion about ROI, is somewhere on the "slope of enlightenment". In this article, we’ll look at ways to measure social media performance by grounding it in the only criteria that truly matter - business fundamentals.

    Public Relations

    We’ve talked about the Cluetrain Manifesto and how the world changed when corporations could no longer control the message. If the message can no longer be controlled, then measuring the effectiveness of public relations becomes even more problematic.

    PR used to be about crafting a message and placing it, and nurturing the relationships that allowed that to happen. With the advent of social media, that’s still true, but the scope has expanded exponentially - everyone can now repeat, run with, distort, reconfigure and reinvent the messages. Controlling the message was always difficult, but now it’s impossible.

    On the plus side, it’s now much easier to measure and quantify the effectiveness of public relations activity due to the wealth of web data and tools to track what people are saying, to whom, and when.

    The Same, Only Different

    As much as things change, the more they stay the same. PR and social media is still about relationships. And getting relationships right pays off:

    Today, I want to write about something I’d like to call the “Tim Ferriss Effect.” It’s not exclusive to Tim Ferriss, but he is I believe the marquee example of a major shift that has happened in the last 5 years within the world of book promotion. Here’s the basic idea: When trying to promote a book, the main place you want coverage is on a popular single-author blog or site related to your topic.....The post opened with Tim briefly explaining how he knew me, endorsing me as a person, and describing the book (with a link to my book.) It then went directly into my guest post– there was not even an explicit call to action to buy my book or even any positive statements about my book. An hour later, (I was #45 on Amazon’s best seller list

    Public relations is more than about selling, of course. It’s also about managing reputation. It’s about getting audiences to maintain a certain point of view. Social media provides the opportunity to talk to customers and the public directly by using technology to dis-intermediate the traditional gatekeepers.

    Can We Really Measure PR & Social Media Performance?

    How do you measure the value of a relationship?

    Difficult.

    How can you really tell if people feel good enough about your product or service to buy it, and that “feeling good” was the direct result of editorial placement by a well-connected public relations professional?

    Debatable, certainly.

    Can you imagine another marketing discipline that used dozens of methods for measuring results? Take search engine marketing for example. The standards are pretty cut and dry: visitors, page views, time on site, cost per click, etc. For email marketing, we have delivery, open rates, click thru, unsubscribes, opt-ins, etc”

    In previous articles, we’ve looked at how data-driven marketing can save time and be more effective. The same is true of social media, but given it’s not an exact science, it’s a question of finding an appropriate framework.

    There are a lot of people asking questions about social media's worth.

    No Industry Standard

    Does sending out weekly press releases result in more income? How about tweeting 20 times a day? How much are 5,000 followers on Facebook worth? Without a framework to measure performance, there’s no way of knowing.

    Furthermore, there’s no agreed industry standard.

    In direct marketing channels, such as SEO and PPC, measurement is fairly straightforward. We count cost per click, number of visitors, conversion rate, time on site, and so on. But how do we measure public relations? How do we measure influence and awareness?

    PR firms have often developed their own in-house terms of measurement. The problem is that without industry standards, success criteria can become arbitrary and often used simply to show the agency in a good light and thus validate their fees.

    Some agencies use publicity results, such as the number of mentions in the press, or the type of mention i.e. prestigious placement. Some use advertising value equivalent i.e. is what editorial coverage would cost if it were buying advertising space. Some use public opinion measures, such as polls, focus groups and surveys, whilst others compare mentions and placement vs competitors i.e. who has more or better mentions, wins. Most use a combination, depending on the nature of the campaign.

    Most business people would agree that measurement is a good thing. If we’re spending money, we need to know what we’re getting for that money. If we provide social media services to clients, we need to demonstrate what we’re doing works, so they’ll devote more budget to it in future. If the competition is using this channel, then we need to know if we’re using it better, or worse, than we are.

    Perhaps the most significant reason why we measure is to know if we’ve met a desired outcome. To do that we must ignore gut feelings and focus on whether an outcome was achieved.

    Why wouldn’t we measure?

    Some people don’t like the accountability. Some feel more comfortable with an intuitive approach. It can be difficult for some to accept that their pet theories have little substance when put to the test. It seems like more work. It seems like more expense. It’s just too hard. When it comes to social media, some question whether it can be done much at all

    For proof, look no further than The Atlantic, which shook the social media realm recently with its expose of “dark social” – the idea that the channels we fret over measuring like Facebook and Twitter represent only a small fraction of the social activity that’s really going on. The article shares evidence that reveals that the vast majority of sharing is still done through channels like email and IM that are nearly impossible to measure (and thus, dark).

    And it's not like a lot of organizations are falling over themselves to get measurement done:

    According to a Hypatia Research report, "Benchmarking Social Community Platform Investments & ROI," only 40% of companies measure social media performance on a quarterly or annual basis, while almost 13% or the organizations surveyed do not measure ROI from social media at all, and another 18% said they do so only on an ad hoc basis. (Hypatia didn't specify what response the other 29% gave.)

    If we agree that measurement is a good thing and can lead to greater efficiency and better decision making, then the fact your competition may not be measuring well, or at all, then this presents great opportunity. We should strive to measure social media ROI, as providers or consumers, or it becomes difficult to justify spend. The argument that we can't measure because we don’t know all the effects of our actions isn’t a reason not to measure what we can.

    Marketing has never been an exact science.

    What Should We Measure?

    Measurement should be linked back to business objectives.

    In “Measure What Matters”, Katie Delahaye Paine outlines seven steps to social media measurement. I liked these seven steps, because they aren’t exclusive to social media. They’re the basis for measuring any business strategy and similar measures have been used in marketing for a long time.

    It’s all about proving something works, and then using the results to enhance future performance. The book is a great source for those interested in reading further on this topic, which I’ll outline here.

    1. What Are Your Objectives?

    Any marketing objective should serve a business objective. For example, “increase sales by X by October 31st”.

    Having specific, business driven objectives gets rid of conjecture and focuses campaigns. Someone could claim that spending 30 days tweeting a new message a day is a great thing to do, but if, at the end of it, a business objective wasn’t met, then what was the point?

    Let’s say an objective is “increase sales of shoes compared to last December’s figures”. What might the social strategy look like? It might consist of time-limited offers, as opposed to more general awareness messages. What if the objective was to “get 5,000 New Yorkers to mention the brand before Christmas”? This would lend itself to viral campaigns, targeted locally. Linking the campaign to specific business objectives will likely change the approach.

    If you have multiple objectives, you can always split them up into different campaigns so you can measure the effectiveness of each separately. Objectives typically fall into sales, positioning, or education categories.

    2. Who Is The Audience?

    Who are you talking to? And how will you know if you’ve reached them? Once you have reached them, what is it you want them to do? How will this help your business?

    Your target audience is likely varied. Different audiences could be industry people, customers, supplier organizations, media outlets, and so on. Whilst the message may be seen by all audiences, you should be clear about which messages are intended for who, and what you want them to do next. The messages will be different for each group as each group likely picks up on different things.

    Attach a value to each group. Is a media organization picking up on a message more valuable than a non-customer doing so? Again, this should be anchored to a business requirement. “We need media outlets following us so they may run more of our stories in future. Our research shows more stories has led to increased sales volume in the past”. Then a measure might be to count the number of media industry followers, and to monitor the number of stories they produce.

    3. Know Your Costs

    What does it cost you to run social media campaigns? How much time will it take? How does this compare to other types of campaigns? What is your opportunity cost? How much does it cost to measure the campaign?

    As Delahaye Paine puts it, it’s the “I” in ROI.

    4. Benchmark

    Testing is comparative, so have something to compare against.

    You can compare yourself against competitors, and/or your own past performance. You can compare social media campaigns against other marketing campaigns. What do those campaigns usually achieve? Do social media campaigns work better, or worse, in terms of achieving business goals?

    In terms of ROI, what’s a social media “page view” worth? You could compare this against the cost of a click in PPC.

    5. Define KPIs

    Once you’ve determined objectives, defined the audience, and established benchmarks, you should establish criteria for success.

    For example, the objective might be to increase media industry followers. The audience is the media industry and the benchmark is the current number of media industry followers. The KPI would be the number of new media industry followers signed up, as measured by classifying followers into subgroups and conducting a headcount.

    Measuring the KPI will differ depending on objective, of course. If you’re measuring the number of mentions in the press vs your competitor, that’s pretty easy to quantify.

    “Raising awareness” is somewhat more difficult, however once you have a measurement system in place, you can start to break down the concept of “awareness” into measurable components. Awareness of what? By whom? What constitutes awareness? How to people signal they’re aware of you? And so on.

    6. Data Collection Tools

    How will you collect measurement data?

    • Content analysis of social or traditional media
    • Primary research via online, mail or phone survey
    • Web analytics

    There are an overwhelming number of tools available, and outside the scope of this article. No tool can measure “reputation” or “awareness” or “credibility” by itself, but can produce usable data if we break those areas down into suitable metrics. For example, “awareness” could be quantified by “page views + a survey of a statistically valid sample”.

    Half the battle is asking the right questions.

    7. Take Action

    A measurement process is about iteration. You do something, get the results back, act on them and make changes, and arrive at a new status quo. You then do something starting from that new point, and so on. It’s an ongoing process of optimization.

    Were objectives met? What conclusions can you draw?

    Those seven steps will be familiar to anyone who has measured marketing campaigns and business performance. They’re grounded in the fundamentals. Without relating social media metrics back to the underlying fundamentals, we can never be sure if what we’re doing is making or a difference, or worthwhile. Is 5,000 Twitter followers a good thing?

    It depends.

    What business problem does it address?

    Did You Make A Return?

    You invested time and money. Did you get a return?

    If you’ve linked your social media campaigns back to business objectives you should have a much clearer idea. Your return will depend on the nature of your business, of course, but it could be quantified in terms of sales, cost savings, avoiding costs or building an audience.

    In terms of SEO, we’ve long advocated building brand. Having people conduct brand searches is a form of insurance against Google demoting your site. If you have brand search volume, and Google don’t return you for brand searches, then Google looks deficient.

    So, one goal of social media that gels with SEO is to increase brand awareness. You establish a benchmark of branded searches based on current activity. You run your social media campaigns, and then see if branded searches increase.

    Granted, this is a fuzzy measure, especially if you have other awareness campaigns running, as you can’t be certain cause and effect. However, it’s a good start. You could give it a bit more depth by integrating a short poll for visitors i.e. “did you hear about us on Twitter/Facebook/Other?”.

    Mechanics Of Measurement

    Measuring social media isn’t that difficult. In fact, we could just as easily use search metrics in many cases. What is the cost per view? What is the cost per click? Did the click from a social media campaign convert to desired action? What was your business objective for the social media campaign? To get more leads? If so, then count the leads. How much did each lead cost to acquire? How does that cost compare to other channels, like PPC? What is the cost of customer acquisition via social media?

    In this way, we could split social media out into the customer service side and marketing side. Engaging with your customers on Facebook may not be all that measurable in terms of direct marketing effects, it’s more of a customer service function. As such, budget for the soft side of social media need not come out of marketing budgets, but customer service budgets. This could still be measured, or course, by running customer satisfaction surveys.

    Is Social Media Marketing Public Relations?

    Look around the web for definitions of the differences between PR and social media, and you’ll find a lot of vague definitions.

    Social media is a tool used often used for the purpose of public relations. The purpose is to create awareness and nurture and guide relationships.

    Public relations is sometimes viewed it as a bit of a scam. It’s an area that sucks money, yet can often struggle to prove its worth, often relying on fuzzy, feel-good proclamations of success and vague metrics. It doesn’t help that clients can have unrealistic expectations of PR, and that some PR firms are only too happy to promise the moon:

    PR is nothing like the dark, scary world that people make it out to be—but it is a new one for most. And knowing the ropes ahead of time can save you from setting impossibly high expectations or getting overpromised and oversold by the firm you hire. I’ve seen more than my fair share of clients bringing in a PR firm with the hopes that it’ll save their company or propel a small, just-launched start-up into an insta-Facebook. And unfortunately, I’ve also seen PR firms make these types of promises. Guess what? They’re never kept

    Internet marketing, in general, has a credibility problem when it doesn’t link activity back to business objectives.

    Part of that perception, in relation to social media, comes from the fact public relations is difficult to control:

    The main conduit to mass publics, particularly with a consumer issue such as rail travel or policing, are the mainstream media. Unlike advertising, which has total control of its message, PR cannot convey information without the influence of opinion, much of it editorial. How does the consumer know what is fact, and what has influenced the presentation of that fact?

    But lack of control of the message, as the Cluetrain Manifesto points out, is the nature of the environment in which we exist. Our only choice, if we are to prosper in the digital environment, is to embrace the chaos.

    Shouldn’t PR just happen? If you’re good, people just know? Well, even Google, that well known, engineering-driven advertising company has PR deeply embedded from almost day one:

    David Krane was there from day one as Google's first public relations official. He's had a hand in almost every single public launch of a Google product since the debut of Google.com in 1999.

    Good PR is nurtured. It’s a process. The way to find out if it’s good PR or ineffective PR is to measure it. PR isn’t a scam, anymore so than any other marketing activity is a scam. We can find out if it’s worthwhile only by tracking and measuring and linking that measurement back to a business case. Scams lack transparency.

    The way to get transparency is to measure and quantify.

    Getting Granular With User Generated Content

    Apr 24th

    The stock market had a flash crash today after someone hacked the AP account & made a fake announcement about bombs going off at the White House. Recently Twitter's search functionality has grown so inundated with spam that I don't even look at the brand related searches much anymore. While you can block individual users, it doesn't block them from showing up in search results, so there are various affiliate bots that spam just about any semi-branded search.

    Of course, for as spammy as the service is now, it was worse during the explosive growth period, when Twitter had fewer than 10 employees fighting spam:

    Twitter says its "spammy" tweet rate of 1.5% in 2010 was down from 11% in 2009.

    If you want to show growth by any means necessary, engagement by a spam bot is still engagement & still lifts the valuation of the company.

    Many of the social sites make no effort to police spam & only combat it after users flag it. Consider Eric Schmidt's interview with Julian Assange, where Eric Schmidt stated:

    • "We [YouTube] can't review every submission, so basically the crowd marks it if it is a problem post publication."
    • "You have a different model, right. You require human editors." on Wikileaks vs YouTube

    We would post editorial content more often, but we are sort of debating opening up a social platform so that we can focus on the user without having to bear any editorial costs until after the fact. Profit margins are apparently better that way.

    As Google drives smaller sites out of the index & ranks junk content based on no factor other than it being on a trusted site, they create the incentive for spammers to ride on the social platforms.

    All aboard. And try not to step on any toes!

    When I do some product related searches (eg: brand name & shoe model) almost the whole result set for the first 5 or 10 pages is garbage.

    • Blogspot.com subdomains
    • Appspot.com subdomains
    • YouTube accounts
    • Google+ accounts
    • sites.google.com
    • Wordpress.com subdomains
    • Facebook Notes & pages
    • Tweets
    • Slideshare
    • LinkedIn
    • blog.yahoo.com
    • subdomains off of various other free hosts

    It comes without surprise that Eric Schmidt fundamentally believes that "disinformation becomes so easy to generate because of, because complexity overwhelms knowledge, that it is in the people's interest, if you will over the next decade, to build disinformation generating systems, this is true for corporations, for marketing, for governments and so on."

    Of course he made no mention in Google's role in the above problem. When they are not issuing threats & penalties to smaller independent webmasters, they are just a passive omniscient observer.

    With all these business models, there is a core model of building up a solid stream of usage data & then tricking users or looking the other way when things get out of hand. Consider Google's Lane Shackleton's tips on YouTube:

    • "Search is a way for a user to explicitly call out the content that they want. If a friend told me about an Audi ad, then I might go seek that out through search. It’s a strong signal of intent, and it’s a strong signal that someone found out about that content in some way."
    • "you blur the lines between advertising and content. That’s really what we’ve been advocating our advertisers to do."
    • "you’re making thoughtful content for a purpose. So if you want something to get shared a lot, you may skew towards doing something like a prank"

    Harlem Shake & Idiocracy: the innovative way forward to improve humanity.

    Life is a prank.

    This "spam is fine, so long as it is user generated" stuff has gotten so out of hand that Google is now implementing granular page-level penalties. When those granular penalties hit major sites Google suggests that those sites may receive clear advice on what to fix, just by contacting Google:

    Hubert said that if people file a reconsideration request, they should “get a clear answer” about what’s wrong. There’s a bit of a Catch-22 there. How can you file a reconsideration request showing you’ve removed the bad stuff, if the only way you can get a clear answer about the bad stuff to remove is to file a reconsideration request?

    The answer is that technically, you can request reconsideration without removing anything. The form doesn’t actually require you to remove bad stuff. That’s just the general advice you’ll often hear Google say, when it comes to making such a request. That’s also good advice if you do know what’s wrong.

    But if you’re confused and need more advice, you can file the form asking for specifics about what needs to be removed. Then have patience

    In the past I referenced that there is no difference between a formal white list & overly-aggressive penalties coupled with loose exemptions for select parties.

    The moral of the story is that if you are going to spam, you should make it look like a user of your site did it, that way you

    • are above judgement
    • receive only a limited granular penalty
    • get explicit & direct feedback on what to fix

    Experiment Driven Web Publishing

    Apr 9th
    posted in

    Do users find big headlines more relevant? Does using long text lead to more, or less, visitor engagement? Is that latest change to the shopping cart going to make things worse? Are your links just the right shade of blue?

    If you want to put an end to tiresome subjective arguments about page length, or the merits of your clients latest idea, which is to turn their website pink, then adopting an experimental process for web publishing can be a good option.

    If you don’t currently use an experiment-driven publishing approach, then this article is for you. We’ll look at ways to bake experiments into your web site, the myriad of opportunities testing creates, how it can help your SEO, and ways to mitigate cultural problems.

    Controlled Experiments

    The merits of any change should be derived from the results of the change under a controlled test. This process is common in PPC, however many SEO’s will no doubt wonder how such an approach will affect their SEO.

    Well, Google encourages it.

    We’ve gotten several questions recently about whether website testing—such as A/B or multivariate testing—affects a site’s performance in search results. We’re glad you’re asking, because we’re glad you’re testing! A/B and multivariate testing are great ways of making sure that what you’re offering really appeals to your users

    Post-panda, being more relevant to visitors, not just machines, is important. User engagement is more important. If you don’t closely align your site with user expectations and optimize for engagement, then it will likely suffer.

    The new SEO, at least as far as Panda is concerned, is about pushing your best quality stuff and the complete removal of low-quality or overhead pages from the indexes. Which means it’s not as easy anymore to compete by simply producing pages at scale, unless they’re created with quality in mind. Which means for some sites, SEO just got a whole lot harder.

    Experiments can help us achieve greater relevance.

    If It ‘Aint Broke, Fix It

    One reason for resisting experiment-driven decisions is to not mess with success. However, I’m sure we all suspect most pages and processes can be made better.

    If we implement data-driven experiments, we’re more likely to spot the winners and losers in the first place. What pages lead to the most sales? Why? What keywords are leading to the best outcomes? We identify these pages, and we nurture them. Perhaps you already experiment in some areas on your site, but what would happen if you treated most aspects of your site as controlled experiments?

    We also need to cut losers.

    If pages aren’t getting much engagement, we need to identify them, improve them, or cut them. The Panda update was about levels of engagement, and too many poorly performing pages will drag your site down. Run with the winners, cut the losers, and have a methodology in place that enables you to spot them, optimize them, and cut them if they aren’t performing.

    Testing Methodology For Marketers

    Tests are based on the same principles used to conduct scientific experiments. The process involves data gathering, designing experiments, running experiments, analyzing the results, and making changes.

    1. Set A Goal

    A goal should be simple i.e. “increase the signup rate of the newsletter”.

    We could fail in this goal (decreased signups), succeed (increased signups), or stay the same. The goal should also deliver genuine business value.

    There can be often multiple goals. For example, “increase email signups AND Facebook likes OR ensure signups don’t decrease by more than 5%”. However, if you can get it down to one goal, you’ll make life easier, especially when starting out. You can always break down multiple goals into separate experiments.

    2. Create A Hypothesis

    What do you suspect will happen as a result of your test? i.e. “if we strip all other distractions from the email sign up page, then sign-ups will increase”.

    The hypothesis can be stated as an improvement, or preventing a negative, or finding something that is wrong. Mostly, we’re concerned with improving things - extracting more positive performance out of the same pages, or set of pages.

    “Will the new video on the email sign-up page result in more email signups?” Only one way to find out. And once you have found out, you can run with it or replace it safe in the knowledge it's not just someone's opinion. The question will move from “just how cool is this video!” (subjective) to “does this video result in more email sign-ups?”. A strategy based on experiments eliminates most subjective questions, or shifts them to areas that don’t really affect the business case.

    The video sales page significantly increased the number of visitors who clicked to the price/guarantee page by 46.15%....Video converts! It did so when mentioned in a “call to action” (a 14.18% increase) and also when used to sell (35% and 46.15% increases in two different tests)

    When crafting a hypothesis, you should keep business value clearly in mind. If the hypothesis suggests a change that doesn’t add real value, then testing it is likely a waste of time and money. It creates an opportunity cost for other tests that do matter.

    When selecting areas to test, you should start by looking at the areas which matter most to the business, and the majority of users. For example, an e-commerce site would likely focus on product search, product descriptions, and the shopping cart. The About Page - not so much.

    Order areas to test in terms of importance and go for the low hanging fruit first. If you can demonstrate significant gains early on, then it will boost your confidence and validate your approach. As experimental testing becomes part of your process, you can move on more granular testing. Ideally, you want to end up with a culture whereby most site changes have some sort of test associated with them, even if it’s just to compare performance against the previous version.

    Look through your stats to find pages or paths with high abandonment rates or high bounce rates. If these pages are important in terms of business value, then prioritize these for testing. It’s important to order these pages in terms of business value, because high abandonment rates or bounce rates on pages that don’t deliver value isn’t a significant issue. It’s probably more a case of “should these pages exist at all”?

    3. Run An A/B or Multivariate Test

    Two of the most common testing methodologies in direct response marketing are A/B testing and multivariate testing.

    A/B Testing, otherwise known as split testing, is when you compare one version of a page against another. You collect data how each page performs, relative to the other.

    Version A is typically the current, or favored version of a page, whilst page B differs slightly, and is used as a test against page A. Any aspect of the page can be tested, from headline, to copy, to images, to color, all with the aim of improving a desired outcome. The data regarding performance of each page is tested, the winner is adopted, and the loser rejected.

    Multivariate testing is more complicated. Multivariate testing is when more than one element is tested at any one time. It’s like performing multiple A/B tests on the same page, at the same time. Multivariate testing can test the effectiveness of many different combinations of elements.

    Which method should you use?

    In most cases, in my experience, A/B testing is sufficient, but it depends. In the interest of time, value and sanity, it’s more important and productive to select the right things to test i.e. the changes that lead to the most business value.

    As your test culture develops, you can go more and more granular. The slightly different shade of blue might be important to Google, but it’s probably not that important to sites with less traffic. But, keep in mind, assumptions should be tested ;) Your mileage may vary.

    There are various tools available to help you run these test. I have no association with any of these, but here’s a few to check out:

    4. Ensure Statistical Significance

    Tests need to show statistical significance. What does statistically significant mean?

    For those who are comfortable with statistics:

    Statistical significance is used to refer to two separate notions: the p-value, the probability that observations as extreme as the data would occur by chance in a given single null hypothesis; or the Type I error rate α (false positive rate) of a statistical hypothesis test, the probability of incorrectly rejecting a given null hypothesis in favor of a second alternative hypothesis

    For those of you, like me, who prefer a more straightforward explanation. Here’s also a good explanation in relation to PPC, and a video explaining statistical significance in reference in A/B test.

    In short, you need enough visitors taking an action to decide it is not likely to have occurred randomly, but is most likely attributable to a specific cause i.e. the change you made.

    5. Run With The Winners

    Run with the winners, cut the losers, rinse and repeat. Keep in mind that you may need to retest at different times, as the audience can change, or their motivations change, depending on underlying changes in your industry. Testing, like great SEO, is best seen as an ongoing process.

    Make the most of every visitor who arrives on your site, because they’re only ever going to get more expensive.

    Here’s an interesting seminar where the results of hundreds of experiments were reduced down to three fundamental lessons:

    • a) How can I increase specify? Use quantifiable, specific information as it relates to the value proposition
    • b) How can I increase continuity? Always carry across the key message using repetition
    • c) How can I increase relevance? Use metrics to ask “why”

    Tests Fail

    Often, tests will fail.

    Changing content can sometimes make little, if any, difference. Other times, the difference will be significant. But even when tests fail to show a difference, it still gives you information you can use. These might be areas in which designers, and other vested interests, can stretch their wings, and you know that it won’t necessarily affect business value in terms of conversion.

    Sometimes, the test itself wasn't designed well. It might not have been given enough time to run. It might not have been linked to a business case. Tests tend to get better as we gain more experience, but having a process in place is the important thing.

    You might also find that your existing page works just great and doesn’t need changing. Again, it’s good to know. You can then try replicating this successes in areas where the site isn’t performing so well.

    Enjoy Failing

    Fail fast, early and fail often”.

    Failure and mistakes are inevitable. Knowing this, we put mechanisms in place to spot failures and mistakes early, rather than later. Structured failure is a badge of honor!

    Thomas Edison performed 9,000 experiments before coming up with a successful version of the light bulb. Students of entrepreneurship talk about the J-curve of returns: the failures come early and often and the successes take time. America has proved to be more entrepreneurial than Europe in large part because it has embraced a culture of “failing forward” as a common tech-industry phrase puts it: in Germany bankruptcy can end your business career whereas in Silicon Valley it is almost a badge of honour

    Silicon Valley even comes up with euphemisms, like “pivot”, which weaves failure into the fabric of success.

    Or perhaps it’s because some of the best ideas in tech today have come from those that weren’t so good. (Remember, Apple's first tablet devices was called the Newton.)
    There’s a word used to describe this get-over-it mentality that I heard over and over on my trip through Silicon Valley and San Francisco this week: “Pivot“

    Experimentation, and measuring results, will highlight failure. This can be a hard thing to take, and especially hard to take when our beloved, pet theories turn out to be more myth than reality. In this respect, testing can seem harsh and unkind. But failure should be seen for what it is - one step in a process leading towards success. It’s about trying stuff out in the knowledge some of it isn’t going to work, and some of it will, but we can’t be expected to know which until we try it.

    In The Lean Startup, Eric Ries talks about the benefits of using lean methodologies to take a product from not-so-good to great, using systematic testing”

    If your first product sucks, at least not too many people will know about it. But that is the best time to make mistakes, as long as you learn from them to make the product better. “It is inevitable that the first product is going to be bad in some ways,” he says. The Lean Startup methodology is a way to systematically test a company’s product ideas.
    Fail early and fail often. “Our goal is to learn as quickly as possible,” he says

    Given testing can be incremental, we don’t have to fail big. Swapping one graphic position for another could barely be considered a failure, and that’s what a testing process is about. It’s incremental, and iterative, and one failure or success doesn’t matter much, so long as it’s all heading in the direction of achieving a business goal.

    It’s about turning the dogs into winners, and making the winners even bigger winners.

    Feel Vs Experimentation

    Web publishing decisions are often based on intuition, historical precedence - “we’ve always done it this way” - or by copying the competition. Graphic designers know about colour psychology, typography and layout. There is plenty of room for conflict.

    Douglas Bowden, a graphic designer at Google, left Google because he felt the company relied too much on data-driven decisions, and not enough on the opinions of designers:

    Yes, it’s true that a team at Google couldn’t decide between two blues, so they’retesting 41 shades between each blue to see which one performs better. I had a recent debate over whether a border should be 3, 4 or 5 pixels wide, and was asked to prove my case. I can’t operate in an environment like that. I’ve grown tired of debating such minuscule design decisions. There are more exciting design problems in this world to tackle.

    That probably doesn’t come as a surprise to any Google watchers. Google is driven by engineers. In Google’s defense, they have such a massive user base that minor changes can have significant impact, so their approach is understandable.

    Integrate Design

    Putting emotion, and habit, aside is not easy.

    However, experimentation doesn’t need to exclude visual designers. Visual design is valuable. It helps visitors identify and remember brands. It can convey professionalism and status. It helps people make positive associations.

    But being relevant is also design.

    Adopting an experimentation methodology means designers can work on a number of different designs and get to see how the public really does react to their work. Design X converted better than design Y, layout Q works best for form design, buttons A, B and C work better than buttons J, K and L, and so on. It’s a further opportunity to validate creative ideas.

    Cultural Shift

    Part of getting experimentation right has to do with an organizations culture. Obviously, it’s much easier if everyone is working towards a common goal i.e. “all work, and all decisions made, should serve a business goal, as opposed to serving personal ego”.

    All aspects of web publishing can be tested, although asking the right questions about what,to test is important. Some aspects may not make a measurable difference in terms of conversion. A logo, for example. A visual designer could focus on that page element, whilst the conversion process might rely heavily on the layout of the form. Both the conversion expert and the design expert get to win, yet not stamp on each others toes.

    One of the great aspects of data-driven decision making is that common, long-held assumptions get challenged, often with surprising results. How long does it take to film a fight scene? The movie industry says 30 days.

    Mark Walberg challenged that assumption and did it in three:

    Experts go with what they know. And they’ll often insist something needs to take a long time. But when you don’t have tons of resources, you need to ask if there’s a simpler, judo way to get the impact you desire. Sometimes there’s a better way than the “best” way. I thought of this while watching “The Fighter” over the weekend. There’s a making of extra on the DVD where Mark Wahlberg, who starred in and produced the film, talks about how all the fight scenes were filmed with an actual HBO fight crew. He mentions that going this route allowed them to shoot these scenes in a fraction of the time it usually takes

    How many aspects of your site are based on assumption? Could those assumptions be masking opportunities or failure?

    Winning Experiments

    Some experiments, if poorly designed, don’t lead to more business success. If an experiment isn’t focused on improving a business case, then it’s probably just wasted time. That time could have been better spent devising and running better experiments.

    In Agile software design methodologies, the question is always asked “how does this change/feature provide value to the customer”. The underlying motive is “how does this change/feature provide value to the business”. This is a good way to prioritize test cases. Those that potentially provide the most value, such as landing page optimization on PPC campaigns, are likely to have a higher priority than, say, features available to forum users.

    Further Reading

    I hope this article has given you some food for thought and that you'll consider adopting some experiment-based processes to your mix. Here's some of the sources used in this article, and further reading:

    Pages






      Email Address
      Pick a Username
      Yes, please send me "7 Days to SEO Success" mini-course (a $57 value) for free.

      Learn More

      We value your privacy. We will not rent or sell your email address.