Time For A Content Audit

Oct 2nd
posted in

"Content is king" is one of those “truthy” things some marketers preach. However, in most businesses the bottom line is king, attention is queen, and content can be used as a means to get both, but it depends.

The problem is that content is easy to produce. Machines can produce content. They can tirelessly churn out screeds of content every second. Even if they didn’t, billions of people on the internet are perfectly capable of adding to the monolithic content pile at similar rates.

Low barriers to content production and distribution mean the internet has turned a lot of content into near worthless commodity. Getting and maintaining attention is the tricky part, and once a business has that, then the benefits can flow through to the bottom line.

Some content is valuable, of course. Producing valuable content can earn attention. The content that gets the most attention is typically something for which an audience has a strong need, yet can’t easily get elsewhere, and is published in a place they're likely to see. Or someone they know is likely to see. An article on title tags will likely get buried. An article on the secret code to cracking Google's Hummingbird algorithms will likely crash your server.

Up until the point everyone else has worked out how to crack them, too, of course.

What Content Does The User Want?

Content can become King if the audience bestows favor upon it. Content producers need to figure out what content the audience wants. Perversely, Google have chosen to make this task even more difficult than it was before by withholding keyword data. Between Google’s supposed “privacy” drive, Hummingbird supposedly using semantic analysis, and Penguin/Panda supposedly using engagement metrics, page level and path level optimization are worth focusing upon going forward.

If you haven’t done one for a while, now is probably a good time to take stock and undertake a content audit.

You Have Valuable Historical Information

If you’ve got historical keyword data, archive it now. It will give you an advantage over those who follow you from this point on. Going forward, it will be much more expensive to acquire this data.

Run an audit on your existing content. What content works best? What type of content is it? Video? Text? What’s the content about? What keywords did people use to find it previously? Match content against your historical keyword data.

Here’s a useful list of site and content audit tools and resources.

If keywords can no longer suggest content demand, then how do we know what the visitor wants in terms of content? We must seek to understand the audience at a deeper level. Take a more fuzzy approach.

Watch Activity Signals

Analytics can get pretty addictive and many tools let you watch what visitors do in real time. Monitor engagement levels on your pages. What is a user doing on that page? Are they reading? Contributing? Clicking back and forward looking for something else?

Ensure pages with high engagement are featured prominently in your information architecture. Relegate or fix low-engagement pages. Segment out your content so you know which is the most popular, in terms of landings, and link that information back to ranking reports. This way, you can approximate keywords and stay focused on the content users find most relevant and engaging. Segment out your audience, too. Different visitors respond to different things. Do you know which group favours what? What do older people go for? What do younger people go for? Here are a few ideas on how to segment users.

User behavior is getting increasingly complex. It takes multiple visits to purchase, from multiple channels/influences. Hence the addition of user segmentation allows us to focus on people. (For these exact reasons multi-channel funnels analysis and attribution modeling are so important!)
At the moment in web analytics solutions, people are defined by the first party cookie stored on their browser. Less than ideal, but 100x better then what we had previously. Over-time as we all expand to Universal Analytics perhaps we will have more options to track the same person, after explicitly asking for permission, across browsers, channels and devices

In-Site Search

If Google won’t give you keywords, build your own keyword database. Think about ways you can encourage people to use your in-site search. Watch the content they search for and consume the most. Another way of looking at site search is to provide navigation links that emphasize different keywords terms. For example, you could place these high up on your page, with each offering a different option relating to related keyword terms. Take a note of which keyword terms visitors favour over others.

In the good old days, people dutifully used site navigation at the left, right, or top of a website. But, two websites have fundamentally altered how we navigate the web: Amazon, because the site is so big, sells so many things, and is so complicated that many of us go directly to the site search box on arrival. And Google, which has trained us to show up, type what we want, and hit the search button. Now when people show up at a website, many of them ignore our lovingly crafted navigational elements and jump to the site search box. The increased use of site search as a core navigation method makes it very important to understand the data that site search generates

Distribution

Where does attention flow from? Social media? A mention is great, but if no attention flows over that link to your content, then it might be a misleading metric. Are people sharing your content? What topics and content gets shared the most?

Again, this comes back to understanding the audience, both what they’re talking about and what actions they take as a result. In “Digital Marketing Analytics: Making Sense Of Consumer Data”the authors recommend creating a “learning agenda”. Rather than just looking for mentions and volume of mentions, focus on specific brand or service attributes. Think about the specific questions you want answered by visitors as if they those visitors were sitting in front of you.

For example, how are consumers reacting to prices in your niche? What are their complaints? What do they wish would happen? Are people talking negatively about something? Are they talking positively about something? Who are the new competitors in this space?

Those are pretty rich signals. We can then link this back to content by addressing those issues within our content.

Google Keyword (Not Provided)

Sep 25th

Just a follow up on the prior (not provided) post, as Google has shot the moon since our last post on this. Here's a quick YouTube video.

The above video references the following:

Matt Cutts when secured search first rolled out:

Google software engineer Matt Cutts, who’s been involved with the privacy changes, wouldn’t give an exact figure but told me he estimated even at full roll-out, this would still be in the single-digit percentages of all Google searchers on Google.com.

This Week in Google (TWIG) show 211, where Matt mentioned the inspiration for encrypted search:

we actually started doing encrypted.google.com in 2008 and one of the guys who did a lot of heavy lifting on that, his name is Evan, and he actually reports to me. And we started that after I read Little Brother, and we said "we've got to encrypt the web."

The integration of organic search performance data inside AdWords.

The esteemed AdWords advertiser David Whitaker.

When asked about the recent increase in (not provided), a Google representative stated the following:

We want to provide SSL protection to as many users as we can, in as many regions as we can — we added non-signed-in Chrome omnibox searches earlier this year, and more recently other users who aren’t signed in. We’re going to continue expanding our use of SSL in our services because we believe it’s a good thing for users….

The motivation here is not to drive the ads side — it’s for our search users.

What an excellent time for Google to block paid search referrals as well.

If the move is important for user safety then it should apply to the ads as well.

How To Think About Your Next SEO Project

Sep 23rd

The independent webmaster has taken a beating over the last couple of years. Risk has become harder to spread, labor costs have gone up, outreach has become more difficult and more expensive as Google's webspam team and the growing ranks of the Search Police spread the FUD far and wide.

The web is still a great place to be and still offers incredible opportunity that is largely unavailable, without much more capital intensive risk, in the offline world.

There's still plenty of success to be had in the web-based business model but like any strategy we have to refine it from time to time. I thought I'd share the core processes I go through when starting a new site.

Look for Signal, Look Past the Noise

Online marketers, celebrities, and brands pretty much power the Twittersphere and the 140 character limit invariably leads to statements full of bluster (and shallowness) like:

  • Links are dead
  • Forget links get social likes, +1's, RT's, and so on
  • Guest posting is dead
  • Infographics are dead
  • SEO is dead

None of that is true but when folks try to become prognosticators they will just keep saying the same thing over and over, with some slight re-framing, until they finally get it right.

All you have to do is look at the really ridiculous statements over the years about how ranking "doesn't matter". These statements have gone back to at least 2006-ish, craziness.

Or look at the past couple years where we get "social shares are the new link" shoved down our throats despite the data that flies in the face of that statement, at least as it pertains to organic search growth.

pinnochio

Yet, years later both of these "industry trends" would have cost you significant amounts of revenue and search share. We don't have to debate the spam links vs non-spam links here either. No one here is advocating for you to build crappy links and you don't need to.

Establishing Your Portfolio

It's quite likely, as an independent webmaster, that you will have sites that serve different purposes. I have sites that:

  • are actively being built into online brands (or trying at least :D )
  • exist as pure, longer-standing SEO plays that are cash cows used to fund more sustainable long-term projects
  • are built to initially live off of paid traffic, direct outreach, and/or social campaigns with organic search as a tertiary method of traffic acquisition
  • exist solely to test new ideas or new products before building an actual site/brand

I also work a select type of client. One thing I found helpful was to set up a spreadsheet with some very basic information to help me keep track of things at a 10,000 foot view.

So I have a column for:

  • Domain
  • Purpose Tag (one of the areas I described above)
  • Net Monthly Revenue (multiple columns)
  • Rolling 12 month Net Revenue
  • Same monthly/rolling numbers for costs

From there, I do a quick chart to show what areas most of the revenue is coming from and where the investment is going. Over time, I try to make sure the online brand area (where we are getting traffic and revenue from a healthly mix of multiple sources) is outpacing the pure SEO plays in both areas and we try to shy away from making too many expensive pure SEO plays where no mid-long term "brandability" exists.

We also like to see growth in client areas as well, but only for the right kind of client. The wrong kind of client can have a really destructive effect on a small team.

Staying small, lean, and profitable are also big keys to this strategy. If you are up against it on debt and overhead you will probably be less likely to make the proper decisions for your long-term viability on the 'net.

Considerations When Starting a New Site

I think most small teams or individual publishers can probably handle 2-3 branded sites at a time (stipulating that a branded site is one where there are just about all elements of online marketing involved). The first step I take is to determine what bucket the site will go in.

A testing site is easy enough to decide on. I might have an idea for a new product so I'll just throw up a small Wordpress site, a landing page, and test it out via PPC. Part of the initial research here is to determine whether there is any existing "search" demand or if you'll be tasked with creating demand on your own.

You can certainly build an online product that will be driven, initially, mostly by offline demand if you have the right networking in place. For the most part we try to stick to stuff where there is some initial demand online as the offline networking component tends to involve, in my experience, a lot more initial work, more stakeholders, etc.

When we look at a "product" we consider the following as "stuff" we could sell:

  • knowledge
  • physical product
  • digital products

Certainly a site can have any combination of those elements but generally those are the three basic types of things we'd consider selling. From there we would want to figure out:

  • brand name and domain (I prefer one or two word domains here, keyword not required)
  • search volume estimates and the length of the tail for each core keyword
  • if conversations are taking place across the web for the broader topic or lateral topics where we can insert ourselves/product
  • if our product can be a niche of an already successful, broader product offering
  • does the product have a reasonable chance of success in the social media realm
  • if we can make it better than what exists now

Example of a Product Idea

So one example, as I also dabble in real estate a bit, that I'll give is a CRM/PM solution for real estate investors. Most of the products out there aren't what I would consider "good". Many of the solutions are either just not very good or require some hook into a complex solution like Microsoft Dynamics CRM.

There's demand for the product on the web and there's a lot that could be done, more elegantly, with technologies that are available today to help connect all the things that go into an investment decision and investment management.

This is something I'm kicking around and it's a good example of our strategy of trying to find a successful, broad market where opportunity exists for niches to be served in a more direct, elegant manner.

We could do 2 of 3 product types here, but would likely start with just the online product itself and maybe hang training or courses off of it later.

You Need a Product

product

If you want to stick around online I believe you need at least 1 product and brand that can sustain the up and down nature of search cycles. You could argue that client work is your product and I'd buy that.

However, I think client work is still an area where you are more beholden to the decisions of others, in a more abrupt fashion (internal client spend decisions, taking things in-house, etc), than you are if you have your own product or service especially at the price points charged to clients.

I could also make the case that if you are selling direct to consumers you are beholden to them as well. Yet, I think the risk is better spread out over an SaaS model, subscription model, or direct product model than it is selling to either a handful of large clients or handfuls of large clients that require a large team of people and all that goes into the management of a team like that.

Opportunity Abounds

There still is a ton of opportunity on the web, there is no doubt about that. The practice of finding a broad market and picking a niche in there has worked out well for us in the last year or so.

In some areas we start off with no connections at all. So in areas where we are behind the 8 ball on relationships we will often hire writers from boards like ProBlogger.Net where will we specifically ask for folks who are in that industry with an existing site and active social following to write for us.

We will also ask them to promote what they write for us on their social channels and site while hooking their authorship profile into the posts they do for us. This helps us, in certain industries anyway, really grow an audience for short money and establish relationships with established, trusted people in the space.

Sell Something

Finding that balance between passion and monetary potential is difficult and there's often some level of tradeoff. If you use the items I listed earlier as a guide to determine how to move forward with an idea, or if moving forward even makes sense for the idea, then I think you'll be starting off in a solid position.

The last couple of years have been really turbulent but that also has created more opportunities in different areas and while it's nice to throw out the word "diversify" it's also good to take a more boots on the ground approach than a theoretical one.

The core hallmarks of a traditional SEO campaign are still largely the same but there's no reason why you can't stick around and take advantage of these opportunities, especially with all the experience you have in multiple areas of online marketing from being an independent webmaster in the golden age of SEO.

  • Over 100 training modules, covering topics like: keyword research, link building, site architecture, website monetization, pay per click ads, tracking results, and more.
  • An exclusive interactive community forum
  • Members only videos and tools
  • Additional bonuses - like data spreadsheets, and money saving tips
We love our customers, but more importantly

Our customers love us!

Design Thinking

Sep 16th
posted in

One of the problems with analysing data is the potential to get trapped in the past, when we could be imagining the future. Past performance can be no indication of future success, especially when it comes to Google’s shifting whims.

We see problems, we devise a solution. But projecting forward by measuring the past, and coming up with “the best solution” may lead to missing some obvious opportunities.

Design Thinking

In 1972, psychologist, architect and design researcher Bryan Lawson created an empirical study to understand the difference between problem-based solvers and solution-based solvers. He took two groups of students – final year students in architecture and post-graduate science students – and asked them to create one-story structures from a set of colored blocks. The perimeter of the building was to optimize either the red or the blue color, however, there were unspecified rules governing the placement and relationship of some of the blocks.
Lawson found that:

The scientists adopted a technique of trying out a series of designs which used as many different blocks and combinations of blocks as possible as quickly as possible. Thus they tried to maximize the information available to them about the allowed combinations. If they could discover the rule governing which combinations of blocks were allowed they could then search for an arrangement which would optimize the required color around the design. By contrast, the architects selected their blocks in order to achieve the appropriately colored perimeter. If this proved not to be an acceptable combination, then the next most favorably colored block combination would be substituted and so on until an acceptable solution was discovered.

Nigel Cross concludes from Lawson's studies that "scientific problem solving is done by analysis, while designers problem solve through synthesis”

Design thinking tends to start with the solution, rather than the problem. A lot of problem based-thinking focuses on finding the one correct solution to a problem, whereas design thinking tends to offer a variety of solutions around a common theme. It’s a different mindset.

One of the criticisms of Google, made by Google’s former design leader Douglas Bowman, was that Google were too data centric in their decision making:

When a company is filled with engineers, it turns to engineering to solve problems. Reduce each decision to a simple logic problem. Remove all subjectivity and just look at the data...that data eventually becomes a crutch for every decision, paralyzing the company and preventing it from making any daring design decisions…

There’s nothing wrong with being data-driven, of course. It’s essential. However, if companies only think in those terms, then they may be missing opportunities. If we imagine “what could be”, rather than looking at “what was”, opportunities present themselves. Google realise this, too, which is why they have Google X, a division devoted to imagining the future.

What search terms might people use that don’t necessarily show up on keyword mining tools? What search terms will people use six months from now in our vertical? Will customers contact us more often if we target them this way, rather than that way? Does our copy connect with our customers, of just search engines? Given Google is withholding more search referral data, which is making it harder to target keywords, adding some design thinking to the mix, if you don’t already, might prove useful.

Tools For Design Thinking

In the book, Designing For Growth, authors Jeanne Liedtka and Tim Ogilvie outline some tools for thinking about opportunities and business in ways that aren’t data-driven. One famous proponent of the intuitive, design-led approach was, of course, Steve Jobs.

It's really hard to design products by focus groups. A lot of times, people don't know what they want until you show it to them

The iphone or iPad couldn’t have been designed by looking solely at the past. They mostly came about because Jobs had an innate understanding of what people wanted. He was proven right by the resulting sales volume.

Design starts with empathy. It forces you to put yourself in the customers shoes. It means identifying real people with real problems.

In order to do this, we need to put past data aside and watch people, listen to people, and talk with people. The simple act of doing this is a rich source of keyword and business ideas because people often frame a problem in ways you may not expect.

For example, a lot of people see stopping smoking as a goal-setting issue, like a fitness regime, rather than a medical issue. Advertising copy based around medical terminology and keywords might not work as well as copy oriented around goal setting and achieving physical fitness. This shift in the frame of reference certainly conjures up an entirely different world of ad copy, and possibly keywords, too. That different frame might be difficult to determine from analytics and keyword trends alone, but might be relatively easy to spot simply by talking to potential customers.

Four Questions

Designing For Growth is worth a read if you’re feeling bogged down in data and looking for new ways to tackle problems and develop new opportunities. I don’t think there’s anything particularly new in it, and it can come across as "the shiny new buzzword" at times, but the fundamental ideas are strong. I think there is value in applying some of these ideas directly to current SEO issues.

Designing For Growth recommends asking the following questions.

What is?

What is the current reality? What is the problem your customers are trying to solve? Xerox solved a problem customers didn’t even know that had when Xerox invented the fax machine. Same goes for the Polaroid camera. And the microwave oven. Customers probably couldn’t describe those things until they saw and understood them, but the problem would have been evident had someone looked closely at the problems they faced i.e. people really wanted faster, easier ways of completing common tasks.

What do your customers most dislike about the current state of affairs? About your industry? How often do you ask them?

One way of representing this information is with a flowchart. Map the current user experience from when they have a problem, to imagining keywords, to searching, to seeing the results, to clicking on one of those results, to finding your site, interacting to your site, to taking desired action. Could any of the results or steps be better?

Usability tests use the same method. It’s good to watch actual customers as they do this, if possible. Conduct a few interviews. Ask questions. Listen to the language people use. We can glean some of this information from data mining, but there’s a lot more we can get by direct observation, especially when people don’t click on something, as non-activity seldom registers in a meaningful way in analytics.

What if?

What would “something better” look like?

Rather than think in terms of what is practical and the constraints that might prevent you from doing something, imagine what an ideal solution would look like if it weren’t for those practicalities and constraints.

Perhaps draw pictures. Make mock-ups. Tell a story. Anything that fires the imagination. Use emotion. Intuition. Feeling. Just going through such a process will lead to making connections that are difficult to make by staring at a spreadsheet.

A lot of usability testers create personas. These are fictional characters based on real or potential customers and are used try to gain an understanding of what they might search for, what problems they are trying to solve, and what they expect to see on our site. Is this persona a busy person? Well educated? Do they use the internet a lot? Are they buying for themselves, or on behalf of others? Do they tend to react emotionally, or are they logical? What incentives would this persona respond to?

Personas tend to work best when they’re based on actual people. Watch and observe. Read up on relevant case studies. Trawl back through your emails from customers. Make use of story-boards to capture their potential actions and thoughts. Stories are great ways to understand motivations and thoughts.

What are those things your competition does, and how could they be better? What would those things look like in the best possible world, a world free of constraints?

What wows?

“What wows” is especially important for social media and SEO going forward.

Consider Matt Cutts statement about frogs:

Those other sites are not bringing additional value. While they’re not duplicates they bring nothing new to the table. It’s not that there’s anything wrong with what these people have done, but they should not expect this type of content to rank.
Google would seek to detect that there is no real differentiation between these results and show only one of them so we could offer users different types of sites in the other search results

Cutts talks about the creation of new value. If one site is saying pretty much the same as another site, then those sites may not be duplicates, but one is not adding much in the way of value, either. The new site may be relegated simply for being “too samey”.

It's the opposite of the Zygna path:

"I don't fucking want innovation," an anonymous ex-employee recalls Pincus saying in 2010, according to the SF Weekly. "You're not smarter than your competitor. Just copy what they do and do it until you get their numbers."

Generally speaking, up-and-coming sites should focus on wowing their audience with added depth and/or a new perspective. This, in turn, means having something worth remarking upon, which then attracts mentions across social media, and generates more links.

Is this certain to happen? Nothing is certain as far as Google is concerned. They could still bury you on a whim, but wowing an audience is a better bet than simply imitating long-established players using similar content and link structures. At some point, those long-established players had to wow their audience to get the attention and rankings they enjoy today. They did something remarkably different at some point. Instead of digging the same hole deeper, dig a new hole.

In SEO, change tends to be experimental. It’s iterative. We’re not quite sure what works ahead of time, and no amount of measuring the past tells us all we want to know, but we try a few things and see what works. If a site is not ranking well, we try something else, until it does.

Which leads us to….

What works?

Do searchers go for it? Do they do that thing we want them to do, which is click on an ad, or sign up, or buy something?

SEOs are pretty accomplished at this step. Experimentation in areas that are difficult to quantify - the algorithms - have been an intrinsic part of SEO.

The tricky part is not all things work the same everywhere & much like modern health pathologies, Google has clever delays in their algorithms:

Many modern public health pathologies – obesity, substance abuse, smoking – share a common trait: the people affected by them are failing to manage something whose cause and effect are separated by a huge amount of time and space. If every drag on a cigarette brought up a tumour, it would be much harder to start smoking and much easier to quit.

One site's rankings are more stable because another person can't get around the sandbox or their links get them penalized. The same strategy and those same links might work great for another site.

Changes in user behavior are more directly & immediately measurable than SEO.

Consider using change experiments as an opportunity to open up a conversation with potential users. “Do you like our changes? Tell us”. Perhaps use a prompt asking people to initiate a chat, or participate on a poll. Engagement that has many benefits. It will likely prevent a fast click back, you get to see the words people use and how they frame their problems, and you learn more about them. You become more responsive and empathetic sympathetic to their needs.

Beyond Design Thinking

There’s more detail to design thinking, but, really, it’s mostly just common sense. Another framework to add, especially if you feel you’re getting stuck in faceless data.

Design thinking is not a panacea. It is a process, just as Six Sigma is a process. Both have their place in the modern enterprise. The quest for efficiency hasn't gone away and in fact, in our economically straitened times, it's sensible to search for ever more rigorous savings anywhere you can

What's best about it, I feel, is this type of thinking helps break strategy and data problems down and give it a human face.

In this world, designers can continue to create extraordinary value. They are the people who have, or could have, the laterality needed to solve problems, the sensing skills needed to hear what the world wants, and the databases required to build for the long haul and the big trajectories. Designers can be definers, making the world more intelligible, more habitable

Jim Boykin Interview

Sep 4th

Jim Boykin has been a longtime friend & was one of the early SEOs who was ahead of the game back in the day. While many people have came and went, Jim remains as relevant as ever today. We interviewed him about SEO, including scaling his company, disavow & how Google has changed the landscape over the past couple years.

Aaron: How did you get into the field of SEO?

Jim: In 1999 I started We Build Pages as a one man show designing and marketing websites...I never really became much of a designer, but luckily I had much more success in the marketing side. Somehow that little one man show grew to about 100 ninjas, and includes some communities and forums I grew up on (WebmasterWorld, SEOChat, Cre8asiteForums), and I get to work with people like Kris Jones, Ann Smarty, Chris Boggs, Joe Hall, Kim Krause Berg, and so many others at Ninjas who aren't as famous but are just as valuable to me, and Ninjas has really become a family over the years. I still wonder at times how this all happened, but I feel lucky with where we're at.

Aaron: When I got started in SEO some folks considered all link building to be spam. I looked at what worked, and it appeared to be link building. Whenever I thought I came up with a new clever way to hound for links & would hunted around, most the times it seems you got there first. Who were some of the people you looked to for ideas when you first got into SEO?

Jim: Well, I remember going to my first SEO conference in 2002 and meeting people like Danny Sullivan, Jill Whalen, and Bruce Clay. I also remember Bob Massa being the first person "dinged" by google for selling links...that was back in 2002 I think...I grew up on Webmasterworld and I learned a ton from the people in there like: Tedster, Todd Friesen, Greg Boser, Brett Tabke, Shak, Bill, Rae Hoffman, Roger Montti, and so many others in there over the years...they were some of my first influencers....I also used to hang around with Morgan Carey, and Patrick Gavin a lot too. Then this guy selling an SEO Book kept showing up on all my high PR pages where I was getting my links....hehe...

Aaron: One of the phrases in search that engineers may use is "in an ideal world...". There is always some amount of gap between what is advocated & what actually works. With all the algorithmic changes that have happened in the past few years, how would you describe that "gap" between what works & what is advocated?

Jim: I feel there's really been a tipping point with the Google Penguin updates. Maybe it should be "What works best short term" and "What works best long term"....anything that is not natural may work great in the short term, but your odds of getting zinged by Google go way up. If you're doing "natural things" to get citations and links, then it may tend to take a bit longer to see results (in conjunction with all you're doing), but at least you can sleep at night doing natural things (and not worrying about Google Penalties).  It's not like years ago when getting exact targeted anchor text for the phrases you want to rank on was the way to go if you wanted to compete for search rankings. Today it's much more involved to send natural signals to a clients website.  To send in natural signals you must do things like work up the brand signals, trusted citations, return visitors, good user experience, community, authors, social, yada yada....SEO is becming less a "link thing"...and more a "great signals from many trusted people", as well as it's a branding game now. I really like how SEO is evolving....for years Google used to say things like "Think of the users" when talking of the algorthym, but we all laughed and said "Yea, yea, we all know that it's all about the Backlinks"....but today, I think Google has crossed a tipping point where yes, to do great SEO, you must focus on the users, and not the links....the best SEO is getting as many citations and trusted signals to your site than your competitors...and there's a lot of trusted signals which we, as internet marketers, can be working on....it's more complicated, and some SEO's won't survive this game...they'll continue to aim for short term gains on short tail keyword phrases...and they'll do things in bulk....and their network will be filtered, and possibly penalized.

Every website owner has to measure the risks, and the time involved, and the expected ROI....it's not a cheap game any more....doing real marketing involves brains and not buttons...if you can't invest in really building something "special" (ideally many special things), on your site to get signals (links/social), then you're going to find it pretty hard to get links that look natural and don't run a risk of getting penalized.  The SEO game has really matured, the other option is to take a high risk of penalization.

Aaron: In terms of disavow, how deep does one has to cut there?

Jim: as deep as it needs to be to remove every unantural link. If you have 1000 backlinks and 900 are on pages that were created for "unnatural purposes (to give links)" then all 900 have to be disavowed...if you have 1000 backlinks, and only 100 are not "natural" then only 100 need to be disavowed... what percent has to be disavowed to untrip an algorthymitic filter? I'm not sure...but almost always the links which I disavow have zero value (in my opinion) anyways.  Rip the band-aid off, get over it, take your marketing department and start doing real things to attract attention, and to keep it.

Aaron: In terms of recoveries, are most penalized sites "recoverable"? What does the typical recovery period look like in terms of duration & restoration?

Jim: oh...this is a bee's nest you're asking me..... are sites recoverable....yes, most....if a site has 1000 domains that link to it, and 900 of those are artificial and I disavow them, there might not be much of a recovery depending on what that 100 links left are....ie, if I disavow all link text of "green widgets" that goes to your site, and you used to rank #1 for "green widgets" prior to being hit by a Penguin update, then I wouldn't expect to "recover" on the first page for that phrase..... where you recover seems to depend on "what do you have for natural links that are left after the disavow?"....the time period....well.... we've seen some partial recoveries in as soon as 1 month, and some 3 months after the disavow...and some we're still waiting on....

To explain, Google says that when you add links to the disavow document, then way it works is that the next time Google crawls any page that links to you, they will assign a "no follow" to the link at that time.....so you have to wait until enough of the links have been recrawled, and now assigned the no follow, to untrip the filter....but one of the big problems I see is that many of the pages Google shows as linking to you, well, they're not cached in Google!....I see some really spammy pages where Google was there (they record your link), but it's like Google has tossed the page out of the index even though they show the page as linking to you...so I have to ask myself, when will Google return to those pages?...will Google ever return to those pages???  It looks like if  you had a ton of backlinks that were on pages that were so bad in the eyes of Google that they don't even show those pages in their index anymore...we might be waiting a long long time for google to return to those pages to crawl them again....unless you do something to get Google to go back to those pages sooner (I won't elaborate on that one).

Aaron: I notice you launched a link disavow tool & earlier tonight you were showing me a few other cool private tools you have for working on disavow analysis, are you going to make any of those other tools live to the public?

Jim: Well, we have about 12 internal private disavow analysis tools, and only 1 public disavow tool....we are looking to have a few more public tools for analyzing links for disavow analysis in the coming weeks, and in a few months we'll release our Ultimate Disavow Tool...but for the moment, they're not ready for the public, some of those are fairly expensive to run and very database intensive...but I'm pretty sure I'm looking at more link patterns than anyone else in the world when I'm analyzing backlinks for doing disavows. When I'm tired of doing disavows maybe I'll sell access to some of these.

Aaron: Do you see Google folding in the aggregate disavow data at some point? How might they use it?

Jim: um.....I guess if 50,000 disavow documents have spammywebsite.com listed in their disavows, then Google could consider that spammywebsite.com might be a spammy website.....but then again, with people disavowing links who don't know what they're doing, I'm sure their's a ton of great sites getting listed in Disavow documents in Webmaster Tools.

Aaron: When approaching link building after recovering from a penalty, how does the approach differ from link building for a site that has never been penalized?

Jim: it doesn't really matter....unless you were getting unnatural/artificial links or things in bulk in the past, then, yes, you have to stop doing that now...that game is over if you've been hit...that game is over even if you haven't been hit....Stop doing the artificial link building stuff. Get real citations from real people (and often "by accident") and you should be ok.

Aaron: You mentioned "natural" links. Recently Google has hinted that infographics, press releases & other sorts of links should use nofollow by default. Does Google aim to take some "natural" link sources off the table after they are widely used? Or are those links they never really wanted to count anyhow (and perhaps sometimes didn't) & they are just now reflecting that.

Jim: I think ~most of these didn't count for years anyways....but it's been impossible for Google to nail every directory, or every article syndication site, or every Press Release site, or everything that people can do in bulk..and it's harder to get all occurances of widgets and mentions of infographics...so it's probably just a "Google Scare....ie, Google says, "Don't do it, No Follow them" (and I think they say that because it often works), and the less of a pattern there is, the harder for Google to catch it (ie, widgets and infographics) ...I think too much of any 1 thing (be it a "type of link") can be a bad signal....as well as things like "too many links from pages that get no traffic", or "no clicks from links to your site". In most cases, because of keyword abuse, Google doesn't want to count them...links like this may be fine (and ok to follow) in moderation...but if you have 1000 widgets links, and they all have commercial keywords as link text, then you're treading on what could certainly turn into a negative signal, and so then you might want to consider no following those.

Aaron: There is a bit of a paradox in terms of scaling effective quality SEO services for clients while doing things that are not seen as scalable (and thus future friendly & effective). Can you discuss some of the biggest challenges you faced when scaling IMN? How were you able to scale to your current size without watering things down the way that most larger SEO companies do?

Jim: Scaling and keep quality has certainly been a challenge in the past. I know that scaling content was an issue for us for a while....how can you scale quality content?....Well, we've found that by connecting real people, the real writers, the people with real social influence...and by taking these people and connecting them to the brands we work with.....so these real people then become "Brand Evangelist"...and getting these real people who know what they're talking about to then write for our clients, well, when we did that we found that we could scale the content issue. We can scale things like link building by merging with the other "mentions", and specifically targeting industries and people and working on building up associations and relations with others has helped to scale...plus we're always building tools to help us scale while keeping quality. It's always a challenge, but we've been pretty good at solving many of those issues.

I think we've been really good at scaling in house....many content marketers are now more like community managers and content managers....we've been close to 100 employees for a few years now..so it's more how can we do more with the existing people we have...and we've been able to do that by connecting real people to the clients so we can actually have better content and better marketing around that content....I'm really happy that the # of employees has been roughly the same for past few years, but we're doing more business, and the quality keeps getting better....there's not as many content marketers today as there was a few years ago, but there's many more people working on helping authors build up their authorship value and produce more "great marketing" campaigns where as a bi-product, we happen to get some links and social citations.

Aaron: One of the things I noticed with your site over the past couple years is the sales copy has promoted the fusion of branding and SEO. I looked at your old site in Archive.org over the years & have seen quite an amazing shift in terms of sales approach. Has Google squeezed out most of the smaller players for good & does effective sustainable SEO typically require working for larger trusted entities? When I first got into SEO about 80%+ of the hands in the audiences at conferences were smaller independent players. At the last conference I was at it seemed that about 80% of the hands in the audience worked for big companies (or provided services to big companies). Is this shift in the market irreversible? How would you compare/contrast approach in working with smaller & larger clients?

Jim: Today it's down to "Who really can afford to invest in their Brand?" and "Who can do real things to get real citations from the web?"....and who can think way beyond "links"...if you can't do those things, then you can't have an effective sustainable online marketing program.... we once were a "link building company" for many, many years.... but for the past 3 years we've moved into full service, offering way more than what was "link building services".... yea, SEO was about "links" for years, and it still is to a large degree....but unless you want to get penalized, you have to take the "it's way more than links" approach... in order for SEO to work (w/o fear of getting penalized) today, you have to look at sending in natural signals...so thus, you must do "natural" things...things that will get others "talking" about it, and about you....SEO has evolved a lot over the years....Google used to recommend 1 thing (create a great site and create great things), but for years we all knew that SEO was about links and anchor text....today, ...today, I think Google has caught up with (to some degree) with the user, and with "real signals"...yesterday is was "gaming" the system....today it's about doing real things...real marketing...and getting you name out to the community via creating great things that spread, and that get people to come back to your site....those SEO's and businesses who don't realize that the game has changed, will probably be doing a lot of disavowing at some time in the future, and many SEO's will be out of business if they think it's a game where you can do "fake things" to "get links" in bulk....in a few years we'll see who's still around for internet marketing companies...those who are still around will be those who do real marketing using real people and promoting to other real people...the link game itself has changes...in the past we looked a link graphs...today we look at people graphs....who is talking about you, what are they saying....it's way more than "who links to me, and how do they link to me"....Google is turning it into a "everyone gets a vote", and "everyone has a value"...and in order to rank, you'll need real people of value talking about your site...and you'll need a great user experience when they get there, and you'll need loyal people who continue to return to your site, and you'll need to continue to do great things that get mentions....

SEO is no longer a game of some linking algorithm, it's now really a game of "how can you create a great user experience and get a buzz around your pages and brand".

Aaron: With as much as SEO has changed over the years, it is easy to get tripped up at some point, particularly if one is primarily focused on the short term. One of the more impressive bits about you is that I don't think I've ever seen you unhappy. The "I'm feeling lucky" bit seems to be more than just a motto. How do you manage to maintain that worldview no matter what's changing & how things are going?

Jim: Well, I don't always feel lucky...I know in 2008 when Google hit a few of our clients because we were buying links for them I didn't feel lucky (though the day before, when they ranked #1, I felt lucky)....but I'm in this industry for the long term...I've been doing this for almost 15 years....and yes, we've had to constantly change over the year, and continue to grow, and growing isn't always easy...but it is exciting to me, and I do feel lucky for what I have...I have a job I love, I get to work with people whom I love, in an industry I love, I get to travel around the world and meet wonderful people and see cool places...and employee 100 people and win "Best Places to work" awards, and I'm able to give back to the community and to society, and to the earth...those things make me feel lucky...SEO has always been like a fun game of chess to me...I'm trying to do the best I can with any move, but I'm also trying to think a few steps ahead, and trying to think what Google is thinking on the other side of the table.....ok...yea, I do feel lucky....maybe it's the old hippy in me...I always see the glass half full, and I'm always dreaming of a better tomorrow....

If I can have lots of happy clients, and happy employees, and do things to make the world a little better along the way, then I'm happy...sometimes I'm a little stressed, but that comes with life....in the end, there's nothing I'd rather be doing than what I currently do....and I always have big dreams of tomorrow that always make the trials of today seem worth it for the goals of what I want to achieve for tomorrow.

Aaron: Thanks Jim!


Jim Boykin is the CEO of the Internet Marketing Ninjas company, and a Blogger and public speaker. You can find Jim on Twitter, Facebook, and Google Plus.

Finding the Perfect Project Management & CRM Tools

Aug 27th
posted in

pm-crm-header-imate

Picking the right tools for project management and CRM functions can feel like an impossible task. I've gone through a number of applications in recent years (just about all of them actually). What makes choosing (or building) the right systems so difficult are the variables we all deal with in our respective workflows.

At some point in the SEO process a checklist doesn't suffice, at some point intuition and experience come into play and these traits require some intellectual flexibility.

You can build tasks and sub-tasks up to a certain level, but at some point you have to replace the task checklist option with a free form area to capture thoughts and ideas. Those thoughts and ideas can drive the future of the project yet it's hard to foresee what tasks are resultant from this approach at the beginning of a project.

How to Determine What You Need

This is hard. You should have an idea of current needs and possible future needs. It really sounds a bit easier than it is. You have to take a number of things into consideration:

  • Your current email, calendar, and document storage set ups
  • You and your staff's mobile devices and OS's
  • Features that you need
  • Features you might need
  • Reporting
  • Scalability of the platforms
  • Desire to rely on 3rd party integrations
  • Ability to manage multiple 3rd party integrations

Inside each of those items are more variables but for the most part these are the key areas to think about.

The hardest part for me was thinking about where I wanted to go. At one point or another I fell into the following categories:

  • Freelancer wanting to grow into an agency owner
  • Freelancer wanting to stay a freelancer
  • Wanting to exclusively work on my own properties
  • Wanting to exclusively focus on client work
  • Mixing client work and self-managed properties
  • Providing clients with more services vs focusing on a core service or two

When you run through those scenarios there are all sorts of tools that make sense, then don't make sense, and tools that kind of make sense. In addition to the categories I mentioned there are also questions about how big do you want to grow an agency.

Do you want a large staff? A small staff? Do you want to be more of an external marketer or do you want to be more day to day operations? Inside of those questions are lots of other intersections that can have a significant effect on how your workflow will be structured.

I'll give you some insight into how I determined my set up.

Putting Tools Through Their Paces

I do a mix of things for "work". I run some of my own properties, I have some clients, and I love SeoBook. In addition to this I've also been (slowly) developing a passive real estate investment company for a year (very slowly and pragmatically).

I spent quite a bit of time determining what I wanted to do and where I wanted to go and what made me the "happiest". I've been fortunate enough to be able to take the proper amount of time to determine these things without having to rush into a decision simply based on the need to make a buck.

So, I decided the following was best for me:

  • Work with select clients only
  • Have a small, focused team of great people
  • Continue developing our own web properties and products

Invariably when you make these decisions you leave money on the table somewhere and that's hard. Whether it's abandoning some short-term strategies that have a nice ROI or turning away clients based on a gut feeling or just being maxed out on client work, it's still hard to leave the $ there.

What Are Your Core Products

After deciding what I was going to do and the direction I was going to go it was a relief to eliminate some potential solutions from the mix. Overly complicated CRM's like Zoho and Microsoft Dynamics were out (fine products but overkill for me).

Determining the products and services that we would sell also helped narrow down the email, calendar, and document storage issue.

Sometimes a product is so core to your service that it has a significant influence on your choice of tools. I've been using Google Apps for business for awhile and our use of Buzzstream cemented that choice. We've also used Exchange in the past but it doesn't seem to play as nice with Buzzstream as Google Apps. Outreach is key for us and no one does it better than Buzzstream.

Our other "products and services" are fairly platform independent so the next big thing to deal with was document and information management. However, before we chose a provider for this service we needed to determine what CRM/PM system fit our workflow the best.

In my opinion, document integration is a nice add-on but not 100% necessary if you keep things in one place and have a tight file structure. In a larger organization this might be different but a proper client/project folder set up is easy enough to reference without having to compromise on a CRM/PM solution.

CRM and PM Systems

A post covering everything I went through would be like 10,000 words long but suffice to say the most important things to me with these system evaluations were:

  • Ease of Use
  • Speed
  • Reliability
  • Task and Project Template functionality
  • Solid reporting features without overkill
  • Backup functionality
  • Scalability
  • OS agnostic

Compromises will be made when you place any amount of criteria against pre-built solutions. There was a period of time where we might have scaled agency work so I'll mention tools that would have made that cut as well. We ended up settling on:

Using Asana

pm-crm-post-asana-logo

Asana accomplishes about 90% of what I need. It doesn't work on IE which means it doesn't work very well on my Windows phone but I have yet to encounter a situation in 5 years of dealing with 50+ clients and many internal projects where I needed to check in on my phone or where it couldn't wait until I got in front of my computer. I have an iPhone for iOS testing so in a pinch I could use that. Plus, you can have activity data emailed to your inbox so you can see if the sky is falling either way.

Asana doesn't do backups really well, you have to export as JSON but it's better than nothing. I have a project manager whom I trust so I don't need to monitor everything but I can quickly see the tasks assigned to her in case things are falling behind.

We don't assign tasks to other folks (outreachers, designers, programmers, etc), we just let them do their thing. Asana also integrates with Dropbox and Google Drive if you need that kind of integration. Asana also is task/project only, there's no information storage like there is in something like Basecamp or TeamworkPM (for us, that's ok).

Alternative to Asana = TeamworkPM

pm-crm-teamwork-logo

The alternative I would recommend, if you have a larger team or just want to have more granular control over things (and also more reporting functions), would be TeamworkPM. It meets all my requirements and then some. I find it just as easy to use as Basecamp but far more robust and it even makes using Gantt Charts easy and fun :)

For us, it's too much but it really is a nice product that makes scaling your work far more easier than Basecamp. In Basecamp you cannot see all tasks assigned to everyone and their statuses, you have to click on each person to see their individual tasks. This makes multi-employee management cumbersome. TeamworkPM also has project and task templates while Basecamp only has project templates.

I like the ability to create task list templates only because many of our project requirements involve specific tasks not necessarily present on every single project, so having just project templates is far too broad to be effective.

In addition, Basecamp's file handling is poor and messy for our usage because:

  • There's no file versioning
  • You can't delete a file without deleting the conversation attached to a file (so you have to rename them)
  • No integration with any document service

TeamworkPM integrates with various services and also does file versioning in case you use a service they do not integrate with.

Using Pipeline Deals

pm-crm-post-pipeline-logo

PipelineDeals is dead simple to use. It meets just about all my requirements and it has the most important integration a CRM can have; contact integration with my email application (Google Apps). It also has a nice gmail widget that makes email and contact management between Gmail and Pipeline Deals really slick.

We use Right Signature for document signing and Pipeline integrates with that as well. It doesn't integrate with BidSketch, which is what we use for proposals but that's ok. We don't do 20 quotes a week so that level of automation is nice but not necessary.

PipelineDeals doesn't integrate with Asana either. Again, that's fine in our case. We don't need the CRM to speak to the PM. It also does task templates which are a big deal to me and our workflows. Reporting and mobile access are excellent as well, without being overly complicated.

Documents and Information

Before I get to what could be a all in one solution for CRM/PM let's talk about documents and information.

I love the idea of easy information retrieval and not having to think about where to put things or where to look for things. There are a few core choices of document and information management to consider:

For more robust, enterprise level solutions we also considered Sharepoint. It's pretty complex but very robust and overkill for us.

Dropbox is excellent except for collaboration. Conflicted file versions are a pain in the butt but if you don't need any collaborative features it's a good solution. It syncs locally, stores native file types, integrates with a lot of services.

Evernote is a solid tool for textual based information sharing but I don't like it for files because it can't be your only file solution and I'm interested in a file solution that handles all files.

Google Docs is a wonderfully collaborative document management solution and could handle probably 60-75% of files. However, we do some custom stuff with Excel, Word, and some stuff with videos and not having the native file available for quick editing is a hassle.

Also, while emailing from Google Docs is a cool feature it doesn't work if you are emailing inside of an existing conversation. If you email inside Gmail you'll share a link to the file rather than the file itself and many times we have to send a Word doc or Excel file so we have to export from Google Docs to the proper file extension and then email.

Choosing Skydrive

pm-crm-post-skydrive-logo

Skydrive does what Dropbox does and what Google Docs does while maintaining the more widely accepted Office formats. We chose Skydrive for this reason. It's OS agnostic and works across iOS, Android, and of course on Windows phone. For iOS and Android you need an active Office 365 subscription. On iPad's you would still access via the browser though I believe an iPad version of what's on the iPhone now isn't far off at all.

We use Skydrive for project files, reference files, and collaborative files for site/project strategy. This leaves email correspondence with clients as the remaining piece of the information puzzle. CRM email storage is great for pre-sales, up-sells, and billing correspondence but what about project related email?

Project Related Emails

Most PM solutions allow you to email a message to a client from your PM interface and continue the correspondence there. This is great until someone starts adding other bits of information to an email (not everyone sticks to the subject line :D) and it quickly becomes unruly.

Probably the most tried and true solution is to either decide to keep all email correspondence (and notes from calls) in a CRM and label the note appropriately or try and document project related stuff in a project notebook or message. Asana doesn't have this option but TeamworkPM does.

My preference is to just keep that stuff in a CRM for easy reference but for larger teams I'd go with keeping it in the CRM + summarizing in the PM system.

There's another solution though. There's a product out there that combines CRM/PM into one app and makes keeping information together fairly simple.

Considering Insightly.Com

pm-crm-post-insightly

Insightly is a pretty robust and affordable CRM/PM solution. It's email dropbox allows you to keep emails stored for quick reference across projects and contacts.

The reason it can handle emails in this way is due to its unique linking relationships. You can link a contact and/or an organization to a project (and multiple projects).

You can easily see all projects associated with X but what's even more powerful is you can link vendors to projects too. When you BCC your project dropbox it will also link the email to the participants on the project as well has have a "Email" tab in the project interface so you can see all the relevant emails for that project whether it's with a client, vendor, staff member, etc.

If we were to move into a more client-facing company Insightly would merit strong consideration for its unique ability to easily keep all related information together.

Is Automation Overrated Sometimes?

I like automation, to an extent. I like syncing 2 apps together directly. There's a service out there called Zapier which does a great job linking otherwise incompatible services together. My hesitation here is relying on too many "parties" to accomplish tasks.

Automation is wonderful, really, but I would recommend sitting down and thinking about what automation do you really need and how helpful will it really be and what happens if a 3rd, 4th, or 5th party goes under.

For me, an example would be when I was considering Highrise.

  • Contacts sync provided by a third-party
  • Task templates provided by another third-party
  • Document integration provided by another third-party

I'm hesitant to rely on these extra services for core functionality because these functions are crucial for my business. There could be situations where those services get abandoned, an API changes and you're waiting for a fix, and so on.

There's plenty of services that offer integration between core apps like contacts, billing, time tracking, quoting, and so on. I just think it's wise to consider very carefully what you are relying on for core functionality and if you have to go outside of your chosen application too much it might be time to consider a new one.

Compromises and Moving Forward

If you choose any pre-built solution you're going to probably have some compromises. I have found that structure is really important and easy access to information, data, and task progress are more important than features and "options".

I think having too many services inside of your operation is a hindrance to being as productive and efficient as possible. Knowing where to look and why to look is half the battle. If you're running multiple project management solutions, multiple document management solutions, and so on then you might want to consider more efficient ways to handle your operation.

Without going through this process multiple times over the years there is no way I would have been able to stay as lean as possible while being as efficient as possible. Doing both of those things correctly usually leads:

  • Happier clients
  • A more productive work environment
  • A more profitable business

The Benefits Of Thinking Like Google

Aug 27th
posted in

Shadows are the color of the sky.

It’s one of those truths that is difficult to see, until you look closely at what’s really there.

To see something as it really is, we should try to identify our own bias, and then let it go.

“Unnatural” Links

This article tries to make sense of Google's latest moves regarding links.

It’s a reaction to Google’s update of their Link Schemes policy. Google’s policy states “Any links intended to manipulate PageRank or a site’s ranking in Google search results may be considered part of a link scheme and a violation of Google’s Webmaster Guidelines." I wrote on this topic, too.

Those with a vested interest in the link building industry - which is pretty much all of us - might spot the problem.

Google’s negative emphasis, of late, has been about links. Their message is not new, just the emphasis. The new emphasis could pretty much be summarized thus:"any link you build for the purpose of manipulating rank is outside the guidelines." Google have never encouraged activity that could manipulate rankings, which is precisely what those link building, for the purpose of SEO, attempt to do. Building links for the purposes of higher rank AND staying within Google's guidelines will not be easy.

Some SEOs may kid themselves that they are link building “for the traffic”, but if that were the case, they’d have no problem insisting those links were scripted so they could monitor traffic statistics, or at very least, no-followed, so there could be no confusion about intent.

How many do?

Think Like Google

Ralph Tegtmeier: In response to Eric's assertion "I applaud Google for being more and more transparent with their guidelines", Ralph writes- "man, Eric: isn't the whole point of your piece that this is exactly what they're NOT doing, becoming "more transparent"?

Indeed.

In order to understand what Google is doing, it can be useful to downplay any SEO bias i.e. what we may like to see from an SEO standpoint, rather try to look at the world from Google’s point of view.

I ask myself “if I were Google, what would I do?”

Clearly I'm not Google, so these are just my guesses, but if I were Google, I’d see all SEO as a potential competitive threat to my click advertising business. The more effective the SEO, the more of a threat it is. SEOs can’t be eliminated, but they can been corralled and managed in order to reduce the level of competitive threat. Partly, this is achieved by algorithmic means. Partly, this is achieved using public relations. If I were Google, I would think SEOs are potentially useful if they could be encouraged to provide high quality content and make sites easier to crawl, as this suits my business case.

I’d want commercial webmasters paying me for click traffic. I’d want users to be happy with the results they are getting, so they keep using my search engine. I’d consider webmasters to be unpaid content providers.

Do I (Google) need content? Yes, I do. Do I need any content? No, I don’t. If anything, there is too much content, and lot of it is junk. In fact, I’m getting more and more selective about the content I do show. So selective, in fact, that a lot of what I show above the fold content is controlled and “published”, in the broadest sense of the word, by me (Google) in the form of the Knowledge Graph.

It is useful to put ourselves in someone else’s position to understand their truth. If you do, you’ll soon realise that Google aren’t the webmasters friend if your aim, as a webmaster, is to do anything that "artificially" enhances your rank.

So why are so many SEOs listening to Google’s directives?

Rewind

A year or two ago, it would be madness to suggest webmasters would pay to remove links, but that’s exactly what’s happening. Not only that, webmasters are doing Google link quality control. For free. They’re pointing out the links they see as being “bad” - links Google’s algorithms may have missed.

Check out this discussion. One exasperated SEO tells Google that she tries hard to get links removed, but doesn’t hear back from site owners. The few who do respond want money to take the links down.

It is understandable site owners don't spend much time removing links. From a site owners perspective, taking links down involves a time cost, so there is no benefit to the site owner in doing so, especially if they receive numerous requests. Secondly, taking down links may be perceived as being an admission of guilt. Why would a webmaster admit their links are "bad"?

The answer to this problem, from Google's John Mueller is telling.

A shrug of the shoulders.

It’s a non-problem. For Google. If you were Google, would you care if a site you may have relegated for ranking manipulation gets to rank again in future? Plenty more where they came from, as there are thousands more sites just like it, and many of them owned by people who don’t engage in ranking manipulation.

Does anyone really think their rankings are going to return once they’ve been flagged?

Jenny Halasz then hinted at the root of the problem. Why can’t Google simply not count the links they don’t like? Why make webmasters jump through arbitrary hoops? The question was side-stepped.

If you were Google, why would you make webmasters jump through hoops? Is it because you want to make webmasters lives easier? Well, that obviously isn’t the case. Removing links is a tedious, futile process. Google suggest using the disavow links tool, but the twist is you can’t just put up a list of links you want to disavow.

Say what?

No, you need to show you’ve made some effort to remove them.

Why?

If I were Google, I’d see this information supplied by webmasters as being potentially useful. They provide me with a list of links that the algorithm missed, or considered borderline, but the webmaster has reviewed and thinks look bad enough to affect their ranking. If the webmaster simply provided a list of links dumped from a link tool, it’s probably not telling Google much Google doesn’t already know. There’s been no manual link review.

So, what webmasters are doing is helping Google by manually reviewing links and reporting bad links. How does this help webmasters?

It doesn’t.

It just increases the temperature of the water in the pot. Is the SEO frog just going to stay there, or is he going to jump?

A Better Use Of Your Time

Does anyone believe rankings are going to return to their previous positions after such an exercise? A lot of webmasters aren’t seeing changes. Will you?

Maybe.

But I think it’s the wrong question.

It’s the wrong question because it’s just another example of letting Google define the game. What are you going to do when Google define you right out of the game? If your service or strategy involves links right now, then in order to remain snow white, any links you place, for the purposes of achieving higher rank, are going to need to be no-followed in order to be clear about intent. Extreme? What's going to be the emphasis in six months time? Next year? How do you know what you're doing now is not going to be frowned upon, then need to be undone, next year?

A couple of years ago it would be unthinkable that webmasters would report and remove their own links, even paying for them to be removed, but that’s exactly what’s happening. So, what is next year's unthinkable scenario?

You could re-examine the relationship and figure what you do on your site is absolutely none of Google’s business. They can issue as many guidelines as they like, but they do not own your website, or the link graph, and therefore don't have authority over you unless you allow it. Can they ban your site because you’re not compliant with their guidelines? Sure, they can. It’s their index. That is the risk. How do you choose to manage this risk?

It strikes me you can lose your rankings at anytime whether you follow the current guidelines or not, especially when the goal-posts keep moving. So, the risk of not following the guidelines, and following the guidelines but not ranking well is pretty much the same - no traffic. Do you have a plan to address the “no traffic from Google” risk, however that may come about?

Your plan might involve advertising on other sites that do rank well. It might involve, in part, a move to PPC. It might be to run multiple domains, some well within the guidelines, and some way outside them. Test, see what happens. It might involve beefing up other marketing channels. It might be to buy competitor sites. Your plan could be to jump through Google’s hoops if you do receive a penalty, see if your site returns, and if it does - great - until next time, that is.

What's your long term "traffic from Google" strategy?

If all you do is "follow Google's Guidelines", I'd say that's now a high risk SEO strategy.

Winning Strategies to Lose Money With Infographics

Aug 26th

Google is getting a bit absurd with suggesting that any form of content creation that drives links should include rel=nofollow. Certainly some techniques may be abused, but if you follow the suggested advice, you are almost guaranteed to have a negative ROI on each investment - until your company goes under.

Some will ascribe such advice as taking a "sustainable" and "low-risk" approach, but such strategies are only "sustainable" and "low-risk" so long as ROI doesn't matter & you are spending someone else's money.

The advice on infographics in the above video suggests that embed code by default should include nofollow links.

Companies can easily spend at least $2,000 to research, create, revise & promote an infographic. And something like 9 out of 10 infographics will go nowhere. That means you are spending about $20,000 for each successful viral infographic. And this presumes that you know what you are doing. Mix in a lack of experience, poor strategy, poor market fit, or poor timing and that cost only goes up from there.

If you run smaller & lesser known websites, quite often Google will rank a larger site that syndicates the infographic above the original source. They do that even when the links are followed. Mix in nofollow on the links and it is virtually guaranteed that you will get outranked by someone syndicating your infographic.

So if you get to count as duplicate content for your own featured premium content that you dropped 4 or 5 figures on AND you don't get links out of it, how exactly does the investment ever have any chance of backing out?

Sales?

Not a snowball's chance in hell.

An infographic created around "the 10 best ways you can give me your money" won't spread. And if it does spread, it will be people laughing at you.

I also find it a bit disingenuous the claim that people putting something that is 20,000 pixels large on their site are not actively vouching for it. If something was crap and people still felt like burning 20,000 pixels on syndicating it, surely they could add nofollow on their end to express their dissatisfaction and disgust with the piece.

Many dullards in the SEO industry give Google a free pass on any & all of their advice, as though it is always reasonable & should never be questioned. And each time it goes unquestioned, the ability to exist in the ecosystem as an independent player diminishes as the entire industry moves toward being classified as some form of spam & getting hit or not depends far more on who than what.

Does Google's recent automated infographic generator give users embed codes with nofollow on the links? Not at all. Instead they give you the URL without nofollow & those URLs are canonicalized behind the scenes to flow the link equity into the associated core page.

No cost cut-n-paste mix-n-match = direct links. Expensive custom research & artwork = better use nofollow, just to be safe.

If Google actively adds arbitrary risks to some players while subsidizing others then they shift the behaviors of markets. And shift the markets they do!

Years ago Twitter allowed people who built their platform to receive credit links in their bio. Matt Cutts tipped off Ev Williams that the profile links should be nofollowed & that flow of link equity was blocked.

It was revealed in the WSJ that in 2009 Twitter's internal metrics showed an 11% spammy Tweet rate & Twitter had a grand total of 2 "spam science" programmers on staff in 2012.

With smaller sites, they need to default everything to nofollow just in case anything could potentially be construed (or misconstrued) to have the intent to perhaps maybe sorta be aligned potentially with the intent to maybe sorta be something that could maybe have some risk of potentially maybe being spammy or maybe potentially have some small risk that it could potentially have the potential to impact rank in some search engine at some point in time, potentially.

A larger site can have over 10% of their site be spam (based on their own internal metrics) & set up their embed code so that the embeds directly link - and they can do so with zero risk.

I just linked to Twitter twice in the above embed. If those links were directly to Cygnus it may have been presumed that either he or I are spammers, but put the content on Twitter with 143,199 Tweets in a second & those links are legit & clean. Meanwhile, fake Twitter accounts have grown to such a scale that even Twitter is now buying them to try to stop them. Twitter's spam problem was so large that once they started to deal with spam their growth estimates dropped dramatically:

CEO Dick Costolo told employees he expected to get to 400 million users by the end of 2013, according to people familiar with the company.

Sources said that Twitter now has around 240 million users, which means it has been adding fewer than 4.5 million users a month in 2013. If it continues to grow at that rate, it would end this year around the 260 million mark — meaning that its user base would have grown by about 30 percent, instead of Costolo’s 100 percent goal.

Typically there is no presumed intent to spam so long as the links are going into a large site (sure there are a handful of token counter-examples shills can point at). By and large it is only when the links flow out to smaller players that they are spam. And when they do, they are presumed to be spam even if they point into featured content that cost thousands of Dollars. You better use nofollow, just to play it safe!

That duality is what makes blind unquestioning adherence to Google scripture so unpalatable. A number of people are getting disgusted enough by it that they can't help but comment on it: David Naylor, Martin Macdonald & many others DennisG highlighted.

Oh, and here's an infographic for your pleasurings.

Google: Press Release Links

Aug 7th
posted in

So, Google have updated their Webmaster Guidelines.

Here are a few common examples of unnatural links that violate our guidelines:....Links with optimized anchor text in articles or press releases distributed on other sites.

For example: There are many wedding rings on the market. If you want to have a wedding, you will have to pick the best ring. You will also need to buy flowers and a wedding dress.

In particular, they have focused on links with optimized anchor text in articles or press releases distributed on other sites. Google being Google, these rules are somewhat ambiguous. “Optimized anchor text”? The example they provide includes keywords in the anchor text, so keywords in the anchor text is “optimized” and therefore a violation of Google’s guidelines.

Ambiguously speaking, of course.

To put the press release change in context, Google’s guidelines state:

Any links intended to manipulate PageRank or a site's ranking in Google search results may be considered part of a link scheme and a violation of Google’s Webmaster Guidelines. This includes any behavior that manipulates links to your site or outgoing links from your site

So, links gained, for SEO purposes - intended to manipulate ranking - are against Google Guidelines.

Google vs Webmasters

Here’s a chat...

In this chat, Google’s John Muller says that, if the webmaster initiated it, then it isn't a natural link. If you want to be on the safe side, John suggests to use no-follow on links.

Google are being consistent, but what’s amusing is the complete disconnect on display from a few of the webmasters. Google have no problem with press releases, but if a webmaster wants to be on the safe side in terms of Google’s guidelines, the webmaster should no-follow the link.

Simple, right. If it really is a press release, and not an attempt to link build for SEO purposes, then why would a webmaster have any issue with adding a no-follow to a link?

He/she wouldn't.

But because some webmasters appear to lack self-awareness about what it is they are actually doing, they persist with their line of questioning. I suspect what they really want to hear is “keyword links in press releases are okay." Then, webmasters can continue to issue pretend press releases as a link building exercise.

They're missing the point.

Am I Taking Google’s Side?

Not taking sides.

Just hoping to shine some light on a wider issue.

If webmasters continue to let themselves be defined by Google, they are going to get defined out of the game entirely. It should be an obvious truth - but sadly lacking in much SEO punditry - that Google is not on the webmasters side. Google is on Google’s side. Google often say they are on the users side, and there is certainly some truth in that.

However,when it comes to the webmaster, the webmaster is a dime-a-dozen content supplier who must be managed, weeded out, sorted and categorized. When it comes to the more “aggressive” webmasters, Google’s behaviour could be characterized as “keep your friends close, and your enemies closer”.

This is because some webmasters, namely SEOs, don’t just publish content for users, they compete with Google’s revenue stream. SEOs offer a competing service to click based advertising that provides exactly the same benefit as Google's golden goose, namely qualified click traffic.

If SEOs get too good at what they do, then why would people pay Google so much money per click? They wouldn’t - they would pay it to SEOs, instead. So, if I were Google, I would see SEO as a business threat, and manage it - down - accordingly. In practice, I’d be trying to redefine SEO as “quality content provision”.

Why don't Google simply ignore press release links? Easy enough to do. Why go this route of making it public? After all, Google are typically very secret about algorithmic topics, unless the topic is something they want you to hear. And why do they want you to hear this? An obvious guess would be that it is done to undermine link building, and SEOs.

Big missiles heading your way.

Guideline Followers

The problem in letting Google define the rules of engagement is they can define you out of the SEO game, if you let them.

If an SEO is not following the guidelines - guidelines that are always shifting - yet claim they do, then they may be opening themselves up to legal liability. In one recent example, a case is underway alleging lack of performance:

Last week, the legal marketing industry was aTwitter (and aFacebook and even aPlus) with news that law firm Seikaly & Stewart had filed a lawsuit against The Rainmaker Institute seeking a return of their $49,000 in SEO fees and punitive damages under civil RICO

.....but it’s not unreasonable to expect a somewhat easier route for litigants in the future might be “not complying with Google’s guidelines”, unless the SEO agency disclosed it.

SEO is not the easiest career choice, huh.

One group that is likely to be happy about this latest Google push is legitimate PR agencies, media-relations departments, and publicists. As a commenter on WMW pointed out:

I suspect that most legitimate PR agencies, media-relations departments, and publicists will be happy to comply with Google's guidelines. Why? Because, if the term "press release" becomes a synonym for "SEO spam," one of the important tools in their toolboxes will become useless.

Just as real advertisers don't expect their ads to pass PageRank, real PR people don't expect their press releases to pass PageRank. Public relations is about planting a message in the media, not about manipulating search results

However, I’m not sure that will mean press releases are seen as any more credible, as press releases have never enjoyed a stellar reputation pre-SEO, but it may thin the crowd somewhat, which increases an agencies chances of getting their client seen.

Guidelines Honing In On Target

One resource referred to in the video above was this article, written by Amit Singhal, who is head of Google’s core ranking team. Note that it was written in 2011, so it’s nothing new. Here’s how Google say they determine quality:

we aren't disclosing the actual ranking signals used in our algorithms because we don't want folks to game our search results; but if you want to step into Google's mindset, the questions below provide some guidance on how we've been looking at the issue:

  • Would you trust the information presented in this article?
  • Is this article written by an expert or enthusiast who knows the topic well, or is it more shallow in nature?
  • Does the site have duplicate, overlapping, or redundant articles on the same or similar topics with slightly different keyword variations?
  • Are the topics driven by genuine interests of readers of the site, or does the site generate content by attempting to guess what might rank well in search engines?
  • Does the article provide original content or information, original reporting, original research, or original analysis?
  • Does the page provide substantial value when compared to other pages in search results?
  • How much quality control is done on content?

….and so on. Google’s rhetoric is almost always about “producing high quality content”, because this is what Google’s users want, and what Google’s users want, Google’s shareholders want.

It’s not a bad thing to want, of course. Who would want poor quality content? But as most of us know, producing high quality content is no guarantee of anything. Great for Google, great for users, but often not so good for publishers as the publisher carries all the risk.

Take a look at the Boston Globe, sold along with a boatload of content for a 93% decline. Quality content sure, but is it a profitable business? Emphasis on content without adequate marketing is not a sure-fire strategy. Bezos has just bought the Washington Post, of course, and we're pretty sure that isn't a content play, either.

High quality content often has a high upfront production cost attached to it, and given measly web advertising rates, the high possibility of invisibility, getting content scrapped and ripped off, then it is no wonder webmasters also push their high quality content in order to ensure it ranks. What other choice have they got?

To not do so is also risky.

Even eHow, well known for cheap factory line content, is moving toward subscription membership revenues.

The Somewhat Bigger Question

Google can move the goal- posts whenever they like. What you’re doing today might be frowned upon tomorrow. One day, your content may be made invisible, and there will be nothing you can do about it, other than start again.

Do you have a contingency plan for such an eventuality?

Johnon puts it well:

The only thing that matters is how much traffic you are getting from search engines today, and how prepared you are for when some (insert adjective here) Googler shuts off that flow of traffic"

To ask about the minuate of Google’s policies and guidelines is to miss the point. The real question is how prepared are you when Google shuts off you flow of traffic because they’ve reset the goal posts?

Focusing on the minuate of Google's policies is, indeed, to miss the point.

This is a question of risk management. What happens if your main site, or your clients site, runs foul of a Google policy change and gets trashed? Do you run multiple sites? Run one site with no SEO strategy at all, whilst you run other sites that push hard? Do you stay well within the guidelines and trust that will always be good enough? If you stay well within the guidelines, but don’t rank, isn’t that effectively the same as a ban i.e. you’re invisible? Do you treat search traffic as a bonus, rather than the main course?

Be careful about putting Google’s needs before your own. And manage your risk, on your own terms.

Google Keyword(Not Provided): High Double Digit Percent

Aug 3rd

Most Organic Search Data is Now Hidden

Over the past couple years since its launch, Google's keyword (not provided) has received quite a bit of exposure, with people discussing all sorts of tips on estimating its impact & finding alternate sources of data (like competitive research tools & webmaster tools).

What hasn't received anywhere near enough exposure (and should be discussed daily) is that the sole purpose of the change was anti-competitive abuse from the market monopoly in search.

The site which provided a count for (not provided) recently displayed over 40% of queries as (not provided), but that percentage didn't include the large percent of mobile search users that were showing no referrals at all & were showing up as direct website visitors. On July 30, Google started showing referrals for many of those mobile searchers, using keyword (not provided).

According to research by RKG, mobile click prices are nearly 60% of desktop click prices, while mobile search click values are only 22% of desktop click prices. Until Google launched enhanced AdWords campaigns they understated the size of mobile search by showing many mobile searchers as direct visitors. But now that AdWords advertisers were opted into mobile ads (and have to go through some tricky hoops to figure out how to disable it), Google has every incentive to promote what a big growth channel mobile search is for their business.

Looking at the analytics data for some non-SEO websites over the past 4 days I get Google referring an average of 86% of the 26,233 search visitors, with 13,413 being displayed as keyword (not provided).

Hiding The Value of SEO

Google is not only hiding half of their own keyword referral data, but they are hiding so much more than half that even when you mix in Bing and Yahoo! you still get over 50% of the total hidden.

Google's 86% of the 26,233 searches is 22,560 searches.

Keyword (not provided) being shown for 13,413 is 59% of 22,560. That means Google is hiding at least 59% of the keyword data for organic search. While they are passing a significant share of mobile search referrers, there is still a decent chunk that is not accounted for in the change this past week.

Not passing keywords is just another way for Google to increase the perceived risk & friction of SEO, while making SEO seem less necessary, which has been part of "the plan" for years now.

Buy AdWords ads and the data gets sent. Rank organically and most the data is hidden.

When one digs into keyword referral data & ad blocking, there is a bad odor emitting from the GooglePlex.

Subsidizing Scammers Ripping People Off

A number of the low end "solutions" providers scamming small businesses looking for SEO are taking advantage of the opportunity that keyword (not provided) offers them. A buddy of mine took over SEO for a site that had showed absolutely zero sales growth after a year of 15% monthly increase in search traffic. Looking at the on-site changes, the prior "optimizers" did nothing over the time period. Looking at the backlinks, nothing there either.

So what happened?

Well, when keyword data isn't shown, it is pretty easy for someone to run a clickbot to show keyword (not provided) Google visitors & claim that they were "doing SEO."

And searchers looking for SEO will see those same scammers selling bogus solutions in AdWords. Since they are selling a non-product / non-service, their margins are pretty high. Endorsed by Google as the best, they must be good.

Or something like that:

Google does prefer some types of SEO over others, but their preference isn’t cast along the black/white divide you imagine. It has nothing to do with spam or the integrity of their search results. Google simply prefers ineffective SEO over SEO that works. No question about it. They abhor any strategies that allow guys like you and me to walk into a business and offer a significantly better ROI than AdWords.

This is no different than the YouTube videos "recommended for you" that teach you how to make money on AdWords by promoting Clickbank products which are likely to get your account flagged and banned. Ooops.

Anti-competitive Funding Blocking Competing Ad Networks

John Andrews pointed to Google's blocking (then funding) of AdBlock Plus as an example of their monopolistic inhibiting of innovation.

sponsoring Adblock is changing the market conditions. Adblock can use the money provided by Google to make sure any non-Google ad is blocked more efficiently. They can also advertise their addon better, provide better support, etc. Google sponsoring Adblock directly affects Adblock's ability to block the adverts of other companies around the world. - RyanZAG

Turn AdBlock Plus on & search for credit cards on Google and get ads.

Do that same search over at Bing & get no ads.

How does a smaller search engine or a smaller ad network compete with Google on buying awareness, building a network AND paying the other kickback expenses Google forces into the marketplace?

They can't.

Which is part of the reason a monopoly in search can be used to control the rest of the online ecosystem.

Buying Browser Marketshare

Already the #1 web browser, Google Chrome buys marketshare with shady one-click bundling in software security installs.

If you do that stuff in organic search or AdWords, you might be called a spammer employing deceptive business practices.

When Google does it, it's "good for the user."

Vampire Sucking The Lifeblood Out of SEO

Google tells Chrome users "not signed in to Chrome (You're missing out - sign in)." Login to Chrome & searchers don't pass referral information. Google also promotes Firefox blocking the passage of keyword referral data in search, but when it comes to their own cookies being at risk, that is unacceptable: "Google is pulling out all the stops in its campaign to drive Chrome installs, which is understandable given Microsoft and Mozilla's stance on third-party cookies, the lifeblood of Google's display-ad business."

What do we call an entity that considers something "its lifeblood" while sucking it out of others?

Pages






    Email Address
    Pick a Username
    Yes, please send me "7 Days to SEO Success" mini-course (a $57 value) for free.

    Learn More

    We value your privacy. We will not rent or sell your email address.