Google collects a lot of information on individuals & can have some level of confidence if the person is a real person or not based on things like their history of email usage, if they have a credit card on file, how they interact with other high confidence real accounts, how many people are friends with them on Google+, usage of an Android cell phone, their search history, etc.
Google doesn't need all those signals on any individual, just some blend of them.
From there they can create a lot of usage-based brand signals.
Query Volume + Click Distribution
For any keyword Google can see the search volume & the click distribution on the search results.
If a lot of people click on the top result & very few people click on the second or third result there is a strong chance the keyword is a brand. If the click distribution is spread more evenly across the search results then it is less likely to be a brand keyword.
The above was a hypothetical example, but the following image shows how lower volume branded navigational keywords can drive far more traffic than broader industry keywords. We get twice as much traffic for seobook & seo book as we do for seo.
When people search for a generic keyword they may (immediately or later) modify their search query to search for related keywords. In the past Microsoft offered a search funnels tool that would show common searches before & after a keyword. If someone searched for credit cards they might soon search for visa or mastercard.
Of course getting the user to click is just the first step. From there you must satisfy them. ;)
If you visit a page quickly & then jump right back to the search results Google asks users for an explicit vote against that site.
And if you visit a page for a significant period of time Google asks users for an explicit vote for that site.
That Google is measuring the time until return the search results to determine which explicit vote to request also implies that they can use the same aggregate data to create an implicit signal.
Where this measurement can get a bit fuzzy is that Panda can create a self-reinforcing impact (good or bad).
Self-Reinforcing Positive Impacts
Let's say your site got a ranking boost by Panda. It will rank higher across broader industry keywords, to where people may enter your site at the category level (say shoes or Nike shoes) and then surf around your site quite a bit. This equates to a longer time on site & a better user experience.
2 more factors on this front are branded navigation & familiarity.
On some search results Google shows branded search options.
If clicking those brand & store links feeds into the Vince relevancy signal, then any brand featured there has a huge wind at their back, building further brand signals. Eventually such suggestions can work their way into Google Instant keyword suggestions as well. Even if people do not click on those particular options, the various highlights in the search results act as advertisements for the brands, which drive incremental demand and search volume for those brands.
Amazon.com is responsible for roughly 1/3 of ecommerce spend in the United States (outside of travel), so many people might go and research product options generally & then conclude those search sessions by seeing if they can buy it off Amazon.com (due to getting free shipping & the high level of user trust Amazon has). As this becomes part of search relevancy algorithms this is the online equivalent of going to your local Borders store to find something to buy & then buying it on Amazon. In the short run you save a few Dollars, but in the long run stores like Borders go out of business.
Self-Reinforcing Negative Impacts
There are 2 bad ways a business can be impacted by Panda. One is missing out on the above promotional options that a large competitor may enjoy, which over time build more brand signals for them & leave your site stranded in no man's land until it is finally clipped by Panda for lacking "quality."
A second issue is a self-reinforcing issue with Panda. On WMW a user nicknamed Walkman described it as the "size 13 shoe problem." After you have been hit by Panda you are not likely to rank for broader category level searches. However you might still rank for some really obscure longtail keyword that is uneconomic to address directly (and thus only have a glancing mention of the user's intent). Your page might say we do not carry size 13 or size 13 out of stock and your Panda-hit site ranks for "Nike Carmelo Anthony size 13." Thus the user bounces, creating a self-reinforcing negative user experience signal.
The above examples of +1 votes and blocks can be used (along with the time on site & repeat visits) to gauge user satisfaction, however if they can't get enough engagement then it will be very easy for big brands to buy that signal for pennies on the Dollar, as some social signals are easily bought by brands.
Not only does Amazon directly integrate promoting your wishlist on social media ...
... but they also have done interesting promotions like a "Tweet & get" ...
Imagine if/when a new local Wal-Mart store launches offering a free $10 coupon to everyone who Tweets their savings at the checkout counter!
One big issue I have with the +1 votes & blocks is that they apply across the board. I may dislike some craptastic videos hosted on YouTube, but there is also a lot of great content there. I love eBay for vintage video games, but it does not mean I love them for books.
Likewise some of the friend of friend stuff can be a bit off.
At some point Google should make +1 votes & blocks more granular.
Near the end of this article I will also further discuss some issues with ad votes.
Does Google measure repeat visitors? Yes.
They use that user interaction to ask for an explicit vote...
...and they can use it as an implicit vote as well.
They not only track how many times you visit a page or site, but also when you last visited it.
Once it is obvious Google is counting certain types of user metrics (just like they count links) there will be a race to the bottom to provide those said signals. That race to the bottom will lead to such signals being sold by accounts that either have sketchy trust metrics associated with them (if done through automation) and/or in markets with lower living costs.
And they can also track where the votes come from.
If your domain name matches your keyword that may be a brand signal. However, Google may also look at some other signals (like user engagement, repeat visits, relative CTR, etc.) as confirmation signals on this front.
Sometimes when a spammer builds links they trap themselves by using the same anchor text too much. Whereas when a branded website pulls in organic citations the anchor text tends to be mixed up, like...
Diversity in any sense (anchor text, linking sources, pages being linked to, links built across time, etc.) is generally considered a good thing.
Other types of links might also be seen as potential brand signals. For instance, frequent exposure in trusted news sites, other trusted seed sites, or other known brand sites could pass additional karma. Some link spikes that are also associated with strong direct traffic spikes, strong referral traffic from the links, and strong brand searches might also boost the weight given to links.
In local search Google has long used the sites they displaced in the organic results as citations (even if they were in some cases unlinked).
In addition to offering branded filters in their internal navigation, many merchants submitting their products to Google product search may also be giving Google signals about which brands matter.
Google will be able to lean into Zagat ratings for business & other data sources (Google Wallet, Google Offers, etc.) will provide additional signals to Google.
Any type of non-search distribution you have (RSS subscribers, email newsletters, mobile applications, physical stores, membership loyalty programs, etc.) makes it easier to influence search engines.
If advertising with Google had a negative impact on search relevancy you can be sure that the relevancy algorithms would change. Whereas if there is a convenient positive spill over then Google won't complain. In fact, they will even go out of their way to advertise that spill over. Any sort of advertising you do increases brand awareness. And that leads to additional incremental brand searches (and thus brand signal)
More exposure also leads to more user experiences, which in turn leads to more opportunities for people to leave signals behind (be it links, social mentions, additional brand searches, and/or repeat visits). Here is State Farm buying *irrelevant* brand signal for pennies on the Dollar.
And of course there are all sorts of corporate advocacy ads as well.
Even if those votes don't influence rank directly, they still influence user perception. And what is so bad about that is that users are only voting of the content of the ad. This basically is the equivalent of cloaking.
If the landing page doesn't match the ad (free iPad anyone???) then people are going to see their friends vouching for scams & get duped by Google.
That is worse that a press release being advertised as though it was news
You can also be certain that some clever spammers are integrating +1 buttons in display ads on other ad networks in ways that may automatically collect user clicks & so on, or have users pay for viewing their next porn video by clicking a +1 button (much like some old school email spammers used porn viewers as manual captcha breakers).
Google does offer the ability to vote against an ad as well, but if an ad looks great upfront & its the landing page that scams you then how exactly do you vote against it if you don't see the site until after you click the ad?
If you read any of Google's older guidelines that leaked over the years you would see a consistent disdain toward affiliate sites. This was also reflected in official advice at search engine conferences & whatnot.
A friend of mine went to Google's campus & Google offered to "optimize" their AdWords account. As soon as the word affiliate came up it was like spoiled meat. Replacing the word "affiliate" with some other idiotic made up phrase (I think it was "regional online distributor") suddenly made everything O.K. again. Other friends had similar stories.
Note that the difference between "affiliate" and "regional online distributor" is for all intents and purposes linguistic crap, however it can be the difference between life and death for an online business.
To be fair, the ready availability of feeds to quickly generate sites means that most affiliate sites will be garbage. At some point Google gets sick of fighting the same battles over and over again. Then again, most websites are garbage & only the top x% of anything is going to be great.
It is worth noting that Google doesn't consider itself "just an unnecessary step in the sales funnel" when they insert themselves as an affiliate.
Should information empires be allowed to discriminate based on nothing more than the business model of competitors?
Spam vs Not Spam
The most recently leaked Google rater document stated
Spammers create spam pages to make money. Sometimes, they make money directly, by placing moneymaking links on the spam page. Here are two types of moneymaking links:
Pay-Per-Click (PPC) ads: Spammers get paid each time ads are clicked on their webpages. Another term for PPC ads is “sponsored links”.
Thin Affiliates: Spammers make money when a transaction is completed after the user has clicked through to the merchant’s site from their webpages
PPC ads appear on many, many webpages. Some pages with PPC ads are spam, but many pages with PPC ads are not. Pages should not be assigned a Spam flag if they are created to provide information or help to users. Pages are spam if they exist only to make money and not to help users.
Sometimes, spam pages do not have moneymaking links. These spam pages are created to change search engine rankings or even to do harm to users’ computers with sneaky downloads.
So in essence, the difference between spam & not spam is if the page is helpful to users.
The rating document takes 130 pages to clearly articulate the difference between what is spam and what is not spam.
But the core ethos in categorization is if it is original & helpful it is not spam unless it is doing something deceptive.
A Minor Exception*
Google's rater guides also arbitrarily sneaked in the "what the hell, if it is affiliate, it is spam" card:
Note: Major cosmopolitan cities are preferred targets for spammers, especially hotel affiliates. Such results should be flagged as Spam, even if they are related to the query and helpful to users. For example, a hotel affiliate page with a list of Chicago hotels may be assigned a rating Relevant, but also receive a Spam flag.
Google is directly going out of its way to attack competing business models.
Even if the site is quality - any way you slice it - they still tell raters to label it as spam if it is a hotel affiliate.
Once again it is worth pointing out that the label "affiliate" is just an arbitrary label. It could just as well be a "commissioned salesperson."
An Example Market: Books
In our forums one of our members quoted a brilliant book by Karl Polanyi from 1944 which was full of gems like "A so-called self-regulating market economy may evolve into Mafia capitalism — and a Mafia political system"
I searched for that quote & guess what ranked #1?
Google Books of course.
Google's owned & operated affiliate offering in the niche.
The stolen version hosted on Google.com ranks #1...everything else is either spam, unneeded duplication in the marketplace, and/or conjecture that can float up and down as they tweak the algorithms.
To say that the book publishing industry is undergoing pains would be an understatement. But maybe in some weird way Google promoting Google helps the book industry by giving it more avenues to be seen? Maybe they are trying to help out book authors?
The structure of the book industry prevents the book author from getting anything but a small slice of the book's revenues (unless the author is well known and/or they self publish). Markets being what they are, most authors live in obscurity on the long tail. To help supplement their low cut of the revenue pie, some book authors use affiliate links to link to Amazon.com as a purchasing option on their official book websites.
Recently in our forums a member created a thread about a client site being blocked from AdWords because there was an affiliate link on the page for their own book!
Google is The Biggest Online Affiliate
So the author is not allowed to advertise his own work to give you multiple buying options & highlight options which offer her additional compensation, however...
Google is free to steal the copyright work & promote their looted version first
The word affiliate is arbitrarily tarnished in the same way that SEO is.
Use another label & if you do the exact same thing it is clean. Craigslist or eBay are not affiliates as they are marketplaces. Wal-Mart & Amazon.com might do drop shipping & have some affiliate promotions on their sites, but they are retailers.
These arbitrary label differences make a big difference to the stability of an online business.
Machine Learning vs a Small Business Killing Machine
Google can claim that they use artificial intelligence and machine learning and are unbiased, but their ranking systems need training sets. And if upon this alleged independent rating affiliates come up as "spam" then how can an affiliate build a sustainable business model?
I know what you are thinking: "Well, Aaron, they can stop being affiliates and move up the value chain."
The problem with that is that as an affiliate I can compare a lot of products in a condensed space, but if I accept payments for products then I likely need to have a page for each product. The issue there is that if you do not have a strong brand and you have lots of pages on your site there is a great chance that the Panda algorithm will torch your website.
The new items on the website will mostly get to consumers through third-party sellers, which means B&N won’t have to carry the expense of inventory. The bookseller will just take a sales commission of 8% to 15% on each item.
What's worse, when brands come under review for spamming, Google says that they already ranked #1 so there is no reason to penalize them. Which is precisely why you can now buy rugs on Barnes & Noble. And it is precisely why you can find dating offers, education offers, jobs, and automotive sections on Excite.com. There is no SEO risk in brand extension for large brands that can do no wrong.
Google puts weight on domain names then suggests that domains can be a spam tool. So in a sense, if you invest in whatever Google trusts and are small you are a spammer. Whereas if you invest in whatever Google trusts and are large you deserve the benefit of the doubt & further promotion.
Google put the +1 button in display ads & claims that if you click on it you are recommending the site in the search results (in spite of having only seen an ad & not actually having seen the landing page yet! how hard is it to advertise "free money" and then offer up a landing page which says "oh, but there's a catch"?)
So if you have brand & money you can just flat out buy the "relevancy" signals. Yet if you try to create similar signals without paying Google & without owning a billion Dollar brand you are shunned & labeled as a spammer.
This subjective circular nonsense is getting a bit out of hand.
In summary, we are not SEOs and we are not affiliates.
We are a brand & we will buy retargeting AdWords ads + up our AdWords budget appropriately.
If we rebrand to remove "SEO" from the domain name can we please be added to Google's whitelist? ;)
As the co-founder of an SEO Consultancy, my biggest hurdle in business is finding more staff. Clients are lining up at our door, we have no trouble there, it's finding the staff to work with them that becomes the issue. This may not sound like the worst dilemma for a business to face, especially during the current global economic decline, but the causation is a matter of great concern to me as both an SEO and a businessman.
Ayima's company structure is such that only highly skilled SEOs make it through to our interview stage and yet even then, less than 5% meet our skill requirements. This isn't me being picky, misjudging characters or sourcing bad candidates - this is a knowledge pandemic that is spreading through our industry. We've started apprenticeship programs to teach eager candidates from the ground up, but this can take several years to generate the finished article.
After looking back at our past 30 interview candidates, my opinion for the reason behind this issue may not be a popular one. I believe that celebrity SEOs, brands and blogs are feeding a generation of untested and poorly trained search marketers, who pass themselves off as SEO experts. I will of course explain my positioning…
The Pander Update
Some high profile SEO bloggers recently ceased client work and personal projects, in order to appear impartial and trustworthy to their community. This makes sense at first, after-all, who wants to use a link building tool operated by someone working for one of your client's competitors? It does however bring to light 2 much larger issues;
1) a reliance on tertiary information for SEO analysis, and
2) a reliance on search engineers to provide fresh and exclusive information/data.
Some SEO information sites may argue that they have access to the Web Analytics accounts of their partners and that they do study index changes, but nothing replaces the value of following a handful of websites every single day of the year. An absence of "boots on the ground" leads to misinformation and a distancing from the SEO practices and concerns that really matter. This in turn results in an information churn which newbies to the industry naturally perceive as important.
Moving away from servicing clients or running in-house/affiliate projects also causes a financial flux. Revenue no longer relies on problem solving, but on juicy search engine spoilers and interviews. Search Engines are businesses too though and it's in their best interest to only reward and recommend the publishers/communities that tow their line. A once edgy and eager SEO consultancy must therefore transition into a best practice, almost vanilla, publisher in order to pander to the whims of over-eager search reps.
How do we expect the next generation of SEO consultants to analyse a website and its industry competitors, when all they've read about is how evil paid links are and how to tweak Google Analytics?
I could directly link the viewpoints and understandings of some recent SEO candidates back to a single SEO community, word for word. They would be horrified to see the kind of broken and malformed SEOs that their community has produced.
OMG, Check Out My Klout
It's true that social media metrics will become important factors for SEO in the future, but this certainly does not negate the need for a solid technical understanding of SEO. Getting 50 retweets and 20 +1's for a cute cat viral is the work of a 12 year old schoolgirl, not an SEO. If you can't understand the HTML mark-up of a page and how on-page elements influence a search engine, pick up a HTML/SEO book from 2001 and get reading. If you don't know how to optimise site structure and internal linking, read a book on how the web works or even a "UNIX for Dummies" manual. If you're unable to completely map out a competitor website's linking practices, placement and sources, set up a test site and start finding out how people buy/sell/barter/blag/bate for links.
You may be thinking at this point, "Rob, I already know this - why are you telling me?". Well, the sad fact is that many SEOs, with several years of experience at major and minor agencies, fail to show any understanding of these basic SEO building blocks. There are SEOs who can't identify the H1 on a page and that seriously consider "Wordle" and "Link Diagnosis" as business-class SEO tools. It used to be the case that candidates would read Aaron Wall's SEO Book or Dan Thies' big fat Search Engine Marketing Kit from cover-to-cover before even contemplating applying for an entry level SEO role. These days, major agencies are hiring people who simply say that "Content is King" and "Paid Links are Evil", they have at least 50 Twitter followers of course.
"Certified SEO" is NOT the answer
In most other professional industries, the answer would be simple - regulate and certify. This simply does not work for SEO though. I die a little, each time I see a "Certified SEO" proclamation on a résumé, with their examining board consisting of a dusty old SEO company, online questionnaire or a snake-oil salesman. A complete SEO knowledgebase cannot be taught or controlled by a single company or organisation. No one in their right mind would use Google's guide to SEO as their only source of knowledge for instance, just as no self-respecting Paid Search padawan would allow Google to set-up their PPC campaigns. Google's only interest is Google, not you. Popular SEO communities and training providers have their own agendas and opinions too.
I do however concede that some learning should be standardised, such as scientifically proven or verified ranking factors. Just the facts, no opinions, persuasions or ethical stances.
My Plea To You, The Industry
I plea to you, my fellow SEOs, to help fix this mess that we're in. Mentor young marketers, but let them make up their own minds. Put pressure on SEO communities to concentrate on facts/data and not to be scared by controversy or those with hidden agendas. Promote apprenticeship schemes in your company, so that SEOs learn on the job and not via a website. Encourage people to test ideas, rather than blindly believing the SEO teachings of industry celebs and strangers.
An experienced SEO with, what I perceive to be basic skills, isn't too much to ask for is it?
Recently we had the pleasure of interviewing one of my favorite link building experts, Melanie Nathan. Melanie has been involved in online marketing since 2003 and is a wonderful writer on all things link building in addition to being a well-respected link builder by her peers.
Melanie runs CanadianSEO, an internet marketing company based in Canada. You can check out some of her posts from the web here, follow her on Twitter here, and follow her on Google Plus here.
We hope you enjoy the interview!
So I see you started your career by running a successful e-commerce store, which you then sold off to a US company and then you moved into the client side of things. When did this all start and how did you decide to get into online, e-commerce stuff?
The e-commerce stuff started in 2003. My husband and I were operating a successful brick and mortar auto repair/aftermarket accessory store in Edmonton, where my husband’s dad (a skilled mechanic) would fix the vehicles and we would bling them up with cool accessories like euro tail lights and hid lighting kits. When we found out that our main manufacturer would be willing to drop ship their products directly to our North American customers, starting an online store seemed like no-brainer.
I fell in love with SEO shortly after that, mostly through experimentation with various e-commerce shopping carts and my frustration at not being able to find a decent one (at the time).
Some SEO's love the idea of running their own sites rather than working on client sites based on the difference between the ratio of profits to labor on your own sites versus client sites (relatively speaking). Some SEO's like doing both to help diversify their income streams, and some like pure client work. What lead you to decide to get into the client side of things?
I’m happy working for clients because I have a genuine interest in helping people and it’s extremely gratifying being able to impact someone’s life in such a way. On top of that, the work is constantly changing and I can pick and choose my projects therefore it never gets boring.
If there’s a downside, it’s that I don’t get many opportunities to experiment with different techniques or work on personal projects. This is why I’ve been slowly making time for the leap into the ‘other’ side of SEO (tool creation, affiliate marketing and yes, even some BHT) with some domains I own.
I figure, if I’m offering professional services, it’s best to be as experienced as possible in order to best serve my clients. If this leads to me eventually moving away from the client side of SEO though, then I might be open to the possibility.
If you’re interested in co-developing a link building tool or an affiliate site, ping me and we’ll talk ;)
You're well-known as a link building expert and you've written extensively on the subject. Can you walk us through how you approach/plan out a new client's plan (generally speaking) and talk about which tools you use and why?
Site owners mainly hire me in order to see measurable movement in the SERPs for their top keyphrases. This means, to help my clients stand out (where Google is concerned), I first need to see what they’re up against. I therefore always start with competitive research.
Among the tools I use are; SEOmoz Open Site Explorer & Competitive Link Research Tool. I’ve also been using SEOProfiler Competitive Backlink Intelligence tool lately. I also use Yahoo Site Explorer (I’ll sure miss this when it’s gone!) and, of course, Google itself.
I look for such things as; rankings of the site, number of root domains linking, quality of backlinks, backlink velocity and social media mentions. Once I chart out what each competitor’s link profile looks like, what I need to do in order to differentiate my client, becomes pretty apparent.
After that, it’s all about looking for prospects and then developing realistic ways to acquire links from them.
I read, and actually have Evernoted (is that the new word for bookmarking?) your Search Engine Journal post on "6 Super Tips For Creating a Natural Link Profile" and some of things you talk about there (back in 2010) might have helped sites weather parts of this latest Panda parade of updates.
Those tips are logical, solid, but require a good amount of work. Do you find that link building failures are a result of trying to look for shortcuts too often or just not being willing to really put a lot of natural effort into link building?
Thank you for Evernoting (love this) and mentioning that post.
In my experience, the majority of link building failures happen simply because the linkee was too busy thinking about THEIR needs rather than the needs of the linker. They also take shortcuts that often decrease their own chances, such as; sending bad email pitches and/or using generic email subject lines and/or using poor grammar etc.
Link building offers awesome rewards, but it can be an incredible amount of effort. If you’re unwilling or unable to put in that effort, I guarantee you’ll be disappointed with the results.
Of course, in some areas these kinds of natural links can be harder (sometimes much harder) between different sites. Do you think link building opportunities are existent enough in each market irrespective of the competition (big brands, strong sites, etc)?
Or, is it more of a budget issue on the client side when it comes to being unable to complete for really competitive stuff?
I’m always up for a challenge and I have yet to encounter a niche or market where links weren’t readily obtainable. Unfortunately, sometimes the techniques required to attract those links, just don’t fit within the client’s budget. In these cases, I recommend starting out small and, as the client sees more and more ROI, they’re happy to increase their budget. After all, some link building is better than no link building.
As far as eventually competing on a large scale, I’ll just say that most people grossly underestimate the power that high-quality links can have.
What are the key points you look for when identifying link opportunities? Do you consider pure link value to rankings and/or consider links that might be no-follow if they have the potential to bring targeted traffic to the site?
The main thing I focus on when selecting link prospects is; relevance. The link absolutely has to make sense or I won’t waste my client’s time on it.
After that, I look at the overall quality (How many links on the page? Is there any PR? Does it rank for anything?) and, to save a bit of time, I like to run it through the Raven Quality Analyzer (which tells me how many backlinks, indexed pages, age of domain etc). I do all of this in order to determine how much Google trusts the site and the likelihood of a link from the site directly affecting my client’s rankings.
As for nofollow links, let’s face it, clients don’t pay me to get them links that aren’t heavy hitting so I generally don’t pursue them (unless there’s a specific reason for doing so such as trying to help a paid link profile appear more natural). I don’t build links in humongous quantities though, so it all evens out.
If you’re building links for your own site though, I would never recommend turning down a link that makes sense…. even if it was nofollow.
As a provider of services, I see that you also offer a full suite of services. Has that evolved over the years from being mostly a link building company to now being a full service company?
Do you find this differentiates you from other providers and is that well-rounded approach one you'd recommend for someone starting a link building company today?
CanadianSEO has always offered a full line of SEO services, however over the years I’ve learned from experience that it’s the LINKS that get you where you need to be in Google hence why I’ve made link building my main focus. I now look at web design/site optimization and content creation as necessary steps in making sure your link efforts will have the desired effect.
Not sure if this sets me apart, but my clients are happy therefore I would probably recommend this approach to anyone running a SEO company. You absolutely have to be capable of attracting/acquiring/sourcing valuable links though, and this is something that apparently not every SEO is willing (or able) to do.
So let's say you are advising me on how to become a better link builder or a better manager of link building teams. What would your top 3 points be and what are maybe the top 3 myths or over-hyped points I should avoid?
Become a better link builder/manager by a) developing a system for tracking progress b) learning how to be persuasive to get what you want and c) never sacrificing quality in order to meet a deadline or fill a quota.
As far as myths, it may surprise many people to learn that both paid and reciprocal links are still effective as part of an overall link building strategy. I’m always trying to emphasize that Google doesn’t know as much about your links as you think it does. Especially when it comes to how your links are obtained. Yes, they do watch for certain obvious things (rate of links acquired, unnatural use of anchor text etc) but it’s totally ok to be creative. In fact it’s best. As long as you’re being logical, you’ll get the results you’re looking for.
Other than that, I still roll my eyes at people who say PageRank doesn’t matter when it comes to links. Hi, um, have you heard that Google still uses PR as a metric of quality? I’d like to offer those same peeps a link from a relevant PR0 or a link from a relevant PR7 and see which one they jump at.
Not that PR should be the ONLY metric you use when determining the value of a link prospect, but if you’re interested in making any impact on your rankings, it should definitely be taken into consideration.
For tracking link building efforts and for tracking the links you secure, do you use tools for that (like Raven or Buzzstream) or do you do that internally?
I still do it all internally/manually via custom Excel reports. Guess I’m still old-school in that regard.
A typical link report includes such data as; link URL, link anchor text, Google cache date, Raven Quality Score, relevancy info, link type, PR and link status. It has everything my clients need in order to see the progress of their link campaigns and its also great for keeping me organized as I’m often building links for many sites at once.
Please tell us what you think are going to be the most important aspects of link building going forward in this age of rapid algorithm changes and social signals?
Many people assume that link development is decreasing in importance, but this is far from the case. Links are still the simplest way for search engine spiders to judge the reliability of a webpage. However, the way that search engines view links is changing.
I’ve definitely seen (what I consider to be) evidence that Google is using social media mentions as a measure of quality. In an age where Facebook ‘likes’, Tweets and Google +1’s can be readily bought and sold though, one has to wonder about the longevity of such a system.
I almost feel sorry for Google in that no matter what they try to use as a measure of quality, there will always be ways to game it. I think this is precisely why they’re trying to move away from organic SERPs by diversifying them so much. It’s an imperfect system and I seriously don’t envy the position they’ve put themselves in.
As always, those that can keep up and adapt, will ultimately have the most success.
SEM Rush has long been one of my favorite SEO tools. We wrote a review of SEM Rush years ago. They were best of breed back then & they have only added more features since, including competitive research data for many local versions of Google outside of the core US results: UK, Russia, Germany, France, Spain, Italy, Brazil.
Recently they let me know that they started offering a free 2-week trial to new users.
For full disclosure, SEM Rush has been an SEO Book partner for years, as we have licensed their API to use in our competitive research tool. They also have an affiliate program & we are paid if you become a paying customer, however we do not get paid for recommending their free trial & their free trial doesn't even require giving them a credit card, so it literally is a no-risk free trial.
What is SEM Rush?
SEM Rush is a competitive research tool which helps you spy on how competing sites are performing in search. The big value add that SEM Rush has over a tool like Compete.com is that SEM Rush offers CPC estimates (from Google's Traffic Estimator tool) & estimated traffic volumes (from the Google AdWords keyword tool) near each keyword. Thus, rather than showing the traffic distribution to each site, this tool can list keyword value distribution for the sites (keyword value * estimated traffic).
As Google has started blocking showing some referral data the value of using these 3rd party tools has increased.
Using these estimates generally does not provide overall traffic totals that are as accurate as Compete.com's data licensing strategy, but if you own a site and know what it earns, you can set up a ratio to normalize the differences (at least to some extent, within the same vertical, for sites of similar size, using a similar business model).
One of our sites that earns about $5,000 a month shows a Google traffic value of close to $20,000 a month.
5,000/20,000 = 1/4 = 0.25
A similar site in the same vertical shows $10,000
$10,000 * 0.25 = $2,500
A couple big advantages over Compete.com and services like QuantCast for SEM Rush are that:
they focus exclusively on estimating search traffic
you get click volume estimates and click value estimates right next to each other
they help you spot valuable up-and-coming keywords where you might not yet get much traffic because you rank on page 2 or 3
Disclaimers With Normalizing Data
It is hard to monetize traffic as well as Google does, so in virtually every competitive market your profit per visitor (after expenses) will generally be less than Google. Some reason why..
In some markets people are losing money to buy marketshare, while in other markets people may overbid just to block out competition.
Some merchants simply have fatter profit margins and can afford to outbid affiliates.
It is hard to integrate advertising in your site anywhere near as aggressively as Google does while still creating a site that will be able to gather enough links (and other signals of quality) to take a #1 organic ranking in competitive markets...so by default there will typically be some amount of slippage.
A site that offers editorial content wrapped in light ads will not convert eyeballs into cash anywhere near as well as a lead generation oriented affiliate site would.
SEM Rush Features
Keyword Values & Volumes
As mentioned above, this data is scraped from the Google Traffic Estimator and the Google Keyword Tool. More recently Google combined their search-based keyword tool features into their regular keyword tool & this data has become much harder to scrape (unless you are already sitting on a lot of it like SEM Rush is).
Top Search Traffic Domains
A list of the top 100 domain names that are estimated to be the highest value downstream traffic sources from Google.
You could get a similar list from Compete.com's Referral Analytics by running a downstream report on Google.com, although I think that might also include traffic from some of Google's non-search properties like Reader. Since SEM Rush looks at both traffic volume and traffic value it gives you a better idea of the potential profits in any market than looking at raw traffic stats alone would.
Here is a list of sites that rank for many of the same keywords that SEO Book ranks for
Most competitors are quite obvious, however sometimes they will highlight competitors that you didn't realize, and in some cases those competitors are also working in other fertile keyword themes that you may have missed.
Here is a list of a few words where Seo Book and SEOmoz compete in the rankings
These sorts of charts are great for trying to show clients how site x performs against site y in order to help allocate more resources.
Compare AdWords to Organic Search
These are sites that rank for keywords that SEO Book is buying through AdWords
And these are sites that buy AdWords ads for keywords that this site ranks for
Before SEM Rush came out there were not many (or perhaps any?) tools that made it easy to compare AdWords against organic search.
Start Your Free Trial Today
SEM Rush Pro costs $79 per month (or $69 if you sign up recurring), so this free trial is worth about $35 to $40.
Take advantage of SEMRush's free 2-week trial today.
If you have any questions about getting the most out of SEM Rush feel free to ask in the comments below. We have used their service for years & can answer just about any question you may have & offer a wide variety of tips to help you get the most out of this powerful tool.
So today Google announced that they have turned on SSL by default for logged in users, a feature that has been available for a while on encrypted.google.com. The way they set it up, as explained in this post, means that your search query will not be forwarded to the website you're visiting and that they can only see that you've come from an organic Google result. If you're buying AdWords however, you still get the query data.
This is what I call hypocrisy at work. Google cares about your privacy, unless they make money on you, then they don't. The fact is that due to this change, AdWords gets favored over organic results. Once again, Google gets to claim that it cares about your privacy and pulls a major public "stunt". The issue is, they don't care about your privacy enough to not give that data to their advertisers.
That might also enlighten you to the real issue: Google still has all your search data. It's just not allowing website owners to see it anymore. It's giving website owners aggregated data through Google Webmaster Tools, which would be nice if it hadn't shown to be so incredibly useless and inaccurate.
If Google really cared about your privacy, (delayed) retargeting wouldn't be available for advertisers. They wouldn't use your query data to serve you AdSense ads on pages, but I doubt they'll stop doing that, if they did they would have probably said so and made a big fuzz out of it.
If Google really cared, the keyword data that site owners now no longer receive from organic queries would no longer be available for advertisers either. But that would hit their bottom line, because it makes it harder to show ROI from AdWords, so they won't do that.
The Real Reason for killing organic referral data
So I think "privacy" is just a mere pretext. A "convenient" side effect that's used for PR. The real reason that Google might have decided to stop sending referral data is different. I think it is that its competitors in the online advertising space like Chitika and Chango are using search referral data to refine their (retargeted) ads and they're getting some astonishing results. In some ways, you could therefor describe this as mostly an anti-competitive move.
In my eyes, there's only one way out. We've now determined that your search data is private information. If Google truly believes that, it will stop sharing it with everyone, including their advertisers. Not sharing vital data like that with third parties but using it solely for your own profit is evil and anti-competitive. In a country such as the Netherlands where I live, where Google has a 99%+ market share, in other words: a monopoly, I'm hoping that'll result in a bit of action from the European Union.
Joost is a freelance SEO consultant and WordPress developer. He blogs on yoast.com about both topics and maintains some of the most popular WordPress plugins for SEO and Google Analytics in existence.
Google would spin Performics out of DoubleClick, and sell it to holding firm Publicis. Only one major force inside of Google hated the plan. Guess who? Larry Page.
According to our source, Larry tried to sell the rest of Google's executive team on keeping Performics. "He wanted to see how those things work. He wanted to experiment."
A search engine selling SEO services? Yep.
And now they are aggressively entering the make money online niche. Both Prizes.org & YouTube are in the top 3 ad slots for "make money online"
And I am seeing some of those across portions of the content/display network as well. I just saw this in Gmail today.
How does this align with the Google AdWords TOS?
To protect the value and diversity of the ads running on Google, we don't generally permit advertisers to manage multiple accounts featuring the same business or keywords except in certain limited exceptions. Furthermore, Google doesn't permit multiple ads from the same or an affiliated company or person to appear on the same results page. We've found that pages with multiple text ads from the same company provide less relevant results and a lower quality experience for users. Over time, multiple ads from the same source also reduce overall advertiser performance and lower their return on investment.
Google doesn't allow advertisers or affiliates to have any of the following:
Ads across multiple accounts for the same or similar businesses
Ads across multiple accounts triggered by the same or similar keywords
Google has recently began refining search queries far more aggressively. In the past they would refine search queries if they thought there was a misspelling, but new refinements have taken to changing keywords that are spelled correctly to align them with more common (and thus profitable) keywords.
disclosure: in the past refinement disclosures appeared at the top of the search results, but now it often ends up at the bottom
awful errors: a couple months after I was born my wife was born in Manila. When I was doing some searches about visiting & things like that, sometimes Google would take the word "Manila" out of the search query. (My guess is because the word "Manila" is also a type of envelope?)
Here is an example of an "awful error" in action. Let's say while traveling you find a great gift & want to send it to extended family. Search for [shipping from las vegas to manila] and you get the following
The search results contain irrelevant garbage like an Urban Spoon page for Las Vegas delivery restaurants.
How USELESS is that?
And now, with disclosure of changes at the bottom of the search results, there isn't even a strong & clean signal to let end users tell Google "hey you are screwing this up badly."
In some ways I am inspired by Google's willingness to test and tweak, but in others I wonder if their new model for search is to care less about testing and hope that SEOs will highlight where Google is punting it bad. In that case, they just roped me into offering free advice. ;)
Link Assistant offers SEO's a suite of tools, under an umbrella aptly named SEO Power Suite, which covers many aspects of an SEO campaign.
Link Assistant provides the following tools inside of their Power Suite:
Rank Tracker - rank tracking software
WebSite Auditor - on-page optimization tool
SEO Spy Glass - competitive link research tool
Link Assistant - their flagship link prospecting, management, and tracking tool
We'll be reviewing their popular Rank Tracking tool in this post. I've used their tools for awhile now and have no issue in recommending them. They also claim to have the following companies as clients:
Rank Tracker is one of the more robust, fast, and reliable rank checking tools out there.
Is Rank Tracker a Worthy Investment?
Rank Tracker offers a few different pricing options:
All of the editions have the following features:
Customizable reports (you can only save and print with Enterprise level however, kind of a drawback in my opinion. Pro accounts should have this functionality)
Human search emulation built in
User agent rotation
Google analytics integration
Multiple language support (English, German, Russian, French, Dutch, Spanish, Slovak)
Runs on Windows, Mac, Linux
All editions offer access to their keyword research features, with all the features included, the only difference here is the free edition doesn't allow KEI updates.
Rank Tracker Feature Set
Rank Tracker offers a keyword research tool and a rank checking component within the application. A more thorough breakdown of the feature set is as follows:
I prefer to do my keyword research outside of tools like this. Generally specific tools seem to excel at their chosen discipline, in this case rank checking, but fall kind of short in areas they try to add-on. I like to use a variety of tools when doing keyword research and it's easier for me, personally, to create and merge various spreadsheets and various data points rather than doing research inside of an application.
However, Rank Tracker does offer a one-stop shop for cumbersome research options like various search suggest methods and unique offerings like estimated traffic based on ranking #1 for that specified term.
Overall, a nice set of keyword research features if you want to add on to the research you've already done.
Rank Tracker also gives you the option to factor in data from Google Trends as well as through Google Analytics (see current ranking for each keyword and actual traffic).
As this is the core piece tool it's really no surprise that this part of Rank Tracker shines. Some of the interesting options here are in the ability to track multiple Google search areas like images, videos, and places.
In addition to the interesting features I mentioned above, Rank Tracker also includes a wide array of charting and design options to help you work with your data more directly and in a clearer way:
Usability is Top Notch
While the interfaces aren't the prettiest, this is one of one most user-friendly rank tracking tools that I've come across.
First you simply enter the URL you wish to track. Rank Tracker will automatically find the page AND sub-domain on the domain ranking for the keywords chosen, so you don't have to enter these separately.
You enter the site you want to check (remember, subpages and subdomains are automatically included)
Choose from a whole host of engines and select universal search if you wish to factor in places taken up by Google insertions into the SERPS:
Enter your keywords:
Let Rank Tracker go to work: (you can choose to display the running tasks as line views or tree views, a minor visual preference)
That's all there is to it. It is extremely easy to get a project up and running inside of this tool.
Working with Rank Tracker
Inside of Rank Tracker the data is displayed clearly, in an easy to understand format:
In the top part you'll get to see:
the keywords you selected
current rank compared to last rank
overall visibility (top rankings) in each search engine selected
custom tags you might decide to choose to tag your keywords with for tracking purposes or something
On the bottom chart you'll see three options for the selected search engine (bottom) and keyword (top):
ranking information for each search engine for the selected keyword
historical records (last check date and position)
progress graph (visual representation of rankings, customizable with sliders as shown in the picture)
The ranking chart shows the chart for the chosen keyword and search engine:
Within the ranking results page, you can select from these options to get a broader view of how your site is performing on the whole:
Customizing Rank Tracker
Inside of Rank Tracker's preferences you'll see the following options, most of which are self-explanatory:
This is where you can take advantage of some of their cooler features like:
adding competitors to track
adding in your Google Analytics account
customizing your reporting templates
changing up human emulation settings
adding in a captcha service
adding in multiple proxies to help with the speed of the tool as well as to prevent blocks
You can track up to 5 competitors per Rank Tracker profile (meaning, 5 competitors per one of your sites).
Key Configuration Options
Rank Tracker has a ton of options as you can see from the screenshot above. Some of the more important ones you'll want to pay attention to begin with their reporting options.
You'll want to set up your company information as shown here: (this is what will show on your reports)
On a per profile basis you can customize client-specific details likeso:
You can create new and modify existing templates for multiple report types here as well:
Emulation settings are important, you want to make sure you are set up so your requests look as normal and human as possible. It makes sense to check off the "visit search engine home page" option to help it appear more natural in addition to having delays between queries (again, to promote a natural approach to checking rankings/searching).
One thing that irks me about Rank Tracker is that they have emulation turned off by default. If you don't adjust your settings and you try and run a moderately sized report you'll get a Google automated ban in short order, so be careful!
In addition to emulation, search approach is also worthy of a bit of tinkering as well. Given how often Google inserts things like images, products, and videos into search results you might want to consider using universal search when checking rankings.
Also, the result depth is important. Going deep here can help identifying sites that have been torched rather than sites that simply fell outside the top 20 or 50. 100 is a good baseline as a default.
Successive search gives you a more accurate view as it manually goes page by page rather than grabbing 100 results at a time (double listings, as an example, can throw off the count when not using successive search)
Finally, another important option is scheduling. You can schedule emails, FTP uploads, and so on (as well as rank checks) from this options panel. Your machine does have to be on for this to work (not in sleep mode for instance). In my experience Rank Tracker has been pretty solid on this front, with respect to executing the tasks you tell it to execute (consistently).
Software versus Cloud
There are some strong, cloud based competitors to Rank Tracker. Our Rank Checker is a great solution for quick checks and for ongoing tracking if you do not need graphical charts and such (though, you can easily make those in excel if you need to).
Competitors and Options
Raven offers rank tracking as a part of their package and there are other cloud based services like Authority Labs (who actually power Raven's tools) you can look into if you want to avoid using software tools for rank checking.
There are some drawbacks to cloud-based rank tracking though. Some of them do not have granular date-based comparisons as they typically run on the provider's schedule rather than yours.
Also, most cloud rank checking solutions offer limits on how many keywords you can track. So if you are doing enterprise level rank checking it makes sense to use a software tool + a proxy service like Trusted Proxies
Pricing and Final Thoughts
Rank Tracker offers a generous discount if you grab all their tools in one bundle. If you want to customize, schedule, and print reports you'll need the enterprise edition.
I think requiring the purchase of your top tier for the basic functionality of printing reports is a mistake. I can see having that limitation on the free edition, but if you pay you should get access to reports.
You can find their bundle prices here and Rank Tracker's specific pricing here. Also, similar to competitors, they have an ongoing service plan which is required if you plan to continue to receive updates after the initial 6 months.
Despite my pricing concern regarding the reporting options, I think this is one of the top rank checkers out there. It has a ton of features and is very simple to use. I would recommend that you give this tool a shot if you are in the market for a robust rank checking solution. Oh I almost forgot, rank checking is still useful :)
One More Note of Caution
Be sure to read the below complaints about how unclear & sneaky the maintenance plan pricing is. This is something they should fix ASAP.
It doesn't matter what "signals" Google chooses to use when Google also gets to score themselves however they like. And even if Google were not trying to bias the promotion of their own content then any signals they do collect on Google properties will be over-represented by regular Google users.
Google can put out something fairly average, promote it, then iterate to improve it as they collect end user data. Publishers as big as MotorTrend can't have that business model though. And smaller publishers simply get effectively removed from the web when something like Panda or a hand penalty hits them. Worse yet, upon "review" search engineers may choose to review an older version of the site rather than the current site!
With that level of uncertainty, how do you aggressively invest in improving your website?
Over a half-year after Panda launched there are few case studies of recoveries & worse yet, some of the few sites that recovered just relapsed!
If you look at search using a pragmatic & holistic view, then this year the only thing that really changed with "content" farms is you can now insert the word video for content & almost all that video is hosted on Youtube.
To highlight the absurdity, I created another XtraNormal video. :)
Compete.com's Google downstream search traffic stats are available with a premium membership to their site, & they do a good job of showing the actual traffic impact of the aggregate algorithmic changes. YouTube's growth is also well reflected in numbers from firms like SearchMetrics