SEM Rush has long been one of my favorite SEO tools. We wrote a review of SEM Rush years ago. They were best of breed back then & they have only added more features since, including competitive research data for Bing and for many local versions of Google outside of the core US results: Argentina, Australia, Belgium, Brazil, Canada, Denmark, Finland, France, Germany, Hungary, Japan, Hong Kong, India, Ireland, Israel, Italy, Mexico, Netherlands, Norway, Poland, Russia, Singapore, Spain, Sweden, Switzerland, Turkey, United Kingdom.
Recently they let me know that they started offering a free 2-week trial to new users.
Set up a free account on their website & enter the promotional code located in the image to the right.
For full disclosure, SEM Rush has been an SEO Book partner for years, as we have licensed their API to use in our competitive research tool. They also have an affiliate program & we are paid if you become a paying customer, however we do not get paid for recommending their free trial & their free trial doesn't even require giving them a credit card, so it literally is a no-risk free trial. In fact, here is a search box you can use to instantly view a sampling of their data
Quick Review
Competitive research tools can help you find a baseline for what to do & where to enter a market. Before spending a dime on SEO (or even buying a domain name for a project), it is always worth putting in the time to get a quick lay of the land & learn from your existing competitors.
Seeing which keywords are most valuable can help you figure out which areas to invest the most in.
Seeing where existing competitors are strong can help you find strategies worth emulating. While researching their performance, it may help you find new pockets of opportunities & keyword themes which didn't show up in your initial keyword research.
Seeing where competitors are weak can help you build a strategy to differentiate your approach.
Enter a competing URL in the above search box & you will quickly see where your competitors are succeeding, where they are failing & get insights on how to beat them. SEMrush offers:
granular data across the global Bing & Google databases, along with over 2-dozen regional localized country-specific Google databases (Argentina, Australia, Belgium, Brazil, Canada, Denmark, Finland, France, Germany, Hungary, Japan, Hong Kong, India, Ireland, Israel, Italy, Mexico, Netherlands, Norway, Poland, Russia, Singapore, Spain, Sweden, Switzerland, Turkey, United Kingdom, United States)
search volume & ad bid price estimates by keyword (which, when combined, function as an estimate of keyword value) for over 120,000,000 words
keyword data by site or by page across 74,000,000 domain names
the ability to look up related keywords
the ability to directly compare domains against one another to see relative strength
the ability to compare organic search results versus paid search ads to leverage data from one source into the other channel
the ability to look up sites which have a similar ranking footprint as an existing competitor to uncover new areas & opportunities
historical performance data, which can be helpful in determining if the site has had manual penalties or algorithmic ranking filters applied against it
a broad array of new features like tracking video ads, display ads, PLAs, backlinks, etc.
Longer, In-Depth Review
What is SEM Rush?
SEM Rush is a competitive research tool which helps you spy on how competing sites are performing in search. The big value add that SEM Rush has over a tool like Compete.com is that SEM Rush offers CPC estimates (from Google's Traffic Estimator tool) & estimated traffic volumes (from the Google AdWords keyword tool) near each keyword. Thus, rather than showing the traffic distribution to each site, this tool can list keyword value distribution for the sites (keyword value * estimated traffic).
As Google has started blocking showing some referral data the value of using these 3rd party tools has increased.
Normalizing Data
Using these estimates generally does not provide overall traffic totals that are as accurate as Compete.com's data licensing strategy, but if you own a site and know what it earns, you can set up a ratio to normalize the differences (at least to some extent, within the same vertical, for sites of similar size, using a similar business model).
One of our sites that earns about $5,000 a month shows a Google traffic value of close to $20,000 a month.
5,000/20,000 = 1/4 = 0.25
A similar site in the same vertical shows $10,000
$10,000 * 0.25 = $2,500
A couple big advantages over Compete.com and services like QuantCast for SEM Rush are that:
they focus exclusively on estimating search traffic
you get click volume estimates and click value estimates right next to each other
they help you spot valuable up-and-coming keywords where you might not yet get much traffic because you rank on page 2 or 3
Disclaimers With Normalizing Data
It is hard to monetize traffic as well as Google does, so in virtually every competitive market your profit per visitor (after expenses) will generally be less than Google. Some reason why..
In some markets people are losing money to buy marketshare, while in other markets people may overbid just to block out competition.
Some merchants simply have fatter profit margins and can afford to outbid affiliates.
It is hard to integrate advertising in your site anywhere near as aggressively as Google does while still creating a site that will be able to gather enough links (and other signals of quality) to take a #1 organic ranking in competitive markets...so by default there will typically be some amount of slippage.
A site that offers editorial content wrapped in light ads will not convert eyeballs into cash anywhere near as well as a lead generation oriented affiliate site would.
SEM Rush Features
Keyword Values & Volumes
As mentioned above, this data is scraped from the Google Traffic Estimator and the Google Keyword Tool. More recently Google combined their search-based keyword tool features into their regular keyword tool & this data has become much harder to scrape (unless you are already sitting on a lot of it like SEM Rush is).
Top Search Traffic Domains
A list of the top 100 domain names that are estimated to be the highest value downstream traffic sources from Google.
You could get a similar list from Compete.com's Referral Analytics by running a downstream report on Google.com, although I think that might also include traffic from some of Google's non-search properties like Reader. Since SEM Rush looks at both traffic volume and traffic value it gives you a better idea of the potential profits in any market than looking at raw traffic stats alone would.
Top Competitors
Here is a list of sites that rank for many of the same keywords that SEO Book ranks for
Most competitors are quite obvious, however sometimes they will highlight competitors that you didn't realize, and in some cases those competitors are also working in other fertile keyword themes that you may have missed.
Overlapping Keywords
Here is a list of a few words where Seo Book and SEOmoz compete in the rankings
These sorts of charts are great for trying to show clients how site x performs against site y in order to help allocate more resources.
Compare AdWords to Organic Search
These are sites that rank for keywords that SEO Book is buying through AdWords
And these are sites that buy AdWords ads for keywords that this site ranks for
Before SEM Rush came out there were not many (or perhaps any?) tools that made it easy to compare AdWords against organic search.
Start Your Free Trial Today
SEM Rush Pro costs $79 per month (or $69 if you sign up recurring), so this free trial is worth about $35 to $40.
Take advantage of SEMRush's free 2-week trial today.
Set up a free account on their website & enter the promotional code in the image located to the right.
If you have any questions about getting the most out of SEM Rush feel free to ask in the comments below. We have used their service for years & can answer just about any question you may have & offer a wide variety of tips to help you get the most out of this powerful tool.
So today Google announced that they have turned on SSL by default for logged in users, a feature that has been available for a while on encrypted.google.com. The way they set it up, as explained in this post, means that your search query will not be forwarded to the website you're visiting and that they can only see that you've come from an organic Google result. If you're buying AdWords however, you still get the query data.
This is what I call hypocrisy at work. Google cares about your privacy, unless they make money on you, then they don't. The fact is that due to this change, AdWords gets favored over organic results. Once again, Google gets to claim that it cares about your privacy and pulls a major public "stunt". The issue is, they don't care about your privacy enough to not give that data to their advertisers.
That might also enlighten you to the real issue: Google still has all your search data. It's just not allowing website owners to see it anymore. It's giving website owners aggregated data through Google Webmaster Tools, which would be nice if it hadn't shown to be so incredibly useless and inaccurate.
If Google really cared about your privacy, (delayed) retargeting wouldn't be available for advertisers. They wouldn't use your query data to serve you AdSense ads on pages, but I doubt they'll stop doing that, if they did they would have probably said so and made a big fuzz out of it.
If Google really cared, the keyword data that site owners now no longer receive from organic queries would no longer be available for advertisers either. But that would hit their bottom line, because it makes it harder to show ROI from AdWords, so they won't do that.
The Real Reason for killing organic referral data
So I think "privacy" is just a mere pretext. A "convenient" side effect that's used for PR. The real reason that Google might have decided to stop sending referral data is different. I think it is that its competitors in the online advertising space like Chitika and Chango are using search referral data to refine their (retargeted) ads and they're getting some astonishing results. In some ways, you could therefor describe this as mostly an anti-competitive move.
In my eyes, there's only one way out. We've now determined that your search data is private information. If Google truly believes that, it will stop sharing it with everyone, including their advertisers. Not sharing vital data like that with third parties but using it solely for your own profit is evil and anti-competitive. In a country such as the Netherlands where I live, where Google has a 99%+ market share, in other words: a monopoly, I'm hoping that'll result in a bit of action from the European Union.
---
Joost is a freelance SEO consultant and WordPress developer. He blogs on yoast.com about both topics and maintains some of the most popular WordPress plugins for SEO and Google Analytics in existence.
Google would spin Performics out of DoubleClick, and sell it to holding firm Publicis. Only one major force inside of Google hated the plan. Guess who? Larry Page.
According to our source, Larry tried to sell the rest of Google's executive team on keeping Performics. "He wanted to see how those things work. He wanted to experiment."
A search engine selling SEO services? Yep.
And now they are aggressively entering the make money online niche. Both Prizes.org & YouTube are in the top 3 ad slots for "make money online"
And I am seeing some of those across portions of the content/display network as well. I just saw this in Gmail today.
How does this align with the Google AdWords TOS?
To protect the value and diversity of the ads running on Google, we don't generally permit advertisers to manage multiple accounts featuring the same business or keywords except in certain limited exceptions. Furthermore, Google doesn't permit multiple ads from the same or an affiliated company or person to appear on the same results page. We've found that pages with multiple text ads from the same company provide less relevant results and a lower quality experience for users. Over time, multiple ads from the same source also reduce overall advertiser performance and lower their return on investment.
Google doesn't allow advertisers or affiliates to have any of the following:
Ads across multiple accounts for the same or similar businesses
Ads across multiple accounts triggered by the same or similar keywords
Google has recently began refining search queries far more aggressively. In the past they would refine search queries if they thought there was a misspelling, but new refinements have taken to changing keywords that are spelled correctly to align them with more common (and thus profitable) keywords.
As one example, the search result [weight loss estimator] is now highlyinfluenced by [weight loss calculator]. The below chart compares the old weight loss estimator SERP, the current weight loss estimator SERP & the current weight loss calculator SERP. Click on the image for a larger view.
There are 2 serious issues with this change
disclosure: in the past refinement disclosures appeared at the top of the search results, but now it often ends up at the bottom
awful errors: a couple months after I was born my wife was born in Manila. When I was doing some searches about visiting & things like that, sometimes Google would take the word "Manila" out of the search query. (My guess is because the word "Manila" is also a type of envelope?)
Here is an example of an "awful error" in action. Let's say while traveling you find a great gift & want to send it to extended family. Search for [shipping from las vegas to manila] and you get the following
The search results contain irrelevant garbage like an Urban Spoon page for Las Vegas delivery restaurants.
How USELESS is that?
And now, with disclosure of changes at the bottom of the search results, there isn't even a strong & clean signal to let end users tell Google "hey you are screwing this up badly."
In some ways I am inspired by Google's willingness to test and tweak, but in others I wonder if their new model for search is to care less about testing and hope that SEOs will highlight where Google is punting it bad. In that case, they just roped me into offering free advice. ;)
Link Assistant offers SEO's a suite of tools, under an umbrella aptly named SEO Power Suite, which covers many aspects of an SEO campaign.
Link Assistant provides the following tools inside of their Power Suite:
Rank Tracker - rank tracking software
WebSite Auditor - on-page optimization tool
SEO Spy Glass - competitive link research tool
Link Assistant - their flagship link prospecting, management, and tracking tool
We'll be reviewing their popular Rank Tracking tool in this post. I've used their tools for awhile now and have no issue in recommending them. They also claim to have the following companies as clients:
Disney
Microsoft
Audi
HP
Rank Tracker is one of the more robust, fast, and reliable rank checking tools out there.
Update: Please note that in spite of us doing free non-affiliate reviews of their software, someone spammed the crap out of our blog promoting this company's tools, which is at best uninspiring.
Is Rank Tracker a Worthy Investment?
Rank Tracker offers a few different pricing options:
Free
Pro
Enterprise
All of the editions have the following features:
Unlimited sites
Unlimited keywords
Customizable reports (you can only save and print with Enterprise level however, kind of a drawback in my opinion. Pro accounts should have this functionality)
API key's
Human search emulation built in
User agent rotation
Proxy support
Proxy rotation
Google analytics integration
Multiple language support (English, German, Russian, French, Dutch, Spanish, Slovak)
Runs on Windows, Mac, Linux
All editions offer access to their keyword research features, with all the features included, the only difference here is the free edition doesn't allow KEI updates.
Rank Tracker Feature Set
Rank Tracker offers a keyword research tool and a rank checking component within the application. A more thorough breakdown of the feature set is as follows:
Keyword Research
I prefer to do my keyword research outside of tools like this. Generally specific tools seem to excel at their chosen discipline, in this case rank checking, but fall kind of short in areas they try to add-on. I like to use a variety of tools when doing keyword research and it's easier for me, personally, to create and merge various spreadsheets and various data points rather than doing research inside of an application.
However, Rank Tracker does offer a one-stop shop for cumbersome research options like various search suggest methods and unique offerings like estimated traffic based on ranking #1 for that specified term.
Overall, a nice set of keyword research features if you want to add on to the research you've already done.
Rank Tracker also gives you the option to factor in data from Google Trends as well as through Google Analytics (see current ranking for each keyword and actual traffic).
Rank Checking
As this is the core piece tool it's really no surprise that this part of Rank Tracker shines. Some of the interesting options here are in the ability to track multiple Google search areas like images, videos, and places.
In addition to the interesting features I mentioned above, Rank Tracker also includes a wide array of charting and design options to help you work with your data more directly and in a clearer way:
Usability is Top Notch
While the interfaces aren't the prettiest, this is one of one most user-friendly rank tracking tools that I've come across.
First you simply enter the URL you wish to track. Rank Tracker will automatically find the page AND sub-domain on the domain ranking for the keywords chosen, so you don't have to enter these separately.
You enter the site you want to check (remember, subpages and subdomains are automatically included)
Choose from a whole host of engines and select universal search if you wish to factor in places taken up by Google insertions into the SERPS:
Enter your keywords:
Let Rank Tracker go to work: (you can choose to display the running tasks as line views or tree views, a minor visual preference)
That's all there is to it. It is extremely easy to get a project up and running inside of this tool.
Working with Rank Tracker
Inside of Rank Tracker the data is displayed clearly, in an easy to understand format:
In the top part you'll get to see:
the keywords you selected
current rank compared to last rank
overall visibility (top rankings) in each search engine selected
custom tags you might decide to choose to tag your keywords with for tracking purposes or something
On the bottom chart you'll see three options for the selected search engine (bottom) and keyword (top):
ranking information for each search engine for the selected keyword
historical records (last check date and position)
progress graph (visual representation of rankings, customizable with sliders as shown in the picture)
The ranking chart shows the chart for the chosen keyword and search engine:
Within the ranking results page, you can select from these options to get a broader view of how your site is performing on the whole:
Customizing Rank Tracker
Inside of Rank Tracker's preferences you'll see the following options, most of which are self-explanatory:
This is where you can take advantage of some of their cooler features like:
adding competitors to track
adding in your Google Analytics account
customizing your reporting templates
changing up human emulation settings
adding in a captcha service
scheduling reports
adding in multiple proxies to help with the speed of the tool as well as to prevent blocks
You can track up to 5 competitors per Rank Tracker profile (meaning, 5 competitors per one of your sites).
Key Configuration Options
Rank Tracker has a ton of options as you can see from the screenshot above. Some of the more important ones you'll want to pay attention to begin with their reporting options.
You'll want to set up your company information as shown here: (this is what will show on your reports)
On a per profile basis you can customize client-specific details likeso:
You can create new and modify existing templates for multiple report types here as well:
Emulation settings are important, you want to make sure you are set up so your requests look as normal and human as possible. It makes sense to check off the "visit search engine home page" option to help it appear more natural in addition to having delays between queries (again, to promote a natural approach to checking rankings/searching).
One thing that irks me about Rank Tracker is that they have emulation turned off by default. If you don't adjust your settings and you try and run a moderately sized report you'll get a Google automated ban in short order, so be careful!
In addition to emulation, search approach is also worthy of a bit of tinkering as well. Given how often Google inserts things like images, products, and videos into search results you might want to consider using universal search when checking rankings.
Also, the result depth is important. Going deep here can help identifying sites that have been torched rather than sites that simply fell outside the top 20 or 50. 100 is a good baseline as a default.
Successive search gives you a more accurate view as it manually goes page by page rather than grabbing 100 results at a time (double listings, as an example, can throw off the count when not using successive search)
Finally, another important option is scheduling. You can schedule emails, FTP uploads, and so on (as well as rank checks) from this options panel. Your machine does have to be on for this to work (not in sleep mode for instance). In my experience Rank Tracker has been pretty solid on this front, with respect to executing the tasks you tell it to execute (consistently).
Software versus Cloud
There are some strong, cloud based competitors to Rank Tracker. Our Rank Checker is a great solution for quick checks and for ongoing tracking if you do not need graphical charts and such (though, you can easily make those in excel if you need to).
Competitors and Options
Raven offers rank tracking as a part of their package and there are other cloud based services like Authority Labs (who actually power Raven's tools) you can look into if you want to avoid using software tools for rank checking.
There are some drawbacks to cloud-based rank tracking though. Some of them do not have granular date-based comparisons as they typically run on the provider's schedule rather than yours.
Also, most cloud rank checking solutions offer limits on how many keywords you can track. So if you are doing enterprise level rank checking it makes sense to use a software tool + a proxy service like Trusted Proxies
Pricing and Final Thoughts
Rank Tracker offers a generous discount if you grab all their tools in one bundle. If you want to customize, schedule, and print reports you'll need the enterprise edition.
I think requiring the purchase of your top tier for the basic functionality of printing reports is a mistake. I can see having that limitation on the free edition, but if you pay you should get access to reports.
You can find their bundle prices here and Rank Tracker's specific pricing here. Also, similar to competitors, they have an ongoing service plan which is required if you plan to continue to receive updates after the initial 6 months.
Despite my pricing concern regarding the reporting options, I think this is one of the top rank checkers out there. It has a ton of features and is very simple to use. I would recommend that you give this tool a shot if you are in the market for a robust rank checking solution. Oh I almost forgot, rank checking is still useful :)
One More Note of Caution
Be sure to read the below complaints about how unclear & sneaky the maintenance plan pricing is. This is something they should fix ASAP.
On September 28th, Google rolled out Panda 2.5. Yet again Youtube is the #1 site on the leader board, while even some branded sites like MotorTrend were clipped, and sites that had past recovered from Panda (like Daniweb) were hit once more. In the zero sum game of search, Google's Android.com joins YouTube on the leader board.
It doesn't matter what "signals" Google chooses to use when Google also gets to score themselves however they like. And even if Google were not trying to bias the promotion of their own content then any signals they do collect on Google properties will be over-represented by regular Google users.
Google can put out something fairly average, promote it, then iterate to improve it as they collect end user data. Publishers as big as MotorTrend can't have that business model though. And smaller publishers simply get effectively removed from the web when something like Panda or a hand penalty hits them. Worse yet, upon "review" search engineers may choose to review an older version of the site rather than the current site!
With that level of uncertainty, how do you aggressively invest in improving your website?
Over a half-year after Panda launched there are few case studies of recoveries & worse yet, some of the few sites that recovered just relapsed!
If you look at search using a pragmatic & holistic view, then this year the only thing that really changed with "content" farms is you can now insert the word video for content & almost all that video is hosted on Youtube.
To highlight the absurdity, I created another XtraNormal video. :)
Compete.com's Google downstream search traffic stats are available with a premium membership to their site, & they do a good job of showing the actual traffic impact of the aggregate algorithmic changes. YouTube's growth is also well reflected in numbers from firms like SearchMetrics
So here we are, aren't we? It's 2011, SEO is still not dead (despite a decade of claims to the contrary), but the landscape is very, very different in this post-Panda world. Most sites that have been hit by Panda (inclusive of all iterations) are still on ice some 7 months after the initial roll out.
Businesses have been destroyed, livelihoods ruined, and the future of a once thriving business is seemingly on the ropes for newcomers and seasoned veterans alike.
Seems like a good time to dial this up:
This all appears to be just fine with Google. As Eric Schmidt once said, "Brands are how you sort out the cesspool". How very elitist of you Mr. Schmidt.
What exactly is a brand anyway, to you? Is it content factories ranking for medical queries like "How to survive a heart attack" and other assorted medical terms?
Or maybe you think an article that is in the running for queries around avoiding heart attacks, written by a guy with an English degree, is something that isn't part of a cesspool?
Matt Cutts has highlighted health issues as a great example of why selling links was "evil." What his post didn't disclose at the time was that Google had built a half-billion Dollar enterprise selling illegal drug ads!
I don't know about you, but I sure don't want to read an article on a medical topic that could have life or death implications which is written by a guy with an English degree! The point is that the lines continue to become extremely blurred and the algorithm "adjustments" continue to become more and more severe.
The combination of those two attributes must give an SEO pause when thinking about short, mid, and long term strategies for their business model. One mistake or one algorithm update (completely out of your hands) can have devastating consequences for your business.
Talk is Cheap
Now we can queue the white hats (whatever the heck that means) who will now wax poetic about building "brands" the right way (whatever the heck that means) and begin to play the "I told you so" game as you struggle to survive. Keep in mind that salespeople will use your uncertainty against you, and try to calm your fears by telling you "everything is ok if you do things the right way".
Problem is, what is the "right" way and why aren't "they" doing it? There is no "right" way, rather, just all sorts of shades of gray.
Don't buy into the hype and save yourself a bit of sanity. The same people who will whip out their white hats at the first sign of algorithmic shifting are the same people who want to sell you something that, at its core whether it's a tool or product, is designed to give you information on how to manipulate search results (irrespective on how they frame the language).
Bottom line is that folks in the industry are confused, scared, nervous and it's easy for salespeople to prey on the scared and the informationally-poor to enhance their bottom line.
Keep this quote from Voltaire in mind when you are searching for answers or guidance in these times of uncertainty:
The comfort of the rich depends upon an abundant supply of the poor.
The best defense is education, experience, and information.
The Shrinking Google SERP
It's getting harder to breathe in the SERPS. We routinely point this out in various blog posts, but I thought now would be a good time to revisit this problem. As it continues to appear as if Panda was less about content farms and about something a bit more sinister the incredibly shrinking organic SERP is cause for concern as well:
Here you see one site with extended AdWords and organic sitelinks:
If you're not in the top 3, well then you're pretty much not in the game:
So much for SERP diversity:
A few key takeaways when looking at these results are that:
Competing and monetizing just on search traffic is probably not a good long term strategy (but can work short-mid term)
Google continues to layer on Google "stuff", becomes another competitor that is almost impossible to beat
You might want to explore PPC a bit more than you have in the past for more visibility, if the margins are available
It might make some sense to start evaluating the cost of your SEO efforts and figuring out how they could translate into getting your foot into other areas of traffic acquisition online via targeted advertising, media buys, monitoring blogs and forums for discussions about your market, keywords, or products. Spread the funds out to get maximum exposure in multiple areas (for both short term and long term positioning)
As you can see from the images, the long term viability of just relying on search engine traffic is likely to be a losing proposition.
Leveraging Your SEO Skills
SEO has long been more about marketing than making sure your title tags are perfect. A good SEO is a good marketer and it's been said on this blog over the years that SEO really should be part of a more holistic approach to an overall marketing strategy. However, many of you reading this might be in affiliate or Adsense camp rather than a full service SEO agency.
The good news for the SEO agency is that you have all sorts of ways to leverage your SEO skills. You can get into things like:
conversion optimization
email marketing
online media buys and adverts
analytics services
social media services
the venerable "design and development" market
offline advertising and tracking
local SEO and Google Places SEO as well as Yahoo! and Bing local
The options listed above are all items that can quite easily come up within the context of an SEO proposal or discussion and should make for fairly doable cross-sales or up-sells.
The problem with just selling rankings or traffic is that it's all too easy for the client to dismiss you after you've achieved rankings. What's worse, even if you achieve rankings there are no guarantees of results and going back to the client 4 months in to up-sell conversion optimization is usually a non-starter if the stuff you've delivered thus far is of little value ROI-wise.
No matter how effective your performance is, as an SEO you are working in someone else's ecosystem. Google may extend the AdWords ads or insert their own product search or local search or video search results right at the top and push your work down.
Part of your SEO career planning, if you are in it for the long haul, should involve you starting to take a serious look at some level of client work and/or refine your product offering to a more holistic one rather than one with a singular focus.
Affiliates Feeling the Squeeze
Since Google has clearly shown its true colors with respect to how they view affiliates on the AdWords side is it that hard to believe that is how they view affiliates on the organic side? In fact, one of our members received this email when applying their AdWords credit:
Hello Aaron Wall,
I just signed up for the Get $75 of Free AdWords with Google Adwords. After receiving an e-mail stating that I was to call an 877 number of Google Adwords, I was told in my phone call that affiliate marketing accounts were not accepted. I guess I confused by this statement. Is this in error? Or am I not understanding the Tip #3 for setting up an account for Google Adwords for promoting a website?
Thank you in advance for your time.
Sincerely,
Carole
Do you remember this video where the body language suggests AdSense is ok but OMG YOU'RE AN AFFILIATE (at approximately 0:38)!
Diversity, Diversity, Diversity
To counteract being viewed as a "thin affiliate", I'd suggest reading up on SugarRae's blog, specifically her affiliate marketing section.
Clearly you can build a quality affiliate site that is quite profitable, but how many can you reasonably expect to build out into thick, market leading sites without scaling high on internal costs to the point where margins become an issue or until Google monopolizes your SERPS?
Diversity is still key with respect to revenue streams but diversity between different revenue types (affiliate, adsense, client, product) is what you should be aiming for rather than just your garden variety diversity in revenue (just different sites of the same monetization method)
Where Do You Go From Here
The best thing you can do for your business is to stay out of debt. This is much easier said than done, especially if you live in the US where debt slavery is the norm and gets pretty ugly before you even have a chance to earn real money.
Being mostly debt free with some savings put away not only puts you in a better spot than most consumers but it also allows you to be less subjected to the whimsical nature of Google. Also, you can afford to be more patient, invest in new opportunities, and be less stressed out if some of your stuff turns down for a bit.
I'd venture to say that debt is probably a major reason why some folks went out of business after the Panda update and being debt free with some backup savings and income diversity helped keep some folks in the game.
Taking the First Steps
I would suggest that you take stock of your personal financial situation, your current revenue streams, your skill sets, and your feeling on the overall landscape of the industry and then start to make some decisions on the future of your career. With any update or change there are usually new opportunities that arise from the ashes of Google's scorched earth policy (or policies).
Now that Google is overtly spamming their own "organic" search results to try to capture the second click, riding as a parasite posting content on their own parasitical platforms is likely going to be an extremely profitable strategy in the coming years.
You might not make as much money posting content to Youtube as you made posting it to your own site, but you NEVER have to worry about Youtube disappearing from the search results.
The barrier to entry is getting much higher and rising fast. You need patience, capital, reliable/trusted information sources, and a bit of luck to succeed going forward. Within the span of a couple years it's gone from (mostly) the wild west to survival of the fittest. How do you plan on surviving?
Another successful SMX East is in the books. From all accounts, the event seemed to go through flawlessly and without a hitch. Kudos to Danny Sullivan, Claire Schoen, and crew as the caliber of speakers, sessions, and attendees was top notch, as always. Judging from the event, search marketing is alive and thriving more than ever before. There was a healthy mix of industry experts, consultants, large corporations, agencies, and small businesses. The sessions covered a broad range of topics from beginner link building fundamentals to more advanced technical SEO sessions covering site architecture, technical coding optimization and everything inbetween. A huge thank you goes out to the organizers for a job well done.
It seemed there were two themes that surfaced regularly - Panda and Google Plus/+1. Clearly, there are still many webmasters struggling with Panda and how to properly handle content in the new post-Panda world . The search engines are addressing this and giving webmasters and SEO’s more tools and information to organize their websites correctly. After some of the presentations, it seems Google is very dedicated to their Plus and +1 initiatives which will have a large affect on SEO should end user usage continue to increase.
Below are tidbits and takeaways from the conference, from an SEO perspective. Enjoy!
Microformats where the original snippet format, however, they have been replaced by the new and evolving standard, microdata (which is Schema.org/Google/Bing are developing for and placing resources towards). Some notes from the presentations:
General consensus is rich snippets can greatly help in getting your content noticed.
In one example given, Eatocracy added the hRecipe tag to their pages, and immediately saw a 47% increase in their recipes being picked up and indexed into Google (which does support this in their recipe search). Additionally, they saw a 22% increase in their recipe traffic.
CNN started using Yahoo SearchMonkey / RDFa, and saw a 35% increase in their video content on Google Video search, and saw a 22% increase in overall search traffic. However, they removed the additional code from their site as it increased their page load time. The take away on that is that you should think to integrate this into your down dev cycle, your cms, or your template.
Per Google, their studies show that sites w/ rich snippets have a better CTR as well. Rich Snippets Engineer at Google, RV Guha noted, “From our experiments, it seemed that giving the user a better idea of what to expect on the page increases the click-through rate on the search results. So if the webmasters do this, it’s really good for them. They get more traffic. It’s good for users because they have a better idea of what to expect on the page. And, overall, it’s good for the web.”
Rich snippets only work for one site (no cross site references).
Sites like LinkedIn and Google Profiles still use microformats. Google has also provided a tool in WMT, but it is a bit buggy and may throw false errors. If you don’t see your snippets show up in the SERP’s, it’s likely caused by longer than preferred latency load times, errors in your code, or a random Google bug - (per Google).
The current types of rich snippets: reviews, people, products, businesses & organizations, recipes*, events, music
One audience member asked how to handle ‘subcategory’ pages that are often created in ecommerce sites such as “Sort Prices $0-$5”, “Prices $5-$25” etc. The question was whether or not to use the “rel=canonical” tag and point the pages back to the main page. The panelists agreed that those pages should be blocked completely and should not use the canonical tag. The Google representative said not only do these pages not add value to the engine’s index, but they also eat up the sites crawl budget.
If you see the warning "we're seeing a high # of URL's" in Webmaster Tools, most times its a duplicate content issue.
One audience member asked: do you look at subdomain as part of the main domain?
Blekko - no inheritance from main domain
Google - "it depends". Sometimes it is inherited, sometimes not.
Bing - we look and try to determine if subdomain is a standalone business/website and will get treated differently based on that determination
One question touched on removing URL’s from Google’s index. Google advised that a removed URL may or may not stay in the index for a period of time, and that to expedite removal of a URL one should use Webmaster Tools remove-url tool
Duane from Bing was adamant about keeping your submitted sitemap clean. The threshold is 1%. If there are issues in your submitted sitemap >1%, Bing will “lose trust” for your website
Panelists advised to make your 404 pages useful to the user
It may not be breaking news, but Bing and Google both said unequivocally - duplicate content does hurts you
Google commented they are big fans of HTML 5 technology
At this point it seems Google will crawl a page if +1 is present, regardless of the robots.txt. This could possibly create issues with trying to not crawl certain pages to avoid dup content. More information found here: http://www.webmasterworld.com/google/4358033.htm
Panelists advised to spend a lot of energy “containing urls” on your website and to be thoughtful about which URLs you are getting out there
Bing and Google confirmed that “pagerank sculpting” is misunderstood and not effective. For example, if a page has 5 outgoing links and link juice is spread 20% to each of the 5 links, if you no follow one of the links, the link juice distribution will not become 25% to the remaining 4 links. It will remain 4 x 20%. In essence, you have just evaporated potential link juice
Google Plus and +1
These were hot topics at this year’s SMX East. Multiple session covered Google Plus and +1 in depth.
Speaker Benjamin Vigneron from Esearchvision covered the basics of Google Plus and +1 . He noted a +1 to a search result will +1 the ppc ad/landing page, too.
With PPC, +1 could have a significant affect on Adrank by affecting each of the Quality Score factors including quality of the landing page, CTR, and the ad’s past performance.
Interesting that Adwords could conceivably add segmenting on all information in Google Plus (similar to FB) ie males, ages, etc.
Christian Oestlien, the Google Product Manager for Google Plus, spoke about Google Plus features and fielded questions. He mentioned Google is testing and experimenting with celebrity endorsements +1'ing and showed an example SERP with a +1 annotation under the search result (for example “Kim Kardashian has +1’ed” Brand X or search result X). He noted Google is seeing much higher CTR with the +1 annotation and that usage for the “Circles” feature is relatively high.
Google software engineer Tiffany Oberoi was also present on the panel. She noted +1 is NOT a ranking factor, but social search is still of course implemented in search results. She confirmend Facebook likes have no impact on rankings but also noted regarding social signals, “explicit user feedback is like gold for us". She also touched on spam with +1 and said she is currently working with spam team. Regarding +1’s and spamming, she said to think of +1’s similarly to links. The same guidelines could apply. Google wants to use them as a real signal. Using in an unnatural way will not good for you.
Hardcore Local Search Tactics
Panelists:
Matt McGee - Search Engine Land
Mike Ramsey, - Nifty Marketing
Will Scott - Search Influence
Panelists here gave an encore presentation of the session these folks put on at SMX Advanced in Seattle. The content was excellent and definitely deserved another run through. Here are the notes:
July 21st, Google removed citations from their Places listings. While they have been removed for public viewing, they are still used. Sources like Whitespark (link: http://www.whitespark.ca/) can be very helpful in uncovering citation building opportunities.
Citation accuracy is among the most important factors in getting your business to rank in the O or 7-Pack. Doing a custom Google search of “business name”+”address”+”phone number” will help determine what other sources Google sees as citation sources.
Average number of IYP reviews of ranked listings vs non ranked listings showed to be a large gap, indicating that IYP reviews do in fact provide quite a bit of listing weight.
Offsite Citation’s / Data appear to be the no. 1 ranking factor in Places listings
Linking Root Domains appear to be the no. 2 ranking factor in Places listings
Exact match anchor links appear to be the no. 3 ranking factor in Places listings
Links are the new citations for local in 2011-12
Building a custom landing page to link your Places Listing to appears to be a huge success factor. Include your Name, Address, Phone (NAP) in the title tag
Design that landing page to mirror a Places listing on their site w/ a map, business hours, contact data, etc.
If needed, submit your contact/location page as your Places URL/Landing Page which will create a stronger geo scent
When trying to understand how users are searching for your client, Insights for Search is a great tool as you can find Geo targeted data w/ KW differentiation (ie Lawyer vs Attorney, which is used more in that area)
Local requires a different mindset from traditional SEO
Optimize location (local SEO) vs Optimize websites (traditional SEO)
Blended search is about matching them up
PageRank of Places URL does NOT seem to affect Local ranking -(source: David Mihm)
Multi-Location Tips
Flat site architecture beginning w/ a “Store Locator” page
Great Example, lakeland.co.uk/StoreLocator.action
Give each location its own page
Great Example, lakeland.co.uk/stores/aberdeen
Cross link nearby locations w/ geo anchor text
Ensure the use of KML Sitemap in Google WMT
Encourage Community Edits - Make Use of Google’s Map Maker
Include Geo data in Facebook pages and article engines
Panda Recovery Case Study - High Gear Media
Speaker Matt Heist from High Gear Media covered their experiences over the past 8 months with recovering from Panda. High Gear Media is an online publisher of auto news and reviews.
Heist walked through the company’s strategy pre-panda and explained their contrasting new post-panda strategy. The original strategy was many auto review niche sites across a broad range of auto makes, models and manufacturers. The company originally had 107 sites and 20+ writers and dispersed content amongst all the sites. The content was "splashing" everywhere, unfocused. The “large network of microsites” strategy was working and traffic was climbing each month. Then Panda hit - hard. Traffic plummeted beginning this past Spring. Leaders at High Gear was forced to reevaluate their strategy and concluded that a more focused approach was better for users and consequently would help search traffic recover.
High Gear took the following actions:
Eliminated most of their properties completely (301'ed) and pared them down to 7 total sites with 4 being ‘core’: FamilyCarGuide, Motorauthority, GreenCarReports, TheCarConnection.
Properly canonicalized duplicate content
Aggregated content with strong user engagement was KEPT, but not indexed
The made the hard decision to eliminate content that could be making money but not good for the long term
Dedicated significant resources to redesigning each of the 7 sites remaining sites
Their strategy seems to be working. Heist noted traffic has ‘flipped, plus some”. According to Heist, here are the learning's:
High Gear Media believes that premium content will prevail and that Panda will help that
Advertisers like bigger brands - it is now easier to sell ads and for more $ with fewer, more powerful sites
With evolution of Social (joining Search from a distribution perspective), premium content that is authoritative AND fresh with flourish
Raven Tools
We were able to meet up with the friendly staff over at Raven Tools, sit down with them, and learn a bit more about their product. We personally have been using Raven for about a year now, and highly recommend it. There are several features in the works that will make this even more of an incredible product. If you haven't used them, we would HIGHLY suggest giving the tools a run. They are partnering with new companies constantly, and as such, are building out a best in class seo management product.
Upcoming Features:
A new feature they are working on is a Chrome Toolbar to compliment the current Firefox toolbar
Another feature coming is “templated messaging” for link requests and manual link building which will include BCC’s back to records. Templated Messaging will be built into our Contact Manager, but they are working on making that functionality available in the toolbar.
Another upcoming features is file management. RavenTools engineers are looking at integrating Dropbox into the system to allow files to be associated with other data and records.
The Co-Founder Jon Henshaw alluded many times to the idea that link building and consequently their toolset will continue to become more and more based on relationships in the future. He also alluded to the idea that traffic can or in some cases should be associated with PEOPLE as the referrer, rather than a website (ie x amount of traffic came from person A, whether it be their facebook, twitter, blog, or website). In other words, a relationship management system looks to be a integral part of the future of Raventools.
For future updates, Raventools takes explicit user feedback greatly into account. If you have a feature request or a software integration request, please contact: http://raventools.com/feature-requests/
Regarding MajesticSEO and OSE/Linkscape, they will be more fully integrating it into the Research section of Raven. That means they’ll be adding as much functionality into Raven as their APIs will allow. In addition to getting more full access to that data, users will be able to easily add that data to other tools, like the Keyword and Competitor Managers, Rank Tracker, etc...
Speed is the number one priority right now. They have full-time staff that are solely dedicated to speeding up the system. The goal is to make it run as fast as a desktop app.
Long term - 3rd party integration will be a constant (and should accelerate) for the platform for the foreseeable future.
Regarding Panda, one panelist referenced what he called a website’s “Content Performance Ratio” referring to the % of content on a site that is good versus bad or ‘performing vs non performing’ and using that as a gauge as to the health of a website.
Panelists also noted in his experience it takes 3-4 requests on a 404 before search engine believes you and removes it from the index.
Panelists in the “Ask the SEO” session said to pay close attention to anchor text diversity and human engagement signals
Author bio:
Jake Puhl is the Co-Founder/Co-Owner of Firegang Digital Marketing, a Local search marketing company, specializing in all aspects "Local", including custom web design, SEO, Google Places, and local PPC advertising. Jake has personally consulted businesses from Hawaii to New York and everywhere in-between. Jake can be contacted at jacobpuhl at firegang.com.
Then more ads below it. Then a single organic listing with huge sublinks too. And unless you have a huge monitor at that point you are "below the fold."
Negative advertising in AdWords is not allowed. So long as you build enough brand signals & pay the Google toll booth, public relations issues & reputation issues won't be accessible to searchers unless they learn to skip over the first screen of search results.
While it is generally against Google's TOS for advertisers to double dip in AdWords (outside of the official prescribed oversize ad units highlighted above), Google is doing exactly that with their multitude of brands.
Another friend sent me a message today: "just got a whole swathe of non-interlinked microsites torched today. Bastard! Just watching the rank reports coming in..."
I haven't seen his sites, but based on how he described them "whole swathe" I wouldn't guess the quality to be super high. One thing you could say for them was that they were unique.
Where putting in the effort to create original content falls flat on its face is when search engines chose to outrank aggregators (or later copies) over the original source. The issue has got so out of hand that Google has come right out & asked for help with it.
When Google launched Knol I was quick to flame them after I saw them ranking recycled content on Knol ahead of the original source. The Knol even highlighted similar works, showing that Google allowed Knol to outrank earlier sources of the same work.
Some Google+ SEO factors now trump linking as prime algo ingredient. Google+ is already and clearly influencing rankings. I watched a presentation last night that definitely showed that rankings can occur from Google+ postings and photo's with no other means of support.
As Google+ grows - so will Google's understanding of how to use it as rankings signals.
We are not playing Google+ because we want too - we are playing Google+ because we have to.
I read that sorta half hoping he was wrong, but know he rarely is.
And then today Google hit me across the head with a 2x4, proving he was right again.
What's worse is that isn't from a friend, isn't from the original source, is the full article wholesale, from Google Reader, and the source URL has Google's feedproxy in it.
But Google's dominance in search coupled with their dominance in display (from owning DoubleClick & YouTube) has led competing portals to team up to try to compete against Google with display ads.
And, if the big portals are struggling that much, then at the individual publisher level, how do you profitably produce content when Google takes your content & ranks their copy ahead of yours?