Jim Boykin recently released a free, but powerful tool, that can help you check on broken links, redirects, in addition to helping you generate a Google Sitemap.
Being a free, web-based tool you might think it's a bit lightweight but you'd be wrong :) It can crawl up to 10,000 internal pages, up to 5 runs per day per user.
In addition to the features mentioned above, the tool offers other helpful data points as well as the ability to export the data to CSV/Excel, HTML, and the ability to generate a Google XML Sitemap.
The other data points available to you are:
URL of the page spidered
Link to an On-Page SEO report for that URL
Link depth from the home page
HTTP status code
Internal links to the page (with the ability to get a report off the in-links themselves)
External links on the page (a one-click report is available to see the outlinks)
Overall size of the page with a link to the Google page speed tool (cool!)
Link to their Image check tool (size, alt text, header check of the page)
Rows for Title Tag, Meta Description, and Meta Keywords
Canonical tag field
Using the Tool
The tool is really easy to use, just enter the domain, the crawl depth, and your email if you don't care to watch the magic happen live :)
For larger crawls entering your email makes a lot of sense as it can take a bit on big crawls:
Click Ninja Check and off you go!
Working With The Data
The top of the results page auto-updates and shows you:
Status of the report
Internal pages crawled
External links found
Internal redirects found
External redirects found
Internal and External errors
When you click any of the yellow text(s) you are brought to that specific report table (which are below the main results I'll show you below).
This is also where you can export the XML sitemap, download results to Excel/HTML.
The results pane (broken up into 2 images given the horizontal length of the table) looks like:
More to the right is:
The On Page Report
If you click on the On Page Report link in the first table you are brought to their free On-Page Optimization Analysis tool. Enter the URL and 5 targeted phrases:
Their tool does the following:
Metadata tool: Displays text in title tags and meta elements
Keyword density tool: Reveals statistics for linked and unlinked content
Keyword optimization tool: Shows the number of words used in the content, including anchor text of internal and external links
Link Accounting tool: Displays the number and types of links used
Header check tool: Shows HTTP Status Response codes for links
Source code tool: Provides quick access to on-page HTML source code
The data is presented in the same table form as the original crawl. This first section shows the selected domain and keywords in addition to on-page items like your title tag, meta description, meta keywords, external links on the page, and words on the page (linked and non-linked text).
You can also see the density of all words on the page in addition to the density of words that are not links, on the page.
Next up is a word breakdown as well as the internal links on the page (with titles, link text, and response codes).
The word cloud displays targeted keywords in red, linked words underlined, and non-linked words as regular text.
You'll see a total word count, non-linked word count, linked word count, and total unique words on the.
This can be helpful in digging into deep on-page optimization factors as well as your internal link layout on a per page basis:
Next, you'll get a nice breakdown of internal links and the text of those links, the titles, and the words in the url.
Also, you can see any links to sub-domains as well as external links (with anchor text and response codes):
Each section has a show/hide option where you can see all the data or just a snippet.
Another report you get access to is the image checker (accessible from the main report "Check Image Info" option):
Here you'll get a report that shows a breakdown of the files and redirects on the page in addition to the image link, image dimensions, file size, alt text, and a spot to click to view the image:
After that section is the link section which shows the actual link, the file type (html, css, etc), status code and a link check (broken, redirect, ok, and so on)
The main report referenced at the beginning of this post is the Internal Page Report. There are five additional reports:
This report will show you:
Internal links to the external link
Actual link URL
Link anchor text
Where the link was first found on the domain
Internal and External Redirects
Internal links to the external link
Actual link URL
Link anchor text
Page the URL redirects to
Internal and External Errors
Internal links to the external link
Actual link URL
Link anchor text
Give it a Spin
It's free but more importantly it's quite useful. I find a lot of value in this tool in a variety of ways but mostly with the ability to hone in on your (or your competitor's) internal site and linking structure.
There are certainly a few on-page tools on the marketing but I found this tool easy to use and full of helpful information, especially with internal structure and link data.
the page disclaims that it is not endorsed by Google
the page embeds a Google search box
the page strips out the Yahoo! Directory search box
the page strips out the Yahoo! Directory PPC ads (on the categories which have them)
the page strips out the Yahoo! Directory logo
Recall that when Google ran their bogus sting operation on Bing, Google engineers suggest that Bing was below board for using user clickstreams to potentially influence their search results. That level of outrage & the smear PR campaign look ridiculous when compared against Google's behavior toward the Yahoo! Directory, which is orders of magnitude worse:
Bing vs Google
Google vs Yahoo! Directory
Uses user-experience across a wide range of search engines to potentially impact a limited number of search queries in a minor way.
Shags expensive hand-created editorial content wholesale & hosts it on Google.com.
Bing hosts Bing search results using Bing snippets.
Google hosts Yahoo! Directory results using Yahoo! Directory listing content & keeps all the user data.
Bing publicly claimed for years to be using a user-driven search signal based on query streams.
Google removes the Yahoo! Directory logo to format the page. Does Google remove the Google logo from Google.com when formatting for mobile? Nope.
Bing sells their own ads & is not scraping Google content wholesale.
Google scrapes Yahoo! Directory content wholesale & strips out the sidebar CPC ads.
Bing puts their own search box on their own website.
Google puts their own search box on the content of the Yahoo! Directory.
Google claimed that Bing was using "their data" when tracking end user behavior.
Google hosts the Yahoo! Directory page, allowing themselves to fully track user behavior, while robbing Yahoo! of the opportunity to even see their own data with how users interact with their own listings.
In the above case the publisher absorbs 100% of the editorial cost & Google absorbs nearly 100% of the benefit (while disclaiming they do not endorse the page they host, wrap in their own search ad, and track user behavior on).
As we move into a search market where the search engines give you a slightly larger listing for marking up your pages with rich snippets, you will see a short term 10% or 20% lift in traffic followed by a 50% or more decline when Google enters your market with "instant answers."
The ads remain up top & the organic resultss get pushed down. It isn't scraping if they get 10 or 20 competitors to do it & then use the aggregate data to launch a competing service ... talk to the bankrupt Yellow Pages companies & ask them how Google has helped to build their businesses.
Update: looks like this has been around for a while...though when I spoke to numerous friends nobody had ever seen it before. The only reason I came across it was seeing a referrer through a new page type from Google & not knowing what the heck it was. Clearly this search option doesn't get much traffic because Google even removes their own ads from their own search results. I am glad to know this isn't something that is widespread, though still surprised it exists at all given that it effectively removes monetization from the publisher & takes the content wholesale and re-publishes it across domain names.
I was recently chatting with Jonah Stein about Panda & we decided it probably made sense to do a full on interview.
You mentioned that you had a couple customers that were hit by Panda. What sort of impact did that have on those websites?
Both of these sites saw an immediate hit of about 35% of google traffic. Ranking dropped 3-7 spots. The traffic hit was across the board, especially in the case of GreatSchools, who saw all content types hit (school profile pages, editorial content, UGC)
GreatSchools was hit on the 4-9 (panda 2.0) update and called out in the Sistrix analysis.
How hard has GreatSchools been hit? Sistrix data suggested that GreatSchools was loosing about 56% of Google traffic. The real answer is that organic Google-referred traffic to the site fell 30% on April 11 (week over week) and overall site entries are down 16%. Total page views are down 13%. The penalty, of course, is a “site wide” penalty but not all entry page types are being affected equally
Google suggested that there was perhaps some false positives but that they were generally pretty satisfied with the algorithms. For sites that were hit, how do clients respond to the SEOs? I mean did the SEO get a lot of the blame or did the clients get that the change was sort of a massive black swan?
I think I actually took it harder then they did. Sure, it hit their bottom line pretty hard, but it hit my ego. Getting paid is important but the real rush for me is ranking #1.
Fortunately none of my clients think they are inherently entitled to Google traffic, so I didn't get blamed. They were happy that I was on top of it (telling them before they noticed) and primarily wanted to know what Panda was about.
Once you get over the initial shock and the grieving, responding to Panda was a rorschach test, everyone saw something different. But is also an interesting self - reflection, especially when the initial advice coming from Greg Boser and a few others was to start to de-index content.
For clients who are not ad driven, the other interesting aspect is that generally speaking conversions were not hurt as much as traffic, so once you start focusing on the bottom line you discover the pain is a little less severe than it seemed initially.
So you mentioned that not all pages were impacted equally. I think pages where there was more competition were generally hit harder than pages that had less competition. Is that sort of inline with what you saw?
Originally I thought that was maybe the case, but as I looked at the data during the recovery process I became convinced that Panda is really the public face of a much deeper switch towards user engagement. While the Panda score is sitewide the engagement "penalty" or weighting effect on also occurs at the individual page. The pages or content areas that were hurt less by Panda seem to be the ones that were not also being hurt by the engagement issue.
On one of my clients we moved a couple sections to sub-domains, following the HubPages example and the experience of some members of your community. The interesting thing is that we moved the blog from /blog to blog.domain.com and we moved one vertical niche from /vertical-kw to vertical-kw.example.com. The vertical almost immediately recovered to pre-panda levels while the traffic to the blog stayed flat.
So the vertical was suddenly getting 2x the traffic. On the next panda push the vertical dropped 20% but that was still a huge improvement over before we moved to the subdomain. The blog didn't budge.
The primary domain also seemed to improve some, but it was hard to isolate that from the impact of all of the other changes, improvements and content consolidation we were doing.
After the next panda data push did not kill the vertical sub domain, we elected to move a second one. On the next data push, everything recovered - a clean bill of health - no pandalization at all.
GreatSchools completely recovered the same day and that was November 11th, so Panda 3.0. I cannot isolate the impact of any particular change versus Google tweaking the algorithm and I think both sites were potentially edge cases for Panda anyway.
Now that we are in 3.3 or whatever the numbering calls it, I can say with confidence that moving "bad" content to a sub-domain carries the Panda score with it and you won't get any significant recovery.
You mentioned Greg Boser suggesting deindexing & doing some consolidation. Outside of canonicalization, did you test doing massive deindexing (or were subdomains your main means of testing isolation)?
We definitely collapse a lot of content, mostly 301s but maybe 25% of it was just de-indexing. That was the first response. We took 1150 categories/keyword focused landing pages and reduced to maybe 300. We did see some gains but nothing that resembled the huge boost when Panda was lifted.
Back to the rorschach test: We did a lot of improvements that yielded incremental gains but were still weighed down. I reminds me of when I used to work on cars. I had this old Audi 100 that was running poorly so I did a complete tune up, new wires, plugs, etc., but it was still running badly. Then I noticed the jet in the carburetor was mis-aligned. As soon as I fixed that, boom, the car was running great. Everything else we fixed may have been the right thing to do for SEO and/or users but it didn't solve the problem we were experiencing.
The other interesting thing is that I had a 3rd client who appeared to get hit by Panda or at least suffer from Panda like symptoms after their host went down for about 9 hours. Rankings tanked across the board, traffic down 50% for 10 days. They fully recovered on the next panda push. My theory is that this outage pushed their engagement metrics over the edge somehow. Of course, it may not have really been Panda at all but the ranking reports and traffic drops felt like Panda. The timing was after November 11th, so it was a more recent version of the Panda infrastructure.
Panda 1.0 was clearly a rush job and 2.0 seemed to be a response to the issues it created and the fact that demand media got a free pass. I think it took 6-8 months for them to really get the infrastructure robust.
My takeaways from Panda are that this is not an individual change or something with a magic bullet solution. Panda is clearly based on data about the user interacting with the SERP (Bounce, Pogo Sticking), time on site, page views, etc., but it is not something you can easily reduce to 1 number or a short set of recommendations. To address a site that has been Pandalized requires you to isolate the "best content" based on your user engagement and try to improve that.
I don't know if it was intentional or not but engagement as a relevancy factor winds up punishing sites who have built links and traffic through link bait and infographics because by definition these users have a very high bounce rate and a relatively low time on site. Look at the behavioral metrics in GA; if your content has 50% of people spending less than 10 seconds, that may be a problem or that may be normal. The key is to look below that top graph and see if you have a bell curve or if the next largest segment is the 11-30 second crowd.
I also think Panda is rewarding sites that have a diversified traffic stream. The higher percentage of your users who are coming direct or searching for you by name (brand) or visiting you from social the more likely Google is to see your content as high quality. Think of this from the engine's point of view instead of the site owner. Algorithmic relevancy was enough until we all learned to game that, then came links as a vote of trust. While everyone was looking at social and talking about likes as the new links they jumped ahead to the big data solution and baked an algorithm that tries to measure interaction of users as a whole with your site. The more time people spend on your site, the more ways they find it aside from organic search, the more they search for you by name, the more Google is confident you are a good site.
Based on that, are there some sites that you think have absolutely no chance of recovery? In some cases did getting hit by Panda cause sites to show even worse user metrics? (there was a guy named walkman on WebmasterWorld who suggested that some sites that had "size 13 shoe out of stock" might no longer rank for the head keywords but would rank for the "size 13" related queries.
I certainly think that if you have a IYP and you have been hit with Panda your toast unless you find a way to get huge amounts of fresh content (yelp). I don't think the size 13 shoe site has a chance but it is not about Panda. Google is about to roll out lots of semantic search changes and the only way ecommerce sites (outside of the 10 or so brands that dominate Google Products) will have a chance is with schema.org markup and Google's next generation search. The truth is the results for a search for shoes by size is a miserable experience at the moment. I wear size 16 EEEE, so I have a certain amount of expertise on this topic. :)
Do you see Schema as a real chance for small players? Or something that is a short term carrot before they get beat with the stick? I look at hotel search results like & and I fear that spreading as more people format their content in a format that is easy to scrape & displace. (For illustration purposes, in the below image, the areas in red are clicks that Google is paid for or clicks to fraternal Google pages.)
I doubt small players will be able to use Schema as a lifeline but it may keep you in the game long enough to transition into being a brand. The reason I have taken your advice about brands to heart and preach it to my clients is that it is short sighted to believe that any of the SEO niche strategies are going to survive if they are not supported with PR, social, PPC and display.
More importantly, however, is that they are going to focus on meeting the needs of the user as opposed to simply converting them during that visit. To use a baseball analogy, we have spent 15 years keeping score of home runs while the companies that are winning the game have been tracking walks, singles, doubles and outs. Schema may deliver some short term opportunities for traffic but I don't think size13shoes.com will be saved by the magic of semantic markup.
On the other hand, if I were running an ecommerce store, particularly if I was competing with Amazon, Bestbuy, Walmart and the hand full of giant brands that dominate the product listings in the SERP, I wouldn't bury my head in the sand and pretend that everyone else wasn't moving in that direction anyway. Maybe if you can do it right you can emerge as a winner, at least over the short and medium term.
In that sense SEO is a moving target, where "best practices" depend on the timing in the marketplace, the site you are applying the strategy to, and the cost of implementation.
Absolutely...but that is only half the story. If you are an entrepreneur who likes to build site based on a monetization strategy, then it is a moving target where you always have to keep your eyes on the horizon. For most of my clients the name of the game is actually to focus on trying to own your keyword space and take advantage of inertia. That is to say that if you understand the keywords you want to target, develop a strategy for them and then go out and be a solid brand, you will eventually win. Most of my clients rank in the top couple of spots for the key terms for their industry with a fairly conservative slow and steady strategy, but I wouldn't accept a new client who comes to me and says they want to rank a new site #1 for credit cards or debt consolidation and they have $200,000 to spend..or even $2,000,000. We may able to get there for the short term but not with strategies that will stand the test of time.
Of course, as I illustrated with the Nuts.com example on SearchEngineLand last month, the same strategy that works on a 14 year old domain may not be as effective for a newer site, even if you 301 that old domain. SEO is an art, not a science. As practitioners we need to constantly be following the latest developments but the real skill is in knowing when to apply them and how much; even then occasionally the results are surprising, disappointing or both.
I think there is a bit of a chicken vs egg problem there then if a company can't access a strong SEO without already having both significant capital & a bit of traction in the marketplace. As Google keeps making SEO more complex & more expensive do you think that will drive a lot of small players out of the market?
I think it has already happened. It isn't about the inability to access a strong SEO it is that anyone with integrity is going to lay out the obstacles they face. Time and time again we see opportunity for creativity to triumph but the odds are really stacked against you if you are an underfunded retailer.
Just last year I helped a client with 450 domains who had been hit with Panda and then with a landing page penalty. It took a few months to sort out and get the reconsideration granted (by instituting cross domain rel=canonical and eliminating all the duplicate content across their network). They are gradually recovering to maybe 80% of where they were before Panda 2.0 but I can't provide them an organic link building strategy that will lift 450 niche ecommerce sites. I can't tell them how they are going to get any placement in a shrinking organic SERP dominated by Google's dogfood, shopping results from big box retailers and enormous Adwords Product Listings with images
From that perspective, if your funding is limited, do you think you are better off attacking a market from an editorial perspective & bolting on commerce after you build momentum (rather than starting with ecommerce and then trying to bolt on editorial?
Absolutely. Clearly the path is to have built Pinterest, but seriously...
if you are passionate about something or have a disruptive idea you will succeed (or maybe fail), but if you think you can copy what others are doing and carve out a niche based on exploits I disagree. Of course, autoinsurancequoteeasy.com seems to be saying you can still make a ton of money in the quick flip world, even with a big bank roll, you need to be disruptive or innovative.
On the other hand, if you have some success in your niche you can use creativity to grow, but it has to be something new. Widget bait launched @oatmeal's online dating site but it is more likely to bury you now than help you rank #1, or at least prevent you from ranking on the matching anchor text.
When a company starts off small & editorially focused how do you know that it is time to scale up on monetization? Like if I had a successful 200 page site & wanted to add a 20,000 page database to it...would you advise against that, or how would you suggest doing that in a post-Panda world?
This is a tough call. I actually have a client in exactly this position. I guess it depends on the nature of the 20,000 pages. If you are running a niche directory (like my client) my advice to them was to add the pages to the site but no index the individual listing until they can get some unique content. This is still likely to run fowl of the engagement issue presented by Panda, so we kept the expanded pages on geo oriented sub-domains.
Earlier you mentioned that Panda challenged some of your assumptions. Could you describe how it changed your views on search?
I always tell prospects that 10-15 years ago my job was to trick search engines into delivering traffic but over the last 5-6 years it has evolved and now my job is to trick clients into developing content that users want. Panda just changed the definition of "good content" from relevant, well linked content to relevant, well linked, sticky content.
It has also made me more of a believer in diversifying traffic.
Last year Google made a huge stink about MSN "stealing" results because they were sniffing traffic streams and crawling queries on Google. The truth is that Google has so many data sources and so many signals to analyze that they don't need to crawl facebook or index links on twitter. They know where traffic is coming from and where it is going and if you are getting traffic from social, they know it.
As Google folds more data into their mix do you worry that SEO will one day become too complex to analyze (or move the needle)? Would that push SEOs to mostly work in house at bigger companies, or would being an SEO become more akin to being a public relations & media relations expert?
I think it may already be too complex to analyze in the sense that it is almost impossible to get repeatable results for every client or tell them how much traffic they are going to achieve. On the other hand, moving the needles is still reasonably easy—as long as you are in agreement about what direction everyone is going. SEO for me is about Website Optimization, about asking everyone about the search intent of the query that brings the visitors to the site and making sure we have actions that match this intent. Most of my engagements wind up being a combination of technical seo/problem solving, analytics, strategy and company wide or at least team wide education. All of these elements are driven by keyword research and are geared towards delivering traffic so it is an SEO based methodology, but the requirements for the job have morphed.
As for moving in house, I have been there and I doubt I will ever go back. Likewise, I am not really a PR or media relations expert but if the client doesn't have those skills in house I strongly suggest they invest in getting them.
Ironically, many companies still fail to get the basics right. They don't empower their team, they don't leverage their real world relationships and most importantly they don't invest enough in developing high quality content. Writing sales copy is not something you should outsource to college students!
It still amazes me how hard it is to get content from clients and how often this task is delegated to whoever is at the bottom of the org chart. Changing a few words on a page can pay huge dividends but the highest paid people in the room are rarely involved enough.
In the enterprise, SEO success is largely driven by getting everyone on board. Being a successful SEO consultant (as opposed to running your own sites) is actually one quarter about being a subject matter expert on everything related to Google, one quarter about social, PR, Link building, conversion, etc and half about being a project manager. You need to get buying from all the stake holders, strive to educate the whole team and hit deliverables.
Given the increased complexity of SEO (in needing to understand user intent, fixing a variety of symptoms to dig to the core of a problem, understanding web analytics data, faster algorithm changes, etc.) is there still a sweet spot for independent consultants who do not want to get bogged down by those who won't fully take on their advice? And what are some of your best strategies for building buy in from various stakeholders at larger companies?
The key is to charge enough and to work on a monthly retainer instead of hourly. This sounds flippant but the bottom line is to balance how many engagements you can manage at one time versus how much you want to earn every month. You can't do justice to the needs of a client and bill hourly. That creates an artificial barrier between you and their team. All of my clients know I am always available to answer any SEO related question from anyone on the team at almost any time.
The increased complexity is really job security. Most of my clients are long term relationships and the ones I enjoy the most are more or less permanent partnerships. We have been very successful together and they value having me around for strategic advice, to keep them abreast of changes and to be available when changes happen. Both of the clients who got hit by Panda have been with me for more than four years.
No one can be an expert in everything. I definitely enjoy analytics and data but I have very strong partnerships with a few other agencies that I bring in when I need them. I am very happy with the work that AnalyticsPros has done for my clients. Likewise David Rodnitzky (PPC Associates) and I have partnered on a number of clients. Both allow me to be involved in the strategy and know that the execution will be very high quality. I only wish I had some link builders I felt as passionate about (given that Deborah Mastaler is always too busy to take my clients.)
You mentioned that you thought user engagement metrics were a big part of Panda based on analytics data & such...how would a person look through analytics data to uncover such trends?
I would focus on the behavioral metrics tab in GA. It is pretty normal to have a large percentage of visitors leave before 10 seconds, but after that you should see a bell curve. Low quality content will actually have 60-70% abandonment in less than 10 seconds, but the trick is for some searches 10 seconds is a good result: weather, what is your address, hours of operations. Lots of users get what they need from searches, sometimes even from the SERP, so look for outliers. Compare different sections of your site, say the blog or those infographics & bad page types.
Its hard to say until you get your hands in the data but if you assume that individual pages can be weighed down by poor engagement and that this trend is maybe 1 year old and evolving, you can find some clues. Learn to use those advance segments and build out meaningful segmentation on your dashboard and you will be surprised how much of this will jump out at you. It is like over optimization: until you believed in it you never noticed & now you can spot it within a few seconds of looking at a page. I won't pretend engagement issues jump out that fast but it is possible to find them, especially if you are an in house SEO who really knows your site.
The other important consideration is that improving engagement for an given page is a win regardless of whether it impacts your rankings or your Panda situation. The mantra about doing what is right for the users, not the search engine may sound cliche but they reality is that most of your decisions and priorities should be driven by giving the user what they want. I won't pretend that this is the short road to SERP dominance but my philosophy is to target the user with 80% of your efforts and feed the engines with the other 20.
Thanks Jonah :)
Jonah Stein has 15 years of online marketing experience and is the founder of ItsTheROI, a San Francisco Search Engine Marketing Company that specializes in ROI driven SEO and PPC initiatives. Jonah has spoken at numerous industry conferences including Search Engine Strategies, Search Marketing Expo (SMX), SMX Advanced, SIIA On Demand, the Kelsey Groups Ultimate Search Workshop and LT Pact. He also developed panels Virtual Blight for the Web 2.0 Summit and the Web 2.0 Expo. He has written for Context Web, Search Engine Land and SEO Book
Jonah is also the cofounder of two SaaS companies, including CodeGuard.com, a cloud based backup service that provides a time machine for websites and Hubkick.com, an online collaboration and task management tool that provides a simple way for groups to work together-instantly.
Since it took me a few hours to put together my SMX presentation I figured it was worth sharing that information on the blog as well. This post will discuss examples of how Google has dialed up their brand bias over time & points to where Google may be headed in the future.
Note that I don't have anything against them promoting brands, I just think it is dishonest to claim they are not.
Against All Odds
When analyzing Google's big-brand bias the question is not "do some small sites manage to succeed against all odds" but…
What are the trends?
What are the biases?
Eric Schmidt once stated that "Brands are the solution, not the problem. Brands are how you sort out the cesspool. Brand affinity is clearly hard wired."
We have a fear of the unknown. Thus that which we have already experienced is seen as less risky than something new & different. This is a big part of why & how cumulative advantage works - it lowers perceived risk.
A significant portion of brand-related searches are driven by offline advertising. When a story becomes popular in the news people look online to learn more. The same sort of impact can be seen with ads - from infomercials to Superbowl ads. Geico alone spends nearly a billion Dollars per year on advertising, & Warren Buffet mentioned that 3/4 of their quotes come from the internet.
Some of the most profitable business models are built off of questionable means.
Many big brands are owned by conglomerates with many horses in the race. When one gets caught doing something illegal they close it down or sell off the assets & move to promote their parallel projects more aggressively.
If things aligned with brands become relevancy signals then to some degree those measure longevity & size of a company (and their ad budget) rather than the quality of their offering.
Companies with a high page rank are in a strong position to move into new markets. By “pointing” to this new information from their existing sites they can pass on some of their existing search engine aura, guaranteeing them more prominence.
Google’s Mr Singhal calls this the problem of “brand recognition”: where companies whose standing is based on their success in one area use this to “venture out into another class of information which they may not be as rich at”. Google uses human raters to assess the quality of individual sites in order to counter this effect, he adds.
Since Panda Overstock has moved into offering ebooks & insurance quotes while companies like Barnes & Noble run affiliate listings for rugs.
As an example of the above trend gone astray, my wonderful wife recently purchased me a new computer. I was trying to figure out how to move over some user databases (like our Rank Checker & Advanced Web Ranking) and in the search results were pages like this one:
The problems with the above are:
actual legitimate reviews get pushed down by such filler
the business model behind doing such actual reviews gets eroded by the automated syndicated reviews
outside of branding & navigation the content is fully syndicated
that particular page is referencing the 2005 version of the software, so the listed price is wrong & the feature set has changed a lot in the last 7 years
Such scrape-n-mash content strategies by large brands are not uncommon. Sites like Answers.com can quickly add a coupons section, sites like FindTheBest can create 10s of millions of automated cross-referencing pages that load a massive keyword net of related keywords below the fold, news sites can create auto-generated subdomains of scraped content, etc.
Eric Schmidt highlighted FindTheBest publicly as an example of a successful vertical search play. That site was launched by an ex-Googler, but if I did the same thing you can be certain that the only way Google would highlight it publicly would be as a "type of spam."
The issue with broadly measuring user experience is that I am still going to visit Yahoo! Sports repeatedly even if my experience on Yahoo! Downloads is pretty crappy. A site which is a market leader in one niche can take those signals to launch a "me too" service in other parallel markets & quickly dominate the market.
Potential Brand Signals
When attempting to debunk the concept of "brand bias" some people claim that it would be ridiculous for Google to have a list of brands that get an across-the-board boost. Of course that debunking is debunking a straw man that was never stated publicly (outside of the irrelevant debunking).
However, some of Google's old rater documents *did* have certain sites whitelisted & Google's Scott Huffman once wrote the following:
At a [search] quality level, we have something similar. On a continuous basis in every one of our data centers, a large set of queries are being run in the background, and we’re looking at the results, looking up our evaluations of them and making sure that all of our quality metrics are within tolerance.
These are queries that we have used as ongoing tests, sort of a sample of queries that we have scored results for; our evaluators have given scores to them. So we’re constantly running these across dozens of locales. Both broad query sets and navigational query sets, like “San Francisco bike shop” to the more mundane, like: Here’s every U.S. state and they have a home page and we better get that home page in the top results, and if we don’t … then literally somebody’s pager goes off.
(Outside of some fraternal Google properties) the algorithm isn't hardcoded to rank sites x & y at #1, but if some sites don't rank for certain queries it does cause an alert to be sent out.
Google has a wide host of quality-based metrics they could look at and analyze when determining if something gets a brand boost, gets ignored, or gets hit by an algorithm like Panda.
If you search for "fishing gear" and then click their Bass Shop refinement link in the search results, you are thus directly creating that search funnels relevancy "signal." Even if you don't click on that link the exposure to the term may make you remember it and search for it later.
When some small bloggers were selling paid links to K-Mart as part of a "sponsored conversations" outreach, Matt Cutts equated the practice to selling bogus solutions to brain cancer & stated: "Those blogs are not trusted in Google's algorithms any more."
Google also started sending webmasters automated messages for bad links pointing at their sites:
Dear site owner or webmaster of domain.com, We've detected that some of your site's pages may be using techniques that are outside Google's Webmaster Guidelines.
We encourage you to make changes to your site so that it meets our quality guidelines. Once you've made these changes, please submit your site for reconsideration in Google's search results.
So if you run a big site & they automatically detect paid links they generally just ignore those links and leave your site alone. If you are a small site & they automatically detect paid links they may decide to automatically penalize your site.
Same offense, entirely different outcome.
Is cloaking evil?
Once again, it depends on who is doing it.
I have a Vistaprint Visa card (so I could get a credit card with our dog's picture on it) and one of the pages that was ranking for Vistaprint Visa was the Singapore Groupon website.
The page forces a pop up and you can't do anything on that page (view the content, scroll around the site, etc.) other than filling in the lead generation form or logging into an existing account. I would never try that because I know I would get smoked for it. ;)
After the first iteration of the Google Panda update Google allowed users to vote to block websites. Experts Exchange was hated among some programmers in part because they used scroll cloaking. That in turn got their site hit by the second Panda update.
Smaller webmasters who ran network of sites in some cases got hit with "doorway page" penalties for owning networks of sites registered in Google Webmaster Tools, even if each site was a full fledged ecommerce website.
Is content farming evil?
Once again, it depends on who is doing it (and where it is hosted).
Another thing that is interesting about the content farms and the alleged need for the Panda algorithm was that in spite of flagrant editorial violations by both eHow and Mahalo, Google didn't smoke them until it could be done "algorithmically."
On the flip side of the above, in some cases Google has chose to keep smaller webmasters penalized because content that was at one point on their site months in the past!
A couple weeks after that aggressive promotional integration Amit Singal stated: "The overall takeaway that I have in my mind is that people are judging a product and an overall direction that we have in the first two weeks of a launch, where we are producing a product for the long term."
The problem with build preferential rankings first & increase quality later is that is the exact opposite of what Google is asking publishers to do with algorithms like Panda. Worse yet, Google not only does this integration when you are logged in, but also shows it on obscure longtail advanced queries when you are not logged in.
In Google's remote rater documents they suggested that hotel affiliate sites be marked as spam, even if they are helpful.
On Google's reconsideration request form they also stated: "In general, sites that directly profit from traffic (e.g. search engine optimizers, affiliate programs, etc.) may need to provide more evidence of good faith before a site will be reconsidered."
The broken piggy bank in the above cycle highlights the break that exists in the process to building a big brand. It is quite hard to have any level of certainty in the search ecosystem with an algorithm like Panda. Without that level of certainty companies must build from low cost structures, but that very constraint makes them more likely to get hit by an algorithm or search engineer.
Being an entrepreneur is all about taking smart calculated bets & managing risk. However as search engines become closed off portals that compete with (& exclude) publishers, there are so many unknowns that estimating risk is exceptionally challenging.
CustomMade is a Google-funded start up launched by an SEO who purchased an old website & created a vertical directory out of it (just like TeachStreet was trying to do, but in a different vertical). Googler's helped with the project & the article highlighting that shared this quote: "Having Google as an investor gives you a branding piece that you can't ignore." - Christopher Ahlberg.
Penalties: How Hard Were They Hit?
Years ago when BMW or Wordpress.org got caught spamming aggressively they were back in good graces in a mater of days.
About the only times well known (non-affiliate) sites have been penalized for a significant duration was when JC Penney & Overstock.com were hit. But that happened around the time of the Panda fiasco & Google had incentive to show who was boss. When the flower sites were outed for massive link buying that was ignored because Google had already rolled out Panda & reasserted the perception of their brand.
In 2009 Google banned over 30,000 affiliates from the AdWords auction. In some cases the problem was not with a current ad (or even a landing page the advertiser controlled), but rather ads that ran years ago promoting 3rd party products. In some cases Google changed their AdWords TOS after the fact in an ex post facto style. Google won't allow some of these advertisers to advertise unless they fix the landing page, but if they don't control the landing page they can't ever fix the problem. Making things worse, to this day Google still suggests affiliates do direct linking. But if the company they promote gets bought out by someone too aggressive then that affiliate could be waiting for a lifetime ban through no fault of their own.
In Australia a small travel site had a similar issue with AdSense. The only way they were able to get a reconsideration was to lodge a formal complaint with regulators. If that is how Google treats their business partners, it colors how they view non-business partners who monetize traffic without giving Google a taste of the revenues.
Why Does Google Lean Into Brand?
Minimize legal risks: if they hit small businesses almost nobody will see/notice/care, but big businesses are flush with cash and political connections. When Google hits big businesses they create organizations & movements like Fair Search & Search Neutrality.
Minimize duplication: some small businesses & affiliates simply repeat offers that exist on larger merchant sites. That said, many big businesses buy out a 2nd, 3rd, 4th, or even 5th site in a vertical to have multiple placements in the search results.
Better user experience: the theory is that the larger sites have more data and capital to improve user experience, but they don't always do it.
Business partnerships: if Google wants to strike up closed door business partnerships with big business then some of those negotiations will have specific terms attached to them. It costs Google nothing to give away part of the organic results as part of some custom deals. If Google wants to sell TV ads & run a media streaming device they need to play well with brands.
CPA-based product ads: on some searches Google provides CPA-based product ads above the search results. It makes sense for Google to promote those who are buying their ads to get the best relationships possible.
Fewer people tasting the revenues: the fewer organizations an ecosystem needs to support the more of the profits from that ecosystem that can be kept by the manager.
More complete ad cycle: if Google caters to direct response advertisers they get to monetize the demand fulfillment of demand, however that is only a small slice of the complete ad cycle. If Google caters to brands they get to monetize (directly or indirectly) every piece of the ad cycle. For example, buying display ads helps build brand searches which helps create brand signals. In such a way, improved rankings in the organic results subsidize ad buying.
Brands buying their equity: Google has create exceptionally large ad units & has convinced many brands to buy their own pre-existing brand equity.
Lack of Diversity
The big issue with brand bias is that a lot of the same *types* of companies rank with roughly similar consumer experiences. If there is a mix of large and small businesses that rank then many of those small businesses will be able to differentiate their offering by adding services to their products, doing in-depth reviews, and so on.
Sure Zappos is a big company known for customer service, but how different is the consumer-facing experience if I click on Target.com or Walmart.com? Sure the text on the page may be slightly different, but is there any real difference beyond aesthetic? Further, a lot of the business models built around strong in-depth editorial reviews & comparisons are eroded by the current algorithms. If the consumer reviews are not good enough, then tough luck!
Do Brands Always Provide a Better User Experience?
For decades, Target has collected vast amounts of data on every person who regularly walks into one of its stores. Whenever possible, Target assigns each shopper a unique code — known internally as the Guest ID number — that keeps tabs on everything they buy. "If you use a credit card or a coupon, or fill out a survey, or mail in a refund, or call the customer help line, or open an e-mail we've sent you or visit our Web site, we'll record it and link it to your Guest ID," Pole said. "We want to know everything we can."
Many big media companies provided watered down versions of their content online because they don't want to cannibalize their offline channels. Likewise some large stores may consider their website an afterthought. When I wanted to order my wife a specific shoe directly from the brand they didn't have customer support open for extended hours during the holidays and their shopping cart kept kicking an error. Since they *are* the brand, that brand strength allows them to get away with other issues that need fixed.
Some of those same sites carry huge AdSense ad blocks on the category pages & have funky technical issues which act like doorway pages & force users who are using any browser to go through their homepage if they land on a deep page.
Missing the Target indeed.
That above "screw you" redirect error has been going on literally for weeks now, with Target's webmaster asleep at the wheel. Perhaps they want you to navigate their site by internal search so they can track every character you type.
Riding The Waves
With SEO many aggressive techniques work for a period of time & then suddenly stop working. Every so often there are major changes like the Florida update & the Panda update, but in between these there are other smaller algorithmic updates that aim to fill in the holes until a big change comes about.
No matter what Google promotes, they will always have some gaps & relevancy issues. Some businesses that "ignore the algorithms and focus on the user" are likely to run on thinner margins than those who understand where they algorithms are headed. Those thin margins can quickly turn negative if either Google enters your niche or top competitors keep reinvesting in growth to buy more marketshare.
Given the above pattern - where trends spread until they get hit hard - those who quickly figure out where the algorithms are going & where there are opportunities have plenty of time to monetize their efforts. Whereas if you have to wait until things are widely spread on SEO blogs as common "tricks of the trade" or wait until a Google engineer explicitly confirms something then you are likely only going to be adopting techniques and strategies after most of the profit potential is sucked out of them, just before the goal posts move yet again.
People who cloned some of the most profitable eHow articles years ago had plenty of time to profit before the content farm business model got hit. Those who waited until Demand Media spelled their business model out in a Wired article had about 1.5 years until the hammer. Those who waited until the content farm controversy started creating a public relations issue to clone the model may have only had a couple months of enhanced revenues before their site got hit & was worse off than before they chased the algorithm late in the game.
Ride The Brand
If Google does over-represent established branded websites in their algorithms then in many cases it will be far easier to rank a Facebook notes page or a YouTube video than to try to rank a new site from scratch. There are a ton of web 2.0 sites driven by user generated content.
In addition to those sorts of sites, also consider participating in industry websites in your niche & buying presell pages on sites that rank especially well.
Collecting (& Abusing) User Data
Google has been repeatedly branded as being a bit creepy for their interest in user tracking.
Collecting that data & using it for ad targeting can have profound personal implications (think of serving a girl with anorexia ads about losing weight everywhere she goes online, simply because she clicks the ad, in such a case Google reinforces a warped worldview). Then when the person needs counseling Google can recommend a service provider there as well. ;)
Throughout the history of the web there will be many cycles between open and closed ecosystems. Currently we are cycling toward closed silos (Apple, Amazon, Google, Facebook). As these silos become more closed off they will end up leaving gaps that create new opportunities.
While on one front Google keeps making it easier for brands to compete against non-brands, Google also keeps clawing back a bigger slice of that branded traffic through larger AdWords ad units & integration of listings from services like Google+, which can in some cases outrank the actual brand.
Google has multiple platforms (Android Marketplace, Chrome Marketplace, Enterprise Marketplace) competing against iTunes. Google recently decided to merge some of their offerings into Google Play. In addition to games, music & books, Play will soon include audiobooks, magazines & other content formats.
Having a brand & following will still be important for allowing premium rates, fatter margins, building non-search distribution (which can be used to influence the "relevancy" signals), and to help overturn manual editorial interventions. But algorithmically brand emphasis will peak in the next year or two as Google comes to appreciate that they have excessively consolidated some markets and made it too hard for themselves to break into those markets. (Recall how Google came up with their QDF algorithm only *after* Google Finance wasn't able to rank). At that point in time Google will push their own verticals more aggressively & launch some aggressive public relations campaigns about helping small businesses succeed online.
Once Google is the merchant of record, almost everyone is just an affiliate, especially in digital marketplaces with digital delivery.
At SMX I gave a presentation on brand & how Google has biased the algorithms toward brands. having already seeing the bulk of my argument months prior, Bryson Meunier spoke after me and put together a presentation that used bogus statistics & was basically a smear of me. He was so over the top with his obnoxious behavior that when Danny Sullivan mentioned the next speaker after him he jokingly said "up next, Ron Paul."
I honestly thought the point of the discussion was to highlight how Google has (or hasn't) biased the algorithms, editorial policies & search interface toward brands. However, if a person speaks after you and uses bogus statistics to reach junk conclusions, you can't debunk their aggregate information until after you have looked into it some. An honest person can put what they know out there & share it publicly in advanced, a dishonest person hides behind junk research and the label of science to ram through poorly thought out trash, collecting whatever "data" confirms their own bias while ignorning the pieces of reality that don't.
As an example, he suggested that based on the number of employees and revenues Wikipedia is a small business. He then went on to say that since Wikipedia wasn't on Interbrand's "scientific" study that they were not a top brand. Nevermind that no countries, religions, sports, celebrities, or non-profits make the list of top "companies."
After IAC figured out that they were able to get away with running Ask.com as a thin scraper site, they outsourced "the algorithm" and fired many of their employees. Because they have fewer employees, Bryson considers Ask as "a mid-sized business" even though they are part of a multi-billion Dollar company and IAC is Google's #1 advertiser!
According to Compete's downstream traffic stats, YouTube receives about 1 in 13 search clicks from Google, but since it wasn't on Interbrand's list "who cares?" Incidentally, the folks at Interbrand do have a mention of YouTube on their top 100 brands page, but it was a suggestion that you watch their videos on YouTube. Their methodology is so suspect that Goldman Sachs and Yahoo! made the cut while YouTube didn't, even though YouTube is one of their few offsite promotional channels they promote on that very page. Their list also puts Microsoft's brand value at about double Apple's (and the list came out when Steve Jobs was still alive).
Bryson also claimed that since big brands are inefficient and slow moving they already have a big disadvantage so it makes sense for search engines to compensate for that. That is at best an illegitimate line of reasoning because those companies have plenty of solutions available to them & have the capital needed to buy out competitors. Even when the SERPs look independent, a lot of the listed sites are owned by large conglomorates. As an example, here is a random search from earlier today:
Meanwhile the same idiotic logic ignores the lack of resources at small businesses. Nowhere in his presentation was a highlight of how Google favored affiliates & direct marketers until the profit margins of the direct response marketing model started to peak & then Google transitioned to promoting brands, as they wanted to keep increasing revenues and monetize more clicks.
Bryson also shared an example of where he got a photo sharing site 40,000 unique visitors a month as a case study of the power of white hat SEO. 40,000 monthly visits to a photo sharing site might fund a light Starbucks addiction (assuming you value your time at nothing, have no employees, ignore hosting costs and the SEO is free), but not much beyond that. If that is a success case study, that shows how much harder the ecosystem is getting to operate in as a small business.
He also put out a painfully fluffy "white paper" / sales letter which stated that since Wal-Mart has a page about SEO they should outrank seobook on "SEO" related queries if my theories of brand bias are correct. That misses the point entirely. I never stated that garbage content on branded sites always outperforms quality content on niche sites, but rather that a lot of smaller websites were intentionally being squeezed out of the ecosystem. Sure some small sites manage to compete, but the odds of them succeeding today are much lower than they were 3 or 4 years ago.
At SMX near the end of our session a question was asked about the audience composition & most people were either big brands or people working for big brands. If you go back to when I first got into SEO in 2003 the audience composition was almost entirely small publishers and independent SEOs. This squeezing out of small players is not something new to search or the web. If you look at the history of any modern communications network this cycle has repeated itself in every single medium - phone, radio, television, and the web.
To be fair, I can understand why a no-name also ran SEO consultant would want to pitch himself for being up for doing SEO work for large brands. Brands generally have fatter margins, economies of scale, and large budgets. As Google tilts the algorithm toward the big brands (to where they can fall over the finish line in first place) they are the best clients to work for, since you are swimming downstream.
Why push huge boulders up the side of the mountain for crumbs when you can get paid far more to blow on a snowflake at the top of the mountain?
That is why so many SEOs fawn over trying to get brand clients. The work is high-paying, low risk, and relatively easy.
If we were ever to close up our membership site & focus primarily on SEO consulting work in more structured arrangements then absolutely we would aim at brands & help them fall over the finsh line in first place. ;)
Back when I worked with Clientside SEM we did a good number of big brand projects with some of the largest online portals & retailers. Understanding the business objectives & communicating things in a way that builds buy in from other departments is of course challenging. You need simplicity & directness without oversimplifying. But (if you work for great clients - like we did), then that is nowhere near as challenging as building a site from scratch into something that can compete for lucrative keywords. I recently stepped back from the client consulting model for a bit simply because I was pulling myself in too many directions & working too long, but Scott is still flourishing & delivering excellent results for clients.
I have nothing against the concept of branding (think of how many years I slaved building up this site & the capital I have poured into it), but I like to share the trends in the ecosystem as they are, rather than as a hack warping my view to try to pick up consulting clients. Our site would likely make far more income if we kept using the words "enterprise" "brand" "fortune 500" and then sold consulting to that target audience. In fact, a large % of our members here are fortune 500s, conglomerates, newspaper chains, magazine publishers, and so on.
It is not that brand counts for nothing (or that it should count for nothing) but anyone who claims the table isn't tilted is either ignorant, a liar, or both.
Truth has to count for something.
Disclaimer: I am not saying enterprise SEO is always easy (there are real challenges, especially with internal politics that add arbitrary constraints). And I am not saying that everyone who targets the enterprise market is a hack (there are some super talented folks out there). But the challenge of being a profitable small webmaster is much more of a struggle than ranking a site that Google is intentionally biasing their algorithms toward promoting.
Disclaimer2: I realize refuting a douchebag like Bryson Meunier is batting below my league, however as a matter of principal I won't let sleazeballs get away with taking a swipe using junk science. The word science deserves better than that.
Google Analytics - one of the most powerful tools for any SEO, assuming you know how to get the data you need from it. One of my favorite things about Google Analytics is how many tools that put at your disposal for quickly analyzing the data you care most about. But again, that all assumes you know how to get it.
A custom report in Google Analytics is similar to their custom dashboard features in a lot of ways. Remember, the dashboards are meant as snapshots of what's going on with your campaign, these custom reports are what you should be using to fully analyze the results.
To start, you should consider setting up Custom Report categories to organize your reports by subject. You will find this to be the most aggravating/irritating/infuriating part of the process as you attempt to drag your first custom report into your new category folder. The secret is to drag your report slightly to the right while hovering over the category you want to place it in. Then let go and hope for the best. Once you have one report in there it gets much easier.
Creating a Custom Report
There are two key components to a custom report:
Metric: a numeric measurement (like number of visits).
Dimension: a description of visits, visitors, pages, products and events.
There are also two types of Custom Reports you can create:
Explorer: Allows you to drill down into sub-dimensions and includes a timeline where you can compare metrics in the same graph.
Table: Allows you to compare dimensions side by side, with metrics also populated within the table. There is no timeline in this report.
Creating the custom report is easy. You choose from a drop-down menu of metrics and dimensions that you're interested in segmenting your report by.
You can also create tabs in your report to keep it organized. Any filters you setup on one tab will automatically apply to any other tab that you setup (there isn't a way to turn them off for the other tabs).
Another great feature of custom reports is your ability to use them cross-profile and to share them. To share a report, all you need to do is click the Actions drop-down menu from the Custom Reports overview page, and click share. You will then be able to share the configuration (not the data) of the custom report you just created.
SEO Custom Report Examples
If you'd like to save time in your SEO analysis, consider creating custom reports similar to the ones outlined below. I've included the share link for each custom report so you don't have to rebuild it yourself. I tried to mix up when I'd tailor the report to look at e-commerce data, and when it would only look at goal data. You'll need to customize those aspects of the report to best meet your needs.
Also, don't forget to modify the keyword filters I've added. You want to make sure to replace our branded keyword (book) with your own.
Audience Custom Report
Understanding your audience's demographics is an often overlooked SEO practice, but it can go a long way in making certain aspects of SEO (like link building) that much easier.
There are two components to this custom report:
City and Language Overview - this part of the report looks at what cities and languages you receive the most visits from and make the most money off of. You may be surprised to see languages your site isn't even translated in yet that are very profitable.
Keyword Targeting - this part of the report lets you drill down all the way to the keywords that are used by each country and language visitor demographic, and calls out how profitable they are for you. This is a great way to refine your keyword targeting.
How this can help you from a link building front is seeing what foreign languages your blog/linkbait content is most popular in, and then translating it. You could then distribute the translated content for links to popular industry blogs in that language.
The purpose of the Content Custom Report is to identify which content is performing the best with organic traffic. I've set this report up as a Explorer Custom Report so you can drill down and see which keywords are sending traffic to a specific Landing Page. This is a great way to make sure you're targeting the right keywords on the right pages in your SEO campaign.
There are a number of engagement metrics I have this report looking at. One in particular I think is important to have with this report is the Social Actions metric. This is a great way to see if the number of social actions correlates with increases in traffic and conversions.
You might consider adding an additional filter (or creating a new custom report) that only looks at your blog content. I'd keep similar metrics in the report so you can quickly identify which blog posts perform the best so you can try and duplicate the results in future content. You may also want to add any event goals you've created to the report, especially if you've set up a event to track comments on your posts.
I think this is one of the most valuable custom reports you can run, and it's one of the bigger custom reports that I like to create in my accounts. There are three components to the report: targeting, engagement and revenue.
This part of the report is pretty straight forward. It's a Flat Table report that places the Page Title and the Keyword that is sending it traffic side-by-side. From there I've added a handful of metrics to determine if I am targeting the right keyword on the right page. Perhaps I'm getting a lot of traffic for this particular keyword, but the majority of people are going elsewhere and/or not converting. This may lead me to do some testing around changing which page I'm optimizing for this particular keyword.
Similar to the Content Custom Report, this component focuses on how engaging visitors are when they visit the site via a specific keyword. I love traffic just as much as the next guy, but if that traffic isn't doing anything on my site - what good is it? This report will help you identify problems and opportunities for keywords that have low/high engagement rates.
Just how much money is a keyword making you? This component of the report looks at the number of transactions, the revenue generated and the per visit value of organic traffic for each keyword.
Which of the inbound links that you've built are sending you the most quality traffic? Don't forget, there's much more to links than rankings, they are also opportunities for sending high quality traffic to your site that may even convert.
This custom report looks at which of your referrals are sending you the most engaging traffic. Knowing which links are sending you the most quality traffic will help you determine if you should be going back for more or if you can find other sites just like it to get links set up on.
I'm a big fan of using paid search as a way to test which landing pages you want to target your keywords on for relevance. The goal of the test is to determine if you were to target a specific keyword on that page, would the visitor find what they are looking for and convert? This is a great way to minimize the risk of focusing on the wrong keyword on the wrong page and investing months of SEO work to get it traffic.
You can use this custom report to look at just that: which keyword/landing page combinations are the most effective from a revenue perspective. Even if you don't run a test like the one I just described, you can still get a pretty good grasp on this just by pulling the report and looking for these opportunities.
Continuing with our holistic custom reports, the goal of the PPC keywords custom report is simple: identify high performing keywords from your paid search campaigns that you could consider targeting in your SEO campaign.
The report calls out a couple qualifier metrics, including how much money bidding on the keyword is costing you, and what your cost per conversion is. This is a great way to decide if you can't afford to target the keyword via PPC, can you make up the loss of traffic via SEO?
We've seen the influence social media has on SEO, and now it's time to make sure we're well-informed of any social media data that can be leveraged to improve our campaigns.
This report uses a filter created by Site Visibility to look at all referring traffic from a variety of top social sources. With this filter applied you can look at which social traffic is most engaged with your content.
If you're tracking social actions you can quickly see which content you've created is being shared the most, so you can figure out what they like about the content and duplicate the results.
I also like to see which social media is converting the best so I can determine if we should be increasing our participation efforts on that social network, or even start experimenting with advertising on that social network.