In an SES panel yesterday Matt Cutts claims paid links pollute the web ,while he advocates off topic link bait as a useful search marketing strategy. Michael Gray and Greg Boser are a bit more honest:
Link Baiting, what Googleâ€™s suggest as link building strategy, is as egregious if not worse for relevancy than paid links - viral content of such an off-topic nature should not help your rankings and is more â€œpollutingâ€ than relevant paid links.
Linkbaiting is Expensive, Time Consuming, and Unpredictable
The reasons search engineers advocate link baiting are:
it is expensive
it is time consuming
the results are hard to predict
it requires social connections
it provides off topic low value traffic
it typically creates content of limited commercial value (other than the ability to pull in links to rank other pages for stuff they did not have enough relevancy or authority to merit ranking for)
the valuable results can take a while to show
it often undermines the credibility of the source doing it (by allowing people to think of information from certain sources as link bait, which is a derogatory classification term)
many companies have restrictions that prevent them from doing it
Because of the above reasons, the technique of link baiting is outside the reach of most webmasters. Since few people can do it, it is highly unpredictable, time consuming, and expensive OF COURSE that is the only way search engineers recommend you build links. They might even like you to believe that almost all links are acquired that way. The more brutally tough it is to build your SEO strategy the more appealing AdWords ads look.
Shopping Search? Try AdWords!!!
If you can't buy links to rank, then some irrelevant old sites and marginably relevant articles on authoritative domains (that typically gained their link based authority before Google polluted the link graph with AdSense and NoFollow) gets to clog up the organic search results, and the only way people can find commercially relevant results is if they look at Google's AdWords ads.
A mainstream media magazine did a spread on one of my friend's websites, where my friend gave them virtually all the content for the article, and they refused to link to my friend's site in the article because they felt it would be too promotional. Sorry, you already sent out 100,000 magazines with the article in it. You already were too promotional. Sadly, that is just one more example of the death of organic links caused by Google's fearmongering.
If I have a blind bid that is too high would it tell me to lower that bid? Nope. A search marketing campaign is only properly optimized if it sends more money to Google, which is the problem with the field of SEO. Google doesn't get a cut of the action. The organic results have yet to be properly optimized.
Why Waste a Breathe Scaring People Unless the Intent is to Lie or Deceive?
Matt also says that it's very difficult to buy paid links effectively as a business or as a search marketer because Google does such a good job detecting and eliminating the value of those links.
How often do you hear Matt Cutts droning on about duplicate page titles or stuffing your meta keywords tag? You don't, because they are no longer effective.
Google would not be trying to brainwash webmasters about links so often if paid links didn't work. The problem with paid links is they work too well.
Who is Getting Paid?
To properly understand search marketing you have to understand that the fight over search spam has NOTHING to do with result relevancy. The label of spam is only applied if the wrong company gets paid.
Wikipedia can cross link just about everything and look legitimate with it because they are non profit. Independent webmasters have to be more focused if they are trying to create profitable websites. Navigation can be nearly useless and spammy looking, or with a few minor tweaks it can look legitimate and well categorized. Compare the following two examples:
Seen On a Farm
The first navigational scheme is something you might see on the common AdSense website. Each page is not connected to any of the others by any trait other than carrying AdSense ads.
The second navigational structure looks less spammy and more useful. In addition to looking more credible and being easier to use, it also has headings focused on relevant keywords, which can link to related category pages. This allows the site to focus link weight on core topical phrases and pick up on mid tier keywords not covered by a more haphazard navigational scheme that uses generic words unrelated to the way searchers search.
If you think ahead when planning out your navigation it also makes site expansion a breeze. For example, if you later add turkeys to the farm animals category it can be grouped with chicken under a poultry category.
Good internal navigation should be logical, easy to follow, and reflect your keyword theme.
The types of link buys that Google has a distaste for are the links that are exchanged directly for cash. Modify your way of thinking just a little and there are a wide array of easy to buy high value links awaiting your purchase. The key to having a low risk profile is to make the link appear indirect.
Most links occur because of a value exchange of some sort. People link because
they find a resource to be valuable
they get paid directly for linking
they get paid indirectly for linking
Here are 18 indirect ways to buy links without looking like you are on a link buying binge.
Guest Blogging: Have a lot to share but little budget for exposure? Consider saving some of your best content for other websites that have the attention of your target market & offer to guest post for them. If you are looking for more general exposure and can't get onto the A list websites start by submitting to some of the B & C list sites that accept guest posts and work your way up. Services like MyBlogGuest make it easy to find relevant opportunities.
Create other featured resource content & promote it to those who link at quality resources. Internet Marketing Ninjas is great at this type of content creation & promotion.
Testimonials: Best thing ever. Buy now! ;)
Testimonials help increase sales because they are a sign of social trust. Many content management systems, web designers, programmers, and web hosts offer links to featured clients. Some keep full directories of sites using their services, while other sites, such as Pligg, also allow people using their software to buy an ad on the official software site.
Association Memberships: Trade organizations tend to have significant global authority and topical authority. In order to push the agenda of the organization many of these list members to show proof of social value. These links are often priced far below their value, and contributing directly to associations is a way to also get significant exposure in front of the type of people who are likely to buy from you and/or link at your site.
Contests: People are competitive animals. Contests like the Mahalo Follow refer a friend program also move the spamming activity away from the source and onto other people, thus allowing the central sites to profit from spamming without being called spammers.
Awards: Even if winning an award has absolutely no value people still like recognition. Winners like to talk about what they have won. In some cases you can even give award winners your product to get them to talk about it.
Donations: Support causes you believe in. Money is the fuel upon which charities can fund themselves and spread their messages. It is hard to call you a spammer for donating money to a good cause. If you get a bit of link equity out of it as a bonus why not enjoy the benefits of good karma? Better yet, you might be able to donate software or services to charities at little to no expense to you. How much is an SEO services by link on a PR8 charity site worth in branding and distribution?
Free Samples: This acts similar to donations, except it is easier to spread to a wider audience without appearing spammy, and if people like what you offer they may review it on their sites.
Widgets: Many embeddable tools (like analytics products, what is my PageRank tools, etc) provide static links back to the original source site. Some companies also provide emblems that their site is hosted on a green host or that they support some other cause.
Sponsorships: Many email newsletters are archived online. If you target a compelling offer to the right audience this may lead to additional links. Services like ReviewMe also allow you to put targeted offers in front of audiences who may help spread the word.
Affiliate Programs: Even if affiliate links do not provide direct link juice, good affiliates still send a relevant stream of traffic to your site. Some affiliate programs also 301 redirect the affiliate links to the end merchant site. Affiliate programs allow clean companies to profit from the dirty parts of the web (think AdSense or Mahalo Follow).
Social Media: Partner with someone who enjoys writing junk for sites like Digg. If you are too lazy for that, StumbleUpon ads allow you to target ads to specific groups on StumbleUpon, and there are a number of Digg spamming services on the market. Here are some tips for link baiting.
Google AdWords or Other Ad Buys: You can buy ads and send targeted traffic streams to your linkworthy content. You can do it one keyword at a time, or target ads to specific websites. In some cases businesses get organic links just because people are talking about how often they see their ads, plus top of mind awareness leads to more usage and more links.
Link Out to Egomaniac Bloggers: This is a way of buying links by paying with your attention and distribution. People like getting mentioned, and are more likely to link to people who agree with them. Seth Godin linked to my blog again a few weeks ago and when I saw he mentioned my site (even if only in passing) for some reason that made me happy. Insightful blog comments are also likely to make a blogger want to talk about you.
Blog Carnivals: Blog carnivals are where a group of bloggers all talk about a topic and mention everyone else in the ring. These amount to a big circlejerk. If your site is legit and a market leader there is no need for this sort of stuff, but if your site is new in a saturated field doing this might be helpful. Plus others in the blog carnival may end up adding your site to their blogroll or talking about you again on their blog.
Press Releases: Do it too often and it looks cheesy, but some mainstream media outlets like CNN syndicate press releases, while others may choose to interview you based on your press release.
Hire Them / Buy Their Brand & Site: If someone already has a large following but is not monetizing it to the full potential consider hiring them and letting them help you build a more profitable business. You can also look for under-performing sites to buy. If someone is outside of your financial reach you may still be able to leverage their brand by interviewing them.
Of course, not all proxies are being run by innocent people for innocent reasons. Some of them are actually designed to hijack content - to deliver ads, etc. Some people want to steal your content, and they want the search engines to index it. In fact, I would not be surprised if a large part of the overall problem isn't caused by such people firing links at their own proxies.
I have seen numerous sites die to proxy hacking, and this is an issue Google has known about for over a year.
Your name can not be stripped and no one else can claim credit for it. That is credit, reputation is a non renewable resource. It can not be replicated. It can not be copied. To the degree that someone takes credit for your stuff, that's the degree to which you lose credit. It is always proportional.
When Google goes so far as trying to police link exchange and link buying why don't they do a better job policing AdSense? If they want to clean up their search index the easiest, most scalable, and most robust way to do so would be for them to worry about their own network, and stop paying content thieves via AdSense.
One of the comments on the article I wrote for Wordtracker mentioned WordsFinder, which allows you to create a list of keywords from a piece of content. Their tool uses the Yahoo! Term Extraction Tool, and also provides a few additional keywords next to the results. Three other easy ways to get similar information are
Enter a URL into the Google AdWords keyword suggestion tool. Note this tool has two options, one for grabbing keywords for a page, and one for grabbing keywords for the page and other pages that the page links at.
I recently changed one of my robots.txt files pruning duplicate content pages to help more of the internal PageRank flow to the higher quality and better earning pages. In the process of doing that, I forgot that one of the most well linked to pages on the site had a similar URL as the noisy pages. About a week ago the site's search traffic halved (right after Google was unable to crawl and index the powerful URL). I fixed the error pretty quickly, but the site now has hundreds of pages stuck in Google's supplemental index, and I am out about $10,000 in profit for that one line of code! Both Google and Yahoo support wildcards, but you really have to be careful when changing a robots.txt file because a line like this
also blocks a file like this from being indexed in Google
Unless you are thinking of that in advance it is easy to make a mistake.
If you are trying to prune duplicate content for Google and are fine with it ranking in other search engines, you may want to make those directives specific for GoogleBot. If you make a directive for a specific robot, that bot will ignore your general robots directives in favor of following the more specific directives you created for it.
Google also offers a free robots.txt test tool, which allows you to see how robots will respond to your robots.txt file, notifying you of any files that are blocked.
You can use Xenu link sleuth to generate a list of URLs from your site. Upload that URL list to the Google robots.txt test tool (currently in 5,000 character chunks...an arbitrary limit I am sure they will eventually lift).
Inside the webmaster console Google will also show you what pages are currently blocked by your robots.txt file, and let you view when Google tried to crawl the page and noticed it was blocked. Google also shows you what pages are 404 errors, which might be a good way to see if you have any internal broken links or external links pointing at pages that no longer exist.
Jim Boykin recently offered tips to help webmasters understand how to audit a site to see what pages are the most link rich, how internal link equity flows around websites, and how to optimize your internal link architecture. In addition to Jim's tips, you can also improve your internal link structure by using some of the following tips.
Create Promotional Content Sections
The following ideas display social acceptance (which helps improve conversion) while also funneling PageRank at important pages without looking spammy.
heavily promote seasonal stuff in advance (internally and externally)
use sales data or other metrics to create a what's hot in this category and what's hot on our site section to flow more link equity to best sellers (these can be called anything like what's hot, top rated, etc)
create pages high in the site structure to support high value keywords that were only tangentially covered on lower level nodes
over-represent new content in your link structure to help it get indexed quickly, see how well it will rank, and learn how profitable it is
Internal to External Link Ratio
Doing theses sorts of things will still give you all the good karma and benefit that linking out does, while minimizing any downside caused by funneling a significant portion of your PageRank out of your site.
if you have a blog cross reference old posts where and when it makes sense
if you link out heavily on a page ensure you also place numerous internal links on the page
use breadcrumb navigation or other navigational schemes to help structure the site and improve the internal to external link ratio
if you have a ton of outbound site-wide links change some of them to only list them on a single page or section of your site
Keep the Noise Out of the Index
demoting an entire section of the site in the link structure if it has a lower ROI than other sections
use robots.txt and meta robots exclusion tags to prevent duplicate content and other low information or noisy pages from getting indexed
instead of using pagination try to display more content on each page
check your server logs for 404 errors. fix any broken links and redirect old linked to pages to their new locations
This idea may sound a bit complex until you visualize it as a keyword chart with an x and y axis.
Imagine that a, b, c, ... z are all good keywords.
Imagine that 1, 2, 3, ... 10 are all good keywords.
If you have a page on each subject consider placing the navigation for a through z in the sidebar while using links and brief descriptions for 1 through 10 as the content of the page. If people search for a 7 or b 9 that cross referencing page will be relevant for it, and if it is done well it does not look to spammy.
Since these types of pages can spread link equity across so many pages of different categories make sure they are linked to well high up in the site's structure. These pages works especially well for categorized content cross referenced by locations.
SEO Question: Do domain names play a role in SEO? Do search engines understand that the words are in the URL even if they are ran together without hyphens in between them? What techniques are best for registering a domain name that search engines like Google will like?
Answer: Over time the role of the domain name as an SEO tool has changed, but currently I think they carry a lot of weight for the associated exact match search. Depending on how they are leveraged going forward they may or may not continue to be a strong signal of quality to search engines.
Domain Names & Link Anchor Text
When I first got in the SEO game a good domain name was valuable because if you got the exact keywords you wanted to rank for in your name it made it easier to get anchor text related to what you wanted to rank for. For example, being seobook.com made it easier for me to rank for seo book and seo.
That link still exists, but nowhere near as strongly or broadly as it once did.
The Fall of Anchor Text & the Rise of Filters
Anchor text as an SEO technique is no secret. To make up for the long ongoing abuse of it, Google started placing less weight on anchor text AND creating more aggressive filters that would filter out sites that have a link profile that looked too spammy with too many inbound links containing the exact same anchor text. If everyone who links to me uses seo book as the anchor text it is much harder to consistently rank for that term than it would be if there was a more natural mixture to it. A natural mix would have some of the following
Natural link profiles also contain deep links to internal pages, whereas spammy sites tend to point almost all of their links at their home page.
Domain Names in Action
As Google started getting more aggressive at filtering anchor text, they started placing more weight on the domain name if the domain name exactly matched the keyword search query. They had to do this because they were filtering out too many brands for the search query attached to their brand. Some examples of how this works:
At one point, about 2 years back, SeoBook.com stopped ranking for seo book due to a wonky filter that also prevented Paypal for ranking for their own name for a little bit.
A friend recently 301 redirected an education site on a bad URL to a stronger domain name. The site's ranking for the exact phrase went from 100+ to top 20 in Google. But, it still is a long way from #1, and it still is at 100+ for the singular version. In competitive industries you need a lot of links to compete, and the redirect also caused the site to slip a bit for some of the other target keyword phrases that the site used to rank for.
When you launch a new site on a domain name like mykeywordphrase.com and get it a few trusted links it should almost immediately rank for mykeywordphrase. A friend launched a 3-word education site about a week ago. That site ranks #1 in Google right now for those keywords ran together. That site also just ranked #118 in Google for the phrase with the words spread apart. As the site ages and gets more links it should be easier to rank for that exact phrase (but that domain probably wouldn't help its rankings much for stuff like the root sub-phrase).
My domain name Search Engine History.com ranked better than it should have for the query search engine history when its only real signs of trust were age and domain name. It was nowhere in the rankings for just about any other query.
Things Will Change Over Time
A few other caveats worth noting
From my experience this exact match domain bonus works with all domain extensions (even .info), but that could change over time. And if the content isn't any good it is still going to be hard to get traction in any market worth developing content for. This exact match domain bonus also works well in local markets for regional domains like .ca.
This post is about the current market, and is highly focused on Google's relevancy algorithms (rather than other search engines). I expect the weight on domain names to be lowered significantly (especially for competitive queries) as Google moves toward incorporation more usage data into their relevancy algorithms. This is especially true if many domainers put up low quality to average quality websites on premium domain names. Moves like creating 100,000 keyword laden sites in one massive push (as Marchex recently did) don't bode well for the future of domain names as a signal of quality.
The search traffic trends are moving toward consolidating traffic onto the largest high authority sites, so it probably is not a good idea to have 100 deep niche domain names like OnlineHealthcareDegrees.org, OnlineNurseDegrees.net, OnlineNursingSchools.com, OnlineLawDegrees.com, OnlineParalegalDegrees.net etc when you can cover a lot of those topics with a singular broad domain like Online Degrees.org.
Any advantage exact match domains seem to have for ranking is much smaller for related phrases that do not exactly match the keyword string or phrases within the anchor text of most of the inbound links.
For local businesses a keyword matching domain might be a way around paying to list in all the regional directories and other related arbitrage plays.
Domains that use familiar language and sound credible also have a resonance that helps build trust, make the information seem more credible, easier to link at, easier to syndicate, and easier to do business with. It is hard to estimate the value of that since much of it is indirect, and few have measured the affect of domain name on linkability or clickability of a listing outside of paid search arbitrage.
As an SEO one of our primary goals is to get more search traffic for targeted search terms. Search traffic is typically far more valuable than other traffic sources because it is so targeted. But non-search traffic is perhaps the single most reliable sign of quality. As Google controls a larger portion of the overall traffic flow across the web, they risk creating self fulfilling prophecies where low quality sites continue to rank only because they already rank.
If you were Google, and discovered that 98% of a site's traffic comes from Google.com might you want to give that site a bit less exposure? I would. Maybe those algorithms do not exist now, but eventually they could.
If you have a site that earns far beyond your living costs, and it is almost entirely reliant on search for income, then one of the best moves you can make for the sustainability of that site is to lower the percentage of traffic that comes from search by creating other traffic sources. The other traffic sources may not be as profitable on a CPM basis, but as you diversify you lower your risks. It doesn't matter how the algorithms shift if your site is strong in every signal of trust they could possibly measure.