How Google AdWords Ads Manipulate Google's Organic Search Results

SEO Question: I was thinking about buying Google AdWords and AdSense ads or placing AdSense on my site. Will doing any of these increase my link count, Google rankings, or rankings in other search engines?

Answer: PPC ads go through redirects, so they do not count toward your link popularity, but there are other ways to tie together PPC ads and organic search placement. Search engines claim there is no direct linkage between buying ads and ranking, but they only talk in ideals because it helps reinforce their worldview and help them make more money.

Buying AdWords Ads

What They Won't Tell You:
Highly commercial keywords may have the associated editorial results go through more relevancy filters and/or be editorially reviewed for relevancy more frequently. Also, because they want people to click on the AdWords ads there is a heavy informational bias on the oranic search results.

I know some people who have large ad spends that get notifications of new ad system changes ahead of time, and others who get to give direct feedback to allow them to participate in cleaning up search results and minimizing unfair competing actions in the ad systems. So that is one type of cross-over / feedback that exists, but I think that tends to be more rare, and the more important cross over / feedback that exists is an indirect one.

Just by Being Real
You can't really explain why and how everyone does what they do. Some people who find your product and enjoy it enough to leave glowing testimonials will even tell you that they don't know how they found it.

In the same way that targeted ads can lead to purchases, they can also lead to an increase in mindshare, brand, reach, usage data, and linkage data. Just by being real and being seen you are going to pick up quality signals. If you try to factor all of those into your ad buys most markets are still under-priced.

Cross Over Due to Buying AdWords:
A well thought out pay per click campaign can feed into your SEO campaign more ways than I can count. Here are a few examples.

Integrating Offline & Online:
In a TV commercial Pontiac told people to search Google, and got a ton of press.

Big Controversial Ads:
Mazda quickly bid on Pontiac.

Many companies also have strong ties between the legal and marketing departments. If buying or selling an ad gets you sued and gets you in the news the value of the news coverage can far exceed the cost of the ads and legal fees.

Small Controversial Ads:
When I was newer to the field one friend called me the original link spammer. He meant it as a compliment, and I still take it as one. In much the same way I was an aggressive link builder, I was also quite aggressive at ad buying.

I caused controversy by buying names of other people in the industry as AdWords ads. I was prettymuch a total unknown when I did that, but some of the top names in the industry elevated my status by placing my name in heated discussion about what was fair and reasonable or not.

You can always consider placing controversial / risky ideas or ads against your brand or competing brands as a way to generate discussion (but of course consider legal ahead of time).

Drafting Off New Words & Industry News:
When the nigritude ultramarine SEO contest started I bid on AdWords. Some people discussing the contest mentioned that I bid on that word. If an event bubbles up to gain mainstream coverage and you make it easy to identify your name as being associated with it then you might pick up some press coverage.

Industry buzz words that are discussed often have significant mindshare, get searched for frequently, and larger / bureaucratic competitors are going to struggle to be as plugged into the market as you are or react as quickly to the changing language.

Snagging a Market Position Early:
When a friend recommended I read the TrustRank research paper in February of last year I knew it was going to become an important idea (especially because that same friend is brilliant, helped me more ways and times than I can count, and was the guy who came up with the idea of naming the Google Dances).

I read it and posted a TrustRank synopsis. In addition to trying to build a bit of linkage for that idea I also ensured that I bought that keyword on AdWords. Today I rank #1 in Google for TrustRank, and I still think I am the only person buying that keyword, which I find fascinating given how many people use that word and how saturated this market is.

Buying AdSense Ads

Buying Ads Creates Content:
If your ads are seen on forums people may ask about your product or brands. I know I have seen a number of threads on SEO forums that were started with something like I saw this SEO Book ad and I was wondering what everyone thought of it. Some people who start talking about you might not even click your ads.

Each month my brand gets millions and millions of ad impressions at an exceptionally reasonable price, especially when you factor in the indirect values.

Appealing to an Important Individual:
I have seen many people advertise on AdSense targeting one site at a time, placing the webmaster's name in the ad copy. It may seem a bit curt for some, but it is probably more likely to get the attention of and a response from a person than if you request a link from them.

Ads are another type / means of communication.

Appealing to a Group of People:
I get a ton of email relating to blogs and blogging. And in Gmail I keep seeing Pew Internet ads over and over and over again. Their ads range from Portrait of a Blogger to Who are Bloggers?

Going forward they will have added mindshare, link equity, and a report branded with that group of people. When people report on blogging or do research about blogging in the future the Pew report is likely to be referenced many times.

Selling AdSense Ads

Don't Sell Yourself Short:
Given the self reinforcing nature of links (see Filthy Linking Rich) anything that undermines your authority will cause you to get less free exposure going forward. So you really have to be careful with monetization. If you do it for maximal clickthough rate that will end up costing you a lot of trust and link equity.

There are other ways to improve your AdSense CTR and earnings without costing your credibility and authority.

More tips:

Don't Monetize Too Early:
Given the lack of monetization ability of a new site with few visitors and the importance of repeat visits in building trust and mindshare you don't want to monetize a new site too aggressively unless it is an ecommerce type site. It is hard to build authority if people view your site as just enough content to wrap around the AdSense.

Spam, Footprints, & Smart Pricing:
In the past search engines may have discounted pages that had poison words on them. Search is all about math / communication / patterns.

If your site fits the footprints of many spammy sites then your site might be flagged for review or reduced in authority. MSN did research on detecting spam via footprints, and link spam detection based on mass estimation shows how power laws could make it easy to detect such footprints.

Graywolf recently noted that landing page slippage may be an input into landing page and site quality scores for AdWords ad buyers. Google could also use AdSense account earnings or AdSense CTR data to flag sites for editorial reviews, organic search demotion, ad payout reduction, or smart pricing.

How to Redirect Outbound Affiliate Links

Question: I wanted to link to your book from my site using an affiliate link, but will not be able to put affiliate links on my site, how do I link to pages on my site and have them jump to affiliate links?

Answer: Many affiliates link to theirdomain.com/recommended/product-name/ and then redirect that location either using .htaccess or a PHP jump script. Some affiliates also block the directory of affiliate links using a robots.txt file.

The advantages of doing this are:

  • getting around publishing requirements that prevent you from posting affiliate links to your site

  • potentially shield some of your affiliate footprints from some information retrieval systems (although likely many of them will be able to understand your link relationships to some level based on surfing habits of your visitors). Some affiliates may also cloak their links to show engines links to well trusted sites, but that could be considered shady by some search engines
  • making it harder for newbies to see how to access the affiliate program that you are recommending or that the link is an affiliate link (some people also use a JavaScript scroll-over event that shows the end site URL to further cloak the affiliate relationship)
  • easily change what merchant or what merchant offer is associated with affiliate links throughout a site by changing the one .htaccess or php redirect file

A while ago NotSleepy guest posted about using .htaccess and redirects. You may want to use 302 instead of 301 redirects if you are using .htaccess for your redirects. Here is an easy to use PHP jump script if you would prefer to use that over .htaccess.

Finding Link Sources & Building Topical Authority Links

SEO Question: Many people tell me to get authoritative links. How do I find authoritative links?

SEO Answer: It helps to get links directly from sources that would be considered trusted seed sites in algorithms like TrustRank or topical hub and authority sites in Topic Sensitive PageRank. As TrustRank, Topic Sensitive PageRank [PDF], and other similar trust / topical trust related algorithms flow around the web it also helps to get links from sites that are linked to from seed sites.

Sites like DMOZ, the Yahoo! Directory, and Wikipedia might be considered obvious authorities and trust seed sites, and there are numerous other ways you could find potential trusted seed sites.

One example of a way to find general high authority / high trust domain might be to look for sites that link to multiple trusted related resources in one field that also link to multiple trusted related resources in other fields. For example, you could do something like Yahoo! Search (linkdomain: a couple sites in field 1) AND (linkdomain: a couple sites in field 2).

Sites that you know the brand of even if they are outside your industry, or see ranking across a wide range of queries are also well trusted authoritative domains.

Some algorithms might transfer a lot of trust to anything listed in multiple seed sites. So if you wanted to find what sites were listed in DMOZ and the Yahoo! Directory that link at a competing site you could do a Live search for something like linkdomain:seobook.com linkfromdomain:dmoz.org linkfromdomain:yahoo.com.

Some algorithms may take the top x% of sites from each category of trusted seed sites and consider those as trusted sites as well. The Yahoo! Directory lists sites roughly in terms of authority, so viewing the top sites in a specific category is a good way to find the most authoritative sites in that category.

Yahoo! also paginates results in each category. If you are in need of co-citation in the Yahoo! Directory and your domain lacks adequate authority to be listed on the first page of your category you can buy a category sponsorship for about $100 a month without worrying about Google calling you a link buying spammer or removing your site from the results (even though you are buying an ad for distribution, link equity, and co-citation - typically with more indirect value than direct value).

The Google Directory is powered from DMOZ data, and sorts listings in order of PageRank, so that is another good way to find the top authorities in a specific category. Also when you search the Google Directory for a domain like seobook.com it will show pages listed in the Open Directory that mention that domain.

You could also create a Google Custom Search Engine which was seeded by a seed site such as the Open Directory Project's RDF dump, and then search that for domain mentions.

When looking for topical hubs you could also look at:

  • sites which link to many top ranked authority sites using a tool like hub finder.

  • top ranked sites for related fields broader than yours...for example, if you had an SEO site you can look for top ranked pages and sites about search
  • who links at industry standards and other important documents in your field
  • top ranked sites for your keywords + blog (helps if your topic is somewhat tech or web related in nature)
  • track mentions of competing sites using Google Blogsearch or Technorati
  • topical authority blogs in Technorati
  • once you find a few hubs or authorities use the Google related sites feature to find related sites related:searchenginewatch.com

Another way to get authoritative links is to see what social sites and people outside of your industry are talking about and linking to that is related to your industry or related industries. Think of ways to create related ideas and industry standards.

Yahoo! tends to sort backlinks roughly in terms of authority. In addition, Yahoo! allows you to search for .edu, .gov, .mil, .ac.uk or things with .k12 in the URL. Combine those types of ideas with a specific topic or a link search function to find a targeted link opportunity.

And, if you are into looking at competitive linkage data right in the search results SEO for Firefox is the extension for you.

Can New Domains Outrank Old Websites in Google?

SEO Question: I have the same content as a top ranked competing site. Are they outranking me because of their domain age? What can I do to outrank them?

SEO Answer: Competing requires more than just replicating what a competing site has done. Back when search was less sophisticated people had to follow links to get where they wanted to go. Thus directories were more relevant and many sites listed any halfway decent sites in their vertical based on the fact that they were even in the same vertical. With search replacing links as the default navigational scheme you have to do more to be linkworthy.

A site like SeoToday would not get to the top of the search results if it were launched today, but because it was launched many years back and was easy to link at back then it has many authoritative industry related links that help keep it ranked well in Google.

Also think of the search business model as though you are a search engine. To them, being the first person to do something is a sign of quality because to be the first person in a market requires some market timing / knowledge / investment / luck. The people who bet on new markets are in essence rewarded if/when their market takes off, both by self-reinforcing market effects (people being more likely to find / experience / link to top ranked results) and by algorithmic weighting on domain age.

The biggest issue facing search engines is the quality of their results. By relying on old / stale results they require new content producers to do better things than old websites did to steal marketshare. Thus you have to be innovative / offer a better customer experience / be more remarkable to rise to the top of a marketplace.

If you want to outrank established websites you can't just replicate what they have done, you also have to do unique and linkworthy things that will help you overcome their early market lead and the self-reinforcing effects of search.

How Do I Get Large Websites Indexed by Google & Other Search Engines?

SEO Question: I have a 100,000+ page website. Is there any easy way to ensure all major search engines completely index my website?

SEO Answer: Search engines are constantly changing their crawl priorities. Crawl too deeply and get many low quality pages while increasing indexing time and costs. Crawl too shallow and you don't get down to the relevant pages. Crawl depth is a balancing act.

There is no way to ensure all pages get and stay indexed...they change their crawl priorities constantly. Having said that, you can set your site up to make it as crawler friendly as possible.

Five big things to look at are

  • content duplication - are your page titles or meta description tags nearly duplicate (for example thin content pages that are cross referenced by topic and location)? or do other sites publish the same content (for example an affiliate feed or a wikipedia article)? are search engines indexing many pages with similar content (for example different model color or splitting feedback for one item across many pages)?

  • link authority - does your site have real high quality links? how does your link profile compare with leading competing sites? what features or interactive elements are on your site that would make people desire to link to you instead of an older and more established competing site?
  • site growth rate - does your site grow at a rate consistent with its own history? how does your growth rate compare with the growth rate of competing sites in the same vertical?
  • internal link structure - is every valuable page on your site linked to from other pages on your site? do you force search engines to go through long loops rather than providing parallel navigation to similar priority pages? do you link to low value noisy pages (sometimes a search engine indexing less pages is better than more)?
  • technical issues - don't feed the search engines cookies or session IDs, and try to use clean descriptive URLs

Some signs of health are

  • you don't have pages you don't want getting indexed - wasting link equity on low quality pages means you have less authority to spread across your higher quality pages

  • most the pages you want indexed are getting indexed, actively crawled, and are not stuck in Google's supplemental index - supplemental problems and / or reduced indexing or crawl priority are common on sites with heavy content duplication, wonky link profiles, or many dead URLs
  • your site is building natural link equity over time and people are actively talking about your brand - if you have to request every link you get then you are losing market share to competitors who get free high quality editorial links
  • you see a growing traffic trend from search engines for relevant search queries - this is really what matters. this includes getting more traffic, higher quality traffic, and searchers landing on the appropriate page for their query.

things you can do if conditions are less than ideal

  • focus internal link equity at important high value pages (for example, on your internal sitemap consider featuring new product categories, new and seasonal items, or link to your most important categories sitewide)

  • trim the site depth (by placing multiple options on a single page instead of offering many near duplicate pages) or come up with ways to make the page level content more unique (such as user feedback)
  • cut out the fat - if many low value pages are getting indexed block their indexing by doing something like nuking them / not linking to them / integrating their information into other higher value pages
  • use descriptive page relevant URLs / page titles / meta descriptions - this helps ensure the right page ranks for the right query and that search engines will be more inclined to deeply crawl and index your site
  • restructure site to be more top / mid / bottom heavy - if a certain section of your site is overrepresented in the search results consider changing your internal link structure to place more weight on other sections. in addition you can add features or ideas which make the under-represented pages more attractive to link at
  • use Sitemaps - while you should link to all quality pages of your site from your site and use internal link structure to help them understand what pages are important you can also help search engines understand page relationships using the open sitemap standard

Should I do SEO for Niche Market Clients?

SEO Question: Where would you begin with a client that markets exceptionally niche products? Or would you simply pass on the opportunity and just work with clients with more mainstream products/services?

SEO Answer: Certain markets, like insurance, are brutal to jump into, no matter how much you are willing to spend. Other markets are easy to dominate with little effort. Traditionally niche markets are less competitive than established competitive markets, and thus it is easier for a small amount of SEO to go a long way.

There are a few major considerations when deciding if it is worth it to do SEO on a niche site:

  • Is there market demand? Is their trending demand? It is hard to get people to change the way they use language or create demand where none exists unless the marketing goes beyond SEO. If the customer ranks #1 for their core keywords, but there is no traffic then SEO is a moot point. Do keyword research before considering SEO. You may also want to search to see if people are talking about similar ideas. Don't forget to use something like Google Trends if the market is seasonal. If possible set up an AdWords campaign. If they are in a deep niche there probably is not going to be much competition for their core target keywords, and they may even do well by spending $100 a month on AdWords.

  • Is there an overshadowing important market? Another thing to consider is how established and authoritative are the sites currently ranking for your target keywords. In some cases they may be prohibitively authoritative and not worth the expenses required to outrank.
  • How relevant are the organic search results? If the organic search results are irrelevant and AdWords has no competition then it might be hard to justify an expensive SEO campaign if AdWords, Overture, and AdCenter are working well enough.
  • Does the site have any trust? If the site has no trust at all it may be easier to rank pages on other sites. For example, you could put up a Squidoo lense (or similar), point a link or two at that, and try to rank it.
  • Are people talking about the site? If the site has been mentioned in the past by people without much of a push marketing campaign then odds are it is remarkable enough to really citation worthy, especially if you are working with a Purple Cow. If the site is interesting or you can relate it to something of great interest then viral marketing as SEO is great stuff.
  • Budget & Fun: The other things to consider are how fun would the client be to work with, how bad do you need work, and are they willing to give you enough budget to adequately value your time and provide long-term value. In some niches there might be huge upside potential. If they are starting from scratch (or nearly from scratch) it might make sense to gain an equity position in the website. You also have to evaluate how scalible your model is. Ongoing quality SEO services require a low client to service provider ratio since each client has exceptionally unique needs and SEO service business models are hard to scale.

Why Do Keyword Tool Search Estimates Vary so Much?

SEO Question:

I am using Overture, Wordtracker, and KeywordDiscovery to do keyword research, but I want to know why the search volume numbers vary so much, and which numbers I should trust. How do I do keyword research?

SEO Answer:

Each keyword data source has flaws inherent to its model. Rather than looking at keyword suggestion tools as something which offers an exact quantitative measure of traffic look for them to be more qualitative (ie: rather than looking for exact numbers look for them to be more of a yes no tool).

Also look at keyword depth, related words, and reasonable modifiers. Depth and related words matter far more than just the sheer volume for a generic term because the longer queries are more targeted and thus easier to monetize, and longer search phrases are typically easier to rank for than shorter keywords.

Overture:

Overture is owned by Yahoo!. Since Yahoo! is a major search provider and has a fairly open ad inventory system they have a ton of automated search queries from things like

  • rank checkers

  • search result scrapers
  • people doing competitive research
  • arbitrage players
  • bid checkers

When I searched for "seo book" as a keyword Overture returned 1,579 monthly queries. This number is low because those people searching for my brand (or seo products in general) are typically more likely to search on Google.

When I searched for "seo book" as a keyword Overture returned 33 monthly queries for book engine optimization search seo seo seo software. Notice how they returned the words in alphabetical order, the words in the search queries most likely were in a vastly different order. Also note that is an absurd search phrase. Like who would search for something like seo book seo search engine optimization seo software? Probably nobody, so it is most likely an automated query. You can also back up the lack of legitimacy of that keyword phrase by the fact that Overture did not show any search volume for other similar but shorter and more reasonable queries.

Another problem worth noting with Overture data is that they run singular and plural search words together. Some queries have far different meanings and/or far different search volumes between the singular and plural versions.

Although I mentioned this above, it is worth noting again: keyword depth matters far more than the sheer volume for a generic term. Longer queries are more targeted and thus easier to monetize. Plus those queries are easier to rank for. Since seo book only had 3 returned queries, with one of them being brand related and another being likely a fake query, it probably would not be a great keyword to target for traffic (but the lack of competition and limited market depth might make it an easy term to brand...which was part of the thought process when I created this site).

To test keyword depth and find related keywords you can use a variety of tools, like Overture, Google Suggest, the AdWords Keyword Tool, and Google Sets.

I also have ownership in one of the top ranked Forex websites. Because of the highly commercial nature of the search term (forex means foreign exchange, which is typically searched for by people trading money) Yahoo!'s search estimates (93,240 searches a month) are absurdly overblown for that core term.

But on a positive note, that market has amazing depth. People are looking for courses, books, news, tips, and strategies on the topic. Forex also has synergistic related keywords like currency trading, forex trading, currency exchange, and modifiers galore including types of trades, country names, and currencies. While the market is exceptionally competitive the depth still makes it somewhat appealing.

Wordtracker:

Wordtracker has a much smaller data set than Overture (Yahoo!), Google, or even MSN, so Wordtracker numbers are going to be more easily skewed by the small sample size. If I use one of Wordtracker's partner search engines and search for an uncommon query it is going to make that query seem far more important than it is.

Digital Point offers a free tool which compares Wordtracker and Overture values side by side.

Keyword Discovery:

I view Keyword Discovery as being somewhat similar to Wordtracker. Keyword Discovery may have a larger keyword index than Wordtracker, but don't expect either of them to offer precise quantitative data.

If you want to test the quantity of traffic for a specific keyword, set up a search targeted campaign on Google AdWords, and ensure you bid enough to show up somewhere on the lower half of the ads on the first page. Also ensure that you target your ads to the appropriate regions, and use a large enough budget that your ad shows up on almost all of the searches for that particular keyword.

In the past I did a more comprehensive review of keyword research tools. I also offer a free keyword research tool which is powered from Overture data and cross references most of the useful keyword research tools on the market.

Determining Domain Link & Age Related Trust

SEO Question: What's the best way to determine whether Google has history of a domain - and considers it an old domain?

SEO Answer: Andy Hagans once posted that a site which is getting crawled fairly regularly has at least some trust greater than nothing. The Google Toolbar gives you and outdated exceptionally rough estimate of authority, but beyond just having many pages indexed and cralwed regularly it is a bit more abstract to determine how well a domain is trusted, especially if you do not yet own it and are not able to manipulate its contents to perform testing.

What you can do is back solve for some clues of trust. For example, a domain that does not rank #1 (or near the top of the search results) for the keywords "mydomain" is probably not as well trusted as one that does.

If a domain name ranks for its core unique string then you can see if it ranks well for the keyword phrase "my domain". If it does then you can assume that it would have more trust than one that does not.

Beyond that you can see how well the domain ranks for unique text phrases on its pages or more general keywords. Essentially as you modify your searches (testing shorter or longer word phrase sets and/or wrapping them in quotes or dropping the quotes) you are just testing what chance the domain may have of ranking for various different phrases (and thus its potential ability to rank for other phrases).

Another way to look at a domain is to see how old the domain is in Archive.org. Google crawls the web more efficiently (and likely more aggressively) than smaller search services that are not based on running / being supported by such a large ad network. Thus if a domain has been indexed for a while in Archive.org then it most likely has also been indexed in Google for a while.

Another thing you want to look at in Archive.org is to see if the domain has had a period of inactivity, or if porn webmaster or a pay per click domainer owned the domain for a while. If the domain was inactive for a while, or spent a period of time being abused then it may have had some of its authority stripped at some time.

You can also query Yahoo! for linkdomain: or link: to see what some of the most important backlinks are for a domain. See if those links point at documents that still exist. See if those links point at documents for the same purpose that they originally did. See if the site still serves the same purpose it originally did.

In one of his SEO videos Matt Cutts stated that it was legitimate to redirect a site to a new location so long as the purpose of the site is the same as the original site was. If redirecting a site's authority is legitimate then one could assume that buying or selling a site to use it for its same original purpose is also legitimate as well. If you are going to leverage a site for off topic purposes though you increase the risk that many of the people linking at the site may pull some of their links, and you also increase your risk profile such that a competitor may out you or a search editor may want to remove your site from the search results.

Back to the backlinks...

With the current Google anchor text no longer matters anywhere near as much as it once did, although variation is still a plus.

When looking at the backlinks of a site you can see how long some of those links have been in place by looking through the Archive.org history of the page linking into your site. Links that have developed naturally over time or that have been in place for a long time may carry greater weight than brand new links.

One of the nice features of SEO for Firefox is that it can give you a quick glimpse of the link profile of a page or domain. While some people discount extra weight being placed on .edu and .gov links than other links I would not be so quick to discount that theory. Generally links that are harder to influence are going to be sources that search engines would want to trust more. Either through directly trusting them more on a per link basis, or by creating a version of the web graph which starts near (and places more emphasis on) some of the core trusted authority sites. Since the web started largely in .edu and .gov type environments and those types of pages are often fairly pure in nature and easy to link at it makes sense that their link voting power would be highly represented in the search results.

When valuing a domain you have to look beyond just the link and age related equity it already has built up. How self reinforcing are its key attributes? Would it be easy for someone else to steamroll over your key widget by throwing a bit of ajax on a similar tool? Does a newer competitor have a richer community driven environment that is picking up steam? Are your links next to impossible for others to get? How official or legitimate does the name sound? Will you be able to build it into something that can continue to gain traction and authority? Or is it going to be a site surviving on past popularity until it withers away?

There are lots of things to consider when valuing a domain. Small changes in ad positioning or monetization method can lead to doubling or tripling earnings. And you can also drastically increase the earnings and traffic potential of any site owned by a person who is not savvy to marketing, SEO, or business. Earnings is one important factor, but do not forget to consider the value you can add to a site when trying to determine what you can afford to pay for it.

Search Engine Friendly Copywriting - What Does 'Write Naturally' Mean for SEO?

SEO Question: Many people say write naturally for SEO, but what does that mean?

SEO Answer: About a month and a half ago the New York Times published an article by Steve Lohr titled This Boring Headline Is Written for Google. The article flits with the idea of writing newspaper articles with Google in mind. That story got a decent amount of buzz because newspapers usually do not put much consideration into search engine marketing.

Old School Search Engine Optimization:

A few years ago you could do SEO like this:

  • start your page title with your keyword or keyword phrase
  • include that keyword phrase on most every heading or subheading on that page
  • link to the page sitewide with that same keyword in the anchor text
  • build a ton of links from external locations, with most (or all) of them containing that keyword phrase

Does Old School Still Work?

For MSN (and, to some extent, Yahoo!) you could still use a somewhat similar keyword stuffing philosophy and see outstanding results, but the problem with the stick my core phrase everywhere SEO method! is that Google does not want to show the most optimized content. They want to show the most relevant content.

As noted above in the New York Times article, most news articles (and likely most quality web documents) are not heavily focused on concentrating on optimizing for a keyword. Instead they use the natural language associated with that topic.

If too many of your signals are focused on just one word or phrase and you lack the supporting vocabulary in your document you may get filtered out of the search results for your primary keyword targets. It has happened to me several times, and it is a pretty common occurrence, especially for websites that have few authoritative trustworthy votes and try to make up for it by aggressive use of a phrase in the page content.

Here is an example of a snapshot of a spam page I saw ranking for a long tail keyword

The problem is, that page was ranking for Michigan Smoker's Life Insurance when it targeted a way different phrase. The page will never rank for the main phrase it was targeting, so unless they redirect searchers to a more relevant page it is going to be hard for them to convert any visitors that land on a page like that.

Read a bunch of SEO forums and you eventually come across threads with titles like Non optimized pages higher in SERPS than optimized ones???

How to Optimize for Google:

So if old hat optimization is considered overoptimization and/or is potentially detrimental to your rankings what do you do?

You could

  • say screw Google they will eventually rank me if I get this keyword on the page 1 more time ;)
  • say screw Google I am pulling in plenty of money from Yahoo! and MSN
  • not worry about SEO at all
  • evolve SEO to a more productive state

Onward and upward I say. How to mix it up to become Google friendly:

  • Start the page title with a modifier or couple non keywords instead of placing your primary keyword phrase as the first word of the page title. Example... instead of search engine marketing company start your title with professional search engine marketing...
  • Stemming is your friend. Use plural, singular, and ing versions of your keywords. I have seen pages that used a bunch of the plural version filtered out of Google for the plural version but still ranking for the singular version. If you mix it up you can catch both.
  • Mix up the anchor text, subheaders and page content. Use semantically related phrases, and, in some cases, write subheaders that are useful for humans even if some of them do not have any keyword phrases in them.
  • Make sure each page is somewhat unique and focused in nature.

Semantically related phrases:

If you think of words as having relationships to one another and you visualize optimizing for a keyword as optimizing for a basket of relevant related keywords it will help you draw in relevant related search traffic while also making your page more relevant for its core keywords.

For example, the acronym SEO would have the following as some semantically related phrases

Now you wouldn't necessarily need to get all of those in your page copy, but if a person was writing naturally about the topic of SEO it would be common for many of those kinds of words to appear on the page.

Where do I Find Semantically Related Phrases?

GoRank offers a free semantic research tool. You can also find semantically related phrases by using a Google ~ search, the Google Keyword Tool, clustering engines, or concept pairing tools like Google Sets.

I link to all those tools on my keyword suggestion tool, and here is a background post on latent semantic indexing.

An Over Abundance of Modifiers:

In addition to using words that are semantically related it makes sense to use words that are common modifiers. For example common buying / shopping searches might include words like

  • Free shipping
  • Coupons
  • Coupon
  • Deals
  • Deal
  • Cheap
  • Expensive
  • Budget
  • Bargain
  • Bargains
  • Affordable
  • Low Cost
  • Free
  • Find
  • Get
  • Buy
  • Purchase
  • Locate
  • Compare
  • Shop
  • Shopping
  • Search

I created a keyword modifiers spreadsheet with free keyword modifier ideas for a few different search, transaction, and classification types. I might try to expand it a bit if people find it useful.

If it All Sounds Like a Bit Much...

If it seems complex or complicated then don't focus too heavily on the modifiers or semantic related phrases or even your core keyword that much.

First write your article about your topic without even thinking about the search engines. Then go back and tweak it to include relevant modifiers and semantically related phrases. Make sure that you use multiple versions of your primary keyword phrase if it has multiple versions.

To make the page easy to read and to make it easy to add related phrases and alternate versions of your keywords break up the page using many subheaders. Also add leading questions that lead people from one section to the next. For example, I could say did you find this search engine marketing article helpful in your website promotion quest? Do you think it will help you do a become a better search engine optimizer and more holistic internet marketer?

I am a bit tired and I think this was a bit verbose, but hopefully it helps somebody. If not, arggggg... hehehe.

How do I Tell Where My Website Ranks in Google?

SEO Question: So here is my question. I have followed a good deal of your advice and am thankful for it as I see myself sitting pretty for some of my keyword phrases. However, my friend in Idaho sees different results and my friend in Saudi sees different results - and I wanna know - does Google index differently according to geographical locations?

Assuming that you mean where you rank in the SERPs (also known as the search results page) and not PageRank - which is a rough estimate of global link popularity, there are a number of factors which may show you different search results than what your friends see when they search Google. The 3 major factors are:

  • data center

  • location
  • personalization

Google's Data Centers:

Google has a boatload of data centers around the world. In fact some of them even be running in shipping crates. They usually route search queries to the data center that is nearest you. In a recent interview Google's Matt Cutts said:

In fact, even at different data centers we have different binaries, different algorithms, different types of data always being tested.

If they roll new filters or make large changes to their algorithms you might notice different results as you hit one data center or another.

Location:

A friend of mine owns a site in a hyper competitive market that used to be owned by a person from Australia. While the site does not yet rank as well as my friend would like on Google.com it ranks for amazingly competitive single word queries in Google Australia (Google.com.au) due to having many links from websites that are located in Australia.

Within Google's local search they allow people to search for all websites or just local ones. You can appear in local databases by hosting your site there, using a domain with a local extension, or having many links from sites that are deemed local in nature.

If you are in Canada even if you search on Google.com those search results will be biased toward Canadian websites. If you are located in another country but want to see what Google's search results look like in the US you can search Google from a proxy.

Search Personalization:

If you are logged into a Google Account they will bias your search results based on websites you have visited, especially those you have clicked through to from search results.

If you visit a site or page frequently they will improve the positioning of that page in your personalized search results. If you visit a page occasionally just rank checking, and then sometimes clicking onto your result then clicking back nearly immediately Google will demote those pages in the SERPs.

You can turn off Google personalized results by clicking a link on the results that says something like Turn Off Personalized Results

Free Google Keyword Rank Checking Tools to Use:

There are a number of free tools that make it easy to track where you rank.

I like tracking some core keywords using Rank Checker. It is free and stores historical results, offering data refreshed however often you would like it refreshed. And since it sits on your desktop you don't have to worry about others spying on and aggregating your data to compete against you.

If you just want to check where you rank in Google I have a few rank checkers in the tool section on my website as well.

Keep in mind that many of the rank checkers will set the number of results per page to be a different number than the default 10, and that will cause a slight ranking skew. Also if you wanted to check your rankings on different data centers McDar has a free tool which makes it easy to check your rankings across a number of data centers all at once.

You could also manually go to any of the data centers directly and do a search query from their IP address, like 66.249.93.104 or 216.239.51.104

Paid Rank Checkers

If you need to track rankings in some search engines outside of the core global search engines (Google, Microsoft Bing, & Yahoo! Search) say like Baidu or Yandex then you might want to give Advanced Web Ranking a look. In addition to storing your rank data they store the entire top 50, and allow you to create graphs with it and provide many different useful reports that give you various looks at ranking trends amongst competing sites.

Server Logs & Tracking:

It is easy to get hung up on where you rank for a specific term, but it is far more important to try to rank for a diverse set of terms. See what terms are driving searchers to your site and track which of those terms are converting. From that data you can create more content around what converts or spruce up the top performing pages by adding a few modifiers or making the content more compelling to human visitors, and thus further increasing conversions.

Pages