Don't Be Content Ranking #1 in Google

Some top ranking sites do not deserve to. When one is lucky enough to be in such a situation it allows us to get away with being lazy, because a site does not have to be too efficient to make money if it is well represented for targeted search queries that send free traffic. But every website has upside potential, even if it already ranks #1.

Improve Internal Navigation & Usability

One client of mine only ranks below his official manufacturer for their name. His site had inadequate internal navigation. I took a day to improve the navigation, and the result was a 150% increase in sales. The last 8 days of last month sold nearly as much as the rest of the month combined. His business model looked like it was about to die, but that one day of work made it functional for at least a couple more years.

High Profit Parallel Markets

I had a site which made a couple hundred dollars a day that was well established in its market, but did not dominate it. Taking the path of lowest resistance, I branched the site into two parallel markets of greater commercial interest where the competition was weaker. On an investment which is less than what the site earns in a month now I was able to increase its income 5x, without even doing much link building.

An Undeserved Ranking

One of my friends is in a high profit market where the competition is absolutely clueless. Basic SEO brought that site to a #1 ranking in Google. The site is highly conversion oriented and makes great income, but now that it already ranks it probably makes sense to reinvest some of the profit into improving content quality and reinforcing that market position. Businesses that do not reinvest eventually fall, especially if they are winning only because the competition is clueless. After spending a couple thousand dollars a day on AdWords eventually the competitors will start to look into SEO.

The Value of Branding

If a site ranks #1, and is monetized via PPC ads, it still might only make a portion of what it should because AdSense is not as efficient as some people would lead you to believe. If a site is strong enough to attract brand advertisers they will pay a premium just for getting their brand seen. Scraper sites and thin content sites don't attract brand advertisers, even if they convert. I have seen a site that was making $80 a month on AdSense make over $10,000 a month selling brand advertisements.

Automation & the Effectiveness Timeline of a Search Spam Technique

By the time people are looking to automate a no cost SEO technique, as a competitive strategy it is already dead. Blog spamming was once highly effective, but when commercial blog comment spam software was available the practice already stopped working in Google.

Automated Article Submission Software

At SMX advanced a Yahoo! engineer noted that if they detect content as duplicate they are less likely to trust it to seed crawling other documents. People are pushing article submission software to submit articles to article directories, but if most of the content on an article directory site is duplicate, marketers are pushing spamming them via an automated system, and the content networks accept automated submissions, obviously this is not going to be a clean and trusted part of the web that you can go back to again and again. Maybe it is good to try here or there for a bit of anchor text or other market testing, but it is probably not worth automating and doing on a mass scale, especially if the site lacks important signs of quality.

Hundreds of Engineers Work to Kill Spamming Techniques

The spam detection and anti-spam algorithms are driven by people. If something is commonplace in a market then the search engines try their best to stop it. If they can automate it they will. If they have to demote it manually they will.

In the second video here Matt Cutts talked about how spam prevention methods may be different based on language, country, or even market...noting that many real estate sites rely too much on reciprocal link spam.

The less your site's marketing methods look like spam and the harder it is to duplicate what you have done the less likely you are to get hurt by the next update. By the time there is a mass market automated spamming solution the technique is already dead.

The Faults of Human Review

Danny Sullivan's recently made a post highlighting the downside of human review for search engines:

[Tim] Mayer reminded that what's relevant for a query can often change over time. Google's Udi Manber, vice president of engineering, made similar remarks when I spoke with him about human-crafted results when I was visiting at Google yesterday.

One example he pointed out was how Google's human quality reviewers -- people that Google pays to provide a human double-check on the quality of its results, so they can then better tune the search algorithm -- started to downgrade results for [cars] when information about the movie Cars started turning up. The algorithm had picked up that the movie was important to that term before some of the human reviewers were aware of it.

Obviously human review is used at all major search engines, but even when outsourcing reviews humans have limits just like with producing content. Even if Google has 10,000 quality raters those people can only be trained to find and rate certain things.

Get Paid in .edu Links to Post Help Wanted Ads

Joe Whyte offers tips on how to get free .edu links - just ask students to work for you on campus websites in the help wanted section.

Students = under-priced workers.
Free or cheap .edu links = under-priced links.
Nice

Starting From Scratch, on Under $100

SEOish offers 7 different views on how to become successful in the search game starting with next to nothing.

Information Architecture is the Most Underrated Component of Effective Search Marketing

Information architecture is probably the single most important and most under-rated aspect of the search marketing strategy for large websites.

A Recurring Error

I have been reviewing some client sites that could use work on the information architecture front. Some of them want to rank for keywords that they do not actively target. The key to ranking is to create meaningful navigation schemes that reinforce the goals of your site and your target keyword phrases. In addition, a site which is improperly categorized due to poor internal navigation does not flow PageRank properly through the site, which means your ranking data and market feedback will be irrelevant / broken and not show your true potential.

Conversion oriented structure is a type of content. It is one of the biggest advantages smaller players have over large players that operate in many fields, and adds to the bottom line of any site that takes it seriously.

Compare the following...

What Happenst to a Site With Bad Internal Navigation?

A piece meal site with hacked up internal navigation exhibits the following characteristics

  • navigation is inconsistent and confusing, thus it is hard for spiders to know what pages are important and it is hard for humans to work their way through the conversion process

  • if the spiders do not rank the correct pages odds are pretty good that the visitors will come into the site on the wrong page (and have a hard time working their way through the conversion process if they start out at the wrong spot)
  • hard to buy broad keywords using PPC because competing sites are better at funneling visitors through the conversion process
  • hard to buy category level keywords using PPC because it is hard to place people on meaningful content if it does not exist. category pages should be more than a link list or groups of irrelevant chunks of content
  • what should be category pages do not add context, build trust, and build credibility - they are essentially placeholders gluing together various unrelated content
  • if you do not have well defined and reinforced category pages the site is not structured to rank for the mid level category related keywords
  • much of the site's PageRank is wasted on unimportant pages such as photo galleries or other low content pages
  • since PageRank is distributed improperly, the market feedback is largely irrelevant
  • has many similar pages that duplicate each other, cleaning up the errors leads to broken links and other problems
  • the site is hard to grow or market because as your category gets more competitive and efficient you first have to restructure the site and undue the errors before you can compete

What Are the Benefits of Good Navigation?

A site with strong internal navigation exhibits the following characteristics

  • properly flows PageRank throughout the site

  • search engines are likely to rank the most relevant page
  • easier to convert
  • is easy for users to move around
  • builds user trust
  • more likely to be referenced in a positive light than a site with broken navigation (gets free editorial links)
  • converts better, so it can afford to pay a higher lead price for traffic (and thus maintain market leadership even as the market gets more competitive)
  • category pages add context and target different relevant word sets than lower level pages
  • folder and filenames are logical so they aid relevancy and clickthrough rate and the site is easy to build out / extend
  • if you ever make errors they are typically far easier to correct
  • easy to promote seasonal specials or currently hot items

Many website owners with unorganized websites think that they just need more of the same, but in a game of market efficiency sometimes less is more, especially if it is better organized.

Google Trends Adds Hot Keywords

Google Trends now shows the top 100 fastest growing keywords by date. Each keyword shows peak search time, search profile by hour, related keywords, top web results, news results, and blog results.

Loooooooooooooooooong Tail Keywords

Eric Enge noted that at the Searchology event that Google's Udi Manber stated that 20 to 25% of the queries that Google sees in any given day are queries that they have never seen before.

Helpful Robots Txt Tip

When creating a robots.txt file, if you specify something for all bots (using a *) and then later specify something for a specific bot (like Googlebot) then search engines tend to ignore the broad rules and follow the ones you defined specifically for them. From Matt Cutts:

If there's a weak specification and a specific specification for Googlebot, we'll go with the one for Googlebot. If you include specific directions for Googlebot and also want Googlebot to obey the "generic" directives, you'd need to include allows/disallows from the generic section in the Googlebot section.

I believe most/all search engines interpret robots.txt this way--a more specific directive takes precedence over a weaker one.

Market Timing and Search Results

You can learn a lot about how the search results will change based on recent changes that have been made and by seeing what tests the engines are running. As SEOs we track the algorithms quite intensively, but the search result display is just as important. Google allows webmasters to see what search tests they are currently performing via Google Experimental Search.

Pages