Exerting Influence & Moving Markets

There are two basic ways to do SEO. One is to look for the criteria you think the search engine wants to see, and then work to slowly build it day after day, chipping away doing great keyword research and picking up one good links one at a time here or there. If you understand what the search engines are looking for this is still readily possible in most markets, but with each passing day this gets harder.

The other way to do SEO is to move markets. When I interviewed Bob Massa, his words search engines follow people stuck in my head. So what does it mean to move markets? People are using the word linkarati. It wasn't a word until recently. Rand made it up. As that word spreads his brand equity, market position, and link authority all improve. Does that make Rand an SEO expert or a person good with words? Probably both, as far as engines and the public are concerned.

I have seen friends get free homepage links from businesses that are making 10s of millions of dollars profit per year. I have had fortune 500 companies contact me with free co-branding offers for new sites. I have came up with content ideas that naturally made it to the #1 position on Netscape and stuck there for 20+ hours straight. I still fail often and have a lot to learn, but I do know this: If you are the featured content on most of the sites in your field then YOU are relevant, and search engines will pick up on it unless their algorithms are broke.

When I was new to SEO I did much more block and tackle SEO. I had to because I had limited knowledge, no trust, no leverage, no money, and was a bad writer. The little things mattered a lot. They had to. As I learned more about the web I have tried to transition into the second mode of marketing. Neither method is right or wrong, each works better for different people at different stages, but as more people come online I think the second path is easier, safer, more stable, more profitable, and more rewarding.

If you are empathetic towards a market and have interests aligned with a market you do not need to understand exactly how search engines work. Search engines follow people.

It is still worth doing the little things right so that when the big things hit you are as efficient as possible, but if you can mix research, active marketing, and reactive marketing into your site strategy you will be more successful than you would be if you ignored one of them.

Different Links Have Different Goals

WMW has a good thread about some of the changes people are noticing at Google. Two big things that are happening are more and more pages are getting thrown in Google's supplemental results, and Google may be getting more aggressive with re-ranking results based on local inter-connectivity and other quality related criteria. You need some types of links to have enough raw PageRank to keep most of your pages indexed, and to have your deeper pages included in the final selection set of long tail search results. You need links from trusted related sites in order to get a boost in result re-ranking.

There are also a few other types of links to look at, if you wanted to take a more holistic view:

  • links from general trusted seed sites

  • links that drive sales
  • links that lead to additional trusted links
  • links that gain you mindshare or subscribers

Some of those other links may not even be traditional links, but may come from a well placed ad buy.

Every unbranded site is heavily unbalanced in their link profile. If you do not have a strong brand then the key people in your community who should be talking about you are not (and thus you are lacking those links).

Most branded sites do not create enough content or do enough keyword research to fully leverage their brand strength, but occasionally you see some of them get a bit too aggressive.

Benchmarking Information Quality

Wikipedia ranks #2 for Aaron right now. They also rank for millions of other queries. They don't rank because their information is of great quality, they rank because everything else is so bad. About.com was once considered best of breed, but scaling content costs and profitability is hard. Google doesn't hate thin affiliate sites because they are bad. They only hate them because the same thing already exists elsewhere. Search engines try to benchmark information quality, and create a structure which encourages the creation and open sharing of higher quality content. When you see poor sites at the top of search results view it as a sign of opportunity. Realize that whatever ranks today is probably not what search engines want, but it is what is considered best giving the lack of structure to the web and how poor most websites are.

The Ups and Downs of Socializing Your Content

There are many ups and downs to adding a user generated content section to a site. It has been interesting watching the effects of SEOMoz's user generated content and points systems. The ups:

  • users feel they are part of the brand.

  • they are more likely to push the brand and link to the site
  • points are created free but give some perception of value
  • users create free content for the site even when you are not doing so.
  • some of their content will rank in search results. today I did a search for search engine marketing and saw Google listing a link for recent blog posts listing this post
  • contributors might give you good marketing ideas or help you catch important trends before competitors do

The downs:

  • people who spend lots of time contributing tend not to value their time too much AND are hard to profit from (especially in savvy marketplaces that ignore ads).

  • having many relationships allows you to be a connector that knows someone for just about any job, but focusing heavily on building community and maintaining the many relationships needed to do so may hold you down on the value chain. A few strong relationships will likely create more value than many weak ones, especially as we run into scale related issues.
  • if your site is not authorititative, user generated content may waste your link authority and lead to keyword canibalization
  • if your site is authoritative many people will look for ways to leverage your domain or authority
  • as you get more authoritative more people will try to exploit it. even friends get aggressive with it, and unless you call people out it gets out of control quickly.
  • as you extend your commitments, spending time to police a site, it is harder to change course. I get frustrated when I see spam on the homepage of ThreadWatch, but I guess I can't be surprised people do it, and due to database issues I am uncertain if I will be able to upgrade TW without just archiving the old information and switching to a new CMS.
  • some people looking to promote their work may spam or aggressively associate your brand with the articles they wrote. For example, is this comment spam? Or is it good?

If a relationship is affiliate based it is quite easy to police undesirable activity by banning accounts, but if people are adding content to your site and marketing it aggressively in ways that may not bode well with your brand it might be harder to police it, especially as you scale your community. And typically the people that are most likely to give you crap for it are hypocritical with their beliefs.

I think on the whole a community section is a pretty good idea if you tie it into a paid content model, but even when you do that you will still run into scale issues if you provide any type of support for the paid content. I have over 600 emails in my inbox, and recently stopped advertising free consulting with an ebook purchase because I stopped scaling as a person. As your profits scale the opportunity cost of any one revenue channel become more apparent. That is one of the things which has prevented me from putting a forum or community section on this site.

Filters vs Penalties & Optimization vs Over-Optimization

Jill Whalen recently posted to SEL about how duplicate content penalties are not penalties, but filters.

If you duplicate on a small scale duplicate content does not hurt you (other than perhaps wasting some of your link authority), but if you do it on a large scale (affiliate feed or similar) then it may suck a bunch of link equity out of your site, put your site in reduced crawling status, and / or place many of your pages in Google's supplemental results. Jilll's article mentioned the difference between penalties and filters:

Search engine penalties are reserved for pages and sites that are purposely attempting to trick the search engines in one form or another. Penalties can be meted out algorithmically when obvious deceptions exist on a page, or they can be personally handed out by a search engineer who discovers an infraction through spam reports and other means. To many people's surprise, penalties rarely happen to the average website. Most that receive a penalty know exactly what they did to deserve it.

From a search engineer's perspective, the line between optimization and deception is thin and curvy. Because that is the case it is much easier for Google to be aggressive with filters while being much more restrictive with penalties.

From my recent experiences most sites that lost rankings typically did so due to filters, and most site owners that got filtered have no idea why they were filtered. If you were aggressively auto-generating sites your experience set might be different (biased more toward penalized over filtered), but here are examples of some filters I have seen:

  • Duplicate Content: This filter doesn't matter for much of anything. Only one copy of a syndicated article should rank in the search results. If they don't rank all of them who cares? Even though duplicate pages are filtered out of the search results after the search query, they still pass link authority, so the idea of remixing articles to pass link authority is a marketing scam.

  • Reciprocal Linking: Natural quality nepotistic links are not bad (as they are part of a natural community) but exclusively relying on them, or letting them comprise most of your link authority is an unnatural pattern. A friend's site that was in a poor community had their rankings sharply increase after we removed their reciprocal link page.
  • Limited Authority & Noise: A site which has most of it's pages in the supplemental results can bring many of them out of the supplemental results by ensuring the page level content is unique, preventing low value pages from getting indexed, and building link authority.
  • Over-Optimization Filter: I had 2 pages on a site ranking for 2 commercially viable 2 word phrases. Both of them were linked to sitewide using a single word that was part of the two word phrases. Being aggressive, I switched both sitewide links to using the exact phrases in the internal anchor text. One of the pages now ranks #1 in Google, while the other page got filtered. I will leave the #1 ranking page as is, but for the other page I changed the internal anchor text to something that does not exactly match the keyword phrase. After Google re-caches the site, the filtered page will pop back to ranking near the top of the results.

The difference between a penalty and a filter is the ability to recover quickly if you understand what is wrong. The reason tracking changes is so important is it helps you understand why a page may be filtered.

How can you be certain that a page is filtered? Here are some common symptoms or clues which may be present:

  • many forum members are complaining about similar sites or page types getting filtered or penalized (although it is tricky to find signal amongst the noise)

  • reading blogs and talking to friends about algorithm updates (much better signal to noise ratio)
  • seeing pages or sites similar to yours that were ranking in the search results that also had their rankings drop
  • knowing that you just did something aggressive that may make the page too well aligned with a keyword
  • seeing an inferior page on your site ranking while the more authoritative page from the same site is nowhere

Near Identical Articles for Content Syndication & Link Building?

People have asked my thoughts on content remixing and syndication. It is an ineffective approach to marketing.

There is enough content on the web, which is why Google is getting selective with their index. The problem with ineffective content is not that it needs mixed up and syndicated. If a site syndicates watered down vanilla remixed content they have too much content for their link authority, and most of their pages are doomed to Google's supplemental results. Lots of content and little link authority means remixing and syndicating is NOT the answer. What is the solution?

Rather than syndicate garbage, create things people would want to talk about and link at.

Group Interview on Links - Without Group Think

Rae recently posted a 5 person interview about link building that is well worth a read. 5 experts are interviewed. Each answers a set of questions without seeing the other answers until after the interview.

Common Internal Site Structure Issues

I recently spoke to a friend about some of his internal site structure errors and figured it would be worth it to share some of the better tips I gave him with readers here.

Canonical URL Issues:

Make sure search engines are seeing mysite.com and www.mysite.com as the same site. If they are not 301 redirect the less popular version to the more popular version.

Flat Site Structure:

In an ideal case your internal site structure would not be the same for every page on your site, especially if you have different sections to your site.

  • Create section related navigation that promotes other offers inside that section of your site, without heavily crossing over to other sections.

  • Actively guide users from within the content area of your site. These links will drive conversions and help funnel PageRank through your site.
  • Highlight featured content.

Many content management systems highlight recent content without placing much emphasis on your featured content. If you have important content make sure it is easy to access. Also use your site statistics to place more link weight on your most popular or most profitable content.

Content Duplication / Limited PageRank / Google's Supplemental Results

Not too long ago I wrote a post about how to check your number of supplemental pages and another about getting a site out of the supplemental index.

There are a near endless number of ways a site can waste link authority:

  • printer friendly pages

  • individual post pages in forums
  • archive vs active content forum threads
  • endless cross referencing heavy internal tagging and user generated tags
  • other cross referencing content sections that create thousands of thin content pages

If you have thin content portions of a site or duplicate pages get rid of them or use robots.txt to prevent them from getting indexed.

If you have more pages than link equity you need to build links, but another thing you can do short term is publish more content per page and structure your internal links to place more link weight on your most important pages.

Two more things worth considering here are to limit template related duplication, and temporarily publish fewer pages until you build your link authority and clean up the supplemental index issues.

Sitewide Outbound Links:

If you minimize your number of sitewide outbound links that will keep more of your link equity flowing internally. For many sites it does make sense to link out to resources sitewide or sell links. If you are selling links try to price at a higher price point and sell fewer links. That will improve your internal to external link ratio, hold your PageRank up higher, and allow you to continue to charge higher rates.

Internal to External Link Ratio:

Make sure you have many internal links on each page. If you do not have many perhaps you can duplicate your header navigation in your site footer.

Isolate Noisy Pieces of Your Site:

One last consideration is to isolate the noisy pieces of your site. Use subdomains to divide your content by content types. For example, if you have a great blog and add a forum to it you are probably best off placing the forum on a subdomain.

A Proprietary Web Graph: Is Google Paying You to Edit Their Search Results?

Google works so well because they are scalable, but they are not adverse to paying people to review content quality, because they love human computation, just see their Google Image Labeler game. What if Google came up with ways to determine which users were real and trustworthy, and could give those users incentive to edit the search results for Google? And what if Google could give a similar incentive to advertisers and legitimate publishers?

What if just by reading this you are helping Google trust this site more?

Attention Data:

Google already is the market leader in tracking attention data on the active portions of the web. What if Google integrated attention data into their algorithm and to offset that decide to lower the quantity of links they would count in any time period or the weight they would put on them? What would that do to the value of link baiting? How can Google move away from links?

WebmasterWorld and Threadwatch both recently had great posts about a recent Google patent application about removing documents based on the actions of trusted users.

Google's Own Web Graph:

Google is setting up an alternate proprietary web graph outside of linkage data. Sure any single point of attention data may be gamed, but they are likely far more reliable when you triangulate them. And if a few data points fall outside of expected ranges for the associated site profile pay to have the data reviewed. Based on that review demote spam or further refine the relevancy algorithms.

A Complete Feedback Cycle:

Google is the perfect shopping mall. Google...

  • verifies the legitimacy of user accounts by looking at their history, comparing them to other users, and challenging them with captchas.

  • tracks click-through rates and user activity, associated with user accounts and IP addresses.
  • hosts a bunch of web content, which is syndicated on many channels, and can further be used to understand the topical interests of the account holders and reach of publishers.
  • asks for searcher feedback on advertisements
  • allows people to note URLs using Google notebooks
  • tracks feedback after conversions
  • puts themselves in the middle of transactions with Google Checkout

Why wouldn't they be open to using those and other forms of feedback to help shape relevancy?

Opening up AdWords to display content partner URLs is probably a pretty good example of them adding an advertiser feature for improvement of organic relevancy scores. If advertisers think a site is garbage and Google knows that most of that site's traffic only comes from search engines it would pretty easy to demote that site. AdSense publishers also have created blacklists.

Popularity vs Personal Relevance:

Google can triangulate all these data points to see beyond just how much hype any idea creates. They can understand user satisfaction and brand loyalty, which are far more important than just how much short term hype an idea can generate.

If 100 searchers with somewhat similar profiles to mine are loyal to brands x, y, and z then I am going to see them more often, even if those sites are not well integrated into the web as a whole.

Digital Identities:

Bill Slawski recently posted about Google's Agent Rank patent, and there is a push to create a distributed login system called OpenID. Google may not need a single login to track everyone. All they have to do is get a large representative sample of legitimate web users to get a pretty good idea of what is important.

As Bob Massa said, search engines follow people.

Personalization is not going to be what makes SEO harder. It is going to be linguistic profiling, attention profiling, community interaction, and layered user feedback that make it harder to promote garbage and easier to promote quality sites. I still see spammy link building working today, but I don't see it staying that way 2 years out.

Google AdWords to Show Contextual Ad Location URLs

Jen noticed that Google's Kim Malone announced that in the next couple months AdWords will start displaying content targeted ad locations.

Google AdSense pays most publishers crumbs for their ad space. People who are running AdSense ads are willing to sell ads. And sites that have AdSense ads on them are probably actively managed.

Is there a better way to get a list of relevant pages to acquire links from than to run a content targeted AdSense ad campaign and ping those webmasters?

Pages