Full Text of Google's General Guidelines for Remote Quality Raters from April 2007

Mar 14th

I was not going to leak the document publicly until others did and it was cited by other popular sources, but given that SEL blogged about the remote quality rater document, now is a fine time to mention the best weekend reading any SEO could wish for...here is the 43 page confidential Google document in PDF format.

Make sure you download a local archive in case either or both go offline. And if you don't know why the document is so important for SEOs, read my post on spying on Google.

Search is About Communication

Jan 6th

Making Untrustworthy Data Trustworthy:

In social networks there tends to be an echo chamber effect. Stories grow broader, wider, and more important as people share them. Tagging and blog citation are inevitably going to help push some stories where they don't belong. Spam will also push other stories.

RSS, the Wikipedia, Government content, press releases, and artful content remixing means automated content generation is easy. Some people are going so far as to try to automate ad generation, while everyone and their dog wants to leverage a publishing network.

What is considered worthwhile data will change over time. When search engines rely to heavily on any one data source it gets abused, and so they have to look for other data sources.

Search Engines Use Human Reviewers:

When John Battelle wrote The Search he stated:

Yahoo is far more willing to have overt editorial and commercial agendas, and to let humans intervene in search results so as to create media that supports those agendas…. Google sees the problem as one that can be solved mainly through technology–clever algorithms and sheer computational horsepower will prevail. Humans enter the search picture only when algorithms fail–and then only grudgingly.

Matt Cutts reviewed the book, stating:

A couple years ago I might have agreed with that, but now I think Google is more open to approaches that are scalable and robust if they make our results more relevant. Maybe I’ll talk about that in a future post.

Matt also states that humans review sites for spam:

If there’s an algorithmic reason why your site isn’t doing well, you can definitely still come back if you change the underlying cause. If a site has been manually reviewed and has been penalized, those penalties do time out eventually, but the time-out period can be very long. It doesn’t hurt your site to do a reinclusion request if you’re not sure what’s wrong or if you’ve checked carefully and can’t find anything wrong.

and recently it has become well known that they outsource bits of the random query evaluation and spam recognition process.

Other search engines have long used human editors. When Ask originally came out it tried to pair questions with editorial answers.

Yahoo! has been using editors for a long time. Sometimes in your server logs you may get referers like http://corp.yahoo.com/project/health-blogs/keepers. Some of the engines Yahoo! bought out were also well known to use editors.

Editors don't scale as well as technology though, so eventually search engines will place more reliance upon how we consume and share data.

Ultimately Search is About Communication:

Many of the major search and internet related companies are looking toward communication to help solve their problems. They make bank off the network effect by being the network or being able to leverage network knowledge better than the other companies.

  • eBay
    • has user feedback ratings
    • product reviews reviews.ebay.com
    • bought Shopping.com
    • bought PayPal
    • bought Skype
  • Yahoo!
    • partnered with DSL providers
    • bought Konfabulator
    • bought Flickr
    • My Yahoo! lets users save or block sites & subscribe to RSS feeds
    • offers social search, allowing users to share their tagged sites
    • bought Del.icio.us
    • has Yahoo! 360 blog network
    • has an instant messenger
    • has Yahoo! groups
    • offers email
    • has a bunch of APIs
    • has a ton of content they can use for improved behavioral targeting
    • pushes their toolbar hard
  • Google
    • may be looking to build a Wifi network
    • has toolbars on millions of desktops and partners with software and hardware companies for further distribution
    • bought Blogger & Picasa
    • alters search results based on search history
    • allows users to block pages or sites
    • has Orkut
    • has an instant messenger with voice
    • has Google groups
    • Google Base
    • offers email
    • AdWords / AdSense / Urchin allows Google to track even more visitors than the Google Toolbar alone allows
    • Google wallet payment system to come
    • has a bunch of APIs allowing others to search
    • search history allows tagging
  • MSN
    • operating system
    • browser with integrated search coming soon
    • may have been looking to buy a part of AOL
    • offers email
    • has an instant messenger
    • Start.com RSS aggregation
    • starting own paid search and contextual ad program based on user demographics
    • has a bunch of APIs
  • AOL
    • AIM
    • AOL Hot 100 searches
    • leverage their equity to partner with Google for further distribution
  • Ask
    • My Ask
    • Bloglines
  • Amazon
    • collects user feedback
    • offers a recommending engine
    • allows people to create& share lists of related products
    • lists friend network
    • finds statistically improbably phrases from a huge corpus of text
    • allows users to tag A9 search results & save stuff with their search history

Even if search engines do not directly use any of the information from the social sharing and tagging networks, the fact that people are sharing and recommending certain sites will carry over into the other communication mechanisms that the search engines do track.

Things Hurting Boring Static Sites Without Personality:

What happens when Google has millions of books in their digital library, and has enough coverage and publisher participation to prominently place the books in the search results. Will obscure static websites even get found amongst the billions of pages of additional content?

What happens when somebody comment spams (or does some other type of spam) for you to try to destroy your site rankings? If people do not know and trust you it is going to be a lot harder to get back into the search indexes. Some will go so far as to create hate sites or blog spam key people.

What happens when automated content reads well enough to pass the Turing test? Will people become more skeptical about what they purchase? Will they be more cautious with what they are willing to link at? Will search engines have to rely more on how ideas are spreading to determine what content they can trust?

Marginalizing Effects on Static Content Production:

As the web userbase expands, more people publish (even my mom is a blogger), and ad networks become more efficient people will be able to make a living off off smaller and smaller niche topics.

As duplicate content filters improve, search engines have more user feedback, and more quality content is created static boring merchant sites will be forced out of the search results. Those who get others talk about them giving away information will better be able to sell products and information.

Good content without other people caring about it simply means to search engines its not good content.

Image showing marginalizing effects on the profitability of publishing boring static sites.

Moving from Trusting Age to Trusting Newsworthiness:

Most static sites like boring general directories or other sites that are not so amazing that people are willing to cite them will lose market share and profitability as search engines learn how to trust new feedback mechanisms more.

Currently you can buy old sites with great authority scores and leverage that authority right to the top of Google's search results. Eventually it will not be that easy.

The trust mechanisms that the search engines use are easy to defeat and matter less if your site has direct type in traffic, subscribers, and people frequently talk about you.

Cite this Post or Subscribe to this Site:

Some people believe that every post needs to get new links or new subscribers. I think that posting explicitly with that intent may create a bit of an artificial channel, but it is a good base guideline for the types of posts that work well.

The key is that if you have enough interesting posts that people like enough to reference then you can mix in a few other posts that are not as great but are maybe more profit oriented. The key is to typically post stuff that adds value to the feed for many subscribers, or post things that interest you.

Many times just by having a post that is original you can end up starting a great conversation. I recently started posting Q and As on my blog. I thought I was maybe adding noise to my channel, but my sales have doubled , a bunch of sites linked to my Q and As, and I have got nothing but positive feedback on it. So don't be afraid to test stuff.

You wouldn't believe how many people posted about Andy Hagans post about making the SEO B list. Why was that post citation worthy? It was original and people love to read about themselves.

At the end of the day it is all about how many legitimate reasons you can create for a person to subscribe to your site or recommend it to a friend.

Man vs Machine:

For most webmasters inevitably the search algorithms will evolve to become advanced to the point where it's easier and cheaper to manipulate human emotion than to directly manipulate the search algorithms. Using a dynamic publishing format which reminds people to come back and read again makes it easier to build the relationships necessary to succeed. To quote a friend:

This is what I think, SEO is all about emotions, all about human interaction.

People, search engineers even, try and force it into a numbers box. Numbers, math and formulas are for people not smart enough to think in concepts.

Disclaimer:

All articles are wrote to express an opinion or prove a point (or to give the writer an excuse to try to make money - although this saying that SEO is becoming more about traditional public relations probably does not help me sell more SEO Books).

In some less competitive industries dynamic sites may not be necessary, but if you want to do well on the web long term most people would do well to have at least one dynamic site where they can converse and show their human nature.

Earlier articles in this series:

Trending and Tracking the Blogosphere and Newsosphere

Jan 6th

Feedback Loops:

Most searches occur at the main search sites and portals (Google, Yahoo!, MSN, AOL, etc.), but some people also search for temporal information, looking to find what is hot right now, or seeing how ideas spread. Not everyone can afford WebFountain, but we can all track what people are searching for or how stories are spreading using:

Feed Readers :
Subscribe to your favorite channels (or topical RSS feeds from news sites)

Blog Search:
search for recent news posted on blogs

Blog Buzz Index:
search for stories rapidly propagating through blogs

General Buzz & Search Volume:

Product Feedback:

News Search:

Test Ad Accounts & Test Media:

  • Google AdWords
  • Yahoo! Search Marketing
  • write press releases and submit them cheaply to see how much buzz & news search volume their is around a topic, using sites like PR Web or PR Leap
  • post on a topic
    • see if it spreads
    • check referrer data
    • Sometimes stories emerge out of the comments. The Save Jeeves meme that spread originated around the time the person who created that story commented on my post about Jeeves getting axed.
    • Don't forget to have friends tag your story on Del.ico.us and submit it to Digg.

Tagging:
Some are busy tagging what information they think is useful.

  • Delicious - personal bookmark manager.
  • Wink - tag search
  • Flickr - image tagging hottest tags
  • Tag Cloud - shows graphic version of hot tags
  • Furl
  • Technorati Tags
  • Digg Top Stories
  • Reddit
  • Ning
  • Squidoo
  • My Yahoo!
  • Google Search History (you can't see what others are tagging, but I bet it eventually will influence the search results - Google is already allowing people to share feeds they read)
  • more tagging sites come out daily...lots of others exist, like Edgio, StumbleUpon, Shadows, Kaboodle, etc etc etc
  • also look at the stuff listed in Google Base...there may or may not be much competition there, and Google Base is going to be huge.

Track Individual Stories and Conversations & Trends of a Blog:

Bloggers typically cite the original source OR the person who does the most complete follow up.

Blog Trends:
See if a blog is gaining or losing marketshare and compare blogs to one another

Overall Most Popular Blogs and Stories:

Did I miss anything? Am sure I did. Please comment below.

Here are earlier stories from this series:

Syndication and How News Spreads

Jan 6th

A while ago I started publishing bits of an article that I intended to finish quickly, but life slowed me down. Here were the first parts

Why Bloggers Hate SEOs
Why SEOs Should Love Bloggers
Dynamic Sites and Social Feedback
Controlling Data and Helping Consumers Make it Smarter
Small vs Big and Voice in Brand

I am going to see if I can finish up the article today. Here is the next piece:

How News Spreads:

News has to start from somewhere. It doesn't really matter if it comes from blogs or traditional media. A few things that are important with both publishing formats are

  • both have incentives to get the scoop or report on stories early
  • both have audiences who can further spread your message
  • both are fairly viral
  • both have lots of legit link popularity
  • getting viral marketing via blogs or news coverage is something that most people will not be able to replicate

Eventually if the story spreads the feedback network becomes the next round of news. If one or two well known reporters write your story other journalists and bloggers may feel like they are missing out if they do not cover it.

The story about me getting sued was picked up by another blogger, then BusinessWeek, then the WSJ. About a few hundred blog citations followed that. Sometimes news that goes a bit national comes back local, and even then you get a bonus links. A Pittsburgh paper mentioned I was sued. That story was syndicated on a Detroit paper, and even got a mention in the blog of the local paper.

Newspapers love to syndicate content from each other to lower costs. Sometimes they even syndicate things that don't make sense because they need fill to surround their ads. I have even seen an Arizona column featuring local Rhode Island bloggers.

Small vs Big & Voice in Brand

Oct 29th

Part 5 of an ongoing series...
read parts 1, 2, 3, and 4:

  1. Why Bloggers Hate SEO's
  2. Why SEO's Should Love Bloggers
  3. Dynamic Sites & Social Feedback
  4. Controlling Data and Helping Consumers Make it Smarter

Can Individuals & Small Sites Compete With Big Ones?

Some people think individuals can't compete with large corporations. The numbers prove otherwise.

When I was recently sued many sites linked through to my site referencing the lawsuit. The first day traffic volume of some of the leading referers was

  • Slashdot ~ 7,500
  • Wall Street Journal ~ 6,000
  • Atrios.blogspot.com ~ 6,000
  • News.com ~ 50

An individually written blogspot blog sent me nearly as much traffic as the Wall Street Journal did, and sent far more than most media sites did. Keep in mind that around 100 or so bloggers linked into the WSJ article, so the average blog post on Atrios.blogspot.com likely gets more online readers than most WSJ online articles do.

Working Alone:

If people like your biases or the way you present the news they will send you stories as well. As you develop trusted and trusting readers even individuals do not end up working alone. Many people will send you tips about the news they uncover. Over time those relationships develop and you know who to trust more and if your channel becomes profitable enough you may even be able to hire one or two of your favorite researchers.

Should I Have Said That:

Being the first person with the news is also an easy way to get links. Sometimes through misinterpreting a story, not fully analyzing it, or just going with gut instinct it also can help uncover things that might have otherwise gone unnoticed.

Some people are afraid to blog because they think I am not sure if I should of said that. In many cases when I write on the web I write it like... should I have said that? Hmm... if I was wrong someone will hopefully tell me or it might get links or comments.

That's the whole point of feedback. To learn from it. The more authentic your voice sounds the better it will be received.

Those who write the rules write them to keep themselves in power. The advantages of being new & small are:

  • You can move quickly, changing your business model or adding multiple new channels each day.
  • If you make errors people may be more tolerant of it if they do not think of you as a professional or do not realize your reach.
  • If you are brand new you may not have much to lose if you break a few rules.
  • Sometimes hidden stories come out when we make mistakes.
  • Controversy is typically surrounded in links. Sometimes being wrong is more valuable than being right.

Most anything that may hurt your credibility in the eyes of some may help you in the eyes of others.

Niche & Bias:

Being small means lower overhead and you can focus more on a specific market. The tighter your niche the easier it is to carve out a market position. The same may be true for the way you bias stories.

If you look in the political sphere the most prominent blogs are typically ones that lean far in one direction or another. If you fake the position eventually it will sound shifty and the truth will wash out, but if you are biased or broken that can lead to added profitability or authority on the web.

Controlling Data & Helping Consumers Make it Smarter

Oct 18th

Part 4 of my recent ongoing article...

Dynamic Site Advantages:

If you use a weblog or any other type of dynamic site, as content ages you create a large quantity of pages which can rank for a variety of terms in many engines. The site archive systems mean that posts not only get their own pages, but can also be organized by date and category. This creates what is considered to be legitimate keyword driftnet content bank.

People can also subscribe to the feeds to remind themselves when to come back and read your new information. Many people who read feeds also write sites with feeds, and can provide you with additional link popularity and another channel to acquire readers from.

Most people who subscribe to what you have to say will usually be people who agree with many of your points. This means that when they talk about you or mention your site you are:

  • likely to be presented to additional like minded people with similar biases to your own
  • in a positive manner
  • from a voice readers likely trust.

If people disagree with you and still subscribe to your feed then there is a great chance they will frequently want to say how wrong you are, maybe even linking through to your site.

Ultra Targeted Content:

Not all ideas need a whole article to explain. By publishing your thoughts with one topic per post it makes it easier for you to refer back to your own content in the future. It also makes it easier for others to point at / link to / reference it.

Ultra targeted content will also stand a good chance of ranking high for it's keyword theme since it is so well targeted.

Consumer Feedback & Product Catalogs:

For a long time creating pages by keyword phrase permutation was a functional SEO strategy, but Google does not want to display hollow product databases in their regular search results. Creating industrial strength spam works well for some, but as time passes the hollow databases need to get better at remixing sources and integrating user data.

If there is commercial value for a term Google believes Froogle & AdWords work well. It seems to be almost a yearly process that Google dials up the rankings on authority sites right around the Christmas shopping season. This forces merchants to need to buy in to the vertical shopping sites, buy AdWords, or spend Christmas out in the cold.

Allowing user feedback and interaction makes your content more original than most competing sites. It also adds value to the consumer experience & makes it easier to link at your site. Both of which make Google far more likely to want to include your site in the result set. Tim O'Reilly states Data is the Next Intel Inside:

While we've argued that business advantage via controlling software API's is much more difficult in the age of the internet, control of key data sources is not, especially if those data sources are expensive to create or amenable to increasing returns via network effects.

Google is just a giant feedback network, learning to better understand the relationships between queries, links, and content. If you own the smartest and richest consumer feedback network in your vertical you will only continue to gain profit, customers, and leverage, at least up until someone creates a better feedback network that displaces the current one.

Dynamic Sites & Social Feedback

Oct 14th

Part 3 in a series... let me know what you think :)

Blog Software is a Simple CMS

Some of the conversations stemming from my article series starting with Why Bloggers Hate SEO's & Why SEO's Should Love Bloggers have stated that blogs are just a simple CMS. The one catch is they are social in nature.

I have probably read about a couple hundred books, and have only emailed about 5 book authors to tell them how great their books were. Most of the book authors quickly replied to my emails to say thanks. This tells me that they must understand the value of having fans (Seth Godin surely fits in that group) or they are not as inundated with email as I sometimes am.

Compare the books, which take months to write, to most blogs. On blogs I have left hundreds or thousands of comments. Across my various blogs I have got thousands of feedback posts others have left. One blog is almost nothing but a framework for people to leave their comments, and yet they still do!

Some people have stated that blogs are a fad that will die out. They may be right, but if they die out it will only be if other software emerges which does a better job of social integration, as some of the current tools are lacking on many fronts.

Static Content & the Game of Margins

Some old estabished static sites may long live on, but both directly and indirectly the web is becoming more of a read write medium. Margins will require content to become more social.

In spite of years of branding and content creation even the most well known publishers are caught playing the margins, selling ad space aggressively, and push the blame onto their advertisers.

Creating content is a game of margins. If you use a static website, and update it's content to keep it current, you are writing over your old work, which means:

  • you are throwing away it's historical record
  • you are creating less pages (which means less chances to pull in visitors) , as each page is another search lottery ticket
  • it is likely going to be harder for an audience to find the new content
  • it is less likely people will reference the new content, since they do not know what URLs are changing when
  • it is less likely people will reference the old content, since it may eventually change
  • many people will not want to reread the parts they already read
  • as your content size grows it means you are forced to worry about keeping it up to date while still trying to keep up with the news and the shifting marketplace

Add all of those things together, and a business model which would wildly succeed could easily become a complete failure.

The static site this article is on generally sucked until my blog became popular. In spite of the effort writing this aritcle, my average blog post will probably be read many times more than this article is.

Who is a Static Site For?

When you first learn about a topic it may be useful to create a large site about the topics you are learning, just as a way of forcing you to learn it all. Even in doing that, so long as you map out the general hierarchy ahead of time, there is no reason to avoid creating the site using a dynamicly driven database. Eventually when I have enough time this site will likely be shifted to a dynamic format.

The only people who can really afford to get away with using purely static sites are:

  • those who have other dynamic sites which help build their credibility & authority
  • those who are creating a site out of boredom or for a personal hobby
  • those who are not trying to profit or spread ideas
  • those who are known as the authority on their topic (who can do well in spite of the shortfalls in their publishing methods)
  • amazing writers who write so well that they can do well in spite of their publishing format
  • those who were first runners or are in niche fields with few competitors
  • those who are gurus in fields that change slow
  • those who run tons of sites and want to make them scalable (although it is even easier to do this with dynamic sites)

In almost all the above cases I can point to examples of how using dynamic sites could save time or be more profitable.

Example of a Sucky Static Site:

Not too long ago I created a site called Link Hounds to give away free link building tools on. I find the tools exceptionally useful, but the site failed to take off for a number of reasons.

  • API Limitations: when I first announced the site people used it beyond the API limits and it did not work. I should ask the engines for increased limits.
  • Lack of Incentive to Syndicate: in part to make up for the API limitations I gave away the source code and referenced tool mirrors, but some who mirrored the tool did not want to share it with others. Also Yahoo! requires that sites have DOM XML support if you use PHP4 to program the tool. I should have had my friend program in PHP5.
  • Crap Design: While the site design was not bad for free, it obviously is not something stellar.
  • Open Source & SEO: Are generally not concepts which are paired together. I think it will take a bit of time for people to get used to it. An open source website recently asked me to write an article, so that may help a bit.
  • Perception of Value: People think they get what they pay for. In spite of the fact that some of my software is similar to (and in some ways better than) stuff that sells for $150 or more, some people think the software is worthless because it is free. Similar software with strong affiliate marketing is seen by many more people:
  • Boring / Static: If I started working a bit harder at link building and placed a blog offering a bunch of creative link tips on that site I suspect it would garner many more links.

As it sits, there is little reason for people to remember to go back to the Link Hounds site, so they rarely do.

Sites that are dynamic in nature which make it easy to give feedback will fair far better.

Why SEO's Should Love Bloggers

Oct 13th

Part 2 of an ongoing series, the future mini articles will shift away from blogs and into other areas, at the end there will hopefully be a point to all these :) if not well then sorry ;)

Blog Blog Blog Blah Blah Blah

I run an SEO related blog which sells a guide to doing SEO, and yet despite the chronic hate toward SEO many authoritative bloggers recently linked through to my site because I was sued by an SEO company for blog posts & comments. As of writing this I am unsure of the specifics of what made my site worth suing, although those lack of specifics pissed off many people.

Most likely Traffic Power thought they could scare me silent, and since I was a blogger with a few good blogging friends that story backfired rather badly for them. It is an easy story to link at, and many people did. Adam Penenberg painted a rather accurate picture of the situation. The story spread far and quick. There was much syndication of the story that my site started ranking for the word sued.

Traffic Power Sucks.com was sued along with me, and yet they got minimal coverage because: they did not want to talk much, and more importantly, they had a static site. Method of publishing plays a hugely important role in whether or not ideas will spread, and how quick they spread.

Smarter Content

Sometimes what makes you / your site comment worthy is what others do there, and how people react to that (just look at Threadwatch to see how important the comments can be). Allowing others to add content to my site allowed them to make the content smarter and more complete. It also was the exact reason why the lawsuit became so comment worthy.

People wanted to save the right to comment on blogs without needing to worry about others cutting off their feedback loops. It is a large part of the reason some think blog comment spam and trackback spam is so nasty: feedback about an idea is sometimes worth far more than the original idea.

Ease of Link Acquisition

By giving people something to talk about and reminding them to regularly visit your site it is much easier to build linkage data. It also is easier to reference old stories that once again become relevant as more news emerges.

The viral behavior of blog posts in a large social network benefits those who can figure out what stories would spread & why people would want to spread them. Arbitrarily answering questions like "How much is my blog worth?" is an easy way to get links.

Someone created a blog called anti-blog to say how lame blogs are. As soon as I found it I made a quick mention of how I thought they were a bunch of lamers. They quickly linked back saying how dumb I am. Easier and quicker than a link exchange, and that link is much more likely to be up in a year than most link trades, which usually turn out to be junk.

Echo Chamber

When you have a regular site and are stuck asking for links one at a time it is an arduous task. Blogs have an echo chamber effect. After stories are above radar they spread without effort, and sometimes how stories spread makes them linkworthy.

Examples:

  • SEO Inc cease and desist letter, as Danny Sullivan states:

    That last thread we actually pulled from our forums back in mid-April. No, not because of a cease-and-desist letter or any message. Instead, our forums have a policy about public spam reporting. We don't allow it, unless a site is incredibly well-known or the issue has become discussed in a variety of public forums. Ironically, with the many blog comments now about the cease-and-desist, the thread that previously was pulled now qualifies for restoration.

  • Google onsite dentist blog is a hoax
  • MC Hammer visits Google - how hard would you normally have to work to get authoritative topically related links from sites with a quality level as high as SEW?

I was not trying to pick on Danny with those examples. I used his site as the example because he is the most authoritative voice on search, has a journalism background, and a long history of spotting the future trends in search before they emerge.

Everyone likes to have a bit of fun. The often informal nature of blogs make it easier to reference somewhat random topics, especially if you get to be the crazy frog. Having a blog lets you tap the flow of linkage data from other related sites, for serious or fun stuff.

Hard to Reproduce

When you do link exchanges most of the sites that exchange with you will gladly exchange with your competitors. When your site garners linkage data from authoritative sites that are not heavily directly interested in making money or search rankings it is hard for competitors to reproduce your linkage data. In fact, if they prod too heavily on that front they stand a good chance of damaging their brand value and credibility.

Quality of Links

When you get links from within the content of an actively read channel typically

  • the individual archive pages have few links on each page
  • the links are the type that drive direct traffic. If search engines bias relevancy based on user data and link activity more then these types of links will become more powerful
  • the Google Sandbox concept really does not matter much if all the high ranking active channels are referencing you anyway
  • many links in social networks lead to secondary links

Why Bloggers Hate SEO's

Oct 12th

I was writing a longer article and decided it would be better in pieces. This first one is about marketing, profit, and why I think most bloggers hate SEO.

Let me know what you think of part 1. Tomorrow I will post part 2.

Algorithm Manipulation & Constant Change:

In the forums recently there has been some whining about Google being an out of control beast with no relevancy, etc. I guess when you look close enough there is always some amount of that.

People are also comparing the new Yahoo! algorithms to the Google Florida update. We tend to think the relevancy is not there when our own site disappears, even if it is temporary. Admittedly the algorithms may be jacked for a while, but if people like your site and sites like yours are not in the search results it hurts the relevancy and brand of the search engine when those results do not show up, even if some of those sites were banned long ago.

Search algorithm manipulation may still be beneficial, but is not necessary to succeed if others are interested in what you offer. Those who take a holistic approach to marketing do not need to worry as much about the ups and downs associated with changing search algorithms or search business models. If people believe in you enough they will push you to success even if you do not know what you are talking about.

There is nothing wrong with creating content with the intent to spread it. That is all SEO is focused on: spreading content, ideas, and websites for profit.

What is Profit?

Profit can be:

  • money
  • feeling important
  • knowing others are reading what you write
  • getting feedback
  • knowing you helped others
  • settling a score with someone you are pissed at
  • doing something which others stated you could not
  • or a variety of other things

Bloggers are SEOs?

When a blogger Google Bombs a person they are doing nothing more that a souped up blog version of SEO. In spite of the fact that bloggers do the same things as SEOs (and sometimes even far worse) many bloggers like to tell you just how much scum the average SEO is.

Why is everyone and their dog launching or partnering in a blog network? Money. Plain and simple.

As NickW would say:

I suspect the ratio is more like 95/5 with regard to who's making reasonable money at blogging.

Personally, i see this as the new seo if you like - and the old guard, who can get their heads round the new medium, are all set to rake in what they want pretty much - it's open season out there...

While many bloggers and designers claim they absolutely hate the topic of SEO - and SEOs - much of the bad SEO advice offered is given by bloggers and web designers who never studied the topic.

At one point in time I had to tell a content rating website to stop hiding content on their own site. They were handed that dubious hide the content tip from a web designer (who ensured them that it was search engine friendly). Even outside of search think how poor it sounds for a content rating website to hide their own content. Where does the credibility go with moves like that?

Why Do Bloggers / Designers Really Hate SEOs?

Many people who are chronically pissed at anything remotely related to SEO are probably in that mode of thought for one of a few reasons:

  • Envy: I remember when I just started out on the web and was doing economically bad because I had few connections, limited experience, and minimal business savvy. I am not ashamed to admit that for a period when I was barely getting by I felt envious of people with better business models. Many web developers, bloggers, and designers whom are barely getting by like to push the blame of their lack of marketing skills and low wages on budget shifts toward marketing and spammy results clogging up the engines.
  • Sick of blog spam: Whenever a blogger talks about comment or trackback spam, or any webmaster gets referral spam many of them blame that annoyance on SEO in general, although the software developers are at fault for selling software with holes in it. Some people have even been known to blog spam for a competitor to hurt a competitors website, just because the software vendors make it so easy.
  • Sick of low quality search results: Search engines have spent a ton of money, time, and attention marketing their faults as belonging to a third party.
  • Quality of Content: Some people believe that SEOs aim to do anything necessary to avoid creating useful websites or content. While some people just aim to exploit algoritmic holes, some of these same people later go on to do SEO for many legitimate websites. As an SEO, if you have clients, it is ideal to have clients that give you a performance based cut (affiliate programs) or clients that have naturally authoritative sites which can easily rank for their official name and related terms. Some of the better SEOs refuse to work for a company unless they are deeply interested in that companies products and market.

    Most blogs are not of amazing quality. If the average blog's content quality was high people wouldn't be failing the Turing test as often as they are.

  • Web Standards: Many sites follow no standards other than putting money in the pocket of the author. That would occur whether or not there was a field called SEO. Sadly most of my sites are not yet standard compliant, and it would probably cost me about $100,000 to even attempt that. I may eventually try it, but you can't learn everything all at once. Learning is a process. Just as many designers are bad at marketing I am bad at designing and web standards related stuff.
  • Easy to pick on: XYZ marking firm is a bunch of knuckleheads is an easy story to spread. Web mob justice means those stories spread fast. Did you know that Gillette now has a 5 blade razor?
  • Selective Memory & Anchoring: How often do you hear a person thank an SEO for placing a relevant result at the top of the results? Never. How often do you hear of SEOs being $@&*ers? Much more often.

Andrew Goodman on Google's Recently Announced AdWords Change

Jul 27th

Andrew writes a 4 page article about the new AdWords system. Smart of him to reinforce his market position by writing an article about it. I also found it interesting that he wrote about his speculations as to why some things at Google change and how Google is viewing the ad system more like organic search results.

You can count on the backend technology driving both AdWords and Google's search results to get more complex. Eventually the systems may require some sort of degree or certification, although for now nothing can really beat what you get out of hands on experience.

Pages






    Email Address
    Pick a Username
    Yes, please send me "7 Days to SEO Success" mini-course (a $57 value) for free.

    Learn More

    We value your privacy. We will not rent or sell your email address.