When I was new to SEO I did a bunch of on the page analysis to try to figure out exactly what other people are doing. The problem is that it gets you focused on things that do not matter. A site may end up ranking high at the sacrifice of conversion.
As search algorithms advance basic link analysis tools, at least for Google, are starting to become what keyword density tools are: a waste of time.
Link analysis software was cool, especially when Google used to show all of the PageRank 4 and above links, back when their search relevancy algorithm was a bit more dependant on raw PageRank.
Now Google only shows a limited random set of backlinks, and the other search engines also limit the search depth to 1,000 results, which makes it hard to do useful analysis with the various link analysis tools on the market.
If it were quick and easy to query a database deeply (deeper than 1,000) then the link analysis tools would be much more useful. None of them currently on the market really make that a quick and easy process.
To keep improving the results, you find more variables for the algorithm-creating machine to use, and you add to your store of human-ranked pages for it to "learn" from. What you don't do is bother understanding the actual algorithm -- it was constructed by a machine and is way too complex for anyone to keep in their head.
Psychologists have shown repeatedly that when you give people a system to optimize, all you have to do is secretly introduce a delay between their actions and the results of their actions, and they will go bonkers. In fact, in a very simple (single variable!) model in which people are trying to control the temperature in a virtual refridgerator, you can get some of the same irrational responses you see in these forums
and the first post here by Captain CaveMan (which incidentally is the name of an awesome cartoon character) does as well:
Without giving away the store, I don't know how else to say it. There is no sandbox. People speak of it as though it were some simple 'thing' that stops new sites from being seen. That has simply never been true. What was true was that in its early days, some of the algo elements and related filters were so tight that only a very few new sites got past them (some accidentally; some methodically). Over time that changed; more sites started getting out, presumably as G worked to surface more new, higher quality sites.
There is no sandbox. There is only a serious of rotating algo's and related filters, that make it far harder for sites launched after spring of '04 to be widely seen in the SERP's. Not impossible. Harder. And certainly not as hard now as was true seven months ago. This has been hashed and rehashed so many times that it's hard to understand why it's still confusing.
If you can only see a few of the variables and overexert effort to satisfy those variables you may end up tripping filters and not satisfying other criteria.
AdWords Spying: GoogSpy looks scrapes hundreds of thousands of searches from Google to determine who is bidding on what terms. The idea is killer, but the implementation is a bit lacking. Link found from ThreadWatch.
So I have been getting some of the Gmail feeds and ads recently. Hopefully I answered this question correctly or you the reader will call me dumb...
Bad Call #1:
Here is an example thread
Question from Search Marketing Info
Which internet search engine was co-founded by a math major who chose the name to imply a vast reach ?
Thanks in advance,
Google was a mispelling for Googol, which means a 1 with a million
zeros behind it.
Larry Page founded it and Sergey Brin was his co founder.
and here was Google's contextually targeted Gmail ad:
Head Gasket Blown? - www.rxauto.com - Repair It Yourself Guaranteed ThermaGasket The Mechanics Choice
That is data stored on Google's servers and that is the best that they can target it? When you couple that in with all the AdSense spam sites and click fraud it really makes you wonder why Google assumes anybody would want that traffic.
Bad Call #2:
One of the default feeds was Engaget. Presumably because they run AdSense? Don't get me wrong here, its cool to help smaller publishers, but if you put Engaget in there you should put Gizmodo there also unless you want people to quetion you motives.
Placing random off target off topic crap I don't want in my email is being evil. At least the old Hotmail dating ads would occassionally show pictures of cute girls ;)
I know that I can unsubscribe from feeds, but I shouldn't have to opt out. Maybe off the start you could just promote Google News, Froogle, and your other portal pieces up there?
Bad Call #3:
Google actually places feeds in your spam folder. How stupid is it to place contextually relevant feeds near stuff that was deemed as being unwanted useless junk? What better way is there to turn users off?
Another thing that is really weird is most (maybe all) of the spam feeds were for spam recipies:
Spam Vegetable Strudel - Bake 20 minutes or until golden, serve with soy sauce
Savory Spam Crescents - Bake 12-15 minutes or until golden brown
French Fry Spam Casserole - Bake 30-40 minutes
They may place the spam ads in there to try to push the cute and innocent corporate culture, but I don't buy it.
After bogusly adding the Google Toolbar Autolink feature which directed B&N customers to Amazon many people started to become increasingly suspicious of potential hidden business partnerships. Is Google partnered with Hormel Foods now too?
I feared this post was reading as though I got it from Google's PR firm, so I felt I should include this... Google Blowout Quarter Update:
There's a blurb in the Wall Street Journal today that explains how Google's reported bottom line is being gamed by their own options program. It seems since they backloaded the options expense onto last year's earnings statement, this quarter's results will be ARTIFICIALLY BOOSTED almost 100%, even though it has absolutely nothing to do with their actual profitability as a business. Keep that in mind when they announce earnings tomorrow.
In after hours Google shares are trading at over $220.
Google's first-quarter net income rose to $369.2 million, or $1.29 a share, from $64.0 million, or 24 cents a share, a year earlier. Profit from the most recent quarter included a $49 million charge for stock-based compensation.
Gross revenue nearly doubled to $1.26 billion from $651.6 million.
The results easily topped Wall Street's average net profit target of 78 cents a share. Analysts had seen profit excluding some items at 92 cents and revenue at $1.16 billion, according to Reuters Estimates. source
PageRank was broken from the start. The concept they were going after may still well exist though if they can get enough users of their search history tool. While other search engines still seem relatively easy to spam Google may be trying to measure web wide trust scores using much more than just raw linkage data.
Google need not stomp SEO techniques out, they only need to:
make the costs of SEO high enough that it is cheaper to build legitimate business models and brands than it is to build a business off of manipulating search results
Some people will be untouchable. They will know enough about social engineering and database programming to where they will still spam Google all day long. I am sure Google realizes that, but they want to continually increase costs to where that is an exceptionally small pool.
As SEO gets harder Google makes more money from ads. As they make more money from ads they can spend more into making SEO harder.
Now if only they could share more data with advertisers to help make click fraud easier to detect. Google bought Urchin. Why not buy, create, or offer something like Who's Clicking Who. Surely Google has the market data and it will not increase costs much to give advertisers more options and more data.
A search company which makes tons of profit organizing data should recognize that by making advertising transparent and making more ad information available they will create a more efficient market which creates more profits. The advertising community would likely police themselves if you gave them enough data and responded to feedback.
Google Inc. (GOOG.O: Quote, Profile, Research) on Wednesday debuted a test service called My Search History that analysts said is a move closer to personalized search, which is widely considered the Holy Grail for the Web search leader and its rivals. source
to use My Search History you must register at Google Accounts and maintain an active account. Ask Jeeves have had a search history tool for a while now and Yahoo! has My Yahoo! for various personalization effects, although Yahoo! seems more focused on providing news and blog feeds and the like. I think Yahoo! is betting on the abundance of information making subscribing to channels much more appealing than searching the web. I believe Yahoo! also allows you to subscribe to Yahoo! News feeds by keyword phrase.
Personalized search allows engines to better understand users to improve search quality and ad targeting. Whoever is branded as the best market solution on that front is going to make a bucket of cash, because keeping your search history and learning the user raises the barrier to switching search providers.
It makes it hard for another search service to be as relevant if you have tons of personal information already locked in a competing service. This data will be hard to export to other systems as well, as importing huge hunks of data will also allow marketers to import large volumes of spam.
I just briefly tested Google's service. It is fairly slick. You can quickly sign in or out and it adds minimal clutter to the Google home page.
From the link in the upper right corner you are brought to a new page. It shows a calender which color codes your search volume on the right side. The left side shows your searches for that day and the results you clicked on. The my history results that you click on also show up in the Google one box area when you search for similar terms using the regular search results.
Some privacy advocates would likely go nuts with this offering. It is all opt in though. I encourage everyone to sign in, search for seo, scroll past the Japanese stuff, and click on my listing.
Presumably some searchers may be able to build up a search history.
As they build it up it could build Google's trust in that user, which in turn could potentially allow Google to use that user feedback to verify search result relevancy.
I would not doubt this to do a bit more of globalizing SEO. Paying people in third world countries to randomly click certain sites. I am already building a search history today as a prospective SEO tool.
I just got an update email from Leslie Rhode of OptiLink...
A few days ago, Google began to employ a "spyware detector" that will in some cases block OptiLink through the use of a cookie and a human visible "ransom note".
The use of Google from "normal" browsers is not effected -- it is only specialized programs such as OptiLink that are targeted by Google's change with the result that OptiLink can be blocked
from Google for two or more hours.
While this is not a terrible problem as no lasting impact has been found, I am not comfortable with Google being able to discover the use of OptiLink no mattter how "gentle" the counter-measures
So, OptiLink's Goolge interface has been REMOVED pending a solution to this problem. This has been done for your safety, and for the safety of all other OptiLink users.
Rest assured that this problem will be solved and Google access restored as soon as possible, but in the meantime, you should use the Yahoo and MSN interfaces for your Google ranking analysis.
I am a bit curious if Google is going too far with all of their recent anti-SEO moves. I can't even count how many times I have read that search relevancy is similar at Yahoo! and Google. Webmasters have undoubtedly helped to build Google's brand.
With the extensive filtering that Google does on its linking information, the loss of the Google interface in many cases is not that important.
In general, you can do your linking analysis using the Yahoo or MSN link databases and safely assume that Google has these links as well, but are simply not showing them. The exception to this rule is of course the "banned domain" which appears to be a uniquely Google concept.
Google does provide useless linkage data. Some of the other engines, especially Yahoo!, provide useful linkage data.
The connectivity measurement (or PageRank) that Google shows in it's toolbar is outdated. July of last year I talked to a Yahoo! Search employee and asked why they were not making a reliable Yahoo! connectivty measurement available?
A large part of how Google gained their brand was by creating concepts that were somewhat easy to explain, like PageRank. Why not force them to keep that data updated or take that market position from them by providing across the board better tools that are easier to explain? This also could help Yahoo! gain a much larger installed toolbar base, which may allow them to
gain market share
collect more market data
improve relevancy algorithms
MSN has also been significantly more supportive of the SEO industry than Google, even allowing people to subscribe to search results via RSS.
I understand running automated systems add to system load time and has associated costs, but could that cost be a cheap form of marketing your high margin search service over competing services?
On many fronts I do like Google as a company, but I think their idealism is at least as much of a hindrance as it is a strength.
Leslie also had the following to say in his update:
My Thoughts on the Future
It is certainly well known that Google does not look with favor upon SEO tools in general, and most especially tools that make use of its interfaces, so some sort of reaction is not totally unexpected.
OptiLink has been in very active use and continuous development since May 22, 2002, and has been on Google's "short list" since the moment they called me (true story) just 10 days after it was announced.