I keep reading these marketing books which say that markets are conversations and over at SearchGuild we recently had two distinctly different types of people come to the forums. Each came to represent their product and they fared way differently. Person 1: Dez from WebSearch.com.au spoke in a human voice about his search engine. He has even lended tips to some moderators on how to improve their sites...we think Dez is the bomb diggity!
Person 2: DirectorySearch from DirectorySearch announced his URL and basically bailed. In the process he alienated many of us and actually hurt his product. He never gave his name and failed to even fully answer the questions about his products.
DirectorySearch does have a good script, but product alone does not make or break you. It's how you use the product and how you communicate with the customer that helps build long term value.
In marketing the power of the weak tie is astronomical. If you ask a large cross section of society "Who found a job through a weak friend?" the percentage will be exceptionally high. Our friends typically share much of our environment and lifestyle. People who are friends of a friend live in a totally different world and know realities which are completely foreign to us.
Friendster is a free dating and social interaction network which opperates using this idea. Reports have stated that Google wanted to buy them last year for $30 million, but they did not sell.
Google organizes the web based on the social structure of the linking of the entire web. Newer technologies are allowing them to better find local clusters, but The Bost Globe reports that today a new competitor will take this field using the direct route.
For Eurekster to be effective features such as categorizing friends and settings such as trust friends of friends a certain number of levels deep will be necissary. Eurekster works by allowing you to cast a silent vote for a site based on the time you visit the site. Read the official Eurkester about us and Eurkester how it works information.
They hope to get you to download their toolbar and to tell friends about Eurkester via email. Two things which I believe to be errors in spreading this message are that the name of the search engine is hard for me to remember, and that they have a somewhat cluttered home page when compared with the current major search engines.
(GEEK STUFF) One of the largest problems many search engines run into is that after they get to a few hundred million documents their algorithms and hardware hit a wall.
For those companies that can afford the investment to get past this point they still run into the problem that each additional resource makes their job a bit harder.
One of the major ways around this problem is to take advantage of the natural patterns in human language. Using Latent Semantic Indexing allows indexing search results based on the pairing of like words within documents.
Many complex searches may lack exact matches in the results as well. Being able to find near matches will allow search engines to provide more comprehensive results.
Its hard to get computers to understand anything human, but the process of latent semantic indexing delivers conceptual results while being entirely mathematically driven.
Some of the steps of the single variable decomposition process are to:
create a database of all words in relevant documents
remove common stop words
remove words appearing in all results
remove words only appearing in one result
create a database of relavent keywords
weight the pages based on the frequency of keyword distribution
increasing the relevance of terms which appear in a small number of pages (as they are more likely to be on topic than words that appear in most all documents)
normalize the page to remove the pagelength as a factor
create relevancy vectors for the keywords
The single variable decomposition process is not scalable enough to work on large scale search engines though as it requires too much processor time. Multi dimentional scaling allows us to take snapshots of the topicology of different documents. "Instead of deriving the best possible projection through matrix decomposition, the MDS algorithm starts with a random arrangement of data, and then incrementally moves it around, calculating a stress function after each perturbation to see if the projection has grown more or less accurate. The algorithm keeps nudging the data points until it can no longer find lower values for the stress function."
This does not provide exact results, but only a rough approximation. When combined with other factors this approximation improves scalability and quality of search.
General tips to make a dynamic site get spidered
1.) Do not force feed the spider a cookie
2.) Use 3 or less variables
3.) Have each query string 10 or less digets
4.) Create a sitemap which links to many of the main database locations.
5.) Build up link popularity from a few quality inbound links. The PageRank (or link popularity in search engines other than Google) will make the spider more inclined to spider deep through your site.
In any medium there will be free rides as new adopters take advantage of knowledge not share by their competitors. While there is always a new technology which creates new markets, this quick read does a good job of explaining why off the page optimization is more effective than on the page optimization. Chris Ridings explains "The Glass Ceiling."
Update: above link to chriseo.com/modules.php?op=modload&name=News&file=article&sid=62&mode=thread&order=0&thold=0 delinked, as the site is owned by a domainer and is a page full of ppc ads