once you stopped accepting sites I submitted I stopped caring about your directory.
sorry if that sounds blunt
but when you arbitrarily decide to stop accepting any and all
submissions I stop trying to help you.
Hedir.com management has changed. I am the webmaster now. This is one of the projects i have undertaken recently . I am restructuring hedir design and policy.
I have noticed that you have got 10 sites listed in my directory .
I do not find any history of your helping the directory either with link or any other way . Could you please enlighten me regarding this ? Coz i believe in the concept of mutual help in the webmaster
I was helping your directory. it was in my directory of directories. it was on my list of directories that over 1,000 have, etc.
when it started screwing around with rejecting every submission I made I stopped making submissions and pulled my support for it.
I also have a theory that directories can charge for inclusion if they like but if they require reciprocal links and are general directories then generally they are a waste of time.
The site is going paid this march , with basic submission still free.You are welcome to submit as many sites as you want .
Reciprocal link is a part of my policy towards existing sites as i am deleting the old base . So, i am pulling out your links from the directory .
You can resubmit your sites if you want .
On the other side of the coin I am a big fan of the Bruce Stone theory:
Give the webmasters what they need. Links, good and fast service and as many pages index by the search engines as conceivably possible. This generates traffic, something directories have been poor in delivering. Youâ€™re not doing the webmaster a favor by accepting their site. They are doing you the favor by developing the database for you. :from interview
It is a mutual thing and if you try to work hard to create something of value others will try to help you.
Sadly many directories subscribe to the Hedir philosophy.
Give me. I am greedy. Give me. Give me. Give me.
Directories list sites. Good sites do not generally list bad general directories.
The biggest problem with most sites these days is that most try to extract profit without building any value.
Sure you can do that if you are technologicaly gifted or know the right people, but deleting large portions of a database because they do not link to is probably the wrong way to "build" a directory.
The online PR thing is automated and all these press releases end up archived somewhere, with link popularity passed on indefinitely (maybe). But it's only a matter of time before the PR wire turns into total spam, isn't it? I can see it now, we'll see days with press release titles like:
"Online Viagra Seller Offers Best Prices, Free Shipping"
"Web Casino Offers Gamblers $50 Free Just For Signing Up"
"Supermodel Has Wardrobe Malfunction, Photos Online Now"
Why would an SEO Company create an arbitrary press release to inform the world of his clients keywords.
Would you want your SEO company to use your niche keywords as an excuse to promote themselves?
Maybe tucked away on their sites, in a testimonial or corner somewhere, with a link to your site, but not something you should be reading Fresh off the news wire!!! hehehe
Press releases are an easy way to dominate news search results, and I have even had some rank well in regular Google search results (part of the reason they can rank well is because some of the PR sites have a solid base authority level).
Some of these might not be exactly new, but just new to me, but hey may as well list them...
Key Words - new blog on SitePoint by Dan Thies SEO Shed - newish SEO blog. I think it is done by Sebastian from SEW forums. SMART Keywords Blog - newish SEO blog. I think it is done by AussieWebmaster who is a moderator at SEW forums.
Hopefully I will have time to update the big list of SEO Blogs soon. I think the list is getting up near 5 pages. There are a bunch of us.
More information in the extended entry...
Most of my sales are generated through PayPal, which has substantially lower costs than ClickBank. In addition to saving me money that also means that saves affiliates money too.
Allows affiliates to send traffic to the home page or to a sales letter page.
The affiliate software makes it easy for affiliates to use various banners, text links, or text ads.
Since most of my ebook sales are through Paypal and since that payment option is on nearly every page of this site that means affiliates stand the greatest chance of getting commission for their sales. Also the triple redundant tracking makes it easier for affiliates to ensure they get credit for their sales.
The new affiliate software does not use 302 redirects, so I do not need to worry about Google or Yahoo! penalizing my site by not properly following a 302 redirect.
I am crediting all affiliate accounts with a $20 bonus credit.
My affiliate software allows me to set result based compenasation levels. This helps me give added incentive and added reward to harder working affiliates. This also helps give better affiliates a benefit in the marketplace.
The affiliate software also allows me to roll out pay per click advertising options. I have not yet turned this on but may do so eventually.
The affiliate software allows two tiers. I have not yet turned this on but may do so eventually.
Top affilates will get $40 per sale, which is over 50% of the sales price.
Bad Deals with the New Affiliate Program:
New affiliates will be started out at $25 per sale, which is lower than the current program level. The affiliate payment amount goes up by $5 after each 5 ebook sales. Though after 15 affiliate sales affiliates make $40 per sale.
The new affiliate program pays once a month, whereas ClickBank pays twice a month. My program will pay out quicker than ClickBank does though.
I have set the minimum payout to $50. New affiliates that do self sales will not be able to profit from self sales until they sell at least 1 additional copy of the ebook.
Currently my affiliate program requires PayPal accounts to receive payment.
I am phasing out the old affiliate program and intend to have it fully phased out by March 5th.
If you are a current affiliate under the old program and have made multiple sales already please send me an email and I will see if I can start you off on a higher tier to credit you for your past sales.
I realize that changing affiliate programs sucks for affiliates (being an affiliate of many products myself), but I do not intend on doing it often. I appoligize for any inconveniences this change may cause.
If you have any questions, comments, or feedback please email me or post them below.
Q: My question is, with this # of links to the same form is it better to save the form with different file extensions or to have all links all going to the same form?
A: I think a large part of the potential problem is intent, which may not be clear and easy for robots and editors alike to be able to understand. If you do go the multi page route you do not necissarily need to optimize those pages ... just get enough link popularity into your site and those pages should rank well. Plus you can pull that data from a database right? [added: ensure you create content which clearly conveys its purpose]
I think another thing you need to look at is the user of your site. If they land on those contact forms or whatever, do those pages provide enough info to actually convert?
Many people tend to be stuck on a product or service or marketing angle. We tend to view these as good and then place our problems on others.
Overture and AdWords are too expensive.
blah doesn't provide enough traffic.
blah has too low of traffic quality.
Before looking to smaller engines I usually recommend creating a product or service offering which does decent on AdWords OR Overture.
Google AdWords and Overture have some fundamental significant differences in how they operate which means that some ads can fail on one and succeed on the other. If an offering fails on both networks then refine it. Change the offering. Target the ads better. Bid on a different position. Position 1.0 might be the guy who is losing the most money. There are lots of things you can change.
When people just give up on Overture and AdWords it means that they are settling for
small streams of traffic
slow feedback loops
potentially lower traffic quality. If the traffic source is a good one why would they partner with a second rate PPC instead of one of the larger ones?
All that combines to likely lead to small streams of sales.
If you can't compete on the larger networks refine until you can. Create a profit stream and then look to duplicate the results.
If you corner yourself to the smallest networks as time passes you may be continually marginalized until your business goes under.
Have not tested it out myself yet, but a buddy found a link for a free $50 AdWords credit for new Google AdWords accounts at adwords.google.com/select/main?cmd=Login&sourceid=Yh91503. Here are some more recent AdWords coupons
You can get a free $75 AdWords coupon here (or here or here or here or here or here or here) ... many options linked because some of their coupon offers expire over time & we update this page periodically. The Google Partners Program also offers coupons to consultants managing AdWords accounts.
Update: in addition to this AdWords coupon we recently came across coupons for Bing Ads (formerly Microsoft adCenter).
Some SEOs I have spoke with have seen sites that had many deep links do exceptionally well in this past Google update. One of them recommended having at least a 2 to 1 ratio of deep links to links pointing at the root URL. This of course makes the SEO process far more expensive, but should make the results far more stable than just building links to your home page would.
Do you build many deep links?
Have you noticed a similar pattern with the sites doing well in this Google update?
Do you have any tips for maximizing your deep link ratio?
My observations show that those with a low percentage of unique backlinks when compared to the total # of backlinks are doing very poorly in these updates.
Many SEO experts I have been talking with prefer to host a presell page on the other site which is linked to from every page on that site. They then make that page sematically related to their site and link that page to various related pages on their site. Some people are also even throw a few other links to authority sites on those pages.
I believe at Chicago SES Jon Glick stated that each site only gives 1 vote. Algorithms such as Google's Hilltop also deweight nepotistic links. If search engines only want to count 1 vote per site or related site owner why not make that vote as strong as it can possibly be?
Do you still get many sitewide links? What is your prefered method to build links?
Malcolm is also going to be at the South by Southwest festival, which is looking rather appealing. I am thinking of going and seeing if I can snag is autograph on my copy of The Tipping Point or Blink. Anyone ever went to South by Soutwest? What did you think of it?
Problems with About.com's Ugly Ads:
Last year at NYC SES (which is coming up again in a week) I remember hearing Jennifer Laycock (then Web Search Guide for About.com) express disappointment about the layout of the About.com pages, but she could do nothing to change it. She now is an editor at SearchEngineGuide.
I do not know the current Web Search guide much, but I do know I read that site less than I did a year ago.
Leveraging a Market Position:
Some of the blog & guide networks gain significant distribution and credibility by being part of a large cross linked and heavily referenced network, but that also has limits.
Is About.com Overpriced?
The NYT Company paid about $410 million, which is a value of around a million dollars per channel, which seems a bit expensive to me. Here is why:
With the decreasing cost of publishing one wonders how many people will start their own niche sites instead of being glued to a company that can change its advertising and publishing policies any day.
As search gets more sophisticated it is likely that being part of those large networks will not provide as much of a benefit as it does today.
RSS readers are still rather primative. As they get more advanced people will be better able to subscribe to ideas instead of just subscribing to channels.
Sites like Topix and Google News make it easy for me to collect a variety of views about a story without commiting to any particular site.
Contextual advertising programs such as Google AdWords make it easy for any person interested in a topic to write and sell ad space, even if they had no idea what their ad space is worth.
Cheap / niche topics may be prohibitively expensive to cover using the About.com business model. As an individual I can create a few different channels about various niches I am interested in.
As ideas become hot market competition increases and strongest brands and most original or useful sites seem to rise to the top.
Large networks present a limited personal branding opportunity. Creating your own site allows you to create a much stronger personal brand than conglomerates do. With that brand and market position you can sell many other products or services which would not fly if you were stuck purely in an editor position at some conglomerate site.
Editorial guidelines may prevent people from displaying how human they are. It is much harder to subscribe to the ideas of a robot than a person.
In the past I signed up to be a guide at other similar networks, but I was too lazy to write it.
It seems to me that so long as you are interested in a topic it is not that hard to start a blog of your own, and so I did ;)
Even if your blog sucks off the start (which I am sure mine did and many will argue that it still does) you can gain a wide readership just by participating in the community you write about.
Version 3 of the software also lets people automatically check their spelling in Web forms; translate words in English into several languages; and add Web links to certain plain text. For example, an address could be enhanced with a hypertext link to its location on a map, with the click of a button on the toolbar.
Recently while talking to two different friends they stated that if you want to be a good SEO you should think more like a search scientist than as a webmaster, and Xan is surely trying to help us out with that ;)
With the recent February update Google has also decided to open up a feedback channel. If you have comments about the update you can mail them to email@example.com. I am still only a couple years into the web, but I don't think it is something Google has really done much of before, and Danny Sullivan stated that it is something new.
Blog Anchor Text:
While the author of asbestos.stinkmachine.com is not interested in the subject, he is acquiring some good linkage data as the blog community states how fascinated they are with him making money from AdSense.
In Google's investor day a slide showed Weblogs Inc makes over $600 a day from AdSense. It seems there are a ton of AdSense sites, but if you lack topical interest is it a sustainable business model? Other than the guy getting the free links from the oohs and ahhs most people are going to need to spend some $ to build linkage data.
There probably is still some easy money on the table, but as time passes surely that market will get much much more saturated and competitive.
On a side note, if bloggers are so smart and well connected, how are they so behind the loop on AdSense?
Fake Weblogs: So evil they may as well be terists...
What is funny is that
blogs will link to other blogs just because they are profitable and being made for AdSense.
and criticise other blogs for being fake
Beyond intent what really matters?
My weblog is fake. If the fake blog wiki would ever go back online I would add my blog.
Naive & Manipulated:
perhaps worse than being fake?
is evil. Nick W has some tips on how to cold call. My personal goal when people cold call me is to ensure I drastically increase the likelihood they will have a bad day, and to hopefully lead to eventual attrition at their work place.
Recently Google updated their index and relevancy algorithm with Update Allegra.
The update was believed to be related to latent semantic indexing.
Beings that my own rankings just dipped, it would be easy for me to take things overly personal and perhaps be a bit biased about the situation. Then again some of my other sites are now ranking way better than they were, and I also pointed out this problem before it ever had any significant effect on me.
In doing this update the search results are in many areas less than stellar. Understandable that shuffles will occur as they must to consistantly improve relevancy, but on more than one occasion Google has seemed to have lost focus on their official mission statement.
They tell you to design content for the user. Link to quality resources. Act if search engines are not even there. Generally this is good advice for many webmasters.
What they do not tell you is that they do not follow their own guidance.
Sure trying to rank for a term like "SEO" or other generic terms may be a bit unrealistic for many and only a few sites can rank well for such a term. I am not particularly saying that I believe I deserve to rank #1 in Google for "SEO" because it is a generic term and they owe me nothing.
On another front there are brand names that people work long and hard to build. Sure the search results are just informational pages about a topic, and maybe Google doesn't give a shit about my brand, and that is fine too.
Where the real problem exists though is that since I have worked so hard to build that brand it gets a ton of traffic and people expect to see my site there.
When people search for stuff like "seobook" and 10 out of 10 of the front page results reference me but I am not listed that provides a poor user experience for Google's users.
To try to prevent their results from being manipulated they have often thrown the baby out with the bathwater. But maybe in the hopes of achieving their longterm goals Google realizes they have to take short term hits.
What if Google is wrong in their desires though? What if their desire to fight off commercial manipulation is so great that they fail to accept commerce as part of the web, and too often show informational results when people want to shop? Would that eventually cause people to stop using Google? Would accepting markets for more of what they are without trying to bias them away from marketing and toward aged sites or information dense pages potentially create a more efficient market?
A buddy of mine pointed me to a white paper by Zoltan Gyongyi, Hector Garcia-Molina, & Jan Pederson about a concept called TrustRank(PDF).
Human editors help search engines combat search engine spam, but reviewing all content is impractical. TrustRank places a core vote of trust on a seed set of reviewed sites to help search engines identify pages that would be considered useful from pages that would be considered spam. This trust is attenuated to other sites through links from the seed sites.
TrustRank can be use to
automatically boost pages that have a high probablility of being good, as well as demote the rankings of pages that have a high probability of being bad.
help search engines identify what pages should be good canidates for quality review
Some common ideas that TrustRank is based upon:
Good pages rarely link to bad ones. Bad pages often link to good ones in an attempt to improve hub scores.
The care with which people add links to a page is often inversely proportional to the number of links on the page.
Trust score is attenuated as it passes from site to site.
To select seed sites they looked for sites which link to many other sites. DMOZ clones and other similar sites created many non useful seed sites.
Sites which were not listed in any of the major directories were removed from the seed set, of the remaining sites only sites which were backed by government, educational, or corporate bodies were accepted as seed sites.
When deciding what sites to review it is mostly important to identify high PR spam sites since they will be more likely to show in the results and because it would be too expensive to closely monitor the tail.
TrustRank can be bolted onto PageRank to significantly improve search relevancy.
So a friend of mine is building a tool which will likely be publically available for usage and it may run a thousand or few thousand queries a day. This tool may query some of the major search engines and may need to use some open HTTP proxies.
Does anyone know how he can gain access to reliable open HTTP proxies and if / what costs would be available?
Feel free to emails me seobook aT gmail DoT com if you do not want to post anything in the comments.
If and when he completes the tool I will mention its launch on this site :)
How to Be a Consultant:
Create The Warm Fuzzy Feelingâ„¢. Reading it certainly takes much longer than 10 minutes, but it is well worth it if you are considering becoming a consultant.
The list is great, but on the web / marketing front I would also add create affiliate and content sites to help build a stable income stream when down periods occur.
Even when you have few clients you help shore up your technical understand by creating things. If you create great sites then they will make money and you will be able to better filter what work you are willing to take on. If you create lousy sites then they will make for great research and will help you identify symptoms of a lousy site when prospective customers contact you.
As stated in that article, it can't be overly stressed
how important it is to be easily available; &
how amazingly well syndicated articles act as sophisticated salesmen
Many people have been noticing a wide shuffle in search relevancy scores recently. Some of those well in the know attribute this to latent semantic indexing. Even if they are not using LSI, Google has likely been using other word relationship technologies for a while, but recently increased its weighting. How Does Latent Semantic Indexing Work?
Latent semantic indexing allows a search engine to determine what a page is about outside of specifically matching search query text.
A page about Apple computers will likely naturally have terms such as iMac or iPod on it.
Latent semantic indexing adds an important step to the document indexing process. In addition to recording which keywords a document contains, the method examines the document collection as a whole, to see which other documents contain some of those same words. LSI considers documents that have many words in common to be semantically close, and ones with few words in common to be semantically distant. This simple method correlates surprisingly well with how a human being, looking at content, might classify a document collection. Although the LSI algorithm doesn't understand anything about what the words mean, the patterns it notices can make it seem astonishingly intelligent. source
By placing additional weight on related words in content, or words in similar positions in other related documents, LSI has a net effect of lowering the value of pages which only match the specific term and do not back it up with related terms.
LSI vs Semantically Related Words:
After being roasted by a few IR students and scientists I realized that many SEOs (like me) blended the concepts of semantically related words with latent semantic indexing, and due to constraints of the web it is highly unlikely that large scale search engines are using LSI on their main search indexes.
Nonetheless, it is overtly obvious to anyone who studies search relevancy algorithms by watching the results and ranking pages that the following are true for Google:
search engines such as Google do try to figure out phrase relationships when processing queries, improving the rankings of pages with related phrases even if those pages are not focused on the target term
pages that are too focused on one phrase tend to rank worse than one would expect (sometimes even being filtered out for what some SEOs call being over-optimized)
pages that are focused on a wider net of related keywords tend to have more stable rankings for the core keyword and rank for a wider net of keywords
Given the above, here are tips to help increase your page relevancy scores and make your rankings far more stable...
Mix Your Anchor Text!
Latent semantic indexing (or similar technologies) can also be used to look at the link profile of your website. If all your links are heavy in a few particular phrases and light on other similar phrases then your site may not rank as well.
Example Related Terms:
Many of my links to this site say "SEO Book" but I also used various other anchor text combinations to make the linkage data appear less manipulative.
Instead of using SEO in all the links some of them may use phrases like
search engine optimization
search engine marketing
search engine placement
search engine positioning
search engine promotion
search engine ranking
Instead of using book in all the links some other good common words might be
How do I Know What Words are Related?
There are a variety of options to know what words are related to one another.
Search Google for search results with related terms using a ~. For example, Google Search: ~seo will return pages with terms matching or related to seo and will highlight some of the related words in the search results.
Look at variations of keywords suggested by various keyword suggestion tools.
write a page and use the Google AdSense sandbox to see what type of ads they would try to deliver to that page.
Read the page copy and analyze the backlinks of high ranking pages.
Google Sandbox and Semantic Relationships:
The concept of "Google Sandbox" has become synonymous with "the damn thing won't rank" or whatever. The Sandbox idea is based upon sites with inadequate perceived trust taking longer to rank well.
Understanding the semantic relationships of words is just another piece of the relevancy algorithms, though many sites will significantly shift in rankings due to it. The Google sandbox theory typically has more to do with people getting the wrong kinds of links or not getting enough links than it does with semantic relationships. Some sites and pages are hurt though by being too focused on a particular keyword or phrase.
Where do I learn more about Latent Semantic Indexing?
A while ago I read Patterns in Unstructured Data and found it was wrote in a rather plain english easy to understand manner.
Brian Turner also listed a good number of research papers in this thread.
I'm not about to go post my research and examples on a public forum. But, I'll warn you now - if you're not varying your anchor text, and you're not writing pages synonymous with your term that don't contain the term you're targetting, you're going to be in a world of hurt within the next 90 days.
We've been tracking this update for the last 6 months. I was surprised to see it happen now - I honestly didn't expect it until next month or March, but it's here.
I have a page about "baby clothes". I link to my site 100 times with the anchor text "baby clothes"
I now pull out the words "baby clothes" and all the links pointing to my site with the words "baby clothes"
Do I still have footing to rank for that term "baby clothes" after you've run some sort of semantic analysis on it?
That's my simplistic explanation. I think they're doing something very similar, but taking links into account like that and maybe even devaluing some links on the "main" term...
Well, if it hasn't changed by Monday I'm going out to buy a black hat.
If irrelevant junk is what Google wants then irrelevant junk is what it's gonna get. :-(
Man I'm glad I diversified my sites. I think I will work on diverifying some more...
Google Inc. is all about money. And IMHO ... so are Yahoo Inc. and Microsft Corp.. As webmasters we are the people who build sites and depend on these money hungry companies, who at the heels of the hunt, put their interests miles ahead of ours.
My main concern with this new update is that if you search for my brand name (and there are quite a few that do based on referrals), then right now my site does not even rank. Our brand name is perhaps the best in my industry, and Google are, in my opinion, diluting my brand name and causing my company money. The first result for my brand name is a spammy page which is a "scraper site" which is actually SERP's page from somewhere - so that's basically useless.
The Hidden or Not so Hidden Messages:
If you are entirely dependant on any single network and a single site for the bulk of your income then you are taking a big risk. Most webmasters would be best off to have at least a couple of income streams to shield themselves from algorithm changes.
If you are new to SEO you are best off optimizing your site for MSN and Yahoo! off the start and then hoping to later rank well in Google.
Make sure you mix your anchor text to minimize your risk profile. Even if you are generally just using your site name as your anchor text eventually that too can hurt you.
Search algorithms and SEO will continue to get more complicated. But that makes for many fun posts ;)
Update: a few additional tools recommended in our comments and the comments at ThreadWatch
Yahoo! launches Yahoo! Q, which shows contextually relevant news and links in a small pop up box next to content.
"The thinking is that if you can read an article, you can be inspired to search," said Ken Norton, senior director of product management at Yahoo search. "We're bringing search to the moment of inspiration... We'll save them time and energy, and the most relevant search." source: MarketWatch
A couple weeks ago Jakob Neilson talked about using fat links (or smart links which offered multiple options or opened multiple windows when clicked). This is the first implimention of the concept I have seen by any major web players.
Users can download the Yahoo! Q DemoBar or add extensions to FireFox.
Eventually Yahoo! may integrate ads into their Q boxes, but off the start they are primarily hoping to improve search usage. The fact that FireFox is part of the beta release means that Yahoo! is really starting to create products which the web community will help market for them.
I have not tested it much, but is sure sounds like cool stuff.
PostScript: I installed Yahoo! Q on all my individual post pages. It was easy to install, but I am kinda tired.
I few things I do not like about it...
the Yahoo! Search blog has not yet installed Yahoo! Q. What is up with that? ;)
It slightly messed up my template. Not sure if I am at fault or it is at fault.
It requires me to pop the form element up within the content tags when I would have prefered to have it lower...like near all the other search engine links. currently if I do that it might place too much weight on the post title
Since many of the highlights will be at the bottom of the screen it will require the user to scroll down to see the Q box. Perhaps they could find a way to ensure a large portion of it fits on the screen?
So I was looking for a site of a well known SEO in Google and he does not show up for his site name.
I remembered a few others that this happened to recently and spoke to a friend who has seen a bunch of this. It appears that this is a rather common occurance now, where sites that are aggressively improving their rankings stop showing up for their keyword and sometimes their site name.
I looked at some of the keywords for this site and some of the deep pages are ranking poorly for his primary terms in Google, but they are outranking the home page (which heavily targets those same terms and is absolutely burried). None of his pages rank for his site name.
I suppose this is a good way for Google to attack people selling competing advertising systems that manipulate their index. Rank them lowly for their keywords AND remove them from the index for their site name.
If people do not show up for their own name it hurts their brand. On the web AND off the web their entire brand is diminished by not showing up for their own name.
Then the only way these people can show up for their own name and brand is by buying in on AdWords, and if you have a strong brand that can become a competitive landscape and those costs can add up quick.
Pretty damn cool self regulating system if you are Google, but kinda sucky for joe average SEO company. :(
The attraction of hundreds of millions of web searches per day provides significant incentive to content providers to do whatever necessary to rank highly in search engine results. The use of techniques that push rankings higher than they belong is often called spamming a search engine. Such methods typically include textual as well as link-based techniques. Like e-mail spam, search engine spam is a form of adversarial information retrieval; the conflicting goals of accurate results of search providers and high positioning by content providers provides an interesting and real-world environment to study techniques in optimization, obfuscation, and reverse engineering, in addition to the application of information retrieval and classification.
The workshop solicits technical papers and synopses of research in progress on any aspect of adversarial information retrieval on the Web. Particular areas of interest include, but are not limited to:
- search engine spam and optimization,
- crawling the web without detection,
- reverse engineering of ranking algorithms,
- advertisement blocking, and
- web content filtering.
Papers addressing higher-level concerns (e.g., whether 'open' algorithms can succeed in an adversarial environment, whether permanent solutions are possible, etc.) are also welcome.
11 February 2005 E-mail intention to submit (optional, but helpful)
25 February 2005 Deadline for submissions
25 March 2005 Notification of acceptance
8 April 2005 Camera-ready copy due
10 May 2005 Date of workshop
The real question of course, is why would you give away spam white papers to a conference where many current search engineers are part of the program committee?
The Search Wars:
MSN makes the official switch announcement and is to spend big.
To appreciate the financial power of MicroSoft you need only look at the various 4th quarter "US Personal Income Soars" news stories which were primarily caused by MicroSoft's $32,000,000,000 dividens.