Running Multiple Brands / Channels In Parallel

There are many problems with running multiple similar brands in parallel.

  • It splits your focus

  • you end up cross posting stuff where it does not belong

I am going to consolidate many of my domains in the near future. SeoBook and ThreadWatch will remain unique channels, at least until some tells me that I am screwing up majorly bad and it should be otherwise.

The key for me to do well with them is to respect them for what they are...ie: this site is supposed to be about actionable SEO tips, and ThreadWatch is about rumor and stuff that is fun and interesting about search / technolgy / SEO.

I also recognize that some of the people posting comments at TW have been in the game far longer than I. In the past I would reference some of the comments that I thought were great, and I will keep doing that going forward.

Dreamhost: Free Advertorial

So Nick recently sold me ThreadWatch. While transferring the site there was an issue with my account settings. At about 230am I sent a help request into Dreamhost. By 3am a Dreamhost rep called me up and sorted it out.

They don't normally do call backs on the graveyard shift, but the guy hooked me up. I just wanted to give them a bit of free marketing for having kick ass customer service.

Do Outbound Links Help SEO? Where Should I Link to?

So I recently set up a bunch of new websites and wanted to link out to a few authoritative sites right off the start. I added various numbers of outbound links to each channel, but after setting up a number of them I got pretty quick at researching where I should link at, even when I did not know a topic that well.

Outbound links are like a gimme in SEO. It's fairly hard to get the right type of people to link at a new site unless you bribe them or it's a great site, but just about anyone can improve their web community by linking out to some of the better resources on their topic.

Terms like PageRank leakage and bad neighborhood have made some webmasters become greedy or paranoid with their link popularity to the point where their sites become harder to link at because they are islands.

How do you find the best resources to link at?

What do the Various Major Engines Like?
My first port of call is Myriad Search. Search for the core phrase your site is focused on, synonyms, and phrases slightly broader in scope than your keyword phrase.

If you are in a hyper competitive field MSN tends to bring up some fairly spammy results, but sometimes seeing the results mixed together gives you a nice flavor of what they all like and sometimes you will find a nugget ranked at #7 in Yahoo! or #6 in Ask that all the other engines missed.

Unfortunately I am a fairly default searcher (primarily just using Google) but using Myriad helps me get an idea of what people can get away with (as far as content quality goes) in some of the engines.

Some people recommend other meta search engines, but like Berkeley, I think most meta search engines have way to many ads in the content area to be of any use.

Research, Research, Research!
Yahoo! Mindset allows you to bias your search results toward commercial or information result. Tilt that puppy full on research and see what sites they think are informational in nature.

Is that Page Created by the Government?
The Yahoo! Advanced Search page makes it easy to search just .gov or just .edu resources (or both at once). Sometimes this will be a miss, but I have found many great resources using the Yahoo! Advanced Search page.

Directories:
Many directories have picks or a star on favorite sites. DMOZ and the Yahoo! directories are the two most well known directories. You can also browse the Open Directory Project organized by PageRank using the Google Directory.

Don't forget some of the smaller higher quality directories created by librarians. LII is a killer site.

Vertical Authorities:
If you run a finance related site odds are pretty good that you can find something good at Fool.com, MarketWatch.com, Forbes.com, etc.

If you run an automotive site it is easy to link at Edmunds, Nada, Kelly Blue Book, etc.

To drill down go to a relevant vertical authority site and do a site level search for information related to your topic.

Broad Authorities:
When all else fails I like to link to sites with great overall authority scores if they have relevant pages or channels. Some examples include:

Other sites which have content on a wide array of topics like Answers.com, Topix.net, HowStuffWorks.com, and Britanica.com et al are also easy to link at.

I believe this site lists some of the top .com's.

That is about as far as I have gone with most of these new sites, but sometimes you may want to hunt further if you have an uber spammy topic like cash advance or are trying to go further in depth on a topic that is already well covered. Some other ideas...

Local Search:
If your searches are local in nature you may some of the best information by using regional search databases.

Filetype:
Don't forget that you can specify filetype. Spamming is a game of margins, and on the whole the average .doc or .PDF is going to be of higher quality than the average web page.

Search Engine Showdown has a good chart of search engine features.

Related Sites / Pages:
When you find a good site you can see which sites are related to it. Use a tool like the Google related search or the Touchgraph Google Browser.

Social Bookmarking:
There are a variety of social bookmarking websites which can help you find that key resource you need to reference to complete an article. De.lici.us is probably the most popular.

Link Socially: Tip for Blogs:
Technorati, Feedster, Daypop, Ice Rocket, and Google Blogsearch and Yahoo! Blogsearch are a few of the more well known blog search engines.

If you are running a blog or some type of a topical channel do not forget to use ego stroke techniques in your content and citations. Even if your idea is better than someone else's, or your read their idea after you came up with yours still give them a bit of link love.

Sometimes mentioning a prominent blogger (saying they are so right, they are hosed, they are normally spot on but are messed up on issue x) equates to a quality link back and many secondary links from various non clued up ditto heads :)

Monetized Eyeballs...They're Back!!!

Om Malik writes on The Return of Monetized Eyeballs

Web content deals are on the rise again, and Internet ad spending should reach $12 billion this year, meaning Jupiter’s once-ridiculed forecast wasn’t far off the mark. ... Of course, there are new metrics for valuing audiences. "Not all pageviews are created equal," cautions David Hornik, a partner at August Capital. Hornik and other VCs say the most prized traffic comes from sites that leverage "viral" content to acquire users who are intensely loyal.

If you do what is out of favor, when it comes back in favor you end up growing far quicker than me too companies that are always one step behind.

On a related note, here are 10 rules for startups

Matt Cutts on Sleeze Marketing

Matt Cutts posts on some blog and ping RSS Announcer. A few months ago I also stated my disgust with some of the blog spam salesletters.

For those thinking of investing in blog spam software or other useless related crap, search engines can look at far more than the [delete this pink text] and other obvious software footprints. The Link Spam Detection research paper stated:

A number of recent publications propose link spam detection methods. For instance, Fetterly et al. [Fetterly et al., 2004] analyze the indegree and outdegree distributions of web pages. Most web pages have in- and outdegrees that follow a power-law distribution. Occasionally, however, 17 search engines encounter substantially more pages with the exact same in- or outdegrees than what is predicted by the distribution formula. The authors find that the vast majority of such outliers are spam pages.

It's funny to see the star internet marketing crew being too lazy to even change out their bogus testimonials. But at least in the short term there is far more money is the following model:

  • launch a shitty software product

  • use hype affiliate marketing and joint venture opportunities with internet marketing expert hucksters
  • opt in list
  • then email spamming (either directly or via affiliates)
  • repeat process

than in creating real value. Maybe eventually people will get a clue.

On the web there are a large number of people who will try to help you along to doing well, and then there are hollow middle men who want to take your money to shit down your throat. Since it costs virtually nothing to make most software or sites you see far more of the latter in the internet marketing realm.

There was recently a WebmasterWorld thread about why people hate Google and other successful companies. Some chalk it up to jelousy, but I just think it is disappointing when sleeze marketing and / or other dubious business practices bring better returns than honest ones.

Here is a perfect example. So about a week ago I gave away one of many copies of my ebook to a charity and recently got this response:

Hi Aaron,

Great book, I finished it a few weeks ago and now need to take notes on all the highlights. And WOW, with all the resources you have, that was in itself would be worth the price of the book, oh wait, it was free for me, :) lol

I have read quite a few books over the last 1 1/2 years on internet marketing, etc, and I really believe you are one of the few guys out there that truly want to help others and not just try and milk their list, as I heard one big name internet marketer say and that's exactly how I feel when im on his list. It really came through in your writing and the fact that you update the book on a continual basis says enough.

I would like your opinion on http://www.xyz software.com/ from what I can tell, it appears to be a great piece of software but I always like to get a 2nd opinion before I buy. (btw, I was referred to this by one of the guys I consider that will sell any and everything, every week it's a new cant miss product some of these guys are hawking so its sometimes tough to tell the gems from the duds)

I responded (roughly):

Hi Name
thanks for liking the ebook

xyz software = crap
here is why...

-qpw ($30 or so) is great
-you should mix stuff up
-sites that accept automated type submissions will tend to be
bad neighborhoods

see also
http://www.search-marketing.info/newsletter/articles/trustrank-company.htm
http://www.martinibuster.net/2005/11/link-development-is-dead.html

FYI, I recently was bogusly sued and my lawyers are raping me, so if
you feel like you got tons of value out of the ebook donations are
accepted
http://www.seobook.com/archives/001130.shtml

Cheers,
Aaron

From that I got no response or donation. Give a guy your business model and your knowledge and get nothing in response. That's pretty useless considering I just saved him $97.

I find people that have no money to support people who help them but want to spend money on the latest interent marketing scam software to be greedy and/or stupid. It's not uncommon for business people to take the low road though.

Traffic Power Sucks saved people over a half million dollars and did not get much in donations from the people they helped when asking for donations to fight the bogus lawsuit they got.

I have about 700 emails in the inbox and I likely will be changing my charity policy soon to be more accomidating to paying customers. I don't expect to profit from helping charities, but the semi shady charity requests weigh on my time and spirits. And I still haven't said thanks to all the people who have helped me fight off the internet marketing scam that is Traffic Power :(

What Links Have the Most Value?

Rand asks do links from the top ranked sites for your keyword have the most pull?

The question I pose today, however, is on the subject of links from the top ranking pages in the existing SERPs. These links are heavily pursued by SEOs and traffic builders as good sources for both referrals and high conversion visitors. After all, if the visitors are primarily coming through search, it's easy to determine whether they are likely to convert simply by running PPC campaigns in those results. But, let's imagine for a moment that the visitor value was entirely removed from the equation, and the link was purchased/cajoled/traded purely for the purpose of boosting rankings.... Are the highest ranking pages on a subject providing the most valuable links?

Jim Boykin offers a workaround into the neighboorhood:

I'd have to say that getting links from the "Similar pages" to those in the top 5 might even be better. (Google's "related:" command).

These "related/similar" sites are sites that have common backlinks to the top sites...so a kind of "back door" approach to getting "in the neighborhood" is approaching those sites since they help to identify the neighborhood.

It is a bit hard to separate out the direct and indirect effects of a link, especially if a story goes viral. A few months ago EGOL made a post about the importance of natural links and other elements of SVA. Sometimes the best part of a good link is the onslaught of secondary links that follow.

Partial Article Link Exchange

I have not tried this idea out yet, but St0n3y has a new content exchange link exchange idea.

Some people who would be willing to trade content are too lazy to write an article for your site, so what you do is write an article for their site which links back to your site and then write most of an article for your site to link to them.

Leave out a section for pointers or tips and let them write a pointer or two including links back to their site.

How Do You Turn Vision Into a Viral Story?

Bausch & Lomb did a study on Beer goggles.

Within a couple years up to 20% of viral linkbait will likely be from various scientific(ish) commissioned studies and the like. Even if the stories are half-assed the added controversy would likely equate to more links.

[from TW]

Blog About Early Days at Google

Xooglers - a blog about working at Google. Lots of interesting stories. [via SEW]

In other Google news, it appears they are testing click to call and are looking deeply at print ads.

How to Make Google AdWords Ineffective

In How to Become the Doctor for a Famous Rock Band Clifton Sewell tells the whole world (including his competitors) that AdWords are currently under priced in his vertical.

The doctors have been running paid ads on Google for four months now, under terms such as "San Francisco Doctor". They're spending roughly 50 cents per click for top positions and their total monthly spend is about $500. "We get about 20-30 new patients a month from it, so we're happy," said Clifton.

Next month part 2 of the series will be out, where Clifton bitches about how AdWords is ineffective due to being hyper competitive.

Mapping the Google vs eBay Local Shopping Wars

WSJ: Getting an Oil Change Off eBay [sub req]

EBay is aiming to take over the phone book's customary role as the first place people turn to find local services from housecleaners to accountants.

While eBay Inc.'s focus for now is on auto services like oil changes and brake jobs, its goal may be to connect consumers with local businesses of all kinds. This could signal a major shift in the way consumers shop for such services and greatly affect pricing and competition among local shops.

Does eBay have any sort of a map? How are they going to do local without one? Craigslist (which eBay owns part of) links off to Google and Yahoo! maps. I think Google and Yahoo! get information from the same source: Navteq. Not sure if Navteq is going to go after local search as well, but they recently partnered with Zagat.

Any chance someone would want to scoop the leading map company? Are there any other high quality digital mapping companies?

Froogle was recently upgraded, and without a hitch! Google's Froogle Drops eBay for a Day.

When Paypal was upgraded in June Google overshaddowed the news with the announcement of Google Wallet.

Here is yet another article about who should be afraid of Google.

Expert Claims to Have Beaten Google Sandbox...

So WMW has a subscriber only thread titled Expert Claims to Have Beaten Google Sandbox. The best info is in the free Matt Cutts on the Google Sandbox thread. DaveN mentions a site that was quickly out of the box. His main tip?

don't think like an SEO
build site go get links from ...... <- seo

Dave also showed some encrypted logfiles showing that he was getting decent traffic from a couple sites outside of search engines.

RAE also offers up

IME it is more accurate to go *after* traffic and not links.

Find the Easter Eggs in Google AdWords

Andrew Goodman says there is an Easter egg feature in the new AdWords system:

And did we mention pay-per-call, separate content bidding, an obscure nameless easter egg feature that I don't wish to comment on but would like to thank Google for adding apparently in response to a wish list entry I think I posted long ago at Webmasterworld and SE Watch Forums...

Anyone know what egg Andrew is talking about? A bit hard to search WebmasterWorld right now, but SEW has What are your Top 5 AdWords feature or tool requests?

Based on a bit of thinking and Andrew's posts in that top 5 features thread I think some of the cool things you can now do with AdWords are:

  • Run site targeted content ads without paying CPM rates by bidding on the official site names or common phrase matched page elements of sites you want to advertise on.

  • Target ads against competing products without competitors being able to prove you are using their trademarks or product names to trigger your ads. This prevents you from taking needless large sums of crap that I took about a year or so ago.
  • In the past Andrew also hinted at - trailer park geotargeting and Googleplex-cam.

Angel Funding Via Google AdSense

When I recently interviewed Matt Cutts he stated that many companies at the Web 2.0 conference were powered by AdSense. Kevin Burton hopes to take AdSense one step further, using it for angel funding for TailRank. Some others are already donating their AdSense.

Linkblogs Talking Content, Content, Content

Martinibuster: Link Development is Dead

THEY don't want you to promote your site. Anything that smells promotional is getting whacked at the knees.

Justilien: Using Google’s Love Affair with Quality Content to Garner Links

Jim Boykin is even using email spam and fax spam as content! Treehugger. ;)

Google to Reenable Inactive Content Ads

Recently when ads were disabled from Google AdWords they also were disabled from the content network, but that is no more.

Why do 'inactive for search' keywords remain active for content?

The Google AdWords system uses all the keywords in your Ad Group to help match your ads with relevant content network sites. In some cases, keywords which have proven ineffective when triggering your ad for search turn out to be very effective when triggering content impressions. In other cases the keyword is simply useful as context in helping the system determine the overall subject areas of your ads.

How they can have such a large network and then just randomly announce that effective now things are changed?

Some advertisers who do not log into their accounts in the next few days may come back from their holiday break to see a ton of formerly disabled overpriced content clicks killing their ROI.

Google Killing the Press? The Press Fights Back?

Slate posts a worst case scenario for Google article:

It wasn't until Knight Ridder Inc.'s largest stockholder, Private Capital Management LP, called for the newspaper chain's breakup that the creative destruction of market forces turned on Google and began its rout.

Google's Froogle Supports Local Shopping Search

Right before Black Friday Google unveils a new local shopping tool. Froogle now mashes up with Google maps. According to a Silicon Valley article:

The Mountain View, Calif.-based company developed the free tool to help consumers avoid the frustration of traveling to a store that no longer has an item on their shopping lists, said Marissa Mayer, Google's director of consumer products.

Froogle, a comparison shopping site that Google launched three years ago, will continue to give visitors the option to buy the merchandise online. Google receives a commission for the online referrals.

I am not sure what they meant with that Google receives a commission bit. Is that just for the ads near it? In 2003 when Mike Grehan interviewed Craig Nevill-Manning, Craig said:

BUT - the bottom line is - they are unpaid listings, they're unbiased. They're the best results we can find for those products online. ... It'll be free forever.

The New York Times made it sound as though the Silicon Valley article was misquoting or overtly vague in their description of how this service will make Google money.

The service will be freely available to merchants in the United States, Ms. Mayer said. Google, as it frequently notes, plans to gain revenue from the new Froogle service by placing relevant text ads on the same page as the local results.

The company also believes that it gains revenue when users employ Google more frequently as its services become more useful.

The Silicon Valley article also stated:

Initially, Google is depending on a contractor to pull the inventory information from several hundred major merchants. The search engine hopes to make the service even comprehensive by encouraging stores to submit their own customized merchandises list to the newly created "Google Base" - an information clearinghouse for everything from family recipes to scientific formulas.

What vertical search site is safe?

A while ago Battelle had a highly related post about comparison shopping called The Transparent (Shopping) Society. The New York Times made it sound like the eventual goal of this launch is spot on with what John was describing:

Marshal Cohen, chief retail analyst at NPD Group, a market research firm in Port Washington, N.Y., said that if Froogle delivered up-to the-minute inventory updates from retailers, "consumers will finally know whether a trip to a store is worthwhile."

Google wants to be the default inventory information clearinghouse. Users love defaults. I am guessing the value of being the default shopping search site is worth far more than any value they would extract by charging for the feeds, at least off the start. Just like with regular search, there will be incentive for merchants to spam this service. Any idea how Google will fend off spam if they aren't charging? Or are they charging?

GotLinks? Google Kills Reciprocal Link Networks, Even When You Don't Reciprocate...

Illuminating post by Greg Boser. One of his ex clients was ranking well in Google when they parted ways. Since then his ex client was busy entering a reciprocal link network and his rankings tanked.

the details...

the experiment:

I collected a sample of 50 keyword phrases being targeted by sites in the GotLinks directory. In order to get a balanced set of keywords, I randomly selected phrases from several different categories.

the results:

Now I don’t know about you, but one top 20 listing in Google certainly isn’t enough to convince me that the GotLinks network is a place I want my clients to be.

Greg also notes that

  • The site ranked well in Google before the links were added to the GotLinks network.

  • The ex-client never reciprocated.

and the cause of the penalty?

  • Exceeding a threshold for the total number of links developed in a specific time frame, or

  • Simply being included in a specific network.

Other reciprocal link network owners have been showing how great their networks are, pointing out how some people got 1,600 links really quickly. I think there should be a bit more honesty in the marketing, as it is clear those are not 1,600 links that will bring you to the top of Google's search results.

Greg also promises to further research competitive sabotage. Read Greg's whole post at WG.

Egalitarian Effect of Search Engines

John points at an Economist article that mentions The Egalitarian Effect of Search Engines, which is based upon a thesis that search engines tend to send more traffic than expected to lower popular sites.

The study is under scrutiny, but it is a bit counter to the commonly held thought of the rich get richer effect of linkage often mentioned in the SEO sphere.

It is a bit hard to isolate any one factor to determine how search interacts with it. You also have to consider the effects of most popular lists and how those build more linkage at things that are already popular. You know it is getting out of hand when their are aggregators like Diggdot.us that mash up the most popular items from different bookmarking channels.

I believe that as you go to more competitive fields generally competition scales faster than profit, and there is great value in being in a number of smaller niches. Perhaps the single best reason to have a high profile site in a competitive market is to make it easier to launch other channels.

When starting a new website it is cool to look at the power laws that guide the web and try to understand them and use them to your advantage, but I think it is far more important to:

  • see how they apply specifically to your sector of the web

  • think of other sectors near your topic that may be able to give you broader coverage
  • pick topics that would be easy to dominate
  • learn how to become an exception to the rule

Google AdWords Opens Up Content Bidding

A while ago in SEM2.0 Andrew Goodman mentioned that Google was enabling separate content bids. JenSense just posted on the topic from the publisher perspective.

Google was intentionally slow to roll this feature out and makes the feature a bit hard to access, because they would prefer to automate the process using smart pricing and get you to buy as much advertising as you can afford.

Put another way, Google thinks that they algorithmically can determine the value of an ad better than you can estimate it. Having said all that, they do realize that sometimes the feeling of control will increase ad spend from some advanced advertisers, so...

Content bids let AdWords advertisers set one price when their ads run on search sites and a separate price when their ads run on content sites. If you find that you receive better business leads or a higher ROI from ads on content sites than on search sites (or vice versa), you can now bid more for one kind of site and less for the other. Content bids let you set the prices that are best for your own business.

I think a large part of the reason for the early success of Chitika has been that for certain types of content (like consumer electronics) image ads do have more value than the typical textual search ad.

If you have found underpriced content inventory look for this added control to cause more people to dip their toes in the water and drive up costs.

Not only does this new service allow you to bid differently for content clicks than search clicks, but it also allows you to buy content ads while opting out of search ads. In the past AdWords also allowed content only ads, but required content ads to be purchased on a CPM basis.

Now you can buy targeted content only ads and only pay when people click. Cheap branding opportunity I suspect. Perhaps with that type of distribution it makes sense to craft ugly highly graphical animated contextual ads that say DON'T CLICK HERE.

Track Google AdSense Clicks via Google Analytics - Free AdSense Tracker

There have been 3rd party javascripts that track adsense clicks out for a while, but no free ones to my knowledge that track clicks on Firefox. Until now.

This free script integrates with Google Analytics to allow you to track your adsense clicks.

This tracking is done through "Goals". A goal is a way of tracking when a website visitor does something you want - Buy an item, submit a contact form, or in our case click an adsense ad.

Create a goal: To create a goal you assign it a URL. This url doesn't have to exist, as the javascript will trigger it.
In the Goal URL field, enter "/asclick" and "AdClick" for the goal name.

Google Analytics Goal.

Adding the javscript to your page: Copy the astrack.js to your website server and add the following to the footer of your website. This has to come after all adsense code.

<script src="/astrack.js" type="text/javascript"></script>

Testing it: DON'T! There is no way to test that this works as it tracks adsense clicks, and you can't click your own adsense. You'll just have to trust me that works :)

After some time you should start seeing goal tracking appearing in your stats.
For example, here is source conversion. Note that the percentages are based on Visitors, not Pageviews, so they do not compare to CTR.

Google Analytics Source Conversion.

So from that graphic we can see that out of 11 visitors that came from MSN, 54% of then clicked on an adsense ad over the course of their visit.

Below many graphs in Google Analytics is a list with round arrows. If you click the arrows on almost any item you see an option for "To-date Lifetime value".

Google Analytics Lifetime.

Click this and you see the Goal conversion for that item. For example here is the Coversion rate for DSL users.

Google Analytics DSL user conversion rate example.

Once you have Google Analytics tracking your clicks, you can cross segment that data to almost any other data Google Analytics shows. It becomes a very powerful way of optimising your site, not just for CTR, but for the type of visitors that click adsense.

[Thanks to Shawn Hogan and Jim from Digital Media Minute]

Content Optimization Changes to Content Generation

A friend of mine mentioned how the noise level in SEO forums has gone from around 95% to about 99%. I think it is largely due to a shift from content optimization to content creation (and remember that this is a site selling a book on optimization, so me saying this is not in any way to my benefit).

Here is why there is a large shift from optimization to creation

  • The ease which content can be published: It took me less than 2 hours to teach my mom Blogger, Bloglines, rss, xml, etc. She now blogs every day.

  • the ease in which content can be commented on and improved in quality
  • the casual nature in which links flow toward real content
  • the massive increase in the number of channels and quantity of information makes us more inclined to look for topical guides to navigate the information space
  • the ease with which content can be monetized has greatly increased. AdSense, Yahoo! Publisher Network, Chitika, new Amazon Product Previews, affiliate programs, link selling, direct ads, donations, (soon enough Google Wallet for microcontent), etc.
  • contextual ad programs teach the content publishers to blend links, which has the net effect of...
    • short term increase in revenues for small publishers

    • until users trust links less
    • at which point in time users will be forced to go back to primary trusted sources (ie: one of the few names they trust in the field or a general search engine like Google)
  • it is getting increasingly expensive to find quality link inventory that works in Google to promote non content sites, and margins are slimming for many of those creating sites in hyper competitive fields
  • the algorithms are getting harder for people new to the field to manipulate
  • around half of all search queries are unique. most hollow spam sites focus on the top bits whereas natural published information easily captures the longer queries / tail of search
  • duplicate content filters are aggressively killing off many product catalog and empty shell affiliate sites
  • as more real / useful content is created those duplicate content and link filtering algorithms will only get better
  • general purpose ecommerce site owners will have the following options:
    • watching search referrals decrease until their AdWords spends increases

    • thickening up their sites to offer far more than a product catalog
    • switching to publishing content sites
  • and the market dynamics for Google follow popular human behavior, even for branded terms or keyword spaces primarily created by single individuals
    • the term SEO Book had 0 advertisers and about 0 search volume when I launched this site

    • this site got fairly popular
    • SEO Book is now one of my most expensive keyword phrases

As long as it is original, topical, and structured in a non wild card replace fashion content picks up search traffic and helps build an audience.

I am not trying to say that optimization is in any way dead, just that the optimization process places far more weight on content volume and social integration than it did a year or two ago.

The efficiencies Google are adding to the market will kill off many unbranded or inefficient businesses. One of my clients has an empty shell product site and does no follow up marketing with the buyers. I can't help but think that there needs to be some major changes in that business or in 3 to 6 months we won't be able to compete on the algorithmic or ppc front without me being very aggressive.

Yahoo! Hired Andrei Broder

Already mentioned everywhere else, but I think it is worth noting that Andrei Broder Joined Yahoo!. Google has been getting the lion's share of hires of big web names (like Vint Cerf), so it is good to see Yahoo! pick up one of them.

Gary Price also added links to a number of research papers Andrei Broder contributed to.

Google AdWords API Intentionally Sends Lower Quality Data

I recently bought advertising on a tool which gave Google AdWords ad impression estimates via the Google AdWords API. My ad dollars, and the Google AdWords search volume tool itself, were both rendered useless when Google decided they wanted to provide more consistent and accurate keyword data.

All quotes below are from the above linked Google Groups thread.

As Ben Michelson put it:

I believe this may be just the first phase of a new "less is more" concept.

I expect subsequent versions will alternately snip out or merge previously inaccurate fields, until finally (AdWords 1.0) the TrafficEstimatorService will be void of inaccuracy by providing no information whatsoever.

Robert, another programmer, was also thrilled by the recent "upgrade"

Well done, Google. I just want to release my first Adwords program - partly based on the ctr value. I work about two month for it. Why we should develop programs for Google, if Google changes the API every two month (see also KeywordService)?

Patrick Chanezon, who was hired by Google as an AdWords API evangelist, stated:

The algorithms used in the TrafficEstimator may return some results that do not match your quality expectations, but they are not skewed in any way.

And here I thought making something inaccurate was skewing it...

Inasisi ran through some examples of the intentional data skewing and said:

If it is not on purpose, I don't understand why Google is not correcting the huge skewness in its estimates and further remove the only good statistics that we had to access to. If Google felt the need to be consistent to both the API users and the advertisers who use the UI, then they should have provided more information on the UI instead of having to strip them from the API.

For being so concerned with efficient market theory and collecting so much data Google sure is greedy with their data. They sure expect marketers to trust them with a bunch for not even trusting marketers with something as trivial as search counts.

Yahoo! and eBay allow access to their old marketplace data because it helps drive up costs, commerce, profits, and makes a more efficient market. Why can't Google get a clue on this?

SEO Chat Implodes

SEO Chat is quite possibly the most overly commercialized forum I have ever seen. They get their content free, and I think most of the moderators worked free too. Recently they once again made changes without informing the moderators, and this time they pissed Rand off pretty good.

The irony of it is that they said they were fixing up the site for SEO reasons and did not ask any of the SEO moderators about the changes beforehand. Pretty stupid, IMHO.

Surely there is a great deal of noise they are trying to contend with, but after a system becomes noisy you can't change it without breaking it.

Channels can have a bunch of noise and still do well, but if some of the things that add some of the noise draw people toward your network then you are going to lose big when you remove them, especially if you do it in a disrespectful manner.

You only need about a half dozen to dozen members to make a good community, and if you lose them then you are a bit SOL.

As recently stated by a friend I met at Pubcon, creating a hierarchical framework can work to help moderators think that you are helping them by letting them be a moderator, but there still needs to be some level of respect.

Rand wants people to join Cre8asite, and Barry thinks Digital Point is more fitting.

I see Digital Point forums doing well long term because

  • being uber technical and monetizing page views Shawn will have no problem dealing with the massive server load

  • it is built around openness with minimal editing
  • Shawn created a bunch of free useful tools that are easy for anyone to use

Going forward I think most successful communities will be more about setting up a functional social framework and letting the best framework spread rather than advertising top down systems which do not respect their users.

Google TrustBox

Would the Sandbox concept be more accurately named the TrustBox?

NFFC, Lots0, and Massa highlight in far more detail what I was hinting at in my recent sandbox post and what I was trying to say on my WMW panel. Like it or not, SEO is largely becoming a game of public relations in many competitive industries.

Other semi-recent posts about the shifting of bulk spam to trust related techniques:

For those who recently complained that link trading doesn't work, it is largely because most link exchange offers and opportunities are garbage.

A long time ago I thought that search algorithms were going to advance to the point that it would be easier to influence (barter / manipulate / become friends with / etc.) people than search algorithms. With algorithms like TrustRank and the viral nature of blogging you really don't need to seek the approval of all that many people to do well. Put another way, if Danny Sullivan likes and frequently references your SEO website then odds are Google will too. In every industry there are going to be a limited number of people like Danny.

BTW, Danny also recently posted about the move toward trusted links.

If you watch Google close enough you get a lot of good free tips on public relations. They are ALWAYS in the news. And it is on all fronts.

  • controvercy (Google Print)

  • business (why split your stock when a lofty price makes front page news in the business / finance section)
  • technology (too numerous to mention)
  • etc etc etc

Google Sitemaps Updated...

Google Sitemaps now has more features on top of showing crawl stats and crawl errors:

  • PageRank distribution (high, medium, low, not yet assigned)

  • top 5 Google search queries for getting clicks to your site
  • top 5 Google queries returning your site in search results

Seems like that is just a tease at giving information (and if you want real stats you have to use Google Analytics to give them more info back), but here is the Sitemaps stats FAQ page.

I don't think Google really needs or wants the site owner sitemap data so much, I just think they want to be the default service people use in case it is useful down the road. That is why they are throwing in the few extra "goodies". Storing data costs Google next to nothing.

Most likely Google is the default search tool, advertising tool, email tool, analytics tool and free information storage database for a large number of people now.

Google AdWords at Bottom of Search Results

I have not been able to get a screenshot, but at WMW Vegas I noticed that when Baked Jaked was looking at Google search results for [Gwen Stefani Tickets] that their were Google AdWords ads at the top, right, and bottom.

WMW Paid Link Advertising Panel

Patrick Gavin gave similar presenatation as his recent San Jose one.

Stuntdubl mentioned the techniques of Link Ninjas, which is a link building seminar that came out of the presentation.

He posted quite a bit of good stuff like some of the recurring themes on his blog (link naturally, neighgborhoods, use a variety of link types, etc). I will see if he posts his presentation online. If so I will update this post. Todd has got really good at presenting for starting somewhat recently.

Philip Kaplan of AdBrite showed his recently launched intermission ads (mentioned here). Also noted that AdBrite does not do direct links and is exceptionally transparent.

Martinibuster
Online magazines are sometimes underpriced and have great link neighborhoods.

Gives example Google Search [advertise $15 per month -cpm]

look for websites for stuff like [this website closed]

run Xenu link Sleuth on directories to find broken links...some of those may be easy sites to buy cheaply

emphasizes alternative sources of links

look outside same networks everyone else is using

Q&A: there was a question about Google hating on paid links

don't forget Yahoo! and MSN give credit
stay on topic so you get direct value too

managing link buys?
you can use AdBrite to mine information (this could also be used to help you find what the top posts or topics are on some competing channels)
excel can be used to show link dates, which also helps show the value if you are tracking

Oilman mentioned search for powered by xyz forum + a topic (like sci fi) to look for some potential cheap link buys

WMW Las Vegas Organic Site Reviews Panel

On this pannel sites are reviewed for what they could do to improve their SEO.

ArtInternationalWholesale.com

  • use specific page titles on deep pages

  • Tim Mayer recommends optimizing for image search since it has lots of traffic and few people optimize for it. Use proper file names, alt tags, and link at the images.
  • duplicate content issues (individual product pages are so similar)
  • Matt says they need to look harder at link quality
  • has the site duplicated on the .co.uk

OnlineHighway or InformationHighway...something like that...I so could not see the URL

  • 50,000 to 5,000 visitors per day on update Allegra

  • using popunders is just as evil as popups
  • unsure purpose of site by looking at a page
  • used to have multiple location based URLs...301ed to one central domain. Matt Cutts recommended that.
  • Baked Jake said it can take 2 weeks to 6 months for 301s to take effect

TicketsToGo.com

  • TicketsToGo seems penalized in Google since October 2004

  • also created TicketsToGo.net because
  • duplicate content issues
  • Jake recommends starting from the bottom up. Building links into some of the subject specific pages and then working your way up.
  • target Geo specific concerts
  • Matt Cutts said "tell me about your backlinks" ... uber spammy reciprocal linking campaign. said good news is no manual spam penalty, but few of the low quality links this site has are doing it any good.

BargainTravel.com

  • Tim Mayer asked what is actually unique about your domain?

  • Yahoo! looks to ensure that with travel that the travel box is owned by the domain, not an affiliate form. Would not recommend submitting to Yahoo! paid inclusion
  • Matt pointed out bad cross industry linking between his own site (like mortgage and credit sites), but said there were some good links
  • Tim recommends making the site more unique from page to page and cleaning up the navigation links. He thinks the site navigation being at the footer and the page content existing primarily of wildcard replace duplicate content makes him think the local pages are for search bots instead of users
  • not only link to related pages about immunization, etc., but also create tables of the locatin based related information, etc.

MicroMatic.com

  • home page title nice

  • site looks good
  • individual product pages have good data
    Matt Cutts calls some of their paid backlinks "painfully obvious" to most any search engine. Matt said those links are not hurting them, but they are not helping in Google.

  • could probably be rather easy for a site like that to get many links from beer hobbyist sites

LendingTree.com

  • question about looking at their sitewide links to IACI partner network

  • instead of looking to rank for mortgage Matt recommends looking for 20 year mortgage loan, etc.
  • Jake recommends geo targeted pages
  • Matt recommends maybe adding more text, but they are already looking at ROI testing and that is why there is limited text
  • internal links can help reinforce topics
  • Matt said their cross network linking seems pretty organic / not with intent to spam. Note that in Google's spam review guidelines that IACI's travel sites were ones that were whitelisted examples for remote quality search raters

  • mortgage calculator link on LendingTree built for a manipulation test on Google...that was the reasoning the guy said and Matt Cutts made a funny face
  • Matt Cutts said the partner links section on IACI properties as a technique do not work in Google.
  • Matt said the goal of engines is to detect and count editorial quality votes.

Matt Cutts Las Vegas WMW Keynote

Notes...
Spam is a subset of SEO...not all SEO bad, etc.

Nissan Motors robots.txt blocks all spiders.

Testing fixing 302's. Want to accept destination URL except for like 0.5% of the time. Gives SF Giants URL as an example.

Somethings in index can be perceived in our process as the sandbox...does not apply to all sites.

Does not see Google buying DMOZ or killing reliance on it.

Google does not have the ability to hand boost any sites. They do have the ability to penalize things by hand they believe are spam or illegal.

Autolink...references how it was liked at Web2.0. Thinks the launch could have been better. Would like to allow users to enter their own triggers.

Users and privacy...to take search to the next level you need some information about the users. Matt said he wouldn't work at a company that he felt violated users privacy.

Matt has never worried much about hidden table row type techniques to organize word order. With CSS if you want see how it influences a file test it.

Toolbar does not influence how frequently stuff is crawled. It is too easy to spam, and the toolbar does not have equal distribution across various regions. Many people assume some things provide clean signals which are not so clean.

Matt as a webspam team member said he has no ability or intent to accessing the Google Analytics data.

Litmus test of a site for spam is what value does it add to the web. User reviews, forums, community, etc. What makes a site unique.

Matt Cutts hates on paid links. He said they have manual and algorithmic approaches to paid links. Compares effectiveness of paid links going forward to how reciprocal link spam has largely died off with Update Jager.

If you have to something creative and useful it is easy to get quality links that are hard for your competitors to try to recreate.

Not too long ago I interviewed Matt Cutts.

Yahoo! Publisher Contextual Lunch

Why chose Yahoo!?

  • provide control to publishers - not a black box

  • quality network
  • competitive revenue opportunity (over 100,000 ad buyers)
  • opportunity to integrate with Yahoo! content & Yahoo! users
    custmoer service & community

Size of Yahoo! Publishing beta?
approximately 2,000 publishers

they just launched ads in rss feeds

  • open to all beat participants

  • diversifies rev ops
  • aligns w growing shift to rss
  • supports movabletype and wordpress
  • ads optimized to drive revenue

Yahoo! stated some think 5-6% of web users use rss but Yahoo! research showed it was closer to 30% of web users.

Jen asked if Yahoo! has anything similar to Google AdWords smart pricing?

  • not needed for the following reasons

  • allows advertisers to bid separately for the different content channels
  • Yahoo! is more selective with partners

Jen asked when Yahoo! Publisher would be global
likely early 2006

plans for an affiliate program?
want to work to lower bar to make it easier for publishers to make money and work with Yahoo!...will allow affiliate program and will likely eventually support cpm pricing

wide range of topics on one site...how to be relevant?
can target ads at page level, directory level, or site level...can allow page or directory to override the site level targeting

going to change rev share percentage after beta?
absolutely not, but eventually may use traffic quality to adjust click price

Will Yahoo! offer behavioral targeting on contextual ads?
no nearterm plans, but may eventually

Rate of revshare / how compare to Google AdWords?
Yahoo! does not share the revshare %. more interested in being competitive in allow you to monetize.
revshare by publisher will vary over time

may eventually say you are in x range... to get in another range you may need to (get more traffic higher quality clicks etc)

Jen said targeting was no good at start...now better...is it where it needs to be?
still working to improve...pleased with speed at which it is being made better

Will Yahoo! offer a premium publisher program?
may give advertisers more control over who working with. but even small publisher may be premium if quality targeted traffic etc

How long will Yahoo! publisher in beta?
maybe toward end of q1

Jen asked plan on cpm ads?
may add cpm cpa. yahoo already does cpm on internal network

Will Overture drop the minimum bid?
unsure.

Robert X. Cringely Keynote at WMW Las Vegas

  • Robert X Cringely created Triumph of the Nerds

  • he once lost a 96,000 word manuscript and there was no restore function. He created the trash can on Apple's project Lisa, making emptying it a two step process.
  • in 1984 he helped build internal and corporate communications for Apple. In 1991 Apple sold that to Quantum Data Physics (later named AOL)
  • spooks went to xerox parc and xerox offered a huge price for a computer. the price was too high. the went to Stanford, and although they never originally created computers to sell Sun (stanford university nework workstation) was born. stanford saw no intellectual property in sun.
  • cisco came out of the same building as Sun. It used same motherboard as sun. cisco started on credit cards
  • typically companies can go to 600K in monthly sales on credit cards then they typically fail if they are still funding on credit.
  • Robert could have got 15% of Excite for $1,800 (I think that was the number)
  • recently he has been working on PBS GeekTV
  • he tracks his accuracy, thinks someone should create something like accuracy in media.com
  • talks about consolidation in the space... msn /goog /yhoo only serious competitors.
    • windows and office profitable...nothing else at msft is

    • msft has cost items
    • xbox 4 billion dollars lost
    • they spend tons of money on other stuff as case B if office & windws fail
    • extra expenses there so they can later cut them if profits from office or windows falter in profitable
    • thinks google wanted the 4 billion to buy / create something (but unsure what)
    • google sticks it to competitors
    • gmail 1 gig / user... around 3 million users
    • yahoo matched it with 154 million email users
    • google's largest advantage is their clustering of hardware (see Skrenta's post on Google's source of power)
    • Google has image problem where to busy trying to impress w their brain, not helping you think of how smart ur brain is
    • Robert believes Google will beat msft & define internet for future
    • yahoo will reposition to become something far different than google
    • google search appliance is important in what it represents... it "just works" ... you only have to plug it in
    • if msft tried it you wouldnt trust them or you would think they would screw it up
    • google offer life to struggling companies like the dark fiber ones...get 300 boxes on the network
    • perhaps something like google internet will be more secure etc, just plain works, Robert sees it coming in next 2 years
  • on contnet and monetization...
    • Robert has 200,000 weekly readers

    • archives gives him same amount of traffic
    • NerdTV costs $1,000 a show and hosting costs same amount
    • costs about 8/10th cent per download + $1,000 fixed cost
    • 130,000 downloads per show
    • people subscribe to 2-3 times as much as they consume, so sometimes it is not benificial to make data as convenient as possible to access

Opportunity Optimization

Dan Thies was recently interviewed by Pandia. One big thing he stresses is the concept of opportunity optimization, and how many people focused on SEO are missing out:

Beginners have a hard time looking at the rest of the picture. Their #1 problem is probably not traffic, it's conversion, usability, opt-ins, follow-up, pricing, making the right offer.

You can use search engine marketing to help you solve these problems, but if you don't solve them you will eventually fail. Those who make the most profit per visitor have the most resources to compete for rankings and ad placement.

When I speak with someone who wants to improve their rankings, I usually ask if they do pay-per-click. Invariably, the answer is "no, we can't afford that." The bad news is that if your website can't convert well enough to support a PPC campaign, you'll often find that SEO is even more costly, especially in the short term.

To be honest I am pretty guilty of not maximizing monetization per pageview. I struggle a bit with the issue of trying to write about what I find new and interesting when if I dumbed down most of the blog posts to be more fitting toward newbies my sales could probably double or triple.

The people who vote with their link popularity to help boost your authority are frequently not the same people who buy your goods and/or services. What are the best ways you have found to be linkable and target new people without seeming overtly boring, etc? Content in multiple formats? Multiple similar channels? Free email tips?

The low cost of traffic from quality SEO can be as much of a problem as a blessing, because it allows people to get away with being fairly inept in other business fascets to the point that eventually when the SEO techniques they use no longer work that the only solution is to close the business.

link from TW

Free Excerpt of The Google Story by David Bias

Free excerpt from the book (about space race, biotech, etc.)

I am sure NickW will love this bit:

"Why not improve the brain?" Brin asked. "You would want a lot of compute power. Perhaps in the future, we can attach a little version of Google that you just plug into your brain. We'll have to develop stylish versions, but then you'd have all of the world's knowledge immediately available, which is pretty exciting."

AOL to Offer Reruns on Demand, More Links...

Flying Spaghetti Monster:
Martinibuster on quality linkbait

eBay:
price history - paid service

AOL:
TV Reruns on demand

Authority:
Article by Peter Morville

Book of the Week Club, Knight Ridder to be Sold? Speculation on Google Affiliate

Apparently the people at Google want to rent weekly digital access to books.

Web search leader Google Inc. has approached a book publisher to gauge interest in a program to allow consumers to rent online copies of new books for a week, The Wall Street Journal reported on Sunday.

The proposed fee is 10 percent of the book's list price, the Journal reported, citing an unnamed publisher.

The discussion with the publisher indicates Google may move toward adding a digital book-renting service. - Reuters

In related news about other business models Google and the web may be changing or killing off, Knight Ridder, the large newspaper company, is exploring selling itself. When will Google Affiliate come out?

Free Google Analytics

Free Google Analytics

Urchin on demand:

  • yesterday $199 a month

  • today free

Matt Cutts says:

Blackhat SEOs may be leery of using Google for analytics, but regular site owners should be reassured.

Reassured of what? That Google wants more exceptionally valuable user data :) Lest we forget what a click is worth, or that what is acceptable in search marketing changes as the algorithms do.

Danny said:

Worried Google will use your data or the data overall to better understand how much you are willing to pay for ads, based on conversions. Google said that's definitely not done, nor are there any plans to do that. Nor are there any plans to tap into the data as a means of improving regular search results or to identify "bad" sites, Google said.

Peter asks where that info came from, and I gotta wonder how smart pricing works if they ignore the value received from a click. Why would they only track it one way on certain accounts? That seems counter to that whole efficient keyword market theory so much research is being done on. What value does the data have if they are not going to use it?

Even if they only use your data in aggregate, if you are exceptionally profitable on some terms those keywords could be suggested more frequently to competitors (to help raise those keywords to near fair market value), and the smart pricing would discount less on content that your site proves converts. Search engines do not need to know how much money you are making off any term, just a peak at the ratios can help give them a good idea when they have enough other data.

You know the search engine wars are at their peak the day most computers, ISP, and general web hosting is free and you are being paid to surf. :)

The Value of Writing Articles for Trusted Sites

By writing articles for high quality sites you get high TrustRank links cheaper than you can rent or buy them, many secondary links, and added credibility (I think Andy Hagans may have been asked to speak at a cool conference largely based on a recent article).

As an added bonus, when search engines place more bias on global popularity scores your article can show up for rather competitive terms if your site for some reason drops out of the results.

I was just looking through Google's [search engine optimization], and after SeoBook.com has been around for close to two years it ranks at ~ 30 in Google, and Andy Hagans recent article on A List Apart ranks at #19.

In my interview of NFFC he stated:

we offer marketing on demand, a webmaster needs to be visible in every channel.

Sometimes that means working hard to make your site fit a variety of algorithmic possibilities, and sometimes that means putting backup on other sites.

Yahoo! Search Index Update Coming...

Another Yahoo! Update coming, followgreg at WMW said it rolled out on 68.142.226.54 first.

I had not mentioned it here yet, but a while ago Yahoo! also dropped the monthly minimum spend on Overture and lowered the initial deposit to $5.

Google a Web Bully? Hot Nacho Speaks Out Against Their Spam Double Standards

A while ago Chad Jones, from Not Nacho, the site involved with the WordPress content spam fiasco, spoke out about what went wrong.

WordPress hosted about 4,000 content articles about expensive topics. Matt Mullienweg hosted the content on Wordpress.org and placed hidden links on the home page pointing at the articles.

WordPress, the popular blog software which use the hidden links, was back in the Google index quickly. Google is still punishing the owner of HotNacho to this day, as Chad states:

They seem to have taken punitive measures by looking up my other sites via WHOIS and punitively banning a bunch of my sites -- including my hobby freeware sites.

Sites I own (all of which Google has banned):
hotnacho.com
acme-web-hosting.info
avatarsoft.com
notepad.com
free-backup-software.net

Thoughts on his article:

  • I don't like his comparisons on his content vs real spam, but his point that it is hard for human compiled content to be profitable against automated systems is on many fronts accurate.

  • Him saying Google controls over 90% of web traffic right after complaining about others not doing any fact finding undermines his credibility.
  • He has some good ideas on the content rating and importance of user feedback or using strong quality guidelines off the start is important.
  • I know many other friends who run the exact same business model, but do it profitably, successfuly, and in Google's good graces because how the content is formatted. Wrap it in a blog and post a few articles a day to each channel.
  • While he was talking about how his keyword placement software could increase the ability of content to rank, I think it is in error to look at it purely from an algorithmic front. The social structure of content matters.
  • It is far easier to build links into topical channels (such as blogs) than article banks.
  • He talks about creating a bunch of freeware and offering free support. Doing good on one front does not offset the actions on others with the mob justice on the web.
  • I think it is pretty shitty of Google to have banned all of his sites. I mean who does this help? Where is the relevancy?
  • And yet Google funds much of the garbage they purportedly hate. Google not only acts reactively, but blatently overly reactive when certain issues become public. I suppose they were trying to send a message to Chad Jones, but it was not one honestly focused on search relevancy. I wish I would have seen this article sooner.
  • The fact that few people have mentioned the Hot Nacho article shows how biased blogs are at grabbing the front end of the story and then prowling for the next story before adding any depth or further research. Sorta reminds me of the Nirvana song Plateau, although I admit I am just as guilty at it as the next blogger.

Risk vs Reward In Hiring a Cheap Link Monkey

Not only are the engines getting better at discriminating link quality, but when you outsource your link building to save money you often get automated junk which is sent WAY off target.

That presents three main problems:

  • potential bad plublicity (few things suck as bad as Danny Sullivan highlighting one of your own link exchange requests as being bad, as you know that probably gets read by MANY search engineers)

  • frequently exchanging way off topic makes your site less likely to be linkable from the quality resources on your topic (and, to a lesser extent, may cost you some of the quality links you already have)
  • If sites are willing to trade way off topic that means odds are pretty good that much of their link popularity is bottom of the barrel link spam. Thus as you trade more and more off topic links a larger and larger percent of your direct and in-direct link popularity come from link spam that is easy to algorithmically detect.

The net result is that a somewhat well trusted and normalish link profile starts to look more and more abnormal. Eventually bad plublicity or the low quality links may catch up with the site and it risks either gets banned or filtered out of the search results.

If you have a longterm website, and are using techniques that increase your risk profile and are easily accessible to and reproducible by your competitors at dirt cheap rates it might be time to look for other techniques.

Some sites that practice industrial strength off topic link spam might be ranking well in spite of (and not because) some of the techniques they use.

[Update: just got this gem

Hi

My name is Ben, and I'm working with Search Engine Optimisation [URL removed].

I have found your site and believe it would be mutually beneficial for us to exchange links as our sites share the same subject matter. As you may already know, trading links with one another helps boost both of our search engine rankings.

As a result, I am sending this email to inform you about our site and to propose submitting our link to your web page located at; www.search-marketing.info

We would appreciate if you could add a link to our web site on this/your web page, using the following information:

Title Link: Search Engine Marketing
Description: Tailored Search engine marketing campaigns for your business. Leverage our online marketing & pay per click management experience & achive fast ROI.
URL: http://www.[site].com.au/search-engine-marketing.html

NOTE: We will upload your link on our site, when you have notified us our link is live and we can see it online.

Thank you for your time and your consideration.

Sincerely, Ben

Linkmaster
ben@[site].com.au

Can you imagine how shitty their SEO services are for their clients if they send shit like that out for their own site.

They know I am an SEO, and they:

  • are too lazy to grab my name from my site, even though it is on every page (not hard to automate that)

  • say I may know something about how links work (get a clue)
  • call my home page a links page (really stupid)
  • want me deep link into a useless service page on their site
  • call their search engine marketing services tailored, when it is pretty obvious that they are not using sophisticated or useful techniques for their own site.]

Greg Boser: Blogger

Greg says Oh my God, I’ve Become a Blogger. A great thing for webmasters and search in general, IMHO.

Greg asks:

But now comes the hard part. How do you go about creating a blog about search marketing that is truly unique?

Anyone ever notice that the black hat SEO blogs typicially have both higher content quality and more original content than the typical white hat SEO blogs? Apparently, Gordon Hotchkiss has yet to get the memo.

via Oilman

Regulating Search Conference @ Yale

The Information Society Project at Yale Law School is hosting "Regulating Search?: A Symposium on Search Engines, Law, and Public Policy," the first academic conference devoted to search engines and the law. "Regulating Search?" will take place on December 3, 2005 at Yale Law School in New Haven, CT.

Topics covered:

  • Panel 1: The Search Space
    This panel will review the wide range of what search engines do and their importance in the information ecosystem.

  • Panel 2: Search Engines and Public Regulation
    This panel will discuss the possibility of direct government regulation of search functionality.

  • Panel 3: Search Engines and Intellectual Property
    This panel will review past and present litigation involving search engines and claims framed in the legal doctrines of copyright, trademark, patent, and right of publicity.

  • Panel 4: Search Engines and Individual Rights
    This panel will look at the role of search engines in reshaping our experience of basic rights and at the pressures the desire to protect those rights place on search.

Early bird registration fees (early registration ends on Nov. 15):

  • $35 for students

  • $75 for academic and nonprofit
  • $165 for corporate and law firm participants

Free Open Source Keyword Phrase List Generator

Probably the least exciting of the SEO / SEM tools I have created so far, but recently my friend Mike created a keyword phrase list generator.

I made it open source, so if you like it feel free to link to it, mirror it, or improve it.

Google May Sell Ads for Chicago Newspaper Company

UPDATE: Google Weighing Test Of Print Ads In Newspapers

Google Inc. (GOOG) is considering testing print advertisements in Chicago newspapers, in a sign that the Internet giant, to date seen primarily as a threat to traditional media, could also become an ally.

If Google could take the inefficienies out of offline media they probably could end up making the papers more revenue in the short run. Long run is anyone's guess.

Google Sandbox Tip

What is the difference between how real news stories spread and how link spam or artifical link manipulation spreads.

Link Spam Detection Based on Mass Estimation

Not sure how many people believe TrustRank is in effect in the current Google algorithm, but I would be willing to bet it is. Recently another link quality research paper came out by the name of Link Spam Detection Based on Mass Estimation [PDF].

It was authored by Zoltan Gyongyi, Pavel Berkhin, Hector Garcia-Molina, and Jan Pedersen.

The proposed method for determining Spam Mass works to detect spam, so it compliments nicely with TrustRank (TrustRank is primarily aimed to detect quality pages and demote spam).

The paper starts off by defining what spam mass is.

Spam Mass - an estimate of how much PageRank a page accumulates by being linked to from spam pages.

I covered a bunch of the how it works in theory stuff in the extended area of this post, but the general takehome tips from the article are

  • .edu and .gov love is the real deal, and then some

  • Don't be scared of getting a few spammy links (everyone has some).
  • TrustRank may deweight the effects of some spammy links. Since most spammy links have a low authority score they do not comprise a high percentage of your PageRank weighted link popularity if you have some good quality links. A few bad inbound links are not going to put your site over the edge to where it is algorithmically tagged as spam unless you were already near the limit prior to picking them up.
  • If you can get a few well known trusted links you can get away with having a large number of spammy links.
  • These types of algorithms work on a relative basis. If you can get more traditional media coverage than the competition you can get away with having a bunch more junk links as well.
  • Following up on that last point, some sites may be doing well in spite of some of the things they are doing. If you aim to replicate the linkage profile of a competitor make sure you spend some time building up some serious quality linkage data before going after too many spammy or semi spammy links.
  • Human review is here to stay in search algorithms. Humans are only going to get more important. Inside workers, remote quality raters, and user feedback and tagging gives search engines another layer to build upon beyond link analysis.
  • Only a few quality links are needed to rank in Google in many fields.
  • If you can get the right resources to be interested in linking your way (directly or indirectly) a quality on topic high PageRank .edu link can be worth some serious cash.
  • Sometimes the cheapest way to get those kinds of links will be creating causes or linkbait, which may be external to your main site.

On to the review...

  • To determine the effect of spam mass they computate PageRank twice. Once normally and then again with more weight on known trusted sites that would be deemed to have a low spam mass.

  • Spammers either use a large number of low PageRank links, a few hard to get high PageRank links, or some combination of the two.
  • While the quality authoritative links to spam sites are more rare, they are often obtained through the following
    • blog / comment / forum / guestbook spam

    • honey pots (creating something useful to gather link popularity to send to spam)
    • buying recently expired domain names
  • if the majority of inlinks are from spam nodes it is assumed that the host is spam, otherwise it is labeled good. Rather than looking at the raw link count this can further be biased by looking at percent of total PageRank which comes from spam nodes
  • to further determine the percent of PageRank due to spam nodes you can also look at link structure of in-direct nodes and how they pass PageRank toward the end node
  • the presumption of knowing weather something is good or bad is not feasible, so it must be estimated from a subset of the index
  • for this to be practical search engines must have white lists and / or black lists to compare other nodes to. this can be automated or manual compiled
  • it is easier to assemble a good core since it is fairly reliable and does not change as often as spam techniques and spam sites (Aaron speculation: perhaps this is part of the reason some uber spammy older sites are getting away with murder...having many links from the good core from back when links were easier to obtain)
  • since the small reviewed core will be much smaller of a sample than the number of good pages on the web you must also review a small random uniform sample of the web to determine the approximate percent of the web that is spam to normalize the estimated spam mass
  • due to sampling methods some nodes may have a negative spam mass, and are likely to be nodes that were either assumed to be good in advance or nodes which are linked closely and heavily to other good nodes
  • it was too hard to manually create a large human reviewed set, so
    • they placed all sites listed in a small directory they considered to be virtually void of spam in the good core (they chose not to disclose the URL...anyone want to guess which one it was?). this group consisted of 16,776 hosts.

    • .gov and .edu hosts (and a few international organizations) also got placed in the good core
    • those sources gave them 504,150 unique trusted hosts
  • of the 73.3 million hosts in their test set 91.1% have a PageRank less than 2 (less than double the minimum PageRank value)
  • only about 64,000 hosts had a PageRank 100 times the minimum or more
  • they selected an arbitrary limit for minimum PageRank for reviewing the final results (since you are only concerned about the higher PageRank results that would appear atop search results)
    of this group of 883,328 sites and they hand reviewed 892 hosts

    • 564 (63.2%) were quality

    • 229 (25.7%) were spam
    • 54 (6.1%) uncertain (like beauty, spam is in the eye of the beholder)
    • 45 (5%) hosts down
  • ALL high spam mass anomalies on good sites were categorized into the following three groups
    • some Alibia sites (Chinese was far from the core group),
    • Blogger.com.br (relatively isolated from core group),
    • .pl URLs (there were only 12 polish educational institusions in the core group)
  • Calculating relative mass is better than absolute mass (which is only logical if you wanted the system to scale, so I don't know why they put it in the paper). Example of why absolute spam mass does not work:
    • Adobe had lowest absolute spam mass (Aaron speculation: those taking the time to create a PDF are probably more concerned with content quality than the average website)

    • Macromedia had third highest absolute spam mass (Aaron speculation: lots of adult and casino type sites have links to Flash)

[update: Orion also mentioned something useful about the paper on SEW forums.

"A number of recent publications propose link spam detection methods. For instance, Fetterly et al. [Fetterly et al., 2004] analyze the indegree and outdegree distributions of web pages. Most web pages have in- and outdegrees that follow a power-law distribution. Occasionally, however, 17 search engines encounter substantially more pages with the exact same in- or outdegrees than what is predicted by the distribution formula. The authors find that the vast majority of such outliers are spam pages. Similarly, Benczúr et al. [Benczúr et al., 2005] verify for each page x whether the distribution of PageRank scores of pages pointing to x conforms a power law. They claim that a major deviation in PageRank distribution is an indicator of link spamming that benefits x. These methods are powerful at detecting large, automatically generated link spam structures with "unnatural" link patterns. However, they fail to recognize more sophisticated forms of spam, when spammers mimic reputable web content. "

So if you are using an off the shelf spam generator script you bought from a hyped up sales letter and a few thousand other people are using it that might set some flags off, as search engines look at the various systematic footprints most spam generators leave to remove the bulk of them from the index.]

Link from Gary

Google Accounts Being Pushed to Google AdWords Users

When you log into AdWords they have a notice that you should switch over to the new Google Accounts by January 15th, 2006.

Once you switch over a new user access sub tab appears, which allows you to share your AdWords account with co workers without needing to share your personal Google account.

Google has more information about sharing an account and how to send invitations.

Not too long ago Google was giving out Google Account passwords.

Quality Content Without Links Is Not Quality Content...

There is a thread on WMW about the right price to sell an article for. The general consensus is that the author should probably wait it out until their site ranks and just keep their content.

While that is nice in theory, there is no guarantee that a site will eventually rank well just because it has decent content. Of course I am taking stuff out of context here, but you can read the thread to get the gist.

Comment:

As one site is willing to pay you, it doesn't make sense to give your articles away to the other site just to get a link.

Reply:

A friend of mine recently published an article on A List Apart. I think it would be hard to sell most any article for the value he is getting out of the authority of the link from that site, let alone the boost in credibility.

plus good primary links to your site may lead not only to direct exposure and link popularity, but also secondary exposure and more link love.

since your site is new you likely have lots of content and not so many links.

comment:

Whatever you decide, don't make the mistake of granting anyone exclusive rights to publish your work in perpetuity for peanuts.

reply:

for books I totally agree, but if you are obscure / new and / or are operating in a not so well known field and are good at writing articles sometimes giving them away is a great form of marketing.

rule #1: Obscurity is a far greater threat to authors and creative artists than piracy.

comment:

Regarding your site, you will never leave the sandbox unless you keep your content 100% to yourself.

reply:

I think sitting chill with minimal link popularity is far worse than trading some of what you got a lot of for something you don't got a lot of (ie: content for links)

The web has taught me alot about not considering what things could or should be worth and that unless you actively work to make them worth it then inferior products which are marketed more aggressively will often win big.

if you have around a hundred articles I don't think it hurts you to share a few of them.

Some of the links you get by giving stuff away are links you never could have bought. Those are the ones that are usually worth a bunch too.

Friends don't let friends go unlinked. ;)

Playing on the Web...2.0 ;)

Blummy - Firefox bookmarklet management tool that is loved by the Web 2.0 geek. It allows you to put many bookmarks into an expandable box that opens up when you click on it.

For example, the Link Harvester blummlet (code shown below, please ignore the line breaks I added for formatting) looks like:

javascript:Blummy.href
('http://www.linkhounds.com/link-harvester/backlinks.php
?query='+location.href)

and would run Link Harvester on whatever page you are viewing.

A regular bookmarklet for it would look like (again, ignore the formatting line breaks):

javascript:location = 'http://www.linkhounds.com/link-harvester/backlinks.php?query=' + escape(location);

Here is a list of a wide variety of Mozilla bookmarlets, including character count and word frequency bookmarklets.

I was reading some Dive Into Greasemonkey today...good stuff. I just wish I knew a bit more about XPath and Javascript Firefox strategies.

It will probably take me at least a few days before I could make anything cool. I may try though, and if not I could always bug Mike, and maybe Platypus is more my mode :)

A Greasemonkey Hacks book was recently released. Greasemonkey is cool stuff, not just because DaveN says so, but also because you can do things like number search results and import De.licio.us data right into Google search results.

Here is a cool free video maker. I made one today, though it takes forever to upload and sounds like I am eating the mic. I will probably upload it tomorrowish.

I have been far too textual, and think I need to start looking more at trying to learn programming languages, audio, and visual stuff :)

I got to chat for a while with one of the guys from Validome, and they sure do some cool stuff over there.

For those wondering how this post is in any way relevant to search, you can tell a good bit about how competitive a field may be by seeing how many of the top ranked results are annotated.

GoDaddy References Google's Patent

You know you have good reach as a search engine when registrars use your patent numbers to sell domains. GoDaddy says:

Google recently filed United States Patent Application 20050071741. As part of that patent application, Google made apparent its efforts to wipe out search engine spam, stating:

'Valuable (legitimate) domains are often paid for several years in advance, while doorway (illegitimate) domains rarely are used for more than a year. Therefore, the date when a domain expires in the future can be used as a factor in predicting the legitimacy of a domain and, thus, the documents associated therewith."

Domains registered for longer periods give the indication, true or not, that their owner is legitimate. Google uses a domain's length of registration when indexing and ranking a Web site for inclusion in their organic search results.

So to prove to everyone that your site is the real deal, register for more than one year and increase your chances of boosting your search ranking on Google.

I know registrars always sell bogus submit your site to the search engines garbage, but I don't think I have ever seen one recommend registering for extended periods of time because of a Google patent before.

Smart marketing on them, and smart marketing on Google for putting endless amounts of FUD in that patent.

When Getting Published...

Keep in mind that 93% of all ISBN's sold fewer than 1,000 units.

After selling a few thousand ebooks those numbers make getting published seem far less appealing, especially when you consider that my prospective publisher told me they still wanted me to do most the marketing.

What is even more nuts is that I guaranteed the publisher that I would be able to sell more than that in under a year as an add on to my current book (even offering to buy that many off the start) and they still considered that to be small volume and would want me to more prominently push the physical book over the current ebook offering.

I don't get why so many businesses think it is ok to shift nearly all the risk onto another successful business just because they are smaller or new to the market.

Broadcast Television 99 Cents a Show

Comcast Cuts VOD Deal For Four CBS Shows

CBS and Comcast have signed an agreement that, beginning in January 2006, will make four of the network's shows--CSI: Crime Scene Investigation, Survivor, NCIS, and Amazing Race--available on a video-on-demand basis for 99 cents per 24-hour window for each show.

Eventually the content floodgates will open. The question is who will own the distribution rights and at what cost? Does anyone think the baby bells or cable companies will be less monopolistic or more efficient than Google?

Free WMW Las Vegas Conference Pass

Jim Boykin recently gave away a free pass to WMW Las Vegas. Noticing the BOTW WMW conference discount blog post I recently remembered that I had not yet signed up to go.

I signed up, and Brett asked me if I would like to be on this organic search session. I said sure. He gave me one free pass that I can give away, but...

it can not be combined with any other offers ;-) and no people that have already paid, or people that have been comp’d before.

So, tell me why I should give you the free pass for next weeks conference. I will give one lucky winner the pass.

Please note that the conference is in Las Vegas from November 15th through 17th, and you will still be responsible for your travel related costs.

Andy Hagans - Not Haggis - Interviewed...

Andy Hagans recently swore to the importance of his accessible white hat SEO techniques, and I asked him about his love for haggis.

Tips on blogging, outsourcing, link building, and other goodies in the Andy Hagans interview.

Tips on Running an SEO Business

Disclaimer: I am not real good at business. When selling services I always sold myself short, which made it pretty hard to scale services while working by myself. Hence the writing the ebook and some pieces of the Cosmos falling into place for me :)

Todd has seen a good bit of a few different SEO businesses. He recently offered up links to a ton of resources to help you run a web based business. I am not so sure about the business card tip ;) but otherwise everything sounded good to me :)

Google Robots.txt Wildcard

Not sure if I have seen this mentioned before. Dan Thies noticed Googlebot's wildcard robot.txt support:

Google's URL removal page contains a little bit of handy information that's not found on their webmaster info pages where it should be.

Google supports the use of 'wildcards' in robots.txt files. This isn't part of the original 1994 robots.txt protocol, and as far as I know, is not supported by other search engines. To make it work, you need to add a separate section for Googlebot in your robots.txt file. An example:

User-agent: Googlebot
Disallow: /*sort=

This would stop Googlebot from reading any URL that included the string &sort= no matter where that string occurs in the URL.

Good information to know if your site has recently suffered in Google due to duplicate content issues.

Dan also recently an SEO coach blog on his SEO Research Labs site.

Google Jagger 3 Update

Matt Cutts announced the Google Jagger 3 update is live at 66.102.9.104.

It sure is amazing the number of large vertical sites, .edu, and .gov results I saw in a few searches I did. Although there will probably still be a good amount of flux most the stuff I worked on seemed to get through ok.

I did see a bit of canonical URL issues, as noted by others on Matt's blog. Someone named Jason also left this gem in Matt's comments:

Our site has been negatively affected by Jagger. Therefore we just requested the transfer of 30,000 site wide links (paid in advance until July 06) to our main competitor who is currently ranked extremely well in Google for our main keyword.

Our entire website is legit SEO so our site wide links are the only thing that could have caused such a drastic drop in our ranking.

In a thread on SEW DaveN responded to a similar webmaster

IN life there are 2 ways to get on :

1) Be the best you can and move to the top

2) Drag everyone who is above you too below your level ..

Both ways you end up at the Top, it depends on how you view life and how long you want to stay there.

As long as Google is going to announce their updates and data centers, has anyone made a free SEO tool to easily compare / cross reference all the search results at various data centers? (Perhaps something like Myriad Search, but focuses on just one engine and lets the users select which data centers to compare.) I can't imagine it would be that hard to do unless Google blocked it, but they haven't been too aggressive in blocking web based SEO related tools (just look at all the tools SEO Chat has).

Today is the Right Time to Buy Old Sites...

I work by myself, and am always a bit scared of spreading myself too thin, so I have not been to active on the old domain buying front.

Having said that, now would probably be a good time to buy old domains. Jim Boykin again mentioned his new love for oldies and Graywolf said

Came to the same conclusion myself, emailed about 150 people picked up 2 domains from 2000 for under $1K.

Think of how cheap those site purchases are. Decent links can cost $50 to $300 or more each, so buying whole sites for $500 is cheap cheap cheap! How cheap is it? Even the most well known link broker is recommending buying a few old domains.

Why now is the perfect time to buy old domains:

  • It is right before the Christmas shopping season and many people not monetizing their sites might be able to use a bit of spare cash.

  • Many older domains are doing better than one would expect in Google's search results, which means they may recoup their costs quickly.
  • As Andy Hagans said, "Some older sites seem to be able to get away with murder in Google's search results."
  • Link popularity flowed much more naturally to commercial sites in the past than it does now. This means buying something with the a natural link profile may be far cheaper than it would be to try to reproduce similar linkage data.
  • At different times search algorithms show you different things. Before the Christmas shopping season each of the last few year it seems Google rolled out a new algorithm that wacked many sites which SEO'ed their way to the top (IMHO via link trading and low quality linkage data). Most of the algorithm changes are related to looking at linkage quality, communities, and ways to trust sites. The most recent update seems to have (at least temorarily) dialed up the weighting on TrustRank or a similar technology, which has had the net effect of highly ranking many old/trusted/authoritative sites that may lack some query specific authority. If you shop for sites that fit the current Google criteria well then add some good SEO to it you should be sitting good no matter which way the algorithms slide.

Before MSN was launched GoogleGuy recommended everyone taking a look at the MSN search results:

I recommend that everyone spend their full attention coming up to speed on beta.search.msn.com.

It's very rare to get to see a search engine in transition, because that's the best time to see what the different criteria are for ranking.

Now that Google is in a state of flux it might be a good time to perform many searches to look for some underpriced ad inventory. If you know what you are looking for you are more likely to find it in the organic search results than in the AdWords system.

The search vs SEO cat fight:

and going forward...

  • creating causes

  • social networking
  • buzz marketing

I think there is way more competition and SEO is way more complex than when I started learning about it, but that is offset in part by:

  • more searches

  • greater consumer trust of online shopping
  • many channels discussing the topic of SEO
  • many free tools (SEO and content management)
  • lower hosting costs
  • the speed at which viral stories spread if you can create one
  • the vastly expanding pool of options to use to monetize your sites

Why Off Topic Link Exchanges Suck

So I recently got this email:

Dear Webmaster,

As you are probably aware, Google has changed its algorythm and now removes sites from its search results when they have exchanged links with sites that are not in EXACTLY the same category.

To prevent being from blacklisted in Google, it is imperative that we remove our link to you and that your remove your link to us!

Our url is www.SITE.com.

We are removing our link to you now. PLEASE return the courtesy and remove your link to us!

Note that Google is updating its results this week and failure to remove these links immediately will likely mean not showing up in Google for AT LEAST the next 4 months!

Thank you for understanding,
Site.com Partners

The email is bogusly incorrect, and I don't think I traded links with the site mentioned, but that is the exact reason why this email is extra crappy.

If you trade links highly off topic you increase your risk profile, and if it helps you rank:

  • Whenever there is an update your competitors can send these remove my link reminders out for you.

  • There are only a limited number of relationships you can have. If you link out to a million sites your links out to junky sites will be a higher percentage than most sites, you will have more dead links than most quality sites, and many of those people will remove their links to you.
  • Your competitors could pay people from Jakarta $3 a day to go through your link trades and trade the same links.
  • Quality on topic sites may be less likely to link to you if your site frequently links off to low quality resources.

I think most sites which recently went south in Google probably lacked quality linkage data, not because they had too many links.

Traffic Power Has Not Yet Replied to the Motion for Summary Judgement

I recently spoke with the lawyer again about the evil / shoddy anti free speech lawsuit the fine folks at Traffic Power thrusted upon me.

About a month ago my lawyer filed a motion for summary judgement.

Stuff in the court system tends to drag out, but my lawyer said Traffic Power's response is at least a week late & he filed a reply of non oposition [doc] requestion that the motion be granted.

Link Monkeys and Links Within Content

Jim Boykin continues his ongoing rants about links:

Since most people are still thinking "the numbers game" when it comes to obtaining links, most people are buying "numbers" from "monkeys" on crappy link pages.

When will the world wake up that the numbers game has passed the tipping point in Google. Engine are trying to get smarter with how they analyze sites. My overall thought is that they are working to identify, simply, "Links within Content and Linking to Content"

Not too long ago when I interviewed NFFC he stated:

This is what I think, SEO is all about emotions, all about human interaction. People, search engineers even, try and force it into a numbers box.

Numbers, math and formulas are for people not smart enough to think in concepts.

Amazon to Sell Electronic Book Access

Amazon to Google: we own books...

From the Journal:

[Amazon] is introducing two new programs that allow consumers to buy online access to portions of a book or to the entire book, giving publishers and authors another way to generate revenue from their content.

Although Bezos does not come right out and say it, clearly this is a shot across the brow at Google, especially with the timing of their recent print offering.

While Amazon Chief Executive Jeff Bezos wouldn't comment specifically on the Google Print controversy, he said, "It's really important to do this cooperatively with the copyright holders, with the publishing community, with the authors. We're going to keep working in that cooperative vein."

After Google develops their micropayment system I bet they also directly broker a large amount of media.

New Google User Profiling Patent

Loren does a good rundown of a new Google patent Personalization of placed content ordering in search results in his Organic Results Ranked by User Profiling post. Some of the things in the patent may be a bit ahead of themselves, but the thesis is...

GenericScore=QueryScore*PageRank.

This GenericScore may not appropriately reflect the site's importance to a particular user if the user's interests or preferences are dramatically different from that of the random surfer. The relevance of a site to user can be accurately characterized by a set of profile ranks, based on the correlation between a sites content and the user's term-based profile, herein called the TermScore, the correlation between one or more categories associated with a site and user's category-based profile, herein called the CategoryScore, and the correlation between the URL and/or host of the site and user's link-based profile, herein called the LinkScore. Therefore, the site may be assigned a personalized rank that is a function of both the document's generic score and the user profile scores. This personalized score can be expressed as: PersonalizedScore=GenericScore*(TermScore+CategoryScore+LinkScore).

For those big into patents: Stephen Arnold has a $50 CD for sale containing over 120 Google patent related documents.

I think he could sell that as a subscription service, so long as people didn't know all the great stuff Gary Price compiles for free. (Link from News.com)

MicroSoft Buys Into VOIP, Again

TheStreet reports M$ looking at VOIP again:

Microsoft said Thursday it has agreed to buy Media-streams.com, a privately held firm in Zurich, Switzerland. Financial terms were not disclosed.

Media-stream's VoIP technology, which enables telephone calls over the Internet, will become a core part of Microsoft's platform that enables workers to use the Web to collaborate on projects. Microsoft envisions such collaboration encompassing several different modes of communication, including email, instant messaging, Web conferencing and telephone calls via the Internet.

Media-streams is the second VoIP firm acquired by Microsoft in the last few months. In August, Microsoft acquired Teleo

MicroSoft Windows Live Launched

Windows Live software platform getting plenty of buzz.

Firefox Support coming soon. No surprise there. Search champ unimpressed. So are others.

If MicroSoft would learn to do the small things right they wouldn't need to try to create Monopoly 4.0 and Robert would not need to post why so many people do not trust them even though he knows how to fix it.

At least Robert has great job security. EVERYTHING should be secure.

Why People Use Each Search Engine & Lead Value Per Typical User

In Searching for why consumers use certain search engines Internet Retailer looks at recent Majestic Research as to why searchers are loyal to their favorite search brands.

The study shows most Google users primarily are there because they believe Google has the most relevant results (although the fact that Google has not had a longstanding portal as long as it's competitors may bias the study to conclude that result).

Of the other major engines Yahoo! was the only one which had significant votes for relevancy, but Matt Cutts recent post about Yahoo! hand coding probably does not help them much. Tim Mayer just did another Yahoo! update report, and a reader mentions that [online casinos] is just about as spammy as it could be.

In Following the Search Engine Referral Money Trail Jeremy Zawodny shows his search referral ratio, earnings ratio, ad CTR, and effective CPM per engine.

His site has a tech bias, so I believe that favors Google somewhat, but Google sends the bulk of his referrals. MSN and AOL users are much more likely to click content ads than Google or Yahoo! users. I believe that is a function of user sophistication. Less sophisticated people are click happy because they probably don't know they are clicking paid ads.

Google Launches Google Print Book Search

Google Print now allows you to search full text of books in their Google Print program.

After you search you can click on a result and enter your Google Accounts password to view a full page and the two pages before and after it.

The WSJ stated:

The company said it won't display advertisements on public domain book pages or any book pages Google scans from a library.

Perhaps Google realizes being the default search means they can have a few loss liters, and not monetizing the public domain works undermines the Google is a greedy company statement cried far and loud by critics of the program. Google Print is facing numerous pending lawsuits.

Google vs Madison Avenue

Google looks like it wants to own Madison Avenue. The Journal also has a free article on Google vs Madison Ave., and John Battelle recently interviewed Google's Omid Kordestani and Sergey Brin.

If you look at the SEO Bytes monthly toplist you will see that in spite of a recent major Google update many of the most popular threads are about how to monetize Google AdSense ad space.

A year or two ago few of the threads covered monetizing content. It seemed like everyone just wanted to rank or assumed nobody would share that how to profit info. AdSense and similar programs work well for quality and automated sites alike.

While Google monetizes crap sites they usually deny their connection to it, keeping the shadiness far away, funding much of it.

Ask Jeeves is a bit closer in some of their relationships. A few days ago I noticed my mom's computer had some Ask MySearch type spyware activites on it. Sure some of it may be uninstallable, but sometimes when you enter a URL in the address bar it says no site found just to redirect you to ads. Shady.

While some say one bad AdSense site may bring down the whole Google AdSense only took around an hour to approve my mom's new site for AdSense, so Google is not putting up much of a barrier to entry.

The more I read and learn about communities and click pimping the less value I see in my current business model, especially when SEO is usually framed in a negative light and I have to deal with this sort of garbage. After all, even as Case is out AOL is suddenly hot again, and some said Steve was just another spammer. :)