Scholarly Journals' Premier Status Is Diluted by Web, Media and Money

The Wall Street Journal wrote an article about smaller upstarts and the web challenging the business models of of big time academic publishers. It will be interesting to see how far advertising can take search companies before publishers get squeezed to the point where they revolt or make demands that spread to a critical mass.

In other, somewhat related news, Bill Moyers addressed the National Conference for Media Reform in St. Louis about a week ago.

One reason I'm in hot water is because my colleagues and I at "Now" didn't play by the conventional rules of Beltway journalism. Those rules divide the world into Democrats and Republicans, liberals and conservatives, and allow journalists to pretend they have done their job if, instead of reporting the truth behind the news, they merely give each side an opportunity to spin the news.

he continues...

These "rules of the game" permit Washington officials to set the agenda for journalism, leaving the press all too often simply to recount what officials say instead of subjecting their words and deeds to critical scrutiny. Instead of acting as filters for readers and viewers, sifting the truth from the propaganda, reporters and anchors attentively transcribe both sides of the spin invariably failing to provide context, background or any sense of which claims hold up and which are misleading. ... Objectivity is not satisfied by two opposing people offering competing opinions, leaving the viewer to split the difference.

Later he comments on the Journal:

But I confess to some puzzlement that the Wall Street Journal, which in the past editorialized to cut PBS off the public tap, is now being subsidized by American taxpayers although its parent company, Dow Jones, had revenues in just the first quarter of this year of $400 million.

In other related news, Friday Ask Jeeves announced they bought out Excite Europe.

Any way you slice it, there are going to be a few gatekeepers to this thing we call the web, and to most media outlets in general. The more there are the better it is for consumers, and for that reason I might start trying a bit harder to use Google and Yahoo! a bit less.

It will be interesting to see how it plays out, but anyone who knows about SEO should see it as a personal responsibility to make sure people find what issues you feel are important.

Earlier today a friend of mine told me of a site he was creating about an ongoing, rarely covered, & brutal civil war.

More on TrustRank

A while ago I wrote a bit about TrustRank after reading the PDF about it.

It is fairly easy to understand many of the concepts of it (like attenuating a possitive trust score or offsetting the effects of link spam with a negative trust score), but it is even easier to understand them if you visualize the concept of trust attenuation.

Most sites are not exceptionally compelling, so there are usually not many legitimate hubs in any industry, but many sites are glorified link farms which will not pass any positive trust value.

For a while I helped promote many directories, but many of the new ones on the market have little to no legitimate value, and some of the links from them may even have negative value.

I just wrote an article called TrustRank & the Company You Keep, in which I made this graphic explaining the concept of AntiTrust (yet another SEO phrase I made up hehehe).

The red X's represent things that should be, but are not there.
Bad directory image, showing inbound & outbound link profile.

Yes, I know, the drop shadow is too dark, my web designer friend already yelled at me for that. Other than that, I hope the image clearly demonstrates the concept I was trying to get across.

Other than drop shadow remarks, please leave comments on the article and image below.

The Stock Market & Liquidity Theory

Not entirely SEO related, but the stock market is another information system which is often manipulated.

Not that I have much money, but recently I read a book called Trim Tabs Investing by Charles Biderman. On a macroeconomic level it looks at the stock market in terms of volume of shares, their overall price, and the money chasing those shares. Rather than stating that forward earnings drive the stock market they believe the short term stock price can best be described using supply and demand.

It breaks down the money chasing the shares into the following groups:

  • insider and corporate trading (smart money)

  • general investor trading (dumb money)
  • foreign investor trading (dumb money)
  • margin debt (dumb money)

In the short term the money from the typical investor can power the direction of the stock market, but the stock market inevitably goes in the direction of the insider and corporate trading. Peaks in the stock market (tops and bottoms) are often associated with rapid changes in margin debt.

People are emotionally attached to their investments, and tend to believe the future actions of the stock market will follow the recent past. People take out loans to be fully exposed to the market near tops. People also lose hope and cash out at a loss near bottoms. Foreign investment is also another lagging contrary indicator.

Insiders have access to better data, and their actions are thus inclined to be more representative of actual market conditions. Their ability to control the float (number of shares on the market) gives them an unfair advantage. Also sometimes they will forcast a lack of guidance while the stock market is doing bad so they can actively rebuy their own shares for prices below their actual value.

I thought it was a pretty cool book. For small investors he still recommends just dollar cost averaging or using buy and hold, but for those who are rich (some of the early SEO gods are probably sitting out mounds and mounds of cash right now, as early Google workers likely are too - hi Matt) and seek larger gains, liquidity theory may help them do well in both good and bad market periods.

SEO Inc Back In Google, Somewhat

Not that it is huge news for the average SEO, but when SEO Inc was removed from Google the story got so much negative coverage and the SEO Inc PR department botched the issue that it was just a really bad thing for them.

Another great example of how you reacting to something being more important than what actually happens.

Right now I am not getting SEO Inc to show up for their site name and the like, but their home page was cached in Google 2 days ago and is showing up under Search Engine Marketing Firm.

Congrats for getting back in SEO Inc.

Lee Odden mentioned it in SEW forums.

Free Local Keyword Generator Tool

A new free tool allows you to enter a zip code, a radius, and a few keyword phrases and it automatically generates keyword phrases based on your keywords and the locations within your radius.

The output looks like:
pizza "town name"
pizza "other town name"
lasagna "town name"

A few things that would make the tool cooler are:

  • allow people to enter multiple words to make up more phrases based on those various words (like GoogEdit or ThePermutator do)

  • allow output to match various match types (like broad, phrase, & exact)
  • allow people to create "location + keyword" and "keyword + location"
  • optional format the output to allow people to enter max bids and let the search terms drive the URL.

Pretty cool tool for free. found from a thread on SEW.

Link Harvester Updated

My friend fixed the error in Link Harvester's code & updated the free source code with a few new features.

The new features are:

  • sorts .mil domains with the .gov domains.

  • added links to Google cache and Google cache text of each page

If you have not heard of Link Harvester yet, here is some background on it. Are there any other cool features you can think of? I might have a friend create another SEO tool tomorrow too if he has time. I will probably be adding features to Hub Finder soon too if I have enough money and my friends have enough time.

BTW, someone pointed out Search Lores in a comment at ThreadWatch recently. The site is so amazing I can't believe I haven't came across it yet.

Flat Rate Ads & the Quick Automated Cheapest Links

Flat Rate Advertising:
Recently a guy talked about creating a peer reviewed information system which charged a flat rate for clicks. The problems with flate rate advertising are:

  • not all ads have the same value

  • charging a flat rate would lead to oversaturation in competitive areas and minimal coverage in less competitive areas
  • the lower overall income generated through such a system would leave less money for marketing it

It is hard to bribe people to rate relevancy. The best bet on that front is to try to establish a system and idea which will be good enough to build a usebase which markets itself, and then figure out how to attach a business model later.

SEW also recently had another forum thread about acquiring cheap links. Pyramid Link Building Scheme:
Someone recently spammed SEW forums asking about www.16links.com, which is a link building pyramid scheme that charges people to join it too.

What a hunk of crap site / idea!!!

One Time Fee Links:
Another person dropped in the 16links.com thread to recommend textlinkpopularity.com for building one way links for a one time fee. (incidentally, this person's only other comments are in a thread they started recommending textlinkpopularity.com).

Why one time fee links suck:

  • Low Quality: High quality sites selling useful ad space usually do not sell that ad space for a one time fee, even most directories suck.

  • Low Quality: If sites are hard up for cash then those sites likely are not going to be long lasting ones.
  • Low Quality: If a site is selling underpriced ad space for a one time fee, then eventually that ad space becomes hyper saturated to where the value diminishes.
  • Low Quality: If sites are made just to sell links for a fixed rate then they may not have enough money to put back into promotion. The site the ad is on may not grow with the web. If a site rarely picks up new links then it would be easy and likely that a search engine may discount the value of links from that site, especially if it is a site that is not well integrated into the web.
  • Low Quality: I started on the web by creating a site that was a bit critical of the military. Its a really bad site and I should take it down, but I leave it up to still speak my mind and show how quickly people can learn. One of the more reputable link brokers spammed that site asking if they could buy links on it. That shows there probably is not much quality in that business model if they are willing to risk their reputation for a few dollars.

Easy to Replicate:
Another common problem with most linking schemes is that they are easy to replicate. This means that if a quick low cost link scheme is effective, easy to trace, and has no quality standards then people will be able to quickly replicate it, thus any competitive advantage gained would be quickly minimized.

Cheaper Links:
I have lots of directory links and one time fee links, but most of them were not built through any broker, and at the end of the day most of them do not drive much traffic, and I am moving away from doing it as much for some of my sites.

Most of the links for a one time fee type programs charge about $20 - $30 for a link. So a dozen crap links would cost you around $300. These links would most likely:

  • drive no traffic

  • be on pages full of other junk links
  • not be on authoritative, highly related, or well integrated sites

The links that drive the most traffic to my site are the ones where my site or I am featured or cited. Examples:

Articles:
Writing an article might take a couple hours, but if you get it syndicated through the right channels it can build dozens of quality links. These links:

  • drive targeted prospects

  • are on pages with few links
  • are on pages about your topic
  • some of them may be on related, authoritative, well integrated websites

Most articles I write and syndicate quickly bring in at least one or two consulting clients, so there is some value there, plus for about 3 hours of work (writing and submitting the article) I can get links that are worth well over $300, since the articles would have more longterm value than the crap one time fee links.

Free Tools:
Like a twit I recently broke the Link Harvester tool. Currently I have an old version up, but my friend who made it is going to add a few new features to it and have it back up this weekend. :)

The Link Harvester tool so far costed me about 2 hours of my time and around $500 to make. It got links from sites like SitePoint, ThreadWatch, & Yahoo! from within the content part of the pages. In most good algorithms 3 of those links, from sites which:

  • are well established

  • drive traffic
  • are authoritative
  • are not going away anytime soon

are going to be worth far more than a dozen or two dozen permanent junk links. A few other beautiful things about getting links from authoritative sites:

  • Using tons of cheap one time fee links may raise your risk profile. Odds are that Yahoo! is not going to use their link pointing at my site as a reason to ban my site.

  • Getting links on authoritative sites is not as easy to replicate as getting links from a program which serves up links all you can eat at $25 each.

Scalability of a business model is important. If a project or idea does not gain steam then the value of the ad is limited at best. I like investing early into some ideas just in case they pan out, but the people selling links using cheap instead of value as the selling point may not be giving you much value. Sometimes the value of links is destroyed by the business model of the site the link is on.

When you look at links on a shear numbers level you end up missing the value of putting in a little effort or spending the money in indirect ways to get more longterm value out of your link ad spend. [/end rant hehehe]

Google Portal, Stemming, DMOZ Submission Review

Portalized:
Google offers portalization of Google.com. Danny Sullivan has an in depth review. They have a number of features and intend to add many, such as RSS feed support.

Stemming:
Rand points out a post by Xan on stemming and a free online stemming tool

DMOZ:
kills the submission status review. Now its even easier to be corrupt ;)

New York Times:
Begins charging for some of their content. Most of their content remains free. They are also replacing the CEO of About.com.

When Not to Submit to Directories:
when a person creates about a half dozen general directories and promotes them all together. that is not building value, that is trying to cash out and milk the web.

Many directory owners have become exceedingly greedy recently. All the while search algorithms continue to advance and few of the directory owners are actually trying to build any legitimate value.

The Search:
You can pre order John Battelle's new book. He said if you use this link he may be able to autograph it for you, assuming he can work out the shipping details.

The Size of Google's Index:
might have been a bit frothy

Google Factory Tour:
video presentations (should be up soon), Philip Lessen has highlights

Mirago AdSense:
Apparently they have a product similar to AdSense, which might be useful for companies like HotNacho.

SimCity & Google Earth

SimCity was always one of my favorite games. kpaul recently noticed a new site by the name of Chicago Crime, which overlays crimes with their locations using Google Maps. Pretty scary to see that in Chicago there was over a murder a day last month.

What kind of ad marketplace would Google have if they:

  • integrated Google maps and public data into a social network

  • which linked to - or allowed people to upload - business feedback (think Local Froogle)
    • should I buy from here?

    • what other businesses are cheaper or provide better service?
    • should I consider working here?
    • who else is hiring in this field or near here?

    and destination reviews

    • is this place worth visiting?

    • when is best?
    • who has the best travel deals?

They also could show the history and trust rating of reviewers, as well as letting you determine how many social connections away you were willing to accept reviews from, maybe they could match up personalities or demographic profiles if people gave them that data, or they could let you create your own combined metric.

Add a strong recommending engine technology to that (like how Amazon.com says "of the people who viewed this product ultimately 37% ended up buying XYZ") and Google will serve ads that know what you want even when you don't.

Google has data worth lots and lots of money. It will be interesting to see how they aggregate content and collect feedback to leverage their market position.

Any merchant heavily exposed to the web which is not building communities or other hard to replicate assets may end up in the hurt locker in the next couple years.

Google's ad serving technology is still somewhat primative. As time passes and more major networks leverage their market postions more and more merchants will get marginalized by the forces that be.

Making More Money with AdWords: Search Engines, Not Consumers

Sounds like a marketing product name, eh? Actually this is a link to a research paper Orion mentioned, a 20 page PDF about AdWords and Generalized On-Line Matching, which covers the idea of allowing search services to extract the maximum ad revenue out of advertisers.

One problem current search related ad systems have is that after one advertiser exhausts their budget the competing sites may get ads below their fair market value.

If a college student wanted to get a job at Google you could bet that writing a research paper about making AdWords more profitable would be a good idea :)

In related news...
AdWords Smart Keyword Evaluation Tool:
Sometimes without human review it disables some exceptionally well targeted terms even before you get a chance to display your ads. That is not so smart, as it frustrates advertisers and prevents them from selling part of their inventory.

You can't know how well an ad will perform based on past advertising experience since so much of Google's ad space is full of "Buy dead animal at eBay" type ads.

Why Disabling Some Generic Term Makes more Money:
I advertise one product line on Overture where part of the name is an acronym. I can use that acronym to make a decent number of sales on Overture for a good sum of money. If I want to advertise for that term on Google AdWords, even with like 20 negative keywords (filtering out unrelated traffic), the term consistantly gets shut off, despite getting a clickthrough near their minimum rate and converting exceptionally well.

Then again, maybe Google does not want me to get those conversions for a nickel. In how broad search engines allow you to advertise they are also trying to control the way searchers search. If a person searches for a short acronym Google would prefer that person to give them more data, so they can gain a better understanding of what the person wants, and deliver more targeted and hopefully more expensive advertising.

In my example for targeted terms I pay over 10 times as much per click, which really sucks since the acronym had a conversion rate higher than the campaign does.

Pages