Is Your Attention Span Getting Shorter?

I believe that, in general, the attention span of most technologically enabled people is getting shorter by the day. In addition the increased number of content channels and communication associated with the web leads people to become more biased.
I have not done any research, so I may be wrong on those ideas, but this is what leads me to say that:

  • Automated spam (email, blog comments, direct mail associated with registering for a trademark, etc) which may even look relevant forces you to judge things quicker.
  • Search makes it easier to get "good enough" answers quickly.
  • The web (and search) let you self select social groups with predefined similarities.
  • There is a far greater number of news channels than there were just a few years ago.
  • We tend to consume media that fullfills or reinforces our predefined worldview.
  • RSS (and devices like Tivo) make it easy to consume media how and when you want to.
  • Sites like YouTube make it easy to embed other's content within your site.
  • Google Video now allows you to link to an exact second of a video in their collection.
  • My shorter and more straightforward posts usually get more comments and more quality editorial links. I also notice many of the Digg homepage and Del.icio.us popular URLs are short and straightforward articles titled things like "10 ways to x".
  • The increased number of social aggregation sites is making it easier for people to see what ideas spread, how they spread, and why they spread in near real time.
  • My biased and/or controversial posts usually get more comments and more quality editorial links. (And thus are more easily found and locked into more self reinforcing positions.) Can you name a popular political blog that is not highly biased?
  • As the amount of information on the web increases and search engines trust old content until new content proves its value the new content is forced to be of a higher quality than the old content to gain exposure. Being of higher quality generally means being more citation worthy. Which frequently means being more controversial.

If you have been on the web for a while have you found it easier or harder to meet people off the web that are outside your realm of trade? Have your media consumption habits become more or less biased since you started interacting on the web?

Jackass Blog Comments

This guy wanted to be seen bad enough, so I may as well feature him

this webmaster doesnt want to help you.....
i have a good website where you may find coupons for google, yahoo, MSN, looksmart, but this webmaster delete my messages all the time....
i wont give up, i will post my messages again and again....

How absurd is it to threaten me with continued spam until I leave it published? That jerk had a new PHPBB driven SEO forum which had one post on it. It offered the opportunity to get Looksmart coupons (affiliate links) after you made 50 posts AND linked at his new forum. He must have spammed my blog about 20 times before I banned his IP address.

Between that clown and some of the other blog comment spam that has been hitting my blogs recently I was becoming quite scared as to the current state of humanity and the concept of evolution. Then I realized I should just laugh at those sort of people and everything was better again.

In spite of having Javascript required to post a comment on this blog, over 90% of the comments fall into at least one of the following categories: pure spam, of no value, unoriginal, point at adult sites or sleazy one page domain lander parking pages or sleazy lead generation / salesletter sites, or have keywords as the anchor text. I wonder if my sometimes infrequent publishing, coarse copy, and political views sometimes prevent some of the better potential commenters from commenting.

I will be the first to admit that I was probably a bit of a blog comment spammer when blogs were new, but considering just how many people are automatically and manually blog spamming right now I thing there is far more value in doing things which make the publisher want to like you and read your thoughts.

Usage Data Will Not Replace Link Reputation

I am a big fan of usage data as an SEO and marketing mechanism (especially because usage data leads to editorial citations if the site is linkworthy), but I doubt usage data alone will fully replace linkage data anytime soon. Why? The web is but a small fraction of overall advertising market. With search companies controlling so little offline media you would doubt that they would want to let ads outside of their network have control over their relevancy, wouldn't you?

Why does Matt Cutts frequently obfuscate the policies on buying and selling links outside of AdWords? Because abusing links undermines Google's relevancy and Google does not get a cut of the action.

Google's current results are heavily biased toward informational resources. If Google was heavily reliant on usage data it would commercially bias their organic search results and make their paid ads seem less appealing.

Will Yahoo! Shares Bounce Back?

It seems a large part of the reason that Yahoo!'s stock recently tanked was the market was punishing them for delaying their ad system.

I know factoring clickthrough rates into ad costs will help optimize their revenue stream, but does anyone think the new system will help them catch Google on the monetization front?
I don't. The three main reasons are

  • They are losing marketshare daily. Google has a stronger search technology and search related brand, and the next version of IE is going to integrate search into the browser. Even if MSN loses most of the associated browser distribution deals they will still drive up the traffic acquisition costs for those who win them, and since Yahoo! has a less efficient marketplace than Google they are not going to be able to outbid Google.

  • Google is already busy taxing noise out of their ad network when Yahoo! is just fighting to keep up with pricing, let alone creating easy account management tools.
  • Yahoo! is more cautious with trademark protection in search ads. Since branded terms are some of the highest converting and most valuable terms that choice probably costs them a fat packet of cash.

Yahoo! Search Algorithm Moves Toward Links & Authority Sites

I do not think the recent Yahoo! Update was as sharp as they may have hoped for. If you have a variety of sites that were marketed using vastly different techniques and know a market or two well it is pretty easy to pick up on some of the patterns.

Caveman, one of my best friends in the biz, is great at picking up the high level changes (and maybe that is why he got the nickname algo guy). He made a couple great posts in a WMW thread about the update

Here he talks about filtering out the most appropriate page

Odd. I see orphaned pages (i.e., abandoned; not doorway pages) - with NO inbound internal or external links any more - ranking on page 1 of numerous SERP's. Don't think I've ever seen that before. ... They seem to be filtering out the best sub page to show for a specific search (e.g., "red widgets") and instead are now showing a page above it or below it or beside it.

When I mentioned Google ranking a home mortgage type page for a consumer loan (a few weeks back in a rant post) that I am now seeing the exact same page rank for the exact same query in Yahoo! (not due to any type of spam, but due to algorithms that are ranking page B for having navigation related to page A on it - see DaveN's post about a recent Google non-relevancy fiasco).

I am seeing some sub pages rank for things you would expect the more authoritative home page to rank for, and in other cases I am seeing the home page rank for rather specific queries where far more relevant sub pages exist.

I am seeing the move toward promoting authoritative domains in Yahoo!'s SERPs in general. Not only is the trend visible as a general rule of thumb, but I also have a crusty old authoritative domain. I extended the domain out from its initial focus into related higher margin fields. I have not built up the authority on those new pages yet, but they ranked well in Google due to crustiness and high authority links to the site in general. They also ranked well in MSN due to on the page optimization. The site was not getting much love in Yahoo! until this algorithm update. The love (and increased earnings) are likely due to Yahoo! placing far more weight on core domain authority and applying that throughout relevancy scoring for all documents on the authoritative site.

Here Caveman talks about Yahoo!'s shift away from a literal MSN type algorithm to attempting to move more towards a more elegant link based Google type algorithm

What if, for example, Y! substantially altered the way that links factor into the algo: Both from a quality and quantity standpoint.

Y's algo used to be much more onpage and kw oriented. Last year that began to change. Links became more a factor.

In this new update, links are again, IMO, playing a significant role: Both the quality and quality of links. Y seems to be exploring ways to push authoritative links more to the fore.

Brilliant stuff Caveman.

As a marketer, I think Yahoo! shifting toward a sitewide authority type algorithm that tries matching natural text is a big deal since it leaves MSN as the last literal type search algorithm. The current Y! algorithm hints that Yahoo! is willing to throw a bit of paid inclusion revenue in the trash can if it leads to more relevant search results. Within a year I wouldn't be surprised if

  • Yahoo! solves their guestbook and blog spam link problems (and some of the other low quality link issues)

  • common forum questions about things like keyword density and the like are replaced by people talking more about spreading out your keywords, writing naturally, and using semantically related phrases
  • many people trade websites instead of just buying / selling / renting individual links
  • about 100,000 free service sites pop up that are nothing more than link schemes (via stuff like add our link to your site with this badge or whatever)

As a bonus, here is an image of the Yahoo! SERPs for SEO. Notice how many of the domains listed have the word authority next to them in my description of why I think they rank.

As far as SEO goes the word authority is generally synonymous with "heavily linked to via natural citations from other powerful sites."

MSN Search Spam Research

Many spam sites are based on automation, and in the attempts to automate and mass produce content or sites leave footprints that are easy to detect.

While MSN Search is still chuck full of spam, they are doing research to try to stop it (link via PeterD).

Our approach is to treat each spam page as a dynamic program rather than a static page, and utilize a “monkey program” [6] to analyze the traffic resulting from visiting each page with an actual browser so that the program can be executed in full fidelity.

Many successful, large-scale spammers have created a huge number of doorway pages that either redirect to or fetch ads from a single domain that is responsible for serving all target pages. By identifying those domains that serve target pages for a large number of doorway pages, we can catch major spammers' domains together with all their doorway pages and doorway domains.

Just about any piece of the publishing or monetization puzzle that is not well thought out can leave a footprint.

The downside with them doing that type of research and sharing it publicly is that they create an incentive for one person to make a bunch of spam sites for a competitor just to knock the competitor's main site out of the search results. And if you think MSN has fixes in place for that sort of stuff, you would probably be wrong. Take, for example, their inept geo-location targeting algorithms.

Does Google cross reference AdSense accounts when fighting spam? I am not certain, but some friends have recently reported occasional $8 and $9 AdSense ad clicks on some low traffic spammy sites in a network of spammy sites linked to an AdSense account. If Google is going out of their way to filter the noise out of their ad network it shouldn't be surprising that they would use similar data points to clean up their organic results. If you start getting a ton of traffic and/or large earnings quickly that might flag your site for some type of editorial review.

How to Look Like an SEO

Google started to support the NoODP meta tag that was introduced by MSN in May. To use it place the following code in the head of your DMOZ listed page
<META NAME="ROBOTS" CONTENT="NOODP">
I probably would not use that unless my DMOZ listing was really jacked. I believe it is a way to self select yourself as an SEO, which may not work in your favor.

I also think excessively using Nofollow tags outside of those that are typically associated with your content management system is another way to self select yourself as a known SEO.

Google AdWords Landing Page Quality Scores

Google AdWords updated their landing page quality scoring algorithm. I have got quite a bit of email on the issue, although people are still working through what all Google is doing.

In much the same way to how Google has clearly stated their hatred for low quality affiliate sites in the organic SERPs some of that pure hate is crossing over into their AdWords relevancy algorithms, where they are looking at the landing page quality (and other factors) and squeezing the margins on many business models. I believe if you spend huge money you probably get a bit more of a pass than smaller ad buyers, but the clear message with this update is that Google does not like noise even if you are willing to pay them for the privilege of displaying your noisy message.

Many people liked PPC because they felt it was far more stable and more predictable than SEO, but for many PPC just started to look ugly quite quickly. If you are dealing with search marketing you have to evolve with the market or die. That is true with organic search and is true with paid search.

The brutal part with this Google update is beyond providing these general guidelines they failed to define what qualities they are looking for when they test landing page quality. Some of the things Google might be looking for

  • if your AdWords ads redirect

  • your account history (are you a large reliable spender that has been spending for years? are you new to a saturated market? do you have a spotty past checkered with 20,000 unrelated keyword uploads? do your ads get a strong CTR?)
  • history of competitors with similar keyword selections
  • if your landing page links to known affiliate hubs
  • if your landing page has redirect on outbound links
  • if your landing page has many links to other sites or pages that are also advertising on the same or similar keywords
  • if your page has duplicate or limited content (or conversely if it has a huge number of links to external sites on it)
  • time on site
  • rate which people click the back button after landing on your site
  • outbound ad CTR on your landing page (especially easy if you are arbitraging AdWords to AdSense)
  • conversion rate if you use Google Checkout, Google Analytics, or the AdWords conversion tracker

Don't forget that Google not only has a huge search engine, the largest ad network, and an analytics product, but they have their toolbar on a boatload of computers and can track track track their users!

Andrew Goodman reminded advertisers that one shouldn't be too reactive to this change

As [Googler] Nick Fox suggested, there are rarely any gray areas, implying that it's generally seriously misleading ad campaigns and scam offers that are being targeted. Yes, there are landing page factors now in the mix.

But these will generally not affect accounts of long standing which have good CTR's established. You need to continue optimizing your landing pages for corporate goals and profitability, conversion rates, ROI, etc... not based on what you think it will do to your minimum bid in AdWords.

As Google Checkout and other direct merchant incentives (and affiliate disincentive) spread you have think that Google is going to make many PPC affiliate marketers cringe.

If you are already well established though this might improve your margins since it raises the barrier to entry to the AdWords market while wiping out some of the arbitrage players and some of the less sophisticated or lower budget merchants and affiliates. Some of the larger players in the space are seeing a significant rise in traffic as the squirrel population dies off.

In the same way Google trusts older websites maybe it is worth starting up an AdWords account just to learn the medium before it gets any more complex, and with any luck to build up a level of trust that can be leveraged if you ever have a sudden urge to advertise a time sensitive message down the road.

I have had a couple search marketers tell me that they have a couple high spend low maintenance PPC clients just to have the account spend necessary to have pull with the engines.

Marketing, Branding, Feedback, & Network Stability

I went to Affiliate Summit this week. I probably could have went to more sessions than I did, but I had too much fun hanging out with the TLA crew.

I did see the keynote speech by Jim Bouton. His thesis for success was that you must be persistent and you must love the process of whatever you are doing. Jim made it to the majors twice, co-created Big League Chew, and wrote a groundbreaking book titled Ball Four, which in many ways changed the way baseball operated as a business. I have attended many conferences, and it seems like most everyone says the same thing, but with their spin on it (based largely on their own experiences). Go to SXSW and you will hear how important design, standards, and blogging are. And you will hear how you have to be persistent and work hard and keep learning, etc (that is generally the thing you hear everywhere, that and maybe if someone had good market timing they say they were lucky too).

At Affiliate Summit I also listened to Rosalind Gardner offer affiliate marketing tips. I think she is highly focused on getting email addresses to create large targeted mailing lists and use pay per click to protect your site from the engines. Her tips for success seemed similar to things Jim Bouton would say, I would post here, or things I have read on many SEO forums. The one downside I felt in her speech was that she really talked down on SEO as though it was not as reliable, predictable, and as safe as pay per click marketing.

While my position is largely biased by my own experiences, I never really understand when people say pay per click is going to be more reliable long-term than SEO is. All of the markets are growing increasingly more competitive. With PPC someone can overspend you out of the market, and the market makers weed noise from the market. Both of which result in many casualties.

People can also spam the heck out of email too, which may limit how effective email is. And what happens when the major email providers allow more targeted ad buys on their email products? Competitors to your business may subscribe to your newsletter and bid against its contents to show up wherever you are.

With SEO, if you have good market timing and can create better ideas than the competition you carve out a market position and then are sorta stuck there, with the help of reinforcing links. I recently launched that SEO for Firefox extension. Assuming I keep the software functional the download page will probably rank in the top 5 for SEO Firefox and Firefox SEO for years.

The best converting terms are typically brand related terms and search is about communication. As long as you build a brand and gain mindshare search engines will deliver an irrelevant user experience if your site is not showing up.

In some cases it makes sense to buy mindshare, even if it only barely pays for itself and lowers your overall margins. Why? Because it provides another lead source and strengthens your overall brand awareness and mindshare (and, of course, exposure leads to more exposure).

I spend about $1,000 a month on AdSense just breaking even on the ads because the additional 12 or so unit sales does not increase my customer service load by much, but the $1,000 ad cost provides millions of ad impressions and increased mindshare. If I ever need to cut that ad cost I can.

Once people see your brand enough they will assume you are successful and offer free honest feedback. Exposure not only leads to more exposure, but it seems the less you need help the more people are willing to help you. And they may offer you free help that is better than anything you could have paid for.

Using any single medium as your exclusive lead provider is going to be risky, but by using multiple you can make your business profile less risky.

Internal Article Anchors From Search Engines

I recently searched for [Tippecanoe County Shrine Club] and Google ranked a huge Wikipedia page first. When will search engines start directing searchers to portions of a page instead of just to a page? How will that change affiliate, contextual, and web merchant business models?

Dreamy Google Sitemaps & a Page Strength Tool

Matt Cutts is looking for feedback on improving Google Sitemaps.

I'm expecting some creative answers here. I'll phrase it more generally: Forget XML files or even what Sitemaps looks like currently. What info would you want as a webmaster?

If you could design your dream webmaster or site owner console on Google, what would it look like?

Rand announced the launch of his page strength tool, which aims to be more accurate than Google's PageRank. The one downside to the tool is that there is a delay most all the data sources (for example I think my SEO for Firefox page is ranking at #10 in Google for SEO right now, but the page strength tool shows it as being at 3.5), but it is probably quite a bit more accurate than PageRank alone is.

Also interesting that on one front Google is requesting to look for ways to share as much data as they can with you while on other fronts they make external tools and ideas necessary and valuable because they are unwilling to share data they once shared. Thus markets which were once fairly open are getting more and more abstract. It happened with PageRank and SEO and now it is happening with AdWords too.

Over / Under the Radar Link Buys

PageRank 9 links cheap!!!! Or maybe not ;)

If a market inefficiency is so great that people focus specifically on that inefficiency then the inefficiency is going to dry out pretty quickly. Either the undervalued commodity is going to have is supply quickly exhausted or the market maker which lends the value to the commodity will remove the value. Within any topic or vertical there are ideas and sites focused on those ideas which will have high authority but limited income opportunity. Conversely the sites focused on maximizing revenue generation typically are nowhere near as authoritative. So they either have to create secondary sites, launch viral marketing campaigns, or hunt for authority where they can buy it at an affordable price.

For example, there are lots of sports equipment and sports collectible sites online which have limited authority. There are, however, authoritative sites about each and every sport. It looks like this site, from 2000 with about 20 edu links, DMOZ listings, and Yahoo! Directory listings allows you to sponsor pages for a year for $5 each.
I probably would not sponsor a few pages on that site. I would be more inclined to spend a few grand to buy exclusive sitewide sponsorship rights.

Not all of the sites are going to suggest a price for a reference on their sites (and in fact most webmasters are quite unaware of the value of their content and their link authority). You may have to hunt around to find those kinds of sites. But if you think of sites that

  • would have high authority; and

  • not be noticed by most of your competitors; and
  • almost no income

those will be the sites that will give you great long-term link value. Jim Boykin is great at finding those types of sites.

If you think that getting a link off the site which creates the standards that run the WWW is sneaky or that nobody will find it then you are probably wasting your money, and getting a bunch of links from smaller and more related sites is a better investment, especially in long term. The big pages tend to get spotted quickly, fill up with spammy links quickly, and either algorithmically handeled or manually handeled. I learned that in the past when I did a Mozdev donation.

Some people have assumed that I am a huge spammer because I donated to the W3C, but I have donated to many projects where I didn't donate just for a link, and am not ashamed to admit that I supported the WWW.

I was (and still am) a big fan of donating for links, but have generally got much lazier on that front recently because recently it has been far cheaper to create interesting content or tools to build up the authority of this site. it has enough exposure to where if my ideas are well implemented they are going to spread.

I however do sometimes make spammy pages or buy spammy links. Some are just to joke or play around or test the market. Others are dual purpose or passive lead generation streams (for instance on this page I am not selling anything to do with eBay, I just wanted to test the authority of my other blog and a number of people who find that page end up connecting it to this site and buying my book). I don't actively solicit most of my spammy links (like the ones on the splogs about wall clocks), but what does it really matter if you have a few spammy links if you also have tons of legitimate ones? If getting a few low quality links gets people to talk about you does it also increase your exposure and help build good free secondary links? Sometimes, methinks ;)

Who is the moral authority to determine relevancy of a link or a search result? Are their guidelines anything deeper than self promotion? And why does their opinion matter? So long as whatever you do is enjoyable and profitable and you weight the risk to reward ratios I don't think much else matters.

Search Engine Cloaking FAQs: an Interview With Dan Kramer, Creator of Kloakit

I recently asked Dan Kramer of KloakIt if I could interview him about some common cloaking questions I get asked, and he said sure.

How does cloaking work?

It is easiest to explain if you first understand exactly what cloaking is. Web page cloaking is the act of showing different content to different visitors based on some criterion, such as whether they are a search engine spider, or whether they are located in a particular country.

A cloaking program/script will look at a number of available pieces of information to determine the identity of a visitor: the IP address, the User-Agent string of the browser, the referring URL, all of which are contained in the HTTP headers of the request for the web page. The script will make a decision based on this information and serve the appropriate content to the visitor.

For SEO purposes, cloaking is done to serve optimized versions of web pages to search engine spiders and hide that optimized version from human visitors.

What are the risks associated with cloaking? What types of sites should consider cloaking?

Many search engines discourage the practice of cloaking. They threaten to penalize or ban those caught using cloaking techniques, so it is wise to plan a cloaking campaign carefully. I tell webmasters that if they are going to cloak, they should set up separate domains from their primary website and host the cloaked pages on those domains. That way, if their cloaked pages are penalized or banned, it will not affect their primary website.

The types of sites that successfully cloak fall into a couple of categories. First, you have those who are targeting a broad range of "long tail" keywords, typically affiliate marketers and so on. They can use various cloaking software packages to easily create thousands of optimized pages which can rank well. Here, quantity is the key.

Next, you have those with websites that are difficult for search engines to index. Some people with Flash-based websites want to present search engine spiders with text versions of their sites that can be indexed, while still delivering the Flash version to human visitors to the same URL.

What is the difference between IP delivery and cloaking?

IP delivery is a type of cloaking. I mentioned above that there are several criteria by which a cloaking script judges the identity of a visitor. One of the most important is the IP address of the visitor.

Every computer on the internet is identified by its IP address. Lists are kept of the IP addresses of the various search engine spiders. When a cloaking script has a visitor, it looks at their IP address and compares it against its list of search engine spider IP addresses. If a match is found, it delivers up the optimized version of the web page. If no match is found, it delivers up the "landing page", which is meant for human eyes. Because the IP address is used to make the decision, it's called "IP delivery".

IP delivery is considered the best method of cloaking because of the difficulty involved in faking an IP address. There are other methods of cloaking, such as by User-Agent, which are not as secure. With User-Agent cloaking, the User-Agent string in the HTTP headers is compared against a list of search engine spider User-Agents. An example of a search engine spider User-Agent is
"Googlebot/2.1 (+http://www.googlebot.com/bot.html)".

The problem with User-Agent cloaking is that it is very easy to fake a User-Agent, so your competitor could easily decloak one of your pages by "spoofing" the User-Agent of his browser to make it match that of a search engine spider.

How hard is it to keep up with new IP addresses? Where can people look to find new IP addresses?

It's a chore the average webmaster probably wouldn't relish. There are always new IP addresses to add (the best cloaking software will do this automatically), and it is a never-ending task. First, you have to set up a network of bot-traps that notify you whenever a search engine spider visits one of your web pages. You can have a CGI script that does this for you, and possibly check the IP address against already known search engine spiders. Then, you can take the list of suspected spiders generated that way and do some manual checks to make sure the IP addresses are actually registered to search engine companies. Also, you have to keep an eye out for new search engines... you would not believe how many new startup search engines there are every month.

Instead of doing it all yourself, you can get IP addresses from some resources that can be found on the web. I manage a free public list of search engine spider IP addresses. There
are also some commercial resources available (no affiliation with me). In addition to those lists, you can find breaking info at the Search Engine Spider Identification Forum at WebmasterWorld.

Is cloaking ethical? Or as it relates to SEO is ethics typically a self serving word?

Some would say that cloaking is completely ethical, others disagree. Personally, my opinion is that if you own your website, you have the right to put whatever you like on it, as long as it is legal. You have the right to choose which content you display to any visitor. Cloaking for SEO purposes is done to increase the relevancy of search engine queries... who wants visitors that aren't interested in your site?

On the other hand, as you point out, the ethics of some SEOs are self serving. I do not approve of those who "page-jack" by stealing others content and cloaking it. Also, if you are trying to get rankings for one topic, and sending people to a completely unrelated web page, that is wrong in my book. Don't send kids looking for Disney characters to your porn site.

I have seen many garbage subdomains owning top 10 rankings for 10s to 100s of thousands of phrases in Google recently. Do you think this will last very long?

No, I don't. I believe this is due to an easily exploitable hole in Google's algorithm that really isn't related to cloaking, although I think some of these guys are using cloaking techniques as a traffic management tool. Google is already cleaning up a lot of those SERPs and will soon have it under control. The subdomain loophole will be closed soon.

How long does it usually take each of the engines to detect a site that is cloaking?

That's a question that isn't easily answered. The best answer is "it depends". I've had sites that have never been detected and are still going strong after five or six years. Others are banned after a few weeks. I think you will be banned quickly if you have a competitor who believes you might be cloaking and submits a spam report. Also, if you are creating a massive number of cloaked pages in a short period of time, I think this is a flag for search engines to investigate. Same goes for incoming links... try to get them in a "natural" looking progression.

What are the best ways to get a cloaked site deeply indexed quickly?

My first tip would be to have the pages located on a domain that is already indexed -- the older the better. Second, make sure the internal linking structure is adequate to the task of spidering all of the pages. Third, make sure incoming links from outside the domain link to both the index (home) cloaked page and to other "deep" cloaked pages.

As algorithms move more toward links and then perhaps more toward the social elements of the web do you see any social techniques replacing the effect of cloaking?

Cloaking is all about "on-page" optimizing. As links become more important to cracking the algorithms, the on-page factors decline in importance. The "new web" is focused on the social aspects of the web, with people critiquing others content, linking out, posting their comments, blogging, etc. The social web is all about links, and as links become more of a factor in rankings, the social aspects of the web become more important.

However, while what people say about your website will always be important, what your website actually says (the text indexed from your site) cannot be ignored. The on-page factors in rankings will never go away. I cannot envision "social techniques" (I guess we are talking about spamming Slashdot or Digg?) replacing on-page optimization, but it makes a hell of a supplement... the truly sophisticated spammer will make use of all the tools in his toolbox.

How does cloaking relate to poker? And can you cheat at online poker, or are you just head and shoulders above the rest of the SEO field?

Well, poker is a game of deception. As a pioneer in the cloaking field, I suppose I have picked up a knack for the art of lying through my teeth. In the first SEO Poker Tournament, everybody kept folding to my bluffs. While it is quite tempting to run poker bots and cheat, I find there is no need with my excellent poker skills. Having said all that, I quietly await the next tournament, where I'm sure I'll be soundly thrashed in the first few minutes ;)

How long do you think it will be before search engines can tell the difference between real page content and garbled markov chain driven content? Do you think it will be computationally worthwhile for them to look at that? Or can they leverage link authority and usage data to negate needing to look directly at readability as a datapoint?

I think they can tell now, if they want to devote the resources to it.

However, this type of processing is time/CPU intensive and I'm not sure they want to do it on a massive scale. I'm not going to blueprint the techniques they should use to pick which pages to analyze, but they will have to make some choices. Using link data to weed out pages they don't need to analyze would be nice, but in this age of rampant link selling, link authority may not be as reliable an indicator as they would like. Usage data may not be effective because in order to get it, the page has to be indexed so they can track the clicks, defeating the purpose of spam elimination. There best bet would be to look at creation patterns... look to see which domains are creating content and gaining links at an unreasonable rate.

What is the most amount of money you have ever made from ranking for a misspelled word? And if you are bolder than I am, what word did you spell wrong so profitably?

I made a lot of money from ranking for the word "incorparating". This was waaay back in the day. I probably made (gross) in the high five figures a year for several years from that word. Unfortunately, either people became better spellers or search engines got smarter, because the traffic began declining for the word about four or five years ago.

If I wanted to start cloaking where is the best place to go, and what all should I know before I start? Can you offer SEO Book readers a coupon to get them started with KloakIt?

KloakIt is a great cloaking program for both beginners and advanced users, because it is easy to get running and extremely flexible and powerful. There is a forum for cloakers there where you can go for information and tips. I am also the moderator of the Cloaking Forum over at WebmasterWorld, and I welcome questions and comments there.

SEO Book readers can get a $15.00 discount of a single domain license of KloakIt by entering the coupon code "seobook" into the form on the KloakIt download page. I offer a satisfaction guarantee, and, should you decide to upgrade your license to an unlimited domains license, you can get credit for your original purchase towards the upgrade fee.

----

Please note that I am not being paid an affiliate commission for KloakIt downloads, and I have not deeply dug in to try out the software yet. I just get lots of cloaking questions and wanted to interview an expert on the topic, and since Dan is a cool guy I asked him.

Thanks for the interview Dan. If you have any other questions for Dan ask them below and I will see if I can ask Dan if he would be willing to answer them.

Why Linguistics is Important

As a marketer, in most cases you can not shape public opinion or create a profitable economy of scale unless you understand how words are used in a manipulative manner to shape opinion to create profit for external antimarket institutions (like Google).

Looking through economic history and the history of linguistics enables you to realize opportunities when others are not being honest or consistent in their policies, and it helps you form an argument which enables you to sound logical and reasonable while reframing the debate at an appropriate time. (For example, let's look at Google's nofollow policies.)

If you like to read I highly recommend reading A Thousand Years of Nonlinear History. Thusfar it is the most important book I have ever read, and is worth far more than my SEO Book, even though it will cost you less than $20. It is not for everyone, but if you are able to understand abstract patterns I doubt you will ever find another book that is more important or convincing at shaping your worldview to a more impartial or profitable worldview.

Maximizers vs Optimizers & the Hollow Middle

I get asked to review a wide array of sites being asked "what is wrong" and "why isn't this working".

Many times I think that the underlying problem is something I call cart before the horse syndrome. While you can view many data points in the competitive landscape when you view a site what you see now is not the way it has always been. Many of the most authoritative sites were created without any commercial intent, and then the site owner later fell into a business model, and as they saw profit started to maximize their profit potential.

If you start off with a lead generation form as your website and are unwilling to give anything away until people give you money or an email address then you should be looking more toward the pay per click market than at organic SEO.

There is nothing wrong with maximizing your potential profit, but if you create a site geared around converting 10% of the site visitors into paying customers right off the start you are probably going to limit your ability to gain any serious link authority and serious distribution unless your conversion rate and profits are so great that you can convince affiliates to push your product.

If you can afford heavy PPC spending by automating your sales process and maximizing your ROI that is fine, but if you want free traffic there are hidden costs to maximizing right out of the gate. It is like buying a 99 cent burger. Sure the upfront cost is next to nothing (and it seems like you are getting more for less), but as competing sites build traffic while you stagnate those invisible costs start to reveal themselves. You have to consider what search engines want and what your site visitors want. Try to create something that covers those wants and then roll commerce into it.

Seth Godin frequently stresses that getting people to PAY attention is a cost, and even if they give you no money PAYING attention is still a cost. Once you earn that it is worth a lot of money because it takes a long time to build trust. And trust is fragile. If I hadn't built up a lot of friendships and trust over the last couple years there is no way the SEO for Firefox launch would have went so well. The new links and new readers that tool brought in are probably worth far more than the tool cost to build, but it may not have spread so well (and it may not have covered its cost) if I had not worked so hard to build up my authority.

Alexa traffic stats for Seo Book.

Hitting the traffic jackpot once does not make one a marketing expert, but in spite of being on the delicious popular list and Digg homepage yesterday this site only doubled its typical traffic. A friend of mine says that it is a marathon and not a sprint, and that is the way you have to look at getting traffic, especially if you have a new site.

Back to the new sites I get asked to review. What do they need to spread messages or compete in the SERPs?

  • Set reasonable goals. Do not expect to rank for mortgage or search in one month if you have a $0 marketing budget and a site that is so bland or conversion oriented that it would never merit a single legitimate organic citation.

  • Pick a path and run with it. Be a maximizer or an optimizer, but know your path and run with it. If you are stuck mixing up in the middle you will probably do worse than a person who is working hard at either of the edges. After you are well established on either front and are beyond self sustaining then you have money to invest and room to play and test, but you need to have a clear message off the start. You don't want your site to one day say you believe on taking the hard and steady and slow and... way to the top, and then have visitors come to your site the next day to see a picture of a check for $50,000 that you allegedly made while you were on vacation last week.
  • Come up with a clear unique branding angle that makes you stand out. Make sure it is obvious what you want people to do on your site and make sure it is obvious what message you want them to spread away from your site. When it doubt it is better to be niche and unique over broad and not unique.
  • Do not chose cheapest as your branding angle unless you are a masochist.
  • Create a clean site design which reinforces your brand image. For example, if your brand is supposed to be fun and hip POO BROWN is a bad color. If your service is supposed to convey a sense of trust to businesses or people seeking health advice go lean on red and orange. I typically favor clean over going too far with a design. If you can find a good priced logo designer and spend a day learning a bit of CSS you can create a reasonably decent looking site for around $100.
  • If you are unsure of what you want to do participate in topical communities to learn about the market and what the market wants. If all of your marketing is done on your site and it is not backed up by friendships away from your site it is going to be hard to convert potential prospects if they dig further into the SERPs and can't find anything about you other than a few cheesy syndicated articles and free directory listings. The web is cool, but also make sure you find your way to relevant off the web (ie: real world) events. That is where you really solidify your friendships and get to know the people you really should know.
  • If you have down time make sure you keep learning. You should be able to learn quicker than the market leaders because you know less, are more hungry, and have less busywork filling your day if you are seriously focused on success and are new to a market. Read and experiment widely. Especially if you aim to be a consultant review that which you consume (it helps buid relationships, and most personal brands are not too deeply developed, so it also provides a cheap and easy relevant traffic source). Don't wait around for a golden day for things just to fall in place. Don't be afraid to be wrong. I have had people take the time to email me and tell me what a piece of shit I was for having incorrect information on my site only later to have them buy my product, put huge ads for it on their site, and recommend it on various community sites.
  • If you want to rank for competitive terms you have to give to get. Look to create ways to make people want to revisit your site many times and/or link to your site for having a definitive topical resource. When you create a (hopefully) definitive article it may go nowhere, but if you do a half a dozen of them well eventually one of them will take off. You are over-investing hoping that eventually one of the investments will pay big dividends. When you have a great idea make sure you tell a few friends to see if they would be willing to help you market it.

Containers, Aggregators, and Editors

I recently got asked to write a couple articles for various websites and publications. I said no problem, but then I kept putting them off. I just handed in one today and did not get feedback yet, but I am uncertain how well it went. Yesterday I handed in one and the editor was less than impressed. Then it sorta dawned on me, that I am a bit spastic and random in nature, and without using those words that is sorta how my article was described (in a nicer way though of course).

Some people do well with containers and other people driving them, but I have been so (searching for a word here...maybe undomesticated) that it is quite hard to fill in the box or create something that is exactly how someone else wants it. I got so focused on random abstract thoughts that I am only really good at doing something if it is something I really want to do when I want to do it.

I have a PowerPoint presentation and speech to put together and am hoping I do well with it. The biggest benefit to it over the articles is that the request for it came after I put together something similar in nature but in another format.

So I guess my (semi?)relevant marketing thoughts on this post are:

  • I think the closer you are to your audience the easier it is to be successful (at least for me).

  • The more passion and interest you have in a topic the easier it is to be successful.
  • It is definitely worth focusing on what you are good at, but it is also a good thing to occasionally try different containers or formats. I suck at many containers and do well with others. Respect the container, or throw the container away and try something new ;)
  • For most people publishing format (so long as it is legible) likely the format has little to do with your personal credibility level. Everyone is different and probably has their own best way to express themselves. I don't think mine is in 1,000 world articles...at least not at this point!

What have been your most successful publishing formats? Do you think the structure of the web will drastically change media consumption habits?

The Idiocy of Nofollow Abuse & Link Hoarding

Recently it was noted that Business.com started using nofollow on many of their outbound links. If you don't trust the content of a site then why link to it at all? To list it on your own site and then put nofollow on it is to say that you don't trust your own content. Which is especially stupid. And perhaps the quickest way to become irrelevant, if you are an editorial listing company.

It turns out they were likely using nofollow on the free listings to some of the higher quality sites, which in turn means that the links without nofollow are pointing at sites that are on average of lower quality than the sites they added nofollows to.

If I was a CRM company I would think that on average a link from a page that links to Salesforce.com is worth more than than a page that does not. If I was a software company I think that on average a link from a page that links to Microsoft.com is worth more than a page that does not.

I think it muddies their credibility. A lot. Think of the quality of their site from a search engineer's perspective

Oh, the only links they left live were the low quality ones. Outbound link authority nuked. Next.

A site which uses nofollow on most of their quality outbound links also reduces their outbound good link to bad link ratio. Even if search engines still counted Business.com links I think the loss of quality outbound links hurts their authority far more than whatever gain someone gets from having a link on a page with fewer links on it.

SEO for Firefox

SEO for Firefox.

Probably the coolest Firefox SEO extension ever created. :)

It now works on international versions of Google and Yahoo!, While it currently only works on the global Google.com and Yahoo.com sites and it pulls a ton of marketing data right into the search results to help you analyze how competitive a marketplace is. Learn more about SEO for Firefox.

Manuel De Landa's A Thousand Years of Nonlinear History - Cool Book

These are my opinions and ideas gelled with notes form the contents of A Thousand Years of Nonlinear History by Manuel De Landa. Keep in mind I may have misinterpreted some of his points and interjected my bias into the points. If something looks like a rational well thought out point I am probably syndicating Manuel De Landa's point. If something looks like a point of anger being expressed that is probably me adding one of my related views to further why and how I believe that portion of the book relates to my life or the world as I have experienced it. I believe the underlying points of the book are

  • progress is a misguided notion
  • reality and consciousness (and everything around us) are just states of energy and biomass hardened by history
  • our own history biases how we evaluate history
  • many things are not linear even if we have traditionally been lead to think of them in that way

Where applicable the book may also appear biased toward heterogeneous over homogenous systems due largely to the great blinding support of homogenous systems in the business community (homogenized systems are easier to extract profit from due to the lowered costs of mass production). This quote really states how he looks at the true cost of homogenization and discipline:

As with all disciplinary institutions, a true accounting must include those forces that increase (in economic terms of utility) and those that decrease (in political terms of obedience).

I think one of the biggest things I got out of the book was a fresh reminder from a different perspective that some of the scummiest aspects of capitalism are not intentional, but are just un cared for side effects of other business processes.

I was talking to a reporter about some tech companies recently when I stated that I thought that certain companies would do this or that and why. He asked if I got that from talking to those companies. I said no, that my knowledge was from thinking about economic theory and business theory stuff.

Many of those ideas came from reading this book.

Almost any book you read is going to be in some ways biased, but I would say this book was biased toward reality without so much propaganda or hidden agenda (ie: I think this book was written out of passion and interest more than to mislead me into trying to buy into something). This is really the most mind opening book I have ever read. As a marketer participating in a somewhat new network it is amazingly fascinating reading about how economics, biology, and linguistics have evolved over the last 1,000 years.

Below is a point by point Aaron Notes type review of the various sections of the book. I initially took the notes for myself, but thought it would be worth posting them anyhow.

From the introduction

  • reality and consciousness are just a state of energy
  • evolution or other 'improvement' to the state of living does not mean things are inherently better...just that they are different.
  • when a new state of being comes into existence it can co-exist with prior states...the new state does not necessarily have to supersede the old state.
  • when we look back at history we are biased by the path it has took and the narrative current society tells us about the past. to understand social dynamics you have to try to build things up from the bottom as well as break them down instead of just relying on breaking things down.

part 1

  • the creation of agriculture allowed an abundance of non human energy to by synthesized and stored for consumption, and lead to the creation of many cities. fishing or other energy sources could have also lead to the creation of cities.
  • trade winds were another important force of energy that were easy to capitalize on due largely to the inefficiency offered by limited competition and market separation
  • fossil fuels lead to the next major growth (again because they made it easy to store and synthesize non human energy)
  • cities act as parasites that suck off the surrounding area
  • currency was first created as a political means to collect excise taxes, but eventually enabled commerce with less friction
  • large parts of the reason why Western culture advanced quicker than eastern culture were competition and a lack of a large inefficient homogenized religious and political bureaucracy
  • The role of the isolated individual in society is typically largely overrated and over simplified when isolated down to the individual level.

  • Adam Smith's invisible hand theory is a bit too idealistic for real world application. The market friction which it ignores is largely what drives many business models and large socioeconomic changes.
  • Virtually any manufactured profit is created in cascading sets of quality with people further from the source valuing things of lower quality and emulating the original to raise the quality of their products and local skill level.
  • the establishment of reliable credit sources allowed powerful organizations, cities, and nations to sap the resources of surrounding areas
  • Many of the most profitable merchants gained freedom of motion, allowing their businesses to capitalize on whatever is high profit at that given time. ie: how Google does not do much in the physical world, but plays a large role in controlling human interaction with information and commerce. As new keyword develop they quickly are able to monetize them. As old markets die off due to political, cultural, economic or other forces they are not tied to them.
  • as companies grow in international power they create forces that attack government norms.
  • the lack of centralization within Europe caused increased investment in arms races that required societies to be more efficient and innovative to survive (when compared with the inward focused monopolistic stronghold on power in Eastern cultures)
  • most markets are range bound rather than actually reaching a single state of equality or equilibrium.

part 2

  • social class stratification via genetics and other aspects is a natural part of life, however it does not need to occur as aggressively as it does.

  • largely social stratification is driven by people who set up moral, ethical, religious or legal guidelines for others to follow. (which is a large part of the reason why I <3 civil disobedience, as undermines the abuse of such power).
  • if some of those systems lost power many social and economic markets would remain self organized by other positive and negative feedback loops.
  • many people prefer to view things through a hierarchical lens because it is much harder to understand and visualize the world through thinking of effects of positive and negative feedback and reciprocal causality. Even at a young age we are generally taught to develop our thinking patterns in terms of concrete causes and effects.

part 3

  • the military required interchangeable parts, and the US military bred a system which provided quality assurance over the railroads. after the government created systems to make railroads a functional business it handed over the reigns of profit to private enterprise
  • import-substitution dynamics and crafting of individual items gave way to automation and homogenation, such that interchangeable parts were cheap and easy to make.
  • many small businesses of similar trade exist near one another as being near one another improves their social circumstances, market mind share, and creates an environment where ideas can more easily flourish
  • most innovation comes from the smallest companies and individuals, as they are less confined by their business models.
  • after smaller businesses prove the profit of a business model larger businesses based on economies of scales either replicate and automate those business models or consume those companies
  • companies buy other companies to control them via internal rules instead of buying their obedience and productivity
  • with shareholders existing external to corporations there is a bias in management not to just make the business as efficient as possible, but to make pieces of it complex enough to not be comprehensible to outsiders, such that they justify executives continuing to receive (and increasing) their compensation level for running the company.
  • electrical energy made it easier to miniaturize machines, and thus increased productivity by making automation easier, quicker, cheaper, and more decentralized.

part 4

  • every non plant is a parasite

  • heterogeneous systems are more resilient than homogenous systems
  • humans make many pieces of the food chain more homogenous
  • genetic diversity is required to evolve new species

  • the genetic code within one animal type is quite homogenous
  • most human gene variations are superficial in nature

  • immigration is probably the single largest factor which causes a mixing of human genes
  • the dense population of cities made it easy for disease to spread
  • disease (local or imported) was a heavy factor in successful or unsuccessful colonization
  • richer individuals tend to allot for fewer children since they perceive a higher cost of living
  • whenever population declined (typically due to poor crop yield or disease) animals took back land

part 5

  • changes in the genetic code of one species changes the genetic make up of other species (this is especially true in predator pray relationships).
  • the definition of optimal is relative (strengthening any part of a system may make some other parts of it weaker)
  • extreme energy flows can shift equilibriums
  • social darwinism is quite bogus, as it fosters racism and ethnic cleansing. earlier this month an SEO I know who describes himself as a Jew explained to me how he viewed all muslims as terrorists and that he did not think ethnic cleansing was a bad thing. He objected to giving me his address when I offered to send him a 'Hitler was right' t-shirt.
  • genetic change is glacial compared to the speed of cultural change
  • while different cultures and linguistic backgrounds have a varying number of color labels the order of accumulation tends to be well aligned (typically starting with black, white, primary colors in certain orders). In Basic Color Terms: Their Universality and Evolution Brent Berlin and Paul Kay stated there are genetic constraints on perception guiding accumulation of cultural replicators.
  • while it is much harder to detect than the other way around, cultural materials may influence the accumulation of genes. an example of this might be how some people are lactose intolerant.
  • cultural policies can eventually become institutional, which can have both good and bad effects. an example of a good effect would be the curbing of incest. a few bad examples would include medication replacing nutrition and land erosion due to poor cultural farming policies.

part 6

  • when times are good humans outgrow their own good

  • the new world (the americas, australia, etc.) created supply zones and gave a place to put the excess growth of humans
  • many old world plants and animals spread to the frontiers of the new world ahead of civilization
  • military and trade ports host many people, animals, plants and goods in a wide array of states which are conducive the spreading disease.
  • because medical facilities in these locations saw people in a wide array of states it was important to make a clear distinction between that which is illegal and the concept of evil
  • the use of observation and binary systems improved medical care. he mentions how Michel Foucault stated they "treat lepers as plague victims"
  • discipline and homogenization are required to create economies of scale
  • when applied to the food supply (typically by big business) it comes in the form of gene control
  • some corporations create seeds that die if not used that year, and also introduce other genetic defects which require the use of excessive fertilizer or other (often monopoly controlled) inputs
  • this genetic control can be described as "etching entry points for antimarkets into the crops' very genes"
  • the gene makeup of many seeds are protected as trade secrets
  • short term homogenization may increase quality, but in many cases give enough time natural selection will perform better than over homogenized artificial selection. a hidden wealth stored in some poor countries is their food supply genetic diversity capital.
  • homogenized systems are more susceptible to epidemics
  • the genetic control applied to plants has also been applied to animals and some states went so far as trying to apply them to people.
  • eugenics is the belief that by studying hereditary and deploying selective breeding techniques you can improve the human race. alternate eugenics definitions here
  • While the immigration laws did not clearly state eugenics in them, some portions of the US believed that Northern Europe humans provided the highest quality genetic stock source.
  • starting in Indiana in 1907 over 20 US states started sterilizing thousands of people for things like being absent minded.
  • Those who still believe in ethnic cleansing after Hitler probably do not deserve to be alive and should cleanse themselves from the populous.
  • There are also non-traditional ways to control human reproductive cycles.
  • Some wars intentionally underequipped types of soldiers to allow them to be cleansed from the gene pool.
  • To this day the military recruiters pray on the young, poor, and those of below average intelligence. While that may sound ultra biased my thesis for that statement are based on my own experiences. I grew up in a poor town and joined the military when I was 17. While I am quite economically successful I have not yet decided where I wanted to move to, and still live in a mobile home (I moved into it with a friend a few years ago to cut my living costs back when I was just learning about the web and only making a couple hundred dollars a month). Earlier this month yet another military recruiter knocked on my door again. While being of about an average intelligence level I literally scored off the charts high on most of the military tests I took when I was 17 (even the nuclear power test) which should have been a strong indication that the test scales were scaled toward people who are of below average intelligence.
  • Early obstetricians and gynecologists screwed up much worse than midwives by making it easier to spread disease and also by excessively using forceps at birth.
  • Private enterprise also took other choices from mothers by sneaking in berthing formula while the mother did not know it was being given to the baby.
  • To this day tryptophan is common in birthing formulas but is illegal to buy as a supplement. around 15 million Americans were using tryptophan but it became illegal roughly around the same time that Prozac was launched as a wonder drug of the future. Few people questioned how shady that was
  • large public outrage is often required to get special interests to yield authority. It was 1892 before Hamburg improved sanitation of its water supply. They only did so after a cholera epidemic hit.
  • biotechnology allows us to fight microbes more efficiently by doing things like gene-splicing and gene-gluing enzymes from one creature to inject that information into other creatures. this creates the ability to produce large quantities of affordable microbe fighting cells.
  • while biotechnology makes it easier to fight micro parasites it makes it easier for macro parasites to be injected into a large portion of the food chain and form monopolies
  • In some instances totally useless and potentially cancer causing chemicals are created to help increase yield. In many cases consumers are not even informed of which food products are contaminated with the garbage.

    Do you really want trust the people who manufacture agent orange when they talk about the effects of chemicals they inject into the food supply? rBGH, which is illegal in many countries, is injected into many US cows to produce more milk. Fox News fired multiple reporters for wanting to air a report about how shady rBGH is. The Meatrix also provides a clip on rBGH.

  • efficiency of extraction and processing (including homogenized size and shape as well as predictable homogenized maturation dates) now are more important criteria than biomass value in many crops. The nutritional value of a crop is largely ignored in favor of the other "more important" (read as more profitable) genetic traits. Improving some of those other genetic traits also comes at the direct cost of lowered nutritional value.
  • when you buy food from outlets that sell on low price you are voting against genetic diversity in the food supply. and are voting against nutritional value of the food your children and their grand children will eat.
  • As nutrition is removed from the food supply drug companies hook people on garbage prescriptions that treat symptoms of an unbalanced lifestyle with poor nutritional input. Of course it will not be the fault of drug companies when things go astray. In reference to some of these drugs some FDA members have went as far as to claim:

    they don't feel that _______ is addictive because it doesn't carry the behavior of a person that is dependent on a drug. A person that will go out and steal to obtain their drug of choice or cause harm to another

  • At the same time children are medicated with these pills that (IMHO) wrongfully replace or cover up natural human emotions. Some of these things are blatantly over prescribed by doctors who learned from text books and journals sponsored or funded by self interested drug companies.

    “Journals have devolved into information laundering operations for the pharmaceutical industry”, wrote Richard Horton, editor of the Lancet, in March 2004. In the same year, Marcia Angell, former editor of the New England Journal of Medicine, lambasted the industry for becoming “primarily a marketing machine” and co-opting “every institution that might stand in its way”.

  • There are reports of things like

    Jeff Weise, the 16 year old who killed seven and then himself this week at his high school, had been taking ________.

  • In the above two _____'s they were two different drugs. But they were both in the same drug family. And the same drug family as the drugs associated with a kid in the Columbine shooting.
  • That drug family was born with the original drug being announced as a wonder drug of the future.
  • Around the time of the release of that drug family a natural supplement that about 15 million people were taking which worked on the same neurotransmitter was banned from the United States

    On March 22, 1990, the FDA banned the public sale dietary of L-Tryptophan completely. This ban continues today. On March 26, 1990, Newsweek featured a lead article praising the virtues of the anti-depressant drug Prozac. Its multi-color cover displayed a floating, gigantic green and white capsule of Prozac with the caption: “Prozac: A Breakthrough Drug for Depression.”



    The fact that the FDA ban of L-Tryptophan and the Newsweek Prozac cover story occurred within four days of each other went unnoticed by both the media and the public. Yet, to those who understand the effective properties of L-Tryptophan and Prozac, the concurrence seems “unbelievably coincidental.” The link here is the brain neurotransmitter serotonin — a biochemical nerve signal conductor. The action of Prozac and L-Tryptophan are both involved with serotonin, but in totally different ways.

  • L-Tryptophan, the allegedly harmful supplement, is still added to baby formula in the United States to this very day. To quote the federal government:

    "At the present time, an import alert remains in force which limits the importation of L-tryptophan into the United States, except if it is intended for an exempted use. FDA has provided for the use of manufactured L-tryptophan for special dietary purposes. Manufactured L-tryptophan is a lawful and essential component of foods, such as infant formulas, enteral products and approved parenteral drug products..."

  • Instead of exercising, dieting properly, and/or taking natural supplements now hundreds of thousands of people are hooked on (ie: recurring subscription based expense) addictive drugs that in some cases ruin their social relationships and have widely been reported to have HORRIFIC withdrawal related side effects.
  • Some doctor even offered my unemployed brother a free trial of one of these drugs even though he had no way to afford buying more.
  • systems highly focused on maximal yield efficiency often require external inputs. that reliance on external sources makes it easier for monopolies to corrupt or influence the chain for short term profits.
  • While mentioning the DuPont and Monsanto corporations De Landa stated "rather than transferring pest-resistant genes into new crop plants, these corporations are permanently fixing dependence on chemicals into the crops' genetic base."

Part 7

Before reading this book my only exposure to the concept of linguistics was from reading George Lakoff's rather introductory level Don't Think of an Elephant, so this next section might be a bit hosed.

  • dialects exist in a continuum of overlapping forms
  • linguistic patterns develop based on geographic and socioeconomic conditions
  • communication isolation leads to new languages
  • while people in different regions may speak different dialects it is also possible that many are not self aware of the differences in dialect
  • the further one moves from established prestige and power the more likely they are to find new emerging dialects
  • new dialects are standardized at seats of economic and political power to make it easier to govern or extract profit
  • the influence and standardization from the seats of power spread to the surrounding regions
  • Gottlob Frege's philosophy (as explained by De Landa) "The connection between a given name and its referent in the real world is effected through a mental entity (or psychological state) that we call 'the meaning' of the word."
  • Saul Kripke and Hillary Putnam stress linguistic inheritance by placing more emphasis on the historical and social aspects of language over the "inside the head" concept. Based on this theory "only certain experts can confirm the accuracy of usage."
  • one's ability to define a term is directly related to their ability to manipulate the items or systems being referenced (or their audience they are introducing the term to)
  • language related to survival is less likely to change than less common language
  • informal social networks act as enforcement mechanisms. dense networks are exceptionally self-reinforcing and quite stable in nature (and can thus withstand great pressures from societal norms from larger social networks)
  • middle class dialects change far quicker than local dialects or elite dialects (since the middle class is much more transitory than either of the edges)
  • the upper class can leverage their authority to influence governmental, religious, or other bodies with large reach to push their lens and linguistic frame of reference through to ambitious members of the middle class who soak up this information hoping to increase their own status
  • language or words do not mean anything until a group of people use them to communicate. the ability to introduce words (or word meanings) to a community and have them stick is proportional to your prestige and your number of contacts within the community
  • synthetic language has inflections, which are used to show things like verb tense
  • analytic languages express grammatical functions through word ordering (subject-verb-object)
  • the trade of objects and experiences with nearby cultures influences linguistic patterns in the local language
  • pidgins occur "wherever contact between alien cultures has been institutionalized" like slave trading ports. pidgins simplify the linguistic norms of their source language.
  • a creole is born out of recomplication of pidgins into a more complex language
  • language usually goes from conqueror to conquered
  • words usually travels from more complex language to a less complex ones
  • J.L. Austin's speech acts "Involve a conventional procedure that has a certain conventional effect, and the procedure itself must be executed correctly, completely, and by the correct persons under the right circumstances."
  • attempts at defining formal languages have generally failed since most people have many influences that are far more influential on their lives than a formal linguistic rule set.
  • the printing press helped harden languages.
  • The Protestant Reformation helped boost local languages and undermine Latin's role in religion and education and thus power
  • "The usefulness of a given set of sounds is guaranteed by the more or less systematic contrast that they have with one another."
  • all languages are in a state of constant change. not only with the addition of new words, but also large variants in word meaning and/or structural purpose.
  • even within a single core language most people speak multiple different dialects, with the dialect depending on their audience and speaking circumstance (ie: professional, technical, family, informal, formal)
  • cities contain both large impersonal environments and hyper focused subcultures with private lives that cause them to be the source for a wide diversity of language.

part 8

  • Noam Chomsky believes the diagram for the structure of language is an abstract robot
    • language consists of a dictionary and a set of rules
    • we can automatically check if a sentence makes sense
      • generative rules = universal across languages
      • transformational rules = not universal, language specific rules
  • Deleuze and Guattari
    • Chomsky not abstract enough
    • there is no universal language. there is always some overlap
    • need to look at history of social interaction and language to understand linguistic development
  • George Zipf
    • believed in "combinational constraints"
    • by looking at word co-occurance patterns you could predict what other words might appear
  • Zellig Harris
    • introduced "transformation" into linguistics
    • linguistic constraints come from "the gradual standardization (or conventionalization) of customary usage."
    • 3 main constraints guiding language
      • "likelihood constraints" - statistically modeled probability of co-occurance
        • "selection" - the set of the most common words grouped with a word. Words are defined by the words they commonly occur with.
      • operator-argument constraints
        • works on word classes (not individual words)
        • inclusion of a certain class of word demands that other word types occur
      • reduction
        • exceptionally common word pairs may morph into a single word, being reduced without losing meaning
  • Mary Douglas
    • also considers social elements of language in her model
    • "collective assemblage" - "intensity with which individuals are attached to a group"
    • breaks connection down into group and grid, indicating who we interact with and how
    • can create 4 quadrants using group and grid. many social forces drive people to one of the edges
    • "The fourth corner, the fully regulated individuals unaffiliated to any group, is plentifully inhabited in any complex society, but not necessarily by people who have chosen to be there. The groups [bureaucracies or sects] expel and downgrade dissenters; the competition of individualists...pushes those who are weak into the more regulated areas where their options are restricted and they end up doing what they are told."
    • the biggest limitations to her model is that they only work from within a social group or organization

part 9

  • in the 18th century language underwent strong unification and uniformation forces
  • unification - driven largely by the formation of nation states and disciplinary institutions
  • uniformation - due largely to testing, training, and observing people to create an obedient populous
  • linguistic unity is necessary for tapping patriotism to drive men toward war or peace
  • large energy flows in and around capitals and major cities made it easy for their local standards to spread
  • dictionaries and grammar guides solidify language. Dr Johnson's dictionary was viewed as so important to English that in the 1880's a bill was thrown out of the parliament because it used a word that was not in his dictionary
  • the increasing speed of global communications makes linguistic isolation harder
  • "The very idea of massified advertising meant that large-circulation newspapers were not in the business of selling information to people, but rather of selling the attention of their readers to commercial interests."
  • Following many other industries mass media quickly became largely driven by antimarket forces.
    • As noted in:
    • Examples of antimarket behavior:
      • "The formation of a cartel by six New York papers, which resulted in the formation of the Associated Press in the 1860s."
      • Reuters, AP, UPI, and French AFP still exert significant control over the global markets, operating as oligopolies
  • "Rather than aiming for objectivity [newspapers] aimed for widely acceptable neutrality."
  • "Although news agencies are not engaged in a conspiracy to promote 'capitalistic values' around the world, they do have a strong homogenizing effect arising from the routinization and standardization of point of view (with the concomitant distorting simplification) and, ultimately, from the very form of the flow, that is, a flow emanating from very few places to a large number of subscribers."
  • To appear authentic in nature newspapers widely distribute linguistically incorrect information (especially when quoting people).
  • Linguistic differences of lower classes were seen as a thing of barbarians, until those linguistically incorrect people were cherished as conscripts needed to fight in WWI
  • At the end of WWI French was seen as the most prestigious language in the world. By the end of WWII it was displaced by English.
  • As a contrast to traditional news organization the web is more of a community based many to many framework. The web allows communities separated by great distances to come together to discuss a topic.
  • "Computer meshworks have created a bridge to a stable state of social life which existed before massification and continues to coexist alongside it." The Cluetrain Manifesto is a great book to read about that reformation of communities and marketplaces brought about by the web. They have 95 thesis statements. My favorite is Hyperlinks subvert hierarchy.
  • The destiny of the web is still of course largely undecided. While it may allow communities to form easily it is also leveraged by many communities with misguided belief sets.

part 10

  • "The flows of lava, biomass, genes, memes, norms, money (and other 'stuff') are the source of just about every stable structure that we cherish and value (or, on the contrary, that oppresses or enslaves us)."
  • The Earth , institutional norms, social structures, and language can be viewed as bodies without organs that exist at various levels of stratification driven by the intensities of their catalysts and energy flows.
  • Different histories with different stratification levels and rates of change are constantly co-occuring.
  • Our history, language, and science have generally been viewed through an arbitrarily linear lens. "Western societies transformed the objective world (or some areas of it) into the type of structure that would 'correspond' to their theories, so that the latter became, in a sense, self-fulfilling prophecies."
  • While there has been significant homogenization over the last 300 to 400 years artificially becoming more heterogeneous does not guarantee a better state of humanity and blindly pushing toward heterogeneous structures is not a good idea since likely "the most destratified element in a mix effects the most rigid restratification" later on.
  • Rather than pushing hard for change without being certain of its effects we should "call for a more experimental attitude toward reality and for an increased awareness of the potential for self-organization inherent in even the humblest forms of matter-energy.

How does this relate to SEO, the web, and marketing?

The current fight over net neutrality is a fight for the belief in heterogeneous systems over monopolized homogenous systems. As noted in this book, antimarket institutions do not always add as much value as they extract from their market position. Network operators (and pocked padded politicians) assume they know what is best for everyone, but if you listen to Ted Stevens speak you will realize just how misguided many of them are.

Ad agencies like Saachi are trying to push brands to go after owning mind share for individual words, which could become self reinforcing if they did it early enough or well enough. Companies sue search engines because they don't rank where they feel they should.

Who controls language? Will search engines and authoritative websites act as our new dictionaries and encyclopedias that harden language?

The web and search engines providing new social dynamic in coding language.

In some ways search engines make the set norms more self reinforcing (via ease of access to the current status, search personalization reinforcing our current world view, reinforcement of citation data that led to the development of the current status, and running search business models that are more profitable if they limit their trust of new definitions and new statuses).

In other ways they make the set norms less self reinforcing (especially where language is loosely defined and/or a market has limited depth) since they make many opinions accessible and place many results near one another on equal footing it becomes easier for people of traditionally limited authority and reach to help redefine the meaning of a word.

Some vertical engines also put your or my words alongside or above the New York Times in importance. This changes the media bias from being a rather homogenized bias controlled by similar large business structures to a more diverse set of biases.

The web and search engines not only provide new social dynamics to coding language, but also in coding our status. Currently if you have a lot of trust in a popular topic search engines allow you to leverage that across to other topics.

Not only do our status levels rise and fall with the importance of the language we play a role in defining, but also search engines look at social bonds and social interactions that were likely hard to measure for past authority systems.

Search engines also have a way of trying to understand a universal personal identity. For example, our search history may manipulate the results of other searchers, and the level of trust on that search history data may depend on our ties to the search systems (financial ties due to being large ad buyers or large ad sellers, financial ties due to being a large source of content or in need of search referrals, length of time and volume of activity - search, publishing, or advertising).

In addition to controlling how we find content search engines also control the payout levels from the largest distributed ad networks, which in part determine:

  • What content is profitable? (Ad sales automation makes content production more efficient for smaller groups at the expense of many larger groups. Many web based business models revolve around amateurs working for free for other companies.)
  • How we structure content to be maximally profitable (in terms of money or in terms of reach and influence).
  • What topics people will focus on. (Smaller niches are now more profitable).
  • The type of bias they may be interested in writing the content from.
  • How evergreen versus fresh content is.
  • How topics will merge together or drift apart. (Hyperlinks will create new links where there never were links before, and many established trusted businesses will quickly consume new markets by quickly recognizing those connections and drifting toward those markets at a quicker rate).
  • How people will format content (ie: if I went with a traditional book publisher I would have been lucky to make 10% of what I make selling my ebook, and this factor also controls how much information people will typically put on a page and how frequently they publish)

Then there is the question of will the mass amateurization of content change the way citizens perceive content vs advertising and will people become more isolated or more involved in building a world that was more like the world they want to see?

Affiliate Conversion Rate by Ad Format

For a while I didn't realize that my affiliate software program tracked affiliate conversions by ad format. I have had a good number of affiliate sales since then ;)

By far and away the top converting affiliate format is text links using whatever the publisher wants in a text link. Other than that the general trend is the smaller the ad size is the better it converts. Once you go below 100x100 the conversion rates drop off a bit, but anywhere from 100x100 to 350x250 does great. The traditional banner sized ads and exceptionally large ads do not convert well, perhaps because they alert a part of the brain that says "I am trying to sell you crap".

What affiliate ad formats do you find most effective? Have you noticed any recent changes in conversion trends based on ad format?

Lee Dodd's Earners Forums Launched

I used to be a big forum junkie, but have recently cut back a bit. There is still lots of great stuff going on in the forum space though. Lee Dodd, who is serious about monetization, launched Earners Forum last week and it is already in the top couple thousand sites on Alexa.

As part of the launch he is giving away over $15,000 worth of cash and prizes including 5 copies of my ebook. Sign up there if you would want to win a free copy.

Tag Spam, See Also, X is Related to Y

I think the biggest form of spam to hit the web in the next year or so is going to be heavy social spam. Not just the stuff Seth mentioned here (where is appears that LookSmart is leading the charge to irrelevancy on yet another front) but lots of other stuff too.
A while ago I mentioned a few tips for getting quick and easy co-citation data, and I have also mentioned shopping comparison pages and writing natural content but I think many new traffic sources are easy to manipulate right now. Since they are all rapidly evolving and fighting for marketshare they are going to leave many algorithmic holes open along the way.

On many sites I have seen people upload images for related products or companies using their company or URL as their username or tag name. Google, Yahoo!, Amazon, AOL, and eBay are all experimenting with tagging. You can tag something that gets millions of pageviews from predefined relevant traffic source
Tagging Google Video.
if you are a musician and friends tell you that you sound like an established star and you submit your song to YouTube or Google Video do you label it to include a similar star's name in the title or have a friend tag it with someone else's name? Dare you cover old songs you like and submit those? If related content is already listed does it hurt to vote for it / list it as one of your favorites / tag it with your URL?

Get in early on market edges and get exposure in the new verticals. Depending on your vertical and brand investment some techniques may provide different risk / reward ratios.

Google Checkout Launched...My Thoughts

Google Checkout launched today. It is a payment system which revolves around Google storing your credit details and making it quicker to checkout at various stores. After having signed up and looked at it all I think this system will likely spread like a weed. The biggest reasons are:

  • Google AdWords is the largest online ad market.

  • Merchants who use the Google Checkout product will get a little shopping cart badge next to their ads, which could increase their ad CTR and thus lower their per click cost.
  • The pricing policy for Google Checkout gives you $10 of free payment processing for every $1 you spend on AdWords.
  • Google is a trusted brand.
  • Most shopping carts are garbage. This is all about making checkout quick, simple, and easy.
  • Merchants are not allowed to change the Google Checkout images. The Google Checkout images do a great job of conveying a fast, easy checkout.

Google uses AdSense as a way to spread their brand across the content portions of the web, but did not have much visibility on highly commercially oriented websites. Web = Google = web is really going to be pounded into people's heads as the Google brand appears on more and more websites. They are the leading monetization network across content sites, and now they have found a way to sneak into commercial sites in the middle of the highly visible conversion process.

This launch also gives Google another data point in the buying cycle, closing the loop from research right on through to purchase. In addition to giving Google a better idea as to the margins and size of a business this launch also allows Google to trust a merchant more based on consumer feedback and transaction history (although it is unlikely they would want to integrate that data point into the organic results too heavily because they would prefer to have an informational bias to the organic results since AdWords already puts commercial results in the search results.

It's pretty nasty how much power Google has right now. To do that in a decade is absurd. This is no eBay or Paypal killer though, as noted by Ars Technica. But in time, as this and similar programs evolve and this is integrated into the AdSense program (or related programs) it helps Google ensure they have the most efficient product recommendation cycle, which should keep them at #1 in the contextual advertising space unless 1 of 2 things happen:

  • they lose their market position in search because they are so afraid of manipulation that they drive their results into the void of irrelevancy (and they have been heading down that path recently, but they can get away with it because Yahoo! has not been very good at branding their search and Microsoft is irrelevant and does not even know what a brand is).

  • Microsoft decides to spend billions of dollars operating a distributed ad network at a loss to steal market share (although this is not too likely because this would create an arbitrage opportunity which would be heavily abused by people like me)

I may be coming and going this weekend (and my internet connection has been exceptionally unreliable the last few days) but if you want, I am offering $10 off my ebook price if you buy it through Google Checkout. This is a short time promotion (which may end sometime this weekend). I just want to see how this works from the merchant end to see how viable I think it is relative to Paypal. Please note that it may take a number of hours for me to get your ebook to you after purchase as I sleep sometimes (but rarely ;) and my internet connection has not been that great recently.

Update: Order button removed. Thanks to everyone who bought my ebook through Google Purchase.

Here are some more thoughts from my purchase experience.

  • Processing the orders was quite easy, but...

  • I do not like the idea of Google "protecting" my customers for me. Having them suddenly put a brick wall between my customer's email and my business is not seen as much of a value add from the merchant perspective.
  • I do not understand the point of cloaking a person's email address if you give me their name and physical address.
  • One of the two people who used the Google email cloaking stuff also had order problems with their credit card not going through.

Who is Google Purchase good and bad for?

  • I think it may work well for those who have such a large established brand that most of their traffic is for people already searching for their brand, but then again if you are operating at that scale having Google play middle man with some of your customer emails might suck.

  • If you are heavily reliant on search traffic this could put a squeeze on your business. Trusting Google to constantly make the market more efficient when they profit from advertiser inefficiencies is somewhat altruistic for a merchant to do.
  • If you are new to the market and are confident in your product quality I think Google Purchase can be a way to make it easier for Google to learn to trust you and understand what other products and services your business relates to, but even then you are self selecting your site to be commercial in nature, which may not be a good since their search results have an obvious informational bias.
  • If you do not have a website and are just listing a couple products on Google Base and are in no rush to sell the items then Google Purchase can be useful.

Eventually Google will probably use purchase details, search history, and other information consumption habbits to make their AdSense program a more efficient ad/product recommendation engine (think of the Amazon emails and on site notifications that say people who bought x also bought y).

It might be worth thinking about that as one of their end goals and playing with Google Purchase to see how it works and how you can integrate it into your system, but like anything Google Purchase is just a small step in a direction and they are starting off slow, so I do not think it is a make or break now or never thing for most merchants.

If you are a market leader with a low price point and broad distribution you are giving Google a ton of relevant usage data back to refine their ads, especially when weighed against how little value they are adding to your business.

But they do give you a logo :)
Google Checkout Cart Image.