Does Reciprocal Linking Work?
Recently I saw the Blue Gecko SEO forums ranking at #10 for SEO. Most of his link popularity looked like it was from link trades associated with his webmaster resources directory. The reason people say link trades do not work are mostly because:
they are usually slow and expensive to build if you do not outsource or automate
most people exchanging links in bulk are not doing so with quality sites
DMOZ Weighting in Yahoo!:
I created a one page site about Effexor which is listed in DMOZ. I have not built any other linkage data, and it is ranking in the mid 30s for Effexor out of over 7,000,000 sites.
When Google update florida occured about a year and a half ago some people viewed it as a way to bias the results to non commercial or informational type listings. If search engines can train users to use the search resuls for information and ads for commercial sites then that could do a large bit to change the face of SEO.
Combine that idea with:
most searches being unique
the longer and more unique queries typically have higher conversion rates
giving away information builds credibility (in linkage data, consumer trust, & karma)
building communities builds an abundance of content
So I just got an affiliate commission notification for a piece of SEO software that I thought was cool about a year ago, but no longer think is that great.
So the question is, how do you go about maintaining older posts. Is it ok to delete or edit profitable posts if you feel that they undermine your current credibility? Should you edit them? Should you delete them? Even if you do prune the past you will likely miss a few posts. Should you feel guilty because someone bought bunk software? Should you not feel guilty since the person ignored the post was a year old? Should you feel guilty editing or deleting them as though you are hiding your past?
This also reminds me about handing out recommendations and testimonials. It can be a great link and reputation building activity, but after stuff ages and loses its value (as SEO software is known to do) it could likely wear your credibility thin to endorse too many products. Many people who are eager for testimonials are also greedy hucksters who will make sure you pay for your endorsements. As a web marketer my reputation is by far and away the most valuable thing I have.
I am not afraid to admit that I am sometimes wrong or make lots of mistakes, but it does make little sense to leave errors that could and should be easy to fix, right? I think the forward going answer is to always be cautious and forward looking with your endorsements.
Am in the UK now. A few observations:
everyone uses the word "ish"
NFFC was wearing an eSpotting looking shirt. Clearly a reason to like FWHT :)
the Down Hall hotel is in a pretty cool remote setting
JasonD likes the Gaping Void t shirts
a bartender gave a mate 7-UP when they ordered a lemonade
they have street signs that say Queing
Gurtie does not like the nickname TheGurtster
Z is for some reason zed in the UK
Rumours have it DaveN was showing off his back end?
my friend argued that they drive on the right side of the road, but clear as day it is the left side
Just got done doing another update of my ebook right before I left, and would think everyone is really really cool if I sell a bunch of them while I am gone. Of course, they may be cool anyway, but like peanut butter, SEO Book makes everything better. Buy the book. Wear a smile ;)
Currently, the predominant business model for commercial search engines is advertising. The goals of the advertising business model do not always correspond to providing quality search to users. For example, in our prototype search engine one of the top results for cellular phone is "The Effect of Cellular Phone Use Upon Driver Attention", a study which explains in great detail the distractions and risk associated with conversing on a cell phone while driving. This search result came up first because of its high importance as judged by the PageRank algorithm, an approximation of citation importance on the web [Page, 98]. It is clear that a search engine which was taking money for showing cellular phone ads would have difficulty justifying the page that our system returned to its paying advertisers. For this type of reason and historical experience with other media [Bagdikian 83], we expect that advertising funded search engines will be inherently biased towards the advertisers and away from the needs of the consumers.
a few years later:
It looks bad, coming days after the recent song-and-dance at the Google Factory Tour about how much energy is supposedly expended on core search and ads. Here's a personalized home page, but don't worry, we're not a portal, Google said.
Funny, this type of inattention is exactly what made people get turned off from the portals of the past, when they lost focus on search quality. Yahoo seems to have fixed this redirect hijacking problem, but Google is still struggling with it?
Buy someone lunch. Give them something to talk about. Any Penn State proffesors want a free lunch? email me. If your university position is high enough and your university has a great link reputation I am also will to fly.
Go to college to become a system administrator and web designer for a school. A friend of mine as a freshman was both last year. I am thinking this friendship may soon grow leaps and bounds.
Get a crap job you do not care about. Write a humorous blog about it until you get fired for it.
Have your child send $8.43 cash to the government to help pay down the federal debt. Make sure you are available for press comments and get links in your coverage.
Move to Texas. They have big links there.
Always carry a big flag around with you. Even in the shower.
Help your local congressman get reelected. Get links from their site.
Join the local government.
Draft government bills with Orwellian terms, calling them exact opposite of what they do.
Point out said flaws in Government bills.
Join the military and work the .mil link angle.
Buy your way into the government for .gov links. If you can not afford this move to a poorer country where you can afford to buy your way in.
Get caught on tape doing something illegal yet humorous.
Get on a radio show. A friend of mine who used to sell adult sex toys (he sold his site, but it still ranks #1 for his primary keyword phrases) would go on the radio to get them to link to him. The shaddier your marketplace is the more value legitimate links.
Donate or help someone with their site.
Fix someone's car tire on the side of the road.
Accidentally wreck into the car of a famous person, obscenely exclaim it was their fault, and then sue them.
Get ran over by a rich person. etc.
Become a semi stalker. Sue the celebrity for stalking you.
Admit yourself to a psych ward or rehab where you know a link rich person is currently at.
Tell others that they should start a site, knowing they will link to you.
Create free tools or software with powered by or designed by links in them.
Intentionally do something to get sued by a large overbearing company.
Date or marry an annoying overhyped celebrity or marry into a link rich family.
His post was a bit over the top, and I am surprised with that tone and frame anyone wanted to help him, but he got a ton of good advice.
One of the biggest problems with SEO is that sometimes for an extended period of time the free leads allow you to profitably run what would otherwise be a completely non functional business model. Too often people take success for granted and do not shore up other marketing methods. Out of nowhere eventually they pay for the arrogence or laziness as the leads dry up.
With this site for a period of time I was a bit arrogent thinking that it wouldn't fall. Many of my links had the same anchor text since many of my early links used my official site name and most people link to this site use the same link text.
For a short period of time my rankings headed south due too much similar anchor text and a new Google filter. Luckily I had other revenue streams and traffic streams. Even without Google sending much traffic to my site for about a month my sales were still close to 90% of what they were the prior month.
You have had 1st position for a large number of keywords for years and years, you have mysteriously disappeared overnight from Google, through no fault of your own, and [despite having a very good run over the last decade] are merely days away from having to live on the streets.
The main reason why I don't put much emphasis on SEO for my own business is that we need to be able to manage growth, and the predictability of PPC is perfect for that. If we suddenly landed on page one of Google's results for the right search terms, I'd need to hire 15-20 more people to deal with the flood... then if we dropped back down again, what exactly would we do with those people? No thanks! For me, it's just as important to be able to turn the traffic off when we're growing too fast.
If some of the best SEOs in the world look for alternate marketing channels then it is probably best if other webmasters also create diverse marketing & revenue streams to help pull them through bumpy patches.
While Clint wanted to turn back the clock search algorithms continue to evolve. This is another reason why some of the worst SEO clients are those who used to rank well when algorithms were less sophistocated. Some of them believe:
that its easy to do
they know what to do (since they used to rank well)
and you should be able to work for next to nothing
meanwhile their revenue stream has got cut and they are worried about paying their bills and have little to invest.
While I could probably afford to hire people now, I never have because I wanted to keep costs low in case anything ever fell out of favor. When it did I was still fine because I minimized costs, had other revenue streams, and have diverse traffic sources.
Last year was the first year I made profit from the web and I am already saving up and am still working hard to create other revenue streams.
I am a Book Junkie:
My cost of living is generally dirt cheap other than an odd obsession with books. I have book cases full of books (only had about 10 - 20 a year or so ago) and buy them way faster than I read them. Recently I have been trying to read many books to reverse that trend.
I have been reading a good number about SEO and related topics to see how everyone else writes them, if there are certain graphics I should add, etc.
When I initially wrote my ebook I used no graphics at all because I did not want to create a fluffy image book. As time has passed I have been adding some grahics, as they can be useful and help explain some things better than words.
I keep reading lots of books on marketing and web related stuff because you only need to learn a few things for a book to pay for itself.
Marketing a Book:
A while ago Boris Morokovich offered me a free copy of his Pay Per Click Marketing Search Engine Handbook and I have yet had a chance to read it. He just emailed me again to mention his book, so I glanced through it and am writing my thoughts.
At a glance it looks like it is well written and has some good information about the various engines, history of ppc marketing, ppc & branding, contextual advertising, click fraud, roi tracking, ppc tools and the like.
Again, I have only glanced at it for a few minutes, but a few things I did not like:
The general overall view looks solid. Covers lots of stuff.
Uses affiliate links. Fine to do that on your site, but I don't like the idea of doing that in a book that is for sale. I used to do that, and did not let it effect my reviews of products, but as I increased the price of my book and it got broader distribution I realized it was not going to be a wise idea to have some people think the book is there to get upsells.
Along the lines with using those affiliate links, I thought the book could of - and should have - given far more coverage to Google and Overture, instead of giving similar review sizes to Google AdWords and PageSeeker, which were both part of the top 10 search engines. To me, Google AdWords and Overture are their own group, and then second tier starts after that. PageSeeker might be like 3rd or 4th tier IMHO.
I would have also liked lots of information about creating useful profitable campaigns on Google and Overture before even thinking of trying any of the smaller engines.
The reviews are short (about 1 page) and list pros and cons and. The cons many times did not state what some of the more important cons were. Examples:
If an engine has little to no traffic, then that is a con that should be stated. Don't give an Alexa ranking and expect me to know their level and quality of traffic from that. I mean, I launched a 6 page site about a month ago (spending under $1,000 building it and its tools, and $0 on marketing) which has a better Alexa ranking than some of the reviewed PPC search engines.
Why isn't there a mention of poor traffic quality in the LookSmart review? To leave that type of information out and place an affiliate link next to the reviews put the authenticity of the reviews in question.
Opinionated reviews are usually worth far more than factual sounding reviews that do not give in depth information about personal experience. Quality customer service does not matter if they get no traffic and have garbage traffic quality. Good or bad ROI and net profits, as personally experienced, should be listed in most of the reviews.
huge numbers of people are going to be using RSS to create automated content streams.
RSS will become the next blog comment in the evolution of search. Since the content is legible, and some technologies make it easy to grab many related posts on a given topic, it might be a bit hard for search engines to distinguish the difference between original blog posts and fake feeder blogs that just recompile market data from various sources. Some people may even make legitimate regular posts to combine with the automated streams to make them seem more legitimate or manually compiled.
With hundreds of channels on a given topic you know that search engines are going to be in for some fun. This is yet another reason it will be hard for search engines to move away from link based relevancy systems.
Luckily I already created an annoyingly bright favicon file which is ready and waiting to be used :)
Graphic text ads. hmm. To me it seems like it would make sense for Google to make ads on other sites look different than ads on Google to keep people clicking away at the Google.com ads for as long as they possibly can.
The more graphical Google makes ads on other sites the longer it will be before people become blind to text ads.
Should write a press release about Google not being about to control their search index, and nefarious webmasters hijacking other sites to remove them from the search index. Why?
Currently the ball is in Google's court, with SEO being branded as spam and scum of the web.
If someone could push the idea that Google could not even control their own index, or how to rate their own site, then perhaps they could somehow shift the frame, saying that Google does not know how to control their index and needs the help of good SEOs to improve their search relevancy.
freely using webmaster content, and knowingly publishing false or deceptive helpful webmaster information to those same webmasters
consumer confidence in search relevancy
Most consumers do not realize how search results are manipulated, and most don't even know where the ads are.
It would be cool to see an SEO more daring or less lazy than me use this opportunity to toot their own horn and talk about how they help Google solve a problem it hasn't fully figured out yet - relevancy.
It would certainly be cheap marketing if you get national media coverage with the current feeding frenzy for Google's stock.
For those who spin all the ethics stuff, do you think Google knew of the problem and was lying when they said it was no big deal? If so, is it ethical for them to tell blatent lies? If not, how is it that SEOs know more about their search engine than they do and they generally disocunt the whole concept of SEO?
Yahoo! Q Challenge: whats up with a $5,000 prize - that surely is not much payout for the value they could create with that contest. I might need to create a similar marketing program for myself. hehehe
There is a website that qualifies you and prints out your ordained ministor certification in under a minute. A person today tried to justify me giving away my business model to them because they spent the minute to print one out.
One reason I'm in hot water is because my colleagues and I at "Now" didn't play by the conventional rules of Beltway journalism. Those rules divide the world into Democrats and Republicans, liberals and conservatives, and allow journalists to pretend they have done their job if, instead of reporting the truth behind the news, they merely give each side an opportunity to spin the news.
These "rules of the game" permit Washington officials to set the agenda for journalism, leaving the press all too often simply to recount what officials say instead of subjecting their words and deeds to critical scrutiny. Instead of acting as filters for readers and viewers, sifting the truth from the propaganda, reporters and anchors attentively transcribe both sides of the spin invariably failing to provide context, background or any sense of which claims hold up and which are misleading. ... Objectivity is not satisfied by two opposing people offering competing opinions, leaving the viewer to split the difference.
Later he comments on the Journal:
But I confess to some puzzlement that the Wall Street Journal, which in the past editorialized to cut PBS off the public tap, is now being subsidized by American taxpayers although its parent company, Dow Jones, had revenues in just the first quarter of this year of $400 million.
Any way you slice it, there are going to be a few gatekeepers to this thing we call the web, and to most media outlets in general. The more there are the better it is for consumers, and for that reason I might start trying a bit harder to use Google and Yahoo! a bit less.
It will be interesting to see how it plays out, but anyone who knows about SEO should see it as a personal responsibility to make sure people find what issues you feel are important.
It is fairly easy to understand many of the concepts of it (like attenuating a possitive trust score or offsetting the effects of link spam with a negative trust score), but it is even easier to understand them if you visualize the concept of trust attenuation.
Most sites are not exceptionally compelling, so there are usually not many legitimate hubs in any industry, but many sites are glorified link farms which will not pass any positive trust value.
For a while I helped promote many directories, but many of the new ones on the market have little to no legitimate value, and some of the links from them may even have negative value.
I just wrote an article called TrustRank & the Company You Keep, in which I made this graphic explaining the concept of AntiTrust (yet another SEO phrase I made up hehehe).
The red X's represent things that should be, but are not there.
Yes, I know, the drop shadow is too dark, my web designer friend already yelled at me for that. Other than that, I hope the image clearly demonstrates the concept I was trying to get across.
Other than drop shadow remarks, please leave comments on the article and image below.
Not entirely SEO related, but the stock market is another information system which is often manipulated.
Not that I have much money, but recently I read a book called Trim Tabs Investing by Charles Biderman. On a macroeconomic level it looks at the stock market in terms of volume of shares, their overall price, and the money chasing those shares. Rather than stating that forward earnings drive the stock market they believe the short term stock price can best be described using supply and demand.
It breaks down the money chasing the shares into the following groups:
insider and corporate trading (smart money)
general investor trading (dumb money)
foreign investor trading (dumb money)
margin debt (dumb money)
In the short term the money from the typical investor can power the direction of the stock market, but the stock market inevitably goes in the direction of the insider and corporate trading. Peaks in the stock market (tops and bottoms) are often associated with rapid changes in margin debt.
People are emotionally attached to their investments, and tend to believe the future actions of the stock market will follow the recent past. People take out loans to be fully exposed to the market near tops. People also lose hope and cash out at a loss near bottoms. Foreign investment is also another lagging contrary indicator.
Insiders have access to better data, and their actions are thus inclined to be more representative of actual market conditions. Their ability to control the float (number of shares on the market) gives them an unfair advantage. Also sometimes they will forcast a lack of guidance while the stock market is doing bad so they can actively rebuy their own shares for prices below their actual value.
I thought it was a pretty cool book. For small investors he still recommends just dollar cost averaging or using buy and hold, but for those who are rich (some of the early SEO gods are probably sitting out mounds and mounds of cash right now, as early Google workers likely are too - hi Matt) and seek larger gains, liquidity theory may help them do well in both good and bad market periods.
Not that it is huge news for the average SEO, but when SEO Inc was removed from Google the story got so much negative coverage and the SEO Inc PR department botched the issue that it was just a really bad thing for them.
Another great example of how you reacting to something being more important than what actually happens.
Right now I am not getting SEO Inc to show up for their site name and the like, but their home page was cached in Google 2 days ago and is showing up under Search Engine Marketing Firm.
added links to Google cache and Google cache text of each page
If you have not heard of Link Harvester yet, here is some background on it. Are there any other cool features you can think of? I might have a friend create another SEO tool tomorrow too if he has time. I will probably be adding features to Hub Finder soon too if I have enough money and my friends have enough time.
BTW, someone pointed out Search Lores in a comment at ThreadWatch recently. The site is so amazing I can't believe I haven't came across it yet.
charging a flat rate would lead to oversaturation in competitive areas and minimal coverage in less competitive areas
the lower overall income generated through such a system would leave less money for marketing it
It is hard to bribe people to rate relevancy. The best bet on that front is to try to establish a system and idea which will be good enough to build a usebase which markets itself, and then figure out how to attach a business model later.
SEW also recently had another forum thread about acquiring cheap links. Pyramid Link Building Scheme:
Someone recently spammed SEW forums asking about www.16links.com, which is a link building pyramid scheme that charges people to join it too.
What a hunk of crap site / idea!!!
One Time Fee Links:
Another person dropped in the 16links.com thread to recommend textlinkpopularity.com for building one way links for a one time fee. (incidentally, this person's only other comments are in a thread they started recommending textlinkpopularity.com).
Why one time fee links suck:
Low Quality: High quality sites selling useful ad space usually do not sell that ad space for a one time fee, even most directories suck.
Low Quality: If sites are hard up for cash then those sites likely are not going to be long lasting ones.
Low Quality: If a site is selling underpriced ad space for a one time fee, then eventually that ad space becomes hyper saturated to where the value diminishes.
Low Quality: If sites are made just to sell links for a fixed rate then they may not have enough money to put back into promotion. The site the ad is on may not grow with the web. If a site rarely picks up new links then it would be easy and likely that a search engine may discount the value of links from that site, especially if it is a site that is not well integrated into the web.
Low Quality: I started on the web by creating a site that was a bit critical of the military. Its a really bad site and I should take it down, but I leave it up to still speak my mind and show how quickly people can learn. One of the more reputable link brokers spammed that site asking if they could buy links on it. That shows there probably is not much quality in that business model if they are willing to risk their reputation for a few dollars.
Easy to Replicate:
Another common problem with most linking schemes is that they are easy to replicate. This means that if a quick low cost link scheme is effective, easy to trace, and has no quality standards then people will be able to quickly replicate it, thus any competitive advantage gained would be quickly minimized.
I have lots of directory links and one time fee links, but most of them were not built through any broker, and at the end of the day most of them do not drive much traffic, and I am moving away from doing it as much for some of my sites.
Most of the links for a one time fee type programs charge about $20 - $30 for a link. So a dozen crap links would cost you around $300. These links would most likely:
drive no traffic
be on pages full of other junk links
not be on authoritative, highly related, or well integrated sites
The links that drive the most traffic to my site are the ones where my site or I am featured or cited. Examples:
Writing an article might take a couple hours, but if you get it syndicated through the right channels it can build dozens of quality links. These links:
drive targeted prospects
are on pages with few links
are on pages about your topic
some of them may be on related, authoritative, well integrated websites
Most articles I write and syndicate quickly bring in at least one or two consulting clients, so there is some value there, plus for about 3 hours of work (writing and submitting the article) I can get links that are worth well over $300, since the articles would have more longterm value than the crap one time fee links.
Like a twit I recently broke the Link Harvester tool. Currently I have an old version up, but my friend who made it is going to add a few new features to it and have it back up this weekend. :)
The Link Harvester tool so far costed me about 2 hours of my time and around $500 to make. It got links from sites like SitePoint, ThreadWatch, & Yahoo! from within the content part of the pages. In most good algorithms 3 of those links, from sites which:
are well established
are not going away anytime soon
are going to be worth far more than a dozen or two dozen permanent junk links. A few other beautiful things about getting links from authoritative sites:
Using tons of cheap one time fee links may raise your risk profile. Odds are that Yahoo! is not going to use their link pointing at my site as a reason to ban my site.
Getting links on authoritative sites is not as easy to replicate as getting links from a program which serves up links all you can eat at $25 each.
Scalability of a business model is important. If a project or idea does not gain steam then the value of the ad is limited at best. I like investing early into some ideas just in case they pan out, but the people selling links using cheap instead of value as the selling point may not be giving you much value. Sometimes the value of links is destroyed by the business model of the site the link is on.
When you look at links on a shear numbers level you end up missing the value of putting in a little effort or spending the money in indirect ways to get more longterm value out of your link ad spend. [/end rant hehehe]
SimCity was always one of my favorite games. kpaul recently noticed a new site by the name of Chicago Crime, which overlays crimes with their locations using Google Maps. Pretty scary to see that in Chicago there was over a murder a day last month.
What kind of ad marketplace would Google have if they:
integrated Google maps and public data into a social network
which linked to - or allowed people to upload - business feedback (think Local Froogle)
should I buy from here?
what other businesses are cheaper or provide better service?
should I consider working here?
who else is hiring in this field or near here?
and destination reviews
is this place worth visiting?
when is best?
who has the best travel deals?
They also could show the history and trust rating of reviewers, as well as letting you determine how many social connections away you were willing to accept reviews from, maybe they could match up personalities or demographic profiles if people gave them that data, or they could let you create your own combined metric.
Add a strong recommending engine technology to that (like how Amazon.com says "of the people who viewed this product ultimately 37% ended up buying XYZ") and Google will serve ads that know what you want even when you don't.
Google has data worth lots and lots of money. It will be interesting to see how they aggregate content and collect feedback to leverage their market position.
Any merchant heavily exposed to the web which is not building communities or other hard to replicate assets may end up in the hurt locker in the next couple years.
Google's ad serving technology is still somewhat primative. As time passes and more major networks leverage their market postions more and more merchants will get marginalized by the forces that be.
One problem current search related ad systems have is that after one advertiser exhausts their budget the competing sites may get ads below their fair market value.
If a college student wanted to get a job at Google you could bet that writing a research paper about making AdWords more profitable would be a good idea :)
In related news... AdWords Smart Keyword Evaluation Tool:
Sometimes without human review it disables some exceptionally well targeted terms even before you get a chance to display your ads. That is not so smart, as it frustrates advertisers and prevents them from selling part of their inventory.
You can't know how well an ad will perform based on past advertising experience since so much of Google's ad space is full of "Buy dead animal at eBay" type ads.
Why Disabling Some Generic Term Makes more Money:
I advertise one product line on Overture where part of the name is an acronym. I can use that acronym to make a decent number of sales on Overture for a good sum of money. If I want to advertise for that term on Google AdWords, even with like 20 negative keywords (filtering out unrelated traffic), the term consistantly gets shut off, despite getting a clickthrough near their minimum rate and converting exceptionally well.
Then again, maybe Google does not want me to get those conversions for a nickel. In how broad search engines allow you to advertise they are also trying to control the way searchers search. If a person searches for a short acronym Google would prefer that person to give them more data, so they can gain a better understanding of what the person wants, and deliver more targeted and hopefully more expensive advertising.
In my example for targeted terms I pay over 10 times as much per click, which really sucks since the acronym had a conversion rate higher than the campaign does.
The Motley Fool wrote an article about the death of affiliate marketing, talking about how AdSense text ads were better at selling than typical banner ads. Of course he is right, banners are generally useless compared to what can be done in affiliate marketing because they scream "I am an ad. Please ignore me."
Yesterday I had one fairly well targeted visual AdSense group display a couple thousands visual ads, and it had a zero percent clickthrough rate. People do not want to click on banners.
While some of the affiliate marketing companies may have stocks that will continue to falter, that in no way means that affiliate marketing as a whole is dead.
Many smart affiliates create testimonials, or factual looking review based content with affiliate links embedded in it.
The two highly successful affiliate techniques I know of are:
Creating useless spam sites chuck full of affiliate links or AdSense. Make the sites so ugly that people have to quickly click on something. On these sites AdSense might work better, and since Google does not enforce any legitimate publisher quality standards you can create tons of these sites.
Create smaller sites that review most every product in an industry. If a page only makes a hundred or few hundred a month and you have 10 to 50 pages of useful related unique content per site then it does not take long to build a few revenue streams that can make you well over $100,000 per year.
Just yesterday I got a random check in the mail for unknown reasons, which tells me that bad affiliate marketing probably still has a while left, let alone good affiliate marketing, which will only get better as time passes.
So the people suing the major search engines for click fraud issues created a website.
With the money that is going to be needed in that sort of a case you would have thought they could have made an attractive professional looking site, but you would be wrong. They even have (not so) flashy "click here" banners.
"What we'd like is for http://www.LostClicks.com to become an electronic meeting place for advertisers and individuals who are concerned about pay-per- click (PPC) fraud," says attorney Joel Fineberg of Dallas, who represents online advertisers in the class action lawsuit. "It's very important that all of us share information because we're dealing with a new technology and a new challenge. The more people who visit the site, the more knowledge we can all gain."
Sending what visitors I can. They are surely in for an expensive battle. Wonder why don't they have a blog, forum, or anything that would encourage community activity? They probably could have put a bit more effort in on that front.
Google Inc. on Wednesday launched a corporate version of its desktop search application. The Google Desktop Search for Enterprise allows employees at companies to search for information on their computers. The free, downloadable application is based on its desktop search tools introduced last year. Google said it collaborated with IBM on the program, which is able to search IBM Lotus Notes messages, among other features.
The purist will hate the ads, but if RSS is going to transition from early adopter to mainstream it will need to pay for itself. The two options are that the RSS post is a summary that brings visitor to your site to see ads or you place ads in your feed. Google wants ads in your feed.
Syndicate the full text of your articles. The more content that is available in a siteâ€™s feed, the better the user experience, and the more likely people are to subscribe your feed. If you canâ€™t put the full text of your articles in your feed, then in addition to the headline of each article, include as informative a snippet as possible of the articleâ€™s text.
Typically most people do not view a feed until after they subscribed to it, so how does showing the full content of your post in your feed make people more likely to subscribe?
I think I link to every article he writes. his latest: Hiring is Obsolete, which says if you are the young & motivated type you can let the market determine your value by starting a startup instead of going to work for mega corp for lower than market value wages.
While those articles are not directly associated with SEO, I know many SEOs who:
smoke & drink
have lots of caffiene
are under heavy stress
constantly multi task
I don't think depression is just a physical or psychological issue, but is deeply intertwined. The articles focus more on the physical reenforcing aspects of severe depression.
Before doing SEO I was in the Navy and then later a mid level manager for another company. At my prior jobs it was not uncommon to drive & work 80 (mid level manager guy) to 120 (Navy) hours a week. I also did much of my initial learning SEO / marketing / web while in that mid level management position (and got so many speeding tickets during that time period too).
When you are first getting started in SEO you may have to work long hours, and sometimes it can be hard to escape work when there is so much to learn and it rests just beyond the edge of your bed. This is especially true when the alternative is to go work for a company that wants to chew you up for all you are worth, and then fire you or go under before you get any benefits out of your retirement.
A few other things that make it easy to stretch yourself too far doing SEO are
that many times you do not have to leave the house or interact with society in general to get by
pricing SEO services can be somewhat hard, especially when you are new and do not appreciate the value of your services. about a year ago I had like $20,000 of credit card debt, which has since been joyfully erased.
I don't necissarily agree with everything those articles said (particularly the endorsement of the prescription drugs), but did find the articles interesting.
Hopefully this somewhat off topic post helps more people than it makes mad.
LookSmart is taking the bold step of private-labeling Furl.net for publishers within its strategy of licensing tools, content and technology so they can own search advertiser relationships, develop a larger search audience and retain their audience more effectively with sticky tools like Furl.
I think LookSmart needs to get in check with reality a bit. I mean, why should I trust their network when they use AdSense instead of LookSmart on some of their own sites. To me that just goes to show that the value of their ad network has eroded to next to nothing.
Bo Peabody is the guy who created Tripod, who only accidentally had their everpopular site builder added to that site because his workers created something other than what he wanted. Lucky or Smart is a super quick book (58 pages) which explains some of the tips which helped Bo and Tripod along.
A few tips from his book:
"Lucky things happen to entrepenuers who start fundamentally innovative, morally compelling, and philosophically possitive companies."
entrepenuers usually are satisfied with well enough, whereas managers try to make things to perfect and thus move too slowly in starting new companies
startups generally attract many sociopaths
no is the most common word, but is an open door
being gracious is key
always spin your company / story / self to sell #1
press is sensationalistic. never believe in it.
news is past tense. doing is more important than reading the news.
know that you do not always know the answers and when to say you don't know
Consider this. Until earlier this month, WebPosition was owned by WebTrends, in turn owned by NetIQ, a publicly listed company in the US. Now it's owned by Francisco Partners, another publicly listed company. The purchase was announced March 28 and concluded May 3.
Now you're an investor wondering about this sale. You decide to research some of the products. You turn to your trusted research tool, Google. You do a search for one of the products you've heard about, WebPosition. And you can't find the official site about it?
That's relevancy? That's serving the user? That's organizing the web's information? And that's defending Google because it somehow stopped all the other resellers showing up in its editorial results as well as the ads Google itself accepted?
Not how to get more out of your ad spend, but how to spend more money on your PPC ads. Never did I think I would read an article about how to spend more. Why not though, eh?
The 1st major question they listed in evaluating your spend: Any Search Engines Missing?
Kinda funny that the article was sponsored by FindWhat and there is a huge FindWhat ad next to it ;)
From the few chats I have had with him Kevin is a super bright guy, and probably one of the top half dozen PPC experts in the world, but do you think ClickZ is being a bit transparent with the advertising business model there?
Heather Lloyd Martin is a well known SEO copywriter. I have been meaning to read her Successful Search Engine Copywriting ebook for a while and finally did. On to the review... Things I liked about Successful Search Engine Copywriting:
Focuses on writing for humans instead of writing exclusively for bots.
Focuses on importance of keyword phrases over words, citing resources which show that most search queries are longer and more specific, and going through examples showing why those types of queries convert better.
Quotes Greg Boser a good number of times on competitive analysis. As always his answers are insightful & suscinct.
States importance of building credibility with content.
Covers the page title tag and meta description in depth.
Offers good tips on helping marketing, IT, & legal departments play well together. Answers many common what if conflict problems you can have working with a company.
Gives many tips on hiring and working with an SEO copywriter.
Covers XML data feeds in depth, including who they are best for and when to use them.
Quick and easy to read. Uses many analogies which parallel many off the web concepts.
The interviews at the back of the ebook add a good amount of value and cover many other search related topics.
Things I Thought Could be Improved:
Talks about how to get Google to craft a good description display, but does not mention that sometimes they match up with your page description if the exact search query exists in the meta description tag.
States tricking engines is unethical & expensive. In some cases this is true, but in others it is fast and cheap. It is all about determining your risk profile and goals. There is no universal right or wrong way to do SEO.
Does not discuss term weight, latent semantic indexing, or how search systems normalize page copy length. All of which are interesting issues related to SEO copywriting. Perhaps the lack of mentioning these was due to trying to keep the guide fairly non technical and easy to read.
At one point the guide said "Optimizing for one keyphrase is considered spam, and search engines don't like it." While I have not been involved with SEO as long as she is I disagree with that. Being focused is important, but if you write naturally many modifiers and semantically related terms will end up in the copy. I think it is impossible to write naturally and not cover many related keyword phrases.
There are a couple contridictions. In her example meta tag she stated that it weighs in at 191 characters and the meta description tag should be around 200 characters, but not exceed 300. Later she offered information from Jeremy Sanches, which states that meta description tags should not exceed 170 characters.
I am very anti paid inclusion (XML feeds). That does not make either of us right or wrong on the topic, but since it went cost per click generally I consider paid inclusion a last resort.
Overall I thought the ebook was pretty good for those looking to learn about SEO copywriting. If you have to work with large companies and learn a few tips about how to get various departments to work together then that info covers the cost of the book. I also thought some of the interviews added good information as well.
Sometimes when doing link analysis you come across pages that would be appealing to get links from, but may not fit the profile of a page or site that the owner of the page in question would likely link to.
Of all the pages on the web, most of them are not overtly amazingly thoughtful or original. With that being said, it costs next to nothing to write an article or hire an article writer to write about a topic which could likely gain links from various trusted or authoritative resources.
For most people it is easier to create something worth promoting than trying to promote something not worth promoting. Along those lines of thinking, it is easier to create something people care about if you use their interests as the source of the content or idea.
Whether or not you care about Search Engine Spam, it is easy to let the author of a page about the topic think you care by writing a piece that cares, even if your only goal is the link.
Of course, you don't want to destroy your brand value in the process, but there should be ways to use tact and get a link without writing something that is untrue.
If the thought or reasoning behind the article does not totally agree with you, then it might be a good occasion to hire a guest writer.
InfoSearch recently introduced a content licensing model that allows its clients to license the content, generally for a one year period, with renewal rights at the conclusion of the license term. Further, InfoSearch is gradually transitioning the current traffic model through its www.articleinsider.com network from a fixed CPC (cost per click) rate to a bidded CPC rate. After these new initiatives are integrated into the existing business model, the Company expects that they will provide continued revenue growth over the longer term.
I think it will be fun to watch to see what they can make of it.
They have over $4 million in the bank and are cashflow positive, but:
making their ArticleInsider network an open auction goes against their primary selling point of a low fixed cost. I can't imagine current customers will be pleased with the transition.
After they do make the transition they become a second (or third) tier PPC service. Looking at how some of those search stocks are doing in the market with lowering bid prices and marketshare makes you wonder how this helps them.
after you reach a point in market saturation there are some topics which are not as profitable to create content for. what then?
I think creating services or multiple compelling channels that keep consumers wanting to come back is a far better longterm model than profiting from static content.
the direct channels get direct traffic and search traffic
the direct channels are more likely to build natural linkage data
the direct channels, which frequently update, give people an excuse to come back and view more content & ads
Huge news for the beaten down FWHT stock, which was recently down to 4.07 from it's 52 week high of 23.94, gained about 10% on the day.
A judge declared a mistrial in a patent infringement lawsuit between Yahoo Inc. and FindWhat.com Inc. after a jury failed to reach a decision on all of the issues in the case, FindWhat.com said on Thursday.
In a note to clients on Wednesday, RBC Capital Markets analyst Jordan Rohan said the most likely outcome of the case would be a modest out-of-court settlement. He estimated that FindWhat could settle the case for around $7 million to $8 million.
Rohan said some investors had worried that a ruling against FindWhat in the case could wipe out the majority of the company's $50 million cash balance.
Most of the second tier search stocks are fading into irrelevance. Maybe this will help FWHT hang on a little longer. Also noted eariler today:
FindWhat.com noted the judge has yet to rule on the issue of whether the patent is unenforceable because of inequitable conduct committed by Overture. A hearing on the inequitable conduct issue and other motions that could impact the ultimate outcome of the case is currently scheduled for June 24, 2005.
Visitors to Yahoo's Music Unlimited will pay $6.99 a month for access to Yahoo's 1-million-song library. That's less than half what Napster and Real Networks' Rhapsody charge for similar services that permit the transfer of songs to portable music players. source
Keyword Locator is new keyword research & monitoring software which sells for $87. When I tried to download it there were download errors, but Frenchie Sano was quick to reply and help me with the download. On to the reveiw... Features:
Like many of the other current keyword research tools on the market, it pulls keyword suggestions, search volumes, and bid prices from Overture.
Set which Google URL and Overture market you want to review ads from, and reports the number of competing ads.
Select Yahoo!, Google, Overture, or digging to grab your various keywords.
Easy data import and export.
Shows the number of competing ads in each engine.
Allows you to filter keywords by a term or select a group of them. After you select a set of results you can scroll through the URLs, ad titles, common words in ad copy, and bid prices by search engine. (please note Google AdWords does not give out ad price and search volume data).
By collecting and sorting the combined keyword data you can see what terms & emotional triggers people are using most frequently in your marketplace.
Has a character stripper and ad formatter. The character stripper could also be improved by letting you also remove character sequences by doing something like placing the phrase or character set in  or something like that.
Tool also includes FindWhat & Enhance Interactive.
Can access data via proxy.
The format tool makes it easy for you to format keywords as exact match terms, phrases, or broad match terms. It would be nice if they added an all feature to that for those who may want to bid on all levels of relevancy matching.
Things that would Make Keyword Locator Better:
Many people access keyword data from the same sources. This means:
this data is going to be inclined to being spammed or thrown off course by automated bots and marketers.
These portions of the market are going to be much more competitive (and thus less profitable) than words from sources of unique or limited data.
Outside of good sex and choice narcotics almost nothing feels better than being the only bidder for a term which converts at 30% and only costs a nickel a click.
I like free access to data and information (I only sell my ebook because it is my main functional business model and I have not been creative enough to think of another yet), but sometimes paying for data creates a barrier which adds value to the usefulness of the data. It would be cool to see a tool like this interface with data from WordTracker, Keyword Intelligence, & Keyword Discovery.
I think Keyword Locator could also be improved by adding:
Overall Keyword Locator is pretty good software, but a few of the ideas listed above could make it a bit better.
If you spend signifincantly on PPC advertising it can likely help save you time and money, but some things can't be automated. Tools which show you what your competitors are already doing may not show you how to beat them (as you can't put think creatively into software).
Keyword Locator can help you as one tool to use with PPC campaigns, but you may also want to use other tools and research databases & techniques as well.
Does not sugar coat things or make them seem too complex. Gives the exact way he figures out what to bid.
his guide walks users through setting up their first campaign. He also reminds them that some people may take up to 20 tries to find a profitable product & helps them determine if or when they should pause or delete a word or campaign.
Things I thought could be improved with Google AdWords 123:
The book uses affiliate links. I think these are part of the reason why it is cheaper than many similar competing ebooks, but sometimes authenticity of recommendation is questioned when affiliate links are used. A while ago some people complained to me when I used some of them (and so I quickly removed the ones I used).
Does not recommend creating separate campaigns with lower bids for content ads. Content ads will typically less have less implied demand and value than search ads.
does not talk about dynamic keyword insertion, which is huge for helping ads appear relevant and encouraging high clickthrough rates.
points out that software automation is important for effectively using time, but does not point out keyword combination tools such as GoogEdit, ThePermutator, or this one.
Overall I thought it was a pretty good ebook, and at under $50 it was well worth it. Visit the Google AdWords 123 website to learn more.
Biased Search Ads:
Ads are going to be inherently biased, as paying for them means that the person buying them aims to use that money to manipulate others to perform a desired task.
From time to time someone will go too far and search engines will say the ad is out of bounds. The process will repeat.
Are Search Results Biased? Working with a Limited Information Pool:
Lets presume that the search engines aimed to be completely unbiased. Search engines can only display information they know about. They can not serve up information that does not exist.
Information creation is either a labor of love, or must pay for itself.
Ideally it does not happen, but if a site creates profit the business model is going to bias the content.
If information is a labor of love then it is probably going to be highly opinionated - showing the world from a biased perspective.
If you pour yourself into something at a financial loss hopefully you are gaining in other areas, or else why would you create it?
While the best answers are usually somewhere in the middle, it is much more exciting to propose something that is cutting edge or deeply rooted in some ideology.
Linking to Information:
People are more inclined to link into overtly biased information. Whether they like the person:
Some might think ABC is a bit out there, but this is just a briliant idea (link)...
or hate them:
XYZ is a real tool. This moron said "blah blah blah" (link)...
Political and religious related topics are going to come out with a higher ratio of biased to unbiased information. Stories where religion and politics overlap will build heavy linkage data.
In being somewhat biased people get more feedback (potentiall more content), more readers (can make more money from ads and thus can further the content creation, brand, and distribution), and more links (furthering their authority score). Using the results of this type of social network how could search engines be anything but biased?
Do the people reporting about money not know how to make any?
Part of the bad quarters may be due to
the slugish stock market
rapid consolidation of wealth
trade and federal deficits
lack of trust in the market
and energy shortages.
They are also losing out due to the web being a faster moving and cheaper distributed advertising network. Another thing that really hurts them - and all unbiased trusted sources - is that I can read exactly what I want to from whatever channels I like. News biased the way I like it.
While news search algorithms can use systems like TrustRank to unbias their news results, you can't fully remove bias from search results.
Most people are not cited or remembered as social significant for being unbiased and centered. The channels (websites) which do not have to ask for citation (links) will usually beat out those that do.
"It was not a hacking or a security issue," said Krane. He said the problem was related to the DNS, or Domain Name System, though Krane did not elaborate. The DNS translates domain names for computers.
"Google's global properties were unavailable for a short period of time," Krane said. "We've remedied the problem and access to Google has been restored worldwide."
A while ago someone shot me an email about Constant Content and I forgot to post about it. I just remembered it again and thought to post on it.
From their site:
Constant Content is exactly what the name implies: A website where you will be able to find text to complete your website or project. This is a place to locate high-quality content at affordable prices. We will assist you in delivering the whole package, ensuring that the clients you service will be receiving a polished piece of perfection.
I have not bought or tested the content quality, but with the wide range of authors there is likely to be some real gems and some real duds in the mix.
Constant Content is a database which keeps 50% of the funds received when people purchase the content created by authors who submit their articles to the site.
Some of the articles are free, while others are available for sale to use exclusively or to buy an individual license for. Constant Content also runs AdSense on some of their article abstracts to help create another revenue channel.
They already have over 600 writers, and it seems like it would be a fairly scalable business model, and is a rather untapped market.
Economics of Link Buying VS Submitting Articles:
If you buy them, even crap links can usually cost $5 to $50 each. You could likely buy one of these exclusive articles and submit it to a few sites to build a dozen or so links for the cost of one link.
Economics of Buying Ads VS Buying Content & Selling Ads:
To further appreciate the economics of this idea, a single click from Google AdWords on legal, health, insurance, and other high margin subjects can cost $2-$50, while you can buy the rights to an article for about $5 - $10 (usage) and $50 for exclusive rights.
As this and other related business models develop it sure can put another spin on the AdSense business model. With some of these articles you could buy them, place AdSense ads on them to get a 30% CTR, and after a few dozen visitors you would pay for the usage cost and be into the pure profit zone.
Similar Competing Business Models:
I believe the people at Traffic Logic / Article Insider also sell content. I doubt they could compete on the price aspect with how cheap some of the articles at Constant Content are. I also have found much of the Article Insider content to be a bit less than impressive.
Some auctions such as Elance allow you to bid on similar projects, but its hard to be certain of quality. The nice thing about Constant Content is you can request articles and bid without obligation to buy, even if a half dozen people make articles for you.
I don't think I know who is behind Constant Content. The post was fairly positive because it sounds like a cool idea. Whether or not it pans out, the business model seems smart to me.
Keyword Intelligence data is based on Hitwiseâ€™s sample of over 25 million home, work and educational Internet users worldwide and how these people use specific search terms across all search engines to find products and services online.
HitWise has partnerships with various ISPs and search services to track search and clickthrough data. Some of their products are a bit pricey for small webmasters (I believe starting at around $25,000 a year). The Keyword Intelligence offering looks like an attempt to break into the mid to lower market.
Keyword Intelligence has two different subscription plans starting at $90 and $190 a month. It allows you to subscribe to geographic markets and categories and do keyword research from there.
Thanks to Warren Duff for pointing me at Keyword Intelligence.
Tool from Last Month:
None of the major text link analysis tools for sale allow you to check co-citation, or pages which link to multiple related resources.
Last month I had a friend create Hub Finder, which is a free on topic link analysis tool which looks for co-citation. I have not got much feedback on the tool yet, but a few people have said they found it to be useful.
New SEO tool for this month:
Another common problem with most link analysis tools is that they do not make it quick, easy, and convenient for you to be able to search past the 1,000 backlink barrier set by most search engines. What is the point of being slow to give you more details than you need, only to survey a small portion of the inbound links?
A friend of mine is a decent programmer, and I had him whip up a tool I call Link Harvester, which has a ton of cool features:
uses the Yahoo! API, so it is in compliance with their TOS.
makes saving and exporting data in CSV as simple as a click of the mouse
does not require any software downloading
quickly grabs the number of .gov, .edu, & .ac.uk inbound links while also listing each individual link.
quickly grabs the number of unique linking domains while listing them
quickly grabs the number of unique linking C block IP addresses while listing the C block next to each domain
allows you to check links pointing at a page or at a domain
displays the total number of links showed by Yahoo!
displays the total number of pages indexed by Yahoo!
links next to each domain that point at its WhoIs source information and Wayback Machine information.
if a site links at your site more than 5 times then it is bolded in the results and a checkbox is autochecked, which allows you to filter out that site and spider deeper through the link database. This harvesting action is how you can spider deeper than 1,000 backlinks and where the tool got its name from.
Link Harvester is open source. If you like the tool & find it useful you can add it to your site. Also if you can think of ways to make it better you can modify it however you please.
Why Not Look at Anchor Text?
I did not want this tool to spider websites.
I wanted this tool to be faster than anything on the market.
It is important to understand what anchor text variations people are using, but usually you can figure out how stiff the competition is just by quickly glancing through their backlink profile without necissarily looking too deeply into anchor text. The current off the shelf tools that monitor the anchor text only give you a small sample of backlink data.
This tool was not designed to be the comprehensive show all link analysis tool, but just something that was useful and quick and easy to use.
After you see enough linkage data you become aware of how competitive a site is and how you should go about promoting it. It is kinda like the thin slicing concept Malcolm Gladwell talks about in Blink.
Please let me know what you think about Link Harvester in the comments below.
You guys as you say find inspiration in Orion's theories, even if they have not been proved, and it gives you the motivation to improve your content. This is sufficient enough to see the use of them.
The problem of the ideas as a whole as they do not take into account the big picture but focus down on a very specific are which is the content on the page, when what you should be looking at is the content you share with your peers, and how this all links in together. Starting to look at the various different dimensions your content has in relation to the rest of the world around it may tell you some more. Demo's I've seen do include the use of clustering but in the sense of topic classification. Each site or even each part will belong to 1 or many different spheres of belonging if you like. I've seen demo's that spit out the "topic sphere" if you like and enable the user to visually manipulate this or textually manipulate this to get the results they want.
Never forget the big picture!
I think Xan's point is valid in that by following rules or focusing on specific things sometimes we miss out on the big picture or create artificial machine identifiable patterns. With that being said I find lots of the stuff Orion posts interesting.
Off topic, but Orion the Hunter is my favorite constellation. I have been exploring the universe a bit recently, watching some Cosmos :)
Thought I doubt the tool has much use, I love how smart the marketing is. They show a time meter of how much time the tool saved to make it seem as though it is providing an exceptionally useful service for users. To me it just looks like an excuse for Google to try to collect more data.
In my last post there was the following comment
I agree with the premises of privacy, and of not giving people too much information, I just don't know how Google would hurt any individual smart search marketer's business model.
There are many ways Google can hurt many people. One thing you have to worry about with some of these helpful plugins is how often will you see screens like this?
If you rely on internet marketing and do not have a strong brand you can count on Google swallowing more and more of your margins as their network will allow the richest & most socially active businesses to learn from and control the search results.
Most people, even in SEO / SEM, don't seem to be entirely clear about what data mining actually is about. A lot of fuzzy concepts abound, but only a few people seem to realize the commercial potential inherent in owning the world's largest database of trackable and verifiable user behavior.
Take AdWords: a great revenue stream for Google, true; but offhand I'd estimate that the overall value of the data generated from that venture alone probably beats the AdWords revenues by factor 6 or more if properly processed, analyzed, calibrated and marketed.
The difference is that now, the CTR of the ad copy itself is factored in, instead of it being solely the CTR of the keyword. Which only makes sense, IMO, given that it is the quality of the keyword and the particular ad it brings up that defines relevance, for a given search. source
I always like smaller conferences because when they get too big you (as in me) feel lost in the shuffle. It only costs $100 to attend this one. Smart move by JupiterMedia, as this will surely prevent others from having an easy entry into this market space.
The $199 per month Urchin On Demand also now includes report profiles for up to fifty individual websites (Urchin's previous offering included reporting for only one site). The price includes up to 100,000 pageviews per month. Users can add one million more pageviews for only $99 more per month.
In addition to the reduced price and increased number of profiles, Urchin On Demand is now able to import -pay--per-click costs directly from Google AdWords accounts.
Many smart search marketers probably are not willing to be paid to give Google all their data. As time passes you can be sure that Google will drop their costs further as they try to kill off the business models of everthing between them and ad dollars.
Look, Fwht, & Mama are dropping like it's not hot. InfoSpace (which does search & mobile technology) recently lost about 30% of their market value as well
From my limited perspective there are a couple major recurring flaws in the buying / selling cycle of selling SEO services.
Client ignorance of pricing: some clients view SEO as free money. This leads them to hire a cheap guy or someone who heavily markets the free money angle. Either of which stand a good chance of leading to fraud.
Client ignorance of process: some clients assume they know how to do it or that SEOs are messing it up based on slow feedback loops.
Rapid changes: this kinda goes hand in hand with the ignorance of process since most available information is dated. Some search algorithms are changed in ways that could best be described as random.
Big leap of faith:
many SEOs want to sell a $10,000 + package right out if the gate.
Not that I am actively seeking lots of clients (because I am still bad at pricing and still am not sure what I want to do when I grow up) but I have found that by being not available demand is much higher.
Another thing which works well for me is to do an hour long or couple hour phone consultation. Then when you are done go through the clients site and write a report for them giving them specific action points for improving their sites visibility.
In doing that you can easily sell a $500 to $2,000 review package where you might be able to make a few hundred dollars an hour while still avoiding the longterm commitment of performing ongoing SEO services.
You get to feel the client out and they get to feel you out. At the end of that report you can say that if they have any questions or need any help they can get ahold of you.
By charging a decent bit off the start you help the client assume there is value which filters out many of the worst clients.
As long as the suggested improvements report and phone call do not sound like a sales pitch you start to build trust. If you and a client click you can go forward from there with a bit of trust built up.
I see tons and tons of sites sell full service SEO solutions, but few people seem to be looking for that middle ground where they can still deliver good value without expecting a longterm or big dollar commitment from the clients.
What are some of your favorite business models or sales techniques within the SEO space?
When I was new to SEO I did a bunch of on the page analysis to try to figure out exactly what other people are doing. The problem is that it gets you focused on things that do not matter. A site may end up ranking high at the sacrifice of conversion.
As search algorithms advance basic link analysis tools, at least for Google, are starting to become what keyword density tools are: a waste of time.
Link analysis software was cool, especially when Google used to show all of the PageRank 4 and above links, back when their search relevancy algorithm was a bit more dependant on raw PageRank.
Now Google only shows a limited random set of backlinks, and the other search engines also limit the search depth to 1,000 results, which makes it hard to do useful analysis with the various link analysis tools on the market.
If it were quick and easy to query a database deeply (deeper than 1,000) then the link analysis tools would be much more useful. None of them currently on the market really make that a quick and easy process.
To keep improving the results, you find more variables for the algorithm-creating machine to use, and you add to your store of human-ranked pages for it to "learn" from. What you don't do is bother understanding the actual algorithm -- it was constructed by a machine and is way too complex for anyone to keep in their head.
Psychologists have shown repeatedly that when you give people a system to optimize, all you have to do is secretly introduce a delay between their actions and the results of their actions, and they will go bonkers. In fact, in a very simple (single variable!) model in which people are trying to control the temperature in a virtual refridgerator, you can get some of the same irrational responses you see in these forums
and the first post here by Captain CaveMan (which incidentally is the name of an awesome cartoon character) does as well:
Without giving away the store, I don't know how else to say it. There is no sandbox. People speak of it as though it were some simple 'thing' that stops new sites from being seen. That has simply never been true. What was true was that in its early days, some of the algo elements and related filters were so tight that only a very few new sites got past them (some accidentally; some methodically). Over time that changed; more sites started getting out, presumably as G worked to surface more new, higher quality sites.
There is no sandbox. There is only a serious of rotating algo's and related filters, that make it far harder for sites launched after spring of '04 to be widely seen in the SERP's. Not impossible. Harder. And certainly not as hard now as was true seven months ago. This has been hashed and rehashed so many times that it's hard to understand why it's still confusing.
If you can only see a few of the variables and overexert effort to satisfy those variables you may end up tripping filters and not satisfying other criteria.