There is a new version of Backlink Analyzer. I added PageRank to it (hopefully that doesn't piss off Matt too bad), made the search term feature more reliable for deep backlink analysis (ie: it shouldn't crash if you are doing an analysis of the keywords in thousands of backlinks), and it also shows what URL extension the links are coming from, as well as the page the links are pointing at if you do a linkdomain search.
If you find any problems with it please leave a comment on this post.
If it is not working when you try it here are some things to take note of
off the start if it does not work you may have to click the preferences button (the one with two check marks on it) to de-check the "use proxy server" option if you do not want to use a proxy
Norton (or equivalent) may block it
make sure you set the result limit count to a reasonable number (like 100 or 1000)
you have to select at least 1 engine to pull link data from
if you want to enable PageRank, site age, etc. you have to check those features
you cant get the keyword summary until the tool is done pulling in all the links
This thread will probably be up most of the rest of the weekend as I am reading a killer book and my internet access has been shifty recently. I have replaced the wall jack and all cords. A new router will be in next week. If my internet access is still unreliable that will speed up my decision of when and where to move.
Terrible post title, I know, but recently I have been asked many times over about the general theme of where do I find high quality free easy to get links. But when you are brand new to a network you can't add up all those pieces and expect it to work. Here's why If I know unique and innovative ways to pollute the web for personal profit why would I share them for free with someone who does not have any passion for what they are doing and wants me to value my time at $0 to help them make money?
To be fair, some of the people who asked me were also paying customers, but the reasons I don't push the get as much as you can free without thinking angle are:
if you are doing the work for a client and your client has no marketing budget and/or values your time at next to nothing then they are a client worth firing. if they want marketing to be free they should do it on their own dime and time.
I think to be good at profiting from algorithmic holes you have to have an analytical mind that is good on picking up on patterns. If you can't identify the obvious algorithmic holes by looking through the search results or looking at this then I am not certain what I can tell you that would help you.
if you do well and your work is easy to replicate then many others likely will.
if your work is easy to replicate a program will probably replicate it thousands of times over, so in essence you are valuing your own time as being worth less than that of an automated software program. is that any way to live?
You can't do well long term just going after the holes unless you know how to spot them and predict them and quickly capitalize on them yourself.
Thereâ€™s something exciting about coming to musicians when theyâ€™re just names, when youâ€™ve no idea who Derrick Harriott looks like, or what his reputation is - considered naff by real dub fans, maybe ? Derivative ? Or maybe ground-breaking ? I know the Stooges were ground-breaking - maybe thatâ€™s what has been putting me off. The weight of knowing already how good itâ€™s meant to be. With this compilation, Iâ€™ve just ploughed through all these faceless names, liking things I probably shouldnâ€™t (covers of soul songs! Spanish guitar solos!) and maybe finding nothing I like by the supposed classics
Sites like Del.icio.us, Digg and Techmeme surely have a techy slant to them, but they surface ideas that interest people daily. Sure some of them will be spam, hype2.0, or garbage, but occasionally some of them will be interesting, and give you ideas of how you can related your site to the link rich populous. Even casinos are doing a good job of it.
Find your keywords I am targeting resources listed below I get a lot of site review requests for sites that recently took a dive in Google where the page generally follows the above format. Every time their main keyword phrase exists on the page it exists in the exact same format, and it exists about everywhere.
There are two major problems with that format
Over optimization: if a page is obviously targeting a phrase then Google may not want to rank that page for that phrase. When people write naturally (ie: for humans, not engines) there tends to be variations in it. Now some content management systems will cause some parts of the page to be fairly repetitive, but where you can mix it up.
I see some examples of where Google is ranking a page focused on topic A for topic B just because topic B exists as a navigational element on page topic A. The same site has a more relevant page about the search query but the wrong page ends up ranking sometimes. While that is trashy relevancy from Google, to me that hints at where Google wants to head with their algorithms, showing me that Google is trying to figure out natural writing and reward pages for not being too repetitious or overly focused. They still need to do some serious work on how they interpret navigational elements into the relevancy algorithms, but when they do you can expect them to only get even more aggressive with favoring natural writing over spammy optimized content.
Wasted opportunity: assuming you took the time to create unique content for each page it only takes an extra minute or two to mix things up to help the page rank for a much wider net of keywords.
If you can find a way to mix up your keyword phrases, like:
sometimes leave one of the words out
sometimes just use one or two of the words isolated from the others
make the internal anchor text slightly different than what you focus the page content on
use variation in anchor text from external link sources, and focus it on slightly different words than you focused the internal linkage on
you will end up ranking for a lot more phrases and will rank more consistently and reliably in Google. Others will bitch about the updates giving them the raw deal and Google being a power grab while you keep getting more and more traffic.
Google may not even realize how screwed up their search results are because they hold a flawed or blinded perception of value and they ignore important feedback. All search algorithms are just a way of interpreting or perceiving signs of value. Building a real brand requires creating a perception of value. To profit greatly you either need to build a brand, find flaws or underpricing in other's perception of value, or predict how markets will change and have the guts necessary to place a bit bet on your intuition.
To sell for the cheapest price there are usually hidden costs, like: accounting fraud (Enron), increased risk of prostate cancer (possibly rBGH - makes me not want to drink milk at all), not listening to customer complaints (Google search quality, Paypal account reps for people who do $100,000's in Paypal transactions each year), poor customer support (Verizon DSL, Verizon DSL, Verizon DSL, Verizon DSL), etc.
Sometimes those hidden costs cost you far more than you make from them. People tolerate stuff for a while, then eventually a consumer creates a (yourbrand)sucks.com site or two and suddenly you are worried about what one irrational person does, when the irrational thing was expecting nobody to notice or mention your hidden costs, business warts, etc.
And the product price point matters too because your price point not only determines how many units you will sell, but it also helps determine how much support you can give with your offering, and the average quality of your consumer. Aim for the low end and that is just what you will get.
Telling a story about the value of your product and then adequately pricing it or overpricing it while following through on your offer is a much better way to profit than to fill your product with hidden costs and screw people over.
I just finally got unsick today (what are the odds of getting sick at a concert crammed with 80,000 other people hehehe), so I will be catching up on email tonight.
What are some easy perception of value points you could be using to create business models or authoritative linkable content that will make search engines and/or people more likely to perceived your site or business as being important?
It seems like their priorities are screwed at the moment. Am debating selling all my Goog shares as their SERPs are just so bad that it seems like they are driving themselves into irrelevance.
Will users stick to using an irrelevant Google with broken search operators? That is the question Google's current search results are asking right now each time you search. Lucky for Google the competition is deeply lacking at the moment, but I can't see it staying that way forever.
The number of conferences and other obligations I have been dealing with have overwhelmed me, so I decided to create a calendar of marketing and SEO conferences. It is updated through the end of 2006, although I am uncertain to when WITS is. If I missed anything please let me know and I will add it.
The calendar is heavily focused on search and marketing. It will also list a few of the techy conferences like Web 2.0, Gnomedex, and SXSW.
If you are a forum junkie you may want to add Wickedfire.com to your list of daily visits. Be warned though that Jon might be a bit rough around the edge and like to curse from time to time. But then again, the world would be a better place if political correctness was thrown in the trash can in favor of fucking honesty. :)
So my friend might take another week or two to get it done, but I am having him make an extension for adding data to Google's SERPs on the fly. A mock up might look something like this. Notice the links under each organic search result showing things like PageRank, site age, site size, and linkage data. Of course if this extension was made you would be able to actively pull in the data automatically or click a button to have the data selection pulled in on an as needed per URL basis.
Is SEO for Firefox an extension worth making? What marketing data would you like to see in Google's SERPs? What data points should be page specific? Which should be site specific? Which should be both? When gathering site data should it gather subdomain specific data? Or domain specific data?
A while ago I wrote about some of the reasons SEO is given a bad wrap in general. Rand also posted today about how being an SEO is like being a plastic surgeon. I think another key issue which is not typically discussed much is the concept of authority and how it plays a role in media influence. If search may have the power to undermine many locally monopolistic publishing companies it benefits those companies to state that search has holes in it and that people manipulate it. Don't trust search - trust us, your reliable honest trustworthy truthful blah blah blah media source.
Circulation is directly proportional to revenue at large media companies. A story about some evil manipulative ____ is doing __________ is easier to spread than a story about how wonderful SEOs are.
Instances of the use of front groups as a PR technique have been documented in many industries. For example, the coal mining corporations have created environmental groups that contend that increased CO2 emmissions and global warming will contribute to plant growth and will be beneficial, trade groups for bars and beer distributers have created and funded citizens' groups to attack Mothers Against Drunk Driving, and tobacco companies have created and funded citizens' groups to advocate for tort reform and to attack personal injury lawyers.
I believe that for the most part unbiased content will grow less and less profitable and decrease in quality and availability as time passes and more publishers are forced to become more aggressive with their monetization efforts. Google's drive for efficiency will train many independent publishers how to replace traditional media. Social networks and media consumption habits will also be heavily tracked and greatly replace the role of traditional intermediaries.
"The FCC's efforts on VoIP are like trying to solve traffic and energy problems by stifling the rollout of energy-efficient hybrid vehicles, while subsidizing SUVs," he said.
If you are exercising influence to dupe people it is fine if you are already in a seat of power, but if you are not then they want to expose you to make it look as though they are more pure - when it is rarely the case.
As corporations increasingly are able to embed themselves into the genes of humanity and create communication roadblocks while syndicating spin is there a way beyond it all? Will popular opinion be nothing more than people expressing how they are trained to think?
Sorry for all these cryptic rant posts. They are primarily driven from the following elements
I have a killer flu / strep throat / headache / etc
Last weekend offered many experiences which made me realize a general lack of purpose and a lack of passion I have been living with for a while, which sucks. I not only saw the passion with which some others live with, but also broadened my perspectives in a few other ways, and that made me feel great guilt for my lack of passions and living less than optimally for far too long.
I went from being a total failure to pretty successful pretty quick (at least financially), but I feel the learning curve has leveled off to where I have become pretty bored recently, and still need to do a lot of work on the social / physical / mental aspects of life.
I think it is important to question my own actions and authority MORE than I question anything else. I generally have a distaste for authority, and in the last month I have
worked with people who are true mentors (though I feel they think more of me than I think of myself)
been mentioned on my favorite marketing blog (thanks Seth)
been asked to co-author books by well known publishers
been asked to review papers for well known journals (when I don't know shit about peer review processes, etc.)
been asked to talk to deans of a couple schools about how I would modify their courses (when I never went to college and only started learning about the web less than 4 years ago)
That sort of opportunity has gotten to feel a bit surreal when coupled with a feeling of stagnation and lacking purpose.
I recently started reading A Thousand Years of Nonlinear History, and it is probably the most powerful, insightful, and worldview changing book I have ever read. And I am only like half way done with it.
So does misrepresentation bother me? Sure, but so does rotten weather. It will exist as long as concentrations of power engender a kind of commissar class to defend them. Since they are usually not very bright, or are bright enough to know that they'd better avoid the arena of fact and argument, they'll turn to misrepresentation, vilification, and other devices that are available to those who know that they'll be protected by the various means available to the powerful. We should understand why all this occurs, and unravel it as best we can. That's part of the project of liberation - of ourselves and others, or more reasonably, of people working together to achieve these aims.
Sounds simple-minded, and it is. But I have yet to find much commentary on human life and society that is not simple-minded, when absurdity and self-serving posturing are cleared away.
So enough of my current rants and conditions... How do you fix this? Does the web help? What else is needed?
One of Google's biggest problem is that anything they do has a large impact on the web. AdSense made it profitable to create garbage, but at the end of the day it just leads to a web full of garbage.
How does Google fix the problem they created? Some businesses will be hesitant to trust giving Google too many profit points and data points in their business model, but for Google to improve the quality of the web (and thus the quality of their index) they are going to need to evolve their contextual ad program to evolve beyond just selling clicks. Not surprisingly, Google is launching a cost per action distributed affiliate network.
By providing search, analytics, a purchasing mechanism, an affiliate program, a search offering, contextual ads, and a toolbar bundled with everything they are able to get a more pure set of data and are able to insert themselves into more pieces of the shopping cycle while making the entire market more efficient.
I also believe that Google understands that biased content is in many ways more profitable than unbiased content. By teaching many traditional publishers and authorities about conversion Google has the net effect of allowing them to trade in some of their authority for profit. As traditional authorities lose some of their brand value and trust Google's roll as a data aggregator and recommendation engine goes up since people will need to do more re(search) before trusting any entity.
I believe the net effect of search will be that it pushes a more biased and commercial web highly focused on psychographic marketing (that is where marketing is headed if Google is making markets as efficient as they can possibly be).
Many of the best business models are also atrociously inhumane. Are there any ethical guidelines to how well a search engine should automate knowing you and understanding what you want? If some of our worst ideas are reinforced and directed toward existing markets (or at least monopoly markets or markets with expensive and significant ad depth) at an early age it seems the world would become less diverse.
Is that a bad thing if it is also accompanied by consumers more aware of the biases of intermediaries?
Or how many people will think about authority related issues in commerce, life, and information consumption?
How do you increase global conversion rates without sacrificing the quality of the web?
If you were Google what would you do to improve network efficiency while also considering the hidden costs and concept of humanity that is often ignored by extremely efficient homogenized capitalism?
Fantomaster has some awe inspiring comments about the future of search and social engineering. Read them on a recent TW page starting with this one.
Not really SEO related, but I got back from Bonnaroo, and have to say it kicked ass. My favorite reason to go to big festivals is not just to hear the music, but more to see the human interaction and the display of passion associated with it. Sure some marketing agencies can push garbage on the radio enough to get some people to like it, but you don't get to the level of a group like Radiohead without having some real passion behind what you do. I think art is one of the hardest things to market, and even harder to keep producing with the same high quality and authenticity after success had produced a feedback loop that heavily influences the artist's life and work.
Jonny Greenwood is known to play guitar so aggressively that he had repetitive strain injury in his right arm and had to wear an arm brace for playing. Thom Yorke has broke down crying after recording a song. On top of being absurdly talented and working hard they also display more emotion than most would dare do.
I am sure I have posted about the concept of market edges before, but I am about to go away for a few days and wanted to make one last post before I went on my trip to Bonnaroo for a music filled weekend with Werty, Radiohead, DJ Sasha, Beck and Clap Your Hands Say Yeah. Each time a new vertical search service or content distribution type comes about it offers a quick and easy opportunity to help you boost your exposure. If you get in a market before it is saturated there is likely going to be
less competing content
fewer and weaker social barriers to break through
fewer signals of quality that are usable to organize information (thus if you title / label it smartly you won half the battle right there)
less of a requirement to be citation worthy
larger margin per unit effort
So lets say you go to the largest auction. The only way you are going to find great deals there are if
the competition is clueless
there is a glut of supply that either saturates the market or prevents people from wanting to dig through the noise to find the gems
the seller does not know how to describe the value of what they are selling
you know a market better than the market does and can accurately predict future performance (I was bad ass at this as a kid with baseball cards)
you think you are getting a deal, but are actually getting quite screwed ;)
You can think of search (and the web as a whole) as an auction for attention. SEO is all about maximal ROI per unit effort while using a risk level you are comfortable with.
What has prevented us moving forward is a battle with a group of minority shareholders, some of whom claim to be lead by our ex-CEO Salim Ismail and are, in any case, primarily his "friends and family." This group is using very unusual clauses in our Shareholder's agreements to block mergers or financings. We've found it difficult to determine their motives, however, some have said that they believe that it is in their interest to drive the company into bankruptcy so that they can buy our software and start a new company.
Popular sites like YouTube prove that not only that artists will give away their work for exposure, but that eventually some may even need to pay just for the opportunity to give their stuff away. Writers are going to have to be the same way. When I turned down a major publishing house I did so because I thought I would be able to do a better job marketing than they would. Part of that marketing is giving away some copies of my ebook for review, while other parts of that marketing include sharing a ton of ideas.
When I wrote a long post a couple days ago it only got about 10 links (which is a terrible short term ROI for a 30 page article) but it may have also got me a speaking opportunity at San Jose SES, which is huge huge huge.
To do well with search long-term you have to find the new markets or be willing to over-invest and realize that many of the things you do will not do much, but some of them will do much better than you think they should.
I recently ranted about blogspamming comments being a poor way to win an SEO contest. It is a technique that for the most part came and went. Especially if you are taking the time to do it manually to leave garbage comments on a blog with nofollowed comments that other people are hitting hard too.
Most of the large web companies are trying to bridge the gap to try to find new and innovative ways to make user feedback useful.
Yesterday I spent about 10 hours looking at eBay. Did you know they have a wiki, blogs, community forums, keywords, profile pages, auctions, eBay express, reviews, guides, market research data (for $25 a month) and stores. Setting up a store only costs $16 and then the per month per item listing fee is only 2 cents.
Surely as they work on integrating all that information they are going to create under-priced marketing opportunities.
I have seen people make Amazon guides that recommended my ebook. I have seen people review other books and tell people to instead buy mine. Amazon also has tagging, product wikis, and Alexa feedback.
Again you can probably find some ways to market your site on there (by reviewing related products or creating guides, etc.).
Google is doing in-line search suggest for related popular searches. They also are providing guided categorization of travel, medical, and perhaps a few other types of search. Each time they split up their traffic on the generic terms and provide a path to more niche fields they help boost those niche markets. The framework with which they set up their categorization might be a good keys for ways to set up internal navigation or what niche sites are worth building.
Now there are a ton of meme trackers that provide free authoritative links to the quickest spreading ideas. Make sure you own a couple blogs you can link from and make sure you have a number of blogging friends on your IM list, so that when you need to spread an idea they can help give you a boost. The web is just a social network.
Not only are their meme trackers, but there are also social news sites that aim to cut the editorial costs out of running a news site. Netscape is looking to clone the Digg model, so again it is worth it to have a number of friends on the IM list.
If you just get a few blog links and a mention on those two sites you might only be a quality link or two away from being able to rank in most niche markets.
Those social news sites are also killing the importance (and availability) of blogs which aim to be first with all the news. And they are going to make it harder to get traffic to sites that lack opinion, because they are going to create tons of boring content on fairly authoritative domains.
With all these market edge type ideas am I suggesting you spam? Nope. I am just saying that at market edges there is great profit potential, and if you look to see where markets are headed and get in early on new markets you can establish a self reinforcing base with much less effort than is required to build yourself up from scratch in an already competitive marketplace. Just look at some of the junky old sites that rank in Google. Why do they still rank? Because in the past less was needed to be citation worthy.
For those people who consider all aggressive marketing as spam or for those who tie some arbitrary ethical garbage to their marketing methods, don't forget that Google is one of the biggest spammers on the web.
How does Google spam?
Profit share partnerships with garbage AdSense sites.
Inadequate editorial filtering of their ads such that they have even profited from ads promoting child porn.
Accidentally making pre-releases available or listing them in their robots.txt
Labeling everything as a beta so they can double dip on news.
Relaunching old products as though they are never before seen offerings. Just today they duped the Washington Post into writing an article about Google's *BRAND NEW* government search when the service is actually about 7 years old. How is that anything BUT spam?
The difference between spam and good marketing is perception. Most techniques are not typically classified as spam until after people heavily abuse them. In other words, market timing and unique techniques are all you need to do to succeed, and that is pretty cool since new markets are always forming.
I think Marc is a great guy and am sure he had great intentions when he created the Isulong Seoph contest, but getting manually comment spammed 10 or 15 times a day gets old.
When blogs were newer and I had less brand value I am certain I was probably a bit of a blog spammer too, but you have to use effective techniques while they are effective. I don't think you are going to win an SEO contest today by manually blogspamming garbage comments on a blog that uses nofollow and is getting hit by 100 other people using the exact same spamming technique. Weather or not something is spam is entirely up to user perception, but you have to think that I am going to know when an SEO contest is going on. The tolerance for spam and the ability of spam to go undetected is probably roughly about inversely proportional to the frequency the person being spammed is exposed to that type of spam.
From this point forward I am going to just file anyone's comment signed with Isulong Seoph straight to the junk folder without even reading it (same goes for if they have made up contest words in their URL). Not trying to say I am better than anyone (and I am sure I did some manual blog spamming back in the day) but you have to use effective techniques while they are effective.
Today there is soooooooooo much spam opportunity out there:
Google over-trusting subdomains
MSN trusting just about any type of spam you can think of ;)
Wiki links and indirect wiki links
tagging, community, and social sites
other types of sites where you can create profiles without seeming overt
large ecommerce sites trying to integrate user feedback and guides into their sites
a few others I won't name
Then of course you got all sort of the more traditional spamming opportunities still available.
The fake words are boring AND make an obvious footprint that makes it easy to detect many types of spam. Whoever holds the next contest should use a real word. See who could be the first person to rank number #1 in Google for spammer. That would be a bit more challenging though, since it would require them to beat one of the original blog spammers.
While I would describe myself as financially secure and profitable I still am a bit wet behind the ears on business partnerships. These are some of the general attributes I found in partners in good business partnerships. I think I have had about a half dozen great business partners so far. Here are brief descriptions on some of the things that made some of them great.
Hey Asshole! If a person is willing to tell you that you are a piece of shit or that you are screwing up it is much easier to trust them and their motives than the average person email spamming you with the Joint Venture opportunity of your lifetime. If they are willing to be blunt and honest with you then you have to respect that. I found at least 4 great business partners this way.
Questions Authority: When people are willing to ask but why they not only show the courage to tell you when you are full of crap (and thus help you make better ideas) but they also are going to be more likely to find other ideas that help you out-market the competition or find holes in relevancy algorithms to outmaneuver the search engines. Where conventional wisdom is wrong great profit potential exists.
Most authority systems are hypocritical garbage designed to increase the wealth or power of the authority figures or rule makers. If you are willing to look at them from that perspective it is much easier to find potentially profitable opportunities and algorithmic holes.
Believes in You: One of my friends quit his job and is working full time building out a website for me. Behind his computer on the wall he actually wrote the word FOCUS in big black marker. After about a month of consistant growth yesterday was the first day that the website paid over 100% of his living costs (including his somewhat expensive home mortgage).
Focuses on Automation: It depends on your business models, but if people think of the scalability or ease of replication of a business model at launch that is going to typically lead to a much higher profit yield than a person who starts creating before they think about profits or automation.
Has Different Sources: Their sources may be their own experiences or channels that are not typically read by most people in your market, but generally if people can pull value from sources that are generally overlooked in your industry that is a good sign for the value they can create.
It is hard to make money doing the exact same thing everyone else is doing.
History of Execution:
One of my hyper-successful friends and business partners recently said
I do think it is all about execution though and we will not be out executed.
Having too much confidence can be a bad thing, but if you have partners that have shown the ability to follow through it is a great sign to hear them that confident.
Excitement: A person who feels they just deserve to be successful may not add much value to whatever you are doing. A person who is hard working and excited may not realize their value an / or can be trained to produce valuable work, and will be much more malleable than someone who is already stuck in their ways.
What attributes do you look for in a business partner?
If you think of a search engine as a user trying to perceive how credible documents are then many of those factors make a lot of sense from an SEO perspective, because
Your site visitors will consider many perceived credibility factors when deciding weather or not to buy, transact, or link to your site. Credibility is the key to conversion, especially with expensive or non commodity products and services.
Search engines also evaluate how others perceive your site through looking at linkage data and usage data.
It might also be worth taking a look at Beyond Algorithms: A Librarian's Guide to Finding Web Sites You Can Trust. Imagine that many media members use some similar criteria as the above two documents, and it is easy to see how librarians, media members, and other authoritative voices propagate trust through the web, and why many sites lack credible citations until their site sells itself as being credible enough to merit quality citations.
I have bought a couple blog designs. SEO Book.com was designed by one of my favorite blog designers. Another blog design I bought was cheaper and for a network of blogs, and it came out to be far less appealing. Had I not been an SEO who looks at site structure frequently I might not have noticed many of the hidden costs that came with the bargain design. Here are some examples of things that were totally jacked up with the design product I got a deal on
Does not look as professional: of course I expected this part, and sorta factored that into the consideration of value at the lower price point. What I did not factor in was all of the following
Same page title on every page: well, obviously that sucks. How well will search engines understand the differences between documents when I throw one of the keys away at hello? So I had to go through and find the appropriate archive and individual post Typepad tags fix up the templates to offer unique page titles on idividual post and archive pages.
Header links to alternate version of homepage: the site was designed such that all the internal link popularity flowed to site.com/index.html instead of site.com. Some search engines are still having canonicalization issues, so that had to get fixed.
Lack of modularity: although the designer knew I was going to use the template across a series of blogs they chose to manually type out the URL paths and site anchor text when that could have easily been done using the tagging solutions, which I eventually had to go through and add to make it easier to duplicate the design across different blogs without needing to take an hour per blog to edit the templates for each of about 20 blogs.
noindex nofollow: out of an attempt to sabotage a client or out of sheer incompetence the designer included noinex and nofollow tags in most of the page templates.
So lets say you save a few grand by going with a cheaper designer. What are the potential hidden costs to those savings?
less professional design: I think this factor has to be broken down by the quality of your site
high quality content: if you are going to make a high quality site you might as well make the design look nice too. Links snowball on themselves, and a few more links today may be a hundred more next month and a thousand next year. Or imagine the cost if you missed out on those links. Eeek!
low quality content: for sites that are borderline spam sometimes the difference between staying indexed or being booted out of the search results all together is bridged by a decent design. A good design can carry bad content to some extent. Bad content + bad design = much more likely to get the boot for being spam.
poor page titles: this can easily cut your search referral traffic in half. Given that the people who reference your work are people who somehow found it cutting off one of your most important inroads can cost a lot over time, especially when you consider how links logarithmically build over time.
canonicalization issues: this could cause indexing problems and prevent your homepage from ranking as well. Potentially worse too.
noindex nofollow: I guess it depends on how you monetize your site, but cutting the search engines off at hello is not a good way to work your way up to exposure
Someone newer to the web than me probably would not have caught all those errors either. So the problems could have lasted for months or years without being fixed for some people.
I don't think great design has to be expensive either. I am a fan of buying a great logo and then just using an ultra clean site design, and just letting the links and headings sorta match the colors of the logo. That is how this site was for about a year and a half before I found the designer who did a kick ass job designing the current version.
On top of design effecting how willing people will be to link to your site or read your site it also plays a major role in determining how well your site will convert. Some ugly sites sell, but if you are selling something that is high end and individually branded I think a great design can also play a big role in helping build your credibility and boosting your conversion rates.
One thing I find frustrating with this is that if you go to Wikipedia they list the SEO page as being part of the spamming series and yet you got people designing hundreds or thousands of websites with these sort of information architecture errors in them.
Tony Spencer here doing a guest spot on SEOBook. Aaron was asking me some 301 redirect questions a while back and recently asked me if I would drop in for some
tips on common scenarios so here goes. Feel free to drop me any questions in the comments box.
301 non-www to www
From what I can tell Google has yet to clean up the canonicalization problem that arises when the www version of your site gets indexed along with the non-www version (i.e. http://www.seobook.com & http://seobook.com).
The '(*.)$' says that we'll take anything that comes after http://seobook.com and append it to the end of 'http://www.seobook.com' (thats the '$1' part) and redirect to that URL. For more grit on how this works checkout a good regular expressions resource or two.
Note: You only have to enter 'RewriteEngine On' once at the top of your .htaccess file.
Alternately you may chose to do this 301 redirect from
in the Apache config file httpd.conf.
Note that often webhost managers like CPanel would have placed a 'ServerAlias' seobook.com in the first VirtualHost entry which would negate the following VirtualHost so be sure to remove the non-www ServerAlias.
301 www to non-www
Finally the www 301 redirect to non-www version would look like:
Lets say you no longer carry 'Super Hot Product' and hence want to redirect all requests to the folder /superhotproduct to a single page called /new-hot-stuff.php. This redirect can be accomplished easily by adding the following your .htaccess page:
But what if you want to do the same as the above example EXCEPT for one file? In the next example all files from /superhotproduct/ folder will redirect to the /new-hot-stuff.php file EXCEPT /superhotproduct/tony.html which will redirect to /imakemoney.html
This one is more difficult but I have experienced serious canonicalization problems
when the secure https version of my site was fully indexed along side my http version. I have yet
to find a way to redirect https for the bots only so the only solution I have for now is
to attempt to tell the bots not to index the https version. There are only two ways I know to do this and neither are pretty.
1. Create the following PHP file and include it at the top of each page:
2. Cloak your robots.txt file.
If a visitor comes from https and happens to be one of the known bots such as googlebot, you will display:
Otherwise display your normal robots.txt. To do this you'll need to alter your .htaccess
file treat .txt files as PHP or some other dynamic language and then proceed to write
the cloaking code.
I really wish the search engines would get together and add a new attribute to robots.txt
that would allow us to stop them from indexing https URLs.
Getting Spammy With it!!!
Ok, maybe you aren't getting spammy with it but you just need to redirect a shit ton of pages. First of all it'll take you a long time to type them into .htaccess, secondly too many entries in .htaccess tend to slow Apache down, and third its too prone to human error. So hire a programmer and do some dynamic redirecting from code.
The following example is in PHP but is easy to do with any language. Lets say you switched to a new system and all files that ended in the old id need to be redirected. First create a database table that will hold the old id and the new URL to redirect to:
new_url VARCHAR (255)
Next, write code to populate it with your old id's and your new URLs.
Bob Mutch at SEO Company created an inbound link quality extension for Firefox. You can download the extension from his home page, or access the tool online (again on his home page, but the web based tool has been slow). The tool checks to see if a site is listed in the Yahoo! Directory or DMOZ. In addition it searches Yahoo! for the number of .edu and .gov links pointing at a website.
The extension looks like this
While in some cases there are some .edu and .gov sites that offer up spammy links, the theory behind the tool is that most .edu or .gov links are going to be harder to get / more pure / of higher quality than the average link from most commercial sites. In that sense, the raw number of .edu and .gov links can be seen as a proxy for an indication of if a site has any quality natural editorial inbound links and an estimate of the depth of quality citations a site received.
So there was an old domain name I really wanted. I saw that the site was down and that the PageRank was already stripped (which happens to most expiring domains anyhow) and the name was kinda junky, but I was hoping that it would go to auction and I would be the only one backordering it. Oh how I was wrong.
It just cost me about $4,000 to buy a generic domain with 0 PageRank because I was too dumb to try to get it earlier, perhaps while it still had PageRank. So the tip is, if you see a site down and think you would like the traffic stream the domain enjoys you are probably better off asking the current owner if they would part with it for a few hundred dollars instead of paying $4,000 at auction for it ;)
With Matt Cutts's recent post about the changing quality signals needed to get indexed in Google, and sites with excessive low quality links getting crawled shallower (and some of them not getting crawled at all) some people are comparing Google's current improving crawling standard as an early development of something similar to how the Google Sandbox prevents new or untrusted sites from ranking. WebmasterWorld has a 7ish page thread about BigDaddy, Where Graywolfsaid:
I'm personally of the opinion that we're starting to see the 'sandbox of crawling'
What is the Optimal Site Size?
Some people in the thread are asking for optimal site size for crawling, or if they should change their internal navigation to accommodate the new Google, but I think to some extent I think that misses the mark.
If you completely changing your site structure away from being usable to do things that might appease Google in a state of flux you are missing the real message they are trying to send. If you rely too heavily on Google then you might find they are in a constant state of being broken, at least from your perspective ;)
The site size should depend largely on
how much unique content you can create around the topic
how well you can coax others into wanting to create unique topical content for you
how people shop
how people search for information
how much brand strength you have (a smaller site may make it easier to build a stronger niche specific brand, and in most cases less content of a higher content quality is far more remarkable than lots of junk content)
Many times it is better to have smaller sites so that you can focus the branding messages. When you look at some of the mega sites, like eBay, they are exceptionally weak on deep links, but they also have enough authority, mindshare, and quality link reputation to where they are still represented well in Google.
Scaling Out Link Quality and Unique Content
Another big issue with crawl depth is not only link quality, but also how unique the content is on a per page level. I was recently asked about how much link popularity was needed to index a 100,000,000 page site with cross referenced locations and categories. My response was that I didn't think they could create that much content AND have it unique enough to keep it all indexed AND build enough linkage data to make Google want to index it all.
Sometimes less is more.
The same goes for links too. If you go too hard after acquiring links the sandbox is a real and true phenomenon. If you get real editorial citations and / or go for fewer and higher quality links you will probably end up ranking quicker in Google.
While it may help to be selective with how many links you build (and what sources you are willing to get links from) it also presents a great value to be selective to who you are willing to link at AND link out to many quality resources that would be hard to make look spammy. Rand recently posted
From a trustworthy source - Googlebowling is totally possible, but you need to use patterns that would show that the site has "participated" in the program. What does that mean? Check who they link to - see if you can make the same spammy links point to those places and watch for link spam schemes that aren't in the business of pointing to people who don't pay them.
So if you make your site an island or only partner with other sources that would be easy to take out you limit your stability.
What Makes a Site More Stable?
The big sites that will have 100,000 pages stick in the SERPs are real brands and/or sites that offer added value features. Can individuals create sites to that scale that will still stick? I think they can, but there has to be a comment worthy element to them. They have to find a way to leverage and structure data, be comment worthy and / or they need to have an architecture for social participation / content generation.
The Net Cost & Value of Large Algorithmic Swings
Some people say that in wild search algorithmic swings are not a big deal since that for every person losing someone must gain, so the net effect is not driving people toward paid ads, but I do not buy that.
If your sites are thin spam sites and you have limited real costs the algorithmic swings might not be a big deal, but when businesses grow quickly or have their income sharply drop it affects their profitability, both as they scale up and scale down. You also have to factor in the cost of monitoring site rankings and link building.
At the very least, the ability to turn on or turn off traffic flows (or at least finely adjust them) makes PPC ads an appealing supplement to real businesses with real employees and real business costs. Dan Thies mentioned his liking of PPC ads largely for this reason when I interviewed him about a year ago.
As Google makes it harder to spam and catches spam quicker eventually the opportunity cost of spamming or running cheesy no value add thin sites will exceed the potential profit most people could attain.
Authority Systems Influence the Networks They Measure:
Attention is what people produce (as in "hand over the money" or "look at this ad") in exchange for information and experience. As Lanham writes in The Economics of Attention, the most successful artists and companies are the ones that grab attention and shape it, in other words, that exercise influence. With so much information, simply paying attention is the equivalent of consuming a meal or a tube of toothpaste.
Any system that measures influence also poses an influence on the market it measures. A retail site in an under-marketed industry that randomly winds up on the Delicious popular list or Memeorandum one day will likely outdistance competitors that do not.
Search has a self reinforcing aspect to it as your links build up. A site with a strong history of top rankings gets links that other sites won't get. Each additional link is a re validation of quality. The people at Google realize that they have a profound effect on how the web grows. Now that they have enough content to establish a baseline in most commercial markets they can be more selective with what they are willing to crawl and rank. And they are cleaning up the noise in their ad market as well.
The Infinite Web, Almost
Some people view the web as an infinite space, but as mentioned above, there is going to be a limit to how much attention and mindshare anything can have.
The Tragedy of the Commons is a must read for anyone who earns a living by spreading messages online, especially if you believe the web to be infinite. While storage and access are approaching free, eventually there is going to be a flood of traditional media content online. When it is easy to link at pages or chapters of books the level of quality needed to compete is going to drastically increase in many markets.
So you can do some of the things that Graywolf suggested to help make your site Google friendly in the short term, but the whole point of these sort of changes at Google are to find and return legitimate useful content. The less your site needs to rely on Google the more Google will be willing to rely on your site.
If you just try to fit where Google is at today expect to get punched in the head at least once a year. If you create things that people are likely to cite or share you should be future friendly.
In the last 3 days about 3 or 4 friends compared good marketing and branding with the Ipod. How does that related to SEO? Peter Da Vanzo recently posted about his Ipod:
When I was considering buying a music player, some music-gadget obsessed friends offered a wealth of well-meaning advice. â€œNoâ€, they said, â€œdonâ€™t get an Ipod because it canâ€™t do xyz, unlike the XRX2000 (or whatever), which can do so much more! More stuff! Oh, and the Ipod is overpricedâ€. Those werenâ€™t the exact words, but that was the jist.
They were probably right, but the problem is: I donâ€™t care.
I knew that if I bought anything else, Iâ€™d always think â€œyeah, but itâ€™s not an Ipodâ€.
The other day in an IM Andy Hagans also mentioned his Ipod
I buy Ipods regularly even though I know they're not better. For 3 times the price of the competition. Because I 'trust' them somehow.
How does all this relate to marketing? If you want to do well long-term you have to sell your product or service as a non commodity. The more your product / service / business is sold as a piece of art or something to be thought as being worth paying more for the more you have to move away from just being approved on a rational level and the more you have to have a strong appeal on an emotional level.
The link profile of this site is far less than perfect, but a large part of the heavy anchor text focus on the phrase SEO Book is because I wanted to create a strong brand. If my inbound anchor text were mixed better this site could probably get a ton more traffic, but traffic without a strong branding element has much less value, especially when you sell an ebook for about 4 times the price that most physical books sell for.
The reasons that legitimate content works so well are
most markets usually take a while to react to quality content
because of that delay, it typically takes spending months or years over-investing before seeing any type of return on the effort required to create something unique and useful that will stand the test of time
most people looking to make a quick buck are all fighting for the same shallow traffic sources and are not willing to spend the time to deeply research their topic or emotionally invest in their content enough for it to pay off
Not every page is going to win awards or have a net positive return for the effort that went into it, but as you build a variety of legitimate useful original pages over time the site authority starts to build on itself and eventually you snowball toward the top.
Recently 11 blogs from the Fine Fools network sold with content, designs, and links for a total of $4,500. I am still busy kicking myself in the teeth, because I would have paid much more than that for those blogs. Those blogs were generating over 300,000 monthly pageviews, but the sites were generating only roughly $300 in monthly revenues.
Without even adding any content to those sites, given their traffic volume and link authority (most of the sites were strong PageRank 6 sites with natural backlink profiles) I could have easily increased the income to over $5,000 a month (ie: had the network more than pay for itself in the first month of ownership).
I think that limited $300 / month revenue figure is a great example of why it is worth worrying about more than just pageviews. SEO is just one piece of the puzzle, and usually most sites have big obvious on site gains that could be pursued long before you look to invest heavily in increasing traffic.
I get some people who tell me that they are already getting a million pageviews a month and they want me to guarantee they will get 3 million pageviews a month if they read my $79 book. If you can get 2 million monthly pageviews for $79 please let me know the source and I will follow you with a few thousand dollars in hand. When you are to that scale the issue is not that you need more distribution. If you can't make money with hundreds of thousands or millions of pageviews you ought to consider changing your revenue model.
I realize that celebrity sites might get tons of low quality traffic, but how hard would it be to add ring tone affiliate ads, concert ticket text links, or dating affiliate ads to the sites? How hard would it be to write a dating ebook you sold for $30? And the blog A Man's View could have easy been changed to a porn blog that would make in excess of $5,000 a month by itself.
I guess a valuable lesson here is that networks that don't profit will eventually fall apart and/or will be sold for well less than they are worth. Another valuable lesson might be that there is still a huge disconnect between traffic and value in the minds of most webmasters, and the WWW still has near endless profit opportunity about.
A friend normally gives me the scoop on auctions at Sitepoint, but something went wrong on this one. I am still kicking myself in the teeth. I would have loved to have bought those blogs, especially that cheap. Damn damn damn damn etc ;)
Similar to the content categorization engine, but for keywords. In addition to the uses described above this tool can also show you how well your page is aligned with your core keywords. Try Microsoft's Keyword Categorization Engine
Demographics Prediction Tool:
Shows the age groups and gender of searchers for a particular query or visitors to a specific URL. Useful for:
showing the most common markets for a search query or domain.
showing you how well your site audience is aligned with your core keywords (for example, if a site lacks corporate bullshitspeakâ„¢, it would be unsurprising that the viewers of that site would be younger than the demographic averages for a field which is typically targeted toward older people who can't get enough corporate bullshitspeakâ„¢)
the most common groups of visitors and mindset to a site or for a query might be obvious, but some of the secondary and tertiary markets may be well less defined. this tool can help you find some of those other markets.
Shows seasonal search spikes. It is like a hybrid between Google Trends and Google Suggest, but it will also show you relevant keyword phrases that have your keyword in the middle of them. This tool does not seem to have as much depth as Google Trends (ie: only a surprisingly few searches show results). They also seemed to have stripped out many gambling and porn related keywords. Unlike Google, MSN places search volume numbers on their trends. Useful for:
Shows you Microsoft's opinion of the probability of a query or a page being information, commercial-informational, or commercial-transactional in nature. Works well in conjunction with Yahoo! Mindset. Useful for:
seeing how commercial they think a term or page is, which is important because it is believed that some search engines, such as Google, have a heavy informational bias to their search results.