Imagine selling web traffic as a commodity in a blind auction, while touting its value based on the traffic being targeted, relevant, precise, and trackable. Then imagine taking away the default keyword tool on the internet that has been written about in thousands of marketing books, ebooks, and web pages - and replacing it with nothing. Then imagine signing up some seedy publishing partners that run clickbots against your highest value keywords, and giving them the lion's share of the click "value" on those keywords. Then imagine not making it easy for advertisers to opt out of that "traffic." Then imagine editing your advertisers accounts without their permission to alter ad text and keywords, and only informing some of them about the changes sometime after they take place...with 1 in 5 rejecting the changes!
Google offers about a half-dozen public keyword tools, makes it easy to filter out bad traffic, has way more volume, offers enterprise level analytics for free, and does not edit your keywords and ad copy against your permission. Is it any wonder Yahoo! managed to lose hundreds of millions of dollars last quarter, while Google keeps exceeding market expectations - even during a recession?
I just hope that when Yahoo! gets bought out by Microsoft that they keep Site Explorer around for us SEOs, and don't do us as poorly as they did their advertisers. ;)
It is no secret to readers here that SEO is an ongoing process, but I was playing with SmartDraw and created an SEO process circle.
One of the problems many people have with SEO is that they think that they will use SEO to get their site in front of thousands of relevant people, but that model only works if they are...
using pay per click marketing (buying the traffic)
using black hat SEO (which may provide only short term results
using an old trusted domain that already has many signals of quality built up
Amongst the hundreds or thousands of participants in your market, some of them enjoy an older site, more social relationships, more links, a more well known brand, a larger traffic stream due to their site already being trusted, and other traffic streams like RSS readers and email list members, etc.
All of those advantages for existing webmasters act as headwinds for a new webmaster (at least until you get established). You typically have to create some number of social interactions to leave the trail of signals of quality to make Google want to trust a site enough to put it in front of a large traffic stream, especially if you are starting a brand new site and are trying to operate within Google's guidelines. As Bob Massa says "search engines follow people."
If you like the above entry, you might also like the SEO Flowchart. :)
Good design makes quality content look and feel better. Design can help improve conversion rate, makes a site more linkable, and sometimes a site generates additional links and mentions just for having a great aesthetic design.
I frequently get asked how we can run a wide array of websites with only a few high-quality part time employees. One of our secrets is staying away from the stuff we are no good at - like web design. I could show you my attempts at design, but you would think less of me if I did. ;)
Rather than going the DIY route, I have been getting quality custom website designs from Wildfire Marketing Group for many of our newer sites, and they look great. I liked their services enough to work a deal with them to get SEO Book training subscribers $100 off their designs, which start out at $765 for a basic design and $975 for a design + a Wordpress theme. Their services page is here, and the coupon code is here.
Let's take look at a web strategy that has a number of SEO and benefits: the hub and spoke strategy. A hub and spoke strategy is when you create one authoritative domain (the hub), and then hang various related websites off that domain (the spokes).
If you don't yet have an authority site, it's probably best to focus on that one site. However, once you've built an authority hub, it can be a good idea to specialize in a number of niches using multiple, smaller sites.
Let's look at a few reasons why, in the context of dominating a niche.
Economic theory holds that division of labor increases profitability.
During the early days of the web, it was easy to make money by being a generalist. However, as the web got deeper and richer, it became difficult to maintain a generalist position unless you had significant resources.
Specialization, by way of niches, allows for greater targeting, and this targeting can increase value. Leads and advertising become more valuable, because the target audience can be reached more efficiently.
The hub and spoke approach is this theory in microcosm. The hub is the generalist authority, whilst the spokes allow for niche specialization.
We'll see how this dove-tails with SEO shortly.
If you were to create a series of sites on different topics, it might take a significant period of time to know each area well. However, if you create niche topics within your own area of expertise, you should be able to create new sites very quickly.
Why would you create new sites? Why not just stick with one?
Let's say your main site is fairly broad in it's appeal. However, you've discovered some lucrative niche keyword areas within that broad topic area. By creating spoke sites, you can focus on these keyword areas, and dig deeper, without compromising the general appeal of your main site.
An example might be a hub site that is aimed at community education, whilst spoke sites might cover private tuition, corporate learning materials, and education facility hire.
This segmentation can be done in a number of ways. You could aggressively target one search engines algorithm and/or audience (MSN) with one spoke, whilst targeting another search engine on another spoke. One site might be aimed at do-it-yourself people, whilst another site is aimed at a person looking to hire a professional. Both sites cover the same topic, but require a different approach in terms of language, structure, offer and tone.
Likewise, you may use spoke sites for brand reasons. When Google bought YouTube they wisely kept the YouTube name, as the brand appealed to users. Google Video - not so much. There is a general perception that YouTube does video, and Google is a search company, and never the twain shall meet.
Google knew better than to force the issue.
A hub site on education that links out to pharmaceutical affiliates could easily get hit by Google. The relationship between the two areas is questionable. However, if you link out to your spoke sites, that cover related niches, your link pattern will be much more acceptable.
From an SEO standpoint, it can be difficult to get links to purely commercial sites. If you have a hub site that already has link authority - or is created specifically to attract links - then you can pass this authority to your more specialized spokes. Once the spokes become more popular, you can either pass that authority along to yet more specialized sites (one way), or even promote your hub site (reciprocal). Either way, the link graph makes sense.
Each site doesn't need to be directly profitable. You can use one site to attract links, and pass this authority on to your monetarized domains. One can subsidize the production of the other.
If you've already built up name recognition in your niche, you'll find it easier to get links and press attention for your new projects.
Status is important because if no one knows who you are, they probably don't care about the content so much. Let's say Danny Sullivan or Matt Cutts writes something, it will instantly get attention because of who they are and the trust relationship they have with their audience. If you're new to the SEO space, no matter how profound your content is, it could easily get over-looked.
This is why it can be more difficult building multiple areas across unrelated niches. You may need to establish yourself in each new area, which can be a lot more difficult than leveraging your name recognition in your existing niche, then going granular.
Enhanced Monetarization Opportunities
We've looked at how you can target the most profitable areas aggressively using a hub and spoke strategy, without affecting the main brand.
Other advantages include economies of scale. As your network grows, you have more ad inventory to sell people. The inventory can be segmented, as opposed to the advertiser having to accept a one-size-fits-all approach of a generalist site. Similarly, you may be able to demand higher affiliate payouts, because you can precisely target offers.
Note that few of these pages have any relevant on-page content. Is this a case of Google-bombing? Or did Google dial up the .edu bonus too far?
Does Google want to return all the irrelevant pages? Or does it not matter if they are deep enough in the result set? Will having mystery meat results on pages 2 through 100 hurt Google's brand? Or does everyone just click on the first page?
Another question we received recently from the SEOBook.com community was:
What qualities are common in Aaron Wall, DaveN, Bob Massa, Jason Duke, SugarRae, et al, that new SEOs can adopt, to come closer to people like these in expertise. Where do most new SEOs go wrong when they set learning priorities?
I've asked these people to provide their views, which I'll get to shortly.
It's a great question, because the avalanche of SEO information that confronts the beginner can be overwhelming. How do you know what information is important? What aspects do you really need to spend you time on, and what information do you need to reject? What are the qualities that make for a good SEO?
Let's take a look...
Most people stumble into being an SEO.
An awareness of SEO usually comes about when a person launches a site, only to find that the site doesn't magically appear #1.
Soon after, the webmaster will likely find themselves knee deep in SEO forums and blogs, where everyone has a viewpoint, and often those viewpoints contradict each other. Contradiction is rife in SEO. To understand why, we need to understand the history of search engines.
The first step in setting learning priorities for SEO is to.....
Infoseek was one of the early search engines. Infoseek introduced a feature around 1996 , whereby they would crawl a site and update their index immediately. This feature made it easy for webmasters to game the algorithm.
I had just launched a small, commercial site. I thought all I had to do was publish a site, and the search engine would do it's job, and put me at number one! Unsurprisingly, that didn't happen.
So, I tried to figure out why Infoseek didn't think my site was great. I could see that there were sites ranking above mine, so there was clearly something about those sites that Infoseek did like. I looked at the code of the high ranking sites. Did that have something to do with it? To test that idea, I cut and pasted it their code into my own code and republished my site. Viola, I was at number 2!
So far, so good.
But why wasn't I number one? The sites that were ranking highly tended to have long pages on the same topic, so I added more text to my pages. Soon enough, with a little trial and error, I was number one. Predictably, Infoseek soon pulled this feature when they saw what was happening.
I was clearly not alone in my underhanded trickery.
At the time, I thought my cut n paste trick was an amusing hack, but I wasn't earning my bread and butter from the internet. I was working in the computer industry, and unaware of "SEO". I soon forgot about it.
A few years later, a whole cottage industry had sprung up around SEO. The search technology had become a lot more sophisticated. My dubious copy n' paste hack no longer worked, and the search engines were locked in a war against webmasters who were trying to game their ranking criteria.
There is an inherent conflict between the business model of the search engine, and that of the SEO. The SEO wants their site to rank, the search engine wants to rank a page a searcher will find useful.
That isn't necessarily the same thing.
Therefore, the search engines are notoriously secret about their ranking formulas. SEOs try and reverse engineer the formulas, or just guess the factors involved, which is why you'll see so many contradictory viewpoints.
So who do you listen to? What information is relevant?
2. Technical Know-How
Dave Naylor had this to say about doing too much at once:
Common qualities that's simple we notice the little things and understand the larger impact that they will have in long term,
And where do you most new SEOs go wrong when they set learning priorities?
From the new SEO's on the block that I chat too, they seem to run at a million miles an hour trying 100 different things at once, they need to slow get a decent data set of information and slowly pick though it and test small things at a time, and work out thing like why is it when I search for The FT in Google it returns Grand Theft Auto ?
Most people new to SEO place a lot of emphasis on the technical aspects. It's natural to seek out the secret recipe of high rankings. Whilst most forums obsess over these issues, much of what you'll read is irrelevant fluff. These days, SEO is more about a holistic process, rather than an end unto itself.
Start with a solid, credible source - like SEOBook's course for example ;) The cost of a well researched course is nothing compared to the time you may spend heading in the wrong direction.
Most people will benefit by applying the 80/20 rule. To rank in Google, you need to be on-topic, you need to be crawlable, and you need to have inbound links.
You could spend a lifetime trying to figure out the other 20%. Unfortunately, the formula is in Google's hands, and even then, only known to a few. It is reasonable to assume Google tweaks the dials often, especially once a common exploit makes the rounds. Take Dave's advice and take it one step at a time. Focus on the key aspects first - relevance, crawlability and linking - then methodically test and evaluate in order to expand your knowledge.
I honestly think the only way anyone can go wrong, new to online promotion or a seasoned veteran, is to not look too hard for tricks and magic beans from those who make their names posting those so-called tricks, in forums.
I believe anyone can be successful at online marketing or even traffic generation and search engine placement specifically, if they just stop looking for ways to trick machines and instead look for ways to connect with humans.
search engines are just computer programs and algorithms written by humans. The engine is only a tool intended to aide humans do things faster and easier that are important to their lives. I think machines can help with connecting humans BUT the humans are the target, the goal, the end that machines can provide the means to.
I think one thing that is common among the list of people you mentioned is that they all realize, understand and accept that concept.
3. Strategy & Goals
The opportunity in SEO lies in the fact that Google must have content, around which it places advertising. If you rank high, you get "free" clicks.
Of course, nothing in this world is free, and SEO is no exception. There is significant time cost involved in getting lucrative rankings. And that cost comes with a reasonable degree of risk. Google has no obligation to show you at position x, and your competitors will always try and eat your lunch.
Strategy is the most important aspect, and one you should spend a lot of your time on. Why are you trying to rank? Are there better things you could be doing i.e. building up a community? Do you have an on-going publishing model? How is your brochure-web site ever going to attract links? Are you building enough link juice to ensure your entire 500K page affiliate site gets indexed?
I think some of the general principals that apply to most of them are that they are: smart, curious, hard working, blunt, honest, and sharing. They also view SEO as a tool to help them achieve other goals, rather than having SEO be the end goal.
Where a lot of people go wrong with SEO is that they try to think in concrete numbers based on a limited perspective built off a limited set of data. Some things may happen sometimes, but there are very few universal truths to the shifting field of SEO beyond preparing for change. And the certain lasting truths do not provide much competitive advantage...that is built through curiosity, testing, hard work, and creativity - Aaron Wall.
It's surprising how little time is spent talking about measurement, because without it, SEOs are flying blind.
One common metric is rank. It's not a very good metric, because it doesn't tell you very much, other than you've won the ranking game.
But so what?
What if that rank doesn't help you achieve your goals? What if every person who clicks on your link ends up buying from the guy who is advertising on Adwords instead?
This is why measurement, aligned with your goals, is important. If you track SEO efforts through to a goal, and most of those goals tend to involve making money, then you'll be head and shoulders above most of the forum hacks and pretenders. It doesn't matter what tracking software you use. Become an expert and tracking and metrics.
Before you do, it pays to take a look at your competitors. By choosing the right sites to compete against, you can gain significant advantage.
Firstly, you need to position your offering relative to your competitors.
1. What Problem Do you Solve?
Making money is mostly about solving problems. Write down the problem you're going to solve. Be specific.
Provide auto repair training to amateurs
Sell bomb detectors to airlines
Sell ice to Eskimos
For this article, we'll use the idea "sell ice to Eskimos". No doubt you've already spotted the problem with this rather lousy business model, but let's have a look at what a bad idea looks like within this evaluation process.
2. Who Is Your Audience?
You may have noticed I included the prospective audience in my examples above. Know what you're selling, and to whom.
Demographics, in other words.
Who are you customers? What do they want? What type of language do they use? Build up a profile.
In our example, our customers are Eskimos. Eskimos live around the North Pole region, mainly in Siberia, Alaska, Canada, and Greenland. Internet access is obviously going to be an issue, not to mention language barriers, which is about the point the idea should die.
Yet, surprisingly, many prospective web businesses never address this simple question. Various Web 2.0 businesses clearly didn't ask questions 1 & 2. Presumably they jumped straight to the "how can we get a few million dollars in VC?" question instead.
3. Where Are Your Customers Hiding?
You need to get in front of your audience.
SEOs know a lot about keyword research, so have a huge advantage over others when it comes to finding out who their competitors are, and where the opportunity lies.
Find out the keyword terms your potential audience use, conduct searches, and make a note of the big players under those keyword terms. Keep in mind that language people use on search engines is always changing. There are more queries that are longer and more specific. These give you valuable insights into how to position your offering.
What questions are people asking? What problems are they trying to solve? What are the many different ways they describe that problem? What keyword areas are your competitors missing? What value can you provide that they do not? Have your competitors missed lucrative keyword areas?
4. What Is The Nature Of The Market?
You should look for rapidly growing markets. You want to avoid established, declining markets, unless you can provide a new layer of value that is difficult for competitors to emulate.
Take a look at the type of sites you intend to compete against. Are they big companies? Are they hobby blogs and thin affiliate sites? It's going to be much easier for your new site to compete against the thin affiliates and hobby projects than it is to compete against the establishment.
One of the common stumbling blocks at this point is solving a non-problem. The "ice to Eskimos" market is not dominated by established players, or hobby blogs for that matter, but there's a good reason for that - Eskimos don't have a "lack of ice" problem. Beware of the Web 2.0 trap - solving the non-problem.
5. What Related Markets Exist?
If the market you were thinking of entering is competitive, are there any closely related markets you can enter? You can find these areas by looking for patterns in the keyword research results.
You might notice there are numerous searches for fitness locations i.e. a gym, a center, a club. So, instead of targeting fitness in terms of health, which would see you up against established health organizations and generalist publications, you might want to target the fitness center section of the market e.g. a comparison of gyms and centers. Such a niche could possibly be more lucrative, as there is a clear money making opportunity as people need to pay to join these facilities.
Which brings us onto...
6. Is there Potential To Make Money?
Just because a lot of people are doing something, doesn't mean it is worth doing.
Some areas are difficult to monetarize. Science, for example. And social discussions. Some area's are saturated, making it very difficult to find a new value layer to add. SEO, for example.
How easy would it be for one of the major players to copy your value proposition? Look for areas that have a clear path to monetarization and that aren't dominated by major players, or saturated by sites with little to distinguish them.
I have been fond of the depth of Majestic SEO's data and the speed with which you can download millions of backlinks for a website. While not as hyped as similar offerings, Majestic SEO is a cool SEO service worth trying out, and their credit based system allows you to try it out pretty cheaply (unless you are trying to get all the backlinks for a site as big as Wikipedia)...as the credits depend on the number of inbound linking domains.
They give you data on your own domain for free, and share a nice amount of data about third party sites for free. For instance, anyone can look up the most well linked to pages on SEO Book free of charge
What made you decide to create Majestic SEO?
We arrived to it naturally - our main aim with the Majestic-12 Distributed Search Engine project is to create a viable competitor to Google. We use volunteers around the world to help us crawl and index the web data. This project was started in late 2004 and about 2 years later it became clear that we need to be as relevant as Google and in order to do that we have to master power of backlinks and anchor text. As time went on many hundreds of terabytes of data were crawled it also became clear that we need to earn money as well in order to sustain our project. It took well over a year to actually achieve the level that we felt confident with to release it publicly in early 2008.
What were the hardest parts about getting it up and running?
The most difficult part was to avoid the temptation to simplify the problem and focus on a small subset of data that is much smaller than that indexed by Google. It was felt that it would be a mistake as you can't really be sure that you have the same view of the web unless you are close to Google's scale.
Once it was decided to follow the hard path a lot of technical scalability problems had to be solved as well, and then deal with the financial aspect of storing insane amount of data using sane amount of hardware.
You use a distributed crawl, much like Grub did. What were some of the key points to get people to want to contribute to the project? How many servers are you running?
The people that joined our project did so because they felt that Google is quickly becoming a monopoly (this was back in 2004) and a viable alternative was necessary. We have over 100 regulars in our project that run distributed crawler and analyser on well over 150 distributed clients: all this allows us to crawl at sustained rate of around 500 Mbits.
Since we recently moved closer to commercial world with Majestic-SEO it was decided that our project participants will benefit from our success by virtue of share ownership - essentially project members are partners. It needs to be stressed here that our members did not join the project for financial reasons.
How often do you crawl? How often do you crawl pages that have not been updated recently?
We crawl every day around 200 mln urls. At the moment our main focus is to grow our database in order to catch up with Google (see analysis here), however we have dedicated some of our capacity to recrawls, in fact in February we should have new version of automatic recrawls of important pages (high ACRanked) released and this will allow to see competitor backlink building activity pretty quickly. Our beta daily updates feature shows new backlinks found in previous day for registered or purchased domains, this gives a chance to see new backlinks before we do full index update (around every 2 months time).
What is AC Rank? How does it compare to Google's PageRank?
This measure is not as good as PageRank because it does not yet "flow" between pages. We are going to have much improved version of ACRank released soon.
Do you have any new features planned?
Can't stop thinking about them ;)
You allow people to export an amazing amount of data, but mostly in a spreadsheet basis on a per site basis. Have you thought about creating a web based or desktop interface where people can do advanced analysis?
We offer a web based interface to all this data with ability to quickly export it using CSV format.
For example, what if I wanted to know pages (or sites) that were linking to SearchEngineWatch.com AND SearchEngineLand.com but NOT linking to SeoBook.com AND have a minimum AC rank of 3 AND are not using nofollow. Doing something like that would be quite powerful, and given that you have already done the complex crawl I imagine adding a couple more filters on top should be doable. Another such feature that would be cool would be adding an Optilink-link anchor text analysis feature which allows users to break down the anchor text percentages.
We do have powerful options that enable our customers to slice and dice data in many ways, such as excluding backlinks marked as nofollow or only showing such backlinks, this applies to single domain analysis however, but something like what you describe in your example of interdomain linking will be possible soon.
Have customers shared with you creative ways to use Majestic SEO that you have not yet thought of?
We get good customer feedback and often implement customer requested features to make data analysis easier. As for new creative ways our customers prefer to keep them to themselves, but once you look at the data you might see one or two good strategies on how to use it. ;)
How big of an issue is duplicate content while crawling the web? How do you detect and filter duplicate content?
It is a very big issue for search engines (and thus us) as many pages are duplicate or near duplicate of each other, with very small changes that make it hard to detect them. We do not currently detect such pages (we crawl
pretty much everything) though we have a good idea how to do it and will implement it soon. Our reports tend to show data ordered by importance of the backlink, so often it is not an issue though it depends on backlinking profile of a particular site.
A lot of links are rented/bought, and many of these sources get filtered by Google. Does your link graph take into account any such editorial actions? If not, what advice would you give Majestic SEO users when describing desirable links vs undesirable ones?
At the moment our tools report factual information on where backlinks were found, we do not currently flag links as paid or not. This is something that humans are good with and computer algorithms ain't - that's why Google hates such paid links so much. We do have some ideas however on how to detect topically relevant backlinks (paid would usually come from irrelevant sites) - it's coming soon and might actually turn up to be a ground breaking feature!
Microsoft has done research on BrowseRank, which is a system of using usage data to augment or replace link data. Do you feel such a system is viable? If search engines incorporate usage data will links still be the backbone of relevancy algorithms?
BrowseRank is a very interesting concept, though we are yet to see practical implementation on large scale web engine. I don't think such system obsoletes link data at all, in fact it is based on link data just like PageRank only it allows to detect the most relevant outgoing links on a page, essentially such votes should be given more weight in PageRank-like analysis. For example imagine that this very interview page is analysed using BrowseRank and it finds that the following cleverly crafted link to Majestic-SEO homepage is clicked a lot, then such link could be judged as the real vote that this page gives out!
This approach would help identify more important parts of on page content as well so that keyword matches within this content block could get higher score in ranking algorithms. So I actually think there is a lot of mileage in BrowseRank concept, but it would be a mistake to think that it will completely replace need for link data analysis. I am pretty sure Google uses something like this already - Google Toolbar stats would give them all they need to know.
The great irony in my view is that Microsoft lacks good web graph data to apply their browsing concept, this is probably why they are so desperate to buy Yahoo search operations who are much better than it comes to backlinks analysis, though Google are the real masters. Majestic-SEO is trying to slot itself just behind Google and who knows what happens after it ;)
I look up a competing site and see that a competitor has 150,000 more links than I do and feel that it would take years to catch up. Would you suggest I look into other keywords & markets, or what tricks and ideas do you have for how to compete using fewer links, or what strategies do you find effective for building bulk links?
First of all: don't panic! :)
Secondly use the SEO Toolbar that will query our Majestic-SEO database to show number of referring domains - it may well be very few.
Thirdly consider investing into detailed stats we have on this domain: this will tell you anchor text used, actual backlinks that you can analyse by their importance (we measure it using ACRank). Once you see real data a lot of things can become clear: for example you can see that your competitor has got lots of backlinks pointing just to homepage or spread around the site. Seeing actual anchor text is really an eye opener - it can show which keywords site was optimised for, this will allow you to make a good decision whether you can catch up or not. Chances are you may find that your competitor is weak for some keywords, this is where keywords research tool like Wordtracker is invaluable.
And finally consider that a few good relevant backlinks are likely to be worth more than many irrelevant ones: it is those backlinks that you want to get and knowing where your competitor got them should help you create a well targeted strategy.
You allow people to download a free link report for their site. How does this report compare to other link sources (Yahoo! Site Explorer, Alexa links, Google link:, and links in Google Webmaster Central)?
We give free access to verified websites, this is a great way to try our system and you might see the backlinks that you won't find elsewhere because our index is so large and we show you all backlinks (rather than top 1000) that we've got: this will include backlinks from "bad neighbourhoods" (this is not yet automatically marked by our system, but visual human analysis wins the day here) that you may not be shown in other sources.
We believe that our free access reports are the best in class, since it's free why not find out for yourself?
For analyzing third party sites you have a credit based system. How much does it cost to analyze an average site?
The price depends on how large (in terms of external referring domains) a particular website is. We have some sites that have hundreds of millions of backlinks, average would be very different depending on what you really after, the best option is just to run searches for domains that you interested in on our website, this will give you very interesting free information as well as price for full data access.
For a domain like Wikipedia I might only want the links to a specific page. Are you thinking about offering page level reports?
Yep I am thinking of it - I actually had requests like this, funnily Wikipedia being the main object of interest.
What is the maximum number of links can we download in 1 report?
Our web reporting system tries to focus on most valuable backlinks to avoid information overload, however we allow complete dataset download that will include all backlinks - some of our clients have retrieved data on domains with well over 100 mln backlinks! Using our powerful analysis options you can focus on backlinks for particular urls coming from particular pages and retrieve all qualifying data.
There have been SEOs who have argued - with a straight face - that whilst it's ok for them to game search engine algorithms, it's not ok for others to do so. This is usually because the other guy isn't following "the rules".
What are the rules?
The rules are decided - and vaguely defined - by the search engines, and then interpreted to mean whatever an SEO decides they mean. Far be it for a search engine to create rules that serve their own business interests, which may not align with the interests of the webmaster.
SEO is built on shifting sands. What do you do when what you were doing was "within the guidelines" and no longer is because the rules change? Do you willfully decide to rank lower?
Conclusion: Spam is what the other guy does. Also an acronym for "Sites Positioned Above Mine".
2. How To Create Meta Tags
Hard to believe now, but forum wars were fought over how many times a webmaster could repeat a keyword in a meta keyword tag. Twice was often deemed ok, but any more than that and you were almost certainly an "evil spammer" (see #1).
Meta tag manipulation doesn't count for anything these days. The tags are mainly used to describe the content of pages, that the search engines may display as snippet text.
Conclusion: Deader than AltaVista
3. Is SEO Ethical?
A curious framing of SEO in terms of ethics and morality.
Is it good and proper to try to get a higher rank than the search engine would bestow otherwise? The point of SEO is, of course, to get a higher rank than the search engines would bestow otherwise.
These people were usually in the wrong game. Many went on to join Seminaries.
Conclusion: Welcome to the jungle
4. Should I Buy Links?
Paranoia runs rampant in SEO, especially when search engines make a example of someone. Like SearchKing.
The argument quickly descended into a semantic war i.e. define "paid". Money changing hands? Favors? Nepotism? Erm...Yahoo Directory? One of the more interesting conclusions often got buried: "Hey, perhaps if Google dislike them so much, paid links really do work!"
Conclusion: Yeah, they work
5. Should There Be SEO Standards?
A natural progression of the ethical debate. It was proposed that SEOs should all conform to a common code of practice, as other professions often do.
The problem was that the relationship between search engines and SEOs has always been grey. Only the search engine can really define what the search engine wants, and what the search engine wants might not align with what the SEO, or their client, wants. In any case, the search engine isn't going to publicly define exactly what they want, as they are worried that people, like SEOs, will game their systems.
So, you got a few self-appointed search police officers, who would suggest that everyone followed their particular code of practice, based on their interpretation of the search engines guidelines. The self-appointed cops usually out-numbered those who followed them, and invariably disagreed amongst themselves anyway.
Before Obama was sworn in, he launched Change.gov under The Office of The President-elect. Change is an easy concept to grasp after 8 years of crony capitalism and fraudulent wars built on lies from international war criminals. But "change" in and of itself is a tool, not a destination...where is it going?
While launching a plan to increase government spending by nearly a trillion dollars to "create" millions of jobs, upgrade infrastructure, and computerize the national health care system (what hidden costs might come to individuals from that "change"?), Barack announced that his plan to spend this money will be tracked under Recovery.gov.
Part of Obama's spending plan is to "expanding broadband access to millions of Americans so businesses can compete on a level playing field, wherever they are located." But the playing field has never been level (just look at how the bankers rewrote the consumer bankruptcy laws to shaft consumers a few years before the banks were begging the government for trillions of dollars of handouts).
Investments in the web will increase the value of web assets, but the increased competition will make it harder to gain attention and exposure unless you have capital to invest, and invest it wisely. Just this week a corporation worth $10's of billions put one of our core keywords in the page title of their home page! We still outrank them, but for how long?
I thought it would be at least a decade before the United States government started domaining. Many large corporations are sure to catch on soon, increasing domain prices and closing off a great investment opportunity for smaller players. I have been busy over at BuyDomains looking for good names to hoard and build, picking up another 3 yesterday. SEO Book members have access to a coupon code to get 15% off BuyDomains domain names on our member discounts & coupons page.
Continuing on with our community questions, here are a few requests for specific ranking information:
"What are the 100+ variables Google considers in their ranking algorithm?"
Easy to say, hard to do. Take a job at Google, work your way up the ranks and join the inner circle.
Another question we received is along the same lines:
How do you outrank a super established website in your niche, one where Google is giving site links and their domain is older
Again, easy to say, hard to do. Either forget outranking the domain and buy it, or spend time doing exactly what they have done, and hope they also stop their SEO efforts in order to let you catch up.
These types of questions arise often. "If I could just learn a few quick-fix insider secrets, I can outrank everyone!"
If there was a quick n easy secret formula that would guarantee high rank, why would those who know it, reveal it?
The reality is that quick-fix secret formulas don't exist.
Sure, there are quirks in the algorithms that can be exploited, but they are often trumped by historical factors, like authority metrics, that are difficult to fake. One common blackhat technique is to hack an established domain, and place "money" pages on that domain. That's an admission, if ever there was, that technical trickery on your own domain is either too time consuming, or doesn't work so well.
I know some of the worlds top SEOs, and I can't recall them spending much time talking about secret sauce. What they do talk about is making money and growing empires. They're more focused on the business strategy of SEO.
The effectiveness of many SEO techniques will be dead soon, anyway.
What you need to think about for the future is user interaction.
The Future Of SEO
Have a read of this document, by my good friend and Merlot drinker, Mike Grehan. Mike outlines his view on the future of search, and he makes a number of important points:
The web crawler model is nearing the end of its useful life
Signals from users, not content creators, will become more important
Universal Search changed the ranking game forever
Forget rank, think engagement
If you want to future proof your SEO strategy, take heed of Mike's words.
The crawler model is failing because the crawler was designed for structured text, not multimedia. The crawler can't see behind pay-walls. It has trouble navigating databases in which the data isn't interlinked or marked-up. The search engines will need to look for other ways of finding and making sense of data.
Social networks, blogs, Twitter etc indicate a move away from the webmaster as signaler of importance i.e. who you choose to link out to. The search engines will need to mine the social signals form those networks. The user will signal where their attention is focused by their interaction and paths.
Universal search, in may cases, has pushed results listings down below the fold. For example, to get a client seen high up on the results page may involve making sure making sure they are featured on Google Maps. Similarly, if they have video content, it should be placed on YouTube. Google have shown they are increasingly looking to the aggregators for results and featuring their content in prominent positions.
That list of search results is becoming more and more personalized, and this will continue. Who knows, we may not have a list before too long. More and more "search" data - meaning "answers to questions" - might be pushed to us, rather than us having to go hunt for it.
The future of SEO, therefore, will be increasingly about engaging people. The search engines will be measuring the signals users send. In the past, it's all been about the signals webmasters send i.e. links and marked up content.
For now, you still need to cover the obvious bases - create crawlable, on-topic content, backed by quality linking. But you'll also need to think about the users - and the signals they send - in order to future proof your site. Google has long placed the user at the center of the web. Their algorithms are surely heading towards measuring them, too.
What are these signals? Ah, now there's a question.....
I have had a very well known SEO company dust one of best link building strategies (outing it directly to a Google engineer) because I was trusting enough to mention how effective it was inside our training program, thinking that a competitor would not out it, but I was wrong! At least I know what to expect, and can use that knowledge to mitigate future risks.
One of the common concerns about the SEO Toolbar is something along the lines of "does it phone home" or "are you spying on us" or "what data is it sending you". Some SEO companies offer a huge EULA and do spy on the people who use their toolbars, but we do not do that for a number of reasons
I felt rather angry when that well known SEO company outed my site (and haven't really trusted them since then)
I never really liked the idea of spying on customers, and going down that path could harm our perceived brand value
knowing that information is kept private adds value and builds trust
we are already under-staffed (running quite lean) and have more projects to work on than time, so we are not in need of new projects
It is pretty obvious that the trend in software (since the day I got on the web) is that open source software is commoditizing the value of most software products and tools. Providing tools that require limited maintenance costs and provide access to a best of breed collection of SEO tools makes it easy for us to evolve with the space and help our customers do so, without building up a huge cost sink that requires raising capital and having to listen to some icky investors. :)
The reason we can (and do) provide so many free SEO tools is because I feel doing so...
extends opportunity to more people around the globe (anyone who is just fresh starting out like I was ~6 years ago could use the help)
commoditizes the value of some bloated all-in-one SEO software (many of those products generally lack value and misguide people)
makes it hard for con-artists to sell hyped up junk (by commoditizing the value of their offerings to all but the most desperate of get rich quick folks)
helps to educate potential future customers (when we did a survey recently about 80% of our customers have been practicing SEO for over a year)
is an affordable distribution strategy for brand awareness
builds trust by delivering value for free (rather than trying to squeeze every penny out of potential customers)
is a big differentiator between us and most SEO websites
In addition to all the above points, most of the tools we create are tools I want to use. So the cost of building them would still be there even if we did not share them. Sharing them gets us lots of great user feedback to improve them, and does not cost us much relative to the potential upside.
Small Industry, Lightweight Strategy
Rather than centralizing things, we like to rely on a distributed software strategy which has a much lower cost structure.
That strategy allows this site (with a popular blog, an array of tools, some videos, training modules, and an active community) to run on 1 server. We find the Plenty of Fish story inspiring, though doubt we will need his distributed computing skills anytime soon given how small our industry is. After 5 years we are still millions of visitors and over a billion monthly pageviews behind Plenty of Fish :)
Though we are doing ok in our little corner of the web :)
We have analytics on our website to help us see where we are getting coverage, and to measure and improve conversions (an area ripe for opportunity given our brand exposure and site traffic). We may add relevant affiliate links and offers to some of our SEO tools to help pay for the 10s or 100s of thousands of dollars we spent developing our various tools (for example, see how we integrated a link to our Wordtracker keyword guide and the Wordtracker keyword research service in our keyword tool). But we have no need or desire to spy on users who download our tools. Spying and outing are poor strategies for professional SEOs to employ....they erode trust and value.
"SEO as we know it will be dead within the next 2 years – true or false? With the wealth of info at their fingertips combined with localized, customized search to name but a few Google will no longer need to do what it does now to determine rankings?"
I'd say "false".
People have been predicting the death of SEO since, well, the beginning of SEO. Here's a debate from 2004, and another from 2006. These arguments probably started around 1995.
So long as search engines display a list of sites, for which payment is not required, SEO will exist.
How SEO is done will change. It has always changed. In the bad old days, SEO was all about getting listed in the Yahoo Directory. If you didn't, you were pretty much invisible. There was a time that listing with Looksmart got you decent rankings in MSN. These days, few of those new to SEO have even heard of Looksmart.
Google will certainly adapt and change, and use a variety of metrics in order to determine relevance. SEOs will adapt and change, trying to work out what these metrics are.
Recently, Eric Schmidt made the following comment:
The internet is fast becoming a "cesspool" where false information thrives, Google CEO Eric Schmidt said yesterday. Speaking with an audience of magazine executives visiting the Google campus here as part of their annual industry conference, he said their brands were increasingly important signals that content can be trusted"
So, having a brand might be a signal of quality, which may, in turn, lead to a higher rank. Or perhaps Schmidt was just playing to the audience of newspaper owners. Difficult to tell ;)
Google collects a wealth of usage data from toolbars, analytics, and their ad systems, so it is conceivable they might fold these metrics into their ranking systems. Marissa Mayer recently suggested that SearchWiki data might be used ranking calculations.
Will the bar get raised? Will SEO become more difficult? Of course. But a raised bar works two ways. If you can reach it, there's a new barrier between you and those who follow you. That gives you some level of defensibility.
So how do you do SEO going forward?
I've written a lot about the importance of holistic strategy. Your aim should be to sell something to people - be it an opinion, a product, a service. All your endeavors should support this goal, and most of the time, that means doing the basics well - make your site crawlable, well linked, and solve a genuine problem for people. If your SEO efforts are not resulting in an improvement in the bottom line , then there is little point doing SEO.
"I believe anyone can be successful at online marketing or even traffic generation and search engine placement specifically, if they just stop looking for ways to trick machines and instead look for ways to connect with humans".
There are so many SEO tasks demanding your attention. How do you prioritize them?
Seems to be a common issue, as when we asked for questions a while back, we received this one:
"What aspect of SEO should you be spending most of your time on? Optimizing the title tag, getting links, creating quality content? "
So which area of SEO will give you the most bang for your buck? Link building? On-page? Social media? Ask ten different SEOs, and you'll likely get ten different answers.
Let's take a step back and start with strategy.
1. Define Your Goals
Without clear goals, it's difficult to know how to spend your time. Start by listing your goals.
Do you want to sell services or product? Do you want to increase traffic levels? Do you want to increase brand awareness? Knowing which SEO dial to twist depends very much on what goal you're trying to achieve.
Once you have your list, create a set of KPIs. KPI stands for Key Performance Indicator. KPIs will give you a set of metrics to help you decide if you're meeting, or missing, your goals.
Here are a few examples:
Rank top ten for keyword term x in Google
Increase traffic from search engines by *x* percent by *date*
Get 1,000 new sign ups from search visitors in March
Sell ten widgets per day to search visitors by next week
The most useful KPIs are specific. You either hit the target or you miss.
Your strategy will be defined by your goals. For example, if your goal is to sell ten widgets by next week using a new site, then your strategy might be to forget SEO for the meantime, and focus on PPC instead. If your goal is to get 1000 new subscribers by the end of the year, then you might spend a lot of time analyzing your demographic, determining where they hang out, and getting your name and content in front of them at every available opportunity. If your goal is to get #1 for term X, then you'll be focusing a lot on link building, using keyword term X in the link.
And so on. Your goals define your tactics.
Once you have a list of clear objectives, and a clear list of KPIs, the next step is to consider the age of your site.
2. What Type Of Site Do You Have?
One of the most important task for new sites is link building. The sites with the highest quality linkage tend to trump sites with lower quality linkage when it comes to rank.
Until you build links, then tweaking on-page aspects of SEO on a new site won't make a lot of difference in terms of rank. Get the basics right - keywords in the title tags, keyword focused content, strong internal linking, a shallow structure and good crawlability - but focus your efforts on attaining links. If that means establishing a large body of quality content first, then so be it. Others may choose to buy their way up the chain, or aggressively pursue social media opportunities.
The opposite is true for an established site. Whilst links are always important, an established site can leverage on-page factors to a greater degree.
Once your site has built up sufficient link authority, then you may only need add a new page of content, and link it internally, in order to rank well. People running established sites may wish to focus more on producing quality, focused content, and let the linking look after itself.
3. The Five Most Important Areas Of SEO On Which To Spend Your Time
These are highly debatable, but here's my ranking:
1. Produce Remarkable, Attention Grabbing Content
Everything starts with remarkable content i.e. content worth remarking on and linking to. Do you have unique, timely content? Does you content solve a problem? Does you content provide a new insight? Does you content spark controversy? Does you content start - or contribute to - a conversation?
If your content can't be crawled, you won't rank. Ensure your site is easily accessible to both humans and search engine spiders.
3. Build Links
Google's algorithm is heavily weighted towards links. Beg, buy, or earn links, and rankings follow. Get your keywords into the link text. Building links also means building relationships with people. Spend a lot of time doing this, especially in the early stages.
4. Title Tag
It is debatable how much ranking value the title tag has, both it definitely has click-thru value. Your listing fights for attention with all the other links on page. What will make people click your link?
Learn the lessons of Adwords. Match your title tag to the keyword query. Solve a users problem. Arouse curiosity.
5. On-Page Content
Forget endless on-page tweaking. Largely a waste of time. Instead, keep a few keyword phrases in mind when writing. Use semantic variations of your terms in order to help catch long tail terms. Link your page to related pages, using keyword terms in the link.
Bonus: Watch Your Competition. Do What They Do
Download the toolbar. And keep a very close eye on your competition. Whatever they do, you need to do more of it :)
SEO used to be a technical exercise involving the isolation of specific factors that, when tweaked, lead to higher rank. It still is, to a certain extent, but much less so than it used to be. Therefore, there is little point looking at each factor in isolation.
SEO has become a lot more holistic and strategic, so by far the most important aspect is clearly outlining your goals, and defining a strategy to achieve those goals.
I thought it would be worth highlighting a few of the advanced features in the SEO Toolbar. Some of the highest value ideas do not consist of looking at one data point, or boiling things down to 1 arbitrary and meaningless number (like many "professional" SEO tools do), but consist of looking at many data points across multiple sites, and hunting for inconsistencies that help you build new profitable traffic streams. Along those lines, I thought I would run through a few ideas to get your juices flowing...there are dozens more like these :)
In the past when you did something quite cool and attention-worthy people would reference it on their blogs. But now in the age of Twitter, many people mention your stuff on Twitter. This can be good if they have thousands of Twitter followers, but if most the people mentioning a topic are all in the same small tight knit space then you are only reaching a fraction of a fraction of the potential distribution you would have before the age of Twitter.
How many people read every Twitter update from the people they subscribe to? Very few. Since you are in a high volume aggregator the loyalty is nowhere near where it is with traditional blog subscribers.
Exciting news quickly falls into the archives due to the rapid nature and high volume of Tweets.
If you dominate a channel and keep reaching the same people over and over again that does help provide social proof of value, but after seeing the same message 5 or 10 times it becomes noise.
most people do not have automated mechanisms to dump their daily Tweets / Tweet links into their blog to provide trusted direct links
people rarely use Twitter as a bookmarking service, so it is rarely worth searching into yesterday's content. The Twitter content is very zen-like...here today, gone today.
Multiple people asked me to add their RSS feeds to the default set that in the SEO Toolbar that was soon to be downloaded by over 10,000 webmasters. And for wanting all that exposure (and future exposure) they didn't even post about it on their blog. They mentioned it on Twitter...where the same 3,000 people saw the message 20 times each. No value add whatsoever.
Out of over 21 pages of Tweets (300+ Tweets) mentioning "SEO Toolbar" in the last 3 days, Yahoo! is showing less than 10 inbound links to the SEO Toolbar page that came from sources other than direct friend requests, social news sites, or automated links brought on by that exposure. Twitter is pretty worthless as a link building strategy, even if you are giving away something that is both free and better than similar tools selling for hundreds of dollars.
Even if you have a strong launch and a product far superior to related products, the exposure you get may not matter if your coverage is stuck on Twitter. It is a connecting medium, but it doesn't make money:
Venture Beat says that Twitter made Dell a million dollars. That's nuts. Did the phone company make Dell a billion dollars? Just because people used the phone to order their Dell doesn't mean that the phone was a marketing medium. It was a connecting medium. Big difference.
Is Twitter a nice complimentary channel that can add exposure to your launch? Absolutely. But if the conversation does not leave Twitter.com then it has quite limited value in a search-driven Google-centric web. And that limited value is even less if you don't already have thousands of Twitter followers.
The "make money on Twitter" ebooks will be coming out soon, but other than the ebook authors, I doubt anyone will make much money from it (unless customer feedback helps them create new product lines).
“Practice not-doing and everything will fall into place.” - Lao Tzu
Did you take a vacation?
If you took a break, I hope you had a good one! I've just returned from a relaxing holiday - it is summer where I am, the weather is great, and life is lazy and fine.
Holidays provide a great opportunity to reflect and take a new perspective, so one thing I tried to do was to step away from the internet. I didn't take a laptop with me on holiday. Needless to say, I really missed it. After all these years, I suspect I may as well be hard-wired into the interweb.
However, out there amongst the isolated dunes, I was reminded that....
Most Stuff Doesn't Matter
Most blog posts don't matter. Most news doesn't matter. Most Tweets don't matter. Social networks don't matter. These things can quickly become a meaningless distraction.
What's worse, is that we often miss the important things going on, because there is too much irrelevant clutter fighting for our attention. When I returned, there was so much stuff l hadn't read.
But was I any worse off?
Not really. I quickly came up to speed again by selecting a few important sources, and reading those.
It didn't take me long.
With this in mind, it was time to do some weeding and make a fresh start. My feed reader had become ridiculously cluttered.
Hard To See The Wood For The Trees
How many feeds to you subscribe to? Do you have a lot of unread items?
I certainly did.
Using my RSS reader had become a chore, mainly because I'd subscribed to so many feeds over the years that I was never, in reality, going to read. All those unread items were just made me feel guilty. I needed to reduce the clutter.
So I took a chainsaw to it.
I asked myself - what are the one or two sites in any given vertical that provide me with genuine value? Could I name them without looking at them?
It was actually surprising easy, especially given the rather useful historical usage data. Once I answered this question, I kept the truly useful feeds, and deleted everything else.
My feed collection now feels very Zen. No more news re-writers or trivia about who is doing what to whom. It's simple, elegant and best of all, a lot more useful than it was before.
What Is Your Desert Island List?
Your list will probably differ significantly from mine, but I thought I'd share a few sites, and try to see if there was any pattern to my choices.
One pattern was a fondness of good aggregation. By subscribing to one good aggregation site, I pretty much know what is going on in the generalist tech world, but without the need to subscribe to numerous individual blogs. One such site is Techmeme. Techmeme does a good job of harnessing the wisdom of crowds, by being selective about who is a member of that crowd.
The other thing I noticed was that I chose blogs with a distinctive personality behind them, coupled with an established reputation. For example, I read pretty much everything Danny Sullivan writes, because what he writes about is important.
Finally, there are the "official" blogs from the big companies in search - those blogs that form the horses mouth. Most of Google's blogs appear in this folder.
Do you notice any patterns to your RSS selections?
Getting Noticed In Crowded Markets
One problem with my approach is that it tends to be elitist. I'm concerned I'm going to miss upcoming writers who don't yet appear on the establishment radar.
Were you planning to start a blog this year? Have you done so, but are having problems getting noticed?
This article is a good reminder on the essential factors you need when you plan to enter a crowded market:
You can choose to sell to different people, such as small businesses; you can find new distribution channels; you can stratify the industry's price points by introducing a luxury class; or, you can redefine your selling proposition," he says, noting how Starbucks (SBUX) revolutionized the coffee shop by selling an experience rather than just a beverage.....However you choose to be different, you must be great at the basics and exceptional at your defining factor
That last part is killer. If I look at my RSS choices, they all have those defining features.
Recommended Search Reading
By no means conclusive, but I guess that's the point :)
Please share your killer sources with the SEOBook community in the comments.
Search Engine Land - Great editorial. Also features some of the top search writers as columnists and feature contributors
SEOBook - How could this not be on anyone's list! ;) Aaron writes some of the most useful SEO instruction in this vertical.
Earlier today I called an older version of the SEO Toolbar that does not have the update option built in it. If you downloaded it earlier today, please download again from http://tools.seobook.com/seo-toolbar/
The current version should be 1.0.1 (rather than 0.1). Sorry about the error on the updating part...but you won't have to download it again after this time...the update feature will work, and it is safe to just download it now as it will write over the earlier version of the extension.
What would happen if you smooshed together many of the best parts of Rank Checker, SEO for Firefox, the best keyword research tools across the web, a feed reader (pre-populated with many SEO feeds), a ton of competitive research tools, the ability to compare up to 5 competing sites against each other, easy data export, and boatloads of other features into 1 handy Firefox extension? Well, you would have the SEO Toolbar.
Loren Baker is holding a 3 day spring break SEO get together in Deerfield Beach, Florida. There are only 200 tickets cheaply priced at $500 each (especially when you consider that there will only be 200 people attending and you have people like Loren Baker, Chris Winfield, Todd Malicoat, Brent Csutoras, Rae Hoffman, and more speaking).
Conclusion #1: The more they know about you as an individual, the more likely they will be to try and track and, as required, exploit or manipulate you - be it as a consumer, as a citizen i.e. a polity member, as a (perceived) health hazard, as a (perceived) sociopath, as a (perceived) security risk, etc. etc.
Conclusion #2: The better they are able to categorize you (aka slap some generalized "profile" of theirs onto you), the easier it will be for the process to become self-perpetuating and auto-referential: anything you may do or avoid doing (as tracked and monitored by them) will actually only reinforce their hold on you - both as an individual and as a member of whichever societal group or subgroup you may belong to.
Every listing site or review site has to start off from scratch at some point. Over the past 3 or 4 years it has got much harder to rank thin affiliate database sites, and now that is only going to get harder, with Matt Cutts asking for spam reports on empty review sites.
Of course if Amazon.com or TripAdvisor or Craigslist open new sections they can probably get away with using duplicate or thin content based on the strength of their brands. Branded networks can always throw out a new related niche site and have it be seen as being above board:
The internet is fast becoming a "cesspool" where false information thrives, Google CEO Eric Schmidt said yesterday. "Brands are how you sort out the cesspool."
But new competitors are going to have a hard time building the budget and funding the brand exposure needed to rank because SEO is getting more complex, and if you don't have enough brand or enough AdWords spend you pretty-much are not going to get the exposure needed to get consumer reviews and rank organically, unless you license/steal/borrow/mix/re-mix content to build an opening "reviews" database. Some software tools, like Web Data Parser, make the process easier, but you still need to wrap everything in some time of value add (good design, mash ups, etc.). Or have great public relations. Or start your site off as an editorial only play, where you review what interests you, and then move the brand into the reviews space after you get some momentum and an organic traffic flow.
Matt Cutts explained how thin listing pages may be against their guidelines
Use robots.txt to prevent crawling of search results pages or other auto-generated pages that don’t add much value for users coming from search engines.
Don’t create multiple pages, subdomains, or domains with substantially duplicate content.
Avoid “doorway” pages created just for search engines, or other “cookie cutter” approaches…
Search is growing more subjective, becoming more about competition and expanding the ad channel. Think like a black hat. You have to stay ahead of Google's internal products & services if you want to avoid the spam label.
The shopping search engines/price comparison sites spend enough on AdWords to be considered a value add user experience (they give AdWords a broad backfill baseline inventory which other merchants have to compete against), but if Google can evolve their Product Search into a revenue stream and encourage reviews then many shopping search engines will soon run out of steam.
I believe that the locus of advertising will gradually shift towards the creation of valuable and compelling content. There is, however, a relative dearth of professionals or companies that can provide such content creation services. Perhaps advertising agencies might evolve in this direction, or perhaps this may an opportunity for forward-thinking individuals?
Eventually Google will need to become more of a content play if they want to keep growing revenues. This is why...
projects like Knol, YouTube, SearchWiki, and even Panaramio are core to their strategy
And if Google co-opts the media that makes it hard to give them serious negative press. Eric Schmidt thinks the press needs to be more tightly integrated into Google
I think the solution is tighter integration. In other words, we can do this without making an acquisition. The term I've been using is 'merge without merging.' The Web allows you to do that, where you can get the Web systems of both organizations fairly well integrated, and you don't have to do it on exclusive basis.
With online ad networks slowing in growth and some collapsing I thought it would be a great time to interview the team from YieldBuild, an ad optimization service, to see where they saw online ads and ad networks heading.
What online ad markets are dominated by companies other than Google?
With its acquisition of Doubleclick, Google definitely does dominate both the largest publishers and the long tail of smaller content sites, but the fact it's the dominant player doesn't necessarily mean that it's exerted a monopolistic presence. At the high end, Doubleclick competes with (Microsoft's) Atlas, Yahoo, and (AOL's) Platform-A, while the long tail is served by a wide range of smaller ad networks, like Chitika, AOL's Quigo, Tribal Fusion, and Blue Lithium, in addition to the likes of Yahoo and Microsoft, which recently released its contextual ad network, Pubcenter. And when you go beyond traditional online media, the market's still open for mobile, games, and video. And, of course, lead generation (CPA) is a fragmented market that's certainly not dominated by Google.
Steve Ballmer mentioned how advertisers + search volume build off each other to create a higher yield. Do you see online advertising becoming a natural monopoly market?
I don't, because although economies of scale have an important part to play in establishing the pecking order among firms, there are other ways in which an ad network can successfully compete.
Google certainly benefited from being an early, aggressive mover in the space, and it is clearly the dominant player, but there is still substantial opportunity that is being capitalized on by other networks. Smaller ad networks fighting Google with a more generalized approach can still offer lower pricing, because heavy bidding with Google and limited high-quality inventory means that Google can't necessarily always provide the best value proposition to advertisers, so they offload to other networks. But there will always be other ad networks that either nail a specific vertical market well enough to be an attractive option for both niche advertisers and publishers, and smaller ad networks will also continue to innovate, creating engagement and pricing (payment) models that work better for some advertisers and publishers.
Google as a network can't do all these things - the market is just too big and too complex. But there is a way for Google to be the dominating player in the market by owning the marketplace, by opening up its platform to other advertisers, as it has done. Google doesn't derive any direct benefit by doing so, but one predominant ad delivery platform creates the liquidity in the market that makes Google's economies of scale matter.
What are some of the more innovative things smaller ad networks have done to gain ground on Google?
As I said before, Google draws tremendous strength on the economies of scale it's developed. But there are a large number of firms that can do some things better - either creative, targeting, deeper integration with advertisers - that can give it a leg up on Big G, enough to carve itself a profitable niche (at least that's been the case until now). Some have taken on the creative capabilities of traditional ad agencies, merged that with innovative, unique technology, and created online advertising formats that deliver better response/engagement. But targeting - being able to deliver a specific audience segment that advertisers want to reach - is something that smaller, vertical ad networks have been able to do better than Google or any of the larger, more general networks.
Do you see any ad networks playing the opposite role of Google? Google started a search engine to have an ad platform...do you think an ad network will ever build a search service around itself?
It's conceivable. I can see ad networks wanting to own properties that contribute valuable inventory. Search is an attractive piece of online ad revenues, but the competitive nature of search and the massive R&D budgets getting put into it make it unlikely that an ad network would be able to organically expand to include a consumer search service. It's much more likely that an ad network would build a search service through a partnership with Google, Yahoo or Microsoft, but it's a tremendous challenge to change people's search habits when Google does the job so well (although Ask, Yahoo, Cuil and others have certainly tried!). The closest I've seen is the one developed by Snap; their in-text links to pop-up windows include a small "Search the Web using Snap.com" below the related content.
As Google builds more verticals (local search/maps, checkout, Google Product Search, book search, searchwiki, etc.) and adds features to their ad program (checkout logos, product links) do you see an eventual advertiser backlash happening against them?
Yes, although this is just a natural progression from partner to competitor as Google expands its feature catalog. I can give you one example from our own experience. HubPages, a site that we started in 2006, began as a partner with Google, offering users of the site AdSense revenue for publishing unique content on our site (we were the first site that used the AdSense API to manage this). HubPages has become hugely successful, and is a terrific revenue maker. Two years later, Google launches Knol, which is similar in many ways to HubPages. Naturally, it remains to be seen if Knol will ever become as successful as HubPages, but it's not surprising to us that Google would see how lucrative the business is and try to enter the market.
How many ad networks are typically in strong rotation on each YieldBuild client site?
It is hard to say, because there is no typical. We have a lot of publishers who just use YieldBuild to optimize their AdSense. Others already have a relationship with a display network like Advertising.com or Tribal Fusion, and they'll add that to the networks we optimize for them. We do typically recommend that publishers optimize one ad network for each two impressions a visitor is served from an ad network every day; this can be the case if a site gets lot of repeat traffic.
How often do you guys rotate through services to test them? Do you use earnings data cross sites to help improve yield?
The entire process is done through an algorithmic approach that uses performance data from the ads tested to determine the networks, formats, and layouts that generate the maximum revenue. YieldBuild is constantly testing, looking at changes, and adapting its algorithm to produce better results for our publishers.
Some ad networks build added services in them to personalize the advertising experience. Do you see such personalization algorithms boosting yield?
I don't know of any data confirming it generally, but I can easily imagine that services which tailor each ad's creative or message to the visitor would boost yield. It's certainly been the case that our testing on HubPages with personalized/targeted campaigns generally do well, although the results are uneven.
What do you guys typically see performing better: text ads or image ads?
It completely depends on the site, page, and, most specifically, zone on the page. Sometimes a display/image ad on a page will do well, sometimes a text ad will, and often both will work well in rotation with each other. There is no way to know for sure unless you test; each site monetizes differently. It's certainly true that high CPM display ads are the holy grail, but there aren't enough of them (especially these days) to go around, so the goal should be to optimize your inventory with the best-performing ads, text and/or image, that are available.
How has the ad slowdown affected the network rotation ratios? Were smaller networks hit more than bigger ad networks?
I think it's too early to tell, but from our purview, rates are down across all networks. This is a pretty rough time regardless, though, since Q1 is weak generally. As the year wears on, I do think the biggest difference will be display vs text, mostly because text's generally CPC pricing model fits tighter marketing budgets better than display's CPM model, but we don't have the data yet to tell.
What baseline optimization ideas should a publisher implement before going to a third party for ad optimization help?
There are a lot of things that a publisher can do; some are simply applying best practices (like blending ads with the background, or embedding them in content), some involve a bit more work (testing). I've written a number of posts on our blog about how to optimize ads for blogs, forums and other sites. Naturally, a one-size-fits-all approach won't work best for everyone--you have to do more involved testing or use a service like YieldBuild--but it is better than blindly putting in ads in a haphazard manner and hoping for the best.
When does it make sense for a publisher to go with a third party ad optimization platform? Is the leading issue revenue, impressions, time, etc.?
I would say that unless optimization is a fun hobby for you and you enjoy it, or unless you're making little/no money and don't care about the revenue, then it's worth it to use a third-party optimization platform like ours. We haven't surveyed our users yet, but I'd guess the leading reason is to maximize revenue, while avoiding the hassle of tweaking ads all the time coming in second-place. Just finding the best ad sizes, positions and optimal number of ads to display for each page is very daunting to do manually, given its on-going nature and complexity of permutations. Beyond maximizing revenue and saving themselves time and trouble, platforms like YieldBuild also offer ad network management and deep analytics (comprehensive, consolidated reporting) which help publishers get insight into what inventory and traffic is earning them money.
Some ad networks (like Federated Media) often get quotes or other input from publishers and use it to help build the ad campaign. Can publishers work with those types of networks and YieldBuild at the same time?
YieldBuild doesn't do any campaign management; we're not a classic ad network. Rather, we support a number of ad networks that our publishers typically already have a relationship with. Federated Media is a bit of a different animal in that it works on an exclusivity basis; i.e. you have to agree to allow them to manage all of your site's ads, so I'm not surprised that they allow the publisher some input in shaping the campaign.
Blend vs contrast: which usually works best? When should a publisher consider using each.
The rule of thumb is to blend, especially above the fold and with white/light backgrounds. Below the fold, and with dark backgrounds, sometimes a color very close to the background works better, and sometimes a highly-contrasting, even bright, color works well. But often there's substantial benefit to nailing the exact right color, as in this example: http://blog.yieldbuild.com/2008/03/24/myth-all-ad-units-on-a-page-should...
Google AdSense offers a heat map for ad placements. Do you think it is fairly accurate? What ad placements have you found that worked surprisingly well?
I would say it's a pretty good rule-of-thumb. It underscores that placement does matter, especially placing ads above the fold and juxtaposed/embedded in content. I hate to keep dropping links to our blog, but there is an example here that's interesting, because even at the handful-of-pixel level, the precise positioning of ads matters: http://blog.yieldbuild.com/2008/02/06/exact-position-of-ads-matters/
Do you feel that banner blindness will eventually carry over to other ad "units" to where advertising eventually has to leave the standard format size?
The IAB standard sizes have enormous value to the online ad industry because they help advertisers buy media at scale; too many custom ads just create too much friction for both advertisers and publishers, and relying on them would make the whole industry suffer. That said, although ad size is only one dimension that a viewer can become "blind" to (position, color, style, format all also matter, too), there will probably always be a market for custom ad solutions--there's an opportunity for combo packages that include a non-standard, custom ad product along with a lot of standard ads that publishers can slot in easily.
Excessive advertising on content can cut away at usability and site growth. What is the optimal number of ads/ad units that a publisher should display on a page? What are some ways people can include more ad units in their pages without making the pages look too ad heavy?
There was a study done on this recently that I blogged on, and if you're not using something like YieldBuild that makes that determination for you (YieldBuild will often serve less than the maximum number of ads per page, because this actually does improve page revenue), then I would probably do some sort of testing. Of course, it depends not only the number of ads, but their size, intrusiveness, how long your page is, etc. But you could always start conservatively, then slowly add more ad units and carefully monitor bounce rates. When bounce rates climb to an unacceptable level (minus ad clicks, naturally), then you could pare back the number of ads. This is assuming you don't have a way of A/B testing, which, of course, would probably offer better results.
Have you guys discovered great strategies for monetizing social media?
There is no one standard approach that works beautifully. Social media sites tend to monetize poorly, at least relative to their traffic - worse than original content sites. (This is something that even heavyweights like Facebook and YouTube are struggling with.) That doesn't mean that there aren't ways to improve what you are earning. Finding the right combination of ad networks, formats and layouts for your specific site can boost revenue. We have a large number of social media sites - some small, some very large - that are seeing impressive gains to their earnings by optimizing.
Who is the ideal client for YieldBuild? What types of publishers (site size, vertical, content type, etc.) can expect to see the biggest lift from working with you?
We actually work really well for just about every publisher. Naturally, larger sites will get through the training period faster, so they'll see improvement to their revenue more quickly. We've worked well for a lot of different types of verticals and content types: we optimize content sites, social networks, forums, blogs, and have seen success in all types. Occasionally we don't work well for a site, but we haven't determined any sort of pattern.
Thanks Jason. If you want to learn more about YieldBuild check out the below video or visit their site.
Since hosting is a high value field we can build a list of keywords that are selling for $1 or more by using the advanced filters. For lower value fields there is no need to use a price filter.
Enter each keyword here and export the top 800 keywords from each list. If Google shows less than 800 keywords you only need to export 1 list.
If they allow you to export exactly 800 keywords you can use the keyword competition level as another filter to allow you to break the list down into two smaller lists that can be exported. Using negative keywords (ie: hosting - web) can also help you dig deeper.
If some services (like fax servers and email hosting) do not fit your business model then you can add them as negative words to help the tool return fewer irrelevant keywords and more relevant keywords.
Step 3: In Excel Combine & Sort Data
Delete the unneeded columns (everything but keyword, click price, and search volume - optionally you may want to keep keyword competition level as well).
Copy and paste all the lists into 1 Excel spreadsheet.
Once you have that list add a column that uses the equation click price * estimated monthly search volume.
Sort your list by keyword value
Use the de-duplication tool to remove duplicate data.
Filter irrelevant keywords by hand, and/or remove rows containing x (ie: where x is the word free, or email, or some type of hosting/server you have no interest in providing).
Cut off the bottom of the combined list. Decide to set a lower threshold based on how large of a site you want to build, how rapidly the keyword values fall off.
As an example, here is a list of ~ 4,000 hosting related keywords, but please note I did not filter out the various irrelevant stuff like free, because for some hosting business models maybe they rely on pitching free hosting and then selling upgrades.
Step 4: Map Out Keywords Against Your Site Structure
Essentially you want to cluster relevant keyword groups together and try to map them out against your site structure, planning out page titles, URLs, on page H1 headings, meta description tags, and internal anchor text.
There are other tools similar to the Google Search-based keyword tool (like Microsoft Ad Intelligence). You may want to watch these videos if you need help visualizing the process.
Disclaimer: The above strategy relies on pulling keywords from an ad network. It will thus have a commercial focus and a bias toward high search volume keywords. If you want non-commercial keywords then using a variety of keyword tools and/or focus the above method on grabbing some keywords with low bid prices as well.
Here are 7 quick ways to expand your keyword strategy
If you have an established website with significant web traffic you can look at your analytics data for additional keywords and keyword modifier ideas.
Track industry news, blogs, and forums to see what people are talking about. Conversation creates search demand.
Use keyword suggestion tools built into browsers and search toolbars, and visual keyword cloud tools like Quintura to come up with additional keyword ideas.
If you are creating an affiliate or review site make sure to hunt down leading brands to review. Much of this word can be done through choosing good competing sites to draw keywords from in competitive research tools.
If you are advertising on AdWords you can login to your Google account then use the Google Search-based Keyword Tool and they will show you a list of up to thousands of additional keywords they think you should advertise for (one list a friend showed me had over 6,000 keywords in it).
The business cycle: Someone has a good idea - creates a company - creates a movement - creates profit - gets corrupted - becomes what they despise (leaving an opening for the next person with a good idea).
In many businesses financial interests eventually exceed the purpose of the business in importance. Which leads to ethical decay and sleazy behavior. Corner cutting starts off small, but keeps growing until the house of cards collapses. The small print keeps getting smaller until it creates a big deal:
The letter, posted on the FDA's Web, notes that the ad presents risks associated with the drug in "extremely" small type that fails to adequately convey the serious risks connected to the product. Humira's label carries a black-box warning, the FDA's strongest, that details risks of tuberculosis and other infections, some fatal.
Some companies are founded on lies. Some businesses are only profitable because they lie. Some industry organizations exist exclusively to perpetuate lies. Some industry spokesmen are no better than whores - selling their mouths to the highest bidder.
As marketing becomes more integrated into the web, the web becomes more integrated into our lives, attention becomes more scarce, media is dominated by public relations talking points, and more scandals are surfaced by the glut of information, the need for (and value of) people who are willing to speak the truth keeps increasing.
I pay ~ $100 a year to subscribe to the Wall Street Journal. I would pay well over 10 times that to access Barry Ritholtz's blog (if he charged for access). If a company stays small it does not need to keep finding (or creating) additional growth...there is no need to work the books or lie to the public to please investors.
In circumstances where there is even the slightest chance that the result of failing to deal with a possible situation would be the death of the world, then, if it wishes to survive, the human race has no option but to take whatever action is necessary to deal with that situation, however unpleasant and difficult that may appear to be, and to take it at once.
Money becomes a tool and a means to an end rather than something that controls you. For most people, money becomes so important that it clouds judgment with regards to ethics, it breaks or makes relationships, and can devastate lives (winning the lotto or going broke). The less focus on money, I’d argue the more you are able to control money (and not let it control you) the more you are able to generate more income. Very non-intuitive, but true.
Instead of worrying about money or competition, online publishers need to worry about creating the mood.
Since blogging about a toy flower I bought my wife I got multiple emails from guys who said they bought them for their wives. And one member went so far as paying Mahalo answers $10 to find the exact flower in the video. Think of how irrelevant a toy flower is for the audience on this site, and yet that casual mentioned likely caused over $100 of commerce to happen. What more if the offer was relevant to this site's audience?
If people want media free then I think it makes a lot of sense for publishers to use affiliate links to get a cut of the action. I reviewed SEM Rush before they had an affiliate program, and that likely added thousands of dollars in business for them. They have since added an affiliate program, and I have since went back to that old post and added affiliate links. Their affiliate payout is 40%, but I still have not earned that much because their price point is too low for what their tool does. They could increase their price 400% and it would still be a good value.
A lot of high payout affiliate marketing is sleazy (diet scams, hyped network marketing, reverse billing fraud, cookie pushing, etc.) but if you have a real site with decent reach you can profit significantly from giving people discounts and recommending high value offers that you are proud to endorse. I shared some affiliate program integration tips in the forums, highlighting a couple of our blog posts that made 4 figures and 5 figures each.
Some areas of the economy are going well still, but many are crumbling. If you are a publisher that monetizes via advertising one way to make up for lost earnings is to add more ad units to a page, but if something is not working then doing more of it is probably a bad solution for fixing the problem. If ad prices drop then adding more irrelevant and cheaper ads to the site will not do much for increasing your current revenue, and it might sacrifice some of your future revenue!
Another option is to show fewer ads to create a better user experience. Under-monetize today to make your site look more appealing, increase usability, be more linkworthy, and steal market-share from larger competitors. If the ad market goes too far out of whack you can always move from being an ad seller to becoming a strategic ad buyer. If you gain enough marketshare to tread water while competing businesses and business models collapse then you are doing well, while positioning yourself nicely for the rebound.
Then when the market turns up you can place more ads on your site and monetize aggressively while it is actually worth doing so. Then you would have higher monetization * better rankings & traffic * more ad units per page, with each input compounding the earning power of the next.
On some sites we only monetize key pages while making others white as snow easy to link at. For new sites I skip the ads until the site gets some links, starts ranking and has a solid traffic stream. I like to call this line of thinking conditional advertising, where you adjust your monetization strategy based on your site's market position and the condition of the market.
After seeing the rapid rise of Tip'd, I figured it would be a good idea to interview Muhammad Saleem, a social media addict who knows social media both from site manager and site participant perspectives. You can follow him on Twitter.
How did you get into social media?
I got into social media by reading James Surowiecki's Wisdom of the Crowds. I think it's one of the best books written so far on the topic. The lessons from that book and Gladwell's The Tipping Point are essential for anyone who truly wants to understand social, collaborative media, and viral marketing.
Some friends I know who have been actively involved in social media got burned out quickly. How do you keep it interesting after years of experience?
I think its important to love the fundamentals of social media, be interested in the relationships and conversations, and the theory behind it. If you're genuinely interested then you won't get bored, in fact, I read 2-3 books a month on the topic and each one makes me appreciate it even more (I'm currently reading Groundswell).
On larger sites is social media largely a game of reciprocal voting, or is there something deeper to it?
I think a lot of people misunderstand what reciprocal voting really is. Consider this, people that are friends usually have similar interests and preferences (hence they bond and are friends), and when you have similar interests and preferences, of course there is going to be a large degree of voting for each others content. Keeping that in mind, I really don't see this as you rub my back and I'll rub yours, it's more like we share the same interests, we are friends, and naturally vote on each others submissions.
When there are thousands of people hunting for stories how do you manage to find new ones that have not yet been discussed? Do publishers give you exclusives?
There are definitely people who message me and say 'Hey, do you think this story would do well on Digg (or StumbleUpon, or...), and since they are friends or acquaintances, if the content is good, I see no reason why it shouldn't get exposure. Apart from that, I really don't have to 'hunt' for content much. I usually find most of my submissions when I'm just browsing my favorite websites and other social news sites.
From talking to friends it seems there is a lot of payola in social media. What percent of the top 100 and top 1000 contributors to sites like Digg, Reddit, Propeller, and StumbleUpon engage in payola?
There is quite a bit of payola that goes on but fortunately that's all short-term because the sites and users get banned pretty quickly. I routinely get some pretty spammy emails about payola and without exception I forward them to firstname.lastname@example.org and let them deal with it.
How heavy is the user overlap amongst the big social media sites?
The user overlap is pretty heavy in terms of registrations but it's not that heavy in terms of activity. For example, most of the top users on Digg are also on StumbleUpon and Reddit, but Digg is their primary social news activity, and they participate much less on the others. The same is the case for many top Stumblers and Reddit users, they will be on multiple sites but use them much less than their primary community.
How does a person decide if social media should be a core part of their marketing strategy?
It all depends on what your conversion goals are and what vertical you're in. For example, if you're trying to get affiliate sales, make money from advertisement, get newsletter (or other) subscriptions, and so on, social news is probably the worst place you could go because most of those users have adblock plus installed and have a severe case of banner blindness. You should consider the following: your demographic, their social technographics profile, their interests and preferences, and your conversion goals before deciding if social media should be part of your online marketing strategy. Even then, most people default to Digg - social news is just one aspect, don't forget social networking, online video, online communities and forums, and so on.
What sort of marketing tips would you give to a person who said that their site simply did not fit existing social media sites?
People focus too much on social media sites and often have too narrow a definition of the term (i.e. social news - Digg). First of all, I doubt that 'social media' wont work for any niche, there is always rudimentary stuff like sharing content on microblogging or aggregator sites (Twitter and FriendFeed) or social networks (Facebook). And if that doesn't work, go back to the basics and participate in your blogging community, which is something you should be doing anyway. Work with other bloggers in your niche to increase both your audience and give them exposure.
When Tip'd launched you got a lot of coverage from bloggers. What was key to making that happen? Were you surprised with that level of coverage?
We didn't hire a PR firm or write up an official press release. Instead we reached out to people who we had relationships with and asked them if they could do a write-up, and only approached sites whose audience would enjoy the coverage. Actually I was personally a bit disappointed with the coverage. It seems I gave too much credit to some 'pundits' and 'gurus' who ultimately didn't have the foresight to appreciate why Tip'd is an important development in the social space.
What were some of the biggest keys to getting Tip'd up and growing?
There are several important considerations. The site has to function properly and has to be simple enough so that Joe non-techie can use it but also robust enough that more tech-savvy users can enjoy it. It has to score high on design, usability, and branding. You need to have a good pitch to draw people in. And finally you need to build relationships with publishers in the space. We were able to avoid the 'chicken and egg problem' of "no one wants to participate if there is no existing community, but you can't build a community if no one participates" by building and leveraging relationships with key players in the personal finance and financial news blogosphere.
What is the biggest mistake you feel you guys have made with Tip'd so far?
I don't think we've made any missteps so far. If the is limited in anyway, it's because we have decided to build, market, and grow it entirely ourselves and without taking funding from anyone. Think about this, it is a bootstrapped operation that started with $25,000 in funding, took 3 weeks to launch, and didn't push for any pr. Even with all that, our growth rate and the feedback is largely positive. Just yesterday a marketer messaged me and said "even with 24 votes, a front page story on Tip'd sent me 100 visitors, while with 75 votes on Mixx, they only send about 25 visitors." If a site with $25,000 in funding is already driving a larger audience than one with $3.5 million in funding and relationships with mainstream media, I think we've come a long way in three months.
Many niche social news sites have come to market. How many of these do you think will be successful and still around 5 years from now? What will separate those that succeed from those that fail?
I don't think even 25% of them will succeed. The problem, I believe, is that they are all self-centered and don't have a forward thinking vision. What will separate the successes from the failures is a focus on the site's own community, but also relationships with publishers. Community participation is one aspect of growth on social news sites, but people really underestimate how big a role online publishers and marketers play.
Words that sound good are often used in marketing by those in dire need of credibility, or those promoting a warped view of reality needed to justify their own business models. Many catch words and phrases obtain an Orwellian opposite meaning, due to such usage. Some examples?
below radar network = huge obvious footprint
ethical SEO = while the search engines are not our clients, we put them above our clients (or, we are lacking in creativity)
industry standards & certification = low paying job, little to no competitive advantage
cheap original content = recycled, machine translated, or obviously outsourced
blog links = comment spam
automated link building solution = bot driven spam
spam = sites positioned above mine (think this one came from Brett Tabke)
I think my wife came to say hi to me when I was sleeping a week or so ago and I started singing "you are my sunshine" in a rather annoying voice (not sure where I first heard the song, but think it was maybe school chorus class). She started laughing and wanted to know where I got that annoying voice from (I wasn't sure but maybe it was my way of being annoying in chorus).
Few thoughts are original...so if I thought it, I figured someone else did too. So I went to YouTube and searched for "you are my sunshine." A couple normal versions and then there at #6 was something that looked like it could be annoying. Perfect!
After seeing the flower I had to buy it, and that took me back through Google and eventually to eBay, where it cost $20 after shipping. My wife loved the flower and was totally surprised by it.
What if YouTube not only listed related media, but also pulled related products/services/offers from Google Base (or other structured sources) and could charge merchants on a CPC basis or a percentage of the transaction?
Most of the people on YouTube are looking to be entertained and waste time, but if media wants to be free on YouTube then maybe they should try to sell media in other formats and sell physical stuff. YouTube has huge distribution...so even if only 1 in 100 or 1 in 1,000 have interest in commercial offers they should be able to make the market fairly efficient pretty quick.
Recently we lightened the background color and increased spacing between keyword results in our keyword tool...making the results look more aesthetically appealing.
While playing with it I wondered how hard it would be to change the captcha questions to make them relevant and SEO oriented rather than having them ask generic questions. The big issue I had with it was the need for structured answers (as the PHP coding was not yet set up for fuzzy matches)...which sorta forced me to ask really generic questions or give away the answer. An example captcha is below
The links in the various captcha questions lead to various sections of the training subdomain, which should cause a few people to click through and consider joining. At the very least it should help some people new to SEO get some of the basics, and is a reminder "hey, this is over here." :)
It seems like a pretty cool marketing idea...it is relevant and free, much like advertising on your own search results. I am not sure how well it will work, but this is yet another one of those 1 hour conversion improvement hacks that can pay for itself for many years.
The good news is that captchas can be a legitimate part of any interactive website...so this idea could apply to blogs, forums, web based software tools, etc...anything where people comment and/or interact. But will users find it useful or annoying? What do you think of the idea?
When you are fairly new to the web, if you have a lot of free time, one of the best ways to come up with new content ideas is to read books in other industries and relate some of their key points to your own industry.
If you are starting a site and are new to a field, then looking at the structure of some books in your space is a great way to get a baseline outline for a site structure. $100 on books and 2 weeks of reading and you can have a structured baseline topical knowledge level and site structure that is superior to 99% of competing websites.
The Industry Standard put together about a decade worth of Steve Jobs video clips. I wish I was 1% of the salesman that Steve Jobs is!!!
Ad Rates to Fall?
In 2006 Jakob Nielson wrote "Over the last several years, Yahoo! has made between 0.2 and 0.4 cents per non-search pageview. However, I believe that Internet advertising is over-hyped and that advertisers are deluding themselves into overpaying. In the long term, non-search advertising's value will drop to 0.1 cents or less per page."
Tim O'Reilly offers this great quote "just maybe, we are getting the first signs that our society as a whole (and not just our financial system) is a kind of gigantic Ponzi scheme that will one day run out of room for growth, with disastrous consequences."
The dollar may lose as much as 40 percent of its value to 50 yen or 60 yen from the current spot rate of 90.40 today in Tokyo unless Japan takes “drastic measures” to help bail out the U.S. economy, Mikuni said. Treasury yields, which are near record lows, may fall further without debt relief, making it difficult for the U.S. to borrow elsewhere, Mikuni said.
“It’s difficult for the U.S. to borrow its way out of this problem,” Mikuni, 69, said in an interview with Bloomberg Television broadcast today. “Japan can help by extending debt cancellations.”
The US stock market went up 3% today, but I think anything based in US Dollars is a bit scary after reading the above quote. What exactly is backing the dollar if those holding federal debt consider writing it down? How many more companies and industries can the US government bail out?