Search can be used as a wedge in a variety of ways. Most are perhaps poorly understood by the media and market regulators.
Woot! Check Out Our Bundling Discounts
When Google Checkout rolled out, it was free. Not only was it free, but it came with a badge that appears near AdWords ads to make the ads stand out. That boosts ad clickthrough rates, which feeds into ad quality score & acts as a discount for advertisers who used Google Checkout. If you did not use Google's bundled services you were stuck paying above market rates to compete with those who accepted Google's bundling discounts.
Companies spend billions of Dollars every year building their trademarked brands. But if they don't pay Google for existing brand equity then Google sells access to that stream of branded traffic to competitors, even though internal Google studies have shown it causes confusion in the marketplace.
The Right to Copy
Copyright protects the value of content. To increase the cost of maintaining that value, DoubleClick and AdSense fund a lot of copy and paste publishing, even of the automated variety. Sure you can hide your content behind a paywall, but if Google is paying people to steal it and wrap it in ads how do you have legal recourse if those people live in a country which doesn't respect copyright?
You can see how LOOSE Google's AdSense standards are when it comes to things like copyright and trademarks by searching for something like "bulk PageRank checker" and seeing how many sites that violate Google's TOS multiple ways are built on cybersquatted domain names that contain the word "PageRank" in them. There are also sites dedicated to turning Youtube videos into MP3's which are monetized via AdSense.
Philosophically Google believes in (and delivers regular sermons about) an open web where companies should compete on the merit of their products. And yet when Google enters a new vertical they *require* you to let them use your content against you. If you want to opt out of competing against yourself Google say that is fine, but the only way they will allow you to opt out is if you block them from indexing your content & kill your search traffic.
“Google has also advised that if we want to stop content from appearing on Google Places we would have to reduce/stop Google’s ability to scan the TripAdvisor site,” said Kaufer “Needless to say, this would have a significant impact on TripAdvisor’s ranking on natural search queries through Google and, as such, we are not blocking Google from scanning our site.”
From a public relations standpoint & a legal perspective I don't think it is a good idea for Google to deliver all-or-nothing ultimatums. Ultimately that could cause people in positions of power to view their acts as a collection which have to be justified on the whole, rather than on an individual basis.
Lucky for publishers, technology does allow them to skirt Google's suggestions. If I ran an industry-leading review site and wanted to opt out of Google's all-or-nothing scrape job scam, my approach would be to selectively post certain types of content. Some of it would be behind a registration wall, some of it would be publicly accessible in iframes, and maybe just a sliver of it is fully accessible to Google. That way Google indexes your site (and you still rank for the core industry keywords), but they can't scrape the parts you don't want them to. Of course that means losing out on some longtail search traffic (as the hidden content is invisible to search engines), but it is better than the alternatives of killing all search traffic or giving away the farm.
There are pushes to minimize the need for passwords, but after the Gawker leak fiasco who wants to have a common shared single point of failure for passwords? Sure managing passwords sucks. But friction is a tool that helps cleanse demand & make it more pure. It is why paid communities have a higher signal to noise ratio than free for all sites. Any barriers will annoy people, but those same barriers will also prevent some people from wasting your time. If they are not willing to jump through any hoops they were never going to pull out the credit card.
We have some exciting news to share about eHow.com. Beginning in February 2011, Facebook Login will be the exclusive means for login to the site. You’ll be able to use your new or existing Facebook username and password to connect with the eHow community. We’ll also be removing eHow member profiles to help you streamline friend lists and eliminate the work of managing multiple online accounts. Additionally, we’ll be closing forums on the site. We want to hear from you directly, so moving forward, we encourage you to communicate us through the “Contact Us” section of eHow.com.
We’re excited to introduce these updates! Get started and click on the Facebook Connect button in the upper right corner of the home page to login. We want to keep in touch, so also remember to Fan Us.
My guess is they might be trying to diversify their traffic stream away from search & gain broader general awareness to further legitimize their site. But the big risk to them is that Facebook is an ad network. So now competing sites will be able to market at their base of freelance employees. What's worse, is that there was a rumor that Facebook might plan to launch a content mill strategy. There are plenty of ways for that third party login to backfire.
My believe is that you shouldn't force logins until you have something to offer, but that when you do you should manage the relationship directly. Does that mean you have to reply to every message? No. But it does mean that if there are ways to enhance value through how you interact with your established relationships you are not stuck under the TOS of a 3rd party website which may compete against you at some point. Sure that means some upgrades will be painful, but it means that you get to chose when you do upgrades rather than letting someone else chose when your website breaks for you.
I view third party comment systems the same way. If the person providing the service changes business model it does not mean you are stuck paying whatever rate they want or starting over. This is one of the big advantages of owning your own domain name and using open source content management systems. You don't have to worry about a Ning pivot or a Geocities shut down. Sure this approach means you have to deal with security, but then leaving that sort of stuff to Facebook might not be great anyhow.
Have you ever noticed that a lot of blogs want to be seen as being the same as the media? And media companies are responding by hiring bloggers. But why is emulating the media so exciting? After all, the same media is so big, bloated & redundant that it is buried in debt. How is it possible that a humor blog network built on open source software would ever need to raise $30 million?
The million channel words brings addressability. There is no mass any more. You can't reach everyone. Mad Men is a hit and yet it has only been seen by 2% of the people in the USA.
The mcw bring silos, angry tribes and insularity. Fox News makes a fortune by pitting people against one another. Talkingpointsmemo is custom tailored for people who are sure that the other side is wrong. You can spend your entire day consuming media and never encounter a thought you don't agree with, don't like or don't want to see.
Do you find it perplexing that the same media (which claims to be legitimate) has no problems running ads for total scams? Isn't it bizarre that the same media that claims to protect citizens from the evils of the marketplace tries to blend the ads for such scams in with their navigation to sell their readers down the river? Is this what you would expect of Newsweek?
Is that anything to aspire to?
Jokingly Geordie suggested how annoying he found the gallery sections on media sites with videos and pictures that seem like they are fresh off the Jerry Springer show. "WATCH: Teen beats ferret to death and eats it!" In the short run online advertising can grow quickly through tricking people, but the end result is distrust & people become less receptive.
The problem is lack of sufficiently broad exposure to the facts here in the US. We don’t have a fourth estate, a national media in the role of providing checks and balances to government and business excesses. Instead we have media that sells product. In the late 1990s it sold tech stocks, in the early 2000s the Iraq War, from 2002 until 2007 it sold houses, and in the future it will sell whatever measures are a “necessary” price for social stability, national security, or whatever phrases are used, because things are going to get dicey once this 40 year old Rube Goldberg monetary and trade contraption comes apart when it’s hit with a Peak Cheap Oil sledgehammer in the middle of the Jon Stewart show. I mean, how healthy is the American fourth estate when all of the serious journalism here is done by comedians?
Isn't it weird that the mark of a successful blog is that it starts to look and feel and act like bloated media organizations? Is social media any better? Or is social media mostly a bunch of lemmings following each other off the side of the mountain?
Just because there is lots of information doesn't make all of it valuable. In fact, some of it has negative value. Who are the people who login to Facebook so they can vote on Facebook about how Facebook is a waste of time & they don't use it?
How is it possible that we are told that data has value but privacy allegedly does not? Most such stores of data are built through an invasion of privacy.
Privacy has value. What happens when your account gets hacked & you start recommending some uncomfortable stuff? What happens when a stalker catches you on the way home based on one of your messages? How many such experiences will be viewed as a series of isolated events before people figure it out? Once these ads lose their novelty will there still be real businesses behind them? Or is narcissistic advertising the wave of the future even as people realize they are being spied on?
These companies blend their ad units into editorial so effectively that most people can't tell them apart. If that sounds familiar, that is because it is. The key to making it work is perceived relevancy. That is easy to do when you have a large ad auction and users type their intent into a search box, but is much harder to do when people are browsing pictures of cats.
Anyone who thinks that social is a clean search signal is forgetting that people vote most for stuff that his humorous & easy to share. And people share things that they saw others shared because they felt they had to. The echo chamber effect doesn't encourage critical thought. It is mostly a bunch of +1.
The following video is sad & funny. It has been viewed widely, but it does nothing to fix a broken education system.
And there are entire categories that will never be featured honestly on social media. Sure the idea of turning Kinect into a virtual sex video game will get lots of exposure, but is anyone ever going to honestly Tweet about their favorite solutions to their genital warts problem? Is there enough context to matter? Worse yet, all these networks are turning their relevancy signals into ad units, so if a search engine was to count them heavily all the search engine would be doing is subsidizing the third party ad network. And the scammers who are pushing reverse billing fraud products on the news sites will do as much damage as they can get away with on the social sites.
If there's a broad call at the company to integrate social networking features, Singhal hasn't quite heard it. He seems skeptical about whether social data can make search results significantly more relevant. If he's searching for a new kind of dishwasher, he argues, his friend's recommendations are interesting, but the cumulative opinion of experts manifested in search results is much more valuable. He notes that Google already integrates content from Twitter and says social networking data is easily manipulated. Can social context make search more relevant? "Maybe, maybe not. Social is just one signal. It's a tiny signal," he says.
Someone wants to eat my dog. Other than breathing, writing English(ish), and having a Twitter account, I probably do not have anything in common with that person. And yet there is no tool to sort that out.
That is the big problem with most social media tools: the monolithic nature.
I am not sure where I read this quote from, but I think it went something like this "we are most similar where we are most vulgar and most unique in the ways we are sophisticated." That is precisely why a lot of the broader networks will repeatedly fail in efforts to build strong niches outside their core. It is why there is so much value in being a fast follower.
The inflation and bubbles in the developing world are not yet destabilizing because the dollar is weak and the hot money supports their currency values. Historically, inflation becomes a crisis in the developing world when the dollar turns around and appreciates. However, it is possible for inflation to create a crisis without a currency crisis. It erodes the purchasing power of the people at the bottom. Social unrest can lead to political crisis
The currency pegs mean that most of the inflationary pressure you're creating doesn't hit your nation, it's exported to others. That exactly how you like it, because you can claim "inflation expectations are well-anchored." Perhaps they are in your nation, but in other places they're extremely unanchored and are not only expectations, they're realized facts as the basic cost of life spirals up out of control.
This, in turn, provokes food riots in these less-well-endowed nations that you managed to dupe into participating in your outrageous scheme. After all, there's only one thing worse than a hungry man. That's a man who used to be well-fed and now he's both hungry and ****ed, along with being unemployed.
When his belly growls loudly enough, he riots. And so do his similarly-situated neighbors.
Rather such mobs are caused because the lack of media doing its job to enforce a sense of outrage over the injustices caused upon societies the world over by banking criminals. If there was any sense of justice the large banks that caused this mess would have been bankrupted. But instead we base economic strategy on the theoretical economy rather than its impact on the real world.
People are just an externality for bankers to exploit.
One of my favorite approaches to save time online is to use multiple web browsers for different purposes. It allows you to combine speed + reliability with also having quick access to tons of valuable tools & data.
I set up Firefox fully loaded with bookmarks and extensions (all our free & premium ones, User Agent Switcher, Web Developer, Greasemonkey, Roboform, Colorzilla), but realize that as a result it will often be a bit slower & crash more frequently. That is ok because I don't use it as my primary web browser, but as my primary SEO research browser with all our SEO tools installed. Other extensions like Web Developer and Greasemonkey make it an obvious choice to use it as your fully loaded research browser.
I run Google Chrome bare to the bone, with 0 extensions installed. One time I tried to install Roboform on it, but that slowed it down as well, so I got rid of that and keep it bare. The benefit of having a minimalistic browser is that it is quite stable & fast. In this way I can open up 20 tabs from our forums at any given time without worrying about it causing a crash. What is better is how good Google is at allowing you to restore tabs if things do crash. Chrome is my forums + email browser & my general purpose browser for anything I don't have to login to access & a few of the sites I am typically logged into (like this 1).
And while Firefox is my normal research & testing browser, Chrome also has a nice feature where you can highlight & right click to inspect an element. It tells you exactly what css file the property is in, and you can double click on it, adjust the size/color/etc within Chrome to see what it changes.
I also run IE9. It's purpose is to help give me a clean & pure localized view of search. It is set up to delete all cookies when it closes. I use it in conjunction with a VPN to compare how search results look in various parts of the world. It is another type of research, but it is not always-on the way that Firefox and Chrome are. Such a browser can also be handy for putting your computer in London for exclusive BBC content, or getting around other such geographic content-access limitations. I also have Roboform enabled on IE to allow me to log into client accounts easily if I want to ensure I keep those separate from my personal accounts.
I also have Opera installed & I use it for testing user permissions based issues. Some pages on our site here operate in a way that is far more sophisticated than they might look at a glance. Some pages may look different based on if you are not registered, logged in with a basic account, logged in with a premium account, or logged in as an administrator. When testing & tweaking that sort of stuff I can end up with 4 different browsers open. Over time after we get everything up and running I hope to improve further on this front, as we haven't done as much of the conditional permissions-based changes as I would like to do. But, first thing first, we need to get re-launched soon. ;)
And the final reason to have most modern browsers installed is to check out how your site looks in all of them. I would NEVER describe myself as a website design, but I am foolish enough to hack away at the CSS & HTML. Sometimes it works. Usually it doesn't. :)
Having all browsers available (well all of them except Safari) makes it easy to see if something works or not. That said, tools like Adobe Browser Lab and Browser Shots are a nice compliment to this approach. And we have Safari on my laptop, so if the design looks good elsewhere then generally it is typically good to go in Safari, so I check it last. If you use Safari as your primary browser LastPass is good.
In the past I have highlighted how hype-driven hard launches often lead to hard landings. But what is even more challenging than launches is relaunches. Some relaunches are just flicking a switch, done mostly as a marketing gimmick. But those that are real changes are brutal, largely because you have already built up expectations in the past and have to manage expectations, even while everything is changing, and many things are not in your clear control. The more polished you become the worse you look when things go awry. :D
An Error of Confidence
In our member forums while using vBulletin 3 I became confident enough with upgrades that I did them myself even without a programmer standing by. Then I did the vBulletin 4 upgrade and it broke the templates & forced us to create a new design. vBulletin 4 has all sorts of bizarre variables in it and a lot of members were at first put off by the new design that vBulletin forced me into doing. There was almost an emotional visceral reaction amongst some members because we hate change that is forced upon us, especially if it feels arbitrary!
Based on that experience I decided that when we were going to upgrade Drupal and install a new member management software that it made sense for us to pause user accounts in case anything goes wrong. Lots has gone wrong with the update, so that turned out to be a good decision. Although at the same time it means I am spending well into 5 figures a month on upgrades and such while the site is producing no revenues.
I figured the no revenues part would encourage us to be as fast as possible, but Drupal 7 was a far more difficult change than vBulletin 4 was.
Whenever you do major upgrades will break. And it is virtually impossible to catch it all in advance. There are issues that happen with drop downs on certain browsers only when they are using certain operating systems with certain sized monitor, and all sorts of other technical fun stuff that doesn't appear until thousands of eyes have seen your website.
When you are new and obscure feedback comes in small bits and you keep getting incrementally better. But when everything changes you get hundreds of emails a day and it is nearly impossible to respond to them.
Did You Run Your Site Through a Geocities Generator?
We are trying our best to rush to fix stuff & get up to speed, but some issues that are even fine the day of an upgrade can appear crazy on day #2 due to how things interact. If we had our member's area accessible now, how would we really justify & explain end users seeing something like this...where bizarrely our designs merged:
Weird bugs like that can be difficult to troubleshoot, especially when they are intermittent. We have to fix those huge issues before we can even consider launching (and we mostly have already). But then there are other things that break in other ways that need fixed too.
A Laundry List of Issues
Post comment permalinks that add 30,000 pages of duplicate content to your site. (mostly fixed)
Updates that wipe out the ability to reply to a threaded comment on blog posts. (still need to fix)
Default sign-up page ugly & pretty version not posting to default. (still need to fix)
Users who desire our autoresponder still not getting it due to needing to test it again before having it send any emails. (still need to fix)
Integrating on-site social proof of value & activity like recent comments and member information. (still need to fix)
Redirect issues for certain login types. (mostly fixed)
Enable multiple product tiers & levels. (still need to fix)
Cookies issues based on old cookies before the CMS upgraded to the new system. (fixed for those who cleared cookies already, not for those who haven't)
Password reset emails don't send new passwords, but a one-time login link.
But some of those login links might be so long that they wrap and are broke by certain email clients.
Do you build a custom hack to try to fix that directly? or
Do you wait until you install your membership permissions management software and run everything through that? or
Do you convert your email module to send HTML emails? HTML emails which then requires a lot more testing because it might get stripped by some email clients. (Or, perhaps the email goes through, but the unsubscribe link is broken, which causes immediately a douchebag freetard to open up a support ticket with "lawsuit pending" as the title.)
That is only a partial list of items...there are literally about 100 more! And, as you can see from that last passwords issue, some corrections lead to additional issues. It is sorta like running up the side of a mountain carrying weights. :)
The challenging part of being a marketer, an SEO, and the guy who interacts with the customers is you deeply know how some things are flawed & that forces you to try to fix them as fast as possible. You can't just ignore the canonicalization issue that would be missed by most webmasters as you know the pain that leads to. :D
Even if you are pretty quick at fixing things, some will still blow for a bit. Complex systems are complex.
Not only will freetards complain, but you will get other forms of legitimate friction simply by virtue of being. Lots of eyes are on your errors. Once you have a well known website there is a lifeflow that goes through it 24 hours a day - if you are there or not. And if any of the common interactive paths are broken you will hear about it again and again and again. And again. :)
And yet, while you are trying to decide the best way to keep making things better you get emails that are condescendingly friendly. ;)
You guys have some very useful tools on this site and provide very useful seo information. Yet your site's user flow is surprisingly confounding and awkward. You guys strike me as practical internet marketers and I can't help but wonder why, if you were to upgrade anything on your site, you wouldn't have addressed your awkward user flow as opposed to spending time and money on some hipster faux web 2.0 window dressing. I know I'm not a paying customer...but I've always used this site for the keyword search tool and it has helped me drive traffic and increase eCPMs on my sites.
My guess is the type of people who use your site are not impressed by silly, day-glow,pastel makeovers and are more interested in useful seo data and information.
Nice. So they use us, make money from our work while paying us nothing, and yet they need to sling insults towards us while we publicly state that we are doing upgrades. Way to be a winner! If only everyone in the world was like them this site would disappear.
And they are completely wrong in suggesting that aesthetic doesn't matter. You can't quantify the losses without testing a different approach, but companies do not sink billions of Dollars into testing CPG packaging just for the fun of it. At a minimum a better looking site will increase trust. That leads to all sorts of other things like:
better perceived quality
lower perceived risk
higher conversion rates
being able to charge higher rates
higher visitor value
more media exposure
In many industries the winner is not who is well known within the industry, but rather who is safe and easy for outsiders to reference. Design is important for the same reason that domain names are. Either can yield an instance sense of credibility when done well & either can quickly take it away when done poorly.
And people who are new to an industry become the experts of tomorrow, so if they trust you more off the start then you build a self-reinforcing marketing channel. Whereas if you are not trusted you have to convince people to switch away from defaults after they already made their choice. And that is hard to do if they already passed you over once & your website is ugly.
And there is also the blunt straight talk feedback: "Your Products are bullshit."
I actually prefer the latter to the former because they don't insult your intellect by wrapping the insults in a passive-aggressive flowery packaging. (OK so I said a nice thing about him, so now I can REALLY insult him!!!)
One of the online issues that I think is rarely talked about is the issue of user friction. Media plays up the benefits of success but rarely highlights the cost of it. A popular game developer launched a hugely successful game at 99 cents & was devastated by his success:
I’m angry at a small percentage of customers who actively work towards harming its success. I’m angry at the customers who send me nasty emails or reviews, threatening me with ‘telling Apple to remove it’ or rating it 1 star with a ’should be cheaper than free’ remark because after paying the ridiculously exorbitant 99c, they found it didn’t live up to expectations. The absolute worst is users who condescendingly ‘try to help’ by outlining every little thing they think is wrong with it.
The anger, the sense of entitlement, and the overriding theme that I owe them something for daring to take up any of their time is sickening.
I can see now why many companies provide rubbish support, and have a ‘give us your money then piss off’ attitude. They have no doubt learned the hard way how soul destroying taking pride in your products can be.
That is a big part of the reason I abandoned the ebook business model. I felt that if I kept the model much longer I was going to have to sacrifice the quality of the customer interaction & be more like the companies I grew to hate. Rather than living that way we move higher up and get a higher quality of customer. Another benefit of our current (or soon to be restored) model is that if people ask questions in a closed garden social setting almost nobody is comfortable acting like a troll. People generally won't write the stuff in a social setting that they would write in an email, especially if they are not fully anonymous and they know doing so is going to make them look like a jerk.
While we still have tons of things to fix, the first things we fixed were related to duplicate content (to simply avoid the pain) and some issues associated with the registration path. The ones that people are going to complain about most are generally the ones you need to fix first, because that ends up saving you time in the longrun.
But if you price too cheaply (but not at free) then it is hard to ignore any of the feedback, even when it is ugly. This is why you are better off having higher prices & only converting a small portion of your audience. The folks from MagneticCat left a good comment on the above blog post:
$0.99 is an unsustainable price point. Because, if you sell 1 million games, you make $700,000 BEFORE taxes. A nice amount of money, but you also get 1 million customers – the amount of people living in a huge city – that could potentially have some problems with your game. Maybe because their iPhone’s accelerometer is broken, or because their headphone jack is not working anymore, or because there is an actual bug in your game.
We are only at about a half-million registered users & it is hard (20+ hour work days) to keep up when anything breaks. I can't imagine what it would be like to have a million PAYING customers. I think I would be sitting in the fetal position somewhere. ;)
That said, I am excited to get our site re-launched again and miss the daily water cooler nature of our forums. And based on the emails I am getting every day, so do many of our customers. Sorry for the delay guys!
We have Drupal 7 installed on both parts of the site. We have 3 days of bug fixing left and testing our membership software (which will also take a couple days). We may try to do some of it concurrently & test our membership software Sunday or Monday & hope to have a recurring test & a cancellation test done by Tuesday evening for a soft launch to past subscribers. If that goes well then we would hope to do a full launch before the end of next week.
I have never been a huge fan of correlation analysis. The reason being is that how things behave in aggregate may not have anything to do with how they would behave in your market for your keywords on your website.
Harmful High Quality Links?
A fairly new website was ranked amazingly quickly on Google.com for a highly competitive keyword. It wasn't on the first page, but ranked about #20 for a keyword that is probably one of the 100 most profitable keywords online (presuming you could get to a #1 ranking above a billion Dollar corporation). The site did a promotion that was particularly well received by bloggers and a few bigger websites in the UK press and at first rankings improved everywhere. Then one day while looking at its rankings using rank checker I saw the site simply fell off the map. It was nowhere. I then jumped into web analytics and saw search traffic was up. What happened was Google took the site as being from the UK, so its rankings went to page 1 in the UK while the site disappeared from the global results. In aggregate we know that more links are better & links from high trusted domains are always worth getting. And yet in the above situation the site was set back by great links. Of course we can set the geographic market inside Google Webmaster Tools to the United States, but how long will it take Google to respond? How many other local signals will be fixed to pull the site out of the UK?
Over time those links will be a net positive for the site, but it still needs to develop more US signals. And beyond those sort of weird things (like links actually hurting your site) the algorithms can look for other signals to push into geotargeting. Things like Twitter mentions, where things are searched for, how language is used on your website, and perhaps even your site's audience composition may influence localization. What is worse about some of these other signals is that they may mirror media coverage. If you get coverage in The Guardian a lot of people from the UK will see it, and so you might get a lot of Tweets mentioning your website that are from the UK as well. In such a way, many of the signals can be self-reinforcing even when incorrect.
Correlation analysis also has an issue of sampling bias. People tend to stick with defaults until they learn enough to change. Unfortunately most CMS tools are set up in sub-optimal ways. If you look at the top ranked results some of the sub-optimal set ups will be over-represented in the "what works" category simply because most websites are somewhat broken. The web is a fuzz test.
Of course the opposite of the above is also true: some of the best strategies remain hidden in plain sight simply due to sheer numbers of people doing x poorly.
Analyzing Data Pairs Rather Than Individual Signals
Another way signals have blurred is how Google uses page titles in the search results. That generally used to be just the page title. But more recently they started mixing in
using an on-page heading rather than the page title (when they feel the on-page heading is more relevant)
adding link anchor text into the title (in some cases)
adding the homepage page's title at the end of sub-pages (when sub-page page titles are short)
As Google adds more signals & changes how they account signals it makes analyzing what they are doing much harder. You not only need to understand how the signals are used, but how they interact in pairs or groups. When Google uses the H1 heading on a page to display in the search results are they still putting a lot of weight on the page title? Does the weighting on the H1 change depending on if Google is displaying it or not?
Along the same lines, any given snapshot of search is nowhere near as interesting as understanding historical trends and big shifts. If you are one of the first people to notice something there is far more profit potential than being late to the party. Every easily discernible signal Google creates eventually gets priced close to (or sometimes above) true market value. Whereas if you are one of the first people to highlight a change you will often be called ignorant for doing so. :D
Consensus is the opposite of opportunity.
When you do correlation analysis you are finding out when the market has conformed to what Google trusts & desires. Exact match domains were not well ranked across a wide array of keywords until after Google started putting more weight on them & people realized it. But if there is significant weight on them today & their prices are sky high then knowing that they carry some weight might not be a real profit potential in your market. It might even be a distraction or a dead end. Imagine being the person who bets (literally) a million Dollars that Google will place weight on poker.org only to find out that Google changes their algorithmic approach & weighting, or makes a special exception just for your site (as they can & have done). That day would require some tequila.
As a marketing approach becomes more mainstream then not only do the cost rise, but so does the risk of change. As people complain about domain names (or any other signal or technique) it makes Google more likely to act to curb the trend and/or lower it's weighting & value. To see an extreme version of such, consider that the past year has seen lots of complaints about content farms. A beautiful quote:
Searching Google is now like asking a question in a crowded flea market of hungry, desperate, sleazy salesmen who all claim to have the answer to every question you ask.
In times like these, clients tend to focus on the value proposition. "Throw it at the wall, see if it sticks" is not a phrase you hear a lot in recessions.
Instead, your customers will tend to have their eyes transfixed on your value proposition. "How does this spend make me better off?"
Whilst we may think search marketing services are essential, the spend on search services typically comes out of marketing budgets, and marketing budgets tend to be the first thing companies cut when things get tight.
So, they might need more convincing that usual.
If you weren't doing so already, it can be a good time to go over your proposals and pitch, and look to emphasize, and add to the value proposition you offer.
A few points to consider....
1. Address Genuine Needs
Address the need a client has, which may be different than the need they articulate.
This may seem obvious, but often people aren't quite sure exactly why they need search marketing, or they may have wrong ideas about it. Their genuine business need may be buried. You need to tease this out.
To do so, listen. Hard.
One common mistake people who are "fixers" - seos tend to be fixers - can make is that they'll go through the motions of listening, but really they're just waiting for an opportunity to launch into their solution.
A client will tell you a lot, and perhaps cover a lot of angles you hadn't thought of, if you let them talk long enough. They will like the fact you are interested in them and their problems, and it will make your eventual solution sound more considered and tailor-made.
Because it will be.
If you don't solve a genuine problem, your relationship is more likely to be a short one. Services that don't solve genuine business problems are more likely to get cut.
Look for ways you can enhance your offering.
Look to solve genuine problems in closely related fields. For example, a client may lack a content strategy. They may want to publish content regularly, but haven't got around to doing so. You could enhance your offering by incorporating this work in your offer, reasoning that it dovetails nicely with your SEO strategy, thus killing two birds with one stone.
This can also get you on-going work, if pitched right, and may involve little more than hiring the services of a copywriter.
3. Establish Feedback Mechanisms
Feedback is important.
Not only does it give you added insight into what the client is thinking, it also offers you the opportunity to demonstrate your value proposition in action.
You said you would do X, you do X, then show them you've done X. This helps build trust.
Clients will often elaborate, if given the opportunity, which can give you more ideas on how to "Go Beyond", and how to "Address Their Real Needs".
4. Look At Jobs As Partnerships
If you've ever bought services, you know that selecting a service provider can be a pain. It is time consuming, and there is risk involved. A wrong choice can lead to opportunity cost, and having to repeat the process all over again.
No one wants that. People want partnerships with their suppliers. They want someone on their side.
Once you've landed a client, try to see them as a business partner. This is certainly how they will view you if they like you. They are unlikely to go back out to the market unless they are disastisfied, so try to make their business, your business.
Take the approach that you will boost your own business by building theirs.
5. Every Job Is An Opportunity To Build Hybrid Skills & Knowledge
Let's say you have a travel client.
Learn everything you can about the travel industry. Press the client for information. Research and understand the wider industry, not just the search marketing opportunity within that industry.
One of the golden things about being a consultant is that you get to look inside people's businesses. This information is valuable and difficult to obtain by other means, yet you're getting paid to learn it. You're learning about real business issues, who's-who, and the language of the industry.
You then become more valuable to any other travel-related client as you're now "a travel guy". You can pitch convincing to them, because you speak their language, understand their problems, and you've got industry history.
Some readers might be considering taking that giant leap from their boring day job into the wonderfest that is full-time SEO. Huge money! Party central! Hangin' at conferences with Matt Cutts! What could possibly go wrong?
Let's take a serious look at what your new life will look like.
It's Going To Hurt
SEO is a world of hurt.
When you start, you'll have little money. Your bills don't stop coming in. Google, rather uncooperatively, may not rank your sites for six months.
Perhaps you've already got a few sites ranking. You've got some steady adsense/affiliate money coming in, which is right about the time Update Oh-My-God happens.
A Google update, like a demented hurricane, trashes your site for no good reason. OK, maybe, maybe you had *some* links that were not, in the cold light of day, strictly-speaking, based 100% on merit. But hey, everyone else was doing it, right?
It will be no consolation that everyone else's sites will have been trashed, too. You will meet these people in SEO forums, gnashing their teeth as if the world has just come to end.
It has, of course.
There are few more heart-breaking moments than when Google sends an H-bomb crashing down on your dreams. Google say they do this to improve their "service", but mostly they do it "because it's fun".
Your SEO forum buddies will explain, sometimes using elaborate math, why everyone's rankings dropped. These explanations are bullshit and can be safely ignored. Well-intended they may be, but your buddies don't have a clue. Chances are they just read something in another forum, thought it sounded profound, so they repeated it.
The sad reality is few people are doing any real testing these days.
Even more annoying will be the person who claims his site hasn't been affected. He will lecture everyone else on how, in the latest update, Google is finally rewarding higher quality sites.
Don't worry. This sanctimonious fool will likely get his site trashed in the next update. It will then be his chance to gnash his teeth.
In SEO, everyone gets their turn eventually.
Right about this time, that autographed picture of Matt Cutts hanging on your wall will start to look sinister. You could swear the picture is pulsing red with the faint glow of hells-fire.
Feeling scared and alone, you take it down and hide it in the drawer.
Are You Serious?
Events, like those described above, are just life's way of testing to see if you're serious.
If you are serious, you climb back up on the horse, get back in that saddle, and go rope some steers. Or, if you're an SEO, not a cowboy, you start fixing your sites.
Alternatively, you could decide that the performance-based SEO lifestyle is way too difficult, and vow to become an SEO consultant instead. Being an SEO consultant really takes the pressure off. Mostly, you just talk about stuff. Repeat things you've heard in forums.
Firstly, gather together some cryptic sounding jargon - "latent semantic indexing" is always a crowd pleaser - and apply to talk at the SMXWebmasterWorldSearchEngineStrategies conference. Next, get your smiling, drunken self into a photo, with your arm around Matt Cutts. This implies you have an inside line at Google. Finally, knock together an SEO consultant web site to display it all to the world. Claim to be an "SEO Expert". Often.
Being an SEO Expert is not a rare commodity. There are 22,345,947 SEO experts in India alone. And many work for less than your weekly beer bill. So unless you've got the sales skills of Tony Robbins, the solitary SEO consultant gig is a tough one.
You may decide to join an SEO agency. This is an easier gig, as you can focus 100% on SEO, surrounded by people who claim to know a lot more about SEO than they actually do. Many of your co-workers post regularly on forums.
You will soon enjoy the delights of heading off to a client site to tell a room full of hostile designers why their award winning flash site will have to be redesigned, from scratch, preferably using bare HTML.
Best of luck.
Following that lively exchange of views, you may wish to kiss the dark arts of SEO farewell, and move into the world of PPC.
PPC is a lot easier than SEO. Well, it is if you have a bank balance the size of Texas. If you don't have a lot of money, you'll spend all your time tweaking budgets, which, if you get them wrong, can end up costing you your credit limit. PPC is dangerous, but at least you can take that autographed photo of Matt Cutts back out of the drawer.
He cannot touch you now.
If you fail miserably at being an SEO and PPC consultant, don't despair. You can always take the easy way out.
Become a social media consultant.
Becoming A Social Media Consultant
The beauty of this gig is you don't need any technical chops at all.
Simply grab a book on public relations, rewrite it by dropping the word "Facebook", or "Twitter" in every second paragraph, and hit the speaking circuit. Rehash the same old stuff about "reach", "audience share", and "convergence" and mix it up with new terms like "re-tweet". If you're feeling confident, throw some Cluetrain Manifesto quotes in, like "Markets are conversations", and "Hyperlinks subvert hierarchy".
They love that stuff. No one knows what it means, but that simply validates your high fees.
The problem is the barrier to entry for becoming a social media consultant is set even lower than becoming an SEO consultant. That, and the fact everyone started calling "bs" on the whole thing last year.
Over the past year or 2 there have been lots of changes with Google pushing vertical integration, but outside of localization and verticalization, core relevancy algorithms (especially in terms of spam fighting) haven't changed too much recently. There have been a fewtricky bits, but when you consider how much more powerful Google has grown, their approach to core search hasn't been as adversarial as it was a few years back (outside of pushing more self promotion).
There has been some speculation as to why Google has toned down their manual intervention, including:
anti-trust concerns as Google steps up vertically driven self-promotion (and an endless well of funding for anyone with complaints, courtesyMicrosoft)
a desire to create more automated solutions as the web scales up
spending significant resources fighting site hacking (the "bigger fish to fry" theory)
As we’ve increased both our size and freshness in recent months, we’ve naturally indexed a lot of good content and some spam as well. To respond to that challenge, we recently launched a redesigned document-level classifier that makes it harder for spammy on-page content to rank highly. The new classifier is better at detecting spam on individual web pages, e.g., repeated spammy words—the sort of phrases you tend to see in junky, automated, self-promoting blog comments. We’ve also radically improved our ability to detect hacked sites, which were a major source of spam in 2010. And we’re evaluating multiple changes that should help drive spam levels even lower, including one change that primarily affects sites that copy others’ content and sites with low levels of original content.
It sounds like Google was mainly focused on fighting hacked sites and auto-generated & copied content. And now that hacked *GOVERNMENT* websites are available for purchase for a few hundred Dollars (and perhaps millions in personal risk when a government comes after you) it seems like Google's pushing toward fighting off site hacking was a smart move! Further, there are a wide array of start ups built around leveraging the "domain authority" bias in Google's algorithm, which certainly means that looking more at page by page metrics was a needed strategy to evolve relevancy. And with page-by-page metrics it will allow Google to filter out the cruddy parts of good sites without killing off the whole site.
As Google has tackled many of the hard core auto-generated spam issues it allows them to ramp up their focus on more vanilla spam. Due to a rash of complaints (typically from web publishers & SEO folks) content mills are now a front and center issue:
As “pure webspam” has decreased over time, attention has shifted instead to “content farms,” which are sites with shallow or low-quality content. In 2010, we launched two major algorithmic changes focused on low-quality sites. Nonetheless, we hear the feedback from the web loud and clear: people are asking for even stronger action on content farms and sites that consist primarily of spammy or low-quality content. We take pride in Google search and strive to make each and every search perfect. The fact is that we’re not perfect, and combined with users’ skyrocketing expectations of Google, these imperfections get magnified in perception.
But what sort of sites are the content mills that Google is going to ramp up action on?
The tricky part with vanilla spam is the subjective nature of it. End users (particularly those who are not web publishers & online advertisers) might not complain much about sites like eHow because they are aesthetically pleasing & well formatted for easy consumption. The content might be at a low level, but maybe Google is willing to let a few of the bigger players slide. And there is a lot of poorly formatted expert content which end users would view worse than eHow, simply because it is not formatted for online consumption.
My guess is that sites that took a swan dive in the October 23rd timeframe might expect to fall off the cliff once more. Where subject search relevancy gets hard is that issues rise and fall like ocean waves crashing ashore. Issues that get fixed eventually create opportunities for other problems to fester. And after an issue has been fixed long enough it becomes a non-issue to the point of being a promoted best practice, at least for a while.
Anyone who sees opportunity as permanently disappearing from search is looking at a half-empty glass rather than one which sees opportunities that died reborn again and again.
That said, I view Matt's blog post as a bit of a warning shot. What types of sites do you think he is coming after? What types of sites do you see benefiting from such changes? Discuss. :)
Here's an interesting study, conducted by Benjamin Edelman and Benjamin Lockwood, from the Harvard Business School. The study measures how much search engines, Google in particular, favor their own web services.
We find that each search engine favors its own services in that each search engine links to its own services more often than other search engines do so. But some search engines promote their own services significantly more than others.... we find that Google's algorithmic search results link to Google's own services more than three times as often as other search engines link to Google's services. For selected keywords, biased results advance search engines' interests at users' expense
However, the study brings up an important point. If Google claims to have algorithmic, "objective" search results, then it follows that Google should not favor their own companies properties, unless those properties achieve a top ranking based on their own merit.
Google can't have it both ways.
The problem, of course, is that Google could tweak the algorithm to favor whatever qualities its own properties display e.g. the PageRank of Google's own pages could be calculated - in truly cryptic and oblique fashion - as being of higher "worth". After all, there's no such thing as "objective" when it comes to editorial, which is the function of a search algorithm. There are merely points along a continuum of subjectivity.
But where it gets interesting is the study goes one step further. It tries to figure out what the user wanted when she searched. Did the user want to find a Google service at #1? And if not, then isn't Google doing the user a dis-service by placing a Google property at #1?
In principle, a search engine might feature its own services because its users prefer these links. For example, if users of Google's search service tend to click on algorithmic links to other Google services, whereas users of Yahoo search tend to click algorithmic links to Yahoo services, then each search engine's optimization systems might come to favor their respective affiliated services. We call this the "user preference" hypothesis, as distinguished from the "bias" theory set out above
They tested this theory using click-thru data. Regarldess of the search keyword, users almost always favor the #1 result - 72% of the time. So what if the user clicks further down, indicating that the first result is less relevant?
Gmail, the first result, receives 29% of users' clicks, while Yahoo mail, the second result, receives 54%. Across the keywords we looked at, the top-most result usually receives 5.5 times as many clicks as the second result, yet here Gmail obtains only 53% as many as Yahoo. Nor was "email" the only such term where we found Google favoring its own service; other terms, such as "mail", exhibit a similar inversion for individual days in our data set, though "email" is the only term for which the difference is large and stable across the entire period
There is a huge incentive for search engines, which increasingly crossing the line into publishing territory, to skewer the results towards their own properties. The traffic is valuable, and, whatismore, can be channeled away from competitors.
As Aaron pointed out a few months ago, if Google choose to enter a new vertical, such as travel or local, then you'd better watch out if you compete in those verticals. Regardless of how relevant you are to the search term, it's below-the-fold you'll likely be going.
So, yes, it may be Google's search engine, but they can't make claims about focusing on the user above all else, otherwise they'd return results the user wants, as opposed to possibly directing the user to Google properties due to other considerations. How can they claim "Democracy works", if they don't favour whatever site the link graph "votes" most relevant? And doesn't this come down slightly on the wrong side of "evil"?
So, What To Do?
If you feel Google can position their own sites where they like, then nothing.
Personally, I think any company can do what they like, until they reach a point where they become so influential, they can use their sheer size to reduce competition and choice. If we believe that free markets require healthy competition in order to thrive, then we should be wary of any entity that can reduce competition using anti-competitive behavior.
The Commission will investigate whether Google has abused a dominant market position in online search by allegedly lowering the ranking of unpaid search results of competing services which are specialised in providing users with specific online content such as price comparisons (so-called vertical search services) and by according preferential placement to the results of its own vertical search services in order to shut out competing services
The fact Marissa Mayer said this:
[When] we roll[ed] out Google Finance, we did put the Google link first. It seems only fair right, we do all the work for the search page and all these other things, so we do put it first... That has actually been our policy, since then, because of Finance. So for Google Maps again, it’s the first link
Google is no longer able to stream in reviews from TripAdvisor to Places pages after the user review giant blocked it. TripAdvisor confirmed the move today in an email, stating that while it continues to evaluate recent changes to Google Places it believes the user does not benefit with the “experience of selecting the right hotel”. As a result, we have currently limited TripAdvisor content available on those pages,” an official says
But Google aren't really going to care much about you if you don't have some major clout.
Thirdly, stay out of any vertical Google is likely to want to own. It is likely that Google will be going after the big verticals, because a big company needs to score big on projects. Long tail stuff isn't going to make any difference to their bank balance, except in aggregate, so there will be millions of verticals in which you'll never face a direct threat.
This is also a timely reminder to build up your non-search traffic in case Google, or any other search engine, decides to change the game significantly in their favor. Encourage users to bookmark, develop your social media brand, build mailing lists, put some valuable content behind log-in/pay walls, and build membership sites. Relying on Google has always been a risky strategy, do diversify your traffic strategy where you can in 2011.
Google likes to make SEOs look like fools. Some are, but some are simply privy to less information. Or, in some cases, thrown under the bus by a new wave editorial policy in the gray area. Inconsistent enforcement is a major issue, but even if you go beyond that, the truth is most businesses have a range of revenue streams from pure as can be to entirely parasitic.
In Manufacturing Consent Noam Chomsky highlights that we should judge actions based on an equality of principals & that we are responsible primarily for our own actions. Yet Google complains about Microsoft. It took Microsoft less than a day to clean up their act, while Google still hasn't fixed issues that were highlighted publicly 6 years ago!
Many Subjective Warnings
Not only is Google trying to police their competitors, but recently they have offered warnings on all sorts of subjective issues, like...
an out of context tweet on cloaking: "Google will more at cloaking in Q1 2011. Not just page content matters; avoid different headers/redirects to Googlebot instead of users."
Individually, each of those issues can be debated.
In our new site design our navigation is aggressively repetitive in some areas. The reason we did that was some people complained about not being able to effectively get around the site. To help make the navigation more intuitive and consistent we use drop downs and in some cases have 3 or 4 or even 5 links to the same location. Is that optimal from a search perspective? Probably not. But then again, search engines don't convert into paying customers. They are simply a conduit...a means to an end. When an engineer views a site they might not view it through the same lens as a customer would.
What is an unnatural link profile? Does it depend on who is building the links? We know that at an SEO conference when some of IAC's cross linking was highlighted Matt Cutts stated "those don't count" but didn't qualify it any further. Likewise when it was highlighted how Mahalo was link farming we were told that they deserved the benefit of the doubt. Since then the link farms have grown and mutated. I won't link at ask.inc.com/when-should-i-hire-a-company-for-lead-generation, but if I was told that the following is "natural" and "good to go" then I would have no problems building a few million links a week. Then again, I bet it would be "unnatural" if I did the same thing.
The part about treating Googlebot different from users is a bit perplexing. As technology has evolved this area has become quite blurry/murky.
Sometimes when clicking into big media sites that are 'first click free' I get kicked right to a registration page. In the past some iTunes pages would rank & force you into the iTunes software (though that may have recently changed).
Tools like Google Website Optimizer can be used to alter user experience significantly.
There is an SEO start up which pushes search visitors to sites like CNN to a heavily ad wrapped & paginated version of the same content.
I accidentally screwed up using a rel=canonical on a page (cut the source code from a dynamic page and pasted it as the basis for a similar static page & forgot to remove the rel=canonical tag). Eventually I figured out what was wrong & fixed it, but both the correct and incorrect pages ranked for weeks at #1 and #2. And isn't the whole point of the rel=canonical tag to give the search engines a different type of header than an end user (telling the search engine that the content is elsewhere while telling the user nothing of the sort)?
The thing is, Google is in a position to imply intent as they see fit. They are in a position to tilt the playing table as they see fit. They claim to be open and sometimes they are fighting the good fight, but businesses have a range of revenue streams from pure as can be to entirely parasitic.
The leaked internal Google documents about copyright and ads on trademarks certainly highlight that Google has no problem with a foot in each pond.
Syndication has long been a part of the media landscape, where portals chose what bits to mix in from where. But how much is fine & what should be done with duplicates? When does something go from 'legitimate syndication' to 'overt spam'? We see official Google blog posts which claim that AdSense ads are not allowed on unoriginal content, while Google has multiple large partners that wrap Google's search results in the AdSense feed and then serve it back to Google. Site categories which were described as 'shoot on sight' become viable enterprises when a person puts a web 2.0 design, venture capital & some public relations into the same basic business model. If Google is going to put out some 'thou shalt not' styled commandments under the label of 'fact vs fiction' they should have consistent enforcement of obvious issues that have been brought up publicly numerous times, including on the very post highlighting the policy. But we know they won't! They only care about dirty business practices if they are not getting a taste of the revenue stream (as shown by their BearShare partnership while policing Bing affiliates).
After purchasing Youtube Google rolled out their universal search & was fine with aggressively promoting Youtube over other video services. Only recent government reviews have pushed Google to give 3rd party services a fair shake, but the network effects and lead are likely already too great to overcome.
Due to past search bias, Google might get blocked out of completing the ITA deal. The good news going forward for publishers is due to increasing regulatory heat Google will only go after a small number of verticals where they payouts are huge. The regulatory blowback will be too great for them to try to be all things to all people.
We were hoping to launch today, but we still do not have all the bugs worked out for all our modules/plugins to make them compatible with Drupal 7. Further, our programmer mentioned that some of the Drupal 7 documentation is missing, which makes the above task even harder. He is making great progress with the upgrade, but between the design coming in a bit late + me getting sick for a long while + all the integration issues we are going through, we are estimating our re-launch date to be either January 31 or February 1st.
I realize that is about 2 weeks away, but I would rather be conservative on the estimate rather than promise it will be 3 days and keep moving the goal post over and over again every few days. If we can launch sooner we will, but barring our internet connection dying permanently or yet another major illness, we should be launched by the 1st of February at the very latest.
Sorry for the delays, but on a positive note, this also gives us more time to make more custom graphics for our training area and do more updates within the training section. It also allows me to blog somewhat regularly over the next week, before sorta disappearing publicly to work on the membership area of the site when re-launched.
Microsoft adCenter has recently increased their revenue per click to match Google, in spite of having a small chunk of the search market share (maybe 25% between Bing and Yahoo! Search to Google's ~ 75%).
How was Microsoft able to increase their yield so much? If you go back 5 years, at the time Yahoo! powered a greater share of search traffic then than Microsoft does now, and their ads were powering both MSN and Yahoo! Search. How did Microsoft catch up with Google when Yahoo! failed to compete?
Even today, Yahoo! still arbitrages search traffic through their home page's trending now section.
Notice the word highlighted in yellow. In most cases Yahoo! will typically spike one or two commercially oriented keywords into their trending box. Having ranked for numerous of these keywords, I can tell you that they can drive thousands of search clicks...which can be an expensive shot of traffic if you are paying $5 a click for them. The 'high blood pressure' might be a Dollar or two, but I have seen some expensive finance keywords in there as well.
I won't tell you which search engine it was, other than to say it was a publicly traded one, but about 4 years back a second tier search engine sent me a spreadsheet of [keywords * their bid prices] and wanted me to "generate traffic" for them.
1). Direct Partnership - Pull our ads to display on the site(s) for high paying keyword terms. The traffic must be unique and convert well for the advertiser (search engine traffic is the best). We can display ads in a variety of format and target the top terms on our network. Makes for a good compliment to other revenue streams.
2). Aggressive Referral Partnerships - I will compensate you and/or any other contacts in the black hat SEO realm up to 10% of all revenues generated by referred partnerships. (There are some SEO guys out there doing 1K+ per day in revenue - 10% = 100.00 additional per day for the life of those accounts). I am definitely willing to compensate nicely for referral of these contacts for Direct Partnership deals.
That second tier set up is of course why so many affiliate blogs recommend signing up for every affiliate network in the world. But the big issue with Yahoo! was that (in spite of being a major leading search engine) they were still operating like the 3rd tier folks, with certain publishers being able to access high payouts and CPC stats. Some of the folks running the Overture feed where whoring out out to others & one well known webmaster even has the word "clickbot" in his nickname. Yahoo! made it hard for advertisers to opt out, and that is what killed their click value.
"Although the Yahoo-Bing integration has been ongoing for several months, during which time we were able to adapt well to the volatile environment, in mid-December we began to experience average revenue per click decreases and the strategies we customarily deploy for responding to such decreases were not as effective," said Geoffrey Rotstein, CEO. "As a result, we are maintaining substantially lower traffic levels until we have better insight into the factors contributing to this issue. The Company is currently working diligently with the teams at Yahoo! in an effort to implement any necessary adjustments to this new marketplace."
"We have always been able to adapt quickly and positively to changes in the industry as a result of our intense focus on data and analytics. We intend to apply this same discipline to respond to these issues, as we continue to receive information from Yahoo! that will assist us to adapt our system for the new advertising marketplace," added Ted Hastings, President. "We intend to make whatever changes are required within our Company to ensure a fast and sustainable response to this new market".
SEO = Still Amazing
But the purpose of this post is to point back to the value of SEO clicks. Advertisers spend over $30 billion a year buying ads from search engines & the organic search results still get the bulk of the clicks. Of course search engines are pushing to eat the organic results as well, but for anyone who has a strong organic traffic stream it is easy to under-appreciate the value until you realize how scarce and expensive pure & clean search traffic is.
In a "oh what is the brown stuff oozing from my pants" moment for some e-commerce site owners, Google has quietly entered the space of pulling in manufacturer data directly into Google product search:
To make these pages even better, we plan on working with suppliers and manufacturers to get product data straight from the source.
We are starting this effort through a business partnership with Edgenet, a provider of product data management solutions. Manufacturers and suppliers can work directly with Edgenet's Ezeedata service to submit high-quality product data and images to Google. For more information, you can visit their website, at www.edgenet.com .
What makes this trend scarrier is that everyone is doing it: Google, Yahoo!, Bing, Ask, etc.
The mental model I have come to view search through is this: if a search engine can cut you out of the supply chain while having similar quality then they consider you to be at best irrelevant and at worst a spammer. Alternatively, the more your offering looks like a search engine, the more likely it is to be viewed as spam.
The big issue with this is network effects. Outside of brand corrosion & legal issues, there is basically no limit to how far search engines can push. Sure the above focus is on ecommerce, but don't forget that Google is buying Metawebs + ITA Software. And they have the ability to create vertical databases on the fly. If you want their search traffic you have to opt into being scrapped and disitermediated, likeso:
You can differentiate by having product information. But Google scrapes it. You can differentiate through consumer & editorial reviews. But Google scrapes it. You can differentiate by brand, but Google sells branded keywords to competitors. No matter what you do, Google competes against you. You can opt out of being scraped, but then you get no search traffic (& the ecosystem is set up to pay someone else to scrape your content + wrap it in ads).
These sorts of trends make the concepts of branding and positioning more important. If Google (and similar companies) aim to consolidate down markets into fewer players then it makes sense to be a #1 in a smaller niche market than a #5 in a bigger one.
A lot of folks have been hammering away at sending out automated link exchange emails for Wordpress driven sites.
The hallmarks of many such efforts
URL with something cheesy like "partners" or "friends" or "roundtable"
automated emails without a name that mention a search engine ranking and (falsely) apologize for being sent multiple times
auto-generated content that is overly boastful & looks like it comes from one of those internet marketing review sites that has fake comment bots which say *everything* is the best thing since sliced bread / a genius in motion / a deity of your choice
a bit of technical trickery
Nice bit of false empathy there. ;)
The technical trickery mentioned above is that if you visit the link they put in the email the linking post will appear *all over* the site that is "linking" to you. But if you open up a new browser from a different IP address and try to visit the parent category page before visiting the individual post page you will see that the post is only visible to a person who knows exactly where it is. So the people are not only mass automated email spammers, but they lie at hello as well (by deceiving folks into thinking there is an on-the-level exchange of some sort, while screwing them over with a page that is invisible to everyone but them).
The stuff is so out of hand that even new age doomer movies about 2012 are using it & are sending the emails to sites about SEO, offering sources of 'enlightenment.' :D
Clearly they are enlightened. ;)
Some tips & strategies:
The easiest way around such issues is to delete unsolicited commercial messages, especially if they are not personalized. But if you want to give someone the benefit of the doubt, then the best way to do so is check the source code of the page inside Google's cache. If the page isn't cached by Google then generally Google probably doesn't care much about it. (Yes there are exception to that, but the people who are sending unsolicited emails probably do not deserve too much benefit of the doubt.)
If you are out sending emails asking for links then it goes without saying that you don't want to look like the above folks (though I have received *far* worse emails from some SEO companies & PR folks). Automated tools can be dangerous things when in the hands of tools!
We got our member's area fully paused out on the 25th of December & on the 26th I got probably the worst flu of my life, losing 15 pounds in 3 days. As a bonus, I got a respiratory tract infection that still has me coughing 2 weeks later!
I am starting to feel a bit over the hump (and like I could be normal within a couple days), but I recently let a number of folks down because I am so used to working 16 hours a day that its hard for me to keep up (even with SEO Book paused) when working only 8 or so hours a day. Worse yet, one day I slept over 21 hours! If I haven't emailed you it's likely not because I was trying to ignore you, but rather because I am still about a hundred more emails in the hole from the period of getting sick.
That said, we are starting to make some progress on the site. We upgraded Drupal over the last couple days (from 5 to 6, but still need to upgrade from 6 to 7). I was also testing the new HTML site design & we have a version of it live here. (One page down, and only a few thousand to go. hehehe.)
Our old site design was a *major* upgrade from the hand rolled ugliness I made way back when. The big logo + strong colors really made it stand out & made the site look and feel more memorable. But after we created a membership site, built more tools, created the online training area, started offering more videos, built the community forums, and created a monthly newsletter it sorta seemed like we had outgrown the design.
The thing I dislike about the old site design is that (to me) it looks sorta like a blog that kept on bolting on more pieces. Largely that was so because the site developed quite incrementally over the years. We never really started with a master plan, but just kept building more stuff we liked and bolted it on. Over time it added up & got a bit unwieldy & the current design doesn't really hint at the breadth or depth of the offering. Whereas the new design feels more like a complete thought that better expresses what the site offers.
In terms of the infrastructural upgrades, we are not where we need to be, but we are finally making progress, and are trying to catch up quickly. If I owe you an email expect one before the weekend is out! (Unless I feel worse after another nap here soon).
Happy new year everyone, and more blog posts coming in the days to come.