The Google Search Advertising Cartel

Whenever I read a story about Google losing it's competitive edge or spreading itself too thin I think that they author just does not get the network effects baked into web distribution when a company is the leader in search and advertising, and how solidly Google competes where it allegedly failed.

Sideline projects, like their book scanning project, turn into a treasure for librarians and researchers who guide others to trust Google. Syndicated products and services like their book API nearly create themselves as an off-shoot of creating indexable searchable content.

They monetize search much more efficiently than the competition. And that is only going to increase as time passes, especially since their leading competitor would rather outsource to Google than fix their monetization problems. Google can take any related market it touches and buy marketshare or introduce a new product to push free and openness. Everything should be open, except Google itself.

To sum up Google's lasting competitive advantage (including brand, marketshare, price control, distribution, undermining copyright, strategic partnerships, etc.) I turn to telecom lobbyist Scott Cleland's Googleopoly:

Google arguably enjoys more multi-dimensional dominating efficiencies and network effects of network effects of any company ever - obviously greater than Standard Oil, IBM, AT&T, or Microsoft ever were ever able to achieve in their day.
....
The five main anti-competitive strategies in Google's predatory playbook to foreclose competition are

  1. Cartelize most search competitors into financially-dependent 'partnerships;'
  2. Pay website traffic leaders predatory supra competitive fees to lock up traffic share;
  3. Buy/co-opt any potential first-mover product/service that could obsolete category's boundaries;
  4. Commoditize search complements to neutralize potential competiton; and
  5. Leverage information asymmetry to create entry barriers for competitive platforms.

If you have a spare hour to read, you may want to check out Mr. Cleland's Googleopoly 2 [PDF]. I don't agree with everything in it, but it sums up Google's competitive advantages and business strategies nicely. Anyone can learn a lot about marketing just by watching and analyzing what Google does.

Search Engine Optimization - Evolution or Extinction?

The following is a guest blog post by Jeremy L. Knauff from Wildfire Marketing Group, highlighting many of the recent changes to the field of SEO.

Marketing is constantly evolving and no form of marketing has evolved more over the last ten years than search engine optimization. That fact isn’t going to change anytime soon. In fact, the entire search engine optimization industry is headed for a major paradigm shift over the next twelve months. Like many of the major algorithm updates in the past, some people will be prepared while some will sit teary-eyed amongst their devastation wondering what happened and scrambling to pick up the pieces. Unlike the major algorithm updates of the past, you won’t be able to simply fix the flaws in your search engine optimization and jump back to the top of the SERPs.

Why is this change going to be so different? In the past, the search engines have incrementally updated certain aspects of their algorithms to improve the quality of their SERPs, for example, eliminating the positive effect of Meta tag keyword stuffing which was being abused by spammers. Anyone who has been in the SEO industry for more than a few years probably remembers the chaos and panic when the major search engines stopped ranking websites based on this approach. This time around though, we’re looking at something much more significant than simply updating an algorithm to favor particular factors or discount others. We are looking at not only a completely new way for search engines to assign value to web pages, but more importantly, a new way for search engines to function.

Local search

A number one ranking for a particular keyword phrase was once the end-all, be-all goal but now many searches are regionalized to show the most relevant web pages that are located in the area that you are searching from. While this will probably reduce your traffic, the traffic that you now receive will be more targeted in many cases. Additionally, it give smaller websites a more equal chance to compete.
local.jpg

Google suggest

This August, Google Suggest was moved from Google Labs to the homepage, offering real-time suggestions based on the letters you’ve typed into the search box so far. This can be an incredibly helpful feature for users. At the same time, it can be potentially devastating to websites that rely of long-tail traffic because once a user sees a keyword phrase that seems like at least a mediocre choice they will usually click on it rather than continuing to type a more specific keyword phrase.
suggest.jpg

Devaluation of paid links

Google’s recent attempt to eliminate paid links has scared a lot of people on both sides of the link buying equation into implementing the “nofollow” tag. In the midst of this hypocritical nonsense, Google has also been taking great measures to devalue links based on quantifiable criteria, such as the “C” class of the originating IP, similarities in anchor text and/or surrounding text, location of the link on the page and the authority of the domain the link is from to name a few. Regardless of the effectiveness of any search engines ability to evaluate and subsequently devalue paid links, the fear of getting caught and possibly penalized is more than enough to deter a lot of people from buying or selling links.

Visitor usage data

Again, Google is leading the charge on this one. Between their analytics, toolbar and web browser, they are collecting an enormous amount of data on visitor usage. When a visitor arrives at a website, Google knows how long they stayed there, how many pages they accessed, which links they followed and much more. With this data, a search engine can determine the quality of a website, which is beginning to carry more weight in regards to ranking than some of the more manipulatable factors such as keyword density or inbound links. This puts the focus on content quality instead of content quantity and over time, will begin to knock many of the “me too” websites further down the SERPs pages, or out of the picture all together. The websites that will prosper will be those that produce relevant, original content that their visitors find useful.

TrustRank

Simply pointing a vast number of links with a particular keyword phrase in the anchor text to a website was once a quick and easy way to assure top ranking. The effectiveness of this approach is diminishing and will continue in that direction as a result of TrustRank. In a nutshell, a particular set of websites are chosen (by Google) based on their editorial quality and prominence on the Internet. Then Google analyzes the outbound links from these sites, the outbound links from the sites linked to by these site, and so on down the chain. The sites that are further up the chain carry more trust and those further down the chain, less trust. Links from sites with more TrustRank, those further up the chain, have a greater impact on ranking than links from websites further down the chain. On one hand, this makes it difficult for new websites to improve their position in the SERPs compared to established website; one the other hand, it helps to eliminate many of the redundant websites out there that are just repeating what everyone else is saying.

Google Chrome

Utilizing a combination of visitor usage data and a not so gentle nudge in Google’s direction, Google Chrome is set to change the way search engines gather data and present it to users. For example, when a user begins typing in the address bar of the browser, they are presented with a dropdown list of suggestions containing choices consisting of the #1 result in Google’s SERPs, related search terms and other pages you’ve recently visited. This gives a serious advantage to the websites that hold top ranking in Google and at the same time, gives a serious advantage to Google by giving their Internet real estate even more exposure than ever before.
chrome.jpg
So the question remains, is search engine optimization facing evolution or extinction? Certainly not extinction, not by a long shot, but in a short period of time it is going to be drastically different than it is today. The focus will soon be on producing a valuable and enjoyable user experience rather than just achieving top ranking, which is what it should have been all along.

New Backlink Analysis Strategy: Compete.com Referral Analytics

Compete.com quietly launched a referral analytics product as part of their advanced package ($499/month). Even as a free user you can see the top 3 results for any site, which can be used to see how reliant a site is on search. Why is % of search traffic an important statistic?

  • If search traffic (as a % of total traffic) is low (relative to other competing sites) then it could indicate that there are organic optimization opportunities that are currently being missed and/or that site has a large organic traffic stream that can be marketed to in order to help it improve any search related weakness.
  • If search traffic (as a % of total traffic) is high (relative to other competing sites) then it could indicate that the site is near its full search potential, that the site is not very engaging, and/or does not have many loyal users

Here are search stats for SEO Book. Note that Google controls a minority of the traffic to this site, which means they have limited direct influence on the revenue of this site. Some sites are closer to 90% Google, which makes it easy for Google to effectively remove them from the web!

This sort of data is important for considering the viability of a business model, the stability of a site, and what multiple a site should sell for. It can also be used when considering the CPM of an ad unit - search traffic is much more targeted and goal oriented than a person browsing a forum is.

Until everyone and their dog started looking at PageRank (and how to manipulate it) it was a rather sound way of finding the most valuable backlinks. But with the pollution of endless bought links, nepotistic links, and PageRank only being updated quarterly it is tough to glean much market data from only looking at PageRank. Tools like SEO for Firefox (especially when used on a Yahoo! backlink search) allow you to gather more data about the quality of link sources. But they all try to measure proxies for value rather than how people actually surf the web.

Microsoft BrowseRank research would use browsing data to supplement PageRank on determining relevancy. In Internet Explorer 8 (currently in beta) a person's browsing details are sent to Microsoft by default. With ~ 80% of the browser market, Microsoft does not need to use a random walk for the core of their relevancy algorithm - they know what people are actually doing, and can use usage data as a big part of their relevancy algorithms.

Using a tool like Compete.com Referral Analytics makes it far easier to poach top affiliates, discover the best ad buying locations, and replicate a competitor's best backlinks. Be forewarned that the tool only works at the domain level, so it is much better at analyzing Yahoo.com than shopping.yahoo.com.

Along with referral analytics Compete offers destination analytics, which let you know what websites people visit AFTER visiting a particular site...which should help you glean information about how sites are monetizing, what offers are working well, what sites are well referenced by another site, and what sites people go to if they can't get what they want on the current site.

At $500 a month, this tool is probably only going to be used by those who are already fairly successful rather than as an entry level tool.

Google's Chinese Wall Between AdWords Ads & Organic Search Results Disappears*

In years past Consumer Reports WebWatch studies showed that consumers struggled to differentiate ads from organic search results and that "more than 60 percent of respondents were unaware that search engines accept fees to list some sites more prominently than others in search results."

Since those studies Google has changed the background color on top ads from blue to a light yellow color that is hard to notice on some monitors. Changing my contrast setting from 50% to 55% it is hard for me to see the edge of the sponsored box...it simply bleeds into the organic search results. Google interviewed German searchers to ask if they noticed the yellow background on sponsored links and got a negative answer:

INT [interviewer]: “Why do the results on top have a yellow background, did you notice?”
TP [tester]: “I didn’t notice this.”
INT: “What does it mean?”
TP: “It definitely means they’re the most relevant.”

Google has done studies on the the brand lift of search, but it only tells part of the story. When one considers that

  • many searchers do not know where the paid ads are,
  • people will be searching more on mobile devices,
  • maps and other verticals will eventually have ads integrated in them, and
  • search suggestion services may show ads before the searcher hits the search results

...it is going to get much harder to compete for attention in big verticals unless you have the best visitor value and can afford PPC, or you build a formal partnership with the search engines.

To see where this is headed check out the Yahoo! Search results for a popular band, and see how Yahoo! turned their search results into a useful interaction AND an advertisement for Rhapsody - allowing searchers to play songs directly in the search results. Large portions of the search stream (lyrics, music, entertainment, sports) are going to be directly controlled by the search engines that keep users on their network longer and the second click.

* at least in the mind of searchers tested by Google and used in Google promotions to promote paid search advertising.

SEO News & Interesting Links

I have been spending a lot of time building out other sites, doing interviews, and playing in our member's only forums...this post is a list of some recent interesting links.

Google's Scott Huffman highlighted some of the search quality evaluation process at Google. His post (as well as the older leaked search evaluation documents) should be required reading for all professional SEOs.

I did a quick run down of some SEO tools over at Blogoscoped. Seocracy highlighted a free service called TwitScoop as a cool tool for finding fresh keyword ideas.

Michael Gray on how to figure out what parts of your site are not being crawled regularly. Check out the comments on that post for more tips. If you use Wordpress, you might find this crawl rate tracker plug in handy. Crawl rate is probably a stronger signal of trust than toolbar PageRank is.

Brian Clark launched Lateral Action, a site devoted to using creativity and productivity to drive success. The site looks like it is off to a great start with posts like Innovate or Die: Why Creativity Is Economic Priority Number One.

ChrisG has put together a special pre-launch offer on his new AuthorityBlogger course. At first look it looks like he put a lot of work in creating a great service well worth the layout if you want to become a kick ass blogger and/or get the attention of other bloggers. Nice job Chris.

SugarRae has started posting regularly again. She offers up tips on how affiliate marketing works and the failure of excuses.

IMDB is offering lots of free shows and movies online, which may lead to people becoming more acclimated with watching videos online, but if it does people might start expecting more in terms of production value. I am long on the value of video content, but this article shares some of my hesitation with creating tons of video in a complex rapidly changing field.

Despite the rise of amateur video and the new modes of distribution and discussion, Internet technologies have not been able to change the fundamental character of video. Whether someone watches video on a television screen, or plays it on YouTube, video is a linear, passive experience, designed to be watched from beginning to end without alterations or input from the audience. In this sense, video is still following the model set by film in the late 19th century.

Many things I said in the past later turned out to be incorrect after the market changed. Only with years of experience did I learn how importance the clause it depends is. With text an edit might take 30 seconds, but with video it might take 30 minutes. One way to de-linearize video is to create many small targeted videos rather than one large video.

On the spammier front, it looks like 302 redirects might be back and XMPC offers tips on how to build semi-automated sites.

Marrissa Meyer highlights some of the opportunities and challenges of search in the future. Bob Mass highlighted that her post hints at future opportunities for marketers.

Funny Email: Anyone Who Outranks MY Clients is Unethical ;)

I just came across one of the funnier SEO emails I have ever read. When I shared it with my wife we both laughed out loud, so I thought I would share it with you. Personally identifiable information has been removed to protect the guilty.

Hi,
___________ are looking for sites that would be interested in publishing content on behalf of a number of the UK's major brands, including the likes of ________ and _________ and ___.

For a site such as ___________ we'd be prepared to pay up to £30 per article a month, every month, depending on the nature of the agreement.

Naturally, you would have a say in what content is placed on your site, we would simply provide you with useful, accurate and well written content.

To see how the links might look on your home page please visit _____________ (the articles are near the bottom of the page under the title ‘___________’).

The reason we are looking to pursue this relationship with you is because there are a number of sub-standard websites that are ranking higher in the search engines than our clients for their own products by using unethical techniques. It is our intention to address this imbalance and is why we are willing to compensate you on a monthly basis for the publishing of our content and links to our client's sites on your site.

If you feel this is an opportunity that you are willing to discuss further or if you have any questions about this proposition then please feel free to contact me.

Regards,
________ _______, Media Buyer

Generally by the time an SEO is experience enough to be working with Fortune 500s and big brands they are smarter than to buy into the bogus ethics debate. But what was funny is the ethical links they were buying in the example site were not even for brand related queries...some of the anchor text was for core category keywords like life insurance and loans. :)

It was pretty stupid for them to publish their clients (and a published site with link buying examples) in that email. If I would have fully published it without redacting information that probably would have made their rankings a bit worse ;)

That SEO firm claims to be award winning...I shall send them an email asking if they seek nomination for the worst link request email award.

How Does the Algorithm View Your Website?

Great article in the NYT over the weekend about an ad arbitrage directory named Sourcetool, which Google punted from the AdWords program. A couple quotes:

When I pressed Mr. Fox about Sourcetool, he refused to tell me why the algorithm had problems with the site. When I asked him why the business.com site was in the algorithm’s good graces but Sourcetool’s wasn’t, he wouldn’t tell me that, either. All I got were platitudes about the user experience. It wasn’t long before I was almost as exasperated as Mr. Savage. How can you adapt your business model to Google’s specs if Google won’t tell you what the specs are?

Business.com...

  • sells links (yes they have editors, but when they were interviewed about a year ago by Aviva Directory they only had 6 editors managing 65,000+ categories...many of the listings not only included aggressive anchor text, but also allowed the use of up to 5 spammy sub-links with each listing)
  • used nofollow on many of the free editorial links (while passing link juice out on the paid links)...this was corrected after we gave them a proper roasting on Threadwatch :)
  • uses a funky ajax set up to hide work.com content in a pop up (but makes it accessible to the Google crawler)
  • scrapes Google search results as "web listings" and in some cases Google ranks these pages! (Google is ranking a Google search result surrounded with Google AdSense ads, branded as Business.com)

Any one of those 4 would be enough to kill most websites, but because of Business.com's large scale, strong domain name, and brand they can do things that most webmasters can not. They are given the benefit of the doubt because Google can not clean up all arbitrage without hurting their own revenues - and Google's job it easier if they have to police a few thousand companies rather than millions of individuals.

Google also told me that it never made judgments of what was “good” and “bad” because it was all in the hands of the algorithm. But that turns out not to be completely true. Mr. Savage shared with me an e-mail message from a Google account executive to someone at another company who had run into the same kind of landing page problem as Sourcetool. The Google account executive wrote back to say that she had looked at the site and found that “there seems to be a wealth of valuable information on the site.” Consequently, her team overruled the algorithm.

Want to learn what the algorithm thinks? Read Google's remote quality rater documents. They tell you what Google wants and how the algorithm really works.

Algorithms (and under-waged third world employees labeled as the algorithm) often make mistakes. If a mistake is made when Google passes judgement against your site, is your site good enough to recover? If your site was deleted from the Google index would anyone other than you notice and care?

Interview of Quintura Search CEO Yakov Sadchikov

When was Quintura launched? What gave you the idea to launch it? What problems were you trying to solve by launching it?

Quintura was founded in August 2005 and released its first search application in November of that year. One year later, we launched a web-based search. It was based on visual context-based search concepts that the founders had been developing since 1990s. Quintura was founded to solve several fundamental problems inherent with today's search engines. Those problems include too many irrelevant search results returned, no one reads past the first page of results; inability to manage or tune results by defining context or adding search scope; no means for users to graphically visualize search terms or manage their relationship/relevance. Quintura is designed to make it visually simple for searchers to find what they are looking for, and to make it easy for web publishers to expose the content their visitors are looking for.

You guys have got a lot of great press from tech bloggers. On the marketing front what are some of the biggest and most successful surprises you have encountered? What have you found to be hardest when marketing your search service?

The simple fact that there is a tremendous amount of interest in our technology and service, in spite of the large field of alternative search engines on the market. We've invested most of our time and efforts in research and development. Our biggest challenge has been in getting our first marketing message out, which is we're in the process of expanding now to mainstream media.

How do you guys generate the keyword clouds?

That's part of the magic behind the Quintura technology. At the heart of our technology is a semantic-based 'neural network' algorithm. The cloud is literally a depiction of those search terms laid out to show their contextual relationship. Since the graphic depiction is dynamic - (you are interacting with the search in real time) one of our design goals has been to develop the widget to be extremely responsive. Through the past year, we think we've reached that point.

Quintura is popular as a keyword research tool amongst many SEOs (I use it all the time). Have you thought about combining your service with search volume data and/or competitive research data to create a formal premium keyword research (or competitive research) service/tool?

We've been asked that several times, but for now, our goals are to provide the best consumer site search services to the market and to provide our search widget to as broad an end-user audience as possible.

Quintura makes boolean search easy to visualize. Do you think searchers will eventually start using advanced search operators more on general web search engines, or will most only use it when it is presented in an aesthetically friendly way like Quintura does?

The question is whether users want to become adept at boolean logic or would they prefer to have that hidden in the background. From our experience, users would prefer to focus not on the math but on the search itself - finding the most relevant results in the least amount of time. By laying out search terms contextually and graphically, Quintura helps users manage their search and be in control of their search.

When partners sign up for Quintura you guys create a custom index from a crawl of their sites. How many domains can be part of the same index? What sort of sites does Quintura's visual search work great on? Which ones are not as strong of a match?

There is no limit. We're glad to work with large web publishers directly to assure that we are indexing all important content as part of our site search solution. The publisher of several web-sites can create a “vertical” search engine based on the Quintura search cloud. Quintura works well with all web-sites that we have worked with to date including numerous amount of blogs. Though, our first major site search clients were lifestyle portals and lifestyle magazine web-sites.

Do you see the face/interface of general web search changing drastically in the coming years? How might it change?

The web is getting more visual. So is search interface. That’s the trend. We are enabling our content-publisher customers to be more creative through customization of the widget itself. We're also looking at ways to make the search results even easier to see through the use of even more graphics.

Does Google have general search locked up? What competitive positions might allow people to build out a strong competitor that can take marketshare from Google?

General search is mostly locked up with Google. In my opinion, the best way of taking a marketshare from Google is not by building a better search destination site, but by changing the paradigm – give reasons for users not to make a decision to go to a search engine. Because when the think search engine, they think Google. Essentially, what Quintura site search does is creating environments where users keep exploring the passions, their interests, their information needs from where they are on the Web. People go to search engines when they can’t find what they want where they are.

Chitika has created a fairly large sized behaviorally targeted ad network by targeting ads to the search query prior to people landing on a page. Your site search strategy seems like it could be a rather powerful strategy for building a strong network. How has growth been going? Do you have any interesting success stories from the publisher or advertiser standpoints?

Quintura currently powers site search for a monthly audience of 8 million site users. The tests are underway on various U.S. sites, including two major men’s lifestyle sites and an educational publisher. We plan to reach the audience of 50 million in 2009. You can see Quintura search widget on lifestyle sites Maxim.com, Passion.ru and Cosmo.ru; technology news sites ReadWriteWeb.com and Compulenta.ru, business community portal E-xecutive.ru; web-sites of consumer magazines Hilary Magazine, Russian Newsweek, ComputerBild, luxury news site LeLuxe.ru, in addition to hundreds of smaller web-sites and blogs that joined our affiliate program for site search. We have also approached several online advertisers including security software vendor Kaspersky Lab to advertise on our search widget network of sites.

What types of ads work especially well with a service like Quintura? Which ones are less strong?

We have tested both contextual search ads and display ads. We are going to blend search ads with display ads for more visual appeal. Plus, can target those contextual graphic search ads with much greater precision because of our context-based algorithm. Ads from companies with established brand logos benefit from our ability to graphically display their logos in the search cloud.

What areas does Quintura have a lot of inventory in?

It is in lifestyle and technology areas.

Many search engines (Google, Yahoo! Search, Live), large content & commerce sites (Amazon.com, eBay, Wikipedia), and browsers (IE8 Beta 2, Google Chrome, Firefox 3) are now adding search suggestions in the browser via the search box and/or address bar. Do you see this eventually evolving into a Quintura-like service?

It’s a helpful feature that is mostly based on search statistics. We go a step further by offering contextual suggestions. One of the greatest aspects of our display cloud is that it shows contextually-related results, and to depict them with a graphical element. Can you imagine a shoppng experience that lets you see related items in real time?

Quintura is currently powered from the Yahoo! index. Do you guys ever plan on creating your own web-wide search index?

As a matter of fact, we are already creating our own web index from individual indexes of web-sites where Quintura powers site search. Quintura site search on those web-sites is powered by search results from Quintura index of those sites.

How many regular users does Quintura.com have as a search destination? Do you guys intend to become a consumer search destination, or are you more focused on providing search for third party sites?

We focus on providing site search, analytics and monetization platform for web publishers and content owners. As a search destination, Quintura has less than 1 million users per month. We will continue operating and developing our search sites to provide the benefits of our search technology to users. For example, Quintura.com will evolve into an online research tool where registered users will be able to save and share their searches online with the other registered members.

You guys have a vertical search service for kids. Is that seeing good adoption? Do you plan on coming out with any other vertical search engines?

Children are far more graphically oriented and can grasp contextual depictions easily. It was a natural extension for us to offer a search engine designed specifically for children - Quintura for Kids. It's also a great test bed for us to further evolve search technologies while giving kids a hand. The search engine is used mostly in the elementary schools and public libraries in the U.S., Canada, Australia, and New Zealand. Since its first launch in March 2007, several hundred school and teacher web-sites linked to Quintura for Kids. According to site statistics, the search engine has 70 percent returning visitors. 75 percent of visitors come to the site directly from a browser. In June 2007, Quintura for Kidswas ranked the highest among search engines for kids by Search Engine Watch.

We evaluate additional opportunities including licensing our technology to intranets and major search engines.

For now, our hands are full with upcoming site search product enhancements and monetization as well as  with our growing site search customer base.

Interview of Matt Mullenweg of Wordpress and Automattic Fame

I recently asked Matt Mullenweg if he would be up for doing an interview via email. He said sure, and here are his answers to the best questions I could come up with. Thanks again for doing the interview Matt!

How did you get into web programming? What made you decide to start working on WordPress?
I had started off pretty badly with Frontpage and Dreamweaver. Later I started to use things like guestbooks and forum scripts and light modifications of those for sites I was working on. The breakthrough for me personally, though, was a book called Mastering Regular Expressions from O'Reilly which inspired me to start writing my first code from scratch.

I think my first code contribution to any Open Source project was a set of regular expressions that would "curl" quotes to make them typographically correct, and it was accepted into the b2 system.

Did any early setbacks make you want to quit the WordPress project? If so, how did you work through them?
Since I was just doing it for fun and my own personal usage there were never any problems that were *that* big a deal. There were plenty of times that were tough around security problems, spam links, or community splits, but most ended up being learning opportunities.

When did you know that WordPress was going to work out?
When Zeldman switched.

How did you get beyond wanting to do everything yourself?
That's a tough one - I'm a perfectionist. I think it was that I eventually met folks who were as passionate as I was about the product and were clearly more competent. I think you have to know someone is better than you at something before you can truly let go.

One of the things that blew me away at Elite Retreat was how deeply you grasped the web. Who were some of the major influences in shaping how you perceive the web? What are some key articles and books that you think programmers and marketers should read?
Books:

Links:

Do you think the strategy of "I'm happy to ship a crude version 1.0 and iterate. I find my time is more effective post-launch than pre-launch." applies to bloggers and content producers as well as software producers?
Not as much - for an individual atom of content you don't have ongoing usage, you have a single chance to make an impression on someone. For a site as a whole the iterate approach is good, but for a given post or article give it your all.

At Elite Retreat you mentioned the concept of a "personal newspaper." What does that phrase mean to you, and do you see that concept spreading far and wide as the web ages?
I think Google Reader has the best chance of doing this. Basically there is a ton of interaction data I produce every day about what I read, how long I spend on different types of content, what I buy and gadgets I own, what topics I'm actually interested in, what topics I aspire to be interested in... There's no reason all of this couldn't be used as a filter on the torrent of news and information available every single day.

Blogging has become perhaps the leading information distribution format online. Have you been surprised by the growth of blogging? Do you envision blogs leading onling publishing for a long time? What other formats could gain significant traction?
I was pretty surprised by the growth of blogging, so I'm not going to attempt to make predictions about other formats I know even less about. :)

During past interviews you mentioned that you liked to "stay small while creating a lot of value." With powerful open source software tools & large community sites that may be possible, but what lessons should traditional niche service based business models and publishers take from successful open source software programs like Wordpress.org and communities like Wordpress.com and apply to their businesses?
I think one of the most important lessons is that you have to let go and let the community or your customers guide your direction, bet it around development, pricing, or direction. The extent WordPress has been successful thus far is directly correlated to our responsiveness to our users.

At Elite Retreat you mentioned a meta tag change that dipped the traffic to Wordpress.com. What happened and how long did it take to figure out what happened? How long did it take traffic to recover?
We had changed the meta description on permalink pages to basically be an excerpt from the post. This was less effective in SERPs than Google's auto-generated excerpt and so traffic dipped as a result. It probably took a month or so to figure it out, but traffic came back pretty quickly after we reverted the change.

Wordpress.com is one of the leading user generated content sites on the web. What are some of the leading strategies you have used to entice quality content creation? What strategies are key to detering the creation of spam?
Well one thing that has certainly helped is the lack of user ads, which removes people's direct financial incentive to create content purely for Adsense. Second I would say we take a very proactive in watching out for people trying to take advantage of the system to spam or drive traffic back to other sites inorganically.

Akismet says that 89% of comments are spam. Have you been surpised by the growth of comment spam? What seems to be driving the logarithmic growth of comment spam?
I think comment spam growth has mirrored what happened in the email world, and will probably continue to. The growth seems to be related to the low cost to spammers of just flooding everyone.

Someone used an automated bot to register an account on my site and post a contextually relevant comment about splogs being a problem. They then referenced a post on their blog, which was stolen as their blog was a splog. That splog had 60 subscribers on Feedburner! I have also caught a comment bot sequence that conversed with itself on one of my blogs. As spam gets more sophisticated will central systems like Askimet become more powerful?
I sure hope so. :)

I imagine that comment spamming on MA.TT is a quick way to get into Askimet. As online marketing gets harder some people are willing to do negative marketing for competitors. What steps can brands take to help prevent being listed as a spammer if someone else tries to ruin their reputation?
Akismet is pretty sophisticated and can usually detect that type of bowling, but of course if there is ever a persistent problem you can contact Akismet support 24/7 on the site.

At points in time I think many bloggers hated SEOs (probably for associating the field of SEO with all the comment spam they got every day). What do you think of the field of SEO? Does Wordpress employ key SEO strategies by default, and what modifications, if any, do you recommend?
I'm conflicted - on one hand there are certain things you can do to make your site more accessible to search engines that should be a baseline that everyone does but on the other hand search engines are just trying to deliver the best results to their users, so if you just focus on users and their experience the search engine should be able to figure out you're the canonical resource for a given topic over time.

WordPress' SEO I think is largely the result of focusing on other goals that also happen to have SEO benefits, like well-structured semantic markup, sane URL structures, meaningful title tags, and such. That said, people far more experienced with SEO than me have lots of suggestions of things we could do better and we listen to those closely. Ideally I think it's something WordPress users should never need to think about.

I imagine that many of the comment spammers have to be targeting high value keywords and niches. Have you ever thought about opening up some of the Askimet spam data to create a great keyword research tool? Doing that adds some opportunity cost and might dis-incentivize some of the comment spamming.
Nope.

You probably would be disappointed in me for this, but I had a number of Wordpress blogs where I have not updated the CMS in years. About a week ago one of my blogs got hacked where someone added spammy credit card links to it. I was surprised with how easy it was to upgrade Wordpress. Do you forsee Wordpress.org ever doing automated updates? If someone gets hacked and temporarily removed from Google what are the quickest ways they can find out what went wrong and where the spam is?
We're working on making updated easier than it is today, and a number of web hosts have already integrated tools that make upgrading a one-click procedure just like installs are.

I've heard from people that were removed from Google that contacting their support or webmaster tools describing what happened is a pretty good way to get re-included in the index. They understand that this new wave of SEO hackers is pretty malicious and it's not your fault.

If there was no Wordpress and you were starting from scratch on the web today what areas would you focus on? Where would you start?
An email client or a cloud-synced desktop text editor.

--------------

Thanks Matt! Check out Ma.tt to read more of Matt's stories, see his photo galleries, and keep up with Matt's latest travels.

Help Us Help You!

Now that Peter Da Vanzo has joined the site, we have another writer and can spend a bit more time on the blog. In the past some of my most popular blog posts came out of feedback from readers. What topics would you love to see us cover?

Nearly any SEO/PPC/blogging/internet marketing questions are fair game (although we won't do site reviews, or explain specifically why site X is ranking or why site Y does not rank).

Pages