Review of SES NYC by Khalid Saleh

Conferences are especially interesting especially in a tough economy. Truth be told, I had low expectations for SES NY when all I was reading was companies scaling back and downsizing.

But the first tweets about SES painted a brighter picture. And with close to 5,000 marketers registered for the conference it was shaping up to be an excellent conference. As I walked through the exhibit hall vendors had a very good show and were very pleased with the large numbers of crowds that showed up. Of all the different SES shows I have attended over the last few years, this particular SES NY had to top the list in both the quality of the lectures, the speaker list and even the small details such as quality of the food J.

Here is a quick wrap up of some of the highlights

I thought that Guy Kawasaki’s choice of topic on using “Twitter As A Tool For Social Media” was an interesting one. And although I am a fan of Guy, my assumption was that most everyone in attendance must have used twitter for some time. I was really wondering if I am going to learn many new things from session. The room was over flowing with people and the few who showed up late had to spend the hour or so standing.

Guy made 9 main points:

  1. Forget the A list (sort of funny coming from A lister ;) )
  2. Defocus
  3. Increase your followers
  4. Monitor the conversation
  5. Copy best practices
  6. Use search
  7. Use the right tools
  8. Squeeze the triggers
  9. Make it easy to share

Guy’s favorite tools to use in conjunction with Twitter

My favorite portion in the presentation was the section on search and utilizing advance search parameters to look for terms people are using. That can be a valuable tool to increase business. Let say you are a web designer who is looking for work. You can setup a search for a term such as web design referrals. That is an excellent time to jump in and introduce yourself.

Twitter hawk is a tool that can be used to send automatic “paid” messages when people search for a term. I am not familiar with the tool but I see the potential to use it for business development. I am sure there are many who will debate the tactics Guy suggested in the session. If you are a believer in pure social media, I think there are many things that will turn your stomach.

The session on Meaningful SEO Metrics focused on measurements that help generate better ROI. Traditional metrics focus on number of visitors, pages per visit, time on site, etc. Ray "Catfish" Comstock discussed how bounce rates for keywords is critical in the process of conversion optimization. Ray suggested examining:

  • High Bounce Rate keyword phrases: which indicate keyword phrases that are generating traffic but users are not finding what they want.
  • High Conversion Rate keyword phrases: which indicate keywords that are working and therefore which phrases you should focus more resources on.

I did not really appreciate the importance of mobile SEO until I chatted with an SEO for a large auto site. He mentioned that their site traffic usually peaks on Mondays and Tuesdays. That is the time where people are searching for cars. The traffic usually dies off on weekends when people are out shopping. If you think about it, people actually shopping in real life is perfect time for Mobile SEO. That discussion was enough to convince me to attend Mobile seo best practices. Cindy Krum who specializes in mobile marketing consulting did a great job covering Mobile Marketing Strategies. While visitors of traditional marketing can arrive at a site at different stages of the buying process, mobile search usually indicate immediate intent. Cindy pointed out that real mobile web browsing, flat-rate data pricing, and faster download speed are all factors help that make mobile web more relevant. Cindy’s advice for basic mobile seo includes:

  • Follow all Traditional & Local SEO Best Practices
  • Submit your Site to Mobile Search Engines & Directories
  • Avoid using flash, scripts, popup windows.
  • Follow XHTML standards
  • Use external CSS

The panel discussion on the most common search marketing mistakes CMOs make promised to deliver an interesting topic. My favorite of the mistakes was failing to assign $ value to every conversion on a website. There are too many times when we focus on a generating sales or leads via a website and forget about the other conversions that can take place. These other conversions might not have the same dollar value as a sale but they are the still conversions. A visitor might subscribe to a newsletter, download a white paper or subscribe to a blog. Assign a dollar value for each of these activities.

The extreme makeover session with a focus on conversion made for an entertaining afternoon. Jeffry Eisenberg of Future Now, Tim Ash of SiteTuners and Ethan Griffin of Groove Commerce took on a discussion of one of the sites Groove Commerce worked on. You can tell right away the different approach to optimization each of these guys takes. Jeffry is evaluating different customers, looking at what might work for them. Tim is focused on the testing aspect. Ethan is considering optimization as well as implementation details. Jeffry and Tim seemed to disagree even when they were making the same point. As a listed to the panelist go back and forth on what to test and what to remove, some sitting next to me asked, so who should we listen to here? I smiled and said, listen to your visitors!

Night time at conferences is as valuable as day time and SES New York was no exception. So, on the first night of SES NY I stayed up until 4 AM with Frank Watson (AussieWebmaster AKA crocodile man in some Hollywood circles) and Patrick Sexton (who you know from SEOish or his latest venture GetListed.org), every muscle in my body was aching. I am just not sure how Frank was planning to stay up for few more hours.

B2B complex sales involve longer cycles, many stages and different people in each stage. The session on B2B marketing focused on search marketing tactics that can help deal with some of these complexities. Segmenting data becomes more critical in complex sales. This can be done through allowing customers to identify what segment they belong to (enterprise, small business, etc). Another important factor when it comes to complex sales is going beyond the cost per lead to cost per action which is a good indicator of the quality of leads.

Another panel discussion I attended at SES was Slash Your Search Budget. As you can imagine the title hit home with what many companies have to deal with nowadays. Unfortunately, this session was perhaps the most disappointing discussion in the conference. The speakers did not offer real ways to slash marketing budgets. The talk of mobile SEO as an alternative to traditional SEO threw me off completely. How would that relate to slashing a marketing budget? Talk of utilizing social media as a way to generate hits did not resonate with me either. Social media takes a lot of nurturing and a lot of budget. So, at that point, I could not help but raise my hand and ask how is using social media help in cutting SEM budgets? There was a bit of silence there.

The only exception was Aaron Kahlow of the online marketing summit. He offered candid suggestions: It is better to take charge of the budget discussion. Approach your manager/boss and tell him you want to slash the budget. Evaluate which parts SEM activities are not producing results. By doing so, you will be guaranteed a seat at the table.

The 2 nd night at SES included attending live web master radio show hosted by David Szetela, learning more about the SEO community from Jim Hedger and enjoying a lengthy discussion on online marketing with @webanalytic J .

On the third day of SES, I attended News search and SEO. Most notable on that panel was John Shehata of who specializes in news search seo . John provided many valuable tips that ranged from the basic to more advance level. Some of the tips included:

  • Use trends/buzz keyword tools when writing news for online audience (Google hot trends, Yahoo Buzz, Google Zeitgeist, seomoz popular)
  • Print headlines sell the story, optimized web headlines tell the story

Well, before I sign off, I have to congratulate Matt McGowan and his team for an excellent show and raised the bar for upcoming search events. I think Matt is on his way to Australia at this point. If you enjoyed this post and would like to connect, then follow me on twitter.

How Many Trillions Does it Take to Put a Banker in Jail?

About to go to jury duty here in about 15 minutes...which got me thinking about the concept of justice.

I don't mind paying a lot of taxes if it goes toward creating a better society, but in California when you get toward the upper end of the tax bracket you can pay ~ 60% (federal + state + local + self employment/social security) of your income in taxes. And those tax payments probably do not even offset the handouts we are giving to bankers that gambled with trillions of dollars and lost.

News of additional bailouts via a public private partnership (that makes almost all the reward private and almost all the risk taxpayer funded) have spurred Bank of America and Citibank into buying more of the toxic assets that they allegedly need help clearing off the books.

A guy writes $7,000 in bad checks and gets a 24 year prison sentence. These bankers cost tax payers trillions of dollars. So much money stolen that they debased the currency, and they are awarded with free money for being incompetent.

I am not sure what will come of today, but if this country actually had any sense of justice then there would be at least a half dozen bankers serving a few decades in jail. The fact that none of them have been locked up yet shows how perverse our justice system is and how little you should trust the U.S. government. Politicians work for the bankers.

Obama has a poll allowing voters to ask questions about the economy. Most of the questions are about "what about me I am broke and don't know what to do" and "I need some relief" etc. And that is how they will remain until our financial system is fixed. And by fixed I mean these bankers serve the jail-time they earned and pay back their "earnings." Billions of hours of labor have been wasted propping up a ponzi scheme that promotes insider/bank traders winning on both sides of the trade, while handing you the losses.

Is there any wonder why so many people feel overextended?

These bankers put teeth in the consumer bankruptcy law (lying using bogus statistics to pass it) then they wanted a decade long ride on the free money train for their company.

What's worse is that children who have yet to be born have interest working against them starting from the day they are born. They are in the hole from their first breath, having done nothing wrong other than being born into corruption. The politicians take care of their own children, just not our children.

We are so afraid of terrorism...and financial terrorists that cost us trillions of dollars are somehow just part of how the system works. No big deal.

Sorry, but I don't need to pay for someone else's second yacht or fourth home. If anything these career criminals should be scrubbing my floors and taking out my trash. You and I are the people who are actually paying their salaries (and bonuses) through a collective billions of hours of OUR LABOR that was confiscated and handed over to the banks. This makes me angry enough to want to go unemployed and stop working and/or move to another country. I hope I get to be a juror over a banker some day. And I hope you do too!

Update: here are a couple relevant articles in Rolling Stone & The Atlantic, and a nice video.

Google Expands Snippets & Related Searches Word Relationships

Google announced that they are rolling out a new technology to better understand word relationships and extend their snippets on longer search queries.

Starting today, we're deploying a new technology that can better understand associations and concepts related to your search, and one of its first applications lets us offer you even more useful related searches (the terms found at the bottom, and sometimes at the top, of the search results page).

Note that they claimed that this is "one of its first applications." If they can improve relevancy with integrating this technology directly into the core search algorithms then it will lower the importance of on page optimization (since you only need to be close rather than use the specific words that were searched for). Such a change would likely decrease the traffic to low authority sites held up largely by strong site structure and on page SEO, while increasing the amount of traffic going to high authority sites and well branded sites that are poorly structured from an SEO perspective.

I am not sure if this sort of algorithm change would favor shorter or longer content pages. In most cases I would guess longer pages if they were kept on theme, and broken up to relevant chunks. The expanded snippets on longer search queries show a lot more information directly in Google's search results, which helps thicker pages show off their offering more than thinner pages, but cedes more control of the information over to Google as they can show close to 250 characters in the search results.

If the technology was applied to anchor text it might also limit the value of anchor text manipulation by boosting up the value of related phrases (if Google knows that the word Salesforce is relevant to CRM then they might count that anchor text more).

Greg Sterling noted that this change came from the Orion technology that was purchased by Google from Ori Allon in 2006. He also interviewed them:

I spoke yesterday to Google and Ori Allon. To the extent that I understood his discussion of the way Orion’s technology had been applied to refinements here’s what’s going on at a high level: pages are being scanned in “real-time” by Google after a query is entered. Conceptually and contextually related sites/pages are then identified and expressed in the form of the improved refinements. This is not solely keyword based but derived from an “understanding” of content and context.

It is hard to speculate if/when this technology will move from sideshow to becoming a big deal. The current usage is fairly trivial, but it could get much more well ingrained into many parts of the relevancy algorithms.

As search engines get more sophisticated with how they show word relationships (on branded and non-branded search queries) that is one more thing that can be optimized, though likely one that will require a holistic marketing strategy to optimize, because you will need to create a lot of co-citation (or some other signal of relevancy) across many pages on the web.

A couple years ago Lord Maurice Saatchi described their brand strategy as being built off of One Word Equity.

In this new business model, companies seek to build one word equity - to define the one characteristic they most want instantly associated with their brand around the world, and then own it. That is one-word equity.

It is the modern equivalent of the best location in the high street, except the location is in the mind.

Is Your Website Credible?

I saw a link-bait article at the top of TechMeme this past weekend entitled ""Why Advertising Is Failing On The Internet".

The article outlines how internet advertising will fail because it (apparently) holds people captive and forces them to watch ads (huh?). I'm paraphrasing, but that's the jist of the conclusion reached by the author, Eric Clemons, of the University of Pennsylvania.

I certainly hope a lot of would-be advertisers listen to his view on search advertising, because it will reduce the bid competition for the rest of us:

Misdirection, or sending customers to web locations other than the ones for which they are searching. This is Google’s business model....Misdirection most frequently takes the form of diverting customers to companies that they do not wish to find, simply because the customer’s preferred company underbid"

Bizzare.

For starters, what is the searchers "preferred" company? That statement assumes the searcher already knows what company they are looking for. Perhaps, as is often the case, they are looking to solve a problem, not locate a specific company.

Secondly, anyone who has paid for ads would know that the last thing you want to do as a search advertiser is to "misdirect" visitors to your site i.e. visitors who aren't interested in what you're selling. It costs a fortune, makes no money, and Google will likely demote such ads due to a poor quality score.

Sergey Brin is of the opinion that advertising can add value, so long as it is relevant:

"....it fits with the notion of Google co-founders Sergey Brin and Larry Page that ads can and should be at least as useful to people as search results and other online content. "We believe there is real value to seeing ads about the things that interest you,"

Of course, he would say that, but I think it is true. Ad content need not be intrusive. Relevant advertising, delivered when the customer wants it, can and does solve problems, and thus adds value. Advertising also facilitates a lot of web content that simply couldn't be offered for free if the advertising didn't support it. Google itself could not exist without advertising.

Anyway, Danny Sullivan does a good fisk of the article. We'll worth a read.

Website Credibility

Danny brought up an interesting aside about credibility, which I thought I'd riff on and hopefully we can share some ideas in the comments.

Here is how Danny decides if a travel website is credible:

I have this “travel guide” test to use to help determine if an expert source knows what they’re talking about. Ever struggle to decide which travel book for some vacation destination might be the best one? Me, if it’s a travel series, I pull the guide for a destination I know well, like my hometown. I know my local area in an expert way — and if the travel guide suggests good stuff for my area, then I feel better about trusting it in other areas.

In this case, because Danny has established the credibility of the source, he is more likely to go to places the guide recommends. He is certainly more likely to keep reading the site, which means more opportunity for advertisers to be seen.

What Makes A Website Credible?

Credibility means the quality of being believable or trustworthy.

The markers we use to determine credibility online have a lot in common with the way we determine credibility offline: are we familiar with this person or business? Have we had previous, beneficial dealings with them? Do they come recommended by someone we trust? Does it look and feel right? This last point might be more important than we've been led to believe. More on this shortly.

Various articles have pointed to prescriptive credibility markers, such as displaying your address, having a privacy policy, showing a photo of the site owner etc, but I'd argue these are pretty much useless unless more fundamental credibility markers have been established first.

One of the problems on the internet in terms of establishing credibility, is that the internet is largely unregulated and anonymous:

the Internet has no government or ethical regulations controlling the majority of its available content. This unregulated flow of information presents a new problem to those seeking information, as more credible sources become harder to distinguish from less credible sources (Andie, 1997). Moreover, without knowing the exact URL of a given site, the amount of information offered through keyword searches can make finding a predetermined site difficult as well as increase the likelihood of encountering sites containing false information

The task of deciding the level of credibility lies mostly with the individual, rather than an external agency. A research report by Stanford Persuasive Technology Lab found:

The data showed that the average consumer paid far more attention to the superficial aspects of a site, such as visual cues, than to its content. For example, nearly half of all consumers (or 46.1%) in the study assessed the credibility of sites based in part on the appeal of the overall visual design of a site, including layout, typography, font size and color schemes.This reliance on a site's overall visual appeal to gauge its credibility occurred more often with some categories of sites then others. Consumer credibility-related comments about visual design issues occurred with more frequency with finance (54.6%), search engines (52.6%), travel (50.5%), and e-commerce sites (46.2%), and with less frequency when assessing health (41.8%), news (39.6%), and nonprofit (39.4%) sites. In comparison, the parallel Sliced Bread Design study revealed that health and finance experts were far less concerned about the surface aspects of these industry-specific types of sites and more concerned about the breadth, depth, and quality of a site's information.

The emphasis people place on a sites visual design when trying to determine credibility is interesting. This is not to say having a blinged-up site will make you appear more credible, as it very much depends what market you're in. A slick site is likely be credible if you're selling lipstick to teenagers, but not if you're providing weather data to climatologists. Wikipedia and Google appear credible as information resources partly because they look staid and academic.

So the first step to making your site credible is to know your audience, and meet their expectations in terms of look and feel.

Accuracy Of Information

The studies also point to the accuracy of information as a credibility marker.

It stands to reason that a site that contains obvious lies or inaccuracies, as perceived by the reader, isn't going to be credible. Having said that, there are plenty of scam artists on the internet, and people pedaling incorrect information, but the difference is that their readers aren't aware they are being lied to or being given incorrect information.

This is why it can often pay to cite known authorities to add credibility to your content. Besides the value of citation in terms of establishing accuracy, naming a credible resource can make you appear more credible by association. Go to Yahoo Answers are notice how most answers lack credibility. Those answer that are credible tend to cite external known authorities.

A Way With Words

Closely related to visual presentation is format and the way you use words.

In a study by Indianna University, Matthew Eastin looked at the credibility markers for online health information:

More recently, Rieh & Belkin (1998) identified criteria used when evaluating online information......they found that: (1) institutional sites were seen as more credible than individual sites, and (2) accuracy of content was used to assess online information. Respondents used knowledge of citations within the content and the functionality of hyperlinks as cues to evaluate the information. ....in addition to source and link accuracy, they also recommend that users consider peer evaluation, navigability, and feedback options (i.e., email, chat room, etc.)

Academic essays sound authoritative, even if what they say is nonsense, because they are long winded and use big words. Even the length of an essay can lend credibility. For example, long Wikipedia pages appear more credible that short pages, simply by virtue of their length. Various studies in the direct marketing field appear to back this up, which is why you'll often see those long, single page sales letters. Short letters don't sell so well. "Thoroughness" either reduces anxiety in the buyer, or ehances the credibility of the seller, or most likely both.

Again, the way you use words depends on your audience. An academic approach isn't much use if people can't comprehend what you're saying. Likewise, if a an article is lightweight and flippant, it isn't going to appear to an academic community.

In The Cluetrain Manifesto, a book that looks at communication within markets in the internet age, the authors assert that markets are conversations. And that conversation is conducted in the human voice, not the cliche ridden hype language of the marketing brochure. The use of colloquial "voice" often carries a lot of credibility on the web, presumably because it signifies a human presence.

The Reef Fish Effect

People like to go where other people are.

There is perceived to be less risk in crowds. This is why Amazon's customer reviews are so powerful. People's choices are affirmed by the wisdom of the crowd. It just feels safer.

Include as many human touches as you can. Reviews from known authorities, signs of activity, signs that other people have visited your site before, and their experience has been positive. Being a known quantity makes you appear more credible.

What do you look for when trying to determine credibility?

SEO & Marketing Links of Interest

I have been saving these links up since January 21st. Time to share about 50 of them. :)

Niche SEO Guides

The Rising Commoditization of Everything But Experience

Graywolf shared this great video about the ongoing process of commoditization.

  • materials get commoditized
  • due to competition products become materials
  • customized service and personalization help create sustained value

The Fear of Success in Creative Arts

This Elizabeth Gilbert TED video talks about how to live with knowing your best work is likely behind you, which is true for many popular artists and authors.

More Search News

Yahoo! Search makes it easy to embed videos & docs with SearchMonkey.

Amazon is trying to use DMCA to block other ebooks from getting onto Kindles. Sony and Google partnered up to make 500,000 ebooks freely available on the Sony ebook reader.

Yelp was accused of extortion. Pay us or that negative review stays at the top. Lovely mafia-styled business model :)

Bryan Todd shares a powerful story about how words are powerful:

There is no such thing as right language or wrong language, good grammar or bad grammar, correct English or incorrect English. There is only language that got you want you wanted, or didn't.

Perry Marshall highlights how Google considers some businesses to be illegitimate businesses. If only they would get to the government grant stimulus ad scams.

Gab Goldenberg offers tips on online branding. Lance Loveday wrote a great article about the overlap between search and branding. In our member forums I started a thread called branding in the search channel.

George Michie explains why budgeting search is a bad idea and offered some SEM RFP questions worth asking. Generally I have avoided clients that needed an RFP because I felt they were still in the shopping phase.

Searchers have been using longer search queries.

John Andrews explained how he thought Sphinn moved on from its roots.

CJ shares some ideas for how search engines can hunt for paid links.

Joel Spolsky highlighted why you should not use Google apps for anything important. I am really hoping they never screw up my email account!

Michael Gray took a look at the influence of article directories on organic rankings.

Debra Mastaler offers a link building stimulus plan.

Rob explains how some sales techniques, particularly in social settings, work well by hiding the upsells in the price. Online if you sell a non-commodity you can make the core price higher (to increase perceived value) and then let people de-select the pieces they do not want or need.

Back in January Robert Scoble highlighted that Facebook is studying sentiment behavior.

Andrew Goodman highlighted how dumb some clever nanotargeted marketing is. Funny :)

Marissa Mayer and Eric Schmidt were on the Charlie Rose show. A couple interesting quotes...

  • from Eric - Technology has brought us closer together, but makes us more stressed.
  • from Marissa - speech to text technology on Youtube that is searchable should be around in 5 to 10 years
  • from Marissa - credit card companies know if you are going to get divoriced with 98% certainty something like 2 years ahead of time.

Eric Schmidt also suggested that as netbooks get cheaper they may subsidize them to buy marketshare.

The Economist published a story about Brewster Kahle and the idea of an open library.

Tons of great free research from FutureLab.

Seth Godin

Seth Godin riffs on the purpose of schooling, including ideas like...Teach future citizens how to conform & Teach future consumers how to desire

Here he talks about the concept behind his new book Tribes.

Danny Sullivan Highlights Google's 2 Tier Justice System

Danny highlighted how many aggregators of aggregators and content cesspools are bogusly clogging up Google's search results with sites that would be viewed as spam if the owner was not socially well connected:

You kind of feel sorry for Joe Schmoe. Build a name by once having worked for Apple or by having written a few marketing books, and you seem to get much better treatment than Joe would get if he pulled the same SEO play stunts.

Alltop, Mahalo, Squidoo -- none of them dominate Google. But seriously, Squidoo has a PR8 home page? Alltop has a PR7? Search Engine Land, which actually produces original content, sits with a PR6 -- but these guys that simply compile content from others get a big fat PR kiss on the lips?

Hey, I don't fret about PR scores. I know how meaningless they can be. But Joe Schmoe who tried to launch one of these types of sites wouldn't get any PR at all. Google would have shut them down long ago. Lesson here? To be a really successful SEO, get successful at something else, then jump into your SEO play.

Danny Sullivan is probably the only neutral reporter in the search space with a decade + of experience AND a background in traditional journalism. He is usually quite neutral, so for him to say that, you know Google is screwing up pretty bad.

If you are good at public relations you can have all the PageRank you want. Can't afford a proper public relations campaign? Have no brand equity other than being branded as an SEO? You are the scum that makes the internet a cesspool. Better luck next life!

If you can't be found you don't exist. As Google's "spam team" grows more subjective with the definition of spam (hey I know him it's not spam, never heard of him it's spam, etc.) the web loses out on its diversity. Meanwhile how about you view some great fraudulent government grant ads through AdWords.

Google's public relations team publicly lied about cleaning those fraudulent ads up.

"Our AdWords Content Policy does not permit ads for sites that make false claims, and we investigate and remove any ads that violate our policies," said Google in a statement e-mailed to ClickZ News. "We have discussed these issues with the Federal Trade Commission and reaffirmed our commitment to protecting users from scam ads."

The above LIE was quoted from an article published 3 weeks ago, but Google is still making over $10,000 a day carpet-bombing searchers with that reverse billing fraud (and probably $10,000's more on the content network).

Spam is only spam *if* the spammer is not paying Google and they are too small to fight back against the often arbitrary and injust decisions of the irrational Google engineers that "fight spam" while turning a blind eye to grant scam ads.

Pretty worthless hypocrisy, Google. Who is trying to turn the web into a cesspool full of fraudulent ads and corporate misinformation? This company:

SEM Rush Uncovered : Interview of Michael Goldfinch

I have been a big fan of the SEM Rush project since it launched, and recently interviewed their CEO, Michael Goldfinch. We discussed their software projects, and how they got into the field of SEO.

You guys have had a strong string of hits in the SEO space with SEO Digger, Ads Spy, SEO Quake, and SEM Rush. How did you guys get involved in the SEO space? What do you attribute your string of successes to?

SeoQuake Team became active in the Internet at the end of 90s, when Spedia were alive. Since then we have developed a lot of web projects. In 2000 we started doing SEO. We did SEO for Altavista and NorthernLight. Happy times they were! I remember that we made pages with enough keywords and after entering captcha got top1 immediately. After that we worked for different companies (SEO and web-developing).

SeoQuake and SeoDigger are public products and they make a small share of our work. SeoQuake Team made them public to demonstrate its ability to develop such products. SeoQuake and SeoDigger are extremely popular for a reason: they are helpful, user-friendly, and affordable. They are innovative and developed with users’ needs in mind.

So you guys have created a pretty cool tool in SEM Rush. What made you guys decide to create it?

In summer of 2008, after the shy start of AdsSpy.com, we were playing with different ideas of AdWords keywords research, AdWords arbitrage and competitors’ keywords. And when we found out that Velocityscape plan to launch their new version of Spyfu we launched SEMRush. This was a nice joke, I suppose. When we saw PPC web spy – we just integrate SEMRush AdWords data into SeoQuake =)

When you guys created SEM Rush you closed down SEO Digger. Is there any reason you didn't do a 301 redirect during the transition? What made you feel that a new brand was needed after SEO Digger was already so well known amongst the SEO community?

SeoDigger has not been closed yet. API access for all registered users is still active :) We plan to close it after the integration with Market Samurai is finished. We gave a new name to the project to emphasize its novelty. Besides, the product value is more important for us, then its name.

How hard is it to crawl and update that much data from Google? Do you guys need a lot of beefy servers to grab all that data and serve it up?

It can be really difficult for anyone except Google, but we like this problem. With some relevant experience it becomes not so hard. Without going into details of the technology I have to say that a lot of developers do not bother with their codes and databases optimization, and therefore, they need large server farms. Instead of this we use a lot of C++. Also we use new technologies – SSD (solid-state drives) on servers and so on. Of course we have a number of servers in different data-centers to monitor Google and other search engines geo-targeted SERPs.

Recently Google tested showing AJAX search results to some searchers. If they roll out such a program will you guys still be able to gather all that great data?

We have not tested it yet. But I believe that SEMRush will be still working. Google can block all their analyzers, but why would they do that? Such tools help advertisers, SEOs, and other people working in the web. They make Google AdWords more popular. In addition, I suppose that Ajax-SERP is interesting for geeks, not for mass users.

One of the most interesting SEM Rush features (that I have not seen in any other competitive research tools) is the ability to cross compare the organic rankings for one site and the AdWords ads for another site. What gave you guys the idea to do that?

Our programmer far away in Siberia did this on his own without being asked. The team decided to leave it “as is” :) We like this feature and we plan to improve it.

SEM Rush does a break down of the value of each ranking. What statistics do you use to determine the difference between say a #3 and a #5 ranking?

SEMRush use open statistics about CTR dependence of URL’s position in SERP. in addition it uses statistics from our own sites.

You guys created a list of some of the most valuable and high traffic domains with great organic Google rankings. Have you guys thought about putting together a list of the most valuable keywords as well?

Now everybody sells “expensive” keywords, “huge” keywords databases, “profitable” keywords lists etc. SeoQuake Team is going to sell more valuable information – domains related to top adwords spenders! Full version of SEMRush rank will be available soon. You will be able to download lists of high organic traffic domains (with stats) and lists with high adwords traffic domains (with traffic and costs details). I think this info is really important to SEO firms and adwords professionals.

There is a version of SEM Rush for the German market and one for the Russian market. Are these both primarily based on Google rankings? Are there any other similar tools that serve these markets, or are you guys first to market in these markets? Have you guys considered making a French version and/or a UK version?

Yes. We made them to simplify google.de and google.ru analysis. As you know Keywordspy try to do this for German, and nobody do this for Russian version of Google. SEMRUsh.de (German version of SEMRush ) is still beta for today. There are some problems with keywords traffic estimation for local markets, because there are no accurate stats for German, Uk of French keywords. As you can see there is no enough geo-stats about these keywords at even Google Keyword Tool. And there is a problem in sorting keywords database: you can easily detect specific keywords, but what should you do with universal keywords like cnn? We try to make accurate keywords packs for local markets and we are open to cooperation with anyone who can give us this information.

What are some of the most interesting ways people have used SEM Rush to boost their business?

SEMrush users don’t report us about their success, because the silence is golden. I can only say that for last month our users performed 1.5 million different queries for Google AdWords and organic keywords reports for different sites.

Do you guys have any other new analytics tools or other SEO related tools in the works?

Our users keep giving us tons of great ideas and we generate a lot of them too. So we constantly develop different tools – quite simple ones and I’d rather not discuss them now, because they are not ready yet. When we launch them – you will notice it, my promise! One of them will be for brands analysis. This is not only SEO, but we believe this is interesting too.

Do you guys plan on adding support for subdomains (like seeing the subdomains as their own site)?

We recognize top-level domains using this list https://wiki.mozilla.org/TLD_List. SEMRush already supports reports for URLs and for domains. Correct recognition of subdomains is a problem. There are no rules about what is the subdomain – different site or just a site’s directory. Some other day we will solve this problem, not today.

------

Thanks Michael. Check out SEM Rush if you would like to learn more about their new competitive research tool. You may also want to check out our review from when SEM Rush launched.

'Professional' Content vs Content Actually Worth Reading

Some media executives are bitching about Google ranking blogs and sites not controlled by the mainstream media. Of course Google has been tilting their algorithms in the direction of brands, and even includes trusted news partners directly in the search results for recent news items. But that is not enough to make bloated media companies profitable.

"The original source, and the source with real access, should somehow be recognized as the most important in the delivery of results."

Google subsidizes these media companies with additional exposure by

  • weighting domain authority
  • giving them first mover advantage in the search rankings (through direct inclusion of recent news results in the organic search results)
  • featuring their content (yet again) in their news search product
  • favoring informational content over commercial content

If a big business has "real access" and yet loses out to people rewriting the story, it means the original source did one (or more) of the following

  • did a pretty crummy job of reporting
  • did a pretty crummy job of SEO
  • erected barriers that made them not linkworthy
  • fought off niche brands with a generic brand that does not resonate as well with the market

Google could give these media companies almost 100% of the search traffic and many would still go bankrupt because their business models simply do not fit the web. Online ad rates are lower, most of the media infrastructure is unneeded bloat, and individuals and brands are starting to create their own media.

When I click the publish button, 10's of thousands of people will read this post. Its not your fault or my fault that big media was too lazy to create niche brands offering relevant regularly updated content.

Ironically, the quote from AdAge, begging for coverage of the original source, did not have a name on it. You can quote it, but there is no source. These clowns whine about something and are not willing to put their names behind their own words. Maybe that has something to do with why people would rather read elsewhere.

This is the same media that pushed the bogus Iraq war, laughed and joked about those errors (while people were still dying), and missed the financial terrorism occurring back home. Why again is the original source more important than those who dig a bit deeper and add further context?

If the relevancy algorithms are your enemy, then maybe your work is no longer relevant.

Maybe they can work on stuff that matters.

Corporate SEO

http://www.seobook.com/corporate-seo-services
http://www.ariozick.com/how-google-wants-to-destroy-small-business-online/
http://searchengineland.com/enterprise-seo-a-plumbing-problem-29237

link profile seen as a whole
http://www.bing.com/community/blogs/webmaster/archive/2009/06/19/links-t...
http://googlewebmastercentral.blogspot.com/2009/10/dealing-with-low-qual...

http://www.wordtracker.com/academy/brent-payne-interview
http://www.stonetemple.com/articles/interview-brent-payne.shtml
http://www.foliomag.com/video/new-york-times-chief-search-strategist-mar...

corporate seo is largely about trimming away the fats and leveraging the assets you already have. and perhaps limited link buying. ;)

The corporate SEO faces a number of challenges, many of which are to do with procedure and diplomacy. We'll take a look at these challenges and how to handle them. We'll also look at the specific technical aspects of SEO on corporate sites, and the strategic advantages particular to corporate sites.

Big Obstacles, Big Opportunities

The biggest obstacles in corporate SEO are political.

Corporate sites usually have a team of people working on them. There are a number of stakeholders. These stakeholders consist of managers, related divisions, designers, developers, content producers and writers. There will often be people who will be openly hostile to a change in the way they work. Many of these people may be unfamiliar with search engines and their requirements.

Into this environment walks the SEO.

No matter what, you're going to ruffle a few feathers! How do you deal with the myriad of demands and internal politics?

Get Management Buy In

The first step to achieving good SEO outcomes within an organizational structure is to get management buy-in. Think of internal managers as customers.

Given that management have probably already hired you, getting their buy in should be relatively straightforward. Management will want to see facts, figures and strategies that support the business case. Prepare presentations that demonstrate your proposed strategy, how it supports the business case, how long it will take to achieve, and what your measures of success will be.

What type of facts and figures will they want to see?

1. Show the benefit you provide above the cost of hiring you

If they've hired a full-time inhouse SEO, it's most likely they've already done this calculation, but it doesn't hurt to reinforce it.

For example, let's say they're paying you 50K per year. Overhead for employees is likely 50% of the wage figure again. Can you come up with a business case that shows how you'll provide more than $75K in value per year? You don't need to state this figure explicitly, merely place a ballpark value on your strategy.

Value can be difficult to assess, but you can look at what they're spending on PPC and compare. If they're not spending on PPC, examine the keyword bid prices in tools such as the Google Adwords Pricing calculator. Estimate the value of the keyword terms and traffic you're likely to receive for your SEO campaign.

Try to find out the business plan. What are the companies objectives? What are the objectives of the division? How are they measured? Businesses often have KPIs - which stands for Key Performance Indicators. Find out what these are, and fit your strategy to these metrics.
The very fact you're asking for these details will likely impress those who have hired you.

2. Show Where The Site Is Now, And Where They Can Be With Good SEO

Demonstrate the position they occupy now, and show where you can get them to *if* your strategy is followed. Prepare charts of current rankings, traffic levels, conversion rates, and overall market trends.

Here's an example business case template you can follow:

  • Background - why SEO is useful
  • The costs of SEO - the cost to the organization of not doing SEO
  • The benefits to the company of SEO - focus on the business benefits
  • Why the organization needs SEO - show competitive advantage potential, decreased advertising costs, increased exposure etc
  • General Principles of SEO - stay broad and high level, avoid technical minutiae
  • Recommended scope and objectives of your SEO strategy
  • Risks - outline the conditions that will prevent you from executing your strategy
  • Cost of your SEO strategy - include any external costs, such as directory submissions, paid placement etc
  • Projected cost/benefit analysis for the organization - compare with other advertising channels, such as PPC
  • Measurement, outcomes, milestones and evaluation - set your KPIs
  • Anticipated overall results- also include a timeframe

What pushes managements buttons? Is it traffic numbers? Is it seeing the company top of the search results? Is it increased sales? It might be a combination of these things. Nail - in writing - what it is they really want to see delivered, then figure out how to deliver it.

3. Show Them What Their Competitors Are Doing

Is there a competitor who is doing well with their SEO? Prepare facts and figures that show where your company is being outgunned. There is nothing managers, particularly marketing managers, like less than being outgunned by their competitors. If the competitors are using good SEO strategy, you can use this as justification for your strategy.

One objection you may hear is that the company is already running PPC. So why do they need SEO? Impress upon them that most people click on the main search results. SEO clicks are "free", especially over the long term.

Also, a study by IProspect showed that top search results can result in brand equity for the highly ranked sites:

Finally, it continues to be apparent that brand equity is conveyed upon companies whose
digital assets appear among the top search results by roughly a third of the search
engine users. In 2008, 39% of search engine users believe that the companies whose
websites are returned among the top search results are the leaders in their field. This
figure has grown from 36% in 2006, and 33% in 2002

Once your strategy is agreed to, you should have the backup you'll need to undertake the hard part.

Convincing The Minions

Various people within the web team need to buy into SEO in order for it to work.

Some companies locate their web team in their IT division, others place their web team in their marketing division. Often, these two business units share ownership of the strategy. It is important to determine which division has the most control, especially over aspects such as site structure, content production, and overall strategy. Get buy-in from the appropriate management team.

Look to establish rapport with, and train, the various people who occupy the following roles.

1. The Manager/Team Leader

You must have buy-in from the person with the most control over the business unit responsible for web strategy.

Managers tend to respond well to anything that helps them achieve departmental goals.Look for areas synergy exists. For example, marketing managers often have traffic goals, and similar visitor metric milestones. Show them how SEO will help meet those objectives.

This is why it is important to frame SEO in business terms, rather than purely a technical process.

2. The Designer

The designers are responsible for the look and feel of the site. They are probably also be responsible for site architecture. Architecture and design are two areas where you are likely to experience a lot of push-back.

There is good reason for this.

What is good for SEO may not be good for users or brand aesthetics. This area needs to be carefully balanced. If the designers think the SEO is compromising the look, feel and operation of the site, then you're not going to get very far, no matter how good your intentions are.

If your designers are familiar with usability, and good designers will be, you're in luck. There are a lot of usability integration points that work for users, designers and SEOs. For example, breadcrumb navigation can be great for usability and SEO, as it allows for the propagation of keywords, and provides strong internal link structure.

Are their disability access laws that the company must comply with? Depending on your legal jurisdiction, there may be disability discrimination laws in regards to access, and these can apply to websites. For example, Target were the subject of a legal case brought about by the National Federation Of The Blind.

The lawsuit alleged that Target had not made the minimum changes necessary to its Web site to make the site compatible with screen access technology and to allow blind users to access the site to purchase products, redeem gift cards, find Target stores, and perform other functions available to sighted customers.

In order to comply, sites need to provide equal access for those with impairments. If a person is visually impaired, then compliance may mean that the site must be able to be read by a text-to-speech converter in order to be accessible. Of course, any site that can be used and navigated with a text browser will also be search engine friendly. This can be a good angle to use if the law in your jurisdiction supports it, and you are otherwise having problems convincing the designers - and managers - to make a site more search engine friendly.

Also be on the lookout for other areas that require little change and provide natural synergies.

These areas include:

3. Writers & Content Producers

The writers provide the words. The content producers may provide video, pictures, and other media. You'll probably be dealing mostly with the writers.

Writers, especially if they have been writing professionally for a long time, can be very set in their ways. Writers schooled in journalistic or copy writing techniques use methods that predate internet search engines, and often the internet itself. Old habits die hard.

The problem with such writing is that it may not incorporate keyword terms in the right places - particularly headings - or in the frequency you require. Communicating this concept can be difficult, especially with journalists, who like things presented in terms they can understand, usually within a sentence or two.

Avoid terminology. Talk in their terms, not yours. Look for similar concepts and use the journalists terminology to describe them. For example, both journalists and SEOs know the power of headlines. Go for clarity and be descriptive, as opposed to being generic. Both write in an inverted pyramid, top-down style i.e. the most important facts - and keywords - are likely to appear at the top of the article. Both quote sources i.e. an opportunity for keywords within a link. And so on.

Align their goals with yours. Show writers how much potential traffic there is out there and how keyword research can be used to suggest article topics and title ideas. Show them that by following a few SEO principles, they can get more readers reading their articles. Writers often have communications objectives i.e. to achieve wider reach and exposure, so there might be some obvious, natural synergies to be had. All writers have egos, and like to have their articles widely read.

Check out this tactic, used by Rudy De La Garza Jr at BankRate Inc to help convince writers to adopt SEO practices:

At Bankrate, Mr. De La Garza showed editorial employees that, for some articles, deciding on about 10 main keywords before writing could help increase their number of page views. Writers were already vying for bragging rights to the most popular articles. He told them: "You know what, guys? If we apply a few SEO tactics here, I can help you win the weekly battle," he says.

Writers need to research topics. I've often found writers to be very receptive to SEO data mining techniques i.e. the frequency of keyword searches. Show them how keyword research can be a good way to research topics for articles. They can ensure they are writing on popular themes, or can twist their copy a little in order to tap into search streams.

4. The Developer

The developers are responsible for the technical aspects of the website.

Developers are often located in IT, yet you rely on them to perform a marketing function. Developers tend to work on specific projects. This can cause a conflict with the SEO, whos job is very much a work in progress.

Try to embed SEO into the development process. Developers usually work to a brief or requirements document, so include SEO where appropriate. Look for any design specifications that will affect SEO and get these sorted out before the developer starts coding.

One area that is likely to present problems is the structure of URLs. A developer doesn't care if the URL is long and unwieldy. It's probably never been cited as a problem before. Ensure the document specifies a URL structure and site hierarchy that gels with SEO i.e. descriptive, unique file names and a clear, flat directory structure. If the site has already been built, look into rewriting existing URLs.

Some of the marketing advantages include:

  • The URLs look nicer and will likely get clicked on more often
  • The URLs will provide better anchor text if people use the URLs as the link anchor text
  • If you later change CMS programs having core clean URLs associated with content make it easier to mesh that content with the new CMS
  • The benefit Google espouses for dynamic URLs (Googlebot being able to stab more random search attempts into a search box) is only beneficial if your site structure is poor and/or you have way more pagerank than content (like a wikipedia or techcrunch)

Developers will be aware of the need for site response speed. They need to ensure the site is crawlable. This job has been made somewhat easier, of late, given the introduction of Google Site Maps.

There might be various coding practices that can be changed in order to enhance SEO. For example, try replacing JavaScript behaviors, particularly for menus, with CSS techniques. Are there other coding aspects that could be enhanced? It might provide an opportunity for the developer to train in new technologies. I've yet to meet a developer who didn't want to learn new ways of coding. It all adds to their CV.

5. Legal

In big companies, copy is usually run past legal before any changes are made. Lawyers, as a profession, are typically risk adverse. This can play havoc with SEO strategies, especially edgy, link baiting SEO designed to attract links!

The only way to deal with this is to look for clear guidelines from legal in advance of implementing content strategies. Legal will almost certainly take precedence over SEO as companies look to protect their downside risk. On the bright side, the content of a page - especially if one or two words are changed - isn't going to make or break an SEO strategy.

SEO Best Practices For Corporates

In any change process, there are a lot of retraining that needs to be done. SEO is no exception.

The more people who understand what you do, and how and why you're doing it, the easier your job will be. There is no one way of achieving this, other than to communicate as often as possible. Look at training others as being a big part of your job, and something that should be done on an ongoing basis.

Once you've got people onside, you need to start building procedures into the work-flow itself.

Get a copy of the web site life-cycle and all documents relating to procedure, process and specification. Amend all documents to include SEO requirements into the process. Highlight all areas that present a risk, and make notes about the consequences of not mitigating these risks. With any amendment to process, there will likely be meetings in which you'll need to justify these chances, so come prepared.

An example of a change of process might be:

When publishing new articles, writers should search for existing articles, and link to them in the related articles section

Look for ways that will make your changes easy to incorporate. For the example above, have the designers build a "Related Articles" section into the template, so the requirement of internal linking becomes a natural part of the article creation process.

Here are some broad requirements, listed beneath each job function:

Wider Strategy

Big, corporate sites have advantages that small sites do not. Let's look at a few aspects, and how you can leverage them.

Brand Awareness

Corporate sites often have established brand awareness.

Lets consider Coca-Cola.com. Internal politics aside, getting more search exposure for such a ubiquitous brand, with a PR8, would be a cakewalk. It would simply be a case of ensuring the site is crawlable, the directory structure was well organized, and that keyword rich content was added on a regular basis.

However, if we take a close look at the CocaCola site, we can see that if they use SEO at all, it is most likely losing out to internal politics. If we do a site query site:www.coca-cola.com, we can see that they don't have many pages indexed. Around 350, most of which are regional versions of the site.The title tagging is poor. There is a lot of uncrawlable flash, and the site architecture isn't conducive to SEO. Does any of this matter? Probably not. Coke will sell a lot of soft drink regardless. However, they are throwing away a cheap win in terms of internet marketing by not being more search focused. It would be a major failure for any corporate site that sells direct to the consumer, like an online retailer.

These well-known sites need to mainly focus on internal factors. The external factors are well established.

Some big brands don't have great linking. Perhaps no one has ever considered external linking to be important. Getting links for well known sites is relatively easy. Ask suppliers and customers for links. News media, particularly trade media and business media, will likely be interested in news releases from your company. If you have a PR division, make sure they are using optimized PR templates that include links back to your site. Leverage the extensive network of relationships that corporates usually have.

Make use of sales. Sales people typically have contacts throughout the industry, and these contacts can be useful when it comes to linking. Think of things you can give the customer, in order to help the sales people make a sale, or deepen the existing relationship they have with them. For example, can you profile customers on your site? A customer may welcome a case study that shows them in a good light, and they'll almost certainly link to it.

Get Offline Data Online

Big companies usually have a wealth of data stored on internal networks. Try to get as much of this as possible onto the web site. Obviously, data that is commercially sensitive can't be made public, but there is likely a lot of material that can be marked up and published.

Marketing departments often don't consider such data because they're thinking of the web site as a brochure. However, the more content you have, especially if your site is well linked, the more chances you'll get search engine visitors. If this type of content doesn't support the brand objectives, place it an area of the site that isn't visited by people who don't arrive via search engines. Perhaps create a general information section.

Sponsorship

Corporates often sponsor events. Make sure the organizations your company sponsors link back to you. Have your PR people write up search friendly press releases and leverage these events for all they're worth.

Further Reading

New SEO Tools

Get Listed is a cool tool for seeing how your website looks in local search, and to aid you in submitting your site to local search engines.

Wordtracker announced they will be launching a new version of their keyword tool soon.

FairShare helps you track scrapers publishing your content.

WebReader makes it easy for people to click a button that makes your blog speech enabled.

Wiep highlighted a new link building and public relations tool called BuzzStream.

SEOmoz announced they retooled their backlink anchor text analysis tool using their Linkscape data, which they also released an API for.

Majestic SEO created a neighboring sites search.

Wolfram Alpha is to launch in May as an answer engine. I predict that launch strategy will create branding issues (like not being able to outsource the blame for their poor algorithms onto spammers - like Google does) but if the tool is decent it might be a great tool to use when researching content generation ideas. The new CashKeywords toolbar also looks quite useful for researching content ideas.

In January Google started offering a new, more-advanced sitemap generator, and made Jaiku (a Twitter-like tool) open source.

Mozilla is testing collecting usage data to create open research, hopefully they don't pull an AOL with the data though.

Some developers are creating new tools to analyze Wikipedia.

Pages