Interview of Tedster from WebmasterWorld

If you have been in the SEO field for any serious length of time you have probably come across (and benefited from) some of Tedster's work - either directly, or indirectly from others who have repackaged his contributions as their own. He is perhaps a bit modest, but there are few people in the industry as universally well respected as he is. I have been meaning to interview him for a while now, and he is going to be speaking at Pubcon South on April 14th in Dallas, so I figured now was as good a time as any :)

How long have you been an SEO, and how did you get into the field?

I started building websites in 1995 and the word SEO hadn't been invented. I came from a background in retail marketing, rather than technology or graphic design. So my orientation wasn't just "have I have built a good site?", but also "are enough people finding my site?"

The best method for bringing in traffic seemed to be the search engines, so I began discussing this kind of marketing with other people I found who had the same focus. Ah, the good old days, right? We were so basic and innocently focused, you know?

If you could list a few key documents that helped you develop your understanding of search, which would be the most important ones?

Here are a few documents that acted as major watersheds for me:

Is PageRank still crucial? Or have other things replaced it in terms of importance?

What PageRank is measuring (or attempting to measure) is still very critical — both the quality and number of other web pages that link to the given page. We don't need to worship those public PR numbers, but we definitely do need quality back-links (and quality internal linking) to rank well on competitive queries.

There appears to be something parallel to PR that is emerging from social media — some metric that uses the model of influencers or thought leaders. But even with that in the mix, ranking would still depend on links, but they would be modified a bit by "followers", "friends", since many social sites are cautious with do-follow links.

Lets play: I have got a penalty - SEO edition. Position 6, -30, 999, etc. Are these just bogus excuses from poor SEOs who have no job calling themselves SEOs, or are they legitimate filters & penalties?

If the page never ranked well, then yes - it could well be a bogus excuse by someone whose only claim to being an SEO is that they read an e-book and bought some rank tracking software. However, Google definitely has used very obvious numeric demotions for pages that used to rank at the top.

The original -30 penalty is an example that nailed even domain name "navigational" searches. It affected some sites that did very aggressive link and 301 redirect manipulation.

What was originally called the -950 (end of results) penalty, while never an exact number, most definitely sent some very well ranked pages down into the very deep pages. Those websites were often optimized by very solid SEO people, but then Google came along and decided that the methods were no longer OK.

In recent months, those exact number penalties seem to have slipped away, replaced something a bit more "floating" and less transparent. My guess is that a negative percentage is applied to the final run re-ranking, rather than subtracting a fixed number. Google's patent for Phrase-based Indexing does mention both possible approaches.

But even using percentages rather than a fixed number, when a top-ranked page runs afoul of some spam prevention filter, it can still tank pretty far. We just can't read the exact problem from the exact number of positions lost anymore.

Do you see Google as having many false positives when they whack websites?

Yes, unfortunately I do. From what I see, Google tends to build an algorithm or heuristic that gathers up all the URLs that seem to follow their "spam pattern du jour" — and then they all get whacked in one big sweep. Then the reconsideration requests and the forum or blog complaints start flying and soon Google changes some factor in that filter. Viola! Some of the dolphins get released from the tuna net.

One very public case was discussed on Google Groups, where an innocent page lost its ranking because a "too much white space" filter that misread the effect of an iframe!

Google's John Mueller fixed the issue manually by placing a flag on that one site to trigger a human inspection if it ever got whacked in the future. I'd assume that the particular filter was tweaked soon after, although there was no official word.

How many false positives does it take to add up to "many"? I'd guess that collateral damage is a single digit percentage at most — probably well under 5% of all filtered pages, and possibly less than 1%. It still hurts in a big way when it hits YOUR meticulously clean website. And even a penalty that incorrectly nails one site out of 300 can still affect quite a lot over the entire web.

How often when rankings tank do you think it is do to an algorithmic issue versus how often it is via an editorial issue with search employees?

When there are lots of similar complaints at the same time, then it's often a change in some algorithm factor. But if it's just one site, and that site hasn't done something radically new and different in recent times, then it's more likely the ranking change came from a human editorial review.

Human editors are continually doing quality review on the high volume, big money search results. It can easily happen that something gets noticed that wasn't seen before and that slipped through the machine part of the algorithm for a long time.

That said, it is scary how often sites DO make drastic errors and don't realize it. You see things like:

  • nofollow robots meta tags getting imported from the development server
  • robots.txt and .htaccess configurations gone way wrong
  • hacked servers that are hosting cloaked parasite content

Google did a big favor for honest webmasters with their "Fetch as googlebot" tool. Sometimes it's the easiest way to catch what those hacker criminals are doing.

When does it make sense for an SEO to decide to grovel to Google for forgiveness, and when should they try to fix it themselves and wait out an algorithmic response?

If you know what you've been doing that tripped the penalty, fix it and submit the Reconsideration Request. If you don't know, then work on it — and if you can't find a danged thing wrong, try the Google Webmaster Forums first, then a Request. When income depends on it, I say "grovel".

I don't really consider it groveling, in fact. The Reconsideration Request is one way Google acknowledges that their ranking system can do bad things to good websites.

I've never seen a case where a request created a problem for the website involved. It may not do any good, but I've never seen it do harm. I even know of a case where the first response was essentially "your site will never rank again" — but later on, it still did. There's always hope, unless your sites are really worthless spam.

Many SEOs theorize that sometimes Google has a bit of a 2-tier justice system where bigger sites get away with murder and smaller sites get the oppressive thumb. Do you agree with that? If no, please explain why you think it is an inaccurate view. If yes, do you see it as something Google will eventually address?

I'd say there is something like that going on — it comes mostly because Google's primary focus is on the end user experience. Even-handed fairness to all websites is on the table, but it's a secondary concern.

The end user often expects to see such and such an authority in the results, especially when it's been there in the past. So Google itself looks broken to a lot of people if that site gets penalized. They are between a rock and a hard place now.

What may happen goes something like this: an A-list website gets penalized, but they can repair their spam tactics and get released from their penalty a lot faster than some less prominent website would. It does seem that some penalties get released only on a certain time frame, but you don't see those time frames applied to an A-list.

This may even be an effect of some algorithm factor. If you watch the flow of data between the various Google IP addresses, you may see this: There are times when the domain roots from certain high value websites go missing and then come back. Several data center watchers I know feel that this is evidence for some kind of white-list.

If there is a white-list, then it requires a history of trust plus a strong business presence to get included. So it might make also sense that forgiveness can come quickly.

As a practical matter, for major sites there can easily be no one person who knows everything that is going on in all the business units who touch the website.

Someone down the org chart may hire an "SEO company" that pulls some funny business and Google may seem to turn a blind eye to it, because the site is so strong and so important to Google's end user. They may also just ignore those spam signals rather than penalize them.

Large authority site content mills are all the rage in early 2010. Will they still be an effective business model in 2013?

It's tough to see how this could be quickly and effectively reined in, at least not by algorithm. I assume that this kind of empty filler content is not very useful for visitors — it certainly isn't for me. So I also assume it must be on Google's radar.

I'd say there's a certain parallel to the paid links war, and Google's first skirmishes in that arena gave then a few black eyes. So I expect any address to the cheap content mills to be taken slowly, and mostly by human editorial review.

The problem here is that every provider of freelance content is NOT providing junk - though some are. As far as I know, there is no current semantic processing that can sort out the two.

Given that free forums have a fairly low barrier to entry there are perhaps false alarms every day on ringing in the next major update or some such. How do you know when change is the real deal? Do you passively track a lot of data? And what makes you so good at taking a sea of tidbits and sort of mesh them into a working theme?

I do watch a lot of data, although not nearly to the degree that I used to. Trying to reverse engineer the rankings is not as fruitful as it used to be —especially now that certain positions below the top three seem to be "audition spots" rather than actually earned rankings.

It helps to have a lot of private communications — both with other trusted SEOs and also with people who post on the forums. When I combine that kind of input with my study of the patents and other Google communications, usually patterns start to stand out.

When you say "audition spots" how does that differ from "actually earned rankings"? Should webmasters worry if their rankings bounce around a bit? How long does it typically take to stabilize? Are there any early signs of an audition going good or bad? Should webmasters try to adjust mid-stream, and if so, what precautions should they take?

At least in some verticals, Google seems to be using the bottom of page 1 to give promising pages a "trial" to see how they perform. The criteria for passing these trials or "auditions" are not very clear, but something about the page looks good to Google, and so they give it a shot.

So if a page suddenly pops to a first page ranking from somewhere deep, that's certainly a good sign. But it doesn't mean that the new ranking is stable. If a page has recently jumped way up, it may also go back down. I wouldn't suggest doing anything drastic in such situations, and I wouldn't overestimate that new ranking, either. It may only be shown to certain users and not others. As always, solid new backlinks can help - especially if they are coming from an area of the web that was previously not heard from in the backlink profile. But I wouldn't play around with on-page or on-site factors at a time like that.

There's also a situation where a page seems to have earned a lot of recent backlinks but there's something about those links that smells a bit unnatural. In cases like that, I've seen the page get a page one position for just certain hours out of the day. But again, it's the total backlink profile and its diversity that I think is in play. If you've done some recent "link building" but it's all one type, or the anchor text is too obviously manipulated, then look around for some other kinds of places to attract some diversity in future backlinks.

On large & open forums lots of people tend to have vastly different experience sets, knowledge sets, and even perhaps motives. How important is your background knowledge of individuals in determining how to add their input into your working theme? Who are some of the people you trust the most in the search space?

I try never to be prejudiced by someone's recent entry into the field. Sometimes a very new person makes a key observation, even if they can't interpret it correctly.

There is a kind of "soft SEO" knowledge that is rampant today and it isn't going to go away. It's a mythology mill and it's important not to base a business decision on SEO mythology. So, I trust hands on people more than manager types and front people for businesses. If you don't walk the walk, then for me your talk is highly suspect.

I pay attention to how people use technical vocabulary — do they say URL when they mean domain name? Do they say tag when they mean element or attribute? Not that we don't all use verbal shortcuts, but when a pattern of technical precision becomes clear, then I listen more closely.

I have long trusted people who do not have prominent "names" as well as some who do. But I also trust people more within their area of focus, and not necessarily when they offer opinions in some other area.

I hate to make a list, because I know someone is going to get left out accidentally. Let's just say "the usual suspects." But as an example, if Bruce Clay says he's tested something and discovered "X", you can be pretty sure that he's not blowing sunshine.

Someone who doesn't have huge name recognition, but who I appreciate very much is Dave Harry (thegypsy). That's partly because he pays attention to Phrase-based Indexing and other information retrieval topics that I also watch. I used to feel like a lone explorer in those areas before I discovered Dave's contributions.

What is the biggest thing about Google where you later found out you were a bit off, but were pretty certain you were right?

That's easy! Using the rel="nofollow" attribute for PR sculpting. Google made that method ineffective long before I stopped advocating it. I think I actually blushed when I read the comment from Matt Cutts that the change had been in place for over a year.

What is the biggest thing about Google where you were right on it, but people didn't believe until months or years later?

The reality of the poorly named "minus 950" penalty. I didn't name it, by the way. It just sort of evolved from the greater community, even though I kept trying for "EOR" or "End Of Results.

At PubCon South I believe you are speaking on information architecture. How important is site structure to an effective SEO strategy? Do you see it gaining or losing importance going forward?

It is hugely important - both for search engines and for human visitors.

Information Architecture (IA) has also been one the least well understood areas in website development. IA actually begins BEFORE the technical site structure is set up. Once you know the marketing purpose of the site, precisely and in granular detail, then IA is next.

IA involves taking all the planned content and putting it into buckets. There are many different ways to bucket any pile of content. Some approaches are built on rather personal idiosyncrasies, and other types can be more universally approachable. Even if you are planning a very elaborate, user tagged "faceted navigation" system, you still need to decide on a default set of content buckets.

That initial bucketing process then flows into deciding the main menu structure. Nest you choose the menu labels, and this is the stage where you fix the actual menu labels and fold in keyword research. But if a site is built on inflexible keyword targets from the start, then it can often be a confusing mess for a visitor to navigate.

As traffic data grows in importance for search ranking, I do see Information Architecture finally coming into its own. However, the value for the human visitor has always clearly visible on the bottom line.

What are some of the biggest issues & errors you see people make when setting up their IA?

There are two big pitfalls I run into all the time:

  • Throwing too many choices at the visitor. Macy's doesn’t put everything they sell in their display windows, and neither should a website.
  • Using the internal organization of the business as the way to organize the website. That includes merely exporting a catalog to a web interface.

How would you compare PubCon South against other conferences you have attended in the past?

PubCon South is a more intimate venue than, say Vegas. That means less distraction and more in-depth networking. Even though people do attend from all over the world, there is a strong regional attendance that also gives the conference a different flavor — one that I find a very healthy change of pace.

In addition, PubCon has introduced a new format — the Spotlight Session. One entire track is made completely of Spotlight Sessions with just one or two presenters, rather than an entire panel. These are much more interactive and allow us to really stretch out on key topics.

---

Thanks Tedster! If you want to see Tedster speak he will be at Pubcon Dallas on the 14th, and if you want to learn about working with him please check out Converseon. You can also read his latest musings on search and SEO by looking into the Google forums on WebmasterWorld. A few months back Tedster also did an interview with Stuntdubl.

Published: April 6, 2010 by Aaron Wall in interviews

Comments

pageoneresults
April 6, 2010 - 2:02pm

Great interview Aaron! tedster is one of a select group of people that I've paid very close attention to over the years. He and I have had some very detailed discussions at WebmasterWorld and it's always a pleasure to read tedster, always! He is a Diplomat and well respected by his peers. We love ya tedster, in a SEO guy kind of way. ;)

backlinktracking
April 7, 2010 - 12:22pm

Haven't read such a good tips for quite a while. Thanks for your work tedster

Shrike99
April 10, 2010 - 2:07am

I've been reading SEObook for years now, but I never took the time to leave a comment. But now is the right time. Thanks a lot to Tedster. He always helped me professionally on WebMasterWorld, often on private messaging too. You were there years ago when I was learning SEO, and because of people like you, I kept liking it, and now work as a SEO. Thanks Tedster!

Jonathan Villiard
www.OptimisationV.com

Add new comment

(If you're a human, don't change the following field)
Your first name.
(If you're a human, don't change the following field)
Your first name.
(If you're a human, don't change the following field)
Your first name.