Does Domain Extension Matter?

Some countries have certain rules which make it harder or more expensive to get a local domain than a global one. For local search queries sites which match the local domain extension or are hosted on a machine in that country may get a boost in relevancy over global domains. (ie: .uk may rank well in UK, .de may rank well in Germany)

Google can use the increased price of local hosting and/or the rules associated with gaining a local domain extension to assume that locally hosted or locally registered domains may have a greater local relevancy.

Likely due to less spamming incentive, a smaller content base, and a lesser understanding of local language many of the filters that are applied to the global search results may not be applied to some local results.

By looking at link reputation scores Google lets pages on websites vote for other pages. On the commercial web the purity of many votes may be in question. Weblogs Inc., for example, has gambling ads on over 40 of their blogs - in spite of Google being a minority owner in that network.

In a recent WMW thread someone mentioned this URL (maricopa.gov) as a .gov domain that accepts advertising links, but generally it is much harder to buy .gov or .edu links than .com or .net links.

Beyond .edu and .gov there are also other rare domains which people probably do not talk about that much which also have similar importance. In the UK .ac.uk is the equivalent of a .edu, and perhaps some .mil extensions may be trusted a bit more than the average .com, .net, .info, or .biz type domain.

The factor of trust would be three fold:

  • The standards required to get a .edu (or other rare domain extension) implies a certain level of credibility.

  • When the web started educational institutions and governmental bodies were at the core of it. Thus, with greater history, they are more likely to have more link equity. Over time webmasters of scraper sites and legitimate web pages are going to be more inclined to link at the top ranking pages, which reinforces the link popularity.
  • Generally much of the well cited college papers or governmental pages are of higher quality than the average web page due to internal requirements. On top of that they are harder to influence than most average web pages. For example, it is pretty damn hard to get a professor to link at your site or update his or her outdated links. No professor wants some random self promotional asshole (which is how they will view many people who contact them) telling them that their content is outdated or inaccurate.

When you read about Trustrank the seed set of sites were all backed by government, educational, or corporate bodies. If you don't think Google relies on third parties in this way think about how they limit what sources they accept for their local search product or for their news search.

Surely many college students are selling .edu links by now, but those are still a bit harder for the AVERAGE webmaster to find than .com links for sale.

That which is rare, hard to obtain, hard to influence, or vetted by other trusted bodies may aid in relevancy scoring.
It has been a long time since a link is a link.

Google's Recent Search Result Changes

Google has been testing adding more information near search listings, including
- search this site
- inside this site (links to other pages on that site)
- related (links to related sites)

Testing the above, inline query suggestions, including vertical results via Google Onebox, and suggesting specific verticals for the most broad query types allows Google to attack vertical search from many angles.

Suggesting the broadest databases (shopping, news, images) for broad query types allows them to prevent too many large verticals from being created unless their creators do something fundamentally innovative. Increasing minimum bids for low quality ads also filters out some of the arbitrage model.

Query suggestions as you type and inline suggestions guide searchers toward more common (and likely more meaningful) search paths which will - on average - lead searchers to more useful results. They also aggressively aggregate data in some of the larger verticals, which adds value to the top few players they trust while making it harder for new players to spring up in those markets.

By adding more information near regular search result listings (including site search, related internal links and related external links) they only have to get near the search answer without necessarily needing to precisely answer it. Get close enough and then teach people about things like related links and site search and they should be able to get the rest of the way where they want to go.

The search box has the most value per pixel second, and until some major publishers find new monetization models or ways to challenge Google it is obvious that Google is going to keep adding more and more information to their results. Google may even be the one who helps them find better ways to monetize.

Unlike the competition, Google is not afraid to keep pushing the boundaries of their results, even if in the short term those tests lead to lower earnings. Why hasn't Yahoo! done anything with their Mindset search yet?

Once an engine gets enough marketshare there is a virtual endless stream of revenue possibilities so long as they listen to their users.

The value isn't just in the network, but in how quickly and smartly it reacts to changes. Google generally is the king at that.

They lower costs across the board while making information more accessible. Clicktracks now has a free version. If companies like Britannica listen to the advice they are given then Google may have access to encyclopedias of information for free, on top of having the largest userbase.

I think Wall Street is a bit stupid for reacting to quarter to quarter results. I just don't see a way of Google losing at this point. Even if eBay partners with another large player it still does not change the fact that their value add and relevancy is decreasing with each day of non innovation at eBay.

While keeping an eye on general search Google has also refined some of the more important verticals, which allows them to more precisely answer queries for those who care so much that they want to get much closer than just nearly answering the queries. In some of those verticals they are creating new standards for what is important.

You can count on Google hitting the education market hard, from funding literacy to pointing librarians at lesson plans.

From a marketing perspective these changes all add value to legitimacy while making the marketplace and SERPs more relevant. But as Google pushes these types of features they will also create new types of spam. For example, if you can't easily rank #1 for a competitive phrase, but you can easily make Google believe your document is somehow one of the most related documents to what is ranking at #1 that might be a cheap way to garner targeted traffic. Learning how to become related will be exceptionally useful if the current results may not answer the query as well as it should.

Inline Query Refinement - the Cheap Way to Rank

Instead of going after broad terms sometimes frequently searched for slightly less broad terms will rank in the search results for the broad terms.

It is likely going to be much easier to rank in the top 2 to 3 results for a longer query than it is to rank in the top 10 for a short generic query. Bill reviewed Google's query refinement here, and on this post I noted that on under $1 I was able to rank #4 for Marlboro by ranking #1 for Marlboro Miles.

AdSense 101 Tip: Controlling Page Width for Readability and Profitability

Fluid page designs are supposed to be nice, at least in theory, but if you don't control the page presentation it is hard to maximize the advertising offer opportunity and to blend ads into the layout as best you can.

A page width set to 100% gives the ads a relatively small % of the screen width unless the ads are huge, and huge ads get ignored because they scream I AM AN AD.

If the content area is exceptionally wide it makes a page hard to read due to making the eye move further left to right on each line than is comfortable.

If you limit the page width but align it to the left the ads still may not get clicked as much because they may not be viewed as well if they are on either side of the content.

So, to make more bank per page view, it makes sense to set a page width and center the page. If you still want to make the page somewhat liquid but controllable to a maximum width you may want to use max width.

760 px is a common page width. People are migrating to bigger browsers, but more people will also be connecting to the web on mobile devices.

Another good AdSense tip is to match (or nearly match) your text size and font to that of the AdSense ads
font-size:13px
face="arial,sans-serif"
Depending on your ad units sometimes the size may change. If you right click on an AdSense frame you can view the source of that frame to get that information.

Many designers recommends setting text sizes using em instead of pixels.

Google Image Search Optimization - Paris Hilton Pics

DaveN points at a screenshot of a Google search result with Paris Hilton giving.... well nevermind. ;)

I did a search on the same phrase and saw someone was putting thier URL in a top ranked image. Pretty smart marketing there, and no doubt one of the cheapest ways to tap into popular culture.

I bet eventually many non profit groups and others sites which have significant authority and limited funds will start making their voice heard in the search results far more, especially on image searches, where they may put the names of people they feel caused problems (and/or other messages) on horrific images of piss poor humanity in action.

I haven't done much on the image optimization front, but there can only be a limited number of factors for images:

  • file name

  • image alt text
  • image title
  • text near the image
  • image age
  • click streams
  • trust of site image is on
  • links referencing the image

Earning Google Trust With User Feedback & Engagement Metrics

Via Brickman comes an interesting WMW thread stating Google may be using traffic analysis and user feedback (clickstreams, etc) to help determine the quality of a site.

I have spoke to a well known engineer at another major engine who told me that it plays a significant role in their engine.

It makes sense that they would want to allow users to give feedback, but they wouldn't want to use just traffic analysis, because that would just promote large conglomerate sites and/or stifle innovation across the board (by promoting first movers at the expense of better products that followed).

In the WMW thread Walkman said:

Traffic? This would mean that a search engine is the last to know /show that you're popular.

But when you think about it, hasn't search ALWAYS been this way? Following links that OTHERS PUBLISHED. The links were not only used to crawl, but also a good way to imply trust or quality.

Quality global web search has never been about helping you find stuff first. That is what vertical databases, vertical search services, human editors and reporters are for.

Links to a site without traffic no longer imply the same level of trust that it used to. Sites selling link trading services use sales copy like:

The more websites you can get to link to your website, the higher you will rank in the search engines, guaranteed!

And even the mainstream media is talking about SEO.

Some people will claim that using searcher feedback as a baseline to help determine site quality is nonsense, but most of those people are probably launching new brands off of their old brands and current popularity, which make them much more likely to instantly get significant traffic / mindshare / linkage data.

It is easy to learn from your own SEO experiences, but it is also easy to extend what happens to you as to be the way things happen everywhere, even if that is not the case.

I don't think Google would want to base a ton of the overall relevancy algorithm on site popularity (and clearly they don't since the top results are not always the most popuar sites), but they can and may use traffic patterns and searcher feedback to filter out junk sites. And it may help certain types of link spam stick out (ie: a site that just picked up 50,000 backlinks but few of them drive any traffic) may be a red flag for spam.

Couple some of the temporal ideas with power laws and much of the spam should be pretty easy to detect.

Google talk recently even started redirecting chat URLs through Google.com. Do you think they would do that without reason?

Google PageRank Leakage & Misconceptions on PageRank

Sometimes I get quotes like this

"Bottom line, out-going links are always a BAD IDEA for SEO. It creates what we in the SEO community call SEO hemorrhage. It BLEEDS off GOOD PR. Not a good thing. We actually NEED MORE incoming links."

and

Somewhere in Google's webmaster guidelines is a warning about having more than 100 outbound links on a page. My advice is to take that point very seriously.

Using the same principle proves, at least to us here in this one office, that 101 outbound links on a page (don't forget to count navigation links in the total) may lead to an immediate decrease in absolute PageRank even if it's not demonstrated in the toolbar.

These ideas are typically short sighted and miss a broader view of the web. Is it possible to start from scratch and build up a brand while being completely greedy with your link popularity? Sure it is, but generally it is going to be easier to create a useful site if you are willing to link out to some related resources.

Especially if you write about your industry you have to source some ideas or information. Why avoid social interaction? How can you only view links as a cost? If you link out enough sometimes they come back. Heck sometimes other content authors will even defend your brand for you without you even knowing about it.

What are search engines but link lists? And most of the links are free. And people come back and use them again.

I do have some clients that for a period of time did not link out to some sites that they should of. For about a year or so a client outranked their own manufacturer for the manufacturer brand name in Yahoo! and MSN. In that case I was greedy with the link popularity because I didn't want to lower our exposure. After Yahoo! started ranking the appropriate site #1 for the brand name then I freely linked out to it.

For most any site there are probably at least a few sites that can be linked to.

As far as controlling internal link popularity goes, the reason for the 100 link suggestion was based on page usability. How many options can you give a person before you give them too many to be useful?

As recently noted by Matt, crawl depth is typically a function of PageRank:

One of the classic crawling strategies that Google has used is the amount of PageRank on your pages. So just because your site has been around for a couple years (or that you submit a sitemap), that doesn’t mean that we’ll automatically crawl every page on your site. In general, getting good quality links would probably help us know to crawl your site more deeply.

The theory that I have though is that you have to point at others thoughts that you find interesting if you hope to have others find you interesting. There is only so much one person can do. As a bonus to getting free content ideas by reading and linking at other people sometimes those links come back.

Some people have taken the PageRank funneling concepts to an extreme where they are even heavily using the link nofollow attribute on their own internal links, or whenever they point at official documentation on other sites. Both of which are usually bad form.

To snag a quote from DG

Sorry you fucksticks, but if you've ever used nofollow as anything other than a joke or to fuck someone else, yer an idiot, Just bend over and wrap yer lips around yer own asshole and suck until yer head explodes. At the very least, you'll reduce the number of stupid people that can breed. Follow?

I think things like many nofollows to internal pages might set off some sort of SEO flag. Just like being stingy with outbound links often forces some sites to have unnatural inbound link profiles.

It is just as easy to use links within your content to funnel your link popularity and actively drive users toward your desired goals. In the end goal to funnel visitors one way make sure you make it easy for them to go back in the other direction if they make a mistake or arrive on your site on the wrong page, otherwise you may hurt your conversion rates.

Bleeding PageRank is probably a bit arbitrary when you factor in the larger social aspects of the web. Plus some engines may also look at outbound links when trying to theme the content of your site.

Most content publishers have to vote for at least a few others before too many people are willing to vote for them. If you are a business selling products and services it still makes sense to link to business partners and other useful resources just to increase the depth and richness of your site without needing to recreate the web to do so.

Easy Linkbait...

You can get burned if you do it too often, but if you call others out you can get easy links.

This is pure horseshit. One surefire indicator that something is rotten in this particular pulpit is that Mark's "oops, sorry" Bob's column contains no links. In fact, his columns never link to any external sources of information. Isn't it remarkable that someone who writes a weekly column for the Internet never links to anyone else?

Notice the post also has an update with another link in it. You sometimes can even leverage another person's story about someone else being full of it for a strong link of your own. Of course sometimes you can get even more by being a contrarian to the contrarian.

Also worth noting that outbound links are perhaps the single cheapest form of marketing available to someone new to the WWW who is trying to break into their field. All the blog tracking and link tracking tools make the self agrandizing must-track-myself even more pervasive. That makes it easier to break into a field by making sure the right eyes see it when you agree with industry leaders, and more importantly, when you do not.

People who say hello through a topical post or a link are far more likely to get a receptive reply than lamers asking for links to spammy ad cluttered sites at hello.

Using Viral Content Ideas or Technology to Build Links in Spammy Industries

In some industries, like payday loans for example, it is hard to get legitimate citations.

If somehow a well discussed technology runs close to your core field it may be worth creating content around that linking opportunity and ask the right people for feedback on your end produt. Many of them will give your site authoritative links that are going to be hard to compete with.

Even run of the mill online flower shops may go from a me too site to a heavily linked industry authority if they added something like this to their sites. As storage costs go down and more people filter information in more and more ways there become more and more marketing opportunities. Largely because there become more overlapping intersections between industries. Many of those intersections get talked about and heavily linked at.

Web Directories...are They Relevant to SEO?

With Zeal recently closing (I think Looksmart are dumb to have closed it) some people have recently been questioning the value of directories.

$hoemoney recently had a mini interview of a few SEOs asking if they are still relevant. The general consensus was that if the directory sends traffic then it is a good link to buy.

I think that is a good rule of thumb, but I am also a bit more aggressive. I still buy a few links that I figure won't drive much traffic, largely because I think they still work well in Yahoo! and MSN. Having said that, I think there are certain quality signals or anti-quality signals that it helps to look at.

  • Is it ranking in the SERPs? - If a site ranks well in the search engines it stands a good chance to be trusted by them. Plus even if those links do not count to help boost your ranking they still can drive direct traffic. I frequently see directories like Business.com and JoeAnt ranking in the search results.

  • Do they sell direct links? - Direct links are more likely to be taken as editorial votes of quality. Some redirected links may still count, but many of them will not.
  • How frequently is their site crawled? - You need to check and see if the category pages are being cached in Google, and how frequently they are cached. If their pages are not getting cached or have not been cached for 6 months then the odds are pretty low of that link carrying much weight.
  • What is the quality ratio? - Does it list anyone who pays? Or do they hold sites to some quality standards? Do they categorize sites properly? Or do they sell links to anyone in any category, even if it is the wrong one? Does each page have unique content? Are most pages empty - adding nothing but clutter to search indexes? If they do not help engines categorize the web (ie: no editorial value) then eventually the engines are not going to trust their votes.
  • What is the ad ratio? - Are all the listings paid? Or will they list some useful sites without payment? Does the site look like it aims to serve end users? Or does it look like it exists just to get AdSense ads or affiliate ads indexed?
  • Do they sell outbound sitewide links? - Prettymuch the equivalent of selling out - when a directory puts sitewide outbound links on their site (especially if those links are to junky sites) the odds are pretty good that the links are not going to count much.
  • Is it decrepit? - Directories which have 50% of their links broken or pointing at URLs that have been purchased by spammers or domainers are not going to pack as much punch as sites which have few broken links. I recently bought a 25 page directory that has not been updated in a couple years, and it had about 400 broken links in it. Not good!
  • Does it have unique content? - Is it a DMOZ clone? Are its listings manually compiled and unique from what is offered at other directories?
  • Is it relevant to my site? - Many small niche directories can drive decent value due to offering decent co-citation data and having exceptionally relevant traffic streams.

Bob Mutch recently rated 40 top directories based on their age, how many edu and gov links they have and whether or not they are listed in DMOZ and Yahoo!. I would contest that WhatUSeek isn't a real directory, ISEDB is a directory of search engines and directories, and that Vlib.org should be counted as a directory, but other than those minor points this is a pretty cool study.

No URL EDU GOV Age
1 www.dmoz.org 128,000 761 1999
2 dir.yahoo.com 111,000 2,060 1995
3 www.business.com 2,420 73 1998
4 www.joeant.com 106 1 2000
5 www.botw.org 50 1 1996
6 www.gimpsy.com 23 0 2001
7 www.goguides.org 22 0 2001

I think the inbound link profile is a good starting point for rating (when you add it to the other criteria I mentioned above), but what I think is even more interesting is how quickly they fall off in the quantity of quality links. After DMOZ solved the general directory problem and Google solved the search problem no general directories were able to get many citations. That sorta shows the importance of market timing.

While there are many quality links that are not from a .gov or .edu TLDs, I think those are a good proxy for overall link quality. Notice how quickly the .edu count falls off. That is why the top directories may be worth $300 for a listing...they are trusted quality links.

The quick fall off in legitimate citations is why some types of link spam are easier to detect than many people think. When they manually build links many of the links they accumulate are outlier low power links, often ones sharing similar link profiles with each other.

What General Directories Provide Great Value?

I liked the ones I left linked above. There are a few others that are decent as well, but the broader I make the list the more likely I am to eventually promote sites that are doing lots of spammy things, like whoring out their sites to AdSense or sitewide casino links.

I see a couple of the unlinked directories listed above ranking in the SERPs for a broad range of queries, but some of them may not exercise much editorial control, and will eventually lose some of their authority.

For the sake of MSN and Yahoo! I still do submit to a number more directories than just what is listed above. The number depends on the field, but if the business is going to be a web savvy business that can afford to create strong brands and/or useful content then they will also have many links from outside the directory sphere.

Topically Relevant Directories:

It is hard for me to list quality topical directory examples because:

  • if you do not know a topic then it is hard to judge quality

  • directories change over time.

For example, I used to always use a certain directory as an example of a quality directory, but now that there are off topic airline ads on the home page and too much AdSense ads I don't put as much stock in it.

Think Local:

Some local directories are way under priced and of high quality. Quality local directories tend to drive significant hyper targeted traffic.

A few other things to consider when registering with directories:

  • I use Roboform to submit my sites, but mix up my link anchor text and descriptions (especially since some search engines have certainly looked at word relations outside of on-the-page content and anchor text).

  • If your market is competitive and your site is new you will also need to get other types of links if you want to rank in Google
  • mix your anchor text
  • if brand name is keyword rich make sure you also try to get a few variations in your listing titles outside of your brand name such that if you push the brand hard and cause significant natural linkage it won't cause your link profile to look wonky due to too much similar link text.
  • If your brand is not generic it may only take a couple links for you to rank at or near the top of the search results for it.
  • A Yahoo! listing or DMOZ listing may be worth 20 or more links from lower quality directories.
  • Each good link you get allows you to get many junky links without it really hurting you (say ~50 or so - depending on industry)

How Different Search Engines Count Directory Links:

Yahoo! and MSN still tend to count directory links (including low quality directory links) far more than Google does.

For a new one page flash site I got about 50 directory links in a couple days a while ago. It competes for a basket of low traffic $3 per click terms that can cost about $600 a month ranking at about #2 on the PPC ads.

In Google 2 weeks after I started link building the site ranked in the top 10 across a wide array of terms from this basket of keywords. After about 2 months without additional link building the site's rankings in Google dropped off. After that they have started to slowly improve.

Yahoo! took a bit longer than Google to react, but once it did and I went to #1 I stayed there almost every day for the last 5 months.

MSN reacted about as quickly as Google, perhaps even a few days quicker. Outside of a few fluctuations it has ranked fairly consistantly at #1.

The client ranks #1 for their brand name and related terms in all major engines. They probably would rank a bit better in Google if I got those links over time and showed consistant growth, but considering how cheap I sold those services for I am still certain they have an exceptionally strong ROI, and I am certain their ranking will rise over time if we put more effort and resources into SEO.

Why do People think Directories are Becoming Irrelevant?

  • As a business model directories do not work well unless you are hyper focused or have significant authority to leverage. (Unless you are selling PageRank to naive webmasters who have yet to learn much about SEO or get burned by shady directories.)

  • As more people write and compile information the quality of information needs to be better to be link worthy.
  • Most directories (especially most paid directories) do not add much context as to why a particular site is important, useful or worthwhile.
  • People do not give out links as freely as they once did. It is hard for a directory site to be viewed as link worthy as they were in the past, thus they do not get as much authority to pass on.
  • Active channels, such as topical weblogs, tend to drive far more traffic than most fairly static directory sites.
  • Google's algorithms are improving. They are getting better at scrubbing link quality and filtering out duplicate or near duplicate content.
  • Most general directories are useless spam.
  • Couple improving search algorithms with social bookmarking sites and they make the job of professional catalogs and archivists less relevant, except perhaps for ultra niche categories that are not well cited.

Why is the Business Case for Directories Falling Apart?

  • Many of the reasons listed above (market hypersaturation, lessening authority, other content types - like blogs and wikis - are fighting for the same audience, improving search quality, bottoms up social systems).

  • Directories create inefficiently priced marketplaces.
  • Most directories drive so little traffic and value that it is hard for them to make their marketplaces more efficient.

There is Still Some Value in Directory Links:

In conclusion, I still like a number of directories, but sometimes it helps to drill down to look at relevancy more than just buying any old link. I also think that even if some of the mid to low quality directories do not offer lots of value in Google they still help with the other engines. Another added bonus of building links from directories and other sources is that they can inflate your link count to help discourage competition and/or pollute your link profile to make it hard for competitors to see what all links are helping your rank where you do.

Pages