Google is Becoming Wikipedia Without the Talk Page

In a recent post about paid links, Danny Sullivan wrote about how Google's army of engineers are going to start hand editing PageRank scores if they think you are selling links, which is a move that wreaks of desperation.

Google is only decreasing the PageRank for a subset of the sites they actually know about. ...

Google stressed, by the way, that the current set of PageRank decreases is not assigned completely automatically; the majority of these decreases happened after a human review. That should help prevent false matches from happening so easily.

In contrast, if you're a smaller site not deemed as important to relevancy, a harsher punishment of a ranking penalty may be dealt out.

Introducing the New, Corporate Web

If they actually follow through with any of this then Google, which touts the value of PageRank, clearly no longer believes in its value. They already show stale data in their toolbar, and might as well scrap the whole thing and start fresh. Their mind control exercise is getting a bit obnoxious.

Now they are editing PageRank and relevancy scores. They don't edit based on quality of information but based on method of promotion. And if it is a corporation breaking Google's arbitrary shifting ruleset then Google simply decides not to edit, or only fakes that they care.

Google is Wikipedia, but Worse

With this news of more hand editing, Google also shows that they are biased against small webmasters are and actively trying to screw over small webmasters to increase their corporate profits.

Google is becoming much like the Wikipedia, where generalists wrongly assume topical knowledge greater than that of the real topical experts. In some cases Wikipedia is saved by talk pages and community participation that allow the experts to be heard. Google has no talk page though, which means that Google search results will become a dried out and dumbed down version of the web.

The Real Problem With Half Truths & Hand Editing

The response to every move is a counter move. So if they actually try to squash link buying then webmasters will look for indirect ways to purchase links. Google also offers tips on how to sculpt PageRank, but sculpt to much and suddenly the intent is changed, and you are banned.

Why leave such a thing up to a single Google engineer making a judgement call? If they want to increase the quality of the web they need to be more innovative in encouraging the creation of good content, not make people afraid to invest into creating content only to watch a Google engineer kill it.

Link bait is good when you are a large corporation or are syndicating Google spin, but if you are too successful at link bait they will ban your site for it. They did it to one of my sites and they even banned one of their own site.

If you are a small webmaster and get judged by Google don't expect compassion. They have no talk page, and they already paid an AdSense publisher to steal all your content. They don't need you.

How to Do Well in Google

If you are a webmaster assume that Google is lying to you and ignore them. If their view of the web and webmaster advice are reduced to half truths and lies then we can only hope something a bit more honest will come out of their downfall.

Understanding the Psychology of the Google User (Through the Actions of an Engineer)

Frank Schilling and Shoemoney recently had two great posts about Google. When combined I think they paint a picture of Google that skips past the rhetoric and double talk. Frank said:

As a publisher, I've always viewed Google as a bit of a predator in this context.. taking publishers in, convincing them to serve Google ads, and then allowing those publishers to toil for Google, working sites into their algo to serve the beast, all for increasing revenues, finally to have Google's algorithm scrub you from the index if you become too successful at punching ad converting pages to the top.. Good publishers take on the role of sacrificial lamb to show the algo guys where the holes are and they get to ride the express elevator to the street as a reward.

Shoemoney's video about avoiding getting hit by Google stated that the key is just don't do things that make Google look stupid or undermine the perceived magic that occurs at Google.

My site that Matt Cutts hand removed from Google's search results got too much exposure and Matt killed it not because the site was spammy, but because it was mine and it was getting too much mainstream traction and exposure. My marketing was too appealing, aggressive, and effective. In another year that site would have been untouchable, and that thought made Matt Cutts feel uncomfortable.

The Changing Desires of the Magical Fictitious Average User

How Relevancy is Defined

With Google, the whole concept of relevancy is a shifting mind control game. As long as you do not get the wrong types of exposure you can make a lot of money without them doing anything about it. Go too mainstream with something a little sketchy or something with the scent of smart SEO to it and they will try to kill you out of resentment, jealousy, and to try to protect the lies that their business model are based on.

How to Rent a Half Million Links & Stay Below Google's Radar

Google tries to scare you away from renting links, but their paid link detection algorithms are at best laughable. Which is why Matt Cutts puts so much effort into trying to scare you about bought links.

_________.com has repetitive and near machine generated sounding content, like

Loan calculators are made of different calculation types. In fact, for calculating the same type of loans, a large number of different calculator programs exist that will help you think about your loan and analyze your loans from different angles.

and that QUALITY content ranks in Google for thousands of search phrases. It looks like someone rented hundreds of thousands of links, with sitewide links on _______ and many other high authority sites.

I thought about making this post, but then decided it is bad karma to out the site I menioned, so I edited out the identifying details. You understand the lack of validity of Google's paid links scaremongering techniques by reading Jim Boykin's great post about quality sites never getting penalized for selling links and by looking at some of the places sketchy links are popping up.

If Google is deceptive, misleading, and self serving with the data they share (which they are) why should we expect anything different with their general advice for webmasters?

5 Differences Between Google.com & International Google Search Results

Having searched hundreds of times on google.ca and google.com.ph I see some subtle differences in how the top ranked global / US results are mixed into international results.

  • High authority sites do not tend to rank as well internationally as they do in the US. Domain trust counts less. As an example, Matt Cutts recently posted about his favorite omron pedometer. He is right near the top on Google.com, but it a bit lower internationally.
  • Low authority sites that were near the top of the global search results tend to rank a bit better internationally. My mom has a lower link authority weight loss blog and has also posted about her favorite omron pedometer, and ranks better internationally than she does in the US results.
  • Exact match domains ("mykeyword.com" matches [mykeyword] and [my keyword]) seem to get a bit more love in international search results than in the US results.
  • The domain love is even moreso if it is a local domain extension.
  • Trusted local sites are aggressively mixed into the search results, especially for queries that would hint a local preference. In one local market I saw a local thin affiliate site ranking in the top 3 for a core mortgage term, and the site was only PageRank 2.

Why Google Hand Editing Seems Random

Many people wonder why Google hand editing seems random or incomplete, and why some of the best channels get edited while worse stuff is left untouched. Here are some of the reasons Google does a poor job equitably policing the web:

  • The web it too large to police and engineer time is expensive.
  • Policing certain segments produces unwanted blowback. How often do large corporations or influential bloggers get policed? (It is rare, and when it happens, it is sold as a side effect of feature for users.)
  • When issues become popular they get prioritized. Many times Google won't react unless they feel they are forced to. Sometimes they will lie and say they care, and then do nothing. Back in April I highlighted the Netscape search results in Google. Matt Cutts thanked me, but guess what...those Netscape lolita preteen search result pages are STILL ranking in Google, along with a bunch of other search results.
  • If they edit in an incomplete or random fashion they evoke fear
  • It is easier to control people through fear than to create a perfect algorithm
  • They have no need to hand edit the worst offenders. If they are competent programmers the algorithms should take care of those sites. They sometimes edit a leading sites in a field to send a signal. They may throw in a group of other sites if they need to create cover, but their goal is to send a message and alter behavior.

To appreciate how hard it is to police the web read Eli's recent post on how to create an SEO empire. How can Google compete with that?

7 Useful Webmaster Tools Google Stole From You

As a public facing SEO who has many thousands of customers at the beginning of the SEO learning cycle many of my most common questions I get asked come as a result of Google half truths. I thought it would be worthwhile to write a few of these down to save myself time answering the same emails each day.

It may be inappropriate to label Google as a liar for doing the following. A more appropriate and more fair label would be an intentionally deceitful organization.

Want Link Data? Go Elsewhere

Google offers a link: function which shows a sampling of inbound links to a website. A few years back they had a much smaller allotment of machines for handling link queries, so they used to show mostly a sample of the most authoritative inbound links to a site. Then they started showing mostly lower quality inbound links, while filtering out most of the better ones. But they explain that they doubled the size of the sample, and showed more links to smaller mom and pop websites that lacked high authority inbound links, so it was a good feature for users.

When you hear some people at Google talk about this they talk about it, they tend to talk about "from a historical perspective" and explain how they used to be limited, but they still use virtually ALL link data in calculating result relevancy. Given that last statement then the "from a historical perspective" is self serving positioning about not providing a useful feature because they want to make SEO harder.

Want further proof? If you sign up for Google Webmaster Central and verify your site they will show you all your linkage data. I don't trust Google Webmaster Central because they profile SEOs and hand edit the search results. Signing up is probably an unwise decision.

Google does offer us a free tool to estimate link authority though: PageRank.

Google PageRank

For as hyped as Google PageRank is, Google sure goes out of their way to ensure the values are inaccurate. They only update the Google Toolbar display about once every three months. Even then, when they update it that is not fresh for that day...those stats might be from a week, two weeks, or a month ago. Also sometimes the toolbar is buggy and shows the wrong PageRank values, where viewing the same page multiple times in a row will yield different PageRank values each time.

The only reasons they still place PageRank on the toolbar is because they get free marketing out of it, and it helps them collect more usage data. Years ago Apostolos Gerasoulis, the search scientist behind Teoma, said Google doesn't rely heavily on PageRank to score relevancy. Gigablast's Matt Wells said similar:

PageRank is just a silly idea in practice, but it is beautiful mathematically. You start off with a simple idea, such as the quality of a page is the sum of the quality of the pages that link to it, times a scalar. This sets you up with finding the eigen vectors of a huge sparse matrix. And because it is so much work, Google appears not to be updating its PageRank values that much.

Any webmaster with an old URL that ranks exceptionally well with almost no PageRank knows that PageRank didn't drive them to outrank people with 10 times their link equity.

PageRank is important for one aspect of information retrieval though: crawl depth. If you have a lot of PageRank then you will get crawled deeply and more frequently. If not, then they will crawl shallow, and perhaps place many of your pages in the supplemental results.

Are My Pages in Google's Supplemental Results?

Want to know what pages from your site are rarely crawled, cached, or updated? Want to know where your duplicate content issues exist? Want to know what pages from your site we don' t trust links from or trust enough to rank well for many search results? Look at the supplemental results. Oops, they took that label out of the results, but here is a more complicated search you can use to find your supplemental results, at least until it gets disabled. Jim Boykin and Danny Sullivan also offered tips on finding supplemental pages.

Right now a Google search for Google supplemental results provides low quality search results because most of the top results (including my #1 ranking at the moment) do not discuss how to find supplemental results. Unfortunately if you only heard of the supplemental results AFTER they took the label out you likely will have no way of telling if any of your pages are supplemental, which is good for seasoned marketers but bad for mom and pop. If I don't edit my post then people will think I am a liar or an idiot because Google is deceptive.

If they truly wanted to make the world's information universally accessible and useful why would they remove this label? At the very least, if webmasters paid attention to this label they would structure their sites better and help Google save bandwidth by not having Google crawl as many low quality pages.

The easiest way to get out of the supplemental results is to clean up site structure issues and build more high quality link authority. Cleaning up site structure issues is much harder now that it is harder to see what is in the supplemental results, and each day it gets harder to build honest links due to Google spreading FUD about links...

Organic Linking Patterns

With all the FUD Google spreads about paid links they make many webmasters afraid to link out to other sites, which reduces the quality of information available on those sites, and prevents some quality sites from ranking where they should. Nofollow is not about being organic. In fact, it was a tool created to directly manipulate the public's perception of linking. To appreciate how out of hand it is, consider the following situation.

A friend's business got covered by a mainstream media site. They wrote an entire article about his business but did not link to him because they felt linking would have made the article too promotional. Imagine being the topic of an article and the source of content for other sites without attribution for it. That is the side effect of Google's bought links FUD.

Instead of promoting quality content current relevancy algorithms support information pollution. Google goes out and scares you about keeping your link profile natural while letting proxies hack your rankings. And they have known about this problem for years, just like 302 redirect.

Since Google's link guidelines are self-serving and out of nature with the realities of the web, what happens if I get aggressive with link building and eventually get busted for doing the same things my competitors are getting away with doing?

Lose All Your Link Equity (and Your Content and Your Brand, Too!)

Your Site Goes to Jail, You DO NOT Collect $200

Many webmasters have suffer the fate of hand editing recently. The site of mine that they hand edited had about 95% of its links cleanly built by me, with the other 5% coming before I bought the site. Because it was my site they wiped away ALL of its link equity via a hand edit (simply because I bought a site that had some link equity). What makes hand edits worse is when they follow up a hand edit by paying an AdSense spammer to steal all of your content and then rank his site where you ranked prior to the hand edit.

When sites are hand penalized, they typically do not even rank for their own brand related keywords unless the webmaster buys their own brand name in Google AdWords, which means Google is even willing to sacrifice their relevancy to punish webmasters who fall outside of Google's evershifting rule-set. Unfortunately that punishment is doled out in an uneven fashion. Large corporations can own 10 first page rankings, or use 100 subdomains, but if you do the same with a smaller website expect a swift hand edit. Even programmers who support Google's API get a hand edit from time to time.

Rank Checkers & Google API Keys

Were you one of the early programmers to build a tool that use the SOAP version of Google's API? Sorry, but they no longer offer Google Search API keys. Their (formerly useful) API has came back as an academic only project which they can use to recruit college students studying information retrieval.

Anyone who built a tool based on Google's old API now has to explain to people why their tools broke. Google wanted the tools to break so they could replace the useful API with something worse. In fact, Google is pulling back more data in other spots, even when third parties create tools to add features that should have been core to Google's products. Let's take a look at AdSense.

Google AdSense Stats

Google does not tell smaller webmasters what their payout percentage is, what keywords triggered the ads, or what ads get clicked on. Some third party tools were created to help track the ads and keywords, but Google disabled those.

If you think about this, Google is directly undermining the profitability of their partners by hoarding data. If I know what sections of my site perform well then it is easier for me to create high value content in those areas. The more profitable my site is the more money I have to reinvest into building more content and higher quality content.

It doesn't make sense that they ban high quality content just because it is owned by an SEO, then fund the growing dirty field of cybersquatting. I invested nearly $100,000 into building an AdSense site, until it got hand edited and I realized how AdSense really works, cannibalizing the value of content and making your site too dependant on Google as a traffic source.

Summary

If Google was honestly interested in creating a maximally efficient marketplace they wouldn't disable third party tools, hold back information, and keep changing their systems to confuse webmasters. They wouldn't hand edit real sites that thousands of webmasters vote for. And they would not be spreading FUD amongst the market. They would simply find a way to monetize the markets, push out inefficiencies, and grow additional revenue streams.

In some cases, if you register your site with Google they may give you a few more crumbs of info, but unless you have scale they want you to fail. What they really want, like any for profit power hungry authoritative system, is control of your attention and information, so they can ensure as many dollars as possible flow through them. Look no further than their position on the health care industry to see their true vision for making information universally accessible and useful. Ads are a type of information:

We can place text ads, video ads, and rich media ads in paid search results or in relevant websites within our ever-expanding content network. Whatever the problem, Google can act as a platform for educating the public and promoting your message. We help you connect your company’s assets while helping users find the information they seek.

Looking Forward

Eventually Google will launch Claim Your Content.com, which is yet another way for them to get you to register your work with them so they can more easily profile webmasters and hand edit SEO owned sites. Allegedly it will help prevent content theft, but once it comes out, expect duplicate content filters to stagnate or get worse unless you sign up for their service. Dirty? Yes, but so is deceiving people to increase corporate profits. The above 7 examples are just a small set of the whole.

How to Know the Difference Between an Automated Penalty & a Hand Edit

Some of Google's algorithms give sites a +30 type penalty which prevent the site from ranking better than #31. When Google hand penalizes sites it also tends to look the exact same way, with one exception.

When a site is hand edited typically it won't even rank for its brand name related keywords. Assuming your site has at least a few links, I believe most of the automated penalties still allow mydomain.com to rank for mydomain. Although, over time, I suspect Google may change this, as editing out sites for their own brands hurts Google's relevancy.

Should Google Penalize Companies for Their Official Brand Names?

Sometimes when a Google engineer decides he is pissed off at a site he or she may penalize the site in a way that the company does not rank in the top 30 results for any search query, including branded navigational queries. Imagine if you search for SeoBook and couldn't find this site? How is that a good user experience for searchers using Google? It is their index, so I think it is fair if Google nukes sites that they dislike for non-brand queries, but when they do it for brand related queries too, they make their search results irrelevant and provide a bad user experience. What makes the situation worse than it needs to be?

  • the definition of relevancy changes depending on Google's business interests

  • the definition of spam changes depending on Google's business interests
  • Google is unpredictable in their hand editing
  • if Google does not like a particular market they may hand edit the leading company while leaving competitors ranking at the top of the search results for the competitor's brand. in some cases they penalize the cleanest services while leaving dirtier market players easily accessible
  • even if a site is deemed as spam and penalized they can still buy ads on Google, which makes it seem as though it is only spam if Google isn't getting paid

If Google wants to become the default way we access all information can they continue to penalize brands for their official brand names?

Google Gobbles up News Vertical

Late on Friday afternoon is a brilliant time for Google to announce a major change with their news service if they do not want people to talk about much. With their AP, Agence France-Presse, The Press Association in the United Kingdom and The Canadian Press syndication deals, Google claims they are improving duplicate detection, increasing listing variety, and as a net result they are sending MORE traffic to the people they signed syndication deals with.

Danny stated:

Google's going beyond just hosted news articles as part of this release. The company also says it will be doing a better job of duplicate detection overall, so that if there's the same article from wire agencies it doesn't have agreements with, such as Reuters, it should be more likely to point to the Reuters site than someone running Reuters material.

Google's market position allows them to address relevancy issues as needed, in order to suit their business agendas. They were lax on duplicate news content for nearly a decade because they wanted to spread their public relations spin through the media and get ad deals with many of the media outlets. After Google secured their CNN partnership, now it is time to solve the news syndication duplicate content problem and send traffic to the international news agencies.

A year or two down the road Google News will likely shift from temporarily archiving news to permanently doing so, and news will be yet another content vertical they own, along with search, ads, analytics, video, and books.

Look mom, newspaper ad revenue shrunk 8.6% year on year and Google is getting those ad dollars. With improved duplicate content filters you can look for those numbers to fall further. I wonder if this is the end of Google's successful public relations campaigns in the mainstream media.

They use coersion to control traffic, and then sell it to you as a feature you wanted. Those guys are soooo good at business!

FUD & Relevancy: Inside the Mind of a Google Search Engineer

A large part of the search marketing game that gets little discussion is perception. Are the search results relevant? Are the search results fresh? Is there adequate result diversity? Is that particular result worthy of that ranking? What techniques did they use to rank there?

User vs Engineer Perspective

Gord Hotchkiss does a great job covering the searcher's perspective, but rarely do you get to see how a search engineer thinks of the results. This NYT article offered a snapshot, but that has been filtered through the public relations team. The results show not what the engineers want, but the best they can do given their current mindset and computing resources.

Reading Changes in the Search Results

If you can learn to read changes in the results you can see what they are trying to fix right now, what issues will become a big problem, and what issues they do not care about. For example, right now, Google does not care for result diversity. They are so afraid of small spam sites that they:

Why You Should NOT Trust a Search Engineer's Blog

You can't really trust the perspective of a blog posted by a search engineer because they typically discuss the view of search from an ideal world. In addition, the role of the search engineer blog is to control and manipulate public perception more than it is to share useful information. In some cases they tell you how going off topic is wrong, while they are proud to cite your off topic marketing and praise it if you are syndicating their spin.

If You Were a Search Engineer, Would You Lie?

When you look at how they try to manipulate people you can see the holes in their algorithms. They are nothing but an algorithm, an ad network, marketing, and how they manipulate people to cede power and authority to their fickle relevancy algorithms. If they are hypocritical in their view of the web then manipulation is a large and important piece required for them to keep what authority they have.

How to Spam Google Right Now

A few tips they don't want you to know the truth about:

  • Buying old sites works amazingly well.

  • Buying and redirecting sites works amazingly well.
  • Paid links work amazingly well, and you have a strong brand or can tolerate a bit of risk you would be an idiot not to exploit that hole.
  • Exact match domain names play a good role in helping a site rank for the associated keyword phrase.
  • The supplemental results suck, but they don't want the portion of the web they throw into it to realize just how bad it sucks.
  • The search results have a lot of hand editing in them. Hand editing is gratuitous for smaller websites, but they are afraid to edit out large corporations.

Why might Google refer to some of the above techniques as spam? Simply because they are effective. We don't write the algorithms. We give the search engines what they want.

Have You Ever Been Hand Edited?

Search relevancy algorithms change depending on what types of spamming are popular and effective at the time. After experiencing your first hand edit on something you worked hard to build it changes the way you perceive search engines, and how much you are able to respect them. If you are a professional you are not supposed to take it personally, but it is hard not to if you have to fire all of your employees.

Why is it that one person can review your site and kill your business model, but they wrap their ads around people stealing your content and it is a long drawn out process to get them to fix that problem? It is just an extension of how Google thinks of consumers. If you don't have lawyers they don't give a crap about you.

What is Spam?

Spam used to be irrelevant, but now that the web is a direct marketing channel spam is typically more seen as being focused on who was paid to achieve the results. Search relevancy algorithms are based on ad sales. Something that is spam is perfectly acceptable if Google gets a buck a click out of it. Ad networks dictated by automation and economic efficiency also push a lot of fraud. Consider the following:

Much of their profit margins come from supporting fraud, but most people do not realize the extent of it.
Why is Google's ad centric view of the web viewed as more honest than any other business model?

How to React to a Hand Edit

The way to look at search is that they want their techniques to be robust and scalable. Anytime a search engineer does something to you that is unjust and draconian it is because they have a huge hole in their algorithm. After you get hand edited the four strategies worth exploring are

  • how to obfuscate your identity

  • how to scale exploiting the holes which required a search engineer to try to destroy your business
  • how to make your "spam" fit their view of relevancy so they don't go out of their way to keep hand editing your businesses
  • let others know if you think something is dishonest so you can help them avoid trusting it too much

Pages