A while ago Randy Ray pointed at a post called The Ultimate Secret to Winning Poker, where Bill Rini posted about how he failed at being a stock broker because he did not cold call enough people. He spent too much time trying to study the market and perfect the craft. He did not read or understand that just calling calling calling was the most important step until it was too late.
That post also mentions that if people read many books on a subject and still do not understand the subject then the answer does not necessarily rest in some obscure passage they will one day cross, but more likely in a book they already read, but did not read closely with enough attention and an open mind.
There are many problems with running multiple similar brands in parallel.
It splits your focus
you end up cross posting stuff where it does not belong
I am going to consolidate many of my domains in the near future. SeoBook and ThreadWatch will remain unique channels, at least until some tells me that I am screwing up majorly bad and it should be otherwise.
The key for me to do well with them is to respect them for what they are...ie: this site is supposed to be about actionable SEO tips, and ThreadWatch is about rumor and stuff that is fun and interesting about search / technolgy / SEO.
I also recognize that some of the people posting comments at TW have been in the game far longer than I. In the past I would reference some of the comments that I thought were great, and I will keep doing that going forward.
So Nick recently sold me ThreadWatch. While transferring the site there was an issue with my account settings. At about 230am I sent a help request into Dreamhost. By 3am a Dreamhost rep called me up and sorted it out.
They don't normally do call backs on the graveyard shift, but the guy hooked me up. I just wanted to give them a bit of free marketing for having kick ass customer service.
So I recently set up a bunch of new websites and wanted to link out to a few authoritative sites right off the start. I added various numbers of outbound links to each channel, but after setting up a number of them I got pretty quick at researching where I should link at, even when I did not know a topic that well.
Outbound links are like a gimme in SEO. It's fairly hard to get the right type of people to link at a new site unless you bribe them or it's a great site, but just about anyone can improve their web community by linking out to some of the better resources on their topic.
Terms like PageRank leakage and bad neighborhood have made some webmasters become greedy or paranoid with their link popularity to the point where their sites become harder to link at because they are islands.
How do you find the best resources to link at?
What do the Various Major Engines Like?
My first port of call is Myriad Search. Search for the core phrase your site is focused on, synonyms, and phrases slightly broader in scope than your keyword phrase.
If you are in a hyper competitive field MSN tends to bring up some fairly spammy results, but sometimes seeing the results mixed together gives you a nice flavor of what they all like and sometimes you will find a nugget ranked at #7 in Yahoo! or #6 in Ask that all the other engines missed.
Unfortunately I am a fairly default searcher (primarily just using Google) but using Myriad helps me get an idea of what people can get away with (as far as content quality goes) in some of the engines.
Research, Research, Research! Yahoo! Mindset allows you to bias your search results toward commercial or information result. Tilt that puppy full on research and see what sites they think are informational in nature.
Is that Page Created by the Government?
The Yahoo! Advanced Search page makes it easy to search just .gov or just .edu resources (or both at once). Sometimes this will be a miss, but I have found many great resources using the Yahoo! Advanced Search page.
Many directories have picks or a star on favorite sites. DMOZ and the Yahoo! directories are the two most well known directories. You can also browse the Open Directory Project organized by PageRank using the Google Directory.
Don't forget some of the smaller higher quality directories created by librarians. LII is a killer site.
If you run a finance related site odds are pretty good that you can find something good at Fool.com, MarketWatch.com, Forbes.com, etc.
If you run an automotive site it is easy to link at Edmunds, Nada, Kelly Blue Book, etc.
To drill down go to a relevant vertical authority site and do a site level search for information related to your topic.
When all else fails I like to link to sites with great overall authority scores if they have relevant pages or channels. Some examples include:
That is about as far as I have gone with most of these new sites, but sometimes you may want to hunt further if you have an uber spammy topic like cash advance or are trying to go further in depth on a topic that is already well covered. Some other ideas...
If your searches are local in nature you may some of the best information by using regional search databases.
Don't forget that you can specify filetype. Spamming is a game of margins, and on the whole the average .doc or .PDF is going to be of higher quality than the average web page.
If you are running a blog or some type of a topical channel do not forget to use ego stroke techniques in your content and citations. Even if your idea is better than someone else's, or your read their idea after you came up with yours still give them a bit of link love.
Sometimes mentioning a prominent blogger (saying they are so right, they are hosed, they are normally spot on but are messed up on issue x) equates to a quality link back and many secondary links from various non clued up ditto heads :)
Web content deals are on the rise again, and Internet ad spending should reach $12 billion this year, meaning Jupiterâ€™s once-ridiculed forecast wasnâ€™t far off the mark. ... Of course, there are new metrics for valuing audiences. "Not all pageviews are created equal," cautions David Hornik, a partner at August Capital. Hornik and other VCs say the most prized traffic comes from sites that leverage "viral" content to acquire users who are intensely loyal.
If you do what is out of favor, when it comes back in favor you end up growing far quicker than me too companies that are always one step behind.
A number of recent publications propose link spam detection methods. For instance, Fetterly et al. [Fetterly et al., 2004] analyze the indegree and outdegree distributions of web pages. Most web pages have in- and outdegrees that follow a power-law distribution. Occasionally, however, 17 search engines encounter substantially more pages with the exact same in- or outdegrees than what is predicted by the distribution formula. The authors find that the vast majority of such outliers are spam pages.
It's funny to see the star internet marketing crew being too lazy to even change out their bogus testimonials. But at least in the short term there is far more money is the following model:
launch a shitty software product
use hype affiliate marketing and joint venture opportunities with internet marketing expert hucksters
opt in list
then email spamming (either directly or via affiliates)
than in creating real value. Maybe eventually people will get a clue.
On the web there are a large number of people who will try to help you along to doing well, and then there are hollow middle men who want to take your money to shit down your throat. Since it costs virtually nothing to make most software or sites you see far more of the latter in the internet marketing realm.
Here is a perfect example. So about a week ago I gave away one of many copies of my ebook to a charity and recently got this response:
Great book, I finished it a few weeks ago and now need to take notes on all the highlights. And WOW, with all the resources you have, that was in itself would be worth the price of the book, oh wait, it was free for me, :) lol
I have read quite a few books over the last 1 1/2 years on internet marketing, etc, and I really believe you are one of the few guys out there that truly want to help others and not just try and milk their list, as I heard one big name internet marketer say and that's exactly how I feel when im on his list. It really came through in your writing and the fact that you update the book on a continual basis says enough.
I would like your opinion on http://www.xyz software.com/ from what I can tell, it appears to be a great piece of software but I always like to get a 2nd opinion before I buy. (btw, I was referred to this by one of the guys I consider that will sell any and everything, every week it's a new cant miss product some of these guys are hawking so its sometimes tough to tell the gems from the duds)
I responded (roughly):
thanks for liking the ebook
xyz software = crap
here is why...
-qpw ($30 or so) is great
-you should mix stuff up
-sites that accept automated type submissions will tend to be
From that I got no response or donation. Give a guy your business model and your knowledge and get nothing in response. That's pretty useless considering I just saved him $97.
I find people that have no money to support people who help them but want to spend money on the latest interent marketing scam software to be greedy and/or stupid. It's not uncommon for business people to take the low road though.
I have about 700 emails in the inbox and I likely will be changing my charity policy soon to be more accomidating to paying customers. I don't expect to profit from helping charities, but the semi shady charity requests weigh on my time and spirits. And I still haven't said thanks to all the people who have helped me fight off the internet marketing scam that is Traffic Power :(
The question I pose today, however, is on the subject of links from the top ranking pages in the existing SERPs. These links are heavily pursued by SEOs and traffic builders as good sources for both referrals and high conversion visitors. After all, if the visitors are primarily coming through search, it's easy to determine whether they are likely to convert simply by running PPC campaigns in those results. But, let's imagine for a moment that the visitor value was entirely removed from the equation, and the link was purchased/cajoled/traded purely for the purpose of boosting rankings.... Are the highest ranking pages on a subject providing the most valuable links?
Jim Boykin offers a workaround into the neighboorhood:
I'd have to say that getting links from the "Similar pages" to those in the top 5 might even be better. (Google's "related:" command).
These "related/similar" sites are sites that have common backlinks to the top sites...so a kind of "back door" approach to getting "in the neighborhood" is approaching those sites since they help to identify the neighborhood.
Some people who would be willing to trade content are too lazy to write an article for your site, so what you do is write an article for their site which links back to your site and then write most of an article for your site to link to them.
Leave out a section for pointers or tips and let them write a pointer or two including links back to their site.
Within a couple years up to 20% of viral linkbait will likely be from various scientific(ish) commissioned studies and the like. Even if the stories are half-assed the added controversy would likely equate to more links.
The doctors have been running paid ads on Google for four months now, under terms such as "San Francisco Doctor". They're spending roughly 50 cents per click for top positions and their total monthly spend is about $500. "We get about 20-30 new patients a month from it, so we're happy," said Clifton.
Next month part 2 of the series will be out, where Clifton bitches about how AdWords is ineffective due to being hyper competitive.
EBay is aiming to take over the phone book's customary role as the first place people turn to find local services from housecleaners to accountants.
While eBay Inc.'s focus for now is on auto services like oil changes and brake jobs, its goal may be to connect consumers with local businesses of all kinds. This could signal a major shift in the way consumers shop for such services and greatly affect pricing and competition among local shops.
Does eBay have any sort of a map? How are they going to do local without one? Craigslist (which eBay owns part of) links off to Google and Yahoo! maps. I think Google and Yahoo! get information from the same source: Navteq. Not sure if Navteq is going to go after local search as well, but they recently partnered with Zagat.
Any chance someone would want to scoop the leading map company? Are there any other high quality digital mapping companies?
So WMW has a subscriber only thread titled Expert Claims to Have Beaten Google Sandbox. The best info is in the free Matt Cutts on the Google Sandbox thread. DaveN mentions a site that was quickly out of the box. His main tip?
don't think like an SEO
build site go get links from ...... <- seo
Dave also showed some encrypted logfiles showing that he was getting decent traffic from a couple sites outside of search engines.
RAE also offers up
IME it is more accurate to go *after* traffic and not links.
And did we mention pay-per-call, separate content bidding, an obscure nameless easter egg feature that I don't wish to comment on but would like to thank Google for adding apparently in response to a wish list entry I think I posted long ago at Webmasterworld and SE Watch Forums...
Based on a bit of thinking and Andrew's posts in that top 5 features thread I think some of the cool things you can now do with AdWords are:
Run site targeted content ads without paying CPM rates by bidding on the official site names or common phrase matched page elements of sites you want to advertise on.
Target ads against competing products without competitors being able to prove you are using their trademarks or product names to trigger your ads. This prevents you from taking needless large sums of crap that I took about a year or so ago.
In the past Andrew also hinted at - trailer park geotargeting and Googleplex-cam.
The Google AdWords system uses all the keywords in your Ad Group to help match your ads with relevant content network sites. In some cases, keywords which have proven ineffective when triggering your ad for search turn out to be very effective when triggering content impressions. In other cases the keyword is simply useful as context in helping the system determine the overall subject areas of your ads.
How they can have such a large network and then just randomly announce that effective now things are changed?
Some advertisers who do not log into their accounts in the next few days may come back from their holiday break to see a ton of formerly disabled overpriced content clicks killing their ROI.
It wasn't until Knight Ridder Inc.'s largest stockholder, Private Capital Management LP, called for the newspaper chain's breakup that the creative destruction of market forces turned on Google and began its rout.
The Mountain View, Calif.-based company developed the free tool to help consumers avoid the frustration of traveling to a store that no longer has an item on their shopping lists, said Marissa Mayer, Google's director of consumer products.
Froogle, a comparison shopping site that Google launched three years ago, will continue to give visitors the option to buy the merchandise online. Google receives a commission for the online referrals.
I am not sure what they meant with that Google receives a commission bit. Is that just for the ads near it? In 2003 when Mike Grehan interviewed Craig Nevill-Manning, Craig said:
BUT - the bottom line is - they are unpaid listings, they're unbiased. They're the best results we can find for those products online. ... It'll be free forever.
The New York Times made it sound as though the Silicon Valley article was misquoting or overtly vague in their description of how this service will make Google money.
The service will be freely available to merchants in the United States, Ms. Mayer said. Google, as it frequently notes, plans to gain revenue from the new Froogle service by placing relevant text ads on the same page as the local results.
The company also believes that it gains revenue when users employ Google more frequently as its services become more useful.
Initially, Google is depending on a contractor to pull the inventory information from several hundred major merchants. The search engine hopes to make the service even comprehensive by encouraging stores to submit their own customized merchandises list to the newly created "Google Base" - an information clearinghouse for everything from family recipes to scientific formulas.
What vertical search site is safe?
A while ago Battelle had a highly related post about comparison shopping called The Transparent (Shopping) Society. The New York Times made it sound like the eventual goal of this launch is spot on with what John was describing:
Marshal Cohen, chief retail analyst at NPD Group, a market research firm in Port Washington, N.Y., said that if Froogle delivered up-to the-minute inventory updates from retailers, "consumers will finally know whether a trip to a store is worthwhile."
Google wants to be the default inventory information clearinghouse. Users love defaults. I am guessing the value of being the default shopping search site is worth far more than any value they would extract by charging for the feeds, at least off the start. Just like with regular search, there will be incentive for merchants to spam this service. Any idea how Google will fend off spam if they aren't charging? Or are they charging?
Illuminating post by Greg Boser. One of his ex clients was ranking well in Google when they parted ways. Since then his ex client was busy entering a reciprocal link network and his rankings tanked.
I collected a sample of 50 keyword phrases being targeted by sites in the GotLinks directory. In order to get a balanced set of keywords, I randomly selected phrases from several different categories.
Now I donâ€™t know about you, but one top 20 listing in Google certainly isnâ€™t enough to convince me that the GotLinks network is a place I want my clients to be.
Greg also notes that
The site ranked well in Google before the links were added to the GotLinks network.
The ex-client never reciprocated.
and the cause of the penalty?
Exceeding a threshold for the total number of links developed in a specific time frame, or
Simply being included in a specific network.
Other reciprocal link network owners have been showing how great their networks are, pointing out how some people got 1,600 links really quickly. I think there should be a bit more honesty in the marketing, as it is clear those are not 1,600 links that will bring you to the top of Google's search results.
It is a bit hard to isolate any one factor to determine how search interacts with it. You also have to consider the effects of most popular lists and how those build more linkage at things that are already popular. You know it is getting out of hand when their are aggregators like Diggdot.us that mash up the most popular items from different bookmarking channels.
I believe that as you go to more competitive fields generally competition scales faster than profit, and there is great value in being in a number of smaller niches. Perhaps the single best reason to have a high profile site in a competitive market is to make it easier to launch other channels.
When starting a new website it is cool to look at the power laws that guide the web and try to understand them and use them to your advantage, but I think it is far more important to:
see how they apply specifically to your sector of the web
think of other sectors near your topic that may be able to give you broader coverage
Google was intentionally slow to roll this feature out and makes the feature a bit hard to access, because they would prefer to automate the process using smart pricing and get you to buy as much advertising as you can afford.
Put another way, Google thinks that they algorithmically can determine the value of an ad better than you can estimate it. Having said all that, they do realize that sometimes the feeling of control will increase ad spend from some advanced advertisers, so...
Content bids let AdWords advertisers set one price when their ads run on search sites and a separate price when their ads run on content sites. If you find that you receive better business leads or a higher ROI from ads on content sites than on search sites (or vice versa), you can now bid more for one kind of site and less for the other. Content bids let you set the prices that are best for your own business.
I think a large part of the reason for the early success of Chitika has been that for certain types of content (like consumer electronics) image ads do have more value than the typical textual search ad.
If you have found underpriced content inventory look for this added control to cause more people to dip their toes in the water and drive up costs.
Not only does this new service allow you to bid differently for content clicks than search clicks, but it also allows you to buy content ads while opting out of search ads. In the past AdWords also allowed content only ads, but required content ads to be purchased on a CPM basis.
Now you can buy targeted content only ads and only pay when people click. Cheap branding opportunity I suspect. Perhaps with that type of distribution it makes sense to craft ugly highly graphical animated contextual ads that say DON'T CLICK HERE.
Testing it: DON'T! There is no way to test that this works as it tracks adsense clicks, and you can't click your own adsense. You'll just have to trust me that works :)
After some time you should start seeing goal tracking appearing in your stats.
For example, here is source conversion. Note that the percentages are based on Visitors, not Pageviews, so they do not compare to CTR.
So from that graphic we can see that out of 11 visitors that came from MSN, 54% of then clicked on an adsense ad over the course of their visit.
Below many graphs in Google Analytics is a list with round arrows. If you click the arrows on almost any item you see an option for "To-date Lifetime value".
Click this and you see the Goal conversion for that item. For example here is the Coversion rate for DSL users.
Once you have Google Analytics tracking your clicks, you can cross segment that data to almost any other data Google Analytics shows. It becomes a very powerful way of optimising your site, not just for CTR, but for the type of visitors that click adsense.
A friend of mine mentioned how the noise level in SEO forums has gone from around 95% to about 99%. I think it is largely due to a shift from content optimization to content creation (and remember that this is a site selling a book on optimization, so me saying this is not in any way to my benefit).
Here is why there is a large shift from optimization to creation
The ease which content can be published: It took me less than 2 hours to teach my mom Blogger, Bloglines, rss, xml, etc. She now blogs every day.
the ease in which content can be commented on and improved in quality
the casual nature in which links flow toward real content
the massive increase in the number of channels and quantity of information makes us more inclined to look for topical guides to navigate the information space
the ease with which content can be monetized has greatly increased. AdSense, Yahoo! Publisher Network, Chitika, new Amazon Product Previews, affiliate programs, link selling, direct ads, donations, (soon enough Google Wallet for microcontent), etc.
contextual ad programs teach the content publishers to blend links, which has the net effect of...
short term increase in revenues for small publishers
until users trust links less
at which point in time users will be forced to go back to primary trusted sources (ie: one of the few names they trust in the field or a general search engine like Google)
it is getting increasingly expensive to find quality link inventory that works in Google to promote non content sites, and margins are slimming for many of those creating sites in hyper competitive fields
around half of all search queries are unique. most hollow spam sites focus on the top bits whereas natural published information easily captures the longer queries / tail of search
duplicate content filters are aggressively killing off many product catalog and empty shell affiliate sites
as more real / useful content is created those duplicate content and link filtering algorithms will only get better
general purpose ecommerce site owners will have the following options:
watching search referrals decrease until their AdWords spends increases
thickening up their sites to offer far more than a product catalog
switching to publishing content sites
and the market dynamics for Google follow popular human behavior, even for branded terms or keyword spaces primarily created by single individuals
the term SEO Book had 0 advertisers and about 0 search volume when I launched this site
this site got fairly popular
SEO Book is now one of my most expensive keyword phrases
As long as it is original, topical, and structured in a non wild card replace fashion content picks up search traffic and helps build an audience.
I am not trying to say that optimization is in any way dead, just that the optimization process places far more weight on content volume and social integration than it did a year or two ago.
The efficiencies Google are adding to the market will kill off many unbranded or inefficient businesses. One of my clients has an empty shell product site and does no follow up marketing with the buyers. I can't help but think that there needs to be some major changes in that business or in 3 to 6 months we won't be able to compete on the algorithmic or ppc front without me being very aggressive.
Already mentioned everywhere else, but I think it is worth noting that Andrei Broder Joined Yahoo!. Google has been getting the lion's share of hires of big web names (like Vint Cerf), so it is good to see Yahoo! pick up one of them.
Gary Price also added links to a number of research papers Andrei Broder contributed to.
All quotes below are from the above linked Google Groups thread.
As Ben Michelson put it:
I believe this may be just the first phase of a new "less is more" concept.
I expect subsequent versions will alternately snip out or merge previously inaccurate fields, until finally (AdWords 1.0) the TrafficEstimatorService will be void of inaccuracy by providing no information whatsoever.
Robert, another programmer, was also thrilled by the recent "upgrade"
Well done, Google. I just want to release my first Adwords program - partly based on the ctr value. I work about two month for it. Why we should develop programs for Google, if Google changes the API every two month (see also KeywordService)?
The algorithms used in the TrafficEstimator may return some results that do not match your quality expectations, but they are not skewed in any way.
And here I thought making something inaccurate was skewing it...
Inasisi ran through some examples of the intentional data skewing and said:
If it is not on purpose, I don't understand why Google is not correcting the huge skewness in its estimates and further remove the only good statistics that we had to access to. If Google felt the need to be consistent to both the API users and the advertisers who use the UI, then they should have provided more information on the UI instead of having to strip them from the API.
For being so concerned with efficient market theory and collecting so much data Google sure is greedy with their data. They sure expect marketers to trust them with a bunch for not even trusting marketers with something as trivial as search counts.
Yahoo! and eBay allow access to their old marketplace data because it helps drive up costs, commerce, profits, and makes a more efficient market. Why can't Google get a clue on this?
SEO Chat is quite possibly the most overly commercialized forum I have ever seen. They get their content free, and I think most of the moderators worked free too. Recently they once again made changes without informing the moderators, and this time they pissed Rand off pretty good.
The irony of it is that they said they were fixing up the site for SEO reasons and did not ask any of the SEO moderators about the changes beforehand. Pretty stupid, IMHO.
Surely there is a great deal of noise they are trying to contend with, but after a system becomes noisy you can't change it without breaking it.
Channels can have a bunch of noise and still do well, but if some of the things that add some of the noise draw people toward your network then you are going to lose big when you remove them, especially if you do it in a disrespectful manner.
You only need about a half dozen to dozen members to make a good community, and if you lose them then you are a bit SOL.
As recently stated by a friend I met at Pubcon, creating a hierarchical framework can work to help moderators think that you are helping them by letting them be a moderator, but there still needs to be some level of respect.
I see Digital Point forums doing well long term because
being uber technical and monetizing page views Shawn will have no problem dealing with the massive server load
it is built around openness with minimal editing
Shawn created a bunch of free useful tools that are easy for anyone to use
Going forward I think most successful communities will be more about setting up a functional social framework and letting the best framework spread rather than advertising top down systems which do not respect their users.
Would the Sandbox concept be more accurately named the TrustBox?
NFFC, Lots0, and Massa highlight in far more detail what I was hinting at in my recent sandbox post and what I was trying to say on my WMW panel. Like it or not, SEO is largely becoming a game of public relations in many competitive industries.
Other semi-recent posts about the shifting of bulk spam to trust related techniques:
For those who recently complained that link trading doesn't work, it is largely because most link exchange offers and opportunities are garbage.
A long time ago I thought that search algorithms were going to advance to the point that it would be easier to influence (barter / manipulate / become friends with / etc.) people than search algorithms. With algorithms like TrustRank and the viral nature of blogging you really don't need to seek the approval of all that many people to do well. Put another way, if Danny Sullivan likes and frequently references your SEO website then odds are Google will too. In every industry there are going to be a limited number of people like Danny.
Google Sitemaps now has more features on top of showing crawl stats and crawl errors:
PageRank distribution (high, medium, low, not yet assigned)
top 5 Google search queries for getting clicks to your site
top 5 Google queries returning your site in search results
Seems like that is just a tease at giving information (and if you want real stats you have to use Google Analytics to give them more info back), but here is the Sitemaps stats FAQ page.
I don't think Google really needs or wants the site owner sitemap data so much, I just think they want to be the default service people use in case it is useful down the road. That is why they are throwing in the few extra "goodies". Storing data costs Google next to nothing.
Most likely Google is the default search tool, advertising tool, email tool, analytics tool and free information storage database for a large number of people now.
I have not been able to get a screenshot, but at WMW Vegas I noticed that when Baked Jaked was looking at Google search results for [Gwen Stefani Tickets] that their were Google AdWords ads at the top, right, and bottom.
Patrick Gavin gave similar presenatation as his recent San Jose one.
Stuntdubl mentioned the techniques of Link Ninjas, which is a link building seminar that came out of the presentation.
He posted quite a bit of good stuff like some of the recurring themes on his blog (link naturally, neighgborhoods, use a variety of link types, etc). I will see if he posts his presentation online. If so I will update this post. Todd has got really good at presenting for starting somewhat recently.
Philip Kaplan of AdBrite showed his recently launched intermission ads (mentioned here). Also noted that AdBrite does not do direct links and is exceptionally transparent.
Online magazines are sometimes underpriced and have great link neighborhoods.
run Xenu link Sleuth on directories to find broken links...some of those may be easy sites to buy cheaply
emphasizes alternative sources of links
look outside same networks everyone else is using
Q&A: there was a question about Google hating on paid links
don't forget Yahoo! and MSN give credit
stay on topic so you get direct value too
managing link buys?
you can use AdBrite to mine information (this could also be used to help you find what the top posts or topics are on some competing channels)
excel can be used to show link dates, which also helps show the value if you are tracking
Oilman mentioned search for powered by xyz forum + a topic (like sci fi) to look for some potential cheap link buys
Matt says they need to look harder at link quality
has the site duplicated on the .co.uk
OnlineHighway or InformationHighway...something like that...I so could not see the URL
50,000 to 5,000 visitors per day on update Allegra
using popunders is just as evil as popups
unsure purpose of site by looking at a page
used to have multiple location based URLs...301ed to one central domain. Matt Cutts recommended that.
Baked Jake said it can take 2 weeks to 6 months for 301s to take effect
TicketsToGo seems penalized in Google since October 2004
also created TicketsToGo.net because
duplicate content issues
Jake recommends starting from the bottom up. Building links into some of the subject specific pages and then working your way up.
target Geo specific concerts
Matt Cutts said "tell me about your backlinks" ... uber spammy reciprocal linking campaign. said good news is no manual spam penalty, but few of the low quality links this site has are doing it any good.
Tim Mayer asked what is actually unique about your domain?
Yahoo! looks to ensure that with travel that the travel box is owned by the domain, not an affiliate form. Would not recommend submitting to Yahoo! paid inclusion
Matt pointed out bad cross industry linking between his own site (like mortgage and credit sites), but said there were some good links
Tim recommends making the site more unique from page to page and cleaning up the navigation links. He thinks the site navigation being at the footer and the page content existing primarily of wildcard replace duplicate content makes him think the local pages are for search bots instead of users
not only link to related pages about immunization, etc., but also create tables of the locatin based related information, etc.
home page title nice
site looks good
individual product pages have good data Matt Cutts calls some of their paid backlinks "painfully obvious" to most any search engine. Matt said those links are not hurting them, but they are not helping in Google.
could probably be rather easy for a site like that to get many links from beer hobbyist sites
question about looking at their sitewide links to IACI partner network
instead of looking to rank for mortgage Matt recommends looking for 20 year mortgage loan, etc.
Jake recommends geo targeted pages
Matt recommends maybe adding more text, but they are already looking at ROI testing and that is why there is limited text
internal links can help reinforce topics
Matt said their cross network linking seems pretty organic / not with intent to spam. Note that in Google's spam review guidelines that IACI's travel sites were ones that were whitelisted examples for remote quality search raters
mortgage calculator link on LendingTree built for a manipulation test on Google...that was the reasoning the guy said and Matt Cutts made a funny face
Matt Cutts said the partner links section on IACI properties as a technique do not work in Google.
Matt said the goal of engines is to detect and count editorial quality votes.
Testing fixing 302's. Want to accept destination URL except for like 0.5% of the time. Gives SF Giants URL as an example.
Somethings in index can be perceived in our process as the sandbox...does not apply to all sites.
Does not see Google buying DMOZ or killing reliance on it.
Google does not have the ability to hand boost any sites. They do have the ability to penalize things by hand they believe are spam or illegal.
Autolink...references how it was liked at Web2.0. Thinks the launch could have been better. Would like to allow users to enter their own triggers.
Users and privacy...to take search to the next level you need some information about the users. Matt said he wouldn't work at a company that he felt violated users privacy.
Matt has never worried much about hidden table row type techniques to organize word order. With CSS if you want see how it influences a file test it.
Toolbar does not influence how frequently stuff is crawled. It is too easy to spam, and the toolbar does not have equal distribution across various regions. Many people assume some things provide clean signals which are not so clean.
Matt as a webspam team member said he has no ability or intent to accessing the Google Analytics data.
Litmus test of a site for spam is what value does it add to the web. User reviews, forums, community, etc. What makes a site unique.
Matt Cutts hates on paid links. He said they have manual and algorithmic approaches to paid links. Compares effectiveness of paid links going forward to how reciprocal link spam has largely died off with Update Jager.
If you have to something creative and useful it is easy to get quality links that are hard for your competitors to try to recreate.
competitive revenue opportunity (over 100,000 ad buyers)
opportunity to integrate with Yahoo! content & Yahoo! users
custmoer service & community
Size of Yahoo! Publishing beta?
approximately 2,000 publishers
they just launched ads in rss feeds
open to all beat participants
diversifies rev ops
aligns w growing shift to rss
supports movabletype and wordpress
ads optimized to drive revenue
Yahoo! stated some think 5-6% of web users use rss but Yahoo! research showed it was closer to 30% of web users.
Jen asked if Yahoo! has anything similar to Google AdWords smart pricing?
not needed for the following reasons
allows advertisers to bid separately for the different content channels
Yahoo! is more selective with partners
Jen asked when Yahoo! Publisher would be global
likely early 2006
plans for an affiliate program?
want to work to lower bar to make it easier for publishers to make money and work with Yahoo!...will allow affiliate program and will likely eventually support cpm pricing
wide range of topics on one site...how to be relevant?
can target ads at page level, directory level, or site level...can allow page or directory to override the site level targeting
going to change rev share percentage after beta?
absolutely not, but eventually may use traffic quality to adjust click price
Will Yahoo! offer behavioral targeting on contextual ads?
no nearterm plans, but may eventually
Rate of revshare / how compare to Google AdWords?
Yahoo! does not share the revshare %. more interested in being competitive in allow you to monetize.
revshare by publisher will vary over time
may eventually say you are in x range... to get in another range you may need to (get more traffic higher quality clicks etc)
Jen said targeting was no good at start...now better...is it where it needs to be?
still working to improve...pleased with speed at which it is being made better
Will Yahoo! offer a premium publisher program?
may give advertisers more control over who working with. but even small publisher may be premium if quality targeted traffic etc
How long will Yahoo! publisher in beta?
maybe toward end of q1
Jen asked plan on cpm ads?
may add cpm cpa. yahoo already does cpm on internal network
he once lost a 96,000 word manuscript and there was no restore function. He created the trash can on Apple's project Lisa, making emptying it a two step process.
in 1984 he helped build internal and corporate communications for Apple. In 1991 Apple sold that to Quantum Data Physics (later named AOL)
spooks went to xerox parc and xerox offered a huge price for a computer. the price was too high. the went to Stanford, and although they never originally created computers to sell Sun (stanford university nework workstation) was born. stanford saw no intellectual property in sun.
cisco came out of the same building as Sun. It used same motherboard as sun. cisco started on credit cards
typically companies can go to 600K in monthly sales on credit cards then they typically fail if they are still funding on credit.
Robert could have got 15% of Excite for $1,800 (I think that was the number)
recently he has been working on PBS GeekTV
he tracks his accuracy, thinks someone should create something like accuracy in media.com
talks about consolidation in the space... msn /goog /yhoo only serious competitors.
windows and office profitable...nothing else at msft is
msft has cost items
xbox 4 billion dollars lost
they spend tons of money on other stuff as case B if office & windws fail
extra expenses there so they can later cut them if profits from office or windows falter in profitable
thinks google wanted the 4 billion to buy / create something (but unsure what)
Dan Thies was recently interviewed by Pandia. One big thing he stresses is the concept of opportunity optimization, and how many people focused on SEO are missing out:
Beginners have a hard time looking at the rest of the picture. Their #1 problem is probably not traffic, itâ€™s conversion, usability, opt-ins, follow-up, pricing, making the right offer.
You can use search engine marketing to help you solve these problems, but if you donâ€™t solve them you will eventually fail. Those who make the most profit per visitor have the most resources to compete for rankings and ad placement.
When I speak with someone who wants to improve their rankings, I usually ask if they do pay-per-click. Invariably, the answer is â€œno, we canâ€™t afford that.â€ The bad news is that if your website canâ€™t convert well enough to support a PPC campaign, youâ€™ll often find that SEO is even more costly, especially in the short term.
To be honest I am pretty guilty of not maximizing monetization per pageview. I struggle a bit with the issue of trying to write about what I find new and interesting when if I dumbed down most of the blog posts to be more fitting toward newbies my sales could probably double or triple.
The people who vote with their link popularity to help boost your authority are frequently not the same people who buy your goods and/or services. What are the best ways you have found to be linkable and target new people without seeming overtly boring, etc? Content in multiple formats? Multiple similar channels? Free email tips?
The low cost of traffic from quality SEO can be as much of a problem as a blessing, because it allows people to get away with being fairly inept in other business fascets to the point that eventually when the SEO techniques they use no longer work that the only solution is to close the business.
"Why not improve the brain?" Brin asked. "You would want a lot of compute power. Perhaps in the future, we can attach a little version of Google that you just plug into your brain. We'll have to develop stylish versions, but then you'd have all of the world's knowledge immediately available, which is pretty exciting."
Apparently the people at Google want to rent weekly digital access to books.
Web search leader Google Inc. has approached a book publisher to gauge interest in a program to allow consumers to rent online copies of new books for a week, The Wall Street Journal reported on Sunday.
The proposed fee is 10 percent of the book's list price, the Journal reported, citing an unnamed publisher.
The discussion with the publisher indicates Google may move toward adding a digital book-renting service. - Reuters
In related news about other business models Google and the web may be changing or killing off, Knight Ridder, the large newspaper company, is exploring selling itself. When will Google Affiliate come out?
Worried Google will use your data or the data overall to better understand how much you are willing to pay for ads, based on conversions. Google said that's definitely not done, nor are there any plans to do that. Nor are there any plans to tap into the data as a means of improving regular search results or to identify "bad" sites, Google said.
Peter asks where that info came from, and I gotta wonder how smart pricing works if they ignore the value received from a click. Why would they only track it one way on certain accounts? That seems counter to that whole efficient keyword market theory so much research is being done on. What value does the data have if they are not going to use it?
Even if they only use your data in aggregate, if you are exceptionally profitable on some terms those keywords could be suggested more frequently to competitors (to help raise those keywords to near fair market value), and the smart pricing would discount less on content that your site proves converts. Search engines do not need to know how much money you are making off any term, just a peak at the ratios can help give them a good idea when they have enough other data.
You know the search engine wars are at their peak the day most computers, ISP, and general web hosting is free and you are being paid to surf. :)
By writing articles for high quality sites you get high TrustRank links cheaper than you can rent or buy them, many secondary links, and added credibility (I think Andy Hagans may have been asked to speak at a cool conference largely based on a recent article).
As an added bonus, when search engines place more bias on global popularity scores your article can show up for rather competitive terms if your site for some reason drops out of the results.
I was just looking through Google's [search engine optimization], and after SeoBook.com has been around for close to two years it ranks at ~ 30 in Google, and Andy Hagans recent article on A List Apart ranks at #19.
WordPress hosted about 4,000 content articles about expensive topics. Matt Mullienweg hosted the content on Wordpress.org and placed hidden links on the home page pointing at the articles.
WordPress, the popular blog software which use the hidden links, was back in the Google index quickly. Google is still punishing the owner of HotNacho to this day, as Chad states:
They seem to have taken punitive measures by looking up my other sites via WHOIS and punitively banning a bunch of my sites -- including my hobby freeware sites.
Sites I own (all of which Google has banned):
Thoughts on his article:
I don't like his comparisons on his content vs real spam, but his point that it is hard for human compiled content to be profitable against automated systems is on many fronts accurate.
Him saying Google controls over 90% of web traffic right after complaining about others not doing any fact finding undermines his credibility.
He has some good ideas on the content rating and importance of user feedback or using strong quality guidelines off the start is important.
I know many other friends who run the exact same business model, but do it profitably, successfuly, and in Google's good graces because how the content is formatted. Wrap it in a blog and post a few articles a day to each channel.
While he was talking about how his keyword placement software could increase the ability of content to rank, I think it is in error to look at it purely from an algorithmic front. The social structure of content matters.
It is far easier to build links into topical channels (such as blogs) than article banks.
He talks about creating a bunch of freeware and offering free support. Doing good on one front does not offset the actions on others with the mob justice on the web.
I think it is pretty shitty of Google to have banned all of his sites. I mean who does this help? Where is the relevancy?
And yet Google funds much of the garbage they purportedly hate. Google not only acts reactively, but blatently overly reactive when certain issues become public. I suppose they were trying to send a message to Chad Jones, but it was not one honestly focused on search relevancy. I wish I would have seen this article sooner.
The fact that fewpeople have mentioned the Hot Nacho article shows how biased blogs are at grabbing the front end of the story and then prowling for the next story before adding any depth or further research. Sorta reminds me of the Nirvana song Plateau, although I admit I am just as guilty at it as the next blogger.
potential bad plublicity (few things suck as bad as Danny Sullivan highlighting one of your own link exchange requests as being bad, as you know that probably gets read by MANY search engineers)
frequently exchanging way off topic makes your site less likely to be linkable from the quality resources on your topic (and, to a lesser extent, may cost you some of the quality links you already have)
If sites are willing to trade way off topic that means odds are pretty good that much of their link popularity is bottom of the barrel link spam. Thus as you trade more and more off topic links a larger and larger percent of your direct and in-direct link popularity come from link spam that is easy to algorithmically detect.
The net result is that a somewhat well trusted and normalish link profile starts to look more and more abnormal. Eventually bad plublicity or the low quality links may catch up with the site and it risks either gets banned or filtered out of the search results.
If you have a longterm website, and are using techniques that increase your risk profile and are easily accessible to and reproducible by your competitors at dirt cheap rates it might be time to look for other techniques.
Some sites that practice industrial strength off topic link spam might be ranking well in spite of (and not because) some of the techniques they use.
[Update: just got this gem
My name is Ben, and I'm working with Search Engine Optimisation [URL removed].
I have found your site and believe it would be mutually beneficial for us to exchange links as our sites share the same subject matter. As you may already know, trading links with one another helps boost both of our search engine rankings.
As a result, I am sending this email to inform you about our site and to propose submitting our link to your web page located at; www.search-marketing.info
We would appreciate if you could add a link to our web site on this/your web page, using the following information:
But now comes the hard part. How do you go about creating a blog about search marketing that is truly unique?
Anyone ever notice that the black hat SEO blogs typicially have both higher content quality and more original content than the typical white hat SEO blogs? Apparently, Gordon Hotchkiss has yet to get the memo.
Panel 1: The Search Space
This panel will review the wide range of what search engines do and their importance in the information ecosystem.
Panel 2: Search Engines and Public Regulation
This panel will discuss the possibility of direct government regulation of search functionality.
Panel 3: Search Engines and Intellectual Property
This panel will review past and present litigation involving search engines and claims framed in the legal doctrines of copyright, trademark, patent, and right of publicity.
Panel 4: Search Engines and Individual Rights
This panel will look at the role of search engines in reshaping our experience of basic rights and at the pressures the desire to protect those rights place on search.
Early bird registration fees (early registration ends on Nov. 15):
Google Inc. (GOOG) is considering testing print advertisements in Chicago newspapers, in a sign that the Internet giant, to date seen primarily as a threat to traditional media, could also become an ally.
If Google could take the inefficienies out of offline media they probably could end up making the papers more revenue in the short run. Long run is anyone's guess.
It was authored by Zoltan Gyongyi, Pavel Berkhin, Hector Garcia-Molina, and Jan Pedersen.
The proposed method for determining Spam Mass works to detect spam, so it compliments nicely with TrustRank (TrustRank is primarily aimed to detect quality pages and demote spam).
The paper starts off by defining what spam mass is.
Spam Mass - an estimate of how much PageRank a page accumulates by being linked to from spam pages.
I covered a bunch of the how it works in theory stuff in the extended area of this post, but the general takehome tips from the article are
.edu and .gov love is the real deal, and then some
Don't be scared of getting a few spammy links (everyone has some).
TrustRank may deweight the effects of some spammy links. Since most spammy links have a low authority score they do not comprise a high percentage of your PageRank weighted link popularity if you have some good quality links. A few bad inbound links are not going to put your site over the edge to where it is algorithmically tagged as spam unless you were already near the limit prior to picking them up.
If you can get a few well known trusted links you can get away with having a large number of spammy links.
These types of algorithms work on a relative basis. If you can get more traditional media coverage than the competition you can get away with having a bunch more junk links as well.
Following up on that last point, some sites may be doing well in spite of some of the things they are doing. If you aim to replicate the linkage profile of a competitor make sure you spend some time building up some serious quality linkage data before going after too many spammy or semi spammy links.
Human review is here to stay in search algorithms. Humans are only going to get more important. Inside workers, remote quality raters, and user feedback and tagging gives search engines another layer to build upon beyond link analysis.
Only a few quality links are needed to rank in Google in many fields.
If you can get the right resources to be interested in linking your way (directly or indirectly) a quality on topic high PageRank .edu link can be worth some serious cash.
Sometimes the cheapest way to get those kinds of links will be creating causes or linkbait, which may be external to your main site.
On to the review...
To determine the effect of spam mass they computate PageRank twice. Once normally and then again with more weight on known trusted sites that would be deemed to have a low spam mass.
Spammers either use a large number of low PageRank links, a few hard to get high PageRank links, or some combination of the two.
While the quality authoritative links to spam sites are more rare, they are often obtained through the following
blog / comment / forum / guestbook spam
honey pots (creating something useful to gather link popularity to send to spam)
buying recently expired domain names
if the majority of inlinks are from spam nodes it is assumed that the host is spam, otherwise it is labeled good. Rather than looking at the raw link count this can further be biased by looking at percent of total PageRank which comes from spam nodes
to further determine the percent of PageRank due to spam nodes you can also look at link structure of in-direct nodes and how they pass PageRank toward the end node
the presumption of knowing weather something is good or bad is not feasible, so it must be estimated from a subset of the index
for this to be practical search engines must have white lists and / or black lists to compare other nodes to. this can be automated or manual compiled
it is easier to assemble a good core since it is fairly reliable and does not change as often as spam techniques and spam sites (Aaron speculation: perhaps this is part of the reason some uber spammy older sites are getting away with murder...having many links from the good core from back when links were easier to obtain)
since the small reviewed core will be much smaller of a sample than the number of good pages on the web you must also review a small random uniform sample of the web to determine the approximate percent of the web that is spam to normalize the estimated spam mass
due to sampling methods some nodes may have a negative spam mass, and are likely to be nodes that were either assumed to be good in advance or nodes which are linked closely and heavily to other good nodes
it was too hard to manually create a large human reviewed set, so
they placed all sites listed in a small directory they considered to be virtually void of spam in the good core (they chose not to disclose the URL...anyone want to guess which one it was?). this group consisted of 16,776 hosts.
.gov and .edu hosts (and a few international organizations) also got placed in the good core
those sources gave them 504,150 unique trusted hosts
of the 73.3 million hosts in their test set 91.1% have a PageRank less than 2 (less than double the minimum PageRank value)
only about 64,000 hosts had a PageRank 100 times the minimum or more
they selected an arbitrary limit for minimum PageRank for reviewing the final results (since you are only concerned about the higher PageRank results that would appear atop search results)
of this group of 883,328 sites and they hand reviewed 892 hosts
564 (63.2%) were quality
229 (25.7%) were spam
54 (6.1%) uncertain (like beauty, spam is in the eye of the beholder)
45 (5%) hosts down
ALL high spam mass anomalies on good sites were categorized into the following three groups
some Alibia sites (Chinese was far from the core group),
Blogger.com.br (relatively isolated from core group),
.pl URLs (there were only 12 polish educational institusions in the core group)
Calculating relative mass is better than absolute mass (which is only logical if you wanted the system to scale, so I don't know why they put it in the paper). Example of why absolute spam mass does not work:
Adobe had lowest absolute spam mass (Aaron speculation: those taking the time to create a PDF are probably more concerned with content quality than the average website)
Macromedia had third highest absolute spam mass (Aaron speculation: lots of adult and casino type sites have links to Flash)
[update: Orion also mentioned something useful about the paper on SEW forums.
"A number of recent publications propose link spam detection methods. For instance, Fetterly et al. [Fetterly et al., 2004] analyze the indegree and outdegree distributions of web pages. Most web pages have in- and outdegrees that follow a power-law distribution. Occasionally, however, 17 search engines encounter substantially more pages with the exact same in- or outdegrees than what is predicted by the distribution formula. The authors find that the vast majority of such outliers are spam pages. Similarly, BenczÂ´ur et al. [BenczÂ´ur et al., 2005] verify for each page x whether the distribution of PageRank scores of pages pointing to x conforms a power law. They claim that a major deviation in PageRank distribution is an indicator of link spamming that benefits x. These methods are powerful at detecting large, automatically generated link spam structures with â€œunnaturalâ€ link patterns. However, they fail to recognize more sophisticated forms of spam, when spammers mimic reputable web content. "
So if you are using an off the shelf spam generator script you bought from a hyped up sales letter and a few thousand other people are using it that might set some flags off, as search engines look at the various systematic footprints most spam generators leave to remove the bulk of them from the index.]
There is a thread on WMW about the right price to sell an article for. The general consensus is that the author should probably wait it out until their site ranks and just keep their content.
While that is nice in theory, there is no guarantee that a site will eventually rank well just because it has decent content. Of course I am taking stuff out of context here, but you can read the thread to get the gist.
As one site is willing to pay you, it doesn't make sense to give your articles away to the other site just to get a link.
A friend of mine recently published an article on A List Apart. I think it would be hard to sell most any article for the value he is getting out of the authority of the link from that site, let alone the boost in credibility.
plus good primary links to your site may lead not only to direct exposure and link popularity, but also secondary exposure and more link love.
since your site is new you likely have lots of content and not so many links.
Whatever you decide, don't make the mistake of granting anyone exclusive rights to publish your work in perpetuity for peanuts.
for books I totally agree, but if you are obscure / new and / or are operating in a not so well known field and are good at writing articles sometimes giving them away is a great form of marketing.
Regarding your site, you will never leave the sandbox unless you keep your content 100% to yourself.
I think sitting chill with minimal link popularity is far worse than trading some of what you got a lot of for something you don't got a lot of (ie: content for links)
The web has taught me alot about not considering what things could or should be worth and that unless you actively work to make them worth it then inferior products which are marketed more aggressively will often win big.
if you have around a hundred articles I don't think it hurts you to share a few of them.
Some of the links you get by giving stuff away are links you never could have bought. Those are the ones that are usually worth a bunch too.
You know you have good reach as a search engine when registrars use your patent numbers to sell domains. GoDaddy says:
Google recently filed United States Patent Application 20050071741. As part of that patent application, Google made apparent its efforts to wipe out search engine spam, stating:
'Valuable (legitimate) domains are often paid for several years in advance, while doorway (illegitimate) domains rarely are used for more than a year. Therefore, the date when a domain expires in the future can be used as a factor in predicting the legitimacy of a domain and, thus, the documents associated therewith."
Domains registered for longer periods give the indication, true or not, that their owner is legitimate. Google uses a domain's length of registration when indexing and ranking a Web site for inclusion in their organic search results.
So to prove to everyone that your site is the real deal, register for more than one year and increase your chances of boosting your search ranking on Google.
I know registrars always sell bogus submit your site to the search engines garbage, but I don't think I have ever seen one recommend registering for extended periods of time because of a Google patent before.
Smart marketing on them, and smart marketing on Google for putting endless amounts of FUD in that patent.
After selling a few thousand ebooks those numbers make getting published seem far less appealing, especially when you consider that my prospective publisher told me they still wanted me to do most the marketing.
What is even more nuts is that I guaranteed the publisher that I would be able to sell more than that in under a year as an add on to my current book (even offering to buy that many off the start) and they still considered that to be small volume and would want me to more prominently push the physical book over the current ebook offering.
I don't get why so many businesses think it is ok to shift nearly all the risk onto another successful business just because they are smaller or new to the market.
CBS and Comcast have signed an agreement that, beginning in January 2006, will make four of the network's shows--CSI: Crime Scene Investigation, Survivor, NCIS, and Amazing Race--available on a video-on-demand basis for 99 cents per 24-hour window for each show.
Eventually the content floodgates will open. The question is who will own the distribution rights and at what cost? Does anyone think the baby bells or cable companies will be less monopolistic or more efficient than Google?
Disclaimer: I am not real good at business. When selling services I always sold myself short, which made it pretty hard to scale services while working by myself. Hence the writing the ebook and some pieces of the Cosmos falling into place for me :)
Not sure if I have seen this mentioned before. Dan Thies noticed Googlebot's wildcard robot.txt support:
Google's URL removal page contains a little bit of handy information that's not found on their webmaster info pages where it should be.
Google supports the use of 'wildcards' in robots.txt files. This isn't part of the original 1994 robots.txt protocol, and as far as I know, is not supported by other search engines. To make it work, you need to add a separate section for Googlebot in your robots.txt file. An example:
This would stop Googlebot from reading any URL that included the string &sort= no matter where that string occurs in the URL.
Good information to know if your site has recently suffered in Google due to duplicate content issues.
It sure is amazing the number of large vertical sites, .edu, and .gov results I saw in a few searches I did. Although there will probably still be a good amount of flux most the stuff I worked on seemed to get through ok.
I did see a bit of canonical URL issues, as noted by others on Matt's blog. Someone named Jason also left this gem in Matt's comments:
Our site has been negatively affected by Jagger. Therefore we just requested the transfer of 30,000 site wide links (paid in advance until July 06) to our main competitor who is currently ranked extremely well in Google for our main keyword.
Our entire website is legit SEO so our site wide links are the only thing that could have caused such a drastic drop in our ranking.
In a thread on SEW DaveN responded to a similar webmaster
IN life there are 2 ways to get on :
1) Be the best you can and move to the top
2) Drag everyone who is above you too below your level ..
Both ways you end up at the Top, it depends on how you view life and how long you want to stay there.
As long as Google is going to announce their updates and data centers, has anyone made a free SEO tool to easily compare / cross reference all the search results at various data centers? (Perhaps something like Myriad Search, but focuses on just one engine and lets the users select which data centers to compare.) I can't imagine it would be that hard to do unless Google blocked it, but they haven't been too aggressive in blocking web based SEO related tools (just look at all the tools SEO Chat has).
I work by myself, and am always a bit scared of spreading myself too thin, so I have not been to active on the old domain buying front.
Having said that, now would probably be a good time to buy old domains. Jim Boykin again mentioned his new love for oldies and Graywolf said
Came to the same conclusion myself, emailed about 150 people picked up 2 domains from 2000 for under $1K.
Think of how cheap those site purchases are. Decent links can cost $50 to $300 or more each, so buying whole sites for $500 is cheap cheap cheap! How cheap is it? Even the most well known link broker is recommending buying a few old domains.
Why now is the perfect time to buy old domains:
It is right before the Christmas shopping season and many people not monetizing their sites might be able to use a bit of spare cash.
Many older domains are doing better than one would expect in Google's search results, which means they may recoup their costs quickly.
As Andy Hagans said, "Some older sites seem to be able to get away with murder in Google's search results."
Link popularity flowed much more naturally to commercial sites in the past than it does now. This means buying something with the a natural link profile may be far cheaper than it would be to try to reproduce similar linkage data.
At different times search algorithms show you different things. Before the Christmas shopping season each of the last few year it seems Google rolled out a new algorithm that wacked many sites which SEO'ed their way to the top (IMHO via link trading and low quality linkage data). Most of the algorithm changes are related to looking at linkage quality, communities, and ways to trust sites. The most recent update seems to have (at least temorarily) dialed up the weighting on TrustRank or a similar technology, which has had the net effect of highly ranking many old/trusted/authoritative sites that may lack some query specific authority. If you shop for sites that fit the current Google criteria well then add some good SEO to it you should be sitting good no matter which way the algorithms slide.
I recommend that everyone spend their full attention coming up to speed on beta.search.msn.com.
It's very rare to get to see a search engine in transition, because that's the best time to see what the different criteria are for ranking.
Now that Google is in a state of flux it might be a good time to perform many searches to look for some underpriced ad inventory. If you know what you are looking for you are more likely to find it in the organic search results than in the AdWords system.
The search vs SEO cat fight:
keyword stuffing documents and cloaking
copy and paste the top ranked site's code, resubmit
any link spam goes (guestbooks, etc.)
targeted anchor text
Florida update, generic directories ranked way too well
We are removing our link to you now. PLEASE return the courtesy and remove your link to us!
Note that Google is updating its results this week and failure to remove these links immediately will likely mean not showing up in Google for AT LEAST the next 4 months!
Thank you for understanding,
The email is bogusly incorrect, and I don't think I traded links with the site mentioned, but that is the exact reason why this email is extra crappy.
If you trade links highly off topic you increase your risk profile, and if it helps you rank:
Whenever there is an update your competitors can send these remove my link reminders out for you.
There are only a limited number of relationships you can have. If you link out to a million sites your links out to junky sites will be a higher percentage than most sites, you will have more dead links than most quality sites, and many of those people will remove their links to you.
Your competitors could pay people from Jakarta $3 a day to go through your link trades and trade the same links.
Quality on topic sites may be less likely to link to you if your site frequently links off to low quality resources.
I think most sites which recently went south in Google probably lacked quality linkage data, not because they had too many links.
Since most people are still thinking "the numbers game" when it comes to obtaining links, most people are buying "numbers" from "monkeys" on crappy link pages.
When will the world wake up that the numbers game has passed the tipping point in Google. Engine are trying to get smarter with how they analyze sites. My overall thought is that they are working to identify, simply, "Links within Content and Linking to Content"
[Amazon] is introducing two new programs that allow consumers to buy online access to portions of a book or to the entire book, giving publishers and authors another way to generate revenue from their content.
Although Bezos does not come right out and say it, clearly this is a shot across the brow at Google, especially with the timing of their recent print offering.
While Amazon Chief Executive Jeff Bezos wouldn't comment specifically on the Google Print controversy, he said, "It's really important to do this cooperatively with the copyright holders, with the publishing community, with the authors. We're going to keep working in that cooperative vein."
After Google develops their micropayment system I bet they also directly broker a large amount of media.
This GenericScore may not appropriately reflect the siteâ€™s importance to a particular user if the userâ€™s interests or preferences are dramatically different from that of the random surfer. The relevance of a site to user can be accurately characterized by a set of profile ranks, based on the correlation between a sites content and the userâ€™s term-based profile, herein called the TermScore, the correlation between one or more categories associated with a site and userâ€™s category-based profile, herein called the CategoryScore, and the correlation between the URL and/or host of the site and userâ€™s link-based profile, herein called the LinkScore. Therefore, the site may be assigned a personalized rank that is a function of both the documentâ€™s generic score and the user profile scores. This personalized score can be expressed as: PersonalizedScore=GenericScore*(TermScore+CategoryScore+LinkScore).
For those big into patents: Stephen Arnold has a $50 CD for sale containing over 120 Google patent related documents.
I think he could sell that as a subscription service, so long as people didn't know all the great stuff Gary Price compiles for free. (Link from News.com)
Microsoft said Thursday it has agreed to buy Media-streams.com, a privately held firm in Zurich, Switzerland. Financial terms were not disclosed.
Media-stream's VoIP technology, which enables telephone calls over the Internet, will become a core part of Microsoft's platform that enables workers to use the Web to collaborate on projects. Microsoft envisions such collaboration encompassing several different modes of communication, including email, instant messaging, Web conferencing and telephone calls via the Internet.
Media-streams is the second VoIP firm acquired by Microsoft in the last few months. In August, Microsoft acquired Teleo
The study shows most Google users primarily are there because they believe Google has the most relevant results (although the fact that Google has not had a longstanding portal as long as it's competitors may bias the study to conclude that result).
His site has a tech bias, so I believe that favors Google somewhat, but Google sends the bulk of his referrals. MSN and AOL users are much more likely to click content ads than Google or Yahoo! users. I believe that is a function of user sophistication. Less sophisticated people are click happy because they probably don't know they are clicking paid ads.
The company said it won't display advertisements on public domain book pages or any book pages Google scans from a library.
Perhaps Google realizes being the default search means they can have a few loss liters, and not monetizing the public domain works undermines the Google is a greedy company statement cried far and loud by critics of the program. Google Print is facing numerous pending lawsuits.