The difference is that now, the CTR of the ad copy itself is factored in, instead of it being solely the CTR of the keyword. Which only makes sense, IMO, given that it is the quality of the keyword and the particular ad it brings up that defines relevance, for a given search. source
I always like smaller conferences because when they get too big you (as in me) feel lost in the shuffle. It only costs $100 to attend this one. Smart move by JupiterMedia, as this will surely prevent others from having an easy entry into this market space.
The $199 per month Urchin On Demand also now includes report profiles for up to fifty individual websites (Urchin's previous offering included reporting for only one site). The price includes up to 100,000 pageviews per month. Users can add one million more pageviews for only $99 more per month.
In addition to the reduced price and increased number of profiles, Urchin On Demand is now able to import -pay--per-click costs directly from Google AdWords accounts.
Many smart search marketers probably are not willing to be paid to give Google all their data. As time passes you can be sure that Google will drop their costs further as they try to kill off the business models of everthing between them and ad dollars.
Look, Fwht, & Mama are dropping like it's not hot. InfoSpace (which does search & mobile technology) recently lost about 30% of their market value as well
When I was new to SEO I did a bunch of on the page analysis to try to figure out exactly what other people are doing. The problem is that it gets you focused on things that do not matter. A site may end up ranking high at the sacrifice of conversion.
As search algorithms advance basic link analysis tools, at least for Google, are starting to become what keyword density tools are: a waste of time.
Link analysis software was cool, especially when Google used to show all of the PageRank 4 and above links, back when their search relevancy algorithm was a bit more dependant on raw PageRank.
Now Google only shows a limited random set of backlinks, and the other search engines also limit the search depth to 1,000 results, which makes it hard to do useful analysis with the various link analysis tools on the market.
If it were quick and easy to query a database deeply (deeper than 1,000) then the link analysis tools would be much more useful. None of them currently on the market really make that a quick and easy process.
To keep improving the results, you find more variables for the algorithm-creating machine to use, and you add to your store of human-ranked pages for it to "learn" from. What you don't do is bother understanding the actual algorithm -- it was constructed by a machine and is way too complex for anyone to keep in their head.
Psychologists have shown repeatedly that when you give people a system to optimize, all you have to do is secretly introduce a delay between their actions and the results of their actions, and they will go bonkers. In fact, in a very simple (single variable!) model in which people are trying to control the temperature in a virtual refridgerator, you can get some of the same irrational responses you see in these forums
and the first post here by Captain CaveMan (which incidentally is the name of an awesome cartoon character) does as well:
Without giving away the store, I don't know how else to say it. There is no sandbox. People speak of it as though it were some simple 'thing' that stops new sites from being seen. That has simply never been true. What was true was that in its early days, some of the algo elements and related filters were so tight that only a very few new sites got past them (some accidentally; some methodically). Over time that changed; more sites started getting out, presumably as G worked to surface more new, higher quality sites.
There is no sandbox. There is only a serious of rotating algo's and related filters, that make it far harder for sites launched after spring of '04 to be widely seen in the SERP's. Not impossible. Harder. And certainly not as hard now as was true seven months ago. This has been hashed and rehashed so many times that it's hard to understand why it's still confusing.
If you can only see a few of the variables and overexert effort to satisfy those variables you may end up tripping filters and not satisfying other criteria.
AdWords Spying: GoogSpy looks scrapes hundreds of thousands of searches from Google to determine who is bidding on what terms. The idea is killer, but the implementation is a bit lacking. Link found from ThreadWatch.
So I have been getting some of the Gmail feeds and ads recently. Hopefully I answered this question correctly or you the reader will call me dumb...
Bad Call #1:
Here is an example thread
Question from Search Marketing Info
Which internet search engine was co-founded by a math major who chose the name to imply a vast reach ?
Thanks in advance,
Google was a mispelling for Googol, which means a 1 with a million
zeros behind it.
Larry Page founded it and Sergey Brin was his co founder.
and here was Google's contextually targeted Gmail ad:
Head Gasket Blown? - www.rxauto.com - Repair It Yourself Guaranteed ThermaGasket The Mechanics Choice
That is data stored on Google's servers and that is the best that they can target it? When you couple that in with all the AdSense spam sites and click fraud it really makes you wonder why Google assumes anybody would want that traffic.
Bad Call #2:
One of the default feeds was Engaget. Presumably because they run AdSense? Don't get me wrong here, its cool to help smaller publishers, but if you put Engaget in there you should put Gizmodo there also unless you want people to quetion you motives.
Placing random off target off topic crap I don't want in my email is being evil. At least the old Hotmail dating ads would occassionally show pictures of cute girls ;)
I know that I can unsubscribe from feeds, but I shouldn't have to opt out. Maybe off the start you could just promote Google News, Froogle, and your other portal pieces up there?
Bad Call #3:
Google actually places feeds in your spam folder. How stupid is it to place contextually relevant feeds near stuff that was deemed as being unwanted useless junk? What better way is there to turn users off?
Another thing that is really weird is most (maybe all) of the spam feeds were for spam recipies:
Spam Vegetable Strudel - Bake 20 minutes or until golden, serve with soy sauce
Savory Spam Crescents - Bake 12-15 minutes or until golden brown
French Fry Spam Casserole - Bake 30-40 minutes
They may place the spam ads in there to try to push the cute and innocent corporate culture, but I don't buy it.
After bogusly adding the Google Toolbar Autolink feature which directed B&N customers to Amazon many people started to become increasingly suspicious of potential hidden business partnerships. Is Google partnered with Hormel Foods now too?
I feared this post was reading as though I got it from Google's PR firm, so I felt I should include this... Google Blowout Quarter Update:
There's a blurb in the Wall Street Journal today that explains how Google's reported bottom line is being gamed by their own options program. It seems since they backloaded the options expense onto last year's earnings statement, this quarter's results will be ARTIFICIALLY BOOSTED almost 100%, even though it has absolutely nothing to do with their actual profitability as a business. Keep that in mind when they announce earnings tomorrow.
In after hours Google shares are trading at over $220.
Google's first-quarter net income rose to $369.2 million, or $1.29 a share, from $64.0 million, or 24 cents a share, a year earlier. Profit from the most recent quarter included a $49 million charge for stock-based compensation.
Gross revenue nearly doubled to $1.26 billion from $651.6 million.
The results easily topped Wall Street's average net profit target of 78 cents a share. Analysts had seen profit excluding some items at 92 cents and revenue at $1.16 billion, according to Reuters Estimates. source
PageRank was broken from the start. The concept they were going after may still well exist though if they can get enough users of their search history tool. While other search engines still seem relatively easy to spam Google may be trying to measure web wide trust scores using much more than just raw linkage data.
Google need not stomp SEO techniques out, they only need to:
make the costs of SEO high enough that it is cheaper to build legitimate business models and brands than it is to build a business off of manipulating search results
Some people will be untouchable. They will know enough about social engineering and database programming to where they will still spam Google all day long. I am sure Google realizes that, but they want to continually increase costs to where that is an exceptionally small pool.
As SEO gets harder Google makes more money from ads. As they make more money from ads they can spend more into making SEO harder.
Now if only they could share more data with advertisers to help make click fraud easier to detect. Google bought Urchin. Why not buy, create, or offer something like Who's Clicking Who. Surely Google has the market data and it will not increase costs much to give advertisers more options and more data.
A search company which makes tons of profit organizing data should recognize that by making advertising transparent and making more ad information available they will create a more efficient market which creates more profits. The advertising community would likely police themselves if you gave them enough data and responded to feedback.
Google Inc. (GOOG.O: Quote, Profile, Research) on Wednesday debuted a test service called My Search History that analysts said is a move closer to personalized search, which is widely considered the Holy Grail for the Web search leader and its rivals. source
to use My Search History you must register at Google Accounts and maintain an active account. Ask Jeeves have had a search history tool for a while now and Yahoo! has My Yahoo! for various personalization effects, although Yahoo! seems more focused on providing news and blog feeds and the like. I think Yahoo! is betting on the abundance of information making subscribing to channels much more appealing than searching the web. I believe Yahoo! also allows you to subscribe to Yahoo! News feeds by keyword phrase.
Personalized search allows engines to better understand users to improve search quality and ad targeting. Whoever is branded as the best market solution on that front is going to make a bucket of cash, because keeping your search history and learning the user raises the barrier to switching search providers.
It makes it hard for another search service to be as relevant if you have tons of personal information already locked in a competing service. This data will be hard to export to other systems as well, as importing huge hunks of data will also allow marketers to import large volumes of spam.
I just briefly tested Google's service. It is fairly slick. You can quickly sign in or out and it adds minimal clutter to the Google home page.
From the link in the upper right corner you are brought to a new page. It shows a calender which color codes your search volume on the right side. The left side shows your searches for that day and the results you clicked on. The my history results that you click on also show up in the Google one box area when you search for similar terms using the regular search results.
Some privacy advocates would likely go nuts with this offering. It is all opt in though. I encourage everyone to sign in, search for seo, scroll past the Japanese stuff, and click on my listing.
Presumably some searchers may be able to build up a search history.
As they build it up it could build Google's trust in that user, which in turn could potentially allow Google to use that user feedback to verify search result relevancy.
I would not doubt this to do a bit more of globalizing SEO. Paying people in third world countries to randomly click certain sites. I am already building a search history today as a prospective SEO tool.