Instant Articles may have worked for an instant, but many publishers are likely where they were before they made the Faustian bargain, except they now have less control over their content distribution and advertising while having the higher cost structure of supporting another content format.
When Facebook announced their news feed update to fight off clickbait headlines, it sure sounded a lot like the equivalent of Google's Panda update. Glenn Gabe is one of the sharpest guys in the SEO field who regularly publishes insightful content & doesn't blindly shill for the various platform monopolies dominating the online publishing industry & he had the same view I did.
Further cementing the "this is Panda" view was an AdAge article quoting some Facebook-reliant publishers. Glad we have already shifted our ways. Nice to see them moving in the same direction we are. etc. ... It felt like reading a Richard Rosenblatt quote in 2011 about Demand Media's strong working relationship with Google or how right after Panda their aggregate traffic level was flat.
Richard Rosenblatt: It’s not directed at us in any way.
P K: they wrote this post, which talks about content farms, and even though you say they weren’t talking about you, it left a lot of people scratching their heads.
R R: Let’s just say that we know what they’re trying to do. ... He’s talking about duplicate, non-original content. Every single piece of ours is original. ... our relationship is synergistic, and it’s a great partnership.
Kara Swisher: What were you trying to communicate in the call, especially since investors seemed very focused on Panda?
R R: What I also wanted to show was that third-party data sources should not be relied on. We did get affected, for sure. But I was not just being optimistic, we wanted to use that to really understand what we can do better.
K S: Given Google’s shift in its algorithm, are you shifting your distribution, such as toward social and mobile?
R R: If you look at where trends are going, that’s where we are going to be.
K S: How are you changing the continued perception that Demand is a content farm?
R R: I don’t think anyone has defined what a content farm is and I am not sure what it means either. We obviously don’t think we are a content farm and I am not sure we can counter every impact if some people think we are.
Since the Google Panda update eHow has removed millions of articles from their site. As a company they remain unprofitable a half-decade later & keep seeing YoY media ad revenue declines in the 30% to 40% range.
Over-reliance on any platform allows that platform to kill you. And, in most cases, you are unlikely to be able to restore your former status until & unless you build influence via other traffic channels:
I think in general, media companies have lost sight of building relationships with their end users that will bring them in directly, as opposed to just posting links on social networks and hoping people will click. I think publishers that do that are shooting themselves in the foot. Media companies in general are way too focused on being where our readers are, as opposed to being so necessary to our readers that they will seek us out. - Jessica Lessin, founder of TheInformation
But is that a real long-term solution to turn the corner? Even if they see a short term pop in ad revenues by using some dumbed-down AI-enhanced low cost content, all that really does is teach people that they are a source of noise while increasing the number of web users who install ad blockers.
And the whole time penalized publishers try to recover the old position of glory, the platform monopolies are boosting their AI skills in the background while they eat the playing field.
“This isn’t motivated by inventory; it’s not an opportunity for Facebook from that perspective,” Mr. Bosworth said. “We’re doing it more for the principle of the thing. We want to help lead the discussion on this.” ... Mr. Bosworth said Facebook hasn't paid any ad-blocking software company to have its ads pass through their filters and that it doesn’t intend to.
What's more, even large scaled companies in big money fields are struggling to monetize mobile users. On the most recent quarterly conference call TripAdvisor executives stated they monetize mobile users at about 30% the rate they monetize desktop or tablet users.
What happens when the big brand advertisers stop believing in the narrative of the value of precise user tracking?
P&G two years ago tried targeting ads for its Febreze air freshener at pet owners and households with large families. The brand found that sales stagnated during the effort, but rose when the campaign on Facebook and elsewhere was expanded last March to include anyone over 18.
P&G’s push to find broader reach with its advertising is also evident in the company’s recent increases in television spending. Toward the end of last year P&G began moving more money back into television, according to people familiar with the matter.
For mobile to work well you need to be a destination & a habit. But there is tiny screen space and navigational searches are also re-routed through Google hosted content (which will, of course, get monetized).
In fact, what would happen to an advertiser if they partnered with other advertisers to prevent brand bidding? Why that advertiser would get sued by the FTC for limiting user choice:
The bidding agreements harm consumers, according to the complaint, by restraining competition for, and distorting the prices of, advertising in relevant online auctions, by reducing the number of relevant, useful, truthful and non-misleading advertisements, by restraining competition among online sellers of contact lenses, and in some cases, by resulting in consumers paying higher retail prices for contact lenses.
AMP content presented in the both sections will be “de-duplicated” in order to avoid redundancies, Google says. The move is significant in that AMP results will now take up an entire phone screen, based on the example Google shows in its pitch deck.
Are many publishers in a rush to support Google AMP after the bait-n-switch on Facebook Instant Articles?
At the abstract level, if many people believe in something then it will grow.
The opposite is also true.
And in a limitless, virtual world, you can not see what is not there.
The Yahoo Directory
Before I got into search, the Yahoo! Directory was so important to the field of search there were entire sessions at SES conferences on how to get listed & people would even recommend using #1AAA-widgets.com styled domains to alphaspam listings to the top of the category.
The alphaspam technique was a carry over from yellow page directories - many of which have went through bankruptcy as attention & advertising shifted to the web.
Go to visit the Yahoo! Directory today and you get either a server error, a security certificate warning, or a redirect to aabacosmallbusiness.com.
Before the Yahoo! Directory disappeared their quality standards were vastly diminished. As a webmaster who likes to test things, I tried submitting sites of various size and quality to different places. Some sites which would get rejected by some $10 directories were approved in the Yahoo! Directory.
The Yahoo! Directory also had a somewhat weird setting where if you canceled a directory listing in the middle of the term they would often keep it listed for many years to come - for free. After many SEOs became fearful of links the directory saw vastly reduced rates of submissions & many existing listings canceled their subscriptions, thus leaving it as a service without much of a business model.
At one point Google's webmaster guidelines recommended submitting to DMOZ and the Yahoo! Directory, but that recommendation led to many lesser directories sprouting up & every few years Google would play a whack-a-mole game and strip PageRank or stop indexing many of them.
Many have presumed DMOZ was on its last legs many times over the past decade. But on their 18th birthday they did a spiffy new redesign.
Different sections of the site use different color coding and the design looks rather fresh and inviting.
However improved the design is, it is unlikely to reverse this ranking trend.
Why did those rankings decline though? Was it because the sites suck? Or was it because the criteria to rank changed? If the sites were good for many years it is hard to believe the quality of the sites all declined drastically in parallel.
What happened is as engagement metrics started getting folded in, sites that only point you to other sites become an unneeded step in the conversion funnel, in much the same way that Google scrubbed affiliates from the AdWords ecosystem as unneeded duplication.
What is wrong with the user experience of a general web directory? There isn't any single factor, but a combination of them...
the breadth of general directories means their depth must necessarily be limited.
general directory category pages ranking in search results is like search results in search results. it isn't great from the user's perspective.
if a user already knows a category well they would likely prefer to visit a destination site rather than a category page.
if a user doesn't already know a category, then they would prefer to use an information source which prioritizes listing the best results first. the layout for most general web directories is a list of results which are typically in alphabetical order rather than displaying the best result first
in order to sound authoritative many directories prefer to use a neutral tone
If a directory mostly links to lower quality sites Google can choose to either not index it or not trust links from it. And even if a directory generally links to trustworthy sites, Google doesn't need to rank it to extract most the value from it.
The trend of lower traffic to the top tier general directory sites has happened across the board.
Many years ago Google's remote rater guidelines cited Joeant as a trustworthy directory.
Their traffic chart looks like this.
And the same sort of trend is true for BOTW, Business.com, GoGuides.org, etc.
There is basically nothing a general web directory can do to rank well in Google on a sustainable basis, at least not in the English language.
Even if you list every school in the city of Winnipeg that page can't rank if it isn't indexed & even if it is indexed it won't rank well if your site has a Panda-related ranking issue. There are a couple other issues with such a comprehensive page:
each additional listing is more editorial content cost in terms of building the page AND maintaining the page
the bigger the page gets the more a user needs something other than alphabetical order as a sort option
the more listings there are in a tight category the more the likelihood there will be excessive keyword repetition on the page which could get the page flagged for algorithmic demotion, even if the publisher has no intent to spam. Simply listing things by their name will mean repeating a word like "school" over 100 times on the above linked Winnipeg schools page. If you don't consciously attempt to lower the count a page like that could have the term repeated over 300 times.
What's more, many people who use automated link clean up tools take the declining traffic charts & low rankings of the sites as proof that the links lack value or quality.
That means anyone who gets hit by a penalty & ends up in warning messages not only ends up with less traffic while penalized, but they also get extra busy work to do while trying to fix whatever the core problem is.
And in many cases fixing the core problem is simply unfeasible without a business model change.
When general web directories are defunded it not only causes many of them to go away, but it also means other related sites and services disappear.
Editors of those web directories who were paid to list quality sites for free.
Web directory review sites.
SEOs, internet marketers & other businesses which listed in those directories
Now perhaps general web directories no longer really add much value to the web & they are largely unneeded.
But there are other things which are disappearing in parallel which were certainly differentiated & valuable, though perhaps not profitable enough to maintain the "relevancy" footprint to compete in a brand-first search ecosystem.
Depth vs Breadth
Unless you are the default search engine (Google) or the default social network everyone is on (Facebook), you can't be all things to all people.
If you want to be differentiated in a way that turns you into a destination you can't compete on a similar feature set because it is unlikely you will be able to pay as much for traffic-driven partnerships as the biggest players can.
Can niche directories or vertical directories still rank well? Sure, why not.
Sites like Yelp & TripAdvisor have succeeded in part by adding interactive elements which turned them into sought after destinations.
Part of becoming a destination is intentionally going out of their way to *NOT* be neutral platforms. Consider how many times Yelp has been sued by businesses which claimed the sales team did or was going to manipulate the displayed reviews if the business did not buy ads. Users tend to trust those platforms precisely because other users may leave negative reviews & that (usually) offers something better than a neutral and objective editorial tone.
The conversation started when revenues were down, and I had to carry payroll for a month or two out of my personal account, which I had not had to do since shortly after we started this whole project. We tweaked some things (added an ad or two which we had stripped back for the redesign, reminded people about ad-blockers and their impact on our ability to turn a profit, etc.) and revenue went back up a bit, but for a hot minute, you’ll remember I was like: “Theoretically, if this industry went further into the ground which it most assuredly will, would we want to keep running the site as a vanity project? Probably not! We would just stop doing it.”
In the current market Google can conduct a public relations campaign on a topic like payday loans, have their PR go viral & then if you mention "oh yeah, so Google is funding the creation of doorway pages to promote payday loans" it goes absolutely nowhere, even if you do it DURING THE NEWS CYCLE.
While the real (and important) news stories go nowhere & the PR distortions spread virally, the individual blogger ends up feeling a bit soulless if they try to make ends meet:
"The American Mama reached tens of thousands of readers monthly, and under that name I worked with hundreds of big name brands on sponsored campaigns. I am a member of virtually every ‘blog network’ and agency that “connects brands with bloggers”. ... What’s the point of having your own space to write if you’re being paid to sound like you work for a corporation? ... PR Friendly says “For the right price, I will be anyone you want me to be.” ... I’m not saying blogging is dying, but this specific little monster branch of it, sponsored content disguised as horribly written “day in the life” stories about your kids and pets? It can’t possibly last. Do you really want to be stuck on the inside when it crumbles?"
If you can't get your own site to grow enough to matter then maybe it makes sense to contribute to someone else's to get your name out there.
I recently received this unsolicited email:
"Hello! This is Theodore, a writer and chief editor at SomeSiteName.Com I noticed that you are accepting paid reviews online and you will be glad to know that now you can also publish your Sponsored content to SomeSite via me. SomeSite.Com is a leading website which deals in Technology, Social Media, Internet Stuff and Marketing. It was also tagged as Top 10 _____ websites of 2016 by [a popular magazine]. Website Stats- Alexa Rank: [below 700] Google PageRank: 6/10 Monthly Pageviews: 5+ Million Domain Authority: 85+ Price : $500 via PayPal (Once off Payment) Let me know if you are interested and want to feature your website product like nothing! This will not only increase your traffic but increase in overall SEO Score as well. Thanks"
That person was not actually a member of that site's team, but they had found a way to get their content published on it.
"There's a space in the world for art, but that's different from trying to build products at scale. The one thing that does make me a little nervous is a lot of my designer friends are still focused building websites and I'm not sure that's a growth business anymore. If you look at people who are doing interesting work, they tend to be building inside these platforms like Facebook and finding ways to do interesting work in there. For instance, journalists. Instant Articles is a really great way for stories to be told."
And if you starve during the 7 lean years in between when your business model is once again well aligned with Facebook you can't go back in time to give yourself a meal to un-starve.
Ehow.com has removed *MILLIONS* of pages of content since getting hit by Panda. And yet their ranking chart looks like this
What is crazy is the above chart actually understates the actual declines, because the shift of search to mobile & increasing prevalence of ads in the search results means estimates of organic search traffic may be overstated significantly compared to a few years prior.
A half-decade ago a bootstrapped eHow competitor named ArticlesBase got some buzz in TechCrunch because they were making about $500,000 a month on about 20 million monthly unique visitors. That business was recently listed on Flippa. They are getting about a half-million unique monthly visitors (off 95%) and about $2,000 a month in revenues (off about 99.6%).
The negative karma with that site (in terms of ability to rank) is so bad that the site owner suggested on Flippa to publish any new content from new authors onto different websites: "its not going to get to 0 as most of the traffic is not google today, but we would suggest to push out the fresh daily incoming content to new sites - thats where the growth is."
Now a person could say "eHow deserves to die" and maybe they are right. BUT one could easily counter that point by noting...
the public who owns the shares owns the ongoing losses & many top insiders cashed out long ago
Google was getting a VIG on eHow on their ride up & is still collecting one on the way down (along with funding other current parallel projects from the very same people with the very same Google ad network)
Demand Media's partner program where they syndicate eHow-like content to newspapers like USA Today keeps growing at 15% to 20% a year (similar process, author, content, business model, etc. ... only a different URL hosting the content)
look at this and you'll see how many publishing networks are still building the same sort of content but are cross-marketing across networks of sites. What's more some of the same names are at the new plays. For example, Demand Media's founder was the chairman of an SEO firm bought by Hearst publishing & his wife is on the about us page of Evolve Media's ModernMom.com
The wrappers around the content & masthead logos change, but by and large the people and strategies don't change anywhere near as quickly.
Web Portals & News Sites
As the mainstream media gets more desperate, they are more willing to partner with the likes of Demand Media to get any revenue they can.
It doesn't look particularly bad, especially if you consider that Yahoo has shut down many of their vertical sites.
Underlying flat search traffic charts misses declining publisher CPMs and the click traffic mix shift away from organic toward paid search channels as search traffic shifts to mobile devices & Google relentlessly increases the size of the search ads. Yahoo may still rank #3 for keyword x, but if that #3 ranking is below the fold on both mobile and desktop devices they might need to rank #1 to get as much traffic as #3 got a couple years ago.
Yahoo! was once the leading search portal & now they are worth about 1/5th of LinkedIn (after backing out their equity stakes in Alibaba and Yahoo! Japan).
The chart is roughly flat, but the company is up for a fire sale because organic search result displacement & the value of traffic has declined quicker than Yahoo! can fire employees & none of their Hail Mary passes worked.
Ms. Mayer compared the [Polyvore] deal to Google’s acquisition of YouTube in 2006, arguing that “you can never overpay” for a company with the potential to land a huge new base of users.
“Her core mistake was this belief that she could reinvent Yahoo,” says a former senior executive who left the company last year. “There was an element of her being a true believer when everyone else had stopped.”
The Web will only expand into more aspects of our lives. It will continue to change every industry, every company, and every life on the planet. The Web we build today will be the foundation for generations to come. It’s crucial we get this right. Do we want the experiences of the next billion Web users to be defined by open values of transparency and choice, or the siloed and opaque convenience of the walled garden giants dominating today?
And if converting on mobile is hard or inconvenient, many people will shift to the defaults they know & trust, thus choosing to buy on Amazon rather than a smaller ecommerce website. One of my friends who was in ecommerce for many years stated this ultimately ended up becoming the problem with his business. People would email him back and forth about the product, related questions, and basically go all the way through the sales process with getting him to answer every concern & recommend each additional related product needed, then at the end they would ask him to price match Amazon & if he couldn't they would then buy from Amazon. If he had more scale he might have been able to get a better price from suppliers and compete with Amazon on price, but his largest competitor who took out warehouse space also filed for bankruptcy because they were unable to make the interest payments on their loans.
We live in a society which over-values ease-of-use & scale while under-valuing expertise.
Look at how much consolidation there has been in the travel market since Google Flights launched & Google went pay-to-play with hotel search.
Expedia owns Travelocity & Orbitz. Priceline owns Kayak. Yahoo! Travel simply disappeared. TripAdvisor is strong, but even they were once a part of Expedia.
How many markets are strong enough to support the creation of that sort of featured editorial content?
And most companies which can create that sort of in-depth content leverage the higher margins on shallower & cheaper content to pay for that highly differentiated featured content creation.
But if the knowledge graph and new search features are simply displacing the result set the number of people who will be able to afford creating that in-depth featured content is only further diminished.
Over 5 years ago Bing's Stefan Weitz mentioned they wanted to move search from a web of nouns to a web of verbs & to "look at the web as a digital representation of the physical world." Some platforms are more inclusive than Google is & decide to partner rather than displace, but Bing's partnership with Yelp or TripAdvisor doesn't help you if you are a direct competitor of Yelp or TripAdvisor, or if your business was heavily reliant on one of these other channels & you fall out of favor with them.
Chewing Up Real Estate
There are so many enhanced result features in the search results it is hard to even attempt to make an exhaustive list.
As search portals rush to add features they also rush to grab real estate & outright displace the concept of "10 blue links."
There has perhaps been nothing which captured the sentiment better than
The following is paraphrased, but captures the intent to displace the value chain & the roll of publishers.
"the journeys of users. their desire to be taken and sort of led and encouraged to proceed, especially on mobile devices (but I wouldn't say only on mobile devices).
there are a lot of users who are happy to be provided with encouragement and leads to more and more interesting information and related, grouped in groups, leading lets say from food to restaurants, from restaurants to particular types of restaurants, from particular types of restaurants to locations of those types of restaurants, ordering, reservations.
I'm kind of hungry, and in a few minutes you've either ordered food or booked a table. Or I'm kind of bored, and in a few minutes you've found a book to read or a film to watch, or some other discovery you are interested in." - Andrey Lipattsev
What role do publishers have in the above process? Unpaid data sources used to train algorithms at Facebook & Google?
Individually each of these assistive search feature roll outs may sound compelling, but ultimately they defund publishing.
People may think I am unnecessarily harsh toward Google in my views, but this sort of shift is not a Google-only thing. It is something all the large online platforms are doing. I simply give Google more coverage because they have a history of setting standards & moving the market, whereas a player like Yahoo! is acting out of desperation to simply try to stay alive. The market capitalization of the companies reflect this.
Google & Facebook control the ecosystem. Everyone else is just following along.
"digital is eating legacy media, mobile is eating digital, and two companies, Facebook and Google, are eating mobile. ... Since 2011, desktop advertising has fallen by about 10 percent, according to Pew. Meanwhile mobile advertising has grown by a factor of 30 ... Facebook and Google, control half of net mobile ad revenue." - Derek Thompson
The same sort of behavior is happening in China, where Google & Facebook are prohibited from competing.
As publishers get displaced and defunded online platforms can literally buy the media: “There’s very little downside. Even if we lose money it won’t be material,” Alibaba's Mr. Tsai said. “But the upside [in buying SCMP] is quite interesting.”
The above quote was on Alibaba buying the newspaper of record in Hong Kong.
As bad as entire industries becoming token purchases may sound, that is the optimistic view. :D
Facebook's Instant Articles and Google's AMP those make a token purchase unnecessary: "I don't think it's any secret that you're going to see a bloodbath in the next 12 months," Vice Media's Shane Smith said, referring to digital media and broadcast TV. "Facebook has bought two-thirds of the media companies out there without spending a dime."
Those services can dictate what gets exposure, how it is monetized, and then adjust the exposure and revenue sharing over time to keep partners desperate & keep them hooked.
“If Thiel and Nick Denton were just a couple of rich guys fighting over a 1st Amendment edge case, it wouldn't be very interesting. But Silicon Valley has unprecedented, monopolistic power over the future of journalism. So much power that its moral philosophy matters.” - Nate Silver
Give them just enough (false) hope to stay partnered.
All the while track user data more granularly & run AI against it to disintermediate & devalue partners.
for all the original shows Netflix has underwritten, it remains dependent on the very networks that fear its potential to destroy their longtime business model in the way that internet competitors undermined the newspaper and music industries. Now that so many entertainment companies see it as an existential threat, the question is whether Netflix can continue to thrive in the new TV universe that it has brought into being.
“ ‘Breaking Bad’ was 10 times more popular once it started streaming on Netflix.” - Michael Nathanson
the networks couldn’t walk away from the company either. Many of them needed licensing fees from Netflix to make up for the revenue they were losing as traditional viewership shrank.
Wikipedia is certainly imperfect, but it is also a large part of why other directories have went away. It is basically a directory tied to an encyclopedia which is free and easy to syndicate.
Every large search & discovery platform has an incentive for Wikipedia to be as expansive as possible.
The bigger Wikipedia gets, the more potential answers and features can be sourced from it. More knowledge graph, more instant answers, more organic result displacement, more time on site, more ad clicks.
Even if a knowledge graph listing is wrong, the harm done by it doesn't harm the search service syndicating the content unless people create a big deal of the error. But if that happens then people will give feedback on how to fix the error & that is a PR lead into the narrative of how quickly search is improving and evolving.
"Wikipedia used to instruct its authors to check if content could be dis-intermediated by a simple rewrite, as part of the criteria for whether an article should be added to wikipedia. There are many rascals on the Internets; none deserving of respect." - John Andrews
Sergy Brin donates to fund the expansion of Wikipedia. Wikipedia rewrites more webmaster content. Google has more knowledge graph grist and rich answers to further displace publishers.
I recently saw the new gray desktop search results Google is tested. When those appear the knowledge graph appears inline with the regular search results & even on my huge monitor the organic result set is below the fold.
The problem with that is if your brand name is the same brand name that is in the knowledge graph & you are not the dominant interpretation then you are below the fold on all devices for your core brand UNLESS you pay Google for every single click.
How much should a brand like The Book of Life pay Google for being a roadblock? What sort of tax is appropriate & reasonable? How high will you bid in a casino where the house readjusts the shuffle & deal order in the middle of the hand?
I recently did a search on Bing & inside their organic search results they linked to a Mahalo-like page called Bing Knows. I guess this is a feature in China, but it could certainly spread to other markets.
If they partnered with an eBay or Amazon.com and put a "buy now" button in the search results they'd have just about completely closed the loop there.
The reason I started this article with directories is their role is to link to sites. They are categorized collections of links which have been heavily commodified & devalued to the point they are rendered unnecessary and viewed adversely by much of the SEO market (even the ones with decent editorial standards).
Big platform players like Google and Facebook broaden cross-device user tracking to create new relevancy signals and extract most the value created by publisher. The more information the platform owns the more of a starving artist the partners become.
"It's the golden age right now," [Thrillist CEO Ben Lerer] said. "If you're a digital publisher, you have every big TV company calling you. When I look at media brands, if a media brand disappeared tomorrow, would I notice?" he said. "And there are a bunch of brands that have scale, and maybe a lot of money raised, and maybe this and that, but, actually, I might not know for a year. There's so many brands like that. Like, what does it really stand for? Why does it exist?"
Disruption is not a strategy, but the whole point of accelerating it & pushing it (without an adequate plan for "what's next") is to re-establish feudal lords.
The web is a virtual land where the commodity which matters most is attention. If you go back in time, lords maintained wealth & control through extracting rents.
A few years ago a quote like the following one may have sounded bizarre or out of place
These are the people who guard the company’s status as what ranking team head Amit Singhal often sees characterised as “the biggest kingmaker on this Earth.”
But if you view it through the some historical context it isn't hard to understand
"The nobles still had the power to write the law, and in a series of moves that took place in different countries at different times, they taxed the bazaar, broke up the guilds, outlawed local currencies, and bestowed monopoly charters on their favorite merchants. ... It was never really about efficiency anyway; industrialization was about restoring the power of those at the top by minimizing the value and price of human laborers." - Douglas Rushkoff
Google funding LendUp & ranking their doorway pages while hitting the rest of the industry is Google bestowing "monopoly charters on their favorite merchants."
The issue is not that the value of anything drops to zero, but rather a combine set of factors shrinks down the size of the market which can be profitably served. Each of these factors eat at margins...
the rise of ad blockers (funded largely by some big ad networks paying to allow their own ads through while blocking competing ad networks)
rise of programmatic ads (which shift advertiser budget away from publisher to various forms of management)
larger ad sizes: "Based on early testing, some advertisers have reported increases in clickthrough rates of up to 20% compared to current text ads. "
increase of vertical search results in Google & more ads + self-hosted content in Facebook's feed
increased algorithmic barrier to entry and longer delay times to rank
The least sexy consultant pitch in the world: "Sure I can probably rank your website, but it will take a year or two, cost you at least $80,000 per year, and you will still be below the fold even if we get to #1 because the paid search ads fill up the first screen of results."
That isn't going to be an appealing marketing message for a new small business with a limited budget.
“The open web is pretty broken. ... Railroad, electricity, cable, telephone—all followed this similar pattern toward closedness and monopoly, and government regulated or not, it tends to happen because of the power of network effects and the economies of scale” - Ev Williams.
The above article profiling Ev Williams also states: "An April report from the web-analytics company Parse.ly found that Google and Facebook, just two companies, send more than 80 percent of all traffic to news sites."
a big platform over-promotes a vertical to speed up buy-in (perhaps even offering above market rates or other forms of compensation to get the flywheel started)
other sources join the market without that compensation & then the compensation stream gets yanked
displacement of the source by a watered down copy (eHow or Wikipedia styled rewrite), or some zero-cost licensing arrangement (Facebook Instant Articles, Google AMP, syndicating Wikipedia rewrites)
strategic defunding of the content source
promise of future gains causing desperate publishers to lean harder into Google or Facebook even as they squeeze more water out of the rock.
Hey, sure your traffic is declining & your revenue is declining faster. You are getting squeezed out, but if you trust the primary players responsible for the shift & rely on Instant Articles or Google's AMP this time will be different.
The Internet commoditized the distribution of facts. The "news" media responded by pivoting wholesale into opinions and entertainment.— Naval Ravikant (@naval) May 26, 2016
So now we get story pitches where the author tries to collect a few quote sources to match the narrative already in their head. Surely this has gone on for a long time, but it has rarely been so transparently obvious and cringeworthy as it is today.
Can we talk about how strange it is for a group of Silicon Valley startup mentors to embrace secret proxy litigation as a business tactic? To suddenly get sanctimonious about what is published on the internet and called News? To shame another internet company for not following ‘the norms’ of a legacy industry? The hypocrisy is mind bending.
The desperation is so bad news sites don't even attempt to hide it. And part of what is driving that is bot-driven content further eroding margins on legitimate publishing. Google not only ranks those advertorials, but they also promote some of the auto-generated articles which read like:
As many as 1 analysts, the annual sales target for company name, Inc. (NYSE:ticker) stands at $45.13 and the median is $45.13 for the period closed 3.
The bearish target on sales is $45.13 and the bullish estimate is $45.13, yielding a standard deviation of 1.276%.
Not more than 1 investment entities have updated sales projections on upside over the last week while 1 have downgraded their previously provided sales targets. The estimates highlight a net change of 0% over the last 1 weeks period.
Sales estimated amount is a foremost parameter in judging a firm’s performance. Nearly 1 analysts have revised sales number on the upside in last one month and 1 have lowered their targets. It demonstrates a net cumulative change of 0% in targets against sales forecasts which were given a month ago.
In latest quarterly period, 1 have revised targeted sales on upside and 1 have decreased their projections. It demonstrates change of 4.898%.
I changed a few words in each sentence of that quote to make it harder to find the source as I wasn't trying to out them specifically. But the auto-generated content was ranked by Google & monetized via inline Google AdSense ads promoting the best marijuana stocks to invest in and warning of a pending 80% stock market crash coming soon this year.
"There's all these really new, fun features we're going to be able to do with artificial intelligence and content to make videos faster," Ferro told interviewer Andrew Ross Sorkin. "Right now, we're doing a couple hundred videos a day; we think we should be doing 2,000 videos a day."
All is well, news & information are just externalities to a search engine ad network.
No big deal.
"With newspapers dying, I worry about the future of the republic. We don’t know yet what’s going to replace them, but we do already know it’s going to be bad." - Charlie Munger
Build a Brand
Build a brand, that way you are protected from the rapacious tech platforms.
Or so the thinking goes.
But that leads back to the above image where The Book of Life is below the fold on their own branded search query because there is another interpretation Google feels is more dominant.
The big problem with "brand as solution" is you not only have to pay to build a brand, but then you have to pay to protect it.
And the number of search "innovations" to try to siphon off some late funnel branded traffic and move it back up the funnel to competitors (to force the brand to pay again for their own brand to try to displace the "innovations") will only continue growing.
And at any point in time if Disney makes a movie using your brand name as the name of the movie, you are irrelevant and need of a rebrand overnight, unless you commit to paying Google for your brand forever.
Further, most large US offline retailers are doing horrible.
Almost all the offline growth is in stores selling dirt cheap unbranded imported stuff like Dollar General or Family Dollar & stores like Ross and TJ Maxx which sell branded item remainders at discount prices. And as Amazon gets more efficient by the day, other competitors with high cost structures & less efficient operations grow relatively less efficient over time.
The Wall Street Journal recently published an article about a rift between Wal-Mart & Procter & Gamble: “They sell crappy private label, so you buy Swiffer with a crappy refill,” said one of the people familiar with the product changes. “And then you don’t buy again.”
In trying to drive sales growth, P&G is resorting to some Yahoo!-like desperate measures, included meetings where "Some workers donned gladiator-like armor for the occasion."
Riding on other platforms or partners carries the same sorts of risks as trusting Google or Facebook too much.
The big difference between the web and offline platforms is the marginal cost of information is zero, so they can quickly & cheaply spread to adjacent markets in ways that physically constrained offline players can not & some of the big web platforms have far more data on people than governments do. It is worth noting one of the things that came out of the Snowden leaks is spooks were leveraging Google's DoubleClick cookies for tracking users.
As desperate stores/platforms see slowing growth they squeeze for margins and seek to accelerate growth any way possible. Chasing growth ultimately leads to the promise of what differentiates them disappearing. I recently bought some "hand crafted" soaps on Etsy, which shipped from Shenzen.
I am not sure how that impacts other artisinal soap sellers, but it makes me less likely to buy that sort of product from Etsy again.
And for as much as I like shopping on Amazon, I was uninspired when a seller recently sent me this.
America’s economy today is in some respects more concentrated than it was during the Gilded Age, whose excesses prompted the Progressive Era reforms the FTC exemplifies. In sector after sector, from semiconductors and cable providers to eyeglass manufacturers and hotels, a handful of companies dominate. These giants use their market power to hike prices for consumers and suppress wages for workers, worsening inequality. Consolidation also appears to be driving a dramatic decline in entrepreneurship, closing off opportunity and suppressing growth. Concentration of economic power, in turn, tends to concentrate political power, which incumbents use to sway policies in their favor, further entrenching their dominance.
And the local abusive tech monopolies are now firmly promoting the TPP: "make it more difficult for TPP countries to block Internet sites" = countries should have less influence over the web than individual Facebook or Google engineers do.
I kept waiting. For a year and a half, I waited. The revenues kept trickling down. It was this long terrible process, losing half overnight but then also roughly 3% a month for a year and a half after. It got to the point where we couldn’t pay our bills. That’s when I reached out again to Matt Cutts, “Things never got better.” He was like, “What, really? I’m sorry.” He looked into it and was like, “Oh yeah, it never reversed. It should have. You were accidentally put in the bad pile.”
Luckily the world can depend on China to drive growth and it will save us.
Beijing’s intellectual property regulator has ordered Apple Inc. to stop sales of the iPhone 6 and iPhone 6 Plus in the city, ruling that the design is too similar to a Chinese phone, in another setback for the company in a key overseas market.
“The problem is the fake products today are of better quality and better price than the real names. They are exactly the same factories, exactly the same raw materials but they do not use the names.” - Alibaba's Jack Ma
The more companies who do them & the more places they are seen, the lower the rates go, the less novel they will seem, and the greater the likelihood a high-spending advertiser decides to publish it on their own site & then drive the audience directly to their site.
When it is rare or unique it stands out and is special, justifying the extra incremental cost. But when it is a scaled process it is no longer unique enough to justify the vastly higher cost.
Further, as it gets more pervasive it will lead to questions of editorial integrity.
Get Into Affiliate Marketing
It won't scale across all the big publishers. It only works well at scale in select verticals and as more entities test it they'll fill up the search results and end up competing for a smaller slice of attention. Further, each new affiliate means every other affiliate's cookie lasts for a shorter duration.
It is unlikely news companies will be able to create commercially oriented review content at scale while having the depth of Wirecutter.
“We move as much product as a place 10 times bigger than us in terms of audience,” Lam said in an interview. “That’s because people trust us. We earn that trust by having such deeply-researched articles.”
Further, as it gets more pervasive it will lead to questions of editorial integrity.
Charging People to Comment
It won't work, as it undermines the social proof of value the site would otherwise have from having many comments on it.
He wondered how Google could become like a better version of the RIAA - not just a mediator of digital music licensing - but a marketplace for fair distribution of all forms of digitized content. I left that meeting with a sense that Larry was thinking far more deeply about the future than I was, and I was convinced he would play a large role in shaping it.
If we just give Google or Facebook greater control, they will save us.
Just look at Silicon Valley. They’ve done an extraordinary job, and their market cap is worth gazillions of dollars. Look at the creative industries — not just the music industry, but all of them. All of them have suffered.
Over time media sites are becoming more reliant on platforms for distribution, with visitors having fleeting interest: "bounce rates on media sites having gone from 20% of visitors in the early 2000s to well over 70% of visitors today."
Accelerated Mobile Pages and Instant Articles?
These are not solutions. They are only a further acceleration of the problem.
How will giving greater control to monopolies that are displacing you (while investing in AI) lead to a more sustainable future for copyright holders? If they host your content and you are no longer even a destination, what is your point of differentiation?
If someone else hosts your content & you are depended on them for distribution you are competing against yourself with an entity that can arbitrarily shift the terms on you whenever they feel like it.
“The cracks are beginning to show, the dependence on platforms has meant they are losing their core identity,” said Rafat Ali “If you are just a brand in the feed, as opposed to a brand that users come to, that will catch up to you sometime.”
Do you think you gain leverage over time as they become more dominant in your vertical? Not likely. Look at how Google's redesigned image search shunted traffic away from the photographers. Google's remote rater guidelines even mentioned giving lower ratings to images with watermaks on them. So if you protect your works you are punished & if you don't, good luck negotiating with a monopoly. You'll probably need the EU to see any remedy there.
"One more implication of aggregation-based monopolies is that once competitors die the aggregators become monopsonies — i.e. the only buyer for modularized suppliers. And this, by extension, turns the virtuous cycle on its head: instead of more consumers leading to more suppliers, a dominant hold over suppliers means that consumers can never leave, rendering a superior user experience less important than a monopoly that looks an awful lot like the ones our antitrust laws were designed to eliminate." - Ben Thompson
Long after benefit stops passing to the creative person the platform still gets to re-use the work. The Supreme Court only recentlyrefused to hear the ebook scanning case & Google is already running stories about using romance novels to train their AI. How long until Google places their own AI driven news rewrites in front of users?
Who then will fund journalism?
Dumb it Down
Remember how Panda was going to fix crap content for the web? eHow has removed literally millions of articles from their site & still has not recovered in Google. Demand Media's bolt-on articles published on newspaper sites still rank great in Google, but that will at some point get saturated and stop being a growth opportunity, shifting from growth to zero sum to a negative sum market, particularly as Google keeps growing their knowledge scraper graph.
Yahoo’s journalists used to joke amongst themselves about the extensive variety of Kind bars provided, but now the snacks aren’t being replenished. Instead, employees frequently remind each other that there is little reason to bother creating quality work within Yahoo’s vast eco-system of middle-brow content. “You are competing against Kim Kardashian’s ass,” goes a common refrain.
Yahoo’s billion-person-a-month home page is run by an algorithm, with a spare editorial staff, that pulls in the best-performing content from across the site. Yahoo engineers generally believed that these big names should have been able to support themselves, garner their own large audiences, and shouldn’t have relied on placement on the home page to achieve large audiences. As a result, they were expected to sink or swim on their own.
“Yahoo is reverting to its natural form,” a former staffer told me, “a crap home page for the Midwest.”
The above also has an incredibly damaging knock on effect on society.
People miss the key news. "what articles got the most views, and thus "clicks." Put bluntly, it was never the articles on my catching Bernanke pulling system liquidity into the maw of the collapse in 2008, while he maintained to Congress he had done the opposite." - Karl Denninger
“All these newspapers used to have foreign bureaus,” he said. “Now they don’t. They call us to explain to them what’s happening in Moscow and Cairo. Most of the outlets are reporting on world events from Washington. The average reporter we talk to is 27 years old, and their only reporting experience consists of being around political campaigns. That’s a sea change. They literally know nothing.” ... “We created an echo chamber,” he told the magazine. “They [the seemingly independent experts] were saying things that validated what we had given them to say.”
That is basically the government complaining to the press about it being "too easy" to manipulate the press.
After doing a tour in Facebook’s news trenches, almost all of them came to believe that they were there not to work, but to serve as training modules for Facebook’s algorithm. ... A topic was often blacklisted if it didn’t have at least three traditional news sources covering it
As algorithms take over more aspects of our lives and eat more of the media ecosystem, the sources they feed upon will consistently lose quality until some sort of major reset happens.
The strategy to keep sacrificing the long term to hit the short term numbers can seem popular. And then, suddenly, death.
You can say the soul is gone
And the feeling is just not there
Not like it was so long ago.
- Neil Young, Stringman
If you have something unique and don't market it aggressively then nobody will know about it. And, in fact, in some businesses your paying customers may have no interest in sharing your content because they view it as one of their competitive advantages. This was one of the big reasons I ultimately had to shut down our membership site.
If you do market something well enough to create demand then some other free sites will make free derivatives, and it is hard to keep having new things to write worth paying for in many markets. Eventually you exhaust the market or get burned out or stop resonating with it. Even free websites have churn. Paid websites have to bring in new members to offset old members leaving.
In most markets worth being in there is going to be plenty of free sites in the vertical which dominate the broader conversation. Thus you likely need to publish a significant amount of information for free which leads into an eventual sale. But knowing where to put the free line & how to move it over time isn't easy. Over the past year or two I blogged far less than I should have if I was going to keep running our site as a paid membership site.
And the last big issue is that a paywall is basically counter to all the other sort of above business models the mainstream media is trying. You need deeper content, better content, content that is not off topic, etc. Many of the easy wins for ad funded media become easy losses for paid membership sites. And just like it is hard for newspapers to ween themselves off of print ad revenues, it can be hard to undo many of the quick win ad revenue boosters if one wants to change their business model drastically. Regaining you sou takes time, and often, death.
“It's only after we've lost everything that we're free to do anything.” ― Chuck Palahniuk, Fight Club
Google recently announced app streaming, where they can showcase & deep link into apps in the search results even if users do not have those apps installed. How it works is rather than users installing the app, Google has the app installed on a computer in their cloud & then shows users a video of the app. Click targets, ads, etc. remain the same.
Imagine if, in order to use the web, you had to download an app for each website you wanted to visit. To find news from the New York Times, you had to install an app that let you access the site through your web browser. To purchase from Amazon, you first needed to install an Amazon app for your browser. To share on Facebook, installation of the Facebook app for your browser would be required. That would be a nightmare.
The web put an end to this. More specifically, the web browser did. The web browser became a universal app that let anyone open anything on the web.
To meaningfully participate on those sorts of sites you still need an account. You are not going to be able to buy on Amazon without registration. Any popular social network which allows third party IDs to take the place of first party IDs will quickly become a den of spam until they close that loophole.
In short, you still have to register with sites to get real value out of them if you are doing much beyond reading an article. Without registration it is hard for them to personalize your experience & recommend relevant content.
Desktop Friendly Design
App indexing & deep linking of apps is a step in the opposite direction of the open web. It is supporting proprietary non-web channels which don't link out. Further, if you thought keyword (not provided) heavily obfuscated user data, how much will data be obfuscated if the user isn't even using your site or app, but rather is interacting via a Google cloud computer?
Who visited your app? Not sure. It was a Google cloud computer.
Where were they located? Not sure. It was a Google cloud computer.
Did they have problems using your app? Not sure. It was a Google cloud computer.
What did they look at? Can you retarget them? Not sure. It was a Google cloud computer.
Is an app maker too lazy to create a web equivalent version of their content? If so, let them be at a strategic disadvantage to everyone who put in the extra effort to publish their content online.
If Google has their remote quality raters consider a site as not meeting users needs because they don't publish a "mobile friendly" version of their site, how can one consider a publisher who creates "app only" content as an entity which is trying hard to meet end user needs?
The low pricepoints for consumer apps in app stores makes it hard for businesses to justify selling B2B apps for a high enough price to offset the smaller addressable audience.
It has become harder to sell consumer apps as the app stores have saturated with competition.
2008 I'll sell apps for $2.99 & make millions
2010 At $0.99 I'll make $1000s
2012 Ads might cover my rent
2014 Kickstart my app
2015 Hire me— Nick Lockwood (@nicklockwood) August 3, 2015
Exceptionally popular apps are disabled for interfering with business models of the platforms. Apps and extensions can be disabled at any time, even after the fact, due to violating guidelines or rule changes that turn what was once fine into a guideline violation. In some cases when they are disabled it is done with no option to re-enable.
We’re rapidly moving from an internet where computers are ‘peers’ (equals) to one where there are consumers and ‘data owners’, silos of end user data that work as hard as they can to stop you from communicating with other, similar silos.
If the current trend persists we’re heading straight for AOL 2.0, only now with a slick user interface, a couple more features and more users.
Katz of Gogobot says that “SEO is a dying field” as Google uses its “monopoly” power to turn the field of search into Google’s own walled garden like AOL did in the age of dial-up modems.
Almost 4 years ago a Google engineer described SEO as a bug. He suggested one shouldn't be able to rank highly without paying.
It looks like he was right. Google's aggressive ad placement on mobile SERPs "has broken the will of users who would have clicked on an organic link if they could find one at the top of the page but are instead just clicking ads because they don’t want to scroll down."
In the years since then we've learned Google's "algorithm" has concurrent ranking signals & other forms of home cooking which guarantees success for Google's vertical search offerings. The "reasonable" barrier to entry which applies to third parties does not apply to any new Google offerings.
And "bugs" keep appearing in those "algorithms," which deliver a steady stream of harm to competing businesses.
From Indy to Brand
The waves of algorithm updates have in effect increased the barrier to entry, along with the cost needed to maintain rankings. The stresses and financial impacts that puts on small businesses makes many of them not worth running. Look no further than MetaFilter's founder seeing a psychologist, then quitting because he couldn't handle the process.
there’s no reason why the internet couldn’t keep on its present course for years to come. Under those circumstances, it would shed most of the features that make it popular with today’s avant-garde, and become one more centralized, regulated, vacuous mass medium, packed to the bursting point with corporate advertising and lowest-common-denominator content, with dissenting voices and alternative culture shut out or shoved into corners where nobody ever looks. That’s the normal trajectory of an information technology in today’s industrial civilization, after all; it’s what happened with radio and television in their day, as the gaudy and grandiose claims of the early years gave way to the crass commercial realities of the mature forms of each medium.
If you participate on the web daily, the change washes over you slowly, and the cumulative effects can be imperceptible. But if you were locked in an Iranian jail for years the change is hard to miss.
These sorts of problems not only impact search, but have an impact on all the major tech channels.
iPhone autocorrect inserted "showgirl" for "shows" and "POV" for "PPC". This crowd sourcing of autocorrect is not welcomed.— john andrews (@searchsleuth998) November 10, 2015
Eventually they might even symbolically close their websites, finishing the job they started when they all stopped paying attention to what their front pages looked like. Then, they will do a whole lot of what they already do, according to the demands of their new venues. They will report news and tell stories and post garbage and make mistakes. They will be given new metrics that are both more shallow and more urgent than ever before; they will adapt to them, all the while avoiding, as is tradition, honest discussions about the relationship between success and quality and self-respect.
If in five years I’m just watching NFL-endorsed ESPN clips through a syndication deal with a messaging app, and Vice is just an age-skewed Viacom with better audience data, and I’m looking up the same trivia on Genius instead of Wikipedia, and “publications” are just content agencies that solve temporary optimization issues for much larger platforms, what will have been point of the last twenty years of creating things for the web?
A Deal With the Devil
As ad blocking has grown more pervasive, some publishers believe the solution to the problem is through gaining distribution through the channels which are exempt from the impacts of ad blocking. However those channels have no incentive to offer exceptional payouts. They make more by showing fewer ads within featured content from partners (where they must share ad revenues) and showing more ads elsewhere (where they keep all the ad revenues).
The problem is if you don't control the publishing you don't control the monetization and you don't control the data flow.
Your website helps make the knowledge graph (and other forms of vertical search) possible. But you are paid nothing when your content appears in the knowledge graph. And the knowledge graph now has a number of ad units embedded in it.
A decade ago, when Google pushed autolink to automatically insert links in publisher's content, webmasters had enough leverage to "just say no." But now? Not so much. Google considers in-text ad networks spam & embeds their own search in third party apps. As the terms of deals change, and what is considered "best for users" changes, content creators quietly accept, or quit.
The most recent leaked Google rater documents suggested the justification for featured answers was to make mobile search quick, but if that were the extent of it then it still doesn't explain why they also appear on desktop search results. It also doesn't explain why the publisher credit links were originally a light gray.
With Google everything comes down to speed, speed, speed. But then they offer interstitial ad units, lock content behind surveys, and transform the user intent behind queries in a way that leads them astray.
Most of you are too busy monitoring Google's latest algorithm updates, examining web analytics, and building links and content to stay up to date on the design world.
Usually, creative people who excel at design aren't very good at the left-brain thinking required to succeed in the highly-technical search engine optimization industry. Likewise, very few people with the analytical mindset required for search engine optimization would do well in the free-spirited design industry.
Unfortunately, in the real world, you're often expected to do exactly that. And while most people understand that it would be ludicrous to expect their doctor to also troubleshoot their plumbing, they don't seem to understand why they shouldn't expect the person responsible for their SEO to also handle their design needs from time to time.
So you're often forced to design things for your clients from time to time. Or sometimes, you just need to whip up something for yourself instead of trying to find someone who can deliver what you need on Fiverr.
Since you probably won't start sporting a black turtleneck and talking about crop marks, press checks, or CMYK colors anytime soon, it seems silly to shell out thousands of dollars on software you'll only use occasionally, so I've compiled a list of design resources for non-designers.
The resources in this list are every bit as powerful as any of the professional-grade software, but they are free. (Some do offer premium versions with more options.) The only downside is that it might be a little bit tougher to find tutorials for some of these programs compared to the industry standard software like Adobe Photoshop or Illustrator.
We all need to edit and create images from time to time, but if you only do it occasionally, software like Adobe Photoshop and Illustrator works out to be pretty expensive. Fortunately, there are several feature-rich image editing programs available.
Gimp - Anything you can do with Photoshop can be done with Gimp, and it runs on Windows, Mac, and Linux. The learning curve can be steep, but it's worth the time.
Pixlr - If you're used to Photoshop, this program has a very similar interface, and it even opens native .psd files with the original layers intact.
Canva - The drag-and-drop interface of this web-based design program make graphic design quick and simple, plus it comes with a library of over one million professional stock images.
Inkscape - Easily create illustrations, logos, technical drawings, and vector images with this free alternative to Illustrator.
SVG Editor - If you're obsessed about website speed, you probably love SVGs (scalable vector graphics) and this handy tool from Google make it easy to create and edit them.
OK, so you're not going to compete with Pixar anytime soon, but 3D capabilities do come in handy for designing mockups of books and DVDs, creating characters, and even complete photorealistic animations.
Online 3d Package - This tool lets you quickly and easily create photorealistic mockups of books, boxes, DVDs, and CDs.
Blender - If you occasionally need to create 3D renderings but can't justify spending big bucks for professional-grade software that you'll only use a few times, Blender is the perfect (and free) alternative.
Designing a website requires a blend of creative and technical skills. Fortunately, there are plenty of tools available to efficiently complete both. From the pretty parts, to the nuts and bolts, to the little details, here is everything you'll need:
Palette generator - Upload an image and this tool will generate the perfect color palette to compliment it that you can download as a CSS file.
Subtle Paterns - Creating seamless backgrounds can be a pain, so instead of starting from scratch, just download from over 400 high-quality seamless background images, including textures and patterns.
Web page editors
Whether you're building a website from scratch with a WYSIWYG editor or fine-tuning the code on an existing website with an HTML editor, web design software will probably get a lot of use in your hands. If you have the technical chops to hand code your websites, that's ideal, but if not, or if you just don't want to, here are several options:
Kompozer - With a WYSIWYG editor in one tab and raw HTML in the other, on-the-fly editing with built-in FTP, Kompozer will make creating and editing web page a breeze.
Google Webdesigner - Build HTML5-based designs and motion graphics that can run on any device without writing any code! (If you want to get your hands dirty, you can edit all HTML and CSS by hand.
Expression Web - Microsoft offers another free web page editor which has made significant improvements since that abomination called Frontpage.
Favicon Generator - A truly polished website needs consistent branding throughout, and that means all the little details, including a favicon—that tiny little image that sits in the tab or bookmarks. Just upload an image file, such as your logo, and this handy tool will spit out the .ico files you need.
Web Developer Toolbar - This browser toolbar is available for Firefox and Chrome, and helps you troubleshoot your website and even test it at various screen sizes.
Infographics are still an effective method to earn social shares and links, and they are a great way to present a lot of data-rich information, but they can be a pain to create. Here are several tools to simplify the process that might even be better (and easier) than traditional design software.
Infogram - Build beautiful data-driven infographics in just three steps with this free tool.
Piktochart - With a simple point and click editor and over 4,000 graphics, icons, and template files, Piktochart makes it easy to create infographics that look exactly the way you want.
Easel.ly - Loaded with tons of creative templates and an east-to-use interface, this is another powerful tool to create your own stunning infographics.
Venngage - This drag and drop interface provides all the charts, maps, icons and templates you'll need to design attention-grabbing infographics.
Vizualize.me - Turn your boring resume into a unique visual expression of your skills and experience to stand out from the crowd.
If you are in a saturated market or have a great idea you are certain will be a success then it may make sense to splash out for a custom designed graphic, but in less competitive market some of the above quick-n-easy tools can still be remarkably effective.
Google Charts is a great way to create all sorts of charts, and the best part is that you can create them on the fly by passing variables in the URL.
Today you have plenty of options when it comes to font choices, so please stop using Arial, and for the love of all that is good, never use Comic Sans or I will hunt you down. You can choose from thousands of free fonts, so it's easy to pick one that fits your project perfectly.
Typegenius - Choosing the perfect font combo can be tough, but Typegenious makes it easy. Just pick a starter font from the drop down list and the site will recommend fonts that pair well with it.
Google Fonts - I recommend embedding Google fonts instead hosting them on your own server because they load more quickly and there is a chance they're already cached on visitors' computers.
Font Awesome - This is an awesome (hence the name) way to add all sorts of scalable icons without a load of extra http requests. Simply load one font for access to 519 icons that colored, scaled, and styled with CSS.
DaFont - Download and instal these fonts (.ttf or .otf formats) for designing documents or images on your computer.
What the Font - If you've ever experienced the rage-inducing task of figuring out what font was used when your client only has a 72dpi jpg and no idea how to track down their previous designer, then this is the tool for you. Just upload your image and it goes to work figuring what font it is.
Social media can multiply your website's exposure exponentially, but it takes a lot of work. From branding profiles on each network to crafting engaging visual content your fans will share, you'll have to create a lot of graphics to feed the beast. Doing that manually, the old-fashioned way is tedious and slow, so I recommend these tools to speed up your workflow.
Easy Cover Maker - Stop wasting time trying to position your cover and profile photo for your Facebook and Google+ page. This tool lets you drag everything into position in one handy interactive window, then download the image files.
Quotes Cover - Just select a quote or enter your own text, apply various effects for your own unique style, and download eye catching pictures perfect for social media. It even creates the perfect dimensions based on how you intend to use it.
Chisel - This tool has the most user-friendly interface and tons of great images and fonts to create the exact message you want to share.
Recite This - There are plenty of images and fonts available, but the downside is you have to scroll through images one at a time, and fonts are selected randomly.
Jing - From the makers of Camtasia, this free program gives you the ability to capture images or video (up to 5 minutes long) of your computer screen, then share it with the click of a button.
Social Kit - Create cover images, profile pictures, and ad banners for Facebook, Twitter, Google+, and YouTube with this free, up-to-date Photoshop plugin.
Social media image size guide - The folks over at Sprout Social created (and maintain) this handy and comprehensive Google doc listing the image sizes for all major social media networks, and since it's a shared document, you can have Google notify you anytime it's updated!
Instead of wasting time searching for the perfect meme, why not just create your own?
Powerful photos can mean the difference between a dry post that visitors ignore and one that entices them to read more. The good news is you don't have to take your own spend a fortune on stock photos because there are several free and low-cost options available.
Unsplash - These are not your typical cheesy stock photos; they lean more towards the artistic side. New photos are uploaded every day and they're all 100% free.
StockVault - With over 54 thousand free images available, both artistic and corporate-style, you should be able to find the perfect photo for just about any project.
Dreamstime & iStockPhoto - Both of these sites give you the option of a subscription model or a pay-as-you-go credits. Many images on one are available on the other, but I've found great images that were only on one of the two sites, so it's worthwhile to check both.
Even the best designers hit a wall, creatively speaking, so it helps to look for inspiration. These sites curate the best designs around and are updated regularly, so you'll find plenty of fresh ideas for your project.
Since you're days are filled with keyword research, content development, link building, and other SEO-related tasks, you probably don't have time to stay up-to-date on the latest design trends and techniques. No worries—with these websites, you'll be able to find a tutorial to walk you through just about any design challenge.
CSS-Tricks - Whenever I have a CSS question, I always slap “css tricks” on the end of my search because Chris Coyer has the most detailed, yet easy-to-understand tutorials on damn near every scenario you could imagine.
Tuts+ - Learn everything about graphic design, web design, programming, and more with a growing library of articles and tutorials.
Smashing Magazine - This is probably one of the most comprehensive web design resources you'll find anywhere, going wide and deep on every aspect of web design.
About the Author
Jeremy Knauff is the founder of Spartan Media, a proud father, husband, and US Marine Corps veteran. He has spent over 15 years helping small business up to Fortune 500 companies make their mark online, and now he's busy building his own media empire. You can follow Spartan Media on Twitter and Facebook.
If you look just at your revenue numbers as a publisher, it is easy to believe mobile is of limited importance. In our last post I mentioned an AdAge article highlighting how the New York Times was generating over half their traffic from mobile with it accounting for about 10% of their online ad revenues.
Large ad networks (Google, Bing, Facebook, Twitter, Yahoo!, etc.) can monetize mobile *much* better than other publishers can because they aggressively blend the ads right into the social stream or search results, causing them to have a much higher CTR than ads on the desktop. Bing recently confirmed the same trend RKG has highlighted about Google's mobile ad clicks:
While mobile continues to be an area of rapid and steady growth, we are pleased to report that the Yahoo Bing Network’s search and click volumes from smart phone users have more than doubled year-over-year. Click volumes have generally outpaced growth in search volume
Those ad networks want other publishers to make their sites mobile friendly for a couple reasons...
If the downstream sites are mobile friendly, then users are more likely to go back to the central ad / search / social networks more often & be more willing to click out on the ads from them.
If mobile is emphasized in importance, then those who are critical of the value of the channel may eat some of the blame for relative poor performance, particularly if they haven't spent resources optimizing user experience on the channel.
Starting April 21, we will be expanding our use of mobile-friendliness as a ranking signal. This change will affect mobile searches in all languages worldwide and will have a significant impact in our search results.
In the past Google would hint that they were working to clean up link spam or content farms or website hacking and so on. In some cases announcing such efforts was done to try to discourage investment in the associated strategies, but it is quite rare that Google pre-announces an algorithmic shift which they state will be significant & they put an exact date on it.
I wouldn't recommend waiting until the last day to implement the design changes, as it will take Google time to re-crawl your site & recognize if the design is mobile friendly.
Those who ignore the warning might be in for significant pain.
Some sites which were hit by Panda saw a devastating 50% to 70% decline in search traffic, but given how small mobile phone screen sizes are, even ranking just a couple spots lower could cause an 80% or 90% decline in mobile search traffic.
Another related issue referenced in the above post was tying in-app content to mobile search personalization:
Starting today, we will begin to use information from indexed apps as a factor in ranking for signed-in users who have the app installed. As a result, we may now surface content from indexed apps more prominently in search. To find out how to implement App Indexing, which allows us to surface this information in search results, have a look at our step-by-step guide on the developer site.
Some sites have a separate m. version for mobile visitors, while other sites keep consistent URLs & employ responsive design. How the m. version works is on the regular version of their site (say like www.seobook.com) a webmaster could add an alternative reference to the mobile version in the head section of the document
With the above sort of code in place, Google would rank the full version of the site on desktop searches & the m. version in mobile search results.
3 or 4 years ago it was a toss up as to which of these 2 options would win, but over time it appears the responsive design option is more likely to win out.
Here are a couple reasons responsive is likely to win out as a better solution:
If people share a mobile-friendly URL on Twitter, Facebook or other social networks & the URL changes, then when someone on a desktop computer clicks on the shared m. version of the page with fewer ad units & less content on the page, then the publisher is providing a worse user experience & is losing out on incremental monetization they would have achieved with the additional ad units.
While some search engines and social networks might be good at consolidating the performance of the same piece of content across multiple URL versions, some of them will periodically mess it up. That in turn will lead in some cases to lower rankings in search results or lower virality of content on social networks.
Over time there is an increasing blur between phones and tablets with phablets. Some high pixel density screens on cross over devices may appear large in terms of pixel count, but still not have particularly large screens, making it easy for users to misclick on the interface.
When Bing gave their best practices for mobile, they stated: "Ideally, there shouldn’t be a difference between the “mobile-friendly” URL and the “desktop” URL: the site would automatically adjust to the device — content, layout, and all." In that post Bing shows some examples of m. versions of sites ranking in their mobile search results, however for smaller & lesser known sites they may not rank the m. version the way they do for Yelp or Wikipedia, which means that even if you optimize the m. version of the site to a great degree, that isn't the version all mobile searchers will see. Back in 2012 Bing also stated their preference for a single version of a URL, in part based on lowering crawl traffic & consolidation of ranking signals.
In addition to responsive web design & separate mobile friendly URLs, a third configuration option is dynamic serving, which uses the Vary HTTP header to detect the user-agent & use that to drive the layout. For large, complex & high-traffic sites (and sites with numerous staff programmers & designers) dynamic serving is perhaps the best optimization solution because you can optimize the images and code to lower bandwidth costs and response times. Most smaller sites will likely rely on responsive design rather than dynamic serving, in large part because it is quicker & cheaper to implement, and most are not running sites large enough to where the incremental bandwidth provides a significant incremental expense to their business.
Solutions for Quickly Implementing Responsive Design
New Theme / Design
If your site hasn't been updated in years you might be suprised at what you find available on sites like ThemeForest for quite reasonable prices. Many of the options are responsive out of the gate & look good with a day or two of customization. Theme subscription services like WooThemes and Elegant Themes also have responsive options.
Some of the PSD to HTML conversion services like PSD 2 HTML, HTML Slicemate & XHTML Chop offer responsive design conversion of existing HTML sites in as little as a day or two, though you will likely need to do at least a few minor changes when you put the designs live to compensate for issues like third party ad units and other minor issues.
If you have an existing Wordpress theme, you might want to see if you can zip it up and send it to them, or else they may make your new theme as a child theme of 2015 or such. If you are struggling to get them to convert your Wordpress theme over (like they are first converting it to a child theme of 2015 or such) then another option would be to have them do a static HTML file conversion (instead of a Wordpress conversion) and then feed that through a theme creation tool like Themespress.
For simple hand rolled designs there are a variety of grid generator tools, which can make it reasonably easy to replace some old school table-based designs with divs. Many of the themes for sale in marketplaces like Theme Forest also use a multi-column div grid based system.
Other Things to Look Out For
Third Party Plug-ins & Ad Code Gotchas
Google allows webmasters to alter the ad calls on their mobile responsive AdSense ad units to show different sized ad units to different screen sizes & skip showing some ad units on smaller screens. An AdSense code example is included in an expandable section at the bottom of this page.
Back up your old site before putting the new site live.
For static HTML sites or sites with PHP or SHTML includes & such...
Download a copy of your existing site to local.
Rename that folder to something like sitename.com-OLDVERSION
Upload the sitename.com-OLDVERSION folder to your server. If anything goes drastically wrong during your conversion process you can rename the new site design to something like sitename.com-HOSED then set the sitename.com-OLDVERSION folder to sitename.com to quickly restore the site.
Download your site to local again.
Ensure your new site design is using a different CSS folder or CSS filename such that they old and new versions of the design can be live at the same time while you are editing the site.
Create a test file with the responsive design on your site & test that page until things work well enough.
Once you have the general "what needs changed in each file" down, then use find & replace to bulk edit the remaining files to make the changes to make them responsive.
Use a tool like FileZilla to quickly bulk upload the files.
Look through key pages and if there are only a few minor errors then fix them and re-upload them. If things are majorly screwed up then revert to the old design being live and schedule a do over on the upgrade.
If you have a decently high traffic site, it might make sense to schedule the above process for late Friday night or an off hour on the weekend, such that if anything goes astray you have fewer people seeing the errors while you frantically rush to fix them. :)
If you want to view your top pages you could export that data from your web analytics to verify all those pages look good. If you wanted to view every page of your site 1 at a time after the change, you could use a tool like Xenu Link Sleuth or Screaming Frog SEO Spider to crawl your site & export a list of URLs into a spreadsheet. Then you could take the URLs from that spreadsheet and put them a chunk at a time into a tool like URL Opener.
If you have little faith in the above test-it-live "methodology" & would prefer a slower & lower stress approach, you could create a test site on another domain name for testing purposes. Just be sure to include a noindex directive in the robots.txt file or password protect access to the site while testing. When you get things worked out on it, make sure your internal links are referencing the correct domain name, and that you have removed any block via robots.txt or password protection.
For a site with a CMS the above process is basically the same, except for how you might need to create a different backup. If you are uploading a Wordpress or Drupal theme, then change the name at least slightly so you can keep the old and new designs live at the same time so you can quickly switch back to the old design if you need to.
If you have a mixed site with Wordpress & static files or such then it might make sense to test changing the static files first, get those to work well & then create a Wordpress theme after that.
Google systems have tested 2,790 pages from your site and found that 100% of them have critical mobile usability errors. The errors on these 2,790 pages severely affect how mobile users are able to experience your website. These pages will not be seen as mobile-friendly by Google Search, and will therefore be displayed and ranked appropriately for smartphone users.
When Google introduced the knowledge graph one of their underlying messages behind it was "you can't copyright facts."
Facts are like domain names or links or pictures or anything else in terms of being a layer of information which can be highly valued or devalued through commoditization.
When you search for love quotes, Google pulls one into their site & then provides another "try again" link.
Since quotes mostly come from third parties they are not owned by BrainyQuotes and other similar sites. But here is the thing, if those other sites which pay to organize and verify such collections have their economics sufficiently undermined then they go away & then Google isn't able to pull them into the search results either.
The same is true with song lyrics. If you are one of the few sites paying to license the lyrics & then Google puts lyrics above the search results, then the economics which justified the investment in licensing might not back out & you will likely go bankrupt. That bankruptcy wouldn't be the result of being a spammer trying to work an angle, but rather because you had a higher cost structure from trying to do things the right way.
Google has also done the above quote-like "action item" types of onebox listings in other areas like software downloads
Where there are multiple versions of the software available, Google is arbitrarily selecting the download page, even though a software publisher might have a parallel SAAS option or other complex funnels based on a person's location or status as a student or such.
Mix in Google allowing advertisers to advertise bundled adware, and it becomes quite easy for Google to gum up the sales process and undermine existing brand equity by sending users to the wrong location. Here's a blog post from Malwarebytes referencing
their software being advertised on their brand term in Google via AdWords ads, engaging in trademark infringement and bundled with adware.
numerous user complaints they received about the bundleware
required legal actions they took to take the bundler offline
The company used this cash to build more business, spending more than $1 million through at least seven separate advertising accounts with Google.
The ads themselves said things like “McAfee Support - Call +1-855-[redacted US phone number]” and pointed to domains like mcafee-support.pccare247.com.
One PCCare247 ad account with Google produced 71.7 million impressions; another generated 12.4 million more. According to records obtained by the FTC, these combined campaigns generated 1.5 million clicks
Google started the knowledge graph & onebox listings on some utterly banal topics which were easy for a computer to get right, though their ambitions vastly exceed the starting point. The starting point was done where it was because it was low-risk and easy.
When Google's evolving search technology was recently covered on Medium by Steven Levy he shared that today the Knowledge Graph appears on roughly 25% of search queries and that...
Google is also trying to figure out how to deliver more complex results — to go beyond quick facts and deliver more subjective, fuzzier associations. “People aren’t interested in just facts,” she says. “They are interested in subjective things like whether or not the television shows are well-written. Things that could really help take the Knowledge Graph to the next level.”
Even as the people who routinely shill for Google parrot the "you can't copyright facts" mantra, Google is telling you they have every intent of expanding far beyond it. “I see search as the interface to all computing,” says Singhal.
Even if You Have Copyright...
What makes the "you can't copyright facts" line so particularly disingenuous was Google's support of piracy when they purchased YouTube:
cofounder Jawed Karim favored “lax” copyright policy to make YouTube “huge” and hence “an excellent acquisition target.” YouTube at one point added a “report copyrighted content” button to let users report infringements, but removed the button when it realized how many users were reporting unauthorized videos. Meanwhile, YouTube managers intentionally retained infringing videos they knew were on the site, remarking “we should KEEP …. comedy clips (Conan, Leno, etc.) [and] music videos” despite having licenses for none of these. (In an email rebuke, cofounder Steve Chen admonished: “Jawed, please stop putting stolen videos on the site. We’re going to have a tough time defending the fact that we’re not liable for the copyrighted material on the site because we didn’t put it up when one of the co-founders is blatantly stealing content from other sites and trying to get everyone to see it.”)
To some, the separation of branding makes YouTube distinct and separate from Google search, but that wasn't so much the case when many sites lost their video thumbnails and YouTube saw larger thumbnails on many of their listings in Google. In the above Steven Levy article he wrote: "one of the highest ranked general categories was a desire to know “how to” perform certain tasks. So Google made it easier to surface how-to videos from YouTube and other sources, featuring them more prominently in search."
Altruism vs Disruption for the Sake of it
Whenever Google implements a new feature they can choose to not monetize it so as to claim they are benevolent and doing it for users without commercial interest. But that same unmonetized & for users claim was also used with their shopping search vertical until one day it went paid. Google claimed paid inclusion was evil right up until the day it claimed paid inclusion was a necessity to improve user experience.
There was literally no transition period.
Many of the "informational" knowledge block listings contain affiliate links pointing into Google Play or other sites. Those affiliate ads were only labeled as advertisements after the FTC complained about inconsistent ad labeling in search results.
starting in the next few days, when you ask Google about common health conditions, you’ll start getting relevant medical facts right up front from the Knowledge Graph. We’ll show you typical symptoms and treatments, as well as details on how common the condition is—whether it’s critical, if it’s contagious, what ages it affects, and more. For some conditions you’ll also see high-quality illustrations from licensed medical illustrators. Once you get this basic info from Google, you should find it easier to do more research on other sites around the web, or know what questions to ask your doctor.
Google's links to the Mayo Clinic in their knowledge graph are, once again, a light gray font.
In case you didn't find enough background in Google's announcement article, Greg Sterling shared more of Google's views here. A couple notable quotes from Greg...
Cynics might say that Google is moving into yet another vertical content area and usurping third-party publishers. I don’t believe this is the case. Google isn’t going to be monetizing these queries; it appears to be genuinely motivated by a desire to show higher-quality health information and educate users accordingly.
Google doesn't need to directly monetize it to impact the economics of the industry. If they shift a greater share of clicks through AdWords then that will increase competition and ad prices in that category while lowering investment in SEO.
If this is done out of benevolence, it will appear *above* the AdWords ads on the search results — unlike almost every type of onebox or knowledge graph result Google offers.
If it is fair for him to label everyone who disagrees with his thesis as a cynic then it is of course fair for those "cynics" to label Greg Sterling as a shill.
Google told me that it hopes this initiative will help motivate the improvement of health content across the internet.
By defunding and displacing something they don't improve its quality. Rather they force the associated entities to cut their costs to try to make the numbers work.
If their traffic drops and they don't do more with less, then...
their margins will fall
growth slows (or they may even shrink)
their stock price will tank
management will get fired & replaced and/or they will get took private by private equity investors and/or they will need to do some "bet the company" moves to find growth elsewhere (and hope Google doesn't enter that parallel area anytime soon)
Things get monetized directly, monetized indirectly, or they disappear.
Some of the more hated aspects of online publishing (headline bait, idiotic correlations out of context, pagination, slideshows, popups, fly in ad units, auto play videos, full page ad wraps, huge ads eating most the above the fold real estate, integration of terrible native ad units promoting junk offers with shocking headline bait, content scraping answer farms, blending unvetted user generated content with house editorial, partnering with content farms to create subdomains on trusted blue chip sites, using Narrative Science or Automated Insights to auto-generate content, etc.) are not done because online publishers want to be jackasses, but because it is hard to make the numbers work in a competitive environment.
Facebook's early motto was "move fast and break things," but as they wanted to become more of a platform play they changed it to "move fast with stability." Anything which is central to the web needs significant stability, or it destroys many other businesses as a side effect of its instability.
There are a couple different ways to view big search algorithm updates. Large, drastic updates implicitly state one of the following:
we were REALLY wrong yesterday
we are REALLY wrong today
Any change or disruption is easy to justify so long as you are not the one facing the consequences:
"Smart people have a problem, especially (although not only) when you put them in large groups. That problem is an ability to convincingly rationalize nearly anything." ... "Impostor Syndrome is that voice inside you saying that not everything is as it seems, and it could all be lost in a moment. The people with the problem are the people who can't hear that voice." - Googler Avery Pennarun
Monopoly Marketshare in a Flash
Make no mistake, large changes come with false positives and false negatives. If a monopoly keeps buying marketshare, then any mistakes they make have more extreme outcomes.
Here's the Flash update screen (which hits almost every web browser EXCEPT Google Chrome).
Notice the negative option installs for the Google Chrome web browser and the Google Toolbar in Internet Explorer.
Why doesn't that same process hit Chrome? They not only pay Adobe to use security updates to steal marketshare from other browsers, but they also pay Adobe to embed Flash inside Chrome, so Chrome users never go through the bundleware update process.
Anytime anyone using a browser other than Chrome has a Flash security update they need to opt out of the bundleware, or they end up installing Google Chrome as their default web browser, which is the primary reason Firefox marketshare is in decline.
Has anyone noticed that the latest Flash update automatically installs Google Toolbar and Google Chrome? What a horrible business decision Adobe. Force installing software like you are Napster. I would fire the product manager that made that decision. As a CTO I will be informing my IT staff to set Flash to ignore updates from this point forward. QA staff cannot have additional items installed that are not part of the base browser installation. Ridiculous that Adobe snuck this crap in. All I can hope now is to find something that challenges Photoshop so I can move my design team away from Adobe software as well. Smart move trying to make pennies off of your high dollar customers.
In Chrome Google is the default search engine. As it is in Firefox and Opera and Safari and Android and iOS's web search.
In other words, in most cases across most web interfaces you have to explicitly change the default to not get Google. And then even when you do that, you have to be vigilant in protecting against the various Google bundleware bolted onto core plugins for other web browsers, or else you still end up in an ecosystem owned, controlled & tracked by Google.
Those "default" settings are not primarily driven by user preferences, but by a flow of funds. A few hundred million dollars here, a billion there, and the market is sewn up.
Google's user tracking is so widespread & so sophisticated that their ad cookies were a primary tool for government surveillance efforts.
Locking Down The Ecosystem
And Chrome is easily the most locked down browser out there.
While Google relies on bundling their toolbar & browser in updates to Flash and other plugins, they require an opposite strategy for anyone distributing Chrome plugins. Chrome plugins "must have a single purpose that is narrow and easy-to-understand."
Whenever Google wants to promote something they have the ability to bundle it into their web browser, operating system & search results to try to force participation. In a fluid system with finite attention, over-promoting one thing means under-promoting or censoring other options. Google likes to have their cake & eat it too, but the numbers don't lie.
I am frustrated @JohnMu saying that it will not cost CTR. Either Google lied about the increase in CTR with photos, or they're lying now.— Rand Fishkin (@randfish) June 25, 2014
The Right to Be Forgotten
This brings us back to the current snafu with the "right to be forgotten" in Europe.
Some have looked at the EU policy and compared it to state-run censorship in China.
Google already hires over 10,000 remote quality raters to rate search results. How exactly is receiving 70,000 requests a monumental task? As their public relations propagandists paint this as an unbelievable burden, they are also highlighting how their own internal policies destroy smaller businesses: "If a multi-billion dollar corporation is struggling to cope with 70,000 censor requests, imagine how the small business owner feels when he/she has to disavow thousands or tens of thousands of links."
Sorry About That Incidental Deletion From the Web...
David Drummond's breathtaking propaganda makes it sound like Google has virtually no history in censoring access to information:
In the past we’ve restricted the removals we make from search to a very short list. It includes information deemed illegal by a court, such as defamation, pirated content (once we’re notified by the rights holder), malware, personal information such as bank details, child sexual abuse imagery and other things prohibited by local law (like material that glorifies Nazism in Germany).
Google is free to force whatever (often both arbitrary and life altering) changes they desire onto the search ecosystem. But the moment anyone else wants any level of discourse or debate into the process, they feign outrage over the impacts on the purity of their results.
Despite Google's great power they do make mistakes. And when they do, people lose their jobs.
People noticed the Google update when it happened. It is hard to miss an overnight 40% decline in your revenues. Yet when they asked about it, Google did not confirm its existence. That economic damage hit MetaFilter for nearly two years & they only got a potential reprieve from after they fired multiple employees and were able to generate publicity about what had happened.
As SugarRae mentioned, those false positives happen regularly, but most the people who are hit by them lack political and media influence, and are thus slaughtered with no chance of recovery.
MetaFilter is no different than tens of thousands of other good, worthy small businesses who are also laying off employees – some even closing their doors – as a result of Google’s Panda filter serving as judge, jury and executioner. They’ve been as blindly and unfairly cast away to an island and no one can hear their pleas for help.
The only difference between MetaFilter and tons of other small businesses on the web is that MetaFilter has friends in higher places.
If you read past the headlines & the token slaps of big brands, these false positive death sentences for small businesses are a daily occurrence.
Conversations I’ve had with web publishers, none of whom would speak on the record for fear of retribution from Cutts’ webspam team, speak to a litany of frustration at a lack of transparency and potential bullying from Google. “The very fact I’m not able to be candid, that’s a testament to the grotesque power imbalance that’s developed,” the owner of one widely read, critically acclaimed popular website told me after their site ran afoul of Cutts’ last Panda update.
Not only does Google engage in anti-competitive censorship, but they also frequently publish misinformation. Here's a story from a week ago of a restaurant which went under after someone changed their Google listing store hours to be closed on busy days. That misinformation was embedded directly in the search results. That business is no more.
I am one of the few Real Locksmiths here in Denver and I have been struggling with this for years now. I only get one or two calls a day now thanks to spammers, and that's not calls I do, it's calls for prices. For instance I just got a call from a lady locked out of her apt. It is 1130 pm so I told her 75 dollars, Nope she said someone told her 35 dollars....a fake locksmith no doubt. She didn't understand that they meant 35 dollars to come out and look at it. These spammers charge hundreds to break your lock, they don't know how to pick a lock, then they charge you 10 times the price of some cheap lock from a hardware store. I'm so lost, I need help from google to remove those listings. Locksmithing is all I have ever done and now I'm failing at it.
There are entire sectors of the offline economy being reshaped by Google policies.
When those sectors get coverage, the blame always goes to the individual business owner who was (somehow?) personally responsible for Google's behaviors, or perhaps some coverage of the nefarious "spammers."
John Milton in his fiery 1644 defense of free speech, Areopagitica, was writing not against the oppressive power of the state but of the printers guilds. Darnton said the same was true of John Locke's writings about free speech. Locke's boogeyman wasn't an oppressive government, but a monopolistic commercial distribution system that was unfriendly to ways of organizing information that didn't fit into its business model. Sound familiar?
When Google complains about censorship, they are not really complaining about what may be, but what already is. Their only problem is the idea that someone other than themselves should have any input in the process.
"I think as technologists we should have some safe places where we can try out some new things and figure out what is the effect on society, what's the effect on people, without having to deploy kind of into the normal world. And people like those kind of things can go there and experience that and we don't have mechanisms for that." - Larry Page
I have no problem with an "opt-in" techno-utopia test in some remote corner of the world, but if that's the sort of operation he wants to run, it would be appreciated if he stopped bundling his software into billions of electronic devices & assumed everyone else is fine with "opting out."
A couple years ago we published an article named Branding & the Cycle, which highlighted how brands would realign with the algorithmic boost they gained from Panda & leverage their increased level of trust to increase their profit margins by leveraging algorithmic journalism.
Narrative Science has been a big player in the algorithmic journalism game for years. But they are not the only player in the market. Recently the Associated Press (AP) announced they will use algorithms to write articles based on quarterly earnings reports, working with a company named Automated Insights:
We discovered that automation technology, from a company called Automated Insights, paired with data from Zacks Investment Research, would allow us to automate short stories – 150 to 300 words — about the earnings of companies in roughly the same time that it took our reporters.
And instead of providing 300 stories manually, we can provide up to 4,400 automatically for companies throughout the United States each quarter.
Zacks maintains the data when the earnings reports are issued. Automated Insights has algorithms that ping that data and then in seconds output a story.
In the past Matt Cutts has mentioned how thin rewrites are doorway page spam:
you can also have more subtle doorway pages. so we ran into a directv installer in denver, for example. and that installer would say I install for every city in Colorado. so I am going to make a page for every single city in Colorado. and Boulder or Aspen or whatever I do directv install in all of those. if you were just to land on that page it might look relatively reasonable. but if you were to look at 4 or 5 of those you would quickly see that the only difference between them is the city, and that is something that we would consider a doorway.
A single automated AP article might appear on thousands of websites. When thousands of articles are automated, that means millions of copies. When millions of articles are automated, that means billions of copies. When billions ... you get the idea.
To date Automated Insights has raised a total of $10.8 million. With that limited funding they are growing quickly. Last year their Wordsmith software produced 300 million stories & this year it will likely exceed a billion articles:
"We are the largest producer of content in the world. That's more than all media companies combined," [Automated Insights CEO Robbie Allen] said in a phone interview with USA TODAY.
The above might sound a bit dystopian (for those with careers in journalism and/or lacking equity in Automated Insights and/or publishers who must compete against algorithmically generated content), but the story also comes with a side of irony.
Last year Google dictated press releases shall use nofollow links. All the major press release sites quickly fell in line & adopted nofollow, thinking they would remain in Google's good graces. Unfortunately for those sites, they were crushed by Panda. PR Newswire's solution their penalty was greater emphasis on manual editorial review:
Under the new copy quality guidelines, PR Newswire editorial staff will review press releases for a number of message elements, including:
Inclusion of insightful analysis and original content (e.g. research, reporting or other interesting and useful information);
Use of varied release formats, guarding against repeated use of templated copy (except boilerplate);
Assessing release length, guarding against issue of very short, unsubstantial messages that are mere vehicles for links;
Overuse of keywords and/or links within the message.
So now we are in a situation where press release sites require manual human editorial oversight to try to get out of being penalized, and the news companies (which currently enjoy algorithmic ranking boosts) are leveraging those same "spammy" press releases using software to auto-generate articles based on them.
That makes sense & sounds totally reasonable, so long as you don't actually think about it (or work at Google)...
We partnered with Media.net to offer you a 10% earning bonus for your first 3 months in their program. When you click this link and sign up today, Media.net will add an extra 10%.
On to the review...
We have reviewed a number of contextual ad networks & Media.net scored as the best network outside of Google AdSense. Many smaller ad networks have a huge fall off, to where if you earned 50 cents or a dollar a click with Google AdSense, you'd see nickle and penny clicks. Thankfully Media.net is nothing like that & they are perhaps the best network at competing with AdSense on a CPM basis. Their interface is quite easy to use, both in terms of creating & customizing new ad units and in tracking performance reports.
Application only takes a couple minutes. Account approval may take 4 or 5 business days to about a week. Once your account is approved, each additional site you submit must also be approved, but your account representative can help with that and getting additional sites approved should take a day or less.
They have high traffic quality standards and manually review all sites to help maintain network quality. They require English as your primary language & that your site receives the majority of its traffic from the United States, Canada, and the United Kingdom. Other publisher requirements are posted online. Their terms of service are published at media.net/legal/tos and their program guidelines are published at media.net/legal/programguidelines.
Media.net pays on a Net-30 basis and has a $100 minimum earning threshold.
You can select Paypal or bank wire transfer as your payment method.
RPM / CPM Rate
The earnings potential for any ad network is driven by
the depth of the ad network
the relevancy of the ads
how tightly ads can be integrated to fit the theme of the site
the commercial appeal of the publisher's topic
Ad Network Depth
Since Media.net leverages the Yahoo Bing Network, it has significant ad depth inside the United States. Shortly after its launch in 2012, Media.net CEO Divyank Turkhia stated: "Media.net has contextually optimized over $200 million worth of internet traffic." 6 months later their ad network already had over 2.5 billion pageviews.
While the earnings from Media.net are typically not vastly better than AdSense, they may be quite close to par and tend to outperform networks like Chitika, particularly when the published content is tied to a high value topic where pay per click (ppc) prices are significant. The cost per click (cpc) will vary across networks and topics, but in my experience the gap between AdSense and Media.net is far less than the gap between Media.net and networks like Chitika or the in-text ad networks like Infolinks, Kontera & Vibrant Media IntelliTXT. I've even seen some cases where Media.net outperformed AdSense on some topics. You don't have to chose one or the other though, as Media.net ads can be used in conjunction with AdSense ads on the same site.
Publishers who have had experience with the (now defunct) Yahoo! Publisher Network may recall the ads in Yahoo!'s old network were not particularly relevant. Ads in the Yahoo! Publisher Network lacked relevancy in part because Yahoo! placed excessive weight on the CPC which the advertiser was willing to pay. That in turn led to substantially lower ad clickthrough rates (CTR). And when some of the top paying advertisers like Vonage lowered their bids, ultimately that led to drastically lower RPM.
The good news with Media.net is it puts ad relevancy front and center. This leads to a high level of user engagement with the ads, which in turn drives a much better yield for publishers at a better RPM rate. Their ads have a 100% fill rate and use page level precision targeting.
Media.net is primarily a contextual ad network. Select publishers may be invited to sign up for the premium display advertising partnership Media.net has with Google, to complement the contextual ad performance with display ads. By leveraging ad retargeting features, display ads can help put a floor under the earning potential of pages covering topics of limited commercial appeal. Media.net also has mobile-specific ad units.
When a person sets up AdSense ads or other contextual ads on their site, there's a bit of a sense of "you're on your own." Worse yet, there is often a bit of a conflict between the recommendations from the AdSense team and the search quality team at Google.
One of Media.net's big points of differentiation is they have a team of over 450 employees who work on the product and help publishers better integrate the ads into their websites, including making the ad units really match the look and feel of the site. On some higher revenue sites Media.net will help create custom ad units. For instance, on TheStreet.com here's an example of an ad unit.
Even smaller sites will see a significant amount of effort spent on testing optimizing ad colors & ensuring the ads match the look and feel of the site. The customer service is really one of the areas where Media.net shines best.
Media.net offers a variety of ad unit sizes.
most popular sizes: 336x280, 300x250, 728x90, 600x250, 160x600
horizontal sizes: 728x20, 600x120, 468x60
vertical sizes: 120x600, 120x300, 300x600, 160x90
square: 200x200, 250x250
Media.net offers a variety of pre-set ad unit templates to choose from and the ability to customize the colors further.
Usage samples / examples
The colors can be adjusted on a per-unit basis, so you can test having some ad units blend in to the design & use higher contrasting colors on other ad units. If your site has enough scale the Media.net team can also help you split test different colors. Another useful ad integration strategy Media.net allows & recommends is the creation of jQuery sticky ads which help keep ads in view as a person scrolls around a page, helping the ad units stand out.
responsive ad units
In addition to the above standard ad unit sizes, Media.net also has options to enable mobile anchor ads & even interstitial ads on mobile devices.
Publisher Interface & Reporting
Media.net has put a lot of thought into usability and detailed reporting. Creating new ad units only takes a minute or two and posting the ad code into your site is just as quick.
Publishers can login to their accounts at the Media.net homepage and view stats 24 hours a day. Currently the dashboard does not offer CPC or click reporting, but report impressions, RPM and estimated revenue. They report live impression traffic stats in real-time on the welcome screen, but earnings stats are typically updated early on the next morning. In addition to account-wide reporting, their interface allows you to drill down into reporting on a per-site or per-unit basis.
minimum traffic: none, but they tend to be more likely to approve sites which are already approved in other tier 1 networks and/or obviously have a strong traffic footprint
prohibited topics: illegal drugs, pornography, violence, other illegal activities
Competitive eCPM when compared against AdSense in many categories.
Can be used in conjunction with AdSense.
Has some standard ad unit sizes & some that are custom, which gives you flexibility in terms of integrating them in typical ad spots and in terms of having units which look different than common ones and thus have greater eye appeal than a standard 468x60 or 728x90 banner.
Leverages the Yahoo! Bing Network, which gives it a fairly decent advertiser base & network scale to tap into to ensure there are relevant ads for most topics. I believe one thing that has helped them do so well is Microsoft has done a much better job on pricing click quality than many ad networks did in years past.
Since they are a smaller company than Google, their partner communications are much clearer. You don't have to pull down millions of dollars a year to be considered a valued partner.
Their customer support team not only communicates clearly with publishers, but also works to help improve ad integration.
Once your account has been established and they see strong traffic quality they are generally quite quick at approving any additional sites you add to your account.
In addition to offering contextual ads, Media.net has a partnership to serve Google display ads on their network (though publishers have to sign up with Google).
While earning statistics are not real-time, they provide them the following day.
Fast Net-30 payouts.
The main drawbacks would be:
They require English as your primary language & that your site receives the majority of its traffic from the United States, Canada, and the United Kingdom. If you operate outside those markets, then they wouldn't be a great fit at the moment (though who knows where they may be in a couple years as Bing gets more aggressive with international expansion of their ad network).
It can take a while to get a new account approved, so it is worth applying early to have some experience with their network and to have a backup in place in case anything should happen to your AdSense account.
Inability to split test units. While you can use a PHP rotation script to compare 2 ad units against each other, there isn't a core split test feature baked into the ad platform by default - though if you are doing enough volume your customer support person will help set up and implement a split test for you.
While they do offer statistics on a per-site, per-day & per-ad unit basis (along with impression stats), they currently do not offer data down to the individual page or keyword level. They provide data on earnings, pageviews & eCPM; but they currently do not provide click or CPC data. (I believe they will be adding more granular metrics fairly soon).