The more companies who do them & the more places they are seen, the lower the rates go, the less novel they will seem, and the greater the likelihood a high-spending advertiser decides to publish it on their own site & then drive the audience directly to their site.
When it is rare or unique it stands out and is special, justifying the extra incremental cost. But when it is a scaled process it is no longer unique enough to justify the vastly higher cost.
Further, as it gets more pervasive it will lead to questions of editorial integrity.
Get Into Affiliate Marketing
It won't scale across all the big publishers. It only works well at scale in select verticals and as more entities test it they'll fill up the search results and end up competing for a smaller slice of attention. Further, each new affiliate means every other affiliate's cookie lasts for a shorter duration.
It is unlikely news companies will be able to create commercially oriented review content at scale while having the depth of Wirecutter.
“We move as much product as a place 10 times bigger than us in terms of audience,” Lam said in an interview. “That’s because people trust us. We earn that trust by having such deeply-researched articles.”
Further, as it gets more pervasive it will lead to questions of editorial integrity.
Charging People to Comment
It won't work, as it undermines the social proof of value the site would otherwise have from having many comments on it.
He wondered how Google could become like a better version of the RIAA - not just a mediator of digital music licensing - but a marketplace for fair distribution of all forms of digitized content. I left that meeting with a sense that Larry was thinking far more deeply about the future than I was, and I was convinced he would play a large role in shaping it.
If we just give Google or Facebook greater control, they will save us.
Just look at Silicon Valley. They’ve done an extraordinary job, and their market cap is worth gazillions of dollars. Look at the creative industries — not just the music industry, but all of them. All of them have suffered.
Over time media sites are becoming more reliant on platforms for distribution, with visitors having fleeting interest: "bounce rates on media sites having gone from 20% of visitors in the early 2000s to well over 70% of visitors today."
Accelerated Mobile Pages and Instant Articles?
These are not solutions. They are only a further acceleration of the problem.
How will giving greater control to monopolies that are displacing you (while investing in AI) lead to a more sustainable future for copyright holders? If they host your content and you are no longer even a destination, what is your point of differentiation?
If someone else hosts your content & you are depended on them for distribution you are competing against yourself with an entity that can arbitrarily shift the terms on you whenever they feel like it.
“The cracks are beginning to show, the dependence on platforms has meant they are losing their core identity,” said Rafat Ali “If you are just a brand in the feed, as opposed to a brand that users come to, that will catch up to you sometime.”
Do you think you gain leverage over time as they become more dominant in your vertical? Not likely. Look at how Google's redesigned image search shunted traffic away from the photographers. Google's remote rater guidelines even mentioned giving lower ratings to images with watermaks on them. So if you protect your works you are punished & if you don't, good luck negotiating with a monopoly. You'll probably need the EU to see any remedy there.
"One more implication of aggregation-based monopolies is that once competitors die the aggregators become monopsonies — i.e. the only buyer for modularized suppliers. And this, by extension, turns the virtuous cycle on its head: instead of more consumers leading to more suppliers, a dominant hold over suppliers means that consumers can never leave, rendering a superior user experience less important than a monopoly that looks an awful lot like the ones our antitrust laws were designed to eliminate." - Ben Thompson
Long after benefit stops passing to the creative person the platform still gets to re-use the work. The Supreme Court only recentlyrefused to hear the ebook scanning case & Google is already running stories about using romance novels to train their AI. How long until Google places their own AI driven news rewrites in front of users?
Who then will fund journalism?
Dumb it Down
Remember how Panda was going to fix crap content for the web? eHow has removed literally millions of articles from their site & still has not recovered in Google. Demand Media's bolt-on articles published on newspaper sites still rank great in Google, but that will at some point get saturated and stop being a growth opportunity, shifting from growth to zero sum to a negative sum market, particularly as Google keeps growing their knowledge scraper graph.
Yahoo’s journalists used to joke amongst themselves about the extensive variety of Kind bars provided, but now the snacks aren’t being replenished. Instead, employees frequently remind each other that there is little reason to bother creating quality work within Yahoo’s vast eco-system of middle-brow content. “You are competing against Kim Kardashian’s ass,” goes a common refrain.
Yahoo’s billion-person-a-month home page is run by an algorithm, with a spare editorial staff, that pulls in the best-performing content from across the site. Yahoo engineers generally believed that these big names should have been able to support themselves, garner their own large audiences, and shouldn’t have relied on placement on the home page to achieve large audiences. As a result, they were expected to sink or swim on their own.
“Yahoo is reverting to its natural form,” a former staffer told me, “a crap home page for the Midwest.”
The above also has an incredibly damaging knock on effect on society.
People miss the key news. "what articles got the most views, and thus "clicks." Put bluntly, it was never the articles on my catching Bernanke pulling system liquidity into the maw of the collapse in 2008, while he maintained to Congress he had done the opposite." - Karl Denninger
“All these newspapers used to have foreign bureaus,” he said. “Now they don’t. They call us to explain to them what’s happening in Moscow and Cairo. Most of the outlets are reporting on world events from Washington. The average reporter we talk to is 27 years old, and their only reporting experience consists of being around political campaigns. That’s a sea change. They literally know nothing.” ... “We created an echo chamber,” he told the magazine. “They [the seemingly independent experts] were saying things that validated what we had given them to say.”
That is basically the government complaining to the press about it being "too easy" to manipulate the press.
After doing a tour in Facebook’s news trenches, almost all of them came to believe that they were there not to work, but to serve as training modules for Facebook’s algorithm. ... A topic was often blacklisted if it didn’t have at least three traditional news sources covering it
As algorithms take over more aspects of our lives and eat more of the media ecosystem, the sources they feed upon will consistently lose quality until some sort of major reset happens.
The strategy to keep sacrificing the long term to hit the short term numbers can seem popular. And then, suddenly, death.
You can say the soul is gone
And the feeling is just not there
Not like it was so long ago.
- Neil Young, Stringman
If you have something unique and don't market it aggressively then nobody will know about it. And, in fact, in some businesses your paying customers may have no interest in sharing your content because they view it as one of their competitive advantages. This was one of the big reasons I ultimately had to shut down our membership site.
If you do market something well enough to create demand then some other free sites will make free derivatives, and it is hard to keep having new things to write worth paying for in many markets. Eventually you exhaust the market or get burned out or stop resonating with it. Even free websites have churn. Paid websites have to bring in new members to offset old members leaving.
In most markets worth being in there is going to be plenty of free sites in the vertical which dominate the broader conversation. Thus you likely need to publish a significant amount of information for free which leads into an eventual sale. But knowing where to put the free line & how to move it over time isn't easy. Over the past year or two I blogged far less than I should have if I was going to keep running our site as a paid membership site.
And the last big issue is that a paywall is basically counter to all the other sort of above business models the mainstream media is trying. You need deeper content, better content, content that is not off topic, etc. Many of the easy wins for ad funded media become easy losses for paid membership sites. And just like it is hard for newspapers to ween themselves off of print ad revenues, it can be hard to undo many of the quick win ad revenue boosters if one wants to change their business model drastically. Regaining you sou takes time, and often, death.
“It's only after we've lost everything that we're free to do anything.” ― Chuck Palahniuk, Fight Club
Google recently announced app streaming, where they can showcase & deep link into apps in the search results even if users do not have those apps installed. How it works is rather than users installing the app, Google has the app installed on a computer in their cloud & then shows users a video of the app. Click targets, ads, etc. remain the same.
Imagine if, in order to use the web, you had to download an app for each website you wanted to visit. To find news from the New York Times, you had to install an app that let you access the site through your web browser. To purchase from Amazon, you first needed to install an Amazon app for your browser. To share on Facebook, installation of the Facebook app for your browser would be required. That would be a nightmare.
The web put an end to this. More specifically, the web browser did. The web browser became a universal app that let anyone open anything on the web.
To meaningfully participate on those sorts of sites you still need an account. You are not going to be able to buy on Amazon without registration. Any popular social network which allows third party IDs to take the place of first party IDs will quickly become a den of spam until they close that loophole.
In short, you still have to register with sites to get real value out of them if you are doing much beyond reading an article. Without registration it is hard for them to personalize your experience & recommend relevant content.
Desktop Friendly Design
App indexing & deep linking of apps is a step in the opposite direction of the open web. It is supporting proprietary non-web channels which don't link out. Further, if you thought keyword (not provided) heavily obfuscated user data, how much will data be obfuscated if the user isn't even using your site or app, but rather is interacting via a Google cloud computer?
Who visited your app? Not sure. It was a Google cloud computer.
Where were they located? Not sure. It was a Google cloud computer.
Did they have problems using your app? Not sure. It was a Google cloud computer.
What did they look at? Can you retarget them? Not sure. It was a Google cloud computer.
Is an app maker too lazy to create a web equivalent version of their content? If so, let them be at a strategic disadvantage to everyone who put in the extra effort to publish their content online.
If Google has their remote quality raters consider a site as not meeting users needs because they don't publish a "mobile friendly" version of their site, how can one consider a publisher who creates "app only" content as an entity which is trying hard to meet end user needs?
The low pricepoints for consumer apps in app stores makes it hard for businesses to justify selling B2B apps for a high enough price to offset the smaller addressable audience.
It has become harder to sell consumer apps as the app stores have saturated with competition.
2008 I'll sell apps for $2.99 & make millions
2010 At $0.99 I'll make $1000s
2012 Ads might cover my rent
2014 Kickstart my app
2015 Hire me— Nick Lockwood (@nicklockwood) August 3, 2015
Exceptionally popular apps are disabled for interfering with business models of the platforms. Apps and extensions can be disabled at any time, even after the fact, due to violating guidelines or rule changes that turn what was once fine into a guideline violation. In some cases when they are disabled it is done with no option to re-enable.
We’re rapidly moving from an internet where computers are ‘peers’ (equals) to one where there are consumers and ‘data owners’, silos of end user data that work as hard as they can to stop you from communicating with other, similar silos.
If the current trend persists we’re heading straight for AOL 2.0, only now with a slick user interface, a couple more features and more users.
Katz of Gogobot says that “SEO is a dying field” as Google uses its “monopoly” power to turn the field of search into Google’s own walled garden like AOL did in the age of dial-up modems.
Almost 4 years ago a Google engineer described SEO as a bug. He suggested one shouldn't be able to rank highly without paying.
It looks like he was right. Google's aggressive ad placement on mobile SERPs "has broken the will of users who would have clicked on an organic link if they could find one at the top of the page but are instead just clicking ads because they don’t want to scroll down."
In the years since then we've learned Google's "algorithm" has concurrent ranking signals & other forms of home cooking which guarantees success for Google's vertical search offerings. The "reasonable" barrier to entry which applies to third parties does not apply to any new Google offerings.
And "bugs" keep appearing in those "algorithms," which deliver a steady stream of harm to competing businesses.
From Indy to Brand
The waves of algorithm updates have in effect increased the barrier to entry, along with the cost needed to maintain rankings. The stresses and financial impacts that puts on small businesses makes many of them not worth running. Look no further than MetaFilter's founder seeing a psychologist, then quitting because he couldn't handle the process.
there’s no reason why the internet couldn’t keep on its present course for years to come. Under those circumstances, it would shed most of the features that make it popular with today’s avant-garde, and become one more centralized, regulated, vacuous mass medium, packed to the bursting point with corporate advertising and lowest-common-denominator content, with dissenting voices and alternative culture shut out or shoved into corners where nobody ever looks. That’s the normal trajectory of an information technology in today’s industrial civilization, after all; it’s what happened with radio and television in their day, as the gaudy and grandiose claims of the early years gave way to the crass commercial realities of the mature forms of each medium.
If you participate on the web daily, the change washes over you slowly, and the cumulative effects can be imperceptible. But if you were locked in an Iranian jail for years the change is hard to miss.
These sorts of problems not only impact search, but have an impact on all the major tech channels.
iPhone autocorrect inserted "showgirl" for "shows" and "POV" for "PPC". This crowd sourcing of autocorrect is not welcomed.— john andrews (@searchsleuth998) November 10, 2015
Eventually they might even symbolically close their websites, finishing the job they started when they all stopped paying attention to what their front pages looked like. Then, they will do a whole lot of what they already do, according to the demands of their new venues. They will report news and tell stories and post garbage and make mistakes. They will be given new metrics that are both more shallow and more urgent than ever before; they will adapt to them, all the while avoiding, as is tradition, honest discussions about the relationship between success and quality and self-respect.
If in five years I’m just watching NFL-endorsed ESPN clips through a syndication deal with a messaging app, and Vice is just an age-skewed Viacom with better audience data, and I’m looking up the same trivia on Genius instead of Wikipedia, and “publications” are just content agencies that solve temporary optimization issues for much larger platforms, what will have been point of the last twenty years of creating things for the web?
A Deal With the Devil
As ad blocking has grown more pervasive, some publishers believe the solution to the problem is through gaining distribution through the channels which are exempt from the impacts of ad blocking. However those channels have no incentive to offer exceptional payouts. They make more by showing fewer ads within featured content from partners (where they must share ad revenues) and showing more ads elsewhere (where they keep all the ad revenues).
The problem is if you don't control the publishing you don't control the monetization and you don't control the data flow.
Your website helps make the knowledge graph (and other forms of vertical search) possible. But you are paid nothing when your content appears in the knowledge graph. And the knowledge graph now has a number of ad units embedded in it.
A decade ago, when Google pushed autolink to automatically insert links in publisher's content, webmasters had enough leverage to "just say no." But now? Not so much. Google considers in-text ad networks spam & embeds their own search in third party apps. As the terms of deals change, and what is considered "best for users" changes, content creators quietly accept, or quit.
The most recent leaked Google rater documents suggested the justification for featured answers was to make mobile search quick, but if that were the extent of it then it still doesn't explain why they also appear on desktop search results. It also doesn't explain why the publisher credit links were originally a light gray.
With Google everything comes down to speed, speed, speed. But then they offer interstitial ad units, lock content behind surveys, and transform the user intent behind queries in a way that leads them astray.
Most of you are too busy monitoring Google's latest algorithm updates, examining web analytics, and building links and content to stay up to date on the design world.
Usually, creative people who excel at design aren't very good at the left-brain thinking required to succeed in the highly-technical search engine optimization industry. Likewise, very few people with the analytical mindset required for search engine optimization would do well in the free-spirited design industry.
Unfortunately, in the real world, you're often expected to do exactly that. And while most people understand that it would be ludicrous to expect their doctor to also troubleshoot their plumbing, they don't seem to understand why they shouldn't expect the person responsible for their SEO to also handle their design needs from time to time.
So you're often forced to design things for your clients from time to time. Or sometimes, you just need to whip up something for yourself instead of trying to find someone who can deliver what you need on Fiverr.
Since you probably won't start sporting a black turtleneck and talking about crop marks, press checks, or CMYK colors anytime soon, it seems silly to shell out thousands of dollars on software you'll only use occasionally, so I've compiled a list of design resources for non-designers.
The resources in this list are every bit as powerful as any of the professional-grade software, but they are free. (Some do offer premium versions with more options.) The only downside is that it might be a little bit tougher to find tutorials for some of these programs compared to the industry standard software like Adobe Photoshop or Illustrator.
We all need to edit and create images from time to time, but if you only do it occasionally, software like Adobe Photoshop and Illustrator works out to be pretty expensive. Fortunately, there are several feature-rich image editing programs available.
Gimp - Anything you can do with Photoshop can be done with Gimp, and it runs on Windows, Mac, and Linux. The learning curve can be steep, but it's worth the time.
Pixlr - If you're used to Photoshop, this program has a very similar interface, and it even opens native .psd files with the original layers intact.
Canva - The drag-and-drop interface of this web-based design program make graphic design quick and simple, plus it comes with a library of over one million professional stock images.
Inkscape - Easily create illustrations, logos, technical drawings, and vector images with this free alternative to Illustrator.
SVG Editor - If you're obsessed about website speed, you probably love SVGs (scalable vector graphics) and this handy tool from Google make it easy to create and edit them.
OK, so you're not going to compete with Pixar anytime soon, but 3D capabilities do come in handy for designing mockups of books and DVDs, creating characters, and even complete photorealistic animations.
Online 3d Package - This tool lets you quickly and easily create photorealistic mockups of books, boxes, DVDs, and CDs.
Blender - If you occasionally need to create 3D renderings but can't justify spending big bucks for professional-grade software that you'll only use a few times, Blender is the perfect (and free) alternative.
Designing a website requires a blend of creative and technical skills. Fortunately, there are plenty of tools available to efficiently complete both. From the pretty parts, to the nuts and bolts, to the little details, here is everything you'll need:
Palette generator - Upload an image and this tool will generate the perfect color palette to compliment it that you can download as a CSS file.
Subtle Paterns - Creating seamless backgrounds can be a pain, so instead of starting from scratch, just download from over 400 high-quality seamless background images, including textures and patterns.
Web page editors
Whether you're building a website from scratch with a WYSIWYG editor or fine-tuning the code on an existing website with an HTML editor, web design software will probably get a lot of use in your hands. If you have the technical chops to hand code your websites, that's ideal, but if not, or if you just don't want to, here are several options:
Kompozer - With a WYSIWYG editor in one tab and raw HTML in the other, on-the-fly editing with built-in FTP, Kompozer will make creating and editing web page a breeze.
Google Webdesigner - Build HTML5-based designs and motion graphics that can run on any device without writing any code! (If you want to get your hands dirty, you can edit all HTML and CSS by hand.
Expression Web - Microsoft offers another free web page editor which has made significant improvements since that abomination called Frontpage.
Favicon Generator - A truly polished website needs consistent branding throughout, and that means all the little details, including a favicon—that tiny little image that sits in the tab or bookmarks. Just upload an image file, such as your logo, and this handy tool will spit out the .ico files you need.
Web Developer Toolbar - This browser toolbar is available for Firefox and Chrome, and helps you troubleshoot your website and even test it at various screen sizes.
Infographics are still an effective method to earn social shares and links, and they are a great way to present a lot of data-rich information, but they can be a pain to create. Here are several tools to simplify the process that might even be better (and easier) than traditional design software.
Infogram - Build beautiful data-driven infographics in just three steps with this free tool.
Piktochart - With a simple point and click editor and over 4,000 graphics, icons, and template files, Piktochart makes it easy to create infographics that look exactly the way you want.
Easel.ly - Loaded with tons of creative templates and an east-to-use interface, this is another powerful tool to create your own stunning infographics.
Venngage - This drag and drop interface provides all the charts, maps, icons and templates you'll need to design attention-grabbing infographics.
Vizualize.me - Turn your boring resume into a unique visual expression of your skills and experience to stand out from the crowd.
If you are in a saturated market or have a great idea you are certain will be a success then it may make sense to splash out for a custom designed graphic, but in less competitive market some of the above quick-n-easy tools can still be remarkably effective.
Google Charts is a great way to create all sorts of charts, and the best part is that you can create them on the fly by passing variables in the URL.
Today you have plenty of options when it comes to font choices, so please stop using Arial, and for the love of all that is good, never use Comic Sans or I will hunt you down. You can choose from thousands of free fonts, so it's easy to pick one that fits your project perfectly.
Typegenius - Choosing the perfect font combo can be tough, but Typegenious makes it easy. Just pick a starter font from the drop down list and the site will recommend fonts that pair well with it.
Google Fonts - I recommend embedding Google fonts instead hosting them on your own server because they load more quickly and there is a chance they're already cached on visitors' computers.
Font Awesome - This is an awesome (hence the name) way to add all sorts of scalable icons without a load of extra http requests. Simply load one font for access to 519 icons that colored, scaled, and styled with CSS.
DaFont - Download and instal these fonts (.ttf or .otf formats) for designing documents or images on your computer.
What the Font - If you've ever experienced the rage-inducing task of figuring out what font was used when your client only has a 72dpi jpg and no idea how to track down their previous designer, then this is the tool for you. Just upload your image and it goes to work figuring what font it is.
Social media can multiply your website's exposure exponentially, but it takes a lot of work. From branding profiles on each network to crafting engaging visual content your fans will share, you'll have to create a lot of graphics to feed the beast. Doing that manually, the old-fashioned way is tedious and slow, so I recommend these tools to speed up your workflow.
Easy Cover Maker - Stop wasting time trying to position your cover and profile photo for your Facebook and Google+ page. This tool lets you drag everything into position in one handy interactive window, then download the image files.
Quotes Cover - Just select a quote or enter your own text, apply various effects for your own unique style, and download eye catching pictures perfect for social media. It even creates the perfect dimensions based on how you intend to use it.
Chisel - This tool has the most user-friendly interface and tons of great images and fonts to create the exact message you want to share.
Recite This - There are plenty of images and fonts available, but the downside is you have to scroll through images one at a time, and fonts are selected randomly.
Jing - From the makers of Camtasia, this free program gives you the ability to capture images or video (up to 5 minutes long) of your computer screen, then share it with the click of a button.
Social Kit - Create cover images, profile pictures, and ad banners for Facebook, Twitter, Google+, and YouTube with this free, up-to-date Photoshop plugin.
Social media image size guide - The folks over at Sprout Social created (and maintain) this handy and comprehensive Google doc listing the image sizes for all major social media networks, and since it's a shared document, you can have Google notify you anytime it's updated!
Instead of wasting time searching for the perfect meme, why not just create your own?
Powerful photos can mean the difference between a dry post that visitors ignore and one that entices them to read more. The good news is you don't have to take your own spend a fortune on stock photos because there are several free and low-cost options available.
Unsplash - These are not your typical cheesy stock photos; they lean more towards the artistic side. New photos are uploaded every day and they're all 100% free.
StockVault - With over 54 thousand free images available, both artistic and corporate-style, you should be able to find the perfect photo for just about any project.
Dreamstime & iStockPhoto - Both of these sites give you the option of a subscription model or a pay-as-you-go credits. Many images on one are available on the other, but I've found great images that were only on one of the two sites, so it's worthwhile to check both.
Even the best designers hit a wall, creatively speaking, so it helps to look for inspiration. These sites curate the best designs around and are updated regularly, so you'll find plenty of fresh ideas for your project.
Since you're days are filled with keyword research, content development, link building, and other SEO-related tasks, you probably don't have time to stay up-to-date on the latest design trends and techniques. No worries—with these websites, you'll be able to find a tutorial to walk you through just about any design challenge.
CSS-Tricks - Whenever I have a CSS question, I always slap “css tricks” on the end of my search because Chris Coyer has the most detailed, yet easy-to-understand tutorials on damn near every scenario you could imagine.
Tuts+ - Learn everything about graphic design, web design, programming, and more with a growing library of articles and tutorials.
Smashing Magazine - This is probably one of the most comprehensive web design resources you'll find anywhere, going wide and deep on every aspect of web design.
About the Author
Jeremy Knauff is the founder of Spartan Media, a proud father, husband, and US Marine Corps veteran. He has spent over 15 years helping small business up to Fortune 500 companies make their mark online, and now he's busy building his own media empire. You can follow Spartan Media on Twitter and Facebook.
If you look just at your revenue numbers as a publisher, it is easy to believe mobile is of limited importance. In our last post I mentioned an AdAge article highlighting how the New York Times was generating over half their traffic from mobile with it accounting for about 10% of their online ad revenues.
Large ad networks (Google, Bing, Facebook, Twitter, Yahoo!, etc.) can monetize mobile *much* better than other publishers can because they aggressively blend the ads right into the social stream or search results, causing them to have a much higher CTR than ads on the desktop. Bing recently confirmed the same trend RKG has highlighted about Google's mobile ad clicks:
While mobile continues to be an area of rapid and steady growth, we are pleased to report that the Yahoo Bing Network’s search and click volumes from smart phone users have more than doubled year-over-year. Click volumes have generally outpaced growth in search volume
Those ad networks want other publishers to make their sites mobile friendly for a couple reasons...
If the downstream sites are mobile friendly, then users are more likely to go back to the central ad / search / social networks more often & be more willing to click out on the ads from them.
If mobile is emphasized in importance, then those who are critical of the value of the channel may eat some of the blame for relative poor performance, particularly if they haven't spent resources optimizing user experience on the channel.
Starting April 21, we will be expanding our use of mobile-friendliness as a ranking signal. This change will affect mobile searches in all languages worldwide and will have a significant impact in our search results.
In the past Google would hint that they were working to clean up link spam or content farms or website hacking and so on. In some cases announcing such efforts was done to try to discourage investment in the associated strategies, but it is quite rare that Google pre-announces an algorithmic shift which they state will be significant & they put an exact date on it.
I wouldn't recommend waiting until the last day to implement the design changes, as it will take Google time to re-crawl your site & recognize if the design is mobile friendly.
Those who ignore the warning might be in for significant pain.
Some sites which were hit by Panda saw a devastating 50% to 70% decline in search traffic, but given how small mobile phone screen sizes are, even ranking just a couple spots lower could cause an 80% or 90% decline in mobile search traffic.
Another related issue referenced in the above post was tying in-app content to mobile search personalization:
Starting today, we will begin to use information from indexed apps as a factor in ranking for signed-in users who have the app installed. As a result, we may now surface content from indexed apps more prominently in search. To find out how to implement App Indexing, which allows us to surface this information in search results, have a look at our step-by-step guide on the developer site.
Some sites have a separate m. version for mobile visitors, while other sites keep consistent URLs & employ responsive design. How the m. version works is on the regular version of their site (say like www.seobook.com) a webmaster could add an alternative reference to the mobile version in the head section of the document
With the above sort of code in place, Google would rank the full version of the site on desktop searches & the m. version in mobile search results.
3 or 4 years ago it was a toss up as to which of these 2 options would win, but over time it appears the responsive design option is more likely to win out.
Here are a couple reasons responsive is likely to win out as a better solution:
If people share a mobile-friendly URL on Twitter, Facebook or other social networks & the URL changes, then when someone on a desktop computer clicks on the shared m. version of the page with fewer ad units & less content on the page, then the publisher is providing a worse user experience & is losing out on incremental monetization they would have achieved with the additional ad units.
While some search engines and social networks might be good at consolidating the performance of the same piece of content across multiple URL versions, some of them will periodically mess it up. That in turn will lead in some cases to lower rankings in search results or lower virality of content on social networks.
Over time there is an increasing blur between phones and tablets with phablets. Some high pixel density screens on cross over devices may appear large in terms of pixel count, but still not have particularly large screens, making it easy for users to misclick on the interface.
When Bing gave their best practices for mobile, they stated: "Ideally, there shouldn’t be a difference between the “mobile-friendly” URL and the “desktop” URL: the site would automatically adjust to the device — content, layout, and all." In that post Bing shows some examples of m. versions of sites ranking in their mobile search results, however for smaller & lesser known sites they may not rank the m. version the way they do for Yelp or Wikipedia, which means that even if you optimize the m. version of the site to a great degree, that isn't the version all mobile searchers will see. Back in 2012 Bing also stated their preference for a single version of a URL, in part based on lowering crawl traffic & consolidation of ranking signals.
In addition to responsive web design & separate mobile friendly URLs, a third configuration option is dynamic serving, which uses the Vary HTTP header to detect the user-agent & use that to drive the layout. For large, complex & high-traffic sites (and sites with numerous staff programmers & designers) dynamic serving is perhaps the best optimization solution because you can optimize the images and code to lower bandwidth costs and response times. Most smaller sites will likely rely on responsive design rather than dynamic serving, in large part because it is quicker & cheaper to implement, and most are not running sites large enough to where the incremental bandwidth provides a significant incremental expense to their business.
Solutions for Quickly Implementing Responsive Design
New Theme / Design
If your site hasn't been updated in years you might be suprised at what you find available on sites like ThemeForest for quite reasonable prices. Many of the options are responsive out of the gate & look good with a day or two of customization. Theme subscription services like WooThemes and Elegant Themes also have responsive options.
Some of the PSD to HTML conversion services like PSD 2 HTML, HTML Slicemate & XHTML Chop offer responsive design conversion of existing HTML sites in as little as a day or two, though you will likely need to do at least a few minor changes when you put the designs live to compensate for issues like third party ad units and other minor issues.
If you have an existing Wordpress theme, you might want to see if you can zip it up and send it to them, or else they may make your new theme as a child theme of 2015 or such. If you are struggling to get them to convert your Wordpress theme over (like they are first converting it to a child theme of 2015 or such) then another option would be to have them do a static HTML file conversion (instead of a Wordpress conversion) and then feed that through a theme creation tool like Themespress.
For simple hand rolled designs there are a variety of grid generator tools, which can make it reasonably easy to replace some old school table-based designs with divs. Many of the themes for sale in marketplaces like Theme Forest also use a multi-column div grid based system.
Other Things to Look Out For
Third Party Plug-ins & Ad Code Gotchas
Google allows webmasters to alter the ad calls on their mobile responsive AdSense ad units to show different sized ad units to different screen sizes & skip showing some ad units on smaller screens. An AdSense code example is included in an expandable section at the bottom of this page.
Back up your old site before putting the new site live.
For static HTML sites or sites with PHP or SHTML includes & such...
Download a copy of your existing site to local.
Rename that folder to something like sitename.com-OLDVERSION
Upload the sitename.com-OLDVERSION folder to your server. If anything goes drastically wrong during your conversion process you can rename the new site design to something like sitename.com-HOSED then set the sitename.com-OLDVERSION folder to sitename.com to quickly restore the site.
Download your site to local again.
Ensure your new site design is using a different CSS folder or CSS filename such that they old and new versions of the design can be live at the same time while you are editing the site.
Create a test file with the responsive design on your site & test that page until things work well enough.
Once you have the general "what needs changed in each file" down, then use find & replace to bulk edit the remaining files to make the changes to make them responsive.
Use a tool like FileZilla to quickly bulk upload the files.
Look through key pages and if there are only a few minor errors then fix them and re-upload them. If things are majorly screwed up then revert to the old design being live and schedule a do over on the upgrade.
If you have a decently high traffic site, it might make sense to schedule the above process for late Friday night or an off hour on the weekend, such that if anything goes astray you have fewer people seeing the errors while you frantically rush to fix them. :)
If you want to view your top pages you could export that data from your web analytics to verify all those pages look good. If you wanted to view every page of your site 1 at a time after the change, you could use a tool like Xenu Link Sleuth or Screaming Frog SEO Spider to crawl your site & export a list of URLs into a spreadsheet. Then you could take the URLs from that spreadsheet and put them a chunk at a time into a tool like URL Opener.
If you have little faith in the above test-it-live "methodology" & would prefer a slower & lower stress approach, you could create a test site on another domain name for testing purposes. Just be sure to include a noindex directive in the robots.txt file or password protect access to the site while testing. When you get things worked out on it, make sure your internal links are referencing the correct domain name, and that you have removed any block via robots.txt or password protection.
For a site with a CMS the above process is basically the same, except for how you might need to create a different backup. If you are uploading a Wordpress or Drupal theme, then change the name at least slightly so you can keep the old and new designs live at the same time so you can quickly switch back to the old design if you need to.
If you have a mixed site with Wordpress & static files or such then it might make sense to test changing the static files first, get those to work well & then create a Wordpress theme after that.
Google systems have tested 2,790 pages from your site and found that 100% of them have critical mobile usability errors. The errors on these 2,790 pages severely affect how mobile users are able to experience your website. These pages will not be seen as mobile-friendly by Google Search, and will therefore be displayed and ranked appropriately for smartphone users.
When Google introduced the knowledge graph one of their underlying messages behind it was "you can't copyright facts."
Facts are like domain names or links or pictures or anything else in terms of being a layer of information which can be highly valued or devalued through commoditization.
When you search for love quotes, Google pulls one into their site & then provides another "try again" link.
Since quotes mostly come from third parties they are not owned by BrainyQuotes and other similar sites. But here is the thing, if those other sites which pay to organize and verify such collections have their economics sufficiently undermined then they go away & then Google isn't able to pull them into the search results either.
The same is true with song lyrics. If you are one of the few sites paying to license the lyrics & then Google puts lyrics above the search results, then the economics which justified the investment in licensing might not back out & you will likely go bankrupt. That bankruptcy wouldn't be the result of being a spammer trying to work an angle, but rather because you had a higher cost structure from trying to do things the right way.
Google has also done the above quote-like "action item" types of onebox listings in other areas like software downloads
Where there are multiple versions of the software available, Google is arbitrarily selecting the download page, even though a software publisher might have a parallel SAAS option or other complex funnels based on a person's location or status as a student or such.
Mix in Google allowing advertisers to advertise bundled adware, and it becomes quite easy for Google to gum up the sales process and undermine existing brand equity by sending users to the wrong location. Here's a blog post from Malwarebytes referencing
their software being advertised on their brand term in Google via AdWords ads, engaging in trademark infringement and bundled with adware.
numerous user complaints they received about the bundleware
required legal actions they took to take the bundler offline
The company used this cash to build more business, spending more than $1 million through at least seven separate advertising accounts with Google.
The ads themselves said things like “McAfee Support - Call +1-855-[redacted US phone number]” and pointed to domains like mcafee-support.pccare247.com.
One PCCare247 ad account with Google produced 71.7 million impressions; another generated 12.4 million more. According to records obtained by the FTC, these combined campaigns generated 1.5 million clicks
Google started the knowledge graph & onebox listings on some utterly banal topics which were easy for a computer to get right, though their ambitions vastly exceed the starting point. The starting point was done where it was because it was low-risk and easy.
When Google's evolving search technology was recently covered on Medium by Steven Levy he shared that today the Knowledge Graph appears on roughly 25% of search queries and that...
Google is also trying to figure out how to deliver more complex results — to go beyond quick facts and deliver more subjective, fuzzier associations. “People aren’t interested in just facts,” she says. “They are interested in subjective things like whether or not the television shows are well-written. Things that could really help take the Knowledge Graph to the next level.”
Even as the people who routinely shill for Google parrot the "you can't copyright facts" mantra, Google is telling you they have every intent of expanding far beyond it. “I see search as the interface to all computing,” says Singhal.
Even if You Have Copyright...
What makes the "you can't copyright facts" line so particularly disingenuous was Google's support of piracy when they purchased YouTube:
cofounder Jawed Karim favored “lax” copyright policy to make YouTube “huge” and hence “an excellent acquisition target.” YouTube at one point added a “report copyrighted content” button to let users report infringements, but removed the button when it realized how many users were reporting unauthorized videos. Meanwhile, YouTube managers intentionally retained infringing videos they knew were on the site, remarking “we should KEEP …. comedy clips (Conan, Leno, etc.) [and] music videos” despite having licenses for none of these. (In an email rebuke, cofounder Steve Chen admonished: “Jawed, please stop putting stolen videos on the site. We’re going to have a tough time defending the fact that we’re not liable for the copyrighted material on the site because we didn’t put it up when one of the co-founders is blatantly stealing content from other sites and trying to get everyone to see it.”)
To some, the separation of branding makes YouTube distinct and separate from Google search, but that wasn't so much the case when many sites lost their video thumbnails and YouTube saw larger thumbnails on many of their listings in Google. In the above Steven Levy article he wrote: "one of the highest ranked general categories was a desire to know “how to” perform certain tasks. So Google made it easier to surface how-to videos from YouTube and other sources, featuring them more prominently in search."
Altruism vs Disruption for the Sake of it
Whenever Google implements a new feature they can choose to not monetize it so as to claim they are benevolent and doing it for users without commercial interest. But that same unmonetized & for users claim was also used with their shopping search vertical until one day it went paid. Google claimed paid inclusion was evil right up until the day it claimed paid inclusion was a necessity to improve user experience.
There was literally no transition period.
Many of the "informational" knowledge block listings contain affiliate links pointing into Google Play or other sites. Those affiliate ads were only labeled as advertisements after the FTC complained about inconsistent ad labeling in search results.
starting in the next few days, when you ask Google about common health conditions, you’ll start getting relevant medical facts right up front from the Knowledge Graph. We’ll show you typical symptoms and treatments, as well as details on how common the condition is—whether it’s critical, if it’s contagious, what ages it affects, and more. For some conditions you’ll also see high-quality illustrations from licensed medical illustrators. Once you get this basic info from Google, you should find it easier to do more research on other sites around the web, or know what questions to ask your doctor.
Google's links to the Mayo Clinic in their knowledge graph are, once again, a light gray font.
In case you didn't find enough background in Google's announcement article, Greg Sterling shared more of Google's views here. A couple notable quotes from Greg...
Cynics might say that Google is moving into yet another vertical content area and usurping third-party publishers. I don’t believe this is the case. Google isn’t going to be monetizing these queries; it appears to be genuinely motivated by a desire to show higher-quality health information and educate users accordingly.
Google doesn't need to directly monetize it to impact the economics of the industry. If they shift a greater share of clicks through AdWords then that will increase competition and ad prices in that category while lowering investment in SEO.
If this is done out of benevolence, it will appear *above* the AdWords ads on the search results — unlike almost every type of onebox or knowledge graph result Google offers.
If it is fair for him to label everyone who disagrees with his thesis as a cynic then it is of course fair for those "cynics" to label Greg Sterling as a shill.
Google told me that it hopes this initiative will help motivate the improvement of health content across the internet.
By defunding and displacing something they don't improve its quality. Rather they force the associated entities to cut their costs to try to make the numbers work.
If their traffic drops and they don't do more with less, then...
their margins will fall
growth slows (or they may even shrink)
their stock price will tank
management will get fired & replaced and/or they will get took private by private equity investors and/or they will need to do some "bet the company" moves to find growth elsewhere (and hope Google doesn't enter that parallel area anytime soon)
Things get monetized directly, monetized indirectly, or they disappear.
Some of the more hated aspects of online publishing (headline bait, idiotic correlations out of context, pagination, slideshows, popups, fly in ad units, auto play videos, full page ad wraps, huge ads eating most the above the fold real estate, integration of terrible native ad units promoting junk offers with shocking headline bait, content scraping answer farms, blending unvetted user generated content with house editorial, partnering with content farms to create subdomains on trusted blue chip sites, using Narrative Science or Automated Insights to auto-generate content, etc.) are not done because online publishers want to be jackasses, but because it is hard to make the numbers work in a competitive environment.
Facebook's early motto was "move fast and break things," but as they wanted to become more of a platform play they changed it to "move fast with stability." Anything which is central to the web needs significant stability, or it destroys many other businesses as a side effect of its instability.
There are a couple different ways to view big search algorithm updates. Large, drastic updates implicitly state one of the following:
we were REALLY wrong yesterday
we are REALLY wrong today
Any change or disruption is easy to justify so long as you are not the one facing the consequences:
"Smart people have a problem, especially (although not only) when you put them in large groups. That problem is an ability to convincingly rationalize nearly anything." ... "Impostor Syndrome is that voice inside you saying that not everything is as it seems, and it could all be lost in a moment. The people with the problem are the people who can't hear that voice." - Googler Avery Pennarun
Monopoly Marketshare in a Flash
Make no mistake, large changes come with false positives and false negatives. If a monopoly keeps buying marketshare, then any mistakes they make have more extreme outcomes.
Here's the Flash update screen (which hits almost every web browser EXCEPT Google Chrome).
Notice the negative option installs for the Google Chrome web browser and the Google Toolbar in Internet Explorer.
Why doesn't that same process hit Chrome? They not only pay Adobe to use security updates to steal marketshare from other browsers, but they also pay Adobe to embed Flash inside Chrome, so Chrome users never go through the bundleware update process.
Anytime anyone using a browser other than Chrome has a Flash security update they need to opt out of the bundleware, or they end up installing Google Chrome as their default web browser, which is the primary reason Firefox marketshare is in decline.
Has anyone noticed that the latest Flash update automatically installs Google Toolbar and Google Chrome? What a horrible business decision Adobe. Force installing software like you are Napster. I would fire the product manager that made that decision. As a CTO I will be informing my IT staff to set Flash to ignore updates from this point forward. QA staff cannot have additional items installed that are not part of the base browser installation. Ridiculous that Adobe snuck this crap in. All I can hope now is to find something that challenges Photoshop so I can move my design team away from Adobe software as well. Smart move trying to make pennies off of your high dollar customers.
In Chrome Google is the default search engine. As it is in Firefox and Opera and Safari and Android and iOS's web search.
In other words, in most cases across most web interfaces you have to explicitly change the default to not get Google. And then even when you do that, you have to be vigilant in protecting against the various Google bundleware bolted onto core plugins for other web browsers, or else you still end up in an ecosystem owned, controlled & tracked by Google.
Those "default" settings are not primarily driven by user preferences, but by a flow of funds. A few hundred million dollars here, a billion there, and the market is sewn up.
Google's user tracking is so widespread & so sophisticated that their ad cookies were a primary tool for government surveillance efforts.
Locking Down The Ecosystem
And Chrome is easily the most locked down browser out there.
While Google relies on bundling their toolbar & browser in updates to Flash and other plugins, they require an opposite strategy for anyone distributing Chrome plugins. Chrome plugins "must have a single purpose that is narrow and easy-to-understand."
Whenever Google wants to promote something they have the ability to bundle it into their web browser, operating system & search results to try to force participation. In a fluid system with finite attention, over-promoting one thing means under-promoting or censoring other options. Google likes to have their cake & eat it too, but the numbers don't lie.
I am frustrated @JohnMu saying that it will not cost CTR. Either Google lied about the increase in CTR with photos, or they're lying now.— Rand Fishkin (@randfish) June 25, 2014
The Right to Be Forgotten
This brings us back to the current snafu with the "right to be forgotten" in Europe.
Some have looked at the EU policy and compared it to state-run censorship in China.
Google already hires over 10,000 remote quality raters to rate search results. How exactly is receiving 70,000 requests a monumental task? As their public relations propagandists paint this as an unbelievable burden, they are also highlighting how their own internal policies destroy smaller businesses: "If a multi-billion dollar corporation is struggling to cope with 70,000 censor requests, imagine how the small business owner feels when he/she has to disavow thousands or tens of thousands of links."
Sorry About That Incidental Deletion From the Web...
David Drummond's breathtaking propaganda makes it sound like Google has virtually no history in censoring access to information:
In the past we’ve restricted the removals we make from search to a very short list. It includes information deemed illegal by a court, such as defamation, pirated content (once we’re notified by the rights holder), malware, personal information such as bank details, child sexual abuse imagery and other things prohibited by local law (like material that glorifies Nazism in Germany).
Google is free to force whatever (often both arbitrary and life altering) changes they desire onto the search ecosystem. But the moment anyone else wants any level of discourse or debate into the process, they feign outrage over the impacts on the purity of their results.
Despite Google's great power they do make mistakes. And when they do, people lose their jobs.
People noticed the Google update when it happened. It is hard to miss an overnight 40% decline in your revenues. Yet when they asked about it, Google did not confirm its existence. That economic damage hit MetaFilter for nearly two years & they only got a potential reprieve from after they fired multiple employees and were able to generate publicity about what had happened.
As SugarRae mentioned, those false positives happen regularly, but most the people who are hit by them lack political and media influence, and are thus slaughtered with no chance of recovery.
MetaFilter is no different than tens of thousands of other good, worthy small businesses who are also laying off employees – some even closing their doors – as a result of Google’s Panda filter serving as judge, jury and executioner. They’ve been as blindly and unfairly cast away to an island and no one can hear their pleas for help.
The only difference between MetaFilter and tons of other small businesses on the web is that MetaFilter has friends in higher places.
If you read past the headlines & the token slaps of big brands, these false positive death sentences for small businesses are a daily occurrence.
Conversations I’ve had with web publishers, none of whom would speak on the record for fear of retribution from Cutts’ webspam team, speak to a litany of frustration at a lack of transparency and potential bullying from Google. “The very fact I’m not able to be candid, that’s a testament to the grotesque power imbalance that’s developed,” the owner of one widely read, critically acclaimed popular website told me after their site ran afoul of Cutts’ last Panda update.
Not only does Google engage in anti-competitive censorship, but they also frequently publish misinformation. Here's a story from a week ago of a restaurant which went under after someone changed their Google listing store hours to be closed on busy days. That misinformation was embedded directly in the search results. That business is no more.
I am one of the few Real Locksmiths here in Denver and I have been struggling with this for years now. I only get one or two calls a day now thanks to spammers, and that's not calls I do, it's calls for prices. For instance I just got a call from a lady locked out of her apt. It is 1130 pm so I told her 75 dollars, Nope she said someone told her 35 dollars....a fake locksmith no doubt. She didn't understand that they meant 35 dollars to come out and look at it. These spammers charge hundreds to break your lock, they don't know how to pick a lock, then they charge you 10 times the price of some cheap lock from a hardware store. I'm so lost, I need help from google to remove those listings. Locksmithing is all I have ever done and now I'm failing at it.
There are entire sectors of the offline economy being reshaped by Google policies.
When those sectors get coverage, the blame always goes to the individual business owner who was (somehow?) personally responsible for Google's behaviors, or perhaps some coverage of the nefarious "spammers."
John Milton in his fiery 1644 defense of free speech, Areopagitica, was writing not against the oppressive power of the state but of the printers guilds. Darnton said the same was true of John Locke's writings about free speech. Locke's boogeyman wasn't an oppressive government, but a monopolistic commercial distribution system that was unfriendly to ways of organizing information that didn't fit into its business model. Sound familiar?
When Google complains about censorship, they are not really complaining about what may be, but what already is. Their only problem is the idea that someone other than themselves should have any input in the process.
"I think as technologists we should have some safe places where we can try out some new things and figure out what is the effect on society, what's the effect on people, without having to deploy kind of into the normal world. And people like those kind of things can go there and experience that and we don't have mechanisms for that." - Larry Page
I have no problem with an "opt-in" techno-utopia test in some remote corner of the world, but if that's the sort of operation he wants to run, it would be appreciated if he stopped bundling his software into billions of electronic devices & assumed everyone else is fine with "opting out."
A couple years ago we published an article named Branding & the Cycle, which highlighted how brands would realign with the algorithmic boost they gained from Panda & leverage their increased level of trust to increase their profit margins by leveraging algorithmic journalism.
Narrative Science has been a big player in the algorithmic journalism game for years. But they are not the only player in the market. Recently the Associated Press (AP) announced they will use algorithms to write articles based on quarterly earnings reports, working with a company named Automated Insights:
We discovered that automation technology, from a company called Automated Insights, paired with data from Zacks Investment Research, would allow us to automate short stories – 150 to 300 words — about the earnings of companies in roughly the same time that it took our reporters.
And instead of providing 300 stories manually, we can provide up to 4,400 automatically for companies throughout the United States each quarter.
Zacks maintains the data when the earnings reports are issued. Automated Insights has algorithms that ping that data and then in seconds output a story.
In the past Matt Cutts has mentioned how thin rewrites are doorway page spam:
you can also have more subtle doorway pages. so we ran into a directv installer in denver, for example. and that installer would say I install for every city in Colorado. so I am going to make a page for every single city in Colorado. and Boulder or Aspen or whatever I do directv install in all of those. if you were just to land on that page it might look relatively reasonable. but if you were to look at 4 or 5 of those you would quickly see that the only difference between them is the city, and that is something that we would consider a doorway.
A single automated AP article might appear on thousands of websites. When thousands of articles are automated, that means millions of copies. When millions of articles are automated, that means billions of copies. When billions ... you get the idea.
To date Automated Insights has raised a total of $10.8 million. With that limited funding they are growing quickly. Last year their Wordsmith software produced 300 million stories & this year it will likely exceed a billion articles:
"We are the largest producer of content in the world. That's more than all media companies combined," [Automated Insights CEO Robbie Allen] said in a phone interview with USA TODAY.
The above might sound a bit dystopian (for those with careers in journalism and/or lacking equity in Automated Insights and/or publishers who must compete against algorithmically generated content), but the story also comes with a side of irony.
Last year Google dictated press releases shall use nofollow links. All the major press release sites quickly fell in line & adopted nofollow, thinking they would remain in Google's good graces. Unfortunately for those sites, they were crushed by Panda. PR Newswire's solution their penalty was greater emphasis on manual editorial review:
Under the new copy quality guidelines, PR Newswire editorial staff will review press releases for a number of message elements, including:
Inclusion of insightful analysis and original content (e.g. research, reporting or other interesting and useful information);
Use of varied release formats, guarding against repeated use of templated copy (except boilerplate);
Assessing release length, guarding against issue of very short, unsubstantial messages that are mere vehicles for links;
Overuse of keywords and/or links within the message.
So now we are in a situation where press release sites require manual human editorial oversight to try to get out of being penalized, and the news companies (which currently enjoy algorithmic ranking boosts) are leveraging those same "spammy" press releases using software to auto-generate articles based on them.
That makes sense & sounds totally reasonable, so long as you don't actually think about it (or work at Google)...
The stock market had a flash crash today after someone hacked the AP account & made a fake announcement about bombs going off at the White House. Recently Twitter's search functionality has grown so inundated with spam that I don't even look at the brand related searches much anymore. While you can block individual users, it doesn't block them from showing up in search results, so there are various affiliate bots that spam just about any semi-branded search.
"We [YouTube] can't review every submission, so basically the crowd marks it if it is a problem post publication."
"You have a different model, right. You require human editors." on Wikileaks vs YouTube
We would post editorial content more often, but we are sort of debating opening up a social platform so that we can focus on the user without having to bear any editorial costs until after the fact. Profit margins are apparently better that way.
As Google drives smaller sites out of the index & ranks junk content based on no factor other than it being on a trusted site, they create the incentive for spammers to ride on the social platforms.
All aboard. And try not to step on any toes!
When I do some product related searches (eg: brand name & shoe model) almost the whole result set for the first 5 or 10 pages is garbage.
Facebook Notes & pages
subdomains off of various other free hosts
It comes without surprise that Eric Schmidt fundamentally believes that "disinformation becomes so easy to generate because of, because complexity overwhelms knowledge, that it is in the people's interest, if you will over the next decade, to build disinformation generating systems, this is true for corporations, for marketing, for governments and so on."
Of course he made no mention in Google's role in the above problem. When they are not issuing threats & penalties to smaller independent webmasters, they are just a passive omniscient observer.
With all these business models, there is a core model of building up a solid stream of usage data & then tricking users or looking the other way when things get out of hand. Consider Google's Lane Shackleton's tips on YouTube:
"Search is a way for a user to explicitly call out the content that they want. If a friend told me about an Audi ad, then I might go seek that out through search. It’s a strong signal of intent, and it’s a strong signal that someone found out about that content in some way."
"you blur the lines between advertising and content. That’s really what we’ve been advocating our advertisers to do."
"you’re making thoughtful content for a purpose. So if you want something to get shared a lot, you may skew towards doing something like a prank"
Harlem Shake & Idiocracy: the innovative way forward to improve humanity.
Hubert said that if people file a reconsideration request, they should “get a clear answer” about what’s wrong. There’s a bit of a Catch-22 there. How can you file a reconsideration request showing you’ve removed the bad stuff, if the only way you can get a clear answer about the bad stuff to remove is to file a reconsideration request?
The answer is that technically, you can request reconsideration without removing anything. The form doesn’t actually require you to remove bad stuff. That’s just the general advice you’ll often hear Google say, when it comes to making such a request. That’s also good advice if you do know what’s wrong.
But if you’re confused and need more advice, you can file the form asking for specifics about what needs to be removed. Then have patience
In the past I referenced that there is no difference between a formal white list & overly-aggressive penalties coupled with loose exemptions for select parties.
The moral of the story is that if you are going to spam, you should make it look like a user of your site did it, that way you
Native advertising presents opportunities for SEOs to boost their link building strategies, particularly those who favor paid link strategies.
What Is Native Advertising?
Native advertising is the marketing industries new buzzword for....well, it depends who you ask.
Native advertising can't just be about the creative that fills an advertising space. Native advertising must be intrinsically connected to the format that fits the user's unique experience. There's something philosophically beautiful about that in terms of what great advertising should (and could) be. But first, we need to all speak the same language around "native advertising.
Native advertising is often defined as content that seamless integrates with a site, as opposed to interruption media, such as pre-rolls on YouTube videos, or advertising that sits in a box off to the side of the main content.
Some high-profile examples of native advertising include Facebook Sponsored Stories; Twitter's Promoted Tweets; promoted videos on YouTube, Tumblr and Forbes; promoted articles like Gawker's Sponsored Posts and BuzzFeed's Featured Partner content; Sponsored Listings on Yelp; promoted images on Cheezburger; and promoted playlists on Spotify and Radio.
One interesting observation is that Adwords and Adsense are frequently cited as being examples of native advertising. Hold that thought.
Why Native Advertising?
The publishing industry is desperate to latch onto any potential lifeline as ad rates plummet.
Analysts say the slowdown is being caused by the huge expansion in the amount of online advertising space as companies who manage this emerge to dominate the space. In short there’s just too many ad slots chasing ads that are growing, but at a rate slower than the creation of potential ad slots.
This means the chances are dimming that online ad spending would gradually grow to make up for some of the falls in analogue spending in print. ....staff numbers and the attendant costs of doing business have to be slashed heavily to account for the lower yield and revenue from online ads
And why might there be more slots than there are advertisers?
“The model of ‘boxes and rectangles’ – the display banner – is failing to fully support traditional ‘content’ sites beyond a handful of exceptions,” wrote Federated Media founder John Battelle in a recent blog post. He explained that the next generation of native ads on social networks and strength of Google Adwords make direct sales more competitive, and that ad agencies must evolve with the growing trend of advertisers who want more social/conversational ad campaigns.
Advertisers aren't seeing enough return from the advertising in order for them to want to grab the many slots that are available. And they are lowering their bids to make up for issues with publishing fraud. The promise of native advertising is that this type of advertising reaches real users, and will grab and hold viewers attention for longer.
Facebook is still largely centered around interactions with people one knows offline, making the appearance of marketing messages especially jarring. This is particularly true in mobile, where Sponsored Stories take up a much larger portion of the screen relative to desktop. Facebook did not handle the mobile rollout very gracefully, either. Rather than easing users into the change, they appeared seemingly overnight, and took up the first few posts in the newsfeed. The content itself is also hit or miss – actions taken by distant friends with dissimilar interests are often used as the basis for targeting Sponsored Stories.
If you’re planning on offering native advertising yourself, you may need to walk a fine line. Bloggers and other publishers who are getting paid but don’t declare so risk alienating their audience and destroying their reputation.
Some good ways of addressing this issue are policy pages that state the author has affiliate relationships with various providers, and this is a means of paying for the site, and does not affect editorial. Whether it’s true or not is up to the audience to decide, but such transparency up-front certainly helps. If a lot of free content is mixed in with native content, and audiences dislike it enough, then it might pave the way for more paid content and paywalls.
Just like any advertising content, native advertising may become less effective over time if the audience learns to screen it out. One advantage for the SEO is that doesn’t matter so much, so long as they get the link.
Forbes Insights and Sharethrough today announced the results of a brand study to assess adoption trends related to native video advertising that included senior executives from leading brands such as Intel, JetBlue, Heineken and Honda. The study shows that more than half of large brands are now using custom brand videos in their marketing, and when it comes to distribution, most favor “native advertising” approaches where content is visually integrated into the organic site experience, as opposed to running in standard display ad formats. The study also shows that the majority of marketers now prefer choice-based formats over interruptive formats.
Google’s Clamp-Down On Link Advertising
So, what’s the difference between advertorial and native content? Not much, on the face of it, except in one rather interesting respect. When it comes to native advertising, it’s often not obvious the post is sponsored.
The Atlantic, BuzzFeed and Gawker — are experimenting with new ad formats such as sponsored content or “native advertising,” as well as affiliate links. On Friday, Google engineer Matt Cutts reiterated a warning from the search giant that this kind of content has to be treated properly or Google will penalize the site that hosts it, in some cases severely.
If native advertising proves popular with publishers and advertisers, then it’s going to compete with Google’s business model. Businesses may spend less on Adwords and may replace Adsense with native advertising. It’s no surprise, then, that Google may take a hostile line on it. However, publishers are poor, ad networks are rich, so perhaps it's time that publishers became ad networks.
When it comes to SEO, given Google’s warning shots, SEOs will either capitulate - and pretty much give up on paid links - or make more effort to blend seamlessly into the background.
Blurring The Lines
As Andrew Sullivan notes, the editorial thin blue line is looking rather “fuzzy”. It may even raise legal questions about misrepresentation. There has traditionally been a church and state divide between advertising and editorial, but as publishers get more desperate to survive, they’re going to go with whatever works. If native advertising works better than the alternatives, then publishers will use it. What choice have they got? Their industry is dying.
I have nothing but admiration for innovation in advertizing and creative revenue-generation online. Without it, journalism will die. But if advertorials become effectively indistinguishable from editorial, aren’t we in danger of destroying the village in order to save it?
Likewise, in order to compete in search results, a site must have links. It would great if people linked freely and often based on objective merit, but we all know that is a hit and miss affair. If native advertising provides a means to acquire paid links that don’t look like paid links, then that is what people will do.
And if their competitors are doing it, they’ll have little choice.
If you’re looking for a way to build paid links, then here is where the opportunity lies for SEOs.
Recent examples Google caught out looked heavily advertorial. They were in bulk. They would have likely been barely credible to a human reviewer as they didn’t read particularly well. Those I saw had an "auto-generated quality" to them.
The integration with editorial needs to be seamless and, if possible, the in-house editors should write the copy, or it should look like they did. Avoid generic and boilerplate approaches. The content should not be both generic and widely distributed. Such strategy is unlikely to pass Google’s inspections.
Markets will spring up, if they haven’t already, whereby publications will offer editorial native advertising, link included. It would be difficult to tell if such a link was “paid for”, and certainly not algorithmically, unless the publisher specifically labelled it “advertising feature” or something similar.
Sure, this has been going on for years, but if a lot of high level publishers embrace something called "Native Advertising" then that sounds a lot more legitimate than "someone wants to pay for a link on our site". In marketing, it's all about the spin ;)
It could be a paid restaurant review on a restaurant review site, link included. For SEO purposes, the review doesn't even need to be overtly positive and glowing, therefore a high degree of editorial integrity could be maintained. This approach would suit a lot of review sites. For example, "we'll pay you to review our product, so long as you link to it, but you can still say whatever you like about it". The publishers production cost is met, in total, and they can maintain a high degree of editorial integrity. If Jennifer Lopez is in a new movie with some "hot" scene then that movie can pay AskMen to create a top 10 sexiest moments gallery that includes their movie at #9 & then advertise that feature across the web.
A DIY site could show their readers how to build a garden wall. The products could be from a sponsor, link included. Editorial integrity could be maintained, as the DIY site need not push or recommend those products like an advertorial would, but the sponsor still gets the link. The equivalent of product placement in movies.
News items can feature product placement without necessarily endorsing them, link included - they already do this with syndicated press releases. Journalists often interview the local expert on a given topic, and this can include a link. If that news article is paid for by the link buyer, yet the link buyer doesn't have a say in editorial, then that deal will look attractive to publishers. Just a slightly different spin on "brought to you by our sponsor". Currently services like HARO & PR Leads help connect experts with journalists looking for story background. In the years to come perhaps there will be similar services where people pay the publications directly to be quoted.
I’m sure you can think of many other ideas. A lot of this isn’t new, it’s just a new, shiny badge on something that has been going on well before the web began. When it comes to SEO, the bar has been lifted on link building. Links from substandard content are less likely to pass Google’s filters, so SEOs need to think more about ways to get quality content integrated in a more seamless way. It takes more time, and it’s likely to be more costly, but this can be a good thing. It raises the bar on everyone else.
Those who don’t know the bar has been raised, or don’t put more effort in, will lose.
Low Level Of Compromise
Native Advertising is a new spin on an old practice, however it should be especially interesting to the SEO, as the SEO doesn't demand the publisher compromise editorial to a significant degree, as the publisher would have to do for pure advertorial. The SEO only requires they incorporate a link within a seamless, editorial-style piece.
If the SEO is paying for the piece to be written, that's going to look like a good deal to many publishers.
The web search giant, which is embroiled in a long-running row over the way it deals with pirated material, is considering the radical measure so that it can get rid of the root cause instead of having to change its own search results.
Executives want to stop websites more or less dedicated to offering links to pirated films, music and books from making money out of the illegal material. The plans, still in discussion, would also block funding to websites that do not respond to legal challenges, for example because they are offshore.
Last month Google announced a new format for their image search results, where they pull the image inline without sending the visitor onto the publisher website. At the same time they referenced some "phantom visitor" complaint from publishers to justify keeping the visitor on Google & highlighted how there were now more links to the image source. If publishers were concerned about the "phantom visitor problem" we wouldn't see so many crappy slideshow pageviews.
Google's leaked remote rater guidelines do mention something about rating an image lower under certain situations like where the author might want attributed for their work that they are routinely disintermediated from.
On Twitter a former Google named Miguel Silvar wrote: "If you do SEO and decide to block Image Search just because it's bringing less traffic, you can stop calling yourself an SEO expert."
Many "experts" would claim that any exposure is good, even if you don't get credit for it. Many clients of said "experts" will end up bankrupt! Experts who suggest it is reasonable for content creators to be stripped of payment, traffic & attribution are at best conflicted.
As Google continues to win the game of inches of displacing the original sources, they don't even need you to mark up your content for them to extract their knowledge graph. Bill Slawski shared a video of Google's Andrew Hogue describing their mass data extraction effort: "It's never going to be 100% accurate. We're not even going to claim that it is 100% accurate. We are going to be lucky if we get 70% accuracy ... we are going to provide users with tools to correct the data."
If you as a publisher chose to auto-generate content at a 70% accuracy, pumped it up to first page rankings & then said "if people care they will fix it" Google would rightfully call you a spammer. If they do the same, it is knowledge baby.
Google pays for default placement in Safari & Firefox. Former Google executives head AOL & Yahoo!. Google can thus push for new cultural norms that make Microsoft look like an oddball or outsider if they don't play the same game.