Most of you are too busy monitoring Google's latest algorithm updates, examining web analytics, and building links and content to stay up to date on the design world.
Usually, creative people who excel at design aren't very good at the left-brain thinking required to succeed in the highly-technical search engine optimization industry. Likewise, very few people with the analytical mindset required for search engine optimization would do well in the free-spirited design industry.
Unfortunately, in the real world, you're often expected to do exactly that. And while most people understand that it would be ludicrous to expect their doctor to also troubleshoot their plumbing, they don't seem to understand why they shouldn't expect the person responsible for their SEO to also handle their design needs from time to time.
So you're often forced to design things for your clients from time to time. Or sometimes, you just need to whip up something for yourself instead of trying to find someone who can deliver what you need on Fiverr.
Since you probably won't start sporting a black turtleneck and talking about crop marks, press checks, or CMYK colors anytime soon, it seems silly to shell out thousands of dollars on software you'll only use occasionally, so I've compiled a list of design resources for non-designers.
The resources in this list are every bit as powerful as any of the professional-grade software, but they are free. (Some do offer premium versions with more options.) The only downside is that it might be a little bit tougher to find tutorials for some of these programs compared to the industry standard software like Adobe Photoshop or Illustrator.
We all need to edit and create images from time to time, but if you only do it occasionally, software like Adobe Photoshop and Illustrator works out to be pretty expensive. Fortunately, there are several feature-rich image editing programs available.
Gimp - Anything you can do with Photoshop can be done with Gimp, and it runs on Windows, Mac, and Linux. The learning curve can be steep, but it's worth the time.
Pixlr - If you're used to Photoshop, this program has a very similar interface, and it even opens native .psd files with the original layers intact.
Canva - The drag-and-drop interface of this web-based design program make graphic design quick and simple, plus it comes with a library of over one million professional stock images.
Inkscape - Easily create illustrations, logos, technical drawings, and vector images with this free alternative to Illustrator.
SVG Editor - If you're obsessed about website speed, you probably love SVGs (scalable vector graphics) and this handy tool from Google make it easy to create and edit them.
OK, so you're not going to compete with Pixar anytime soon, but 3D capabilities do come in handy for designing mockups of books and DVDs, creating characters, and even complete photorealistic animations.
Online 3d Package - This tool lets you quickly and easily create photorealistic mockups of books, boxes, DVDs, and CDs.
Blender - If you occasionally need to create 3D renderings but can't justify spending big bucks for professional-grade software that you'll only use a few times, Blender is the perfect (and free) alternative.
Designing a website requires a blend of creative and technical skills. Fortunately, there are plenty of tools available to efficiently complete both. From the pretty parts, to the nuts and bolts, to the little details, here is everything you'll need:
Palette generator - Upload an image and this tool will generate the perfect color palette to compliment it that you can download as a CSS file.
Subtle Paterns - Creating seamless backgrounds can be a pain, so instead of starting from scratch, just download from over 400 high-quality seamless background images, including textures and patterns.
Web page editors
Whether you're building a website from scratch with a WYSIWYG editor or fine-tuning the code on an existing website with an HTML editor, web design software will probably get a lot of use in your hands. If you have the technical chops to hand code your websites, that's ideal, but if not, or if you just don't want to, here are several options:
Kompozer - With a WYSIWYG editor in one tab and raw HTML in the other, on-the-fly editing with built-in FTP, Kompozer will make creating and editing web page a breeze.
Google Webdesigner - Build HTML5-based designs and motion graphics that can run on any device without writing any code! (If you want to get your hands dirty, you can edit all HTML and CSS by hand.
Expression Web - Microsoft offers another free web page editor which has made significant improvements since that abomination called Frontpage.
Favicon Generator - A truly polished website needs consistent branding throughout, and that means all the little details, including a favicon—that tiny little image that sits in the tab or bookmarks. Just upload an image file, such as your logo, and this handy tool will spit out the .ico files you need.
Web Developer Toolbar - This browser toolbar is available for Firefox and Chrome, and helps you troubleshoot your website and even test it at various screen sizes.
Infographics are still an effective method to earn social shares and links, and they are a great way to present a lot of data-rich information, but they can be a pain to create. Here are several tools to simplify the process that might even be better (and easier) than traditional design software.
Infogram - Build beautiful data-driven infographics in just three steps with this free tool.
Piktochart - With a simple point and click editor and over 4,000 graphics, icons, and template files, Piktochart makes it easy to create infographics that look exactly the way you want.
Easel.ly - Loaded with tons of creative templates and an east-to-use interface, this is another powerful tool to create your own stunning infographics.
Venngage - This drag and drop interface provides all the charts, maps, icons and templates you'll need to design attention-grabbing infographics.
Vizualize.me - Turn your boring resume into a unique visual expression of your skills and experience to stand out from the crowd.
If you are in a saturated market or have a great idea you are certain will be a success then it may make sense to splash out for a custom designed graphic, but in less competitive market some of the above quick-n-easy tools can still be remarkably effective.
Google Charts is a great way to create all sorts of charts, and the best part is that you can create them on the fly by passing variables in the URL.
Today you have plenty of options when it comes to font choices, so please stop using Arial, and for the love of all that is good, never use Comic Sans or I will hunt you down. You can choose from thousands of free fonts, so it's easy to pick one that fits your project perfectly.
Typegenius - Choosing the perfect font combo can be tough, but Typegenious makes it easy. Just pick a starter font from the drop down list and the site will recommend fonts that pair well with it.
Google Fonts - I recommend embedding Google fonts instead hosting them on your own server because they load more quickly and there is a chance they're already cached on visitors' computers.
Font Awesome - This is an awesome (hence the name) way to add all sorts of scalable icons without a load of extra http requests. Simply load one font for access to 519 icons that colored, scaled, and styled with CSS.
DaFont - Download and instal these fonts (.ttf or .otf formats) for designing documents or images on your computer.
What the Font - If you've ever experienced the rage-inducing task of figuring out what font was used when your client only has a 72dpi jpg and no idea how to track down their previous designer, then this is the tool for you. Just upload your image and it goes to work figuring what font it is.
Social media can multiply your website's exposure exponentially, but it takes a lot of work. From branding profiles on each network to crafting engaging visual content your fans will share, you'll have to create a lot of graphics to feed the beast. Doing that manually, the old-fashioned way is tedious and slow, so I recommend these tools to speed up your workflow.
Easy Cover Maker - Stop wasting time trying to position your cover and profile photo for your Facebook and Google+ page. This tool lets you drag everything into position in one handy interactive window, then download the image files.
Quotes Cover - Just select a quote or enter your own text, apply various effects for your own unique style, and download eye catching pictures perfect for social media. It even creates the perfect dimensions based on how you intend to use it.
Chisel - This tool has the most user-friendly interface and tons of great images and fonts to create the exact message you want to share.
Recite This - There are plenty of images and fonts available, but the downside is you have to scroll through images one at a time, and fonts are selected randomly.
Jing - From the makers of Camtasia, this free program gives you the ability to capture images or video (up to 5 minutes long) of your computer screen, then share it with the click of a button.
Social Kit - Create cover images, profile pictures, and ad banners for Facebook, Twitter, Google+, and YouTube with this free, up-to-date Photoshop plugin.
Social media image size guide - The folks over at Sprout Social created (and maintain) this handy and comprehensive Google doc listing the image sizes for all major social media networks, and since it's a shared document, you can have Google notify you anytime it's updated!
Instead of wasting time searching for the perfect meme, why not just create your own?
Powerful photos can mean the difference between a dry post that visitors ignore and one that entices them to read more. The good news is you don't have to take your own spend a fortune on stock photos because there are several free and low-cost options available.
Unsplash - These are not your typical cheesy stock photos; they lean more towards the artistic side. New photos are uploaded every day and they're all 100% free.
StockVault - With over 54 thousand free images available, both artistic and corporate-style, you should be able to find the perfect photo for just about any project.
Dreamstime & iStockPhoto - Both of these sites give you the option of a subscription model or a pay-as-you-go credits. Many images on one are available on the other, but I've found great images that were only on one of the two sites, so it's worthwhile to check both.
Even the best designers hit a wall, creatively speaking, so it helps to look for inspiration. These sites curate the best designs around and are updated regularly, so you'll find plenty of fresh ideas for your project.
Since you're days are filled with keyword research, content development, link building, and other SEO-related tasks, you probably don't have time to stay up-to-date on the latest design trends and techniques. No worries—with these websites, you'll be able to find a tutorial to walk you through just about any design challenge.
CSS-Tricks - Whenever I have a CSS question, I always slap “css tricks” on the end of my search because Chris Coyer has the most detailed, yet easy-to-understand tutorials on damn near every scenario you could imagine.
Tuts+ - Learn everything about graphic design, web design, programming, and more with a growing library of articles and tutorials.
Smashing Magazine - This is probably one of the most comprehensive web design resources you'll find anywhere, going wide and deep on every aspect of web design.
About the Author
Jeremy Knauff is the founder of Spartan Media, a proud father, husband, and US Marine Corps veteran. He has spent over 15 years helping small business up to Fortune 500 companies make their mark online, and now he's busy building his own media empire. You can follow Spartan Media on Twitter and Facebook.
If you look just at your revenue numbers as a publisher, it is easy to believe mobile is of limited importance. In our last post I mentioned an AdAge article highlighting how the New York Times was generating over half their traffic from mobile with it accounting for about 10% of their online ad revenues.
Large ad networks (Google, Bing, Facebook, Twitter, Yahoo!, etc.) can monetize mobile *much* better than other publishers can because they aggressively blend the ads right into the social stream or search results, causing them to have a much higher CTR than ads on the desktop. Bing recently confirmed the same trend RKG has highlighted about Google's mobile ad clicks:
While mobile continues to be an area of rapid and steady growth, we are pleased to report that the Yahoo Bing Network’s search and click volumes from smart phone users have more than doubled year-over-year. Click volumes have generally outpaced growth in search volume
Those ad networks want other publishers to make their sites mobile friendly for a couple reasons...
If the downstream sites are mobile friendly, then users are more likely to go back to the central ad / search / social networks more often & be more willing to click out on the ads from them.
If mobile is emphasized in importance, then those who are critical of the value of the channel may eat some of the blame for relative poor performance, particularly if they haven't spent resources optimizing user experience on the channel.
Starting April 21, we will be expanding our use of mobile-friendliness as a ranking signal. This change will affect mobile searches in all languages worldwide and will have a significant impact in our search results.
In the past Google would hint that they were working to clean up link spam or content farms or website hacking and so on. In some cases announcing such efforts was done to try to discourage investment in the associated strategies, but it is quite rare that Google pre-announces an algorithmic shift which they state will be significant & they put an exact date on it.
I wouldn't recommend waiting until the last day to implement the design changes, as it will take Google time to re-crawl your site & recognize if the design is mobile friendly.
Those who ignore the warning might be in for significant pain.
Some sites which were hit by Panda saw a devastating 50% to 70% decline in search traffic, but given how small mobile phone screen sizes are, even ranking just a couple spots lower could cause an 80% or 90% decline in mobile search traffic.
Another related issue referenced in the above post was tying in-app content to mobile search personalization:
Starting today, we will begin to use information from indexed apps as a factor in ranking for signed-in users who have the app installed. As a result, we may now surface content from indexed apps more prominently in search. To find out how to implement App Indexing, which allows us to surface this information in search results, have a look at our step-by-step guide on the developer site.
Some sites have a separate m. version for mobile visitors, while other sites keep consistent URLs & employ responsive design. How the m. version works is on the regular version of their site (say like www.seobook.com) a webmaster could add an alternative reference to the mobile version in the head section of the document
With the above sort of code in place, Google would rank the full version of the site on desktop searches & the m. version in mobile search results.
3 or 4 years ago it was a toss up as to which of these 2 options would win, but over time it appears the responsive design option is more likely to win out.
Here are a couple reasons responsive is likely to win out as a better solution:
If people share a mobile-friendly URL on Twitter, Facebook or other social networks & the URL changes, then when someone on a desktop computer clicks on the shared m. version of the page with fewer ad units & less content on the page, then the publisher is providing a worse user experience & is losing out on incremental monetization they would have achieved with the additional ad units.
While some search engines and social networks might be good at consolidating the performance of the same piece of content across multiple URL versions, some of them will periodically mess it up. That in turn will lead in some cases to lower rankings in search results or lower virality of content on social networks.
Over time there is an increasing blur between phones and tablets with phablets. Some high pixel density screens on cross over devices may appear large in terms of pixel count, but still not have particularly large screens, making it easy for users to misclick on the interface.
When Bing gave their best practices for mobile, they stated: "Ideally, there shouldn’t be a difference between the “mobile-friendly” URL and the “desktop” URL: the site would automatically adjust to the device — content, layout, and all." In that post Bing shows some examples of m. versions of sites ranking in their mobile search results, however for smaller & lesser known sites they may not rank the m. version the way they do for Yelp or Wikipedia, which means that even if you optimize the m. version of the site to a great degree, that isn't the version all mobile searchers will see. Back in 2012 Bing also stated their preference for a single version of a URL, in part based on lowering crawl traffic & consolidation of ranking signals.
In addition to responsive web design & separate mobile friendly URLs, a third configuration option is dynamic serving, which uses the Vary HTTP header to detect the user-agent & use that to drive the layout. For large, complex & high-traffic sites (and sites with numerous staff programmers & designers) dynamic serving is perhaps the best optimization solution because you can optimize the images and code to lower bandwidth costs and response times. Most smaller sites will likely rely on responsive design rather than dynamic serving, in large part because it is quicker & cheaper to implement, and most are not running sites large enough to where the incremental bandwidth provides a significant incremental expense to their business.
Solutions for Quickly Implementing Responsive Design
New Theme / Design
If your site hasn't been updated in years you might be suprised at what you find available on sites like ThemeForest for quite reasonable prices. Many of the options are responsive out of the gate & look good with a day or two of customization. Theme subscription services like WooThemes and Elegant Themes also have responsive options.
Some of the PSD to HTML conversion services like PSD 2 HTML, HTML Slicemate & XHTML Chop offer responsive design conversion of existing HTML sites in as little as a day or two, though you will likely need to do at least a few minor changes when you put the designs live to compensate for issues like third party ad units and other minor issues.
If you have an existing Wordpress theme, you might want to see if you can zip it up and send it to them, or else they may make your new theme as a child theme of 2015 or such. If you are struggling to get them to convert your Wordpress theme over (like they are first converting it to a child theme of 2015 or such) then another option would be to have them do a static HTML file conversion (instead of a Wordpress conversion) and then feed that through a theme creation tool like Themespress.
For simple hand rolled designs there are a variety of grid generator tools, which can make it reasonably easy to replace some old school table-based designs with divs. Many of the themes for sale in marketplaces like Theme Forest also use a multi-column div grid based system.
Other Things to Look Out For
Third Party Plug-ins & Ad Code Gotchas
Google allows webmasters to alter the ad calls on their mobile responsive AdSense ad units to show different sized ad units to different screen sizes & skip showing some ad units on smaller screens. An AdSense code example is included in an expandable section at the bottom of this page.
Back up your old site before putting the new site live.
For static HTML sites or sites with PHP or SHTML includes & such...
Download a copy of your existing site to local.
Rename that folder to something like sitename.com-OLDVERSION
Upload the sitename.com-OLDVERSION folder to your server. If anything goes drastically wrong during your conversion process you can rename the new site design to something like sitename.com-HOSED then set the sitename.com-OLDVERSION folder to sitename.com to quickly restore the site.
Download your site to local again.
Ensure your new site design is using a different CSS folder or CSS filename such that they old and new versions of the design can be live at the same time while you are editing the site.
Create a test file with the responsive design on your site & test that page until things work well enough.
Once you have the general "what needs changed in each file" down, then use find & replace to bulk edit the remaining files to make the changes to make them responsive.
Use a tool like FileZilla to quickly bulk upload the files.
Look through key pages and if there are only a few minor errors then fix them and re-upload them. If things are majorly screwed up then revert to the old design being live and schedule a do over on the upgrade.
If you have a decently high traffic site, it might make sense to schedule the above process for late Friday night or an off hour on the weekend, such that if anything goes astray you have fewer people seeing the errors while you frantically rush to fix them. :)
If you want to view your top pages you could export that data from your web analytics to verify all those pages look good. If you wanted to view every page of your site 1 at a time after the change, you could use a tool like Xenu Link Sleuth or Screaming Frog SEO Spider to crawl your site & export a list of URLs into a spreadsheet. Then you could take the URLs from that spreadsheet and put them a chunk at a time into a tool like URL Opener.
If you have little faith in the above test-it-live "methodology" & would prefer a slower & lower stress approach, you could create a test site on another domain name for testing purposes. Just be sure to include a noindex directive in the robots.txt file or password protect access to the site while testing. When you get things worked out on it, make sure your internal links are referencing the correct domain name, and that you have removed any block via robots.txt or password protection.
For a site with a CMS the above process is basically the same, except for how you might need to create a different backup. If you are uploading a Wordpress or Drupal theme, then change the name at least slightly so you can keep the old and new designs live at the same time so you can quickly switch back to the old design if you need to.
If you have a mixed site with Wordpress & static files or such then it might make sense to test changing the static files first, get those to work well & then create a Wordpress theme after that.
Google systems have tested 2,790 pages from your site and found that 100% of them have critical mobile usability errors. The errors on these 2,790 pages severely affect how mobile users are able to experience your website. These pages will not be seen as mobile-friendly by Google Search, and will therefore be displayed and ranked appropriately for smartphone users.
When Google introduced the knowledge graph one of their underlying messages behind it was "you can't copyright facts."
Facts are like domain names or links or pictures or anything else in terms of being a layer of information which can be highly valued or devalued through commoditization.
When you search for love quotes, Google pulls one into their site & then provides another "try again" link.
Since quotes mostly come from third parties they are not owned by BrainyQuotes and other similar sites. But here is the thing, if those other sites which pay to organize and verify such collections have their economics sufficiently undermined then they go away & then Google isn't able to pull them into the search results either.
The same is true with song lyrics. If you are one of the few sites paying to license the lyrics & then Google puts lyrics above the search results, then the economics which justified the investment in licensing might not back out & you will likely go bankrupt. That bankruptcy wouldn't be the result of being a spammer trying to work an angle, but rather because you had a higher cost structure from trying to do things the right way.
Google has also done the above quote-like "action item" types of onebox listings in other areas like software downloads
Where there are multiple versions of the software available, Google is arbitrarily selecting the download page, even though a software publisher might have a parallel SAAS option or other complex funnels based on a person's location or status as a student or such.
Mix in Google allowing advertisers to advertise bundled adware, and it becomes quite easy for Google to gum up the sales process and undermine existing brand equity by sending users to the wrong location. Here's a blog post from Malwarebytes referencing
their software being advertised on their brand term in Google via AdWords ads, engaging in trademark infringement and bundled with adware.
numerous user complaints they received about the bundleware
required legal actions they took to take the bundler offline
The company used this cash to build more business, spending more than $1 million through at least seven separate advertising accounts with Google.
The ads themselves said things like “McAfee Support - Call +1-855-[redacted US phone number]” and pointed to domains like mcafee-support.pccare247.com.
One PCCare247 ad account with Google produced 71.7 million impressions; another generated 12.4 million more. According to records obtained by the FTC, these combined campaigns generated 1.5 million clicks
Google started the knowledge graph & onebox listings on some utterly banal topics which were easy for a computer to get right, though their ambitions vastly exceed the starting point. The starting point was done where it was because it was low-risk and easy.
When Google's evolving search technology was recently covered on Medium by Steven Levy he shared that today the Knowledge Graph appears on roughly 25% of search queries and that...
Google is also trying to figure out how to deliver more complex results — to go beyond quick facts and deliver more subjective, fuzzier associations. “People aren’t interested in just facts,” she says. “They are interested in subjective things like whether or not the television shows are well-written. Things that could really help take the Knowledge Graph to the next level.”
Even as the people who routinely shill for Google parrot the "you can't copyright facts" mantra, Google is telling you they have every intent of expanding far beyond it. “I see search as the interface to all computing,” says Singhal.
Even if You Have Copyright...
What makes the "you can't copyright facts" line so particularly disingenuous was Google's support of piracy when they purchased YouTube:
cofounder Jawed Karim favored “lax” copyright policy to make YouTube “huge” and hence “an excellent acquisition target.” YouTube at one point added a “report copyrighted content” button to let users report infringements, but removed the button when it realized how many users were reporting unauthorized videos. Meanwhile, YouTube managers intentionally retained infringing videos they knew were on the site, remarking “we should KEEP …. comedy clips (Conan, Leno, etc.) [and] music videos” despite having licenses for none of these. (In an email rebuke, cofounder Steve Chen admonished: “Jawed, please stop putting stolen videos on the site. We’re going to have a tough time defending the fact that we’re not liable for the copyrighted material on the site because we didn’t put it up when one of the co-founders is blatantly stealing content from other sites and trying to get everyone to see it.”)
To some, the separation of branding makes YouTube distinct and separate from Google search, but that wasn't so much the case when many sites lost their video thumbnails and YouTube saw larger thumbnails on many of their listings in Google. In the above Steven Levy article he wrote: "one of the highest ranked general categories was a desire to know “how to” perform certain tasks. So Google made it easier to surface how-to videos from YouTube and other sources, featuring them more prominently in search."
Altruism vs Disruption for the Sake of it
Whenever Google implements a new feature they can choose to not monetize it so as to claim they are benevolent and doing it for users without commercial interest. But that same unmonetized & for users claim was also used with their shopping search vertical until one day it went paid. Google claimed paid inclusion was evil right up until the day it claimed paid inclusion was a necessity to improve user experience.
There was literally no transition period.
Many of the "informational" knowledge block listings contain affiliate links pointing into Google Play or other sites. Those affiliate ads were only labeled as advertisements after the FTC complained about inconsistent ad labeling in search results.
starting in the next few days, when you ask Google about common health conditions, you’ll start getting relevant medical facts right up front from the Knowledge Graph. We’ll show you typical symptoms and treatments, as well as details on how common the condition is—whether it’s critical, if it’s contagious, what ages it affects, and more. For some conditions you’ll also see high-quality illustrations from licensed medical illustrators. Once you get this basic info from Google, you should find it easier to do more research on other sites around the web, or know what questions to ask your doctor.
Google's links to the Mayo Clinic in their knowledge graph are, once again, a light gray font.
In case you didn't find enough background in Google's announcement article, Greg Sterling shared more of Google's views here. A couple notable quotes from Greg...
Cynics might say that Google is moving into yet another vertical content area and usurping third-party publishers. I don’t believe this is the case. Google isn’t going to be monetizing these queries; it appears to be genuinely motivated by a desire to show higher-quality health information and educate users accordingly.
Google doesn't need to directly monetize it to impact the economics of the industry. If they shift a greater share of clicks through AdWords then that will increase competition and ad prices in that category while lowering investment in SEO.
If this is done out of benevolence, it will appear *above* the AdWords ads on the search results — unlike almost every type of onebox or knowledge graph result Google offers.
If it is fair for him to label everyone who disagrees with his thesis as a cynic then it is of course fair for those "cynics" to label Greg Sterling as a shill.
Google told me that it hopes this initiative will help motivate the improvement of health content across the internet.
By defunding and displacing something they don't improve its quality. Rather they force the associated entities to cut their costs to try to make the numbers work.
If their traffic drops and they don't do more with less, then...
their margins will fall
growth slows (or they may even shrink)
their stock price will tank
management will get fired & replaced and/or they will get took private by private equity investors and/or they will need to do some "bet the company" moves to find growth elsewhere (and hope Google doesn't enter that parallel area anytime soon)
Things get monetized directly, monetized indirectly, or they disappear.
Some of the more hated aspects of online publishing (headline bait, idiotic correlations out of context, pagination, slideshows, popups, fly in ad units, auto play videos, full page ad wraps, huge ads eating most the above the fold real estate, integration of terrible native ad units promoting junk offers with shocking headline bait, content scraping answer farms, blending unvetted user generated content with house editorial, partnering with content farms to create subdomains on trusted blue chip sites, using Narrative Science or Automated Insights to auto-generate content, etc.) are not done because online publishers want to be jackasses, but because it is hard to make the numbers work in a competitive environment.
Facebook's early motto was "move fast and break things," but as they wanted to become more of a platform play they changed it to "move fast with stability." Anything which is central to the web needs significant stability, or it destroys many other businesses as a side effect of its instability.
There are a couple different ways to view big search algorithm updates. Large, drastic updates implicitly state one of the following:
we were REALLY wrong yesterday
we are REALLY wrong today
Any change or disruption is easy to justify so long as you are not the one facing the consequences:
"Smart people have a problem, especially (although not only) when you put them in large groups. That problem is an ability to convincingly rationalize nearly anything." ... "Impostor Syndrome is that voice inside you saying that not everything is as it seems, and it could all be lost in a moment. The people with the problem are the people who can't hear that voice." - Googler Avery Pennarun
Monopoly Marketshare in a Flash
Make no mistake, large changes come with false positives and false negatives. If a monopoly keeps buying marketshare, then any mistakes they make have more extreme outcomes.
Here's the Flash update screen (which hits almost every web browser EXCEPT Google Chrome).
Notice the negative option installs for the Google Chrome web browser and the Google Toolbar in Internet Explorer.
Why doesn't that same process hit Chrome? They not only pay Adobe to use security updates to steal marketshare from other browsers, but they also pay Adobe to embed Flash inside Chrome, so Chrome users never go through the bundleware update process.
Anytime anyone using a browser other than Chrome has a Flash security update they need to opt out of the bundleware, or they end up installing Google Chrome as their default web browser, which is the primary reason Firefox marketshare is in decline.
Has anyone noticed that the latest Flash update automatically installs Google Toolbar and Google Chrome? What a horrible business decision Adobe. Force installing software like you are Napster. I would fire the product manager that made that decision. As a CTO I will be informing my IT staff to set Flash to ignore updates from this point forward. QA staff cannot have additional items installed that are not part of the base browser installation. Ridiculous that Adobe snuck this crap in. All I can hope now is to find something that challenges Photoshop so I can move my design team away from Adobe software as well. Smart move trying to make pennies off of your high dollar customers.
In Chrome Google is the default search engine. As it is in Firefox and Opera and Safari and Android and iOS's web search.
In other words, in most cases across most web interfaces you have to explicitly change the default to not get Google. And then even when you do that, you have to be vigilant in protecting against the various Google bundleware bolted onto core plugins for other web browsers, or else you still end up in an ecosystem owned, controlled & tracked by Google.
Those "default" settings are not primarily driven by user preferences, but by a flow of funds. A few hundred million dollars here, a billion there, and the market is sewn up.
Google's user tracking is so widespread & so sophisticated that their ad cookies were a primary tool for government surveillance efforts.
Locking Down The Ecosystem
And Chrome is easily the most locked down browser out there.
While Google relies on bundling their toolbar & browser in updates to Flash and other plugins, they require an opposite strategy for anyone distributing Chrome plugins. Chrome plugins "must have a single purpose that is narrow and easy-to-understand."
Whenever Google wants to promote something they have the ability to bundle it into their web browser, operating system & search results to try to force participation. In a fluid system with finite attention, over-promoting one thing means under-promoting or censoring other options. Google likes to have their cake & eat it too, but the numbers don't lie.
I am frustrated @JohnMu saying that it will not cost CTR. Either Google lied about the increase in CTR with photos, or they're lying now.— Rand Fishkin (@randfish) June 25, 2014
The Right to Be Forgotten
This brings us back to the current snafu with the "right to be forgotten" in Europe.
Some have looked at the EU policy and compared it to state-run censorship in China.
Google already hires over 10,000 remote quality raters to rate search results. How exactly is receiving 70,000 requests a monumental task? As their public relations propagandists paint this as an unbelievable burden, they are also highlighting how their own internal policies destroy smaller businesses: "If a multi-billion dollar corporation is struggling to cope with 70,000 censor requests, imagine how the small business owner feels when he/she has to disavow thousands or tens of thousands of links."
Sorry About That Incidental Deletion From the Web...
David Drummond's breathtaking propaganda makes it sound like Google has virtually no history in censoring access to information:
In the past we’ve restricted the removals we make from search to a very short list. It includes information deemed illegal by a court, such as defamation, pirated content (once we’re notified by the rights holder), malware, personal information such as bank details, child sexual abuse imagery and other things prohibited by local law (like material that glorifies Nazism in Germany).
Google is free to force whatever (often both arbitrary and life altering) changes they desire onto the search ecosystem. But the moment anyone else wants any level of discourse or debate into the process, they feign outrage over the impacts on the purity of their results.
Despite Google's great power they do make mistakes. And when they do, people lose their jobs.
People noticed the Google update when it happened. It is hard to miss an overnight 40% decline in your revenues. Yet when they asked about it, Google did not confirm its existence. That economic damage hit MetaFilter for nearly two years & they only got a potential reprieve from after they fired multiple employees and were able to generate publicity about what had happened.
As SugarRae mentioned, those false positives happen regularly, but most the people who are hit by them lack political and media influence, and are thus slaughtered with no chance of recovery.
MetaFilter is no different than tens of thousands of other good, worthy small businesses who are also laying off employees – some even closing their doors – as a result of Google’s Panda filter serving as judge, jury and executioner. They’ve been as blindly and unfairly cast away to an island and no one can hear their pleas for help.
The only difference between MetaFilter and tons of other small businesses on the web is that MetaFilter has friends in higher places.
If you read past the headlines & the token slaps of big brands, these false positive death sentences for small businesses are a daily occurrence.
Conversations I’ve had with web publishers, none of whom would speak on the record for fear of retribution from Cutts’ webspam team, speak to a litany of frustration at a lack of transparency and potential bullying from Google. “The very fact I’m not able to be candid, that’s a testament to the grotesque power imbalance that’s developed,” the owner of one widely read, critically acclaimed popular website told me after their site ran afoul of Cutts’ last Panda update.
Not only does Google engage in anti-competitive censorship, but they also frequently publish misinformation. Here's a story from a week ago of a restaurant which went under after someone changed their Google listing store hours to be closed on busy days. That misinformation was embedded directly in the search results. That business is no more.
I am one of the few Real Locksmiths here in Denver and I have been struggling with this for years now. I only get one or two calls a day now thanks to spammers, and that's not calls I do, it's calls for prices. For instance I just got a call from a lady locked out of her apt. It is 1130 pm so I told her 75 dollars, Nope she said someone told her 35 dollars....a fake locksmith no doubt. She didn't understand that they meant 35 dollars to come out and look at it. These spammers charge hundreds to break your lock, they don't know how to pick a lock, then they charge you 10 times the price of some cheap lock from a hardware store. I'm so lost, I need help from google to remove those listings. Locksmithing is all I have ever done and now I'm failing at it.
There are entire sectors of the offline economy being reshaped by Google policies.
When those sectors get coverage, the blame always goes to the individual business owner who was (somehow?) personally responsible for Google's behaviors, or perhaps some coverage of the nefarious "spammers."
John Milton in his fiery 1644 defense of free speech, Areopagitica, was writing not against the oppressive power of the state but of the printers guilds. Darnton said the same was true of John Locke's writings about free speech. Locke's boogeyman wasn't an oppressive government, but a monopolistic commercial distribution system that was unfriendly to ways of organizing information that didn't fit into its business model. Sound familiar?
When Google complains about censorship, they are not really complaining about what may be, but what already is. Their only problem is the idea that someone other than themselves should have any input in the process.
"I think as technologists we should have some safe places where we can try out some new things and figure out what is the effect on society, what's the effect on people, without having to deploy kind of into the normal world. And people like those kind of things can go there and experience that and we don't have mechanisms for that." - Larry Page
I have no problem with an "opt-in" techno-utopia test in some remote corner of the world, but if that's the sort of operation he wants to run, it would be appreciated if he stopped bundling his software into billions of electronic devices & assumed everyone else is fine with "opting out."
A couple years ago we published an article named Branding & the Cycle, which highlighted how brands would realign with the algorithmic boost they gained from Panda & leverage their increased level of trust to increase their profit margins by leveraging algorithmic journalism.
Narrative Science has been a big player in the algorithmic journalism game for years. But they are not the only player in the market. Recently the Associated Press (AP) announced they will use algorithms to write articles based on quarterly earnings reports, working with a company named Automated Insights:
We discovered that automation technology, from a company called Automated Insights, paired with data from Zacks Investment Research, would allow us to automate short stories – 150 to 300 words — about the earnings of companies in roughly the same time that it took our reporters.
And instead of providing 300 stories manually, we can provide up to 4,400 automatically for companies throughout the United States each quarter.
Zacks maintains the data when the earnings reports are issued. Automated Insights has algorithms that ping that data and then in seconds output a story.
In the past Matt Cutts has mentioned how thin rewrites are doorway page spam:
you can also have more subtle doorway pages. so we ran into a directv installer in denver, for example. and that installer would say I install for every city in Colorado. so I am going to make a page for every single city in Colorado. and Boulder or Aspen or whatever I do directv install in all of those. if you were just to land on that page it might look relatively reasonable. but if you were to look at 4 or 5 of those you would quickly see that the only difference between them is the city, and that is something that we would consider a doorway.
A single automated AP article might appear on thousands of websites. When thousands of articles are automated, that means millions of copies. When millions of articles are automated, that means billions of copies. When billions ... you get the idea.
To date Automated Insights has raised a total of $10.8 million. With that limited funding they are growing quickly. Last year their Wordsmith software produced 300 million stories & this year it will likely exceed a billion articles:
"We are the largest producer of content in the world. That's more than all media companies combined," [Automated Insights CEO Robbie Allen] said in a phone interview with USA TODAY.
The above might sound a bit dystopian (for those with careers in journalism and/or lacking equity in Automated Insights and/or publishers who must compete against algorithmically generated content), but the story also comes with a side of irony.
Last year Google dictated press releases shall use nofollow links. All the major press release sites quickly fell in line & adopted nofollow, thinking they would remain in Google's good graces. Unfortunately for those sites, they were crushed by Panda. PR Newswire's solution their penalty was greater emphasis on manual editorial review:
Under the new copy quality guidelines, PR Newswire editorial staff will review press releases for a number of message elements, including:
Inclusion of insightful analysis and original content (e.g. research, reporting or other interesting and useful information);
Use of varied release formats, guarding against repeated use of templated copy (except boilerplate);
Assessing release length, guarding against issue of very short, unsubstantial messages that are mere vehicles for links;
Overuse of keywords and/or links within the message.
So now we are in a situation where press release sites require manual human editorial oversight to try to get out of being penalized, and the news companies (which currently enjoy algorithmic ranking boosts) are leveraging those same "spammy" press releases using software to auto-generate articles based on them.
That makes sense & sounds totally reasonable, so long as you don't actually think about it (or work at Google)...
The stock market had a flash crash today after someone hacked the AP account & made a fake announcement about bombs going off at the White House. Recently Twitter's search functionality has grown so inundated with spam that I don't even look at the brand related searches much anymore. While you can block individual users, it doesn't block them from showing up in search results, so there are various affiliate bots that spam just about any semi-branded search.
"We [YouTube] can't review every submission, so basically the crowd marks it if it is a problem post publication."
"You have a different model, right. You require human editors." on Wikileaks vs YouTube
We would post editorial content more often, but we are sort of debating opening up a social platform so that we can focus on the user without having to bear any editorial costs until after the fact. Profit margins are apparently better that way.
As Google drives smaller sites out of the index & ranks junk content based on no factor other than it being on a trusted site, they create the incentive for spammers to ride on the social platforms.
All aboard. And try not to step on any toes!
When I do some product related searches (eg: brand name & shoe model) almost the whole result set for the first 5 or 10 pages is garbage.
Facebook Notes & pages
subdomains off of various other free hosts
It comes without surprise that Eric Schmidt fundamentally believes that "disinformation becomes so easy to generate because of, because complexity overwhelms knowledge, that it is in the people's interest, if you will over the next decade, to build disinformation generating systems, this is true for corporations, for marketing, for governments and so on."
Of course he made no mention in Google's role in the above problem. When they are not issuing threats & penalties to smaller independent webmasters, they are just a passive omniscient observer.
With all these business models, there is a core model of building up a solid stream of usage data & then tricking users or looking the other way when things get out of hand. Consider Google's Lane Shackleton's tips on YouTube:
"Search is a way for a user to explicitly call out the content that they want. If a friend told me about an Audi ad, then I might go seek that out through search. It’s a strong signal of intent, and it’s a strong signal that someone found out about that content in some way."
"you blur the lines between advertising and content. That’s really what we’ve been advocating our advertisers to do."
"you’re making thoughtful content for a purpose. So if you want something to get shared a lot, you may skew towards doing something like a prank"
Harlem Shake & Idiocracy: the innovative way forward to improve humanity.
Hubert said that if people file a reconsideration request, they should “get a clear answer” about what’s wrong. There’s a bit of a Catch-22 there. How can you file a reconsideration request showing you’ve removed the bad stuff, if the only way you can get a clear answer about the bad stuff to remove is to file a reconsideration request?
The answer is that technically, you can request reconsideration without removing anything. The form doesn’t actually require you to remove bad stuff. That’s just the general advice you’ll often hear Google say, when it comes to making such a request. That’s also good advice if you do know what’s wrong.
But if you’re confused and need more advice, you can file the form asking for specifics about what needs to be removed. Then have patience
In the past I referenced that there is no difference between a formal white list & overly-aggressive penalties coupled with loose exemptions for select parties.
The moral of the story is that if you are going to spam, you should make it look like a user of your site did it, that way you
Native advertising presents opportunities for SEOs to boost their link building strategies, particularly those who favor paid link strategies.
What Is Native Advertising?
Native advertising is the marketing industries new buzzword for....well, it depends who you ask.
Native advertising can't just be about the creative that fills an advertising space. Native advertising must be intrinsically connected to the format that fits the user's unique experience. There's something philosophically beautiful about that in terms of what great advertising should (and could) be. But first, we need to all speak the same language around "native advertising.
Native advertising is often defined as content that seamless integrates with a site, as opposed to interruption media, such as pre-rolls on YouTube videos, or advertising that sits in a box off to the side of the main content.
Some high-profile examples of native advertising include Facebook Sponsored Stories; Twitter's Promoted Tweets; promoted videos on YouTube, Tumblr and Forbes; promoted articles like Gawker's Sponsored Posts and BuzzFeed's Featured Partner content; Sponsored Listings on Yelp; promoted images on Cheezburger; and promoted playlists on Spotify and Radio.
One interesting observation is that Adwords and Adsense are frequently cited as being examples of native advertising. Hold that thought.
Why Native Advertising?
The publishing industry is desperate to latch onto any potential lifeline as ad rates plummet.
Analysts say the slowdown is being caused by the huge expansion in the amount of online advertising space as companies who manage this emerge to dominate the space. In short there’s just too many ad slots chasing ads that are growing, but at a rate slower than the creation of potential ad slots.
This means the chances are dimming that online ad spending would gradually grow to make up for some of the falls in analogue spending in print. ....staff numbers and the attendant costs of doing business have to be slashed heavily to account for the lower yield and revenue from online ads
And why might there be more slots than there are advertisers?
“The model of ‘boxes and rectangles’ – the display banner – is failing to fully support traditional ‘content’ sites beyond a handful of exceptions,” wrote Federated Media founder John Battelle in a recent blog post. He explained that the next generation of native ads on social networks and strength of Google Adwords make direct sales more competitive, and that ad agencies must evolve with the growing trend of advertisers who want more social/conversational ad campaigns.
Advertisers aren't seeing enough return from the advertising in order for them to want to grab the many slots that are available. And they are lowering their bids to make up for issues with publishing fraud. The promise of native advertising is that this type of advertising reaches real users, and will grab and hold viewers attention for longer.
Facebook is still largely centered around interactions with people one knows offline, making the appearance of marketing messages especially jarring. This is particularly true in mobile, where Sponsored Stories take up a much larger portion of the screen relative to desktop. Facebook did not handle the mobile rollout very gracefully, either. Rather than easing users into the change, they appeared seemingly overnight, and took up the first few posts in the newsfeed. The content itself is also hit or miss – actions taken by distant friends with dissimilar interests are often used as the basis for targeting Sponsored Stories.
If you’re planning on offering native advertising yourself, you may need to walk a fine line. Bloggers and other publishers who are getting paid but don’t declare so risk alienating their audience and destroying their reputation.
Some good ways of addressing this issue are policy pages that state the author has affiliate relationships with various providers, and this is a means of paying for the site, and does not affect editorial. Whether it’s true or not is up to the audience to decide, but such transparency up-front certainly helps. If a lot of free content is mixed in with native content, and audiences dislike it enough, then it might pave the way for more paid content and paywalls.
Just like any advertising content, native advertising may become less effective over time if the audience learns to screen it out. One advantage for the SEO is that doesn’t matter so much, so long as they get the link.
Forbes Insights and Sharethrough today announced the results of a brand study to assess adoption trends related to native video advertising that included senior executives from leading brands such as Intel, JetBlue, Heineken and Honda. The study shows that more than half of large brands are now using custom brand videos in their marketing, and when it comes to distribution, most favor “native advertising” approaches where content is visually integrated into the organic site experience, as opposed to running in standard display ad formats. The study also shows that the majority of marketers now prefer choice-based formats over interruptive formats.
Google’s Clamp-Down On Link Advertising
So, what’s the difference between advertorial and native content? Not much, on the face of it, except in one rather interesting respect. When it comes to native advertising, it’s often not obvious the post is sponsored.
The Atlantic, BuzzFeed and Gawker — are experimenting with new ad formats such as sponsored content or “native advertising,” as well as affiliate links. On Friday, Google engineer Matt Cutts reiterated a warning from the search giant that this kind of content has to be treated properly or Google will penalize the site that hosts it, in some cases severely.
If native advertising proves popular with publishers and advertisers, then it’s going to compete with Google’s business model. Businesses may spend less on Adwords and may replace Adsense with native advertising. It’s no surprise, then, that Google may take a hostile line on it. However, publishers are poor, ad networks are rich, so perhaps it's time that publishers became ad networks.
When it comes to SEO, given Google’s warning shots, SEOs will either capitulate - and pretty much give up on paid links - or make more effort to blend seamlessly into the background.
Blurring The Lines
As Andrew Sullivan notes, the editorial thin blue line is looking rather “fuzzy”. It may even raise legal questions about misrepresentation. There has traditionally been a church and state divide between advertising and editorial, but as publishers get more desperate to survive, they’re going to go with whatever works. If native advertising works better than the alternatives, then publishers will use it. What choice have they got? Their industry is dying.
I have nothing but admiration for innovation in advertizing and creative revenue-generation online. Without it, journalism will die. But if advertorials become effectively indistinguishable from editorial, aren’t we in danger of destroying the village in order to save it?
Likewise, in order to compete in search results, a site must have links. It would great if people linked freely and often based on objective merit, but we all know that is a hit and miss affair. If native advertising provides a means to acquire paid links that don’t look like paid links, then that is what people will do.
And if their competitors are doing it, they’ll have little choice.
If you’re looking for a way to build paid links, then here is where the opportunity lies for SEOs.
Recent examples Google caught out looked heavily advertorial. They were in bulk. They would have likely been barely credible to a human reviewer as they didn’t read particularly well. Those I saw had an "auto-generated quality" to them.
The integration with editorial needs to be seamless and, if possible, the in-house editors should write the copy, or it should look like they did. Avoid generic and boilerplate approaches. The content should not be both generic and widely distributed. Such strategy is unlikely to pass Google’s inspections.
Markets will spring up, if they haven’t already, whereby publications will offer editorial native advertising, link included. It would be difficult to tell if such a link was “paid for”, and certainly not algorithmically, unless the publisher specifically labelled it “advertising feature” or something similar.
Sure, this has been going on for years, but if a lot of high level publishers embrace something called "Native Advertising" then that sounds a lot more legitimate than "someone wants to pay for a link on our site". In marketing, it's all about the spin ;)
It could be a paid restaurant review on a restaurant review site, link included. For SEO purposes, the review doesn't even need to be overtly positive and glowing, therefore a high degree of editorial integrity could be maintained. This approach would suit a lot of review sites. For example, "we'll pay you to review our product, so long as you link to it, but you can still say whatever you like about it". The publishers production cost is met, in total, and they can maintain a high degree of editorial integrity. If Jennifer Lopez is in a new movie with some "hot" scene then that movie can pay AskMen to create a top 10 sexiest moments gallery that includes their movie at #9 & then advertise that feature across the web.
A DIY site could show their readers how to build a garden wall. The products could be from a sponsor, link included. Editorial integrity could be maintained, as the DIY site need not push or recommend those products like an advertorial would, but the sponsor still gets the link. The equivalent of product placement in movies.
News items can feature product placement without necessarily endorsing them, link included - they already do this with syndicated press releases. Journalists often interview the local expert on a given topic, and this can include a link. If that news article is paid for by the link buyer, yet the link buyer doesn't have a say in editorial, then that deal will look attractive to publishers. Just a slightly different spin on "brought to you by our sponsor". Currently services like HARO & PR Leads help connect experts with journalists looking for story background. In the years to come perhaps there will be similar services where people pay the publications directly to be quoted.
I’m sure you can think of many other ideas. A lot of this isn’t new, it’s just a new, shiny badge on something that has been going on well before the web began. When it comes to SEO, the bar has been lifted on link building. Links from substandard content are less likely to pass Google’s filters, so SEOs need to think more about ways to get quality content integrated in a more seamless way. It takes more time, and it’s likely to be more costly, but this can be a good thing. It raises the bar on everyone else.
Those who don’t know the bar has been raised, or don’t put more effort in, will lose.
Low Level Of Compromise
Native Advertising is a new spin on an old practice, however it should be especially interesting to the SEO, as the SEO doesn't demand the publisher compromise editorial to a significant degree, as the publisher would have to do for pure advertorial. The SEO only requires they incorporate a link within a seamless, editorial-style piece.
If the SEO is paying for the piece to be written, that's going to look like a good deal to many publishers.
The web search giant, which is embroiled in a long-running row over the way it deals with pirated material, is considering the radical measure so that it can get rid of the root cause instead of having to change its own search results.
Executives want to stop websites more or less dedicated to offering links to pirated films, music and books from making money out of the illegal material. The plans, still in discussion, would also block funding to websites that do not respond to legal challenges, for example because they are offshore.
Last month Google announced a new format for their image search results, where they pull the image inline without sending the visitor onto the publisher website. At the same time they referenced some "phantom visitor" complaint from publishers to justify keeping the visitor on Google & highlighted how there were now more links to the image source. If publishers were concerned about the "phantom visitor problem" we wouldn't see so many crappy slideshow pageviews.
Google's leaked remote rater guidelines do mention something about rating an image lower under certain situations like where the author might want attributed for their work that they are routinely disintermediated from.
On Twitter a former Google named Miguel Silvar wrote: "If you do SEO and decide to block Image Search just because it's bringing less traffic, you can stop calling yourself an SEO expert."
Many "experts" would claim that any exposure is good, even if you don't get credit for it. Many clients of said "experts" will end up bankrupt! Experts who suggest it is reasonable for content creators to be stripped of payment, traffic & attribution are at best conflicted.
As Google continues to win the game of inches of displacing the original sources, they don't even need you to mark up your content for them to extract their knowledge graph. Bill Slawski shared a video of Google's Andrew Hogue describing their mass data extraction effort: "It's never going to be 100% accurate. We're not even going to claim that it is 100% accurate. We are going to be lucky if we get 70% accuracy ... we are going to provide users with tools to correct the data."
If you as a publisher chose to auto-generate content at a 70% accuracy, pumped it up to first page rankings & then said "if people care they will fix it" Google would rightfully call you a spammer. If they do the same, it is knowledge baby.
Google pays for default placement in Safari & Firefox. Former Google executives head AOL & Yahoo!. Google can thus push for new cultural norms that make Microsoft look like an oddball or outsider if they don't play the same game.
With so much interest and buzz around mobile and its impact on search, this recent study by the Harris Poll was telling and helpful from an SEO and link building standpoint.
Harris asked smartphone users about their habits and which appliance they used when performing certain online tasks like reading email and researching goods. They polled 2400 adults, 991 of whom use a smartphone. Here are a handful of interesting results from the survey with potential to influence SEO:
Uses a computer (desktop/laptop) %
Uses a smartphone %
Research good or services
Research goods and services
Read work emails
Read work emails
Send work emails
Send work emails
Read social media on sites/apps such as Facebook & Twitter
Read social media on sites/apps such as Facebook & Twitter
Share social media
Share social media
The fact 81% of the people polled use a computer to research or take a survey isn’t surprising, both tasks are easier from a visual and aesthetics view when done on a large screen. But… 45% mobile users is not a number to dismiss. Both numbers reinforce a number of SEO points:
Keep your visual and written content separate so anyone using a smartphone can easily click to what they want to find. Good case for building a presence on Pinterest or Flickr if you have a lot of visual products.
Keep producing descriptive, informative and up-to-date content for your website. (Don’t send it away!) Promote what you write through social media, email distribution lists and on your blogs, forums, etc.
Use a “social media” type press release when announcing new products and major content additions to your site.
For affiliate marketers: A growing number of shoppers use bricks and mortar stores as “showrooms” before going back online to make a purchase. Keep your best promotions and discounts on your site rather than on sites like Coupon Cabin. Create an app to alert people when new products and discounts are available.
Add RSS sign up options on all pages especially those with content and discounts.
Promote new content through an app as well!
If you use email in any way to build links, know more people use computers to read and send work email than mobile devices.
If you are contacting people for content or link placement and doing it after business hours, know more people will see your message on their smartphones. Keep the email short and to the point, if you need to use images link out rather than embed.
Keep in mind most people using smartphones do so when they are on the move or after business hours. Either scenario means you need to hook their attention the second he/she opens the email. Work hard to make subject lines pop and state your mission in the first sentence or two.
Based on the percentages shown here, people like their social media no matter what device they are on!
Include social media share elements on everything you publish (even PDF’s)
Use niche social media sites as well as the big boys
Mix up the type of content you use, create contests for Twitter and polls on Facebook
Although this wasn’t included in the Harris Poll, the fact people are using their smartphone 45% of the time to research goods and services warrants a mention: add a click-to-call option and/or telephone number on all your mobile pages as well as links to your full website and email.
My name is Brandon. I have been with FindTheBest since 2010 (right after our launch), and I am really bummed you posted this Infographic without reaching out to our team. We don't scrape data. We have a 40 person+ product team that works very closely with manufacturers, companies, and professionals to create useful information in a free and fair playing field. We some times use whole government databases, but it takes hundreds-of-thousands of hours to produce this content. We have a product manager that owns up to all the content in their vertical and takes the creation and maintenance very seriously. If you have any questions for them about how a piece of content was created, you should go to our team page and shoot them a email. Users can edit almost any listing, and we spend a ton of time approving or rejecting those edits. We do work with large publishers (something I am really proud of), but we certainly do not publish the same exact content. We allow the publishers to customize and edit the data presentation (look, style, feel) but since the majority of the content we produce is the factual data, it probably does look a little similar. Should we change the data? Should we not share our awesome content with as many users as possible? Not sure I can trust the rest of your "facts", but great graphics!
I thought it was only fair that we aired his view on the main blog.
...but then that got me into doing a bit of research about FindTheBest...
In the past when searching for an issue related to our TV I saw a SERP that looked like this
Those mashed sites were subdomains on trusted sites like VentureBeat & TechCrunch.
Graphically the comparison pages appear appealing, but how strong is the editorial?
How does Find The Best describe their offering?
In a VentureBeat post (a FindTheBest content syndication partner) FTB's CEO Kevin O’Connor was quoted as saying: “‘Human’ is dirty — it’s not scalable.”
Hmm. Is that a counter view to the above claimed 40 person editorial research team? Let's dig in.
Looking at the top listed categories on the homepage of Find The best I counted 497 different verticals. So at 40 people on the editorial team that would mean that each person managed a dozen different verticals (if one doesn't count all the outreach and partnership buildings as part of editorial & one ignores the parallel sites for death records, grave locations, find the coupons, find the company & find the listing).
Google shows that they have indexed 35,000,000 pages from FindTheBest.com, so this would mean each employee has "curated" about 800,000 pages (which is at least 200,000 pages a year over the past 4 years). Assuming they work 200 days a year that means they ensure curation of at least 1,000 "high quality" pages per day (and this is just the stuff in Google's index on the main site...not including the stuff that is yet to be indexed, stuff indexed on 3rd party websites, or stuff indexed on FindTheCompanies.com, FindTheCoupons.com, FindTheListing, FindTheBest.es, FindTheBest.or.kr, or the death records or grave location sites).
Maybe I am still wrong to consider it a bulk scrape job. After all, it is not unreasonable to expect that a single person can edit 5,000 pages of high quality content daily.
Errr....then again...how many pages can you edit in a day?
Where they lost me though was with the "facts" angle. Speaking of not trusting the rest of "facts" ... how crappy is the business information for SEO Book on FindTheBest that mentions that our site launched in 2011, we have $58,000 in sales, and we are a book wholesaler.
I realize I am afforded the opportunity to work for free to fix the errors of the scrape job, but if a page is full of automated incorrect trash then maybe it shouldn't exist in the first place.
I am not saying that all pages on these sites are trash (some may be genuinely helpful), but I know if I automated content to the extent FTB does & then mass email other sites for syndication partnerships on the duplicate content (often full of incorrect information) that Google would have burned it to the ground already. They likely benefit from their CEO having sold DoubleClick to Google in the past & are exempt from the guidelines & editorial discrimination that the independent webmaster must deal with.
One of the ways you can tell if a company really cares about their product is by seeing if they dogfood it themselves.
Out of curiousity, I looked up FindTheBest on their FindTheCompany site.
They double-list themselves and neither profile is filled out.
That is like having 2 sentence of text on your "about us" page surrounded by 3 AdSense blocks. :D
I think they should worry about fixing the grotesque errors before worrying about "sharing with as many people as possible" but maybe I am just old fashioned.
Certainly they took a different approach ... one that I am sure that would get me burned if I tried it. An example sampling of some partner sites...
analytics-software.businessknowhow.com "BusinessKnowHow ended the relationship with find the best as soon as we realized how spammy they were." - Janet Attard
we have seen search results where a search engine didn't robots.txt something out, or somebody takes a cookie cutter affiliate feed, they just warm it up and slap it out, there is no value add, there is no original content there and they say search results or some comparison shopping sites don't put a lot of work into making it a useful site. They don't add value. - Matt Cutts
That syndication partnership network also explains part of how FTB is able to get so many pages indexed by Google, as each of those syndication sources is linking back at FTB on (what I believe to be) every single page of the subdomains, and many of these subdomains are linked to from sitewide sidebar or footer links on the PR7 & PR8 tech blogs.
And so the PageRank shall flow ;)
Hundreds of thousands of hours (eg 200,000+) for 40 people is 5,000 hours per person. Considering that there are an average of 2,000 hours per work year, this would imply each employee spent 2.5 full years of work on this single aspect of the job. And that is if one ignores the (hundreds of?) millions of content pages on other sites.
Here’s one reason to be excited: In its own small way, it combats the recent flood of crappy infographics. Most TechCrunch writers hate the infographics that show up in our inboxes— not because infographics have to be terrible, but because they’re often created by firms that are biased, have little expertise in the subject of the infographic, or both, so they pull random data from random sources to make their point.
Get that folks? TechCrunch hosting automated subdomains of syndicated content means less bad infographics. And more cat lives saved. Or something like that.
The gadget comparisons we built for TechCrunch are sticky and interactive resources comprised of thousands of SEO optimized pages. They help over 1 million visitors per month make informed decisions by providing accurate, clear and useful data.
SEO optimized pages? Hmm.
Your comparisons will include thousands of long-tail keywords and question/answer pages to ensure traffic is driven by a number of different search queries. Our proprietary Data Content Platform uses a mesh linking structure that maximizes the amount of pages indexed by search engines. Each month—mainly through organic search—our comparisons add millions of unique visitors to our partner’s websites.
If we expand the "view more" section at the footer of the page, what do we find?
Sorry that font is so small, the text needed reduced multiple sizes in order to fit on my extra large monitor, and then reduced again to fit the width of our blog.
Each listing in a comparison has a number of associated questions created around the data we collect.
For example, we collect data on the battery life of the Apple iPad.
An algorithm creates the question “How long does the Apple iPad tablet battery last?” and answers it
So now we have bots asking themselves questions that they answer themselves & then stuffing that in the index as content?
Yeah, sounds like human-driven editorial.
After all, it's not like there are placeholder tokens on the auto-generated stuff
Looks like I was wrong on that.
And automated "popular searches" pages? Nice!
As outrageous as the above is, they include undisclosed affiliate links in the content, and provided badge-based "awards" for things like the best casual dating sites, to help build links into their site.
That in turn led to them getting a bunch of porn backlinks.
If you submit an article to an article directory and someone else picks it up & posts it to a sketchy site you are a link spammer responsible for the actions of a third party.
But if you rate the best casual dating sites and get spammy porn links you are wonderful.