For those new to optimizing clients sites, or those seeking a refresher, we thought we'd put together a guide to step you through it, along with some selected deeper reading on each topic area.
Every SEO has different ways of doing things, but we’ll cover the aspects that you’ll find common to most client projects.
The best rule I know about SEO is there are few absolutes in SEO. Google is a black box, so complete data sets will never be available to you. Therefore, it can be difficult to pin down cause and effect, so there will always be a lot of experimentation and guesswork involved. If it works, keep doing it. If it doesn't, try something else until it does.
Many opportunities tend to present themselves in ways not covered by “the rules”. Many opportunities will be unique and specific to the client and market sector you happen to be working with, so it's a good idea to remain flexible and alert to new relationship and networking opportunities. SEO exists on the back of relationships between sites (links) and the ability to get your content remarked upon (networking).
When you work on a client site, you will most likely be dealing with a site that is already established, so it’s likely to have legacy issues. The other main challenge you’ll face is that you’re unlikely to have full control over the site, like you would if it were your own. You’ll need to convince other people of the merit of your ideas before you can implement them. Some of these people will be open to them, some will not, and some can be rather obstructive. So, the more solid data and sound business reasoning you provide, the better chance you have of convincing people.
The most important aspect of doing SEO for clients is not blinding them with technical alchemy, but helping them see how SEO provides genuine business value.
The first step in optimizing a client site is to create a high-level strategy.
"Study the past if you would define the future.” - Confucious
You’re in discovery mode. Seek to understand everything you can about the clients business and their current position in the market. What is their history? Where are they now and where do they want to be? Interview your client. They know their business better than you do and they will likely be delighted when you take a deep interest in them.
What are they good at?
What are their top products or services?
What is the full range of their products or services?
Are they weak in any areas, especially against competitors?
Who are their competitors?
Who are their partners?
Is their market sector changing? If so, how? Can they think of ways in which this presents opportunities for them?
What keyword areas have worked well for them in the past? Performed poorly?
What are their aims? More traffic? More conversions? More reach? What would success look like to them?
Do they have other online advertising campaigns running? If so, what areas are these targeting? Can they be aligned with SEO?
Do they have offline presence and advertising campaigns? Again, what areas are these targeting and can they be aligned with SEO?
Some SEO consultants see their task being to gain more rankings under an ever-growing list of keywords. Ranking for more keywords, or getting more traffic, may not result in measurable business returns as it depends on the business and the marketing goals. Some businesses will benefit from honing in on specific opportunities that are already being targeted, others will seek wider reach. This is why it’s important to understand the business goals and market sector, then design the SEO campaign to support the goals and the environment.
This type of analysis also provides you with leverage when it comes to discussing specific rankings and competitor rankings. The SEO can’t be expected to wave a magic wand and place a client top of a category in which they enjoy no competitive advantage. Even if the SEO did manage to achieve this feat, the client may not see much in the way of return as it’s easy for visitors to click other listings and compare offers.
Understand all you can about their market niche. Look for areas of opportunity, such as changing demand not being met by your client or competitors. Put yourself in their customers shoes. Try and find customers and interview them. Listen to the language of customers. Go to places where their customers hang out online. From the customers language and needs, combined with the knowledge gleaned from interviewing the client, you can determine effective keywords and themes.
Document. Get it down in writing. The strategy will change over time, but you’ll have a baseline point of agreement outlining where the site is at now, and where you intend to take it. Getting buy-in early smooths the way for later on. Ensure that whatever strategy you adopt, it adds real, measurable value by being aligned with, and serving, the business goals. It’s on this basis the client will judge you, and maintain or expand your services in future.
Sites can be poorly organized, have various technical issues, and missed keyword opportunities.
We need to quantify what is already there, and what’s not there.
Use a site crawler, such as Xenu Link Sleuth, Screaming Frog or other tools that will give you a list of URLs, title information, link information and other data.
Make a list of all broken links.
Make a list of all orphaned pages
Make a list of all pages without titles
Make a list of all pages with duplicate titles
Make a list of pages with weak keyword alignment
Crawl robots txt and hand-check. It’s amazing how easy it is to disrupt crawling with a robots.txt file
Broken links are a low-quality signal. It's debatable if they are a low quality signal to Google, but certainly to users. If the client doesn't have one already, implement a system whereby broken links are checked on a regular basis. Orphaned pages are pages that have no links pointing to them. Those pages may be redundant, in which case they should be removed, or you need to point inbound links at them, so they can be crawled and have more chance of gaining rank. Page titles should be unique, aligned with keyword terms, and made attractive in order to gain a click. A link is more attractive if it speaks to a customer need. Carefully check robots.txt to ensure it’s not blocking areas of the site that need to be crawled.
As part of the initial site audit, it might make sense to include the site in Google Webmaster Tools to see if it has any existing issues there and to look up its historical performance on competitive research tools to see if the site has seen sharp traffic declines. If they've had sharp ranking and traffic declines, pull up that time period in their web analytics to isolate the date at which it happened, then look up what penalties might be associated with that date.
Some people roll this into a site audit, but I’ll split it out as we’re not looking at technical issues on competitor sites, we’re looking at how they are positioned, and how they’re doing it. In common with a site audit, there’s some technical reverse engineering involved.
There are various tools that can help you do this. I use SpyFu. One reporting aspect that is especially useful is estimating the value of the SEO positions vs the Adwords positions. A client can then translate the ranks into dollar terms, and justify this back against your fee.
When you run these competitive reports, you can see what content of theirs is working well, and what content is gaining ground. Make a list of all competitor content that is doing well. Examine where their links are coming from, and make a list. Examine where they’re mentioned in the media, and make a list. You can then use a fast-follow strategy to emulate their success, then expand upon it.
Sometimes, “competitors”, meaning ranking competitors, can actually be potential partners. They may not be in the same industry as your client, just happen to rank in a cross-over area. They may be good for a link, become a supplier, welcome advertising on their site, or be willing to place your content on their site. Make a note of the sites that are ranking well within your niche, but aren’t direct competitors.
Using tools that estimate the value of ranks by comparing Adwords keywords prices, you can estimate the value of your competitors positions. If your client appears lower than the competition, you can demonstrate the estimated dollar value of putting time and effort into increasing rank. You can also evaluate their rate of improvement over time vs your client, and use this as a competitive benchmark. If your client is not putting in the same effort as your competitor, they’ll be left behind. If their competitors are spending on ongoing-SEO and seeing tangible results, there is some validation for your client to do likewise.
A well organised site is both useful from a usability standpoint and an SEO standpoint. If it’s clear to a user where they need to go next, then this will flow through into better engagement scores. If your client has a usability consultant on staff, this person is a likely ally.
It’s a good idea to organise a site around themes. Anecdotal evidence suggests that Google likes pages grouped around similar topics, rather than disparate topics (see from 1.25 onwards).
Create spreadsheet based on a crawl after any errors have been tidied up
Identify best selling products and services. These deserve the most exposure and should be placed high up the site hierarchy. Items and categories that do not sell well, and our less strategically important, should be lower in the hierarchy
Pages that are already getting a lot of traffic, as indicated by your analytics, might deserve more exposure by moving them up the hierarchy.
Seasonal products might deserve more exposure just before that shopping season, and less exposure when the offer is less relevant.
Group pages into similar topics, where possible. For example, acme.com/blue-widgets/ , acme.com/green-widgets/.
Determine if internal anchor text is aligned with keyword titles and page content by looking at a backlink analysis
A spreadsheet of all pages helps you group pages thematically, preferably into directories with similar content. Your strategy document will guide you as to which pages you need to work on, and which pages you need to religate. Some people spend a lot of time sculpting internal pagerank i.e. flowing page rank to some pages, but using nofollow on other links to not pass link equity to others. Google may have depreciated that approach, but you can still link to important products or categories sitewide to flow them more link equity, while putting less important sites lower in the site's architecture. Favour your money pages, and relegate your less important pages.
Ensure your site is deep crawled. To check if all your URLs are included in Google’s index, sign up with Webmaster Tools and/or other index reporting tools.
Include a site map
Check the existing robots.txt. Kep robots out of non-essential areas, such as script repositories and other admin related directories.
If you need to move pages, or you have links to pages that no longer exist, use page redirects to tidy them up
Make a list of 404 errors. Make sure the 404 page has useful navigation into the site so visitors don’t click back.
The accepted method to redirect a page is to use a 301. The 301 indicates a page has permanently moved location. A redirect is also useful if you change domains, or if you have links pointing to different versions of the site. For example, Google sees http://www.acme.com and http://acme.com as different sites. Pick one and redirect to it.
Here’s a video explaining how:
If you don’t redirect pages, then you won’t be making full use of any link juice allocated to those pages.
Backlinks remain a major ranking factor. Generally, the more high quality links you have pointing to your site, the better you’ll do in the results. Of late, links can also harm you. However, if your overall link profile is strong, then a subset of bad links is unlikely to cause you problems. A good rule of thumb is the Matt Cutts test. Would you be happy to show the majority of your links to Matt Cutts? :) If not, you're likely taking a high risk strategy when it comes to penalties. These can be manageable when you own the site, but they can be difficult to deal with on client sites, especially if the client was not aware of the risks involved in aggressive SEO.
Establish a list of existing backlinks. Consider trying to remove any that look low quality.
Ensure all links resolve to appropriate pages
Draw up a list of sites from which your main competitors have gained links
Draw up a list of sites where you’d like to get links from
Getting links involves either direct placement or being linkworthy. On some sites, like industry directories, you can pay to appear. In other cases, it’s making your site into an attractive linking target.
Getting links to purely commercial sites can be a challenge. Consider sponsoring charities aligned with your line of business. Get links from local chambers of commerce. Connect with education establishments who are doing relevant research and consider sponsoring or become involved in some way.
Look at the sites that point to your competitors. How were these links obtained? Follow the same path. If they successfully used white papers, then copy that approach. If they successfully used news, do that, too. Do whatever seems to work for others. Evaluate the result. Do more/less of it, depending on the results.
You also need links from sites that your competitors don’t have. Make a list of desired links. Figure out a strategy to get them. It may involve supplying them with content. It might involve participating in their discussions. It may involve giving them industry news. It might involve interviewing them or profiling them in some way, so they link to you. Ask “what do they need”?. Then give it to them.
Of course, linking is an ongoing strategy. As a site grows, many links will come naturally, and that in itself, is a link acquisition strategy. To grow in importance and consumer interest relative to the competition. This involves your content strategy. Do you have content that your industry likes to link to? If not, create it. If your site is not something that your industry links to, like a brochure site, you may look at spinning-off a second site that is information focused, and less commercial focused. You sometimes see blogs on separate domains where employees talk about general industry topics, like Signal Vs Noise, Basecamps blog. These are much more likely to receive links than sites that are purely commercial in nature.
Before chasing links, you should be aware of what type of site typically receives links, and make sure you’re it.
Once you have a list of keywords, an idea of where competitors rank, and what the most valuable terms are from a business point of view, you can set about examining and building out content.
Do you have content to cover your keyword terms? If not, add it to the list of content that needs to be created. If you have content that matches terms, see if compares well with client content on the same topic. Can the pages be expanded or made more detailed? Can more/better links be added internally? Will the content benefit from amalgamating different content types i.e. videos, audio, images et al?
You’ll need to create content for any keyword areas you’re missing. Rather than copy what is already available in the niche, look at the best ranking/most valuable content for that term and ask how it could be made better. Is there new industry analysis or reports that you can incorporate and/or expand on? People love the new. They like learning things they don’t already know. Mee-too content can work, but it’s not making the most of the opportunity. Aim to produce considerably more valuable content than already exists as you’ll have more chance of getting links, and more chance of higher levels of engagement when people flip between sites. If visitors can get the same information elsewhere, they probably will.
Consider keyword co-occurrence. What terms are readily associated with the keywords you’re chasing? Various tools provide this analysis, but you can do it yourself using the Adwords research tool. See what keywords it associates with your keywords. The Google co-occurrence algorithm is likely the same for both Adwords and organic search.
Also, think about how people will engage with your page. Is it obvious what the page is about? Is it obvious what the user must do next? Dense text and distracting advertising can reduce engagement, so make sure the usability is up to scratch. Text should be a reasonable size so the average person isn’t squinting. It should be broken up with headings and paragraphs. People tend to scan when reading online,searching for immediate confirmation they’ve found the right information. This was written a long time ago, but it’s interesting how relevant it remains.
Sites that don’t link out appear unnatural. Matt Cutts noted:
Of course, folks never know when we're going to adjust our scoring. It's pretty easy to spot domains that are hoarding PageRank; that can be just another factor in scoring. If you work really hard to boost your authority-like score while trying to minimize your hub-like score, that sets your site apart from most domains. Just something to bear in mind.
Make a list of all outbound links
Determine if these links are complementary i.e. similar topic/theme, or related to the business in some way
Make a list of pages with no links out
Links out are both a quality signal and good PR practise. Webmaster look at their inbound links, and will likely follow them back to see what is being said about them. That’s a great way to foster relationships, especially if your client’s site is relatively new. If you put other companies and people in a good light, you can expect many to reciprocate in kind.
Links, the good kind, are about human relationships.
It’s also good for your users. Your users are going to leave your site, one way or another, so you can pick up some kudos if you help them on their way by pointing them to some good authorities. If you’re wary about linking to direct competitors, then look for information resources, such as industry blogs or news sites, or anyone else you want to build a relationship with. Link to suppliers and related companies in close, but non-competing niches. Link to authoritative sites. Be very wary about pointing to low value sites, or sites that are part of link schemes. Low value sites are obvious. Sites that are part of link schemes are harder to spot, but typically feature link swapping schemes or obvious paid links unlikely to be read by visitors. Avoid link trading schemes. It’s too easy to be seen as a part of a link network, and it’s no longer 2002.
Clients can’t expect to do a one off optimisation campaign and expect it to keep working forever. It may be self-serving for SEOs to say it, but it’s also the truth. SEO is ongoing because search keeps changing and competitors and markets move. Few companies would dream of only having one marketing campaign. The challenge for the SEO, like any marketer, is to prove the on-going spend produces a return in value.
Competition monitoring i.e. scan for changes in competitors rank, new competitors, and change of tactics. Determine what is working, and emulate it.
Sector monitoring - monitor Google trends, keywords trends, discussion groups, and news releases. This will give you ideas for new campaign angles.
Reporting - the client needs to be able to see the work you’ve done is paying off.
Availability - clients will change things on their site, or bring in other marketers, so will want you advice going forward
Whole books can be written about SEO for clients. And they have. We've skimmed across the surface but, thankfully, there is a wealth of great information out there on the specifics of how to tackle each of these topic areas.
Perhaps you can weigh in? :) What would your advice be to those new to optimizing client sites? What do you wish someone had told you when you started?
Facebook's early motto was "move fast and break things," but as they wanted to become more of a platform play they changed it to "move fast with stability." Anything which is central to the web needs significant stability, or it destroys many other businesses as a side effect of its instability.
There are a couple different ways to view big search algorithm updates. Large, drastic updates implicitly state one of the following:
we were REALLY wrong yesterday
we are REALLY wrong today
Any change or disruption is easy to justify so long as you are not the one facing the consequences:
"Smart people have a problem, especially (although not only) when you put them in large groups. That problem is an ability to convincingly rationalize nearly anything." ... "Impostor Syndrome is that voice inside you saying that not everything is as it seems, and it could all be lost in a moment. The people with the problem are the people who can't hear that voice." - Googler Avery Pennarun
Monopoly Marketshare in a Flash
Make no mistake, large changes come with false positives and false negatives. If a monopoly keeps buying marketshare, then any mistakes they make have more extreme outcomes.
Here's the Flash update screen (which hits almost every web browser EXCEPT Google Chrome).
Notice the negative option installs for the Google Chrome web browser and the Google Toolbar in Internet Explorer.
Why doesn't that same process hit Chrome? They not only pay Adobe to use security updates to steal marketshare from other browsers, but they also pay Adobe to embed Flash inside Chrome, so Chrome users never go through the bundleware update process.
Anytime anyone using a browser other than Chrome has a Flash security update they need to opt out of the bundleware, or they end up installing Google Chrome as their default web browser, which is the primary reason Firefox marketshare is in decline.
Has anyone noticed that the latest Flash update automatically installs Google Toolbar and Google Chrome? What a horrible business decision Adobe. Force installing software like you are Napster. I would fire the product manager that made that decision. As a CTO I will be informing my IT staff to set Flash to ignore updates from this point forward. QA staff cannot have additional items installed that are not part of the base browser installation. Ridiculous that Adobe snuck this crap in. All I can hope now is to find something that challenges Photoshop so I can move my design team away from Adobe software as well. Smart move trying to make pennies off of your high dollar customers.
In Chrome Google is the default search engine. As it is in Firefox and Opera and Safari and Android and iOS's web search.
In other words, in most cases across most web interfaces you have to explicitly change the default to not get Google. And then even when you do that, you have to be vigilant in protecting against the various Google bundleware bolted onto core plugins for other web browsers, or else you still end up in an ecosystem owned, controlled & tracked by Google.
Those "default" settings are not primarily driven by user preferences, but by a flow of funds. A few hundred million dollars here, a billion there, and the market is sewn up.
Google's user tracking is so widespread & so sophisticated that their ad cookies were a primary tool for government surveillance efforts.
Locking Down The Ecosystem
And Chrome is easily the most locked down browser out there.
While Google relies on bundling their toolbar & browser in updates to Flash and other plugins, they require an opposite strategy for anyone distributing Chrome plugins. Chrome plugins "must have a single purpose that is narrow and easy-to-understand."
Whenever Google wants to promote something they have the ability to bundle it into their web browser, operating system & search results to try to force participation. In a fluid system with finite attention, over-promoting one thing means under-promoting or censoring other options. Google likes to have their cake & eat it too, but the numbers don't lie.
I am frustrated @JohnMu saying that it will not cost CTR. Either Google lied about the increase in CTR with photos, or they're lying now.— Rand Fishkin (@randfish) June 25, 2014
The Right to Be Forgotten
This brings us back to the current snafu with the "right to be forgotten" in Europe.
Some have looked at the EU policy and compared it to state-run censorship in China.
Google already hires over 10,000 remote quality raters to rate search results. How exactly is receiving 70,000 requests a monumental task? As their public relations propagandists paint this as an unbelievable burden, they are also highlighting how their own internal policies destroy smaller businesses: "If a multi-billion dollar corporation is struggling to cope with 70,000 censor requests, imagine how the small business owner feels when he/she has to disavow thousands or tens of thousands of links."
Sorry About That Incidental Deletion From the Web...
David Drummond's breathtaking propaganda makes it sound like Google has virtually no history in censoring access to information:
In the past we’ve restricted the removals we make from search to a very short list. It includes information deemed illegal by a court, such as defamation, pirated content (once we’re notified by the rights holder), malware, personal information such as bank details, child sexual abuse imagery and other things prohibited by local law (like material that glorifies Nazism in Germany).
Google is free to force whatever (often both arbitrary and life altering) changes they desire onto the search ecosystem. But the moment anyone else wants any level of discourse or debate into the process, they feign outrage over the impacts on the purity of their results.
Despite Google's great power they do make mistakes. And when they do, people lose their jobs.
People noticed the Google update when it happened. It is hard to miss an overnight 40% decline in your revenues. Yet when they asked about it, Google did not confirm its existence. That economic damage hit MetaFilter for nearly two years & they only got a potential reprieve from after they fired multiple employees and were able to generate publicity about what had happened.
As SugarRae mentioned, those false positives happen regularly, but most the people who are hit by them lack political and media influence, and are thus slaughtered with no chance of recovery.
MetaFilter is no different than tens of thousands of other good, worthy small businesses who are also laying off employees – some even closing their doors – as a result of Google’s Panda filter serving as judge, jury and executioner. They’ve been as blindly and unfairly cast away to an island and no one can hear their pleas for help.
The only difference between MetaFilter and tons of other small businesses on the web is that MetaFilter has friends in higher places.
If you read past the headlines & the token slaps of big brands, these false positive death sentences for small businesses are a daily occurrence.
Conversations I’ve had with web publishers, none of whom would speak on the record for fear of retribution from Cutts’ webspam team, speak to a litany of frustration at a lack of transparency and potential bullying from Google. “The very fact I’m not able to be candid, that’s a testament to the grotesque power imbalance that’s developed,” the owner of one widely read, critically acclaimed popular website told me after their site ran afoul of Cutts’ last Panda update.
Not only does Google engage in anti-competitive censorship, but they also frequently publish misinformation. Here's a story from a week ago of a restaurant which went under after someone changed their Google listing store hours to be closed on busy days. That misinformation was embedded directly in the search results. That business is no more.
I am one of the few Real Locksmiths here in Denver and I have been struggling with this for years now. I only get one or two calls a day now thanks to spammers, and that's not calls I do, it's calls for prices. For instance I just got a call from a lady locked out of her apt. It is 1130 pm so I told her 75 dollars, Nope she said someone told her 35 dollars....a fake locksmith no doubt. She didn't understand that they meant 35 dollars to come out and look at it. These spammers charge hundreds to break your lock, they don't know how to pick a lock, then they charge you 10 times the price of some cheap lock from a hardware store. I'm so lost, I need help from google to remove those listings. Locksmithing is all I have ever done and now I'm failing at it.
There are entire sectors of the offline economy being reshaped by Google policies.
When those sectors get coverage, the blame always goes to the individual business owner who was (somehow?) personally responsible for Google's behaviors, or perhaps some coverage of the nefarious "spammers."
John Milton in his fiery 1644 defense of free speech, Areopagitica, was writing not against the oppressive power of the state but of the printers guilds. Darnton said the same was true of John Locke's writings about free speech. Locke's boogeyman wasn't an oppressive government, but a monopolistic commercial distribution system that was unfriendly to ways of organizing information that didn't fit into its business model. Sound familiar?
When Google complains about censorship, they are not really complaining about what may be, but what already is. Their only problem is the idea that someone other than themselves should have any input in the process.
"I think as technologists we should have some safe places where we can try out some new things and figure out what is the effect on society, what's the effect on people, without having to deploy kind of into the normal world. And people like those kind of things can go there and experience that and we don't have mechanisms for that." - Larry Page
I have no problem with an "opt-in" techno-utopia test in some remote corner of the world, but if that's the sort of operation he wants to run, it would be appreciated if he stopped bundling his software into billions of electronic devices & assumed everyone else is fine with "opting out."
A couple years ago we published an article named Branding & the Cycle, which highlighted how brands would realign with the algorithmic boost they gained from Panda & leverage their increased level of trust to increase their profit margins by leveraging algorithmic journalism.
Narrative Science has been a big player in the algorithmic journalism game for years. But they are not the only player in the market. Recently the Associated Press (AP) announced they will use algorithms to write articles based on quarterly earnings reports, working with a company named Automated Insights:
We discovered that automation technology, from a company called Automated Insights, paired with data from Zacks Investment Research, would allow us to automate short stories – 150 to 300 words — about the earnings of companies in roughly the same time that it took our reporters.
And instead of providing 300 stories manually, we can provide up to 4,400 automatically for companies throughout the United States each quarter.
Zacks maintains the data when the earnings reports are issued. Automated Insights has algorithms that ping that data and then in seconds output a story.
In the past Matt Cutts has mentioned how thin rewrites are doorway page spam:
you can also have more subtle doorway pages. so we ran into a directv installer in denver, for example. and that installer would say I install for every city in Colorado. so I am going to make a page for every single city in Colorado. and Boulder or Aspen or whatever I do directv install in all of those. if you were just to land on that page it might look relatively reasonable. but if you were to look at 4 or 5 of those you would quickly see that the only difference between them is the city, and that is something that we would consider a doorway.
A single automated AP article might appear on thousands of websites. When thousands of articles are automated, that means millions of copies. When millions of articles are automated, that means billions of copies. When billions ... you get the idea.
To date Automated Insights has raised a total of $10.8 million. With that limited funding they are growing quickly. Last year their Wordsmith software produced 300 million stories & this year it will likely exceed a billion articles:
"We are the largest producer of content in the world. That's more than all media companies combined," [Automated Insights CEO Robbie Allen] said in a phone interview with USA TODAY.
The above might sound a bit dystopian (for those with careers in journalism and/or lacking equity in Automated Insights and/or publishers who must compete against algorithmically generated content), but the story also comes with a side of irony.
Last year Google dictated press releases shall use nofollow links. All the major press release sites quickly fell in line & adopted nofollow, thinking they would remain in Google's good graces. Unfortunately for those sites, they were crushed by Panda. PR Newswire's solution their penalty was greater emphasis on manual editorial review:
Under the new copy quality guidelines, PR Newswire editorial staff will review press releases for a number of message elements, including:
Inclusion of insightful analysis and original content (e.g. research, reporting or other interesting and useful information);
Use of varied release formats, guarding against repeated use of templated copy (except boilerplate);
Assessing release length, guarding against issue of very short, unsubstantial messages that are mere vehicles for links;
Overuse of keywords and/or links within the message.
So now we are in a situation where press release sites require manual human editorial oversight to try to get out of being penalized, and the news companies (which currently enjoy algorithmic ranking boosts) are leveraging those same "spammy" press releases using software to auto-generate articles based on them.
That makes sense & sounds totally reasonable, so long as you don't actually think about it (or work at Google)...