Changes to the core concepts of quality and authority over the past year have altered the course of SEO forever, with Google both improving algorithms and increasingly relying on human “quality raters.”
Old methods of dynamically generating content and other quick hacks no longer result in long-term, sustainable SEO performance.
Gone are the days of low-quality ghostwriting as a means of rapidly producing new blog articles for the sake of “content freshness.” Additionally, SEOs can no longer rely upon re-purposing information otherwise found online (especially without citing one’s sources) as an effective and sustainable content strategy.
To succeed in this new landscape, SEOs must learn how Google has changed the rules regarding content quality and authority and what steps website owners must take to ensure they’re seen as trustworthy.
What are the biggest changes in how sites are evaluated and ranked that you’ve seen over the past year?
2019 was the year when Google turned up the dial on analyzing the quality and trustworthiness of web pages and domains. In previous years, making small tweaks to well-known ranking factors (such as adjusting title tags or adding new internal links) could have been enough to see an improvement in SEO performance, even if the website wasn’t known as an authority on the subject being written about.
Those types of quick on-page optimization tactics are no longer sufficient to obtain top positions in the search results, especially if the website contains other issues related to quality and trust. Google is now honing in on the reputation and credibility of both the website itself, as well as the creators who contribute to its content. It has also placed great importance on user experience, such as by rolling out several updates related to page speed and switching to mobile-first indexing.
What steps do you take when one of your clients has had a huge traffic drop due to an algorithm change?
We start by looking at both on- and off-site issues related to quality and trust when a client has been hit. We analyze which pages were hit or whether the effect was site-wide, to see if a certain section of the site contains E-A-T (expertise, authoritativeness, trustworthiness) issues or if the entire domain’s authority was called into question. We take a close look at the content on affected pages compared to that of our competitors. Does it adequately answer the search query with the appropriate vocabulary, citations and page structure?
Doing a “site:” search is another step we take when gauging the current organic footprint of a site; it’s common to find many thin, duplicate, or low-quality pages in the index that could benefit from being merged or removed. Another good tactic (which is actually suggested by Google) is to look up reviews of the content creator or the website itself, while excluding results pulled from the website in question. There may be external issues related to the client’s trustworthiness that need to be resolved.
How closely should people be following the Google Search Quality Evaluator Guidelines?
Our SEO team has a printed-out copy of the 2019 Search Quality Rater Guidelines on our bookshelf, which we often dive into when analyzing an affected site or whenever else we have questions about how Google defines “high quality” content. While we are fully aware that Google has stated that many of the recommendations in the guidelines are not current ranking factors, we know these are good long-term marketing strategies that Google teams will work to build into future evolutions of the search algorithm.
Furthermore, we have already seen ample evidence in recent months that the algorithm has made great improvements in learning how to behave like a human being when analyzing website quality. And although it may be obvious to most SEOs by now, the updated guidelines clearly state that old-fashioned SEO techniques like keyword stuffing, auto-generating content or writing low-quality blog articles for the sake of churning out new content are not sustainable strategies to produce good results in the future.
How much impact are artificial intelligence and machine learning having on ranking algorithms?
I think a lot of what we have seen in recent months with algorithm updates stems from Google’s rapid advancements in machine learning. When Ben Gomes of Google said the rater guidelines are “where we want the search algorithm to go,” he confirmed that humans continue to be the best judge of what is high-quality and what is not (for now at least), but there is still work to do for the algorithms to catch up with human judgment.
Machine learning makes that process much quicker and more scalable than it has ever been before. But when we see major fluctuations in search, as we have in the past six months with sites drastically rising and falling after each algorithm update, it’s a good indication that sometimes the algorithms can miss the mark and still have a lot of work to do.
What kind of changes are you anticipating for the coming year?
By focusing so much on the reputation of the website and the creators of the content — especially for YMYL (Your Money Or Your Life) sites — Google has made it hard to rank well with mediocre content or lack of expertise. Ranking in 2019 will require more time and effort than ever before, because short-term hacks won’t lead to sustained SEO success.
I believe that companies and individuals who are focused on their search presence should keep a close eye on their online reputations and take steps to address anything that may bring their credibility or trustworthiness into question. This includes actually listening to their customers and addressing their concerns across different platforms. It includes not overwhelming customers with calls to action or advertisements. It also includes providing easy ways for customers to get in touch with them when customers have a problem and incorporating that feedback into how they operate their business.
These are not easy things to achieve; digital brands must be all-hands-on-deck to build and maintain long-term organic visibility. But these are also the things human searchers evaluate as they make informed consumer decisions, and search engines are quickly catching up to humans in their ability to emulate those evaluation processes.