Affected by the core algorithm? 5 factors to consider

Affected by the core algorithm?  5 factors to consider

Many factors can affect rankings after a core algorithm update. It’s not always about the usefulness of the content, there are other factors that can influence why the algorithm has changed and negatively affected your website’s ranking.

If you find yourself saying, “It used to rank, why isn’t it ranking now?” then some of these factors may be something to consider.

1. Algorithmic losses are not necessarily persistent

Sites affected by the core algorithm update (which includes the useful content part) do not have a permanent strike against them. Over the past ten years, Google has implemented complicated algorithms and systems that can take months between update cycles, so affected sites can’t find a quick path back to search results. While not a permanent mark, a place seems to have acquired a curse that permanently marks them as no good and permanently banned.

Google’s John Mueller responded to a question where he confirmed that getting stuck in a core algorithm update is not persistent and with work a site can recover from being hit by an update.

someone ask X (formerly Twitter):

“Can an HCU-visited site grow again in terms of traffic if it improves in quality? Many fear that no matter how many improvements we make, a successful HCU site will forever be assigned a ranker that prevents grow again.”

John Mueller answered:

“Yes, sites can grow back after being hit by the ‘HCU’ (well, core upgrade now). This is not permanent. It can take a lot of work, time and maybe upgrade cycles, and/but A different, updated site will also be different in search.”

2. Recover is not the right word

Many people think of recovering from an update as a ranking reset so that websites regain positions to a previous state. John Mueller’s answer on X suggests that publishers can understand algorithmic effects as something that requires adjusting a website to fit an evolving web, including user expectations.

Mueller tweeted:

“Permanent changes are not very useful in a dynamic world, so yes. However, “recovering” implies going back to the way it was before, and IMO is always unrealistic, as the world, user expectations, and the rest of the web keep changing. It’s never ‘the same as before’.”

This statement seems to imply that, to some extent, algorithmic updates reflect users’ expectations of what they expect to see in search results. One way to understand this is with the example of Google’s medical update from a few years ago. This update reflected a realignment of search results with what users expect to see when they perform certain queries. After the Medic update, search queries on medical topics required search results with a scientific focus. Sites that reflected non-scientific folk remedies did not fit this updated definition of relevance.

There are subtle variations in this realignment of search results that go directly to answering the question, what do users mean when they make a search query? Sometimes relevance means informative sites, while for other queries it can mean review sites are what users expect to see.

Therefore, if your site is affected by a core algorithm update, review the SERPs and try to determine what the new SERPs mean in terms of relevance and self-assess whether your site meets this new definition of relevance.

Going back to Mueller’s response, there is no “back to business as usual” and that may be because there has been a subtle shift in relevance. Sometimes the solution is subtle. Sometimes, returning to search engine results (SERPs) requires a major change to the website to meet user expectations.

3. Thresholds and Classification Formulas

Another interesting point Mueller discussed is the difference between an ongoing algorithmic evaluation and the more persistent effects of a ranking system that requires an update cycle before a site can be retrieved.

someone he asked:

“The simple question is whether to wait for a new core update to recover from HCU. A simple ‘yes’ or ‘no, you can recover anytime’ should suffice.”

John Mueller answered:

“It’s because not all changes require another update cycle. In practice, I’d assume that stronger effects will require another update. Basic updates can include a lot of things.”

Then he continued with these interesting ones comments:

“For example, a ranking formula + some thresholds could be updated. The effects of the updated formula are mostly ongoing, changes to the thresholds often require another update to adjust.

…(“thresholds” is a simplification for any number that takes a lot of work and data to recalculate, reassess, revise)”

The above means that there are two types of effects that can affect a site. One that is part of a continuously updated ranking formula that can quickly reflect changes made to a site. These were called continuous updates where the core algorithm can make relatively instantaneous assessments about a site and raise or lower the ranking.

The other type of algorithmic problem is one that requires massive recalculation. This is what the HCU and even the Penguin algorithms used to be until they were incorporated into the core algorithm. They were like massive calculations that seemed to assign scores that only updated on the next cycle.

4. The web and users change

In another recent exchange at X, John Mueller stated that the key to success is keeping track of what users expect.

he he tweeted:

“…there is no single secret to lasting online success. Even if you find something that works now, the web, user desires, and the way they interact with websites change. It’s very difficult to make good, popular and persistent things.”

This statement offers these concepts to consider for online success:

Internet User Desires How Users Interact With Website Popularity Is Not Persistent

These are not algorithm factors. But they could be things that Google collects in terms of understanding what users expect to see when they make a search query.

What users expect to see is my favorite definition of relevance. This has practically nothing to do with “semantic relevance” and more about what the users themselves expect. This is something that some SEOs and publishers struggle with. They focus so much on what words and phrases mean and forget that what really matters is what they mean to users.

Mueller posted something similar in an answer to why a website ranks #1 in one country and doesn’t do so well in another. He said what users expect to see in response to a query may differ from country to country. The point is that it’s not about semantics and entities and other technicalities, but often the relevance of search ranking has a lot to do with users.

he he tweeted:

“It is normal that search results vary between countries. Users are different, expectations can vary and the web is also very different”.

This insight may be useful for some publishers who have lost rankings in a core algorithm update. User expectations may have changed and the algorithm reflects those expectations.

5. Page level signal

Google’s SearchLiaison stated that the useful content component of the core algorithm is generally a page-level signal, but it is also site-wide. His tweet quoted the Helpful Content Update FAQ which reads:

“Do Google’s core ranking systems evaluate the usefulness of content at the page level or across the site?

Our core ranking systems are primarily designed to work at the page level, using a variety of signals and systems to understand the usefulness of individual pages. We have some signs all over the place that are taken into account as well.”

Keep an open mind

It’s frustrating to lose rankings in a core algorithm update. I’ve been working in SEO for about 25 years and auditing websites since 2004. Helping site owners identify why their sites are no longer ranking has taught me that it’s helpful to keep an open mind about what is affecting rankings.

The core algorithm has many signals, some of which are about usefulness, while others are relevant to users, relevance to site queries, and also just the quality of the site. So it can be helpful not to get stuck thinking that a site lost rankings for one thing because it could be something else or even multiple factors.

Featured image by Shutterstock/Benny Marty



Source link

You May Also Like

About the Author: Ted Simmons

I follow and report the current news trends on Google news.

Leave a Reply

Your email address will not be published. Required fields are marked *