Why Google can’t tell you about every ranking drop

Why Google can't tell you about every ranking drop

In a recent Twitter exchange, Google Search Liaison Danny Sullivan provided insight into how the search engine handles algorithmic spam actions and ranking drops.

The discussion was sparked by a website owner complaint about a significant loss of traffic and the inability to request a manual review.

Sullivan clarified that a site could be affected by algorithmic spamming or simply not ranking well due to other factors.

He stressed that many sites that experience ranking drops mistakenly attribute it to spammy algorithmic action when that might not be the case.

“I’ve looked at a lot of sites where people have complained about losing rankings and decide they have algorithmic spam action against them, but they don’t.”

by Sullivan full statement will help you understand Google’s transparency challenges.

It also explains why the desire for manual review to override automated ratings may be misguided.

Two different things. A site could have an algorithmic spam action. A site might not rank well because other systems that *don’t talk about spam* simply don’t see it as useful.

I’ve looked at many sites where people have complained about losing rankings and decide they have a…

— Google SearchLiaison (@searchliaison) May 13, 2024

Challenges in transparency and manual intervention

Sullivan acknowledged the idea of ​​providing more transparency in Search Console, potentially notifying site owners of algorithmic actions similar to manual actions.

However, he highlighted two key challenges:

Revealing algorithmic indicators of spam could allow bad actors to game the system. Algorithmic actions are not site specific and cannot be raised manually.

Sullivan expressed sympathy for the frustration of not knowing the cause of a traffic crash and not being able to communicate with someone about it.

However, he cautioned against the desire for manual intervention to overrule automated systems.

Sullivan states:

“…you don’t really want to think, ‘Oh, I just wish we had a manual action, that would be so much easier.'” You really don’t want your individual site to attract the attention of our spam analysts. First of all , it’s not like manual actions are somehow processed instantly.Secondly, it’s just something we know about a site in the future, especially if it says it’s changed but really hasn’t.

Determining the usefulness and reliability of the content

Beyond spam, Sullivan discussed various systems that evaluate the usefulness, usefulness and trustworthiness of content and individual sites.

He acknowledged that these systems are imperfect and that some high-quality sites may not be recognized as well as they should be.

“Some of them rank very well. But they have moved down a bit in such small positions that the drop in traffic is noticeable. They assume they have fundamental problems, but they really don’t, so we’ve added a whole section about it to our traffic debugging page.”

Sullivan revealed ongoing discussions about providing more metrics in Search Console to help creators understand how their content is performing.

“Another thing I’ve been discussing, and I’m not alone in this, is that we could do more in Search Console to show some of these indicators. This is all a similar challenge to all the things I said about spam, about not wanting to let the systems be gamed, and also how there’s no button we push that’s like “actually more useful than our automated systems think—sort it out better. !” But maybe there’s a way we can find to share more, in a way that helps everyone and, along with better targeting, would help creators.”

Advocacy for small publishers and positive progress

In response to a suggestion from Brandon Saltalamacchia, founder of RetroDodo, about manually reviewing “good” sites and providing guidance, Sullivan shared his thoughts on possible solutions.

He mentioned exploring ideas like self-declaration using structured data for small publishers and learning from that information to make positive changes.

“I have some ideas that I’ve been exploring and pitching about what we could do with small publishers and self-declaration with structured data and how we could learn from that and use it in a variety of ways. Which is getting me a lot ahead of myself and the promises usual, but yes, I think and hope that there are ways to move forward in a more positive way.”

Sullivan said he can’t make promises or implement changes overnight, but he expressed hope to find ways to move forward positively.

Featured image: Tero Vesalainen/Shutterstock



[ad_2]

Source link

You May Also Like

About the Author: Ted Simmons

I follow and report the current news trends on Google news.

Leave a Reply

Your email address will not be published. Required fields are marked *