Why Google seems to favor big brands and low-quality content

Why Google seems to favor big brands and low-quality content

Many people are convinced that Google shows a preference for big brands and ranks low-quality content, which many believe has gotten progressively worse. Maybe it’s not a matter of perception, something is going on, almost everyone has an anecdote of poor quality search results. The possible reasons for this are truly amazing.

Google has shown favoritism in the past

This is not the first time that Google’s search engine results pages (SERPs) have shown a bias that favors big brand websites. In the early years of Google’s algorithm, it was obvious that sites with high PageRank would rank for pretty much anything they wanted.

For example, I remember a web design company that built many websites, creating a network of backlinks, increasing their PageRank to a remarkable level that is usually only seen on large corporate sites like IBM. As a result, they ranked for the two-word keyword phrase, Web Design, and pretty much every other variant like Web Design + [any state in the USA].

Everyone knew that websites with a PageRank of 10, the highest level displayed in Google’s toolbar, practically had a free pass in the SERPs, which pushed big-brand sites above most relevant web pages. It didn’t go unnoticed when Google finally tweaked its algorithm to fix this problem.

The point of this anecdote is to point out an example where Google’s algorithm inadvertently created a bias that favored big brands.

Here are other algorithm biases that editors exploited:

Top 10 Longtail How-To Article Posts Misspellings Free Footer Widgets That Contained Links (Always Free for Colleges!)

Big brands and low quality content

There are two things that have been a constant throughout Google’s history:

Low quality content Big brands crowding out small independent publishers

Anyone who has searched for a recipe knows that the more general the recipe, the lower the quality of the recipe it ranks. Look up something like cream of chicken soup and the main ingredient in almost every recipe is two cans of chicken soup.

A search for authentic Mexican tacos returns recipes with these ingredients:

Soy sauce Ground beef “Cooked chicken” Taco shells (from the store!) Beer

Not all recipe SERPs are bad. But some of the more general recipes that Google ranks are so basic that a hobo can cook them on the stove.

Robin Donovan (Instagram), a cookbook author and online recipe blogger observed:

“I think the problem with Google search rankings for recipes these days (post HCU) is much bigger than if they are too simple.

The biggest problem is that you get a bunch of Reddit threads or sites with untested user-generated recipes, or scrap sites that are stealing recipes from hard-working bloggers.

In other words, content that isn’t ‘helpful’ if what you want is a well-written, tested recipe that you can use to make something delicious.”

Explanations of why Google’s SERPs are broken

It’s hard not to get away from the perception that Google rankings for a variety of topics always seem to default to big brand websites and low quality websites.

Small sites grow to become big brands that dominate the SERPs, it happens. But that’s what happens, even when a small site gets big, it’s now another big brand dominating the SERPs.

Typical explanations for poor SERPs:

It’s a conspiracy to increase ad clicks The content itself is generally low quality Google has nothing else to rank for It’s SEO Affiliates’ fault Poor SERPs is Google’s scheme to generate more ad clicks Google promotes big brands because [insert your conspiracy]

So what’s going on?

People love big brands and trash content

Google’s recent antitrust lawsuit exposed the importance of Navboost algorithm signals as an important ranking factor. Navboost is an algorithm that interprets user engagement signals to understand what topics a web page is relevant for, among other things.

The idea of ​​using engagement signals as an indicator of what users expect to see makes sense. After all, Google is user-centric and who better to decide what’s best for users than users themselves, right?

Well, keep in mind that arguably the biggest and most important song of 1991, Nirvana’s Smells Like Teen Spirt, didn’t make the Billboard top 100 that year. Michael Bolton and Rod Stewart both made the list twice, with Rod Stewart taking the top spot for a song called “The Motown Song” (anyone remember it?)

Nirvana didn’t hit the charts until the following year…

My guess, given that we know that user interactions are a strong ranking signal, is that Google search rankings follow a similar pattern related to user biases.

People tend to choose what they know. It’s called familiarity bias.

Consumers have a habit of choosing things they know over things they don’t. This preference shows up in product choices that brands prefer, for example.

Behavioral scientist Jason Hreha, define familiarity bias like this:

“Familiarity bias is a phenomenon in which people tend to prefer familiar options over unfamiliar ones, even when the unfamiliar options may be better. This bias is often explained in terms of cognitive ease, which is the feeling of fluency or ease that people experience when they are processing familiar information. When people encounter familiar options, they are more likely to experience cognitive ease, which can make those options seem more attractive.”

Except for certain queries (like health-related ones), I don’t think Google makes an editorial decision about certain types of websites, such as brands.

Google uses many signals for ranking. But Google is very user-centric.

I think it’s possible that strong user preferences carry more weight than the signals from the review system. How else to explain why Google apparently has a bias towards big brand websites with fake reviews ranking better than honest independent review sites?

It’s not like Google’s algorithms haven’t created poor search results in the past.

Google’s Panda algorithm was designed to remove cookie-cutter content bias. The review system is a patch to correct Google’s bias for content that refers to reviews but is not necessarily reviews.

If Google has systems in place to catch low-quality sites that its core algorithm would rank, why do big brands and poor-quality content still rank?

I think the answer is that it’s what users prefer to see on these sites, as indicated by user engagement signals.

The big question to ask is whether Google will continue to rank for user biases and inexperience that trigger signs of user satisfaction. Or will Google continue to offer the bon-bons sugar frostings that users crave?

Should Google decide to rank quality content at the risk of making it too difficult for users to understand?

Or should publishers give up and focus on creating for the lowest common denominator, like the big pop stars do?



[ad_2]

Source link

You May Also Like

About the Author: Ted Simmons

I follow and report the current news trends on Google news.

Leave a Reply

Your email address will not be published. Required fields are marked *