There has been a massive leak of Google’s search algorithm documentation

google-search.jpg

People who aren’t necessarily in the business of running a website or optimizing other websites for Google’s search engine might find this too boring and inside baseball, but it’s very important in certain internet circles. An anonymous source has leaked thousands of search documents which provide information about Google’s secret search algorithm.

This is a big problem because there is an entire industry of professionals, many of whom are helpful and some of whom are online scammers, dedicated to optimizing websites to get more traffic from Google. There are thousands of factors that go into the algorithm (apparently 14,000, in fact), but not only has Google been secretive about it, its carefully worded public statements have often contradicted the information in the leaked documentation.

It’s important to note here that while Google hasn’t always been straightforward about its algorithm factors, it has always stressed the importance of writing good, original content and letting the rest take care of itself. For the most part, this is exactly what we try to do while ignoring the ins and outs of SEO. At Uproxx, where I worked for a decade, there was a much greater focus on search engine optimization (so much so that it would often take 20 to 30 percent longer to complete an article), and now half of Uproxx seems to be written. not for a real audience but for search engines (they recently fired most of their feature writers and literally brought in Will.i.am and their AI software to reposition the site. Really).

Most of what was revealed was honestly common sense, but some of the revelations are interesting nonetheless. For example, despite public claims to the contrary, the specific author of a piece does factor into search ranking. That explains a lot. For about a decade, I probably wrote more articles about The Walking Dead than anyone else on the internet, so when I wrote about the series, either here or on Uproxx, my posts would generally rank well on Google. Conversely, other writers on the same sites would not necessarily rank as well on the same topic. This makes sense, though: someone with authority on particular topics should be able to maintain that authority with search engines elsewhere.

In fact, this is one of the main findings of the leak: the so-called EEAT questions: experience, expertise, authority and trust. Google also has specific systems for evaluating and scoring news content that is included on sensitive topics related to a person’s health, financial stability, safety or well-being (ie there is a reason for the which WebMD and the Mayo Clinic rise to the top of most health-related topics.

Other revelations: Despite their protestations, Google assigns a domain authority to each site; Click data takes into account (ie if a certain site is clicked frequently for a specific search, the site will rise in rankings); and apparently there is a so-called “sandbox” that limits the visibility of new sites. Importantly, the documentation also says that certain sites that provide election information are whitelisted (or downgraded) during election periods.

It’s also important to note here that, based on the information in the leak (and what the site’s editors have known for many, many years), it’s not easy to game Google, and tailor articles and web pages specifically to Google can even backfire. SEO professionals can help, but they can only help so much.

What I can’t find in the documentation, however, is any reference to why 70 percent of search queries now bring up random Reddit pages. I suspect, however, that it has something to do with the fact that Google paid Reddit $60 million use your content to train your AI.

Sources: ipullrank, Sparktoroand the Virgin

[ad_2]

Source link

You May Also Like

About the Author: Ted Simmons

I follow and report the current news trends on Google news.

Leave a Reply

Your email address will not be published. Required fields are marked *