Someone on Reddit asked a question about how to make a site-wide change to code related to a website with ten languages. Google’s John Mueller offered general advice on the pitfalls of site-wide changes and talked about complexity (which implies the value of simplicity).
The question was related to hreflang, but Mueller’s answer, because it was general in nature, had broader SEO value.
This is the question that was asked:
“I’m working on a website that contains 10 languages and 20 cultural codes. Assume that blog-abc was published in all languages. The hreflang tags in all languages point to the lang-based blog-abc version. For it can be en/blog-abc
They did an update to the English one and the URL was updated to blog-def. The English blog page hreflang tag for en will be updated to en/blog-def. However, this will not update dynamically in other languages’ source code. They will still be pointing to en/blog-abc. To update hreflang tags in other languages we will also need to republish them.
Since we’re trying to make the pages as static as possible, updating the hreflang tags dynamically may not be an option. The options we have are to update the hreflang tags periodically (for example, once a month) or to move the hreflang tags in the sitemap.
If you think there is another option, that will be helpful too.”
Sitewide changes take a long time to process
I recently read something interesting in a research paper that reminded me of things John Mueller said about how it takes time for Google to understand that updated pages relate to the rest of the Internet.
The research paper mentioned how updated web pages required recalculating the semantic meanings of the web pages (the embeddings) and then doing this for the rest of the documents.
Here is the research paper (PDF) says in passing about adding new pages to a search index:
“Consider the realistic scenario where new documents are continuously added to the indexed corpus. Updating the index with methods based on dual encoders requires computing embeddings for new documents, followed by reindexing all document embeddings.
In contrast, building indices using a DSI involves training a transformer model. Therefore, the model must be retrained from scratch each time the underlying corpus is updated, thus incurring prohibitive computational costs compared to dual encoders.”
I mention this passage because in 2021 John Mueller said that Google can take months to assess the quality and relevance of a site and mentioned how Google tries to understand how a website fits in with the rest of the web.
Here’s what he said in 2021:
“I think it’s a lot more complicated when it comes to things about quality in general, where assessing the overall quality and relevance of a website is not very easy.
It takes a long time for us to understand how a website fits in with the rest of the Internet.
And that’s something that can easily take, I don’t know, a couple of months, half a year, sometimes even more than half a year, for us to recognize significant changes in the overall quality of the site.
Because we’re basically looking at… how this website fits into the context of the overall web and that just takes a lot of time.
So I would say that compared to the technical issues, it takes a lot longer to update in that regard.”
That part about evaluating how a website fits into the context of the overall web is a curious and unusual statement.
What he said about fitting into the context of the general web sounded strikingly similar to what the research paper said about how the search index “requires compute embeddings for new documents, followed by reindexing all embeddings from documents”.
Here it is Answer by John Mueller on Reddit about the problem with updating many URLs:
“Changing URLs on a larger site will usually take time to process (which is why I like to recommend stable URLs…someone once said that cool URLs don’t change – I don’t think they meant SEO, but (also for SEO). I don’t think any of these approaches will change that significantly.”
What does Mueller mean when he says big changes take time to process? It could be similar to what he said in 2021 about evaluating the site again for quality and relevance. This part of relevance could also be similar to what the research paper on computational embeddings said, which relates to creating vector representations of words on a web page as part of understanding semantic meaning.
Complexity has long-term costs
John Mueller continued his response:
“A more meta question might be whether you’re seeing enough results from this somewhat complex configuration to merit spending time maintaining it, whether you could drop the hreflang configuration, or whether you could even drop the country versions and simplify even more.
Complexity doesn’t always add value and comes at a long-term cost.”
Creating sites as simply as possible has been something I’ve done for over twenty years. Mueller is right. It makes upgrades and renewals much easier.
Featured image by Shutterstock/hvostik
[ad_2]
Source link