Google to explore alternatives to robots.txt in wake of generative AI and other emerging technologies

Google to explore alternatives to robots.txt in wake of generative AI and other emerging technologies

Google is exploring alternatives or additional ways to control crawling and indexing beyond the 30-year-old standard of the robots.txt protocol. “We believe it is time for the web and AI communities to explore additional machine-readable media for web editor choice and control for emerging AI and research use cases,” Google. he wrote.

Involvement with the community. Google said it is inviting members of the web and AI communities to discuss a new protocol. Google said it is “initiating a public discussion,” with a “wide range of voices from web publishers, civil society, academia, and other fields around the world.

Timing Google said those discussions are over “to join the discussion, and we’ll be inviting those interested to participate over the next few months.” So nothing happens too soon, and nothing will change tomorrow.

Content issue with paywall. Open AI recently disabled the Bing browsing feature on ChatGPT after it was able to access paywall content without publisher permission. This is one of the many reasons why Google may be looking for alternatives to the robots.txt protocol.

Why we care We’ve all been accused of allowing bots to access our websites through robots.txt and other newer forms of structured data. But we may be looking for new methods in the future. What these methods and protocols might look like is unknown right now, but the discussion is happening.

[ad_2]

Source link

You May Also Like

About the Author: Ted Simmons

I follow and report the current news trends on Google news.

Leave a Reply

Your email address will not be published. Required fields are marked *