BBC Embraces AI in Journalism While Blocking Data Scrapers

JJohn October 6, 2023 11:11 PM

United Kingdom's largest news organization, the BBC, has expressed openness towards the integration of generative AI in its journalistic processes, while also blocking data scraping by OpenAI and Common Crawl. The network aims to maintain public trust and support the rights of creators in the digital age.

BBC's strategic approach towards AI

The BBC has expressed enthusiasm about incorporating generative AI into its operations, particularly for research, production of journalism, archival, and crafting personalized experiences. The network perceives the technology as a way to deliver more value to its audiences and society in general. To ensure ethical use of AI, the organization has proposed three guiding principles - acting in the public’s best interests, prioritizing talent and creativity, and being open and transparent about AI-generated content.

The BBC is not going it alone in this endeavor; it seeks to collaborate with tech companies, other media houses, and regulatory bodies. The goal is to develop generative AI safely, in a manner that doesn't compromise the trust in the news industry. The network aims to initiate several projects to explore the use of generative AI and to understand the opportunities and risks involved.

Guarding digital assets against data scraping

While the BBC is diving headfirst into the AI realm, it's also taking measures to guard its digital assets. The network has blocked web crawlers from OpenAI and Common Crawl from accessing its websites, following the footsteps of CNN, The New York Times, and Reuters. The decision is aimed at protecting the interests of its license fee payers and maintaining the integrity of its data.

More articles

Also read

Here are some interesting articles on other sites from our network.