Britain considers AI content labels to tackle deepfakes and disinformation

Britain considers AI content labels to tackle deepfakes and disinformation
Artificial Intelligence words are seen in this illustration taken 31 March, 2023
Reuters

Britain is considering introducing labels for AI-generated content to protect consumers from disinformation and deepfakes, the government said on Wednesday (18 March), as it sets out the next phase of its approach to regulating artificial intelligence.

Technology minister Liz Kendall said officials are also exploring ways to balance safeguards for the creative industries with continued innovation in the fast-growing AI sector. She stressed that the government would take time to “get this right.”

The proposals form part of a broader review of copyright and artificial intelligence, which will examine risks such as unauthorised digital replicas, tools to help creators control how their work is used online, and support for independent creative organisations.

In 2024, Britain proposed easing copyright rules to allow developers to train AI models on legally accessed material, while giving creators the option to reserve their rights.

However, Kendall said that, following consultations with artists, technology firms, unions and academics, the government “no longer has a preferred option.”

“We will help creatives control how their work is used,” she said, adding that fair payment for artists and smaller organisations remains central to policy plans.

The announcement comes as Prime Minister Keir Starmer pushes to position the UK as a global leader in AI.

The sector is expanding rapidly. According to government officials, it is growing 23 times faster than the wider economy and ranks behind only the U.S. and China in scale.

Governments and regulators worldwide are grappling with the rapid rise of AI systems capable of generating text, images and other content, raising legal and ethical concerns over the use of copyrighted material and the spread of misleading or manipulated media.

Tags