VK is working on rules for training AI models with creative work

Stay informed with free updates

British ministers are working on plans to increase transparency into how tech companies train their artificial intelligence models after the creative industries raised concerns about work being copied and used without permission or compensation.

Culture Secretary Lucy Frazer told the Financial Times that the government would make its first attempt to create rules around the use of material such as TV programmes, books and music by AI groups.

Frazer said ministers would initially focus on ensuring greater transparency over the content used by AI developers to train their models, essentially allowing the industry to see if the work it produces is being ripped off.

Rishi Sunak’s government is caught between competing objectives: strengthening Britain’s position as a global center for AI and protecting the country’s leading creative sector. The general election expected this year, with Sunak’s Conservatives trailing in the polls, is also likely to limit the work ministers and civil servants can do.

Frazer said in an interview that she recognized that AI represented a “major problem, not just for journalism, but also for the creative industries.”

“The first step is just being transparent about what they do [AI companies] to use. [Then] There are other issues that people are very concerned about,” she added. “There are questions about opt-in and opt-out [for content to be used], reward. I work on all those things together with the industry.”

Frazer declined to say what mechanisms would be needed to achieve greater transparency so that rights holders can understand whether the content they produced was used as input to AI models.

Better transparency around the rapidly evolving technology will mean that rights holders can more easily detect infringements of intellectual property rights.

People close to the work said the government would try to come up with proposals ahead of elections, expected in the fall. Asked about the timing, Frazer said she was “working with the industry on all those things.”

Executives and artists in the music, film and publishing industries are concerned that their work is being unfairly used to train AI models developed by technology groups.

Last week, Sony Music called on more than 700 developers to make all sources of their AI systems public. In a strongly worded letter, the world’s second largest music group underlined that it is opting out of its music in connection with the training, development or commercialization of AI systems.

The EU is already preparing to introduce similar rules under its AI law, which will require developers of general-purpose AI models to publish a “sufficiently detailed” summary of the content used for training, and implement a policy to respect the bloc’s copyright laws.

Britain, on the other hand, has been slow to draw up similar rules. Officials have admitted there is a conflict between ministerial ambitions to attract fast-growing AI companies to Britain with a more favorable regulatory environment and to ensure companies in the creative industries are not exploited.

An attempt to create a voluntary set of rules agreed upon by rights holders and AI developers failed last year, forcing officials to reconsider next steps.

Frazer said the government wanted to create a “framework or policy” around transparency, but noted that “very complex international issues are developing rapidly.” She said Britain needed to ensure it had “a very dynamic regulatory environment”.

Leave a Reply

Your email address will not be published. Required fields are marked *