Jun 26, 2023 4 min read

UK Music sets out priorities for regulation of music-making AI

UK Music sets out priorities for regulation of music-making AI

The boss of cross-sector trade group UK Music, Jamie Njoku-Goodwin, last week set out the British music industry’s priorities regarding the regulation of artificial intelligence, and in particular the AI tools that automatically generate new content.

In an opinion piece published by Music Week and on the UK Music website, Njoku-Goodwin declared: “We must ensure that AI technologies are developed in a way that supports human culture and artistry, rather than eroding our creative endeavours and threatening the deeply personal creations we all cherish – whether that’s music, literature or other great works of art”.

Law-makers across the world are considering how to best regulate rapidly evolving AI technologies, including generative AI technologies. The latter pose lots of questions around copyright law, but also personality and publicity rights, plus there is increasingly lots of talk about the need for more transparency.

The music industry is adamant that if a generative AI tool is trained by crunching data connected to existing songs and recordings – so, for example, the AI crunches digital music files – then that activity needs to be licensed by whoever owns the copyright in that existing music. Which would give the copyright owner the right to veto any such training and also make some money.

However, some companies developing generative AI tools argue that they can make use of copyright protected content for training purposes without getting permission, because that activity is covered by a so called copyright exception. Or at least it is in whichever country they have set up their servers to crunch the data.

That argument is still to be properly tested. Though at one point the UK government said it would actually add a new copyright exception into British copyright law for data and text mining, which would specifically cover a lot of that activity. That controversial proposal was dropped, though copyright owners are still seeking clear confirmation from the UK government that the training of AI technologies with copyright material must be licensed.

Of course, the AI-generated music that has got the most attention of late is that which imitates the voices of famous artists. Which poses the additional question: can an artist protect their voice, over and above controlling the training of any AI with their recordings through copyright? It’s possible that can be done through legal concepts like personality rights or publicity rights. Though that is still to be properly tested. Plus that concept doesn’t currently exist in UK law.

The other big talking point around generative AI is the need for more transparency. It’s argued that generative AI platforms – and users of those platforms – should be more transparent about when AI has been used in the creative process, especially if a piece of content is predominantly or completely AI generated. And there should be more transparency about what content was used to train that AI, both generally and specifically in relation to each piece of content.

All these things are now being debated by law-makers in multiple jurisdictions who are looking into how to regulate AI in general and generative AI in particular. In the European Union, an AI Act is now in its final stages of negotiation.

That legislation was originally focused on AI technologies which EU law-makers reckon pose an “unacceptable risk”, such as those that “deploy subliminal or purposefully manipulative techniques, exploit people’s vulnerabilities or are used for social scoring”. However, with generative AI becoming such a big talking point while those proposals were being debated, the final draft of the AI Act includes some regulation of that too.

There has been plenty of debate around regulating AI and generative AI in the US as well – with a recent Congressional hearing specifically considering the copyright questions. And China’s internet regulator has also set out some new draft measures.

Meanwhile, in the UK, the government is currently drafting its own code of practice that will provide guidance for AI firms using copyright protected works. It’s in relation to that draft code that Njoku-Goodwin published his opinion piece last week. “It’s imperative that that this code of practice promotes the highest ethical standards of copyright and intellectual property protection, and we protect the property rights of creators”, he wrote.

In terms of the music industry’s priorities, Njoku-Goodwin stated: “Government must put copyright and IP protection at the heart of its approach to AI, and commit categorically to there being no new copyright exceptions”.

“Secondly”, he went on, “there must be an obligation for adequate record keeping. We need to know exactly what content an AI has been trained on”.

“The same applies across the generative AI field. While it’s exciting to play around with large language models like Chat GPT, don’t forget that we have no idea what the inputs for these technologies were. We don’t know what trained them or whether the creators’ consented or what biases may have been introduced unwittingly”.

“Thirdly, labelling. It’s important to know whether something has been generated by a computer, or if it is a real human product. This is true not just for music, but for all manner of content. Advertisements, political campaign materials, reports, advice. This helps not only the creator, but also protects the consumer”.

And finally, he said, “we must rapidly look at the issues around the protection of personality rights, and ensure that there are adequate protections in order to keep pace with the rapid development of AI in the creative space”.

“Other countries are ahead of us”, the UK Music boss then noted. “It would be quite something if we were to end up in a situation where the UK’s regulatory framework ends up less transparent and with weaker protections for copyright and property rights than the People’s Republic Of China”.

“If our standards are set at a low bar then this would undermine both the AI sector and creative industries in the long-term. The AI industry can only benefit from high standards as it needs a successful creative sector to generate ‘new’ products”.

Concluding, Njoku-Goodwin wrote: “There is much to be excited about and just as much to be rightly concerned given the ferocious pace of AI development. It’s absolutely critical we enshrine and uphold these key principles as we work our way through these issues”.

You can read Njoku-Goodwin’s full piece here.

Great! You’ve successfully signed up.
Welcome back! You've successfully signed in.
You've successfully subscribed to CMU.
Your link has expired.
Success! Check your email for magic link to sign-in.
Success! Your billing info has been updated.
Your billing was not updated.
Privacy Policy