Creatives argue that being able to ‘opt-out’ of allowing developers use of their work to train generative AI software is unworkable and favors the tech industry.
LONDON — The U.K. has seen a fierce debate play out in the news in recent weeks over copyright changes related to artificial intelligence development.
In one corner are British stars with global pulling power — musicians such as Dua Lipa, Elton John, Paul McCartney and a whole slew of actors, writers and other creatives.
In the other corner are such tech giants as Google, Apple and ChatGPT-creator OpenAI, which are all keen to gain free access to high-quality materials with which to train their generative AI models.
In the middle is the government, which is striving to turn Britain into a global hub for AI in a bid to boost growth.
Government officials are not having an easy time adjudicating the dispute. Prime Minister Keir Starmer’s Labour administration has faced heavy fire from creative artists after announcing plans to alter rules on how developers can use content such as songs, art, photos, plays and books to train AI software.
In a consultation that closed last month, ministers stated that their preferred option was to introduce an “opt-out” system for creative artists who did not want to agree to AI developers having “fair use” of their copyright for machine learning purposes.
The artists have argued that an opt-out system is unworkable. More than 1,000 musicians — including Dua Lipa, Kate Bush and Blur’s Damon Albarn — last month released a silent album in protest. The tracklisting for the record spelled out the message: “The British government must not legalise music theft to benefit AI companies.”
Ursula Smartt, an associate professor of law at Northeastern University in London, says the opt-out model favored by the British government is derived from Article 4 of the European Union’s Digital Single Market Directive — an order that came into force after Brexit, so it was not adopted by the U.K. at the time.
Smartt says the government’s preference to introduce a text and data mining exception for commercial uses, binding copyright owners unless they opt-out, “will not deliver sufficient control for authors or creators.”
“The proposed exception is disproportionate, uncertain and at odds with obligations under current U.K. and international copyright law,” argues Smartt, a media law expert.
“The associated opt-out system is unfair and technically unworkable. The net effect of this approach would be to seize intellectual property rights from authors and creators, with no means for redress or compensation. In other words, completely denying them control.”
Such an exception would allow AI developers to use artistic works at an “industrial scale,” says Smartt, and would likely result in an array of legal battles in the U.K.’s intellectual property court based on the “fair usage” test, similar to cases currently making their way through U.S. courtrooms.
Alice Helliwell, an assistant professor in philosophy who researches AI and creativity, says there has been a long tradition of artists being “heavily influenced” by those who have practiced the craft before them, but that AI learning is seen as something separate.
“It is complicated because we have long histories of art being referential in some way,” she says. “There are plenty of arts and artists that use the work of others, or are influenced heavily by the work of others.
“But there is this view that that is quite different from an AI system being trained upon that work. People see this as a risk to their line of work, to their creative expression. They have spent a long time honing their craft and then it just ends up being fodder for AI — they see that as some sort of threat.”
The rise of automation in the creation of materials, coming first in the form of the printing press and followed later by the era of photography, has tended to be seen as a threat to creativity before being embraced and even incorporated into different forms of art, says Helliwell, who highlights photography’s influence on abstract art forms such as cubism.
The Northeastern professor, who is based in London, says AI systems are able to create “novel” images, rather than just reproductions, after being trained on human-created images.
But the sheer scale of objection from creative artists to the potential copyright changes, she argues, lies in the fact that AI developers are being seen as making money off the back of an industry in which it can already be a struggle for people to make a living.
“It is not so straightforward as just saying, ‘Oh, it is copying my work’ — that is not how these systems are working, so the objection is a little deeper than that,” adds Helliwell, who made a submission to the government’s consultation.
“It is not that this is straightforward stealing but yet it is still basing what it is producing on the work of humans on a grand scale. And I think that is the concern. I think if people were fairly compensated in the arts in all cases, this may be less of a problem.
“Would you care about your work being used in a training set if you knew that it wasn’t copying directly and you were being paid for your work fairly? It is this perception that, ‘You’re taking my labor, you’re taking my work that I’ve taken a long time to develop, and I get nothing from that — and it is threatening my industry.’ I think that is a big part of the issue.”
Artists have readily voiced their opposition to the proposed copyright change, and newspapers have printed submissions on the topic from the likes of former Beatles members and actors such as Julianne Moore because it is a concern that the publishing industry shares.
On Feb. 25, every major newspaper in the U.K. published a wraparound advertisement for the industry’s “Make It Fair” campaign. The industry is pressing ministers to uphold current copyright laws and ensure that AI companies are required to declare the source of their training data as well as pay a license fee for the British-created material they scrape from the internet.
The government being sandwiched between the demands of the AI sector and artists is a “typical example of the double-sided role of the media in the accountability process,” says Gergana Dimova, associate professor in politics and sociology on Northeastern’s London campus.
“On the positive side, the media elevates the salience of voices demanding enhanced copyright protection, and then it channels these demands to the government,” explains Dimova, who researches how democracies function.
“In this sense, the media supplies the government with information about existing sentiments and provides a marketplace of ideas to discuss them. On the flip side, governments are sometimes forced into substituting potentially salubrious long-term decisions in favor of short-term fixes meant to silence the loudest critics in the room.”
Ministers are faced with a tough decision, Dimova points out. Do they side with a creative sector that added £125 billion ($161 billion) in gross value to the U.K. economy in 2023? Or do they focus on the future potential of AI, which is estimated could lift national productivity annually by 1.5% in the next decade?
Dimova says the hope must be that all those involved in the democratic process — from the media, to interest groups, to lawmakers — will be able to agree on a copyright policy that not only pleases the stakeholders involved but also recognizes the “responsibility to materialize potential for the wider society.”