4 Sources
4 Sources
[1]
UK should back licensing-first approach for AI training, says upper house committee
LONDON, March 6 (Reuters) - Britain should reject any move to let artificial intelligence companies freely mine copyrighted material for commercial model training and instead adopt a licensing-first regime, a committee in the upper house of parliament said on Friday. Governments worldwide are wrestling with how copyright should apply to AI training, as developers scrape vast amounts of online material to build models and creators say they are losing control of their work. Britain has been consulting on the issue but has yet to confirm a final approach after stepping back from an earlier preference for allowing commercial text-and-data-mining with an opt-out for creators. Technology minister Liz Kendall said in January the government was seeking a "reset" on its AI copyright plans, calling its earlier proposal a mistake and saying the review would put "reward and control" for artists at its centre. The government is due to publish its review in March. The House of Lords, the unelected second chamber of the UK Parliament, scrutinises legislation and conducts inquiries that shape government policy. Its communications and digital committee warned in a 180-page report that Britain risks long-term dependence on opaque foreign AI systems. CALL TO DROP TEXT-MINING EXCEPTION Britain faces a choice between becoming a leader in responsibly trained, transparently developed AI models, the committee said, or sliding into "tacit acceptance of large-scale, unlicensed use" of copyrighted works by mostly U.S.-based developers, a path it said could undermine creative livelihoods. The upper house urged the government to formally abandon proposals for allowing commercial text-and-data-mining with the opt-out. It said similar opt-out systems in the European Union had "failed to support a strong licensing market" and were built on technical tools that were unreliable, patchy and burdensome for individuals. ($1 = 0.7502 pounds) Reporting by Sam Tabahriti. Editing by Mark Potter Our Standards: The Thomson Reuters Trust Principles., opens new tab * Suggested Topics: * Artificial Intelligence Sam Tabahriti Thomson Reuters Sam Tabahriti is a UK breaking news correspondent covering general and political news for Reuters. He has over five years of experience covering general news and three years covering business and legal news. He is also a keen cyclist and photography enthusiast.
[2]
UK government delays AI copyright rules amid artist outcry
The UK government is working on a controversial data bill that would allow AI companies like Google and OpenAI to train their models on copyrighted materials without consent. However, following a two month consultation, it looks like passage of the law will be delayed. "Copyright is going to be kicked down the road," a person with knowledge of the matter told The Financial Times. Responses by stakeholders during the consultation period weren't favorable to any of the government's proposed ideas for use of copyrighted materials, the FT's sources said. There's no expectation now that an AI bill will be part of the King's Speech set for May this year. As a result, Ministers have decided to go back to the drawing board and spend more time exploring other options. The House of Lords Communications and Digital Committee called on the government to develop a licensing-first regime "underpinned by robust transparency that safeguards creators' livelihoods while supporting sustainable AI growth." The UK parliament's preferred position on the bill (also argued by tech giants like Google) has been that copyright holders need to formally opt-out if they don't want their materials used to train AI models. However, publishers, filmmakers, musicians and others have said that this would be impractical and an existential threat to the UK's creative industries. The House of Lords took the side of artists and introduced an amendment that would require tech companies to disclose which copyright-protected works were used to train AI models. That addition, however, was blocked by the UK's House of Commons in May last year. The UK's majority Labour government -- already under fire for its handling of the economy -- has taken hits from publishers, musicians, authors and other creative groups over the proposed law. Elton John called the government "absolute losers" while Paul McCartney said that AI has its uses but "it shouldn't rip creative people off." McCartney and others artists were part of a "silent album" meant to show the impact of IP theft by AI. Baroness Beeban Kidron from the House of Lords has also ripped the government over the AI bill. "Creators do not deny the creative and economic value of AI, but we do deny the assertion that we should have to build AI for free with our work, and then rent it back from those who stole it," she said last year. "It's astonishing that a Labour government would abandon the labor force of an entire section."
[3]
UK arts must not be sacrificed for speculative AI gains, peers say
Ministers urged to abandon plans to let tech firms use work of novelists, artists and writers without permission The UK's creative industries must not be sacrificed in the pursuit of speculative gains in AI technology, a House of Lords committee has warned, as the government prepares to reveal the economic cost of proposals to change copyright rules. A report by peers has urged ministers to develop a licensing regime for the use of creative works in AI products and abandon proposals to let tech firms use the work of novelists, artists, writers and journalists without permission. The call from the House of Lords communications and digital committee comes as the government prepares to release an economic impact assessment of proposed changes to copyright law, as well as a progress update on a consultation about the legal overhaul, by a deadline of 18 March. Barbara Keeley, a Labour peer and committee chair, said the UK's creative industries faced a "clear and present danger" from AI firms using their work without credit or payment. "AI may contribute to our future economic growth, but the UK creative industries create jobs and economic value now," she said. Official figures show the creative sector contributes £146bn a year to the UK economy. "Watering down the protections in our existing copyright regime to lure the biggest US tech companies is a race to the bottom that does not serve UK interests. We should not sacrifice our creative industries for AI jam tomorrow," Lady Keeley added. The government has been consulting on a new intellectual property framework for AI. The technology requires vast amounts of data, including copyright-protected work taken from the open web, to develop tools such as chatbots and image generators. However, British artists have responded with outrage at the main government proposal of letting AI firms use copyright-protected work without the owner's permission - unless the owner has signalled that they want to opt out of the process. Elton John is among the artists who have protested over the prospect of a relaxation in copyright law, calling the government "absolute losers". The House of Lords report, titled "AI, copyright and the creative industries", also urges the government to formally rule out the proposal to let AI firms use copyright-protected material. Other recommendations include supporting the development of a licensing market that ensures artists are paid by tech companies for use of their work; backing UK-developed AI models; requiring AI companies to reveal the data they have used to develop their products; and giving creators greater rights-based protection against deepfakes. As well as the main government proposal, ministers have suggested three further options: to leave the situation unchanged; to require AI companies to seek licences for using copyrighted work; or to allow AI firms to use copyrighted work with no opt-out for creative companies and individuals. The government has refused to rule out a copyright waiver for using material for the purposes of "commercial research", which creative professionals fear could be exploited by AI firms to take artists' work without permission. The notion of a commercial research exemption was raised in the Lords this week and Fiona Twycross, a minister at the Department for Culture, Media and Sport, said it would be "pre-emptive" to rule out any exception before the update report was published. A government spokesperson said: "The government wants a copyright regime that values and protects human creativity, can be trusted, and unlocks innovation. "We welcome the committee's contributions, and we will continue to engage closely with parliament going forwards."
[4]
AI and copyright - in a welcome move, UK legislators reject tech vendor claims, warn of existential danger to Britain's creative sectors
Britain's $165 billion creative industries face a "clear and present danger" from generative AI, and government must not sacrifice "our outstanding creative capacity for speculative AI gains". That's the stark, but welcome, conclusion of the Communications and Digital Committee of the UK Government's upper legislature, the House of Lords, in an 83-page report published today from its Inquiry into AI and Copyright. From November 2025 to January this year, the Committee heard in-person testimony from all sides of the debate, including vendors, academics, copyright experts, news organisations, and industry groups, with written submissions published earlier this month. After considering all the evidence, the Lords' judgement could not be clearer. The report says: The UK faces a choice between two futures. In the first, the UK becomes a world-leading home for responsible, licensing-based artificial intelligence (AI) development, where commercial model developers using UK content obtain permission, pay fair remuneration to rightsholders and can deploy their models without questions of legal liability. Domestic AI efforts would be directed towards building sovereign models whose training data and development processes are open to scrutiny. In this scenario, both the UK's creative industries and AI sector could thrive, building on our national strengths and unique selling point of innovating in creative technology. [But] in the second scenario, the UK continues to drift towards tacit acceptance of large-scale, unlicensed use of creative content and long-term dependence on opaque models trained overseas, with most benefits accruing to a small number of US-based firms while harms to UK creators grow. Until recently, the UK Government's position favoured the latter, though it claims to have undergone a policy "reset" since Christmas, despite forging a $2 billion strategic partnership with ChatGPT maker OpenAI. The report warns: Only the first path is compatible with the UK's long-term interests. The UK's creative industries are an economic powerhouse that contributed £124 billion [$165 billion] to the UK economy in 2023, with gross value added expected to reach £141 billion [$188 billion] by 2030. Their success is underpinned by a 'gold-standard' copyright framework, which rewards creativity, supports sustainable business models for creative work, and commands international respect. By contrast, the Committee notes that Britain's AI sector contributed only £12 billion to the economy in 2024 and employed just 86,000 people, compared to the 2.4 million jobs supported by its other creative industries. The report continues. In the age of AI, the protections for creators afforded by copyright are under threat. Generative AI systems can now generate imitations of creative material in seconds, but speed is not a substitute for the value of the human creativity, skill and dedication that underpin original work. And these capabilities of AI systems depend on training models on vast quantities of human-created content, much of it copyrighted and drawn directly from the creative sector. It adds: This is not because our copyright framework is outdated or in need of reform. Rather, widespread unlicensed use of protected works, coupled with limited transparency from AI developers about how their models have been trained, leaves rightsholders unsure about whether their content has been used, and unable to enforce their rights when it has. In addition, the absence of a robust 'personality right' or specific protection for digital likeness in the UK means creators and performers are unable to challenge harmful outputs that imitate their distinctive style, voice, or persona. These problems pose material risks to the livelihoods of individual rightsholders. Creators are already losing meaningful control over how their works and identities are used, leading to tangible economic harms, while an influx of AI-generated content in the market is replacing human-made work and undercutting paid commissions. Meanwhile, "technology sector stakeholders" - principally large US vendors, such as Inquiry witness Google - have been pressing for the introduction of a broad exception for commercial text and data mining (TDM). Until recently, this was Downing Street's preferred option too, as stated in the run-up to its public consultation. If introduced, this would legitimise large-scale AI training on copyright-protected works; indeed, it would have the effect of retrospectively legitimizing actions that were illegal even under US law, such as the scraping of pirated, in-copyright books. As such, it would undermine ongoing lawsuits and class actions by rightsholders against those vendors, potentially losing them billions of dollars. The Committee rejects vendors' claims that reform is essential to growth, competitiveness, and progress, saying: There is only limited evidence to show that weakening UK copyright law would significantly expand our AI sector. In contrast, a broad commercial TDM exception presents predictable harms to rightsholders by removing incentives to license protected works for AI training. On that point, it is worth noting that the unprecedented expansion of the AI industry since the launch of ChatGPT has taken place under existing copyright laws - strained though they have become. And it has also occurred during previous administrations' focus on AI safety, transparency, and fairness in both the UK and the US. So, where is the evidence of AI sector harm? The coruscating report comes less than a fortnight before Downing Street has pledged to unveil its latest thinking on the topic, which has seen vendors such as Google state - before the Committee - that anything published online is fair game for AI training, while repeating the bogus claim that AI reads and learns about the world in the same way that a human does. It does not: it is fed on millions of documents by a corporation. In response, artist rights campaigners have claimed that vendors' position is questionable at best, given the industrial-scale scraping of copyrighted works, the core purpose of which is to create automated competitors, without consent, credit, or payment to the originators. In general, the Committee sides with the views expressed by Britain's creative sectors, and against the claims made by AI witnesses to the Inquiry. Surprisingly, that position is shared by trade body UKAI, whose own report a year ago described the government's preferred option of opting creators into a TDM exception as "misguided", "damaging", and "divisive". Yet for some reason, UKAI's voice has not been heard in this debate since then, and - when questioned in writing by diginomica - the Department of Science, Innovation, and Technology (DSIT) has, on three separate occasions, refused to comment on the UKAI report, or on why the government has seemingly ignored the opinion of the domestic industry it claims to be helping. But few people have supported the UK Government's position. For example, the public consultation last year saw its preferred option of opting creators into AI training by default supported by just eight percent of the public, while Option One, to strengthen copyright laws and require licensing in all cases, was supported by eighty-eight percent of 11,512 respondents. However, earlier this year, the UK's Intellectual Property Office hinted that it was taking that overwhelming response with a pinch of salt. On 13 January 2026, Matt Cope, the IPO's Deputy Director of AI, Missions, and Technology told a Westminster policy forum on intellectual property: These are really important messages, but they have to be understood in light of the fact that the vast majority of respondents were from the creative industries. A consultation is about evidence and context, and we have to consider the findings in the round and what it means across the board. They only give us part of the picture. This was seen by many as an outrageous response from a body whose remit is, after all, the protection of Britain's intellectual property. My own view is that Inquiry witness Ed Newton-Rex - a composer, technologist and artist rights campaigner - was quite correct when in January 2026 he wrote: If you're trying to build AGI, you are by definition trying to compete with your training data. This is one reason training data must be licensed. So, what are the Committee's own recommendations to government? It makes several, but the strongest and most urgent ones are as follows: (1) It should make transparency about AI training data into a statutory obligation via a clear, mandatory transparency framework for AI developers. Any regime should be carefully designed to ensure disclosures are sufficiently granular to meet rightsholders' needs, while avoiding disproportionate burdens, particularly on small UK-based AI firms. (2) Most importantly in my view, the UK Government should rule out a new commercial text and data mining exception with an opt-out model. My previous diginomica reports on this subject have explained why an opt-out is, in the real world, a fantasy solution that will do nothing to prevent works from being exploited by opportunistic developers - the very existence of pirate libraries being ample evidence of this. The government was right to reset its initial approach to AI and copyright, notes the Committee, but mixed public messaging since then and an extended consultation have undermined trust and stalled licensing and investment in the meantime. Accordingly, the Government should, within the next year, publish a final decision. In fact, it may come within the fortnight, with any hold-up doubtless due to the deteriorating international situation. So, will Downing Street heed the Committee's recommendations? Party politics aside, the problem is this: Britain has a stubborn, obstinate, tin-eared Prime Minister with an anti-Midas touch. Given a full range of options and sage, evidenced, popular advice to take just one of them, Sir Keir Starmer will invariably do the opposite in the apparent belief that growth will magically result from his instincts. Alas, those instincts are faulty. In this, he is backed by self-styled think tank the Tony Blair Institute for Global Change, run by Britain's former Prime Minister - the only organization in Britain, aside from the local offices of US Big Techs, that is urging for a change in UK copyright law, and one who gets a large amount of funding from US Big Tech. In a sane world, those financial relationships would disqualify Blair's lobbyists from having a direct impact on government policy. But we don't live in that world; instead, the Institute has put itself at the center of influencing these decisions, supported by Starmer, with some staff moving between the TBI and US vendors. Another problem is this. The Communications and Digital Committee has form on this issue. Its earlier Inquiry into Large Language Models covered much of the same ground and was presented with even more aural and written evidence over an extended period. Its conclusions were similarly blunt: the UK should protect and enforce its copyright laws, not sacrifice them on the altar of American corporations' greed. But the then Prime Minister Rishi Sunak ignored that report and was accused by the Committee of sitting on his hands and, through inaction, favoring US Big Tech's wishes. In other words, the law was there, and it was abundantly clear, but Sunak, now a paid advisor for both Microsoft and Anthropic in his post-PM life, chose not to push for its enforcement. Meanwhile, Sunak's attempts to put Britain in the vanguard of AI safety have been bulldozed aside by the Trump 2.0 administration, and many of the same US vendors who lined up to support the AI Safety Summit in Bletchley Park are now cheering the abandonment of that policy. Hell, even Britain's AI Safety Institute has been renamed as the AI Security Institute. What more do you need to know? So, will Starmer fare any better with his supposed US allies and innovation partners? No, of that you can be certain: he is dealing with a wildly unpredictable White House, the relationship with which has soured to a new low in recent days over the response to the Iran crisis, and a group of super-powerful corporations that see a one-off opportunity to sweep aside regulations and encircle the world's IP - permanently. To which I'd make a passionate plea -- for once in your political life, Prime Minister, stop barging people into the mud like they are on a second-rate Surrey rugby field, and listen to good advice. The Committee is right. UKAI is right. And the TBI is not only wrong, but it should also have no role in UK policymaking. You can only get this wrong once. If you do, we are all screwed.
Share
Share
Copy Link
A House of Lords committee has urged the UK government to abandon plans allowing AI companies to freely use copyrighted material and instead adopt a licensing-first regime. The call comes as Britain's £146bn creative industries face what peers describe as a 'clear and present danger' from generative AI, with passage of controversial AI copyright rules now delayed amid fierce opposition from artists, musicians, and creators.
The UK government faces mounting pressure to overhaul its approach to AI copyright as a House of Lords committee released a scathing 180-page report urging ministers to formally abandon proposals for commercial text-and-data-mining (TDM) with an opt-out for creators
1
. The communications and digital committee warned that Britain risks long-term dependence on opaque foreign AI systems if it continues down its current path, calling instead for a licensing-first approach that puts fair compensation and creator rights at the center3
.
Source: diginomica
The committee's findings represent a direct challenge to tech giants like Google and OpenAI, which have lobbied for broad exceptions allowing them to train AI models on copyrighted material without permission. The report noted that similar opt-out systems in the European Union had "failed to support a strong licensing market" and were built on technical tools that were unreliable, patchy and burdensome for individuals
1
. Britain's creative industries contribute £146bn annually to the UK economy, dwarfing the AI sector's £12 billion contribution in 20244
.Barbara Keeley, a Labour peer and committee chair, emphasized the existential threat to livelihoods facing UK creators. "AI may contribute to our future economic growth, but the UK creative industries create jobs and economic value now," she said, warning against watering down copyright protections to "lure the biggest US tech companies"
3
. The committee's report stressed that Britain's creative sector employs 2.4 million people compared to just 86,000 in AI, with gross value added expected to reach £141 billion by 20304
.
Source: Engadget
The House of Lords committee outlined two starkly different futures for Britain. In the first scenario, the UK becomes a world-leading home for responsible, licensing-based AI development where commercial model developers obtain permission and pay fair remuneration to rightsholders. In the second, Britain drifts toward "tacit acceptance of large-scale, unlicensed use of creative content and long-term dependence on opaque models trained overseas, with most benefits accruing to a small number of US-based firms while harms to UK creators grow"
4
.Following a two-month consultation period, passage of AI copyright rules will be delayed as ministers go back to the drawing board. "Copyright is going to be kicked down the road," a person with knowledge of the matter told The Financial Times
2
. Responses by stakeholders during the consultation weren't favorable to any of the government's proposed ideas for use of copyrighted material, with no expectation that an AI bill will be part of the King's Speech set for May this year2
.The UK government is due to publish its review in March, including an economic impact assessment of proposed changes to copyright law and a progress update on the consultation by a deadline of March 18
3
. Technology minister Liz Kendall said in January the government was seeking a "reset" on its AI copyright plans, calling its earlier proposal a mistake and saying the review would put "reward and control" for artists at its center1
.Related Stories
The UK's majority Labour government has taken significant hits from publishers, musicians, authors and other creative groups over the proposed legislation. Elton John called the government "absolute losers" while Paul McCartney said that AI has its uses but "it shouldn't rip creative people off"
2
. McCartney and other artists participated in a "silent album" meant to show the impact of intellectual property theft by AI developers2
.Baroness Beeban Kidron from the House of Lords criticized the government's position sharply: "Creators do not deny the creative and economic value of AI, but we do deny the assertion that we should have to build AI for free with our work, and then rent it back from those who stole it. It's astonishing that a Labour government would abandon the labor force of an entire section"
2
.The House of Lords report includes several critical recommendations beyond the licensing-first approach. The committee urged the government to require AI companies to reveal the data they used to develop their products, supporting greater transparency around how AI developers train their models
3
. Additional recommendations include backing UK-developed AI models and giving creators greater rights-based protection against deepfakes3
.
Source: Reuters
The government has refused to rule out a copyright waiver for using material for purposes of "commercial research," which creative professionals fear could be exploited by AI firms to take artists' work without permission
3
. The House of Commons previously blocked an amendment from the House of Lords that would have required tech companies to disclose which copyright-protected works were used to train AI models2
. As the March deadline approaches, the economic impact of these competing visions for innovation and creator protection will become clearer, with implications extending far beyond Britain's borders.Summarized by
Navi
10 May 2025•Policy and Regulation

17 Dec 2024•Policy and Regulation

28 May 2025•Policy and Regulation

1
Technology

2
Policy and Regulation

3
Policy and Regulation
