18 Sources
[1]
Paul McCartney and Dua Lipa call on the UK to pass AI copyright transparency law
Wes Davis is a weekend editor who covers the latest in tech and entertainment. He has written news, reviews, and more as a tech journalist since 2020. Last week, Paul McCartney, Dua Lipa, Ian McKellen, Elton John, and hundreds of others in the UK creative industry signed an open letter backing an effort to force AI firms to reveal the copyrighted works used to train their models. They support an amendment to the UK's Data (Use and Access) Bill proposed by letter organizer Beeban Kidron, adding the requirement, which the UK government has opposed. The British House of Lords passed the amendment yesterday, 272 to 125, reports The Guardian, and now it's going back to the House of Commons, where the amendment could be removed again. The British government says the fight over the amendment "is holding back both the creative and tech sectors and needs to be resolved by new legislation," writes The Guardian. From the letter: We will lose an immense growth opportunity if we give our work away at the behest of a handful of powerful overseas tech companies, and with it our future income, the UK's position as a creative powerhouse, and any hope that the technology of daily life will embody the values and laws of the United Kingdom. Also signed by many media companies, music publishers, and arts organizations, the letter insists that the amendments "will spur a dynamic licensing market that will enhance the role of human creativity in the UK, positioning us as a key player in the global AI supply chain." Companies like OpenAI and Meta have been accused in court of using copyrighted material without permission to train their models. Baroness Beeban Kidron, who tabled the amendment, writes that although the UK's creative industries welcome creative advancements enabled by AI, "...how AI is developed and who it benefits are two of the most important questions of our time." "My lords," The Guardian quotes Kidron as saying yesterday, "it is an assault on the British economy and it is happening at scale to a sector worth £120bn to the UK, an industry that is central to the industrial strategy and of enormous cultural import."
[2]
Creatives demand AI comes clean on what it's scraping
Musicians, artists, writers, and actors urge government to protect copyright More than 400 of the UK's leading media and arts professionals have written to the prime minister to back an amendment to the Data (Use and Access) Bill, which promises to offer the nation's creative industries transparency over copyrighted works ingested by AI models. Signatories include some of the UK's best-known artists such as musicians Paul McCartney, Elton John, Coldplay, writer/director Richard Curtis, artist Antony Gormley, and actor Ian McKellen. The UK government proposes to allow exceptions to copyright rules in the case of text and data mining needed for AI training, with an opt-out option for content producers. "Government amendments requiring an economic impact assessment and reports on the feasibility of an 'opt-out' copyright regime and transparency requirements do not meet the moment, but simply leave creators open to years of copyright theft," the letter says. The group - which also includes Kate Bush, Robbie Williams, Tom Stoppard, and Russell T Davies - said the amendments tabled for the Lords debate would create a requirement for AI firms to tell copyright owners which individual works they have ingested. "Copyright law is not broken, but you can't enforce the law if you can't see the crime taking place. Transparency requirements would make the risk of infringement too great for AI firms to continue to break the law," the letter states. Baroness Kidron, who proposed the amendment, said: "How AI is developed and who it benefits are two of the most important questions of our time. The UK creative industries reflect our national stories, drive tourism, create wealth for the nation, and provide 2.4 million jobs across our four nations. They must not be sacrificed to the interests of a handful of US tech companies." The letter was also signed by a number of media organizations, including the Financial Times, the Daily Mail, and the National Union of Journalists. Baroness Kidron added: "The UK is in a unique position to take its place as a global player in the international AI supply chain, but to grasp that opportunity requires the transparency provided for in my amendments, which are essential to create a vibrant licensing market." Labour peer Lord Brennan of Canton backed the amendment. "We cannot let mass copyright theft inflict damage on our economy for years to come," he said. "Transparency over AI inputs will unlock tremendous economic growth, positioning the UK as the premier market for the burgeoning trade in high-quality AI training data." Debate rages as to whether AI training should disregard copyright. For example, The Atlantic alleges that Meta, along with other GenAI devs, may have accessed millions of copyrighted books and research papers through the LibGen dataset. Researchers have speculated that OpenAI may have done the same, with the allegations a part of lawsuits over the alleged use of copyrighted material. UK authors were alarmed to find their copyrighted books in the database. Meanwhile, the head of the US Copyright Office has reportedly been fired, a day after the agency concluded that AI models' use of copyrighted material went beyond existing doctrines of fair use. ®
[3]
Elton John and Dua Lipa urge Starmer to back UK artists in AI copyright row
Sir Paul McCartney, Richard Curtis and Dua Lipa are among the 400 top musicians, artists and media executives who have written to Prime Minister Sir Keir Starmer seeking support to protect copyright from being ripped off by artificial intelligence tools. The letter calls on Starmer to next week back an amendment to a bill introduced by Baroness Beeban Kidron, a crossbench peer, that would give transparency and protection over whether artists' work is being used to train AI models. Kidron's amendment would make tech giants tell copyright owners which individual works -- from music and books to films and newspapers -- they have used to train their AI models. This would allow companies and artists "to hold AI firms accountable for the mass theft of creative works that continues to take place", the letter says. The letter also has support from Coldplay, Sir Elton John, Russell T Davies, Antony Gormley and top executives from news groups such as the Telegraph and The Times. The Financial Times has also signed the letter. The amendment to the Data (Use and Access) Bill was this week defeated in the House of Commons, but will be voted on again in the House of Lords on Monday. The government has made its own amendments that would guarantee an economic impact assessment of different options, while ministers are retreating from a previously "preferred" position that would have meant that creative industries would need to opt out from their work being scraped by AI. Officials insist all options are on the table following the end of a consultation into various proposals earlier this year. However, executives are still concerned that tech groups will ultimately be allowed to override copyright rules unless the government provides legislative backing guaranteeing transparency and protection. The letter warns that if artists are forced to give away their work, the UK "will lose an immense growth opportunity . . . and with it our future income, the UK's position as a creative powerhouse, and any hope that the technology of daily life will embody the values and laws of the UK". The use of the data bill is the latest attempt by the creative industries to protect their copyright from being used by AI groups without attribution or payment. Kidron said the UK creative industries "must not be sacrificed to the interests of a handful of US tech companies". She added: "The UK is in a unique position to take its place a global player in the international AI supply chain, but to grasp that opportunity requires the transparency provided for in my amendments, which are essential to create a vibrant licensing market." Lord Kevin Brennan of Canton, former MP and Labour Peer, said: "We cannot let mass copyright theft inflict damage on our economy for years to come." The government did not immediately respond to a request for comment.
[4]
Opt out or get scraped: UK's AI copyright shakeup has Elton John, Dua Lipa fighting back
Sir Elton John (right) performs at the Verizon Center in Washington, D.C. Celebrity musicians from Elton John to Dua Lipa are urging the U.K. government to rethink controversial plans to reform copyright laws that allow artificial intelligence developers access to rights-protected content. An open letter signed by John, Lipa and a host of other high-profile artists, this weekend called on Prime Minister Keir Starmer to back an amendment proposed by U.K. lawmaker Beeban Kidron to make the legal framework around AI model makers' use of copyrighted content more strict. "We are wealth creators, we reflect and promote the national stories, we are the innovators of the future, and AI needs us as much as it needs energy and computer skills," they said in the letter. "We will lose an immense growth opportunity if we give our work away at the behest of a handful of powerful overseas tech companies."
[5]
The UK's House of Lords kicks back bill that let AI train on copyrighted content
It proposes an amendment to force companies to be transparent about their training. The UK's House of Lords just voted to add an amendment to a data bill that mandates that tech companies disclose which copyright-protected works were used to train AI models, . The amendment faced government opposition but resoundingly passed with 272 votes to 125. The vote came just a few days after hundreds of artists and organizations joined together to urge the government not to "give our work away at the behest of a handful of powerful overseas tech companies." The artists involved in this push included Paul McCartney, Elton John and Dua Lipa, among many others. The government's preferred position has been a provision that would force copyright holders to formally opt-out of being used to train AI models. Critics say this would be impractical and unworkable for many artists. The bill will now return to the House of Commons for another vote. If today's amendment is removed, it will likely lead to another confrontation with the House of Lords. "The House of Lords has once again taken the right decision by voting to establish vital transparency obligations for AI companies," Sophie Jones, chief strategist for the British Phonographic Industry, . "Transparency is crucial in ensuring that the creative industries can retain control over how their works are used." This isn't the first time the House of Lords has demanded tech companies make clear whether they have used copyright-protected material when training AI models. Back in January, the body voted 145 to 126 in favor of adding amendments to the bill aimed at . "Creators do not deny the creative and economic value of AI, but we do deny the assertion that we should have to build AI for free with our work, and then rent it back from those who stole it," Baroness Beeban Kidron of the House of Lords said during a debate before this week's vote. There are some signs that Prime Minister Starmer is backing off the proposed "opt-out" idea that would force creators to petition AI companies not to use their work. The government recently added its own amendments to the data bill that include a commitment to conduct an economic impact assessment and to publish reports on transparency with regard to licensing. Technology secretary Peter Kyle has been looking into a new proposal that would, instead, create a licensing system for copyright holders and AI developers, .
[6]
UK Parliament opts not to hold AI companies accountable over copyright material
Ministers in the UK House of Commons have blocked an amendment to a data bill that would require AI companies to disclose their use of copyrighted materials, . This transparency amendment was stripped out of the text by invoking something called financial privilege, an arcane parliamentary procedure that suggests that any new regulations would require a new budget. The official site of the UK parliament "may be used by the Commons as grounds for overruling any House of Lords proposal that has cost implications." It looks like that's exactly what happened here, with those in favor of removing the amendment bringing up the cost of a potential regulatory body. There were 297 MPs who voted in favor of removing the amendment, with 168 opposing. Chris Bryant, data protection minister, said that he recognized that this could feel like an "apocalyptic moment" for the creative industries, but that he thinks the transparency amendment requires changes "in the round and not just piecemeal." The amendment earlier this week. Baroness Beeban Kidron of the Lords responded to today's move by saying that "the government failed to answer its own backbenchers who repeatedly asked 'if not now then when?'" She also said it was "astonishing that a Labour government would abandon the labor force of an entire section," referring to the plight of creative workers whose jobs have been or at risk of being replaced by AI. Lady Kidron went on to accuse the government of allowing "theft at scale" and cozying up "to those who are thieving." "Across the creative and business community, across parliament, people are gobsmacked that the government is playing parliamentary chess with their livelihoods," she concluded. As expected, Kidron will introduce a rephrased amendment before the bill's return to the Lords next week. This sets up yet another showdown when the bill returns to the Commons for another pass. Owen Meredith, the chief executive of the News Media Association, told The Guardian that it's "extremely disappointing that the government has failed to listen to the deep concerns of the creative industries, including news publishers who are so fundamental to uploading our democratic values." He accused the government of using parliamentary procedure to "dismiss industry concerns, rather than taking this timely opportunity to introduce the transparency that will drive a dynamic licensing market for the UK's immensely valuable creative content." The government's preferred plan includes the reliance on an opt-out clause. This would give AI companies free rein over any and all content, except in the cases when a creator has explicitly opted out. Last week, hundreds of artists and organizations to urge the government not to "give our work away at the behest of a handful of powerful overseas tech companies." The artists involved in this campaign included Paul McCartney, Elton John and Dua Lipa, among others. America is set to host its own version of the "give everything to AI companies" game show. Republicans have snuck in a provision to the budget bill that for ten years. That'll end well.
[7]
Elton John and Dua Lipa seek protection from AI
Julia Willemyns, co-founder of the Centre for British Progress think tank, said such proposals could hamper the UK and its bid for growth. The measures would "do nothing to stop foreign firms from using content from the British creative industries," she told the BBC. "A restrictive copyright regime would offshore AI development, chill domestic innovation, and directly harm the UK economy," she said. However, the letter comes amid mounting concern from artists over the inclusion of their works, and material protected by copyright, in the data used to develop generative AI systems. These tools, which can produce new content in response to simple text prompts, have become increasingly popular and available to consumers. But their capabilities have been accompanied by concerns and criticism over their data use and energy demand. In February, artists including Annie Lennox and Damon Albarn released a silent album to protest about the government's proposed changes to copyright law. The government carried out a consultation around its proposal to allow developers to be able to use creators' content on the internet to help develop their models, unless the rights holders elect to "opt out". According to The Guardian, ministers were reconsidering the proposal following creator backlash. Mr Ishiguro pointed the BBC to an earlier statement in which he wrote, "why is it just and fair - why is it sensible - to alter our time-honoured copyright laws to advantage mammoth corporations at the expense of individual writers, musicians, film-makers and artists?" The Nobel Prize-winning author added that since then the only limited advance was that it now appeared the government had accepted the opt-out proposals were not likely to be workable, He thought a new consultation to find a fairer scheme was possible, though it remained to be seen how meaningful any consultation would be. "It's essential that they get this right," he wrote. MPs recently rejected a separate amendment tabled by Baroness Kidron that aimed to make AI developers accountable to UK copyright law. Now, she says transparency obligations for tech firms under the new proposed amendment could support the development of licensing agreements between creators and companies. "The UK is in a unique position to take its place as a global player in the international AI supply chain, but to grasp that opportunity requires the transparency provided for in my amendments, which are essential to create a vibrant licencing market," Baroness Kidron said. In their statement the government said: "It's vital we take the time to work through the range of responses to our consultation, but equally important that we put in the groundwork now as we consider the next steps. "That is why we have committed to publishing a report and economic impact assessment - exploring the broad range of issues and options on all sides of the debate."
[8]
UK ministers to block amendment requiring AI firms to declare use of copyrighted content
Government plans to use arcane procedure to strip amendment passed by House of Lords from its data bill Ministers are turning to an arcane parliamentary procedure to block an amendment to the data bill that would require artificial intelligence companies to disclose their use of copyright-protected content. The government is planning to strip the transparency amendment, which was backed by peers in the bill's reading in the House of Lords last week, out of the draft text by invoking financial privilege, meaning that there is no budget available for new regulations, during a Commons debate on Wednesday afternoon. The amendment, which would require tech companies to reveal which copyrighted material is used in their models, was tabled by the crossbench peer Beeban Kidron and was passed by 272 votes to 125 in a House of Lords debate last week. Kidron said: "Across the creative and business community, across parliament, people are gobsmacked that the government is playing parliamentary chess with their livelihoods. "Using parliamentary privilege is a way of not confronting the issue, which is urgent, for rights holders and the economy. The house is on fire and the government are playing croquet in the garden. "This is not a serious response, and we are horribly disappointed that a party that promised to put creativity into the DNA of the country - now in power - has turned its back. It will hurt them, it will hurt the country and it is already hurting creative industries who are witnessing their work being stolen at industrial scale." Kidron intends to respond to the government's block by tabling a rephrased amendment before the bill's return to the Lords next week, setting the scene for another confrontation. This could include removing the reference to regulation, or omitting a timeframe for it to be implemented. One industry insider said that "introducing moderate, proportionate transparency obligations" such as the Kidron amendments were necessary to protect creators' work from "wholesale abuse and theft by AI". He said: "This is a workable and pragmatic solution, which does not bind the government's hand on copyright, but would help facilitate a properly functioning licensing market for high-quality content. "Yet instead of listening to the overwhelming view from the House of Lords and addressing legitimate concerns by engaging on the issues, the government seems intent on using arcane parliamentary tricks to stand in the way of progress." Last week, hundreds of artists and organisations including Paul McCartney, Jeanette Winterson, Dua Lipa and the Royal Shakespeare Company urged the prime minister not to "give our work away at the behest of a handful of powerful overseas tech companies". The government's copyright proposals are the subject of a consultation due to report back this year, but opponents of the plans have used the data bill as a vehicle for registering their disapproval. The main government proposal is to let AI firms use copyright-protected work to build their models without permission, unless the copyright holders opt out - a solution that critics say is unworkable. The government insists, however, that the creative and tech sectors are being held back and this needs to be resolved through new legislation. It has already tabled one concession in the data bill, by committing to an economic impact assessment of its proposals.
[9]
Paul McCartney and Dua Lipa among artists urging Starmer to rethink AI copyright plans
Hundreds of leading figures from UK creative industries urge prime minister not to 'give our work away' Hundreds of leading figures and organisations in the UK's creative industries, including Coldplay, Paul McCartney, Dua Lipa, Ian McKellen and the Royal Shakespeare Company, have urged the prime minister to protect artists' copyright and not "give our work away" at the behest of big tech. In an open letter to Keir Starmer, a host of major artists claim creatives' livelihoods are under threat as wrangling continues over a government plan to let artificial intelligence companies use copyright-protected work without permission. Describing copyright as the "lifeblood" of their professions, the letter warns Starmer that the proposed legal change will threaten Britain's status as a leading creative power. "We will lose an immense growth opportunity if we give our work away at the behest of a handful of powerful overseas tech companies and with it our future income, the UK's position as a creative powerhouse, and any hope that the technology of daily life will embody the values and laws of the United Kingdom," the letter says. The letter urges the government to accept an amendment to the data bill proposed by Beeban Kidron, the cross-bench peer and leading campaigner against the copyright proposals. Kidron, who organised the artists' letter, is seeking a change that requires AI firms tell copyright owners which individual works they have ingested into their models. Urging parliamentarians on all sides of the political spectrum and in both houses to support the change, the letter says: "We urge you to vote in support of the UK creative industries. Supporting us supports the creators of the future. Our work is not yours to give away." Spanning the worlds of music, theatre, film, literature, art and media, the more than 400 signatories include Elton John, Kazuo Ishiguro, Annie Lennox, Rachel Whiteread, Jeanette Winterson, the National Theatre and the News Media Association, which represents more than 800 news titles including the Guardian. Kidron's amendment will go to a House of Lords vote on Monday, although the government has already signalled its opposition to the change, saying that a consultation process already under way was the correct process for debating alterations to copyright law, which protects someone's work from being used by others without permission. Under the government proposal, AI companies will be able to use copyright-protected material without permission unless the copyright holder "opts out" of the process by indicating - in an as yet unspecified way - that they do not wish their work to be used for free. Giles Martin, the music producer and son of the Beatles producer George Martin, told the Guardian the opt-out plan could be impractical for young artists. "When Paul McCartney wrote Yesterday his first thought was 'how do I record this' and not 'how do I stop someone stealing this'," said Martin, who was the music supervisor on the documentary series The Beatles: Get Back and co-produced the "last" Beatles song Now and Then. Kidron said the letter's signatories were speaking out "to ensure a positive future for the next generation of creators and innovators". Supporters of the Kidron amendment claim the change will ensure creatives are compensated for the use of their work in training AI models via licensing deals. Generative AI models, the term for technology that underpins powerful tools such as the ChatGPT chatbot or the Suno music-making tool, have to be trained on a vast amount of data in order to generate their responses. The main source of this information is online, including the contents of Wikipedia, YouTube, newspaper articles and online book archives. The government has submitted one amendment to the data bill that commits to officials carrying out an economic impact assessment of its proposals. A source close to Peter Kyle, the technology secretary, has told the Guardian that an opt-out system was no longer his preferred option. Officially, there are four options under consideration. The other three alongside the "opt-out" scenario are: to leave the situation unchanged; require AI companies to seek licences for using copyrighted work; and allow AI firms to use copyrighted work with no opt-out for creative companies and individuals. A government spokesperson said: "Uncertainty over how our copyright framework operates is holding back growth for our AI and creative industries. That cannot continue, but we're clear that no changes will be considered unless we are completely satisfied they work for creators."
[10]
House of Lords pushes back against government's AI plans
Peers back amendment to data bill requiring AI companies to reveal which copyrighted material they have used The government has suffered another setback in the House of Lords over its plans to let artificial intelligence firms use copyright-protected work without permission. An amendment to the data bill requiring AI companies to reveal which copyrighted material is used in their models was backed by peers, despite government opposition. It is the second time parliament's upper house has demanded tech companies make clear whether they have used copyright-protected content. The vote came days after hundreds of artists and organisations including Paul McCartney, Jeanette Winterson, Dua Lipa and the Royal Shakespeare Company urged the prime minister not to "give our work away at the behest of a handful of powerful overseas tech companies". The amendment was tabled by crossbench peer Beeban Kidron and was passed by 272 votes to 125. The bill will now return to the House of Commons where the government is expected to remove it, setting the scene for another confrontation in the Lords next week. Lady Kidron said: "I want to reject the notion that those of us who are against government plans are against technology. Creators do not deny the creative and economic value of AI, but we do deny the assertion that we should have to build AI for free with our work, and then rent it back from those who stole it. "My lords, it is an assault on the British economy and it is happening at scale to a sector worth £120bn to the UK, an industry that is central to the industrial strategy and of enormous cultural import." The government's copyright proposals are the subject of a consultation due to report back this year, but opponents of the plans have used the data bill as a vehicle for registering their disapproval. The main government proposal is to let AI firms use copyright-protected work to build their models without permission, unless the copyright holders signal they do not want their work to be used in that process - a solution that critics say is impractical and unworkable. The government insists, however, that the present situation is holding back both the creative and tech sectors and needs to be resolved by new legislation. It has already tabled one concession in the data bill, by committing to an economic impact assessment of its proposals. A source close to the tech secretary, Peter Kyle, said this month that the "opt out" scenario was no longer his preferred option but one of several being given consideration.
[11]
Lords examine new amendment to data bill to require AI firms declare use of copyrighted content
It circumvents financial privilege grounds on which earlier version was rejected by MPs on Wednesday A new amendment to the data bill that would require artificial intelligence companies to disclose their use of copyright-protected content has been tabled, after MPs voted to remove an earlier version on Wednesday. The amendment by cross-bench peer and former film director Beeban Kidron will be a fresh challenge to plans to let artificial intelligence firms use copyright-protected work without permission. It circumvents the financial privilege grounds - meaning there is no budget available for the regulation - on which its predecessor was rejected.. The new wording states the government "may" make enforcement provisions rather than "must", and gives no detail about how the government could enforce them. It will be put to peers in the House of Lords for debate on 19 May after the earlier version of the amendment passed by 272 votes to 125 in a debate on Monday. Lady Kidron said: "We have accepted the speakers ruling on the Commons financial privilege and replaced the original amendment with another that would still offer transparency. "We very much hope that the government will accept it because it is in line with the review that they have proposed and the transparency that they have repeatedly said is a key to the outcome. But what it offers the creative industries and UK AI companies is a clear timeline, and a mechanism by which licensing and not stealing can become the norm." Owen Meredith, chief executive, News Media Association said: "This new amendment removes any potential direct spending implications for enforcement - which was the Commons' objection to the previous drafting - and would ensure copyright owners receive clear, relevant, accurate and accessible information about how their work is accessed and used, but gives the government flexibility over exactly how this is achieved. "The entire creative industries, the voting public, and multiple parliamentary reports and debates have given a clear view to the government that action now to ensure rights holders are better equipped to enforce the existing law, with the proportionate application of transparency, is a progressive way forward. It's time to act, not just 'listen'." In Wednesday's debate, the data protection minister, Chris Bryant, told MPs that although he recognised that for many in the creative industries this "feels like an apocalyptic moment", he did not think the transparency amendment delivered the required solutions, and he argued that changes needed to be completed "in the round and not just piecemeal". He added that the sooner the data bill was passed, the quicker he would be able to make progress on updating copyright law. The government's copyright proposals are the subject of a consultation due to report back this year, but opponents of the plans have used the data bill as a vehicle for registering their disapproval. The main government proposal is to let AI firms use copyright-protected work to build their models without permission unless the copyright holders opt out - a solution that critics say is unworkable.
[12]
Tech firms must tell newspapers when they use material to train AI, under Lords plan
Lord Black, who is the deputy chairman of the Telegraph Media Group, said the "centuries-old right" to copyright protection was in danger because the Government is "legalising theft" and allowing AI to "plunder someone else's work and profit from it". During the House of Lords debate, he argued that AI posed an "existential threat" to a free press, by allowing companies to steal news companies' content to train their models. He said: "AI has the capacity utterly to destroy independent news organisations because it feasts off millions of articles written by journalists without any attribution or payment, destroying the business model that makes the free press possible. "Without action this day, news will die in the cold darkness of cyberspace where no legal framework exists - the advertising which supports it taken by the platforms, its content stolen by AI. There will be only a husk left." News organisations are especially at risk of copyright violations by tech companies, many of which are looking to develop their own AI news services. Several of the UK's largest news companies, including The Telegraph, signed the letter to Sir Keir urging him to introduce a requirement for tech companies to inform the creators of content they have used. 'Threat to democracy' Lord Black added: "The term 'existential threat' is bandied around too much. But this is not crying wolf. "Unless we introduce transparency, control over content and fair remuneration within a dynamic licensing market, the threat to free media is genuinely existential. As a consequence the threat to democracy itself is also genuinely existential." The amendment, by Baroness Kidron, would require AI companies to publish details of copyrighted material they use to train models, and make it accessible to content owners upon request. Ministers have effectively abandoned earlier plans that would have given AI companies the power to train their models on copyrighted content unless the owner "opted out". Peter Kyle, the Science Secretary, is now considering a new licence-based model. The latest version of the Data Bill requires ministers to draw up a policy on AI and copyright within a year. However, the signatories of the letter argue that the process will take too long, and they will be forced to "give our work away at the behest of a handful of powerful overseas tech companies" if ministers do not act sooner.
[13]
British govt suffers setback in AI copyright battle
The British government suffered a setback to its plans to make it easier for AI companies to access data as the House of Lords backed more protection for content creators on Monday. The Labour government under Prime Minister Keir Starmer wants to introduce a copyright exception for commercial generative AI training with its Data (Use and Access) Bill. Under the proposed law, companies developing AI models would not need permission from creatives to access certain content -- a plan that has provoked a fierce backlash in the cultural sector. More than 400 artists and other creatives have signed an open letter calling for the plans to be scrapped, including Paul McCartney, Elton John and Dua Lipa. Beeban Kidron, a member of the House of Lords, Britain's upper house of parliament, on Monday tabled an amendment to the bill that was passed by 272 votes to 125. Under the amendment, authors must give permission for their work to be used and must also be able to see what has been taken, by whom and when. Artificial intelligence companies "are stealing some of the UK's most valuable cultural and economic assets," said Kidron, who directed one of the Bridget Jones films. "Creators do not deny the creative and economic value of AI. But we do deny the assertion that we should have to build AI for free, with our work, and then rent it back from those who stole it," Kidron said. "It's Harry Potter, it's the entire back catalog of every single music publisher in the UK. It's the voice of Hugh Grant, the design of an iconic handbag, the IP of our universities, great museums and library collections," she said. Labour digital minister Maggie Jones said there was a "real risk" that too many "obligations" would lead to "AI innovators, including many home-grown British companies, thinking twice about whether they wish to develop and provide their services in the UK". Starmer in January unveiled an "action plan" to make the UK "the world leader" in artificial intelligence and spark Britain's flagging economy, promising flexible regulations. The bill will now be sent back to the House of Commons, the lower house of parliament, for further debate.
[14]
Labour accused of blocking fresh AI copyright protection
Labour has been accused of "manipulating" parliamentary procedure to block new rules that would help protect artists, musicians and authors from copyright violations by AI. The Government has rejected an amendment put forward by the House of Lords, which would have required tech companies to tell creatives when their work had been used to train AI bots. The rebel amendment, which was backed by more than 400 industry figures including Sir Elton John and Robbie Williams, was blocked using an arcane parliamentary procedure. On Monday, peers voted by a 147 majority to amend Labour's Data Bill to add transparency requirements. However, MPs voted by a majority of 129 on Wednesday to disagree with this change, meaning so-called ping-pong between the two Houses of Parliament is set to continue. The reason given for rejecting the amendment was that it "would involve charges on public funds". Involve taxpayers' money Ministers argued that introducing new copyright regulations would involve spending taxpayers' money, which is the prerogative of the House of Commons to decide. Lord Black of Brentwood, the deputy chairman of the Telegraph Media Group, condemned the decision and called for ministers to do more to protect creatives from AI "theft". "For AI businesses to flourish here, they need access to our world-class content, which will only be produced if content creators have effective copyright protection," he said. "This House recognised that on Monday during the passage of the Data Bill and it's deeply disappointing to learn that, rather than act decisively to give creators transparency as we voted for [...] the Government is manipulating parliamentary procedure arrogantly to dismiss our views. "Is it really now the Government's extraordinary position that if it costs money to enforce the law we must just let criminals get away with theft?" Failure to protect creative industries Lord Black previously said the AI amendment was an existential issue for the news industry, which is at risk from theft by tech companies. The row follows a debate on the AI measures in the House of Lords, where peers lined up to accuse the Government of failing to protect the creative industries. The amendment was proposed by Baroness Kidron, a crossbench peer, who marshalled an open letter to Sir Keir Starmer about her suggested change. On Wednesday she said Labour used "procedure rather than policy to overturn amendments that could have turbocharged a lucrative market for AI training data". She added: "If theft at this scale was happening in pharma, finance, aerospace or indeed the tech sector, which protects its patents so fiercely, would the Government stand by and suggest the cost of regulation was too great to stop a multibillion-pound industry being plundered?" The Government has said it is "considering next steps" on addressing copyright issues after its 10-week consultation on the impact of AI, which received more than 11,500 responses, but that the Data Bill is not the right vehicle to implement change on this issue.
[15]
UK House of Lords backs amendment to AI bill thanks to peer vote that forces companies to reveal copyrighted material used in training AI models
The UK government doesn't seem to want to give people credit for their work in the name of progress. How good is the british museum, btw? In a sentence that reads like something out of a very derivative sci-fi novella, peers in the House of Lords have pushed back against the United Kingdom government's bill around training AIs. Yes, this is exactly the kind of future I imagined living in. Not the cool one with flying electric vehicles. As the Guardian reports, the UK is looking to make it legal to train AI on copy-write protected materials, as long as the owners don't specifically object. This opt-out approach was rightly pointed out to be too cumbersome by critics of the movement. In some cases you'd have to get lucky to even know about the AI that's using your work, let alone have the savvy to contest it. The argument in favour used by the UK government is that we're currently slowing both artistic and technological advancements by placing such restrictions on training AI. It's likely mirroring other governments concerns over the growing supremacy of China's DeepSeek LLM and other speedy developments in the industry. Still, as an artist, this reeks of not wanting to pay people for their work rather than driving an industry forward. While the government is concerned over global advancement, individuals and artists worries are around ownership of work. It's a common problem that's cropping up with these kinds of AIs, like in the court case where Sarah Silverman sued Meta over voice training. In the days leading up to the vote in the House of Lords prominent artists and groups protested the move. These included the likes of Paul McCartney, Jeanette Winterson, Dua Lipa and the Royal Shakespeare Company who called for the government to oppose the bill, asking that they don't "give our work away at the behest of a handful of powerful overseas tech companies". At the vote an amendment was tabled by crossbench peer Beeban Kidron, which added the requirement for AI companies to reveal which copyrighted material is used in their models. The amendment was passed, despite government opposition, by 272 votes to 125. Lady Kidron said: "I want to reject the notion that those of us who are against government plans are against technology. Creators do not deny the creative and economic value of AI, but we do deny the assertion that we should have to build AI for free with our work, and then rent it back from those who stole it." "My lords, it is an assault on the British economy and it is happening at scale to a sector worth £120bn to the UK, an industry that is central to the industrial strategy and of enormous cultural import." Assuming the government still goes ahead with the horrible opt out method, this should at least make it easier for people to find out if their work has been cannibalised by AI. Plus forcing the acknowledgement of these training methods is at least some form of credit to the creators. This is, of course, all assuming these firms are honest in their reporting and don't find some other legal loophole in the meantime.
[16]
Elton John, Dua Lipa, Coldplay Among 400 Artists Seeking Copyright Protection Amid A.I. Surge
Bad Bunny Finally Facing Lawsuit Over Uncleared Afrobeats Sample Elton John, Dua Lipa, Coldplay, and Florence Welch are among the over 400 artists who have signed a letter calling on British Prime Minister Keir Starmer to update copyright laws in the face of A.I. technology. "We, along with 400 other creatives, have signed and sent this letter to the Prime Minister, urging him to give Government support to proposals that would protect copyright in the age of AI," Elton John wrote on social media. "This comes ahead of a crunch vote on the plans in the House of Lords on Monday 12th May." Paul McCartney, who previously lobbied for copyright law protections in a BBC interview earlier this year, also signed the letter, along with Kate Bush, Robbie Williams, and hundreds more musicians, actors, playwrights, directors, and artists. "Creative copyright is the lifeblood of the creative industries. It recognizes the moral authority we have over our work and provides an income stream for 2.4 million people across the four nations of the United Kingdom," the letter reads. "The fight to defend our creative industries has been joined by scores of UK businesses, including those who use and develop AI. We are not against progress or innovation. The creative industries have always been early adopters of technology. Indeed, many of the world's greatest inventions, from the lightbulb to AI itself, have been a result of UK creative minds grappling with technology." The signees have thrown their support behind the Data (Use and Access) Bill, which would require developers to be transparent with copyright owners about using their material to train AI models, the BBC reports. The bill was proposed by Baroness Beeban Kidron, with a vote set Monday in the House of Lords. "The first job of any government is to protect its citizens," the letter continued, adding that the bill would "put transparency at the heart of the copyright regime and allow both AI developers and creators to develop licensing regimes that will allow for human-created content well into the future." This past December, artists, publishers, media companies and more banded together to form the Creative Rights in AI Coalition, which aims to keep the current copyright protections in place despite the U.K.'s continued courtship of AI technology "We're the people, you're the government. You're supposed to protect us. That's your job," McCartney said to lawmakers in a BBC interview earlier this year. "So if you're putting through a bill, make sure you protect the creative thinkers, the creative artists, or you're not gonna have them. If there's such a thing as a government, it's their responsibility -- I would think -- to protect young people to try and enhance that whole thing so it works. So that these people have got job and can enhance the world with wonderful art."
[17]
British government suffers setback in AI copyright battle
Under the proposed law, companies developing AI models would not need permission from creatives to access certain content -- a plan that has provoked a fierce backlash in the cultural sector. More than 400 artists and other creatives have signed an open letter calling for the plans to be scrapped, including Paul McCartney, Elton John and Dua Lipa.The British government suffered a setback to its plans to make it easier for AI companies to access data as the House of Lords backed more protection for content creators on Monday. The Labour government under Prime Minister Keir Starmer wants to introduce a copyright exception for commercial generative AI training with its Data (Use and Access) Bill. Under the proposed law, companies developing AI models would not need permission from creatives to access certain content -- a plan that has provoked a fierce backlash in the cultural sector. More than 400 artists and other creatives have signed an open letter calling for the plans to be scrapped, including Paul McCartney, Elton John and Dua Lipa. Beeban Kidron, a member of the House of Lords, Britain's upper house of parliament, on Monday tabled an amendment to the bill that was passed by 272 votes to 125. Under the amendment, authors must give permission for their work to be used and must also be able to see what has been taken, by whom and when. Artificial intelligence companies "are stealing some of the UK's most valuable cultural and economic assets", said Kidron, who directed one of the Bridget Jones films. "Creators do not deny the creative and economic value of AI. But we do deny the assertion that we should have to build AI for free, with our work, and then rent it back from those who stole it," Kidron said. "It's Harry Potter, it's the entire back catalogue of every single music publisher in the UK. It's the voice of Hugh Grant, the design of an iconic handbag, the IP of our universities, great museums and library collections," she said. Labour digital minister Maggie Jones said there was a "real risk" that too many "obligations" would lead to "AI innovators, including many home-grown British companies, thinking twice about whether they wish to develop and provide their services in the UK". Starmer in January unveiled an "action plan" to make the UK "the world leader" in artificial intelligence and spark Britain's flagging economy, promising flexible regulations. The bill will now be sent back to the House of Commons, the lower house of parliament, for further debate.
[18]
UK Lords Approve AI Copyright Transparency Amendment
The UK government's attempt to introduce a copyright exception for artificial intelligence (AI) training has met stiff resistance in the House of Lords, where a majority voted 272-125 in favour of an amendment requiring greater transparency from AI developers. The measure compels companies to disclose whether copyrighted material was used during AI training and to share that information with rights holders -- marking a major pushback against unchecked data scraping in the age of generative AI. More than 400 British artists and creative leaders -- including Paul McCartney, Elton John, Dua Lipa, Kazuo Ishiguro, and Ian McKellen -- signed an open letter urging Prime Minister Keir Starmer to support enforceable transparency rules. They warned that allowing AI to train on protected works without consent could undermine the UK's status as a global creative hub. The letter backed amendments introduced by Baroness Beeban Kidron, a long-time advocate for digital accountability. The House of Lords approved Amendment 49B to the Data (Use and Access) Bill, setting new transparency obligations for developers of AI systems offered in the UK or directed at UK users. The amendment directs developers to: Lawmakers included a safeguard under Section 7 that instructs regulators to scale down compliance requirements for small and micro-entities, as defined in the Companies Act 2006. This provision ensures that transparency obligations do not impose an excessive burden on smaller or UK-based AI firms. Consequently, the House of Commons rejected the amendment, citing financial privilege, which prevents the Lords from passing provisions that increase public spending unless approved by the Commons. The Labour government has opposed the disclosure requirement. UK's Data Protection Minister Chris Bryant acknowledged the concern from artists, saying the situation "feels like an apocalyptic moment" for many in the creative industries. However, he argued that the amendment failed to offer the right solutions and said changes to copyright law needed to be tackled "in the round and not just piecemeal." He added that passing the bill would help the government move faster on broader copyright reform. Rights groups such as the Music Publishers Association and Design and Artists Copyright Society (DACS) supported the amendment. These organisations argue that a lack of transparency prevents copyright owners from knowing when others use their work. Without that information, they cannot effectively enforce their rights. The open letter from The Music Publishers Association to the Prime Minister stated: "The first job of any government is to protect its citizens. These amendments recognise the crucial role that creative content plays in the development of generative AI. They will spur a dynamic licensing market that will enhance the role of human creativity in the UK, positioning us as a key player in the global AI supply chain." The UK amendment joins a broader international debate over copyright and AI. Countries are responding in different ways to the question of how AI should interact with copyrighted content. Generative AI depends on creative work, but the rules around how that work is used remain unclear. The UK's debate over disclosure rights has put a spotlight on this gap. India is watching closely. With a government panel now examining whether the Copyright Act still holds up in the age of machine learning, decisions made elsewhere may shape how domestic law evolves -- and who it protects.
Share
Copy Link
Over 400 UK artists and media professionals, including Paul McCartney and Dua Lipa, support an amendment to the Data Bill requiring AI firms to disclose copyrighted works used in training models.
In a significant move, over 400 prominent figures from the UK's creative industry have rallied behind an amendment to the Data (Use and Access) Bill, aimed at protecting copyrighted works from unauthorized use in AI training. The amendment, proposed by Baroness Beeban Kidron, would require AI companies to disclose which individual copyrighted works they have used to train their models 12.
The open letter supporting the amendment boasts signatures from some of the UK's most renowned artists, including Paul McCartney, Dua Lipa, Elton John, and Ian McKellen. It also has backing from major media organizations such as the Financial Times, Daily Mail, and the National Union of Journalists 23.
Despite this strong show of support, the UK government has opposed the amendment, arguing that the fight over it "is holding back both the creative and tech sectors and needs to be resolved by new legislation" 1. The government's preferred position has been a provision that would require copyright holders to formally opt-out of having their work used to train AI models 5.
In a significant development, the British House of Lords passed the amendment with a vote of 272 to 125 15. This decision sends the bill back to the House of Commons, where the amendment could potentially be removed again. The passage of the amendment in the House of Lords represents a crucial step in the ongoing debate over AI and copyright protection.
Supporters of the amendment argue that it is essential for protecting the UK's creative industries, which contribute significantly to the national economy. Baroness Kidron emphasized the sector's importance, stating, "It is an assault on the British economy and it is happening at scale to a sector worth £120bn to the UK, an industry that is central to the industrial strategy and of enormous cultural import" 1.
The debate in the UK reflects a broader global conversation about AI and copyright. Companies like OpenAI and Meta have faced legal challenges over alleged unauthorized use of copyrighted material in training their AI models 12. The outcome of this legislative process in the UK could have far-reaching implications for how AI companies operate and how creative works are protected in the digital age.
As the bill returns to the House of Commons, the creative industry remains hopeful that their concerns will be addressed. Technology Secretary Peter Kyle has been exploring alternative proposals, including a potential licensing system for copyright holders and AI developers 5. This suggests that there may be room for compromise as the government seeks to balance the interests of the creative and tech sectors.
NVIDIA announces significant upgrades to its GeForce NOW cloud gaming service, including RTX 5080-class performance, improved streaming quality, and an expanded game library, set to launch in September 2025.
9 Sources
Technology
3 hrs ago
9 Sources
Technology
3 hrs ago
As nations compete for dominance in space, the risk of satellite hijacking and space-based weapons escalates, transforming outer space into a potential battlefield with far-reaching consequences for global security and economy.
7 Sources
Technology
19 hrs ago
7 Sources
Technology
19 hrs ago
OpenAI updates GPT-5 to make it more approachable following user feedback, sparking debate about AI personality and user preferences.
6 Sources
Technology
11 hrs ago
6 Sources
Technology
11 hrs ago
A pro-Russian propaganda group, Storm-1679, is using AI-generated content and impersonating legitimate news outlets to spread disinformation, raising concerns about the growing threat of AI-powered fake news.
2 Sources
Technology
19 hrs ago
2 Sources
Technology
19 hrs ago
A study reveals patients' increasing reliance on AI for medical advice, often trusting it over doctors. This trend is reshaping doctor-patient dynamics and raising concerns about AI's limitations in healthcare.
3 Sources
Health
11 hrs ago
3 Sources
Health
11 hrs ago