Curated by THEOUTPOST
On Tue, 25 Feb, 4:03 PM UTC
12 Sources
[1]
AI can't help real artists reach their full potential | Letters
Proposed changes to copyright law would deny what makes us human, says Helen Ward. Plus letters from Jason Mills and Daniel Heuman Your article (Kate Bush and Damon Albarn among 1,000 artists on silent AI protest album, 25 February) quotes a government spokesperson: "As it stands, the UK's current regime for copyright and AI is holding back the creative industries, media and AI sector from realising their full potential." To suggest that an artist's full potential can only be reached with the help of artificial intelligence is to dismiss what makes us human and to ignore the vast scope, power and achievement of thousands of years of human creativity. Genuine creativity is not about full potential - it is about imagination and learning, about continually asking and answering questions; it is a process. Creativity is a source of wellbeing and comfort even for those whose work may never be seen, heard or read by anyone else. Creativity needs to be valued, protected and encouraged as a human activity. I suggest that all those involved in this AI-biased consultation spend time away from their screens, make something with their own heads or hands, and then listen to those deeply involved in creativity to understand what they are stealing, repackaging and, more disastrously, offering: AI-manufactured pseudo-creativity. This is junk food for the mind on the children's shelf of some megalomaniac's digital supermarket. A shortcut to "full potential" or yet another way to stop human beings thinking for themselves? I was brought up in a creative family - I am an illustrator and author - and quite apart from all of the above, my living depends on the selling of copyright. Helen Ward Stroud, Gloucestershire Even as AI pushes from below, the top tier of artistic endeavour implicitly requires conscious authorship. We attend galleries and concerts to encounter communication from one mind to another about the experience of living. Truly conscious AI might be as worth listening to as any other individual, but that's over the horizon. For now, a machine-generated production, while impressive, is of only novelty interest. But it's small comfort to those further down the chain whose gig is up - jobbing artists, writers and composers. For their changing situation, the government needs to do a lot more than roll over. Impoverishing creatives to accommodate big tech's latest overhyped wheeze is not progress. Jason Mills Accrington, Lancashire As the CEO of a UK-based tech company, I might be expected to disagree with your article. On the contrary: we share the concerns about the erosion of copyright protections that the proposed data (use and access) bill would bring. Our firm is deeply invested in AI technology, and our customers are writers, editors, artists and scientists. Without enforceable copyright laws that protect their work, they simply won't be able to continue their work. Your article rightly highlights the moral and economic implications of the unchecked use of creative content. It's essential that AI firms obtain consent and provide fair remuneration when using copyrighted materials. This respects the rights of creators and ensures the sustainability of the creative sector. It also supports British businesses that rely on the creative sector. I stand with you in urging policymakers to reconsider this bill. It's crucial that we strike a better balance between fostering technological innovation and upholding the rights of those who contribute so much to our cultural and economic landscape. Daniel Heuman CEO and founder, Intelligent Editing
[2]
Help is coming in the AI copyright wars
As you might expect, the protests have been creative, even quirky. More than 1,000 artists, including Annie Lennox and Kate Bush, supported the release this week of a silent album containing nothing more than background studio noise. The 47-minute album called Is This What We Want? contains 12 tracks entitled: The. British. Government. Must. Not. Legalise. Music. Theft. To. Benefit. AI. Companies. As a musical experience, the album -- available on Spotify -- is not highly recommended. Personally, I prefer John Cage's 4'33'', a three-movement composition in which the orchestra does not play a note, mainly because it's shorter. But this mute protest is part of a worldwide revolt by creative artists and content companies against the unauthorised use of their work by big technology firms. In the US, the Authors Guild and 17 individual authors, including Jodi Picoult and Jonathan Franzen, are pursuing a more traditional American form of protest by suing OpenAI and Microsoft for copyright infringement, alleging "systematic theft on a mass scale." Japan's Newspaper Publishers and Editors Association has also protested against AI companies "freeriding on the labour of news outlets". These disputes are a classic example of what happens when new technologies outpace laws written for an earlier era. When intellectual property laws were enacted, no one could have imagined a day when massive companies would scrape the entire internet as training data for their generative AI models then spew out convincing simulacra of poems, images, music and videos. But the principle that no one should profit from another's intellectual property without consent should remain inviolable. As in many other countries, the British government is currently struggling to realign principle and practice and update its intellectual property laws for the AI age. As the protests show, this is not easy. The creative industries are of critical importance to the British economy. By the government's numbers, they contributed £124bn in gross added value to the economy in 2023, about 5 per cent of the total. On the other hand, the government is desperate to position the UK as an AI-friendly powerhouse, behind the US and China. The UK government appears fearful of stepping out of line with the Trump administration over tech policy and also wants to distance itself from intrusive EU regulations. Last month, the government published an AI Opportunities Action Plan saying the current uncertainty around intellectual property needed to be urgently resolved. It has been consulting widely but is toying with "fair use" exemptions, which would be welcomed by AI companies. What is partly overlooked in this debate is how desperate AI companies are to obtain fresh human-generated content to develop their models -- and how much they would pay if they were able to do so easily and legally. "We need to find new economic models where creators can have new revenue streams," Sam Altman, OpenAI's chief executive, admitted in December. As it happens, several start-ups are experimenting with such economic models, including ProRata.ai, TollBit and Human Native.ai. ProRata is developing an answer engine that would pay a share of an AI company's revenue to content creators whenever their work appeared in its responses. TollBit enables AI bots and data scrapers to pay websites directly for their content and thereby reduce legal uncertainty. And Human Native is creating a two-sided marketplace allowing AI creators to license data from content creators. Just as hackers pirated music from the record companies in the early 2000s -- before the industry evolved and enabled consumers to pay to stream music online -- so the creative industries are experiencing their own "Napster era," argues James Smith, Human Native's co-founder. Some of these creative businesses are already striking individual content licensing deals with AI companies: Axel Springer, News Corp and the FT have signed agreements with OpenAI while Agence France-Press has partnered with Mistral. Human Native is aiming to automate that process on a mass scale. "We want to be the infrastructure for enabling data commerce on the internet," Smith tells me. The biggest of many differences between the Napster era and today, however, is that the pirates are no longer small groups of hackers but giant corporations with lobbying muscle. Revised legislation may be essential to force their hands. But nascent market mechanisms are developing that could enable mutually beneficial solutions. If AI companies do not bite harder on that carrot they deserve to be hit with a big stick.
[3]
AI companies committing 'largest copyright heist in world's history'
"But a strong copyright regime is absolutely vital if the Government wants to achieve its growth goals, and that goes for absolutely everyone." Dame Caroline's comments come as she joined forces with Chi Onwurah, the chairman of the science, innovation and technology committee, to urge Ms Nandy and Mr Kyle to rethink the planned regime. The joint letter emphasised the "vital and longstanding role" played by copyright laws in Britain and said the Government must pay artists fairly. "Underpinning our recommendations is the principle that everyone should receive fair remuneration for their creative work," it said. "The framework, underpinned by the principle of rights holder consent, provides for the commercialisation of copyrighted works, incentivising further creativity and innovation. Both sides of this debate benefit from copyright law." The same proposals were presented to the previous Tory government but blocked by Rishi Sunak's administration over fears about the theft of creative content by AI firms. It was the view of Mr Sunak's team that no changes to copyright laws should be pursued without the support of creative industries. A compromise could still be struck in the form of new commercial licensing agreements, which would give AI companies legal certainty while also remunerating rights holders. A Government spokesman said: "As it stands, the UK's current regime for copyright and AI is holding back the creative industries, media and AI sectors - and that cannot continue. That's why we have been consulting on a new approach that protects the interests of both AI developers and right holders and delivers a solution for both. "We will now consider the full range of responses we have received through the consultation, so it's premature to speculate on the way forward. "No decisions will be taken until we are absolutely confident we have a practical plan that delivers each of our objectives, including increased control for right holders to help them easily license their content, enabling lawful access to material to train world-leading AI models in the UK, and building greater transparency over material being used." Everyone suffers when copyright law is watered down By Dame Caroline Dinenage On Tuesday, in a rare display of unity, every UK newspaper joined together in opposition to the Government's proposals for the future of AI. The message they delivered was loud and clear: proposals to let AI companies hoover up their work without seeking permission were not fair, and amounted to intellectual property theft. Newspaper editors joined artists, authors, musicians and other creators to say that such a move presents a grave and existential threat to our creative industries. If the Government carries on down its preferred pathway of an "opt-out", they say the very foundations that make the UK a world leader in culture and creativity would be lost. AI companies have already scraped the internet of books, films and music, all created by uniquely talented British creators, without permission. As I told the Prime Minister last year, it is effectively the largest copyright heist in the world's history. The Government has suggested that creators might be able to "reserve their rights" by opting out of having their work used in this way. Experience shows us that there is no effective way of opting out on offer anywhere in the world. Technology may have developed so far as to be able to steal content, but it certainly hasn't developed far enough yet to protect it. The opt out proposal does not align with our tacit understanding of property rights, developed over centuries. These proposals are the equivalent of telling shop owners that shoplifters should have the freedom to pocket the goods on a supermarket's shelves unless every single item is marked with a note saying "do not steal". The creative industries are worth £125 billion to our economy, but by failing to enforce intellectual property rights, overseas tech giants and their AI models will be getting a free subsidy on the back of our talented and hard-working creators. It truly flies in the face of a well-established regime that has played a vital role in protecting creators for more than 300 years. The concerns expressed by many in the creative sectors echo what we have heard in Parliament where this week two select committees, including the culture, media and sport committee, which I chair, submitted a response to the government consultation after holding a joint hearing with AI start-ups and creatives. The strong case against the Government's preferred opt out model has been made eloquently across the board. But before even considering any changes to copyright, there is one step that could go a long way to ensuring the present law can be properly enforced in the face of the challenges brought about by new technology. Already in both the EU and California, there are requirements for more transparency of the data big firms are using to train their AI models. More can be done in the United Kingdom to make creators aware that their work has been used in AI. This would not only ensure that rights holders could be properly paid for their work, but would allow consumers to make more informed choices about the AI models they use. Developers have pushed back on the basis that training data constitutes a trade secret. While there may be a case for protecting the makeup of the models themselves, it would seem a stretch to suggest that this would apply to the data they are trained on. A focus on transparency would lessen the need for any changes to copyright and the Government's proposal that rights holders can opt out of having their work used. I'm all for technology, and just as excited as anyone - including and especially the creative industries - about the ways in which AI might help us tackle some of humanity's biggest challenges and drive growth. But a strong copyright regime is absolutely vital if the Government wants to achieve its growth goals, and that goes for absolutely everyone. Ironically, as if hoisted by its own petard, it was recently alleged that Chinese company DeepSeek had breached ChatGPT's terms of service by copying Open AI's output to train their model. Some might see this as karma, but it shows that there are risks to AI developers too. Everyone suffers when copyright law is watered down. While ministers have repeatedly claimed that there is "uncertainty" about copyright law, parroting the lines of tech lobbyists, in reality there is no confusion. Our gold-standard intellectual property regime needs to be enforced.
[4]
The Guardian view on AI and copyright: creativity should be cherished, not given away | Editorial
It's a seductive promise: let our computers scrape the internet for ideas, images, forms of words, stories, music, jokes ... and our industry will make your country rich. For a UK government desperate for economic growth, the demands of tech companies for copyright laws to be relaxed - in order that their artificial intelligence (AI) systems can access as much online content as possible without having to pay or seek permission - have been hard to resist. The US and China are the global leaders of this new tech race. But the UK has a chance to compete that ministers are desperate not to miss. To AI businesses, copyright is an irritant. Three years ago, it appeared that their lobbyists were on the verge of getting their way when a government agency, the Intellectual Property Office, recommended an exemption for data mining. This would grant bots free rein and - so the argument went - provide an incentive for tech companies to invest in the UK. The proposal was not taken up. Unfortunately, it was not killed off either. And the consultation regarding AI and copyright law being run by the current government, which concludes on Tuesday, is framed in terms that are far too favourable to big tech. Currently, the law is very clear. The makers of works of art, and other products of human creativity, including journalism, have for centuries been entitled to protection from copyists. Drawing a line between influence and imitation is not always straightforward, and can lead to interesting court cases. But the principle that original material cannot be ripped off and that creative people have rights over their work is widely understood and accepted. It is not surprising that big tech decided these rules did not apply to AI. Breaking things is, famously, part of the Silicon Valley ethos. Already, AI firms have absorbed a vast amount of material that they ought to have been obliged to pay for. There are parallels with the way that US social media companies built advertising-funded businesses that were heavily reliant on content, including news, that was paid for by others. Once again, regulators as well as other industries are struggling to keep up. But it is individual artists, writers and musicians, and smaller creative and media organisations and businesses, who are most in danger of being left behind. It is easy to see why ministers are attracted by big tech's boundless self-belief. Who doesn't hope that new technology will help to solve some of the world's many problems? But JD Vance, the US vice-president, was wrong to criticise European governments for being "too self-conscious, too risk-averse", at the AI summit held in Paris earlier this month. It is concerning that the UK, like the US, refused to sign a declaration committed to safety and sustainability. Thankfully, artists including Paul McCartney and Elton John have stepped up to make the case against big tech anti-regulation - and in favour of the continued protection of human artistry. In the House of Lords, an amendment to the government's data bill, asserting that licences to copyrighted material must be actively sought rather than taken for granted, was accepted. Ministers who have had their heads turned by the promise of new data centres and a seat at the AI table should reconsider their priorities and obligations. Big tech should have no more rights over the work of others than anybody else.
[5]
It's grand theft AI and UK ministers are behind it. Oppose this robbery of people's creativity | Andrew Lloyd Webber and Alastair Webber
The plan to weaken copyright law, allowing tech firms to profit for free from people's talent, is an outrage, and the nation would suffer We are father and son: one has written 16 musicals and counting, the other cofounded The Other Songs, a leading independent record and publishing company. Our work has employed thousands globally, nurturing the next generation of talent. Copyright is the foundation that protects this, and all creative work: from music, theatre and literature to film and art. Copyright ensures creators retain control and are fairly compensated. It underpins the creative economy. Put simply, it allows artists and creatives to make a living. Endless studies have shown what a benefit that creativity - music, theatre, dance, art, film, TV, the list is endless - has on the rest of society. Yet, today, the UK government is proposing changes that would strip creators of this protection. Under the data (use and access) bill, AI companies would be allowed to take works, past and future, and use them as training data without consent or payment. These models digest vast amounts of human-created content and then generate imitations, bypassing the rights of the original creators. The government's proposed "opt-out" system - the idea that they will always be in a position to preemptively reserve their rights - is a sham. It is technically impossible for artists to opt out. The government's consultation ends today, but we should be clear: this is not regulation, it is a free pass for AI to exploit creativity without consequence. AI can replicate patterns, but it does not create. If left unregulated, it will not just be a creative crisis, but an economic failure in the making. AI will flood the market with machine-generated imitations, undercutting human creativity and destroying industries that drive jobs, tourism and Britain's cultural identity. The creative industry on which we all thrive in myriad ways will stumble and falter. The government claims weakening copyright law will attract AI investment, and that it is offering "a copyright regime that provides creators with real control, transparency, and helps them license their content", but there is no evidence to support this. Global AI firms will extract UK intellectual property while continuing their operations elsewhere, leaving British creators at a disadvantage. Meanwhile, responsible AI companies such as Adobe and DeepMind already license content, proving that regulation and innovation can coexist. The solution is clear. Beeban Kidron's amendments to the bill would introduce safeguards, ensuring AI firms seek permission and pay for the content they use. The alternative is, as she told the House of Lords last month, that we continue the "delusion that the UK's best interest and economic future aligns with those of Silicon Valley". Copyright protections are not a barrier to AI innovation; they are the foundation that allows creators to produce the high-quality work AI depends on. Without strong copyright laws, human creativity will be devalued and displaced by machines. Do we want our children to discover the next David Bowie, or David BowAI? We stand at a pivotal juncture. The streaming era has already diminished the value of songwriters to the extent that many struggle to make a living. Streaming revenue allocates about 15% to songwriters, while record labels and artists receive 55%, and streaming services claim 30%. Moreover, songwriters are not compensated upfront for their used songs by artists and labels, unlike in TV, film and theatre where there is an option for their works. Consequently, relying solely on 15% is an insurmountable challenge. Now, the UK risks making an even greater error. In 1710, Britain introduced the world's first copyright law, the Statute of Anne, setting the global standard for protecting creators. Until then, authors found the copyright to their work belonged to the printers of that work. Self-publishing was effectively illegal, but the statute gave writers the ability to own their own creations. This was right and now seems obvious. It is extraordinary that more than 300 years later this government is planning to dismantle those protections. Labour claims to represent working people. Creative artists are working people, and their work is of untold value economically, socially and, of course, culturally. An AI machine is not a person. It is time to step up and protect the people at the heart of the UK's unrivalled creative economy. If these efforts fail, we will all suffer.
[6]
We must make AI fair - failure could scupper creative life
But perhaps the biggest shout out goes to the hundreds of individuals who have written to me to thank me for my amendments to the Data Uses and (Access) Bill. Amendments that would solve the Government's stated problem of making copyright fit for the age of AI. And now, a little slower to react, those companies and institutions who themselves rely on the UK's incredibly successful, lucrative, and joyous industry, that drives tourism, brings inward investment, tells the national story are creaking into action. For them, the watering down of copyright is an existential threat, and rather than wrapping themselves in blue or writing to the press, they are reaching for the law. It will be interesting to watch the basis upon which a class action can be fought, or if the Government has acted fairly in putting forward a preferred option that will undermine their income, while taking advice from companies that will build products which are in direct competition with what they have stolen. We have seen those with deep pockets in the US go to the court and watched as even last week a US federal judge found that ROSS intelligence infringed Thomson Reuters' copyright, this judgment shows that it is far from inevitable that the US will have a far more permissive copyright framework than currently exists in UK copyright law, making the arguments of tech lobbyists and ministers that we must become more permissive in order to compete internationally entirely spurious. This case is also important because the judge found that ROSS's AI was designed to compete with the works they ingested and could make it more difficult for Thomson Reuters - the creator - to monetise their IP, a key concern of creators large and small start coming back saying that. Reuters has deep pockets, but now the creative sector, press, music, films, drama, fiction, design and even fashion and academia have started to see their interests aligned, and in doing so, more than one sector has seen that they don't have to fight one by one.
[7]
Prioritise artists over tech in AI copyright debate, MPs say
Cross-party committees urge ministers to drop plans to force creators to opt out of works being used to train AI Two cross-party committees of MPs have urged the government to prioritise ensuring that creators are fairly remunerated for their creative work over making it easy to train AI models. The MPs argued there needed to be more transparency around the vast amounts of data used to train generative AI models, and urged the government not to press ahead with plans to require creators to opt out of having their data used. The government's preferred solution to the tension between AI and copyright law is to allow AI companies to train the models on copyrighted work by giving them an exception for "text and data mining", while giving creatives the opportunity to opt out through a "rights reservation" system. The chair of the culture, media and sport committee, Caroline Dinenage, said there had been a "groundswell of concern from across the creative industries" in response to the proposals, which "illustrates the scale of the threat artists face from artificial intelligence pilfering the fruits of their hard-earned success without permission". She added that making creative works "fair game unless creators say so" was akin to "burglars being allowed into your house unless there's a big sign on your front door expressly telling them that thievery isn't allowed". She added: "Aside from any changes to copyright, there needs to be much tougher requirements on transparency of the data being used to train AI models, so creators will know without ambiguity where they need to be remunerated for the use of their works." The culture, media and sport committee and science, innovation and technology committee were responding to a government consultation on artificial intelligence and copyright, after their joint evidence session held earlier this month with representatives from AI startups and the creative industries. The letter to ministers calls on the government to improve transparency around training data to enable creators to identify the use of their works, ensure that any copyright holders who opt out aren't penalised through reduced visibility, and enable consumers to make informed choices about which AI model to use. The letter warned that without this, "the biggest impact would be felt by the long tail of creators and journalists already operating under financial constraints". Backlash to the government's AI proposals among celebrities and the creative industrie has been steadily growing in recent months. On Tuesday, more than 1,000 musicians, including Kate Bush, Damon Albarn and Annie Lennox, released a silent album in protest. The letter also said that although AI developers have suggested that training data constitutes a "trade secret", both the EU and the state of California have introduced transparency requirements, including detailed technical record-keeping about training data. It added that the government should look at encouraging companies which are developing per-use revenue sharing models, which it says "could move gen-AI past its 'Napster era', much as Spotify did in the advent of music streaming following two decades of peer-to-peer digital piracy". The MPs also asked that the government publish a full impact assessment for each option proposed in the consultation, with robust mechanisms to ensure compliance, enforcement and redress when it came to copyright. The letter said that other jurisdictions, such as the US and EU, "have not settled this issue", despite the government's fears that AI developers may move to countries with "clearer or more permissive rules".
[8]
Make It Fair
The imagination of the UK's creative industries powers the British economy and shapes how the rest of the world sees us as a nation. The creative works of British artists, authors, journalists, illustrators, photographers, film-makers, scriptwriters, singers and songwriters are being scraped from the internet by tech companies, big and small, to build and maintain AI products that have the potential to reshape our world. But most of those companies are taking British creativity without permission and, crucially, without payment. Without fair reward, our creative industries simply won't survive. The government must stand behind its creative industries. It's time to fairly compensate the creators. The UK's creative industries have today launched a bold campaign to highlight how their content is at risk of being given away for free to AI firms as the government proposes weakening copyright law. A government consultation seeking views on the copyright issue closes today. The 'Make it Fair' campaign was developed to raise awareness among the British public about the existential threat posed to the creative industries from generative AI models, many of which scrape creative content from the internet without permission, acknowledgement, and critically, without payment. The impact on creative businesses and individuals throughout the country - who collectively generate over £120 billion a year towards the UK economy - will be devastating if this continues unchecked, or worse still if the government legitimises this content theft. On 25 February, which is the last day of the government's consultation, regional and national daily news brands are running the same cover wrap and homepage takeover. The campaign cover wrap states: "MAKE IT FAIR: The government wants to change the UK's laws to favour big tech platforms so they can use British creative content to power their AI models without our permission or payment. Let's protect the creative industries - it's only fair." Weekly titles will run the campaign throughout the next week, with the aim of appealing to the British public to write to their MPs and back the creative industries. Launching the campaign today, Owen Meredith, CEO of News Media Association, said: "We already have gold-standard copyright laws in the UK. They have underpinned growth and job creation in the creative economy across the UK - supporting some of the world's greatest creators - artists, authors, journalists, scriptwriters, singers and songwriters to name but a few. "And for a healthy democratic society, copyright is fundamental to publishers' ability to invest in trusted quality journalism. The only thing which needs affirming is that these laws also apply to AI, and transparency requirements should be introduced to allow creators to understand when their content is being used. Instead, the government proposes to weaken the law and essentially make it legal to steal content. "There will be no AI innovation without the high-quality content that is the essential fuel for AI models. We're appealing to the great British public to get behind our 'Make it Fair' campaign and call on the government to guarantee creatives are able to secure proper financial reward from AI firms to ensure a sustainable future for both AI and the creative industries." Launching a music industry campaign to coincide with the 'Make it Fair' campaign, Ed Newton- Rex said: "1,000 UK musicians released a joint album today, recordings of empty studios, calling on the government to change course or risk empty studios becoming the norm. The government's proposals would hand the life's work of the UK's talented creators - its musicians, its writers, its artists - to AI companies, for free. The government must change course and make it fair." The Make it Fair campaign will have various spokespeople available for interview on the day. Please contact laura@newsmediauk.org for more information. If you wish to get a copy of the artwork, please speak to paul@newsmediauk.org. The copyright consultation in short: On 17 December 2024, the UK government launched a consultation process on copyright and AI. The government is trying to decide whether to let tech companies use content without permission unless the creators specifically say "no". Creators argue this puts the burden on them to police their work - which would be both costly and time consuming - and that tech companies should pay for using their content and work. The UK creative industries, which includes artists, authors, journalists, illustrators, photographers, film makers, scriptwriters, singers and songwriters, generates around £120 billion a year towards the UK economy. As the government progresses towards an AI Bill, the government must take the consultation responses on board before making a final decision on proposed legislation. The Department of Science, Innovation and Technology (DSIT) is responsible for the bill. MPs are currently debating provisions added to the Data Bill which - in contrast to the government's plans - would make existing copyright law enforceable in the age of AI.
[9]
UK creative industries launch 'Make it Fair' campaign against AI content theft
British creatives band together to urge for stronger copyright law Artificial intelligence and Large Language Models are trained on hoards of online information, including songs, articles, comments, books, drawings, pictures, and more - so if you've ever commented on an Instagram post, posted a photo to Twitter, or uploaded a video to YouTube - the likelihood is, your work has been used to train a model at some point or another. These models don't ask for permission, either, nor does it notify the creator - and these models make millions from the content. OpenAI reportedly used over a million hours of YouTube video data to train GPT-4, and Meta uses public posts from Instagram and Facebook to train its AI model - but British creatives are coming together to fight back. Artists, singers, authors, journalists, and scriptwriters (and more) - who collectively generate over £120 billion per year for the nation's economy, have come together to urge the UK government to apply British copyright law to AI companies, and to ensure 'content theft' is not legitimised by leaving this issue unchecked. The 'Make it Fair' campaign comes at the end of the British government's AI and copyright consultation period, in which it is reviewing ways to boost trust and transparency between sectors, and "ensuring AI developers have access to high-quality material to train leading AI models in the UK and support innovation across the UK AI sector". Owen Meredith, the CEO of News Media Association, which launched the campaign, added the UK's "gold-standard" copyright laws have underpinned growth and job creation in the British economy, and without the content they produce, AI innovation would not exist. "And for a healthy democratic society, copyright is fundamental to publishers' ability to invest in trusted quality journalism," Meredith said. "The only thing which needs affirming is that these laws also apply to AI, and transparency requirements should be introduced to allow creators to understand when their content is being used. Instead, the government proposes to weaken the law and essentially make it legal to steal content. AI is at the forefront of productivity discussions in the UK right now, as the PM released plans to 'turbocharge AI' into the public sector, including the idea to 'unlock' public data by handing it over to 'researchers and innovators' to train AI models.
[10]
How UK's AI Copyright Changes Could Impact the Music Industry
Disclaimer: This content generated by AI & may have errors or hallucinations. Edit before use. Read our Terms of use More than 1,000 musicians have released a silent album protesting the UK government's proposed changes to the copyright laws, allowing companies to train their AI models on copyrighted works like music, art, and text without a license or remuneration. The album titled 'Is This What We Want?' is live on Spotify and comprises empty recordings of studios and performance spaces, which the group claims represent the impact of the government's proposal on the musicians' livelihoods. Interestingly, the titles of the different tracks on Spotify spell out the message "The British Government Must Not Legalise Music Theft To Benefit AI Companies". In a separate letter to The Times, several artists, including Elton John, Dua Lipa, and Paul McCartney, warned that the proposal undermines UK copyright law -- a key factor in attracting rights holders and investment. Substantiating their claims, they added that the UK's creative industries contribute 126 billion pounds to the country's economy annually and employ 2.4 million people. Announced for public consultation in December 2024, the UK government's proposal aims to curb the worries of rights holders expressing concerns about their works used in training AI models and those of AI developers facing issues in navigating the UK's copyright law. The plan is based on three objectives- supporting rights holders' control over their content, aiding the development of "world-leading AI models by the UK", and fostering greater transparency. While opening the proposal for public consultation, the government outlined three different options, while mentioning its preference for each mode: This method requires licenses for AI companies to train their models on copyrighted works, backed by transparency provisions and easier routes to enforce copyright. However, the government contends that while this would improve access to remuneration for creators, it would create difficulties for AI companies while accessing content in the UK when compared to other jurisdictions. Finally, since the option curbed access to material for AI developers, despite giving right holders greater control over their works, it wasn't given preference. Within this option, companies could openly mine data of copyrighted works for AI training purposes without the right holders' permission. This mechanism would be subject to certain exceptions like those listed in other countries like Singapore and the U.S. However, since the approach fell short of transparency and control objectives, it was not preferred. This preferred suggestion would enable companies to undertake text and data mining of copyrighted works unless the right holders proactively opt out. Additionally, online works should include a machine-readable reservation of rights, which would automatically indicate copyright restrictions to AI systems. The government claims that this method satisfies all the aforementioned objectives and provides for "spillover innovation and productivity benefits", enabling the UK's AI sector to become more competitive internationally. This proposal signalled part of the country's broader AI strategy, termed the "AI Opportunities Action Plan", which detailed investments like dedicated "AI Growth Zones" for infrastructure development and funds worth 14 billion euros for this purpose. This sharply contrasted with the former British government's cautious approach toward AI, which focused on tackling its risks. While tech giants like Google have previously argued in favour of the "opt-out" model for training their algorithms, publishers have expressed concerns over increased compliance burden, the Financial Times reported. Fearing "widespread theft of copyrighted material without remuneration", creators have also pointed out flaws like unawareness about which companies are trying to mine their content within this system. Generative AI companies circumventing their responsibilities was another issue highlighted within this proposal. Besides musicians, major newspapers across the UK voiced opinions against the proposed efforts displaying a prominent "MAKE IT FAIR" message on their front pages. The campaign appeared in publications on the same date as the end of the AI and copyright consultation, i.e. on February 25, 2025. Although the two instances differ in context, this campaign by UK media outlets is reminiscent of the Indian Express' famous blank editorial on June 28, 1975, which protested press censorship during India's Emergency. To explain, following the resuming of its publication, the newspaper ran a blank editorial page highlighting press censorship by the Indira Gandhi government, drawing attention to injustice, akin to what UK publishers seek to do. Previously, music labels like UMG, Warner Brothers Music and Sony Music have filed complaints against AI companies for training their AI models from sound recordings from the labels' copyrighted works. Conversely, other companies like YouTube have also initiated discussions with record labels to "legally obtain popular artists' songs to train its new AI tools". Meanwhile, artists have called upon stakeholders to block AI models from 'infringing upon and devaluing the rights of human artists'. Previously, the American Union of Media Professionals- the Screen Actors Guild-American Federation of Television and Radio Artists (SAG-AFTRA) in a tentative agreement with record labels has also demanded guardrails for artificial intelligence. This comprised the establishment of clear and conspicuous consent, minimum compensation requirements, and specific details of intended use before releasing a sound recording using a digital replication of an artist's voice. Generally speaking, music copyrights are of two kinds- songwriting or publishing copyright (the rights over the songwriting or composition of a music piece) and sound recording or master copyright (rights belonging to the owner of the master sound recording). Speaking of deals with record labels, they own and control all elements of the copyright. Further, the copyright system is key to receiving royalties or payments made to owners when the music is played, performed, or reproduced. Considering the UK government's new proposal, since the copyright holders (record labels in certain cases) can prevent AI training based on their work via an "opt-out", it still leaves artists and songwriters out of the picture. Consequently, their rights over content created by them may end up being trampled. This concern may also be amplified by the view of music labels towards AI. For instance, UMG's Senior Vice President for Strategic Technology Chris Horton remarked about the future alignment of licensing, litigation, and legislation to enable AI companies to work with creative industries in an interview with Music Ally. He also added that "music will be more interactive and more responsive" and that "musicians will find incredible uses for AI systems we can't predict today". In contrast, Sony Music Group considered embracing the use of AI as a creative tool, it opted out of AI training, given the wishes of artists and songwriters. Overall, with the disparate views of these record labels and the general criticism of the UK's proposed copyright changes, it remains to see how artists adapt to the same and the subsequent steps they take.
[11]
Make it fAIr - why human intelligence must prevail in the ongoing AI-driven assault on creativity and copyright
Artificial Intelligence (AI) is the transformative technology of the day. Popularized by Large Language Models and generative tools in the cloud, AI promises many things. Among these are the potential to help doctors, researchers, and pharma companies to not only find cures for diseases such as cancer, Alzheimer's, and Cystic Fibrosis, but also to detect them early and shift healthcare towards prevention. That may save millions of lives and improve the quality of life for many more. AI might also help stop climate change, minimize carbon emissions, and prevent further damage to the planet - though its own energy consumption and water usage need urgent consideration. AI might help us find new materials and cleaner fuels, design more efficient cities, transport networks, buildings, and machines, or spot anomalies in data that might otherwise have taken human workers decades to detect. And it will doubtless help diginomica's readers to interrogate their own trusted data sets and unlock their hidden value - or exploit new opportunities that might otherwise have remained invisible. All these uses and more are positive and exciting - and there are plenty of them to read about on diginomica. But there is a massive problem. That same transformative technology also transforms human intelligence into zero-effort content, instantly, while giving nothing back to the originator of the training data. And survey after survey of enterprise adoption reveals that, primarily, organizations want to save money and be more productive (see diginomica, passim). Unfortunately, that incentivizes behavior that may have long-term destructive, rather than disruptive, effects. Large Language Models (LLMs) and other generative tools transform human intelligence, and human expertise, talent, insight, skill, and professionalism - in some cases, people's life's work - into content that gives something to everyone except the person who did the work or originated the idea. And that simply isn't fair. Because every instance creates a low-cost digital competitor to that creator - to that author, journalist, analyst, artist, filmmaker, video maker, photographer, illustrator, actor, songwriter, composer, musician, designer, analyst, consultant, and more. Worse, it commoditizes those professionals' work without consent - and frequently without their knowledge - and turns it into a subscription-based revenue stream for AI vendors, some of which are already among the most valuable, wealthy, and powerful companies on Earth. At present, copyright is creators' only meaningful protection in a networked, AI-infused world. And protecting creatives' copyright does nothing to stop AI having transformative effects in healthcare or science: honouring a novelist's, illustrator's, filmmaker's, or composer's copyright won't prevent doctors using AI to find a cure for cancer. It is a completely different type of data. In the UK, today is a day when all of this needs to be front of mind. It's the last day for consultation on the British Government's proposals for frankly unhelpful changes to copyright legislation, as well marking the launch of a collective creatives campaign against the free-for-all AI 'brain drain' of content. At the moment, the UK's copyright law is clear and unambiguous: exceptions can be made for academic research purposes, but not for the development of commercial products. As the UK's House of Lords' Communications and Digital Committee noted in its LLM report last year: LLMs may offer immense value to society. But that does not warrant the violation of copyright law or its underpinning principles. We do not believe it is fair for tech firms to use rightsholder data for commercial purposes without permission or compensation, and to gain vast financial rewards in the process. However, last May the Committee slammed government inaction, and accused the previous administration of "sitting on its hands" on the question of vendors scraping unlicensed content to train AI models. In a coruscating letter to the government, Committee Chair Baroness Stowell said: The government's record on copyright is inadequate and deteriorating. This trajectory is concerning. The issues with copyright are manifesting right now and problematic business models are fast becoming entrenched and normalized. Fast forward to the present day, and the UK government is now proposing tearing up the last remaining protection for proprietary work, copyright, and thus allowing AI companies to train their models on that work with impunity, at zero cost. That is against the clearly expressed wishes of the creative communities, against all legal advice, and against the findings of the Committee which examined this issue in depth, after hearing weeks of expert testimony (from all sides) last year. That's why over 1,000 musicians and composers, including the likes of Kate Bush, Damon Albarn, Sir Paul McCartney, Ed Sheeran, Sir Simon Rattle, and more, have today put their names to a 12-track album - of silence. Just the sound of empty recording studios. The album is called 'Is This What We Want?' Bush said: If these changes go ahead, the life's work of all the country's musicians will be handed over to AI companies for free. None of us have a say in it. Make no mistake, the unsanctioned, unlicensed, and uncompensated scraping of proprietary content devalues original human creativity in hard financial terms, but also in wider cultural and behavioural ones. Millions of talented people might decide never to share their work, and never to pursue creative careers - never to write, design, sing, perform, paint, act, or point a camera. Not only that, but millions more may decide never to code, or learn about the law, finance, or strategy. Many may decide never to research, investigate, question, or report their findings. And as we have previously discussed, there are other dangers in a world in which we rush to perceive machine intelligences as superior to our own - hallucinations, misinformation, deep fakes, and perhaps worst of all, a collective abandonment of critical thought and fact checking. As we reported earlier this month, a 2025 research paper - from OpenAI backer Microsoft - called 'The Impact of Generative AI on Critical Thinking: Self-Reported Reductions in Cognitive Effort and Confidence Effects from a Survey of Knowledge Workers', found that higher confidence in generative AI is associated with reduced critical thinking, as users engage in what the authors call "cognitive unloading". Put simply, they offload the task of thinking onto the machine. The researchers note: The data shows a shift in cognitive effort as knowledge workers increasingly move from task execution to oversight when using gen AI. Higher confidence in gen AI's ability to perform a task is related to less critical thinking effort. When using gen AI tools, the effort invested in critical thinking shifts from information gathering... diginomica is committed to human intelligence - we made our position entirely clear in this article - and to investing and showcasing original work by named authors. We speak to decision-makers, CEOs, CIOs, CTOs, founders, professors, and entrepreneurs on readers' behalf. And we report on the latest issues from our informed, and sometimes flawed, human perspectives. Changing the law threatens that business model, and every other one like it. We know that AI is transformative technology and offers enormous potential for enterprise benefit - no question about that - and we'll continue to bring as many powerful use cases of AI in positive action to the fore as exemplars for us all to learn from collectively. We also firmly believe that the British Government's proposals are misconceived, nonsensical, and completely unworkable. More, they will have the effect of retrospectively legitimizing illegal behavior in the form of the unsanctioned scraping of copyrighted content to train AI models. And that sets an appalling precedent, both in the UK and in other parts of the world, particularly countries that are already lax in the protection they offer. The UK should be setting an example. Giving rights holders an opt-out and hoping that all this will, somehow, create transparency, control, and fairness is hopelessly naïve in a world in which we have little control over whether content ends up in a public domain. Balance will have collapsed completely. This is why we endorse Make It Fair, a new initiative from the Creative Rights in AI Coalition, and hosted by the Professional Publishers Association (PPA). Launching the project this morning, CEO of the PPA Sajeeda Merali said: "Yesterday the PPA submitted a response to government as part of its consultation on AI, following weeks of gathering crucial evidence and examples from members on how publishers are being impacted. The government's proposed changes to copyright law threatens to allow AI companies to use the creative content from PPA members and others without permission or payment. The publishing industry stands united with the wider creative sector, in calling for fairness, transparency, and control over how AI firms use our valuable work. Without these safeguards, the UK risks stifling both creative innovation and the long-term potential of AI. We stand with them too. But the creative and publishing sectors can't do this alone. We need AI users - many of whom are within the creative fields - to be seen to be using the technology more responsibly. And we need AI vendors to listen and to understand that just because you can scrape copyrighted work, that doesn't mean you should. Most of all, vendors should grasp a critical point: this industry-widespread data grab will undermine public trust in AI and the development of the sector. To trust the technology demands that we, as users, can trust the companies that make it. So, can we? At present, the jury is out in an unfortunate number of cases. But on the subject of juries, one thing is clear: the progress of ongoing litigation by rights holders against AI vendors in the US is, early results suggest, finding for the plaintiffs. So, in the UK, the Government's rush to change the law may soon be out of step, even with those more relaxed legal regimes. On that basis alone, it would be wise for the British Government to do nothing on copyright, or to back it rather than undermine it completely. After all, this is a decision that can never be reversed.
[12]
UK newspapers blanket their covers to protest loss of AI protections
Creative and media industries have teamed up on this "Make It Fair" initiative, calling for readers to help protect British creative industries. The campaign was created to fight government proposals that would allow artificial intelligence companies to train their models on copyright-protected work without permission. The stunt appears to be carefully timed. A public consultation period -- in which anyone can submit their opinions to the UK government -- closed today following its launch on December 17th, proposing that exceptions be made to copyright law regarding AI training "for any purpose," including commercial. Creatives would be able to opt out of the new "text and data mining" process via a so-called "rights reservation" process, but that places greater responsibility and labor requirements on individuals to ensure their works are protected.
Share
Share
Copy Link
The UK government's proposed changes to copyright law for AI have ignited a fierce debate between tech companies and creative industries, raising concerns about intellectual property rights and the future of human creativity.
The UK government's proposed changes to copyright law for artificial intelligence (AI) have ignited a fierce debate between tech companies and creative industries. The government is considering relaxing copyright restrictions to allow AI companies to use copyrighted material for training their models without seeking permission or providing compensation 12.
Over 1,000 artists, including Kate Bush and Annie Lennox, have joined a silent protest album titled "Is This What We Want?" to express their opposition to the proposed changes 2. The creative sector argues that these changes would amount to intellectual property theft and pose an existential threat to the UK's creative industries 3.
The UK government claims that the current copyright regime is holding back the creative industries, media, and AI sectors from realizing their full potential 1. They aim to position the UK as an AI-friendly powerhouse, following the US and China 2. However, critics argue that this approach could impoverish creatives to accommodate big tech's latest developments 1.
The creative industries contributed £124 billion in gross added value to the UK economy in 2023, accounting for about 5% of the total 2. Dame Caroline Dinenage, chair of the culture, media and sport committee, has described the situation as "the largest copyright heist in world's history" 3. There are concerns that the proposed changes could undermine the foundations of the UK's leadership in culture and creativity 3.
Some startups are experimenting with new economic models to address the issue. Companies like ProRata.ai, TollBit, and Human Native.ai are developing platforms to enable fair compensation for content creators when their work is used by AI companies 2. Additionally, some creative businesses are already striking individual content licensing deals with AI companies 2.
In the House of Lords, an amendment to the government's data bill has been accepted, asserting that licenses to copyrighted material must be actively sought rather than taken for granted 4. Beeban Kidron's amendments to the bill would introduce safeguards to ensure AI firms seek permission and pay for the content they use 5.
The UK introduced the world's first copyright law, the Statute of Anne, in 1710, setting the global standard for protecting creators 5. The current debate is seen as a potential dismantling of these long-standing protections. The US and EU are also grappling with similar issues, with some countries implementing transparency requirements for AI training data 3.
As the consultation on AI and copyright law concludes, the UK government faces a critical decision that will shape the future of both the creative and tech industries. The outcome of this debate could have far-reaching consequences for intellectual property rights, human creativity, and the economic landscape of the United Kingdom.
Reference
[1]
[2]
[3]
[4]
The UK government's new AI action plan, aimed at making Britain an AI superpower, faces backlash from artists and writers over proposed copyright reforms that could allow AI companies to use creative works without permission.
2 Sources
2 Sources
The UK government is reevaluating its proposed AI copyright reforms after facing strong opposition from prominent artists and creative industry figures. The debate centers on balancing AI innovation with protecting creators' rights.
3 Sources
3 Sources
A coalition of UK creative industries, including publishers, musicians, and photographers, has strongly opposed the government's proposal to allow AI companies to train on copyrighted works without explicit permission. The debate centers on the balance between AI innovation and protecting creative rights.
15 Sources
15 Sources
UKAI, the UK's AI trade body, rejects proposed copyright law changes and advocates for transparency, collaboration, and fair solutions between AI and creative industries.
2 Sources
2 Sources
UK trade unions call for urgent action to protect creative industry workers from exploitation by AI companies, demanding changes to proposed copyright laws and AI framework.
2 Sources
2 Sources
The Outpost is a comprehensive collection of curated artificial intelligence software tools that cater to the needs of small business owners, bloggers, artists, musicians, entrepreneurs, marketers, writers, and researchers.
© 2025 TheOutpost.AI All rights reserved