4 Sources
[1]
Ex-Meta exec: Copyright consent obligation = end of AI biz
Nick Clegg, former politico and Zuckcorp policy Prez, seems confused, can Reg readers help him? Former British deputy PM and Meta apologist Sir Nick Clegg says that forcing AI companies to ask for the permission of copyright holders before using their content would destroy the AI industry overnight. Clegg, who served as deputy to David Cameron in a Conservative / Lib Dem coalition that governed the UK between 2010 and 2015 before moving to Zuckcorp as president of global policy affairs, told the audience at a literary festival that demands to make tech firms seek consent from creators before using copyrighted material to train AI models were unworkable. Any such laws would "basically kill the AI industry in this country overnight," Clegg claimed, according to The Times. This month, members of the House of Lords, the UK's upper chamber of Parliament, voted in in favor of amendments to the proposed Data (Use and Access) Bill that would have protected copyrighted work from simply being copied by AI companies. However, government ministers used an arcane parliamentary procedure to block the amendment, which would have required tech firms to reveal what copyright material has been used to train their models. Clegg stepped down from his role as president of global affairs at Facebook parent company Meta at the start of this year. He was speaking at the Charleston Festival in East Sussex in order to plug a book he has coming out, How to Save the Internet: the Threat to Global Connection in the Age of AI and Political Conflict. The former politician seemed confused over the issue of AI and copyright, agreeing when questioned that people ought to be able to opt out of having their work copied and used for model training. But he then reportedly said that "quite a lot of voices say 'you can only train on my content, [if you] first ask.' And I have to say that strikes me as somewhat implausible because these systems train on vast amounts of data." He added: "I just don't know how you go around, asking everyone first. I just don't see how that would work. And by the way if you did it in Britain and no one else did it, you would basically kill the AI industry in this country overnight." So which is it, Mr Clegg? Do creators have the right to opt out or not? Because asking their permission after the fact is self-defeating. And admitting that the AI business model is dead unless LLM trainers are allowed to break the law doesn't sound like much of an argument. That Clegg should side with the UK government and big business interests is probably not surprising considering his background. The Tony Blair Institute, founded by the former Prime Minister, also came out in favor of exceptions to copyright rules for developers training AI models. This is despite many of the UK's leading media and arts professionals speaking out against the data access bill, including playwright Tom Stoppard, Dr Who producer Russell T Davies, and a slew of musicians such as Elton John, Paul McCartney, Kate Bush and Robbie Williams. The big AI companies haven't been waiting for permission so far, with a recent study finding that OpenAI mined copyright-protected content in order to train its GPT family of models, for example. Baroness Kidron, who proposed the Lords amendments, said: "How AI is developed and who it benefits are two of the most important questions of our time." She warned the UK creative industries "must not be sacrificed to the interests of a handful of US tech companies." The UK government, for one, has made AI a central plank of its plans for economic revival of the country, as detailed in the AI Opportunities Action Plan published earlier this year. This includes setting up "AI Growth Zones" with streamlined planning processes that allow developers to override both local authorities and the concerns of local communities when siting massive new AI datacenters. There are alternatives: last month, we reported on a new licensing model that aims to let developers of large language models (LLMs) use copyrighted training data while paying the publishers for the privilege. It isn't only the UK where formerly sacrosanct copyright protection is being shredded in favor of AI developers: every nation fears being left behind in some kind of tech arms race. Just recently, it was reported that the head of the US Copyright Office was sacked, just after the agency concluded that AI developers' use of copyrighted material went beyond existing doctrines of fair use. ®
[2]
AI will die "overnight" if copyright permission is...
Former UK Deputy Prime Minister Nick Clegg says artificial intelligence companies shouldn't need to seek permission every time they use copyright-protected data. Speaking at a recent event to promote his book, "How to Save the Internet," Clegg - who previously served as a Meta executive - sided with the AI industry on the issue. Forcing technology firms to comply with copyright laws - and notify rights holders when they use protected content to train artificial intelligence models - would kill the UK's AI industry "overnight," Clegg warned. The content is already publicly available, he argued, and AI systems need vast amounts of data to improve their reasoning. Clegg argues that current copyright laws are incompatible with artificial intelligence, as requiring companies to obtain permission every time they train a model would render the entire technology unworkable. He believes artists and rights holders should be able to opt out of data scraping for AI training, but seeking individual confirmations isn't a viable solution. "I think people should have clear, easy to use ways of saying, no, I don't. I want out of this," the former Meta VP said. "But I think expecting the industry, technologically or otherwise, to preemptively ask before they even start training - I just don't see. I'm afraid that just collides with the physics of the technology itself." Clegg is focusing on the UK AI industry as politicians debate the new Data (Use and Access) Bill, which aims to regulate access to customer and company data. A coalition of artists and authors, led by film director Beeban Kidron, pushed to amend the law, requiring AI companies to disclose the data they use to train their models. However, parliament rejected the proposal. In a recent op-ed in The Guardian, Kidron accused the government of essentially approving a plan to facilitate mass cultural theft. She said UK authorities are allowing AI companies to use copyrighted works freely while opting out of such practices would be impossible without proper transparency. The government can certainly "bully its way to victory" and pass the bill by majority vote, but doing so would deal a catastrophic blow to Britain's creative industry. However, the fight isn't over. The draft will return to the House of Lords for a new vote on June 2. Permalink to story:
[3]
Former Meta executive claims forcing AI companies to ask for copyright permissions will kill UK's AI industry "overnight"
Editor's take: The UK Parliament is debating the Data (Use and Access) Bill, a law set to regulate access to user and customer data. The bill could have a dramatic impact on the IT sector, particularly AI companies that aggressively collect vast amounts of human-generated data online to train their often unpredictable chatbots. Former UK Deputy Prime Minister Nick Clegg says artificial intelligence companies shouldn't need to seek permission every time they use copyright-protected data. Speaking at a recent event to promote his book, "How to Save the Internet," Clegg - who previously served as a Meta executive - sided with the AI industry on the issue. Forcing technology firms to comply with copyright laws - and notify rights holders when they use protected content to train artificial intelligence models - would kill the UK's AI industry "overnight," Clegg warned. The content is already publicly available, he argued, and AI systems need vast amounts of data to improve their reasoning. Clegg argues that current copyright laws are incompatible with artificial intelligence, as requiring companies to obtain permission every time they train a model would render the entire technology unworkable. He believes artists and rights holders should be able to opt out of data scraping for AI training, but seeking individual confirmations isn't a viable solution. "I think people should have clear, easy to use ways of saying, no, I don't. I want out of this," the former Meta VP said. "But I think expecting the industry, technologically or otherwise, to preemptively ask before they even start training - I just don't see. I'm afraid that just collides with the physics of the technology itself." Clegg is focusing on the UK AI industry as politicians debate the new Data (Use and Access) Bill, which aims to regulate access to customer and company data. A coalition of artists and authors, led by film director Beeban Kidron, pushed to amend the law, requiring AI companies to disclose the data they use to train their models. However, parliament rejected the proposal. In a recent op-ed in The Guardian, Kidron accused the government of essentially approving a plan to facilitate mass cultural theft. She said UK authorities are allowing AI companies to use copyrighted works freely while opting out of such practices would be impossible without proper transparency. The government can certainly "bully its way to victory" and pass the bill by majority vote, but doing so would deal a catastrophic blow to Britain's creative industry. However, the fight isn't over. The draft will return to the House of Lords for a new vote on June 2.
[4]
Legendary Facebook Exec Scoffs, Says AI Could Never Be Profitable If Tech Companies Had to Ask for Artists' Consent to Ingest Their Work
Fresh on the heels from his exit from Meta, former Facebook executive Nick Clegg is defending artificial intelligence against copyright holders who want to hold the industry accountable. As the Times of London reports, Clegg insisted during an arts festival last weekend that it's "implausible" to ask tech companies to ask for consent from creators before using their work to train their AI models. During a speech at the Charleston Festival in East Sussex -- which was, ironically enough, meant to promote his new book titled "How To Save The Internet" -- Meta's former global affairs vice president initially said that it was "not unreasonable" that artists may want to "opt out of having their creativity, their products, what they've worked on indefinitely modeled." But he then went on to suggest that those same artists are getting greedy. "I think the creative community wants to go a step further," Clegg then charged. "Quite a lot of voices say 'you can only train on my content, [if you] first ask.' And I have to say that strikes me as somewhat implausible because these systems train on vast amounts of data." "I just don't know how you go around, asking everyone first," Clegg said during a speech to promote his new book, ironically titled "How to Save The Internet," that took place at this year's Charleston Festival in East Sussex, England. "I just don't see how that would work." The former deputy prime minister then added that if AI companies were required only in Britain to gain permission to use copyright holders' works, "you would basically kill the AI industry in this country overnight." Clegg's comments came amid a fiery debate in England about AI and copyright, spurred on by a recent Parliament vote on an amendment to the UK government's data bill, which would have required companies to tell copyright holders when their work was used had it not been struck down in the House of Commons last week. His stance also put him in opposition to Paul McCartney, Elton John, Dua Lipa, and hundreds of other artists who called on the British government to "protect copyright in the age of AI," as Sir Elton put it in an Instagram post. Unfortunately, it seems that Parliament's lower house agreed with Clegg's sentiments and not the artists' -- but history will show who was on which side of the AI wars.
Share
Copy Link
Nick Clegg, former Meta executive, argues that requiring AI companies to seek copyright permissions would destroy the UK's AI industry, sparking debate on AI regulation and creative rights.
Former Meta executive and British politician Nick Clegg has ignited a fierce debate on artificial intelligence (AI) and copyright laws. Speaking at the Charleston Festival in East Sussex, Clegg argued that forcing AI companies to seek permission from copyright holders before using their content for training would "basically kill the AI industry in this country overnight" 12.
Source: Futurism
Clegg, who recently stepped down as Meta's president of global affairs, emphasized the impracticality of obtaining individual permissions for the vast amounts of data used in AI training. He stated, "I just don't know how you go around, asking everyone first. I just don't see how that would work" 13. This stance aligns with the UK government's recent actions, which blocked amendments to the proposed Data (Use and Access) Bill that would have protected copyrighted work from being copied by AI companies 1.
Clegg's position has met strong opposition from the creative community. A coalition of prominent artists, including Elton John, Paul McCartney, and Kate Bush, has spoken out against the data access bill 14. They argue that the current approach essentially approves "mass cultural theft" and fails to protect the rights of creators in the age of AI 24.
The UK Parliament is currently debating the Data (Use and Access) Bill, which aims to regulate access to customer and company data. While the House of Lords initially voted in favor of amendments to protect copyrighted work, government ministers used parliamentary procedures to block these changes 12. The bill is set to return to the House of Lords for another vote on June 2 2.
Source: The Register
This debate extends beyond the UK, reflecting a global struggle to balance AI innovation with copyright protection. In the United States, similar controversies have arisen, with reports of the US Copyright Office head being dismissed after concluding that AI developers' use of copyrighted material exceeded existing fair use doctrines 1.
Clegg's argument suggests that the current AI business model relies heavily on unrestricted access to vast amounts of data, including copyrighted material. However, alternatives are emerging, such as new licensing models that allow AI developers to use copyrighted training data while compensating publishers 1.
Source: TechSpot
As the AI industry continues to evolve rapidly, the outcome of this debate could have far-reaching consequences for both tech companies and content creators worldwide. The challenge lies in finding a balance that fosters innovation while protecting intellectual property rights in the digital age.
Summarized by
Navi
[1]
Google and the U.S. Department of Justice present final arguments in a landmark antitrust case, debating potential remedies for Google's search engine monopoly, including the sale of Chrome browser and restrictions on default search agreements.
10 Sources
Policy and Regulation
7 hrs ago
10 Sources
Policy and Regulation
7 hrs ago
Major Chinese tech companies are shifting towards homegrown AI chips as US export controls tighten, potentially reshaping the global AI chip market.
5 Sources
Technology
15 hrs ago
5 Sources
Technology
15 hrs ago
Dell Technologies reports significant growth in AI server orders, leading to an increased annual profit forecast and a substantial backlog, highlighting the company's strong position in the AI infrastructure market.
10 Sources
Business and Economy
23 hrs ago
10 Sources
Business and Economy
23 hrs ago
Grammarly, the AI-powered writing assistant, has secured $1 billion in non-dilutive funding from General Catalyst to expand its AI offerings and evolve into a comprehensive productivity platform. The company plans to use the capital for sales, marketing, and strategic acquisitions.
10 Sources
Business and Economy
23 hrs ago
10 Sources
Business and Economy
23 hrs ago
Marvell Technology reports record Q1 revenue driven by AI-powered data center growth, with custom silicon and electro-optics fueling momentum. The company forecasts continued strong performance in Q2.
5 Sources
Technology
15 hrs ago
5 Sources
Technology
15 hrs ago