5 Sources
5 Sources
[1]
Music publishers sue Anthropic for $3B over 'flagrant piracy' of 20,000 works
A cohort of music publishers led by Concord Music Group and Universal Music Group are suing Anthropic, saying the company illegally downloaded more than 20,000 copyrighted songs, including sheet music, song lyrics, and musical compositions. The publishers said in a statement on Wednesday that the damages could amount to more than $3 billion, which would be one of the largest non-class action copyright cases filed in U.S. history. This lawsuit was filed by the same legal team from the Bartz v. Anthropic case, in which a group of fiction and nonfiction authors similarly accused the AI company of using their copyrighted works to train products like Claude. In that case, Judge William Alsup ruled that it is legal for Anthropic to train its models on copyrighted content. However, he pointed out that it was not legal for Anthropic to acquire that content via piracy. The Bartz v. Anthropic case became a slap on the wrist worth $1.5 billion for Anthropic, with impacted writers receiving about $3,000 per work for roughly 500,000 copyrighted works. While $1.5 billion seems like a substantial sum, it's not exactly back-breaking for a company valued at $183 billion. Originally, these music publishers had filed a lawsuit against Anthropic over its use of about 500 copyrighted works. But through the discovery process in the Bartz case, the publishers say they found that Anthropic had also illegally downloaded thousands more. The publishers tried to amend their original lawsuit to address the piracy issue, but the court denied that motion back in October, ruling they'd failed to investigate the piracy claims earlier. That move prompted the publishers to instead file this separate lawsuit, which also names Anthropic CEO Dario Amodei and co-founder Benjamin Mann as defendants. "While Anthropic misleadingly claims to be an AI 'safety and research' company, its record of illegal torrenting of copyrighted works makes clear that its multibillion-dollar business empire has in fact been built on piracy," the lawsuit says. Anthropic did not respond to TechCrunch's request for comment.
[2]
Music publishers sue Anthropic for $3 billion over 'flagrant piracy'
A group of music publishers led by Concord Music Group and Universal Music Group Anthropic, . The suit accuses the AI company of illegally downloading more than 20,000 copyrighted songs, including sheet music, lyrics and compositions. These songs were then allegedly fed into the chatbot Claude for training purposes. There are some iconic tunes named by Universal in the suit, including tracks by The Rolling Stones, Neil Diamond and Elton John, among many others. Concord is an independent publisher that handles artists like Common, Killer Mike and Korn. The publishers issued a statement saying that the damages could amount to more than $3 billion. This would make it one of the largest non-class action copyright cases in US history. "While Anthropic misleadingly claims to be an AI 'safety and research' company, its record of illegal torrenting of copyrighted works makes clear that its multibillion-dollar business empire has in fact been built on piracy," the lawsuit says. The suit was filed by the same legal team as . The music publishers say they found that Anthropic had been illegally downloading thousands of songs during the . For the unfamiliar, the Bartz v. Anthropic case ended with an award of $1.5 billion to impacted writers after it was found that the company had illegally downloaded their published works for similar training purposes. The terms of that agreement dictated that the 500,000 authors involved in the case would get $3,000 per work. The $1.5 billion looks like a big number, but not so much when broken down like that. Also, Anthropic . In the Bartz case, Judge William Alsup ruled that it was legal for Anthropic to train its models on copyrighted content but not legal to acquire that content via piracy. We'll have to wait and see how this new suit shakes out. The legal precedent here seems to suggest that if Anthropic would have just spent a buck on each copyrighted song, then they'd be in the clear. That's an odd distinction when it comes to building an entire company around snatching up copyrighted content, but whatever.
[3]
Music publishers sue Anthropic over illegal torrenting of 20,000 songs
A group of music publishers led by Concord Music Group and Universal Music Group filed a lawsuit against Anthropic in the United States, according to Reuters. The suit accuses the AI company of illegally downloading more than 20,000 copyrighted songs, including sheet music, lyrics, and compositions, then using them to train its chatbot Claude. Universal Music Group named specific iconic tracks in the lawsuit, such as songs by The Rolling Stones, Neil Diamond, and Elton John, along with many others. Concord Music Group, an independent publisher, represents artists including Common, Killer Mike, and Korn. These publishers discovered Anthropic's actions during the discovery phase of the prior Bartz v. Anthropic case, where evidence emerged that the company had illegally downloaded thousands of songs. The lawsuit states that damages could exceed $3 billion, positioning the case as one of the largest non-class action copyright disputes in U.S. history. It includes a direct quotation: "While Anthropic misleadingly claims to be an AI 'safety and research' company, its record of illegal torrenting of copyrighted works makes clear that its multibillion-dollar business empire has in fact been built on piracy." The legal team filing this suit is the same one that handled last year's Bartz v. Anthropic case. That earlier litigation concluded with a $1.5 billion award to 500,000 impacted writers and authors. Under the settlement terms, each of the 500,000 authors received $3,000 per work involved. Anthropic holds a valuation of around $350 billion. The publishers' claims stem directly from findings in the Bartz discovery process, which revealed the extent of the company's unauthorized song downloads for training purposes similar to those used for the writers' works. In the Bartz case, Judge William Alsup issued a ruling that permitted Anthropic to train its AI models on copyrighted content. The decision specified, however, that acquiring such content through piracy remained illegal. This distinction forms the basis for the current music publishers' arguments. The publishers assert that Anthropic's method of obtaining the 20,000 songs via illegal downloads violates this precedent. Evidence from the Bartz discovery phase documented thousands of such downloads, extending to musical works beyond the original literary focus. Concord and Universal emphasize the scale, covering sheet music, lyrics, and full compositions from major artists. The suit details how these materials fed directly into Claude's development, mirroring the training process challenged in Bartz. With damages pegged above $3 billion, the case tests the boundaries of Judge Alsup's acquisition ruling in a new domain of copyrighted music.
[4]
Music Publishers Say Anthropic Is 'Built on Piracy.' Now They're Suing for $3 Billion.
Concord Music Group and Universal Music Group claim the AI company illegally downloaded songs, lyrics, and sheet music to train Claude. Claude is a music thief, according to a coalition of major music publishers now suing for $3 billion. Concord Music Group and Universal Music Group are suing Claude creator Anthropic, claiming the company illegally downloaded more than 20,000 copyrighted songs, including sheet music, song lyrics, and musical compositions. The publishers said damages could exceed $3 billion, making it one of the largest non-class action copyright cases in U.S. history. The lawsuit comes from the same legal team behind Bartz v. Anthropic, where authors accused the AI company of using copyrighted works to train its products. In that case, Judge William Alsup ruled it's legal to train models on copyrighted content but illegal to acquire that content through piracy. Anthropic paid $1.5 billion. The lawsuit strikes a harsh note. "While Anthropic misleadingly claims to be an AI 'safety and research' company, its record of illegal torrenting makes clear that its multibillion-dollar business empire has been built on piracy," it states.
[5]
Why Are Music Publishers Again Suing Anthropic for Copyright?
Building on the previous copyright lawsuit and a breakthrough judgment that revealed Anthropic was using copyrighted books for AI training, various global music publishers filed a new lawsuit against Anthropic, alleging it infringed on copyrighted content, including song lyrics, particularly through pirated means. This is not the first time Anthropic has been dragged into copyright infringement cases. Earlier in September 2025, in the famous Anthropic vs. Bartz case, the company agreed to pay $1.5 billion in "the largest publicly reported copyright recovery" case. Interestingly, during the same court proceedings before the final settlement, the US court found that using purchased copyrighted works to train AI models is 'fair use' under the US Copyright Law. Separately, several music publishers previously filed a separate copyright infringement case against Anthropic that the parties later resolved through an agreement. The fresh lawsuit, which also includes a few music publishers from the previous lawsuit, is a follow-up to the previously settled case. In this new lawsuit, the music publishers say they tried to expand their earlier copyright case against Anthropic after Judge William Alsup's rulings in the case, popularly known as Anthropic Vs Bartz, which revealed Anthropic's illegal downloading from pirated shadow libraries. "Until the revelations in those [Bartz vs Anthropic] opinions and filings, Publishers did not know that their works were being copied by Defendants from some of the most notorious pirated sources in the world," reads the lawsuit, referring to the piracy shadow libraries like LibGen (Library Genesis) and its mirror sites such as Z-library. Before evolving as Anna's Archive, Pirate Library Mirror (PiLiMi) replicated content from the banned Z-Library. When music labels sought to amend their previous complaint, Anthropic opposed it, arguing that the torrenting claims were unrelated to their copyright infringement case and would "fundamentally transform" the first case. So, the publishers say that they filed this separate lawsuit to address what they call "willful infringement" through the downloading and uploading of unauthorised copies of their works from massive piracy websites. Referring to the above-mentioned case, the publishers also noted they had already sued Anthropic in the earlier case over the alleged copying of their content to train certain Claude AI models. Despite their agreement to enforce safety guardrails to prevent their AI models from generating copyrighted content, they claim that Anthropic has continued to use their works on a much larger scale since then, leading to this second lawsuit over the same copyright infringement issues surrounding AI training and outputs. Concord Music Group, Universal Music, and others filed the lawsuit against Anthropic, its CEO, Dario Amodei, and co-founder Benjamin Mann. Along with allegations of copyright infringement at Anthropic, the lawsuit also claims that Amodei and Mann used the alleged pirated libraries while they were at OpenAI between 2019 and 2020, as revealed in the Bartz v. Anthropic court proceedings. "From the very beginning, Anthropic has built its multibillion-dollar business on piracy," states the lawsuit, referring to the founders of Anthropic's alleged involvement with OpenAI. In their new lawsuit, the music publishers asked the court to award statutory damages of up to $150,000 per infringed work. They also sought additional statutory damages of up to $25,000 per violation for the alleged removal or alteration of copyright management information from their original work. "In total, Defendants torrented at least 5 million copies of pirated books from LibGen in 2021, and at least another 2 million copies of pirated books from PiLiMi in 2022," alleges the lawsuit. Additionally, it claims that they illegally downloaded a separate catalogue of bibliographic metadata for each collection, which included information on book title, author, and ISBN, a numeric commercial book identifier. The publishers further requested an order requiring Anthropic to destroy all infringing copies of their works in its datasets under the court's supervision and to file a report on its compliance. Music publishers say the industry depends on licensing and authorised deals to ensure songwriters and publishers are paid when their works are used or played. Publishers license songs in their catalogues, collect the revenue, and share it with the artists they represent. They say that these licensing arrangements can also extend to AI companies that can bring them additional revenue in exchange for their proprietary data for AI training. After initially suing them for copyright infringement, Universal Music Group and Udio, an AI music generator, announced a partnership following their settlement. Similarly, the recent Tips Music earnings call also revealed that its partnership with Warner Music Group involved AI training as part of NVIDIA's new music model, which it is developing in collaboration with Universal Music Group. Particularly regarding illegal downloading through peer-to-peer-based torrent networks, the lawsuit further said that by torrenting pirated books, Anthropic "violated Publishers' exclusive right of reproduction." Explaining this claim, it said, "to make matters worse, because of the two-way nature of the BitTorrent protocol, when Defendants downloaded copies of these pirated books via torrenting, they simultaneously uploaded to the public unauthorized copies of the same books, thereby infringing Publishers' exclusive right of distribution in these works and contributing to further infringement of Publishers' works as well." "Despite its multibillion-dollar valuation, Anthropic refuses to pay a cent for the vast amounts of copyrighted content -- including Publishers' musical compositions -- it takes without permission or credit to build its business," claims the lawsuit, alleging the AI company of building a "central library by copying and ingesting text from the internet and other sources." The lawsuit states that Anthropic used BitTorrent to download and copy text from illegal pirate library websites. Torrenting is a peer-to-peer (P2P) file-sharing protocol that is "infamously used for widespread unauthorised reproduction and distribution of copyrighted materials," as claimed by the lawsuit. Citing disclosures from the Bartz vs. Anthropic proceedings, the lawsuit said CEO Dario Amodei acknowledged that Anthropic had many legal options to obtain copyrighted works for AI training, but deliberately chose to obtain them illegally via torrenting because it was reportedly "faster and free." He allegedly described the legal route as a "practice/business slog." The lawsuit refers to another instance to illustrate Anthropic's approach to copyrighted content. When one Anthropic founder learned he could torrent additional copyrighted works from PiLiMi, he wrote to colleagues, "Just in time!" To which another employee allegedly replied, "zlibrary, my beloved." The publishers called LibGen and PiLiMi "two of the largest and most infamous" illegal libraries. "These pirate libraries contain every genre of book imaginable, including songbooks, sheet music collections, and other books of song lyrics, containing copyrighted musical compositions owned and controlled by Publishers and others," states the complaint. In addition to these allegations, internal documents unveiled at the recent legal filings revealed Project Panama is Anthropic's "effort to destructively scan all the books in the world", the Washington Post reported. It also revealed that Anthropic had "spent tens of millions of dollars to acquire and slice the spines off millions of books." After Anthropic collects the vast text library that includes publishers' copyrighted works, it allegedly uses portions of that data to train its AI models through further unauthorised copying of the same. Publishers alleged that Anthropic "cleans" its training text but leaves infringing content, such as song lyrics, while using tools to remove copyright notices and other copyright management information that generally identify the copyright holders. They also said Anthropic copies and processes the corpus in memory, breaking it into "tokens" for storage, and makes additional copies during finetuning and reinforcement learning based on human and AI feedback. As part of that process, publishers claimed Anthropic-directed human reviewers prompt and reward the model in ways that can involve outputs tied to publishers' lyrics. The lawsuit said that as early as May 2021, senior Anthropic employees, including founders Benjamin Mann and Jared Kaplan, discussed using extraction tools to strip webpage footers, where copyright notices often appear, from training data. In June 2021, they concluded that one tool, jusText, a Python-based boilerplate content removing tool, left too much "useless junk," including copyright notice information, compared with alternatives like Readability and Newspaper. Mann also said he wanted the model to "ignore the boilerplate." Publishers alleged that Anthropic chose another extraction tool, Newspaper3k, because it reportedly removed copyright owner names and notices more effectively. "Because Newspaper [tool] removed Copyright Management Information more effectively, Anthropic purposefully decided to employ that tool to remove copyright notices and other Copyright Management Information from Publishers' lyrics and other copyrighted works," read the lawsuit. "The datasets Anthropic has copied and filtered to train its Claude AI models include a well-known dataset called 'The Pile,' which includes countless unauthorised copies of Publishers' lyrics," claims the lawsuit. The now-deleted dataset, the Pile dataset, was also allegedly used by GPU chip-making company, NVIDIA. The publishers claim that Anthropic continues to use The Pile to train its latest Claude models, which is drawn from several existing text sources, giving more weight to what it calls "high-quality datasets." These include Books3, a collection of hundreds of thousands of pirated books that allegedly contains many works with publishers' musical compositions, as well as the YouTube Subtitles dataset of human-written closed captions. For additional context, Books3 is the same dataset allegedly used by NVIDIA, Apple, Adobe, Meta, and Anthropic itself. In August 2023, after a legal complaint by a Denmark-based anti-piracy group, Rights Alliance, the Books3 dataset was removed from The Pile, which was also removed in the same year. The new copyright lawsuit also said Anthropic uses the "Common Crawl" dataset for ongoing AI training. Publishers alleged that the dataset includes a large number of their copyrighted lyrics, scraped without permission from authorised sites such as MusixMatch, LyricFind and Genius. Common Crawl refers to the non-profit company that "maintains a free, open repository of web crawl data that can be used by anyone."
Share
Share
Copy Link
Universal Music Group and Concord Music Group lead a coalition of music publishers in a $3 billion lawsuit against Anthropic, accusing the AI company of illegally downloading over 20,000 copyrighted songs to train Claude. The suit alleges flagrant piracy through shadow libraries, marking one of the largest non-class action copyright cases in U.S. history.
A coalition of music publishers led by Universal Music Group and Concord Music Group has filed a $3 billion lawsuit against Anthropic, alleging the AI company illegally downloaded more than 20,000 copyrighted songs to train AI models. The Anthropic lawsuit, filed in U.S. federal court, accuses the company of flagrant piracy involving song lyrics, sheet music, and compositions from iconic artists including The Rolling Stones, Neil Diamond, and Elton John
1
2
. This marks one of the largest non-class action copyright cases in U.S. history, with potential damages exceeding $3 billion.
Source: MediaNama
The lawsuit states that "while Anthropic misleadingly claims to be an AI 'safety and research' company, its record of illegal torrenting of copyrighted works makes clear that its multibillion-dollar business empire has in fact been built on piracy"
1
. The suit names not only Anthropic but also CEO Dario Amodei and co-founder Benjamin Mann as defendants, alleging they used pirated libraries during their time at OpenAI between 2019 and 20205
.The music publishers discovered Anthropic's extensive copyright infringement during the discovery process in Bartz v. Anthropic, a separate case where fiction and nonfiction authors accused the company of using their copyrighted works to train AI chatbot Claude. That case concluded with a $1.5 billion settlement, with roughly 500,000 impacted writers receiving approximately $3,000 per work
1
2
.During that litigation, evidence emerged revealing Anthropic's use of shadow libraries like LibGen (Library Genesis) and its mirror sites including Z-library and Pirate Library Mirror. According to the new lawsuit, "in total, Defendants torrented at least 5 million copies of pirated books from LibGen in 2021, and at least another 2 million copies of pirated books from PiLiMi in 2022"
5
. The publishers state they were unaware their works were being copied from "some of the most notorious pirated sources in the world" until Judge William Alsup's rulings revealed these practices5
.
Source: Entrepreneur
The music publishers originally filed a lawsuit against Anthropic over approximately 500 copyrighted works. However, after discovering the extent of illegally downloading copyrighted songs through the Bartz case, they attempted to amend their original complaint. The court denied that motion in October, ruling they had failed to investigate the piracy claims earlier. When Anthropic opposed the amendment, arguing that torrenting claims would "fundamentally transform" the original case, the publishers filed this separate lawsuit to address what they term "willful infringement"
5
.In the Bartz case, Judge William Alsup established a critical legal distinction: it is legal for Anthropic to train AI models on copyrighted content, but acquiring that content through piracy remains illegal
1
3
. This precedent forms the foundation of the current music publishers' arguments, focusing specifically on how Anthropic obtained the material rather than its use for AI training data.Related Stories
The lawsuit seeks statutory damages of up to $150,000 per infringed work, plus additional damages of up to $25,000 per violation for alleged removal or alteration of copyright management information. The publishers also request a court-supervised order requiring Anthropic to destroy all infringing copies in its datasets and file compliance reports
5
.
Source: TechCrunch
For the music industry, this case highlights the importance of licensing arrangements that ensure songwriters and publishers receive compensation when their works are used. Publishers collect revenue from licensing songs in their catalogues and share it with represented artists. These licensing deals can extend to AI companies, potentially generating additional revenue streams in exchange for proprietary data used to train AI models
5
.While the $1.5 billion Bartz settlement appeared substantial, it represents a manageable cost for Anthropic, valued at $183 billion according to some sources and around $350 billion according to others
1
3
. This raises questions about whether financial penalties alone will deter similar practices across the AI industry or whether companies will view such settlements as acceptable business costs.The outcome of this copyright infringement case could establish clearer boundaries for how AI companies acquire training data, potentially forcing the industry toward legitimate licensing partnerships rather than relying on pirated materials from shadow libraries. Recent examples show this shift is possible: Universal Music Group and AI music generator Udio announced a partnership following their settlement, while Tips Music's earnings call revealed AI training components in its Warner Music Group partnership
5
. Anthropic has not responded to requests for comment on the allegations1
.Summarized by
Navi
[4]
1
Business and Economy

2
Policy and Regulation

3
Technology
