Curated by THEOUTPOST
On Mon, 7 Apr, 8:00 AM UTC
3 Sources
[1]
The music industry is battling AI -- with limited success
The music industry is fighting on platforms, through the courts and with legislators in a bid to prevent the theft and misuse of art from generative AI -- but it remains an uphill battle. Sony Music said recently it has already demanded that 75,000 deepfakes -- simulated images, tunes or videos that can easily be mistaken for real -- be rooted out, a figure reflecting the magnitude of the issue. The information security company Pindrop says AI-generated music has "telltale signs" and is easy to detect, yet such music seems to be everywhere. "Even when it sounds realistic, AI-generated songs often have subtle irregularities in frequency variation, rhythm and digital patterns that aren't present in human performances," said Pindrop, which specializes in voice analysis. But it takes mere minutes on YouTube or Spotify -- two top music-streaming platforms -- to spot a fake rap from 2Pac about pizzas, or an Ariana Grande cover of a K-pop track that she never performed. "We take that really seriously, and we're trying to work on new tools in that space to make that even better," said Sam Duboff, Spotify's lead on policy organization. YouTube said it is "refining" its own ability to spot AI dupes, and could announce results in the coming weeks. "The bad actors were a little bit more aware sooner," leaving artists, labels and others in the music business "operating from a position of reactivity," said Jeremy Goldman, an analyst at the company Emarketer. "YouTube, with a multiple of billions of dollars per year, has a strong vested interest to solve this," Goldman said, adding that he trusts they're working seriously to fix it. "You don't want the platform itself, if you're at YouTube, to devolve into, like, an AI nightmare," he said. Litigation But beyond deepfakes, the music industry is particularly concerned about unauthorized use of its content to train generative AI models like Suno, Udio or Mubert. Several major labels filed a lawsuit last year at a federal court in New York against the parent company of Udio, accusing it of developing its technology with "copyrighted sound recordings for the ultimate purpose of poaching the listeners, fans and potential licensees of the sound recordings it copied." More than nine months later, proceedings have yet to begin in earnest. The same is true for a similar case against Suno, filed in Massachusetts. At the center of the litigation is the principle of fair use, allowing limited use of some copyrighted material without advance permission. It could limit the application of intellectual property rights. "It's an area of genuine uncertainty," said Joseph Fishman, a law professor at Vanderbilt University. Any initial rulings won't necessarily prove decisive, as varying opinions from different courts could punt the issue to the Supreme Court. In the meantime, the major players involved in AI-generated music continue to train their models on copyrighted work -- raising the question of whether the battle isn't already lost. Fishman said it may be too soon to say that: although many models are already training on protected material, new versions of those models are released continuously, and it's unclear whether any court decisions would create licensing issues for those models going forward. Deregulation When it comes to the legislative arena, labels, artists and producers have found little success. Several bills have been introduced in the US Congress, but nothing concrete has resulted. A few states -- notably Tennessee, home to much of the powerful country music industry -- have adopted protective legislation, notably when it comes to deepfakes. Donald Trump poses another potential roadblock: the Republican president has postured himself as a champion of deregulation, particularly of AI. Several giants in AI have jumped into the ring, notably Meta, which has urged the administration to "clarify that the use of publicly available data to train models is unequivocally fair use." If Trump's White House takes that advice, it could push the balance against music professionals, even if the courts theoretically have the last word. The landscape is hardly better in Britain, where the Labor government is considering overhauling the law to allow AI companies to use creators' content on the internet to help develop their models, unless rights holders opt out. More than a thousand musicians, including Kate Bush and Annie Lennox, released an album in February entitled "Is This What We Want?" -- featuring the sound of silence recorded in several studios -- to protest those efforts. For analyst Goldman, AI is likely to continue plaguing the music industry -- as long as it remains unorganized. "The music industry is so fragmented," he said. "I think that that winds up doing it a disservice in terms of solving this thing."
[2]
The music industry is battling AI -- with limited success
New York (AFP) - The music industry is fighting on platforms, through the courts and with legislators in a bid to prevent the theft and misuse of art from generative AI -- but it remains an uphill battle. Sony Music said recently it has already demanded that 75,000 deepfakes -- simulated images, tunes or videos that can easily be mistaken for real -- be rooted out, a figure reflecting the magnitude of the issue. The information security company Pindrop says AI-generated music has "telltale signs" and is easy to detect, yet such music seems to be everywhere. "Even when it sounds realistic, AI-generated songs often have subtle irregularities in frequency variation, rhythm and digital patterns that aren't present in human performances," said Pindrop, which specializes in voice analysis. But it takes mere minutes on YouTube or Spotify -- two top music-streaming platforms -- to spot a fake rap from 2Pac about pizzas, or an Ariana Grande cover of a K-pop track that she never performed. "We take that really seriously, and we're trying to work on new tools in that space to make that even better," said Sam Duboff, Spotify's lead on policy organization. YouTube said it is "refining" its own ability to spot AI dupes, and could announce results in the coming weeks. "The bad actors were a little bit more aware sooner," leaving artists, labels and others in the music business "operating from a position of reactivity," said Jeremy Goldman, an analyst at the company Emarketer. "YouTube, with a multiple of billions of dollars per year, has a strong vested interest to solve this," Goldman said, adding that he trusts they're working seriously to fix it. "You don't want the platform itself, if you're at YouTube, to devolve into, like, an AI nightmare," he said. Litigation But beyond deepfakes, the music industry is particularly concerned about unauthorized use of its content to train generative AI models like Suno, Udio or Mubert. Several major labels filed a lawsuit last year at a federal court in New York against the parent company of Udio, accusing it of developing its technology with "copyrighted sound recordings for the ultimate purpose of poaching the listeners, fans and potential licensees of the sound recordings it copied." More than nine months later, proceedings have yet to begin in earnest. The same is true for a similar case against Suno, filed in Massachusetts. At the center of the litigation is the principle of fair use, allowing limited use of some copyrighted material without advance permission. It could limit the application of intellectual property rights. "It's an area of genuine uncertainty," said Joseph Fishman, a law professor at Vanderbilt University. Any initial rulings won't necessarily prove decisive, as varying opinions from different courts could punt the issue to the Supreme Court. In the meantime, the major players involved in AI-generated music continue to train their models on copyrighted work -- raising the question of whether the battle isn't already lost. Fishman said it may be too soon to say that: although many models are already training on protected material, new versions of those models are released continuously, and it's unclear whether any court decisions would create licensing issues for those models going forward. Deregulation When it comes to the legislative arena, labels, artists and producers have found little success. Several bills have been introduced in the US Congress, but nothing concrete has resulted. A few states -- notably Tennessee, home to much of the powerful country music industry -- have adopted protective legislation, notably when it comes to deepfakes. Donald Trump poses another potential roadblock: the Republican president has postured himself as a champion of deregulation, particularly of AI. Several giants in AI have jumped into the ring, notably Meta, which has urged the administration to "clarify that the use of publicly available data to train models is unequivocally fair use." If Trump's White House takes that advice, it could push the balance against music professionals, even if the courts theoretically have the last word. The landscape is hardly better in Britain, where the Labor government is considering overhauling the law to allow AI companies to use creators' content on the internet to help develop their models, unless rights holders opt out. More than a thousand musicians, including Kate Bush and Annie Lennox, released an album in February entitled "Is This What We Want?" -- featuring the sound of silence recorded in several studios -- to protest those efforts. For analyst Goldman, AI is likely to continue plaguing the music industry -- as long as it remains unorganized. "The music industry is so fragmented," he said. "I think that that winds up doing it a disservice in terms of solving this thing."
[3]
The music industry is battling AI -- with limited success
NEW YORK (AFP) - The music industry is fighting on platforms, through the courts and with legislators in a bid to prevent the theft and misuse of art from generative AI -- but it remains an uphill battle. Sony Music said recently it has already demanded that 75,000 deepfakes -- simulated images, tunes or videos that can easily be mistaken for real -- be rooted out, a figure reflecting the magnitude of the issue. The information security company Pindrop says AI-generated music has "telltale signs" and is easy to detect, yet such music seems to be everywhere. "Even when it sounds realistic, AI-generated songs often have subtle irregularities in frequency variation, rhythm and digital patterns that aren't present in human performances," said Pindrop, which specialises in voice analysis. But it takes mere minutes on YouTube or Spotify -- two top music-streaming platforms -- to spot a fake rap from 2Pac about pizzas, or an Ariana Grande cover of a K-pop track that she never performed. "We take that really seriously, and we're trying to work on new tools in that space to make that even better," said Sam Duboff, Spotify's lead on policy organization. YouTube said it is "refining" its own ability to spot AI dupes, and could announce results in the coming weeks. "The bad actors were a little bit more aware sooner," leaving artists, labels and others in the music business "operating from a position of reactivity," said Jeremy Goldman, an analyst at the company Emarketer. "YouTube, with a multiple of billions of dollars per year, has a strong vested interest to solve this," Goldman said, adding that he trusts they're working seriously to fix it. "You don't want the platform itself, if you're at YouTube, to devolve into, like, an AI nightmare," he said. But beyond deepfakes, the music industry is particularly concerned about unauthorised use of its content to train generative AI models like Suno, Udio or Mubert. Several major labels filed a lawsuit last year at a federal court in New York against the parent company of Udio, accusing it of developing its technology with "copyrighted sound recordings for the ultimate purpose of poaching the listeners, fans and potential licensees of the sound recordings it copied." More than nine months later, proceedings have yet to begin in earnest. The same is true for a similar case against Suno, filed in Massachusetts. At the center of the litigation is the principle of fair use, allowing limited use of some copyrighted material without advance permission. It could limit the application of intellectual property rights. "It's an area of genuine uncertainty," said Joseph Fishman, a law professor at Vanderbilt University. Any initial rulings won't necessarily prove decisive, as varying opinions from different courts could punt the issue to the Supreme Court. In the meantime, the major players involved in AI-generated music continue to train their models on copyrighted work -- raising the question of whether the battle isn't already lost. Fishman said it may be too soon to say that: although many models are already training on protected material, new versions of those models are released continuously, and it's unclear whether any court decisions would create licensing issues for those models going forward. When it comes to the legislative arena, labels, artists and producers have found little success. Several bills have been introduced in the US Congress, but nothing concrete has resulted. A few states -- notably Tennessee, home to much of the powerful country music industry -- have adopted protective legislation, notably when it comes to deepfakes. Donald Trump poses another potential roadblock: the Republican president has postured himself as a champion of deregulation, particularly of AI. Several giants in AI have jumped into the ring, notably Meta, which has urged the administration to "clarify that the use of publicly available data to train models is unequivocally fair use." If Trump's White House takes that advice, it could push the balance against music professionals, even if the courts theoretically have the last word. The landscape is hardly better in Britain, where the Labor government is considering overhauling the law to allow AI companies to use creators' content on the internet to help develop their models, unless rights holders opt out. More than a thousand musicians, including Kate Bush and Annie Lennox, released an album in February entitled "Is This What We Want?" -- featuring the sound of silence recorded in several studios -- to protest those efforts. For analyst Goldman, AI is likely to continue plaguing the music industry -- as long as it remains unorganised. "The music industry is so fragmented," he said. "I think that that winds up doing it a disservice in terms of solving this thing."
Share
Share
Copy Link
The music industry is struggling to combat the challenges posed by AI, including deepfakes and unauthorized use of copyrighted material for AI model training. Despite efforts on multiple fronts, progress has been limited.
The music industry is facing an unprecedented challenge from artificial intelligence, with deepfakes and AI-generated content proliferating at an alarming rate. Sony Music recently reported that it has demanded the removal of 75,000 deepfakes, highlighting the scale of the problem 1. Despite claims from information security company Pindrop that AI-generated music has "telltale signs" and is easy to detect, such content remains widespread on popular platforms like YouTube and Spotify 12.
Major streaming platforms are actively working to address the issue of AI-generated music. Spotify's lead on policy organization, Sam Duboff, stated that they are "trying to work on new tools in that space to make that even better" 1. Similarly, YouTube is "refining" its ability to detect AI dupes and may announce results soon 2. However, industry analyst Jeremy Goldman notes that "bad actors were a little bit more aware sooner," leaving the music industry "operating from a position of reactivity" 3.
Beyond deepfakes, the music industry is particularly concerned about the unauthorized use of copyrighted content to train generative AI models. Several major labels have filed lawsuits against companies like Udio and Suno, accusing them of using copyrighted recordings to develop their AI technologies 12. These legal battles center around the principle of fair use, which could potentially limit the application of intellectual property rights.
Joseph Fishman, a law professor at Vanderbilt University, describes this as "an area of genuine uncertainty" 2. The outcome of these cases remains unclear, with the possibility that varying opinions from different courts could eventually lead to a Supreme Court decision 3.
The music industry's attempts to address AI-related issues through legislation have seen limited success. While several bills have been introduced in the US Congress, no concrete results have emerged 1. Some states, such as Tennessee, have adopted protective legislation, particularly concerning deepfakes 2.
However, potential roadblocks exist, including Donald Trump's stance as a champion of AI deregulation. Tech giants like Meta are advocating for the use of publicly available data to train AI models to be considered fair use, which could further complicate matters for the music industry 3.
The situation is not much better internationally. In Britain, the Labor government is considering changes to the law that would allow AI companies to use creators' content from the internet for model development, unless rights holders opt out 1. This has sparked protests from musicians, with over a thousand artists, including Kate Bush and Annie Lennox, releasing a silent album titled "Is This What We Want?" to voice their opposition 2.
Analyst Jeremy Goldman suggests that the music industry's fragmented nature is hindering its ability to effectively address the AI challenge. "The music industry is so fragmented," he stated, adding, "I think that that winds up doing it a disservice in terms of solving this thing" 3. This lack of unity may continue to impede progress in the industry's battle against AI-generated content and copyright infringement.
As the situation evolves, it remains to be seen whether the music industry can unite and develop more effective strategies to protect artists' rights and creative works in the age of artificial intelligence.
Reference
[1]
[3]
Pop icon Britney Spears reveals the release date for her highly anticipated memoir, "The Woman in Me," and shares insights into her journey of self-discovery and healing.
2 Sources
2 Sources
A global study predicts significant revenue losses for music creators due to AI, highlighting the need for regulatory measures to protect artists' rights and income.
4 Sources
4 Sources
French streaming platform Deezer reports a significant increase in AI-generated music uploads, raising concerns about copyright issues and the impact on human artists. The company has implemented detection tools to manage the influx of AI content.
3 Sources
3 Sources
Suno, an AI-powered music creation platform, is embroiled in a legal battle with major record labels over alleged copyright infringement. The startup defends its practices while raising concerns about innovation and competition in the music industry.
5 Sources
5 Sources
Amazon's demonstration of Suno AI integration with Alexa Plus raises copyright issues, while Timbaland embraces AI music generation, highlighting the growing tension between AI technology and the music industry.
2 Sources
2 Sources
The Outpost is a comprehensive collection of curated artificial intelligence software tools that cater to the needs of small business owners, bloggers, artists, musicians, entrepreneurs, marketers, writers, and researchers.
© 2025 TheOutpost.AI All rights reserved