2 Sources
[1]
AI music is fine until it starts pretending to be real people
There's a difference between AI as a musical tool and using it to commit fraud AI-generated music is becoming more widespread but not necessarily popular. And that's just the publicly acknowledged AI music. Now, artists are dealing with seeing their name and voice attached to music they never performed or approved of, even if they passed away decades ago. The most recent high-profile incident occurred when English folk singer Emily Portman heard from a fan who liked her new release, except the album, Orca, though released under her name, was entirely fake. The whole thing had been pushed live on Spotify, iTunes, YouTube, and other major platforms without her knowledge or consent. Portman took to social media to warn her fans about what was happening. The fact that the AI could mimic her artistic style well enough to trick some fans just added to the creep factor. It took weeks for Spotify to address the problem, and you can still see the album on Spotify even if the music is gone. Portman joins a litany of acts, from pop artist Josh Kaufman to country artists Blaze Foley, who passed away in 1989, and Guy Clark, who died in 2016, in having her work mimicked by AI without her approval. It seems we've moved past the novelty of AI remixes and deepfake duets into digital identity theft with a beat. The thieves are often good at being quiet in their releases, able to score whatever royalties might trickle in. Further, even getting the music taken down might not be enough. A few days after the initial incident, Portman found another album had popped up on her streaming page. Except this time, it was just nonsense instrumentals, with no effort to sound like the musician. Having scammers use AI to steal from actual artists is obviously a travesty. There are some blurry middle grounds, of course, like never-real musicians pretending to be humans. That's where AI-generated "band" Velvet Sundown stands. The creators later admitted the origin of the AI band, but only after millions of plays from a Spotify profile showing slightly uncanny images of bandmates that didn't exist. As the music was original and not directly ripped from other songs, it wasn't a technical violation of any copyright laws. The band didn't exist, but the royalties sure did. I think AI has a place in music. I really like how it can help the average person, regardless of technical or musical skills, produce a song. And AI tools are making it easier than ever to generate music in the style of someone else. But, with streaming platforms facing 99,000 uploads a day, most of which are pushed through third-party distributors that rely on user-submitted metadata, it's not hard to slip something fake into a real artist's profile. Unless someone notices and complains, it just sits there, posing as the real thing. Many fans are tricked, with some believing Orca was really Emily Portman's new album. Others streamed Velvet Sundown, thinking they'd stumbled onto the next Fleetwood Mac. And while there's nothing wrong with liking an AI song per se, there's everything wrong with not knowing it is an AI song. Consent and context are missing, and that fundamentally changes the listening experience. Now, some people argue this is just the new normal. And sure, AI can help struggling artists find new inspiration, fill in missing instrumentation, suggest chord progressions, and provide other aid. But that's not what's happening here. These are not tools being used by artists. These are thieves. Worse still, this undermines the entire concept of artistic ownership. If you can make a fake Emily Portman album, any artist is at risk. The only thing keeping these scammers from doing the same to the likes of Taylor Swift right now is the threat of getting caught by high-profile legal teams. So instead, they aim lower. Lesser-known artists don't have the same protections, which makes them easier targets. And more profitable, in the long run, because there's less scrutiny. And there's the issue of how we as music fans are complicit. If we start valuing convenience and novelty over authenticity, we'll get more AI sludge and fewer real albums. The danger isn't just that AI can mimic artists. We also have to worry that people will stop noticing, or caring, when it does.
[2]
After the Velvet Sundown snafu, Spotify lands in another AI tangle
AI seems to be an increasing challenge for music streaming platforms to handle. A couple of months ago, many music fans were fooled by a band on Spotify called The Velvet Sundown that claimed to be real but instead turned out to be generated by AI, including everything from the music to the photos of the group. Now, generative-AI tools -- or, more accurately, some of the people who use them -- are causing headaches for real musicians as AI-generated songs appear on their official profile pages without fans realizing they're fake. Recommended Videos Take British folk singer-songwriter Emily Portman. She recently received a message from a fan thanking her for her latest work, which came as a surprise as she hasn't released anything recently, BBC News reported. The artist took a look for herself and found what looked like a new album -- called Orca -- on her official pages on Spotify, Apple Music, and other streaming platforms. She said the music on the 10 tracks sounded a bit like something she may have created -- including a voice that sounded similar to hers -- leading Portman to conclude that the AI software had been carefully prompted by someone to emulate her work. She said that even the song titles were "uncannily close" to ones she might have come up with. Even more concerningly, she was credited for all of the work on the fake album -- including the writer and performer -- and was even listed as the copyright holder. A short while later, Portman said another album appeared on her streaming pages, but the quality of the music on this one was rather on the shoddy side, with the artist describing it to BBC News as "20 tracks of instrumental drivel ... just AI slop." Portman contacted the streamers about the fake album and the AI-generated tracks have now been taken down, with Spotify taking a lengthy three weeks to remove it. Spotify's official statement on the incident is rather curious, saying: "These albums were incorrectly added to the wrong profile of a different artist by the same name, and were removed once flagged." That makes it sound like less of an issue, but Portman said that while there is indeed another artist on Spotify with the same name, her music is markedly different from her own, and the music that was removed from her Spotify profile has yet to be added to the other artist's page. The BBC's report offers other examples of AI-generated music that uses an established artist's identity to drive more traffic, thereby earning the uploader more revenue. Portman's experience highlights a troubling and expanding problem in the music industry, and one that is likely to get worse as the generative-AI tools improve. Streaming services like Spotify clearly need to do more to ensure the authenticity of the tracks on their platforms, both to protect artists' branding and to give fans confidence that what they're hearing is the real deal.
Share
Copy Link
AI-generated music is causing concern in the music industry as fake albums and impersonations of real artists appear on streaming platforms, raising questions about authenticity and copyright.
The music industry is facing a new challenge as AI-generated music impersonating real artists has begun appearing on major streaming platforms. This trend has raised concerns about authenticity, copyright, and the future of music creation and consumption.
British folk singer-songwriter Emily Portman recently discovered an AI-generated album titled "Orca" on her official pages on Spotify, Apple Music, and other streaming platforms 12. The album, which she had not created, contained 10 tracks that sounded similar to her style and even included a voice resembling hers. Portman was credited as the writer, performer, and copyright holder of this fake album.
Shortly after, another AI-generated album appeared on her streaming pages, described by Portman as "20 tracks of instrumental drivel ... just AI slop" 2. It took Spotify three weeks to remove the fake content after being notified.
This incident follows the recent controversy surrounding "The Velvet Sundown," an AI-generated band that fooled many music fans on Spotify 2. The band's creators later admitted to using AI to generate everything from the music to the photos of the non-existent group members.
The rise of AI-generated music impersonating real artists has several implications:
Artist Identity Theft: Real musicians are seeing their names and voices attached to music they never performed or approved 1.
Copyright and Royalties: Scammers can potentially earn royalties from fake releases, especially when targeting lesser-known artists 1.
Fan Deception: Many fans are being tricked into believing they are listening to authentic new releases from their favorite artists 12.
Platform Verification Challenges: With streaming platforms facing 99,000 uploads daily, it's becoming increasingly difficult to verify the authenticity of each release 1.
Source: Digital Trends
While AI has legitimate uses in music production, such as helping average people create songs or assisting artists with instrumentation and chord progressions, the current trend of impersonation goes beyond these beneficial applications 1.
The incident highlights the need for streaming services to implement stricter verification processes to protect artists' branding and maintain fan trust 2. As AI technology continues to improve, the music industry must address these challenges to preserve the integrity of artistic creation and ownership.
Source: TechRadar
Some argue that this might become the "new normal," but critics warn that valuing convenience and novelty over authenticity could lead to more AI-generated content at the expense of genuine artistic expression 1.
Summarized by
Navi
[2]
Microsoft introduces its first homegrown AI models, MAI-Voice-1 for speech generation and MAI-1-preview for text, signaling a potential shift in its AI strategy and relationship with OpenAI.
8 Sources
Technology
10 hrs ago
8 Sources
Technology
10 hrs ago
Nvidia reports a record-breaking Q2 FY2026 with $46.7B revenue, showcasing the company's dominance in AI hardware and continued success in gaming, despite challenges in the Chinese market.
10 Sources
Technology
18 hrs ago
10 Sources
Technology
18 hrs ago
Anthropic announces significant changes to its data retention and usage policies for Claude AI users, sparking discussions about privacy, consent, and the future of AI development.
7 Sources
Technology
10 hrs ago
7 Sources
Technology
10 hrs ago
Nvidia's exclusion of potential China sales from its forecast due to trade uncertainties causes market volatility, while AI enthusiasm continues to drive tech sector growth.
17 Sources
Technology
18 hrs ago
17 Sources
Technology
18 hrs ago
Dell Technologies raises annual forecasts due to strong AI server demand, but faces margin pressures from high costs and competition.
15 Sources
Technology
10 hrs ago
15 Sources
Technology
10 hrs ago