15 Sources
15 Sources
[1]
AI Music Fools Most People, and They're Not Happy About It
A survey from Deezer and Ipsos reveals strong feelings about AI-generated tunes. Our playlists are becoming a playground for AI-generated music. And that's making us uneasy, especially because it's getting harder and harder to discern the genuine, human-made tunes from the musical deepfakes. According to a new survey of 9,000 people by the music service Deezer and the research firm Ipsos, participants listened to three songs and then had to choose which were fully AI-generated and which weren't. Nearly all respondents (97%) couldn't tell the difference. Of those who couldn't tell, 71% said they were surprised by the results and more than half, 52%, were uncomfortable they couldn't distinguish the AI music. Respondents expressed ambivalence about AI and music: About two-thirds expressed curiosity about AI-generated music, with a willingness to try listening at least once, but four out of five (80%) agreed that AI music should be clearly labeled for listeners. Don't miss any of our unbiased tech content and lab-based reviews. Add CNET as a preferred Google source. Deezer, which commissioned the survey, has reason to underscore people's inability to tell if they're listening to AI-generated songs. In January, it rolled out a detection tool for AI in music. In the release for the survey, the company said it receives 50,000 AI-generated tracks every day. The unsettling feelings about AI and music have seen a crescendo in recent days as an AI-powered tune from source called Breaking Rust topped Billboard's country digital music charts. Last month, music streaming giant Spotify signed deals with Sony, Universal and Warner to develop AI music products. Some of the other findings from the Deezer/Ipsos survey showed curiosity and caution in listeners' attitudes toward AI music: The Deezer/Ipsos survey of 9,000 adults ages 18-65 was conducted in early October in eight countries: the United States, Canada, Brazil, UK, France, Netherlands, Germany and Japan.
[2]
AI music has beaten hat-act humans, but it's not a victory
Top of the slops signposts the undiscovered country for an industry Opinion Remember when the hottest news in the schoolyard was which band was the hottest this week? Those days are back, baby. An AI-generated band called Breaking Rust has just hit the top of the Billboard Country chart in the US with a song called Walk My Walk. Some questions will never be answered - could it ever release a sea shanty, and will all the albums be compilations? What this means for the future of the music industry, the AI industry, and music itself, is less funny. It is deeply ironic that the country genre is first to cop the slop. Country is very jealous of its authenticity, of communicating the emotions and experiences of common people. Three chords and the truth, as songwriter Harlan Howard said. Country is also deeply formulaic, it knows what buttons to push to connect with its audience. It has always been one of the easier genres to parody, and LLMs are nothing if not parody generators. It's not as if the music industry hasn't been producing low-quality slop all its life. We remember the exciting, rewarding songs, especially from our teenage years when everything is new and tastes are formed. Go back to the charts from those years of yours, though, and marvel at how much was anodyne, derivative, programmatic junk. The musicians making that were exploited by an industry skilled in keeping as much of the money as it could. That LLMs now offer it the chance to keep all the money is as unsurprising as my dog just upped and left me and the bank just took the house. How the song will end will be more surprising. At this stage, in music as elsewhere, the LLMs have eaten all the data. The web has been scraped, the content archives mined out, and all the millions of digitized songs thrown into the AI maw. Bigger may be better when it comes to models, and the best data is more data, but once there's no more data, what then? The thing about music is that for all its variety and importance in our consciousness, it's quite a small data set. There are probably between 100 and 200 million songs out there. Which sounds a lot, until you compare that to the five billion photographs taken per day. Moreover, it took a century to accumulate that much music, but that's not the problem. On current stats, some 120,000 new tracks are uploaded to streaming services daily, a number that's doubled in the past five years. Guess where a huge amount of this new music has come from. Well done you. Even if the rate of 50,000 AI-generated songs a day holds steady, it will take about five years to overtake all the human generated music of today. It won't hold steady. Because the human music data set is so small, it is easy to overwhelm with slop. It doesn't matter that there are no musicians to interview or promote, no gigs or festival presence, because this stuff is farmed out by automated playlists. There is nothing to stop the slop. Which is bad for humans, but it's far worse for AI, which does not thrive at all when it feeds on its own output. Model Autophagy Disease is one of those things AI boosters don't like to talk about. As the name suggests, it's a syndrome that can lead to model collapse in about five ingestion cycles when a model's own output is included in the ingested data. Like bovine spongiform encephalopathy, better known as mad cow disease, it comes about due to terrible industry practices. Those are difficult to eliminate if the industry in question is entirely dependent on those practices. That's the song the music industry is singing right now. It seems like such a good idea. Consolidate the record labels and streaming distributors, grab as much data as you can to train your AIs claiming fair use, ladle out the results, and collect all the money. Take as many of the filters off as possible to maximize the efficiency of the process, and the streaming revenue checks will write themselves. Nobody's going to stop you. Trouble is, nobody's stopping anyone else from using those generative AIs either. Except, it turns out, the baleful dynamics of LLMs fed on their own swill. Even before model collapse, the quality of the output is smushed down and down as the model sheds diversity while heading towards the terminal feedback loop of insanity. The presence of Walk My Walk at the top of the charts certainly sounds like that process has begun. Our ears are the evidence, the charts themselves marking the progression of the madness. At last, AI benchmarks that mean something. For music to matter, it has to evoke emotion. If that emotion is one of horror at AI's exponential banalification of music, or grim satisfaction as it writhes out of the control of its makers and sinks its fangs into their veins, then it may precipitate the collapse of other, bigger, more dangerous models. Another great theme of country music is redemption, after all. ®
[3]
That New Hit Song on Spotify? It Was Made by A.I.
Nick Arter, a thirty-five-year-old in Washington, D.C., never quite managed to become a professional musician the old-fashioned way. He grew up in Harrisburg, Pennsylvania, in a music-loving family. His father and stepfather were big into nineties hip-hop -- Jay-Z, Biggie, Nas -- and his uncles were working d.j.s spinning seventies R. & B. By his adolescence, he and his cousins were recording their own hip-hop tracks, first on cassette boom boxes, then on desktop computers, mimicking Lil Romeo and Lil Bow Wow, the popular kid rappers of the day. Music remained a hobby throughout Arter's college years, at Indiana University of Pennsylvania. After graduating, he briefly attempted to go pro, selling mixtapes at local shows, before settling into a job running a government call center in Harrisburg. That role eventually led to a position at Deloitte in D.C., and Arter continued rapping on nights and weekends without releasing any music. "I was getting a little bit too old to be a rapper," he recalled recently. Then, late last year, he started using artificial intelligence to create songs, and, within months, he had hits on streaming platforms netting hundreds of thousands of plays. Maybe he had a musical career after all. Arter's success is emblematic of A.I.'s accelerating inroads into the music industry. No realm of culture or entertainment remains untouched by artificial intelligence: Coca-Cola just released a Christmas ad made with A.I. visuals; A.I. actors are being hyped in Hollywood. But the technology has had an especially swift impact on songwriting. A couple of years ago, a smattering of A.I. tracks went viral for using tricks like replicating the voices of pop stars, including Jay-Z and Drake. Now we're in the midst of a full-blown A.I. music moment. This month, an A.I. country song called "Walk My Walk" (with percussive claps and forgettable lyrics such as "Kick rocks if you don't like how I talk") hit No. 1 on Billboard's Country Digital Song Sales chart, and passed three million streams on Spotify; the performer behind it is a square-jawed digital avatar named Breaking Rust. In September, Xania Monet, an A.I. R. & B. singer created by a young poet in Mississippi, landed a multimillion-dollar record deal after several Billboard-charting singles. And earlier in the year a mysterious psychedelic band called the Velvet Sundown passed a million plays on Spotify before its creators admitted that the group was "synthetic." Spotify does not mark A.I.-generated content as such, and the company has said that it is improving its A.I. filters without defining what qualifies as an A.I. song. In the past year, the platform has removed more than seventy-five million "spammy tracks" from its service, but countless unmarked A.I. tracks remain, and many listeners can't tell the difference. In one recent study, participants could successfully discern A.I.-generated music from human-made music only fifty-three per cent of the time. If you hear an A.I.-generated track online, chances are that it was created with one of two popular song-making apps, Suno or Udio. Arter's process involves both. He writes his own lyrics, often on his phone. Then he drafts text prompts with the lyrics and notes about the track he's envisioning, and plugs the prompts into the two apps to see which one produces better results. (Arter told me that "a good prompt consists of (year), (genre of music), (instrumentation), (mood) and (emotion).") He generates dozens of versions of each track this way, iterating on the tune and the instrumentation, until he's happy with an output. Finally, he uses Midjourney to create album art for each new single -- usually closeup portraits of generic soul musicians -- and uploads the songs to streaming services including YouTube and Spotify. One of his more popular hits, with nearly nine hundred thousand plays on Spotify, is "I'm Letting Go of the Bullshit," a pastiche of a late-seventies R. & B. torch song and hip-hop-style lyrical empowerment: "This year I'm in my flow / fuck anything that don't help me grow." The apps allow Arter to save a dashboard of style shortcuts, making it easier to produce future tracks in a similar vein. "The algorithm kind of learns your taste," he explained. Arter's music, released under the name Nick Hustles, is by no means subtle (another track is "Stop Bitching": "nobody ever got rich / acting like a little bitch"), and the instrumentals and vocals are undercut with the vacant tinniness that's the hallmark of A.I. sound. But the melodies -- and certain lyrical flourishes, such as a prominent expletive in "Dopest MotherFucker Alive" -- are catchy enough to stick in your head. This technology has "opened up a new realm of creative possibility," Arter said. He had never been a skilled singer; now he could dabble in the old-school R. & B. he grew up with. Suddenly, he could craft ageless personae to represent his music, complete with fictional backstories, in lieu of his aging millennial self. Arter has produced about a hundred and forty songs in the past year alone, and he doesn't hide the fact that his music is made with A.I., though the unsuspecting listener may not notice the name of his YouTube account, "AI for the Culture." Many of his songs function as punch lines about everyday life: They're "talking about being in traffic, Chipotle messing up my order," Arter said. His œuvre includes "Healthy Hoes At Trader Joe's," "I'm About to Take My Ass to Sleeep" [sic], and both "I Got to Stop Vaping" and "I LOST MY FUCKING VAPE AGAIN," catering to his demographic and covering all stages of addiction. He has never done marketing or promotion for his A.I. music, yet word of mouth and algorithmic recommendations, such as Spotify's Radio function, have propelled his work to a level of popularity that he could only dream of as a rapping teen-ager. Justin Bieber has used Arter's songs to soundtrack Instagram posts, and 50 Cent posted a video of himself singing along to a Nick Hustles track in his car. The rapper Young Thug adopted the chorus of Arter's "all my dogs got that dog in 'em" for his hit track "Miss My Dogs" and gave Arter credit as a lyricist. Arter was able to quit his job in consulting and embark on a full-time career as a semiautomated musician. He now works with the music distributor UnitedMasters and makes money from more than fifty different streaming platforms. On the side, he generates novelty songs for clients' birthdays or weddings at five hundred dollars a pop (half price if you supply your own lyrics). Arter has no doubt that what he's doing is just a new way of being an artist: If your music "changes someone's life," he said, "does it really matter if it was A.I.?"
[4]
People Can't Tell If a Song Is AI-Generated, and That's Why It's Going to Be Inescapable
If you thought AI-generated videos were getting scary realistic, you now have one more worry to add to that list. According to music streaming platform Deezer, an overwhelming majority of people cannot tell AI-generated music apart from the real deal written and performed by actual humans. In a joint survey with market research company Ipsos, Deezer asked 9,000 people across eight countries â€"the United States, Canada, Brazil, the United Kingdom, France, the Netherlands, Germany, and Japanâ€"to listen to songs and determine whether or not they were AI-generated. A whopping 97% of the respondents failed this task. The participants in Deezer's survey were split on how to view the findings, with 52% finding it uncomfortable that they were not able to tell the difference. 51% of the survey respondents also said they think AI will lead to more low-quality, generic-sounding (aka AI slop) music. Regardless of how they viewed AI's role in music, 80% agreed that AI-generated music should be clearly labeled. "The survey results clearly show that people care about music and want to know if they're listening to AI or human-made tracks or not," Deezer CEO Alexis Lanternier said in a press release. Labeling AI use in music is a hot topic. The conversation was sparked earlier this year, when a rock band called "The Velvet Sundown" amassed a million Spotify streams before it was revealed that the project was AI-generated. It led to increasing calls by artists for the clear labeling of AI use in music. Spotify said in September that it will start supporting a "new industry standard for AI disclosures in music credits." But a quick look at The Velvet Sundown's artist page shows that there is no clear, upfront labeling yet. Deezer clearly labels AI-generated content on its platform, but is still home to a growing number of AI-generated songs. The French streaming company announced in September that 28% of the music uploaded on its platform was fully AI-generated. There's a likely reason why AI-generated music was so hard to distinguish from the real thing in the Deezer survey, and it's the same reason why artists are worried about the rise of AI-generated music: it's because these AI song generators are trained on the hard work of actual human musicians. "There's also no doubt that there are concerns about how AI-generated music will affect the livelihood of artists, music creation and that AI companies shouldn't be allowed to train their models on copyrighted material," Lanternier said in the press release. Seventy percent of the respondents said they believed AI-generated music threatened the livelihood of musicians. It's unclear where copyright law goes from here in relation to AI and music. Early signs say the European Union might be siding with the artists. In a key case in Germany, a court ruled earlier this week that OpenAI's ChatGPT had violated copyright law by training its models with song lyrics. The story is different elsewhere, though. Earlier this year, famous British musicians like Elton John and Dua Lipa called on the British government to pass an amendment that would ensure copyright transparency when it comes to how AI companies can use their work to train models. But that amendment ultimately failed. AI-generated music is winning stateside as well. In April of last year, leading artists from Billie Eilish to Aerosmith signed an open letter calling on AI developers and digital music services to pledge not to "develop or deploy AI-music generation technology, content or tools that undermine or replace the human artistry of songwriters and artists." A few months later, leading studios Universal Music Group, Sony, and Warner Records filed a copyright lawsuit against two popular AI music generation startups, Suno and Udio. But fast forward a year, Universal Music Group announced that it not only had an out-of-court settlement with Udio but was also partnering with the AI company to create a new product trained exclusively on their music catalogue. Spotify is also doubling down on AI. The streaming giant already uses AI to optimize its algorithm and provides services like an "AI DJ" to mimic a radio host interjecting with commentary in between a personalized music stream. The company also announced last month that it was planning to collaborate with Sony, UMG, Warner Music, and others to develop "responsible AI products," without yet divulging what exactly those products would be. “AI is the most consequential technology shift since the smartphone, and it’s already reshaping how music is created and experienced," Spotify's co-president Gustav Söderström said in the press release. "Our company brings deep research expertise to this opportunity, and we’re actively growing our AI team and capabilities to drive the continued growth of the entire music ecosystem.†AI isn't just coming for your Spotify playlists, though. The music industry is everywhere. The livelihoods of real, human musicians depend on not just hit albums but also catchy brand jingles, movie soundtracks, podcast outros, hold songs on phone calls, and a lot of other melodies that we take for granted as background music. In a world where AI takes over music, the jobs of these anonymous musicians who create the soundtrack of our everyday lives could be the first on the chopping block.
[5]
AI slop tops Billboard and Spotify charts as synthetic music spreads
Three songs generated by artificial intelligence topped music charts this week, reaching the highest spots on Spotify and Billboard charts. Walk My Walk and Livin' on Borrowed Time by the outfit Breaking Rust topped Spotify's "Viral 50" songs in the US, which documents the "most viral tracks right now" on a daily basis, according to the streaming service. A Dutch song, We Say No, No, No to an Asylum Center, an anti-migrant anthem by JW "Broken Veteran" that protests against the creation of new asylum centers, took the top position in Spotify's global version of the viral chart around the same time. Breaking Rust also appeared in the top five on the global chart. "You can kick rocks if you don't like how I talk," reads a lyric from Walk My Walk, a seeming double entendre challenging those opposed to AI-generated music. Days after its ascent up the charts, the Dutch song disappeared from Spotify and YouTube, as did Broken Veteran's other music. Spotify told the Dutch outlet NU.nl that the company had not removed the music, the owners of the song rights had. Broken Veteran told the outlet that he did not know why his music had disappeared and that he was investigating, hoping to return it soon. For three weeks, Walk My Walk has led Billboard's "Country Digital Song Sales" chart, which measures downloads and digital purchases. The list is minor in comparison to Billboard's "Hot Country Songs" or "Top Country Albums", which measure a broader range of signals of success. Breaking Rust, JW "Broken Veteran" and Spotify did not respond to requests for comment. These three songs are part of a flood of AI-generated music that has come to saturate streaming platforms. A study published on Wednesday by the streaming app Deezer estimates that 50,000 AI-generated songs are uploaded to the platform every day - 34% of all the music submitted. Walk My Walk and We Say No, No, No to an Asylum Center, aren't the first AI-generated songs to reach a mass audience. Over the summer, AI-generated songs from a group going by the name Velvet Sundown amassed over a million streams on Spotify, in what one member later described as an "art hoax". Ed Newton-Rex, a musician and the founder of a non-profit that certifies generative AI companies' data training practices as fair to artists, says that the sheer number of AI-generated songs now online is a key factor driving the ascent of a few AI-generated hits. "It's part of the very rapid trend of AI music gaining in popularity essentially because it's spreading in volume," he said. "What you have here is 50,000 tracks a day that are competing with human musicians. You have a new, hyperscalable competitor and, moreover, this competitor that was built by exploitation." AI music has improved in quality from its early, clanking days. As part of its study, Deezer surveyed 9,000 people in eight countries and found that 97% could not distinguish between AI-generated music and human-written music. "There's no denying it. I think it's fair to say you can't distinguish the best AI music from human-composed music now," Newton-Rex said. Human-level quality is not the only reason for the success of AI-generated tracks on Spotify. As with many parts of the AI slop economy, there are a set of tools and platforms out there that enable AI music to spread easily - and sub-communities of users eager to share tips to game the system. Jack Righteous, a blog on AI content creation, recommends that its followers generate "streams of passive income" using a music distribution service called DistroKid, which funnels royalties to music creators every time their AI tracks are streamed on YouTube, Spotify or TikTok. DistroKid is part of an ecosystem of online music distribution services, such as Amuse, Landr and CDBaby, which help creators place their music on major platforms like YouTube and Spotify. These services have varied policies on AI-generated content, and blogs describe DistroKid's as "more lenient". Several hits from Breaking Rust, including Livin' on Borrowed Time and Resilient, appear to be distributed by DistroKid. "Basically every piece of AI music you see isn't distributed by a regular label. They're made by a person in their bedroom and uploaded to these distribution sites," said Chris Dalla Riva, author of the book Uncharted Territory, on the data behind music virality. When reached for comment, Spotify pointed to its policy on AI-generated tracks.
[6]
AI artists blow up on country music chart
Why it matters: The success of "artists" Breaking Rust and Cain Walker pits AI technology against humans who earn their living as songwriters, artists and music business professionals. Driving the news: Breaking Rust has the No. 1 song on the Billboard country digital song sales chart with the single "Walk My Walk." * Cain Walker's "Don't Tread On Me" comes in at No. 3 on the same chart. * Ella Langley, a human singer-songwriter, is No. 2 on the chart with "Choosin' Texas." Threat level: The situation is ringing alarm bells in Nashville. * Dating back to the 1950s when husband-and-wife writing duo Felice and Boudleaux Bryant were churning out hits for the Everly Brothers, Nashville has been a songwriting capital of the world. * Songwriting and music publishing are the cornerstones of the city's music industry. Countless singer-songwriters flocked to Nashville over the decades, and successful ones like Don Schlitz and Liz Rose proudly carried on the city's songwriting tradition. Yes, but: The age of streaming already put the traditional country music songwriter under threat as royalties declined. The number of professionals who make their living as just songwriters, and not also as performers, has dropped precipitously. * Streaming revenues continue to grow, and songwriters fought in 2018 to pass the Music Modernization Act, which they hoped would create more favorable revenue structures. * Still, it's gritty out there. Against that backdrop now comes AI-fueled competition. By the numbers: Breaking Rust boasts 2.3 million monthly listeners on Spotify. Cain Walker has over 842,000. * By comparison, ascending singer-songwriter Jackson Dean, whose single "Heavens to Betsy" is climbing the country radio charts, has 1.6 million monthly listeners. Between the lines: Aaron Ryan, editor at the country music website Whiskey Riff, took a critical view of the situation in a recent post: "With advances in technology, a lot of these songs are nearly indistinguishable from the real thing, which obviously poses a risk to actual artists, songwriters, and fans who value real art over AI slop." * Ryan reports the songs released by Breaking Rust are credited to a person named Aubierre Rivaldo Taylor, who is also behind other AI artists. Songwriter advocates express concern The sight of AI-generated songs topping sacred country music charts sent shockwaves through Nashville this week. What they're saying: "To creators, AI is scary and it's existentially scary," Bart Herbison, executive director of Nashville Songwriters Association International, the nation's leading songwriter advocacy group, tells Axios. "In this instance, it's ignited a conversation that I've never seen in my 28 years in this job." * "I'm going to speak to (the impact of AI on the) songwriter and songwriter protections. AI is here, but what we've espoused for a couple of years are the four P's: permission, payment, proof and penalties," Herbison says, explaining NSAI's stance on AI copyright protections for songwriters. "We want to see that in any context, whether it's the song or the artist that AI produced." Friction point: Herbison says he questions whether the AI tools used to generate successful songs are "trained on human works, and whether there's compensation for that." * Tennessee became the first state last year in the country to pass a law protecting creators from deep AI fakes. * Advocacy groups want creators to be compensated if AI models its work after them. They want penalties in instances of unlicensed use of an artist's voice or likeness. We want to hear from you: What do you think of Breaking Rust and Cain Walker?
[7]
You next favorite song could be made by AI
What Happened: So, the streaming service Deezer just dropped the results of a huge survey it ran about AI-generated music, and the findings are pretty wild. It teamed up with the polling company Ipsos to ask over 9,000 people across eight different countries what they really think about AI in music. And here's the headline that's freaking everyone out: in a blind test, 97% of people could not tell the difference between a song made by a human and one generated by AI. Spooky, right? Even worse, when people were told they couldn't tell the difference, more than half of them (52%) said it made them feel "uneasy." This all comes as Deezer is trying to be the good guy in this fight. It's pledging to be the first major service to put a clear label on any track that is 100% made by AI. Why Is This Important: This survey basically proves what a lot of us have been feeling: AI is getting good, way faster than our ears can keep up. People are totally torn. On one hand, almost half the folks in the survey thought AI could be a cool tool for discovering new music. But on the other hand, a lot of them (64%) are worried it's going to suck all the creativity out of music, and 70% are scared it's going to put human artists out of a job. The other big takeaway? People want rules, and they want them now. A massive 80% said AI-made songs must be clearly labeled. And 65% are saying AI companies shouldn't be allowed to train their bots on copyrighted music without getting permission and paying for it - which is a direct shot at how a lot of these AI companies operate. Why Should I Care: So, why does this matter to you? If you use Spotify, Apple Music, or, well, Deezer, this is your world now. AI-generated songs are flooding these platforms. It's getting harder to know who (or what) actually made the song you're listening to. Real artists could see their work get diluted or just plain ripped off without getting paid a cent. Your favorite "Discovery" playlist might soon be packed with songs made by a computer, and you wouldn't even know it. This survey just confirms that trust is breaking down. Fans are getting creeped out, and they want to know what's real before they hit "like." Recommended Videos What's Next: Deezer says it's sticking to its guns. It's going to keep adding AI labels and is trying to push the rest of the industry to do the same (we'll see how that goes). But this is just the start. Get ready for 2025 to be the year of endless, loud arguments about AI training, how artists get paid, and what "real music" even means anymore. Deezer is just the first one to make a big, public move.
[8]
AI music is getting messy
A version of this article originally appeared in Quartz's AI & Tech newsletter. Sign up here to get the latest AI & tech news, analysis and insights straight to your inbox. Xania Monet just became the first "artificial" artist to chart on Billboard's airplay rankings and secure a multimillion-dollar record deal. But most listeners can't tell she's not actually human: She's a creation of generative AI. That disconnect is a problem the music industry is scrambling to solve. Monet's breakthrough arrives as the recording industry, already transformed by two decades of digital disruption, enters its next phase of reinvention. Major labels that once fought streaming are now racing to stake claims in AI territory, negotiating deals that will determine how music gets made, who gets paid, and what consumers actually know -- or care -- about what they're listening to. Behind Monet is Telisha Nikki Jones, a Mississippi poet who writes the lyrics that Monet performs using Suno's generative AI platform. Monet has released at least 31 songs since the summer, including a full-length album "Unfolded" in August with 24 tracks. Her songs "Let Go, Let God" and "How Was I Supposed to Know" have charted on Billboard's Hot Gospel Songs and Hot R&B Songs, respectively, a first for artificial artists. "AI doesn't replace the artist," Romel Murphy, Monet's manager, told CNN. "It doesn't diminish the creativity and doesn't take away from the human experience. It's a new frontier." But that frontier looks different depending on where you're standing. Working musicians see their already precarious livelihoods threatened by endless AI-generated alternatives. Industry executives see both opportunity and existential threat. And listeners? They mostly don't know what they're hearing. A recent study found that listeners could only correctly identify AI-generated music 53% of the time, barely better than random guessing. When presented with stylistically similar human and AI songs, accuracy improved to 66%, but that still means one in three listeners couldn't tell the difference. The settlement terms remain undisclosed, but the structure hints at the industry's strategy. Artists must opt in to have their music included, and all AI-generated content must stay within Udio's platform. Similar deals are reportedly weeks away. According to the Financial Times, Universal and Warner are in talks with Google, Spotify, and various AI startups including Klay Vision, ElevenLabs, and Stability AI. The labels are pushing for a streaming-like payment model where each use of their music in AI training or generation triggers a micropayment. The urgency is understandable. Besides Monet, Billboard said at least one new AI artist has showed up on the charts for the last five weeks, meaning there are more and more chances for chart-topping confusion. Spotify revealed that it removed 75 million tracks last year to maintain quality, though the company won't specify how many were AI-generated. Deezer, another streaming platform, reports that up to 70% of AI-generated music streams on its platform are fraudulent, suggesting the technology is already being weaponized for streaming fraud at scale. The lack of transparency about what music AI models are trained on means independent artists could be losing compensation without even knowing their work was used. Industry groups are calling for mandatory labeling of AI-generated content, warning that without safeguards, artificial intelligence risks repeating streaming's pattern of tech platforms profiting while creators struggle. Currently, streaming platforms have no legal obligation to identify AI-generated music. Deezer uses detection software to tag AI tracks, but Spotify doesn't label them at all, leaving consumers in the dark about what they're hearing. The industry's challenge goes beyond detection or regulation. Music has always been more than sound waves arranged in pleasing patterns. It's been about human connection, shared experience, and the stories we tell ourselves about the songs we love. As AI-generated artists climb the charts and secure record deals, the question isn't whether machines can make music that sounds real. They already can. The question is whether listeners will still care about the difference once they know the truth.
[9]
Humans can no longer tell AI music from the real thing: Survey
It has become nearly impossible for people to tell the difference between music generated by artificial intelligence and that created by humans, according to a survey released Wednesday. The polling firm Ipsos asked 9,000 people to listen to two clips of AI-generated music and one of human-made music in a survey conducted for France-based streaming platform Deezer. "Ninety-seven percent could not distinguish between music entirely generated by AI and human-created music," said Deezer. The survey came out as a country music song featuring a male singer's voice generated by AI reached the top of the US charts for the first time this week. "Walk My Walk" by Breaking Rust -- an artist widely reported by US media to be powered by generative AI technology -- made it to the top spot on Billboard magazine's chart ranking digital sales of country songs, according to data published Monday. Deezer said more than half of the respondents to its survey felt uncomfortable at not being able to tell the difference. Pollsters also asked broader questions about the impact of AI, with 51% saying the technology would lead to more low-quality music on streaming platforms and almost two-thirds believing it will lead to a loss of creativity. "The survey results clearly show that people care about music and want to know if they're listening to AI or human-made tracks or not," Deezer CEO Alexis Lanternier said. One in three streamed tracks AI Deezer said there has not only been a surge in AI-generated content being uploaded to its platform, but it is finding listeners. In January, one in 10 of the tracks streamed each day were completely AI-generated. Ten months later, that percentage has climbed to over one in three, or nearly 40,000 per day. Eighty percent of survey respondents wanted fully AI-generated music clearly labeled for listeners. Deezer is the only major music-streaming platform that systematically labels completely AI-generated content for users. The issue gained prominence in June when a band called The Velvet Sundown suddenly went viral on Spotify, and only confirmed the following month that it was in fact AI-generated content. The AI group's most popular song has been streamed more than three million times. In response, Spotify said it would encourage artists and publishers to sign up to a voluntary industry code to disclose AI use in music production. The Deezer survey was conducted between October 6 and 10 in eight countries: Brazil, Britain, Canada, France, Germany, Japan, the Netherlands and the United States.
[10]
People can't tell AI-generated music from real thing anymore, survey shows
It's become nearly impossible for people to tell the difference between music generated by artificial intelligence and that created by humans, according to a survey released Wednesday. The polling firm Ipsos asked 9,000 people to listen to two clips of AI-generated music and one of human-made music in a survey conducted for France-based streaming platform Deezer. "Ninety-seven percent could not distinguish between music entirely generated by AI and human-created music," said Deezer in a statement. The survey was conducted between October 6 and 10 in eight countries: Brazil, Britain, Canada, France, Germany, Japan, the Netherlands and the United States. Deezer said more than half of the respondents felt uncomfortable at not being able to tell the difference. Pollsters also asked broader questions about the impact of AI, with 51 percent saying the technology would lead to more low-quality music on streaming platforms and almost two-thirds believing it will lead to a loss of creativity. "The survey results clearly show that people care about music and want to know if they're listening to AI or human made tracks or not," Deezer CEO Alexis Lanternier said in a statement. Deezer said there's not only been a surge in AI-generated content being uploaded to its platform, but it's attracting listeners as well. In January, one in 10 of the tracks streamed each day were completely AI-generated. Ten months later, that percentage has climbed to over one in three, or nearly 40,000 per day. Eighty percent of survey respondents wanted fully AI-generated music clearly labelled for listeners. Deezer is the only major music-streaming platform that systematically labels completely AI-generated content for users. The issue gained prominence in June when a band called The Velvet Sundown suddenly went viral on Spotify and only confirmed the following month that it was in fact AI-generated content. The AI group's most popular song has been streamed more than three million times. In response, Spotify said it would encourage artists and publishers to sign up to a voluntary industry code to disclose AI use in music production.
[11]
Humans can no longer tell AI music from the real thing: survey
Paris (AFP) - It has become nearly impossible for people to tell the difference between music generated by artificial intelligence and that created by humans, according to a survey released Wednesday. The polling firm Ipsos asked 9,000 people to listen to two clips of AI-generated music and one of human-made music in a survey conducted for France-based streaming platform Deezer. "Ninety-seven percent could not distinguish between music entirely generated by AI and human-created music," said Deezer in a statement. The survey was conducted between October 6 and 10 in eight countries: Brazil, Britain, Canada, France, Germany, Japan, the Netherlands and the United States. Deezer said more than half of the respondents felt uncomfortable at not being able to tell the difference. Pollsters also asked broader questions about the impact of AI, with 51 percent saying the technology would lead to more low-quality music on streaming platforms and almost two-thirds believing it will lead to a loss of creativity. "The survey results clearly show that people care about music and want to know if they're listening to AI or human made tracks or not," Deezer CEO Alexis Lanternier said in a statement. Deezer said there has not only been a surge in AI-generated content being uploaded to its platform, but it is finding listeners as well. In January, one in 10 of the tracks streamed each day were completely AI-generated. Ten months later, that percentage has climbed to over one in three, or nearly 40,000 per day. Eighty percent of survey respondents wanted fully AI-generated music clearly labelled for listeners. Deezer is the only major music-streaming platform that systematically labels completely AI-generated content for users. The issue gained prominence in June when a band called The Velvet Sundown suddenly went viral on Spotify, and only confirmed the following month that it was in fact AI-generated content. The AI group's most popular song has been streamed more than three million times. In response, Spotify said it would encourage artists and publishers to sign up to a voluntary industry code to disclose AI use in music production.
[12]
A third of daily music uploads are AI-generated and 97% of people can't tell the difference, says report
Please use Chrome browser for a more accessible video player Do you care if the music you're listening to is artificially generated? That question - once the realm of science fiction - is becoming increasingly urgent. An AI-generated country track, Walk My Walk, is currently sitting at number one on the US Billboard chart of digital sales and a new report by streaming platform Deezer has revealed the sheer scale of AI production in the music industry. Deezer's AI-detection system found that around 50,000 fully AI-generated tracks are now uploaded every day, accounting for 34% of all daily uploads. The true number is most likely higher, as Deezer's AI-detection system does not catch every AI-generated track. Nor does this figure include partially AI-generated tracks. In January 2025, Deezer's system identified 10% of uploaded tracks as fully AI-generated. Since then, the proportion of AI tracks - made using written prompts such as "country, 1990s style, male singer" - has more than tripled, leading the platform's chief executive, Alexis Lanternier, to say that AI music is "flooding music streaming". 'Siphoning money from royalty pool' What's more, when Deezer surveyed 9,000 people in eight countries - the US, Canada, Brazil, UK, France, Netherlands, Germany and Japan - and asked them to detect whether three tracks were real or AI, 97% could not tell the difference. That's despite the fact that the motivation behind the surge of AI music is not in the least bit creative, according to Deezer. The company says that roughly 70% of fully AI-generated tracks are what it calls "fraudulent" - that is, designed purely to make money. "The common denominator is the ambition to boost streams on specific tracks in order to siphon money from the royalty pool," a Deezer spokesperson told Sky News. "With AI-generated content, you can easily create massive amounts of tracks that can be used for this purpose." The tracks themselves are not actually fraudulent, Deezer says, but the behaviour around them is. Someone will upload an AI track then use an automated system - a bot - to listen to a song over and over again to make royalties from it. Even though the total number of streams for each individual track is very low - Deezer estimates that together they account for 0.5% of all streams - the work needed to make an AI track is so tiny that the rewards justify the effort. Are fully-AI tracks being removed? Deezer is investing in AI-detection software and has filed two patents for systems that spot AI music. But it is not taking down the tracks it marks as fully-AI. Instead it removes them from algorithmic recommendations and editorial playlists, a measure designed to stop the tracks getting streams and therefore generating royalties, and marks the tracks as "AI-generated content". Read more from Sky News: How Elon Musk is boosting the British right The extraordinary impact of a crime on UK growth Concerns about artists' livelihoods Deezer's survey found that more than half (52%) of respondents felt uncomfortable with not being able to tell the difference between AI and human-made music. "The survey results clearly show that people care about music and want to know if they're listening to AI or human-made tracks or not," said the company's boss Alexis Lanternier. "There's also no doubt that there are concerns about how AI-generated music will affect the livelihood of artists." Earlier this year, more than 1,000 musicians - including Annie Lennox, Damon Albarn and Kate Bush - released a silent album to protest plans by the UK government to let artificial intelligence companies use copyright-protected work without permission. A recent study commissioned by the International Confederation of Societies of Authors and Composers suggested that generative AI music could be worth £146bn a year in 2028 and account for around 60% of music libraries' revenues. By this metric, the authors concluded, 25% of creators' revenues are at risk by 2028, a sum of £3.5bn.
[13]
Humans can no longer tell AI music from the real thing: survey - The Economic Times
A survey by Deezer shows that 97% of people cannot tell AI-generated music from human-made tracks. With AI content rising on streaming platforms, over half of listeners feel uneasy. 80% want clear labelling, while concerns grow over creativity loss and low-quality music flooding the industry.It has become nearly impossible for people to tell the difference between music generated by artificial intelligence and that created by humans, according to a survey released Wednesday. The polling firm Ipsos asked 9,000 people to listen to two clips of AI-generated music and one of human-made music in a survey conducted for France-based streaming platform Deezer. "97% could not distinguish between music entirely generated by AI and human-created music," said Deezer in a statement. The survey was conducted between October 6 and 10 in eight countries: Brazil, Britain, Canada, France, Germany, Japan, the Netherlands and the United States. Deezer said more than half of the respondents felt uncomfortable at not being able to tell the difference. Pollsters also asked broader questions about the impact of AI, with 51% saying the technology would lead to more low-quality music on streaming platforms and almost two-thirds believing it will lead to a loss of creativity. "The survey results clearly show that people care about music and want to know if they're listening to AI or human made tracks or not," Deezer CEO Alexis Lanternier said in a statement. Deezer said there has not only been a surge in AI-generated content being uploaded to its platform, but it is finding listeners as well. In January, one in 10 of the tracks streamed each day were completely AI-generated. Ten months later, that percentage has climbed to over one in three, or nearly 40,000 per day. 80% of survey respondents wanted fully AI-generated music clearly labelled for listeners. Deezer is the only major music-streaming platform that systematically labels completely AI-generated content for users. The issue gained prominence in June when a band called The Velvet Sundown suddenly went viral on Spotify, and only confirmed the following month that it was in fact AI-generated content. The AI group's most popular song has been streamed more than three million times. In response, Spotify said it would encourage artists and publishers to sign up to a voluntary industry code to disclose AI use in music production.
[14]
AI can perform a song, but can it make art? - The Korea Times
The most insulting thing about the success of Breaking Rust, an artificial intelligence "artist" that topped Billboard's Country Digital Song Sales Chart this week, is the titles of the hits. The EP -- which is also on the charts -- is called "Resilient," as if Breaking Rust spent years playing for tips in empty bars. And maybe Aubierre Rivaldo Taylor, who is credited for writing the songs, did. But the bluesy voice we hear singing about pain and suffering did not overcome anything. In fact, you could say this completely computer-generated country singer found chart success by mocking people. A year ago, a handful of loud industry folks in Nashville questioned whether Beyoncé, who was born and raised in Texas, was country enough to do a country album. Good times. Today AI-generated "performers" such as Breaking Rust and Xania Monet, which hit the Billboard R&B charts, are suggesting you don't even need to be human to fit into those genres. Eric Church, whose latest release "Evangeline vs. the Machine," was nominated this month in the best contemporary country album category at the Grammys, told me he's not too worried because fans still want to see live shows and "AI algorithm is not going to be able to walk on stage and play." He says that the best thing the industry can do is establish AI music as its own genre and that award shows should establish a separate category. "I think it's a fad," he said, adding that he finds it fun. "When people like a song or connect with an artist the ultimate thing for them is then to go experience that artist with people who also like that artist, that's the ultimate payoff. You're not going to be able to do that with AI." Church wraps up touring on Saturday at the Intuit Dome in Inglewood. In addition to promoting the new album, this year his foundation began providing housing for victims of Hurricane Helene using funds from a benefit concert. The North Carolina native also released a single to raise funds to help his neighbors. You know, things only a flesh-and-blood artist can do. Regarding Breaking Rust, he said: "The better thing we should be doing is making the general public aware that it's AI because ... I don't think they know that." "The biggest problem is the ability to deceive people or manipulate people because it looks real, it sounds real, it's pretty disingenuous if you didn't say it," Church told me. "I've seen stuff from me that is online.... They take my face and they put it on another body.... My mom sent me one and I was like, 'Mom, that's not me.' "That's where it gets dangerous and that's where it gets scary." If AI-generated "musicians" like Breaking Rust are a passing fad, as Church suggests, it's one that's been 50 years in the making. While use of the voice box on recordings goes back to the 1960s, it was the 1975 recording of Peter Frampton's double live album, "Frampton Comes Alive," that popularized its use. In the 1980s Zapp had a string of gold albums with front man Roger Troutman using the voice box technology to make his voice sound futuristic, and in the 1990s AutoTune went from being a tool producers use to fine-tune a singer's pitch on a recording to being the featured sound on a recording. This gave us Cher's global chart-topper "Believe." Over the decades, technology in the studio has made it possible for the vocally challenged to usurp craftsmanship and talent. Before MTV debuted in 1981, we were warned that video was going to kill the radio star. That obviously didn't happen. And now, AI-generated video can theoretically replace filmed human performances. But even that should not be a threat to real stars. As with most things in life, when expertise is devalued, it's easier to pass trash off as treasure. AutoTune and AI are enabling people who lack musical talent to game the system -- like audio catfish. When an artist like Church sings of heartbreak, listeners can identify with his lived experience. However, Breaking Rust is on the top of the charts with a song called "Walk My Way" ... and the entity singing those words has never taken a step. That's not to say an AI ditty can't be catchy. It most certainly can be. I just wonder: If the artist isn't real, how can the art be?
[15]
Are you listening to bots? Survey shows AI music is virtually undetectable
(Reuters) -A staggering 97% of listeners cannot distinguish between artificial intelligence-generated and human-composed songs, a Deezer-Ipsos survey showed on Wednesday, underscoring growing concerns that AI could upend how music is created, consumed and monetized. The findings of the survey highlight the ethical and economic tension facing the global music industry as AI tools capable of generating songs in seconds raise copyright concerns and threaten the livelihoods of artists. The issue gained prominence earlier this year when AI band "The Velvet Sundown" sparked enough buzz to gain around one million monthly listeners on Spotify before people found out about its synthetic origins. The study, which polled 9,000 people across eight countries including the U.S., UK, France, Brazil and Canada, showed about 71% of the respondents were surprised by their inability to distinguish between human and machine-produced tracks. Streaming platform Deezer said more than 50,000 songs uploaded daily on its service are entirely AI-generated, accounting for about a third of all new music submissions. The company began tagging AI music earlier this year to promote transparency. "We believe strongly that creativity is generated by human beings, and they should be protected," CEO Alexis Lanternier told Reuters, urging transparency. Lanternier also said implementing a differential payout structure for AI music is a complex issue with varying viewpoints, making a "massive change" to remuneration difficult. Deezer is excluding fake streams from royalty payments. Another way is for companies to make deals with rights holders, he added, citing Universal Music Group's settlement of a copyright infringement case with AI company Udio. While financial terms were undisclosed, UMG will partner with Udio to launch a new AI music creation and streaming platform next year, with the AI tool getting trained on licensed music. (Reporting by Jaspreet Singh in Bengaluru; Editing by Nivedita Bhattacharjee and Saumyadeb Chakrabarty)
Share
Share
Copy Link
AI-generated music is flooding streaming platforms and topping charts, with Breaking Rust's country song reaching Billboard's top spot while most listeners remain unable to detect the artificial nature of these tracks.
Artificial intelligence has reached a significant milestone in the music industry, with AI-generated songs not only infiltrating streaming platforms but achieving chart-topping success. Breaking Rust, an AI-created country act featuring a digital avatar, has dominated Billboard's Country Digital Song Sales chart for three consecutive weeks with "Walk My Walk"
2
. The track, featuring forgettable lyrics like "Kick rocks if you don't like how I talk," has garnered over three million streams on Spotify3
.
Source: CBS
This success extends beyond a single track. Breaking Rust's "Walk My Walk" and "Livin' on Borrowed Time" topped Spotify's "Viral 50" chart in the United States, while a Dutch anti-migrant anthem "We Say No, No, No to an Asylum Center" by JW "Broken Veteran" reached the top position globally
5
. The phenomenon represents a broader trend of AI-generated content achieving mainstream recognition across multiple markets and genres.A comprehensive survey conducted by music streaming platform Deezer and research firm Ipsos reveals the extent of AI music's sophistication. The study, involving 9,000 participants across eight countries including the United States, Canada, Brazil, UK, France, Netherlands, Germany, and Japan, found that 97% of respondents could not distinguish between AI-generated and human-made music .

Source: Tech Xplore
The results sparked mixed reactions among participants. While 71% expressed surprise at their inability to detect AI music, 52% reported feeling uncomfortable about this limitation
4
. Despite this discomfort, approximately two-thirds of respondents expressed curiosity about AI-generated music and willingness to listen at least once, though 80% agreed that AI music should be clearly labeled for listeners.The volume of AI-generated music entering streaming platforms has reached unprecedented levels. Deezer reports receiving 50,000 AI-generated tracks daily, representing 34% of all music submissions to the platform
5
.
Source: Axios
This massive influx is facilitated by accessible AI music generation tools, primarily Suno and Udio, which allow users to create professional-sounding tracks through text prompts. Nick Arter, a 35-year-old Washington D.C. resident operating under the name Nick Hustles, exemplifies this new generation of AI-assisted musicians. After years of unsuccessful attempts at traditional music careers, Arter found success using AI tools to create R&B and hip-hop tracks, with one song achieving nearly 900,000 Spotify plays
3
. His process involves writing lyrics, crafting detailed prompts specifying year, genre, instrumentation, mood, and emotion, then iterating through dozens of versions until achieving desired results.Related Stories
The music industry's response to AI-generated content reflects deep ambivalence about the technology's impact. While 70% of survey respondents believe AI-generated music threatens musicians' livelihoods
4
, major streaming platforms and record labels are simultaneously embracing AI partnerships. Spotify announced collaborations with Sony, Universal Music Group, and Warner Music to develop "responsible AI products," while Universal Music Group settled its copyright lawsuit with Udio and formed a partnership to create AI products trained exclusively on their catalog.The proliferation of AI music raises concerns about "Model Autophagy Disease," a phenomenon where AI systems degrade when trained on their own output
2
. As AI-generated content increasingly dominates training datasets, experts warn of potential quality degradation and model collapse, creating a feedback loop that could ultimately undermine the technology's effectiveness while simultaneously flooding the market with homogenized content.🟡 smiles=🟡TrueSummarized by
Navi
[2]
[3]
11 Nov 2025•Entertainment and Society

01 Sept 2025•Technology

29 Nov 2025•Entertainment and Society
