14 Sources
14 Sources
[1]
Apple Music to add Transparency Tags to distinguish AI music, says report | TechCrunch
Apple Music is changing the way that record labels and distributors can flag AI-generated or AI-assisted content when they upload it to the platform. According to Music Business Worldwide, Apple sent a newsletter to industry partners on Wednesday to explain how it will roll out a new set of metadata to promote transparency around how and when AI is used in music. Metadata typically refers to fields like the song title, album title, genre, artist name, and other information that helps keep files organized. Now, Apple Music will add the option to include metadata tags that distributors can apply to flag when AI-generated content is involved in certain aspects of a song. These tags allow distributors to distinguish between a song's artwork, track (music), composition (lyrics), or music video. This seems like something that Apple Music users are interested in -- a Reddit user posted a mock-up of a similar feature concept just days ago. But the problem with this sort of opt-in tagging is that it's on the label or distributor to manually choose to flag their use of AI. Spotify is taking a similar path. Other music streaming platforms like Deezer are trying to flag content with in-house AI detection tools, but it remains challenging to create these sorts of systems that are maximally accurate. TechCrunch has reached out to Apple for more information.
[2]
Apple Music Has a New System to Identify AI Generated Content, Report Says
Apple Music has introduced Transparency Tags, a new metadata system that will help identify and label AI-generated content. The announcement reportedly came in a newsletter sent to music industry partners on Wednesday. Distributors and record labels can now use the system to label content that is submitted to the platform. Metadata is the labeling method used to identify, organize and categorize content on the platform and may include basic information such as song title, album title, artwork and genre. Apple Music's new Transparency Tags focus on four key elements: track (the music), composition (the lyrics), artwork and music video. These metadata tags will be added during the submission process, allowing distributors to flag music that contains AI-generated content falling into one of the four categories listed above. The goal of this move is to increase transparency around the methodology the company uses for AI in music. Last year, Spotify made a similar announcement that it would implement stronger AI protections. There is a catch, though. In the newsletter, Apple reportedly said it'll rely on the content providers to decide what qualifies as AI-generated content. Apple Music's Transparency Tag system currently operates on an opt-in basis. In Apple's specs for this update, it says, "If omitted, none is assumed." Basically, record labels and distributors can simply choose not to participate. An Apple representative didn't immediately respond to a request for comment.
[3]
Apple Music adds optional labels for AI songs and visuals
Apple is asking artists and record labels on its music streaming platform to voluntarily label songs that were made using AI. The new "Transparency Tags" metadata system for Apple Music was announced in a newsletter to industry partners yesterday, according to Music Business Worldwide, and covers four categories, including track, composition, artwork, and music videos. The track tag should be applied when "a material portion of a sound recording" has been generated by AI tools, while the composition tag covers other AI-generated compositional elements, such as song lyrics. The artwork tag applies to static or moving graphics, but only at the album level. For all other AI-generated visual content -- whether standalone or bundled with albums -- the music video tag should be applied. Multiple transparency tags can be used simultaneously for works that require more than one of these disclosures. In its newsletter, Apple says its new tags are a "concrete first step" toward achieving industry-wide transparency around AI-generated music, and that labels and distributors "must take an active role in reporting when the content they deliver is created using AI." Apple Music's tagging system follows other efforts from competing music streaming providers to protect authentic artists from spam and impersonation, and help make AI-generated music easier for users to identify. Spotify is developing a new metadata standard for AI music disclosures with DDEX -- a music standards-setting organization that currently lists senior Apple Music exec Nick Williamson as a board member. Deezer also made the AI music detection tool it launched last year available to other platforms in January, while Qobuz introduced its own proprietary AI detection system last week. In contrast to Deezer and Qobuz's proactive detection systems, Apple Music's Transparency Tags are entirely optional (for now) and place the responsibility for AI disclosure squarely on record labels and music distributors instead of the platform. Apple even says that determining what qualifies as AI-generated music and visuals will be left to the discretion of content providers, "similar to genres, credits, and other metadata," and that no AI usage will be assumed on works that providers haven't tagged. Honesty policies for other AI labelling solutions haven't worked out so far. Given the lack of enforcement surrounding Apple Music's tagging system, I'm struggling to see why creators and record labels would be motivated to actually use it.
[4]
Apple Music Will Reportedly Require AI Disclosures on Songs
Apple Music users may soon be able to tell whether their new favorite tune is AI-generated. The streamer has launched new metadata "Transparency Tags" that will require distributors and record labels to disclose the use of AI in songs, Music Business Worldwide reports. Metadata refers to the basic information attached to a music file when it is uploaded to a streaming platform. This usually includes details such as the song title, artist name, album title, genre, and other data that users see in the Now Playing tab and elsewhere. With Transparency Tags, Apple Music will have labels and distributors disclose whether they used AI in the artwork, tracks, compositions, or music videos. The tag for each category applies when "a material portion" of the work was generated using AI. For an AI-generated song, they'd enable the track tag, and for AI-generated lyrics, they'd use the composition tag. The catch here, though, is that Apple leaves it to its music partners to decide what qualifies as AI content. "Proper tagging of content is the first step in giving the music industry the data and tools needed to develop thoughtful policies around AI... and we believe labels and distributors must take an active role in reporting when the content they deliver is created using AI," Apple told industry partners this week. In September, Spotify announced a similar feature for labels, distributors, and music partners. These updates come as more AI platforms let users create songs from prompts. With Gemini, for example, users can now generate 30-second audio clips. Notably, AI-generated content still remains ineligible for copyright in the US.
[5]
Apple Music can now flag AI content, but only if distributors elect to label it
While music streaming apps like Bandcamp, Spotify and Deezer have taken steps to inform users about AI-generated content, we haven't heard much out of Apple Music in that regard. However, Apple Music has now introduced "Transparency Tags" designed to show listeners if any elements were generated in whole or part by AI. The catch is that Apple is leaving it up to labels and distributors to create those tags, according to an Apple newsletter to industry partners seen by Music Business Worldwide.. "Proper tagging of content is the first step in giving the music industry the data and tools needed to develop thoughtful policies around AI, and we believe labels and distributors must take an active role in reporting when the content they deliver is created using AI," Apple wrote, calling it a concrete first step toward transparency around artificial intelligence. Streaming platforms already use metadata tags for things like song and album titles, genre and the name of the artist. The new tags will now identify any artwork, tracks, compositions and music videos created in whole or in part by AI. However, Apple's new system requires labels and distributors to opt in and manually flag their use of AI, a system that's similar to what Spotify is doing. On top of that, Apple has no apparent enforcement mechanism for AI content. By contrast, other music platforms including Deezer and Bandcamp are using in-house AI-detection tools to flag content whether the distributor opts in or not. Deezer disclosed in January 2026 that it receives over 60,000 fully AI-generated tracks every day, double the number it saw in September 2025. Synthetic content, also called "AI slop," has accounted for 13.4 million tracks on its platform, Deezer added.
[6]
Apple Music introduces metadata tags to disclose AI-generated content - 9to5Mac
Today, Apple introduced new metadata tags that will let record labels and distributors disclose when artificial intelligence was used in the creation of music, artwork, and more. Here are the details. According to an email distributed to Apple Music partners earlier today, the company will now require AI transparency tags for new content delivered to the platform. From the email: The new Transparency Tags will include four tag types that correspond to the main creative elements of digital music content: Apple says that the new requirements will apply when "a material portion of the content has been created using AI," and that multiple tags can be applied to the same content. Interestingly, Apple will also leave it to the discretion of each partner "to determine what qualifies as AI content," much like how partners already handle metadata such as genres and credits. The company says that the new tags are a first step to "bring greater transparency about AI-generated content," as the industry adjusts to the new creative possibilities afforded by generative technology. Alongside today's announcement, Apple also updated the Apple Music Package Specification material, and more details can be found here.
[7]
Apple Music Rolling Out Disclosure Tags for AI-Made Songs
Apple Music is rolling out a new metadata system called Transparency Tags, which indicates when AI has been used in the creation of music hosted on the platform. According to Music Business Worldwide, Apple sent a newsletter to industry partners on Wednesday to explain how it will roll out the new set of metadata. The system covers four categories including artwork, track, composition (lyrics), and music video. Labels and distributors can begin applying the tags immediately. Apple describes the tags as optional for now, noting that if omitted, no AI is assumed. Apple said it defers to content providers to determine what qualifies as AI-generated, and that it treats the tags similarly to genres, credits, and other existing metadata. The company describes it as a first step toward industry-wide transparency around AI-generated music. Proper tagging of content is the first step in giving the music industry the data and tools needed to develop thoughtful policies around AI," Apple said in the newsletter, "and we believe labels and distributors must take an active role in reporting when the content they deliver is created using AI." Apple's approach contrasts with the route taken by competitors like Deezer, which has built its own detection infrastructure to independently identify AI-generated tracks, but it's not 100% accurate all the time. With Apple's tags, there isn't a visible enforcement or cross-verification process in place. Deezer reports that it receives over 60,000 fully AI-generated tracks per day, with synthetic content now accounting for roughly 39% of all music delivered to the platform. Up to 85% of streams on AI-generated music were fraudulent in 2025, according to Deezer's data. Apple's system is voluntary, or at least it is for now. Whether labels and distributors will actually use it remains to be seen.
[8]
Apple Music will add 'Transparency Tags' to AI-generated tracks, report says
It's getting harder to distinguish AI-generated music on streaming platforms, and companies including Spotify and now Apple Music are attempting to make it clearer. Apple's audio streaming service is adding "Transparency Tags" to content generated by artificial intelligence, according to a report by Music Business Worldwide. In a newsletter sent to industry partners, viewed by the news outlet, the company reportedly has new requirements for metadata when uploading tracks to Apple Music, which would disclose whether the song (or anything related to the making of the song) is AI-generated or not. The metadata tags reportedly cover artwork, track, composition, and music video, and labels must apply tags to each if AI has been used in production. However, as MWB noted, the Transparency Tags appear to be "optional" in Apple Music Specification 5.3.25 and "if omitted, none is assumed," so it will be on Apple Music to enforce the need for such tags. Mashable has reached out to Apple Music for further information. It's tricky territory for music streamers and listeners feeling caught unawares with AI-made music, as Mashable's Rachel Thompson asks, "How should we feel about AI-generated songs finding their way into our listener library? Some people aren't necessarily opposed to giving AI music a try, but their open-mindedness begins to shift once they feel deceived." After the whole Velvet Sundown-fiasco, audio streaming competitor Spotify started adding AI disclosures through metadata in Sept. 2025, working with the Digital Data Exchange (DDEX) to allow "artists and rights holders a way to clearly indicate where and how AI played a role in the creation of a track -- whether that's AI-generated vocals, instrumentation, or post-production." However, this is separate from Spotify recommending AI-generated music to users, which we've still seen happening in 2026. In a 2025 study by Deezer and Ipsos, 97 percent of people surveyed couldn't tell an AI-generated song from a human-made one, 80 percent want clear labels for AI music, and 72 percent want to know if they're being recommended it. Honestly, more tags on AI-generated content is welcome news, and that doesn't just apply to music.
[9]
Apple Music is flagging AI slop before Spotify has even started -- but there's a catch
* Apple Music is rolling out its own system for flagging AI-generated music * 'Transparency Tags' will focus on visual elements as well as music composition * The catch is that the responsibility lies with record labels and distributors The war between AI and the best music streaming services continues, and Apple Music could be next to storm the battlefield with its own AI music tagging system. Dubbed 'Transparency Tags', Apple's flagship music platform is tipped to roll out a new metadata system to help identify AI-generated and AI-assisted music. This was revealed in a newsletter sent out to industry partners on March 4, but there's one big caveat that seems quite contradictory -- the tagging responsibility lies with record labels and distributors. Before content is submitted to Apple Music, record labels and distributors will have the option to disclose whether AI has been used during the production process, a method that Apple is implementing to increase transparency on how companies use AI in music production. The system will focus on four creative aspects. The first is the music itself, or the 'Track' tag, which will be used to show if AI has contributed to a material portion of the audio recording. This tag is at the track-level only. On a similar note, the 'Composition' tag will be used to highlight that AI was used in a material portion of the music composition, and also flags AI-generated lyrics. But Apple Music is taking it a step further with Transparency Tags that have a visual focus. As well as audio tags, Apple Music is rolling out an 'Artwork' tag that, as its name implies, flags AI-generated static and motion imagery used on the cover of an album or single. The 'Music Video' tag is similar, applying to visual components to albums and standalone videos. Though the newsletter has been limited to the eyes of industry partners, Music Business World shared an excerpt from the announcement; "Proper tagging of content is the first step in giving the music industry the data and tools needed to develop thoughtful policies around AI", the outlet starts. "We believe labels and distributors must take an active role in reporting when the content they deliver is created using AI." Despite the information we have so far, Apple hasn't disclosed when Transparency Tags will roll out, however we've reached out for further comment and will update this story if we find out more. It's not surprising to see that Apple Music is taking initiative before Spotify has even started, but its frustrating loophole is all a bit 'one step forward, two steps back'. It's optional, for now... Putting a responsibility like this on record labels and music distributors welcomes a slew of issues. For one, since it's essentially an opt-in program, it doesn't guarantee 100% transparency as labels can easily choose not to disclose such information, but who can blame them? You could deliver the best song in the world, but having an obtrusive 'AI' tag slapped onto your song isn't the most attractive -- as a passionate music fan, I'd immediately skip a song if it had this. It's also a hindrance to streaming numbers, which labels aren't willing to risk. Even if a song only has a tiny portion of AI-generated audio for creative purposes, most listeners would assume the whole song is AI-generated and skip accordingly. That said, the opt-in system could simply be a temporary placeholder for now, as Music Business World reports that Transparency Tags may become a required practice for labels and distributors when delivering content to Apple Music. It's a very different approach to the ones its rivals have taken. Deezer, for example, has developed its own AI detection tool which it recently made available to competitors while Bandcamp has banned AI-generated music altogether. That just leaves Spotify to join the growing army of music streamers against AI, but although it tightened new impersonation rules for music uploads, it has yet to follow suit with a legitimate system that clearly labels AI slop. Follow TechRadar on Google News and add us as a preferred source to get our expert news, reviews, and opinion in your feeds. Make sure to click the Follow button! And of course, you can also follow TechRadar on YouTube and TikTok for news, reviews, unboxings in video form, and get regular updates from us on WhatsApp too.
[10]
Apple Music will put custom tags for AI songs and visuals, but it's not enough
The streaming service now lets labels label AI-generated tracks and artwork, but there's no one checking their work. Apple Music started rolling out Transparency Tags on March 4, a system for flagging AI-generated music and artwork. Music Business Worldwide reports that labels can now tag content across four categories when they deliver it: tracks, compositions, artwork, and music videos. When a label sends a song to Apple, it can check a box saying AI generated a material portion of the recording or its packaging. Apple says in a partner newsletter that proper tagging is the first step toward giving the industry the data it needs for thoughtful policies. There's just one thing missing: Enforcement. The data problem with trusting labels The timing is uncomfortable. Weeks before Apple's announcement, Deezer dropped numbers that show what happens when AI music meets an honor system. The platform now gets more than 60,000 fully AI-generated tracks every day, roughly 39% of all music delivered. Since early 2025, Deezer has detected over 13.4 million AI tracks total. Recommended Videos Why do those tracks exist? Deezer found that up to 85% of all streams on AI-generated music in 2025 were fraudulent, up from 70% the year before. Those streams get demonetized and removed from the royalty pool. For context, streaming fraud across Deezer's entire catalog was just 8% last year. "We know that the majority of AI-music is uploaded to Deezer with the purpose of committing fraud," said CEO Alexis Lanternier. That's the environment Apple is stepping into. The company is asking people uploading fraudulent content to label it honestly. Why the fraud numbers matter Deezer's data explains why streaming services are rushing to address AI music. They want to put a stop to people who are siphoning money out of the royalty pool. Generate 60,000 tracks in a day, run bots to stream them, and every fake stream is money taken from a human artist. Spotify is watching closely. The company announced stronger AI rules last year and is working on industry tagging standards. But its detection infrastructure still lags behind Deezer's. Like Apple, Spotify depends on what labels disclose. What happens when trust isn't enough Deezer is already licensing its detection tech to other companies. French collecting society Sacem is among the first testing the tool, which claims to identify 100% of AI-generated music from models like Suno and Udio. The message is clear: automated detection works. Apple is taking a different road. Its transparency tags defer to labels and distributors to decide what counts as AI. The technical spec says the tags are optional for now and assumes none if omitted. For listeners, the takeaway is simple. You'll start seeing AI tags on Apple Music soon. Just remember who's doing the labeling.
[11]
AppleInsider.com
AI-generated content in the Apple Music app can now be more easily spotted, but only if record labels and distributors actually label it as such. Apple continues to enhance the iPhone's built-in Music application, with Apple Intelligence features such as Playlist Playground set to make their way to end users as part of the iOS 26.4 update. Now, the company has implemented an additional AI-related safeguard, referred to as Transparency Tags. They're essentially disclosure labels that let music distributors and record labels indicate specific content was made with the help of artificial intelligence. As TechCrunch points out, Apple's new tags cover various aspects of any given song. The labels, if present, make it clear whether select artwork, or a specific track, composition, or music video is AI-generated. To be more specific, the Track tag is used to disclose that a material portion of an audio recording was created by artificial intelligence. Similarly, the Composition tag denotes that a substantial portion of the music compositions present within a track is AI-generated. The new Music Video label, meanwhile, applies to visual content, including music videos delivered as standalone content, as well as to those bundled within albums. Apple's new labelling system was highlighted in a newsletter sent to industry partners, which Music Business Worldwide spotted. In it, the iPhone maker explains that its new tags are "similar to genres, credits, and other metadata." "Proper tagging of content is the first step in giving the music industry the data and tools needed to develop thoughtful policies around AI," says Apple's newsletter, "and we believe labels and distributors must take an active role in reporting when the content they deliver is created using AI." Per Apple, its optional AI-disclosure tags offer "a concrete first step toward the transparency necessary for the industry to establish best practices and policies that work for everyone." How Apple's Transparency Tags stack up against the competition The lukewarm language hints at just how effective and useful this new AI-related labeling system will ultimately prove. At the time of writing, distributors do not appear to be under any obligation to label AI-generated content. In short, Apple's Transparency Tags are entirely optional, though they are a "first step" in the right direction, even if overdue. Spotify announced similar AI-disclosure labels back in September 2025, along with a policy that enables the removal of AI voice impersonations resembling real-world artists. Other music streaming platforms, meanwhile, have opted for a stricter, more decisive approach. Rather than relying solely on the willingness of distributors to disclose their use of AI, Deezer has already implemented an automated AI-detection system, and it's been available for well over a year. Deezer claims that it receives over 60,000 AI-generated songs each day, and its AI-catching utility has reportedly identified more than 13.4 million AI-generated songs. Deezer CEO Alexis Lanternier says most AI-music uploaded to the platform is used to commit fraud, and explains that the company plans to take action. This entails licensing the AI-detection tool for use in the wider music industry, among other things. Going back to Apple, the iPhone maker has made clear efforts to label AI-generated content created with its Image Playground app. The company could choose to implement more rigorous requirements regarding the use of AI in Apple Music. Alternatively, Apple also has the option to implement a supplementary AI-impersonation policy, as Spotify did. For now, though, the current AI-labeling system leaves AI disclosure entirely in the hands of distributors and record labels.
[12]
Apple Music adds transparency tags for AI-generated content
Apple Music will add metadata tags to distinguish AI-generated or AI-assisted content. The new metadata options aim to increase transparency regarding the use of artificial intelligence in music content uploaded to the platform. This development could impact how record labels and distributors categorize their releases. The changes were communicated to industry partners in a newsletter on Wednesday, according to Music Business Worldwide. Distributors will be able to use the new metadata tags to indicate AI involvement in specific components of a song. These components include artwork, music tracks, lyrical compositions, or music videos. The system relies on labels and distributors to manually opt-in and flag their use of AI. Spotify is implementing a similar approach. Other streaming services, such as Deezer, are attempting to use in-house AI-detection tools, but accuracy remains a challenge.
[13]
Who Made This Song? Platforms Start to Label AI Music | PYMNTS.com
The track, "Walk My Walk," was released under the name Breaking Rust, a fictional artist with an AI-generated cowboy persona, generic lyrics about perseverance, and an Instagram page that never disclosed its synthetic origins. It has become increasingly difficult to tell who or what was powered by AI, and to what extent. Apple Music moved this month to address that gap directly. According to a Wednesday (March 4) TechCrunch report, Apple sent a newsletter to industry partners announcing Transparency Tags, a metadata framework covering four content categories: Track, Composition, Artwork and Music Video. Labels and distributors can apply the tags immediately, with Apple noting that the requirements will eventually become mandatory for new content. The technical barriers to creating and distributing AI-generated music have effectively collapsed. Tools such as Suno and Udio allow users to generate full tracks from text prompts, with outputs that can be indistinguishable from human recordings to the untrained ear. Those tracks can be uploaded through standard music distribution services, the same aggregators labels and independent artists use, and appear in streaming catalogs alongside human-made work, carrying identical metadata fields and monetization rights. Breaking Rust attracted over 2 million monthly listeners on Spotify, where it was listed as a verified artist despite not having a biography. Several of its songs were played more than a million times, and one single exceeded 4.5 million streams. The creator never publicly identified themselves. Deezer, which has invested in its own AI detection infrastructure, reported in January that it is receiving over 60,000 fully AI-generated tracks on a daily basis, up from 10,000 when it first deployed its detection tool in early 2025. Synthetic content now makes up roughly 39% of all music delivered to the platform daily, according to. Music Business Worldwide report. More significantly, Deezer found that up to 85% of streams on AI-generated music were fraudulent in 2025, used to game royalty payouts rather than reflect genuine listener demand. The transparency problem and the fraud problem are, in practice, the same problem. Apple framed Transparency Tags as "a concrete first step toward the transparency necessary for the industry to establish best practices and policies that work for everyone." The limitation is self-evident: Apple is asking the parties responsible for uploading synthetic content to voluntarily declare it. The technical specification describes the tags as optional for now, and if omitted, none is assumed. There is no independent verification, no detection layer and no enforcement mechanism for labels or distributors that choose not to disclose. Spotify's stance covers similar ground through a different rhetorical frame. Co-CEO Gustav Söderström said at the company's Feb. 10 Q4 earnings call that the platform should not police creative tools: "Spotify should not decide what kind of tools you are allowed to use. Are you allowed to use an electric guitar, a synthesizer, a digital audio workstation? Or AI or -- more complicated question -- a bit of AI. ... I do not think it is our decision to make." But Söderström acknowledged the listener demand for clarity: "What we do think is that consumers would like to know and understand what tools were used in the creation of their music. We've been working with the industry to allow creators and labels uploading music to put in the metadata how it was created so that we can surface this to users." The music industry's disclosure debate is one instance of a broader platform challenge. Music is a measurable case because streaming platforms have royalty data that quantifies the consequences of synthetic content at scale. The same structural problem applies to video and images, where detection is harder and the stakes for trust are arguably higher. Meta launched Vibes in September, a dedicated feed within the Meta AI app for short-form AI-generated video, allowing users to create, remix and share synthetic content. The choice to create a feed reflects a different philosophy than Apple's labeling approach: rather than surfacing disclosure within a shared content environment, Meta is routing AI-generated content into a separate container. Both approaches are experiments. Neither has been tested at the scale the problem demands.
[14]
Apple Music Introduces Transparency Tags for AI-Generated Music
Apple Music has launched Transparency Tags - a framework of disclosure labels that music distributors as well as record labels can begin incorporating for AI-created content that they deliver on the music streaming platform, a news report by Music Business Worldwide says. Apple reportedly announced this move via a newsletter sent out to its industry partners on March 4, 2026. In this newsletter, the Tim Cook-led company remarks that it is leaving it to content providers to determine what exactly qualifies as AI-generated content, which is "similar to genres, credits, and other metadata". Apple further says that these tagging requirements provide a "concrete first step toward the transparency necessary for the industry to establish best practices and policies that work for everyone." "Proper tagging of content is the first step in giving the music industry the data and tools needed to develop thoughtful policies around AI," the newsletter reads. "We believe labels and distributors must take an active role in reporting when the content they deliver is created using AI," it adds. The new metadata requirements framework on Apple Music encompasses four essential creative elements: artwork, tracks, compositions, and music videos. To explain, the artwork tag applies at the album level and flags when AI has been used to create a 'material portion' of either static or motion graphic artwork. Elsewhere, the track tag comes into play when AI generates a 'material portion' of a sound recording. Similarly, the composition tag applies to AI-created lyrics or other compositional elements, while the music video tag covers any kind of visual content - whether bundled with albums or delivered standalone - that AI helps in generating. One important factor within this announcement is the level of agency granted to music distributors and record labels, as they can take an independent call as of now about tagging any AI-created content on Apple Music. Thus, identifying AI-created music content on Apple Music is at the sole discretion of music labels and distributors. There is no platform-level responsibility for this at the time of writing. However, this approach is in stark contrast to what Paris-based music streaming platform Deezer is doing. Deezer has spent the past one year or so developing its in-house AI detection infrastructure to cherry-pick AI-created content through technical analysis, as opposed to self-reporting by labels or music artists. Further, the Paris-based music platform revealed in November last year that around 50,000 fully AI-created tracks are being uploaded every day on Deezer, accounting for nearly 34% of total daily deliveries. This revelation came as part of a wider survey to understand perceptions and attitudes towards AI-generated music. One must note that this survey clearly pointed towards the desire to tag 100% AI-generated music and make sure that artists, as well as songwriters, receive fair treatment and remuneration if their music is used to train AI models. And pertinently, Deezer is the only music streaming platform to label 100% AI-generated music clearly for its listeners. The music platform claims that its AI music detection tool can detect fully AI-generated music even from the most prolific generative AI models, such as Suno and Udio. Meanwhile, Spotify has collaborated with the likes of Sony Music Group and Warner Music Group to build "artist-first" AI products. This shift means Spotify has moved from policing misuse to developing licensed AI tools that prioritise consent, credit, and fair compensation for music artists. To explain, Spotify has set up a generative AI research lab to develop these AI products. This lab combines Spotify's research with input from artists, producers, and songwriters. Co-presidents Alex Norström and Gustav Söderström said that Spotify's goal is to ensure that technology supports artists rather than competing with them. Back in October 2025, Sony Music's Rob Stringer said that direct licensing before product launches is the right approach to AI development. Meanwhile, Universal Music's Lucian Grainge remarked that Spotify's initiative builds on ongoing efforts to keep artists at the centre of AI conversations.
Share
Share
Copy Link
Apple Music has launched Transparency Tags to help users identify AI-generated content across tracks, lyrics, artwork, and music videos. The new metadata system lets record labels and distributors flag AI usage when uploading content. However, the opt-in approach leaves enforcement entirely to content providers, raising questions about whether labels will actually use the tags.
Apple Music has introduced a new metadata system called Transparency Tags designed to help users identify AI music on the platform. According to Music Business Worldwide
1
, Apple sent a newsletter to industry partners on Wednesday explaining how record labels and distributors can now flag AI-generated content when uploading music. The metadata tags cover four key categories: track (the music itself), composition (lyrics), artwork, and music videos2
.The track tag applies when "a material portion of a sound recording" has been generated by AI tools, while the composition tag covers AI-generated compositional elements such as song lyrics
3
. The artwork tag applies to static or moving graphics at the album level, and the music video tag covers all other AI-generated visual content. Multiple Transparency Tags can be used simultaneously for works requiring more than one disclosure.The critical limitation of Apple Music's approach is that the system operates entirely on an opt-in basis, placing responsibility for AI disclosures squarely on record labels and distributors rather than the platform itself. In its specifications for the update, Apple states, "If omitted, none is assumed," meaning content providers can simply choose not to participate
2
. Apple has also left it to the discretion of content providers to determine what qualifies as AI-generated content, "similar to genres, credits, and other metadata"3
.This voluntary labeling approach mirrors what Spotify is implementing, as the competing platform develops a new metadata standard for AI music disclosures with DDEX, a music standards-setting organization
3
. However, without enforcement mechanisms, the effectiveness of such systems remains uncertain. As one report noted, "Honesty policies for other AI labelling solutions haven't worked out so far"3
.While Apple Music relies on self-reporting, other music streaming platforms have adopted more proactive approaches to identify AI music. Deezer made its AI detection tool available to other platforms in January after launching it last year, while Qobuz introduced its own proprietary AI detection system last week
3
. These in-house AI-detection tools flag content whether the distributor opts in or not5
.The scale of the challenge is significant. Deezer disclosed in January 2026 that it receives over 60,000 fully AI-generated tracks every day, double the number it saw in September 2025. Synthetic content has accounted for 13.4 million tracks on its platform
5
. Yet creating maximally accurate AI detection systems remains challenging for the music industry1
.Related Stories
Apple described the new metadata tags as "a concrete first step" toward achieving industry-wide transparency around AI-generated music, stating that "labels and distributors must take an active role in reporting when the content they deliver is created using AI"
3
. This push for transparency comes as more AI platforms let users create songs from prompts, with tools like Gemini now enabling users to generate 30-second audio clips4
.The timing matters for copyright considerations as well. AI-generated content still remains ineligible for copyright in the US
4
, creating legal complexities for the music industry as AI tools become more sophisticated. For artists concerned about impersonation and spam, these transparency efforts represent attempts by music streaming platforms to protect authentic creators, though the lack of mandatory enforcement may limit their effectiveness in the short term. The music industry will be watching to see whether voluntary AI disclosures gain traction or whether platforms will need to implement stricter requirements to give listeners the data they need.Summarized by
Navi
[1]
[4]
1
Technology

2
Entertainment and Society

3
Policy and Regulation
