16 Sources
[1]
SoundCloud changes policies to allow AI training on user content | TechCrunch
SoundCloud appears to have quietly changed its terms of use to allow the company to train AI on audio that users upload to its platform. As spotted by tech ethicist Ed-Newton Rex, the latest version of SoundCloud's terms include a provision giving the platform permission to use uploaded content to "inform, train, [or] develop" AI. "You explicitly agree that your Content may be used to inform, train, develop or serve as input to artificial intelligence or machine intelligence technologies or services as part of and for providing the services," read the terms, which were last updated February 7. The terms have a carve out for content under "separate agreements" with third-party rightsholders, such as record labels. SoundCloud has a number of licensing agreements with indie labels as well as major music publishers, including Universal Music and Warner Music Group. TechCrunch wasn't able to find an explicit opt-out option in the platform's settings menu on the web. SoundCloud didn't immediately respond to a request for comment. SoundCloud, like many large creator platforms, is increasingly embracing AI. Last year, SoundCloud partnered with nearly a dozen vendors to bring AI-powered tools for remixing, generating vocals, and creating custom samples to its platform. In a blog post last fall, SoundCloud said that these partners would receive access to content ID solutions to "ensure rights holders [sic] receive proper credit and compensation," and it pledged to "uphold ethical and transparent AI practices that respect creators' rights." A number of content hosting and social media platforms have changed their policies in recent months to allow for first- and third-party AI training. In October, Elon Musk's X updated its privacy policy to let outside companies train AI on user posts. Last September, LinkedIn amended its terms to allow it to scrape user data for training. And in December, YouTube began letting third parties train AI on user clips. Many of these moves have prompted backlash from users who argue that AI training policies should be opt-in as opposed to opt-out, and who argue that they should be credited and paid for their contributions to AI training data sets.
[2]
SoundCloud backtracks on AI-related terms of use updates | TechCrunch
SoundCloud says it's revising its terms after widespread backlash over a clause related to AI model training. Earlier this year, SoundCloud quietly updated its usage policies, adding wording that many users interpreted as legal cover to allow the company to train AI on audio uploaded to its platform. SoundCloud was quick to assert that it wasn't developing AI by using its users' content, but the company's PR statement didn't allay fears that SoundCloud might do so in the future. On Wednesday, SoundCloud CEO Eliah Seton published an open letter admitting that the wording of the changes to the company's terms "was too broad and wasn't clear enough." The updates were intended to focus on other uses of AI internally at the company, Seton said -- including recommendations and tools to help prevent fraud -- but missed the mark. SoundCloud has now revised its terms "to make it absolutely clear [that] SoundCloud will not use [user] content to train generative AI models that aim to replicate or synthesize [a] voice, music, or likeness," said Seton.
[3]
SoundCloud says it isn't using your music to train generative AI tools
Wes Davis is a weekend editor who covers the latest in tech and entertainment. He has written news, reviews, and more as a tech journalist since 2020. The music-sharing platform SoundCloud quietly updated its terms of use in February last year, adding language that lets it train AI models on its users' content, as TechCrunch reported. And while the company says it hasn't used user-created content for model training, it doesn't rule out the possibility that it will in the future. Marni Greenberg, SVP and head of communications at SoundCloud, provided the following in a statement emailed to The Verge. SoundCloud has never used artist content to train AI models, nor do we develop AI tools or allow third parties to scrape or use SoundCloud content from our platform for AI training purposes. In fact, we implemented technical safeguards, including a "no AI" tag on our site to explicitly prohibit unauthorized use. Greenberg went on to say that SoundCloud's terms of service update "was intended to clarify how content may interact with AI technologies within SoundCloud's own platform." She said the company uses AI for things like personalized recommendations and fraud detection, and suggests its plans for future uses of AI on its platform fall along similar lines. When we asked about letting users opt out of having their music used for generative AI development, here's what Greenberg had to say: The TOS explicitly prohibits the use of licensed content, such as music from major labels, for training any AI models, including generative AI. For other types of content uploaded to SoundCloud, the TOS allows for the possibility of AI-related use. Importantly, no such use has taken place to date, and SoundCloud will introduce robust internal permissioning controls to govern any potential future use. Should we ever consider using user content to train generative AI models, we would introduce clear opt-out mechanisms in advance -- at a minimum -- and remain committed to transparency with our creator community. Hopefully SoundCloud will go to greater lengths to tell users about those opt-out mechanisms than it appears to have done for last year's AI-related terms of use update. Tech ethicist Ed Newton-Rex, who spotted the changes reported by TechCrunch, posted that they "can't see any emails" alerting them that the terms had been altered. I've contributed to SoundCloud, too, and also didn't find any emails about the changes when I checked. SoundCloud's terms say it will provide "prominent notice" about significant alterations to its terms, but doesn't guarantee you'll see that in an email.
[4]
SoundCloud changes its TOS again after an AI uproar
Wes Davis is a weekend editor who covers the latest in tech and entertainment. He has written news, reviews, and more as a tech journalist since 2020. Music-sharing platform SoundCloud is saying it "has never used artist content to train AI models," and that it's "making a formal commitment that any use of AI on SoundCloud will be based on consent, transparency, and artist control." The update comes several days after artists reported that changes made last year to its terms of use could mean it reserved the right to use their music and other content to train generative AI tools. "The language in the Terms of Use was too broad and wasn't clear enough. It created confusion, and that's on us," writes SoundCloud CEO Eliah Seton. The terms that SoundCloud is currently using were updated in February last year with text including this passage: In the absence of a separate agreement that states otherwise, You explicitly agree that your Content may be used to inform, train, develop or serve as input to artificial intelligence or machine intelligence technologies or services as part of and for providing the services." But Seton says that "in the coming weeks," that line will be replaced with this: We will not use Your Content to train generative AI models that aim to replicate or synthesize your voice, music, or likeness without your explicit consent, which must be affirmatively provided through an opt-in mechanism. Seton reiterates that SoundCloud has never used member content to train AI, including large language models, for music creation or to mimic or replace members' work. And, echoing what a SoundCloud spokesperson told The Verge in an email over the weekend, Seton says if the company does use generative AI, it "may make this opportunity available to our human artists with their explicit consent, via an opt-in mechanism." Ed Newton-Rex, the tech ethicist who first discovered the change, isn't satisfied with the changes. In an X post, he says the tweaked language could still allow for "models trained on your work that might not directly replicate your style but that still compete with you in the market." According to Rex, "If they actually want to address concerns, the change required is simple. It should just read "We will not use Your Content to train generative AI models without your explicit consent."
[5]
SoundCloud says it's never trained AI using artists' work after getting called out for terms of use change
A 2024 update that slid under the radar says content 'may be used to inform, train, develop or serve as input to' AI. Following backlash about a quietly added clause to SoundCloud's Terms of Use that says users' content may be fed to AI, the company says it's "never used artist content to train AI models," and insists it "has always been and will remain artist-first." The outrage came after tech ethicist Ed Newton-Rex (via TechCrunch) spotted a change to SoundCloud's terms that was made in February 2024 seemingly without notifying users. The updated text states that by using the platform, "You explicitly agree that your Content may be used to inform, train, develop or serve as input to artificial intelligence or machine intelligence technologies or services as part of and for providing the services." In a statement to TechCrunch, a spokesperson said the update was only meant to "clarify how content may interact with AI technologies within SoundCloud's own platform" and that the company "has never used artist content to train AI models, nor do we develop AI tools or allow third parties to scrape or use SoundCloud content from our platform for AI training purposes." SoundCloud's official Reddit account posted similar statements on the social media platform in response to users' outrage, and both noted that SoundCloud added a "no AI" tag for artists "to explicitly prohibit unauthorized use." AI may be used for things like music recommendations, playlist creation and fraud detection, the company said. "Any future AI tools will be built for artists to enhance discovery, protect rights, and expand opportunities," SoundCloud posted on Reddit. "We hear your concerns and remain committed to transparency, artist control, and fair use." Just a few months ago, though, SoundCloud introduced a suite of AI tools geared toward music creation, on top of three others it had announced earlier that year. That includes AI tools for generating remixes, new tracks, beats and singing voices.
[6]
SoundCloud backtracks on 'too broad' AI terms of service
SoundCloud is updating its Terms of Use again after angering users with language around AI that even the company now describes as "too broad." The details of the change were shared in an open letter from SoundCloud CEO Eliah Seton affirming the company's commitment to artists. Specifically, SoundCloud's Terms of Use now forbids the company from using content uploaded to SoundCloud to train generative AI that replicates an artist without their consent. As it's phrased in the new terms SoundCloud is rolling out in the next few weeks: Seton also reiterated that Soundcloud has never used "artist content" to train AI. "Not for music creation. Not for large language models. Not for anything that tries to mimic or replace your work," Seton writes. The conflict over SoundCloud's approach to AI started when users noticed that the company had updated its Terms of Use in February 2024 to allow SoundCloud to use content to "inform, train, develop or serve as input to artificial intelligence or machine intelligence technologies or services as part of and for providing the services." SoundCloud claims that it would only ever use artist's content to train AI music discovery tools. The company acquired Musiio in 2022 with that exact idea in mind. Still, it's understandable that SoundCloud users would be sensitive to any AI-related changes the company makes. Like most other content stored online, AI companies have scraped music without artists' consent and been fairly open about it.
[7]
SoundCloud latest company to hit trouble with AI clause in T&Cs - 9to5Mac
SoundCloud is the latest company to run into trouble after quietly adding an AI clause to its terms and conditions. The clause appeared to allow the company to use subscriber work to train AI models ... A number of companies have been called out in the past for this, with Adobe one of the most high-profile examples back in June of last year. A change to Adobe terms & conditions for apps like Photoshop has outraged many professional users, concerned that the company is claiming the right to access their content, use it freely, and even sub-licence it to others. The company is requiring users to agree to the new terms in order to continue using their Adobe apps, locking them out until they do so. The company initially dismissed the controversy, but was later forced to issue a better explanation. Tech commenter Ed Newton-Rex spotted that SoundCloud had quietly added a similar clause to its own T&Cs without any fanfare. You explicitly agree that your Content may be used to inform, train, develop or serve as input to artificial intelligence or machine intelligence technologies or services as part of and for providing the services. TechCrunch obtained a statement in which the company explained the reasons for the clause, SoundCloud has never used artist content to train Al models, nor do we develop Al tools or allow third parties to scrape or use SoundCloud content from our platform for Al training purposes. In fact, we implemented technical safeguards, including a 'no Al tag on our site to explicitly prohibit unauthorized use [...] Any future application of AI at SoundCloud will be designed to support human artists, enhancing the tools, capabilities, reach, and opportunities available to them on our platform. Examples include improving music recommendations, generating playlists, organizing content, and detecting fraudulent activity. These efforts are aligned with existing licensing agreements and ethical standards. Tools like [those from our partner] Musiio are strictly used to power artist discovery and content organization, not to train generative AI models. As with Adobe, the wording is ambiguous to say the least. If companies don't want to have their users up in arms, they'd be well advised to explicitly state in the T&Cs what they will and won't do.
[8]
Soundcloud changed its AI policy so it can train on users' audio
If you don't want AI to scrape your music to learn, then it might be time to leave Soundcloud. The music streaming platform quietly updated its terms of service sometime last year to allow AI to train on audio uploaded to Soundcloud, TechCrunch reported this week. "You explicitly agree that your Content may be used to inform, train, develop or serve as input to artificial intelligence or machine intelligence technologies or services as part of and for providing the services," the terms read, via TechCrunch. This Tweet is currently unavailable. It might be loading or has been removed. Not long after the TechCrunch report was published, Soundcloud clarified that it does not currently use audio uploaded by users to train AI. However, the company did not rule out doing so in the future. "SoundCloud has never used artist content to train AI models, nor do we develop AI tools or allow third parties to scrape or use SoundCloud content from our platform for AI training purposes," Marni Greenberg, SVP and head of communications at SoundCloud, told The Verge. "In fact, we implemented technical safeguards, including a 'no AI tag' on our site to explicitly prohibit unauthorized use." Greenberg further noted that music from major labels would be exempt from any AI training and confirmed that other users would have the opportunity to opt out of any such usage. Greenberg told The Verge: "The [terms of service] explicitly prohibits the use of licensed content, such as music from major labels, for training any AI models, including generative AI. For other types of content uploaded to SoundCloud, the TOS allows for the possibility of AI-related use. Importantly, no such use has taken place to date, and SoundCloud will introduce robust internal permissioning controls to govern any potential future use. Should we ever consider using user content to train generative AI models, we would introduce clear opt-out mechanisms in advance -- at a minimum -- and remain committed to transparency with our creator community." So as of right now, if you're uploading music, podcasts, or other audio to Soundcloud, it is not using it to train AI. But it seems Soundcloud is preparing for the day it will.
[9]
SoundCloud Quietly Forced Artists to Let AI Feast on Their Music
SoundCloud -- a music sharing platform once so beloved by artists it spawned sub-genres named after it -- updated its terms of service (TOS), forcing artists who use SoundCloud to let their music train AI. It looks like the change went into effect during SoundCloud's last TOS policy update in February 2024, and it's just coming to light now. The updated terms read that "in the absence of a separate agreement that states otherwise," creators who upload content to the site "explicitly agree that your Content may be used to inform, train, develop or serve as input to artificial intelligence or machine intelligence technologies or services as part of and for providing the services." In short, the language in the update suggests that musicians, artists, and other creators who use SoundCloud for distribution can expect their uploaded work to be funneled into AI models as training data. It's a sweeping provision. And how SoundCloud is actually applying it isn't exactly clear. But since early 2024, SoundCloud has integrated several generative AI tools and services into its platform that largely center on creating or producing new music. These changes started in late January 2024 (just before the TOS update), with the integration of three "assistive AI" products designed to help SoundCloud's creators. According to a press release, these tools are designed to "allow artists to upload songs created with assistive AI tools directly to SoundCloud." In November of that year, following the TOS update, SoundCloud introduced several more "assistive" AI integrations. These tools are similarly targeted at music generation and production tasks, and in a press release, SoundCloud lauded them as a way to "democratize music creation for all artists." Those services include products like Tuney, which "enables remixing, editing, and new track generation"; AIBeatz, a synthetic beat maker that allows users to "generate and customize [their] own beats like a Pro"; and Starmony, which SoundCloud says can help publishers "quickly produce high-quality tracks" and "distribute them across multiple platforms" while retaining a "large share of royalties," among others. During the November round of AI integrations, SoundCloud said in a press release it would double down on treating musicians better, per their "commitment to responsible, innovative and ethical use of creative AI tools" by partnering with services designed to "provide all of our existing and new AI partners access to content identification solutions," which SoundCloud said would ensure "rights holders receive proper credit and compensation." The company also entered into AI for Music's non-binding "Principles for Music Creation with AI" pledge, which, according to the AI for Music website, marks a commitment to the "vital contributions of human creativity and to the responsible development and application of artificial intelligence for music creation." Pledges withstanding, not all SoundCloud users appeared to be aware of the AI training update. We caught the wind of the update earlier today after The Flight, a musical duo who have composed music for popular films and videogames, posted about the terms change on Bluesky. "Ok then..." read the post, which linked to SoundCloud's terms webpage, "deleted all our songs that we uploaded to Soundcloud and now closing account." "YIKES," read a following Bluesky post from the composer Adam Humphreys, "@soundcloud.dev no thank you." "SoundCloud seems to claim the right to train on people's uploaded music in their terms," Ed Newton-Rex, a composer and the CEO of the nonprofit Fairly Trained, said on X-formerly-Twitter. "I think they have major questions to answer over this." We reached out to SoundCloud to ask about the update and what it means for artists' work, but have yet to hear back at the time of publishing. We'll update this story as it develops, and if SoundCloud responds. SoundCloud emerged in the late 2000s as an industry-shifting digital platform that allowed artists -- especially emerging and independent artists -- to upload and distribute their work at a remarkably low cost without relying on record labels and other guarded industry gatekeepers. This dynamic has also historically made SoundCloud an incredible venue for music discovery, and altogether, its place as an accessible online stage for emerging acts has cemented it as an important mainstay in the music industry landscape. But like countless other digital platforms in the AI era, SoundCloud seems to understand that the vast piles of data it's collected over its many years of operation are more valuable than ever. And as tensions between musicians, among other creative professionals, and the digital companies that platform them continue to rise over the practical and ethical implications of AI, pushback from users who didn't expect to see their creative work quietly vacuumed into AI models is anything but unexpected.
[10]
SoundCloud Backtracks on AI Plans After Artist Outrage
SoundCloud has altered its platform policies to require opt-ins for training generative AI models with artists' music following widespread user backlash, the company announced today in a letter from its CEO. On Friday, Futurism broke the story that SoundCloud had quietly updated its Terms of Use (TOU) in February 2024 with language allowing it to train AI using users' uploaded content, which could include uploaded music. The updated terms -- which were flagged by users on Bluesky and X (formerly-Twitter) -- included some exceptions to account for music and other content licensed under third parties. But the AI provision was overall extremely broad, and could feasibly grant the music-sharing site the right to funnel much of its vast content library into generative AI models as training material, whether now or in the future. Though the change was made back in February 2024, it seemed like site users were largely unaware of the change. Artists responded with rage and frustration, taking to social media to express their anger at the company and, in many cases, claiming they'd deleted and scrubbed their accounts. In response to the mess, SoundCloud issued a lengthy statement clarifying that, despite the provision's sweeping language, it hadn't used artists' music to train AI models. That included generative AI tools like large language models (LLMs) and music generation tools, according to SoundCloud. Now, it looks like SoundCloud is doubling down on those promises -- and changing its policies. In the letter released today, SoundCloud CEO Eliah Seton conceded that SoundCloud's language around AI training was "too broad." To rectify that, said Seton, the company revised its user terms, which now bar SoundCloud from using artists' music to "train generative AI models that aim to replicate or synthesize your voice, music, or likeness" without the explicit consent of artists. The new clause adds that should SoundCloud seek to use its artists' music to train generative AI, it would have to earn that consent through opt-in mechanisms -- as opposed to opt-outs, which are notoriously slippery. Seton also reiterated SoundCloud's commitment to blocking third parties from scraping SoundCloud for AI training data, and characterized the changes as a "formal commitment that any use of AI on SoundCloud will be based on consent, transparency, and artist control." According to Seton, the initial AI policy change was a reflection of SoundCloud's internal use of AI for features like music discovery algorithms and Pro features, fraud detection, customer service, and platform personalization, among other features. SoundCloud also uses AI to target opted-in users with advertisements based on their perceived mood. It also allows users to upload AI-generated music, and boasts a slew of partnerships with platform-integrated AI music and generation tools. If there's any moral here, it's that language matters, as do the voices of the artists who power creative platforms -- especially in an era where data-hungry AI models and the companies that make them are looking to suck up valuable human-made content wherever they can. Seton, for his part, promised that SoundCloud would "keep showing up with transparency." "We're going to keep listening. And we're going to make sure you're informed and involved every step of the way," reads the letter. "Thanks for being a part of the SoundCloud community and for holding us accountable to the values we all share."
[11]
SoundCloud Just Updated Their Terms of Service After AI Policy Backlash
You might be having a bad week, but AI is having a worse one. First there was the "racism glitch" that beset Grok, and now music platform SoundCloud is facing some serious criticism for a clause buried in its terms of service. The imbroglio began in February 2024, when SoundCloud quietly changed its TOS to include: In the absence of a separate agreement that states otherwise, You explicitly agree that your Content may be used to inform, train, develop or serve as input to artificial intelligence or machine intelligence technologies or services as part of and for providing the services. That quietly sat there in the TOS for more than a year, but this week, Ed Newton noticed the change and posted about it on his X account. The response was immediate and fiery, with many musicians and SoundCloud users decrying the use of their music to train AI. But some of that user ire seems misplaced; it's a more nuanced situation than it seems at first. It's easy to see why musicians wouldn't want their art used to train machines designed to replace them, but, according to SoundCloud, the TOS change was never about that. The company's president, Eliah Seton, issued an open letter on Wednesday explaining that they were using AI for "powering smarter recommendations, search, playlisting, content tagging, and tools that help prevent fraud" but the company has never "used artist content to train AI models. Not for music creation. Not for large language models. Not for anything that tries to mimic or replace your work." According Seton, it's basically been a misunderstanding. "The language in the Terms of Use was too broad and wasn't clear enough. It created confusion, and that's on us," Seton wrote. SoundCloud may have cleared up how it has used AI in the past, but company reps waffled on what it plans to do with your music in the future. The initial response to the controversy, delivered in a statement to Verge from Marni Greenberg, SVP and head of communications at SoundCloud, explained, "Should we ever consider using user content to train generative AI models, we would introduce clear opt-out mechanisms in advance." The community responded with, "shouldn't that read 'opt-in?'" "Yeah, opt-in. Sounds great," SoundCloud responded, eventually. In his open letter, SoundCloud CEO Seton got specific about planned changes to the service's terms of service. The offensive AI section will be replaced with: We will not use Your Content to train generative AI models that aim to replicate or synthesize your voice, music, or likeness without your explicit consent, which must be affirmatively provided through an opt-in mechanism. So AI won't be used for replication or synthesis of users' music unless they opt in. Presumably, SoundCloud's will continue to use AI for recommendations, tagging, and play-listing, a much more benign, and generally accepted used of the technology. A couple of lessons from this flare-up: One, dealing with AI requires companies to be crystal clear with their users about how AI will be employed. Two, we should all read the TOS.
[12]
SoundCloud faces backlash after adding an AI training clause in its user terms
SoundCloud is facing backlash after creators took to social media to complain upon discovering that the music-sharing platform uses uploaded music to train its AI systems. According to SoundCloud's terms of use, unless a separate agreement states otherwise, users "explicitly agree that your Content may be used to inform, train, develop or serve as input to artificial intelligence or machine intelligence technologies or services as part of and for providing the services." These terms appear to have been added to SoundCloud's website in February 2024. Futurism was the first to report on artists' concerns. Musical duo The Flight brought attention to the terms this week, alerting fellow creators. "ok then... deleted all our songs that we uploaded to Soundcloud and now closing account," the duo posted on Bluesky. Another user replied, "Thanks for the heads up. I just deleted it my account."
[13]
SoundCloud CEO admits AI terms weren't clear enough, issues new pledge
SoundCloud is revising its terms of use after facing widespread backlash over a clause that many users believed allowed the company to train AI models on user-uploaded content. The controversy began earlier this year when SoundCloud quietly updated its usage policies, sparking concerns among users. SoundCloud's CEO, Eliah Seton, acknowledged in an open letter on Wednesday that the updated terms were "too broad and weren't clear enough." Seton explained that the intended focus of the updates was on internal AI uses, such as enhancing recommendations and preventing fraud, but the wording failed to convey this clearly. Suno 4.5 update is music to prompt-engineers' ears The revised terms now explicitly state that SoundCloud will not use user content to train generative AI models that replicate or synthesize a user's voice, music, or likeness, according to Seton. This change aims to address user concerns and provide clearer guidelines on the company's AI practices.
[14]
SoundCloud Says It "Has Never Used Artist Content to Train AI Models" After Backlash on Terms of Service Change
Range Music Establishes Composer Division, Hires Jeff Jernigan SoundCloud has issued a statement clarifying that it isn't using artists' content for generative AI music, following considerable backlash over a change some users noticed in the platform's terms of service. In a statement issued to The Hollywood Reporter on Friday, a spokesperson for SoundCloud said the platform "has never used artist content to train AI models, nor do we develop AI tools or allow third parties to scrape or use SoundCloud content from our platform for AI training purposes." "In fact, we implemented technical safeguards, including a 'no AI' tag on our site to explicitly prohibit unauthorized use," the spokesperson said. The company said that "SoundCloud has always been and will remain artist-first," adding that it believes AI can be a helpful creative tool for artists "especially when guided by principles of consent, attribution and fair compensation." SoundCloud's statement comes following vocal criticism from some musicians and music industry advocates online after the platform's updated terms of service had begun to make the rounds. The update said that users "explicitly agree that your content may be used to inform, train, develop or serve as input to artificial intelligence or machine intelligence technologies or services as part of and for providing the services." SoundCloud confirmed the terms of service had been updated in February 2024. SoundCloud said it updated the TOS "to clarify how content may interact with AI technologies within SoundCloud's own platform." Those uses, SoundCloud said "include personalized recommendations, content organization, fraud detection and improvements to content identification with the help of AI Technologies." AI music generation is among the hottest button issues in the industry; the major record labels sued prominent AI music generation platforms Suno and Udio last year on allegations of massive copyright infringement. That suit is ongoing. SoundCloud said on Friday that "any application of AI at SoundCloud will be designed to support human artists, enhancing the tools, capabilities, reach and opportunities available to them on our platform." "Examples include improving music recommendations, generating playlists, organizing content and detecting fraudulent activity," SoundCloud said. "These efforts are aligned with existing licensing agreements and ethical standards." Those services would align with some of the uses the company lists on the website for Musiio, an AI platform SoundCloud purchased in 2022. In SoundCloud's statement, the company said that "tools like Musiio are strictly used to power artist discovery and content organization, not to train generative AI models." The uses SoundCloud lists are less controversial than music generation, though several musicians began sharing posts Friday encouraging other artists to consider taking their music off the platform. Among the advocates voicing their concern Friday was Ed Newton-Rex, the founder of Fairly Trained, a non-profit that calls for AI companies to train their models ethically and with permission from the original content creators. Newton-Rex told THR Friday that he was still concerned in light of SoundCloud's clarification, noting that the statement "doesn't actually rule out SoundCloud training generative AI models on their users' music in future." "This is particularly worrying because the terms of service clearly allow it," Newton-Rex said. "I think it's important they rule this out and update their terms accordingly. Otherwise I for one will be removing my music." (SoundCloud didn't immediately respond to request for comment on the consideration of any such carveout.) "We understand the concerns raised and remain committed to open dialogue," SoundCloud said. "Artists will continue to have control over their work, and we'll keep our community informed every step of the way as we explore innovation and apply AI technologies responsibly, especially as legal and commercial frameworks continue to evolve."
[15]
SoundCloud Updates Terms of Service Following Backlash to AI Policy
In an open letter published Wednesday, SoundCloud CEO Eliah Seton wrote that the company "has never used artist content to train AI models," echoing a statement a SoundCloud representative shared with The Hollywood Reporter last week. "Not for music creation. Not for large language models. Not for anything that tries to mimic or replace your work. Period," Seton wrote. "We don't build generative AI tools, and we don't allow third parties to scrape or use artist content from SoundCloud to train them either." The backlash comes from a February 2024 update to SoundCloud's terms of service to say that users "explicitly agree that your Content may be used to inform, train, develop or serve as input to artificial intelligence or machine intelligence technologies or services as part of and for providing the services." That update started making the rounds last week, drawing the ire of musicians wary their content would be used to train generative AI models. Similar to SoundCloud's initial statement last week, Seton said in the letter that the platform updated its terms of service "to clarify how we may use AI internally to improve the platform for both artists and fans," citing functions like improved search, playlisting and content recommendations. Seton said Wednesday, however, that the language in that update was "too broad and wasn't clear enough." "It created confusion, and that's on us," Seton wrote. "That's why we're fixing it." Per the letter, SoundCloud is making another update to the terms of service saying "we will not use Your Content to train generative AI models that aim to replicate or synthesize your voice, music, or likeness without your explicit consent, which must be affirmatively provided through an opt-in mechanism." The old language will be stricken. Seton said that "if there is an opportunity to use generative AI for the benefit of our human artists, we may make this opportunity available to our human artists with their explicit consent, via an opt-in mechanism." "Our position is simple: AI should support artists, not replace them. Any use of these tools on SoundCloud will continue to reflect that," Seton said. "AI is going to be a part of the changing landscape of music. It brings new opportunities, but also very real challenges. That's why our approach will always be guided by a single principle: artist-first." SoundCloud's move comes as AI remains one of the most contentious issues in the music and entertainment industries, as underscored by the concern following the ouster of copyright register Shira Perlmutter over the weekend. While the change addresses some critics' concerns, it hasn't seemed to appease all the critics. Ed Newton-Rex, the founder of the nonprofit music advocacy group Fairly Trained (who said last week he'd be removing his music from the platform) tweeted Wednesday that the update "doesn't go nearly far enough." "Their new terms will say they won't train gen AI models that replicate your voice / style. But they leave the door open to the much more likely gen AI training: models trained on your work that might not directly replicate your style but that still compete with you in the market," Newton-Rex wrote. "If they actually want to address concerns, the change required is simple. It should just read 'We will not use Your Content to train generative AI models without your explicit consent.'"
[16]
SoundCloud's New Terms Allow AI Models To Train On Users' Data
SoundCloud changed its terms of use, allowing the company to use content uploaded by users to "inform, train or develop" AI. The music company has clarified that, until now, it has not utilised users' data to train AI models. In addition, it had previously pledged to uphold the "ethical and transparent AI practices that respect creators' rights" in a blog post in November 2024. SoundCloud already has third-party AI-assistive tools that enable users to edit, remix, and change the vocals of songs. However, the company loosening its terms, that allows it to train its AI models with users' content, raises a few significant questions regarding the rights of independent artists. The change in SoundCloud's policy was first flagged by Ed Newton-Rex, the founder of FairlyTrained, which "certifies" generative AI companies for fair training data practices. He claimed that users, including him, were not informed about the recent changes in the terms of use. The company should at least send an email according to its own terms of use. The recent update to the terms of service mentions that the content uploaded to the platform can be used to train the AI models. "You explicitly agree that your Content may be used to inform, train, develop or serve as input to artificial intelligence or machine intelligence technologies or services as part of and for providing the services." - SoundCloud's Terms of Service. MediaNama couldn't find a setting for an opt-out feature that would exempt the user's data from being used to train/inform SoundCloud's AI models. This implies that the users are deprived of choice to stop the company from (mis)using their original content for training the music platform's AI models. "SoundCloud has never used artist content to train AI models, nor do we develop AI tools or allow third parties to scrape or use SoundCloud content from our platform for AI training purposes," said Marni Greenberg, the Head of Communications at SoundCloud. The music company said that terms of service were updated in February 2024 to specify how content may interact with AI technologies within SoundCloud. The intended use cases for AI are "personalised recommendations, content organisation, fraud detection, and improvements to content identification" SoundCloud said. The music platform didn't specifically say how it will use the user's content to train AI models, which it can do as per the current terms of service. "Artists will continue to have control over their work," the company mentioned. SoundCloud's terms also mention that if content is delivered under a separate agreement, neither SoundCloud nor any other third party is allowed to use or reproduce such content. Here, the company was referring to its licensing deals with music labels. For example, SoundCloud has content licensing with giants like Universal Music and Warner Music Group. "To SoundCloud, artists without a label are apparently second-class citizens, who don't deserve the same protections as artists who are signed," said Newton-Rex to explain why he had to delete his SoundCloud account. The legality of the origins of training datasets is a significant issue in AI model training, as many AI models are trained on data scraped from publicly available information. In an ideal world, acknowledging that publicly available data isn't royalty-free and copyright-free is a no-brainer. But, as AI models become more prevalent, we must scrutinise the legitimacy of the datasets used to train them. The case with SoundCloud is peculiar as it started in 2007 as a platform for independent artists to publish their music. However, now it has transformed into a full-fledged commercial music streaming service competing with Spotify. As mentioned above in the article, SoundCloud's terms state that once a user has uploaded their content, they by default explicitly agree to give up their content to train/inform the AI models. And if the music platform sticks with this policy in the long term, then independent artists, cover artists, and royalty-free music creators will inevitably bear the brunt.
Share
Copy Link
SoundCloud faces controversy over changes to its terms of use regarding AI training on user content, leading to a swift policy reversal and clarification from the company's CEO.
In a move that sparked widespread concern among its user base, SoundCloud, the popular music-sharing platform, quietly updated its terms of use in February 2024 to include language that appeared to allow the company to train AI models using content uploaded by users 1. The change, which went largely unnoticed at the time, stated that users explicitly agreed their content "may be used to inform, train, develop or serve as input to artificial intelligence or machine intelligence technologies or services" 15.
The policy update was brought to light by tech ethicist Ed Newton-Rex, leading to significant backlash from the platform's community 12. Many users argued that such policies should be opt-in rather than opt-out, and expressed concerns about proper crediting and compensation for contributions to AI training datasets 1.
Initially, SoundCloud attempted to clarify its position, stating that the company had never used artist content to train AI models and that the update was intended to focus on internal AI uses such as recommendations and fraud prevention 23. However, this explanation did little to allay fears about potential future uses of user content for AI training.
In response to the growing controversy, SoundCloud CEO Eliah Seton published an open letter acknowledging that the wording of the policy changes was "too broad and wasn't clear enough" 2. Seton announced that SoundCloud would revise its terms to explicitly state that the company "will not use [user] content to train generative AI models that aim to replicate or synthesize [a] voice, music, or likeness" 24.
SoundCloud has emphasized that it has implemented technical safeguards, including a "no AI" tag, to prohibit unauthorized use of content for AI training purposes 3. The company also stated that its terms of service explicitly prohibit the use of licensed content, such as music from major labels, for training any AI models 3.
While SoundCloud maintains that it hasn't used user-created content for model training, it hasn't entirely ruled out the possibility for the future. The company has committed to introducing robust internal permissioning controls and clear opt-out mechanisms should it ever consider using user content to train generative AI models 34.
The controversy surrounding SoundCloud's policy changes reflects a broader debate in the tech and creative industries about the use of user-generated content for AI training. Other platforms, including X (formerly Twitter), LinkedIn, and YouTube, have made similar policy updates in recent months, often facing pushback from users and creators 1.
As AI technologies continue to advance, particularly in the realm of content creation and music generation, the incident highlights the need for clear communication, user consent, and transparent policies regarding the use of creative works in AI development. The swift response and policy reversal by SoundCloud may set a precedent for how other platforms approach these sensitive issues in the future.
Summarized by
Navi
[4]
NVIDIA announces significant upgrades to its GeForce NOW cloud gaming service, including RTX 5080-class performance, improved streaming quality, and an expanded game library, set to launch in September 2025.
9 Sources
Technology
3 hrs ago
9 Sources
Technology
3 hrs ago
As nations compete for dominance in space, the risk of satellite hijacking and space-based weapons escalates, transforming outer space into a potential battlefield with far-reaching consequences for global security and economy.
7 Sources
Technology
19 hrs ago
7 Sources
Technology
19 hrs ago
OpenAI updates GPT-5 to make it more approachable following user feedback, sparking debate about AI personality and user preferences.
6 Sources
Technology
11 hrs ago
6 Sources
Technology
11 hrs ago
A pro-Russian propaganda group, Storm-1679, is using AI-generated content and impersonating legitimate news outlets to spread disinformation, raising concerns about the growing threat of AI-powered fake news.
2 Sources
Technology
19 hrs ago
2 Sources
Technology
19 hrs ago
A study reveals patients' increasing reliance on AI for medical advice, often trusting it over doctors. This trend is reshaping doctor-patient dynamics and raising concerns about AI's limitations in healthcare.
3 Sources
Health
11 hrs ago
3 Sources
Health
11 hrs ago