SoundCloud has been forced into damage control this week after quietly sliding an AI clause into its Terms of Use. One that looked a lot like it was giving itself permission to feed your tracks into the AI meat grinder.
The clause, updated back in February 2024 but only brought to light recently by Futurism, told users in no uncertain terms: by uploading music, you “explicitly agree” to let SoundCloud use it to “train, develop, or serve as input” for artificial intelligence tech. Translation: your beats could be training the next generation of AI copycats without you knowing.
Unsurprisingly, the internet didn’t take kindly to that, especially the artists who built SoundCloud from the ground up. The backlash was loud, fast, and full of fire—enough to get CEO Eliah Seton to jump in with some good old-fashioned corporate spin.
In a statement posted online, Seton said the company has “never used artist content to train AI models,” nor does it allow third parties to scrape or repurpose uploads for that use. He added, “We don’t build generative AI tools,” and that SoundCloud’s position is simple: “AI should support artists, not replace them.”
It all sounds nice, but it’s also a little too late. The damage was in the wording, and artists saw what looked like a bait-and-switch buried in the fine print.
To SoundCloud’s credit, they’ve now revised the Terms of Use again—this time stating clearly that no content will be used to train generative AI models that could mimic your voice, sound, or style without explicit, opt-in consent. It’s a win, but a shaky one.
This all follows a larger industry trend where artists are finding out—often after the fact—that their work might be feeding the AI boom. It’s part of a much bigger fight over ownership, transparency, and control.
So while SoundCloud might’ve dodged the worst of the fallout for now, it’s a reminder that tech platforms are playing fast and loose with your art—and it pays to read the fine print.