AI is no stranger to the music industry, but what was once a fun tool for remixes has now turned into something darker. Musicians are waking up to find albums released under their names without ever stepping into a studio.
AI-created songs spark ethical questions about identity and creativity in music(Unsplash)
Recently, English folk singer Emily Portman was stunned when fans congratulated her on a new album, Orca. The shocking part is that she never recorded it. The entire album was AI-generated, uploaded to Spotify, iTunes, and YouTube under her name. It fooled listeners, and it took weeks for streaming platforms to respond. Even after removal, another fake album popped up under her profile days later.
However, she’s not the one who has been cloned by AI. AI-generated tracks have appeared under the names of other artists, including late musicians Blaze Foley and Guy Clark. Closer home, even Indian legend Kishore Kumar has been dragged into this trend. A version of the popular Bollywood song “Saiyaara” was circulated online, falsely promoted as an “original” sung by Kumar. In reality, he never sang it. The Kishore Kumar Saiyaara song is a fake and AI-generated track created by voice-cloning technology to mimic his voice.
What began as fun AI remixes has now turned into stealing artists’ identities through music. Scammers earn money from it, while real artists and musicians are left fighting to prove their own work.
And the scale of the problem is staggering. Streaming platform Deezer recently revealed it receives 20,000 AI-generated songs every single day, almost double what it saw just three months ago. “AI-generated content continues to flood streaming platforms like Deezer, and we see no sign of it slowing down,” said Aurelien Herault, the company’s Chief Innovation Officer, in a statement to Forbes in April 2025.
With nearly 100,000 uploads a day, most platforms rely on third-party distributors and user-submitted data. That makes it easy for scammers to sneak fake music into real artist profiles unless fans spot it and complain. The threat is even greater for smaller, independent musicians without big legal teams.
It would be wrong to say that AI in music is all bad. It can really help artists brainstorm lyrics, suggest chord progressions, or experiment with sounds. But when it’s used to impersonate real musicians, it becomes fraud. As more people value convenience over authenticity, the danger is clear, and we could end up with more AI sludge and fewer real albums.
The music industry now faces its biggest remix yet, deciding how to protect creativity in an age where even the dead can drop a “new” track.