A new report from French streaming giant Deezer reveals that a staggering 70% of AI-generated music streams on its platform are fraudulent. This widespread manipulation, primarily driven by bots, aims to illicitly claim royalty payments, highlighting a growing challenge for the music industry as AI content proliferates.
The Scale Of The Problem
Deezer's analysis indicates that while AI-generated music currently accounts for a mere 0.5% of all streams on its platform, an alarming proportion of these are fraudulent. The company found that up to seven out of ten streams of AI-made content are orchestrated by fraudsters. This issue is exacerbated by the rapid increase in AI content, with AI-generated music now representing 18% of all new uploads to Deezer, equating to approximately 20,000 tracks daily.
How The Fraud Works
Fraudsters exploit streaming platforms by deploying bots to simulate listens to AI-generated songs. This tactic allows them to accumulate significant royalty payments, especially when spread across numerous bogus tracks. The aim is to generate revenue by manipulating the system, often by creating vast listening numbers for a small number of tracks to evade detection. Thibault Roucou, Deezer's director of royalties and reporting, stated that as long as there is financial incentive, efforts to profit from fraudulent streaming will persist.
Deezer's Response And Challenges
In response to this growing threat, Deezer has implemented a tool capable of detecting 100% AI-made content from prominent models like Suno and Udio. The platform actively blocks royalty payments for streams identified as fraudulent. Furthermore, Deezer announced in April that it is removing all fully AI-generated content from its algorithmic recommendations. Despite these measures, the company acknowledges the ongoing battle, with Roucou noting that the perpetrators appear to be "organised."

Wider Industry Impact
The problem of fraudulent streaming extends beyond Deezer, impacting the entire global music streaming market, which was valued at $20.4 billion last year. The International Federation of the Phonographic Industry (IFPI) has highlighted that fraudulent streaming diverts money that should rightfully go to legitimate artists, and generative AI has "significantly exacerbated" this issue. A notable case from last year involved US musician Michael Smith, who was charged in connection with a scheme to generate hundreds of thousands of AI songs and stream them billions of times, allegedly obtaining $10 million in royalty payments.
Key Takeaways
Up to 70% of AI-generated music streams on Deezer are fraudulent.
Fraudsters use bots to simulate listens and illicitly claim royalty payments.
AI-generated music, though a small percentage of total streams, accounts for 18% of new uploads.
Deezer is actively combating the issue with detection tools and by blocking fraudulent payments.
The problem is a significant and growing concern for the entire music streaming industry.