wezebo
Back
ArticleMay 4, 2026 · 4 min read

AI music is turning streaming into a spam problem

Deezer says AI-generated tracks now account for 44% of new uploads, while Spotify is leaning on labels and spam takedowns. The real fight is over incentives, not just disclosure.

Wezebo
Abstract rows of anonymous music tiles flowing through a dark streaming platform pipeline with a few human-made tiles standing apart, no text or logos

AI music has moved from novelty to platform maintenance problem. The question for streaming services is no longer whether machine-made songs should exist. It is how much automated content a music catalog can absorb before discovery, royalties and trust start to break.

The Verge reported that major services are converging on a cautious middle ground: they are not banning AI music outright, but they are also trying not to let it quietly flood playlists and recommendation systems.

The new upload math

Deezer says AI-generated tracks now represent 44% of new music uploaded to its platform, or roughly 75,000 tracks per day, according to its April update. That is not a small edge case. It means a large streaming service is already receiving almost as much synthetic music as human-made music.

The company has taken one of the firmer positions in the market. It detects and labels AI-generated tracks, excludes them from recommendations and editorial playlists, and says it has demonetized most streams attached to detected AI music where it sees signs of fraud. Ars Technica noted Deezer's claim that AI music still accounts for only a small share of listening, partly because those tracks are kept out of organic discovery.

That distinction matters. Upload volume is cheap. Listener attention is not.

Labels help, but incentives matter more

Spotify is taking a more standards-driven route. It has pushed for AI credits that identify whether generative tools were used for vocals, lyrics or backing music, and it has worked with the DDEX standards group on metadata for AI disclosure. Spotify has also said it removed more than 75 million spam tracks over a 12-month period as part of broader platform cleanup, according to its policy update.

Apple's approach appears more dependent on self-reporting by labels and creators. That may be easier to implement across a large rights ecosystem, but it is weaker against bad actors whose whole strategy is to upload at scale and avoid scrutiny.

The hard problem is that generative music changes the economics of spam. A low-quality track used to require at least some production effort. Now a large batch can be created, titled, uploaded and tested against platform rules far more cheaply. Even if each track earns very little, automation can make the attack profitable.

What users are actually owed

For listeners, the minimum useful standard is not a philosophical label saying a song involved AI. It is control. People should be able to know when a track is synthetic, filter it if they want to, and trust that recommendations are not being padded with automated filler.

For artists, the concern is more direct: royalty pools and discovery slots are finite. If platforms allow machine-generated catalogs to compete on the same terms as human artists without clear disclosure or anti-fraud enforcement, the cost lands on creators who cannot scale output like software.

That does not mean every AI-assisted song is spam. Some musicians will use generative tools as instruments, effects or writing aids. The line worth defending is not human versus machine in the abstract. It is consent, disclosure and whether the upload is trying to reach listeners or game a payout system.

The likely end state

Streaming services will probably settle on a layered model: metadata labels for legitimate AI-assisted work, detection for undisclosed synthetic tracks, recommendation limits for bulk AI catalogs and aggressive demonetization when listening patterns look fraudulent.

The services that move fastest may avoid turning AI music into another version of search spam: technically abundant, economically motivated and exhausting for users. The services that wait will discover that moderation is much harder after the catalog has already filled up.