Saturday, June 14, 2025

AI Music Scam: $10M Fraud Using Fake Songs and Streams

- Advertisement -

AI Music Scam: $10M Fraud Using Fake Songs and Streams

The AI music scam that shook the industry in 2024 set a new standard for how streaming fraud is handled. Using AI tools and bot networks, one man exploited major platforms and earned millions—until authorities caught him.

The AI Music Scam Exposed

In a historic case, U.S. prosecutors charged Michael Smith, a North Carolina resident, for running a $10 million fraud operation involving AI-generated music. According to the Department of Justice, Smith and his associates created thousands of fake songs using AI and uploaded them under fabricated artist names to platforms like Spotify.

Inside the $10M AI Music Scam

Smith used over 1,000 bots to mimic real listeners and boost profits. In fact, this tactic earned him more than $3,000 per day, as reported by Wired. Over the course of a year, that translated to more than $1.2 million.

The musician used bots in the ai music scam fraud
The musician used bots in the ai music scam fraud

Furthermore, he spread his activity across thousands of songs and accounts, making it harder to detect. Smith wrote, “We need a TON of songs fast to work around these anti-fraud policies.”

Additionally, He used VPNs and fake IP addresses to make the bot traffic seem real. This created the illusion of real users streaming music globally. A Forbes report revealed the scam lasted more than seven years before the FBI stepped in.

How the Scam Exploited the Royalty Model

Smith’s operation exploited the streaming payout system. On most platforms, royalties are based on play counts. Therefore, bot-generated streams mean more income—even though the music isn’t real. Moreover, services like Spotify had difficulty identifying these tracks, often referred to as “AI slop.”

A report by The Verge highlighted how fake album covers, similar-sounding titles, and manipulated metadata made detection even harder. Consequently, the scam kept growing.

The case is now viewed as a turning point.  Industry experts welcomed the ruling. As a result, platforms are improving identity verification and flagging suspicious uploads. Some companies are even partnering with law enforcement and rights organizations.

What This Means for Artists and Platforms

Clearly, the AI music scam revealed a weakness in digital music platforms. Although these tools make music distribution easier, they can also be exploited. Hence, artists, distributors, and developers must collaborate on stronger systems. Without them, fraud may continue to grow unchecked.

This shift opens new risks—and new responsibilities. Platforms must evolve, especially as AI becomes more advanced and more widely used.

Conclusion

Ultimately, the $10 million AI music scam is a warning to the entire music industry.  Moreover, it also offers a chance to adapt. With stronger defenses, better transparency, and ethical AI use, the industry can avoid similar threats in the future.

Related >>>

Related Articles

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Stay Connected

1,217FansLike
139FollowersFollow
440FollowersFollow
209SubscribersSubscribe
- Advertisement -

Latest Articles