Meta sues nudifying app creator in legal bid to block deepfake ads
Meta, parent company of Facebook and Instagram, is suing the maker of a nudifying app to clamp down on AI-generated sexual content. On June 16, it filed a complaint in a Hong Kong district court against Joy Timeline HK Limited, developer of CrushAI, alleging the app ran over 87,000 ads—across 170+ fake accounts—targeting users in the U.S., U.K., Canada, Australia, and Germany, in direct violation of Meta’s ban on intimate deepfakes
AI-Generated Deepfake Exploitation Sparks Legal and Policy Crackdown
The lawsuit reveals that CrushAI ads carried captions like “erase any clothes on girls” and “upload a photo to strip for a minute,” in direct breach of Meta’s ads policy on sexual content. Meta acted after media investigations—including coverage by Faked Up and CBS News—revealed advertisers were exploiting proxy domains and cloned Facebook pages to bypass rejection systems
Meta sues nudifying app creator and rolls out advanced AI ad moderation
Alongside the lawsuit, Meta has enhanced its AI-powered ad scanning tools. These systems analyze text—including suggestive phrases and emojis—even when visuals are harmless. Meta also participates in the Tech Coalition’s Lantern initiative, sharing information with other platforms to prevent non-consensual intimate imagery.
Meta sues nudifying app creator against a backdrop of tougher regulation
Meta’s legal action mirrors global moves like the U.S. Take It Down Act, which makes non-consensual deepfake content illegal and demands rapid removal by tech platforms. It also comes after pressure from Senator Dick Durbin, who challenged Mark Zuckerberg over the issue. This trend reflects a growing consensus among lawmakers, activists, and ethicists.
What it signals for platform accountability
The lawsuit marks a new phase for Meta—combining legal and technical measures. Users can expect more robust screening of intimate content. Meanwhile, app developers face heightened legal risks. Meta aims to elevate the fight against deepfake exploitation and set industry-wide standards.




