Non-consensual Imagery: Meta Lawsuit Against Joy Timeline
Meta has filed a lawsuit in Hong Kong against Joy Timeline HK Limited, the developer behind the CrushAI app. The lawsuit aims to prevent the company from advertising AI technology that creates non-consensual imagery—specifically, fake nude images of clothed people—on Meta’s platforms. Meta alleges that Joy Timeline repeatedly attempted to evade ad review processes and ran thousands of disguised ads since 2023.
Non-consensual Imagery: How CrushAI App Technology Works
CrushAI uses advanced AI models to generate interactive experiences, including the controversial “nudify” feature. The app allows users to upload photos and, using AI, produces non-consensual imagery by removing clothing from the images. Other features include:
- AI-powered character conversations
- NSFW content generation without restrictions
- Custom character creation
- A credit-based system for interactions
Non-consensual Images: Meta’s Platform Policies
Meta strictly prohibits the sharing of non-consensual imagery, including both real and AI-generated content. In July 2024, the Oversight Board ruled that explicit AI images resembling public figures must be removed. Meta supports victims through StopNCII.org, which uses hashing to prevent the spread of intimate images without consent. The company also allows certain content for awareness-raising, provided victims are unidentifiable and the content is not sensationalized.
Non-consensual Images: Industry and Legal Response
Lawmakers and researchers have raised concerns about the proliferation of “nudify” apps. In early 2025, U.S. Senator Dick Durbin urged Meta to address Joy Timeline’s ads, citing violations of Meta’s standards on adult content and harassment (source). Meta is now enhancing its detection technology and sharing information about violators through the Tech Coalition’s Lantern Program.
References:




