Meta Files Lawsuit Against Developers of AI-Powered Deepfake Nude Generators

As Meta continues to push the use of its AI content creation tools, it’s also facing a growing challenge: the spread of harmful AI-generated content on its platforms. In response, Meta is now taking legal action to combat this trend.
Today, Meta announced a lawsuit against a company called “Joy Timeline HK Limited,” which promotes an app named “CrushAI.” This app allows users to generate AI-created nude or sexually explicit images of people without their consent—a serious and alarming misuse of technology.
Meta explained:
“We’re witnessing a troubling rise in ‘nudify’ apps that use AI to produce fake, non-consensual intimate images. Meta has strict policies against this type of content. Over a year ago, we updated our guidelines to clearly prohibit the promotion of nudify services. We take action by removing ads, Pages, and Instagram accounts that promote these tools, blocking website links hosting such services, and limiting related search terms like ‘nudify’, ‘undress’, and ‘delete clothing’ across Facebook and Instagram.”
Despite these efforts, some content and promotions still manage to bypass Meta’s detection systems.
This lawsuit marks Meta’s first direct legal action against the developers of a nudify app.
“We’ve filed a case in Hong Kong—where Joy Timeline HK Limited operates—seeking to prevent them from advertising CrushAI on our platforms,” Meta said. “They’ve repeatedly tried to evade our ad review process, continuing to place violating ads even after removals.”
The situation underscores a major contradiction: while Meta promotes the use of AI tools for creative expression, it must simultaneously police their misuse.
Unfortunately, that misuse is inevitable. As history shows, any technological advancement tends to attract bad actors, and generative AI is no exception.
Just last month, University of Florida researchers reported a sharp rise in AI-generated explicit imagery created without consent. Their review of 20 nudification sites found even more disturbing trends—some were producing images of minors, and the vast majority of targets were women.
This has fueled growing support for stronger legal protections. The National Center for Missing and Exploited Children (NCMEC) is advocating for the “Take It Down Act,” legislation aimed at outlawing non-consensual image creation and curbing AI abuse.
Meta has endorsed this initiative, and its latest legal action is another step in that direction.
Still, completely eliminating such tools may be impossible. The internet has repeatedly shown that unethical use of innovation often follows even the best intentions. And with AI-generated adult content, that pattern continues.
But actions like this may help slow the spread, limit accessibility, and set legal precedents—sending a message that AI abuse won’t be tolerated.
📢 If you're interested in Facebook-related solutions, don't hesitate to connect with us!
🔹 https://linktr.ee/Adshinepro
💬 We're always ready to assist you!