Nudify Apps Continue Spreading Across Major App Stores Despite Explicit Policy Bans
An investigation by the Tech Transparency Project found that Apple’s App Store and Google Play continue to host dozens of artificial-intelligence “nudify” applications that allow users to generate sexualized or nude images from ordinary photos without consent, despite both companies maintaining policies that prohibit explicit content and non-consensual imagery; the report identified 55 such apps on Google Play and 47 on Apple’s App Store, collectively accounting for more than 705 million downloads and approximately $117 million in revenue, raising renewed concerns that enforcement of existing rules is weak and inconsistent even as these tools pose clear risks for harassment, exploitation, and privacy violations.
Sources
https://www.theregister.com/2026/01/27/nudify_app_proliferation_shows_naked/
https://www.theverge.com/news/868614/nudify-apps-ttp-report-google-apple-app-stores
https://indianexpress.com/article/technology/apple-google-app-stores-still-host-dozens-of-ai-nudify-apps-report-claims-10498397/
Key Takeaways
- Dozens of AI-powered nudify apps remain available on Apple’s and Google’s platforms, even though both companies explicitly ban apps that generate sexualized or non-consensual imagery.
- These apps have reached massive scale, with hundreds of millions of downloads and substantial revenue, indicating systemic moderation failures rather than isolated oversights.
- The continued availability of such tools heightens risks of harassment, blackmail, and abuse while eroding public trust in the ability of major tech platforms to police their own ecosystems.
In-Depth
The rapid spread of AI-driven “nudify” applications highlights a widening gap between the public commitments of major technology platforms and their real-world enforcement practices. According to the Tech Transparency Project, Apple and Google continue to distribute dozens of apps that can digitally undress individuals in photos, often producing realistic nude or sexualized images without the subject’s knowledge or consent. While both companies claim to prohibit explicit and exploitative content, the scale of the findings suggests that enforcement mechanisms are either inadequate or inconsistently applied.
The sheer reach of these applications is difficult to dismiss. With hundreds of millions of downloads and well over one hundred million dollars in revenue, nudify apps are not fringe products slipping through the cracks. Some have even benefited from in-store promotion and advertising, suggesting a commercial incentive structure that favors engagement and revenue over user protection. Google has stated that it removed or suspended several apps after being notified, while Apple reportedly pulled some listings, yet many similar apps remain accessible, underscoring the reactive nature of these responses.
Beyond policy failures, the issue raises deeper concerns about the social consequences of unchecked AI tools. Nudify apps make it trivial to create non-consensual intimate imagery, a practice widely condemned across the political spectrum and already associated with severe emotional, reputational, and legal harm. When such tools are readily available in mainstream app stores, they normalize a form of digital exploitation that should instead be aggressively curtailed.
From a conservative standpoint, this is not a call for new speech rules but for accountability. Companies that write rules and profit from their platforms have a responsibility to enforce those rules consistently. If Apple and Google cannot or will not uphold their own standards, public confidence in self-regulation erodes, and the broader technology sector risks inviting heavier external scrutiny. Enforcing existing policies decisively would be a straightforward first step toward restoring trust and protecting basic personal dignity in the age of AI.

