Despite strict policies against non-consensual sexual content, Apple and Google are reportedly hosting dozens of AI-powered "nudify" applications—software designed to generate deepfake nude images of individuals without their consent. 

A new investigation by the Tech Transparency Project (TTP) reveals that these apps are not only evading detection but are generating significant revenue for the tech giants through app store commissions. The report identified 102 active nudify apps across both platforms—55 on the Google Play Store and 47 on the Apple App Store. 

According to the report, these apps "have been collectively downloaded more than 705 million times worldwide and generated $117 million in revenue.” Because both tech giants take a standard commission on transactions, the report concludes that "Google and Apple are directly profiting from the activity of these apps." 

Apps like Collart were found to require users to watch ads before generating nude content. TTP noted that a search for "nudify" on the App Store immediately "produced an ad for Collart... followed by Grok as the first organic result," highlighting how the platforms' own ad infrastructure directs users toward these tools. 

Both Apple and Google have strict guidelines for not allowing apps that generate non-consensual sexual content on their App stores. However, the investigation suggests a significant gap between policy and enforcement. 

TTP’s findings show that Google and Apple have failed to keep pace with the spread of AI deepfake apps that can “'nudify' people without their permission."
 
One of the report’s most damaging findings is how easily these apps are accessible to minors. Despite their explicit capabilities, many of them carried "family-friendly" age ratings. 

WhatsApp Introduces Lockdown-Style Protection for Users Facing Advanced Cyber Threats
This comes days after a lawsuit challenged WhatsApp’s privacy claims.

The app DreamFace, which has over 10 million downloads, was rated suitable for children as young as 9+ on the Apple App Store. When testing the app, TTP reported that it "turned an image of a fully clothed woman into a video of her dancing topless." 

Beyond the immediate harm of these nudity apps, there is a significant privacy risk. Several identified apps, such as Bodiva, operate out of China. TTP warns that "Chinese companies can be forced to share user data with the Chinese government," meaning sensitive biometric data, potentially including nude edits of real users, could be compromised. 

Following the release of the report, Apple stated it removed 24 of the identified apps, and Google suspended several others. However, the reactive nature of these bans raises serious questions about the validity of the safety argument often used by Big Tech to defend its app store monopolies. 

As the report concludes: "Both companies say they are dedicated to the safety and security of users, but they host a collection of apps that can turn an innocuous photo of a woman into an abusive, sexualized image." 

Google Settles $68M Lawsuit Over Voice Assistant Recording Users Without Consent
Google did not admit wrongdoing as part of the settlement.