Apple and Google have rules against “nudify” apps, but their app store search and advertising systems actually point users to them, according to an investigation by the Tech Transparency Project. Here are the highlights from the report:
° Searches for terms like “nudify,” “undress,” and “deepnude” in the app stores produced multiple apps capable of digitally stripping the clothes off women in photos.
° These apps can take images of real people and use AI to make them look naked, put them into pornographic videos, or turn them into sexually explicit chatbots.
° Apple and Google ran ads for nudify apps in some of the search results, and the app stores even suggested additional nudify search terms through their autocomplete function.
The apps identified by TTP have been downloaded 483 million times and made more than $122 million in lifetime revenue, according to data compiled by a mobile analytics firm. The investigation found 31 nudify apps that were rated suitable for minors, a notable finding given the growing number of sexual deepfake scandals in schools.
The TTP) is a research initiative from the Campaign for Accountability that investigates and exposes the influence, business practices, and societal impacts of major technology companies like Apple, Google, Facebook (Meta), and Amazon. It uses public records, data analysis, and investigations to hold Big Tech accountable for policy failures, such as content moderation issues and user safety.
I hope you’ll help support Apple World Today by becoming a patron. Almost all our income is from Patreon support and sponsored posts. Patreon pricing ranges from $2 to $10 a month. Thanks in advance for your support.
Also, check out my daughter-in-law’s “Scattered Words” website if you’re interested in unique, handcrafted jewelry made out of an array of vintage dictionaries, books, and even a few antiques.



