← 返回 Avalaches

科技透明度计划(Tech Transparency Project,TTP)报告称,Apple 与 Google 在应用商店中仍会对“nudify”“undress”等搜寻词返回可用来制作非自愿性性化图像的 App,尽管两家平台皆明文禁止性化或违反同意的内容。TTP 在 Apple App Store 和 Google Play 中各识别到 18 款与 20 款应用,合计 38 款;根据 AppMagic 估算,这些应用共下载 4.83 亿次,并产生约 1.22 亿美元收入,若按下载量平均分摊约每次 0.25 美元。部分 App 在名称与宣传上明确性化,另有部分被包装为一般影像工具但可轻易被滥用。

Apple 表示在 Bloomberg 联系后已下架 15 款应用,并封锁了部分报告中的搜寻词;另对 6 家开发者发出警示,指出若不整改将可能遭移除。Google 则称多款违规应用已被暂停,并持续调查中。具体案例中,Google Play 的 Video Face Swap AI: DeepFace(儿童级分级 “E”)下载量超过 100 万次,程式内于 “Girls” 分类可让用户将人脸贴到拍打胸部或摆臀的模板。开发商 Okapi Software 说已启动内部调查并移除用户上传内容,但其声称并未提供直接的 nudify 功能。

研究者与学界专家认为,执法仍呈现不均且不透明:一个被包装为通用生图工具的 App 可通过审核,并借由排名与自动补全功能被导向更多同类“nudify”应用,导致互动驱动模型放大不当使用。监管压力同步上升。2025 年美国总统 Donald Trump 签署了《Take It Down Act》,将非自愿性性内容的发布定罪,英国亦计划今年四月推出法案,使科技执行长在未及时下架性影像滥用内容时可能承担个人责任。这一争议显示两大平台仍在政策敏感内容上既获取流量与收益,又倾向事后下架而非有效的前置筛检。

The Tech Transparency Project (TTP) report says that Apple and Google still return apps that can create non-consensual sexualized images when users search terms such as “nudify” and “undress,” even though both platforms explicitly ban sexualized or non-consensual content. TTP identified 18 applications in the Apple App Store and 20 in Google Play, totaling 38. According to AppMagic estimates, these applications were downloaded 483 million times and generated about US$122 million, implying roughly US$0.25 per download if distributed evenly. Some apps are explicitly sexual in names and marketing, while some are packaged as general image tools but can be easily misused.

Apple said it removed 15 applications after Bloomberg contacted it and blocked some search terms listed in the report; it also warned six developers that apps may be removed if not fixed. Google said many violating applications were suspended and that investigation continues. A specific case was Google Play’s Video Face Swap AI: DeepFace, rated “E,” with over 1 million downloads, where users can paste faces into templates featuring bouncing breasts or hip shaking in a “Girls” category. Okapi Software said it launched an internal investigation and removed user-uploaded content, while asserting that it does not provide direct nudify functionality.

Researchers and external experts said enforcement remains uneven and opaque: an app framed as a generic image generator can pass review, and ranking and autocomplete can direct users toward more nudify options, so engagement-driven systems can amplify misuse. Regulatory pressure is rising in parallel. In 2025, U.S. President Donald Trump signed the Take It Down Act, which criminalizes publishing non-consensual sexual content, and the UK plans to introduce legislation this April that may make tech executives personally liable when intimate image abuse is not promptly removed. The controversy shows that both platforms still gain traffic and revenue from policy-sensitive content while favoring post-hoc takedowns over effective pre-screening.

2026-04-20 (Monday) · 86ebb0d7b1b19fe8893b72f18e1ecbeca78444dc