Sexual deepfakes become more realistic, easier to access, and more harmful
A WIRED review and researcher reports show dozens of websites, bots, and apps can turn a single photo into explicit AI videos, fueling widespread nonconsensual abuse of women and girls.
- Dozens of websites, Telegram bots, and apps now offer explicit image-to-video deepfakes that can create short sexual clips from a single photo.
- Many services charge small fees for videos and additional fees for AI-generated audio or advanced options.
- WIRED reviewed more than 50 sites likely receiving millions of views per month and found 1.4 million accounts across 39 Telegram deepfake channels and bots.
- Services list dozens of sexual templates and custom prompts, and the ecosystem can enable creation of child sexual abuse material.
- Developers often build on open-source models and provide APIs, allowing consolidation and rapid scaling of the tools.
- Victims are overwhelmingly women and girls, and harms include harassment, humiliation, sextortion, and private circulation to friends or family.
- Platform responses vary: Telegram removed many tools after being flagged, but laws, detection, and enforcement remain limited