(C/NET) – A free, easy-to-use deepfake bot found on the Telegram messenger app has victimized tens of thousands of women by replacing the clothed parts of their bodies in photos with nudity. More than 100,000 of these nonconsensual sexual images have been posted publicly online, but the bot may have produced magnitudes more that haven’t been traced.

The victims are mostly private individuals, women whose photos were taken off social media or pulled from a personal stash of pics, according to a research report about the bot Tuesday. Some victims were originally photographed in bathing suits. Some were wearing simple T-shirts and shorts. Some were visibly underage. All are women.

Deepfake porn isn’t new. Deepfake technology — artificial intelligence that makes sophisticated media forgeries — has been used early and often to fabricate pornography. But this Telegram bot takes the ease and access of this technology to a new level. “The innovation here is not necessarily the AI in any form,” said Giorgio Patrini, CEO of deepfake-research company


Advertisement


Sensity and coauthor of the report. “It’s just the fact that it can reach a lot of people, and very easily.”
Computer manipulation of media has existed for decades, and sexual imagery has been weaponized online for as long as the internet could host photos. Whether it’s nude photos posted without consent or crudely doctored forgeries, sexual images have been weaponized to extort, threaten, humiliate and harass victims. READ MORE