This terrible incident demonstrates how deepfake technology threatens women’s rights and safety. Currently, deepfakes can be produced in under 24 hours for less than $1000, and soon will only require a single image. As this technology improves and access widens, the threat to everyday women grows.
Outlined below are the four key steps we suggest you take if you find yourself in this situation.
Document everything. Record all evidence of this abuse — screenshot the images, your takedown requests, and any related communications.
Flag or report the post and alert the platform administrators or moderators. Non-consensual pornographic content of any kind violates all mainstream platform use policies.
Contact a lawyer to explore civil or criminal avenues. It’s possible to seek justice under laws that define related crimes, such as revenge porn, extortion, harassment, or defamation, and some of the most effective ways to remove this content are related to IP or copyright laws.
Reach out to any of a number of civil society groups. Organizations such as DeepTrust Alliance, Cyber Civil Rights Initiative, Electronic Frontier Foundation, and EndTab are dedicated to helping and protecting women who are victims of this crime. Other gender advocacy and women’s rights organizations can connect you to hotlines, attorney recommendations, and mental health resources and support.
Unfortunately, the mechanisms for shutting down this content are not nearly as advanced or systematic as the AI algorithms that create it, and a disproportionate amount of responsibility falls on individuals. We hope that you never find yourself in this situation, but if you do, these steps may be able to help until state and federal law catches up to fully address the unchecked development and dissemination of pornographic deepfakes.