Digital Rape: Women Are Most Likely to Fall Victim to Deepfakes

Juliane Reuther
Hide avatar

When you google the term "deepfake", the search results are mainly funny manipulated pictures of the Mona Lisa and changed videos of Facebook ceo Mark Zuckerberg. But these results are deceiving: Around 93 percent of the deepfakes that were available online in 2020 were not funny deepfakes of Obama, but non-consensual porn, according to a report by the company "Sensity".

Expert Harrison considers non-consensual deepfake porn as the biggest problem at the moment

In 2017, vice tech journalist Samantha Cole became aware of the Reddit user "deepfake" who used open-source software and artificial intelligence to create deepfake porn of famous women. The app "DeepNude'' made similar technology accessible to everyone in 2019 and allowed users to “undress“ photos of women with AI. The creators of the app discontinued this themselves shortly afterward due to abuse. The software behind it is still available.

Numerous websites already specialize in deepfake porn. The four largest sites hosted around 134.4 million of these AI porn videos in 2019. Particularly interesting: 100 percent of the deepfaked victims were women. 

In 2020, famous women from the world of entertainment and fashion made up around 80 percent of them. According to „Sensity“, athletes, politicians and news anchors were also among the victims. All content is likely to have been produced non-consensually.

Germans made up 1.3 percent of the worldwide victims of deepfakes in 2020. Almost 89 percent of this content was NSFW or pornographic. 95 percent of these denuding fakes were  of women.

So far, the quality of some apps still leaves a lot to be desired, but in the future, deepfake nude photos will look more and more realistic

That fact seems to be part of the appeal such videos bring: they are morally wrong, voyeuristic, illegal. Women are being objectified and sexualized by men. We experience the same noticeable power imbalance in the digital space as in real life. 

In "Rolling Stone“ American law professor and author Danielle Citron called deepfake porn a kind of digital rape: "When you see a deepfake sex video of yourself, it feels viscerally like it’s your body. It’s an appropriation of your sexual identity without your permission. It feels like a terrible autonomy and body violation“.

Not Only Celebrities Are Affected by Deepfake Porn And Fake Nude Photos

Since 2020 deepfake bots trained to undress pictures of women have been available in Telegram channels. All users have to do is upload a picture to have the person on it undressed by AI. As with other websites and apps, the quality of these photos varies greatly - some look deceptively real, others catastrophic – but these bots are still dangerous. 

According to a report by “Sensity“, between July and October 2020, more than 100,000 women have unknowingly been victims of such deepfake bots. 70 percent of the pictures uploaded were private and possibly obtained from social media. This means that every woman and girl who has ever posted a picture of herself or had a photo of herself posted by others is a potential victim.

That is exactly what happened to 19-year-old Charlotte from Leipzig. You can watch her story here.

In an anonymous survey, 16 percent of users of such deepfake bots stated that they wanted to deepfake famous women. The majority, 63 percent, on the other hand, wanted to see women they knew in real life "naked" with the help of artificial intelligence. 

The appeal of virtually undressing one's colleague or classmate is evidently greater than seeing celebrities naked. One can only wonder why. It might be the allure of some kind of sexualized power one tries to hold over a person that they cannot in real life? A voyeuristic curiosity or revenge on an ex-lover seems possible, too.

In some US states like California, the creation of non-consensual deepfake porn has already been criminalized. In Australia, the creation of deepfake porn can result in prison sentences of up to seven years since 2018.

At this point, there is no such legislation in Germany. However, legal steps are still possible. Deepfake porn and nude photos generally violate personal rights, the right to one's own image, and, according to Section 184 of the Criminal Code, the prohibition on the distribution of pornographic material.  

Christian Solmecke is a specialist lawyer for internet and media law

In the EU artificial intelligence and content created by it has become a topic of interest, too. On April 21, 2021, the European Commission presented a proposal aimed at regulating AI. Deepfakes are classified as a technology with "limited" risk and only need to be marked as deepfakes. For victims of deepfake porn this might be little consolation, after all, watermarks can be removed with a little skill.

Victims of deepfake porn in need of emotional and legal support can contact local aid agencies like „Weißer Ring“.

More Stories

Terms and conditions
Imprint