What We Need
Is Moral Integrity

Juliane Reuther
Hide avatar

The last few years have made us suspicious. Donald Trump coined the term "fake news", which too quickly became a regular in worldwide vocabulary. Deepfakes are misinformation of the highest quality: We can no longer believe our eyes.

Even today, anyone can order deepfakes on the Internet for a few euros or, with a little patience and the right apps, create them themselves. At the moment, the quality of it still leaves something to be desired, but we know how rapid technical progress can be.  We must not underestimate the risks of such availability of manipulated videos for our society and for ourselves. That's why we have to fasten our moral seatbelt and act responsibly, especially when it comes to non-consensual pornographic content and manipulated information of social interest.

In 2020 according to “Sensity“ around 93 percent of deepfakes on the Internet were of pornographic nature and most likely created without the knowledge of the person concerned. Morally, deepfake porn is hardly any different from revenge porn, as both have the potential to wreak havoc on another person's life. As soon as you have a nude photo or sex tape of another person, you have the opportunity to post it online and expose them. Fortunately, most of us are sensible enough not to do that. We must not lose this common sense despite the lure of deepfake software which allows us to create such compromising imagery with as little as one selfie. And when in doubt, everyone is at the mercy of the technology - including minors.  

It is essential that legislation adapts and explicitly criminalizes the creation of non-consensually produced synthetic media. For this, awareness and basic knowledge about deepfake porn and its impact on those affected must be created with the authorities, as this is the most common abusive use of the technology. Reporting deepfakes has to become easier for victims. Social networks such as Instagram, Facebook, or TikTok must be obliged to immediately delete content that has not been consensually created and to block accounts that publish or spread it.

But we also have to tackle the problem at the root. We need to educate society and encourage it to act sensibly. Legislation is good, moral integrity is better. After all, the technology of deepfakes is not evil or wrong, just what people make of it. 

In response to a request from the FDP on the subject of deep fakes, the German Bundestag announced at the end of 2019: “The development and research on deep fakes is still in its infancy. On this scientific basis, which is still to be developed, a strict distinction between harmless and dangerous is currently not expedient."

Since then, a lot has changed technologically, but not legally: There is still no legislation that explicitly criminalizes the non-consensual creation of deepfake porn, for example. The European Commission considers deepfakes a technology of „limited risk“ and proposes a regulation that only makes it necessary to mark deepfakes as such and nothing more. We have to show moral integrity especially because our governments do not yet want to position themselves clearly on this. We have to educate ourselves and each other and be aware that a deepfake that begins as a harmless joke could destroy the lives of those affected.

Deepfakes may be a futuristic technology, but the challenges they pose are nothing new. A nude photo or porn video that is made public, even if it is a deepfake, can still mean the end of a career or relationship and trauma for women. Proving that a deepfake porn of yourself is fake is time-consuming and re-traumatizing. Stopping the spread is often impossible. A report usually comes to nothing.  

So we have no choice but to not rely on legislation, but to act with moral integrity. We need to prevent such harmful content from being created and spread. We must remain vigilant, research, and educate when deepfakes reach us and make others aware of what really matters: the truth.

More Stories

Terms and conditions