The Dutch Public Prosecution Service is concerned about so-called deepfakes. These are films that have been manipulated with the help of AI algorithms. You can replace someone’s head with someone else’s, or have someone say completely different things.
“We have to realize that if we see something, we can no longer believe it,” said prosecutor Lodewijk van Zwieten. “Soon you can easily visualize something while it has not actually happened.”
It is becoming increasingly easy to make deepfake videos. Last week an app went viral in China with which everyone can place their head in a film scene of their choice. That app cannot be downloaded in Europe; a Chinese telephone number is required for activation.
“This kind of apps is fun, but if you use these kinds of videos to put people on the heel, you cross a border,” says Van Zwieten. The public prosecutor fears that this will become a reality if it becomes increasingly easier to make fake videos yourself.
This often requires some expertise – unlike the Chinese app – but there is a good chance that it will not stay that way. “If we all have an app on our phone that can do this, it will be easy to make a movie where you let someone else do something criminal.” And that can be punishable, for example as libel or defamation.
Blackmail
Researcher Theo Gevers from the University of Amsterdam shares the fear of deepfakes.
“As long as it stays with nice apps, it’s pretty harmless,” says Gevers. “But if it is applied to blackmail, for example, it is very dangerous.”
And the developments are going fast. “Things are now possible that could not have been done six months ago. And in six months to a year you will probably no longer be able to see with the naked eye whether a movie is a deepfake,” says Gevers, who works on software that recognizes deepfakes .
At the moment the most difficult thing is falsifying audio: this requires a lot of material from someone’s voice, although there are reports that a company lost 240,000 dollars because someone pretended to be a director with a fake voice.
Deep fakes are getting scary. In this one, we see Jim Carrey’s face transposed onto Alison Brie’s body. What do you think will be the societal effects when this technology is easily accessible to everyone on their phone? pic.twitter.com/j4ZMyGsBkX
— Ken Rutkowski (@kenradio) September 3, 2019
Already well-known women – in practice always women – sometimes find themselves without permission embedded in a porn movie, because someone has stuck their head in it.
According to Van Zwieten, there is a risk that, as technology becomes more accessible, “horny teenagers” in school areas will also start making such films of each other. And that can be punishable because virtual pornography with minors is also prohibited in the EU.
Fake movies could also be used for extortion: if he or she does not pay, the movie will be distributed. Or someone is arrested by accident, because it looks like a movie like he’s doing something punishable. But foreign (and our) governments could, for example, also spread fake news to influence elections.
Social media are also used for threats, but they are not prohibited either. It’s not about technology, but what you do with it.
Researcher Gevers is also not in favor of a ban. “There are enough positive applications to think of. Of course for nice apps, but also for educational or psychological purposes.”
In this way, students could be taught by historical figures that were brought to life. Survivors could also talk virtually with their deceased loved ones.
The public prosecutor thinks that new regulations are not necessary anyway.
“We will come up with regulation. But we must have a debate about this technology.”
No Comments