When mobile apps like Snapchat made an appearance, everybody was enjoying the face recognition technology. We were happy applying filters to our faces to become what we’re not. The face recognition technologies evolved, then met the Artificial Intelligence to create what’s called Deepfakes.
The word “deepfake” is a compound of the terms “deep learning” and “fake”. The concept is about generating multimedia contents, generally videos, that show people participating in event that in reality they never took part in.
How does Deepfake work?
Deepfake is an AI-based technology that uses Generative adversarial network (GAN). The inputs of the algorithm are multimedia contents, it uses them to learn about the person’s face from all angles.
To generate a Deepfake, the AI must be provided a source video, the person to be replaced and the person that will take its place. In each frame, the AI uses the data gathered from the inputs to perfectly switch the two persons.
AI is also capable of cloning a human being voice. Some tools simplify the process so that all the user have to do is type a sentence, then the AI generates a speech with the targeted person’s voice.
Application of Deepfake
Until now, the application of Deepfake is still limited to amateurs and some perverted tendencies.
The word Deepfake is becoming a synonym of porn. The majority of videos available on the internet are about faking celebrities’ sex tapes.
It started on Reddit when a Redditor using the pseudonym “deepfakes” created multiple deepfake porn videos featuring the faces of famous actresses, like Wonder Woman’s Gal Gadot. Later, Reddit reacted by banning what it’s calling “involuntary pornography”.
As it is with the evolution of Aritificial Intelligence, the application of this technology can be harmful. If used to target politician and celebrities, it can cause some unrepaired damage. What’s worst, is that audiovisual content will lose its credibility, since we won’t be able to tell the real from the fake. If every video clip could potentially be fake, why believe anything is real?
Seeing is believing, that was the rule until now. Yes, we’re familiar with special effects and CGIs, but they were used in specific cases, like to try to make us believe in some supernatural events. But now, the goal may be purely personal or political. Now we hear about using Deepfake for porn revenge, tomorrow it can target a presidential candidate.
What’s the legal situation?
As explained in a Mashable article, legally speaking, it’s difficult to build a solid case for the Deepfake victims.
You can claim copyrights if you took the original footage yourself. While to go for a defamation claim, the Deepfake creator have to proclaim that it’s you in the video.
If you’re a celebrity, you can go for a publicity claim if the other person is trying to win money with it.
At the end of the day, all this talk about Deepfakes is our last resort. Since people acknowledge the existence of these fake videos, it will be difficult to use them to ruin other person’s lives, hopefully.
Leave a Reply