DeepFakes have entered the arena of public interest in the form of a battle over censorship. Websites, most notably PornHub, have found themselves in the midst of a no win shit storm as a result of taking the proactive step to disallow content that is digitally altered so as to look like it contains recognizable notable people. Their reasoning: it’s a form of non-consensual content. The individuals who’s likeness is being used ostensibly did not consent to having imagery made of them in this fashion.
On the surface this issue seems almost childish. The internet is in a tumult over getting naked videos of a few hot actresses, as if their inevitable sex tapes were not already on the way. Dig a little deeper however, and you may find the act of superimposing a neural-network representation of Scarlett Johanson’s face on to someone else’s body speaks volumes about human desire, and the effort to prevent it says just as much about the landscape of our political world.
It starts with a simple human instinct. Men see the face of an actress in a movie, an actress famous because of her good looks, and they get to thinking about what the rest of her looks like. When the imagination and a few cartoonish artist renderings fail to fill the basest urge to mate with attractive women, the nerds get to work. A computer sifts through thousands of images of the actress’ face, piecing together what she really looks like. Then an unrelated video with a different kind of actress is selected and the neural-net paints over the face with it’s own idea of what a face should look like. If done correctly the effect is convincing enough to bring euphoria to the dry masses.
Ironically, the desire to see more than just Scarlett Johanson’s face has led us down a very complicated and winding road leading full circle to placing just her face on anything but her body. The effect of the compilation is more desirable than it’s parts. It is the combination of both, the face and the context it is in, that make DeepFake’s desirable.
Highlighting this division brings up the question, what part of a DeepFake is non-consentual? All the source material is obtained legally. The faceless woman agreed to be filmed and was compensated. The images that teach the algorithm are freely available. The final output of the process is a likeness but not a replication. It’s a machine’s interpretation, not a camera’s capture of a real person. If the process were a little less sophisticated, no one would care who’s images were use to produce it.
It’s only now that the process is good enough, that the likeness is close enough, that there begins to be concern. That concern does not rise from the fear that people will be tricked in to thinking the videos are authentic. The concern rises from the fact that people don’t care. The videos boldly advertise themselves as DeepFakes, and all censorship has been directed towards the videos that represent themselves as such.
Instead it is because they fill the desire to place the likeness of a carefully maintained public personality in a sexual context. They show the people what they want in a manner that bridges the gap between fabrication and reality in such a way that their imaginations are satisfied. They are the truest form of porn, adulterating not only the visual image of a person, but the idea of them as well.
Imagine a world where DeepFakes are used for political purposes. Not to seriously convince people of something untrue, but to play with the idea of a person in a fabricated context. A politician could be placed in endless scenarios, lay victim to the imaginations of his enemies, and viewers at home would engage along side in the fantasy.
We already exist in a world that substitutes facts for repetitious talking points and suggestive impressions. DeepFakes will be the nuke in that arsenal soon enough.
That leaves one question: why the backlash? Why the push to bury these things? The answer is that nukes are no good is everyone has them. This technology is cumbersome and hard to use right now but it’s going to become freely available soon, and when it does the only thing to mitigate the damage is the censorship measures put in place today.