Deepfakes Have No Tech Solution

Financing technological fixes for algorithmically produced bogus videos covers up the more significant problems of consent and media literacy.
Google Alerts regularly emails me a list of all the most current articles containing the phrase “deepfake.” The stories alternate between speculating that deep fakes could start a war and examining Hollywood’s most bizarre recent application of face-swapping technology. It’s a media frenzy that’s right on one with the rest of 2018. Yet, this coverage frequently ignores what we should be most concerned about: a society in which many people are led astray by a video of an untrue event, reinforcing their false beliefs.

The rest of the world has rushed straight for the literal nuclear option in the nine months since Motherboard discovered a man using the username “deepfakes” posting face-swapped, algorithmically-generated porn on Reddit. Suppose internet nerds can create fake videos of Gal Gadot having sex. In that case, they can also create fake videos of Barack Obama, Donald Trump, and Kim Jong Un that somehow start an international incident that results in a nuclear war. The US government is financing research to automatically identify phony films because of their potentially harmful political ramifications.

The Media Forensics division of the US Defense Advanced Research Projects Agency (DARPA) awarded nonprofit research organization, SRI International three contracts in April to develop methods for automatically identifying digital video modifications. DARPA provided funding for researchers at the University of Albany to explore deep fakes. They discovered that one way of telling a deepfake from a genuine video is by examining the blinks in the movie.

Though it’s a compelling worst-case scenario, the concern that deepfakes might one day start a nuclear war ignores today’s crucial concerns around consent, media literacy, physical autonomy, and digital self-ownership. These problems are not hypothetical or far-fetched. Deepfakes today have made them worse. Will someone fabricate a video of President Trump announcing war on North Korea and endanger all of us? Maybe. The most extreme outcome, though, is the extinction of humanity. It is receiving more attention than concerns about respecting women’s bodies or examining the reasons behind why those who made deepfakes felt free to use the women’s photographs without their consent in the first place.