It’s not male politicians who edited “deepfake” videos target—it’s women turned into pornography

Deep fakes should be a concern for politicians—but for a range of reasons. Photo: PA

This morning a video of Boris Johnson was uploaded to Twitter. The leader of the Conservatives leaned earnestly into the camera as he endorsed “his worthy opponent,” opposition Labour leader Jeremy Corbyn. “Only he, not I, can make Britain great again,” says Johnson, before explaining why his voice sounds slightly disembodied. “I am a fake, a deepfake to be precise.”

The stunt by think tank Future Advocacy—they released a video of Corbyn, too—is designed to draw attention to an unregulated world of synthetic video ready to manipulate our elections.

While the technology could manipulate elections, however, it hasn’t yet. Video trickery in politics has so-far avoided outright fabrication. While the Conservatives edited Keir Starmer interview and the video of American Democrat Nancy Pelosi, where her speech had been slowed to sound drunk are both examples of slippery editing—or “shallow fakes”—they are not deepfakes.

“My general view is that we still have a way to go before deepfakes become a real problem in elections, and at the moment we should be more concerned about ‘shallow-fakes’ and the many—many—other problems that the digital environment poses to our existing electoral systems,” Martin Moore, author of Democracy Hacked: Political Turmoil and Information Warfare in the Digital Age, told Prospect.

That reality has not stopped artists, AI companies and researchers from producing their own deepfakes to warn of what could be coming. But focusing entirely on the electoral disruption deepfakes could cause distracts from an issue that has already arrived. The technology not yet a problem in politics, but it is somewhere else: pornography.

When DeepTrace Lab—a cyber-security start-up based in Amsterdam—embarked on its yearly study of deepfake videos online, they found the number had almost doubled since last year, reaching 14,678. 96 per cent of those videos featured pornography; all of it targeting women.

Danielle Citron, a law professor at Boston University, calls deepfake pornography, made without consent, “an invasion of sexual privacy.” In a WIRED article, the researcher said this was not new. Instead, it points to how misogyny evolves with technology. “At each stage we’ve seen that people use what’s ready and at hand to torment women. Deepfakes are an illustration of that.”

Right now, 99 per cent of that pornography features women working in entertainment; British or American actresses and South Korean musicians. There’s a technical reason for that, says Deeptrace CEO, Giorgio Patrini, speaking over the phone: “Even today, people need a lot of material [to create a deepfake] : video or 100s of photos. With celebrities, there’s a lot of footage of them which makes this technology much easier.”

Scarlett Johansson, who has been a target, commented on the fake porn circulating online last year. “Nothing can stop someone from cutting and pasting my image or anyone else’s onto a different body and making it look as eerily realistic as desired…” she told the Washington Post.

“The internet is a vast wormhole of darkness that eats itself,” she added.

While some deepfake porn is hosted on specialist sites or forums, mainstream porn providers are also guilty. In February 2018, PornHub, the world’s biggest porn site, told Motherboard it considered deepfakes to be “non-consensual” and would remove “all said content as soon as we’re made aware of it.” But as of today, deepfakes featuring famous actresses have been up on Pornhub for months, with some of them clocking up more than one million views each.

PornHub did respond to Prospect’s request for comment by the time of this article going to print.

As the technology evolves, it’s getting easier and easier to create deepfake porn. “Five years ago, it would have been more or less impossible for anyone except professionals or researchers,” says Deeptrace’s Patrini, “but we are now at the level where people who know even a little bit about how to programme—even college students studying computer science—can make fairly realistic videos after just a few weeks of experience with this type of software.”

One app could “undress” a photograph of a woman

For those with no programming skills, cash can be a substitute. Patrini notes how there are deepfake freelancers out there making videos for clients. For some companies, that’s their business model. Earlier this year, an app called DeepNude was available allowing users to “undress” a photograph of a clothed woman.

The technology’s evolution from niche to easy-to-access has sparked fears that ordinary women could suffer the same treatment as celebrities. Already there has been one public case in a Australia where the face of 18-year-old Noelle Martin was featured in a deepfake video uploaded to various porn sites. At the time Martin—who is now an activist—was a law student with no public profile.

Deepfake videos joins a list of other online tools that are enabling harassment; from so-called “revenge porn” to stalkerware that has been used to spy on partners and smart home tech turned into instruments of domestic abuse. Yes, deepfakes could cause election chaos, but the technology is already being used to degrade women. That should also be a concern for politics.

Source link