An application called “FakeApp” makes it possible for people to create fake porn using the faces of their favorite celebrities.
FakeApp: Oh look, a new way to exploit women.
In the wake of the reckoning in Hollywood and other industries over sexual abuse comes a new violation: near-convincing pornography featuring celebrities. Even people without a background in computer programming can, in a way, live out their fantasies with an easy-to-use app.
The future is indeed here. But did we anticipate that it would be this disturbing?
It all started with a Reddit user with the handle deepfakes. He had already made a treasure trove of convincing pornographic videos with the faces of celebrities like Gal Gadot and Taylor Swift edited onto performers’ bodies. All he needed were a machine learning algorithm, his home computer, and a number of publicly available videos. Armed with those things, and perhaps a thirst for something not readily available, he was able to make it look as if those celebrities were doing things they haven’t actually done.
A screenshot of a video made with FakeApp, featuring Daisy Ridley
A number of other people have begun to create their own fake porn with machine learning since they discovered deepfakes. Deepfakes has even created his own subreddit, which has amassed over 43,000 subscribers. Another user with the Reddit handle deepfakeapp created FakeApp, making it easy to create fake porn for anyone who has a mind to. All the tools are free, and there’s even a set of instructions to walk users through the app. If you have the app and one or two high-quality videos of your celebrity or personality of choice, you can produce realistic porn.
Experts have said that at least for the next year or two, fake porn videos would be somewhat easily discernible if you bothered to look closely. However, it took only a couple of months for convincing fakes to surface. More people began experimenting with the app, and some of the videos they’ve come up with might convince those who didn’t know they’re fake. Some believe that it’s highly likely that soon enough, it would be impossible to tell if the videos were fake.
So is this exploitative? Yes. Even users subscribed to the deepfakes subreddit know that what they’re doing isn’t something they can tell their parents about at the dinner table. According to one user, their work is “derogatory, vulgar and blindsiding to the women that deepfakes works on.” Still, the same user claims that the technique has applications beyond the realm of fake, exploitative pornography.
Seems like Hollywood has a new problem in its hands.
“The work that we create here in this community is not with malicious intent. Quite the opposite," according to Reddit user Gravity_Horse. "We are painting with revolutionary, experimental technology, one that could quite possibly shape the future of media and creative design."
Whatever these users might say to somehow elevate what they’re doing out of its inherent creepiness and exploitativeness, the fact remains that their intentions don’t matter in the greater scheme of things. The celebrity porn site CelebJihad, for example, has already uploaded a deepfake of Emma Watson, implying that the video is real and not the product of artificial intelligence. FakeApp users only have control over the production of their fakes, but they no longer have control of the fake videos themselves once they’re put on the internet.
In the era of fake news, when truth is already difficult to dig out, it’s technology like this that has the potential to wreak havoc. First it’s Emma Watson showering with another girl, next it’s Donald Trump getting urinated on by a pair of Russian prostitutes. The technology certainly has the potential for great things, but it’s likely to first make a mess before its powers are harnessed for something productive.
Get weekly science updates in your inbox!