Most people familiar with "face-swapping" know it as an innocuous social media feature. An algorithm captures a person's face and pastes it onto someone else's. The result is rarely seamless and often funny.
But as it grows more sophisticated, that technology has taken a sinister turn: It's now become easier to superimpose the faces of celebrities onto those of actors in pornographic films, resulting in highly-realistic fake videos.
Deepfakes, as the digital form is called, takes its name from the Redditor "deepfakes," the first person known to create these fake porn videos. Celebrities Daisy Ridley, Gal Gadot and Taylor Swift are among deepfakes' early victims.
Samantha Cole, an editor at Motherboard, who first reported on the trend, tells NPR's Scott Simon that the videos are created using a machine-learning algorithm, which is trained by processing hundreds of photos of an individual's face.
"Someone takes a dataset of one person's face — and a lot of pictures of that person's face — and then a video that they want to put it on," Cole says. "And they run a machine-learning algorithm, train it on these two images, and after a few hours, gives you the result, which is, these very realistic, fake porn videos."
So, while well-photographed actors and actresses are easy targets, as the technology quickly becomes more advanced and accessible, not-so-famous faces are worried where they might show up online.
That's the talk of Reddit threads right now, Cole says, "Whether this can be done with people that they know or scraped from Facebook images or Instagram. It's definitely possible, if you have enough images of someone."
In fact, a new, user-friendly tool, FakeApp, democratizes the technology. FakeApp allows anyone to generate fake videos with their own datasets. Deepfakes enthusiasts have been inserting into movies the Internet's favorite face: Nicolas Cage.
Legally, though, this quickly-advancing technology has been outpacing the law. "It's all very hazy right now," Cole says. "Celebrities could sue for misappropriation of their images, like when you use a celebrity's face for an ad without their permission. But the average person has little recourse. Revenge porn laws don't include the right kind of language to cover this kind of situation."
Similar technologies have already stirred fears. Last year, journalist Nick Bilton considered the implications of adding two new manipulative mediums. He pointed to a video demonstrating technology researchers developed that allowed them to manipulate the facial expressions of world leaders, including Presidents Donald Trump and Vladimir Putin.
If fabricated text-based stories can snowball into events like PizzaGate, he suggested, he left the door open for the consequences that could play out on an international stage when bad actors latch onto this technology.
"Celebrities and porn performers are two groups of people that have lots of images of themselves publicly so they're easy targets for this, but so are politicians," Motherboard's Samantha Cole says. "It's going to be difficult trying to suss out all of this in an era of fake news."
SCOTT SIMON, HOST:
Most people familiar with face-swapping know it as a harmless, fun feature on social media apps. An algorithm captures a person's face and places it on somebody else's head. The result is rarely seamless, and often it's pretty funny.
But face-swapping has recently been used to superimpose the faces of celebrities into pornographic films. This isn't just alarming for actors and actresses who appear to perform in movies they never made. Because the technology is more advanced and accessible, not-so-famous faces are worried where they might show up online. Is face-swapping a dark sign for online identities?
Samantha Cole is an editor at Motherboard and has been covering this. Thanks very much for being with us.
SAMANTHA COLE: Sure. Thanks for having me.
SIMON: You've seen one of these, right?
COLE: Yes. I've seen probably dozens, if not a hundred, of them by now.
SIMON: Well, you tracked down and interviewed a Reddit user who goes by the name of Deepfakes who, I guess, has created three adult films with celebrity faces, yes?
COLE: He's created probably a lot more than that, to be honest. He was the person who first posted one of these on Reddit, and his name has become the name for this form of video - these fake porn videos.
SIMON: How does it work?
COLE: So basically, it's generated using a machine-learning algorithm. So someone takes a dataset of a lot of people's - or one person's face and a lot of pictures of that person's face, and then a video that they want to put it on. And they run a machine-learning algorithm, train it on these two images. And after a few hours, it gives you the result, which is these very realistic fake porn videos.
SIMON: So hypothetically, could you take somebody's photos or videos off their social media feeds and put them into adult films?
COLE: So yes. Hypothetically, it's definitely possible if you have enough images of someone. It's not something that we've seen happen yet. But as quickly as this technology is moving, it's definitely possible.
SIMON: Is it legal? Or does anyone care?
COLE: (Laughter) I think both sides care quite a bit - the people making them and the people who are the targets of them. The legality is honestly in a very gray area. It's all very hazy right now. We're not really sure what to make of it. Celebrities could sue for misappropriation of their images, like when you use a celebrity's face for an ad without their permission - things like that. But the average person has little recourse, honestly. Revenge porn laws don't include the right kind of language to cover this situation because it's a mashup of two things.
SIMON: Yeah. Revenge porn is when someone takes a...
SIMON: ...Intimate film of someone, and they don't have their permission.
COLE: Exactly. Yeah. So this is not quite that. And that's creating a lot of problems legally and a lot of questions of how we're going to handle this.
SIMON: I have to tell you my biggest worry as a citizen is not porn but that somebody might put somebody's face - let's say - at a crime scene or in some other - you know, at a rally that you never attended or something like that.
COLE: That's definitely possible, and that's something that we're thinking about. It's splashy right now because it is porn. And celebrities and porn performers are two groups of people that have lots of images of themselves publicly out there, so they're easy targets for this. But so are politicians, you know, anyone who's on TV or on the Internet, showing their face quite a bit.
SIMON: And what about regular citizens who just have a lot of pictures and videos on social media sites? Could they be victimized, too?
COLE: I mean, it's theoretically definitely possible. You would need hundreds of pictures of someone. It's worth taking a look at your privacy settings and thinking about how you use the Internet and whether or not you're sharing your face in all these private forums.
But then again, that puts a lot of pressure on users to - for them to kind of self-regulate over platforms. And those are the ones that really need to be accountable for taking care of the people who are using these platforms and kind of regulating how people are using them and hoping that they're not for harm.
SIMON: I mean, if the solution is just don't put pictures or videos on social media platforms, that also kind of destroys the utility of social media platforms, doesn't it?
COLE: Sure. And that's definitely not - that's not what I'm saying. I'm not saying don't put pictures of yourself out there. That's an extreme solution to this. The better solution would be to have more stringent laws around revenge porn, ownership of our own images, more responsive platforms who act quickly and serve their users better.
Yeah. It's - right now, it's just easier to say think twice about your privacy settings because that's all we can do. That's all we have control of right now.
SIMON: Samantha Cole at Motherboard, thanks so much for being with us.
COLE: Thank you for having me. Transcript provided by NPR, Copyright NPR.