Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:01):
Welcome to Stuff You should know, a production of My
Heart Radios How Stuff Works. Hey, and welcome to the podcast.
I'm Josh Clark. There's Charles W. Chuck Bryant. Here's our
brand new producer from now on, Josh Brother, Josh, Have
you said your last name? Josh? We don't do that, Okay,
(00:23):
sometimes we say Jerry's Jerry quit? She really did, but
she did it like that, kind of like quiet, silent
wait for her. She just stopped showing up. Yeah, Jerry
didn't quit. Everyone. We don't think we're not. It's not
entirely certain until I see her sitting in that chair. Then,
assuming she's quit, what if, Chuck, she sent us a
(00:44):
video of herself saying, I quit. I'm so sick of
you guys. I'm done with this forever. Would you believe
it then, but the lips didn't quite match up, then
it would be a deep fake. It would be a
deep fake. I saw a tweet. I don't remember who
it was, but they were maybe Ryan Liza or somebody
was complaining said why do we call these things deep fakes?
(01:07):
And somebody school them on it. It was kind of
nice to wat who he said, Ryan Liza I think
he's like a CNN correspondent journalist. Uh So, first of all,
we want to issue a c o A here, uh
that maybe any kids shouldn't listen to this one. We're
talking about some really every day dark, harmful stuff. Yeah,
(01:33):
really despicable, gross stuff. The only thing I can think
of that would be worse than covering this would be
to to one on like snuff films. I kept thinking
of that while I was reading this. I don't know, man,
I don't want to. I don't want to pit types
of despicable media against one another. But um, I think
revenge porn might have a leg up on this, which
(01:55):
this sort of is as well. Right, it's definitely a
close cousin of it at least. Yeah, but this one
in for kids. Uh. And I was shocked and dismayed
because I didn't know about this, And when I saw
it's a podcast on fake videos, I thought, well, how fine,
because I love those videos of David Beckham kicking soccer
(02:16):
balls in the cans from a hundred yards out on
the beach. That's for real. Uh No, yeah, and it's
just coincidence he's holding it PEPSI can up. I saw
it with my own eyes. I thought that's what this
is about. I was like, all those are fun. It
is kind of but then I wanted to take a
bath after this. I can I can understand, So we
(02:36):
should probably start out after the CEA by saying what
a deep fake is? A deep fake d E E
p F A k E all one word is a
type of video where somebody is saying or doing something
that they never actually said or did. What you say, Okay,
this is nothing new. This has been around for a while,
(02:56):
Like people have doctored photos and videos and stuff like
that for base sically as long as there's been videos.
C G. I sure this is different. This is the
in the same ballpark, but this is an entirely different league.
Like this league plays on Saturday and Sunday afternoons, not
Tuesday night, you know what I mean. Like this is
something totally Just let it simmer for a little while
(03:18):
and you'll be like, wow, it's a really good analogy. Uh,
this is just it's different. It's it has a lot
of the same in principle, but because they are so
realistic and they're getting more and more realistic by the day,
they they actually in a lot of people's minds pose
a threat not just two individual people, as we'll see,
but possibly to society at large, say a lot of
(03:39):
people who are really worried about this kind of stuff. Yeah,
and we're not talking about I'm assuming the fake lip
reading thing that's deep fake, right or is that just
no manipulation of video whatsoever? And that's just people using
their voice So what that is is the Yeah, it's
just somebody pretend like they're just fake lip reading and
(04:02):
then doing a voiceover. They're not. They're not manipulating the
video at all. Okay, No, they're just doing a really
bad job of lip reading, which is are hysterics. They
are hilarious. Those I would put those up with the
the G I. Joe p S A S. Pork chop
sandwiches like those are just all time classic. Can watch
them anytime and still laugh. Have you ever seen the
(04:23):
G I. Joe's action figures on like the dead road Kills.
It's sad because it's a dead animal, but they'll be like,
you know, a dead squirrel in the road, and someone
will pose like a G. I. Joe action figure with
his like foot on his head. It's like it's a trophy. Yeah, game,
it's kind of funny. Yeah, I can see that. It's
let's put it this way, it's as funny as you
(04:44):
can make a picture of a dead animal and that
I assume got hit by a car. You hope, But
maybe they are like killing squirrels. Just after reading this,
I don't doubt anything. And it makes me hate the
Internet even more. All right, so let's let's get into
this a little bit. Okay, Chuck, calm down. We're not
allowed to have personal positions on these things, so this
(05:06):
is totally a neutral thing. Okay. So, um, there's this
really interesting Gizmoto article that that talked about the history
of kind of um not necessarily deep fixed but altering videos,
like presenting a doctored video as as reality. And apparently
there was a long tradition of it at the beginning
of cinema where people got their news from newsreels, like
(05:28):
you actually go to a movie theater to see the
news because you were just a early twentieth century yokel
living in Kansas or something like that. Yeah, And after
reading this bit, I thought that was a very Gizmoto
way to say, here's one not so interesting fact that
really has not much to do with this. Oh I
love it really. I personally selected it and put it
(05:48):
in here, and I thought it was kind of funny.
I think it's great. Uh yeah, they used to fake,
uh real life events and recreate them. Don't try to backpedal.
And that has nothing to do with deep it does
because one of the big problems are threats from deep
deep fakes is it's a way of seeing what you
think is news, but it's not. It's it's a sham.
(06:09):
It's recreated. Yeah. The difference I see is they were
recreating real news events and just like here, you didn't
see it. So this is what it may have looked like,
but they were passing it off as real. Therein lies
the tragedy of all times. I thought it was a
very thin GIZMOTOI whatever, we'll edit this part out Webster's
defined deep fake. I like it. I put it in
(06:31):
there specifically because I thought it was good. That's all right, Okay,
so we'll take another tech Okay, why don't we just
talk about deep fakes, So Chuck, let's talk about deep fakes.
We can just cut there. So deep fakes actually are
super new. Yeah, they and the reason they're called deep fakes.
Is because in late two thousand seventeen, I think November,
(06:52):
this guy who was a reditor, a guy who posts
on Reddit at your first sign warning sign not necessarily read.
It's pretty sharp and smart and got some good ideas
going on as as of all the social media platforms
that throw my two cents in with Reddit, but there
was a redditor called deep fake d E E P
F A K E all one word and he said, hey, world,
(07:14):
look at what I figured out how to do. And
he started posting pornography but with celebrities faces transposed on it,
and he said, this is just my hobby, but here's
how I did it. And he said that he used,
I'm assuming it's a him. I don't know if it's
a man or a woman. I'm gonna go with a man.
(07:35):
UM and he said, I just used Keras and TensorFlow.
And these are a couple of UM basically open source
AI programs that this this guy was smart enough to
figure out how to use to train to create UM
these videos where you take a celebrity's face and put
it on a clip from a porn movie and it
(07:55):
looks like the celebrity is doing what you're seeing. And
at first it was kind of hokey and not very
it was very obviously not real. Yeah, I think the
scary part was how quickly and easily it could be done. Motherboard,
who we we used to write for every now and then.
(08:15):
Remember that I tried to forget. They tried to forget
for sure. Yeah, if I hit us up, what like,
I feel like seven or eight years ago, I'm trying
to forget and you're making it really hard. Said you
guys want to write some blogs for Motherboard. He said, sure,
so we did. We did. We wrote ten. Yeah, you
can probably go find those on the internet if you
want to learn how to they drive a stick shift
or something. The fine people that motherboards scrubbed those from
(08:37):
the internet, let's hope. So so Uh, this deep fake
character figures this out. Another guy released a downloadable desktop
software that said here, you can do this awful thing too. Right,
within like two months of of deep Fake coming out
and saying look what I did and here's how I
did it, somebody said that's a really good idea. I'm
going to turn it into an app and make it
(08:59):
give it to everybody. That's right, And now, uh, people.
Can you know at this time, this was a very
short time ago, people, and it's really come a long
way in the past whatever, not even two years, because
there's a late right early two eighteen when it really
first popped up. Yeah. So this thing was downloaded a
hundred thousand times in the first month alone, and some
(09:21):
people used it for fun stuff, like uh, putting Nick
Cage in movies he wasn't in. Yeah, those are called
dirt fakes. Yeah, they've all got fun names, don't they. Dude.
Nicholas Cages Yoda is patently objectively hilarious. I didn't like
that one. I thought the I don't know, the rages
of all stark thing was interesting, I guess, but none
(09:44):
of them made me laugh. Like maybe I just don't
have that kind of sense of humor. Yeah, but yeah,
I did. Never was like, oh my god, that's hysterical.
It's Nick Cage's face. I understand. I understand where you
come from. I don't think I was, like, you know,
in stitches or anything like that. But it's pretty great. Okay, Okay,
it's just not my thing. You're not a Gizmoto reader,
(10:04):
are you. No, none of this is my thing. But
that doesn't mean we can't report on it. However, since
it started happening that it became pretty clear pretty quickly
that this could be a bad thing in the future,
and not just for putting your ex girlfriend's face on
a sex video, you know, saying look what she did.
(10:26):
It could you could put a world leader up there
and uh really cause a lot of problems. Yes, hypothetically
you could. And that's that's really as we'll see this,
this new technology, this deep fake technology, it poses at
least two risks to immediately obvious risks, and they're hyper
(10:46):
individualized and hyper macro social. That's risks, but they both
stem from the same thing, from the same route or
same seat. To keep the metaphor going and on track,
that's so, let's talk about the technology behind this, because
this stuff is just totally fascinating. Surely you agree it
(11:07):
is AI. It was created by a guy named Ian Goodfellow,
just this particular type of AI. He didn't make the
deep fake stuff, no, no, no. But basically what this
model you know, everyone knows AI is basically when you
teach a machine to start teaching itself, starts learning on
its own, which is a little creepy. Um. But the
(11:29):
model that they're using these days is called artificial neural
net which is uh, machine learning. And basically what they've
done in this case is all you have to do
is show something a lot of data for it to
start to be able to recognize that data when you
aren't showing it that data. Yeah, and it learns on
its own. What makes the classic example is um AI
(11:50):
that can pick out pictures of cats, and it's easy enough,
but you don't tell the AI, here's what a cat is.
Find pictures of cats in this data set. It's here's
a bunch of stuff, and figure out what a cat is,
and they get really good at picking it out. You
can also turn it the opposite way, once you have
an AI trained on identifying cats and get it to
(12:11):
produce pictures of cats, but they're usually terrible and often
very very bizarre, like anyone would look at it and
be like a human didn't make this. It's just off
in some really obvious ways. And what you and Goodfellow
figured out was a way around that problem. Yeah. So, uh,
I'm not sure I agree with his wording here, um,
(12:34):
but we'll we'll say what he calls it, uh, he
set up two teams and one is a generator and
one is a discriminator, and he calls it generative adversarial network.
So basically his contention is that these two are adversarial.
I saw it more as like a managerial in nature. Okay, bureaucratic, Yeah,
(12:57):
I mean, isn't that what it felt like you or
the the scriminer is like, yeah, I'm gonna need you
to come in on Saturdays. It kind of felt like,
So you've got these two networks and they're both trained
on the same data set, but the generator is the
one that's producing these fake cats, and then there's a
discriminator or what I like to call a manager saying
(13:18):
these look good. These don't look so good? Right. The
other way, the way that good Fellow has proposed it,
is that the discriminator is going through and looking at
these generated pictures and trying to figure out if it's
real or if it's from the data, or if it's
fake the generator created it, or if the or if
it comes from the data set, And based on the
(13:40):
feedback that the manager gives the generator um the generator
is going to adjust as parameters so that it gets
better and better at putting out more realistic pictures of cats. Yeah.
I don't get the adversarial part unless it gets mean
and how it delivers that message the way the reason
they call it ever sincere is I saw it, but like, um,
(14:00):
it's like an art forger and detective and the art
forgers putting out or an appraiser is a better way
to put it. No art forgers putting out forged art,
and the appraisers like this is fake, this is fake,
this is fake. Well, I'm not sure about this one.
I don't know if this one is fake. This is real,
this is real, this is real. And then at that point,
(14:21):
the generator has become adept at fooling an AI that's
trained to identify pictures of cats creating pictures of cats
that don't exist. Okay, it's adversarial. They're trying, the generators
trying to fool the discriminator, and the discriminators trying to
thwart the generator. That's the adversarial part. But in the
end they're really on the same team. Yeah, okay, I
(14:44):
guess that's where it loses me. You have a really
positive image of corporate America. What is that the two
with anything. The managers on the same team as everybody.
Come on, get on board, Get on the trolley, Josh. So,
if you want to look up Mona Lisa talking, there
was a video this year from Samsung that showed how
you could do this, and all the stuff is on YouTube.
(15:07):
If you want to see Nick Cage as Indiana Jones,
which is pretty funny, or Yoda, which is hilarious, or
if you want to see Mona Lisa talking, it looks
it looks fairly realistic, like hey, they brought that painting
to life, right yeah, yeah, And if you scroll down
a little bit, they did one with Marilyn Monro too,
they brought her to life. He did interesting. Well, I
(15:28):
mean this is like just set up for for TV commercials. Yes,
they've already done stuff like this, right, This is like
fredis Stared dancing with a dirt devil or whatever that was.
What this could bring is creating entirely new movies and
bringing back dead actors and actresses or I guess as
(15:49):
actors now these days. Right, Uh well, I mean you've
seen some of them. Are you talking about the the
aging people are just creating, like bringing back someone that's
been long dead. No, No, I'm saying like you call
actors and actresses just actors these days. Oh that part.
Uh you can do whatever you want, okay, Uh, motion
(16:09):
picture performers, maybe even television performers, but bringing them back
and giving them like they could star in an entirely
because it's so realistic in life. Yeah. Um, they're not
at that point yet because they're just now getting to
where the d aging looks decent depending on who it is. Uh,
Like the Sam Jackson stuff and Captain Marvel, Uh, my
(16:32):
friend looked pretty good. It looked amazing. Yeah. And this
Will Smith stuff in this new angry movie looks really good.
What's that one? He's an angry movie where he plays
a some sort of assassin that uh Aladdin? Yeah, that's it.
Have you seen that? It was good? No? I have
no interest, um to go back and kill with the
(16:55):
younger version of himself or the young ones trying to
kill the older one. Is what it is. Man. It's
a lot like Looper, uh sort of, but it looks
pretty good. Like it looks like young Will Smith. Slightly uncanny,
but not as bad as I think. Some people are
easier than others, like the Michael Douglas stuff and aunt
Man and the marble stuff is kind of creepy looking.
I haven't seen that. I mean I've seen parts of it,
(17:16):
but I didn't notice that they were d aging Michael Douglas.
They took Michael Douglas back in scenes to like the seventies.
It's just like it doesn't look great. But anyway, that's
sort of off track, um, but not really. I mean,
it's kind of similar type of stuff. I guess, no
more than that Gizmoto article to start. Yeah, that's a
good point, um, But the whole reason we should point
out that people are doing this stuff with celebrities and
(17:40):
stuff like that, it's just because there's more data out there.
It's a lot easier when you have a gazillion pictures
of Brad Pitt on the internet to do a fake pit. Yeah,
because the data set that the AI is trained on
is just more and more robust. And the more pictures are,
the more angles the AI has seen Brad Pitt, you know,
look round at, and so can recreate this these faces
(18:03):
because the AI seem like every possible pose or expression
or whatever Brad Pitts ever made. But the thing is
that mona Lisa and Marilyn Monroe. Thing that Samsung showed,
They showed that you could make a pretty convincing deep
fake with just one picture, one pose, right, So that's
a big deal. But again, the bigger the data set,
(18:26):
the better. And that's why, like you said, celebrities and
and UM world leaders were the earliest targets. But over time,
with the advent of other software and the fact that
people now post tons of stuff about themselves and pictures
of themselves on social media, um, it's become easier and
(18:46):
easier to make a deep fake video of anybody. There's
like holes, their software that scrapes social media accounts for
every picture and video that's been posted ants, right, um distinction, Yeah,
for sure. Um there's and then there's um other other
sites and other apps that say, oh this this picture
(19:07):
of this person. You're targeting, your your classmate or whatever.
They probably have a pretty good match with this porn star,
So go find videos of this porn star. And then
the next thing, you know, you run it through that app,
that deep fake app came out with, and you've got
yourself a deep fake video and you've officially become a
bad person. All right, that's a good place to take
(19:28):
a break, and uh, we'll talk more about these bad people,
right for this that's what sk as good should all right,
(19:54):
So you mentioned before the break, Uh, this person's face
that I just stole off the internet fits this porn
actors body, which is a it's a consideration if you're
making that, because to look right, that has to they
have to bear a passing resemblance. I think that's right. Okay,
So I was just about to say, so, yeah, So
(20:16):
what they're doing now is they're browsing these applications with
facial recognition software to make this a lot easier. And
that's what most of this is about, is like, let's
just see how easy we can make this and how
how much we can democratize this where any shmoke can
take any single photo and do the worst things possible
with it, but also how convincing they've become as well. Yeah,
(20:42):
I mean, so it's another big change. It's looking better
and better, quicker and quicker, which is pretty scary. Did
you see the Obama one. Had it not been for
had it not been for Jordan's Peel's voice obviously being
not Obama, I would have been like, Wow, this is
really convincing. Really, Yeah, see how didn't think the lips
matched up at all? Oh? I thought it looked pretty close. Yeah.
(21:03):
So what we're talking about is Jordan Peel did a
basically a demonstration video to raise awareness about how awful
this is by doing it himself, and did a video
of Obama like you know, referring to Trump as a
curse word, a dipstick and basically saying like, hey, this
is Obama and what you know people are doing. He
(21:24):
basically is describing what's happening as you're watching it. And
I thought it look kind of fake. He's describing a
deep fake through deep fake in Jordan Peel, and he
did it in conjunction with BuzzFeed and another production company.
But in their defense, they were making this in like
early two thou eighteen, like April two eighteen, and since then,
even more technologies come out that is dedicated to matching
(21:46):
the movement of the mouth to whatever words you want
the person to say. Yeah, and you can also like
use only parts of it, so it's even more convincing.
So like if Obama had a lead in that actually worked,
you could just keep that in there and then take
out certain words, and you can manipulate it however you
want to write, and the AI can go through and
(22:07):
find like phone names and stuff like that to make
the new words that the person never said. It's it's
becoming extremely easy. Let's just put it like this. It's
becoming extremely easy, and it's um widely available for anybody
to make a video of somebody doing something or saying
something that they never did or never said, and to
(22:27):
make it convincing enough that you may believe it at first. Yeah, which,
like we said that, you know, the obvious scariest implications
that aren't just of the personal variety are in politics,
when you could create real fake news that actually put
people in jeopardy. They were put the entire world in
(22:48):
jeopardy by like announcing a nuclear strike or something like that. Right, Yeah,
Marco rubio In I can't remember when it was, but um,
within the last year or two basically said that deep
fakes or the modern equivalent of a nuclear bomb, that
you could threaten America the degree with the deep fake.
(23:08):
I think that is a little hyperbolic for sure, And
we're not the only ones. Yeah, there are other people
that say people that know what they're talking about not
just you know, shlubs like us, but other people that say, like, hey, listen,
this is probably not like a nuclear bomb going off. Um,
we should keep our on it. But there are other
bigger fish to fry when it comes to stuff like this,
(23:28):
for sure. And then there are other people who are saying,
well there are that's not to discount like the real
problem it composed right, Like, we're already in a very
polarized position in this country. Um. So the idea of
having realistic, um, indistinguishable from reality videos of like world
(23:49):
leaders or senators or whoever saying whatever is not going
to help things at all. It's not going to bring
everyone together like look at this hilarious deep fake. It's
going to be like you look, um, and it's just
going to erode that trust that that is necessary for
a democracy to thrive. Um. And to to take into
(24:10):
his logical conclusion, this one researcher put it like this, like,
eventually we're going to lose our ability to agree on
what is shared objective reality. And at that point, what
we would face is what's called the death of truth,
like there is no such thing anymore. And in one
(24:32):
on one hand, that's horrible. It's a horrible idea, the
idea that nothing's real because there's such a thing as
deep fakes, and anybody could could make something like this.
But on the other hand, you can kind of say,
you can't engage his Yoda right exactly. On the other hand, though,
you can say, the fact that people know that deep
(24:53):
fakes are out there means that it's gonna be easier
and easier to be like, that'sbviously not real. It's just
too unbelievable. So it may actually make us more discriminating
and more discerning of the news than we are today. Yeah,
that's the only thing that salvaged my brain from this
track of talking about this today was like, well, we'll
(25:16):
go tell our listeners at least be on the lookout,
be wary, take everything with a grain of salt, because
we're already in a place where, like, you don't even
need some deep fake video like it's happened all over
the place. You can see a uh, something that's photoshopped
or a real photo that someone just writes a false
story about. Yeah, that's a good one. Um, you can
(25:38):
just come up with a false narrative from a picture
when there's a guy on the street and he's laying
there bleeding, and you can just say, uh, this person
was attacked yesterday by a group of angry Trumpers or
an Antifa on the other side, and it'll get passed
around twenty million times, and then the retraction gets seen
by people exactly. That's where that's not a deep fake. No,
(26:02):
that's just that's low hanging fruit. That's a low five fake.
Imagine inserting into that and this is what we're talking about,
into that climate like video where you're looking at the
person seeing and seeing with your own eyes what they're saying.
And a lot of people who aren't, like I thought
the Obama video looked pretty fake, you thought it looked
pretty real. Everyone's eye is different and ear is different.
(26:25):
Like a lot of people will believe anything they see
like this, right, And we'll we'll we'll talk about like
how to discern deep fakes in a second. But we're
getting to the point people seem to be in wide
agreement that very soon it will be up to digital
forensic scientists to determine whether a video is authentic or not.
(26:47):
And that's all that's because you or I will not
be able to distinguish it from reality. Yeah, and I
imagine that every country will have their own team that
will be hard at work doing that stuff, and by
will already yeah, has since at the end of two
thousand seventeen. Yeah, or at least they're scrambling to catch up.
Because when the video comes out of um, you know,
(27:12):
the leader of North Korea saying we want to drop
bombs on America at two o'clock this afternoon, that's going
to send our DARPA team scrambling to try and disprove
this thing before we push the button. Right. It's like
war games, it is, but way way worse. Yeah. So,
just to reiterate one more time, the one thing that
you and I can do, and the one thing that
(27:33):
you guys out there listening can do to keep society
from eroding is to know that deep fake videos are
very real, and just about anybody with enough computing power
and patients to make one can make one. And the
very fact that those things exist should make you question
anything you see or hear with your own eyes that
(27:53):
seems unbelievable or sensational. Unfortunately, I think the stuff you
should know crowd is pretty savvy, so we're sort of
pre into the choir here. Yeah, but maybe they can
go preach to their older relatives on Facebook or exactly
take this to Thanksgiving dinner and just explain it to folks. Uh,
we should talk about porn a little more. Should we
(28:13):
take a break first? Sure? Are you okay with that? Yeah?
I feel bad now? No? Okay, Well we'll wait and
talk about porn in about sixty seconds, ask as good?
(28:37):
Shouldn't all right, Chuck? You promised talking about porn? Yeah? This? Uh?
In this research, it says one of the defenses people
make in favor of deep pig porn is it it
doesn't actually harm anyone. Is anyone actually saying that? Yeah?
A lot of people who I shouldn't say a lot
I've seen at least quote from people who make this
(28:57):
stuff saying like this is a this is the media
drumming up a moral panic, Like what's the what's the
problem here? What's the issue? It's not like where it's
not like they're going and hacking into like a star's
iCloud account, getting naked pictures of them and then distributing
that and like this is really a private naked picture
(29:17):
of a celebrity people into thinking they've done that to them.
I think that they would say they're just creating some
fantasy thing that's not even real. It doesn't exist. I'm
not defending it. I'm just telling you what the other
side is saying. Yeah. Well, and that's the perfect example
of why these are very bad people, because it is
it is harmful, um to everyone involved, to the person
(29:38):
whose face you're using, to the adult film actor who
did a scene wants credit, and well, yeah, I mean
it's regardless of how you feel about that stuff, someone
did something and got paid a wage to do so,
and now it's being ripped off and there are real
people's faces involved in real bodies involved in real lives.
(30:00):
It's you know, it's not a moral panic. But you know,
it's not like we need to march this to the
top of Capitol Hill right now. Well that's funny because
Congress held hearings on it this year. Well yeah, but
I have a feeling it's a little bit more about
the political ramifications than putting your ex girlfriend's face on
a porn body. Oh yeah, yeah, yeah, I see what
you mean. Although they could do you know, they could
(30:21):
put your your governor's face on a porn body and
get them removed from office. You know, this video was
just dug up. Look at look at this, Uh, look
at your governor. Yeah, look what he's doing. For sure,
But even take that down to the more the less
political level, like you were saying, you could ruin somebody's
marriage if it was shaky or on the rocks before. Sure,
(30:43):
Hey here's the sex tape of your husband or your wife.
You know, yes, blackmails another one too. Um, there's a
lot of ramifications of this, and it seems like the
more you dig into it, the more becomes clear that
really the big threat from this is to the individual
whose face is used on the deep fake porn. Right,
they could do a video of us holding hands walking
(31:04):
down the street, right, or they could just use that
video of us doing that, the one that exists around
And I'm glad you picked up that one. Um. There
was just jeez, just a couple of weeks ago because
I saw this one when I google this undernews, so
it's very recently. There was an app that we won't
name that's undressed. Basically, what you could do is just
(31:27):
take a picture of any woman, plug it into this app,
and it would uh show her, um, what she would
look like nude, not her body, but it would just
do it so fast and so realistically that you could
nude up some woman with a touch of a button. Yeah,
and like it would it would replace her clothes in
the picture with nude clothes. Right. So the birthday suit right,
(31:53):
birthday suit right, that's what I was looking And it's
just as awful as you think. Uh. And the creator
actually even shut it down within like three or four days. Yeah,
but like what was this guy thinking, like this is
a great idea. I'm like, oh, people have a problem
with this, Well I'll shut it down. Like really, in
his defense, he's probably like fourteen, Well, I guess that's
(32:14):
a good point. Uh. Even if you plugged in a
picture of a man, it would show a woman's nude body.
And you know what that means. That means that And
the person who created this app says, well, I just
did that because there are way more pictures of naked
women on the internet, and I was gonna do a
man's version, but I had to go to baseball practice
and never got a chance to. Yeah. That's uh, that's
(32:34):
pretty amazing. And of course that person is anonymous, right
as far as I know. Yeah, which means that they
really must be four Team because they weren't unmasked on
the internet. Despite the outrage against this, You're probably right.
I wonder that it's just farmer Bro. That guy justaed him.
He's still in jail. Is he really good? So that's
(32:54):
a good um segue into what you can do if
this happens to you, Right, there's a lot of outrage,
Congressman against There's a lot of outrage against this kind
of thing on the internet. So if this, if you
are targeted and you end up in a deep fake
porn clip or video or whatever, um, you could drum
(33:20):
up some moral outrage on the Internet or go to
the site that is being hosted on directly and say, hey,
I know this is messed up. You know this is
messed up. Please take this down. I didn't consent to this.
This is an invasion of my privacy. Get rid of
this video. Porn websites are good about that, actually, yep,
they don't want that stuff on there. No, And it's
not just porn websites like um, porn Hub, Reddit, giffy
(33:45):
Um some other sites have banned all kinds of deep
fake videos and apparently giffy I think it's giffy g
y f y idea. I don't either. I've never heard
of it until I started researching this. But this site
actually created an AI that's trained to spot deep fig
videos and remove them from the site, which is a
big tool that they need to be sharing with everybody else.
(34:08):
But if you can contact the site and say, hey, man,
take this down, this is me, they will probably take
it down just because of the everybody knows this is
really messed up. Yeah, they got enough. They have plenty
of videos to work for from that are real. They
don't need this stuff. But there's no laws that say
they have to take it down, are there? Well not yet. Uh.
(34:29):
There's this guy, Henry Fred He studies digital forensics at
Dartmouth and they are hard at work. Like again, this
just started, you know, very recently, so all of this
stuff they're just like scrambling to get ahead of as
far as uh sniffing this stuff out. Well, the whole
world was caught off guard by this. Oh yeah, like
(34:49):
this guy just wanted, hey, look what I can do,
and I'm going to change the world here. Nick Kage
is so funny. Oh wait, what is Nick kide doing
as yoda. Oh my god, So profession nials, there are
some pretty uh easy to spot things if you're a pro,
unless it's just really bad. Um, your average layman can
spot those. But if you're a pro, you're gonna look
for like bad compression, um stuff that like you know,
(35:14):
lighting that looks off. Yeah. Not blinking is a big one. Yeah,
like Michael Caine don't blink. Maybe he's just a big
deep fake his whole career. Man, that'd be something. Um,
sound is a big thing, like wa wait, hold on, hold,
I want to say why blinking is not a thing.
Oh sure, because it's fascinating. I mean it probably wasn't
to you because you didn't like that gives motor article.
(35:37):
But um, the reason why not blinking is the thing
in deep fake is because deep fake AI that is
trained on data sets probably are being shown photos of
somebody not blinking, so they don't learn that people blink,
so they don't The AI doesn't know to add blinking
when it renders the new face on this video. But
(35:59):
all I can do is just say, all right, well
now then we'll program it to blink. Right. That's the
big problem, Chuck. It's like everything they can spot. In fact,
when they list out all the things like look for
blocking this, compression, fixed pattern noise, I'm sure there's some
deep fakir that's like check check, check, Thanks for the
list of stuff we need to work on, and they're there.
It's still maybe, like you were saying, at a point
(36:21):
where possibly you or I could look at a shoddy
video and be like, yeah, I see this this and
this is a little wrong, like there is a little
bit of compression um relic or remnants or whatever, like
they're not blinking the shadows off a little bit. But
there's also plenty of videos where you do need to
be like a digital forensic scientists to to find them
or an AI to find it. Yeah, you can also
(36:43):
use your ear holes because you know, look at the
room that the person is in and would it sound
like that in a room like that, And that's a
good One of the things audio specialists look at is like, uh,
you know, if you have Obama in a concert hall
speaking and it sounds like in someone's closet or us, Yeah,
it sounds like a can like our earliest episodes. Yeah, exactly,
(37:05):
that's pretty you know, pretty strong indicator. It is. So
there are things you can do. Um. When BuzzFeed tweeted
that Jordan Peel Obama deep fake, they included a list
of things to do to spot a deep fake. UM,
I wonder how many times you said deep fake in
this episode. Don't jump to conclusions. Consider the source. It's
(37:30):
a big one. That's what's gonna guide us through here.
People like this is Jordan Peel, I can trust that. No.
But I mean, like if you go onto a site,
I I can spot like a fake news site a
mile away, Like you can just tell it's just there's
it's off, it's uncanny. It's our sense of uncanny that's
gonna guide us through this. Yeah, the it's you can
always sell because the screen is black in the texts,
(37:51):
florescent green in comic sance. Um. And then another one
is check where this thing is and isn't. This is
kind of like the opposite tip we always give people
where if you see the same thing and basically the
same wording throughout the internet, you should question that. If
you see a deep fake video and only a couple
of places and it's news news, but you don't see
(38:13):
it on like ABC or CNN or Fox News or wherever.
If you don't see it on like a reputable news site,
you should probably question it. Yeah, Donald Trump threatens nuclear war.
We have the video from slappy dot com, right, it's
probably a good indicator slappy dot com. I'm sure the
good people are like, we make we make hamburger button
(38:34):
picking up. I should probably check and see what that is.
Everyone else is right now, what else? Look closely at
their mouth? Yea uh. And then here's a kind of
a no brainers, like slow it down, slow the video down,
slow your role and like really look at it closely
if you see like because that's where you're gonna see
like strange lighting changes and stuff. Um, but it's all legal,
(38:57):
it is, so we were kind of we're kind of
talking about that, like the best way to get a
video taking down as to contact the website. UM just
be like, bro, come on, this is awful. Um. There
are no laws that protect you directly. But a lot
of people are saying, well, we've got revenge porn laws
that are starting to pop up around the country. It's
a very short short trip to from revenge porn to
(39:21):
deep fake porn. It's virtually the same thing. It's involuntary pornography. Um,
it's even more involuntary because with revenge porn, the person
even posed for the picture or whatever initially for whatever
context or reason, with no intention to get it out.
With deep fake porn, this person never even posed or
(39:41):
engaged in this act or anything like that. So it's
even even in a way, maybe even worse than revenge porn,
which is feels like bitter acid in my mouth to say. Um.
So you can make a case though, that these revenge
porn statutes that protect people could be extended to this
as well, But that's a that's for personal stuff. For
(40:04):
like national stuff or a public figure or something like that,
especially when it comes to politics. You could make a
really strong case that these deep fake videos, even the
most misleading, nefarious deep fake video you can imagine, would
be protected under the First Amendment. Yeah, I could see
a satire defense being mounted in the future. Uh. Like,
(40:27):
you know, what's the difference between doing a really good
deep fake in doing an animated cartoon like south Park
which shows people saying and doing things they wouldn't do either. Uh,
it is very slippery and thorny in a very fine line.
But even if the person who makes the deep fake
says No, I did not mean this is satire. It
was meant to be misleading, and I wanted to see
(40:49):
what effects it had. Sure they didn't shout fire in
a crowded theater, so they could probably still get away
with it under the First Amendment. Yeah, it's interesting to
see where the just gonna go. Hopefully write down the toilet. Nope,
it's just gonna get there. It's gonna get more and
more realistic, and we're gonna end up, inadvertently, um, falling
(41:10):
into the simulation. That's what's gonna happen. Chuck, prepare for it.
That's great, Okay, just try to put a smile on
your face regardless. That's smiling. If you want to know more,
if you want to know more about deep fakes, Um,
it's so hot right now. Just go on the internet
and you can read all sorts of news articles about it.
(41:33):
Since I said that, it's time for listener mail. I
think the first thing that turned me off was the name.
Anytime I see something that's like not a real word,
but they're like squeezed together to words and it's all
lowercase or something or oh gosh, yeah, it's just the
(41:53):
worst of the Internet. It's terrible. All right, Hey, guys,
just finished the Neanderthal episode and those mentioned that their
language could have have some remnants in our modern languages.
That was a really good automatically remembered that during How
Swearing Works, you guys mentioned that a different part of
our brains activate when hearing are using swear words. So maybe,
(42:16):
just maybe that small percentage of our name Andrew Falian
d n A activates when we stub our toes or
hit our shins to unleash our primitive and original language.
How about that? I like this person anyways, loved the podcast, guys.
Grateful for the knowledge and entertainment. And thank you Jerry,
or should we just say thank you Josh Josh T
(42:36):
for keeping the quality of these podcasts awesome. We're not
thinking Jerry anymore. Just from my overpriced apartment in Redlands, California.
That is Falcon. No, that's his last name. Wow, the
end is silent though, so it's Falcone. Thanks a lot, Falcon.
We appreciate you. Um, that was a great name. Thanks
(42:58):
for swooping in with that idea. Mm hm and uh,
I'm sorry for everybody for that. If you want to
get in touch with us, like foul code. Did You
can tweet to us. We're at s y s K podcast,
I'm at joshuam Clark. We're on Instagram, We're on Facebook.
Where else are we? Chuck Uh, We're on every deep fake.
(43:19):
We might be on Giffy who knows gif Uh. You
can also send us a good old fashioned email, spank
it on the bottom after you wrap it up, of course,
and send it off to Stuff podcast at i heart
radio dot com. Stuff you Should Know is a production
of iHeart Radio's How Stuff Works. For more podcasts for
(43:39):
my heart Radio, visit the iHeart Radio app, Apple Podcasts,
or wherever you listen to your favorite shows.