All Episodes

February 4, 2020 • 54 mins

With the development of increasingly smart artificial intelligence and lots more cameras spread around than ever before, we have reached a critical point in the US and other countries where governments can easily track everyone, everywhere, all the time.

Learn more about your ad-choices at https://www.iheartpodcastnetwork.com

See omnystudio.com/listener for privacy information.

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:00):
Hey everybody, it's me Josh, and I'm here to tell
you it's official. We're going to be in Vancouver, b C.
And Portland, Oregon this March. On March twenty nine, will
be at the Chance Center in Vancouver and on March
will be at the Arlene Schnitzer Concert Hall in Portland.
So come see us. Tickets go on sale this Friday.

(00:21):
Go to s Y s K live dot com for
ticket links and info and everything you need and we'll
see you guys in March. Welcome to Stuff You Should Know,
a production of I Heart Radios How Stuff Works. Hey,
welcome to the podcast. I'm Josh Clark. There's Charles Scooter,

(00:45):
Computer Bryant and Jerry Um Matthew Broderick in War Games
Rolling and this is Should Know. That's good. Thanks. What's
your name, Josh m hmm? Okay, Josh, Ali Shedy in
More Games Okay Clark, I know Aaron Cooper is doing

(01:08):
right now? Man? How cute was she in that movie? Like?
I think they designed that movie for like every thirteen
year old boy in America's fall in love with Ali
shed I think you're talking about um short Circuit. I
never saw that, believe that what with Johnny five. I mean,
I know the movie. You gotta see it. Really, it's
pretty awful, especially with um oh, what's his name? Who

(01:35):
is the sleeves ball from Fast Times at Ridgemont High?
Who is the ticket scalper? Yes, he plays a Indian
like Asian Indian character, like full on brown face and everything.
It's really bad. The movie is bad enough, but then

(01:57):
now when you go back and see that, to your like,
I can't believe this. I can't believe it. I think
he's Italian, oh easily, maybe Jewish or maybe just a
straight up white guy. He's definitely not Asian Indian. No,
he's not. But anyway, go see Short Circuit. Okay, see
what you think Ali shed He just keeps looking at
the camera going, I'm so sorry that was a big

(02:18):
hit though she didn't have anything be sorry about. Um So.
I guess War Games is kind of in your your wheelhouse. Yeah,
it was a little old for me. Yeah, because I
saw that when I was like twelve, bish uh, and
you would have been I don't know how how much
younger are you. I would have been seven seven. Yeah,

(02:38):
that's a little young for War Games, I would think.
I mean, I still watched your wherever, but I was like, yeah,
that's hilarious. The computer is called Whopper. Yeah, I mean
it was right in my wheelhouse, right. I remember at
the end of war Games, Uh, they lock in, you know,
they're decoding the the code one number and letter at
a time. Very suspenseful and yeah, very suspenseful. And it
finally locks in and me and my friends memerized it

(03:01):
so we could go home and plug it into an
Apple too to see what happened. Nothing, Okay, how do
you plug in a number anyway? What does that even mean? Oh? Yeah,
that's true, you know. Yeah. I remember a very rudimentary
program you could run where you could type in like

(03:21):
four lines of whatever I don't even know if you
call it code with a phrase, and it would run
the phrase like a thousand times all over your screen
and a big scroll. And then that just thought that
was the coolest thing ever. I feel like, I remember
what you're talking just like five lines was like the
only part I remember is twenty go to ten and

(03:42):
ten was the phrase I think something like I don't remember.
Then I was like, man, let's just play Castle Wolfenstein.
That was a good one. I remember, did Oregon Trail.
I never did as well. I was Castle Wolfenstein, but
like wolf and signed on the PC. Oh yeah, like
move left arrow, right arrow, shoot uh shoot dash, which

(04:02):
is some sort of a bullet. Yeah. It was fun.
That was fun, and I thought it was just like
the height of technological gaming. It was it was at
the time. But now, Chuck, we've reached the height of
technology which is being tracked everywhere you go, look at
you all the time by whoever wants to do that.
I'm gonna change your name to Josh smooth up a

(04:23):
Raider Clark for that transition. Very nice. Yeah, Like I
like Ali Shedy Clark, Josh, Ali Shedy and Clark. Yeah,
this is a good one. Uh did you put this together?
It was this you and this is Dave. Dave Brews
good stuff in high Dave. We finally got to meet
Dave and his family, lovely family that we cursed awfully

(04:45):
in front of in Seattle. I felt terrible, thanks for
adding yourself to the max and he was like, you
know whatever, he was fine. His kids were adorable, they
were they were great. They couldn't look at me though
that really they were probably just him did by your present,
No No, it was because I cursed so badly. So
this is good stuff. Though. Facial recognition technology that they've

(05:08):
been kind of at since the nineteen fifties, which they
rolled out as a test in two thousand two at
the Super Bowl in New Orleans, did not go that well. No,
it was a little clunky back then, but it's gotten
a lot better since then. Let me explain why. For
anyone who's listened to the End of the World with
Josh Clark the AI episode in particular, everything associated with

(05:31):
artificial intelligence got way better starting around two thousand seven,
when neural nets became a viable form of machine learning.
Because you don't have to train a computer what constitutes
a human face and what to look for. You just
feeded a bunch of pictures of faces and say these

(05:51):
are human faces, learn what a human faces and they
trained themselves. And so around about two thousand seven, two
thousand eight, two thousand nine, everything had to do with
machine learning got way smarter because we started using neural
nets and facial recognition software is no exception. Yeah, and
there were a few things that kind of converged all
at the same time or around the same time. Uh,

(06:14):
social meds kind of coming on the scene right in
that wheelhouse was a big deal. Um, Facebook, this is staggering.
Facebook just by itself processes three hundred and fifty million
new photos through its uh facial recognition software every day,

(06:34):
and every time one comes through. Mark Zuckerberg goes like,
you think it's neat when you go when you put
a picture up and it says like would you like
to tag Emily your wife because that's her? And you think, oh, well,
that's super easy, thanks Facebook. But then you don't think, like,
wait a minute, all right, how do they know that's
my wife? And you know, it's like with everything else, Uh,

(06:56):
there's privacy people that were like, whoa do you guys
realize what's going on? And then of the sheep they're
like huh no, no, they're like no, it's great, Like
I don't have to go in and like make click
two links or two buttons to take the way easier.
So that was one thing. There's way more photos out
there for those machines to learn one like good high

(07:19):
quality photos right uty million a day just on Facebook alone, um,
which means the machines were getting smarter, They were getting
better and better at at training themselves. And then lastly, um,
that has led to a ubiquity in in facial recognition.
That the better the machines have gotten, the easier it

(07:41):
has been to put together data sets for them to
train on, which is lots and lots of pictures of people. Um,
the cheaper the technology has gotten, which means the more
people that are now using facial recognition than ever. Yeah.
Amazon has a service called Recognition with a K, which
is not a good look. No, it looks very German.

(08:02):
There's something about replacing at sea with the K. It
just looks creepy, Like when you spell America with a K,
it means something. It means like bad America. Yet they
went right full steam ahead and called it recognition. You
have to say it like that, you do, I think,
and you have to be like squeezing the air out

(08:24):
of a syringe while you're saying it too. Um, so
they have this. I didn't even know about this, but
it's ubiquitous and it's not super expensive, and that means
that the law enforcement agencies agencies they don't have to
like create their own they can just say, well, just
to sign up for a recognition right exactly. Because it's there,

(08:45):
and because it's it's relatively cheap, you can just get
a subscription, not just law enforcement agencies. If you have
a photo sharing app or whatever and you want some
facial recognition technology, you just contract with with Amazon, and
Amazon goes here, you go, here's our data, and you
can or um like our code, and you can put

(09:06):
it onto your platform. Anybody can use it. Um. So
it is kind of everywhere, and that makes a lot
of people, including me, very very nervous because as this
guy wood Row hurt Zog, if he's not Werner hurt Zog, brother,
I'll be disappointed. But but a wood Row and a

(09:28):
Werner come on, you know anyway. Um. Wood Row hurt
Zog as a professor of computer science, and I believe privacy,
civil liberty. Um. He basically says, Look, there is no
way we're going to reap the benefits of facial recognition
without ultimately sliding irreversibly into a dystopian surveillance state where

(09:55):
it's happening right now, and if we don't do something
about it, it's never going to change back. We're about
to fully give up our privacy because it's one thing
to have your phone tracked you can give up your phone,
get yourself a burner phone, like you're Jesse Pinkman or
something like that, and then you just throw that phone away.
You can't be tracked anymore. You can't get a burner face.

(10:16):
And if that does become a thing down the line,
it'll be very, very expensive. So the average person can't
get a burner face. Will be tracked by our face
everywhere we go. And as we add more and more
cameras and this technology becomes cheaper and cheaper, it's we
will be living in a world where there will be
zero privacy and will be monitored and tracked because it

(10:39):
will be so easy, and it will be sold to
us like it's being sold to us now that it's
a law enforcement tool to get the bad guys. But
it's eventually going to extend to include everybody. But what
do you have to worry about. You're an upstanding citizen.
It doesn't matter if you're tracked. That's not true. That's
just not the case, everybody, It's not case. All right,

(11:00):
we're gonna call that soap soapbox soliloquy number one of
what I guarantee will be probably three or four. Uh,
let's talk of a little bit about how it works.
It is biometric authentication. Um. It's like a fingerprint or
a retina scan. And basically what it does is, uh,
it is precise measurements of a face to calculate every

(11:25):
person's very unique visual geometry, like how far apart your
eyes are, how how far apart your pupils are from
your nostrils. Yeah, your facial geometry, how your face is
all set up? I think. Yeah, it's even gotten into
things like facial hair, skin tone, um, skin texture. Yeah,
I'm sure it'll get just more and more specific. Yeah.

(11:45):
And then you know, because they're getting the machines are
getting better and better and easier and easier to train
on this stuff, you can just add more and more
data to it, and the the the recognition will just
become increasingly um um good. Yeah. And if you want
to throw off facial recognition software and freak out every

(12:06):
human you meet to shave your eyebrows, oh yeah, that
would be a little freaky. Have you ever seen that.
You've seen it before, I'm sure in movies and stuff.
It's an interesting thing. I remember a kid in industrial
arts class did that one year. He was like a
little you know, kind of a ninth grade burnout, and
he just showed up one day with no eyebrows, like
you would I think like not having a nose would

(12:27):
be more easily accepted. There's something just uncanny when someone
shaves their eyebrows, Like one day they have them, the
next day they don't. Was it like immediately recognizable what
the thing was, or was it like that's the thing
off today? Whereas if you, you know, you came in
the next day without a nose, the first thing you

(12:48):
would say is what happened to your nose? What happened
to your nose? Todd? Yeah, and Todd would be like,
I can't rent of it. I fell off. So those
measurements we were talking about. What happens then is they
compare that just like a fingerprint, with a database of images.
And depending on what this is for, it could be

(13:08):
like just within your company, or it could be the
FBI's database of mugshots, or it could be the d
m VS database of driver's license photos yeh, which we'll
get into. Yeah. What's interesting is each stage of the way,
there's a different algorithm that does you know, each increasingly
sophisticated step, until you finally have basically like all of

(13:29):
the different data points for that, you know, what makes
up that facial geometry, and then you can compare it
to all the other data points. We think of like
a computer running like a you know, a picture, You've
got your input picture and then running you know, all
the pictures next to it. That's not what it's doing.
It's it's running the numbers. Basically, it's doing computer stuff. Yeah.

(13:51):
I love that first step, which is you have to
teach the computer what a face is. So, I mean,
seems silly, but of course that's what it is. Well, yeah,
because I mean if you show it a picture of
a person standing next to a fire hydra shaming on
the fire Hydran, Yeah, and say hello, handsome, So this
is what a human face looks like. Yeah, or no,

(14:13):
that's a butt, And then it starts, you know, closer, closer.
All right, now that's a face. You know what a
face is. Now move on to step two, which is
stop screwing around. Yeah, so now you know what a
face is. You've got to normalize it for the photo,
which means there are not that many. Well, let's you
have to put it in the dockers. That normalizes, Um,

(14:34):
you isolate that face, and then you have to make
sure that it's normalized as far as looking at the camera.
So if it's a if you get a photo of
someone from a CCTV, let's say, and it's sort of
a three quarter, they have the ability to make it
as if it is looking straight at you. Yeah, the
computer can pretty accurately predict what the rest of the

(14:56):
face looks like face you know it on, I guess
face on um. And when it normalizes it like that,
it makes it much easier to compare to other pictures because,
as we'll see, most of the pictures or most of
the data points that it's comparing it to, are taken
from databases of pictures that have been taking of people

(15:17):
face on. Right. So that's why it wants to go
to driver's license. Yeah, well just spoiled it. But yes,
we already said that. Oh we did, Yeah, I did, Okay,
I missed it, I know. So uh from there, do
you have more algorithms still that isolate parts of the face. Um.

(15:38):
And this is where my old theory that like, there
are only so many sort of facial combinations, so that's
why you have doppelgangers. We got to do an episode
on doppel Gan. There's only so many things you can
do with two eyes to eyebrows and nose in a
mouth and cheekbones and in chin. Well, okay, what else?
I mean, there's not a whole lot. There's lips, lips, sure,
what about um uh uh uh, that's about it. These

(16:01):
are called elevens, the ridges between your eyebrows. Well, if
you want to get super specific, but that's what I'm saying.
I think they're getting more and more specific. Oh yeah, yeah.
But my whole point is, and we'll learn and here
and facial recognition. They do use Apple gangers, but put
a pin in that so they recognize all these features
and then each feature becomes what's called a nodal point

(16:24):
or no Dowell point. I think, nodal nodal, I think.
And this is where you're gonna get your super exact
uh angles and distances between all these parts as a
flat two dimensional thing. Which my question was because below here,
you know, it talks about Apple and their iPhone have
a three D facial recognition. Is that is two dimensional

(16:47):
superior to three D? I don't know, or is it
just because that's what all the pictures are in the databasis,
so that's what they do. I don't know. All I
know is my phone usually unlocks when I look at it.
You know what? I Hey, that's having to take off
my sunglasses. Worst so I found I've got some wayfarers
that I don't have to take, but my aviators I

(17:09):
do have to take off. Interesting do they keep trying
to make you into mav when you have on the
the go ahead? What is that? That was Tom Cruise
laughing and chewing gum. Okay, wow thanks, I feel like okay,
um we we we gotta keep going because I was
about to take a break unnecessarily. So um when the

(17:34):
when the computer is running through the pictures, it just
sis soon goes like no, no, no, no, no, millions
and millions of times and then finally goes yes. But
when it says yes and it spits out another picture,
it's not like this is that person? No you want
it to be. Because we all watch n c I S,
we all watch c S I, we all watch um

(17:55):
Law and Order, we all watch um uh party down
uh Andy Griffith, that Mattlock, the whole CHAMAI um, so
we wanted to just spit out and be like, here's
your here's your your person of interest, right. But what
it's really doing is it's it's producing a similarity score

(18:17):
that is probabilistic. It's saying this is there's this percent
chance that this is the same person as the picture
or the person in the picture that you uploaded. It's
a bit of a guess. It is, so a sophisticated
guess it is, and the better computers get at this,
the likelier it is that if they say this is

(18:38):
a there's a chances the same person that it's the
same person. But it's as we'll see, it's up to
the human user to determine what is an acceptable threshold
of a confidence Is it no? Is it no? Frankly,
it really should be about or higher should be the

(18:58):
confidence setting for the confidence level. Didn't that what Amazon's
recognition says the threshold should be. I'm glad you said that, man,
because it really is creepy and I couldn't put my
finger on it. And it's exactly I mean, I knew
the k looked weird or whatever, but it hadn't hit
me that just how creepy it is and just how
off the mark or potentially on the mark that name is. Like, Oh, like,

(19:23):
if my name was spelled c h u K, I'm
sinister a little bit, you'd be more sinister. Yeah, I
don't think you could ever be truly sinister. I appreciate that. Alright,
let's take a break. I'm gonna go work on sinistering
up a bit, and we'll talk a little bit more
about some of the uses of f R right after this. So,

(20:04):
as with all technology, it has to be abbreviated into
two letters, the second of which is are do they
call it f R? I've seen it, Yes, I was
just silly, but it doesn't surprise me. Nope. So in
f R facial recognition technology, there are there are some
some beneficial uses for it. Yeah, Like we said, you

(20:25):
got to tag people is chief among them. The for
people like you and me, that's the pinnacle as it stands.
You don't have to tag people yourself. Facebook does it
for you. That's what we're trading everything for. I gotta
calm down. Okay. There are some other, like genuinely beneficial

(20:45):
uses too. There's a nonprofit company called Thorn that scans
missing person's pictures against um pictures of children in child
porn videos, um or suspected human and trafficking to to
get matches, and apparently they've rescued a hundred kids so
far from using that technology. There's a pretty beneficial use

(21:08):
of facial recognition software. Uh dating apps. Let's say you
want to you can get pretty specific on what kind
of face you find attractive, which is interesting. But you
can say I really think, um, I like guys with
high cheekbones, and but no, you would go find mall lips.
It would be more like, um, somebody could be like, oh,

(21:32):
I really find Christian Bale attractive, and they get a
picture of Christian Bale into this dating app and I
would come up. But I wouldn't because I wouldn't be
in the dating app because I'm happily married. Do you
think you look like Christian Bale? I'm told that a lot. Really, Yeah,
that's weird. Don't think you look anything like it. I
don't either, but people say interesting. I'm I don't know
what I would do if I was dating now. I

(21:52):
guess I would just go to a service and say
and aliety type sure for more games era, But they'd
be like, okay, sir, you would just upload the picture.
You don't have to come into the office, which is
really not even open to the public, and just tell
us you're interested in Ali sheety type like a weirdo.
You mean, dating apps don't have offices where they just
feel complaints and interested parties. You sit down and they

(22:13):
they videotape you with the VHS camera, put you on
with some other guys on the tape. That's how they
used to do it. Oh yeah, that was one of
the subplots of Singles, the Cameron Crow movie. Oh yeah,
it was Expect the Best was the name of the
dating service, and you would make a videotape and like
a watch video tapes of people you know, saying who

(22:35):
they are. How do you remember that? I was a
big singles fan. That's not a bunch I got you. Yeah,
Expect the Best. You me pack of cigarettes and some coffee.
We don't need anything else becausezoom tight. So what else here?
This was? UM Taylor Swift and and her security team
on tour used it to scan the audience to see

(22:56):
if any of the creeps who have harassed and doctor
were in the audience. That's super beneficial. No one should
have to go through that. UM also, cops use it
in myriad ways, but in particular especially beneficial or um
when they use facial recognition to identify people who can't
identify themselves. Somebody in the midst of a psychotic break,

(23:18):
perhaps somebody wasted on troom's um, somebody you're not Jesus Christ,
who who has uh amnesia, Our friend Benjamin Kyle, who
apparently knows who he is now but he's decided not
to disclose it publicly. Remember the guy he was found
behind a burger king near a dumpster, had zero recollection

(23:41):
of who he was or how he got there, and
like there was this international publicity publicizing like who he
wasn't that he couldn't remember who he was, and somebody
finally came forward and identified him. So now he knows
who he was, but he went like a decade without knowing.
By the way, when I said, sorry, you're not Jesus Christ,

(24:01):
I was making fun of the guy on mushrooms, not
someone in the midst of a psychotic break. I just
want to be very specific. I think that was very
clear a right to make sure everybody knows that. So uh.
Those are some of the good ways that it can
be used UM. Now let's talk about all the bad ways. Yeah,
I mean when you're talking about the government, you're talking
about law enforcement, when you're talking about things like what's

(24:27):
going on allegedly in China with CCTVs everywhere trained to
UM to single out ethnic minorities and religious groups just
walking down the street going about their day. Yeah, that's
it gets into much different territory than tagging people in
dating apps. Yeah, it's pretty difficult to attend your religious

(24:50):
service if you're not allowed to attend your religious service
and you're being tracked everywhere you go. Yeah, that's why
places like and this is the most predictable thing in
the world San Francisco, Oakland and Berkeley and then Somerville, Maine.
I knew the Manners would be in there. They're not
into this, that's right. They have banned law enforcement UM

(25:13):
from using facial recognition altogether in California UM as a state,
and the ACE has put a three or moratorium on
the use of it UM body cams, which and the
a c l U is basically I know this is
jumping ahead, but they're at the point where they're like,
we need to tap the brakes here for a few
years and like because there's no legislation about this yet

(25:35):
and it's just going full steam ahead. Yeah, I really
I don't want to like like run past that there
is aside from Berkeley, San Francisco, who's the other one,
Oakland and Somerville, Maine. UM, there are no laws, state, local,
or federal governing the use of facial recognition technology by

(25:57):
law enforcement. It's just happening very fast. Whatever they want
to do, they can do UM and in some cases
they do all sorts of stuff with it. They will
use it, like the NYPD very famously used UM what
you were talking about with doppelgangers. There was a guy
who was caught stealing beer at a CVS. Not even

(26:18):
a duane read a CVS. UM, and they said, but
this guy looks a lot like Woody Harrelson. We don't
have a good we don't have a good shot of
him to use in official recognitions. So they went and
got a pick of Woody Harrelson, and there they came
up with a match, and they think it was the
guy on video and CVS and so UM. The Georgetown

(26:42):
School of Law UH produced a study called Garbage in,
Garbage Out, and they were basically like, that's not okay,
you really shouldn't be doing that. But that's the level
of legality as it stands right now. There's it's just
open season, um, and uh, it's just basically whatever you
want to do, you can do as far as facial
recognition is concerned. In that story in particular, it's like

(27:05):
some people are like, awesome, the system works. Sure. Other
people are like, what about poor Woody Harrelson. He was
really in danger right then of being implicated in this
beer stealing scheme from CVS and what he said? What dude,
I love that guy man. True Detective the first season, Yeah,
first four episodes just amazing. That's called using a probe

(27:27):
photo when you use when you say, hey, that looks
like someone. They also did the same with one of
the New York Knicks. Apparently I could not, for the
life of me find out who. Yeah, it's like he's
being protected or something that maybe no one said who
it was. Um, a couple of numbers for you, though. Uh.
The FBI receives about fifty thousand facial recognition search submissions

(27:49):
a month for their database. So that's the other thing.
If you don't have even the money for a subscription
to Amazon Recognition, or you don't have an I T
person who's capable of assembling and putting it, you know,
using it UM. You can just submit these requests to
the FBI. So there's a lot of different avenues you
could take his law enforcement to to UM to use

(28:12):
facial recognition technology to catch suspected criminals. Yeah. I was
about to say bad guys, but who knows. We'll see
it's not always the case. So here's some more numbers though,
because you know it needs to be regulated. But when
it works, it really works. Yeah, it really does though.
It is the thing. Yeah, there was one department where

(28:32):
they said it lowered the average time required for an
officer to identify a subject from an image from thirty
days to three minutes, which kind of brings home the
point there's another number in here. This interesting, but uh,
it brings on the point that like, this is something
that human policemen were doing. Officers were doing with their

(28:54):
eyeballs by flipping through books, yes, for thirty days straight,
saying like it doesn't look like this person. This is
like a chance to really speed up that process and
to spend more time in theory catching bad guys. Yes,
I'm not arguing for it. I'm just saying they were
doing this anyway, just through manpower. Right. I think the

(29:16):
thing is is anytime you add artificial intelligence automatically makes
the user of the artificial intelligence aside unfair unfairly advantaged.
It's not like the criminals are able to use AI
to steal beer from cvs more effectively, but the cops
are using AI to catch them stealing beer more effectively.

(29:37):
And it's kind of like, yes, it makes sense to
catch like child pornographers and um human traffickers and rapists
and murderers and violent criminals with this stuff, but using
that kind of technology catch somebody who stole beer from
a CVS. If that's when it starts to feel like
what kind of society are we moving towards? You know? Well,

(29:58):
I think someone not. And let me keep going here
for a second, because I don't want people be like,
what do you in favor of the guy stealing beer
from CVS? No, I'm not. I think you're a scumbag
if you steal beer from CVS. But I also think
that it's overkilled to use facial recognition technology to catch
that person. Use old fashioned police tactics or don't catch them. Yes,

(30:19):
it's just kind of the fairness of the old western
New York City. I think I might be on the
other side because I don't think we need to set
a fair playing ground between criminals and cops and saying
like it's unfair that cops can use this stuff and
criminals are just out there not able to use these
same techniques. Okay, So my the fairness thing doesn't just

(30:41):
end at the law and order thing, right like, Like,
it's not just with cops using it. They they have
this huge advantage. I totally get how people would be like, no,
give the cops that huge advantage. I don't have an
issue with that in and of itself. I think my
issue comes a step or two down the road, sure,
where the government or the cops, acting on behalf of

(31:05):
the government, use that against everyday citizens who have no
recourse whatsoever. That that that that lopsidedness that's so evident
when you're using AI to catch somebody's stealing beer from
the CBS. It's really easy to kind of follow that
a little further across to the horizon and see just
how unfair life could be and how oppressive that could

(31:28):
be using that technology. I think that's ultimately what I'm saying.
I hopefully dug myself out of that hole by now so.
Uh And and this gets into some of the controversies
and the arguments if you're if you're scanning mug shots
uh for rapists and arsonists and murderers and violent criminals,

(31:50):
and you're catching people, You're not gonna find a lot
of people that say, well, that's not fair, go back
and use take a month to look through a mug
shot book instead, and waste a bunch of time and
don't be efficient. So I think most people would say
if you're looking at mug shots, although we should point
out that a mug shot doesn't mean that just means
you're arrested. That doesn't mean you were guilty of anything.

(32:13):
Um So, there are plenty of opportunities for false positives
and people being put in jail that shouldn't be. But
but there's not a lot of people who are like, no,
not use mugshot databases right Exactly, If you're scanning driver's
license databases or other just general public databases, that's when
it gets super tricky because we can't avoid the fact

(32:35):
that what that means is in in the Center on
Privacy and Technology, Um kind of stated very plainly what
that means is everyone is in a perpetual lineup. Essentially,
if you have a driver's license, you're part of a
police lineup. Yeah, whether you like it or not, whether
you know it or not. And if that computer says
here's the guy, it's Chuck Bryant, Uh, they will say, oh,

(32:58):
he doesn't strike me as very sinister, or in the
computer would be like, trust me, this is the guy. Uh,
with like an eighty something percent confidence interval. Um, Chuck,
Suddenly you're going to get visited by the cops and
maybe you'll even get arrested because you were a little
KG when they talked to you and you set off
their cop radar or whatever. And then the next thing
you know, you're in court being charged with the crime

(33:20):
that you didn't commit because the computer implicated you and
the cops thought that you were acting KG. And let's
say that you were a very very poor person and
you don't have any money to mount a decent defense.
The best you can afford is a free public defender
who has fifty other cases is not really paying very
much attention to you and you're in jail now because
you got convicted wrongly because you were putting a lineup

(33:44):
just because you had a driver's license. Yeah. I think
for me, um, and this is total my privilege coming
through as well, Like I'd want to see some numbers
and if one of every ten thousand arrest and conviction
of a real criminal or an a rapist and a murderer,
and there's three people that get falsely identified and have

(34:05):
to go through the system and may or it may
not be acquitted, I'd want to see those numbers. But
again that's coming from a privileged position as someone who
could afford a legal defense who uh white, Yeah exactly.
That's another one too, is that people of color uh
bear a inordinate burden, disproportionate burden when it comes to

(34:26):
facial recognition technologies. We'll see well, I mean you might
as well go ahead and talk about that. It's um.
I think from the beginning, even with social media, there
were certain facial recognition, early facial recognition technologies that admitted like,
we're not as good as uh seeing or recognizing faces
with darker skin. It's just not that good. Yeah, I

(34:49):
think something like darker skinned men and women were recognized,
twelve and thirty were misidentified, compared to one percent and
seven percent of light skinned men and him in And
they say it's because of the data sets that these
machines have been trained on, which is not it's not purposefully,
but it makes sense if you live in like a

(35:10):
generally like like the white people are in power, and
it's like whiteness is the most celebrated part of the
society or whatever. That's why you're going to have more
pictures of And when you feed just a bunch of
pictures from your society into a machine and say, learn
what faces are, it's going to go, oh, white men,
I got you. Well, they just are more white people.

(35:32):
Numbers wise, probably has something to do with it, right, Um, Yes,
that's a really that is an excellent point as well,
for sure. But the fact of the matter is the
data sets that the machines are learning on are largely
white and largely male, and so they're just not as
good at recognizing the differences in faces among um people

(35:55):
who aren't white males. Uh, let's read these quotes. There's
a couple of good quotes here. The first one is
from Woodrow Heart sog. I was going to read it
as wurner. I don't know if I can. I should
get Nolan here. He does a good burner. The most
uniquely dangerous surveillance mechanism ever intended. It's an irresistible tool

(36:18):
for oppression that's perfectly suited for governments to display unprecedented
authoritary tane I'm sorry, authoritarian control and an all out
privacy eviscerating machine. I just realized it's heart sog, so
it's it's spelled differently. It's a j r T hert

(36:39):
sog is just h g r z o G. I'm
glad that we didn't figure that out beforehand, though. You
want to take the other one though. I also I
have to say I detected a hint of Michael Caine
in there too. There might have been a little bit.
It's hard to get Michael Caine out of my system.
What's the other one? Oh? From Microsoft president Brad Smith? Yeah. So.
Brad Smith says that when combined with ubiquitous cameras and

(37:02):
massive computing power and storage in the cloud, a government
could use facial recognition technology to enable continuous surveillance of
specific individuals like they're supposedly doing in China as an aside,
it could follow anyone anywhere, or for that matter, everyone everywhere,
at any time, or even all the time. And he
wasn't this wasn't a sales pitch. He was speaking out

(37:23):
against this to Congress, saying like, guys, we gotta we
have to do something about this, because this is the
path we're heading down. And that's why uh Seth abrama
Witz changed his name to Brad Smith. It sounds like
a total like like I just want to blend in. UM.
So you've got scanning against mug shots, scanning against driver's licenses,

(37:47):
and then um, there's a new one that just came out.
The New York Times just released this expose on January eighteenth,
just a few days ago, UM on a company called
clear View AI. And apparently even among um Silicon Valley
there has been this longstanding kind of unspoken thing where
let's steer clear of this facial recognition technology because it's

(38:10):
such a tool of oppression potentially. And clear View AI said, hey,
we're not from Silicon Valley. Well, we're just going to
do our own thing. So now there's this, there's this
tool that's available to law enforcement agencies that they're using.
Remember that one that one guy who had a quote
saying that, um, it went from thirty days to three minutes.
They were almost certainly using clear View AI. And the

(38:33):
reason clear View AI has such an advantage because they've
gone to this place where everyone else said was off limits,
which is scraping social media. So rather than the forty
one million driver's license and mug shot pictures that is
available in the FBI's database, clear View AI is this
app that you can subscribe to for a year for

(38:54):
like two thousand and ten thousand dollars, and they have
three billion pictures, including links to the social accounts of
the people whose pictures come up, so that you can
not only see who it is, you can find out
where they're at right then. And it's a hugely invasive thing.
And there's no legislation on this whatsoever. And it's only

(39:16):
just recently come out that this this company even exists,
or that this app exists, and that law enforcement is
using this stuff because again there's basically no laws saying
you can do this, you can't do that. Um And
again Woodrow Herzog has basically said, there's no way we're
going to realize the benefits of this without the incredibly
disproportionate drawbacks, and um, he just calls for an all

(39:40):
outband of the technology. He's basically saying it's not worth it.
All right, let's take another break. Oh my gosh, we
haven't taken our second break yet. Uh, and we'll be
right back to talk about the rest of this stuff
right after this. I think we should talk a little bit,

(40:18):
like we've talked about the false positives, UM, and I
think within Amazon, their contention is that what you're talking
about with like the studies out of m I T
that said, UM, that there are too many false positive
is he's they're saying, wait a minute, you're talking about
facial analysis, not facial recognition, and those are two different things.

(40:40):
I did not understand this at all. I went and
looked it up and I just fully get it. Either
there's it sounds like some tap dancing to me. I
looked at and there's like not a distinction between those
two aside from this quote. Yeah, it's basically the same thing.
And also it doesn't even make sense as a defense.
So basically what they're saying is that um, that they

(41:00):
were being called out by M I. T. S. Media Lab.
They did a two thousand eighteen study. That's the one
that found that there's like a twelve and thirty five
misidentification among darker skinned men and women women. I think yeah, um,
and Amazon said no, no, no, uh, you guys are
using facial analysis, not facial recognition. It's like, no, that's

(41:23):
that's not the case at all. They're doing facial recognition.
All right. I'm glad it wouldn't just me, because you
see I wrote I don't get it next to this.
It was a it was a bad, a bad jam,
I guess. But I think their point was, well, you're
trying to tell the gender of somebody, and if you're
doing binary gender stuff, like you're trying to say this
as male or female, you can't really use facial recognition

(41:47):
for that, especially among darker skinned people. And they said
that you shouldn't use that, especially in cases of people's
civil liberties or whatever. But it still remains the case
that if you are a darker skinned person and you're
being looked at by a police department that has their

(42:07):
threshold for a confidence level set low, yeah, there's a
chance that a false positive is going to be put
out there, right, And that's that can be trouble for
you if you don't have the money to mount a defense.
And even if you do have the money, you shouldn't
have to mount a defense to spend money on that
to be equitted of a crime. Just because the computer
is not so good at a distinguishing Um black people

(42:30):
think it is among white people. Yeah, And what you
know when it comes to where this is going to
end up legally, Uh, you might want to look at
the Fourth Amendment. Um. It gets really dicey on how
you interpret the constitution when you talk about illegal search
and seizure? Is this a search or a seizure? Probably not, um,

(42:50):
because it depends on what we're talking about with the
Supreme Court. Um. You've probably been stopped h in a
at a d y checkpoint, and that's stopping everybody. That's
sort of the same thing. It's like, if you're in
a car, we're gonna stop you and check you out
because the couplet the public, the public his said, you know,

(43:11):
that's okay, it's reasonable, it's not super invasive. And if
you're stopping drunk drivers, it's just putting someone out for
a few minutes. Yeah, the court said, if it's minimally
invasive and the public good or the potential for public good,
which is in this case getting drunk drivers off the
road is high enough, then it's it's okay to basically
search everybody without probable cause. Yeah. Same with T s

(43:32):
A checkpoints when it comes to official rulings. Obviously we
don't have one in facial recognition yet. But if you
look at Carpenter v. United States, the court ruled five
four that police violated uh Fourth Amendment rights of a
man when they asked for his cell phone location data
without a warrant from T Mobile. Um. So hopefully this

(43:53):
nuance will prevail, and it just won't. It looks like
it probably won't be some blanket ruling that just says, yep,
you can use it for whatever you want, right if
it even gets to that point, and if the court
hears it, which probably would. So. The other thing that
um that is has become worrisome for people, though, is there.
It's becoming um our society is becoming increasingly surveiled. Right

(44:17):
like ring the ring doorbell. They market to law enforcement
basically saying like you can, you can, These people like
will pay to have video cameras put on their house
and you can go get these videos on neighborhood all
the time, people like my car got broken into, who
can help me out with their cameras. So it's being
marketed to law enforcement. Your TV has a camera and
your smart speaker has a microphone and it So the

(44:38):
more that we are um surveiled and the more ubiquitous
facial recognition technology gets, the easier will be to not
just scan a picture of somebody stealing beard of a
CVS against the mugshot database, or but to say this
person right here that you're you're looking at, that the

(45:01):
cameras following, that's this, that's that's um, that's uh, that's
Chuck Bryant right there. And everywhere you walk there's getting
a little icon next to your head Chuck Bryant. You know,
if you click it, it'll show you your Facebook page
or a map to your house or whatever. They want
to know, your police record, it doesn't matter. And that
this is what we're increasingly getting closer to. And some

(45:23):
people say this is what they're already doing in China. Yeah,
and London has has They were one of the first
on the CCTV train. Yes, but they use humans, which
is fair right for recognizing phases. Yeah, they have people
like actually looking at the at the individual monitors looking
for crime. This is this is the idea of this

(45:43):
is it's just tracking people who are just doing nothing wrong. Yeah,
but there are plenty of people on the other side
we should point out that are like, you know what,
if you're catching bad guys, that's great. If you're a
good guy, you've got nothing to hide, so you shouldn't
sweat it. Yeah. I can never remember the name of
the article. I'll try to find it, but there's a
man I wish I could remember off the top of
my head. But there's there's this amazing article from a

(46:06):
few years back, um that that basically says like that's
that's a terrible argument that that even if you have
nothing to hide, you still, um are a human being.
And if somebody wanted to put together and but if
somebody wanted to put together like a dossier on your
embarrassing things that you've done or said or thought or whatever,

(46:29):
um and and put it all together and condensed it,
you can make anybody look bad. No one should want
to live in a situation where like that could conceivably
happen in the police state. Yeah, police state, good stuff.
I guess we'll see how it pans out. I'm not
saying police state stuff. Police date. It's good stuff. Yeah,

(46:51):
we'll see what happens right in Woodrow, Hurtzog and let
us know what to do. If you want to know
more about facial recognition technology, you can go onto the
internet and start reading stuff about it. Definitely read the
New York Times expose about um clear view AI came
out January. Yes, okay, since I said that it's time

(47:14):
for a listener mitt all right, now it's not you
know what it's time for. Oh yeah, I know it's
time for you ready, Yeah, you say it. It's time
for administrative details. All right. This is part two. This
is where we thank people on the show that have

(47:35):
sent us kindness. Is via snail mail. Siggy s I
g D. I sent sent me some hand that it
sucks not you for some reason. I don't know why.
I got some socks too. Oh really yeah, I didn't
know who they were from, so they may be from Siggy.
I think probably what it was is you left him
with my desk and I thank you for it, CHUCKR

(47:58):
do another one while I'm pulling up my list acting.
Julie Shoop made us t shirts. Shoop, this is good stuff.
Faux band name tour shirts. Uh, super fun. Thanks a
lot of Julie. Very cool. Uh, you're still working, so
I'm gonna keep going. Thalia Dawes is our pal from Australia,
said my daughter a couple of books. She's a very

(48:20):
lovely lady who has a very adorable and whip smart
daughter about the same age, who listens to our show.
And um, I was just like, man, I wish she
lived here. We could go into play date. They both
seemed like lovely humans. There's such thing as plains. Yeah,
good Australia for a play date. Um. So at our
Portland's main show, Chuck, we had like a lot of um,

(48:42):
We've got a lot of neat gifts. Jim Diefenbacher made
us amazing cross hatch portraits of them. Yeah, those were
great of us, like of a photo we took I
think on like our West Coast tour from two. It
brought back some memories saw that. It's just really great stuff.
And you can see Jim's work at Jim Diefenbaker dot com,

(49:03):
j I M D I E F F E N
B A C H E R dot com and they
were framed in everything. Yeah, very sweet stuff, Jim. We
got some home tapped maple syrup from Andy Huntsberger from
Elgin I A, Okay, what's Iowa? Is that Iowa? Yeah? Yeah, okay, Yeah,

(49:24):
I was about to say the wrong state. What are
you gonna say? You know, I think I went to
say Illinoia. If you ever see Gary Gulman's bid on
Nate abbreviating the states, dude, just look it up. One
of the great comedy bits I've ever seen. It's hysterical.
Let's see. Uh oh. Another at the Portlands Show, we
got a letter from tog Braun from Downea's day Boat

(49:47):
from Lloyd Braun. Tog Braun Um and Downey's day Boats
mission is to bring sustainable, delicious scalops from Maine to
the world, and she said that scalops have varietals like
oysters and that main has the best. So check out
down East day Boat dot com. Coke Braun feel free
to send us some scallops as long as they've been

(50:09):
appropriately refrigerated. The entire time. I got another children's book,
Are You a Good Egg? And that was from Peter Deutschel,
along with some stuff you should know coasters. Yeah yeah,
thanks again Peter. I think we thanked him last episode
for the coasters too. I didn't know about the children's book.
Then Sarah Law who is an s y s k

(50:31):
Army member. Um, she came to the Toronto show and
she brought us a bunch of um Canadian goodies, everything
from Japanese cheesecakes and tarts from Uncle Tetsus. So good.
Um and uh. I think some other stuff too, like
coffee crisps, which are my favorite. Um. Yeah, so thanks
a lot. Sarah's always that is everything from Japan awesome.

(50:54):
They just it's really good. They don't necessarily invent much.
They just take other people's inventions and perfect them. And
it seems like they take a lot of pride in
like doing things right. I think that he could say that,
probably because we got from Matt an assortment of food
things from Japan and that came in today, including our

(51:17):
beloved QP Man's. I love that stuff. It's been too long, man,
God bless you. Let's see Leah Harrison gave us some
amazing goodies to including coffee Crisp and Canadian Smarties, which
are way better than American Smarties because they involved chocolate
and super smarties. A student named Maria Styling wrote us

(51:37):
a letter for an honors English project because she had
to write someone who inspired her, and she asked this.
I told her we to answer, how do we choose
a topic? Maria? We choose a topic. It's not it's
pretty low fih. We just send each other one each
week on whatever happens to grab our fancy. We're always
looking around our world, uh and thinking I wonder about that.

(52:00):
That's as that's as easy as it gets. And we'll
just send each other an email and times out of
a hundred will say great, let's do it. Yeah more ka,
let's see um oh. Michael C. Learner, who's an attorney
at Law and Reno, sent us a letter about getting
the word out about the National Consumer Law Center, for

(52:20):
which Learner does a lot of pro bono work for
people who are poor and getting screwed over because of debt.
As he put it so, he pointed to the National
Consumer Law Center and the Practicing law Institutes Consumer Financial Services.
Answer book. So if you are in debt and you're
getting pushed around, go check those things out, says Michael C. Learner.
Good stuff Van Ostro, and we gotta thank him again.

(52:42):
Our buddy from Washington sent us a book by his
friend Andy Robbins called Field Guide to the North American Jackalothes.
Pretty awesome, that's pretty fun. Paul Speth from Mars Community
Brewing Company in Chicago gives a bunch of beer at
the Chicago show. Thank you for that. Um, I got
one more. Okay, I'll go and finish up and then
you can round us out. Man, I have a whole page.

(53:03):
Laugh all right. Robert Highland from Wammo, this was just
came in today. Okay. He works for Wammo. He sent
us each their seventieth anniversary super book. Oh wow, thanks
a lot. Like you guys talk a lot about wammo products.
Does it bounce? I have not dropped it on the floor. YEA,
let's find out. Give it a try. I'm gonna do
a couple more and then well maybe we'll split these
up because there for both of us for another episode.

(53:26):
It's up to you. He can blaze through them too.
Now there's too many Um, so let's see the Crown
Royal people again for hooking us up. Very sweet. They've
hooked us up many, many times. Um. And they gave
us a nice congratulations because we've got the Best Curiosity
Award from the Heart Podcast Awards last year. Oh yeah,

(53:47):
that's how old this one is. Mick Sullivan gave us
a copy of his book The Meat Shower, which is
amazingly illustrated. You can check it out on the Past
and the Curious dot com. Yeah that just sounds really
greats it really does. Let's see, um and over round
everything out with Danielle Dixon, who is a real life
marine biologists chuck at the University of Delaware, and she

(54:08):
sent us a couple of copies of her kids books
See Stories, children's books based on real science. You can
check it out at s E A S T O
r Y books dot com. Alright, you're gonna save the rest.
I'm gonna save the restival splow up. All right. Thanks
everybody who sent us stuff, and and thank you also
just for saying hi to anyone who does. You can

(54:30):
say hi to us by sending us an email. Wrap
it up, spending on the bottom, send it off to
Stuff podcast at i heart radio dot com. Stuff you
Should Know is a production of iHeart Radio's How Stuff Works.
For more podcasts for my heart Radio, visit the iHeart
radio app, Apple Podcasts, or wherever you listen to your
favorite shows.

Stuff You Should Know News

Advertise With Us

Follow Us On

Hosts And Creators

Josh Clark

Josh Clark

Chuck Bryant

Chuck Bryant

Show Links

Order Our BookRSSStoreSYSK ArmyAbout

Popular Podcasts

Death, Sex & Money

Death, Sex & Money

Anna Sale explores the big questions and hard choices that are often left out of polite conversation.

Stuff You Should Know

Stuff You Should Know

If you've ever wanted to know about champagne, satanism, the Stonewall Uprising, chaos theory, LSD, El Nino, true crime and Rosa Parks, then look no further. Josh and Chuck have you covered.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2024 iHeartMedia, Inc.