Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:00):
From UFOs to psychic powers and government conspiracies. History is
riddled with unexplained events. You can turn back now or
learn the stuff they don't want you to know. M
(00:25):
welcome back to the show. My name is Matt. They
called me Ben. You are you? And that makes this
stuff they don't want you to know. Listeners, you're probably
wondering whatever happened to the other guy on the show.
Don't worry. Our third amigo, Nol, will be back in
due time. Today, however, we've pulled a switcheroo on you.
(00:45):
You see, this is not the ordinary episode of stuff
they don't want you to know. Today we are joined
by a very special and returning guest, Ladies and gentlemen,
our resident tech expert, Jonathan Stricklands. I'd like to thank
This is an extraordinary episode of stuff they don't want
you to know, and uh, I am here to talk
(01:06):
about stuff they legit do not want you to know.
That's true, That's true, Jonathan, thank you for coming. You know.
One thing, one thing that we can say about Jonathan
off the bat is that he has a face that
is more recognizable than most. You host a show called
tech stuff. You host a show called forward thinking, You've
(01:27):
worked with everyday science on brain stuff, and you regularly
hit the streets on C S and E three. Yep,
these are all true. I I often will get maybe
not often, but frequently enough, get recognized simply because my
face has been in a lot of places and associated
(01:47):
directly with my identity. It's not just that, Hey, who's
that guy in the background of every Avengers movie or whatever.
I'm someone who my face is tied to who I
am frequently enough that people recognize me for the person
I am. Most people, I would argue, probably don't experience
that to any great degree, apart from you know, they're
(02:09):
they're close friends and acquaintances that sort of thing whatever,
and their EXAs. Yeah, the enemies that they have ranked
up as people may or may not be aware. Ben
and I are longtime enemies. I declared you as such
within my first couple of months of working at How
Stuff Works. Well. Yeah, but also, to be fair, I
push people. Matt can attest to that. So most people
(02:33):
would think there's not a real great chance they would
be recognized whenever they go out in public. But here's
an interesting fact that you may or may not know.
In the United States, one out of every two adults
is actually their faces actually tied to their identity in
some law enforcements database. That law enforcement agency might be local,
(02:59):
might be federal. Most of them are linked together loosely
in the sense that if one law enforcement agency wants
to run a recognition UH test on a suspect photo,
they can get that done within other law enforcement agencies.
So you can, in theory, have half of the adult
(03:21):
population of the United States as your virtual lineup when
you are looking for a match on a photo. So
if you are an adult in the United States, there's
a coin flip chance that your faces in one of
these databases. If you've committed a crime, or if you've
been accused of committing a crime in the United States,
there's essentially a chance that your face is in one
(03:44):
of these databases. So that's what we're talking about today.
Ladies and gentlemen. You will remember at the close of
our previous episode, we asked you to consider how many
times your face has appeared on a camera? Right, how
many times a day does a camera perceive your visage.
I would argue that a lot of people don't want
(04:05):
to be recognized casually. For one reason or another perfectly legitimate,
perfectly legal reasons. Yeah, if I'm if I'm walking around
a place and I just want to duck into a store,
I don't necessarily want the entire world to know that
I went into that store. Let's say that I'm shopping
(04:26):
for a surprise gift for my wife, totally innocent, nothing
wrong about that. Then I don't necessarily want that information
to get back to my wife and thus ruin the surprise.
Or it could be, it could be anything that you
would argue is on the further end of the spectrum.
At any rate, we have a certain expectation of privacy
(04:49):
in here in the United States that I think is
largely based off of wishful thinking at this point. So
what is now phil esophagal Yeah, waxing about privacy or
the nature of it or how long it's existed. Is Uh,
it's sort of a hobby Matt and I on the show.
(05:11):
I guess we should start, as the Mad Hatter said
at the beginning, Sure, So what what do we mean
when we say facial recognition? What is this? So this
was something that people were starting to work on very
seriously in the nineties and early two thousand's because it
required a great deal of processing power back in those
(05:33):
days to achieve what now we can do much more
simply with new types of computer architecture. But your basic
approach has been the same, which is that you create
a program and algorithm that identifies special anchor points on
an image, such as a face, and identifies this looks
(05:55):
like an eye. So I'm going to assume that this
is an eye. So here's the other eye, Here the nose,
here's the mouth. Based upon the relationship of these various
components within a face, I would assign a numeric value
more or less a numeric value to this. It's called
a face print, but it's essentially the sum total of
(06:17):
all the different features that that particular algorithm is looking for.
And just to get this out of the way, there
are a lot of different algorithms out there by a
lot of different companies, so not everyone does it exactly
the same way. Everyone claims that there's the best, but
they all use similar but not identical methods. You then
take this information, this face print, this numeric value you
(06:40):
have created, and you run it against a database of
previous face prints that are tied to known individuals so
the picture you take is called a probe photo. That is,
the photo you are probing the database to see if
you can find a match. You look and see if
you find a close nume wreck match between the two
(07:02):
the probe photo and all of your your collected images
that are in your database that you have, you know
who those belong to. So photo pro photo is of
a person of interest. Might not be a suspect, but
it's certainly someone you would like to talk to if
you had a chance, because they're involved in a investigation
in some way. You look against the database, you try
(07:24):
and find a match. Numerically. At that point, at least
with most systems, but not withal you would then have
a human being, a real life person, comparing these images
to each other to see if they actually represent a
match of this photo looks like yes, it is in
fact the person who's in the database. Because people can
look very similar. People can look very similar, and but
(07:47):
depending upon how the algorithm breaks down the face, it
may mistakenly think that two people are the same, even
if you were to look at them in person and say, well,
that's clearly not the same person. I mean, I can
tell that's not the same person. There are times where
that happens. Uh. It also has a lot of other factors,
such as the picture you take is the person facing
(08:10):
face onto the camera. If it's at a slight angle,
that changes things as well. Right. Uh, there have been
advances in algorithms where it's able to take better guesses
as to the three dimensionality of a face. But previously
it was very much a two dimensional image, which meant
that you could fool it. You could fool it either
(08:31):
on purpose or by accident. So this doesn't have to
do with law enforcement. But in Japan they incorporated facial
recognition software in vending machines that sold cigarettes because the
idea was that if you appeared to be too young,
the vending machine wouldn't sell cigarettes to you. But if
you just held up a photo of someone who was
(08:54):
old enough to purchase cigarettes, it could. Because it couldn't
detect depth, it was entirely just two dimensionals. Is this
a face of a person over what is eighteen or whatever? Right?
And now it's gotten to a point where because of
camera sophistication, there are a lot of cameras that can
incorporate two lenses, which gives it that that by optic
(09:15):
a view, a parallax view that can simulate Yeah, exactly,
you can simulate that three dimensional vision. And there are
algorithms that can take advantage of that. That has been
decreased somewhat. It is not as easy to fool at
least a sophisticated machine, but it can still happen now earlier,
at the very beginning. Yeah, you gave everybody a pretty
(09:39):
harrowing statistic. Yeah, the what did you call a coin
flip chance. Yes, there's a coin flip chance that if
you are an adult in the US, your face is
in one of these database. If you've never done anything wrong. Yeah,
you don't have to do anything, we really, all you
have to have done is applied for some sort of
license or accreditation that also includes your photo, passport, driver's license,
(10:05):
State I D. There are a lot of security clearances
that do this. So let's let's back up a little
bit and and let me explain the landscape because it
is complicated, right, It's not as simple as there's this
one massive computer system at Langley and that's where your
face is. It's much more distributed than that, much like
the Internet itself. So the FBI has the interstate Photos system.
(10:28):
This was sort of a an evolution of their national
fingerprint database. And I think you would say, all right, well,
I understand the fingerprint database. They've collected fingerprints from various
criminal investigations. This speeds things along when they are investigating
a new crime. This, however, is more about faces, less
about fingerprints. So now what we're looking at is a
(10:52):
system where if the FBI uses their normal method of operations,
in theory, at least you probably you would feel a
little less creepy about all this. And that's because if
they're using their basic method of running a search, what
the FBI would do is take a probe photo from
an investigation. They would then run it against their database,
(11:15):
which includes both criminal and civil photos. The civil photos,
some of those are tied to some of the criminal photos.
If they have the same picture of a person from
a criminal investigation and from say a license that they
had applied for that then both of those will be
tied together in the database. When they run a search,
(11:36):
the search only looks for matches in the criminal side.
It cannot look for matches on the civil side. So
if your ideas in the FBI's UH interstate photosystem. But
it's only in there as an example of a civil photo,
and you have no criminal record to tie it to,
it should not pull you as a result. So this
(11:59):
leads to an immediate question. We're talking about the division
between civil and criminal images here facial images, at least
as it pertains to the FBI. Do they not automatically
correlate because of a matter of ability or because of
(12:19):
a matter of legislation? They do it as a policy
saying we we don't want to, we don't want to
bring in the civil photos. Trust us. But this was
the basic search. I haven't gotten to their other search,
and we'll dive into that after a word from our
sponsor and we're back. So for everybody who is just
(12:54):
now becoming acquainted with us, we've we've heard all of
this stuff people have seen like crime show c S
I or whatever, and uh, facial recognition and action films
and facial recognition and science fiction. And what we've talked
about so far is a basic search. We've explained the technology,
(13:16):
and we've explained a little bit about the evolution. I
think it would surprise people how quickly this has evolved.
But what's after the basic search? So all right, the
FBI S General m O is using this this criminal
database and avoiding the civil database uh when they're doing
(13:41):
their own searches. But they also have a division called
the Facial Analysis, Comparison and Evaluation Services or FACES. I
want to do the acronymic FACES is kind of like
there their specialty division where they want to run a
more uh more extensive search. So the FACES operators can
(14:04):
run a search about that that incorporates both criminal and
civil uh sources, and they can work with local state
law enforcement agencies, will not just state, but tribal law
enforcement agencies as well. In the United States, they have
a they have this ability to work with all of them,
(14:25):
to partner with them. Um, it's kind of a tip
for tat sort of approach. Whenever anyone needs to run
one of these, they can run it up the chain
and see if they can get a bigger net to
to pull from. Right, So what will happen is you
would send a request to the FACES department. They could
run it against not only the FBI's database, which is
(14:46):
it's extensive, but that is not the one in every
two Americans databases that that statistic comes from the collection
of databases across the United States at all different levels
of law enforcements, So against state, tribal, city, UH, federal,
all of those different levels combined, that's where you get
(15:06):
the one in two. But the FBI essentially has access
to most of that because of these relationships they've developed
with various other entities. So you would send a request
to FACES. FACES would then handle the request sent out
to other agencies, right so within they would do the
(15:27):
the FBI search, but they would also say, hey, State
of Utah, can you do this search? City of Las
Vegas can you do this search? You know what we're
gonna expand this, California can you do this search? And
you start again casting a wider net. Well, you start
running into problems very very quickly with this approach. Well,
(15:48):
multiple multiple avenues. Let's start with technology. You remember earlier
I mentioned that no, no facial recognition algorithm does the
approach the exact same way as a different at one,
right like, So it's not like there's just one way
to achieve facial recognition technology and some may be better
(16:09):
than others. And the real problem is that we don't
really know because most of these companies haven't had a
third party come in and evaluate the software for accuracy.
So there's no UM, there's no quality assurance, there's no
accountability at all. So one from a technology standpoint, you
(16:29):
cannot be certain that everyone is on the same playing ground,
like they're on the same level. They may all be
using different technologies, some of which may be more effective
than others. That's problem number one. Just to add on
to that, one of the larger problematic trends in many
uh many government agencies attempt to implement technology is the
(16:55):
arrival and persistence of outdated legacy stuff because they're locked
into some service agreement or contract for or they just
don't have budget for. Like if you're running if you're
running old servers, old computers, and they aren't capable of
running the most up to date versions of software that's
(17:19):
out there, then yeah, that's yet another layer of technology issues. Yes, so,
ladies and gentlemen, if this sounds kind of weird to you, um,
one of the one of the moments in American pop
culture when the average non military, non law enforcement officer
(17:41):
person learned about this was actually watching the wire when
they had to use typewriters. Do you remember that? And
this is just another example of this. So not only
are their various forms of facial recognition algorithms and technology,
some of which may be proprietary, all of it is proprietary.
(18:03):
And not only did these not all let's play nice together,
I guess or agree at times. But some are I'm
overwhelmingly certain. Some are pretty outmoded. Yeah, effectively obsolete. Some
of them are. And again you don't necessarily know unless
you are getting regular updates from all the different agencies
(18:27):
out there. There are thousands of these law enforcement agencies
right Like, it's not we're not talking about others. There's
fifty of them, because there's fifty states. No, there's way
more than that. And you're talking about databases that are
not necessarily directly connected to one another. So each one
is its own little island with its own little like.
Some of them may be using the same facial recognition software.
Some of them may be using different versions of the
(18:48):
same vendor software. Now, if if they all worked, if
all the facial recognition software worked flawlessly, this would not
be an issue. It wouldn't matter if you went and
bought from Company A versus Company B, or if you
had Company a's one point over version versus company as
one point five version. If they all work perfectly, that's
not an issue. But we don't know because we haven't
(19:10):
had these third party audits for a lot of those technologies.
You get a lot of claims, Like you'll see a
company say we're accurate, and then you say, well, well,
where's the proof of that? And they say, trust us,
And then you say, well, can we hold you to
that in court? And like, oh no, our terms of
service say you can't. We can't be legally held accountable
(19:31):
for the claim that we make that it is accurate.
That's an issue, right, So that's all technology. There's another
technology issue that gets super uncomfortable. What's that most of
the time, facial recognition software is not equally accurate across
all ethnicities because the way people choose these algorithms and
(19:55):
the way they designed the faces, you know, the facial
recognition that the elements that they're looking for, or it
may be that in one ethnicity you see a greater
variety of the features that were chosen to be the
elements that you're looking for versus others. So within one
ethnicity you might see a greater variation in something like cheekbones.
(20:16):
Let's say the cheekbone size, the cheekbone, like the prominence. Yeah,
that kind of stuff might be really important with or
at least there might be a greater variety within one
ethnicity and less in another. Well, if your technology is
focusing on that and it doesn't focus so much on
the other qualities that have a greater variety within a
(20:37):
different ethnicity, you're gonna get a lot of false positives
because the technology you're looking at, if it's looking at
a subject from that ethnicity that doesn't have great representation
in your technology, you're gonna get a lot of false
positive identifications because the tech can It's like, it's like
that that statement about someone being racist. They cannot tell
(20:58):
the difference between two different individuals from the same race.
They don't see the difference because they have become so
ethnically minded of their own ethnicity. They only recognize the
differences that are representative of their respect by ethnicity. And
this is what's what's fascinating and terrifying about this. I
(21:21):
think the most immediate fascinating and terrifying thing is that
we see again that software is limited by the creator.
Technology is limited by we all make these amazing tools
based on our own parameters, we very quickly make an
assumption that something that's that's working on what appears to
(21:45):
be cold, hard logic is infallible, right, but that that
emotion that things like uh A rationality don't play a
factor except for the fact that they played a factor
when we made this stuff, and it didn't necessarily come
from a conscious decision. It may have been something that
just in the design elements, people were trying to figure out,
(22:05):
all right, well, what components are the ones we need
to really focus on in order for our technology to work,
And some guy leaned in, took off his clan hood
and said, cheek bones. Yeah, that's a very I mean,
probably not quite as a as extreme as that, but
the point being that it there have been some some claims,
at least I don't know about any deep studies, but
(22:27):
claims that facial recognition technology disproportionately affects people of UH
specific ethnicities, particularly black people. They have been uh disproportionately
affected by this false positives. Yeah, absolutely, yeah. And and
also there it's that goes down a road that is
(22:49):
far beyond the scope of this episode, because we'd have
to start talking about the vast gap uh in the
experience of being say, a white person in the United
States and being the subject of a law enforcement investigation
and being a black person. There's huge issues there that
really go beyond what I'm talking about here. Uh. And
(23:09):
I'm not gonna jump into that because I'm sure you've
covered those sort of things in previous episodes. If you're
interested in learning more about that, you can look at
the Center on Privacy and Technology at Georgetown Law. They
have a website called Perpetual Lineup dot org and it
goes through a lot of what you've been talking about. Yeah,
it's it is a real thing. I want to stress
(23:30):
that it is a real thing. People sometimes will think
that I've encountered people with resistance to it because they
don't like the idea of an inherently unfair system. It
goes against some of their um their own personal values,
which I completely understand. But it is a real thing. Yeah. Yeah. Well, Also,
(23:50):
every this is gonna be Ben's blanket statement for the episode.
Always end up having one, and please write to me
if you disagree. All systems are inherently, on some level unfair. Yes, yeah,
because systems parse and we don't have. We don't have
a perfect way of incorporating every possible life experience and
(24:14):
perspective when developing a system. Systems are meant to sort
things into different categories. Typically, it's like a bureaucracy. If
you know how the bureaucracy works, you can very quickly
move through a bureaucracy. But if you don't, your experience
is going to be painfully laborious. And also, I mean,
what else can we expect if if every system is
(24:36):
inherently imperfect, we can't even as a species get our
collection of people's face pictures correct. And we're pouring millions
into this. Let me let me throw another issue here. Yes,
so we had technology. It's it's all proprietary business only
(25:00):
work well together. It's disproportionately discriminatory, leading to not only
false positives, but in some cases wrongful of rest and
imprisoned and uh and Bennet's about to get a whole
lot worse. Yes, you're right, it's time for a word
from our sponsor. And we returned. It wasn't that bad.
(25:30):
I I actually chuckled a little bit, but it was
a tease. We weren't talking. We weren't talking about the
ad break. We were talking about yet another disturbing aspect
of the emergence of facial recognition. Yeah, I wish, I
wish I could say there's only maybe one or two
layers left. But honestly, this this is this is a
(25:53):
mire of issues. So remember I said, we have thousands
of different agencies, all with various databases, all using various
types of facial recognition software. The FBI can tap many
of them when doing one of these searches. Um on
top of this, you don't have you don't have a
lot of regulations in place for most of these agencies
(26:16):
on how they use this technology. So there are no rules,
which means there's no accountability there either. It means that
your face could be pulled into a virtual lineup. Essentially,
it could be pulled up as a hit on one
of these facial recognition searches. And because many of these
places have no rules or regulations, you are not necessarily
(26:39):
going to be alerted to this fact. You may not
be aware that your face is even there in the
first place. The first you may hear of it is
if it goes far enough for for an agent to
come and come knocking and ask like, Hey, your face
popped up when we ran this search. We need to
talk to you about this thing that you may or
may not have any connection to. You better have an alibi. Better.
(27:03):
So this is incredibly disturbing that because there are no
rules and no accountability, you can't as a citizen take
any steps because they didn't break rules. The rules didn't
exist before we started recording. I said, it's kind of
like if there were no laws, I could walk into
your house and take whatever I wanted because there's no
(27:24):
law against stealing as long as I have If might
makes right, then this is your time because there's only policy. Yeah,
there's there's policy internally for these agencies, and even that
doesn't match up from agency agency. Yeah, there are eighteen
states that have memorandums of agreement on this with the
FBI specifically, and and there are more states that they're
(27:46):
looking at adding to this all the time. But again,
within each individual agency, they may have very different policies.
Some of them have been trying to be responsible, right,
some of them have done third party audits of the
systems that sort of thing. That's great, that's at least
a step in the right direction. But until we actually
make rules, either at the state or federal level. We
(28:10):
don't have any again, any accountability of something were to
go wrong, Like, how do you complain if there's no
rule against it? Right? Well, one of the policies that
some of these agencies have is that before you can
return results to the FBI or to the investigative agent,
(28:30):
it has to first pass through humans who look at
the pictures and determine whether or not they make a
good match. So each of these agencies also have different
agreements as to how many results they will return to
whatever agent. Right Like, uh, in the FBI, it's I
believe between two and fifty, with the default being twenty results.
(28:54):
But some places they'll be like no, we we will
send you one, or will send you too, or we'll
send you up to eighty results. And then human beings
are supposed to call through these and decide which ones
are actually the more uh likely matches to the one
that the computer found. You're ready to have your mind
blown here, Ben, Let's say, okay, all right, Let's say
(29:18):
that your picture has come up in one of these investigations.
Let's say that the agency that did it does have
this policy that it must be reviewed by a human
being before it goes a knee further, fifty of the
time they're going to make the wrong call the human
(29:38):
being because this is hard. It's not just hard for computers.
It's hard for people to take. If you're not looking
at the exact same photos side by side, if you're
looking at a person on a different day, from a
different angle, from a different distance, under different lighting conditions,
wearing different clothing, wearing a different hairstyle, wearing whatever. It's hard,
(30:00):
like unless you are really familiar with that person. I mean,
I've seen pictures of people who look like me where
I looked and I thought, I don't remember ever being there,
and then as I looked, like, oh wait, that's not me.
I that has happened just to me. Don't beat yourself up.
I went home with the wrong girl. Like yeah, I
mean I've definitely walked into the wrong house. But that's
(30:21):
a totally different issue. That has nothing to do with
my recognition. That just has to do with me not
paying attention. That was back before you realize that Steely
was against the law. Yeah, you know, people need to
make these things a little more clear to me. Challenging stuff,
so it turns out that you people really need to
go through a very thorough, challenging, difficult training process to
(30:43):
become experts in matching photos. If you have not done that,
and a lot of agencies don't require that level of training,
there's a fifty error rate. So not only could the
machines misidentify somebody, the human beings who are meant to
(31:04):
be the the checks and balances against such things could
make the same mistake. So too bad. Coin flips, you're
in a lineup, you might be matched as someone of interest.
Now you might say, let's say, let's say that you're
the kind of person who says, I didn't do anything
wrong to worry. What do I have to worry about? Well,
(31:27):
I mean, you're talking about a violation of a basic
amendment here, potentially the protection against unreasonable search and seizure.
You're being searched with no justification. There's no reason for
them to search you, but they're going to be looking
into you and your background. Let's say that you did
do something that was of minor significance in the past
(31:50):
that maybe you know you're sorry for. You did something dumb,
you you are sorry for it. You have reformed you're not.
You know you you luckily weren't taken to task for it.
I guarantee you if there's a federal investigation, that's gonna
be one of the things that pops up. I mean,
it's it's things like your your life could be ruined
or at least significantly impacted in a negative way for
(32:12):
something of minor importance that may have happened in the
past that has nothing at all to do with whatever
the investigation is. And also, let's let's go a step
further here. Sure, so the practice of expunging a criminal conviction, right,
that happens a lot with especially with people who commit
(32:36):
a crime when they're teenagers. Right. Sure that whether whether
or not that's expunged doesn't really matter in today's investigatory climate.
It's kind of like that game where people pretend the
floor is lava and no one steps on the floor.
(32:56):
I'm sorry, you have to learn about it. This like
the editorial board, the floor is lava over at the
editorial department. We Um, we started that joke. Um, that's
why we have the carpet tiles. Ben about two weeks
before you got here, just kept going, you don't have
they're not special shoes. Man. On the flip side, my
(33:16):
hamstrings are amazing. That's true. That's true. Alright, so, um,
we're adding some levity, but we're adding levity. Yeah, you
have to because and I am not you know me, Ben,
I am not a They're all out to get you
kind of guy. No, no, you You endeavor to be
(33:39):
as objective as possible when you're discussing anything tech related.
I just feel very strongly that this particular practice is
dangerous and irresponsible on multiple levels. And I'm not the
only one. Congress called the FBI in front of the
principal's office back in March seventeen and essentially was saying
(34:02):
the same thing that we're saying. There's no accountability here. Uh.
This is horrifying. It is a gross uh over overstepping
of your authority. It uh. It is violating people's privacy.
People should have an expectation to not be called into
(34:23):
these virtual lineups, even if you never know about it.
The fact that it's happening is disturbing, right. I wonder
if it qualifies on your unreasonable search that and that's
one of the arguments. That's Georgetown University has has a
great report on this that is very easy to read. Uh.
There's also the Government Accountability Office has an amazing report
(34:43):
specifically about the FBI. Georgetown looks at the broader picture.
Government Accountability Office looks specifically at the FBI's use because
it's a federal agency. In the Government Accountability Offices is
also a federal agency. The Georgetown one looks at the
broader use of facial recognition in law and enforcement in general,
and they there are some very strong points saying this
(35:05):
seems like it violates our protection against unreasonable search and
seizure and that you know that that's a huge deal.
That's that that's a protection granted to us by the
Constitution of the United States. So there's definitely a movement
in government to push these agencies to develop regulations. And
(35:26):
I suspect that if the agencies failed to do so
in a timely manner, we will see actual legislative movement
in various state and federal governing bodies. Two codify it
at that level, if not at the actual agency level. Uh.
And I would prefer it to be codified at the
(35:48):
state and federal level because I always worry about any
group that's allowed to make up its own rules. Yeah,
And and I find it optimistic to to say that
someone would limit this ability because we know, for instance,
that historically, when aggregation of powerful data bases occurs, it
(36:14):
almost never just due to the different paces of technological
evolution and legislative evolution, the it almost never is reigned
back efficiently. I mean, like the like the right now, if,
for instance, I was that I decided to pull it
(36:34):
Edwards Snowden or Julian Assang or like what's a good
bond villain the old doctor No, okay, doctor No. I
was like, I'm gonna do a doctor no thing or
rs Goldfinger, Yes, yes, I'm gonna do a doctor no
Goldfinger thing. And I'm like, Matt Jonathan and I are
going to do some some terrible attack on the US
(36:57):
right from our secret underwater base. Of course, of course
that's where we would do that. The volcano base is nice,
But if you're doing like a world level attack where
you're you're legitimately causing vast destruction, you want to be
in that underwater base. You genuinely have to be afraid
of floor lava in the volcano base, just you really do.
Those carpet tiles are a lifesaver. Your strings do look great?
(37:22):
So what what What happened though, is uh, I would
quickly learn that whatever legislation exists will not in practice
stop uh stop the n essay from knowing everything I've
done online and from extrapolating, in the interests of national
(37:46):
security any other information. The FISA courts are for the birds.
Yeah let me, let me, let me throw one more.
I would make it worse. Yeah, okay, you wanted to
be worse. Okay, So you know I mentioned earlier that
that idea that if I haven't done anything wrong, I'm
all right. Here in the United States, one of the
(38:08):
things that we are allowed to do is protected by
the Constitution, is to assemble peacefully in public and express
free speech. So there has been a fear, justifiably in
my mind, that an agency that perhaps has a specific
political inclination, could use such facial recognition software to identify
(38:34):
participants in a peaceful protest for later let us say discussions. Right,
you don't think that's already been done. What I'm here
to say is that Georgetown, when they did their report,
they asked various agencies about their policies as to the
(38:55):
use of facial recognition software they found. Out of the
fifty two agency ease that they were able to contact
and get a response from, only one has a specific
rule against using facial recognition to identify people participating in
public demonstrations or free speech in general. Only one out
(39:15):
of fifty two that they asked specifically has a rule
against that. So, in other words, fifty one out of
fifty two have no rule. It means that if the
police were to use it, or whatever other law enforcement
agency were to use facial recognition, perhaps even in a
live feed, because you can do it in video just
as well as you can do it in photographs, to
(39:37):
start identifying to populate a crowd with names and identities
who are protesting a particular thing. Let's say it's police brutality,
which would be very relevant, especially in recent years. That
is a legitimate concern that that people's free speech could be,
uh could be could be violated. If you feel like
(39:57):
you can't go out there and express your thoughts because
you might be unfairly targeted as a result of that,
that is a violation of your free speech, not to mention,
you know, your personal safety. Depending upon how legit those
fears are, so that is incredibly troubling. I don't know
if you noticed this, Ben Matt. Maybe you guys don't
(40:19):
notice it. I've noticed that there have been a couple
of protests recently in the last twelve months or so.
I've seen a few, like some of some of which
have had a significant number of participants, both on both
on different sides of political issues, right, not just I'm
not saying that this is one group targeting one other group.
(40:42):
I'm saying there are a lot of different parties involved here,
and no matter how this technology is used against whichever
group it is used, the fact that there are no
rules about how to use it is deeply troubling, and
I would argue fundamentally un American, I would agree, what
do you think that? Yeah, definitely, I've seen the photographs
(41:05):
and videos of police filming using video cameras to film
protests and protesters, and that makes me very nervous. And honestly,
if you're able to get at those images, then you
can run that search, even if you weren't the one
filming it. Right. Like, one of the elements we didn't
(41:26):
really touch on in this in this episode, but that
is related, is the fact that we have lots of
private companies out there that incorporate facial recognition technology as
a feature for you, for consumers. Right. So Facebook is
a great example. You take a picture of someone on
Facebook that you're friends with. Facebook will often say, hey,
do you want to tag this photo? And they'll even
(41:46):
give you the name of the person because their facial
recognition software is good enough to let you know, like, oh, well,
we're pretty sure this is your buddy, you know, Bob,
So you want us to go ahead and tag Bob
for you? Well, you know that on its own as
a feature, there's nothing necessarily wrong with that. But you
could imagine that same sort of of deep, deep wealth
(42:08):
of data being put to nefarious use under the right circumstances,
maybe not by Facebook itself, but perhaps by an entity
that forces Facebook to hand that information over exactly. And
I'm glad you said that, because more more frightening than
law enforcement unfairly targeting people is the idea of the
(42:32):
increasingly gray area and the increasingly black box interactions between
large data aggregators like Facebook, Google, your local I s
P and Internet service provider, and the rest of the
alphabet soup of the federal government, you know, and it
could be anything from the FBI to the n s
(42:56):
A to local police department rights in and at Facebook
and says, hey, we need to We think there are
these three guys who went to this underwater base off
the coast of redacted, and they've been posting photos in
their timeline and we just want to make sure that
these three guys are the same three guys who did
(43:16):
that amazing diamond heist a few years ago and for
their for their freeze ray, yes, yes, yeah, Hey do
you guys remember b O b Y Yeah yeah, yeah
yeah Bob uh he he came at us a little
while ago, us being the world through social social media
(43:37):
and talked about the Earth being flat and we talked
about that. Well, he thinks that Snapchat is really just
collecting blackmail photos of our silly faces doing things. I
will tell you this, Uh, if you think that the
images and video that you take on Snapchat that disappear
after twenty four hours has truly disappeared, absolutely wrong. Yeah. No,
(44:02):
that exists on a server somewhere. It has to. And
then furthermore, and I love that we're exploring this part too,
because Okay, Facebook, people pile onto Facebook often because it
is you know, diabolical. It's it's an enormous corporation that
is completely dependent upon users surrendering information about themselves. So
(44:25):
due to its very nature as to what it is
and how it makes money. It makes money through the
fact that people use it and reveal their likes their
dislikes there you know, the things that they want. Thus
you can serve ads against that. It's very easy for
that business model to take a very dark turn without
without a lot of discipline and policies and and the
(44:48):
will to actually uh perform business in an ethical manner.
And to be completely fair, I don't think Facebook is
run by a bunch of vampire demons, but I do
think that it can be very easy to make unethical
decisions even without being aware of it, when you have
that massive amount of data. Absolutely, and it also extends
(45:10):
to as you said, snapchat. And I had an issue
when um one of one of our co host here
knowl was going through a Pokemon Go phase and it
don't put put Pokemon Go camera on me. We're hanging out,
so there's always you know, a different Pokemon that's on you.
(45:32):
And of course being a little bit on the paranoid side.
I'm kidding. I'm way deep into that. Well, Ben, Ben,
I mean, you're I hate to tell you this, but
you are a pig post. You're just got piggies all
over you. What do you think they mean when they
say catch them all talking about all the faces? Yeah.
(45:54):
One of the first things I turned off on Pokemon Go,
although it was not for fear of Well, there are
some concern for privacy because you know, you're using it
in public and there are people all around you. And
while you can take photos of people in public, there's
nothing illegal against that. I just think it's, you know, rude.
So I turned off the camera slash augmented reality part,
(46:14):
so it just has the animated background as opposed to
an actual camera view. Um and uh. But again I
did that not really out of concern of other people,
but rather because it doesn't drain. My battery is fast,
So it's a selfish reason. I wish I could say
I was being a really decent human being, but really
I just didn't want my phone to die. But is
it really turning your camera off? Well, I mean, it's
(46:36):
not showing it on my screen, whether it's showing it
on someone screen. Oh gosh, I guess, I guess what
I am doing is consistently giving Nintendo and the Pokemon
Corporation up to date information about who is on the
belt line. That's good, that's what they need, and it's
a shame that they don't pay you. So there there
are a couple of things here that we should we
(46:57):
should explore, so sure another they're Another fact that we
must add is yes, your face is out there. We
were increasingly, you know, I think it was Andy Warhol
who said, now everybody will have fifteen minutes of fame.
We're increasingly arriving at a world where everyone in some
way is going to be recognizable in one of these databases.
(47:21):
But living in an anonymous life is almost impossible if
you are living in modern society, right, And to me
a little twisting the knife with this, this is being
sold at a profit and none of the no one
is being compensated. So, for instance, every time Matt's UH
(47:42):
image shows up, it's a false positive. Every time that
that one resource that is entirely created by him shows
up in you know, a P D, FBI, etcetera, he
doesn't get any royalty from that. And I don't I
for the record, I completely think that will never actually happen. Now,
(48:04):
if we ever see a time where the FBI opens
up a fun and zany FBI T shirts store where
you can get your photos stored that was stored on
the law enforcement database onto a T shirt and maybe
as a mug shot or something, that's a great idea,
oh on a coffee mic mug shot. Coffee mug shot.
So let's uh, well, let's as as they like to say,
(48:28):
let's in big in this, all right. I think that's
a cromulent decision. Thank you. Uh, I agree. So we're
we're talking right now about the United States, which had
does have an extensive network, as you said, an extensive
array of networks. But my question is when it runs
(48:51):
into like multinational private industry, does it also run into
international law enforcement like interpool. I mean, really, if you
don't think agencies like the Central Intelligence Agency have their
own versions of the database that incorporate people from all
over the world, you're absolutely wrong. I mean it has
(49:12):
to write like and at some point in some aspects
you understand, right, you're talking about trying to protect national
interests from very flexible, very mobile, very agile enemies essentially,
or or agents if you want to call them that,
whether they're state agents or otherwise. So you understand, you
(49:36):
understand the necessity. The problem is again, unless you have
a clear set of rules, you know you have some
transparency there on how it operates, and you have the
ability to actually put your technology to the test to
show that it is in fact accurate. Ultimately, you're left
with a question of do we did we get the
(49:56):
right person? This leads to deaths if we If we
get the wrong person, two things happen. The bad guy
got away, right because you didn't get the person you
wanted to get, and you've given the bad guys more
reason to hate you. Because you are targeting people who
are innocent, you are creating more bad guys. Right. If
you go out there and you are targeting people who
are are innocent of the crimes that you think they
(50:19):
committed because you're following a wrong lead due to a
technological glitch, then you are justifying that anti sentiment because
you are doing what you have been accused of doing,
which is coming into another culture and disrupting it. So
this is true whether it's domestically and we're talking about
(50:41):
people who are disproportionately targeted by law enforcement because of
this sort of same sort of issue, or on a
global scale. Moreover, there are other nations that have even
more pervasive camera technologies built around them than the United
States does. Hello England, I love you, but I can't
go anywhere without being on a camera in England, right,
(51:01):
don't they have the highest density of CCTV. They certainly did,
at least for a long time. I don't know if
they currently hold that record, but they for for a
long time, like especially in London, because they had so
many different CCTV systems that were most of them were
self contained. Right, so a shop owner has a security camera. Yeah, essentially,
(51:23):
that's what it was. So it wasn't like it was
just this vast network apart from what you know you
would see like in in um Sherlock, where microft is
able to zoom in on Watson wherever he goes because
he can tap into every camera that's on every street corner. Well,
that's one thing, because those were those were streets, so
those would presumably be operated by by the government itself
(51:44):
or at least some branch of the government, but you
wouldn't be able to necessarily tap into like every every
stores feed because it's a self contained system. It's not
connected to a network for now, which brings us to,
of course the future. Yes, okay, very last part, uh,
speculation on speculation, educated guesses where is this going? So
(52:09):
On the one hand, I am somewhat encouraged in the
United States that a lot of focus has been brought
to this matter along with the Government Accountability Office and
Georgetown University. You have Congress actually taking the FBI to
task about this. Obviously, there would need to be a
lot more of those sort of discussions at all levels
(52:30):
of government for that to actually lead to regulations and transparency.
I think it's a good start. I do not know
if it's going to have enough momentum to carry it
forward to prevent massive abuse of this kind of technology
on multiple levels, because I mean that's happening right now.
I don't know if it'll curtail it. Um. I hope
(52:51):
it will. That's one possible future is that we'll see
over time the development of these rules and policies that
various agencies have to follow so that the use of
this technology is done in an accountable and responsible way,
and that uh, you know the FBI. They state that
when they're running these searches, any search result they get
(53:14):
back that does not mean the person that they get
has become a suspect. It means that they have a
match that a potential lead to follow. They're very careful
in their wording to say that they haven't identified to suspect.
That's why they use probe photo rather than suspect photo. However,
words only go so far. How where do the actions go?
(53:35):
And that's the big fear is that we don't really
know yet. So optimist Jonathan would say, over time, we
developed these rules, we hold these agencies accountable for them.
We make certain that no one is relying too heavily
on a technology that is not perfect, and it just
(53:56):
becomes another another tool of agencies to use for specific
situations in order to investigate leads. And that is as
far as it goes. Cynical Jonathan, I'm more familiar with
this one. Cynical Jonathan thinks this is going to continue
largely unchecked at various levels of law enforcement agencies, including
(54:23):
the federal level. Largely used in in agencies that have
no transparency because of the nature of what they do.
Things like the n s A, and that there will
be a growing number of people very much interested in
using it for purposes that it was not intended, in
other words, not to investigate crimes alone. I mean they
(54:47):
would still want to use it for that obviously, but
also to perhaps push specific political agendas by suppressing opposing
political agendas. And this sounds very or Willian and very
like I'm not again, I'm not a Norwellian type of
dystopia person, but it makes it so easy to do
(55:08):
that even if you weren't consciously setting out on that path,
you could do it just by happenstance of your other
methods of going about your business. And because we have
no rules or regulations for it, there's no protection against it.
So I find it very concerning. I think it needs
to be addressed and and it needs to be consistently
(55:34):
looked at as something that we have to fix, because
if we don't, the abuse of it is going to
be rampant. It's going to be disruptive. And I mean,
if you are not a member of whichever privileged group
is not consistently targeted by this, you might not see
a problem with it. But for everybody else it's it's
(55:56):
a huge it's unfair, it's a threat, it's not cool.
Uh So yeah, I'm I'm afraid we're going to a
real dark place and it's literally stuff they don't want
you to know. When we talk about where this data goes,
where it's aggregated, who's talking to whom? Right? This is?
(56:17):
This is not only I don't know. I think you
make an important point when you say it's not illegal,
because there's no measure of legality yet. But legal and
ethical are not always the same thing. No, I mean,
(56:37):
there are plenty of things that I think you could
argue are unethical that are perfectly legal, and vice versa.
There are some things that you might argue are illegal
but are perfectly ethical. Right, Like it's legal to uh
seeing that Journey song and get karaoke, But is it ethical?
I would think not that, I would think at this point,
don't stop believing has been overdone. Go with a different
(57:02):
Journey song. So at this point we've talked about the past,
the present, and the possible future of facial recognition, and Jonathan,
I want to thank you so much for coming on
the show and illuminating us. Choosing that word very carefully, uh,
with with the facts behind something that people hear about,
(57:26):
and this is a morphous boogeyman sort of way. But
it turns out that even when you shed this light
on it, for us, it's still a boogeyman. Oh yeah,
Now this is this is something that is happening. It's
happening without any real oversight it. It is certainly incorporating
(57:46):
uh data from people who have no criminal background whatsoever,
and it is it's very disturbing, like it's It's one
thing if everyone were to go with the rules that
the FBI originally set, which is that we're only using
the data that we've collected from criminal cases. Even then
you could argue, well, we don't necessarily know that everybody
(58:09):
who's in that criminal database actually did something criminal. There
might be some innocent people in there, but there probably
are some innocent people in there, And you're also missing
out on everybody who has yet to commit a crime.
But that that part would be less concerning from the
average citizen, right, Matt. I mean, if you're FBI, you
hate it because it makes it harder for you to
(58:30):
find a lead. But if you're an average citizen, you're
relieved because you know you're not gonna get targeted. But
the problem is that no those rules, that's not what
everyone's following people are. In fact, we don't know what
everyone is doing. And it's terrifying because you are. There's
no way of knowing how frequently your face has already
(58:52):
been in virtual lineups. It probably has been at some point.
It may have been that it reached a very early
stage before it was dismissed. It never went further than that.
But right now, as we're talking, as your listeners are
listening to this, there are cold algorithms looking for that
numeric match and your face might be it. And that
(59:14):
reminds me, Uh, do you guys want to take a
picture for Instagram after we after we closed the show
doesn't have to be of us? All right, we'll break
out I patches or something. Maybe this is all true.
This is not a conspiracy theory. This is there is
an active collection of groups that are cooperating using your
(59:40):
data in what could be and what have been very
dangerous ways. And Jonathan, I think you've ended it for
us on the perfect note, because this is only weirdly
enough the beginning. So if people want to learn more
about not just this, but all things tech related? Could you?
(01:00:01):
Could you tell us where to check you out? Absolutely?
I host the podcast tech Stuff uh we uh we,
published twice a week Wednesdays and Fridays. We cover all
sorts of technology topics, from things that are scary, like
I did a recent episode about near misses with nuclear war,
the times when we were the world was on the
(01:00:22):
brink of nuclear devastation and cooler heads prevailed because clearly
that did not happen. But I also do fun, silly stuff.
I did one that was all about the top memes
of the last few years, so talking about where those
came from and and uh and and how they evolved
over time. So it's the whole gambuit of technology stories.
(01:00:44):
So if you want to check that out, go to
your favorite podcatching service such as iTunes podcasts and listen
to tech Stuff. If you like it, subscribe if you
really like it, you can watch me record it live
because I'm doing something crazy on Wednesdays and Fridays over
at twitch dot tv slash tech Stuff. I live stream
(01:01:05):
the recording sessions of my show, so you can go
and watch me record live. And here's the cool thing
or lame thing, depending upon your point of view, you
can witness all the times I mess up or have
to take a break, or have to cough, or have
to drink some water or whatever it may be. Because
it's live, it's all the stuff that we cut out
before we published the show. You can witness it. And
(01:01:28):
you can say I heard that show a month before
it published, because that's how far out I am. That's fantastic,
And I think Matt sometimes helps you produce that is
that he does. Matt has I have pulled Matt in
on multiple occasions to help set up the both the
recording and the Twitch streaming side of that. Matt has
appeared on camera briefly as part of that. I don't
(01:01:50):
make him stay for the whole thing because he's cut
It turns out he's got stuff to do sometimes. Yeah,
so uh, do check out Jonathan and that if you
are interest in the future of technology and want to
feel a little bit better about the state of humanity
than check out Jonathan's other show, Forward Thinking, which is
available on YouTube and the Facebook's Yeah Yeah if you wanna.
(01:02:14):
If you wanna see my face getting identified all over
the place, check out Forward Thinking. That one's that was
kind of like the the shiny happy cousin to stuff
they don't want you to know's dark, dark emo cousin.
I think we prefer to think of it as conspiracy realism.
I look, I'm not making any kind of judgments here,
(01:02:35):
I'm merely making an observation. I love all the children
of How Stuff Works equally. All right, Well, now we're
definitely putting a picture of you on Instagram, and you
may be asking yourselves, ladies and gentlemen, Wait, the guys
have an Instagram. Oh my gosh, o MG, where can
I find it? Well, the answer is Conspiracy Stuff Show.
You can also find us on Facebook and Twitter, where
(01:02:57):
we are Conspiracy Stuff. And if you're if you're sufficiently
freaked out enough by social media where you're seeing Ben Matt,
Jonathan Noel love the show, I want to talk to you.
Guys have some stuff that my fellow listeners should know,
but I'm I'm scared shirtless of social media, then you
(01:03:18):
can write to us directly. We are conspiracy at how
Stuff Works dot com and we especially want to hear
from you if you live in Alabama, Arizona, Arkansas, Delaware, Illinois, Iowa, Kentucky, Maryland, Michigan, Nebraska,
New Mexico, North Carolina, North Dakota, South Carolina, Tennessee, Texas, Utah,
and Vermont. Because those are the states that are working
directly with the FBI, why I counted eighteen of them, Matt,
(01:03:39):
that's right, Guys. We end when we say the email
conspiracy at how stuff works dot com.