All Episodes

May 13, 2020 57 mins

As governments work to fight COVID-19 and industries struggle to survive, more and more people are concerned society will never actually return to normal. World leaders have been disappearing (and reappearing). Millions are unemployed. Governments and tech companies are rolling out vast surveillance schemes that would make a supervillain blush. And, around the world, people are wondering -- did we start fighting this too late? Are we opening too soon? Tune in to learn more in part four of this continuing series on the coronavirus pandemic.

Learn more about your ad-choices at https://www.iheartpodcastnetwork.com

They don't want you to read our book.: https://static.macmillan.com/static/fib/stuff-you-should-read/

See omnystudio.com/listener for privacy information.

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:00):
From UFOs to psychic powers and government conspiracies. History is
riddled with unexplained events. You can turn back now or
learn the stuff they don't want you to know. A
production of I Heart Radio. Hello, welcome back to the show.

(00:25):
My name is Matt, my name is Noel. They called
me Ben. We're joined as always with our super producer Paul.
Mission Control deconds. Most importantly, you are you. You You are here,
and that makes this stuff they don't want you to
know here in the age of social media. Right, we're
recording this as long time listeners know on a virtual

(00:50):
interface right where we're um, we're mentally and soulfully together,
but not physically together, and that may be the case
for some time. However, technology is allowing us to feel closer, right,
And and this the crazy thing is this stuff is
so cheap. Now we're talking about the proliferation of cheap

(01:10):
imaging technology. You guys, remember when not everyone could afford
a camera, editing software and stuff like that. I mean, hell,
I remember when not every phone had a camera on it,
or it was like, who's got a digital camera? Go
find one? You know. Now it's like a non issue.
You know, well, if you're going to take a video

(01:30):
with one of those, it was this tiny little MPEG
file that you'd end up getting. Even though you have
this powerful camera, the digital camera you could use, your
video was so terrible. You oh yeah, and you just
had a nice curated set of pixels that would move
and what appeared to look like a face exactly. But

(01:51):
but now we have you know, now it's it's so
affordable to buy these things, and and on balance that's
a good thing. But combine that with the massive growth
of social media and the massive growth there's something you
people don't talk about is often the massive growth of
archival capacity. This this means, if you want to be
sweet nostalgic about it, that are days of cherishing a

(02:14):
few like glamour photographs from your from your local photography
studio or if you expensive polaroids. Those days are all
but gone. Now you can post photos of yourself, your friends,
or strangers, and you can post them anywhere you please.
There are some laws about it, um, but they're they're
not really good because legislation has always been technologies slower sibling,

(02:39):
and so those those laws often don't have a lot
of teeth to them. And with the rise of all
this technology and the rise of all these capabilities. We
also see the rise of technology that is exclusively meant
to exploit the vast amount of stuff that we've been
making just from like live journal, Tumbler, MySpace, friends, sir,

(02:59):
face book, all you know, all the hits, all that jazz.
So what do you guys think, I mean, optimists and
pessimists have a different idea of of what this means
for the world. Well, yeah, so on the optimist side,
I would just say the number of photographs and captured
memories I have of my son is probably more, you know,

(03:22):
it's certainly thousands of times greater than that which exists
for me as a child and my growth, my you know,
my parents, and then going back and back and back
and back. It's just exponentially a larger amount of memories
that are captured throughout his life, my son. So that's great,
that's wonderful. We can look back on those, you know, um,

(03:44):
until the the machines all turn off or we lose electricity.
Those those images will exist for my family and people
that care about me and and my son. But on
the flip side of that, Matt, you know, like folks
like my mom, for example, she doesn't consider those re
all like She's like, if it's not printed out and
it's not in a frame on my mantle, then that

(04:06):
is not a real memory. Which is interesting because you
to your point about like the machines turning off. There's
all these companies now that will take your Instagram photos
and print them out and send you framed copies of them,
and the proliferation of like in stacks cameras and like
more analog technology is sort of a response, almost a
backlash to this over digitization of like our memories and

(04:29):
our collective kind of you know, unconscious really in a
way with all the stuff that's out there. It's like
there's this sense of is it really ours once it's
out there in the etherm And I think that's a
big part of today's question. Yeah, who who owns the art?
The artists or the audience. It's a very old question,
but it's one that has continually been relevant, right and yeah,

(04:52):
But but the other question, Ben, is is it art
or is it just data? Right? Right there we go.
That's a good that's a good distinction. Um, it's a
tough distinction to make. Optimists are saying this is great,
you guys, the world is going to be a safer place,
And think how convenient everything's gonna be. You'll never lose

(05:12):
your favorite photo again, you know, and criminals can no
longer disappear into a crowd. And then pessimists say, you know,
in a world beyond privacy, every person is monitored continually
throughout their lives. Your image is logged countless times, and
all the data from that image and all the other
images of you, it's analyzed. It's bought and sold without

(05:34):
your knowledge or your permission. A world careening towards sci fi, dystopia,
pre crime, pre existing medical conditions. Advertising so invasive and
so bespoke that it might as well be reading your mind,
because it's doing something very close to that. But before
you get all blade round around minority report, let's let's

(05:55):
start closer to the beginning. There is a revolution happening
in the world of faith sal recognition right now and
its name is clear View AI. Here are the facts. Yeah,
and the first time I heard of clear View AI
was on a recent uh NPR piece where the founder
was interviewed. So this is a very relatively new company,

(06:16):
at least to me. But as far as the background, UM,
it is a research tool that is being used by
law enforcement, specifically in the company. It makes it very
clear that this is something that's designed exclusively for use
by law enforcement, UH in order to identify UM perpetrators
and victims of crimes. It's a little bit yi, I

(06:40):
would say, yeah, I mean that's you know, we have
to be honest. That's that's the high level like logline
or pitch from the company on its own website. So
we can't blame them for not wanting to get too
into the weeds about algorithms. But if we look at
how it actually works, we can we can get a
high level understanding. It's a startup. It has a massive

(07:04):
database of faces, just specifically faces, somewhere north of three
billion images and as we're as we're recording this, that
number is likely still growing. So that number is going
to be a little bit higher by the time, UM,
you get to the end of this episode. Yeah, And
it was that the gentleman who founded the startup, han

(07:26):
Ton thought was who I heard interviewed on I believe
it was The Daily actually, So if you want to
hear directly from this gentleman's UH mouth what he believes
the mission of this technology and this company is. It's
very fascinating interview that kind of delves into some of
the slippery slope sides of this technology that we're going
to get into. But as far as the end users concerned, UM,

(07:49):
which in this case would be law enforcement, because currently
this is not available to UH civilians. UM. You take
a picture, UM, and it identifies who that person is
to blows that by by matching it with all of
this information that they have collected, and they've collected it,
you know, not through any really nefarious means. It's all

(08:09):
that information that we talked about that's already out there.
All they've got to do is reach out and grab it.
It's a it's a search engine for faces, and it
sounds innocuous at first. I also, UM, if it was
The Daily I think I listened to that interview, UM,
but I also listened to an excellent interview you sent
Matt an extended CNN Business UH conversation. So if you

(08:31):
want to see them visually, like if you want to
see the founder using the app a couple of different times,
I think he does it on himself. He does it
on UM. The interviewer and then he does it on
one of their producers and and it it seems to work. Um,
it's it's not a sensationalized thing either, at least the
way they're presenting it. Yeah, but we're we're gonna get it.

(08:54):
We're gonna maybe talk about that a little more. Um,
just just if you do, wanna look at that video. Um,
it's an interview with Donny O'Sullivan with CNN Business and
it is where we're going to reference it several times
in this episode. Yeah. So, like you said, no clear
View has taken this from publicly available sources, your Facebook's,

(09:15):
your YouTube's, your venmo's, which yes, are publicly available. And
the federalis and local law enforcement admit that they only
have a limited knowledge of how the app actually works mechanically,
the nuts and bolts of it, but they do confirm
they've already used it to help solve shoplifting cases, identity

(09:38):
theft cases, credit card fraud, child abuse, even and even
several homicides. This technology goes, by the way, far beyond
anything constructed by Silicon Valley giants or anything officially created
or used by the US government. Yeah, in that that interview,
that we just mentioned. Juan mentions that they they had

(10:04):
this image or this video file of a child abuse
UM incident and the person of interest walks past in
the background at one point, and there were only two
usable frames of this person's face. And again, like imagine
in the background of a video, they were able to

(10:25):
take a still from that video of this person's face
and identify him through other social media websites where they've
got all this data from and make an arrest on
this person like that. That is um, it's mind blowing
that you could do that, and thank goodness that they

(10:46):
were able to do that, But it's the implication of
being able to do that in that one instance and
then applying it across everything is what makes it feel
a little scary, because, yeah, for the critics or for
the people who are looking at it big picture. So
if you're you're thinking about it just in a law

(11:07):
enforcement application, then it's amazing, right, and it's the best
thing ever. But if you apply it to everything, to
everyone at all times outside of law enforcement, then well,
and just strictly on like a technological level, it's fascinating
and and clearly a huge leap forward in this kind

(11:29):
of technology because I mean we've seen these videos they're
talking about, these grainy surveillance videos or a t M
camera videos. I mean it's hard to as a human being,
even if you knew that person, it would be difficult
for you to identify them. So this using this algorithm
and you know, analyzing comparing it to all these different subjects,
I imagine using points of articulation like on the faces

(11:51):
and the structures or whatever, it's able to come back
with positive matches. I think that's really fascinating, and they're
really you know, pretty innovative use of technology for sure,
as a nerd speaking you know, exclusively on that level.
But you're right, Matt, when you start to apply it
on a larger scale, it gets really nineteen eighty four

(12:13):
ish and kind of scary. I appreciate that point. No, well,
it's something that's going to come into play later and
it may surprise some of our fellow listeners, Uh, to
learn a little bit more about this technology and some
of the other topics that you raise. Their want I
want to say this, their pitch is very law enforcement

(12:34):
based on the website. Uh, they're quick to point out
they've they're quick to claim I should say that they've
helped law enforcement track down hundreds of at large criminals. Uh.
They mentioned the real the things that make people very emotional,
like um, child abusers, terrorists, sex traffickers. But they also
say this technology is used to help exonerate the innocent

(12:58):
and identify victims. Uh. And it seems that there they
might be embellishing a little bit in terms of the
degree of help they're providing, but they are providing help.
The authorities have used this, it's happening now. They've done
so without much public scrutiny, or they were for most
of doing so without much public scrutiny. In early clear

(13:23):
View said more than six hundred different law enforcement agencies
started using our app just in the last year, but
they refuse to provide more specifics because they're very protective
of their client base. So the badger is out of
the bag. Metaphorically, this is not a what if scenario.

(13:43):
This exists. This is in use right now, and that
means it can therefore be used on you and every
single person you know. Is that a bad thing? Uh?
Here here's the interesting part. This technology is maybe in
a vative and application, but very much not innovative in

(14:03):
theoretical terms. It has been possible for a while. People
in the tech world knew about this possibility for a
long long time, and some tech companies this is so weird.
Tech companies barely ever do this. Some tech companies like
Giants said, you know, we can build something like this.
They realized it years ago, and they actually decided to

(14:26):
stop researching it. They decided not to deploy this because
of the massive potential privacy concerns. Even Google wouldn't do it.
It's like that thing in the Batman Um, which which
one is It is the one with Keith Heath Ledger,
where there's that technology that turns every cell phone in
the world into a listening device, and you know he

(14:46):
has to destroy it after using it once because it's
just such problematic tech that, you know, Morgan Freeman's character
basically builds a safeguard device that causes it to you know,
destroy itself, because it's that's the thing ben about the
badger being out of the bag. Once it's out, you
can say all day long, this is just for law enforcement.
But like you said, men, the capability has been there,

(15:09):
so who's to stop somebody from doing a kind of
copycat version of this once it's out there and available
to check out, even if it's only quote unquote just
for law enforcement, you know. I mean, so were certain
kinds of radios and tasers and things like that, you know,
and people in the public have those now too. Don't
think it's a very good point. Yeah, I mean back

(15:29):
in eleven, think about how long ago. That was almost
ten years ago. The chairman of Google said, facial recognition technology,
this kind of stuff that clear View is doing, is
the one technology the Google is holding back on because
we feel like it could be used in a very
bad way. And now this is a fascinating point here.

(15:50):
Some cities have even preemptively banned police from using any
kind of facial recognition technology. And one of the most
notable example of a city that's banned this San Francisco,
at the very heart of technological innovation here in the US,
so they definitely knew what was coming. Their lawmakers, their

(16:10):
constituents gave it a big nope. And and here's here's
the other thing, clear View AI. If we think about
the future of it, the code behind it is fascinating.
If anyone out there has been watching the latest season
of Westworld. You may be familiar with the glasses that

(16:31):
several characters wear, and just think about that. If you
are familiar with that, think about that as we describe
what this thing can do. Yeah, think about it carefully
because you see Clear View AI. The code behind it
isn't just facial recognition. It also includes programming language that

(16:55):
would allow this tool to pair with a r or
augment and reality glasses. What does this mean? This means
that if the sunglasses I'm wearing now were uh, we're
functioning with Clear View AI. I could walk down the
street and I could identify almost every person I saw,
and not just other pictures of them, but breadcrumbs that

(17:19):
would tell me where they lived, you know, what their
children's identities were, what their name was, what they did
for a living, and so it would automatically play the
six Degrees of Kevin Bacon. Uh. Actually, that's a really
great idea for an add on game for these glasses.
You can see how close people are to Kevin Bacon.
You heard it here first, But but it's scary because

(17:41):
then you would like you would see you would see
Matt Frederick, and then you would you would know his friends.
Paul Mission Control decade and old Brown Ben Bowling, and
you would know everyone he worked with. It just it
gets real sticky really quickly. And and can you hear
the eds drooling in the backgrounds? Is that just my cat?

(18:03):
I don't know at this point, but that's the high
level look or spies, I mean, just any spy imagine that. Well,
you guys, I mean, I think we're forgetting an even
more classic sci fi example of this, that that kind
of heads up display you'd see in a movie like
The Terminator. You know, where you're seeing from the perspective
of this, you know, assassin robot who's looking for a

(18:27):
target and being able to see who everybody is and
get all these stats and metrics, and you could apply
this to any number of you know, um let's call
them parameters surrounding an individual. You could then go deeper
with this and start pulling stats from their social media
and be served up with information about their height and
or you know whatever, like any like, um, let's say

(18:49):
they have pre existing condition or something like that, or
they have some kind of you know, uh, what's the
word I'm looking for, like comproment on them. You could
even go deeper and find pictures of them, like maybe
smoking when they've claimed that they don't smoke, and they are,
you know, committing insurance fraud. I don't know. I'm just saying,
there's all kinds of ways you could deep dive scrape

(19:11):
this data and apply these points to an individual and
then get that information served up to you very quickly
in this display. And and yeah, and optimists are so
quick to point out the potential benefits of clear view
AI and facial recognition in general. It's true there are benefits. Pessimists, however,
claim the disadvantages here in this specific case far outweigh

(19:35):
any of those benefits, outweigh them, in fact, by a
vast and disturbing margin. We'll pause for word from our
sponsor and return to dive down the rabbit hole. Here's
where it gets crazy. There are a ton of problems

(19:58):
we've we've you know, we've done some foreshadowing on these.
But but let's let's talk a little bit more about
them first. You know, you can categorize these right, So first,
there are not many strong operational restrictions on how this
technology is actually used. That means that it can not

(20:20):
only be easily misused, but that there will probably not
be repercussions for people, at least not not legal repercussions.
It's like a point you made earlier, been off off, Mike.
It's like when you used to go into those uh
tobacco shops and uh, you know you are not allowed
to refer to certain smoking apparatus is as bombs. They

(20:42):
are water pipes exclusively because that absolves the distributor, the
retailer of any responsibility if you choose to use them
in an illegal manner. It's the same thing. It's like saying, I, hey,
whatever you do with my thing is up to you.
I I know it's design and exclusively for a legal purpose. Um.

(21:03):
It's a real sticky situation, didn't it, Because I mean
it's it not only UM puts all of the burden
on the user, it absolves the creator of any kind
of responsibility at all. Or does this table Hey hey buddy,
that's a tobacco water pipe, right Okay? Yeah? Yeah? UM

(21:23):
saying you know, saying something like that the way the
founder did in that interview with CNN where he was
just saying, you know, the things as just use it
in this way. It's it's we've seen it over and
over and over again. There there are a couple of
good things that he does speak about in that CNN
Business interview, And one of the major things, you know,

(21:46):
if you put it at the top, it's that the
software costs around fifty dollars to have a license for
a you know, for a department or something for years,
for two years, and so at least there's a buy
in level there that just your average person if they're
getting if they're obtaining it legally, wouldn't be able to use.

(22:06):
And then there are you know, ways to verify if
you know, an end user is actually if that license
is is actually that person they should be using it?
That's good. Um. And the other thing is if you're
a manager within the software of a you know, a
license e of the software, you get audits of every

(22:27):
time the software has been used, which is really nice.
So at least at least there could be some oversight
to it and they could check to see if you
are quote, using it in a way they're not supposed to. Yeah, yeah,
there's a there's an example of that. I think they'll
they'll come into play here later, which is kind of
it's pretty weird. Uh, it's it's a little it's a

(22:48):
little spooky, but I want to go back to that warning.
It's it's literally something. It's it's like that old FBI
VHS tape warning that we mentioned in earlier episode. You
go onto the homepage essentially the app, which you can
use on desktop or you can use on your phone,
and basically all it says is like, don't use this

(23:09):
in a bad way. And that's it's. It's like as
effective as those tags on mattresses that say don't remove
this tag, or or the the little note on your
boxy Q tips that's like, hey, don't put these in
your ear. I know it feels awesome, uh, and I
know that's what everybody does with it, but don't don't

(23:31):
put them in your ear. Uh. And then like you said, no,
that absolves them. Or the warning about seizure is at
the beginning of when you start up your PlayStation, you
know what I mean, It's like, yeah, okay, maybe I
get seizures, but I'm gonna roll the dice because I
sure want to play me some Borderlands three, you know. Yeah,
because you know, now that I think about it, we
might be in a bit of a glasshouse situation. Here, uh,

(23:53):
my friends, because our show literally starts with a warning
that says you can turn back now. We're not We're
not as bad que tips, you can you can turn
back now. That's definitely as as is our prerogative to
follow said warnings and not stick cute tips in our ears.
But who wants to do that? What are you gonna
use them for? Like putting on eyeshadows? I mean, I
don't know, like the back of your ear you need

(24:17):
a cute tip for that? Can't you just use like
a like a like a tissue. Some people are living
wild man. I don't know what to tell you. Tips
are literally probes. I don't know anyway. I just feel
like they're very clearly designed to shove in your ear
and then it does feel great, But yeah, it can.
It's pretty scary when you pull out some blood. You know.
I'm just saying, so, yes, I've never done that. Actually

(24:39):
I'm gonna starts, No, we get gotten blood, like gone
in for wax out with blood. But you know, maybe
I'm just fortunate. I want to bring up something that
is possible with this software and just how it uses
it before we jump into, uh, the next thing. We're
gonna talk about. And that's just that when your face

(25:01):
ends up on a social media website somewhere, if you
even if you did not take that picture, even if
you were in the background of a selfie that someone
took at Bonnaroo, let's say, or something like that, your
face is still searchable with facial recognition software if it
is visible in the background of an image, if you're

(25:22):
in a crowd, and if anywhere you are anywhere you've
been where a photograph has been taken where you can
are visible, uh, somewhere in the background. This thing clear
view AI can may very much likely has that image
within its database and your face will be recognizable to it.
Just let's put that there. So it doesn't matter if

(25:44):
you're on social media exactly, Guys Devil's advocate here, Isn't
that one of the situations where if you're in public
you're sort of giving up that right to privacy, Or
if you go to a Bonnaroo, there's usually signs that
said when you has these gates, you are subject to
being filmed, etcetera. And that's usually for more like documentary

(26:05):
purposes of the festival, but I would think it would
also apply if you're in public and you're caught on
you know, one of these millions of little surveillance devices
that people carry around with them. Is that fair game?
Can you sue over use of your of misuse of
your image? I could, I wouldn't have think. I wouldn't
have thought. So I'm certainly not arguing about, you know,

(26:26):
the legality of it. I'm just saying the reality of
it is, wherever you go, if there is a camera
taking a picture, whether it's cct well, CCTV, generally wouldn't
show up in one of these things unless it was
placed into the system at the At the risk of
sounding to the I don't know to extreme about this,

(26:47):
maybe too prescient. I think the best for people concerned
about this, the best way to handle it, which is maddening,
is to is to apply some of the rules that
some of the rules that we apply out of guns
two cameras. So you always assume a gun is loaded, right,
you never want to point it at you, So always

(27:08):
assume a camera is recording. And if you don't want
to be with that assumption, if you don't want to
be in the footage or in the photo, then don't
have it pointed at you. You guys, remember I was
very uncomfortable with Pokemon Go when it came out, and
I was like, no, they're putting poke They're putting these

(27:29):
are little cute things to collect because you're filming, you're
filming for them. You guys are like Ben big brother
does not give a like the faintest hint of a
of a about charas arts or whatever. Right, But still
that that stuff freaks me out, you know. But we
we did a whole story about the that company in

(27:51):
the antech and like how it had roots and intelligence. Katherineing.
I mean, you're not wrong, Ben, and I remember that
we'd be out hanging out and but would be doing
their little capturing their pokemons and then be on your
cell phone or something, and you would like cover your face,
and that is your prerogative to do. And I completely
understand where you're coming from, because it's like with the
whole zoom phenomenon that we're obviously we're using it right now.

(28:14):
You can turn off that camera, uh as far as
you're concerned where it doesn't show it to the group,
But I guarand damn t you, it's still capturing what's
going on. And and storing it somewhere, so you know,
it's a great technology. It works real well, which is
why it's having this moment right now. But I just
would wouldn't trust it too implicitly. You know, you know,

(28:36):
the best technology, that's what it's a sticker or some
tape at all times. Yeah, but that's uh, that's that's
still that's that's a low tech hack. But you have
to do it because this stuff can be weaponized. And uh,
there's a guy near me, Eric Goldman at the High

(28:58):
Tech Law Institute at Santa Clara University. I like the
way he put it. He said, the weaponization possibilities of
this are endless. Imagine a rogue law enforcement officer who
wants to stalk someone that they want to hit on later,
or maybe a foreign government is doing similar to what

(29:18):
you described NOL. They're digging up secrets about people to
blackmail them or to throw members of an opposition party
in jail. This stuff can happen. And you know, clear
View is also notoriously secretive. Even now. That's the second
big issue. It leaves a lot of critics ill at ease.
We we know some stuff because of some excellent research

(29:41):
from earlier reporters. Let me tell you, guys this weird story.
So there's a there was a New York Times journalist
who was working on a story about clear View that
really brought it to international attention. And and while this
journalist is working with clear is trying to do this story,
clear View isn't really answering him there. They're like ducking

(30:03):
attempts to contact via text, email, phone call. And so
this reporter knows, uh knows that a lot of law
enforcement agencies have clear View access. And so this reporter
asked some contacts in the police department to run his
photo through the app and they do a couple of times.

(30:25):
So the first approach, the first thing he learns about
clear View approaching outside forces is not when they come
to him. It's when they go to the police that
he worked with, and they asked the police, Hey, are
you talking to the media. How shady is that? Yeah?
That is whoa They have an explanation, right, oh absolutely,

(30:49):
And just really quickly that's the piece that was on
the Daily because the Daily's affiliate with the New York Times,
So this was after the fact of this reporter whose
name was trying to defind Um had done all this
research and gone through this whole runaround to even get
a sit down. The reporter eventually found the office and
it was like very shady like there weren't that many
people there. It was a skeleton crew and uh finally

(31:10):
got access to the to the founder. But that is
all um, you know, described in great detail on that
piece by The Daily. I highly recommend checking it out. Yeah,
Kashmir Hill, the secretive company that might end privacy as
we know it is the um is the Times article. Yeah,
I like what you're saying about. When when when Hill
finally figured out where the company was originally on LinkedIn,

(31:34):
they just had like one employee listed and it turned
out to be a pseudonym or a fake name that
the founder was using. Uh so they're they're aware of
the dangers here. But but when they did eventually speak
to the reporter, they had and they had reasonable explanations
for both of these things. First, the founder said, look,

(31:56):
we were avoiding speaking to the media, not because as
we're like super villains or something. It's because we are
a tiny startup and a lot of times when you're
a tiny startup, you're in you're in stealth mode, right,
the and that makes sense because the n d as
on those things are pretty ironclad. You don't want anyone
to scoop you. And then he additionally said, here's why

(32:17):
we talked to the police about your search terms. We
actually monitor all the searches that are done because we
want to make sure that they're being used correctly. In
other words, you know, we want to make sure, uh
nothing like like, there's never a situation where a police
officer is a jilted is jilted by a romantic partner

(32:38):
and then they find out their X has a new
lover and they're like, well, I'm gonna find out about this.
Um god, I'm even afraid to make up a name
because I don't want to put it out there. Uh
uh Marcus mcgilli cutty, that's swinging a miss funny thing
about that, though, Ben is like, you don't even need
technology this robust to get this. Somebody that closely removed

(33:02):
from you know, a person you really know, they're probably
already like posting tagging each other on Instagram posts and
you just you know, go and just do a couple
of clicks and you're there. Um, but no, it's it's
a really good point and that's what what set off
those alarm bells. Um. But yeah, the way it's described
in the in the in the daily piece, it was
very it felt very like, okay, ay, they know they're

(33:23):
onto something and be they don't want to be found.
And I can see to your point bend the whole
start up mode. They don't want to get scooped. They
don't want anybody stealing their proprietary and form their you
know technology and trying to misuse it or you know,
create a copycat kind of version of it. So that
does make sense, but just it's the whole thing is

(33:44):
a little strange. But this app clear view claims, uh,
you know, is is up to security industry standards in
terms of I guess in terms of you know, hack
ability or like firewalls security or you know, being being
vulnerable to being infiltrated. But who is actually watching them,

(34:06):
who is actually monitoring them because it's it's essentially untested
and already being put out into the wild in cases
that can have huge effects on people's lives. We'll talk
more about that after a quick word from our sponsor. Yes,

(34:27):
you know that was that was a perfect ad break.
Will uh also gave me time to get get this
cat out of out of my recording studio. I heard
you had a little me hour. Yeah yeah, Well we're
all dealing with new coworkers, right, Can I just do
a quick aside, A quick quarantine aside. Had a crisis yesterday.

(34:49):
Uh we I don't know if I've mentioned the show
that my daughter got a hamster right before quarantine kicked in.
I had to inherit it because her mom was allergic.
She found out hamster's name is han Co which is
like an anime character. But anyway, we realized yesterday afternoon
that Hanukoh was not in her cage. Uh. So we
had a hamster search party all up in this house,

(35:12):
uh freaking out and I had just these horrible visions
of like finding a dead, decaying hamster, you know, two
weeks later in my sock drawer or something. So luckily,
the little the little critter found herself back to where
she belonged, on her own, which was great. Yeah, she
just popped out and was right there, even thought it
was a sock and it was turns out it was
a hamster. And it wasn't long. I read all this

(35:34):
horror stories about how you can only search late at night,
and you gotta bait it with peanut butter and all
this stuff, and it just she just came came right back.
It was great. That's fantastic. Yeah. I had a had
a gerbil in my younger days that escaped and lived
and died in the walls of a house, and my

(35:54):
tried to try to move the oven and appliances to
get to it, uh, but then we couldn't, And my
father thought it would be a good lesson in mortality.
Oh boy, I was hoping to avoid that particular lesson
for my kid, especially during these uh trying times when
things have been mainly pretty positive for us here at

(36:14):
the house. I'm so glad. Yeah, thanks guys. Um, but no,
you're right, Ben, tell us a little bit about the
security of this app and the idea of it being
tested for you know, um vulnerabilities. Right. Yeah, this is
a very interesting point. So a lot of the reporting
that came up about clear View AI came before late

(36:40):
February of this year. In late February, clear View app
security was actually tested in the form of a hack.
On February, the public learned that the company had been
breached their security have been compromised. Hackers had stolen clear
view ais entire customer which was coveted and very much secret.

(37:02):
Right the customer list, adding to the troubling facts about
this company. The customer list did not jibe with what
clear View had said earlier. It spans thousands of government
entities and private businesses across the world, so the US
Department of Justice, Immigration, and Customs. But it also includes banks,

(37:23):
and he does allude to banks in a couple of
different interviews the founders. It also includes Macy's and Best
Buy an interpool. Uh. And then on the on the list, uh,
you also see credentialed users at the FBI. Uh. You know,
we said hundreds of police departments. But here's something very interesting.
It also includes users in Saudi Arabia and the United

(37:45):
Arab Emirates. These are countries that are, you know, not
particularly known for their progressive policies or their their First
Amendment rights. The founder, by the way, says that they're
exercise in the First Amendment right and they scrape all
these images. Yeah, they're they're exercising a First Amendment right
because they're all publicly available and out there. And if

(38:08):
the argument that they make generally is that if Google
can scrape all the websites and create search terms for
all those things and then allow you to search that information.
If um, you know, YouTube has you know, when you
upload something to YouTube, it's all publicly available. They can
scrape all the information for all the data, and then
the other companies intertwine and use that data to help

(38:31):
each other find other things. Well, hey, why can't we
do that too? Uh So we're going to However, a
lot of the scrutiny that's been happening because of that
that hack that was reported, as well as just the
tech in general and how these other giant tech companies
how their data is being used. Tech giants like Twitter, Google,
and YouTube same thing have sent cease and desist letters

(38:53):
to clear view orders saying do not do this anymore. Um.
And even like because of this, some police departments and
other users have just discontinued use of the software. And
when the founder you know, responds to this kind of thing, um,
he says, just you know, our legal team is on it.
They're working right now to fight that battle, um, using

(39:17):
the First Amendment as our major argument. Yeah. And then
importantly we should note that they said the customer list
was the only thing compromised, So the hackers don't have
that three billion plus and growing image archive. But you're right, Matt,
legal troubles gather like storm clouds on the horizon. Clear

(39:39):
View has said it's complied with things um like policies
proposed by the a c l U. The a c
l U has said that they don't comply. And I
think there's there's a case coming up against clear View
in Illinois, and it's one of what maybe many arguably
legal troubles should be gathering on the horizon, and the

(40:01):
storm should hit, especially when we look at the implications
of this technology, and before we continue, we want to
be like cartoonishly transparent here. We're not saying that clear
View or the people in the startup are bad people,
and we're not saying even that their intentions are bad.
We're saying that this technology, just like any other technology,

(40:24):
has inherent implications. It hasn't just like fire. It has
it has some positive benefits, and it has some very
dangerous possible consequences, like how could it be misused? The
first one just off the top here. The first one
is the one we talked about a couple of times
in this episode. Somebody with access could use it for

(40:46):
personal reasons. We did that hypothetical cop who's you know,
mad at his ex for dating Mark mcgilli cutty or whatever.
But we can go beyond the hypothetical examples. There's a
real life example. There's a billionaire clear View investor who
used the app to stalk his daughter's newest bow. He
wanted to find out about this guy. I imagine, you know, um,

(41:09):
whether you're a millionaire or a pauper, you of course
you want to know who your kids dating. You want
to know about them. But it's crazy that this guy
was able to um to use it and he found
out all the stuff that we just mentioned, where the
guy lives, what his job is, uh, you know, his
known connections, his social media and stuff. This example is

(41:31):
a little weird though, because I imagine being a billionaire,
you could just hire a person to dedicate their time
to doing that. Just hire a p I. This is
a This is a really important thing about this that
I maybe we mentioned, but we didn't get into it.
The reason why he was able to find all that
stuff about this person. The way I've seen the software

(41:52):
work it's not like you identify this person that it
gives you a giant read out of all of this
person's information. You you identify that that person's face, then
all of the other publicly publicly available on the internet
faces show up and you can click on it and
it takes you to the website where that face is,

(42:13):
where that image is located. So that means you're then
directly connected to their social media, to anywhere that that
photo has been posted, and then through those other websites
you can gather all that other information. At least that's
the way it was demonstrated in that CNN Business interview
in a couple of other places, that's the way it
works for now, But we know that that u I

(42:35):
could easily be upgraded to maybe pull to scrape that
information if it's publicly available, right, I mean, think about
how many things are we've talked about before, how easy
it is to find a phone number, how easy it
is to find not just your address, but the last
few addresses you lived at over the years. This this

(42:56):
is where we should also mention people don't trust clear
view AI is a little bit of stereotyping, but they
don't trust it because one of the investors, Peter Thiel
is also big in Facebook. So Facebook at the Facebook
is good at a lot of things and users privacy
has never been one that is not a bug, that's

(43:18):
a feature from there. And then this is just a
dovetail on all of that. Um It. In that interview,
Donnia Sullivan does the search of his face and it's
the official profile picture on the website um where he
works for the CNN business site. And when it scrolled,

(43:39):
when they scrolled all the way down, there was an
image of him because he's in his uh, I guess thirties,
late thirties or something to that. I don't know how
old he is, but he's a person about our age,
been an ol and I they scroll the way down
and there's an image of him from a local newspaper
where he was in Ireland. Way he was sixteen years old,

(44:02):
and it's a group of you know, teenagers holding up
a sign and you can just see his face and
he looks absolutely different. I mean you, I would not
have been able to tell that that was him, but
he recognized the photo and he knew it was him.
And this is the kind of thing where you know,
this is not an allegation on my part. This is

(44:24):
just me wondering and ruminating. I'm wondering how they connected
that up, How this face recognition software inserted that picture
of him where he looks so different. And the founder says, well,
you know that there's still parts of your face, the
geometry of it that remained the same over the years,

(44:44):
even if you put on weight, even if you're wearing glasses,
even if you're covering your face. I get that argument,
But to me, it makes me wonder if there is
some kind of added like Spokeo search that's going on
within this system where you're looking at that's just another

(45:06):
company that, um, you can search social media, uh, any
social media accounts that somebody has using an email or
a phone number or a name. And it makes me
wonder if they're connecting things up there as an extra
layer of a name, like prioritizing an image by uh
other information. Yeah, I mean it makes me wonder if

(45:26):
that's happening. I guess not, but I they I would
say he didn't sufficiently, the founder didn't sufficiently explain how
that image got in there, and and the interviewer is
clearly a little weirded out and disturbed by it. Well,
another thing too, is I mean, you know, in in
the United States, presumably um you you know, you might

(45:47):
be identified using this technology, but doesn't mean you are
immediately going to be convicted of something without some kind
of physical evidence or without a trial. But in other
countries where stuff like that isn't as much of a thing,
it could be used to identify social dissidents or like
people that are speaking out against the totalitarian government and
flag them and throw them in the gulag or whatever,

(46:09):
you know, without ever you know, batting an eye. And
that is uh, sort of the crux of the problem
with this software is it depends on the notion of
people that use it being good actors and and and
being good stewards of this power that they have. And
as we know, power corrupts, and we've got a lot

(46:30):
of governments that are kind of corrupt and UM out
to shut down any kind of UM criticism of of
what they do. And this would be a great way
to identify those people and round them up, UM and
put them away. Yeah, and it's it's it's dangerous for
that reason. You know, you could you could detain people,

(46:52):
and you could cut around some of their legal protections,
like what if this what if this counted in a
court of law as definitive identification, then you would say, well,
we don't need to figure out if you're the person, right,
it could get so dangerous so quickly. There's another point
we have to add. You know, this sounds like some
kind of Skynet Big Brother stuff. It can also be

(47:15):
used inaccurately. It is far from perfect. There's um It's
one thing that I've been talking a lot about with
some people online about. Uh facial recognition has an inherent
racial bias that I wish more people talked about for
non white people, for persons of color. This stuff lags inaccuracy,

(47:37):
not just clear view, but in general. Uh facial recognition
is known for this. And this means that you could
be arrested due to a computer error because the software
decided that you look like a guy you never met,
who lives in the state you've never been to, who
committed a crime that occurred while you were not alive,
and it could happen. It could happen here. And to

(48:00):
continue with that, we mentioned, you know, the people that
were known to be using this software, the organizations and
two of them are three of them. Let's just point
these these out. Department of Justice, the US Immigration and
Customs Enforcement. Let's just leave it at those two. So
take the problems you're talking about ben that are inherent

(48:22):
to this system. Apply that then to let's say, Immigrations
and custom looking for people who are in the United
States illegally, and imagine that you're attempting to round everyone up.
The implications of the software would be um extremely effective,
I would say, or completely ineffective because they're they're detaining

(48:46):
and rounding up people incorrectly because of the problems with
the software. Well think think about this too. Uh, the
problems with the software, So they compound on the state level,
but they compound on the private level. Two, and this
is one that I think for the immediate future is
at least as dangerous as the Orwellian stuff we're talking

(49:10):
about right now. So that data, when it's mind right,
just like your data on any other social media, it
can be sold to third parties, insurance companies, financial institutions,
and so on. This goes way past ads with the phrase.
I know I'm kind of churchifying here, but the phrase
I think of is a longitudinal profile of your face

(49:33):
and your behavior over time. So it's like watching a
real life progression of you aging and the let's say
that UH, an insurance company or an algorithm could start
to pick up on certain medical conditions that are manifesting
your face in in very minuscule ways ways of human being,

(49:54):
and even a human doctor wouldn't sense. The computer picks
this up before you know. Anything's long, sort of like
the way targets algorithms figured out that poor girl was
pregnant before her her family knew. So here's the question.
Would the insurance company legally be required to notify you

(50:14):
of this possible health condition, say, like, there's clearly something happening,
maybe if they have video to in your behavior that
is indicative of UM an early onset terminal condition, would
they be legally required to tell you that they knew
you had a let's say eight percent chance of dying

(50:35):
in the next two years and give you the assistance
you need right now. That's a nope. There is a
big no, Chief. They don't have to do a damn
thing for you what they can do with absolutely no
repercussions and say, hey, this person is probably gonna be
dead in two years and it's gonna be expensive to
get them to year two, so let's just drop them.

(50:56):
Let's just drop him. There's no law that there's no
like ethical legal requirement for them to tell you what
they know. And that's that's mind boggling. That is reprehensible.
That is like I need, I need to get my
good non abridged the sours to come up with all
the different words for how bad and disturbing that is. Oh, Ben,

(51:18):
I would I love it when you do that, because
I always learn a new word. Uh, did you guys ever?
Just a light in the mood, Did you guys ever
hear that old joke about the abridge the sourrous? It's like,
I have a an abridge the sourus. Not only is
it terrible, it's terrible. So I didn't write that, but

(51:38):
you but you can, um. The good news is right now,
depending on where you live, you can opt out of
clear View. Oh my gosh. But even the way you
opt out isn't like it's such a burden. It's such
a burdensome brocess. You can't just click a box in
an email, right you? All you have to do is
send the headshot. We all have those uh and and

(52:01):
an image of your government issued I d UM and
this only applies also to residents of California or the
EU UM for whatever reason, sorry, the rest of the
world right because of privacy, because of the privacy laws
that exists there. Yeah, and we shall also point out
going back to this accuracy thing there, there are serious

(52:23):
questions about how accurate clear View is in general. So BuzzFeed,
according to marketing literature they found from clear View, UH
says that the company the company touts the ability to
find a match out of one million faces n point
six percent of the time. But when clear View did
finally start talking to the New York Times, they said,

(52:44):
the tool produces a match up to seventy of the time,
and we don't know how many of those are quote
unquote true matches. So there's some contradictory information it's coming out,
which can make people feel uncomfortable, obviously, And that's that's
where we leave today. The battle lines are firming up.
On one side, you have clear View, it's investors, it's clients,

(53:06):
and what I would call the techno optimists. Right on
the other side, you have privacy advocates, you have tech giants,
like Microsoft IBM. You even have the Pope. The Pope
came out against facial recognition and he summed it up
pretty nicely. He said, quote this a symmetry by which
a select few know everything about us while we know
nothing about them. Dull's critical thought and the conscious exercise

(53:30):
of freedom. Imagine applying that to a priest who hears
the confessions from everyone uh every week when when they
go and tell their dirty secrets. The priest is then
the one where who everybody knows nothing about uh, but
he knows everything about them. Just putting it out there,

(53:51):
especially hey, and also, uh, this thing is targeting child
sex abuse and abusers. Just also leaving that there. Yeah.
Oh one other thing for anyone listening along and thinking,
good thing. I made my profile on insert social media
here private years ago. Sorry, homie. Anything that was public

(54:12):
at some point is pretty much in the system, even
if you later made a private So check your MySpace.
In that CNN Business interview, as you said, Ben, they
test out the software on the producer who has a
private Instagram account, and images from that account show up
in the search, and it is because it was public
at one point. And that's where we leave off today.

(54:36):
What do you think, fellow conspiracy realists. Do the benefits
outweigh the potential consequences of this? Do the consequences outweigh
the potential benefits? Uh? Let us know. You can find
us on Facebook, you can find us on Instagram, you
can find us on Twitter. We always like to recommend
our Facebook community page. Here's where it gets crazy. Now

(54:58):
you can jump on there. You can talk about this
up eisode or any other in the past. You can
post some dank memes, let's say, some conspiracy memes or
facial recognition stuff. Maybe you've had access to this before
to clear view and you want to talk to us
about it. Don't do that on Facebook. Don't do that.
But if you do want to tell us about that,
you can call our number. We are one eight three

(55:20):
three st d w y t K. You can leave
a message there, tell us about it. Let us know.
If you don't want to be identified, if you don't
want us to know UM or talk about anything on air,
just just let us know. Give us the information. UM.
We'd love to hear from you. And I want to
add to this to say that I finally start despite

(55:43):
my strange phobia regarding phones. I started diving in. Matt,
You're doing massive, amazing work there and I've started listening
to these calls as well. Thank you so much to
everyone who calls in. It's inspiring and I don't know
about you. I don't know about you guys, but it
makes me feel like what we're doing is worthwhile. If

(56:07):
that makes sense. Oh, definitely, definitely. I've had you know.
Hopefully you're gonna if you could get past the phobia, Ben,
hopefully you can call a few people because you can
use that app and you should be good to go. Um,
actually speaking to to you know, you know who you are.
If if we've talked on the phone, um, it means

(56:29):
a great deal has been is saying just to us
to know that we're not just talking in a darkened room, um,
and that you guys, you guys care about us as
much as we care about you. So this is a
great relationship. Let's keep doing it well, said Matt Well said,
And we'll be following up with some of those messages

(56:50):
in the future, so keep them coming. One last thing,
if you say, look, Ben's right, phones are terrifying and weird,
but also social media. You guys just told me how
dangerous that is? Why? Well, why on earth? What gives how?
I have a story that my fellow and listeners need
to know about. I have some I have some experience.

(57:12):
I have some terrible jokes to tell you, But I
don't know how to contact you. Well, we have good
news for you. You can always twenty four seven contact
us at our good old fashioned email where we are
conspiracy at i heeart radio dot com. Stuff they don't

(57:47):
want you to know is a production of I Heart Radio.
For more podcasts from my heart Radio, visit the i
heart Radio app, Apple Podcasts, or wherever you listen to
your favorite shows.

Stuff They Don't Want You To Know News

Advertise With Us

Follow Us On

Hosts And Creators

Matt Frederick

Matt Frederick

Ben Bowlin

Ben Bowlin

Noel Brown

Noel Brown

Show Links

RSSStoreAboutLive Shows

Popular Podcasts

Stuff You Should Know

Stuff You Should Know

If you've ever wanted to know about champagne, satanism, the Stonewall Uprising, chaos theory, LSD, El Nino, true crime and Rosa Parks, then look no further. Josh and Chuck have you covered.

Dateline NBC

Dateline NBC

Current and classic episodes, featuring compelling true-crime mysteries, powerful documentaries and in-depth investigations. Follow now to get the latest episodes of Dateline NBC completely free, or subscribe to Dateline Premium for ad-free listening and exclusive bonus content: DatelinePremium.com

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.