Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:00):
From UFOs to psychic powers and government conspiracies. History is
riddled with unexplained events. You can turn back now or
learn this stuff they don't want you to know. A
production of iHeartRadio.
Speaker 2 (00:24):
Hello, welcome back to the show. My name is Matt, my.
Speaker 3 (00:27):
Name is Noah. They call me Ben. We're joined as
always with our super producer Paul. Mission control decands. Most importantly,
you are you. You are here, and that makes this
stuff they don't want you to know here in the
age of social media. Right, we're recording this as longtime
listeners know on a virtual interface. Right, we're mentally and
(00:54):
soulfully together, but not physically together, and that may be
the case for some time. However, tech is allowing us
to feel closer. Right, And this the crazy thing is
this stuff is so cheap. Now we're talking about the
proliferation of cheap imaging technology. You guys, remember when not
(01:15):
everyone could afford a camera, editing software and stuff like that.
Speaker 4 (01:19):
I mean, hell, I remember when not every phone had
a camera on it or it was like, who's got
a digital camera?
Speaker 5 (01:25):
Go find one, you know, And now it's like a
non issue.
Speaker 6 (01:28):
You know, well, and if you were going to take
a video with one of those, it was this tiny
little MPEG file that you'd end up getting. Even though
you had this powerful camera, the digital camera you could use,
your video was so terrible.
Speaker 3 (01:41):
Yeah, yeah, and you just had a nice curated set
of pixels that would move and what appeared to look
like a face exactly. But now we have you know,
now it's it's so affordable to buy these things. And
on balance that's a good thing. But combine that with
the massive growth of social media and the massive growth
(02:03):
there's some of you people don't talk about as often,
the massive growth of archival capacity. This means, if you
want to be sweet nostalgic about it, there are days
of cherishing a few like glamour photographs from your from
your local photography studio, or a few expensive polaroids. Those
days are all but gone. Now you can post photos
(02:25):
of yourself, your friends, or strangers, and you can post
them anywhere you please. There are some laws about it,
but they're they're not really good because legislation has always
been technology's slower sibling, and so those those laws often
don't have a lot of teeth to them. And with
the rise of all this technology and the rise of
(02:47):
all these capabilities. We also see the rise of technology
that is exclusively meant to exploit the vast amount of
stuff that we've been making just from like live journal, Tumblr, MySpace, friends,
their face book, all you know, all the hits, all
that jazz. So what do you guys think, I mean,
optimists and pessimists have a different idea of what this
(03:08):
means for the world.
Speaker 6 (03:09):
Well, yeah, on the optimist side, I would just say
the number of photographs and captured memories I have of
my son is probably more, you know, it's certainly thousands
of times greater than that which exists for me as
a child and my growth, my you know, my parents,
(03:30):
and then going back and back and back and back.
It's just exponentially a larger amount of memories that are
captured throughout his life, my son. So that's great, that's wonderful.
We can look back on those, you know, until the
the machines all turn off or we lose electricity. Those
those images will exist for my family and people that
(03:51):
care about me and my son.
Speaker 4 (03:53):
But on the flip side of that, Matt, you know,
like folks like my mom for example, she doesn't consider
those re like she's like if it's not printed out
and it's not in a frame on my mantle, then
that is not a real memory. Which is interesting because
to your point about like the machines turning off, there's
all these companies now that will take your Instagram photos
(04:15):
and print them out and send you framed copies of them,
and the proliferation of like instas, cameras and like more
analog technology is sort of a response, almost a backlash
to this over digitization of like our memories and our
collective kind of you know, unconscious really in a way
with all the stuff that's out there. It's like there's
(04:36):
this sense of is it really ours once it's out
there in the ether. And I think that's a big
part of today's question.
Speaker 3 (04:42):
Yeah, who owns the art, the artists or the audience.
It's a very old question, but it's one that has
continually been relevant, right and Yeah.
Speaker 6 (04:52):
But the other question, Ben, is is it art or
is it just data?
Speaker 3 (04:57):
Right? Right there we go, That's a that's a good distinction.
It's a tough distinction to make. Optimists are saying this
is great, you guys, the world's going to be a
safer place, and think how convenient everything's gonna be you'll
never lose your favorite photo again, you know, and criminals
can no longer disappear into a crowd. And then pessimists say,
(05:18):
you know, in a world beyond privacy, every person is
monitored continually throughout their lives. Your image is logged countless times,
and all the data from that image and all the
other images of you, it's analyzed. It's bought and sold
without your knowledge or your permission. A world's careening towards
sci fi, dystopia, pre crime, pre existing medical conditions. Advertising
(05:43):
so invasive and so bespoke that it might as well
be reading your mind, because it's doing something very close
to that. But before you get all blade round around
minority report, let's let's start closer to the beginning. There
is a revolution happening in the world of faith recognition
right now and its name is Clearview AI. Here are
(06:06):
the facts.
Speaker 4 (06:07):
Yeah, I mean, the first time I heard of Clearview
AI was on a recent in PR piece where the
founder was interviewed. So this is a very relatively new company,
at least to me. But as far as the background,
it is a research tool that is being used by
law enforcement, specifically in the company. It makes it very
(06:27):
clear that this is something that's designed exclusively for use
by law enforcement in order to identify perpetrators and victims
of crimes.
Speaker 3 (06:39):
It's a little bit pretty very.
Speaker 5 (06:40):
Vague side I would say, yeah.
Speaker 3 (06:42):
Yeah, I mean that's you know, we have to be honest.
That's the high level like logline or pitch from the
company on its own website. So we can't blame them
for not wanting to get too into the weeds about algorithms.
But if we look at how it actually works, we
can we can get a high level understanding. It's a startup.
(07:03):
It has a massive database of faces, just specifically faces,
somewhere north of three billion images, and as we're recording this,
that number is likely still growing, so that number is
going to be a little bit higher by the time
you get to the end of this episode.
Speaker 4 (07:23):
Yeah, and it was that the gentleman who founded the
startup hann thought was who I heard interviewed on I
believe it was The Daily actually, So if you want
to hear directly from this gentleman's mouth what he believes
the mission of this technology in this company is a
very fascinating interview that kind of delves into some of
the slippery slope sides of this technology that we're going
(07:46):
to get into. But as far as the end users concerned,
which in this case would be law enforcement, because currently
this is not available to civilians. You take a picture
and it identifies who that person is. Is that by
by matching it with all of this information that they've collected,
and they've collected it, you know, not through any really
(08:08):
nefarious means. It's all that information that we talked about
that's already out there. All they've got to do is
reach out and grab it.
Speaker 3 (08:14):
It's a it's a search engine for faces, and it
sounds innocuous at first. I also if it was The
Daily I think I listened to that interview, but I
also listened to an excellent interview you sent Matt an
extended CNN Business conversation. So if you want to see
them visual, like if you want to see the founder
(08:35):
using the app a couple of different times, I think
he does it on himself, he does it on the interviewer,
and then he does it on one of their producers,
and it seems to work. It's it's not a sensationalized
thing either, at least the way they're presenting it.
Speaker 2 (08:52):
Yeah, but we're we're gonna get it.
Speaker 6 (08:54):
We're gonna maybe talk about that a little more, just
just if you do want to look at that video.
So it's an interview with Donnie O'Sullivan with CNN Business
and it is where we're going to reference it several
times in this episode.
Speaker 3 (09:07):
Yeah. So, like you said, nol Clearview has taken this
from publicly available sources, your Facebook's, your youtubes, your venmos,
which yes, are publicly available, and the federals and local
law enforcement admit that they only have a limited knowledge
(09:29):
of how the app actually works mechanically, the nuts and
bolts of it, but they do confirm they've already used
it to help solve shoplifting cases, identity theft cases, credit
card fraud, child abuse, even and even several homicides. This
technology goes, by the way, far beyond anything constructed by
(09:49):
Silicon Valley giants or anything officially created or used by
the US government.
Speaker 2 (09:56):
Yeah.
Speaker 6 (09:56):
In that that interview that we just mentioned, Juan mentions
that they had this image or this video file of
a child abuse incident and the person of interest walks
past in the background at one point, and there were
(10:17):
only two usable frames of this person's face. And again,
like imagine in the background of a video, they were
able to take a still from that video of this
person's face and identify him through other social media websites
where they've got all this data from and make an
arrest on this person like.
Speaker 1 (10:39):
That is.
Speaker 6 (10:42):
It's mind blowing that you could do that, and thank
goodness that they were able to do that. But it's
the implication of being able to do that in that
one instance and then applying it across everything is what
makes it feel a little scary because.
Speaker 3 (10:57):
For the critics, for the critics.
Speaker 6 (11:00):
Oh yeah, for the critics, well, for the people who
are looking at it big picture. So if you're you're
thinking about it just in a law enforcement application, then
it's amazing, right, and it's the best thing ever. But
if you apply it to everything, to everyone at all
times outside of law enforcement, then.
Speaker 4 (11:20):
Well, and just strictly on like a technological level, it's
fascinating and clearly a huge leap forward in this kind
of technology, because i mean, we've seen these videos they're
talking about, these grainy surveillance videos or ATM camera videos.
I mean, it's hard to as a human being, even
if you knew that person, it'd be difficult for you
(11:41):
to identify them. So this using this algorithm and you know,
analyzing and comparing it to all these different subjects. I
imagine using points of articulation like on the faces and
the structures or whatever, it's able to come back with
positive matches. I think that's really fascinating, and they're really,
you know, pretty innovative use of technology. For sure, as
(12:02):
a nerd speaking you know, exclusively on that level. But
you're right, Matt, when you start to apply on a
larger scale, it gets really nineteen eighty four ish and
kind of scary.
Speaker 3 (12:14):
I appreciate that point, Noel. It's something that's going to
come into play later and it may surprise some of
our fellow listeners to learn a little bit more about
this technology and some of the other topics that you
raise there. I want to say this. Their pitch is
very law enforcement based on the website. They're quick to
(12:36):
point out, they're quick to claim, I should say that
they've helped law enforcement track down hundreds of at large criminals.
They mention the real the things that make people very emotional,
like child abusers, terrorists, sex traffickers, but they also say,
this technology is used to help exonerate the innocent and
(12:58):
identify victims, and it seems that there they might be
embellishing a little bit in terms of the degree of
help they're providing, but they are providing help. The authorities
have used this, it's happening now. They've done so without
much public scrutiny, or they were for most of twenty
nineteen doing so without much public scrutiny. In early twenty twenty,
(13:22):
Clearview said more than six hundred different law enforcement agencies
started using our app just in the last year, but
they refuse to provide more specifics because they're very protective
of their client base. So the badger is out of
the bag. Metaphorically, this is not a what if scenario.
(13:43):
This exists. This is in use right now, and that
means it can therefore be used on you and every
single person you know. Is that a bad thing? A
Here's the interesting part. This technology is maybe in ave
an application, but very much not innovative in theoretical terms.
(14:05):
It has been possible for a while. People in the
tech world knew about this possibility for a long long time,
and some tech companies this is so weird. Tech companies
barely ever do this. Some tech companies like Giants said,
you know, we can build something like this. They realized
it years ago, and they actually decided to stop researching it.
(14:27):
They decided not to deploy this because of the massive
potential privacy concerns. Even Google wouldn't do it.
Speaker 4 (14:35):
It's like that thing in The Batman, Which which one
is it? It's the one with Keith Heith Ledger where
there's that technology that turns every cell phone in the
world into a listening device, and you know he has
to destroy it after using it once because it's just
such problematic tech that you know, Morgan Freeman's character basically
builds a safeguard device that causes it to you know,
(14:57):
destroy itself, because it's that's the thing been about the
Badger of being out of the bag. Once it's out,
you can say all day long, this is just for
law enforcement. But like you said, Ben, the capability has
been there, So who's to stop somebody from doing a
kind of copycat version of this once it's out there
and available to check out, even if it's only quote
unquote just for law enforcement, you know, I mean, so
(15:20):
were certain kinds of radios and tasers and things like that.
You know, and people in the public have those now too,
don't they.
Speaker 3 (15:27):
It's a very good point. Yeah, I mean back in
twenty eleven, think about how long ago. That was. Almost
ten years ago. The chairman of Google said, facial recognition technology,
this kind of stuff that Clearview is doing, is the
one technology that Google is holding back on because we
feel like it could be used in a very bad way.
And now this is a fascinating point here. Some cities
(15:51):
have even preemptively banned police from using any kind of
facial recognition technology. And one of the most notable example
of a city that's banned this San Francisco, at the
very heart of technological innovation here in the US, so
they definitely knew what was coming. Their lawmakers, their constituents
(16:12):
gave it a big nope. And here's the other thing,
Clearview AI. If we think about the future of it,
the code behind it is fascinating.
Speaker 6 (16:22):
If anyone out there has been watching the latest season
of Westworld, you may be familiar with the glasses that
several characters wear, and just think about that. If you
are familiar with that, think about that as we describe
what this thing can do.
Speaker 3 (16:41):
Yeah, think about it carefully because you see Clearview AI.
The code behind it isn't just facial recognition. It also
includes programming language that would allow this tool to pair
with AR or augmented reality glasses. What does this mean.
(17:02):
This means that if the sunglasses I'm wearing now, we're
functioning with Clearview AI, I could walk down the street
and I could identify almost every person I saw, and
not just other pictures of them, but breadcrumbs that would
tell me where they lived, you know, what their children's
(17:23):
identities were, what their name was, what they did for
a living, and so it would automatically play the six
Degrees of Kevin Bacon. Actually, that's a really great idea
for an add on game for these glasses. You can
see how close people are to Kevin Bacon. You heard
it here first. But it's scary because then you would
like you would see you would see Matt Frederick, and
(17:45):
then you would you would know his friend's Paul Mission
Control Deckat and Nold Brown, Ben Bolin, and you would
know everyone he worked with. It just it gets real
sticky really quickly. And can you hear the fan heads
drooling in the background? Is that just my cat I
don't know at this point, but that's the high level look.
Speaker 6 (18:07):
Or spies, I mean, just any spy. Imagine that, well,
you guys, I mean, I think we're forgetting an even
more classic sci fi example of this, the kind of
heads up display you'd see in a movie like The Terminator.
Speaker 4 (18:21):
You know, where you're seeing from the perspective of this,
you know, assassin robot who's looking for a target and
being able to see who everybody is and get all
these stats and metrics. You could apply this to any
number of you know, let's call them parameters surrounding an individual.
You could then go deeper with this and start pulling
(18:41):
stats from their social media and be served up with
information about their height and or you know whatever, like
any like let's say they have pre existing condition or
something like that, or they have some kind of you
know what's the word I'm looking for, like compromand on them.
You could even go deeper and find pictures of them,
like maybe smoking when they've claimed that they don't smoke,
(19:04):
and they are, you know, committing insurance fraud.
Speaker 5 (19:07):
I don't know. I'm just saying there's all kinds.
Speaker 4 (19:08):
Of ways you could deep dive, scrape this data and
apply these points to an individual and then get that
information served up to you very quickly in this display.
Speaker 3 (19:18):
And yeah, an optimists are so quick to point out
the potential benefits of clearview AI and facial recognition in general.
It's true there are benefits. Pessimists, however, claim the disadvantages
here in this specific case far outweigh any of those
benefits outweigh them, in fact, by a vast and disturbing margin.
(19:42):
We'll pause for word from our sponsor and return to
dive down the rabbit hole. Here's where it gets crazy.
There are a ton of problems we've you know, we've
done some foreshadowing on these, but let's talk a little
(20:05):
bit more about them first. You know you can categorize
these right. So first, there are not many strong operational
restrictions on how this technology is actually used. That means
that it can not only be easily misused, but that
there will probably not be repercussions for people, at least
(20:26):
not legal repercussions. It's like a point you made earlier,
been off off, Mike.
Speaker 4 (20:31):
It's like when you used to go into those tobacco
shops and you know you are not allowed to refer
to certain smoking apparatuses as bongs. They are water pipes exclusively,
because that absolves the distributor, the retailer of any responsibility
if you choose to.
Speaker 5 (20:51):
Use them in any illegal manner.
Speaker 3 (20:53):
It's the same thing.
Speaker 4 (20:54):
It's like saying, hey, whatever you do with my thing
is up to you. I know, it's design and exclusively
for a legal purpose. It's a real sticky situation, isn't it,
Because I mean it's it not only puts all of
the burden on the user, it absolves the creator of
any kind of responsibility at all.
Speaker 3 (21:14):
Or does he do this table up, Hey, hey buddy,
that's a tobacco water pipe.
Speaker 6 (21:19):
All right, okay, yeah, yeah, say, you know, saying something
like that the way the founder did in that interview
with CNN where he was just saying, you know, the
thing says, just use it in this way.
Speaker 2 (21:33):
It's it's we've seen it over and over and over again.
Speaker 6 (21:36):
There there are a couple of good things that he
does speak about in that CNN Business interview, and one
of the major things, you know, if you put it
at the top, it's that the software costs around fifty
thousand dollars to have a license for a you know,
for a department or something for a year, years for
two years, and so at least there's a buy in
(21:59):
level there that just your average person if they're getting
if they're obtaining it legally, wouldn't be able to use.
And then there are you know, ways to verify if
you know, an end user is actually if that license
is is actually that person they should be using it?
Speaker 2 (22:16):
That's good.
Speaker 6 (22:18):
And the other thing is if you're a manager within
the software of a you know, a licensee of this software,
you get audits of every time the software has been used,
which is really nice. So at least at least there
could be some oversight to it and they could check
to see if you are quote using it in a
way they're not supposed to.
Speaker 3 (22:39):
Yeah, yeah, there's any there's an example of that. I
think that they'll come into play here later, which is
kind of it's pretty weird. It's it's a little it's
a little spooky, uh, but I want to go back
to that warning. It's it's literally something. It's it's like
that old FBI VHS tape warning that we mentioned in
earlier episode. You go on to the homepage essentially of
(23:01):
the app, which you can use on desktop or you
can use on your phone, and basically all it says
is like, don't use this in a bad way, and
that's it's It's like as effective as those tags on
mattresses that say don't remove this tag, or the little
note on your boxy Q tips that's like, hey, don't
(23:24):
put these in your ear. I know it feels awesome,
and I know that's what everybody does with it, but
don't put them in your ear. And then like you said, no,
that absolves them.
Speaker 4 (23:34):
Or they're warning about seizures at the beginning of when
you start up your PlayStation, you know what I mean,
It's like, yeah, okay, maybe I get seizures, but I'm
gonna roll the dice because I sure want to play
me some Borderlands three, you know.
Speaker 3 (23:46):
Yeah, Because you know, now that I think about it,
we might be in a bit of a glasshouse situation here,
my friends, because our show literally starts with the warning
that says you can turn back now. We're not as
bad a Q tips. You can you can turn back now.
Speaker 4 (24:02):
That's definitely as as is our prerogative to follow said
warnings and not stick cue tips in our ears.
Speaker 3 (24:08):
But who wants to do that.
Speaker 4 (24:09):
What are you gonna use them for? Like putting on eyeshadowers?
I mean, I don't know, Like I feel like the
back of your ear you need it to do need
a cue tip for that? Can't you just use like
a like a like a tissue.
Speaker 3 (24:21):
Some people are living wild man. I don't know what
to tell you. You tips are literally probes. I don't
know anyway.
Speaker 4 (24:26):
I just feel like they're very clearly designed to shove
in your ear and it does feel great, but yeah,
it can. It's pretty scary when you pull out some blood,
you know, just saying.
Speaker 6 (24:36):
So, yes, I've never done that. Actually, I'm gonna start
keep trying. Uh No we get gotten blood like gone
in for wax out with blood. But you know, maybe
I'm just fortunate.
Speaker 4 (24:49):
Uh.
Speaker 6 (24:49):
I want to bring up something that is possible with
this software and just how it uses it before we
jump into, uh, the next thing we're going to talk about,
and that's just that when your face ends up on
a social media website somewhere, even if you did not
take that picture, even if you were in the background
of a selfie that someone took at Bonnaroo, let's say,
(25:12):
or something like that, your face is still searchable with
facial recognition software if it is visible in the background
of an image, if you're in a crowd, and if
anywhere you are, anywhere you've been where a photograph has
been taken, where you can are visible somewhere in the background,
this thing clearview AI can may very much likely has
(25:37):
that image within its database and your face will be
recognizable to it.
Speaker 2 (25:42):
Just let's put that there.
Speaker 3 (25:43):
So it doesn't matter if you're on social media exactly.
Speaker 4 (25:47):
Guys Devil's advocate here, isn't that one of the situations
where if you're in public, you're sort of giving up
that right to privacy. Or if you go to a Bonnaroo,
there's usually signs that said when you pass these gates,
you are subject to being filmed, et cetera. And that's
usually for more like documentary purposes of the festival. But
I would think it would also apply if you're in
(26:08):
public and you're caught on you know, one of these
millions of little surveillance devices that people carry around with them.
Is that fair game? Can you sue over use of
your of misuse of your image? I'm gonna I wouldn't
have think I wouldn't have thought.
Speaker 6 (26:24):
So I'm certainly not arguing about you know, the legality
of it. I'm just saying the reality of it is
that wherever you go, if there is a camera taking
a picture, whether it's cct well, CCTV, generally wouldn't show
up in one of these things unless it was placed
into the system.
Speaker 3 (26:40):
At the At the risk of sounding to I don't
know too extreme about this, maybe too prescient. I think
the best for people concerned about this, the best way
to handle it, which is maddening, is to is to
apply some of the rules that some of the rules
that we apply guns to cameras. So you always assume
(27:04):
a gun is loaded, right, you never want to point
it at you. So always assume a camera is recording.
And if you don't want to be with that assumption,
if you don't want to be in the footage or
in the photo, then don't have it pointed at you.
You guys, remember I was very uncomfortable with Pokemon Go
(27:25):
when it came out. I was like, no, they're putting poke,
They're putting these ar little cute things to collect because
you are filming, you're filming for them. You guys are
like Ben big Brother does not give a like the
faintest hint of a about charizards or whatever. Right, but
(27:46):
still that stuff freaks me out.
Speaker 2 (27:48):
Ye.
Speaker 4 (27:48):
Well no, but we did a whole story about that company,
Nantec and like how it had roots and intelligence cathering.
I mean, you're not wrong, Ben and I remember that
we'd be out hanging out and people would be doing
their little capturing their pokemons and they'd be on your
cell phone or something, and you would like cover your face,
and that is your prerogative to do. And I completely
understand where you're coming from, because it's like with the
(28:10):
whole zoom phenomenon that we're obviously we're using it right now.
You can turn off that camera as far as you're
concerned where it doesn't show it to the group, but
I guarandamn to you, it's still capturing what's going on
and storing it somewhere. So you know, it's a great technology.
It works real well, which is why it's having this
(28:31):
moment right now. But I just would wouldn't trust it
too implicitly, you know.
Speaker 6 (28:36):
You know, the best technology, it's what Ben, Ben and
I it's a sticker or some tape just go poop
at all times.
Speaker 3 (28:47):
Yeah, but that's uh, that's that's still that's that's a
low tech hack. But you have to do it because
this stuff can be weaponized. And uh, there's a guy
named Eric Goldman at the High Tech Law Institute at
Santa Clara University. I like the way you put it.
He said, the weaponization possibilities of this are endless. Imagine
(29:09):
a rogue law enforcement officer who wants to stalk someone
that they want to hit on later, or maybe a
foreign government is doing similar to what you described, Noel.
They're digging up secrets about people to blackmail them or
to throw members of an opposition party in jail. This
stuff can happen. And you know, Clearview is also notoriously
(29:30):
secretive even now that's the second big issue. It leaves
a lot of critics ill at ease. We know some
stuff because of some excellent research from earlier reporters. Let
me tell you, guys, this weird story. So there was
a New York Times journalist who was working on a
story about Clearview that really brought it to international attention.
(29:54):
And while this journalist is working with clear is trying
to do this story, Clearview isn't really answering him. They're
like ducking attempts to contact via text, email, phone call.
And so this reporter knows that a lot of law
enforcement agencies have clearview access, and so this reporter asks
(30:16):
some contacts in the police department to run his photo
through the app and they do a couple times. So
the first approach, the first thing he learns about clearview
approaching outside forces is not when they come to him.
It's when they go to the police that he worked
(30:37):
with and they ask the police, Hey, are you talking
to the media. How shady is that?
Speaker 2 (30:43):
Wow? Yes, that is whoa.
Speaker 3 (30:47):
They have an explanation, right, oh, absolutely, and just really quickly.
Speaker 4 (30:50):
That's the piece that was on the Daily because the
Daily is affiliate with the New York Times, so this
was after the fact of this reporter whose name was
trying to defind, had done all this research and gone
through this whole run around to even get a sit down.
The reporter eventually found the office and it was like
very shady, like there weren't with that many people there.
It was a skeleton crew, and finally got access to
(31:11):
the to the founder. But that is all, you know,
described in great detail on that piece by The Daily.
Speaker 5 (31:17):
I highly recommend checking it out.
Speaker 3 (31:19):
Yeah, Kashmir Hill, the secretive company that might end privacy
as we know it is. The is the Times article. Yeah,
I liked what you're saying about. When when when Hill
finally figured out where the company was originally on LinkedIn,
they just had like one employee listed and it turned
out to be a pseudonym or a fake name that
(31:40):
the founder was using. So they're they're aware of the
dangers here. But but when they did eventually speak to
the reporter, they had and they had reasonable explanations for
both of these things. First, the founder said, look, we
were avoiding speaking to the media, not because we're like
(32:00):
super villains or something. It's because we are a tiny
startup and a lot of times when you're a tiny startup,
you're in you're in stealth mode, right, And that makes
sense because the NDAs on those things are pretty ironclad.
You don't want anyone to scoop you. And then he
additionally said, here's why we talked to the police about
your search terms. We actually monitor all the searches that
(32:22):
are done because we want to make sure that they're
being used correctly. In other words, you know, we want
to make sure nothing like like, there's never a situation
where a police officer is a jilted is jilted by
a romantic partner and then they find out their X
has a new lover and they're like, well, I'm gonna
find out about this. God, I'm even afraid to make
(32:46):
up a name because I don't want to put it
out there. Marcus mcgilly cutty. Oh that's swinging a miss.
Speaker 4 (32:54):
Funny thing about that, though, Ben is like, you don't
even need technology this robust to dig in this somebody
that closely removed from you know, a person you really know,
they're probably already like posting tagging each other on Instagram
posts and you just, you know, go and.
Speaker 5 (33:09):
Just do a couple of clicks and you're there.
Speaker 4 (33:12):
But no, it's just a really good point and that's
what what's set off those alarm bells. But yeah, the
way it's described in the in the in the Daily piece,
it was very it felt very like, Okay, A, they
know they're onto something, and B they don't want to
be found. And I can see to your point been
the whole startup mode. They don't want to get scooped,
(33:33):
They don't want anybody stealing their proprietary in their you know,
technology and trying to misuse it or you know, create
a copycat kind of version of it. So that does
make sense, but just it's the whole thing is a
little strange. But this app clearview claims. Uh, you know,
is is up to security industry standards in terms of
(33:53):
I guess in terms of you know, hackability or like
firewalls security or you know, being not being vulnerable to
being infiltrated, but who is actually watching them? Who is
actually monitoring them because it's it's essentially untested and already
being put out into the wild in cases that can
(34:14):
have huge effects on people's lives. We'll talk more about
that after a quick word from our sponsor.
Speaker 3 (34:26):
Yes, you know, that was a perfect ad. Break bill
also gave me time to get this cat out of
out of my recording studio. I heard you had a
little me hour. Yeah, yeah, Well we're all dealing with
new coworkers, right.
Speaker 4 (34:44):
Can I just do a quick aside, A quick quarantine
aside had a crisis yesterday we I don't know if
I've mentioned the show that my daughter got a hamster
right before quarantine kicked in and I had to inherit
it because her mom was allergic. She found out hamster's
name is Hannah, which is like an anime character. But anyway,
we realized yesterday afternoon that Honakoh was not in her cage.
Speaker 3 (35:07):
Uh, so we had a hamster.
Speaker 4 (35:09):
Search party all up in this house, UH freaking out
and I had just these horrible visions of like finding
a dead, decaying hamster, you know, two weeks later in
my sock drawer or something. So uh, luckily, the little
the little critter found herself back to where she belonged
on her own, which was great.
Speaker 3 (35:27):
Great.
Speaker 4 (35:27):
Yeah, she just popped out and was right there, even
thought it was a sock and it was. Turns out
it was a hamster. And it wasn't long. I read
all this horror stories about how you can only search
late at night and you gotta bait it with peanut
butter and all this stuff, and it she just came
came right back.
Speaker 2 (35:41):
It was great.
Speaker 3 (35:42):
That's fantastic.
Speaker 6 (35:43):
Yeah.
Speaker 3 (35:44):
I had a Uh, I had a Gerbil in my
younger days that escaped and lived and died in the
walls of a house in my Uh tried to tried
to move the oven and appliances to get to it. Uh,
but then we couldn't, and my father thought it would
be a good lesson in mortality.
Speaker 4 (36:05):
Oh boy, I was hoping to avoid that particular lesson
for my kid, especially during these trying times when things
have been mainly pretty positive for us here at the house.
Speaker 3 (36:15):
I'm so glad.
Speaker 4 (36:16):
Yeah, thanks guys, But no, you're right, Ben, tell us
a little bit about the security of this app and
the idea of it being tested for you know, vulnerabilities.
Speaker 3 (36:29):
Right, Yeah, this is a very interesting point. So a
lot of the reporting that came up about Clearview Ai
came before late February of this year. In late February,
Clearview app security was actually tested in the form of
a hack. On February twenty six, the public learned that
(36:52):
the company had been breached, their security been compromised. Hackers
had stolen Clearview AI's entire customer list, which was coveted
and very much secret right the customer list, adding to
the troubling facts about this company, The customer list did
not jibe with what Clearview had said earlier. It spans
(37:13):
thousands of government entities and private businesses across the world.
So the US Department of Justice, Immigration and Customs, but
it also includes banks, and he does allude to banks
in a couple of different interviews the founders. It also
includes Macy's and best Buy and Interpool. And then on
the list you also see credentialed users at the FBI.
(37:37):
You know, we said hundreds of police departments. But here's
something very interesting. It also includes users in Saudi Arabia
and the United Arab Emirates. These are countries that are,
you know, not particularly known for their progressive policies or
their First Amendment rights. The founder, by the way, says
that they're exercising First Amendment right and they scrape all
(38:01):
these images.
Speaker 6 (38:02):
Yeah, they're they're exercising a First Amendment right because they're
all publicly available and out there. And if the argument
that they make generally is that if Google can scrape
all the websites and create search terms for all those
things and then allow you to search that information. If
you know, YouTube has you know, when you upload something
(38:23):
to YouTube, it's all publicly available. They can scrape all
the information for all the data, and then the other
companies intertwine and use that data to help each other
find other things. Well, hey, why can't we do that too,
So we're going to However, a lot of the scrutiny
that's been happening because of that hack that was reported,
as well as just the tech in general and how
(38:44):
these other giant tech companies, how their data is being used.
Tech giants like Twitter, Google, and YouTube same thing have
sent cease and desist letters to Clearview orders saying do
not do this anymore. And even like because of this,
some police departments and other users have just discontinued use
(39:04):
of the software. And when the founder, you know, responds
to this kind of thing, he says, just you know,
our legal team is on it. They're working right now
to fight that battle, using the First Amendment as our
major argument.
Speaker 3 (39:20):
Yeah, and importantly we should note that they said the
customer list was the only thing compromised, so the hackers
don't have that three billion plus and growing image archive.
But you're right, Matt, Legal troubles gather like storm clouds
on the horizon. Clearview has said it's complied with things
(39:42):
like policies proposed by the ACLU. The ACLU has said
that they don't comply. And I think there's there's a
case coming up against Clearview in Illinois and it's one
of what may be many arguably legal troubles should be
gathering on the horizon and the storm should hit, especially
(40:03):
when we look at the implications of this technology, and
before we continue, we want to be like cartoonishly transparent here.
We're not saying that clear View or the people in
the startup are bad people, and we're not saying even
that their intentions are bad. We're saying that this technology,
(40:23):
just like any other technology, has inherent implications. It has
just like fire. It has it has some positive benefits,
and it has some very dangerous possible consequences, like how
could it be misused? The first one just off the
top here. The first one is the one we talked
about a couple times in this episode. Somebody with access
(40:45):
could use it for personal reasons. We did that hypothetical
cop who's mad at his ex for dating Mark McGillicutty
or whatever. But we can go beyond the hypothetical examples.
There's a real life example. There's a billionaire Clearview investor
who used the app to stalk his daughter's newest bow.
He wanted to find out about this guy. I imagine
(41:08):
you know, whether you're a millionaire or a popper. Of course,
you want to know who your kid's dating, you want
to know about them. But it's crazy that this guy
was able to use it and he found out all
the stuff that we just mentioned, like where the guy lives,
what his job is, you know, his known connections, his
(41:29):
social media and stuff. This example is a little weird though,
because I imagine being a billionaire, you could just hire
a person to dedicate their time to doing that. Just
hire a PI.
Speaker 6 (41:40):
This is a This is a really important thing about
this that I maybe we mentioned, but we didn't get
into it. The reason why he was able to find
all that stuff about this person. The way I've seen
the software work, it's not like you identify this person
that gives you a giant readout of all of this
person's information. You identify that person's face, then all of
(42:05):
the other publicly publicly available on the internet faces show up,
and you can click on it and it takes you
to the website where that face is, where that image
is located. So that means you're then directly connected to
their social media, to anywhere that that photo has been posted,
and then through those other websites you can gather all
that other information at least that's the way it was
(42:27):
demonstrated in that CNN Business interview and a couple of
other places.
Speaker 3 (42:31):
That's the way it works for now, but we know
that that UI could easily be upgraded to maybe pull
to scrape that information if it's publicly available, right, I mean,
think about how many things are we've talked about before,
how easy it is to find a phone number, how
easy it is to find not just your address, but
(42:52):
the last few addresses you lived at over the years.
This is where we should also mention people don't trust
Clearview AI is a little bit of stereotyping, but they
don't trust it because one of the investors, Peter Thiel,
is also big in Facebook. So Facebook. Facebook is good
(43:13):
at a lot of things and users privacy has never
been one that is not a bug, that's a feature
from there.
Speaker 6 (43:20):
And this is just a dovetail on all of that.
In that interview, Donnia Sullivan does a search of his
face and it's the official profile picture on the website
where he works for the CNN Business site, and when
it scrolled, when they scrolled all the way down, there
(43:41):
was an image of him because he's in his I
guess thirties, late thirties or something to that. I don't
know how old he is, but he's a person about
our age. Ben A Nol and I they scrolled the
way down and there's an image of him from a
local newspaper where he was in Ireland, right when he
was sixteen years old. And it's a group of you know,
(44:03):
teenagers holding up a sign and you can just see
his face and he looks absolutely different. I mean I
would not have been able to tell that that was him,
but he recognized the.
Speaker 2 (44:15):
Photo and he knew it was him.
Speaker 6 (44:17):
And this is the kind of thing where you know,
this is not an allegation on my part, This is
just me wondering and ruminating. I'm wondering how they connected
that up, How this face recognition software inserted that picture
of him where he looks so different. And the founder says, well,
(44:38):
you know that there's still parts of your face, the
geometry of it, that remained the same over the years,
even if you put on weight, even if you're wearing glasses,
even if you're covering your face. I get that argument,
but to me, it makes me wonder if there is
some kind of added like Spochio search that's going on
(45:01):
within this system where you're looking at. That's just another
company that you can search social media, any social media
accounts that somebody has using an email or a phone
number or a name. And it makes me wonder if
they're connecting things up there as an extra layer of like.
Speaker 3 (45:20):
A name, like prioritizing an image by other information.
Speaker 6 (45:25):
Yeah, I mean, it makes me wonder if that's happening.
I guess not, but I would say he didn't sufficiently,
the founder didn't sufficiently explain how that image got in there,
and the interviewer is clearly a little weirded out and
disturbed by it.
Speaker 4 (45:40):
Well, another thing too, is I mean, you know, in
the United States, presumably you know, you might be identified
using this technology, but doesn't mean you are immediately going
to be convicted of something without some kind of physical
evidence or without a trial. But in other countries where
stuff like that isn't as much of a thing, it
could be use to identify social dissidents or like people
(46:03):
that are speaking out against the totalitarian government and flag
them and throw them in the gulag or whatever, you know,
without ever you know, batting an eye. And that is
sort of the crux of the problem with this software
is it depends on the notion of people that use
it being good actors and and and being good stewards
(46:23):
of this power that they have. And as we know,
power corrupts, and we've got a lot of governments that
are kind of corrupt and out to shut down any
kind of criticism of what they do. And this would
be a great way to identify those people and round
them up and put them away.
Speaker 3 (46:45):
Yeah, and it's it's it's dangerous for that reason. You know,
you could you could detain people, and you could cut
around some of their legal protections like what if this
what if this counted in a court of law as
definitive identification, then you muld say, well, we don't need
to figure out if you're the person, right, It could
(47:06):
get so dangerous so quickly. There's another point we have
to add. You know, this sounds like some kind of
Skynet Big Brother stuff. It can also be used inaccurately.
It is far from perfect. It's one thing that I've
been talking a lot about with some people online about
facial recognition has an inherent racial bias that I wish
(47:30):
more people talked about for non white people, for persons
of color. This stuff lags in accuracy, not just clearview,
but in general. Facial recognition is known for this and
this means that you could be arrested due to a
computer error because the software decided that you look like
a guy you never met, who lives in a state
(47:52):
you've never been to, who committed a crime that occurred
while you were not alive, and it could happen. It
could happen here.
Speaker 6 (48:00):
And to continue with that, we mentioned, you know, the
people that were known to be using this software, the organizations,
and two of them or three of them, let's just
point these these out, Department of Justice, US Immigration and
Customs Enforcement.
Speaker 2 (48:16):
Let's just leave it to those two.
Speaker 6 (48:18):
So take the problems you're talking about ben that are
inherent to this system. Apply that then to let's say,
Immigrations and custom looking for people who are in the
United States illegally, and imagine that you're attempting to round
everyone up. The implications of this software would be extremely effective,
(48:41):
i would say, or completely ineffective because they're they're detaining
and rounding up people incorrectly because of the problems with
the software.
Speaker 3 (48:51):
Well think about this too. The problems with the software
so they can pound on the state level, but they
can pound on the private level too, And this is
one that I think for the immediate future is at
least as dangerous as the Orwellian stuff we're talking about
right now. So that data, when it's mind right, just
(49:14):
like your data on any other social media, it can
be sold to third parties, insurance companies, financial institutions, and
so on. This goes way past ads the phrase. I know,
I'm kind of churchifying here, but the phrase I think
of is a longitudinal profile of your face and your
behavior over time. So it's like watching a real life
(49:38):
progression of you aging. And let's say that an insurance
company or an algorithm could start to pick up on
certain medical conditions that are manifesting in your face in
very minuscule ways ways of human being, and even a
human doctor wouldn't sense. The computer picks this up before
(49:58):
you know anything's wrong, sort of like the way targets
algorithms figured out that poor girl was pregnant before her
family knew. So here's the question. Would the insurance company
legally be required to notify you of this possible health condition, say,
like there's clearly something happening maybe if they have video
(50:21):
to in your behavior that is indicative of an early
on set terminal condition, would they be legally required to
tell you that they knew you had a let's say,
eighty percent chance of dying in the next two years
and give you the assistance you need right now. That's
a note. That is a big no.
Speaker 5 (50:42):
Chief.
Speaker 3 (50:43):
They don't have to do a damn thing for you.
What they can do with absolutely no repercussions is say, hey,
this person's probably going to be dead in two years
and it's going to be expensive to get them to
a year two, so let's just drop them. Let's just
drop them. There's no law that, there's no like ethical
legal requirement for them to tell you what they know.
(51:04):
And that's that's mind boggling. That is reprehensible. That is
like I need, I need to get my good non
abridged thesaurus to come up with all the different words
for how bad and disturbing that is.
Speaker 4 (51:17):
Oh, Ben, I would I love it when you do
that because I always learn a new word.
Speaker 6 (51:23):
Uh.
Speaker 3 (51:23):
Did you guys ever, just to light in the mood,
did you guys ever hear that old joke about the
abridge thesaurus It's like I have an abridged thesaurus. Not
only is it terrible, it's terrible. So I didn't write that,
but you but you can. The good news is right now,
depending on where you live, you can opt out of
(51:45):
clear View. Oh my gosh.
Speaker 4 (51:47):
But even the way you opt out isn't like it's
such a burden. It's such a burdensome process. You can't
just click a box in an email, right you. All
you have to do is send the headshot. We all
have those and an image of your government issued ID.
And this only applies also to residents of California or
(52:08):
the EU.
Speaker 5 (52:09):
For whatever reason. Sorry, rest of the world.
Speaker 6 (52:12):
Right because of privacy, because of privacy laws that exists there.
Speaker 3 (52:18):
Yeah, we shall also point out, going back to this
accuracy thing, there are serious questions about how accurate Clearview
is in general. So BuzzFeed, according to marketing literature they
found from Clearview, says that the company the company touts
the ability to find a match out of one million
phases ninety eight point six percent of the time. But
(52:40):
when Clearview did finally start talking to The New York Times,
they said the tool produces a match up to seventy
five percent of the time, and we don't know how
many of those are quote unquote true matches. So there's
some contradictory informations coming out which can make people feel uncomfortable, obviously,
and that's where we leave today. The battle lines are
(53:01):
firming up. On one side, you have clear view, its investors,
its clients, and what I would call the techno optimists.
Right on the other side, you have privacy advocates, you
have tech giants like Microsoft, IBM, you even have the Pope.
The Pope came out against facial recognition and he summed
it up pretty nicely. He said, quote this asymmetry by
(53:23):
which a select few know everything about us while we
know nothing about them Dull's critical thought and the conscious
exercise of freedom.
Speaker 6 (53:31):
Imagine applying that to a priest who hears confessions from
everyone every week when they go and tell their dirty secrets.
The priest is then the one who everybody knows nothing about,
but he knows everything about them.
Speaker 2 (53:50):
Just putting it out.
Speaker 6 (53:50):
There, especially hey, and also this thing is targeting child
sex abuse and abusers.
Speaker 2 (53:58):
Just also leaving that there.
Speaker 3 (54:00):
Yeah, Oh, one other thing for anyone listening along and thinking,
good thing, I made my profile on insert social media
here private years ago. Sorry, homie. Anything that was public
at some point is pretty much in the system, even
if you later made it private. So check your MySpace.
Speaker 6 (54:18):
In that CNN Business interview, as you said, Ben, they
test out the software on the producer who has a
private Instagram account, and images from that account show up
in the search and it is because it was public
at one point.
Speaker 3 (54:33):
And that's where we leave off today. What do you think,
fellow conspiracy realists? Do the benefits outweigh the potential consequences
of this? Do the consequences outweigh the potential benefits? Let
us know. You can find us on Facebook, you can
find us on Instagram, you can find us on Twitter.
We always like to recommend our Facebook community page. Here's
(54:56):
where it gets crazy.
Speaker 5 (54:57):
Now.
Speaker 6 (54:58):
You can jump on there. You can talk about this
or any other in the past. You can post some
dank memes, let's say, some conspiracy memes or facial recognition stuff.
Maybe you've had access to this before to clearview and
you want to talk to us about it.
Speaker 2 (55:14):
Don't do that on Facebook. Don't do that.
Speaker 6 (55:16):
But if you do want to tell us about that,
you can call our number. We are one eight three
three std WYTK. You can leave a message there, tell
us about it, let us know if you don't want
to be identified, if you don't want us to know
or talk about anything on air, just just let us know,
give us the information. We'd love to hear from you.
Speaker 3 (55:38):
And I want to add to this to say that
I finally start despite my strange phobia regarding phones, I
started diving in Matt You're doing massive, amazing work there
and I've started listening to these calls as well. Thank
you so much to everyone who calls in. It's inspiring
(56:00):
and I don't know about you. I don't know about
you guys, but it makes me feel like what we're
doing is worthwhile. If that makes sense.
Speaker 2 (56:08):
Oh, definitely, definitely. I've had you know.
Speaker 6 (56:11):
Hopefully you're gonna if you could get past the phobia, Ben,
hopefully you can call a few people because you can
use that app and you should be good to go
actually speaking to to you know, you know who you are.
If we've talked on the phone, it means a great deal,
as Ben is saying, just to us, to know that
(56:33):
we're not just talking in a darkened room and that
you guys, you guys care about us as much as
we care about you. So this is a great relationship.
Speaker 3 (56:42):
Let's keep doing it well, said Matt Well said, and
we'll be following up with some of those messages in
the future, so keep them coming. One last thing, if
you say, look, Ben's right, phones are terrifying and weird,
but also h social media. You guys just told me
how dangerous that is. Why on earth? What gives how
(57:06):
I have a story that my fellow listeners need to
know about. I have some I have some experience. I
have some terrible jokes to tell you, but I don't
know how to contact you. Well, we have good news
for you. You can always twenty four to seven contact
us at our good old fashioned email where we.
Speaker 6 (57:24):
Are conspiracy at iHeartRadio dot com. Stuff they don't want
(57:47):
you to know is a production of iHeartRadio. For more
podcasts from iHeartRadio, visit the iHeartRadio app, Apple Podcasts, or
wherever you listen to your favorite shows.