All Episodes

September 9, 2025 59 mins

It's "AI" week here on Stuff They Don't Want You To Know. In tonight's Classic episode, Ben, Matt and Noel revisit their 2020 original exploration of Clearview AI -- to its supporters, this company's facial recognition software revolutionizes safety. To its critics, there's much more on the horizon.

They don't want you to read our book.: https://static.macmillan.com/static/fib/stuff-you-should-read/

See omnystudio.com/listener for privacy information.

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:00):
Friends and neighbors, fellow conspiracy realist, It is a government
and AI week here on stuff they don't want you
to know. So for Tonight's classic, we looked back through
the archives and we found some stuff that we had
been sort of flum mixed about for a while, for
more than five years now.

Speaker 2 (00:20):
Oh yeah, we talk a lot about the technology that
was shown off in that certain Tom Cruise movie. You
remember the one, the one with the psychics, uh huh,
the ones that could pre crime everybody. Well, one of
the big things that was shown off in that movie
that struck all of us at the time when it
came out is this concept of facial recognition everywhere you

(00:43):
go in a city. Well, there's a company that's been
trying to do this for a long time, and some
stuff went down in twenty twenty, and that's when we
made this episode.

Speaker 1 (00:55):
Yes, this is where we introduce you to the concept
of clearview AI. The argument of supporters would be this
can prevent crime, This can make the world a safer
place through the power of pattern recognition and super juiced
up algorithms. But then, as we learn there are a

(01:15):
lot of detractors for technology like this being rolled out
and clear view stories they originally tell. It doesn't always
match the things people found out later.

Speaker 3 (01:26):
Yeah, I mean this is a story that is more
relevant now than ever. So why don't we take a
quick break here we're from our sponsor and then jump
into this classic.

Speaker 4 (01:34):
From UFOs to psychic powers and government conspiracies. History is
riddled with unexplained events. You can turn back now or
learn the stuff they don't want you to know. A
production of iHeartRadio.

Speaker 2 (01:58):
Hello, welcome back to the show. My name is Met,
my name is Noah.

Speaker 3 (02:02):
They call me Ben.

Speaker 1 (02:03):
We're joined as always with our super producer Paul, Mission
control decand most importantly, you are you. You are here,
and that makes this stuff they don't want you to know.

Speaker 3 (02:15):
Here in the.

Speaker 1 (02:15):
Age of social media, right, we're recording this as longtime
listeners know on a virtual interface. Right, we're mentally and
soulfully together, but not physically together, and that may be
the case for some time. However, technology is allowing us
to feel closer.

Speaker 3 (02:37):
Right.

Speaker 1 (02:38):
And the crazy thing is this stuff is so cheap.
Now we're talking about the proliferation of cheap imaging technology.
You guys, remember when not everyone could afford a camera
editing software and stuff like that.

Speaker 3 (02:54):
I mean, hell, I remember when not every phone had
a camera on it or it was like, who's got
a digital camera? Find one? You know, Now it's like
a non issue.

Speaker 2 (03:02):
You know, Well, if you were going to take a
video with one of those, it was this tiny little
MPEG file that you'd end up getting. Even though you
had this powerful camera, the digital camera you could use,
your video was so terrible.

Speaker 1 (03:15):
Yeah, yeah, and you just had a nice curated set
of pixels that would move and.

Speaker 3 (03:21):
What appeared to look like a face exactly.

Speaker 1 (03:25):
But but now we have you know, now it's it's
so affordable to buy these things, and and on balance
that's a good thing. But combine that with the massive
growth of social media and the massive growth there's some
of you people don't talk about as often, the massive
growth of archival capacity.

Speaker 3 (03:43):
This means, if you.

Speaker 1 (03:44):
Want to be sweet nostalgic about it, that are days
of cherishing a few like glamour photographs from your from
your local photography studio, or a few expensive polaroids. Those
days are all but gone. Now you can post photos
of yourself, your friends, or strangers, and you can post
them anywhere you please. There are some laws about it,

(04:07):
but they're they're not really good because legislation has always
been technology's slower sibling, and so those those laws often
don't have a lot of teeth to them. And with
the rise of all this technology and the rise of
all these capabilities, we also see the rise of technology
that is exclusively meant to exploit the vast amount of

(04:28):
stuff that we've been making just from like live journal, Tumblr, MySpace, friends,
their Facebook, all, you know, all the hits, all that jazz.
So what do you guys think, I mean, optimists and
pessimists have a different idea of what this means for
the world.

Speaker 2 (04:43):
Well, yes, on the optimist side, I would just say
the number of photographs and captured memories I have of
my son is probably more, you know, it's certainly thousands
of times greater than that which exist for me as
a child in my growth, my you know, my parents,

(05:04):
and then going back and back and back and back,
it's just exponentially a larger amount of memories that are
captured throughout his life, my son. So that's great, that's wonderful.
We can look back on those, you know, until the
machines all turn off or we lose electricity, those those
images will exist for my family and people that care

(05:25):
about me and my son.

Speaker 3 (05:28):
But on the flip side of that, Matt, you know,
like folks like my mom, for example, she doesn't consider
those real Like she's like, if it's not printed out
and it's not in a frame on my mantle, then
that is not a real memory. Which is interesting because
to your point about like the machines turning off, there's
all these companies now that will take your Instagram photos

(05:49):
and print them out and send you framed copies of them,
and the proliferation of like instacs, cameras and like more
analog technology is sort of a response, almost a backlas
to this over digitization of like our memories and our
collective kind of you know, unconscious really in a way
with all the stuff that's out there, it's like there's

(06:10):
this sense of is it really ours once it's out
there in the ether. And I think that's a big
part of today's question.

Speaker 1 (06:16):
Yeah, who owns the art? The artist or the audience.
It's a very old question, but it's one that has
continually been relevant.

Speaker 2 (06:25):
Right And yeah, but but the other question, Ben, is
is it art or is it just data?

Speaker 3 (06:32):
Right?

Speaker 1 (06:32):
Right there we go. That's a good distinction. It's a
tough distinction to make. Optimists are saying this is great,
you guys, the world's going to be a safer place,
and think how convenient everything's gonna be. You'll never lose
your favorite photo again, you know, and criminals can no
longer disappear into a crowd. And then pessimists say, you know,

(06:54):
in a world beyond privacy, every person is monitored continually
throughout their lives. Your image is logged countless times, and
all the data from that image and all the other
images of you, it's analyzed. It's bought and sold without
your knowledge or your permission. A world careening towards sci
fi dystopia, pre crime, pre existing medical conditions. Advertising so

(07:18):
invasive and so bespoke that it might as well be
reading your mind, because it's doing something very close to that.
But before you get all blade round round minority report,
let's start closer to the beginning. There is a revolution
happening in the world of facial recognition right now, and
its name is Clearview AI.

Speaker 3 (07:39):
Here are the facts. Yeah, I mean, the first time
I heard of Clearview AI was on a recent in
PR piece where the founder was interviewed. So this is
a very relatively new company, at least to me. But
as far as the background, it is a research tool
that is being used by law enforcement, specifically in the company.

(08:00):
It makes it very clear that this is something that's
designed exclusively for use by law enforcement in order to
identify perpetrators and victims of crimes. It's a little bit
pretty very vague side I would say, yeah.

Speaker 1 (08:16):
Yeah, I mean that's you know, we have to be honest.
That's that's the high level like logline or pitch from
the company on its own website. So we can't blame
them for not wanting to get too into the weeds
about algorithms. But if we look at how it actually works,
we can we can get a high level understanding.

Speaker 3 (08:36):
It's a startup.

Speaker 1 (08:37):
It has a massive database of faces, just specifically faces,
somewhere north of three billion images and as we're as
we're recording this, that number is likely still growing. So
that number is going to be a little bit higher
by the time you get to the end of this episode.

Speaker 3 (08:57):
Yeah, and there was the gentleman who found the startup. Hann
thought was who I heard interviewed on I believe it
was The Daily. Actually, so if you want to hear
directly from this gentleman's mouth what he believes the mission
of this technology in this company is it a very
fascinating interview that kind of delves into some of the
slippery slope sides of this technology that we're going to

(09:20):
get into. But as far as the end users concerned,
which in this case would be law enforcement, because currently
this is not available to civilians. You take a picture
and it identifies who that person is, simple as that,
by matching it with all of this information that they've collected,
and they've collected it, you know, not through any really

(09:42):
nefarious means. It's all that information that we talked about
that's already out there. All they've got to do is
reach out and grab.

Speaker 1 (09:48):
It's a it's a search engine for faces, and it
sounds innocuous at first. I also, if it was The Daily,
I think I listened to that interview, but I also
listen to an excellent interview you sent Matt an extended
CNN business conversation. So if you want to see them,
like if you want to see the founder using the

(10:10):
app a couple of different times, I think he does
it on himself, he does it on the interviewer, and
then he does it on one of their producers, and
it seems to work. It's not a sensationalized thing either,
at least the way they're presenting it.

Speaker 2 (10:26):
Yeah, but we're gonna get it. We're gonna maybe talk
about that a little more. Just if you do want
to look at that video. It's an interview with Donnie
O'Sullivan with CNN Business and it is where we're going
to reference it several times in this episode.

Speaker 3 (10:41):
Yeah.

Speaker 1 (10:42):
So, like you said, Noel Clearview has taken this from
publicly available sources, your Facebook's, your YouTube's, your venmos, which yes,
are publicly available, and the Federals and local law enforcement
admit that they only have a limited knowledge of how

(11:03):
the app actually works mechanically, the nuts and bolts of it,
but they do confirm they've already used it to help
solve shoplifting cases, identity theft cases, credit card fraud, child abuse,
even and even several homicides. This technology goes, by the way,
far beyond anything constructed by Silicon Valley giants or anything

(11:26):
officially created or used by the US government.

Speaker 2 (11:30):
Yeah, in that interview that we just mentioned. Juan mentions
that they had this image or this video file of
a child abuse incident and the person of interest walks
past in the background at one point, and there were

(11:51):
only two usable frames of this person's face. And again,
like imagine in the background of a video, they were
able to take still from that video of this person's
face and identify him through other social media websites where
they've got all this data from and make an arrest

(12:12):
on this person like that. That is it's mind blowing
that you could do that, and thank goodness that they
were able to do that. But it's the implication of
being able to do that in that one instance and
then applying it across everything is what makes it feel
a little scary because.

Speaker 3 (12:31):
For the critics, for the critics, well.

Speaker 2 (12:34):
Yeah, for the critics, well for the people who are
looking at it big picture. So if you're you're thinking
about it just in a law enforcement application, then it's amazing, right,
and it's the best thing ever. But if you apply
it to everything, to everyone at all times outside of
law enforcement, then well.

Speaker 3 (12:55):
And just strictly on like a technological level, it's fascinating
and clearly a huge leap forward in this kind of
technology because I mean we've seen these videos they're talking about,
these grainy surveillance videos or ATM camera videos. I mean
it's hard to as a human being, even if you
knew that person, it'd be difficult for you to identify them.

(13:17):
So this using this algorithm and you know, analyzing and
comparing it to all these different subjects I imagine using
points of articulation like on the faces and the structures
or whatever, it's able to come back with positive matches.
I think that's really fascinating and the really, you know,
pretty innovative use of technology. For sure, as a nerd

(13:37):
speaking you know, exclusively on that level. But you're right, Matt,
when you start to apply on a larger scale, it
gets really nineteen eighty four ish and kind of scary.
I appreciate that point, Noel.

Speaker 1 (13:50):
It's something that's going to come into play later and
it may surprise some of our fellow listeners to learn
a little bit more about this technology and some of
the other topics that you raise there. I want to
say this, their pitch is very law enforcement based on
the website. They're quick to point out, they're quick to

(14:13):
claim I should say that they've helped law enforcement track
down hundreds of at large criminals. They mention the real
the things that make people very emotional, like child abusers, terrorists,
sex traffickers, but they also say this technology is used
to help exonerate the innocent and identify victims. And it

(14:35):
seems that they're they might be embellishing a little bit
in terms of the degree of help they're providing, but
they are providing help. The authorities have used this, it's
happening now. They've done so without much public scrutiny, or
they were for most of twenty nineteen doing so without
much public scrutiny. In early twenty twenty, Clearview said more

(14:58):
than six hundred different law enforcem agencies started using our
app just in the last year, but they refuse to
provide more specifics because they're very protective of their client base.
So the badger is out of the bag. Metaphorically, this
is not a what if scenario.

Speaker 3 (15:17):
This exists.

Speaker 1 (15:19):
This is in use right now, and that means it
can therefore be used on you and every single person
you know.

Speaker 3 (15:26):
Is that a bad thing?

Speaker 2 (15:28):
Ah?

Speaker 1 (15:29):
Here's the interesting part. This technology is maybe innovative in application,
but very much not innovative in theoretical terms. It has
been possible for a while. People in the tech world
knew about this possibility for a long long time, and
some tech companies this is so weird. Tech companies barely

(15:50):
ever do this. Some tech companies like Giants said, you know,
we can build something like this. They realized it years ago,
and they actually decide to stop researching it. They decided
not to deploy this because of the massive potential privacy concerns.
Even Google wouldn't do it.

Speaker 3 (16:10):
It's like that thing in The Batman, which which one
is it. It's the one with Keith Keith Fledger, where
there's that technology that turns every cell phone in the
world into a listening device, and you know he has
to destroy it after using it once because it's just
such problematic tech that, you know, Morgan Freeman's character basically
builds a safeguard device that causes it to you destroy itself,

(16:32):
because it's that's the thing, Ben about the badger being
out of the bag. Once it's out, you can say
all day long, this is just for law enforcement. But
like you said, Ben, the capability has been there, so
who's to stop somebody from doing a kind of copycat
version of this once it's out there and available to
check out, even if it's only quote unquote just for
law enforcement, you know. I mean, so were certain kinds

(16:55):
of radios and tasers and things like that, you know,
and people in the public of those now too, don't they.
It's a very good point.

Speaker 1 (17:02):
Yeah, I mean back in twenty eleven, think about how
long ago.

Speaker 3 (17:06):
That was.

Speaker 1 (17:06):
Almost ten years ago, the chairman of Google said, facial
recognition technology, this kind of stuff that Clearview is doing,
is the one technology that Google is holding back on
because we feel like it could be used in a
very bad way. And now this is a fascinating point here.
Some cities have even preemptively banned police from using any

(17:29):
kind of facial recognition technology. And one of the most
notable examples of a city that's banned this San Francisco,
at the very heart of technological innovation here in the US,
so they definitely knew what was coming. Their lawmakers, their
constituents gave it a big not nope. And here's the

(17:50):
other thing, Clearview AI. If we think about the future
of it, the code behind it is fascinating.

Speaker 2 (17:56):
If anyone out there has been watching the latest season
of Westworld. You may be familiar with the glasses that
several characters wear, and just think about that. If you
are familiar with that, think about that as we describe
what this thing can do.

Speaker 1 (18:15):
Yeah, think about it carefully, because you see Clearview AI.
The code behind it isn't just facial recognition. It also
includes programming language that would allow this tool to pair
with AR or augmented reality glasses.

Speaker 3 (18:35):
What does this mean.

Speaker 1 (18:37):
This means that if the sunglasses I'm wearing now, we're
functioning with Clearview AI, I could walk down the street
and I could identify almost every person I saw, and
not just other pictures of them, but breadcrumbs that would
tell me where they lived, you know, what their children's

(18:57):
identities were, what their name was, they did for a living,
and so it would automatically play the six Degrees of
Kevin Bacon. Actually, that's a really great idea for an
add on game for these classes. You can see how
close people are to Kevin Bacon.

Speaker 3 (19:13):
You heard it here.

Speaker 1 (19:14):
First, But but it's scary because then you would like
you would see you would see Matt Frederick, and then
you would you would know his friend's Paul. Mission Control Deckett,
Noel Brown, Ben Bollen, and you would know everyone he
worked with. It just gets real sticky really quickly. And
can you hear the FEDS drooling in the background? Is

(19:36):
that just my cat? I don't know at this point,
but that's the high level look.

Speaker 2 (19:41):
Or spies, I mean, just any spy. Imagine that, well,
you guys, I mean.

Speaker 3 (19:46):
I think we're forgetting. An even more classic sci fi
example of this, that the kind of heads up display
you'd see in a movie like The Terminator. You know,
where you're seeing from the perspective of this, you know,
assassin robot who's looking for a target and being able
to see who everybody is and get all these stats
and metrics. You could apply this to any number of

(20:09):
you know, let's call them parameters surrounding an individual. You
could then go deeper with this and start pulling stats
from their social media and be served up with information
about their height and or you know whatever, like any
like let's say they have pre existing condition or something
like that, or they have some kind of you know

(20:29):
what's the word I'm looking for, like compromand on them.
You could even go deeper and find pictures of them,
like maybe smoking when they've claimed that they don't smoke,
and they are, you know, committing insurance fraud. I don't know.
I'm just saying, there's all kinds of ways you could
deep dive scrape this data and apply these points to
an individual and then get that information served up to

(20:50):
you very quickly in this display.

Speaker 1 (20:53):
And yeah, an optimists are so quick to point out
the potential benefits of clearview AI recognition in general. It's
true there are benefits. Pessimists, however, claim the disadvantages here
in this specific case, far outweigh any of those benefits,
outweigh them, in fact, by a vast and disturbing margin.

(21:16):
We'll pause for word from our sponsor and return to
dive down the rabbit hole.

Speaker 3 (21:27):
Here's where it gets crazy.

Speaker 1 (21:30):
There are a ton of problems we've you know, we've
done some foreshadowing on these, but let's talk a little
bit more about them first. You know, you can categorize
these right. So first, there are not many strong operational
restrictions on how this technology is actually used. That means

(21:53):
that it can not only be easily misused, but that
there will probably not be repercussions for people, at least
not not legal repercussions.

Speaker 3 (22:03):
It's like a point you made earlier, been off off, Mike.
It's like when you used to go into those tobacco
shops and uh, you know, you are not allowed to
refer to certain smoking apparatuses as bongs. They are water
pipes exclusively because that absolves the distributor, the retailer of

(22:23):
any responsibility if you choose to use them in any
illegal manner. It's the same thing. It's like saying, hey,
whatever you do with my thing is up to you.
I know, it's designed exclusively for a legal purpose. It's
a real sticky situation, isn't it, Because I mean it's
it not only puts all of the burden on the user,

(22:44):
it absolves the creator of any kind of responsibility at all.

Speaker 1 (22:48):
Or does he do this ta hey, hey buddy, that's
a tobacco water pipe, all right?

Speaker 2 (22:54):
Okay, yeah, yeah, say, you know, saying something like the
way the founder did in that interview with CNN where
he was just saying, you know, the thing says, just
use it in this way. It's it's we've seen it
over and over and over again. There there are a
couple of good things that he does speak about in

(23:14):
that CNN Business Interview, and one of the major things,
you know, if you put it at the top, it's
that the software costs around fifty thousand dollars to have
a license for a you know, for a department or
something for a year, years for two years, and so
at least there's a buy in level there that just

(23:35):
your average person if they're getting if they're obtaining it legally,
wouldn't be able to use. And then there are you know,
ways to verify if you know an end user is
actually if that license is is actually that person they
should be using it? That's good. And the other thing
is if you're a manager within the software of a

(23:58):
you know, a licensee of this software, you get audits
of every time the software has been used, which is
really nice. So at least at least there could be
some oversight to it and they could check to see
if you are quote, using it in a way they're
not supposed to.

Speaker 1 (24:13):
Yeah, yeah, there's any there's an example of that. I
think they'll come into play here later, which is kind
of it's pretty weird.

Speaker 3 (24:21):
It's it's a little it's a.

Speaker 1 (24:22):
Little spooky, but I want to go back to that warning.
It's it's literally something. It's it's like that old FBI
VHS tape warning that we mentioned in earlier episode. You
go on to the homepage essentially of the app, which
you can use on desktop or you can use on
your phone, and basically all it says is like, don't

(24:42):
use this in a bad way.

Speaker 3 (24:45):
And that's it's.

Speaker 1 (24:46):
It's like as effective as those tags on mattresses that
say don't remove this tag, or or the the little
note on your boxy Q tips that's like, hey, don't
put these in your ear. You know it feels awesome. Uh,
and I know that's what everybody does with it, but
don't don't put them in your ear.

Speaker 3 (25:06):
Uh. And then like you said, no that absolves them.
Or the warning about seizures at the beginning of when
you start up your PlayStation, you know what I mean,
it's like, yeah, okay, maybe I get seizures, but I'm
gonna roll the dice because I sure want to play
me some Borderlands three, you know.

Speaker 1 (25:20):
Yeah, because you know, now that I think about it,
we might be at a bit of a glasshouse situation here, uh,
my friends, because our show literally starts with the warning
that says you can turn back now we're not We're
not as bad as Q tips.

Speaker 3 (25:35):
You can you can turn back. Now that's definitely as
as is our prerogative to follow said warnings and not
stick Q tips in our ears. But who wants to
do that? What are you gonna use them for? Like
putting on eyeshadowers? I mean, I don't know the back
of your ear. Really do you need it to do?

Speaker 1 (25:51):
You a C tip for that?

Speaker 3 (25:52):
Can't you just use like a like a like a tissue.
Some people are living wild man. I don't know what
to tell you. You tips are literally probes. I don't
know anyway. I just feel like they're very clearly designed
to shove in your ear and it does feel great,
but yeah, it can. It's pretty scary when you pull
out some blood, you know, just saying.

Speaker 2 (26:10):
So, yes, I've never done that. Actually I'm gonna I'm
gonna start trying. Uh no, we get gotten blood like
gone in for wax out with blood. But you know,
maybe I'm just fortunate. I want to bring up something
that is possible with this software and just how it
uses it before we jump into the next thing we're

(26:31):
going to talk about. And that's just that when your
face ends up on a social media website somewhere, if
you even if you did not take that picture, even
if you were in the background of a selfie that
someone took at Bonaru, let's say, or something like that,
your face is still searchable with facial recognition software if

(26:54):
it is visible in the background of an image, if
you're in a crowd, and if anywhere you are anywhere
you've been in where a photograph has been taken where
you can are visible somewhere in the background. This thing
clearview AI can may very much likely has that image
within its database and your face will be recognizable to it.

(27:16):
Just let's put that.

Speaker 1 (27:17):
There, so it doesn't matter if you're on social media exactly.

Speaker 3 (27:21):
Guys Devil's advocate here. Isn't that one of the situations
where if you're in public you're sort of giving up
that right to privacy, Or if you go to a
bon Aroo, there's usually signs that said when you pass
these gates, you are subject to being filmed, etc. And
that's usually for more like documentary purposes of the festival,
but I would think it would also apply if you're

(27:42):
in public and you're caught on you know, one of
these millions of little surveillance devices that people carry around
with them. Is that fair game? Can you sue over
use of your of misuse of your image? I'm gonna
I wouldn't have think. I wouldn't have thought.

Speaker 2 (27:58):
So I'm certainly not arguing about, you know, the legality
of it. I'm just saying the reality of it is
that wherever you go, if there is a camera taking
a picture, whether it's cct well, CCTV generally wouldn't show
up in one of these things unless it was placed
into the system.

Speaker 1 (28:14):
At the At the risk of sounding to I don't know,
to extreme about this, maybe too prescient, I think the
best for people concerned about this, the best way to
handle it, which is maddening, is to is to apply
some of the rules that some of the rules that
we apply to guns to cameras. So you always assume

(28:38):
a gun is loaded, right, you never want to point
it at you, So always assume a camera is recording.
And if you don't want to be with that assumption,
if you don't want to be in the footage or
in the photo, then don't have it pointed at you.
You guys, remember, I was very uncomfortable with Pokemon Go

(28:59):
when it came out, it was like, no, they're putting poke,
They're putting these ar little cute things to collect because
you're filming.

Speaker 3 (29:07):
You're filming for them.

Speaker 1 (29:08):
You guys are like Ben big Brother does not give
a like the faintest hint of a about Charis Arts
or whatever. Right, But still that stuff freaks me out yet,
you know, But we did.

Speaker 3 (29:23):
A whole story about that company, Nantek, and like how
it had roots and intelligence gathering. I mean, you're not wrong, Ben,
And I remember that we'd be out hanging out and
people would be doing their little capturing their pokemons and
they'd be on your cell phone or something, and you
would like cover your face, and that is your prerogative
to do. And I completely understand where you're coming from,

(29:43):
because it's like with the whole zoom phenomenon that we're
obviously we're using it right now. You can turn off
that camera as far as you're concerned where it doesn't
show it to the group, but I guarandamn to you,
it's still capturing what's going on and storing it somewhere.
So you know, it's a great technology. It works real well,

(30:04):
which is why it's having this moment right now. But
I just would wouldn't trust it too implicitly.

Speaker 2 (30:10):
You know, you know, the best technology it's what ben
ben and it's a sticker or some tape just go
at all times.

Speaker 1 (30:21):
Yeah, but that's that's that's still that's that's a low
tech hack. But you have to do it because this
stuff can be weaponized. And uh, there's a guy named
Eric Goldman at the High Tech Law Institute at Santa
Clara University.

Speaker 3 (30:37):
I like the way he put it.

Speaker 1 (30:38):
He said, the weaponization possibilities of this are endless. Imagine
a rogue law enforcement officer who wants to stalk someone
that they want to hit on later, or maybe a
foreign government is doing similar to what you described. No,
they're digging up secrets about people to blackmail them or
to throw members of an opposition party in jail.

Speaker 3 (30:59):
This stuff can happen.

Speaker 1 (31:01):
And you know, Clearview is also notoriously secretive. Even now
that's the second big issue. It leaves a lot of
critics ill at ease. We know some stuff because of
some excellent research from earlier reporters. Let me tell you, guys,
this weird story. So there's a there was a New

(31:22):
York Times journalist who was working on a story about
Clearview that really brought it to international attention. And while
this journalist is working with clear is trying to do
this story, Clearview isn't really answering him. They're like ducking
attempts to contact via text, email, phone call.

Speaker 3 (31:41):
And so this reporter.

Speaker 1 (31:44):
Knows that a lot of law enforcement agencies have Clearview access,
and so this reporter asks some contacts in the police
department to run his photo through the app, and they
do a couple times. So the first approach, the first
thing he learns about Clearview approaching outside forces is not

(32:07):
when they come to him. It's when they go to
the police that he worked with and they ask the police, Hey,
are you talking to the media.

Speaker 3 (32:16):
How shady is that?

Speaker 2 (32:17):
Wow? Yes, that is whoa.

Speaker 3 (32:21):
They have an explanation right, oh absolutely, And just really
quickly that's the piece that was on the Daily because
the Daily is affiliate with the New York Times, So
this was after the fact of this reporter whose name
I'm trying to find, had done all this research and
gone through this whole run around to even get a
sit down. The reporter eventually found the office and it
was like very shady like there weren't with that many

(32:42):
people there. It was a skeleton crew and finally got
access to the founder. But that is all, you know,
described in great detail on that piece by The Daily.
I highly recommend checking it out.

Speaker 1 (32:53):
Yeah, Kashmir Hill, the secretive company that might end privacy
as we know it is the is the Times article. Yeah,
I like what you're saying about. When when when Hill
finally figured out where the company was originally on LinkedIn,
they just had like one employee listed and it turned
out to be a pseudonym or a fake name that

(33:14):
the founder was using. So they're they're aware of the
dangers here. But but when they did eventually speak to
the reporter, they had and they had reasonable explanations for
both of these things. First, the founder said, look, we
were avoiding speaking to the media, not because we're like

(33:35):
super villains or something. It's because we are a tiny
startup and a lot of times when you're a tiny startup,
you're in you're in stealth mode, right, And that makes
sense because the NDAs on those things are pretty ironclad.
You don't want anyone to scoop you. And then he
additionally said, here's why we talked to the police about
your search terms. We actually monitor all the searches that

(33:57):
are done because we want to make sure that they're
being used correctly. In other words, you know, we want
to make sure nothing like like there's never a situation
where a police officer is jilted, is jilted by a
romantic partner and then they find out their X has
a new lover and they're like.

Speaker 3 (34:15):
Well, I'm gonna find out about this.

Speaker 1 (34:19):
God, I'm even afraid to make up a name because
I don't want to put it out there. Uh, Marcus
mcgilly cutty. Oh that's swinging a miss.

Speaker 3 (34:28):
Funny thing about that, though, Ben is like, you don't
even need technology this robust to dig into somebody that
closely removed from you know, a person you really know
they're probably already like posting tagging each other on Instagram
posts and you just you know, go and just do
a couple of clicks and you're there. But no, it's
just a really good point and that's what what's set
off those alarm bells. But yeah, the way it's described

(34:51):
in the in the in the Daily piece, it was
very it felt very like okay, a they know they're
onto something and be they don't want to be found.
And I can see to your point been the whole
startup mode. They don't want to get scooped. They don't
want anybody stealing their proprietary in their you know, technology

(35:11):
and trying to misuse it or you know, create a
copycat kind of version of it. So that does make sense,
but just it's the whole thing is a little strange.
But this app clearview claims you know, is is up
to security industry standards in terms of I guess in
terms of you know, hackability or like firewalls security or

(35:33):
you know, being not being vulnerable to being infiltrated, but
who is actually watching them? Who is actually monitoring them
because it's it's essentially untested and already being put out
into the wild in cases that can have huge effects
on people's lives. We'll talk more about that after a
quick word from our sponsor. Yes, you know that was

(36:03):
a perfect ad break.

Speaker 1 (36:04):
Bill also came be time to get this cat out
of out of my recording studio.

Speaker 3 (36:10):
I heard you had a little me hour.

Speaker 1 (36:13):
Yeah uh yeah. Well we're all dealing with new coworkers, right.

Speaker 3 (36:19):
Can I just do a quick aside, a quick quarantine
aside had a crisis yesterday. We I don't know if
I've mentioned the show, that my daughter got a hamster
right before quarantine kicked in, and I had to inherit
it because her mom was allergic. She found out hamster's
name is Honakah, which is like an anime character. But anyway,
we realized yesterday afternoon that Honakoh was not in her cage.

(36:41):
So we had a hamster search party all up in
this house, freaking out, and I had just these horrible
visions of like finding a dead, decaying hamster, you know,
two weeks later in my sock drawer or something. So luckily,
the little the little critter found herself back to where
she belonged on her own, which was great. What great? Yeah,

(37:02):
she just popped out and was right there, even thought
it was a sock and it was turns out it
was a hamster and it wasn't long. I read all
this horror stories about how you can only search late
at night and you gotta bait it with peanut butter
and all this stuff, and it she just came came
right back. It was great.

Speaker 1 (37:16):
That's fantastic. Yeah, I had a I had a gerbil
and in my younger days that escaped and lived and
died in the walls of a house in my tried
to tried to move the oven and appliances to get
to it, but then we couldn't, and my father thought
it would be a good lesson in mortality.

Speaker 3 (37:39):
Oh boy, I was hoping to avoid that particular lesson
for my kid, especially during these trying times when things
have been mainly pretty positive for us here at the house.
I'm so glad. Yeah, thanks guys, but no, you're right, Ben,
tell us a little bit about the security of this
app and the idea of it being tested for.

Speaker 1 (38:02):
Vulnerabilities, right, Yeah, this is a very interesting point. So
a lot of the reporting that came up about Clearview
Ai came before late February of this year. In late February,
clearview app security was actually tested in the form of

(38:22):
a hack. On February twenty six, the public learned that
the company had been breached, their security been compromised. Hackers
had stolen Clearview AI's entire customer list, which was coveted
and very much secret right the customer list, adding to
the troubling facts about this company, The customer list did

(38:43):
not jibe with what Clearview had said earlier. It spans
thousands of government entities and private businesses across the world,
so the US Department of Justice, Immigration, and Customs. But
it also includes banks, and he does allude to banks
in a couple of different and interviews the founders. It
also includes Macy's and best Buy and Interpool. And then

(39:06):
on the list you also see credentialed users at the FBI.
You know, we said hundreds of police departments. But here's
something very interesting. It also includes users in Saudi Arabia
and the United Arab Emirates. These are countries that are,
you know, not particularly known for their progressive policies or

(39:27):
their their First Amendment rights. The founder, by the way,
says that they're exercising a First Amendment right when they
scrape all these images.

Speaker 2 (39:36):
Yeah, they're they're exercising a First Amendment right because they're
all publicly available and out there. And if the argument
that they make generally is that if Google can scrape
all the websites and create search terms for all those
things and then allow you to search that information. If
you know, YouTube has you know, when you upload something

(39:57):
to YouTube, it's all publicly available, they can scrap all
the information for all the data, and then the other
companies intertwine and use that data to help each other
find other things. Well, hey, why can't we do that too?
So we're going to However, a lot of the scrutiny
that's been happening because of that hack that was reported,
as well as just the tech in general and how

(40:18):
these other giant tech companies how their data is being used.
Tech giants like Twitter, Google, and YouTube same thing have
sent cease and desist letters to Clearview orders saying do
not do this anymore. And even like because of this,
some police departments and other users have just discontinued use

(40:38):
of the software. And when the founder, you know, responds
to this kind of thing, he says, just you know,
our legal team is on it. They're working right now
to fight that battle using the First Amendment as our
major argument.

Speaker 3 (40:54):
Yeah.

Speaker 1 (40:54):
And then importantly we should note that they said the Sstomer
lists was the only thing compromised, so the hackers don't
have that three billion plus and growing image archive. But
you're right, Matt, legal troubles gather like storm clouds on
the horizon. Clearview has said it's complied with things like

(41:17):
policies proposed by the ACLU. The ACLU has said that
they don't comply. And I think there's a case coming
up against Clearview in Illinois, and it's one of what
may be many arguably legal troubles should be gathering on
the horizon and the storm should hit, especially when we

(41:37):
look at the implications of this technology, and before we continue,
we want to be like cartoonishly transparent here. We're not
saying that Clearview or the people in the startup are
bad people, and we're not saying even that their intentions
are bad. We're saying that this technology, just like any

(41:57):
other technology, has inherent implications.

Speaker 3 (42:01):
It hasn't just like fire.

Speaker 1 (42:04):
It has it has some positive benefits, and it has
some very dangerous possible consequences, like how could it be misused?
The first one just off the top here. The first
one is the one we talked about a couple times
in this episode. Somebody with access could use it for
personal reasons. We did that hypothetical cop who's you know,

(42:25):
mad at his ex for dating Mark McGillicutty or whatever.
But we can go beyond the hypothetical examples. There's a
real life example. There's a billionaire Clearview investor who used
the app to stalk his daughter's newest bow. He wanted
to find out about this guy. I imagine, you know,
whether you're a millionaire or a pauper. Of course you

(42:48):
want to know who your kid's dating. You want to
know about them. But it's crazy that this guy was
able to use it and he found out all the
stuff that we just mentioned. Where the guy lives, what
his job is, you know, his known connections, his social
media and stuff. This example is a little weird though,
because I imagine being a billionaire, you could just hire a

(43:09):
person to dedicate their time to doing that.

Speaker 3 (43:13):
Just hire a PI.

Speaker 2 (43:15):
This is a This is a really important thing about
this that I maybe we mentioned but we didn't get
into it. The reason why he was able to find
all that stuff about this person. The way I've seen
the software work, it's not like you identify this person,
then it gives you a giant readout of all of
this person's information. You you identify that that person's face,

(43:38):
then all of the other publicly publicly available on the
internet faces show up and you can click on it
and it takes you to the website where that face is,
where that image is located, so that means you're then
directly connected to their social media to anywhere that that
photo has been posted, and then through those other websites
you can gather all that other information. At least that's

(44:00):
the way it was demonstrated in that CNN Business interview
in a couple of other places.

Speaker 1 (44:05):
That's the way it works for now, But we know
that that UI could easily be upgraded to maybe pull
to scrape that information if it's publicly available, right, I mean,
think about how many things are we've talked about before,
how easy it is to find a phone number, how
easy it is to find not just your address, but

(44:26):
the last few addresses you lived at over the years.
This is where we should also mention people don't trust
clearview AI is a little bit of stereotyping, but they
don't trust it because one of the investors, Peter Thiel,
is also big in Facebook.

Speaker 3 (44:42):
So Facebook.

Speaker 1 (44:46):
Facebook is good at a lot of things, and users
privacy has never been one that is not a bug,
that's a feature from their opinion, and.

Speaker 2 (44:55):
This is just a dovetail on all of that. In
that interview, Donnia Sullivan does a search of his face
and it's the official profile picture on the website where
he works for the CNN business site. And when it scrolled,
when they scrolled all the way down, there was an

(45:16):
image of him because he's in his I guess thirties,
late thirties or something to that. I don't know how
old he is, but he's a person about our age.
Ben and Nola and I. They scrolled the way down
and there's an image of him from a local newspaper
where he was in Ireland, right when he was sixteen
years old. And it's a group of you know, teenagers

(45:38):
holding up a sign and you can just see his
face and he looks absolutely different. I mean, I would
not have been able to tell that that was him,
but he recognized the photo and he knew it was him.
And this is the kind of thing where you know,
this is not an allegation on my part. This is
just me wondering and ruminating. I'm wondering how they connected

(46:03):
that up, How this face recognition software inserted that picture
of him where he looks so different. And the founder says, well,
you know that there's still parts of your face, the
geometry of it that remained the same over the years,
even if you put on weight, even if you're wearing glasses,
even if you're covering your face. I get that argument,

(46:25):
But to me, it makes me wonder if there is
some kind of added like Spochio search that's going on
within this system where you're looking at that's just another
company that you can search social media, any social media

(46:45):
accounts that somebody has using an email or a phone
number or a name, And it makes me wonder if
they're connecting things up there as an extra layer of like.

Speaker 1 (46:54):
A name, like prioritizing an image by other information.

Speaker 2 (46:59):
Yeah, it makes me wonder if that's happening. I guess not,
but they I would say he didn't sufficiently, the founder
didn't sufficiently explain how that image got in there, and
the interviewer is clearly a little weirded out and disturbed
by it.

Speaker 3 (47:15):
Well, another thing too, is I mean, you know, in
the United States, presumably you you know you might be
identified using this technology, but doesn't mean you are immediately
going to be convicted of something without some kind of
physical evidence or without a trial. But in other countries
where stuff like that isn't as much of a thing,
it could be used to identify social dissidents or like

(47:37):
people that are speaking out against the totalitarian government and
flag them and throw them in the gulag or whatever,
you know, without ever you know, batting an eye. And
that is sort of the crux of the problem with
this software is it depends on the notion of people
that use it being good actors and and being good

(47:57):
stewards of this power that they have. And as we know,
power corrupts, and we've got a lot of governments that
are kind of corrupt and out to shut down any
kind of criticism of what they do. And this would
be a great way to identify those people and round
them up and put them away.

Speaker 1 (48:19):
Yeah, and it's it's it's dangerous for that reason. You know,
you could you could detain people, and you could cut
around some of their legal protections like what if this,
what if this counted in a court of law as
definitive identification, then you would say, well, we don't need
to figure out if you're the person, right, It could

(48:40):
get so dangerous so quickly. There's another point we have
to add. You know, this sounds like some kind of
Skynet Big Brother stuff. It can also be used inaccurately.
It is far from perfect. There's It's one thing that
I've been talking a lot about with some people online about, uh,
facial recognition has an inherent racial bias that I wish

(49:04):
more people talked about for non white people, for persons
of color. This stuff lags in accuracy, not just clearview,
but in general. Facial recognition is known for this. And
this means that you could be arrested due to a
computer error because the software decided that you look like
a guy you never met, who lives in a state

(49:27):
you've never been to, who committed a crime that occurred
while you were not alive, and it could happen. It
could happen here.

Speaker 2 (49:34):
And to continue with that, we mentioned, you know, the
people that were known to be using this software, the organizations,
and two of them or three of them, let's just
point these these out Department of Justice, US Immigration and
Customs Enforcement. Let's just leave it to those two. So
take the problems you're talking about ben that are inherent

(49:56):
to this system. Apply that then to let's say, Immigration
and custom looking for people who are in the United
States illegally, and imagine that you're attempting to round everyone up.
The implications of this software would be extremely effective, I
would say, or completely ineffective because they're detaining and rounding

(50:21):
up people incorrectly because of the problems with the software.

Speaker 1 (50:25):
Well think about this too, the problems with the software.
So they can pound on the state level, but they
can pound on the private level too, And this is
one that I think for the immediate future is at
least as dangerous as the Orwellian stuff we're talking about
right now. So that data, when it's mined right, just

(50:48):
like your data on any other social media, it can
be sold to third parties, insurance companies, financial institutions, and
so on. This goes way past ads the phrase. I
know I'm kind of churchifying here, but the phrase I
think of is a longitudinal profile of your face and
your behavior over time. So it's like watching a real

(51:12):
life progression of you aging. And let's say that an
insurance company or an algorithm could start to pick up
on certain medical conditions that are manifesting in your face
in very minuscule ways ways of human being, and even
a human doctor wouldn't sense. The computer picks this up

(51:32):
before you know anything's wrong, sort of like the way
targets algorithms figured out that poor girl was pregnant before
her family knew.

Speaker 3 (51:42):
So here's the question.

Speaker 1 (51:44):
Would the insurance company legally be required to notify you
of this possible health condition? Say, like, there's clearly something happening.
Maybe if they have video to in your behavior that
is indicative of an early on set terminal condition, would
they be legally required to tell you that they knew

(52:06):
you had a let's say, eighty percent chance of dying
in the next two years and give you the assistance
you need right now. That's a nope.

Speaker 3 (52:15):
That is a big no. Chief. They don't have to
do a damn thing for you.

Speaker 1 (52:19):
What they can do with absolutely no repercussions is say, hey,
this person's probably going to be dead in two years
and it's going to be expensive to get them to
a year two.

Speaker 3 (52:28):
So let's just drop them. You, let's just drop them.

Speaker 1 (52:31):
There's no law that, there's no like ethical legal requirement
for them to tell you what they know.

Speaker 3 (52:38):
And that's that's.

Speaker 1 (52:40):
Mind boggling, that is reprehensible. That is like I need
I need to get my good non abridged the source
to come up with all the different words for how
bad and disturbing that is.

Speaker 3 (52:51):
Oh, ben, I would I love it when you do that,
because I always learn a new word.

Speaker 1 (52:57):
Uh, did you guys ever just light in the mood?
Did you guys ever hear that old joke about the
abridged thesaurus. It's like I have an abridged thesaurus. Not
only is it terrible, it's terrible. So I didn't write that,
but you but you can. The good news is right now,
depending on where you live, you can opt out of Clearview.

Speaker 3 (53:20):
Oh my gosh.

Speaker 5 (53:21):
But even the way you opt out isn't like it's
such a burden. It's such a burdensome process. You can't
just click a box in an email, right you. All
you have to do is send a headshot. We all
have those and an image of your government issued ID.
And this only applies also to residents of California or

(53:42):
the EU.

Speaker 3 (53:43):
For whatever reason, sorry, rest of the world.

Speaker 2 (53:46):
Right because of privacy, because of privacy laws that exists there.

Speaker 1 (53:52):
Yeah, we shall also point out, going back to this
accuracy thing there, there are serious questions about how accurate
Clearview is in general. So BuzzFeed, according to marketing literature
they found from Clearview, says that the company the company
touts the ability to find a match out of one
million faces ninety eight point six percent of the time.

(54:14):
But when Clearview did finally start talking to the New
York Times, they said, the tool produces a match up
to seventy five percent of the time, and we don't
know how many of those are quote unquote true matches.
So there's some contradictory informations coming out which can make
people feel uncomfortable, obviously, and that's that's where we leave today.

(54:34):
The battle lines are firming up. On one side, you
have Clearview, its investors, it's clients, and what I would
call the techno optimists. Right on the other side, you
have privacy advocates, you have tech giants like Microsoft IBM,
you even have the Pope. The Pope came out against
facial recognition and he summed it up pretty nicely. He said,

(54:55):
quote this asymmetry by which a select view know everything
about us while you know nothing about them Doll's critical
thought and the conscious exercise of freedom.

Speaker 2 (55:05):
Imagine applying that to a priest who hears the confessions
from everyone every week when they go and tell their
dirty secrets. The priest is then the one who everybody
knows nothing about, but he knows everything about them, just
putting it out there, especially hey, and also this thing

(55:28):
is targeting child sex abuse and abusers. Just also leaving
that there.

Speaker 1 (55:34):
Yeah, one other thing for anyone listening along and thinking,
good thing.

Speaker 3 (55:40):
I made my.

Speaker 1 (55:40):
Profile on insert social media here private years ago.

Speaker 3 (55:44):
Sorry, homie.

Speaker 1 (55:45):
Anything that was public at some point is pretty much
in the system, even if you later made it private.

Speaker 3 (55:50):
So check your MySpace.

Speaker 2 (55:52):
In that CNN Business interview, as you said, Ben, they
test out the software on the producer who has a
private Instagram account, and images from that account show up
in the search and it is because it was public
at one point.

Speaker 1 (56:07):
And that's where we leave off today. What do you think,
fellow conspiracy realists? Do the benefits outweigh the potential consequences
of this? Do the consequences outweigh the potential benefits?

Speaker 3 (56:21):
Let us know.

Speaker 1 (56:22):
You can find us on Facebook, you can find us
on Instagram, you can find us on Twitter. We always
like to recommend our Facebook community page.

Speaker 3 (56:29):
Here's where it gets crazy. Now.

Speaker 2 (56:32):
You can jump on there. You can talk about this
episode or any other in the past. You can post
some dank memes, let's say, some conspiracy memes, or facial
recognition stuff. Maybe you've had access to this before, to clearview,
and you want to talk to us about it. Don't
do that on Facebook. Don't do that. But if you
do want to tell us about that, you can call

(56:52):
our number. We are one eight three three std WYTK.
You can leave a message there, tell about it, let
us know. If you don't want to be identified, if
you don't want us to know or talk about anything
on air, just just let us know, give us the information.
We'd love to hear from you.

Speaker 1 (57:12):
And I want to add to this to say that
I finally start despite my strange phobia regarding phones, I
started diving in Matt You're doing massive, amazing work there,
and I've started listening to these calls as well. Thank
you so much to everyone who calls in. It's inspiring

(57:34):
and I don't know about you. I don't know about
you guys, but it makes me feel like what we're
doing is worthwhile, if that makes sense.

Speaker 2 (57:42):
Oh, definitely, definitely. I've had you know. Hopefully you're gonna
if you could get past the phobia, Ben, hopefully you
can call a few people because you can use that
app and you should be good to go actually speaking
to to you know you know who you are. If
we've talked on the phone, it means a great deal,

(58:04):
as Ben is saying, just to us, to know that
we're not just talking in a darkened room and that
you guys, you guys care about us as much as
we care about you. So this is a great relationship.

Speaker 1 (58:16):
Let's keep doing it well, said Matt Well said, And
we'll be following up with some of those messages in
the future, so keep them coming. One last thing, if
you say, look, Ben's right, phones are terrifying and weird,
but also social media. You guys just told me how
dangerous that is.

Speaker 3 (58:37):
Why Why on earth? What gives how?

Speaker 1 (58:40):
I have a story that my fellow listeners need to
know about. I have some I have some experience. I
have some terrible jokes to tell you, but I don't
know how to contact you.

Speaker 3 (58:50):
Well, we have good news for you.

Speaker 1 (58:53):
You can always twenty four to seven contact us at
our good old fashioned email where we.

Speaker 2 (58:58):
Are conspiracy see at iHeartRadio dot com. Stuff they Don't

(59:21):
want you to Know is a production of iHeartRadio. For
more podcasts from iHeartRadio, visit the iHeartRadio app, Apple Podcasts,
or wherever you listen to your favorite shows,

Stuff They Don't Want You To Know News

Advertise With Us

Follow Us On

Hosts And Creators

Matt Frederick

Matt Frederick

Ben Bowlin

Ben Bowlin

Noel Brown

Noel Brown

Show Links

RSSStoreAboutLive Shows

Popular Podcasts

Dateline NBC

Dateline NBC

Current and classic episodes, featuring compelling true-crime mysteries, powerful documentaries and in-depth investigations. Follow now to get the latest episodes of Dateline NBC completely free, or subscribe to Dateline Premium for ad-free listening and exclusive bonus content: DatelinePremium.com

The Clay Travis and Buck Sexton Show

The Clay Travis and Buck Sexton Show

The Clay Travis and Buck Sexton Show. Clay Travis and Buck Sexton tackle the biggest stories in news, politics and current events with intelligence and humor. From the border crisis, to the madness of cancel culture and far-left missteps, Clay and Buck guide listeners through the latest headlines and hot topics with fun and entertaining conversations and opinions.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.