All Episodes

May 16, 2019 35 mins

If data doesn't make you think of a new world order, it should. AI is enabling wholesale surveillance, and changing the landscape in countries like China, where cameras monitor citizens to decide their social credit score. But how is this already playing out in the US? We speak with experts on both sides of the Pacific, and visit the NYPD to learn how they use AI. Plus, we see where else predictive technology is being used in the American criminal justice system. 

In this episode: Ian Bremmer of Eurasia Group, Kai Fu Lee of Sinovation Ventures, Mary Haskett and Alex Kilpatrick of Blink Identity, Lisa Talia Moretti of Methods, Glenn Rodriguez of the Center for Community Alternatives, Ben Singleton of the NYPD, and Jason Tashea of Justice Codes.

Learn more about your ad-choices at https://www.iheartpodcastnetwork.com

See omnystudio.com/listener for privacy information.

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:04):
Sleepwalkers is a production of our heart Radio and unusual productions.
Every single thing that you do on a social network
or on a website is essentially recorded. How many pages
you visited, what did you click on, how did you

(00:25):
get to that website? What page did you leave on?
How many photos have you ever uploaded? Where were those
photos uploaded? How many places have you checked into? Who
have you tagged? What photographs have you been tagged in with? Whom?
Where were those photographs taken? Who's in your friendship circle?
Who did you go to school with? There are algorithms
and machine learning technologies that connect all of that data

(00:49):
together and start to find patterns. That's Lisa italian Aretti speaking.
She's a digital sociologist and tech ethics advocate with a
big focus on data. I think some of the data
that's quite concerning is facial recognition technology, where machines are
being fed huge amounts of images pictures of people's faces,

(01:13):
and that can come from dating websites, from photographs of
you on Facebook or Instagram or anywhere on the Internet.
You don't even have to upload it yourself. It could
be somebody else that's uploaded it for you. With all
that data out in the wild. All it takes is
for somebody to suck it up and they can start
connecting the dots. So there were researchers last year from

(01:35):
Stanford University who without the user's permission, scraped thirty thousand
photographs from a dating website that was public rights. So
they took it that they could just use those photographs
for research. Because it was a dating site, people will
also asked the data points, um, what is your sexual orientation?

(01:56):
So gay, straight, by right, and so they had all
of that data connected, so they could then connect faces
to sexual orientation and essentially they built an algorithm that
they said could detect just by somebody's face if they
were gay or straight or by the algorithm worked, it
could predict sexual orientation from photographs with accuracy for men

(02:21):
and accuracy for women based on just five photographs. And
Stanford isn't the only place using facial recognition to categorize people.
Governments are too all around the world. This is Sleepwalkers Welcome.

(02:48):
I'm Azveloshin. In the last episode, we looked at what
happens when artificial intelligence digests huge data sets to find
patterns and make predictions, like what's the most typical move synopsis?
Or is that moral cancerous? But advances in machine learning
are also making us more and more legible. In today's episode,

(03:10):
we ask what happens when we become the data set
and the power to predict is turned on us. Here's
Lisa again. The extra terrifying thing is why creates this
kind of technology? Like? Who does it serve? Why do
we need this? If you take this type of technology,

(03:30):
feed it to a citywide CCTV surveillance system and say
that you go to a place like Saudi Arabia where
being gay is considered a crime, and suddenly you're just
what pulling people off the street and arresting them because
you're gay because the computer said so, so like now
you're going to prison. This may sound like a terrifying

(03:54):
Minority Report style future, but actually it's here today. I
Cara Highs. So Lisa was talking about what happens when
governments start to connect the dots of all this data,
but it's already happening with private enterprise. For example, insurance
companies can now see someone who joins a Facebook group

(04:14):
about a genetic mutation and use that data to guess
that that person may have the genetic mutation and the
condition that's associated, and then the computer says, let's raise
their premium or let's deny them care. And so we
can use this peroxy data to do something The New
York Times has recently called proxy discrimination. And I think

(04:34):
something that's even more widely applicable is this definition of
surveillance capitalism, which is essentially that data is not just
data anymore, it's money. Companies can use data to make
predictions about future behavior and that can make them profit. Right,
that surveillance capitalism. I using information about joining a Facebook

(04:57):
group to make an insurance decision, and it's I don't know,
it's a little bit scary. In the US, it's all
about capitalism, But in other countries surveillance is used for
other purposes. For example, in China, it's about social control.
But in China, this same massive ingestion of data and
statistical modeling is used for governance. So let's take a

(05:18):
closer look at China and how they're using technology to
amplify the power of the state. They have this notion
of a social credit score, which is how good of
a citizen you are according to the government. They have
literally hundreds of millions of cameras around and they can
basically do things like you've been out at the bar,
you know, until to the last couple of nights. That's
not really what a good upstanding citizen would do. So

(05:39):
your social credit score can get ding because you've been
spending too much time at night at a bar. That's
Dr Alex Kilpatrick. He and Mary Haskett co founded Blink Identity,
a facial recognition company that can recognize the users face
in yes, the blink of an eye about no point
four seconds. Here's Blink co founder Mary on Chinese surveillance.
They have camera, so you're recorded jaywalking, and so your

(06:02):
score goes down, you know, and you automatically get a ticket,
which again doesn't sound like that they give a deal
if you really believe in law and order, except if
your score isn't high enough, you can't buy a plane ticket,
you have to travel by bus, you know, you can't
live in certain areas. And you can obviously see how
this could be abused. I mean, it doesn't take much
of an imagination. Other factors that can bring down your
social credit score include what you buy at the store,

(06:24):
your online browsing, and even having a friend with a
low score. This use of generalized surveillance can keep a
whole population in check, which is more or less the
explicit goal of the Communist Party of China. This can
be stifling for the average hand Chinese citizen. For minorities,
it can be much much worse, right, you know, more
specifically and more dauntingly, it can be used for internment,

(06:47):
you know, in the case of the weaker minority in China,
using data that is being created by this minority to
just communicate, you know, one person communicating with another person,
those who are connected that both wagas right, five weeks
are gathering the same place. Now that we know that,
you know, what are we going to do with that data? Oh,
We're going to send these people to re education camps.

(07:08):
And as technology improves, so does the state's ability to
project power. China today is cleaning the floor with the
Americans on voice and facial recognition technology. That's the Embrema,
an expert on global political risk and the founder of
the Eurasia Group. The Chinese have much more data. You

(07:29):
also have a government that is consolidating the data and
allocating it for different types of purposes. And you have
no presumption of privacy whatsoever. With no presumption of privacy,
the amount of data you can collect from your citizens
grows exponentially, and that actually gives you a huge technological advantage.
This is how Kai Fu Lee explains it. What makes

(07:49):
sense AI algorithm work better is how much data you
used to train it. And that's the beauty of deep learning.
You just keep throwing data at it and it just
performs better. And Kai fully understands this world better than most.
His fund, Signivation Ventures, has invested in meg v, a
facial recognition company valued at four billion dollars. Kaifu also

(08:11):
around Google China, so he knows the landscape. China simply
has more data than the US due to not only
the large number of users, but also the depth in
which Chinese users use the Internet for ordering food, for
shared bicycles, for mobile payments. So the AI will actually
just perform better because it's trained out more data. Some

(08:33):
of that data is taken from citizens using surveillance, but
according to Kaifu, much is freely given. I think the
Chinese culture and the Chinese people are more pragmatic, so
that if the software delivers pragmatic value, they ask fewer questions.
For example, you know, we've funded an app that loans

(08:57):
money to ask you a couple of questions and takes
data from your phone at the same level as the
Facebook would take data from your phone, and it's SAPs
the money to you instantly if it decides to lend
money to you. I think in the US people my
question do I really want to give my data to
a lending application? And is it appropriate to consider the

(09:17):
makeup my phone as a part of that give me
giving me money or my zip codes? Because that might
reflect certain things about me. The breadth and depth of
data in China, both from a larger population and a
much deeper integration with technology, gives China a serious competitive advantage.
Where the Americans have much better scientists, the Chinese were

(09:39):
able to buy a lot of science. Now, when you
talk to specialists in this field, they will tell you
that in many parts of Ai, great data and okay
scientists will frequently beat great scientists and okay data. Character
scary thing is that China is using that technological superiority
to build a very different kind of state, you know,

(10:00):
one in which the price of descent is intolerably high.
You know. So according to Ian public demonstrations have fallen marketly, right,
because if you know you're being watched, you're probably less
likely to commit public displays of civil disobedience. Right. We
talked about the vigas earlier in Cashgar, which is a
Muslim city in China. Vigas have to register to go

(10:20):
into the mosque, and once they're inside, they face a
bank of cameras like many many surveillance cameras, and go
figure Muslims stopped going to mosque voluntarily well, because going
to mask is an act of civil disobedience where they are.
Even if it's not explicitly stated, it's it's heavily implied, right.

(10:41):
I mean, I think about it for myself, Like if
I were living in an area in the United States
where going to temple was going to land me in
an internment camp, I would not be going voluntarily if
I knew there were security cameras all over my temple. Absolutely,
And the crazy thing is you wouldn't even have to
know if those security cameras work. It's like on the
o TO in England, we have a lot of speed

(11:01):
cameras and no one knows that they're actually doing it.
It's staying apart from once every five years you've got
a ticket. But it still slows people down the panopticon
even if they can't do that. But people think they
can do that, right. I mean, you don't need a
hundred percent certainty. You just need a government that is
starting to get that capacity and make it known and
have a few people that are sort of strung up
as examples, and suddenly everyone's scared. And this isn't only

(11:26):
happening in China. According to Ian, it was a key
part of Bashar al Assad's strategy in the Syrian Civil War.
Assade got some help from the Russians, who gave them
a couple of hundred computer scientists to go in work
with the Syrian military and identify on social media and
on text messaging who wore those Syrian citizens that were

(11:47):
nodes of dissent and within six months no more moderate
opposition in Syria. They specifically, we're looking into individual Syrian
citizens that we're saying things about the regime that we're untoward,
that we're connected to influencers that were helping to organize protests,
and suddenly, you know, a bunch of those people were

(12:10):
rounded up and some were never heard from again. And
as I mentioned in terms of China, you don't have
to do that with many people before people start ratting
out their friends, being scared of talking to anyone not
going out. The system worked. We may feel comfortably far
from the battlefields of Syria here in the US, and

(12:31):
from the overwhelming number of surveillance cameras on practically every
street corner in China. But the more effective these technologies are,
the more likely they are to be adopted by others. Now,
in other countries, you're going to have a confluence of
both them liking the model and the Chinese directly exporting it.
Who were those countries, Well, look at One Belt, One Road,

(12:52):
the you know, trillion plus dollar investments that the Chinese
are making all over the world in Pakistan, Southeast Asia,
you know, Cambodia, laughs, a whole bunch of countries. And
when you look at those countries and you see that
the Chinese are providing the money in this conditionality in return,
some that conditionalities use Chinese standards for technology that's in
many of these contracts. And with the spread of Chinese

(13:14):
standards of technology comes the spread of Chinese style surveillance,
which could ultimately make the whole world trend more authoritarian.
So as that happens, these governments are going to say,
ah ha, we get the money from China, we use
their technology. We're stuck with their system, but we can
use it to ensure that our people stay in power. Again,

(13:37):
it's easy to let all of this feel comfortably far away,
but remember the Internet doesn't have borders, so you don't
have to be in China for the Chinese state to
access your data. Yeah. I don't know how many people
know this, but Grinder is actually owned by a Chinese company,
Grinder the dating MP. Yes, and actually there have been
articles about the fact that the US government is trying

(13:58):
to force China's hands so that we can buy it
back because there's so much user data that this company
now owns. Well, that Grinder use a data is basically
being seen by the US government as a strategic asset.
It is a strategic asset. I mean, if you think
about it, if somebody is on a military base or
in a barrack and trying to connect with someone on Grinder,

(14:21):
they're turning their location services data on because they want
to see people in their area and if they're turning
that location services data on, they're basically making themselves vulnerable
to the company that owns the data, because it's basically
saying here I am, here, I am and not only that,
here I am, here, I am, and I'm gay, and
that can lead to some possibilities of blackmail. Even today, exactly,

(14:43):
there was an article in the interface that was written
by this guy, Casey Newton, chat history, photos, videos, real
time location. All of that is connected to a user's
email address, and that means that the user's identity can
be very easily learned. That's pretty scary. And even if
don't use Grinder, you might use Flash of Clans or Fortnite,

(15:03):
which are very popular gaming apps also owned by the Chinese.
Now we keep saying the Chinese. To be clear, these
apps aren't owned by the Chinese government, that owned by
Chinese companies. What the actions of the US government imply
by trying to force this company to sell Grinder back
is that they don't believe in the distinction. And do
you think the US government would really be working that

(15:25):
hard to get back a gay dating app if they
didn't think that there was not a murky separation between
the government and companies in China. Right, So every time
we give data away, I mean, we're aware that it
opens us up to targeted ads on Facebook. We talked
about those with Gillian. We're not aware that that data

(15:45):
may end up in the hands of potentially hostile foreign government.
So once again, Sleepwalking. We've been talking about how foreign
governments are using AI, but when we come back, we'll
look at how the police and courts are using it
at home in America. Kara, it's easy to look at

(16:09):
China and to see the big bad wolf. They're using
surveillance technology for the wholesale suppression of an ethnic minority.
They have a social credit score that con emit access
to opportunities and even travel, But algorithms also determined outcomes
here in the US exactly if you think about it,
we do use social ratings. If you use Uber and
have a rating lower than four out of five stars,

(16:31):
you can't get a car right and you can't get
alone if you have a low FICO credit score. And
our criminal justice system also uses algorithmic ratings to decide
people's fate. When I got arrested at sixteen, I was
in high school and John F. Kennedy High School. That's
Glenn Rodriguez. When he was a baby, Glenn's mother was murdered,

(16:51):
and when he was three, his father committed suicide. From
then on, Glenn was raised by his grandmother and he
searched for belonging. The kid who wants to feel accepted,
wants to feel a part of something right. Whatever the
group was up for, I was down. And if they
were going one step forward, I would take two. We
pretty much planned a robbery at a car dealership in

(17:14):
Queen's and we entered the premises. We took three cars.
There was a twenty five year old men in there,
and he initially pulled a gun, and so I had
a gun and I shot him. Glenn was arrested and
convicted of second degree murder. He was sentenced to twenty

(17:35):
five years in jail, and he was still a high schooler.
You feel powerless, you feel hopeless, especially at that age.
So the way I saw it was, this is my life.
You know, I'm probably gonna die in jail, and so
whatever it is that I have to do, I need
to survive. One of the things that I learned very
quickly is that in prison, one of the only things
that is respected is violence, and so in order for

(17:56):
you to survive in there, you have to be violent,
because otherwise you'd be come pray. In time, Glenn established
his reputation and started to feel safer. With that security
and getting older, his thinking began to change. And it
wasn't until later, to like my mid twhenies, when I
started saying, you know what, I need to reverse this
trend if I want to have any chance at parole.

(18:18):
Glenn had to reverse thirteen years of behavior to survive
in prison. He had learned to behave one way, but
to get out he had to behave another. I reveiled
myself of the Puppies behind Bars program, so I was
training service dogs for wounded war veterans for five years.
That was an amazing experience, right because throughout incarceration it's
almost like you build a wall around yourself with the dogs.

(18:41):
You can't fake it with the dogs. If you're trying
to like teach them a command, sometimes you may have
to be silly, which guess what, in prison, being silly
is not acceptable. That's perceived as a weakness. But with
that program, often times you had the resort of being
silly and throwing yourself on the floor and giggling loud
and making all kinds of crazy sounds to try to
get the dogs attend gin Right. To a very large extent,

(19:01):
I believe that that program kind of helped me regain
my humanity as well as helping Glenn. Personally. Taking part
in prison programs for the public good is looked upon
favorably by parole boards. So everything I did, I wanted
to document the kind of showcases what I've done, this
is who I am today. As part of the process,
there's also the Campus Risk Assessment COMPUS stands for Correctional

(19:25):
Offender Management Profiling for Alternative Sanctions. It's an algorithm that
claims to be able to predict how likely a defendant
is to commit another crime based on a list of
a hundred and thirty seven questions. Since being developed in
its estimated COMPASS has been used to assess more than
one million defendants, including Glenn. You meet with this person

(19:48):
a few months before your schedule parole board day, and
they asked you a series of questions. And so when
he got to the disciplinary history section of the Campus
Risk Assessment, there was a list of offenses right for
him to check off. Yes or no for the past
twenty four months, and it was all known. And anyone
who has any experience with prison would tell you that

(20:08):
that is almost impossible to do, right, because misbehavior reports
can be for something as simple as having too many pillows,
something as simple as your parents hanging off your bot,
your sneaker is untied. It takes a lot of energy
to dodge a misbehavior report during the course of a year,
let alone ten, and in my case, it had been eleven.

(20:29):
And then I heard him read the question and he says,
does this person appear to have notable disciplinary issues? And
he says yes, And I was like, hold up, wait
a second, did I just hear you right? Because I
just heard you say that I have notable disciplinary issues?
Do you do realize that I haven't had a misbehavior
report in over a decade? Right? And his answer was well,

(20:50):
I was told that if there's any instance of misbehavior
at any point, I have to check yes for this answer.
So I was like, okay, So at that point there
was nothing I could do. I'm appearing before the parole
board panel. I presented to them a portfolio that was
approximately one hundred pages, had letters to support Now the

(21:12):
Compass is saying that I'm a disciplinary issue and so
I shouldn't be released. But I was denied because of
the fact that I scored high on Compass. They played
a safe and kept me in. It may have been
less than five minutes the hearing. I waited twenty six
years to sit in front of a panel of three
people for less than five minutes. No one wants to

(21:33):
be the one to go against Compass, and next you know,
something goes wrong and now your job is on the
line because you departed from Compass, which is taken as
factual and scientific. In time, Glenn went before another parole
board and this time they freed him against the recommendation
of Compass. And now Glenn has built a life for

(21:54):
himself working with teenagers at risk, giving conceration at the
Center for Community Alternatives. But he's still being affected by
the algorithm. Compass does not end upon your release because
the same Compass risk assessment that's considered for your release.
The term is how you're going to be supervised upon release.
There's a number of restrictions that I have. I have

(22:16):
a curve you. I'm still haunted by a Compass. Despite
turning his life around Compass is still limiting Glen's freedom,
and that should haunt all of us. According to propublic A,
Compass inaccurately labels black defendants as likely to reoffend twice
as often as white defendants. Algorithmic discrimination isn't government policy

(22:39):
here in the US like it is against the weak
as in China, but it still exists. There's this issue
where you can have computer scientists building a more accurate algorithm,
but on account of dubious input factors like gender or
race or religion, you've created something that's unconstitutional. That's Jason

(22:59):
tchet It. He introduced us to Glenn, and he's the
founder of Justice Codes and a legal affairs writer for
the American Bar Association Journal. There's this predisposition to believe
that math doesn't carry all of the biases that humans do.
It's an objective science. I think we need to dispel
that idea. Jason is describing the very human habit of

(23:23):
taking computer output as gospel truth. It's called automation bias,
and it's why parole boards often don't feel comfortable over
writing algorithms like Compass, and why some people follow their
GPS even when it has them driving into the ocean.
This idea that somehow, because math is an underlying force
to these tools, makes them more objective or beyond certain

(23:46):
types of scrutiny is wrong. Computer algorithms are being used
to determine human fate today, whether it's Compassed in the
US or the social credit score in China, So we
have to scrutinize them and understand that their output is
not necessarily neutral. The foundational principle of AI is using
historical data to predict what will happen next, and that

(24:10):
in itself is a challenge to our culture because the
American dream is built on the idea that we have
a capacity to change, that we can move from rags
to riches, from the penitentiary to the boardroom. And it's
not just an American narrative. It's Scrooges change of heart
delivering the turkey to the Cratchets on Christmas Day. It's
Saint Paul's conversion on the Road to Damascus. It's at

(24:33):
the very heart of Western culture. But algorithms like Compass
aren't built to see the potential in people. They're designed
to calculate risk based on past actions, and Compass isn't
the only example of algorithms being used in our criminal
justice system. When we come back, we go right inside
the NYPD to understand how new technology is powering policing.

(25:03):
It's a freezing cold day in New York City when
we arrive at the NYPD headquarters, and before we even
get into the main building, Julie and I have to
pass through airport style security and naturally give up some data,
including submitting a selfie in a kiosk. Yes, right now,
our society is holding big conversations about body cameras, police accountability,

(25:26):
and government monitoring, so we had to ask how does
one of the most recognizable police forces in the world
handle our data. My name is Benjamin Singleton, director of
Analytics at the NYPD, so I probably spend half my
day writing code and half my day and meetings. The
police Department collects records as a regular course of our business.

(25:47):
We respond to nine when one calls, We take crime reports,
we make arrests, We issue moving summons. Is when you
you know, speed in the city. These are examples of
the kind of data that we collect. You know, I
think there's probably some sentiment that there are back doors
into various systems UM, but the NYPD is governed by
the same legal processes as any other UM law enforcement agency.

(26:11):
If we want data from an outside company or vendor,
we get a search warrant from a judge or through
a d a's office issue with subpoena, and that's how
we collect our data. Their cameras throughout the subways, white
corset turnstyles, and easy pause readers on the roads and
more in New York. So what might the NYPD know
about me? If you haven't sort of stood in front

(26:32):
of a police officer who hasn't taken a report by hand,
we probably don't have records on you. That being said,
we do collect data through sensors like license plate readers UM,
and we do have data sharing agreements with some other
criminal justice agencies like corrections, like the courts, and so
there's obviously opportunities for that kind of data to enter

(26:53):
our realm. But one thing that's built into every single
NYPD application is an auditing track. So anytime you look
at any piece of information, no matter what system you're in,
that's being audited. And so we have a very large
internal affairs bureau, and people have gotten in trouble before
for misuse of computer systems, and so I think that

(27:13):
that's an important check that's reassuring to hear. But why
collects so much data in the first place. I think
that the next frontier of machine learning in policing is
bringing decisions and information into the hands of cops who
need to make decisions quickly. We recently rolled out tens
of thousands of mobile phones to all of our cops,

(27:34):
and putting a computer in their hands has really changed
the way that the police. When you have more information,
you can make better decisions. So we could be responding
to a job at a specific location in a building,
and we know what's happened at that building before we
responded to nine. When one calls their last week in apartment,
you know for see and in that interaction it led

(27:55):
to some sort of altercation or we found out that
that person that we interacted have had some sort of issue. Well,
the cop who's working today might not be the same
cop who is working a week ago. And so how
do I convey that information? Maybe in a phone as
a as a pop up, as a notification that tells
you take extra time, take caution. Um, this sort of
incident happened. Using data to give office as context is

(28:18):
hard to argue against if it can lead to safer
interactions for everyone. But of course, what many people find
more concerning is ambient surveillance. Surveillance that happens all the time,
and despite much pressure, the NYPD has yet to release
an explicit facial recognition policy. And where do the effort
stand on facial recognition technology. Our Facial Identification Section, which

(28:41):
sits under the Detective Bureau, is a group of trained
detectives investigators. They use a tool and algorithm that compares
faces that we might get from a surveillance photo, and
they run that algorithm, get potential matches and then conduct
an investigation. But it's not as simple as you know,
a facial recognition hit occurs and that's suddenly licensed to

(29:04):
make an arrest. It doesn't generate probable cause for us.
We still require much more evidence in order to make
a determination that that hit is truly viable and something
we can act on. But there are cases where that
technology has been used as part of an arrest or prosecution.

(29:25):
In the absence of an explicit policy, Ben wasn't able
to answer the question live in the room, but we
did get a statement from the m I p D.
The NYPD has moved deliberately and responsibly in the use
of facial recognition software. There is no NYPD case where
an arrest or prosecution was brought on the basis of
facial recognition. The NYPD uses it on a case by

(29:47):
case basis, and the case must always be supported by
further investigation before any arrest is made. The NYPD has
absolutely no interest in wholesale surveillance, which would be an
enormous and entirely pointless task. We have little choice but
to trust. But that said, Bend is beat convincingly about

(30:07):
how the NYPD actually uses technology to police themselves. There's
also statistical tools around fairness that can actually measure whether
an algorithm is fair, whether it's causing bias, etcetera. And
so we're very interested in utilizing these metrics and we
fully embrace them. We we want to get better, and
we're taking a conservative approach because we know how high

(30:31):
stakes this is. The stakes are high and the path
is murky. I didn't know what to expect at the NYPD.
Would they optimize purely for reducing crime or they take
a broader view of justice. Personally, I found ben reassuring,
but the potential for abuse remains. So how do we

(30:51):
here in America god against that abuse. Well, let's return
to Mary Haskett who found a blink identity with Alex Kilpatrick.
Anytime you're using face wreck without consent, it's going to
get abused because why wouldn't it. And and here's the problem.
I don't think it's appropriate to ask a police department
to just voluntarily not use a tool that's awesome for them.

(31:13):
I mean you need to have a different level. You
need to have your governor, federal states, some government governing
body needs to be saying sorry, this is not appropriate.
This is violating people's rights. The difference between what is
happening in China and in the US is not technological,
it's cultural and political. Edward Snowden had a phrase for this,

(31:35):
turnkey tyranny, meaning that the technical infrastructure of mass surveillance
already exists, and that we're only protected by our values
and our laws. And thinking of China, I think there
are some profoundly creepy things that we are right on
the edge of starting to see. There's cameras everywhere. If
you add face recognition. It's not just oh, they saw

(31:55):
my face, they saw that I went to Starbucks. It's
where you were every day, every time, all of her
history and all that gets saved. It's my pattern of
where I go when I'm outdoors forever. Five years ago,
I would have said that could never happen here. Part
of the reason Mary has so much about privacy is
that she knows how quickly facial recognition is spreading. In fact,

(32:18):
in twenty eight Blink Identity raise money from Live Nation
ticket Master's parent company to allow future concert goers to
use their faces instead of their tickets. We wanted to
be a case study of how to do this in
a way that preserves individual privacy and respects the individual,
and maybe that will help set a precedence and maybe
some of these other objectionable use cases just won't be

(32:41):
able to take off. Facial recognition and other AI technologies
are being developed all over the world, and we can't
trust everyone to be as conscientious as Mary and Alex
In America, the liberty we take for granted is hard
one and fragile, and cases like Glens show what can
happen when algorithms are blind trusted to determine outcomes. So

(33:03):
much hangs in the balance right now about our technological future,
and the decisions we take will affect our lives profoundly
and echo through the lives of our children too. I
mentioned Charles Dickens Christmas Carol earlier to me. One of
the most powerful scenes in the book is Scrooge seeing
for the first time the chains he has made for
himself through his own decisions. Nowadays, we would call those

(33:25):
decisions and those chains longitudinal data, and they'd be very
hard to get rid of. They're the record that Lenn
couldn't shake, that might deny a Chinese citizen a plane ticket,
or deny you health insurance because of your social media activity.
But data can also set us free. In the next episode,
we investigate what's possible when our data is used to

(33:47):
help us. From a dying man brought back to his youth,
to movies and music that read our bodies while they play,
and what happens when Alexa becomes part of the family.
How long that Giuliana dived with m hmmm, I don't
know that one. I am still learning more about dinosaurs
us some dinosaur trivially. Sleepwalkers is a production of I

(34:24):
Heart Radio and Unusual productions. For the latest AI news,
live interviews, and behind the scenes footage, find us on Instagram,
at Sleepwalker's podcast or at Sleepwalker's podcast dot com. Special
thanks this episode to Laurie Arlam and Lucy Brady. Sleepwalkers
is hosted by me Osbaloshen and co hosted by me
Kara Price, with produced by Julian Weller with help from

(34:47):
Jacopo Penzo and Taylor Koin. Mixing by Tristan McNeil and
Julian Weller. Our story editor is Matthew Riddle. Recording assistance
this episode from Dina Bridgett, Rachel London, and Phil Bodger.
Sleepwalkers is Exactly You, produced by me Osloschen and Mangesh Hattikiler.
For more podcasts from my Heart Radio, visit the I
Heart Radio app, Apple Podcasts, or wherever you listen to

(35:09):
your favorite shows.

Sleepwalkers News

Advertise With Us

Follow Us On

Hosts And Creators

Oz Woloshyn

Oz Woloshyn

Karah Preiss

Karah Preiss

Show Links

About

Popular Podcasts

Dateline NBC

Dateline NBC

Current and classic episodes, featuring compelling true-crime mysteries, powerful documentaries and in-depth investigations. Follow now to get the latest episodes of Dateline NBC completely free, or subscribe to Dateline Premium for ad-free listening and exclusive bonus content: DatelinePremium.com

The Clay Travis and Buck Sexton Show

The Clay Travis and Buck Sexton Show

The Clay Travis and Buck Sexton Show. Clay Travis and Buck Sexton tackle the biggest stories in news, politics and current events with intelligence and humor. From the border crisis, to the madness of cancel culture and far-left missteps, Clay and Buck guide listeners through the latest headlines and hot topics with fun and entertaining conversations and opinions.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.