Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker:
Welcome back to episode eighteen
of The Gray Files, where we peel (00:04):
undefined
Speaker:
back the layers of technology,
economics, data science, and (00:09):
undefined
Speaker:
even the human condition itself,
all in an effort to try and (00:15):
undefined
Speaker:
understand this vast and often
perplexing world we live in. (00:20):
undefined
Speaker:
I'm your host, Erika Barker, and
tonight we are talking about how (00:26):
undefined
Speaker:
you and everyone you know are
being watched, not from a dark (00:32):
undefined
Speaker:
van down the street with
sophisticated surveillance (00:38):
undefined
Speaker:
equipment, but by algorithms. (00:42):
undefined
Speaker:
We're examining how the everyday (00:46):
undefined
Speaker:
act of posting, liking, and (00:48):
undefined
Speaker:
scrolling turns into something (00:51):
undefined
Speaker:
else once machines start keeping (00:53):
undefined
Speaker:
score. (00:57):
undefined
Speaker:
It was a bright cold day in (01:06):
undefined
Speaker:
April and the clocks were (01:09):
undefined
Speaker:
striking thirteen. (01:12):
undefined
Speaker:
That's the famous opening line (01:15):
undefined
Speaker:
to Orwell's nineteen eighty (01:17):
undefined
Speaker:
four. (01:19):
undefined
Speaker:
Now, in twenty twenty five, I (01:21):
undefined
Speaker:
just read a disturbing story in (01:24):
undefined
Speaker:
wired magazine. (01:26):
undefined
Speaker:
It was about documents that show (01:28):
undefined
Speaker:
how the US government plans to (01:30):
undefined
Speaker:
hire dozens of contractors to (01:33):
undefined
Speaker:
scan ex Facebook, TikTok and (01:36):
undefined
Speaker:
other platforms to target people (01:40):
undefined
Speaker:
for deportation. (01:43):
undefined
Speaker:
This is not some simple think (01:45):
undefined
Speaker:
tank memo or request for (01:47):
undefined
Speaker:
information. (01:50):
undefined
Speaker:
It has requirements and
deadlines. (01:51):
undefined
Speaker:
This is a detailed plan for (01:55):
undefined
Speaker:
around the clock social media (01:58):
undefined
Speaker:
monitoring. (02:01):
undefined
Speaker:
And here's the part that caught
my attention. (02:03):
undefined
Speaker:
The system is built to sort,
rank and predict. (02:07):
undefined
Speaker:
It turns what we say in public (02:13):
undefined
Speaker:
into scores that travel (02:15):
undefined
Speaker:
categories we never agreed to, (02:18):
undefined
Speaker:
but that still shape how we are (02:20):
undefined
Speaker:
treated tonight. (02:24):
undefined
Speaker:
Not in April of nineteen eighty (02:26):
undefined
Speaker:
four, but in October of twenty (02:29):
undefined
Speaker:
twenty five. (02:33):
undefined
Speaker:
The stakes change again. (02:34):
undefined
Speaker:
Immigration and Customs
Enforcement is seeking vendors (02:37):
undefined
Speaker:
for a social media surveillance
program designed to turn public (02:41):
undefined
Speaker:
posts and to operational leads. (02:46):
undefined
Speaker:
That is the visible tip of
something larger. (02:50):
undefined
Speaker:
Every employer. (02:56):
undefined
Speaker:
Lawyer. (02:57):
undefined
Speaker:
Every insurer, every agency is
learning to read your digital (02:58):
undefined
Speaker:
shadow and they are getting
really, really good at it. (03:05):
undefined
Speaker:
Big brother is indeed watching
you. (03:12):
undefined
Speaker:
Part one. (03:18):
undefined
Speaker:
The watchers have names and
addresses. (03:20):
undefined
Speaker:
Two buildings, one in Williston, (03:25):
undefined
Speaker:
Vermont, population eight (03:28):
undefined
Speaker:
thousand. (03:31):
undefined
Speaker:
The kind of place where everyone
knows your coffee order. (03:33):
undefined
Speaker:
The other in Santa Ana, (03:37):
undefined
Speaker:
California, in the shadow of (03:39):
undefined
Speaker:
Disneyland, where dreams come (03:42):
undefined
Speaker:
true. (03:44):
undefined
Speaker:
Inside these buildings, nearly (03:45):
undefined
Speaker:
thirty contractors will sit at (03:48):
undefined
Speaker:
screens watching, not watching (03:51):
undefined
Speaker:
television, not watching movies, (03:54):
undefined
Speaker:
watching you. (03:57):
undefined
Speaker:
The contract documents are
public, filed October third. (04:00):
undefined
Speaker:
They want artificial
intelligence that can process ex (04:06):
undefined
Speaker:
Facebook, Instagram, TikTok,
YouTube, Reddit, every major (04:10):
undefined
Speaker:
platform where humans gather to
be, well, human. (04:16):
undefined
Speaker:
Thirty minutes. (04:23):
undefined
Speaker:
That's the turnaround time for (04:25):
undefined
Speaker:
urgent cases from your post to (04:27):
undefined
Speaker:
their report. (04:31):
undefined
Speaker:
Half an hour to transform a
tweet into intelligence. (04:33):
undefined
Speaker:
They're calling it Osint, which (04:39):
undefined
Speaker:
stands for Open Source (04:42):
undefined
Speaker:
Intelligence. (04:44):
undefined
Speaker:
It sounds technical, neutral,
almost boring. (04:46):
undefined
Speaker:
So let me tell you what these
systems actually do. (04:52):
undefined
Speaker:
Imagine you post a photo at your
cousin's wedding. (04:57):
undefined
Speaker:
Beautiful day. (05:01):
undefined
Speaker:
Everyone's happy. (05:03):
undefined
Speaker:
You're holding a beer. (05:05):
undefined
Speaker:
Laughing at something off
camera. (05:06):
undefined
Speaker:
The location tag says El Paso. (05:09):
undefined
Speaker:
The system sees the subject was
mobile on October fifteenth. (05:12):
undefined
Speaker:
Subject consumes alcohol. (05:17):
undefined
Speaker:
Subject has family connections
in border region. (05:20):
undefined
Speaker:
Subject associates with it (05:24):
undefined
Speaker:
starts checking your cousin's (05:27):
undefined
Speaker:
profile. (05:28):
undefined
Speaker:
Individual who posted anti
enforcement content in March of (05:30):
undefined
Speaker:
twenty twenty three. (05:34):
undefined
Speaker:
Your moment of joy becomes data (05:36):
undefined
Speaker:
points and someone else's (05:40):
undefined
Speaker:
database. (05:42):
undefined
Speaker:
The algorithm doesn't see the
wedding. (05:43):
undefined
Speaker:
It sees patterns and patterns,
and the wrong light can look (05:46):
undefined
Speaker:
like anything you want them to. (05:53):
undefined
Speaker:
Part two. (05:57):
undefined
Speaker:
The infrastructure was already
here. (05:59):
undefined
Speaker:
This isn't new. (06:04):
undefined
Speaker:
Since twenty fourteen, Ice has
operated something called Gost (06:06):
undefined
Speaker:
or possibly ghost. (06:11):
undefined
Speaker:
I'm not quite sure how to
pronounce it, but it stands for (06:13):
undefined
Speaker:
Giant Oak Search Technology. (06:17):
undefined
Speaker:
Sounds like something from a spy
novel, doesn't it? (06:21):
undefined
Speaker:
Well, it's real, and it's been
watching ghost crawls social (06:25):
undefined
Speaker:
media looking for what the
system calls derogatory (06:31):
undefined
Speaker:
information, posts critical of
America, photos at protest, (06:35):
undefined
Speaker:
connections to people already
flagged in other databases. (06:41):
undefined
Speaker:
The interface is almost
insultingly simple. (06:47):
undefined
Speaker:
Thumbs up. (06:52):
undefined
Speaker:
This person seems fine. (06:53):
undefined
Speaker:
Thumbs down flag for review. (06:55):
undefined
Speaker:
Like Tinder, but for human
freedom. (07:00):
undefined
Speaker:
Meanwhile, Ice maintains (07:03):
undefined
Speaker:
contracts worth millions with (07:05):
undefined
Speaker:
data brokers LexisNexis, Thomson (07:08):
undefined
Speaker:
Reuters. (07:12):
undefined
Speaker:
Companies that aggregate
everything that's technically (07:13):
undefined
Speaker:
public about you your address,
history, your relatives. (07:16):
undefined
Speaker:
That speeding ticket from twenty (07:22):
undefined
Speaker:
nineteen, the house you looked (07:25):
undefined
Speaker:
at but didn't buy, your voter (07:27):
undefined
Speaker:
registration. (07:30):
undefined
Speaker:
All of it compiled,
cross-referenced and scored a (07:32):
undefined
Speaker:
company called Palantir, which
we did an episode on in episode (07:39):
undefined
Speaker:
seven, named after the seeing
Stones in Lord of the rings. (07:43):
undefined
Speaker:
Just got thirty million dollars (07:47):
undefined
Speaker:
to build something called (07:50):
undefined
Speaker:
Emigration OS, an operating (07:52):
undefined
Speaker:
system like windows, but for (07:55):
undefined
Speaker:
tracking humans. (07:57):
undefined
Speaker:
Peter Thiel, Palantir's founder,
once said privacy was dead. (07:59):
undefined
Speaker:
He was wrong. (08:06):
undefined
Speaker:
Privacy isn't dead, per se. (08:07):
undefined
Speaker:
It's just expensive. (08:11):
undefined
Speaker:
The wealthy can afford lawyers, (08:14):
undefined
Speaker:
PR firms, reputation management (08:17):
undefined
Speaker:
services. (08:20):
undefined
Speaker:
The rest of us, well, we are
just raw data. (08:22):
undefined
Speaker:
Part three. (08:29):
undefined
Speaker:
The profile you'll never see. (08:31):
undefined
Speaker:
Everyone knows employers check
social media. (08:36):
undefined
Speaker:
Not stop being news in twenty
ten. (08:39):
undefined
Speaker:
What's happening now is (08:43):
undefined
Speaker:
algorithmic behavioral (08:44):
undefined
Speaker:
profiling. (08:46):
undefined
Speaker:
And you have no idea what story
your patterns Hatton's tell (08:48):
undefined
Speaker:
Resume Builders twenty twenty
three survey confirms seventy (08:52):
undefined
Speaker:
three percent of hiring managers
screen social media. (08:56):
undefined
Speaker:
But that statistic misses the
evolution. (09:01):
undefined
Speaker:
They're not reading your post
anymore. (09:05):
undefined
Speaker:
They're feeding them into
systems that analyze patterns. (09:08):
undefined
Speaker:
Think about what an algorithm
might see. (09:13):
undefined
Speaker:
Ten photos with drinks across
two months. (09:16):
undefined
Speaker:
You see normal social life. (09:21):
undefined
Speaker:
The algorithm might flag
potential substance dependency (09:24):
undefined
Speaker:
risk, poor judgment patterns,
health insurance liability. (09:27):
undefined
Speaker:
Posting regularly at two a m.
Well, you're a night owl. (09:33):
undefined
Speaker:
The algorithm might interpret
poor self-regulation likely to (09:38):
undefined
Speaker:
miss morning meetings. (09:43):
undefined
Speaker:
Potential productivity issues. (09:44):
undefined
Speaker:
Do you delete posts frequently? (09:48):
undefined
Speaker:
You're just editing yourself,
right? (09:50):
undefined
Speaker:
Well, the algorithm might code. (09:53):
undefined
Speaker:
Impulsive decision making, (09:55):
undefined
Speaker:
potential PR risk, emotional (09:56):
undefined
Speaker:
instability indicators. (09:59):
undefined
Speaker:
These systems are being sold
right now. (10:02):
undefined
Speaker:
HR tech companies advertise (10:06):
undefined
Speaker:
behavioral prediction, cultural (10:09):
undefined
Speaker:
fit analysis, risk assessment, (10:12):
undefined
Speaker:
modeling. (10:15):
undefined
Speaker:
They promise to predict who will (10:17):
undefined
Speaker:
quit, who will cause problems, (10:19):
undefined
Speaker:
who will cost more in health (10:22):
undefined
Speaker:
insurance. (10:24):
undefined
Speaker:
How do they calculate this? (10:26):
undefined
Speaker:
We don't know. (10:28):
undefined
Speaker:
These algorithms are proprietary
trade secrets. (10:30):
undefined
Speaker:
Black boxes. (10:34):
undefined
Speaker:
But consider that's technically
possible. (10:37):
undefined
Speaker:
Sentiment analysis. (10:40):
undefined
Speaker:
Tracking emotional volatility
across post network analysis. (10:42):
undefined
Speaker:
Measuring professional versus
personal content ratios. (10:47):
undefined
Speaker:
Temporal patterns. (10:51):
undefined
Speaker:
Identifying focus and
self-control indicators. (10:53):
undefined
Speaker:
Language processing. (10:57):
undefined
Speaker:
Detecting aggression. (10:58):
undefined
Speaker:
Negativity. (11:00):
undefined
Speaker:
Risk taking. (11:02):
undefined
Speaker:
Terminology. (11:03):
undefined
Speaker:
Every like. (11:05):
undefined
Speaker:
Share. (11:07):
undefined
Speaker:
Comment becomes a data point. (11:08):
undefined
Speaker:
Your three am political rants, (11:11):
undefined
Speaker:
your deleted post, your weakened (11:14):
undefined
Speaker:
location tags at bars, your (11:18):
undefined
Speaker:
complaint to positivity ratio, (11:20):
undefined
Speaker:
the gaps in your posting that (11:23):
undefined
Speaker:
might indicate depression or (11:25):
undefined
Speaker:
instability. (11:28):
undefined
Speaker:
You are not being judged by a (11:30):
undefined
Speaker:
person who might understand (11:33):
undefined
Speaker:
context. (11:35):
undefined
Speaker:
You are being scored by software
that sees patterns. (11:36):
undefined
Speaker:
And here's the trap forty seven (11:42):
undefined
Speaker:
percent of employers are (11:45):
undefined
Speaker:
suspicious if you have no online (11:47):
undefined
Speaker:
presence whatsoever. (11:50):
undefined
Speaker:
But maintaining one means
feeding the profiling system. (11:51):
undefined
Speaker:
Maybe the algorithm is
sophisticated enough to (11:56):
undefined
Speaker:
accurately predict behavior from
social media patterns. (11:59):
undefined
Speaker:
Maybe it's digital astrology
dressed up as a science. (12:04):
undefined
Speaker:
You'll never know. (12:08):
undefined
Speaker:
The company won't tell you why
you weren't selected. (12:10):
undefined
Speaker:
They might not even know
themselves. (12:14):
undefined
Speaker:
The algorithm generated a risk
score. (12:17):
undefined
Speaker:
That's all they needed. (12:20):
undefined
Speaker:
Your digital shadow tells a
story. (12:22):
undefined
Speaker:
You just don't get to read it. (12:25):
undefined
Speaker:
And you definitely do not get to
edit it. (12:28):
undefined
Speaker:
Part four the insurance
detective and your phone. (12:35):
undefined
Speaker:
A ski trip posted to Instagram. (12:43):
undefined
Speaker:
Black diamond run. (12:47):
undefined
Speaker:
Pure joy. (12:49):
undefined
Speaker:
Six weeks later, a back injury
from moving furniture. (12:50):
undefined
Speaker:
It's legitimate. (12:54):
undefined
Speaker:
The MRI confirmed it. (12:56):
undefined
Speaker:
Your doctor prescribed
treatment, but the insurance (12:57):
undefined
Speaker:
denies the claim. (13:02):
undefined
Speaker:
Inconsistent with reported
physical limitations. (13:04):
undefined
Speaker:
They don't mention the ski
video. (13:08):
undefined
Speaker:
They don't have to. (13:11):
undefined
Speaker:
This is how modern insurance
investigation works. (13:13):
undefined
Speaker:
The ski video was from before
the injury. (13:17):
undefined
Speaker:
But the algorithm doesn't care
about temporal context. (13:20):
undefined
Speaker:
It sees athletic capability, (13:25):
undefined
Speaker:
risk taking behavior, active (13:28):
undefined
Speaker:
lifestyle. (13:30):
undefined
Speaker:
It calculates lower payout
probability. (13:31):
undefined
Speaker:
Property casualty three hundred
sixty. (13:36):
undefined
Speaker:
An insurance industry (13:38):
undefined
Speaker:
publication reported that in (13:40):
undefined
Speaker:
twenty twenty five, a social (13:42):
undefined
Speaker:
media evidence factors into (13:44):
undefined
Speaker:
forty two percent of disputed (13:46):
undefined
Speaker:
claims. (13:49):
undefined
Speaker:
Investigations. (13:50):
undefined
Speaker:
Almost half insurance companies (13:51):
undefined
Speaker:
now contract with specialized (13:55):
undefined
Speaker:
firms that scan social media for (13:57):
undefined
Speaker:
claim verification. (14:00):
undefined
Speaker:
They have names like Risk (14:02):
undefined
Speaker:
Mitigation Services and Claim (14:04):
undefined
Speaker:
Integrity Solutions. (14:08):
undefined
Speaker:
What they're looking for depends
on the claim for injury cases. (14:11):
undefined
Speaker:
It's any physical movement
exercise activity for disability (14:17):
undefined
Speaker:
claims, it's evidence of
unreported work or income. (14:22):
undefined
Speaker:
Health insurance applications
for lifestyle indicators that (14:27):
undefined
Speaker:
suggest higher risk. (14:30):
undefined
Speaker:
That marathon you just ran three (14:33):
undefined
Speaker:
years ago and posted about to (14:35):
undefined
Speaker:
the algorithm. (14:37):
undefined
Speaker:
You're either low risk because
you're healthy or high risk (14:39):
undefined
Speaker:
because athletes get injuries. (14:45):
undefined
Speaker:
It depends on what they're
trying to prove. (14:48):
undefined
Speaker:
The same data tells opposite (14:51):
undefined
Speaker:
stories, depending on who's (14:53):
undefined
Speaker:
interpreting it. (14:55):
undefined
Speaker:
Think about wedding videos on
TikTok. (14:56):
undefined
Speaker:
Someone with chronic pain having
one good day dancing for 30s at (15:00):
undefined
Speaker:
their sister's wedding. (15:06):
undefined
Speaker:
The investigators seized
physical capacity inconsistent (15:08):
undefined
Speaker:
with claim the algorithm flags
potential fraud indicator. (15:12):
undefined
Speaker:
The claim gets denied. (15:18):
undefined
Speaker:
They don't see the three hours
of pain before the dance. (15:21):
undefined
Speaker:
The two days in bed after the (15:25):
undefined
Speaker:
context that makes 30s of joy (15:28):
undefined
Speaker:
possible and months of (15:30):
undefined
Speaker:
suffering. (15:33):
undefined
Speaker:
Industry training materials (15:34):
undefined
Speaker:
publicly available from (15:36):
undefined
Speaker:
investigation certification (15:39):
undefined
Speaker:
programs. (15:40):
undefined
Speaker:
Teach techniques like Cross, (15:41):
undefined
Speaker:
referencing Geotags with claimed (15:44):
undefined
Speaker:
limitations. (15:46):
undefined
Speaker:
Analyzing Venmo transactions for
undisclosed income. (15:48):
undefined
Speaker:
Searching tagged photos where (15:52):
undefined
Speaker:
the subject appears in other (15:54):
undefined
Speaker:
posts and using facial (15:56):
undefined
Speaker:
recognition to find unclaimed (15:58):
undefined
Speaker:
social profiles. (16:01):
undefined
Speaker:
You document your life thinking
you're sharing with friends. (16:04):
undefined
Speaker:
You're actually building an
evidence file that can be read (16:08):
undefined
Speaker:
against you by anyone willing to
pay for it. (16:13):
undefined
Speaker:
Part five the tools they use. (16:19):
undefined
Speaker:
Can link cobwebs. (16:26):
undefined
Speaker:
Shadow Dragon. (16:29):
undefined
Speaker:
They sound like hacker handles
from a nineties movie. (16:31):
undefined
Speaker:
Well, they're actually companies (16:35):
undefined
Speaker:
with government contracts and (16:38):
undefined
Speaker:
corporate clients. (16:40):
undefined
Speaker:
Shadow Dragon Social Net (16:42):
undefined
Speaker:
monitors over two hundred (16:45):
undefined
Speaker:
platforms, not just Facebook and (16:47):
undefined
Speaker:
Twitter. (16:50):
undefined
Speaker:
We're talking about niche
forums, regional social (16:51):
undefined
Speaker:
networks, places where you
forgot you had accounts. (16:56):
undefined
Speaker:
It maps relationships, finds
alternate accounts, builds what (17:01):
undefined
Speaker:
they call patterns of life. (17:07):
undefined
Speaker:
Texas Department of Public (17:12):
undefined
Speaker:
Safety gave a five million (17:14):
undefined
Speaker:
dollars contract for a tool (17:16):
undefined
Speaker:
called tangles. (17:19):
undefined
Speaker:
It builds relationship webs from
social media, finding (17:21):
undefined
Speaker:
connections between people
who've never met in person but (17:26):
undefined
Speaker:
liked the same post. (17:31):
undefined
Speaker:
Clearview AI just got a nine (17:34):
undefined
Speaker:
million dollars ceiling contract (17:37):
undefined
Speaker:
from Ice. (17:39):
undefined
Speaker:
This is the company that scraped
billions of photos from social (17:41):
undefined
Speaker:
media without permission. (17:45):
undefined
Speaker:
They built a facial recognition
system so powerful it's actually (17:48):
undefined
Speaker:
illegal in Europe. (17:54):
undefined
Speaker:
You know that photo from your
friends Instagram where you're (17:57):
undefined
Speaker:
in the background? (18:00):
undefined
Speaker:
Clearview has it and it knows
it's you. (18:01):
undefined
Speaker:
But the newest tool is even more
invasive. (18:06):
undefined
Speaker:
Location data not from your (18:10):
undefined
Speaker:
post, but from your phone (18:13):
undefined
Speaker:
itself. (18:16):
undefined
Speaker:
A company called Babble Street
had to cancel their contract (18:18):
undefined
Speaker:
after journalists exposed they
were selling location data. (18:23):
undefined
Speaker:
So Ice just bought the same
capability from Penlink instead. (18:28):
undefined
Speaker:
Billions of location signals
every day from hundreds of (18:35):
undefined
Speaker:
millions of phones. (18:40):
undefined
Speaker:
No warrant needed. (18:42):
undefined
Speaker:
They can see where you've been, (18:45):
undefined
Speaker:
who you've been near, how long (18:47):
undefined
Speaker:
you stayed. (18:50):
undefined
Speaker:
And most disturbingly, they can
look back in time. (18:52):
undefined
Speaker:
Part six when context dies. (18:58):
undefined
Speaker:
December of twenty thirteen (19:05):
undefined
Speaker:
Justine Sacco boards a plane to (19:08):
undefined
Speaker:
South Africa. (19:11):
undefined
Speaker:
She tweets going to Africa. (19:13):
undefined
Speaker:
Hope I don't get Aids. (19:16):
undefined
Speaker:
Haha just kidding, I'm white. (19:18):
undefined
Speaker:
It was sarcasm. (19:21):
undefined
Speaker:
A really bad joke about
privilege and inequality at the (19:23):
undefined
Speaker:
time of the tweet. (19:27):
undefined
Speaker:
She only had one hundred and
seventy followers. (19:29):
undefined
Speaker:
She then turned off her phone
for the eleven hour flight. (19:32):
undefined
Speaker:
Fell asleep when she landed. (19:36):
undefined
Speaker:
She was the number one trending
topic worldwide. (19:39):
undefined
Speaker:
Hashtag has just been landed
yet. (19:45):
undefined
Speaker:
People were waiting at the (19:48):
undefined
Speaker:
airport to photograph her (19:50):
undefined
Speaker:
reaction. (19:52):
undefined
Speaker:
She had been fired while in the
air. (19:54):
undefined
Speaker:
Death threats filled her inbox. (19:58):
undefined
Speaker:
Her life as she knew it was
over. (20:00):
undefined
Speaker:
Context. (20:05):
undefined
Speaker:
Died at thirty thousand feet. (20:06):
undefined
Speaker:
This is what happens when human (20:09):
undefined
Speaker:
communication meets algorithmic (20:12):
undefined
Speaker:
interpretation. (20:14):
undefined
Speaker:
Sarcasm becomes statement. (20:16):
undefined
Speaker:
Jokes become evidence. (20:20):
undefined
Speaker:
Mistakes become permanent. (20:23):
undefined
Speaker:
And twenty seventeen ten
students had their admissions (20:27):
undefined
Speaker:
revoked from Harvard University. (20:31):
undefined
Speaker:
Someone leaked screenshots from
a private Facebook group where (20:34):
undefined
Speaker:
they shared offensive memes. (20:39):
undefined
Speaker:
These kids and they were kids (20:42):
undefined
Speaker:
eighteen years old thought they (20:45):
undefined
Speaker:
were being edgy in a private (20:47):
undefined
Speaker:
space. (20:50):
undefined
Speaker:
Harvard disagreed. (20:51):
undefined
Speaker:
Four years of perfect grades,
extracurriculars, SAT prep gone (20:54):
undefined
Speaker:
one hundred and forty characters
can end a career. (21:01):
undefined
Speaker:
A private group chat can derail
a future. (21:06):
undefined
Speaker:
A photo from five years ago can
deny an insurance claim. (21:10):
undefined
Speaker:
The algorithm doesn't understand
context and strangely enough, (21:16):
undefined
Speaker:
increasingly, neither do we. (21:22):
undefined
Speaker:
Part seven the authentication of
everything. (21:27):
undefined
Speaker:
Every platform wants to verify
you. (21:33):
undefined
Speaker:
Now, blue checks on Twitter, ID
verification on Facebook, real (21:36):
undefined
Speaker:
names on LinkedIn. (21:42):
undefined
Speaker:
They say it's about
authenticity. (21:45):
undefined
Speaker:
Fighting bots, creating trust. (21:47):
undefined
Speaker:
Well it's not. (21:51):
undefined
Speaker:
It's about making you
survivable. (21:53):
undefined
Speaker:
An anonymous account can speak
truth to power. (21:57):
undefined
Speaker:
A verified account has a home
address. (22:01):
undefined
Speaker:
China's social credit system (22:05):
undefined
Speaker:
seemed dystopian when it (22:08):
undefined
Speaker:
launched. (22:10):
undefined
Speaker:
Bad social score. (22:11):
undefined
Speaker:
Well, you might not be able to
buy plane tickets, get a loan, (22:13):
undefined
Speaker:
or you might not be able to send
your kids to good schools. (22:18):
undefined
Speaker:
And we here in the States
laughed. (22:22):
undefined
Speaker:
That could never happen here. (22:26):
undefined
Speaker:
Well it's happening. (22:28):
undefined
Speaker:
We just call it different names. (22:31):
undefined
Speaker:
Background checks, credit
scores. (22:34):
undefined
Speaker:
Social media screening. (22:38):
undefined
Speaker:
Algorithmic risk assessment. (22:41):
undefined
Speaker:
Insurance evaluation. (22:44):
undefined
Speaker:
The Chinese were just honest
about centralizing it. (22:47):
undefined
Speaker:
Pharma or pharma is a major (22:51):
undefined
Speaker:
player in social media (22:55):
undefined
Speaker:
screening. (22:57):
undefined
Speaker:
They advertise that it searches (22:58):
undefined
Speaker:
ten thousand online public (23:00):
undefined
Speaker:
sources for what it calls (23:03):
undefined
Speaker:
behavioral intelligence. (23:05):
undefined
Speaker:
According to their own marketing
materials. (23:08):
undefined
Speaker:
They screen for nine types of (23:11):
undefined
Speaker:
workplace misconduct, including (23:14):
undefined
Speaker:
fraud, harassment, threats, and (23:17):
undefined
Speaker:
violence. (23:20):
undefined
Speaker:
They claim a ninety nine point
nine five percent accuracy. (23:22):
undefined
Speaker:
And they say their AI can (23:27):
undefined
Speaker:
identify extremist symbols, (23:29):
undefined
Speaker:
violent imagery, memes and (23:32):
undefined
Speaker:
gestures. (23:36):
undefined
Speaker:
They use what they call avatar (23:38):
undefined
Speaker:
recognition to find your (23:41):
undefined
Speaker:
accounts, even when you use (23:43):
undefined
Speaker:
different names. (23:46):
undefined
Speaker:
This is actually from their (23:48):
undefined
Speaker:
marketing materials, not (23:50):
undefined
Speaker:
speculation. (23:52):
undefined
Speaker:
Again, this is from their pitch
decks. (23:53):
undefined
Speaker:
Apparently another screening (23:57):
undefined
Speaker:
company analyzes what they call (23:59):
undefined
Speaker:
eleven different behaviors using (24:02):
undefined
Speaker:
a billion profile Osint (24:06):
undefined
Speaker:
database. (24:08):
undefined
Speaker:
They advertise the ability to (24:10):
undefined
Speaker:
spot bias threats, political, (24:12):
undefined
Speaker:
disparaging and prejudiced (24:16):
undefined
Speaker:
speech across text and visual (24:19):
undefined
Speaker:
content. (24:21):
undefined
Speaker:
Aquasource HR combines AI driven
search tools with what they call (24:23):
undefined
Speaker:
expert human analysis from
trained social anthropologist. (24:31):
undefined
Speaker:
Social anthropologist studying (24:37):
undefined
Speaker:
your tweets like you're an (24:40):
undefined
Speaker:
undiscovered tribe. (24:42):
undefined
Speaker:
And here is the number that
should terrify everyone. (24:45):
undefined
Speaker:
A sterling survey found that
sixty eight percent of employers (24:50):
undefined
Speaker:
admitted to using social media
to find answers to illegal (24:55):
undefined
Speaker:
interview questions, questions
they can't legally ask you. (25:01):
undefined
Speaker:
Age. (25:08):
undefined
Speaker:
Religion. (25:09):
undefined
Speaker:
Sexual orientation. (25:10):
undefined
Speaker:
Health conditions. (25:12):
undefined
Speaker:
Sixty eight percent admitted it. (25:14):
undefined
Speaker:
These companies defend
themselves with the same line. (25:18):
undefined
Speaker:
It's all public data. (25:22):
undefined
Speaker:
What's public is your entire
life, and they've built an (25:26):
undefined
Speaker:
industry around reading it. (25:31):
undefined
Speaker:
Part eight the deletion myth. (25:36):
undefined
Speaker:
You can't delete anything. (25:41):
undefined
Speaker:
Not really. (25:44):
undefined
Speaker:
Sure, you can remove posts from (25:47):
undefined
Speaker:
your timeline, but the data (25:49):
undefined
Speaker:
persist. (25:51):
undefined
Speaker:
And backups. (25:52):
undefined
Speaker:
Caches. (25:53):
undefined
Speaker:
Archives. (25:55):
undefined
Speaker:
Screenshots. (25:56):
undefined
Speaker:
Proliferate. (25:57):
undefined
Speaker:
Aggregators. (25:59):
undefined
Speaker:
Scrape and store. (25:59):
undefined
Speaker:
The European Union has the right
to be forgotten. (26:02):
undefined
Speaker:
Between twenty fourteen and
twenty seventeen, two point four (26:07):
undefined
Speaker:
million people requested Google
remove their information. (26:12):
undefined
Speaker:
Google approved forty three
percent, less than half. (26:18):
undefined
Speaker:
In America, we don't even have
that. (26:23):
undefined
Speaker:
No federal right to deletion. (26:26):
undefined
Speaker:
No right to see what's in your
life. (26:28):
undefined
Speaker:
No right to correct errors when
you delete a post. (26:31):
undefined
Speaker:
Here's what actually happens. (26:36):
undefined
Speaker:
The platform removes it from
public view. (26:38):
undefined
Speaker:
That's all. (26:43):
undefined
Speaker:
Server logs remain. (26:44):
undefined
Speaker:
Backup systems retain copies. (26:46):
undefined
Speaker:
Law enforcement can still access
through legal request. (26:50):
undefined
Speaker:
Meanwhile, the Internet Archive
captures pages. (26:54):
undefined
Speaker:
Google caches search results. (26:58):
undefined
Speaker:
Third party monitoring services, (27:01):
undefined
Speaker:
the ones selling to HR (27:04):
undefined
Speaker:
departments and insurance (27:05):
undefined
Speaker:
companies have already scraped (27:06):
undefined
Speaker:
and stored other user's (27:09):
undefined
Speaker:
screenshot before you think to (27:12):
undefined
Speaker:
delete. (27:13):
undefined
Speaker:
Data brokers operate on a simple (27:15):
undefined
Speaker:
principle accumulation without (27:17):
undefined
Speaker:
expiration. (27:20):
undefined
Speaker:
Companies like LexisNexis and (27:22):
undefined
Speaker:
Thomson Reuters aggregate public (27:24):
undefined
Speaker:
records social media purchase (27:27):
undefined
Speaker:
histories. (27:30):
undefined
Speaker:
They create profiles on hundreds
of millions of Americans. (27:31):
undefined
Speaker:
Profiles you can't see. (27:37):
undefined
Speaker:
You can't correct. (27:39):
undefined
Speaker:
You can't delete. (27:41):
undefined
Speaker:
They sell access to these (27:44):
undefined
Speaker:
profiles tens of thousands of (27:45):
undefined
Speaker:
times per year to employers, (27:48):
undefined
Speaker:
insurers, landlords, anyone with (27:52):
undefined
Speaker:
a business account and a credit (27:56):
undefined
Speaker:
card. (27:58):
undefined
Speaker:
The data multiplies. (27:59):
undefined
Speaker:
Every search someone runs on
you, generates metadata about (28:01):
undefined
Speaker:
who's interested and new. (28:06):
undefined
Speaker:
Every analysis creates derived
data. (28:09):
undefined
Speaker:
Risk scores, behavioral
predictions, network maps. (28:13):
undefined
Speaker:
You exist in databases you've
never heard of, scored by (28:18):
undefined
Speaker:
algorithms you'll never see sold
to buyers you'll never know. (28:24):
undefined
Speaker:
Deletion is theatre. (28:30):
undefined
Speaker:
The data persist, and in America
you have no right to even know (28:33):
undefined
Speaker:
what story it tells. (28:40):
undefined
Speaker:
Part nine The Great Quieting. (28:44):
undefined
Speaker:
Something is happening to public
discourse. (28:51):
undefined
Speaker:
People are self-censoring. (28:54):
undefined
Speaker:
Not because the government told
them to. (28:57):
undefined
Speaker:
Because they know everything is
watched. (29:00):
undefined
Speaker:
Recorded. (29:03):
undefined
Speaker:
Analyzed. (29:04):
undefined
Speaker:
Scored. (29:06):
undefined
Speaker:
A Pew Research study found that
seventy two percent of Americans (29:08):
undefined
Speaker:
believe their online activities
are being tracked by companies. (29:12):
undefined
Speaker:
And they are right. (29:18):
undefined
Speaker:
But it's what they do with that
knowledge that matters. (29:20):
undefined
Speaker:
They become careful, calculated,
boring. (29:25):
undefined
Speaker:
The phrases repeat across social
media. (29:30):
undefined
Speaker:
I don't post about politics
anymore. (29:33):
undefined
Speaker:
I deleted all my party photos
linked and voice only made a (29:37):
undefined
Speaker:
separate account for work. (29:45):
undefined
Speaker:
The same story. (29:48):
undefined
Speaker:
Different voices. (29:49):
undefined
Speaker:
Self-censorship as survival
strategy. (29:51):
undefined
Speaker:
The chilling effect is real and
measurable. (29:56):
undefined
Speaker:
When Cambridge Analytica's data
harvesting was exposed, Facebook (30:00):
undefined
Speaker:
lost two point eight million US
users under twenty five. (30:06):
undefined
Speaker:
Not just inactive. (30:11):
undefined
Speaker:
Gone. (30:13):
undefined
undefined
Speaker:
When people learn how their data
is used, they don't just change (30:14):
undefined
Speaker:
passwords, they change behavior. (30:19):
undefined
Speaker:
We are watching democracy's
immune system shut down. (30:22):
undefined
Speaker:
Descent requires the possibility (30:29):
undefined
Speaker:
of having the freedom to be (30:32):
undefined
Speaker:
anonymous. (30:34):
undefined
Speaker:
Innovation requires the freedom
to be wrong. (30:35):
undefined
Speaker:
Growth requires the space to be
foolish. (30:39):
undefined
Speaker:
When everything is permanent and (30:44):
undefined
Speaker:
searchable, nothing important (30:47):
undefined
Speaker:
gets said. (30:50):
undefined
Speaker:
The result is a generation
learning to perform stability (30:51):
undefined
Speaker:
rather than experience growth. (30:56):
undefined
Speaker:
The cure rate for an algorithmic
audience that never forgets and (30:59):
undefined
Speaker:
never forgives everyone. (31:03):
undefined
Speaker:
Self edits now. (31:06):
undefined
Speaker:
Not for friends. (31:08):
undefined
Speaker:
For the machine that's watching. (31:10):
undefined
Speaker:
And the machine is always
watching. (31:13):
undefined
Speaker:
Part ten the resistance that
isn't. (31:20):
undefined
Speaker:
People think they are fighting
back. (31:26):
undefined
Speaker:
Private accounts. (31:29):
undefined
Speaker:
Fake names. (31:31):
undefined
Speaker:
Deleted apps. (31:34):
undefined
Speaker:
Digital detoxes. (31:36):
undefined
Speaker:
It doesn't work. (31:39):
undefined
Speaker:
Your phone's advertising ID (31:40):
undefined
Speaker:
connects everything your credit (31:43):
undefined
Speaker:
card links, your purchases, your (31:45):
undefined
Speaker:
location. (31:48):
undefined
Speaker:
Data tells the real story. (31:49):
undefined
Speaker:
Your contacts upload their
address book with your number (31:52):
undefined
Speaker:
and them you exist and other
people's data shadows. (31:57):
undefined
Speaker:
Even if you never, ever created
accounts. (32:03):
undefined
Speaker:
Ghost profiles. (32:08):
undefined
Speaker:
That's what Facebook calls them
internally. (32:09):
undefined
Speaker:
According to leaked documents,
they are profiles of people who (32:13):
undefined
Speaker:
never signed up. (32:19):
undefined
Speaker:
Built from photos of other
posts. (32:21):
undefined
Speaker:
Contact list. (32:24):
undefined
Speaker:
Others. (32:25):
undefined
Speaker:
Upload and relationships others
document. (32:26):
undefined
Speaker:
Now you of course can opt out of
Facebook, but you can't opt out (32:30):
undefined
Speaker:
of being in Facebook. (32:36):
undefined
Speaker:
The real resistance is
political. (32:39):
undefined
Speaker:
The only real resistance is
political. (32:42):
undefined
Speaker:
Force with teeth. (32:47):
undefined
Speaker:
Regulations that bite. (32:49):
undefined
Speaker:
Rights that can't be waived in
terms of service. (32:51):
undefined
Speaker:
Europe has GDPR. (32:57):
undefined
Speaker:
California has CcpA. (33:00):
undefined
Speaker:
They're imperfect, but they are (33:03):
undefined
Speaker:
something the rest of America (33:06):
undefined
Speaker:
has. (33:09):
undefined
Speaker:
Thoughts and prayers. (33:10):
undefined
Speaker:
Federal privacy legislation has
been proposed repeatedly. (33:12):
undefined
Speaker:
It never passes. (33:17):
undefined
Speaker:
The tech lobby warns it would
break the internet. (33:19):
undefined
Speaker:
The security lobby says it would
help terrorists. (33:22):
undefined
Speaker:
First, the business lobby claims
economic catastrophe. (33:25):
undefined
Speaker:
We can't even agree on basic
transparency, whether people (33:31):
undefined
Speaker:
should know what data companies
collect, whether they should see (33:36):
undefined
Speaker:
their own files. (33:41):
undefined
Speaker:
The answer is always the same (33:43):
undefined
Speaker:
too complicated, too expensive, (33:46):
undefined
Speaker:
too dangerous. (33:49):
undefined
Speaker:
Translation transparency would
reveal how bad it really is. (33:52):
undefined
Speaker:
Part eleven the score you can't
see. (34:01):
undefined
Speaker:
You have a score. (34:08):
undefined
Speaker:
Multiple scores. (34:11):
undefined
Speaker:
Actually. (34:12):
undefined
Speaker:
Credit score. (34:13):
undefined
Speaker:
You know that one already. (34:14):
undefined
Speaker:
But also customer lifetime value
score. (34:16):
undefined
Speaker:
Insurance risk score. (34:20):
undefined
Speaker:
Employment Probability score. (34:23):
undefined
Speaker:
Social influence score. (34:26):
undefined
Speaker:
Fraud. (34:29):
undefined
Speaker:
Likelihood score. (34:30):
undefined
Speaker:
These are not conspiracy
theories. (34:33):
undefined
Speaker:
These are products sold at
industry conferences. (34:36):
undefined
Speaker:
Data analytics companies openly (34:41):
undefined
Speaker:
advertise predictive scoring (34:44):
undefined
Speaker:
systems. (34:46):
undefined
Speaker:
Know your customer's true value. (34:48):
undefined
Speaker:
Predictive employee churn before
it happens. (34:51):
undefined
Speaker:
Identify risk before they
manifest. (34:55):
undefined
Speaker:
HR technology firms claim their (34:59):
undefined
Speaker:
algorithms can predict with high (35:02):
undefined
Speaker:
accuracy which employees will (35:05):
undefined
Speaker:
quit, which will get sick, which (35:08):
undefined
Speaker:
will become what they call (35:13):
undefined
Speaker:
problematic. (35:15):
undefined
Speaker:
How do they do this? (35:18):
undefined
Speaker:
Well, they analyzed digital
footprints, email patterns, (35:20):
undefined
Speaker:
social media activity, even
physical workplace data. (35:24):
undefined
Speaker:
When available. (35:30):
undefined
Speaker:
Badge swipes, parking times, (35:31):
undefined
Speaker:
cafeteria purchases the privacy (35:34):
undefined
Speaker:
policies are buried in (35:37):
undefined
Speaker:
employment agreements. (35:39):
undefined
Speaker:
Page forty seven, section three. (35:41):
undefined
Speaker:
The part no one reads. (35:44):
undefined
Speaker:
You consented without knowing
what you consented to. (35:48):
undefined
Speaker:
Part twelve tomorrow's crime. (35:55):
undefined
Speaker:
Today's punishment. (35:59):
undefined
Speaker:
Predictive policing was supposed (36:03):
undefined
Speaker:
to stop crime before it (36:05):
undefined
Speaker:
happened. (36:07):
undefined
Speaker:
Instead, it criminalizes
probability. (36:08):
undefined
Speaker:
Chicago's heat list algorithm, (36:12):
undefined
Speaker:
officially called the Strategic (36:16):
undefined
Speaker:
Subject List, identified people (36:19):
undefined
Speaker:
likely to be involved in (36:22):
undefined
Speaker:
shootings, either as victims or (36:25):
undefined
Speaker:
perpetrators. (36:28):
undefined
Speaker:
The algorithm did not (36:30):
undefined
Speaker:
distinguish if you made the (36:32):
undefined
Speaker:
list. (36:34):
undefined
Speaker:
Police visited your home, (36:35):
undefined
Speaker:
knocked on your door, told you (36:38):
undefined
Speaker:
they were watching the (36:41):
undefined
Speaker:
algorithm, considered arrest, (36:44):
undefined
Speaker:
not convictions. (36:47):
undefined
Speaker:
Social networks. (36:49):
undefined
Speaker:
Geography. (36:51):
undefined
Speaker:
Age. (36:52):
undefined
Speaker:
Being young, black, and living
in certain neighborhoods was (36:54):
undefined
Speaker:
enough to score high. (36:58):
undefined
Speaker:
The program was discontinued
after a Rand Corporation study (37:01):
undefined
Speaker:
found it ineffective, and civil
rights groups proved it was (37:05):
undefined
Speaker:
algorithmic racial profiling. (37:10):
undefined
Speaker:
But the concept didn't die. (37:13):
undefined
Speaker:
It evolved. (37:16):
undefined
Speaker:
Now it's person based predictive
analytics. (37:18):
undefined
Speaker:
It's the same idea, but better
branding. (37:23):
undefined
Speaker:
Police departments nationwide
use variations. (37:27):
undefined
Speaker:
Ice also uses it as well as
border patrol. (37:31):
undefined
Speaker:
Your social media feeds these (37:35):
undefined
Speaker:
systems that protest you (37:38):
undefined
Speaker:
attended, that article you (37:41):
undefined
Speaker:
shared. (37:43):
undefined
Speaker:
That friend who got arrested all
became inputs to algorithms (37:44):
undefined
Speaker:
calculating threat probability. (37:50):
undefined
Speaker:
The companies building these
systems claim accuracy rates (37:54):
undefined
Speaker:
above eighty five percent. (37:58):
undefined
Speaker:
They say the algorithms identify
patterns humans miss. (38:01):
undefined
Speaker:
What patterns? (38:07):
undefined
Speaker:
Well, they won't say trade
secrets. (38:08):
undefined
Speaker:
Proprietary methods. (38:12):
undefined
Speaker:
The algorithm generates a score. (38:14):
undefined
Speaker:
The score generates an action,
and no one knows exactly why. (38:18):
undefined
Speaker:
Not even the people using it. (38:25):
undefined
Speaker:
Part thirteen the Future that's
already here. (38:30):
undefined
Speaker:
William Gibson said. (38:37):
undefined
Speaker:
The future is already here. (38:39):
undefined
Speaker:
It's just unevenly distributed. (38:42):
undefined
Speaker:
He was right, but he was also
wrong. (38:45):
undefined
Speaker:
The surveillance future isn't
unevenly distributed. (38:49):
undefined
Speaker:
It's universal. (38:55):
undefined
Speaker:
We all live in it. (38:56):
undefined
Speaker:
We just experience it
differently based on our scores. (38:58):
undefined
Speaker:
If your scores are good, (39:03):
undefined
Speaker:
employed, insured, documented, (39:05):
undefined
Speaker:
compliant, you might never (39:08):
undefined
Speaker:
notice the cage doors are open (39:10):
undefined
Speaker:
for you. (39:13):
undefined
Speaker:
Services appear. (39:14):
undefined
Speaker:
Life just feels frictionless. (39:15):
undefined
Speaker:
If your scores are bad, every
interaction is friction. (39:19):
undefined
Speaker:
Every application is denied. (39:24):
undefined
Speaker:
Every benefit requires proof. (39:27):
undefined
Speaker:
Every movement is questioned. (39:30):
undefined
Speaker:
The architecture of surveillance
is complete. (39:34):
undefined
Speaker:
The only question is how it will
be used in the future. (39:38):
undefined
Speaker:
ICE's social media monitoring
program isn't the beginning. (39:43):
undefined
Speaker:
It's the formalization of what's
already happening. (39:48):
undefined
Speaker:
The government is just catching
up to what corporations have (39:52):
undefined
Speaker:
been doing for a decade. (39:57):
undefined
Speaker:
Your posts are being watched, (39:59):
undefined
Speaker:
your patterns analyzed, your (40:02):
undefined
Speaker:
future predicted, your worth (40:05):
undefined
Speaker:
calculated. (40:08):
undefined
Speaker:
Right now, as you listen to this
podcast, an algorithm somewhere (40:10):
undefined
Speaker:
is updating your file. (40:17):
undefined
Speaker:
You can't see it. (40:20):
undefined
Speaker:
You can't correct it. (40:21):
undefined
Speaker:
You can't escape it. (40:23):
undefined
Speaker:
But you should know it exists. (40:25):
undefined
Speaker:
Because in a world where
everything is remembered and (40:28):
undefined
Speaker:
nothing is forgiven, the most
radical act isn't revolution. (40:32):
undefined
Speaker:
It's remembering that you are (40:39):
undefined
Speaker:
more than the sum of your data (40:42):
undefined
Speaker:
points. (40:44):
undefined
Speaker:
You are a human being. (40:45):
undefined
Speaker:
Complex. (40:48):
undefined
Speaker:
Contradictory. (40:49):
undefined
Speaker:
Capable of change. (40:51):
undefined
Speaker:
The algorithm doesn't know that. (40:54):
undefined
Speaker:
But you do. (40:57):
undefined
Speaker:
And maybe, just maybe, that's
enough to start with. (40:59):
undefined