All Episodes

May 6, 2025 • 13 mins
From early internet horror stories about stolen identities to contemporary fears about deepfake technology, we examine how identity theft legends have evolved with technology. The episode traces the progression from simple credit card fraud tales to complex narratives about AI-generated doubles, revealing how these stories reflect our deepening anxiety about digital identity and authenticity. Through interviews with cybersecurity experts and folklore scholars, we uncover how these legends have actually shaped security policies and online behavior, while exploring their role in processing collective trauma from real-world data breaches.
Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:03):
Urban legends decoded. Urban legends decoded.

Speaker 2 (00:15):
Every urban legend starts with someone saying, this really happened
to a friend of mine. But what if I told you?
The most interesting part isn't whether it's true, but why
we desperately need it to be is b I'm Ryan Carter,
and this is urban legends decoded. Where we dig beneath
the surface of the stories that spread like wildfire to

(00:36):
discover what they reveal about who we are and what
keeps us awake at night. So the evolution of identity
theft legends, right, it's fascinating how it all began with
those early Internet horror stories about stolen credit cards. Yeah.

Speaker 1 (00:50):
Absolutely, it's like folklore for the digital age. Remember those
chain emails warning about fishing scams. They were like modern
day campfire.

Speaker 2 (00:56):
Tales exactly, and they tapped into real anxieties. Credit card
fraud was like a tangible fear even back then, but
the narratives were often exaggerated, embellished for dramatic effect.

Speaker 1 (01:08):
Oh for sure. It's the classic folklore pattern, right, a
kernel of truth amplified and distorted through retelling and social anxieties.
But those early stories laid the groundwork for what came next?

Speaker 2 (01:21):
Yeah, what do you mean?

Speaker 1 (01:22):
Well, think about it. Those credit card tales evolved into
identity cloning narratives, posers on social media, criminals stealing identities
to evade the law. It's like a digital doppelganger, a
fandom self out there wreaking havoc.

Speaker 2 (01:36):
I see interesting. And the Wikipedia article highlights this criminal
identity theft aspect. Criminals using stolen identities at the point
of arrest, court summons is arriving for crimes you didn't commit.
It's a nightmare scenario.

Speaker 1 (01:49):
Total nightmare. And the legal system often struggles to keep
up right, clearing your name, proving your you. It can
be a bureaucratic labyrinth, definitely.

Speaker 2 (02:00):
And this whole idea of a stolen identity, it's deeply unsettling.
It like violates our sense of self, our control over
our own narrative, which brings us to the AI era
and deep.

Speaker 1 (02:12):
Fakes, right, deep fakes take this to a whole new level.
It's not just your information being stolen, it's your likeness,
your face, your voice synthesized and weaponized against you. It's
like science fiction made reality exactly.

Speaker 2 (02:28):
And the article on deep fakes it mentions how they're
used for blackmail, fraud, even political manipulation. It's like a
disinformation wildfire waiting.

Speaker 1 (02:37):
To happen, definitely, And the speed at which this technology
is evolving is concerning. It's getting harder and harder to
distinguish real from fake. It undermines trust in everything we
see and.

Speaker 2 (02:50):
Hear, absolutely, and that erosion of trust that's the real
danger here. It's not just about individual victims. It's about
the fabric of society reality.

Speaker 1 (03:01):
Yeah, and the data breaches they're like fuel for this fire.
Every breach feeds the deep fake machine with more data,
more raw material for creating these digital doppelganners. It's a
vicious cycle, it is.

Speaker 2 (03:15):
And the Wikipedia article details how breaches exposed millions of records,
social security numbers, personal identifiers. It's like handing the keys
to your life to a stranger, right.

Speaker 1 (03:24):
And then there's the psychological trauma, the feeling of violation,
the loss of control. It's not just about financial damage.
It's about the deep seated fear of being erased, replaced
by a digital fantom.

Speaker 2 (03:39):
Yeah. And I find it particularly disturbing how children are targeted.
The article mentions child identity theft digital kidnapping. It's like
stealing their future before they even have a chance to
live it.

Speaker 1 (03:52):
It's horrifying, and it underscores how vulnerable we are in
this digital age. Our identities are very selves are increasingly fagled,
scattered across the digital landscape, ripe for exploitation. It's a
challenge we have to confront head on.

Speaker 2 (04:07):
So we were talking about the deeply unsettling nature of
identity theft, how it violates our sense of self right.

Speaker 1 (04:14):
And how it's evolved from those almost quaint early Internet
stories about stolen credit cards.

Speaker 2 (04:21):
Yeah, those chain emails almost laughable now, but they really
did tap into a primal fear.

Speaker 1 (04:28):
Exactly, and that fear has only intensified, hasn't it. As
the technology has advanced, so have the methods of theft
and the potential consequences. I mean, think about the legal ramifications.

Speaker 2 (04:40):
Oh absolutely. The Wikipedia article touched on that, the bureaucratic
labyrinth of clearing your name when the system itself is struggling.

Speaker 3 (04:46):
To catch up.

Speaker 1 (04:47):
It's a nightmare scenario. Court summons is for crimes you
didn't commit, trying to prove your you. It's like something
out of Kafka.

Speaker 3 (04:55):
It is.

Speaker 2 (04:56):
It's the ultimate loss of control. And then you layer
on deep.

Speaker 1 (05:01):
Fakes, which takes it to a whole other dimension, not
just your information, but your image, your face, your voice weaponized.
It's chilling totally.

Speaker 2 (05:11):
And the US is blackmail, fraud, political manipulation. The article
highlights how deep fakes can spread disinformation like wildfire, and the.

Speaker 1 (05:19):
Speed at which the technology is evolving is terrifying. It's
getting harder and harder to distinguish real from fake, which.

Speaker 2 (05:27):
Erodes trust in everything. It's not just about individual victims anymore, right,
it's societal. It's about our shared reality precisely.

Speaker 1 (05:35):
And the data breaches they're the fuel on the fire.
Every breach feeds the deep fake machine with more raw material,
creating more of these digital doppelgangers.

Speaker 2 (05:44):
It's a vicious cycle. Millions of records exposed, social security numbers,
personal identifiers. It's like key e oh, it's like ah
handing over the keys to your life.

Speaker 1 (05:53):
And the psychological trauma, that sense of violation, it's profound,
not just financial damage, but the fear of being erased, replaced.

Speaker 2 (06:04):
And the fact that children are targeted digital kidnapping, stealing
their future. That's that's particularly horrifying.

Speaker 1 (06:12):
I think it underscores just how vulnerable we are in
this digital age. Our identities are fragmented, scattered across this
digital landscape, ripe for exploitation. It's a challenge we absolutely
have to confront head on.

Speaker 2 (06:27):
So this deep fake phenomenon, it's chillin right, the idea
of your image, your voice weaponized against.

Speaker 1 (06:35):
You, absolutely, and the implications extend far beyond individual harm.
Think about the erosion of trust in well, everything, news, evidence,
even our own perceptions.

Speaker 2 (06:48):
Yeah, it's like, how can you even know what's real anymore?
The technologies evolving so rapidly it's outpacing our ability to
even comprehend the consequences, let alone regulate it.

Speaker 1 (06:57):
Yeah. The article mentions the EUAI, the GDPR, all these
frameworks struggling to keep up with this digital wildfire.

Speaker 2 (07:04):
And the data breaches. Right, they're like pouring gasoline on
that fire. Every breach feeds the deep fake machine with
more how no raw material. It's a terrifying cycle, it is.

Speaker 1 (07:14):
And the psychological impact that sense of violation being erased
replaced by a digital phantom. Oh, that's a deep wound.

Speaker 3 (07:24):
It really is.

Speaker 2 (07:25):
The article highlights the trauma the fear. It's not just
about financial damage, it's about our very sense of self exactly.

Speaker 1 (07:35):
And then there's the political dimension disinformation, micro targeting culture
of the potential to destabilized democracies is immense huge.

Speaker 2 (07:45):
The examples given are I mean, from world leaders to celebrities,
no one is immune. Zelenski putin, even Pope Francis in
a puffer jacket. It's absurd and yet deeply unsettling.

Speaker 1 (07:58):
It speaks to the power of these synthetic realities. They
can bypass our skepticism, tap into our existing biases, and
the consequences can be devastating.

Speaker 2 (08:08):
The article mentions how deep fakes can actually reinforce pre
existing beliefs, which is counter and two almost you'd think
they'd create more doubt, but.

Speaker 1 (08:18):
It's more insidious than that. It's not about creating new beliefs,
it's about solidifying existing ones, creating echo chambers of yes,
manufactured reality.

Speaker 3 (08:27):
Yeah.

Speaker 2 (08:28):
And what about the detection methods. The article talks about algorithms,
light reflections, and eyes, even just asking someone to turn
sideways on a video call.

Speaker 1 (08:37):
It's a constant arms race. As detection methods improve, so
does the technology creating the deep fakes. It's like a
moving target, it is.

Speaker 2 (08:49):
And the Deep Fake Detection challenge, the winning model was
only sixty five percent accurate. Humans were slightly better, but
that's not exactly reassuring, not at all.

Speaker 1 (08:58):
It underscores the difficulty in in establishing truth in this
increasingly synthetic world. And then there's the legislative side. The
laws are scrambling to catch up.

Speaker 2 (09:09):
Right, the California laws, the EUAI Act, they're all grappling
with this. How do you regulate something that's constantly evolving,
that blurs the lines between real and fake speech and manipulation.

Speaker 3 (09:23):
It's a massive challenge, and the ethical questions. The potential
for misuse is just staggering, from revenge porn to political sabotage.
The possibilities are silling, deeply disturbing.

Speaker 1 (09:40):
And the fact that children are being targeted digital kidnapp
being it's horrifying.

Speaker 2 (09:45):
It is, and it highlights a larger issue our vulnerability
in this digital age. Our identities are fragmented, scattered across
this digital landscape, ripe for exploitation.

Speaker 1 (09:59):
We need to be having these conversations. We need to
be educating ourselves and others about the dangers and the
potential solutions, media literacy, critical thinking. These are crucial tools
in navigating this admissibly. It's a challenge we have to
confront head on.

Speaker 2 (10:19):
There's no other way, So we'reblving into the world of
deep fakes. Huh. It's fascinating and a little unnerving how
quickly this technology has evolved.

Speaker 1 (10:28):
It really is. From those early, almost clunky examples to
what we're seeing now. It's a huge leap, and the
implications are well pretty far reaching.

Speaker 2 (10:38):
Yeah, absolutely, the sheer range of uses, from entertainment.

Speaker 1 (10:41):
Stola like that agt act with Simon Cowell, oh.

Speaker 2 (10:45):
Yeah, metaphysic wild stuff, to more malicious applications.

Speaker 1 (10:52):
The political disinformation campaigns the Keke Zelensky video for example, chilling.

Speaker 2 (10:56):
H definitely, and then there's the whole realm of scams
and which seems to be exploding.

Speaker 1 (11:01):
Exploding, is right. The article mentions billions and losses billions,
and the methods are getting more sophisticated all the time.

Speaker 2 (11:09):
It's like an arms race, isn't it. The technology develops
than the detection methods try to catch up.

Speaker 1 (11:14):
And then the tech evolves again, a vicious cycle. The
deep fake detection challenge even the winning model was only
sixty five percent.

Speaker 2 (11:21):
Accurate, right, and humans weren't much better, which is not
exactly reassuring, not at all.

Speaker 1 (11:28):
It makes you wonder, how do we establish truth in
this increasingly synthetic world.

Speaker 2 (11:33):
It's a tough question. The legal frameworks are struggling to
keep pace, the EUAI Act, the California laws, they're all
grappling with this.

Speaker 1 (11:42):
It's a moving target. How do you regulate something that's
constantly evolving, that blurs the lines between real and fake
speech and manipulation exactly.

Speaker 2 (11:52):
And then there are the ethical questions, which are even thornier.
The potential for misused is staggering, from.

Speaker 1 (12:03):
Which sadly was one of the earliest applications.

Speaker 2 (12:07):
Dot to political sabotage, to financial fraud. It's a minefield.
And the fact that children are being targeted with digital
kidnapping and such, it's horrifying.

Speaker 1 (12:18):
It really is. It underscores just how vulnerable we are
in this digital age. Our identities, our images, our voices
scattered across this digital landscape, ripe for exploitation. It's a
challenge we absolutely have to confront head on the next
time someone shares a story that sounds too perfectly terrifying
to be true. Remember, they're not just passing along information,

(12:40):
they're sharing a piece of our collective unconscious. These legends
survive because they speak to something real, even when the
facts don't. Until next time, keep questioning not just what
we believe, but why we need to believe it. From

(13:01):
a stony La Saga and As by Danna and Land
from Sada
Advertise With Us

Popular Podcasts

New Heights with Jason & Travis Kelce

New Heights with Jason & Travis Kelce

Football’s funniest family duo — Jason Kelce of the Philadelphia Eagles and Travis Kelce of the Kansas City Chiefs — team up to provide next-level access to life in the league as it unfolds. The two brothers and Super Bowl champions drop weekly insights about the weekly slate of games and share their INSIDE perspectives on trending NFL news and sports headlines. They also endlessly rag on each other as brothers do, chat the latest in pop culture and welcome some very popular and well-known friends to chat with them. Check out new episodes every Wednesday. Follow New Heights on the Wondery App, YouTube or wherever you get your podcasts. You can listen to new episodes early and ad-free, and get exclusive content on Wondery+. Join Wondery+ in the Wondery App, Apple Podcasts or Spotify. And join our new membership for a unique fan experience by going to the New Heights YouTube channel now!

The Breakfast Club

The Breakfast Club

The World's Most Dangerous Morning Show, The Breakfast Club, With DJ Envy, Jess Hilarious, And Charlamagne Tha God!

Fudd Around And Find Out

Fudd Around And Find Out

UConn basketball star Azzi Fudd brings her championship swag to iHeart Women’s Sports with Fudd Around and Find Out, a weekly podcast that takes fans along for the ride as Azzi spends her final year of college trying to reclaim the National Championship and prepare to be a first round WNBA draft pick. Ever wonder what it’s like to be a world-class athlete in the public spotlight while still managing schoolwork, friendships and family time? It’s time to Fudd Around and Find Out!

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.