Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:09):
I learned about the story from a reporter named Elizabeth
Hernandez at the Denver Post. She wrote this piece in
August about a murder that supposedly happened in Littleton, Colorado.
Speaker 2 (00:23):
On July four, twenty fourteen, just before noon, the Littleton
Police Department made a decisive move. Harrison, twenty one, a
college student and part time bartender, had been living a
life of turmoil and confusion. Harrison admitted to the police
that he had been in a secret sexual relationship with
his stepfather for the past two years.
Speaker 1 (00:42):
There was this YouTube channel that made this video and
it was a pretty wild case.
Speaker 2 (00:48):
Welcome to True Crime Case Files. Today we uncover the
tragic and complex story of Richard Engelbert, a successful real
estate agent who's hidden life and secrets led to his
brutal murder.
Speaker 1 (01:00):
And Elizabeth Hernandez, her editor, told her about these emails
that the newsroom have been receiving, all linking to this video,
all talking about this weird murder. And her editor said,
this seems like it's a story made for you. And
she read these emails and there were some that just
said you should check out this video. This is weird.
But there were others that were mad at the Denver
(01:22):
posts that were saying, why aren't you covering this? How
could you miss such a big, horrific case.
Speaker 3 (01:30):
It makes sense that people would be upset. This video
had been seen almost two million times, and frankly, it
was embarrassing that the story of Richard Engelbert, this grizzly
murder hadn't been covered by the local paper. How is
it that an independent YouTube channel was doing a better
job of doing an investigation than professional reporters. Well there
(01:51):
was a reason.
Speaker 1 (01:53):
It was pretty quickly clear that the story was made up,
that was invented using AI, and that no such murder
had ever happened.
Speaker 3 (02:02):
I talked to Henry Larson, who wrote about this not
so true true crime phenomenon for FO for Media.
Speaker 1 (02:08):
My name is Henry Larson. I'm a reporter and I
cover typically criminal justice. This is not that at all,
so much weirder story about fake crimes.
Speaker 3 (02:19):
The channel was called True Crime Case Files and it
had about one hundred thousand subscribers and tons of videos.
Speaker 1 (02:26):
The channel itself seemed to have been making dozens and
dozens of similar fake crimes, pumping them out, and it
seemed like people were buying into these fake crimes thinking
they were real.
Speaker 3 (02:45):
I'm afraid Kaleidoscope and iHeart podcasts. This is kill Switch,
I'm Dexter Thomas, I'm.
Speaker 4 (03:22):
Goodbye.
Speaker 3 (03:26):
Henry started looking at the YouTube channel and he noticed something.
Speaker 1 (03:31):
A lot of videos on the channel were pretty perverse sexualized.
Speaker 3 (03:37):
Then again, so are a lot of true crime stories.
These ones just always had a particular extra element of drama.
Speaker 1 (03:44):
That would usually have something to do with someone in
a position of power murdering or taking advantage of someone
who had less power than them, like a sheriff and
the secretary, or a teacher and a student, a parent
and their stepchild.
Speaker 3 (03:58):
There were dozens of story with that same setup, and
just as an example, let me red you some of
these titles here. So one of them goes judge beats
college student to death after secret gay affair ends in scandal.
Then couple's wife swap experiment ends in obsession and brutal murders,
or cheating husband murder's loyal wife and claims he was
(04:21):
on an acid trip. And then of course there's the
one the reporter of the dever posts got the emails
about husband's secret gay love affair, with step son, ends
in grizzly murder, and whatever you're imagining the story is like,
you're probably right.
Speaker 1 (04:39):
I will say that they got progressively more scandalous click
baity sexual over time. When this channel first started making
these videos, they were a little tamer.
Speaker 3 (04:51):
The video of the went viral and alerted the Denver
Post to this fake channel was definitely from the era
after the channel had gotten much more scandalous. It told
the story of Richard Engelbert, a real estate agent who
lived a seemingly perfect life but was having a secret
sexual affair with his step son.
Speaker 2 (05:10):
Richard kept this relationship hidden from everyone, fearing it would
ruin his reputation and career. In addition to this, he
continued to meet other men for sex through his work,
using the homes he showed as secret meeting places.
Speaker 3 (05:22):
The whole crime took place in Littleton, Colorado, and the
video showed pictures of these generic looking rows of suburban homes,
and for anyone who actually lived there, it was obvious
that this was not Lyttleton, but for people who don't
live there, it might have been convincing. All these videos
look kind of like a lower budget version of something
(05:43):
you might see on the Hallmark channel. They're usually around
twenty five minutes long. They're narrated by a host with
this authoritative radio voice, and there's photos of suburban living rooms,
and then there's the setup. It introduces the victim or
the witnesses or the perpetrator with a plausible looking photograph.
You'll see a smiling businessman with a blue suit and
(06:04):
bleach white teeth, or an office worker with blonde hair
and dangling earrings, and then it'll cut to a photo
of police cars parked outside the crime scene. And stories
are on par with a lot of true crime stuff
you might see on TV. Was it obvious to you
that this stuff was AI generated?
Speaker 1 (06:23):
It was pretty clear to me that the narrator was
an AI generated voice, that the photos are very weird.
Everyone had impeccable veneers in the headshots that were generated
and looked very glossy and kind of plasticky, and so
that made me initially think, oh, okay, this is pretty obvious, right,
(06:43):
this has been AI generated. But then when you go
in the comments, there might be a couple people who
are raising of red flag saying I don't know about this.
I can't find any information about this online, but they
were drowned out by people finding specific details from the
videos and saying, I can't believe that the police missed this,
or I think that there is this theory that was overlooked,
(07:06):
or here's this other comment I have about like a
material fact that was presented in the video, and that
at least to me, showed that people were buying into this,
and combined with the people who are mad at the
Denver Post for not reporting that story, makes me think
that this was compelling to at least a fair amount
of this channel's viewers. Generally, people were interacting with each
(07:31):
other talking about the details of this fake case as
if it was real.
Speaker 3 (07:35):
Okay, just the recap. The images look AI generated. The
voice is also AI generated. Okay, it's fake and huh,
people are falling for it. Not surprising. This is the
Internet and that is where most of us would stop.
We'd close the window and go back to white day.
But Henry didn't do that. He kept digging.
Speaker 1 (07:57):
I got really intrigued about the kind of person that
would make this video. So I found the contact email
associated with the YouTube channel and I reached out.
Speaker 3 (08:09):
So hold on this person. They had just left their
contact email just on their YouTube page.
Speaker 1 (08:14):
Yeah, they had a contact email.
Speaker 3 (08:16):
Not every YouTuber does that. That's actually kind of unusual.
Why do you think they did that? I don't know.
Speaker 1 (08:23):
I think, well, I guess I have the theory. I
think a lot of the reason why these videos were
being made by this person were because they wanted to
be a filmmaker and get some attention.
Speaker 3 (08:35):
So Henry gave him some of that attention. Now, just
a note in case you want to go and read
about this. The guy behind the channel, his name is
out there now if you want to look for it.
But back when Henry wrote the article and when we
did the interview, he was referring to him anonymously with
the name Paul. So that's the name that you'll hear
in this episode. So, yeah, what's his deal? Why did
(08:57):
you start all this up? What did he tell you?
Speaker 1 (08:59):
He graduated from college right before the pandemic, and when
the pandemic hit, he was living with his parents and together.
You know, a lot of families did all sorts of
traditions around this time. Paul and his family decided they
were going to watch dateline together, okay, and they watched
a lot of Dateline He told me he really didn't
(09:22):
like the show all that much, but he taught himself
the formula. He taught himself the process of the procedural
true crime genre. Here's the characters, here's the Grizzly murder,
and this lengthy investigation presenting each of the suspects in turn,
and then eventually a trial and some resolution for the victims.
(09:44):
This is also around the time that chat GPT is
really rolling out publicly, and he watches there's like a
twitch live stream of AI generated Seinfeld episodes.
Speaker 3 (09:57):
Hey, yvone, did you hear about that new restaurant around
the corner? I remember that. Yeah, they're supposed to have
the best food in town. Yeah, I forgot all about that.
Speaker 4 (10:11):
I heard they just opened up, and I'm dying to
try it, but it looks so expensive. Maybe we can
make a deal with the owner, you know, trade them
some of our jokes for a free meal.
Speaker 1 (10:24):
He loved it real, he got really interesting. He was like, oh,
this is the thing. This is weird and new and
probably cost nothing to make.
Speaker 3 (10:35):
So Paul decides to try his hand at AI generated content,
but before he lands on true crime, he tries a
different genre.
Speaker 5 (10:43):
First.
Speaker 1 (10:43):
He comes up with the bones of a plot for
a four to five minute Hallmark style rom com. Crucially,
he labels them as AI generated. He calls this channel
AI Film Studio. He thinks they're very good. We disagree
on that. I didn't think they were very good when
I watch them.
Speaker 3 (11:02):
Did you tell him that?
Speaker 1 (11:03):
I asked him if he thought they were good, and
he said yes, and I said okay, And they bomb.
The videos do terribly that none of them cross over
one hundred views, and so for some that might sort
of been part a lesson that this technology isn't ready.
There's something weird about this format. You need some more
production value, even if you want to use AI in media.
(11:26):
The lesson he learns is, don't tell people it's made
from AI. Don't tell people's think.
Speaker 3 (11:31):
The other decision he makes is to focus on true crime,
and in January of twenty twenty four, he starts the
True Crime Case Files channel and he tests this idea
he had that maybe one of the things that was
holding him back was that he was telling people that
these videos were AI generated. So he stops doing that.
I mean, it turns out that maybe he wasn't wrong,
(11:55):
because his videos started to take off after that.
Speaker 1 (11:57):
Yeah, in all honesty, he was success. At least for
a series of months. His channel was working. It was
working so well that other people noticed and started copying
his style, his formats, even the exact titles of some
of his videos.
Speaker 3 (12:13):
And now you can find dozens of channels that are
all posting similar fake true crime content. There's variations in
different niches here and there, but it's all basically the
same format, AI generated images accompanied by AI generated voices
reading stories that are also probably AI generated.
Speaker 1 (12:32):
He saw himself at the forefront of this gold rush,
this new medium of entertainment, and here he is experimenting,
trying something, and so I think part of the allure
of making AI films for him was the fact that
it was AI, was the fact that this was a
new technology. He told me he does consider himself to
be a filmmaker, one without a studio or expensive production costs.
Speaker 3 (12:56):
Or in this case, I guess a camera.
Speaker 1 (12:59):
Or a camera, yeah, or a microphone.
Speaker 3 (13:02):
Okay, So how much of a gold rush are we talking?
Speaker 5 (13:05):
Though?
Speaker 3 (13:06):
It's a little hard to say. A few times Henry
did try asking him how much he was making, but
he couldn't get a straight answer.
Speaker 1 (13:14):
He never told me exactly how much money he made.
I did learn that this was the only thing he
was working on, and as far as I know, it
was his only source of income.
Speaker 3 (13:25):
All right, let's just stick with the facts for a second.
This is all fake and nobody's denying that the stories
are made up. The voice isn't real, the images aren't real. Yes, people,
a lot of people actually are being fooled. But does
fake mean bad? Is there anything wrong with what he's
(13:45):
doing here? We'll get into that after the break. So
true crime. Listen, I don't know how you feel about
true crime. I've got some feelings. I think a lot
(14:08):
of people have some feelings about true crime. But let's
be real, it is an extremely popular genre of I'm
gonna say content. I'm gonna be as neutral as I
can here. Not a fan personally, But why did he
pick true crime? Do you think he.
Speaker 1 (14:26):
Picked true crime because he knew the format? But Paul
has a lot of criticisms of true crime, and this
was actually one of the reasons why he justified his work.
He said, what I'm doing making these AI videos is
actually better than real true crime.
Speaker 3 (14:43):
It's better.
Speaker 2 (14:44):
It's better.
Speaker 1 (14:45):
It's better because there's no real victims involved. He got
to make his videos and his money and no one suffered.
Speaker 3 (14:53):
How did you feel about that?
Speaker 1 (14:55):
I thought he was wrong. I still think he's wrong,
and I told him that. I mean, there's the reporter line,
which is just injecting fake information into the world is bad.
But there's also the component that true crime as a
medium has plenty of flaws that are not just about
the specific victim of a crime, but also our societal
(15:16):
understandings of criminal justice in general. We turn on the
evening news on our local TV broadcaster and it leads
with a murder in a neighborhood near us, and we
think that crime is going up, And we listen to
a true crime podcast about a serial killer, and we're
a little more nervous around our neighbors. And there's a
(15:38):
real societal impact that a lot of researchers have looked
into and analyzed about crime media impacting our perceptions of
actual crime.
Speaker 5 (15:47):
Right.
Speaker 1 (15:48):
So, I think his work in many ways was committing
some of the same sins as the true crime genre
in general.
Speaker 2 (15:55):
Right.
Speaker 3 (15:55):
What you're talking about here is research essentially that shows
it this stuff also makes us feel like your neighborhood
is not safe, which is pro Your neighborhood's pretty safe.
You live in the suburbs, You're fine, it's gonna be okay.
But we watch a lot of this stuff, and if
it's maybe fun to watch, you know when you just
want to turn your brain off at the end of
(16:16):
the day, which I think is what a lot of
people do for true crime. Look, I get it, but
in the back of your mind, it also makes you
think that the world around you is more dangerous than
it actually is, and what does that do to you?
But Paul the AI True Crime creator, he had a
different point of view on what he was doing.
Speaker 1 (16:36):
He said, what he was making is a form of
abstract art. He really liked his structural touches that he
would introduce into these videos, and at several points he
basically said, you know, I make these stories so ludicrous,
so insane that people should just assume that they're fake,
that they didn't really happen, and if they don't get it,
(16:59):
that's on that.
Speaker 3 (17:00):
Really it was so like in an absurdist art form.
Speaker 1 (17:03):
Yeah, he said what he's doing is absurdist start, and
he doesn't regret any of it.
Speaker 5 (17:07):
Yeah, I mean like his larger artistic message is lost
on me because it sounds like complete BS.
Speaker 6 (17:14):
I was gonna say, I'm gonna call BS on the
AI excuse.
Speaker 3 (17:18):
You might have been wondering what actual true crime podcasters
think about all this stuff. Well you just heard from
two of them. Hi. I am Bob Mauta and my
name is Lauren brtcheck out. Bob used to be a
defense attorney. Then he made a podcast telling the story
of how his father defended the notorious serial killer John
Wayne Gacy, and since then he's been a true crime podcaster.
(17:41):
Lauren is a former television producer, but now she's focused
on true crime audio. She produced the podcast Happy Face,
which was about another serial killer, and that's been adapted
to a show on Paramount Plus. The two of them
work together on a true crime podcast called Murder on
Songbird Road. But I think what be happening here, and
I'm curious to hear what you think about this, is
(18:02):
this may be exposing something about the audience for true crime.
And I think this is one of the things that
this person who who are calling Paul is you know
this point he's trying to make, which he's saying, Well,
the stuff that he's making, This AI generated stuff is
actually better than real true crime because no actual victims
(18:24):
are being exploited.
Speaker 6 (18:25):
That would work with the assumption that the intention of
all true crime creators is to exploit victims. And that's
the antithesis of my intention of Bob's intention. And I
would ask Paul, since he has wrapped this very I
(18:46):
think disingenuous scam up with the I'm teaching everybody a lesson, Bo,
what he's doing with the profits?
Speaker 5 (18:55):
Mm hmm, Like are you are you are you sending
all the proceeds LIKECTIMS organizations, Like are you doing good
with it to teach your lesson? Or are you just
using that as a convenient excuse as to why you're
creating this stuff under the guys that it's actually real.
But and look, I'm going to devils advocate on Paul's
behalf here, true criminal defense attorney here. Yeah right, I mean,
(19:18):
I can't help it. So in terms of I mean,
there is a large chunk of creators out there in
the true crime realm that they're they are just they're
out there peddling the violence of the crimes. They are
not taking into consideration the victims themselves, their families that
have to live with these tragedies for generations beyond when
(19:41):
the crime took place. They're merely just retelling a story
that's not their story to tell, you know, and really
in order to just make money and look like I
have a lot of friends that do it, you know,
that don't like there aren't deep divers that really just
kind of sit there and reread. They'll watch a Discovery
ID thing and they'll go right up an episode about
(20:01):
it and just.
Speaker 6 (20:02):
Speak about it with authority.
Speaker 5 (20:04):
Right, you know. So it's like, I don't want to
offend any of the people that do that out there,
but I think that Paul might have a point as
to those type of creators because what like, really, at
the end of the day, what do they bringing to
the table?
Speaker 3 (20:18):
I'm hearing YouTube really kind of distinguishing yourself from a
lot of other people who make true crime podcasts, even
though you all may you know, be situated in the
same podcast in the listing right, even in the same genre.
In this way, we could maybe think of what you
two do as creating really carefully prepared organic meals. It's
(20:40):
a lot of people who are very happy with a
bag of Cheetos.
Speaker 6 (20:45):
I right.
Speaker 5 (20:45):
I mean, like a lot of people just do like
kind of that pulp fiction quick hitter. I want to
rip open that bag of Cheetos. I want to dive in.
And it's arguably a much bigger market.
Speaker 3 (20:59):
But remember the Paul talked about his work as holding
up a mirror to the industry of true crime, which
I think is actually pretty interesting as a concept. You know,
show the audience truly what it is that they're looking at.
But that made me wonder about the response of that
audience when he was talking about his work is absurdist art.
(21:21):
He's got to be looking at the comments. He's seen
the same comments you're seeing a lot of people. The majority,
the bulk of the comments clearly are people who do
not understand that this is fake.
Speaker 1 (21:35):
Yeah, in large part also because he would moderate his
own comments and delete comments that would call him out
for his lives.
Speaker 3 (21:42):
Did he tell you that.
Speaker 1 (21:43):
Yeah, he said that he would go in and cut
comments that were negative or said that what he was
making was fake. He didn't get all of them, but
he said that he would try and get as many
as he could.
Speaker 3 (21:54):
Yo, hold on, Okay, I'm sorry man. So at that point,
saying it's on you. If you don't get it, you
are manufacturing something that's fake, and you're also manufacturing an
echo chamber of other people of basically social proof. You know,
it's like walking into a room and everybody says, wow,
look at this amazing thing. You think, well, everybody else
(22:15):
thinks this thing is amazing. I suppose if I think
it's not amazing, something's probably wrong with me. So if
you watch the video, look at the comments, everybody else
seems to think it's real, You're gonna feel kind of
weird if you don't also go along with that. Also, like,
maybe it's just me. Everybody else seems to think it's real. Yeah,
that's incredible. So I don't know about you, but maybe
(22:38):
this does change things a little. Are we trying to
alert people to the danger of harmful entertainment or we
just trying to make money or are those two things
totally compatible. Maybe it's just recognizing that people wanted cheetos,
and Paul figured out that he could provide people with
those cheetos really easily, really quickly, and in massive quantities.
(23:00):
He started picking up the pace, and he was publishing
videos every day or so. Before long, he put out
over one hundred and fifty of these things and people
were watching these. Some people are being fooled. But what
if some people don't care? If the people just want
Paul's Cheetos? Is that so bad?
Speaker 6 (23:18):
I'll go further with that metaphor. You are what you eat?
And are you putting ideas out there into the world.
Are you having an overspill into real life in which
these crimes could become real?
Speaker 3 (23:37):
I mean, if I'm picking up what you're saying here,
do you think that there's some worry that this AI
generated stuff becoming more prevalent, which it's getting more and
more salacious, the details get more and more outlandish, more
and more silacious, but also seem to get more listens,
more views, more clicks. That this could start to affect
(23:58):
how people perceive action cases or even just the news
and give them ideas.
Speaker 5 (24:05):
Yeah, I mean, what's to say that AI created you know,
true crime isn't going to come up with concepts of
ways to really effectively evade.
Speaker 6 (24:17):
You know, I'll tell you another issue that I have
with it. If you're leaning heavily into sensationalizing what you
are claiming is real trauma suffered by real people, but
it's all fake and all made up. You are pulling
(24:37):
out emotion and concern from real people. And when you
are exhausting that reservoir, they're going to have a lot
less capacity to care about real trauma and crime for
other people.
Speaker 3 (24:57):
So and that's a real thing.
Speaker 5 (24:59):
That's a real thing. I mean, we see it especially
in what we do Dexter in terms of people become
emotionally invested in cases.
Speaker 6 (25:08):
And I think it's kind of like one of the
unforeseen side effects of botox that has been studied extensively
is that when people have been doing it for a
period of time, they lose their ability to be empathetic
because when you're listening to somebody in real time, we
(25:29):
don't realize how much our face is mimicking that person's emotions,
and that gives us the empathy.
Speaker 3 (25:37):
That is how wild.
Speaker 5 (25:39):
I had not heard that.
Speaker 3 (25:40):
It makes sense though.
Speaker 6 (25:41):
But now just take that and put that over AI
true crime. It will ultimately be the same thing. We'll
stop caring about these cases.
Speaker 3 (25:56):
So what now, have we become too addicted to junk food?
Speaker 7 (25:59):
Kind?
Speaker 3 (26:00):
Can we or can the platforms do anything about all this?
That's after the break the channel had around one hundred
thousand subscribers and across the videos millions of views. It's
(26:23):
made Henry wonder he might not like this, but was
Paul actually doing anything against YouTube's policies? So he contacted
YouTube and asked them, so you head up YouTube, and
YouTube nuked hiss channel essentially, Yeah.
Speaker 1 (26:41):
Not only that channel, but another three or four that
he also had.
Speaker 3 (26:45):
What was the reason that they gave for pulling these
channels down?
Speaker 1 (26:49):
YouTube told me in an email they said that Paul's
videos had violated YouTube's policies around child safety, particularly their
policy and child sexual exploitation.
Speaker 3 (26:59):
So not about AI.
Speaker 1 (27:00):
No, No, he wasn't violating any of YouTube's policies around AI.
Speaker 3 (27:04):
And even though this channel was taken off of YouTube,
you can still find it on other platforms. He's still
making this stuff, clearly because it's on Spotify. I found
it on Amazon Music, all the platform channels that I
look for, it's on there too.
Speaker 1 (27:18):
Yeah, he has an RSS feed and a podcast player
and he's still generating these true crime stories. Also, they're
selling ads ads on all of these yeah, for hummus
and universities and all sorts of weird stuff. So he's
still clearly making money.
Speaker 3 (27:35):
So again, YouTube takes down the channel because they say
it violated child safety policies, not because it was fake,
and Spotify doesn't really seem to care that it's fake
or that it's explicit. So for now, there's not really
any reason for this stuff to stop.
Speaker 1 (27:51):
True crime is really popular and people love it, and
so of course there's going to be ripoffs and parodies
and scams associated with it. It's just the world we
live in now, I think.
Speaker 7 (28:03):
So if you want to know how to make a
true crime case story video that will go viral, then
you are in luck because hey, there, and how are
you doing?
Speaker 3 (28:10):
And now there are tutorials on how to make this stuff.
Speaker 7 (28:13):
In this video, I am going to show you how
you can create your own true crime story video and
go viral. So are you excited? Well, let's dive right in.
Speaker 1 (28:22):
I mean, of course there are tutorials, right, of course,
Pandora's box is now open, and there's really nothing to
do except hope for a solar flare, right, That's what
I'm my fingers are crossed for.
Speaker 3 (28:36):
I mean, at this point, just bring the asteroid man,
because I don't know what we're gonna do.
Speaker 1 (28:39):
Yeah, let's just clear all the satellites from orbit and
start fresh. And I don't know, maybe someone will cut
all the undersea telecoms cables and we'll be good that.
Speaker 3 (28:49):
I mean, that might be it. At this point, I'm
going to refrain from saying kill switch. I'm going to
refrain from saying that because that would be too corny.
So you spent months reporting on this, What is your
takeaway from this?
Speaker 1 (29:05):
There's a couple I think the very boring reporter in
me is like, you know, truth matter is more than
ever right, and misinformation is bad. But of course we
all know that. I think we have to ask ourselves
why we get interested in the media we consume. I
think that's a really important part of being a consumer
in an age where we have so much to choose from,
(29:28):
there's so much more of our media diet that's completely
in our own hands, and we can totally screw ourselves
over if we let ourselves.
Speaker 3 (29:37):
I would imagine that there are some people who will
continuing to listen to Paul's podcasts who know that it's
fake and who don't care because it's good enough for them.
And not only is it good enough for them. We's
escalated to something, to a level that Netflix is not
going to give them because the source material doesn't exist.
There isn't a husband who is secretly gay who ca
(30:00):
his wife on a cruise that happens every week. There
is not a trans person who crosses state lines to
participate in some drug ring or something like that every
single week. It's not possible, but AI does make it possible.
And maybe that's where we are. Maybe we've gotten so
addicted to true crime that we we there are people
who want the fake stuff. This is something Bob also
(30:25):
brought up.
Speaker 5 (30:25):
I mean, I think at some point people are gonna
become like it, Like more and more people are listening
to it, and they're they're going in or do the
guys as if it happened, And then when they come
to realize that it didn't, they're going to be upset.
But then the question becomes the next time it comes up,
they be like, well, I know now that it's fake,
(30:46):
but it was still pretty good.
Speaker 7 (30:47):
Again.
Speaker 3 (30:49):
Look when I when I open a bag of Cheetos,
I know what I'm in for. Right, And Bob and
Lauren are also worried about some other real life effects.
Speaker 5 (30:57):
I've got the distinct fear about how it's going to
add actually move into the criminal justice system, because it's
only a matter of time before they're able to use
these same type of programs in terms of creating imagery
where they're going to be able to bring false evidence
into cases, like there is nothing stopping somebody from saying, yeah,
I have a recording of a phone call wherein this
(31:17):
person just confessed. I mean, the thought of what could
be in the very very near future is terrifying on
an entirely different level than in terms of just the
fact that it's able to create out of whole cloth
things that don't even exist and make it seem as
if they do.
Speaker 6 (31:37):
You know, we're seeing it on the political stage right now,
and so we are gradually becoming more and more comfortable
with accepting ai.
Speaker 3 (31:52):
AI generated evidence being used in a courtroom. I hope
we never have to do an episode about that, but honestly,
at this point, man, give it a couple months, but
I think what we're starting to learn is that there
is a segment of the population that knows what they're
getting is fake and they don't care. Maybe you personally
are okay with the chat GBT filters that can redraw
(32:14):
your picture as a Simpsons character, or you're cool with
the audio generation engines that can turn your lyrics into
a pop song or a mariachi song or a rap song.
Maybe you personally draw the line at simulated approximations of
people being murdered, but we should acknowledge that these are
all uses for the same technology. Thank you so much
(32:39):
for listening to kill Switch. You can hit us up
at kill Switch at Kaleidoscope dot NYC with any of
your thoughts, or you can hit me personally at dex
digi that's d e x d i GI on Instagram
or blue Sky if that's more your think. And if
you liked the episode, hopefully you did. If you're on
Apple Podcasts or Spotify, you know, take that phone out
(33:00):
of that pocket and leave us a review. It really
does help people find the show, which in turn helps
us keep doing our thing. Kill Switch is hosted by Me,
Dexter Thomas. It's produced by Shena Ozaki, darl Of Potts,
and Kate Osborne. A theme song was written by me
and Kyle Murdoch, and Kyle also mixed the show from
Kaleidoscope Our executive producers are Oswallashin, mangesh Hati Gadur and
(33:25):
Kate Osborne. From iHeart our executive producers our Katrina Norvil
and Nikki Etour. We'll catch you on the next one.