All Episodes

April 25, 2025 20 mins
Bloomberg News, Kaleidoscope, and iHeartPodcasts announced the launch of Levittown, a new six-part podcast series investigating the rise of deepfake pornography online, debuting on March 21. Levittown, hosted by Bloomberg News reporters Olivia Carville and Margi Murphy, takes listeners to a New York suburb as dozens of young women discover that innocent pictures they shared with high-school classmates on social media have been manipulated into pornography and posted online. After being told by police and others there's nothing much that can be done, they set out to catch whoever did this and unwittingly join forces with a global band of investigators and hackers to combat the AI-fueled rise of deepfakes.Building on reporting for their Bloomberg Businessweek cover story, Carville and Murphy range from the suburbs of New York to New Zealand, hearing from victims of deepfake porn, investigators who are seeking a legal foothold to stop it, and the online vigilantes who have stepped in to try to shut down the websites. The first episode will debut March 21 with subsequent episodes available on March 22, 23, 28, 29, and 30. The podcast will also be featured on Bloomberg's flagship Big Take Podcast, which takes listeners through the best business, finance and economic stories from across the Bloomberg newsroom. Levittown is a co-production of Bloomberg News, Kaleidoscope and iHeartPodcasts.

Become a supporter of this podcast: https://www.spreaker.com/podcast/arroe-collins-like-it-s-live--4113802/support.
Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:00):
They are nice to meet you, Nice to meet you.

Speaker 2 (00:02):
I got to tell you, as you're doing this journey
today speaking with broadcasters across the country, it's they've got
to be telling you that the same exact kind of
stuff is happening in their own towns. And because when
I first jumped onto this podcast, it was like this,
you're talking about something that happens here. I mean, are
you seeing that?

Speaker 1 (00:20):
Yeah, I mean we're seeing this happen right across the country,
but also right around the world. Now, like deep fake
pornography targeting young woman isn't just a live it town problem. Unfortunately,
it is likely coming to a town near you. And
that's one of the things we explored in the podcast.

Speaker 2 (00:36):
Yeah, because AI technology, I mean, it's becoming extremely dangerous
because we're talking about people's reputations, we're talking about family,
we're talking about careers here.

Speaker 1 (00:48):
Yeah.

Speaker 3 (00:48):
So our podcast starts in the bedroom of Kayla, an
incredibly brave victim who kindly shares his story with us
for the podcast, And there's this horrible scene where her
dad comes into her bedroom, holds a phone up to
her and says, what is this And she's looking at

(01:09):
a nude image of herself on a pornographic website, and
for a moment she thinks, is that is that a
real who had that photograph of me? And then she
comes to realize this is a deep faked photo, and
she comes to realize that she's one of many women
in Leavertown that this has happened to. And it's what

(01:31):
happens to her is just a replication of what is
happening incredibly speedily around the country. And we're taking back
to twenty twenty. It's now twenty twenty five and the
podcast is coming out, and we are just hearing of
so many more cases and seeing the technology advance so
much more and becoming so much more realistic, cheaper and

(01:53):
easy to use. It's just mushrooming.

Speaker 2 (01:55):
I'll tell you what scares me really bad about this
because I just did a conversation with an expert about
these these schools that allow their students to take home.
The schools purchase computers and there and they're looking into
the child's life of that computer. And so when I
come across a podcast like this right away, I'm going,
oh my god, oh my god, so much can be

(02:16):
done with you know, with these photos or these These
are these images that these schools are picking up on
because they're invading bedrooms.

Speaker 1 (02:24):
That's exactly right. And one of the remarkable things in
the cases we've looked into is sometimes the images that
are being used to turn or morph into non consensual
pornographic images have actually come from social media accounts of
the schools themselves. Unfortunately, some of the young women who

(02:45):
have fallen victim to this have decided they don't want
a social media prisence at all and have decided not
to download any of these apps, despite the fact that
all their friends use them and their images have been
weaponized against them because their school has posted a photo
of them at prom, or them at cheerleading camp, or
just them in the school yard. It is so hard

(03:08):
to protect your children now because you can't control when
their images are taken and where they're shared. And unfortunately,
with this technology that we're talking about, it is so
easy with the push of a button a few seconds later,
and a normal natural photograph of someone fully clothed, you know,
at prom or at school, can be undressed completely and

(03:32):
it looks convincingly real.

Speaker 2 (03:35):
See those ads are appearing up on Facebook and other
places of social media. And what I mean by that
is is they go, oh, with this this technology, you
can remove mountains, you can remove different things you don't
want to see. But see if you remove something, you
can also place something. And that's where all this is
taking place.

Speaker 3 (03:55):
Yeah, it's these apps are the technology pushing this through
is generative AI. It's the ability to just basically, kind
of to put it really simply, you can ask your
computer to do whatever you want with an image, to
create one or to edit it, and it's like super editing.

(04:16):
You may have used Adobe Photoshop where you can kind
of use a paintbrush tool to airbrush things or add
things in. Now you can just plug a picture in,
press a button and it will remove everything for you.
And a lot of these apps that we're seeing at
the moment that are really booming in popularity and we're
seeing in deep fate cases in schools across the country

(04:38):
are these undressing apps which are made specifically to undress people.
There is no other reason other than to plug in
a social media photo or a photo from your camera,
role click a button and remove someone's clothes. And it's
they're marketed in that way, and you know they're being
advertised across there is social media, and you know, you

(05:00):
can find them very easily on Internet browsers specifically for
that purpose.

Speaker 2 (05:04):
All Right, let's get real here, Okay, if they're high
school students, why is this not looked upon as being
child born?

Speaker 1 (05:12):
And that's one of the things that we had to explore,
figure out and understand in this podcast is when these
young women discovered that their images, some of them taken
when they were children, I'm talking five or six years old,
and you know, they were manipulated or morphed, they go
to the police for help, thinking that that's the answer.

(05:34):
And unfortunately, at this stage, back in twenty twenty, there
was no law in New York State that said what
this individual had done to these images was a crime,
because while he had turned their images naked, they were
fake images, they were not real, and that meant there
was no law on the books that the police could

(05:55):
charge them with. And so the podcast kind of takes
you on this wild journey as the police and prosecutors
try to figure out how they're going to charge the
individual who had created these images and had led this
cyber campaign, like cyber harassment campaign against these young women
and still today, five years later, we're at a point

(06:17):
where we don't have an effective federal law protecting young
women across the country from this kind of crime. The
laws that are on the books in some states are
you know, they vary wildly, So you could have one
state that approaches this as a criminal issue and another
state that approaches it as a civil issue, another state
that is just amended existing revenge porn statutes to try

(06:40):
and deal with this. But how you approach it for
minors and how you approach it for adults is different
as well, and so states are trying to figure out
how to handle this, but there is no federal law
that's saying this is how it should guide you. So
we're hoping to see and a lot of the young
victims in these cases are hoping to see leadlegislation passed

(07:01):
later this year in the US called the Take It
Down Act, which would effectively make this a crime, a
federal crime to create or manipulate photographs and turn them
into non consensual pornography. It would also call on the
big tech platforms, the social media platforms where a lot
of these images are shared, resulting in cyber bullying and

(07:21):
cyber harassment. Like meta, Snapchat, TikTok call on them to
remove those images within forty eight hours as soon as
they've been notified of them, because the problem isn't just
the creation of the images, it's the way they're shared
and the way they can go viral matter of seconds.

Speaker 2 (07:39):
You know, the actors went on strike. Part of the
reason why they went on strike was because of AI technology.
When you speak of what you're doing right now, it's like,
how are they going to stop the actors from being
in scenes that they didn't really want to be in?
But yet AI technology doesn't. I mean, this isn't just
what we're talking about today. It's affecting everybody, even adults,
because fetishes are fetish.

Speaker 3 (08:03):
Right, It's more than just the Levetown girls. Their story
is one of several people's stories, and it's young girls
in schools around the country. It's young people. But you
may have heard of deep fake celebrity porn if there
are actual categories on adult websites for deep fakes, and

(08:27):
they're the kind of usual character the usual actors that
you see being deep faked, and they have lost control
of their bodies in that way. I mean, you mentioned
the actors in Hollywood. It's a huge problem across a
lot of different industries. This idea of what your identity
is online now that we have this incredible AI technology

(08:52):
which can basically manipulate anything truly, and it's incredibly convincing.
I mean, we're also seeing AI and things like scams.
I write a lot about how we're getting cases of
CEOs being deep faked and their employees being duped and

(09:14):
genuinely this is happening, being convinced to wire huge sums
to scammers because they were on a zoom call with
what looked like their CEO, or they use an audio
deep fake. So this idea of identity in twenty twenty
five is being completely turned on its head.

Speaker 2 (09:31):
Oooh heavy and yet very informative. We've got more with
Olivia and Margie coming up next. Hey, thanks for coming
back to my conversation with Olivia and Margie from the
podcast Loveytown. When I know that I'm with an AI conversation,
the first thing that I started doing is I go
tell me something that I did last week. If you
think you know me, tell me something I did last week.

(09:51):
Do we need to start playing those games with this
AI technology?

Speaker 1 (09:56):
I mean, that's an effective way to approach this one.
One kind of funny thing that happened recently, And I
hate to use the word funny when we're talking about
such a dark subject matter, but Maggie and I were
talking about Leavittown and talking about our podcast, and as
we looked into deep fakes, and we looked into kind
of the rapid rise of this technology and how effective

(10:16):
it's become. It's really made me question what I see
now and question kind of my own use of social media.
And I see a photo and I'm like, is that
really real? And we were talking about a video that
we both happened to come across on social media. One
it was sent to me by a good friend of
mine about a shark, a great white shark. I know
this is random, but bear with me for a moment

(10:38):
had jumped into the cage of a person who had
gone diving. You know how you have those diving cages
set up and you go out into the ocean. The
shark had entered the cage and the people in the
boat didn't know what to do, and they're freaking out,
and then you can see blood and then the shark
kind of squirms its way back out of the cage,
and thankfully the diver in there had survived. But I

(10:59):
watched it shock and awe, and then seconds later I thought,
is this a deep fake? Yep, And unfortunately we're going
to have that same reality check. I think for everything
we consume now, it is so hard to believe your
own ears, your own eyes. You know what you see,
what you hear. It's we're entering a stage of society

(11:23):
where we use technology. Our smartphones are ubiquitous, they're in
our hands constantly twenty four to seven, but we don't
know if what we're seeing is real or not. And
I think that the way in which we use and
consume the media, photos, videos is going to have to
change and evolve as this technology becomes so effective that

(11:45):
it is just almost impossible to tell whether it's real
or it's fake.

Speaker 2 (11:50):
It kills me that I hearten Spotify and Apple. They
send me letters all the time saying we think that
you've got something that's copyrighted on your episodes. You need
to take it down imediately. But they can't spot this.
There's got to be a signal, there's got to be
something that they can say, hey, look this is a
deep fake shot. Get this off the air right now.

Speaker 3 (12:12):
You're right, this is a big problem that there are.
There are people trying to solve it. So far, there
isn't a kind of capsual solution. You've got people talking
about things like watermarking images. So if an image is
generated by AI, there is something that kind of signifies,
either in the metadata or maybe even just you could

(12:34):
see it on the image, like a little mark that
signifies this is a deep fake. But you know what
the Internet's like, People cut, copy and paste and things
like that can metadata can get scrubbed out, and really
it's an image when you're looking at it on the
internet can be so powerful and the first time you

(12:55):
see it, if you're looking at on your phone, it's
very hard to remove that first and you feel when
you see it. Then if you know, if later on
it's kind of under close inspection you learn it's a
deep fake, it could be like a few days later,
like when Olivia was talking about the shark video, we
had completely different responses to it. I presumed entirely it
was real, which maybe suggests something about my naivity. But

(13:19):
it was funny how we had this transition where Olivia
thought for a long time it was a deep fake.
I thought it was real, and then when she mentioned
it might be a deep fake, then I started believing
it was deep fake, and then she thought, oh, maybe
actually it's real, and that kind of it's really hard
to find a solution that really does help with our
perceptions of things that we see online because we are
just human and when you're presented with an image, it's

(13:41):
you have an instant reaction to it. There are also
ideas about trying to actually watermark real images so when
the kind of when we're inundated of fake images, the
thing we can turn to is the original and the
person who owns that can hold that up and say, look,
this is what it was really before something got manipulated

(14:03):
or morphed in some way. But at the moment, there's
a lot of ideas going around, but there is not
one standard or kind of one, one uniform idea to
protect us from that.

Speaker 2 (14:16):
Yeah. One of the things that I love about Loveytown
and this is the writer and the editor inside of me.
You do it in thirty minutes, and I love that
because that's how much time the average person has to
go to the store, to go to work to do something,
and you're telling a powerful story and sharing reality. You're activators,
you're getting the conversation started. You're teaching us how to

(14:37):
be aware of what we're really doing. But you're doing
it in thirty minutes, and I love the fact that
you are so honest with that.

Speaker 1 (14:45):
Yeah, well, thank you for saying that. We really tried
to stick to a time limit that was going to
be effective for the listener. We really hope as many
people as possible listen to this podcast because the only
way the only way to combat this cybercrime is to
raise awareness for kids so they know what's coming, so
parents understand the risks and dangers of the digital world.

(15:07):
So thank you for highlighting Levettown as well and bringing
us on to talk about it because they think unfortunately
a lot of people don't understand where we're at with
this technology and don't understand just how fast a photo
can be undressed right now.

Speaker 2 (15:21):
Yeah, because here's the thing, we're all guilty of jumping
into clickbait and we don't trust it anymore. So therefore
we're not going to the news outlets when it comes
to the digital format, So we turned to podcasting and
even Chris Cuomo says, the real journalists are where the
podcasts are, and that's exactly what you guys are doing.

Speaker 1 (15:42):
Thank you.

Speaker 3 (15:43):
Yeah, I really appreciate that we thought about because this
story initially was actually a print piece. It was a
magazine piece, the cover story of Bloomberg BusinessWeek, and we
wrote it. It was published back back in twenty twenty three,
and we knew it was such a powerful story and
then it was one that really would still resonate for

(16:06):
the years to come, and it has because we've just
seen cases mushroom, you know, the technology advance, and we
wanted to revisit it, but we felt like the story
was better told over audio, just because of the victims,
you know, we didn't want to put them on the camera,

(16:26):
which you can kind of understand, and I think it
actually is so powerful when you listen to it, and
we were able to get much further into the characters
who are involved, so we really kind of give the
victims a real voice, literally and you get to know
you get to know them and really understand that this
is more than just kind of a fake image. This

(16:49):
is their lives being changed, and this is just another
tool of just a really horrendous harassment campaign. It also
allows us to delve into some of the other characters
in the story who were in the kind of vacuum
that law enforcement had left. We're trying to take the
reins and help, you know, just people who were just

(17:10):
volunteers around the world who saw that these things were happening
and didn't think they were right and wanted to take
masses into their own hands. And so it was great
to go on that journey with them too.

Speaker 2 (17:20):
Wow, it's so hard to believe that we're only just
a few years into this journey of our smartphones taking
pictures and it's become this dangerous this quickly. And the
younger generation has got to be the one that's got
to help change this too, because I mean, because they're
snapping shots like you wouldn't believe, and those snapshots are
becoming dangerous pieces.

Speaker 1 (17:40):
Yeah, I mean, their whole lives are online now.

Speaker 2 (17:43):
Yep.

Speaker 1 (17:44):
They it's this social fabric. It's how they connect, it's
how they communicate. They don't call each other on the phone,
they teach, well, they don't even text each other. They
snap or they DM or they tiptop like that's how
they talk to one and other now, And I think that,
you know, for the younger generation, understanding the risks of

(18:05):
the technology is just it's so profoundly important that they
understand when you put a photo up online, someone could
weaponize that against you, and understanding that if that does happen,
it's okay. You're not alone and there are people who
can help you. You know, you can seek help from
your parents or a trusted adult. And I think that's

(18:27):
kind of the message or the takeaway we hope anyone
who listens to this podcast has is that, you know,
having conversations with kids about the dangers of the digital
world are just so important today because parents right now
are raising children who are more digitally savvy than they
are and they ever were. Parents today didn't grow up
with smartphones or social media. Most of them got their

(18:49):
first smartphone or Facebook account in college or university. They
don't know what it's like to be at the age
of fourteen or fifteen and have a boyfriend or girlfriend
for the first time that you only talk to online
that you've never met in person. It's changed relationships, It's
changed the way kids interact and engage with one another.
And we don't know what the long term consequences are

(19:11):
going to be, But there are real world impacts to
this very digital online crime, and that's what we were
exploring in the podcast. And I think for me, understanding
the beauty of podcasting and the power of audio really
came through because while we did write this as a
print story, we were quoting the young woman that we

(19:33):
were talking to and we weren't letting them use their
own voices. And for me, after the podcast came out
and we sent it to them because obviously the most
important response is how they themselves felt about their own
story being used in that way, and the messages we
got back were a sense of empowerment of I'm so

(19:53):
grateful and thankful that i had the chance to share
my story in my voice.

Speaker 2 (19:58):
Wow, please come back to this show so anytime in
the future, the door is always going to be open
for the two of you.

Speaker 3 (20:05):
Thanks so much for having us.

Speaker 1 (20:07):
Yeah, thanks, Bah, it was great.

Speaker 2 (20:08):
You'd be brilliant today.

Speaker 1 (20:09):
Okay, Jeers, thank you, thank you. Bye.
Advertise With Us

Popular Podcasts

Stuff You Should Know
Dateline NBC

Dateline NBC

Current and classic episodes, featuring compelling true-crime mysteries, powerful documentaries and in-depth investigations. Special Summer Offer: Exclusively on Apple Podcasts, try our Dateline Premium subscription completely free for one month! With Dateline Premium, you get every episode ad-free plus exclusive bonus content.

On Purpose with Jay Shetty

On Purpose with Jay Shetty

I’m Jay Shetty host of On Purpose the worlds #1 Mental Health podcast and I’m so grateful you found us. I started this podcast 5 years ago to invite you into conversations and workshops that are designed to help make you happier, healthier and more healed. I believe that when you (yes you) feel seen, heard and understood you’re able to deal with relationship struggles, work challenges and life’s ups and downs with more ease and grace. I interview experts, celebrities, thought leaders and athletes so that we can grow our mindset, build better habits and uncover a side of them we’ve never seen before. New episodes every Monday and Friday. Your support means the world to me and I don’t take it for granted — click the follow button and leave a review to help us spread the love with On Purpose. I can’t wait for you to listen to your first or 500th episode!

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.