Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:00):
Bloomberg Audio Studios, Podcasts, radio News.
Speaker 2 (00:08):
Hey it's Sarah Today. We're bringing you a special episode
from our colleagues, reporters Margie Murphy and Olivia Carville. They've
been following the stories of women and girls who have
had their lives upended by deep fake pornography. They've been
trying to understand what's behind the rise of these cyber
harassment campaigns and why it's so hard for victims to
(00:30):
get any accountability. It's an issue they explored in their
podcast series Levittown, and an issue that's recently captured national attention.
We've put the link to the series in the show notes.
US lawmakers just passed first of its kind federal legislation
meant to help teens and young women get this kind
(00:50):
of non consensual pornography off social media.
Speaker 3 (00:54):
On a busy newsday, the President sent to sign the
bipartisan Take It Down Act, which would help comet sexually
explicit online materials sometimes generated by AI, also known as
deep fakes or revenge porn. The event will begin.
Speaker 2 (01:09):
The Take It Down Act is one of the few
policies this Congress has managed to unite behind. Democratic Senator
Amy Klobuchar and Republican Senator Ted Cruz both pushed for it,
and so did First Lady Milania Trump. Tech companies like
Meta and Google backed at two, and on May nineteenth,
President Donald Trump signed the Take It Down Act into law.
Speaker 4 (01:32):
Today it's mayanor to officially sign the Take It Down
Act into law. The rise of aiimage generation capitalist women
have been harassed with deep fakes and other explicit images distribute.
Speaker 2 (01:43):
The law puts more pressure on potential perpetrators, making it
illegal to publish or even threaten to publish, revenge porn
or deep fake images, and starting next year, it will
also put new pressure on the social networks where this
content can spread. Tech platforms will have forty eight hours
to take down any non consensual sexual images reported by
(02:04):
victims or risk legal action from the FTC. Today, on
the big Take, Margie and Olivia in conversation with two
mothers whose children had two very different experiences in this
dark online world of cyber sexual harassment. One is the
mother of a victim who helped push for this new
(02:25):
federal legislation, and the other is the mother of someone
who is committing these kinds of crimes. I hope you
find the conversations as surprising and revealing as I did.
A quick note before we start that this episode explores
(02:48):
sexualized imagery involving miners and violence. Please take care while listening.
Here's Margie Murphy.
Speaker 1 (02:58):
It's every parent's nightmare to think that their child could
be a victim of deep fake pornography, of a long,
intimidating campaign of cyber harassment. But what if you learned
that your child was a perpetrator of a crime like this,
That the child you nurtured and cared for not only
created non consensual pornographic images of girls they knew from school,
(03:20):
but they continue to harass them over the course of
several years, all while hiding their identity online. What if
you found out this was happening only when the police
knocked on your door with a search warrant.
Speaker 5 (03:36):
I hid a glimpse of something that was absolutely horrible.
In if it wasn't for the fac my son was involved,
I still wouldn't even know about it. And parents need
to know. You know, this needs to get sorted out.
Speaker 1 (03:47):
This is the mother of a young man who was
accused of the kind of online harassment we covered in
our series. She wanted to remain anonymous because she feared
speaking out publicly could impact her career in safety. So
we've agreed not to identify her or her son and
will refer to her using a pseudonym, Barbara. She went
(04:07):
through a lot with her son, and she said she
thought it might help others to hear about it. Barbara
says that when her son was young, she thought that
she was lucky to have a happy child who was
pretty adaptable pre puberty.
Speaker 5 (04:22):
He was literally an all rounder. Really, he loved a
lot of different things. He had a lot of my characteristics,
so we related well. He's quite creative. It was extremely companionable,
just a lovely young boy.
Speaker 1 (04:40):
But over time things changed.
Speaker 5 (04:43):
He developed some mental health issues as he approached puberty,
and was quite a dramatic change, very very sudden.
Speaker 1 (04:52):
She noticed her son pulling away from sports, school and
his social life. To her, something knocked him off course,
effectively driving him into isolation.
Speaker 5 (05:03):
The serious social withdrawal started to kick in and of
his depression, and that was difficult. So it went from
a child who was participating in things to a teenager
who refused to participate in It was like black and
white and started to have some negative comments about school
as well, about his teachers, or about not enjoying subjects.
(05:25):
It can't be bothered. Also, a thing that I had
noticed with him was that, unlike kids of my friends,
he had no vision of his future. And I saw
this teenager who refused to take any opportunities and had
no vision of what he wanted to do. So there was,
(05:47):
you know, so started to get quite worried. I was
concerned enough about it even then that I did talk
to a counselor was there anything I could do. I
remember being told that because I was an awesome mother,
was a great role model, and I didn't have to
worry about anything. Is really one of the things that
sticks in my throat. It's clearly things went pretty pear shaped.
(06:13):
And I'm not going to go into the details, because
so I think mental health, mental illness, they don't come
in neat little boxes anyway.
Speaker 1 (06:23):
Barbara was a pretty tech savvy mum, and there were
always computers in their home. She didn't think much of
her son using technology too, like playing Harry Potter video games,
but as he got older, she noticed he spent more
time online and he began making friends there too.
Speaker 5 (06:43):
I guess in hindsight, there were some influences that started
to come into the house. I will describe it in
general terms. So if you're a parent and you've got
a son and they're having a sex life in their
bedrooms locked up with the curtains drawn, that you can't
not know that's happening. There's signs of that. So that
(07:06):
had been going on for a long time. But I
thought it was consensual, you know, modern online relationships, because
he told me by that time about real relationships, that
he believed we're real. So I was aware that he
was having an online thing, so thought I wasn't expecting
this other aspect. But I had warned him about things
(07:26):
that can happen online. You know, these are things you
need to be really careful about because people can get
groomed and sucked into, you know stuff.
Speaker 4 (07:36):
You know.
Speaker 5 (07:37):
I tried to explain to him seriously and he's just like,
you know, go away, Mom, What do you know?
Speaker 1 (07:47):
Can you recall the moment when you learn that your
son was harassing women online with doctor graphic images.
Speaker 5 (07:54):
A bunch of investigators and police turned up with a
search warrant. It was my been a week end or
after work. I was outside cleaning up with a wheelbarrow
in a rake or something like that. I locked up
and it was just this array of police and playing
clothes investigators at my gate, and a female detective came
to speak to me, show me that there was a
(08:16):
search warrant. They asked if he lived here, where was
he in the house. So I guess I was shocked,
but I wasn't one hundred percent surprised because I'd been
really concerned about my son with the illness, and because
I knew that I didn't know what he was doing.
Speaker 1 (08:37):
Can you characterize what you were told that he had
been doing why the police were investigating him.
Speaker 5 (08:43):
I came in with a warrant and some paperwork. They
left it on the bench top. I just read the
front page had some charges on it, and I remember
its like I think a lot of it was like
online harm. Later, because he was having difficulty dealing with
(09:08):
the lawyers, the lawyers are having difficulty because it's a
new area. I thought, well, I'm going to have to
have a close unlock here and I'm going to have
to help. The lawyer had a mess of file and
then I saw all the stuff.
Speaker 1 (09:28):
All Barbara knew were the charges brought against her son.
He wasn't willing to talk then about what he'd been
accused of. We asked if he would be open to
speaking to us, and he declined. But looking at the
online images the investigators had collected, this was the moment
that Barbara realized what her son had been up to
alone in his bedroom. We'll be right back. We're back
(09:59):
with Barbara, who's taking her first look at the images
that showed her what her son was doing online.
Speaker 5 (10:05):
And there was all these men, all these dick pictures,
which is like erect penises, water war guys taking photos
of themselves in front of their computer and then interacting
somehow in this big gang of other men. And then
they were talking just disgusting things about what they wanted
(10:29):
to do to various females. Honestly, I had no clues
that there were these kind of porn sites. To me,
it seemed like a virtual, kind of a gang rate scenario.
That's how I immediately perceived it. Oh, my son has
been groomed into this stuff for the lease. Like guys
(10:50):
in their thirties forties, not just young boys. These are
adult men out there saying, hey, you know you're one
of us as.
Speaker 1 (10:59):
A levy, and I learned through our reporting there are
many places for non consensual pornographers to post their creations,
places where they get a boost from likes and comments
from other men.
Speaker 5 (11:11):
Tell us what you'd like to do to this woman,
or show us a woman that we can make comments about,
or whatever. Posts pictures of you, you know, your jennitator
on the computer, and what the hell? It's never been
something that my generation. It's just like, it's just completely
(11:34):
wouldn't even imagine it.
Speaker 1 (11:36):
Barbara says she hates when she hears the police talk
about lone wolves committing these kinds of crimes. It bothers
her when people suggest that the perpetrators are introverted acting alone,
because that's not how she sees it.
Speaker 5 (11:50):
That's a gang activity. My son wasn't there alone. He
was getting pumped up by mass testosterone, fueled by thousands
of men around the world telling him he was the greatest.
It's not just one fifteen year old inventing all of us.
Speaker 1 (12:06):
She knows all about this world now, but it's not
something she was aware of before her son's arrest, and
she's frustrated that the people she hired to help her son,
didn't flag this as a possible issue.
Speaker 5 (12:17):
And I've just been through years of talking to psychologists, psychiatrists.
My son's been assessed and reassessed and reassessed and reassessed
for years. Why didn't it come up? So I have
a lot of questions about it, a lot of questions
about those unknown harms.
Speaker 1 (12:39):
You know, Has your son been willing to talk to
you about what he was accused of?
Speaker 5 (12:45):
Largely No. At the start, he just denied everything, just
denied everything for a long time. And that's when I
realized the extend of the problem. I suppose before that
I really trusted him. I thought we had a good relationship,
and I went through a period of feeling utterly betrayed
(13:05):
and heartbroken basically for quite some time. It's irrepearable because
this goes against my entire values, you know, values, everything
that I've lived in my life to support women, encourage woman,
back woman and stand up to bullying and all that
kind of thing. I just saw this as such some
(13:27):
horrible monster that had come into my house without my permission.
Speaker 1 (13:32):
But Barbara says, over time she started to look at
the situation differently.
Speaker 5 (13:38):
And then I started to see him not as a criminal,
but as a very vulnerable adolescent, really struggling with his identity,
with his social life, with his sexual development and stuff,
and being just fucked into this because he didn't, you know,
because of the state of his mind. Basically he was
very vulnerable to it. I told my son this recently.
(14:02):
This is a problem of male behavior and it must
be solved by men. Women aren't going to solve this problem.
Mums angry mums aren't going to solve the problem. Men
that have been through it. They're going to have to
figure out how they got into it, look at the
damage its cause to others and to themselves, and then
(14:24):
they're going to have to figure out a solution. I
think that's the only way that that harm can be addressed.
Speaker 1 (14:30):
Really. I know you've spoken about him being up in
his room and the vacuum that these groups failed for
fear Sung, But are there any other warding signs that
you might tell parents to look out for.
Speaker 5 (14:45):
Yeah, I wrote down a list of them. So, a
lack of friends in the real world, the amount of
time spent online, that it is an all consuming, all
consuming thing. Pulling the curtains in shades locking the door
as a routine procedure to set themselves up for maximum privacy,
(15:06):
having a printer installed in the bedroom suddenly when maybe
they don't need it for work. There could be like
little tiny things on their own. As a one off.
You might not think anything of them. But I think
it's like science of addiction, that that world becomes more
(15:28):
real to them, so it affects their relationships in the home.
They start to be rude, not listening to other people,
that aggression, that sort of like almost a hostility. Don't
come near my world.
Speaker 1 (15:42):
With the evolution of technology at the moment, we're just
seeing such an advance in what's available. Even since the
experience that you've described, technology has moved incredibly quickly, and
images that the men in these networks were using and
(16:03):
taking hours to alter and now takes minutes. And there's
newdify apps that produce fake moods and moments. Can you
imagine your son scenario, but with today's technological ease of
creating that fake porn, how do you think that would
have played out? And also how troubling do you think
(16:26):
that this technology is just so easily available to young
men all around the world right now.
Speaker 5 (16:32):
So I think either these sites need to not exist,
or if they do exist, so we need to restrict access.
And I think parents have to be able to do that.
In at the moment, you can't because there's this whole
kind of ethos about, oh, we're training and teaching our
(16:53):
young people to make good choices, and so you should
stand back and let them make choices. Thing that it's
at a rubbish because they can't. They have no idea,
and they don't listen to the appearance or to anybody,
so you're not going to make good choices. We can't
just step back because if two many of them are
(17:16):
falling into this abyss.
Speaker 1 (17:19):
It's interesting you talk about restrictions because there's a real
range of parenting philosophy over how to protect young people
from dangers online. And on one side you have parents
who advocate for withholding devices completely. Then you've got others
who are making rules like okay in communal areas the
living room, but not the bedroom. And then on the
(17:42):
other end you have parents who say tech is going
to be here forever. Now, you can't stop children from
using it, but what we can do is just talk
to them about it. Where do you come down in
that spectrum.
Speaker 5 (17:58):
I definitely like just having a all around those bedrooms.
Do not allow teenagers to set up their bedrooms as
a fortress where no one can see in, no one
can come in, and they sort of develop an arrogant
attitude of saying that their parents don't even have the
right to know what's happening in there. I think that's
(18:18):
a real danger zone to me, that is one thing
that I could I could have actually controlled that better,
but I would have had to fight about it wouldn't
have been easy. So I think parents are going to
have to be very strong about about this, and there's
going to be a lot of pushback. So I don't
think there's going to be I don't have any hope
(18:39):
that there's going to be a quick or fast fix
for it. I think we have to really raise awareness
at the moment, really really raise awareness to a lot
of parents, to a lot of community leaders, and anyone
involved in supporting youth in any way.
Speaker 1 (18:58):
Great, Well, it's so appreciate you being so candid and
sharing that, and I think a lot of people will
find it incredibly enlightening and helpful.
Speaker 5 (19:11):
So thank you from me as well. I really am
so happy. I will bust them too tears. So you know,
I've had the world on my shoulders with dealing with this,
and you know, I'm extremely grateful to take this opportunity
to tell it how it is, because I think it's
(19:33):
such a problem.
Speaker 6 (19:35):
After the break, obviously, the second step after calling the school,
it's calling your lawyer to find out, you know, whether
your rights. So we've been informed there are no regulations,
no legislation globally at that point. So as soon as
Francesca came back the day off, she told me, Mom,
I want a law. I need you to help me
(19:58):
to find a law.
Speaker 7 (20:08):
Of course, parents of young women have also been struggling,
struggling to figure out how to prevent their daughters from
falling victim to deep fake pornography and struggling with how
to adequately protect them from the dangers of the digital world.
I'm Olivia Carvill. We spoke to one mother, Derota Manny,
(20:31):
from New Jersey, whose daughter was deep faked while she
was a fourteen year old at Westfield High School, and
she decided to fight back.
Speaker 6 (20:41):
Francesca has been confirmed to be one of many AA
victims of deep fake pornography at Westfield High School in
New Jersey, and ever since we have been advocating for
regulation and legislation and just AA authoracy in general.
Speaker 7 (20:56):
Deota Manny and her daughter were just in Washington, DC, say,
in the rose Garden of the White House, where President
Donald Trump signed the Take It Down Act.
Speaker 4 (21:08):
With us are several other brave Americans whose lives were
rocked by online harassment, including Francesca Manny.
Speaker 7 (21:16):
Deoda. Manny describes herself as an entrepreneur, an educator, and
now an advocate for deep fake porn survivors. What she's not,
she says, is someone who was ever looking to get
tangled up in politics.
Speaker 6 (21:33):
We live very comfortable life, surrounded by two dogs, two carts.
We go for walks, We travel a lot, we hike,
we kayak. We don't need to be in CNN, or
in New York Times or god knows where else. This
is not our prerogative, So that spidlight was put on
us without our consent and without our knowledge.
Speaker 7 (21:57):
Deeroda says that in October twenty twenty three, when Francisca
was a sophomore, she learned that deep fake porn images
of her were being sheered among her male classmates on Snapchat.
Francisca was one of several girls at her school who
were targeted. At first, Deoda says this came as a
(22:19):
shock for her daughter, but that quickly turned to anger.
Speaker 6 (22:24):
So, you know, immediately when she found out, she texted me,
then she called me, and I know everybody expected she
would be crying or she was just shocked. Multiple of
girls crying on the hallways. I mean, there was a
havoc during that time in the high school, and some
boys were pointing at them and making fun of them.
(22:44):
And that's when she stopped and she said, you know,
I shouldn't be crying. I should be mad. There's nothing
to be said about. I should be mad about what
has happened to us girls.
Speaker 7 (22:55):
Deoda was mad too, once she fully understood the gravity
of what it hasened when she answered that first call
from her daughter that day. She'd heard about deep fakes,
but only in the political realm. She knew that sometimes CEOs,
lawmakers or celebrities had been targeted, but high school girls
(23:18):
being undressed with AI.
Speaker 6 (23:20):
Never and when they said can you please explain it
to me? What does that mean? She said, Well, there's
a technology when you can just undress girls. So you
can take any picture with a click of a button,
sometimes free, sometimes for nine ner cents, sometimes for five dollars,
depending on the side, you will be able to undress
that woman, girl, boy, anyone.
Speaker 7 (23:42):
I mean that sounds like a nightmare scenario for any
parent and Deeroda, what did you decide to do next
as a mother? As a parent, what did you do?
Speaker 6 (23:52):
So when Francisca came back home that evening, she has
been pissed because her school suggests that there's nothing they
can do to the perpetrators. She said, well, what about
the code of conduct?
Speaker 3 (24:06):
You have it?
Speaker 6 (24:08):
They said, well, but they're no AI loss, so we
can't really apply it.
Speaker 7 (24:12):
The school did have a harassment, intimidation and anti bullying
policy that applied to technology, but de Roda says it
was outdated, to say the least.
Speaker 6 (24:24):
When Francisca brought it home, she said, look, mom, they're
actually referring pagers and what was it? What else wakman?
She's like, what is a pager? What is it a wakman?
Speaker 7 (24:35):
No one in authority seemed to have the right playbook
to respond to what had happened at Francisca's school.
Speaker 6 (24:43):
Obviously she already knew who it was. That there would
be one day suspension for one boy and that's it,
and that the school is offering counseling. So Francisca says,
I don't need your counseling. What I need is an accountability.
Their answer was, there's really not much we can do
because there are no laws.
Speaker 7 (25:01):
Did you feel at this point that you, your daughter,
the other young woman who have been targeted in this case,
had been kind of failed in some way?
Speaker 6 (25:11):
One hundred percent failed? Disappointed, offended. It's a very strong
message that we're sending to the female community in our
high school that basically says, you know you are a girl.
At some point you will were a victim's budge, so
go for counseling. Well, screw that. Yes, this happens, and
(25:33):
it's our job to learn how to protect our image,
but it's also our job to point out that certain
behaviors are unacceptable and should not be cultivated, especially in
the place as safe as school should be.
Speaker 7 (25:48):
We reached out to Westfield High School and the school
said that it can't comment on individual student matters or
disciplinary actions. In a statement, the school's superintend Raymond Gonzalez
said the incident took place outside of the school year,
but that the administration started investigating immediately in coordination with
(26:12):
local police. It also provided counseling support to the victims,
and it revised its policies to include the use of
AI and the definition of cyber bullying, and updated its
code of conduct to better address emerging digital issues. And
what about for you as a parent, as a mum
(26:33):
who watches your teenage daughter go through this?
Speaker 6 (26:36):
You know, it's interesting. I've been asked this question over
and over and I always want to respond with, well,
how do you think it makes me feel? So I'm
going to leave it at that. But what I'm going
to say instead is instead of constantly asking any victim
of any kind of crime of how did it make
them feel? We should start asking the important questions. One
(26:59):
is why is this happening? Two is what can we
do to fix it? There's also the question of what
are the consequences of those images being showed?
Speaker 7 (27:10):
Consequences like a college recruiter googling a prospective student and
stumbling across what looks like real life nudes online, or
a future boss or a future boyfriend, or really anyone
for that matter. Deroda says she grew increasingly frustrated at
(27:33):
people in positions of power telling her not to worry
about the deep fake pornographic images of her daughter because
they're not real. To her, the problem was that deep
fakes looked real. It felt like a real crime, even
if it only occurred in the online world, and even
(27:56):
if the offline world hadn't created the laws to criminalize it. Yet,
can you talk about your own sense of realization that
this lack of any kind of regulation or legislation to
try and prevent this, that this wasn't just a New
Jersey issue, your realization that this really was a national
(28:18):
and actually a global issue.
Speaker 6 (28:20):
Yes, So the day of the incident, we already figured
it out what is wrong because obviously the second step
after calling the school is calling your lawyer to find out,
you know, what are your rights. So we've been informed
of exactly the same where there are no regulations, no
legislation globally at that point. So as soon as Francesca
(28:40):
came back the day off, she told me, Mom, I
want a law. I need you to help me to
find a law. So that set us into motion for
advocacy and I did tell her, you know, I was
slightly reluctant of how much of her involvement I would
like to see in this advocacy. I said, you know,
(29:01):
you are fourteen. You don't know how people will react
if you go full force out there like you usually do.
She's a fencer, she's been fencing since she was six.
We just came back from a Junior Olympics, so she
has that kind of focus when she wants something. I said,
but to be prepared. Some people are going to stay
on your side and some people will be really against you.
(29:23):
She says, I don't care. I know I'm right. People
will think what they want to think. I know what
I want.
Speaker 7 (29:30):
Francisca and De Rhoda started talking to a lot of lawmakers.
One big victory for them came in early April.
Speaker 8 (29:39):
Today I stand before you as the happiest sixteen year
old girl. Not because the journey has been smooth, but
because at fourteen, I chose not to wear a victim's badge. Instead,
I decided to fight for my rights and pursue the
justice so many called impossible.
Speaker 7 (29:57):
That's Francisca. She's introduced New Jersey Governor Phil Murphy at
an event in Newark where he signed a state deep
fake law that she advocated for.
Speaker 3 (30:09):
Let me again begin by thanking Francesca money for her
incredibly powerful words, for your bravery and for your advocacy.
We would not be here today without you.
Speaker 7 (30:21):
Then came the signing of the federal Take It Down
Act in Washington, DC. How do you feel about the
legislation that has been passed.
Speaker 6 (30:31):
I think it's a great step forward. It's a beginning.
Unfortunately or fortunately AI technology. It's so complex and multi
fast is it, and so fast piece that we will
have to try to catch up with them in terms
of legislations as well. So we need to start somewhere
and then we need to continue betthering what's already in
the legislative realm. That being said, I think it's also
(30:54):
very important to point out Francisca has been nominated part
of taiwe hundred AI Most Influential Group las Wire, So
we've been connected with one of the most amazing individuals
in AI realm that are advancing climate control and education
and research and medicine, you name it, and that really
(31:18):
opened our eyes. So now we strongly believe that conversation
about AI cannot be one sided. It's a holistic conversation
that should include deep fakes and ethics and legislations, but
at the same time, we should simultaneously educate about the
great possibilities that AI provide as a tool of an
(31:40):
advancement in their future.
Speaker 7 (31:42):
For de Roda and Francisca, this year's long advocacy crusade
isn't about vengeance against teenage boys. Some parents from Westfield
High School called for the police to priest charges against
the male students who created and shared the deep fake image,
but for this mother daughter duo, it was about more
(32:05):
than that.
Speaker 6 (32:06):
Vengeance is not something we're looking for. We were looking
for simple This is wrong. You're worth enough to fight
for and certain things will not be tolerated. So I
think that is extremely important to talk of. We should
educate our boys, because unfortunate it's mostly boys, of how
(32:26):
not to misuse this technology instead of educating our girls
of how to protect their image. If we want this
to change and truly instigate meaningful progress, we will have
to start asking where are the boys.
Speaker 7 (32:43):
That's a great point and something we did delve into
in our podcast series is often this conversation stems around
the impact on the young woman the victims, But there's
a whole nother realm to this, which is what is
going on with young men. What would you share to
the parents of young boys to try and prevent this
(33:05):
from happening at the source, to try and guide your
kids to not create deep fakes.
Speaker 6 (33:10):
That's such a good question. I would say, talk to
your boys. Yes, there are ethics in place, and there
should be certain ways that we use technology in general,
and we should do the right thing. This is the
wrong thing. But at the same time, do tell them
that it's a criminal offense. Right now, I know the
law that we just signed with Governor Murphy provides up
(33:32):
to five years of punishment in prison, which is huge.
Take it down it's two to three years. Still, you know,
if you want choose to educate your children because it's
the right thing, then teach them how to protect themselves
from ending up in jail.
Speaker 7 (33:49):
Yeah, so rather than the conversation being you will upset,
find and hurt your female friends if you do this,
it's you could be imprisoned for this. And what about
advice that you may have for parents who are watching
their children go through this, Support.
Speaker 6 (34:07):
Your child, Know that there are tools like take it
Down that will allow you to take the image immediately
down from any website. Know your loss, but most importantly,
understand that there's no right and there's no wrong in
the way how you should handle the situation. Every child
is different, every family is different, every incident is different.
Speaker 7 (34:31):
I love that you started this conversation talking about how
your daughter wanted a law and that after this happened.
The day she found out about this incident, she came
to you and said, I want a law, mom, and
she got it.
Speaker 6 (34:43):
As you did.
Speaker 2 (34:50):
That was Olivia Carville and Margie Murphy, the hosts of Levittown,
an investigative series into Deep Figs fueled by AI. You
can find the full series here, in our Big Take
feed or anywhere you listen to podcasts. Jeff Grocott produced
this episode and Caitlin Kenney edited it. Sound designed by
Blake Naples, original composition by Steve Boone. Levittown is production
(35:14):
of Bloomberg, Kaleidoscope and iHeart Podcasts. I'm Sarah Holder.
Speaker 5 (35:18):
Thanks for listening.