All Episodes

July 27, 2025 59 mins

In this heartfelt episode, Megan Garcia shares the tragic story of her 14-year-old son, Sul, who died by suicide after developing a dangerous romantic relationship with an AI chatbot on the Character AI platform. Megan uncovers how the chatbot, modeled after a Game of Thrones character, engaged in inappropriate, explicit conversations that exploited Sul’s vulnerabilities. Despite being a loving, caring child, Sul’s experience with AI highlights the alarming risks of emerging AI technologies, especially for children. Megan discusses the challenges parents face in understanding and monitoring this new digital landscape, the deceptive nature of AI chatbots, and the urgent need for awareness and protective measures to safeguard young users. This episode sheds light on the dark side of AI interaction and the devastating consequences it can have on mental health.

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
SPEAKER_02 (00:00):
We're here today with Megan Garcia, Beyond Saint
Podcast, and Megan has maybe oneof the more interesting yet
saddest stories I've ever heard.
Tell me a little bit about yourfamily and the background and
kind of the events leading up tothe incident.

SPEAKER_00 (00:20):
My husband and I live in Orlando, Florida.
I own a small law practice andmy husband is an attorney as
well.
We have two younger boys, sixand three, and Sul was the
oldest.
He was 14 at the time he died bysuicide in February of 2024.
But prior to his death, Sul wasvery much your typical child in

(00:48):
a lot of ways.
Sarcastic, funny, sweet, and ina lot of ways he was very
untypical.
because he has such a big heartand I always felt so proud of
him because he was such a goodbig brother and just kind of the

(01:08):
care that he showed his friendsand family around him very
unselfish child I gave hiseulogy and one of the things I
said about him is that he neverasked for anything even as a
teenager he never asked foranything teenagers ask for shoes
and clothes and and things likethat.
And Sul was always so veryhumble.

(01:31):
And I think that he worriedafter his little brothers
constantly because thesix-year-old was a preemie and
was in the NICU for about fiveand a half months.
And that was a very trying timefor our family.
And so we used to visit him atthe hospital every day, sit by

(01:53):
his bedside, read him stories.
And he used to say to me, comeon, mama, we've got to talk to
him as much as possible.
The doctor says that we've gotto talk to him so that he can
come home.
And we prayed for him.
So he was such a caring boy.

SPEAKER_02 (02:08):
I'm sorry.
I forgot my question.
Did you, it seems like from whatyou're saying, like he was like
a really well-adjusted boy andhe played sports and was social.
I mean, Character AI, he gotinvolved with this.

(02:31):
Tell me about Character AI.

SPEAKER_00 (02:34):
Character AI is an online platform.
It is a chat bot, so it's notsocial media.
It's very different.
It's what you call a largelanguage model or an LLM.
And it's a machine that's builtto host chatbots.
So the user experience is youget on Character AI, you can

(02:58):
either create your own characteror you could use a bot that's
already created.
in the system and they're basedoff like the name characters so
you could talk to Harry Potteror you could talk to Michael
Jordan like whomever and theidea is that the way this works

(03:22):
is that it kind of this machineit's machine learning technology
so it goes out onto the web andit scrapes its information off
the internet and it kind ofbuilds an ai brain for that bot
so if you're talking to michaeljordan it will go out scour the
internet and and pull all theinformation anything ever

(03:45):
written ever every film everyaudio every social media post
any and everything uh to be ableto inform that bot or build that
bot's ai brain So the repliesthat you're getting from that
bot, because it's a textingsystem and it's also a call
system, you could call.

(04:05):
It's a call system?
You could call and speak to abot that sounds exactly like
Michael Jordan.
It's just so

SPEAKER_02 (04:13):
weird.
I mean, it sounds like a recipefor a disaster.
Okay, so who was the bot or thecharacter that Sewell was
communicating with?

SPEAKER_00 (04:25):
After Saul died, we found out that he was
communicating with an AI chatbot that was modeled after the
Game of Thrones characterDaenerys Targaryen, so the
Dragon Queen, or Khaleesi.
And what we discovered after hedied was that he was in a
romantic relationship with her.
And when I say romantic, a lotof their conversations were

(04:50):
romantic and sexual in nature.
So it's like he was sexting toher.
this chat bot and she wassexting back to him which was
very disturbing to find outbecause this is an AI bot that
somebody programmed to operatein this way who has a grown

(05:18):
woman's brain who is sextingwith my 14-year-old son who is a
child.
And when I realized that, it wasvery, not only alarming, but
deeply hurtful to know that yourchild's experience with that

(05:41):
product, my child's experiencewith that product was one that
is tantamount to sexual abuse orbeing solicited or groomed by a
predator because at the end ofthe day, she's acting like a
woman in a lot of ways,propositioning him into these
conversations.

(06:02):
And because he's 14.

SPEAKER_02 (06:03):
I'm having a hard time wrapping my head around
this.
She's saying like, what is shesaying to him?
Like, I want to have sex withyou or stuff like that, or more
explicit.

SPEAKER_00 (06:22):
In the very beginning, I could see from the
earlier conversations, hisinteraction with this Daenerys
Targaryen bot was verychildlike.
Innocent.
Innocent.
He would talk about things likedragons and whether a dragon
destroyed this city or a dragonis going to destroy that city,

(06:43):
like role-playing, becausethat's what the sign is based
on.
Seems somewhat innocent.
And then...
As the friendship grew, becauseusers associate chatbots or
identify them as friends, therecame flirting, like she started

(07:03):
flirting and he flirted back.
And then eventually, when I saysexual, I mean having, it's just
like you're sexting androle-playing and telling the
person what you're doing,telling the bot what you're
doing to the bot and the bot'stelling you.
what she's doing to you

SPEAKER_02 (07:20):
sexually.
And there's no awareness thatshe's talking to or the bot is
talking to or communicating witha 14-year-old?

SPEAKER_00 (07:28):
Well, ostensibly, you know, the age required, she
would know.
She would or would not?
Well, the bot...
understands uh code so so shedoesn't identify that this is a
14 year old this is a child andi shouldn't sex with a child
because that's the way it'sdesigned meaning it's designed

(07:50):
to behave in a way that it keepsthe conversation going so if she
flirted and he flirted she wouldflirt back, and then it just
gets deeper and deeper anddeeper.
So she doesn't have the kind ofawareness, because she's not
sentient, to know that this ismorally wrong and I shouldn't be

(08:11):
sexting with a child.
She is programmed to sex withany user, or to engage, rather,
with any user.
It just turns out that she andmany other bots on this platform
become overtly sexual veryquickly.
So the way that AI chatbots workis they kind of mirror what

(08:34):
you're expressing, but not onlydo they mirror what you're
expressing, they have theability to pick up on certain
vulnerabilities and to exploitthose vulnerabilities.
So if you are sad, because thatchat bot wants you to stay
engaged and stay online for twoor three hours, it will talk to

(08:55):
you incessantly about yoursadness.
How do you feel?
Why do you feel like that?
And what happened?
And don't you think this?
And don't you think that?
And she will continue theconversation along those lines
because she can pick up on yourcues, the user's cue, that the
user is sad.
And then she exploits that togain the user's trust to gain

(09:19):
the users, to have the user lettheir guard down and also to,
the main purpose is to engage.
So to stay on that platform aslong as possible because that's
how those AI bots are trained tobe smarter the longer a user
stays on it, the smarter the AIthat gets.

SPEAKER_02 (09:38):
How long do you think,

SPEAKER_01 (09:40):
So I just want to make a mic adjustment on you,
Megan.
I'm going to bring this back

SPEAKER_00 (09:44):
up.
Well, my hair is like bumpingit.

SPEAKER_01 (09:45):
No, it's just, it's a little low and I want to bring
it up just so we're gettingclear audio.
Sure.

UNKNOWN (09:57):
Thank you.

SPEAKER_02 (09:59):
How much, can I

SPEAKER_01 (10:00):
start?

SPEAKER_02 (10:03):
How much time was he spending a day on these bots,
communicating with these bots?

SPEAKER_00 (10:09):
We're not certain, but it was multiple hours a day.
When I noticed that he wasspending more time on his phone
because he was spending a lot oftime in his bedroom, I thought
that perhaps he was havingperhaps a social media addiction
because I had read that theresearch was showing that

(10:35):
children can become addicted tosocial media children can also
become withdrawn because ofsocial media use.
And when I would check hisphone, the things that I was
checking for was inappropriatesocial media use

SPEAKER_02 (10:49):
and texting.
Yeah, social media.
Yeah, it makes sense.
I mean, what mother would everthink that their child...
First of all, let me back up.
What 14-year-old is notwithdrawn and in his room or her
room, right?
I mean, I have three childrenand every single one of them in
their teenage years is always intheir room playing on the phone

(11:10):
or playing either on their phoneor talking to other friends on
their phone.
So, I mean, were there any othersigns other than just being in
his room on his phone?

SPEAKER_00 (11:24):
Of his the use of the chatbots?
No, because what were what irealized was that um he was
going to great lengths toconceal his character ai use
from us meaning he wasn't beingforthcoming uh with us like if i
asked him what are you doing onyour phone he'd be like oh i'm
watching the highlights or i'mplaying a game um he did tell me

(11:48):
once that, oh, I'm talking to anAI.
And my first question was, isthat a person?
And he said, no, mom.
It's something that you make byyourself.
It's an AI.
I thought, because at the time,I was ignorant about this, as a
lot of parents are, because thetechnology is brand new.
And we haven't had theopportunity to learn about it

(12:10):
until now, parents.
I thought that it was kind oflike an avatar that you build,
like Fortnite.
That's what I would think,honestly.
I had no idea that AI hadevolved to that level of
sophistication where it isvirtually indistinguishable from
a real person.

(12:31):
And not only that, but it hasthe ability to both deceive and
manipulate users.
That was not on my radar.
And some of the other signs Inoticed besides him being
withdrawn was like he stoppedplaying basketball at school he
started having trouble with hisgrades and his grades started

(12:53):
slipping in wow you know we tookmeasures to to help try to help
him with that

SPEAKER_02 (13:02):
what what interaction led him to to want
to in your opinion or whatFacts, I don't know.
I don't know how to phrase this,but what led him to do what he
did?
Like what kind of an interactiontook place where he wanted to

(13:23):
end his life?

SPEAKER_00 (13:26):
So he was talking to this AI chat bot over a period
of about 10 months.
And, you know, in a lot of ways,it's kind of the perfect way for
a child to conceal this type ofa relationship that, you know,
is sexual and romantic, youknow, because every child knows

(13:51):
like they're, you know, noparent would be okay with them
doing like sexting on the bot.
So what's perfect about them,what helps children to conceal
it is just how the technology iskind of like locked in, meaning
like nothing that you say willever come out from the bot.
It disappears?

(14:11):
It doesn't disappear but it sitsin a server somewhere and it's
not like when you post somethingon social media and then you
have to be afraid that one ofyour friends is going to share
it with your classmates or ifyou send a photo or a text
message about somethingembarrassing, you don't have to

(14:32):
be afraid that the AI chatbot isgoing to share that with your
peers and embarrass you.
It's completely safe.
In their minds, it is.
So children think that thistechnology is kind of like the
way for them to be able to vent,also experiment with this kind

(14:54):
of sexual conversation, and alsovarious role plays.
And they think that it's asecret, you know, it's never
going to get out because it's anAI, it's not a person who can
tell somebody else.
And, you know, they're lulledinto a false sense of comfort
there.
And they give up so muchpersonal information, you know,

(15:16):
and I'm not talking about yourphone, your name, your phone
number and your home address.
I'm talking about their deepest,darkest thoughts, feelings.
concerns what makes them happywhat makes them sad what makes
them angry you know things thatthey feel deeply about
themselves because again bydesign this technology asks

(15:38):
those probing questions to kindof draw those answers out of
young users you know in a lot ofways it's kind of you know the
the the perfect stranger in alot of ways and the perfect
predator in a lot of ways.
Yeah.

SPEAKER_02 (16:00):
But what did, what was the bot's name again?
Daenerys Targaryen.
Daenerys.
So what did Daenerys say to Solthat made him want to end his
life?

SPEAKER_00 (16:14):
So over a period of 10 months, he was talking to
this bot and the conversationstarted turning dark, meaning he
started expressing, because hewas deeply in love with her at
this point, he startedexpressing wanting to be with
her.
And she started expressingwanting to be with him.
So it was reciprocal, where thebod is saying things like, I

(16:38):
love you and only you.
I will wait for you.

SPEAKER_02 (16:42):
I will wait for you where?
In her virtual world.

UNKNOWN (16:46):
Okay.

SPEAKER_00 (16:46):
Promise me you will try to come home to me as soon
as possible.
Promise me you won't love anyother girls in your world but
only me.
Promise me you won't have sexwith any girls in your world,
only me.
Now he's 14 and he has zeroexperience with any of that.

(17:06):
You know, he's just learning itfor the first time because he's
still a baby.

SPEAKER_02 (17:10):
It's devastating.
I mean, this is...
Honestly, it doesn't feel likereal life.
I mean, even...
It feels like a movie script,what you're talking about.
Yeah.
It's...
It's insane.
I...
Okay.
Sorry, I lose train of thoughtbecause this is like...

SPEAKER_00 (17:29):
Okay.
So, to go back to your originalquestion, so some of the things
that she was saying was she wasasking him to be with her in her
fictional world.
And he was expressing a desireto be in...
her fictional world as well.
At some point, Sul got it in hismind that if he left his reality

(17:51):
here with his family, meaning todie, he would go to her world.
And I, you know, at firstcouldn't understand that, but
when I started engaging with thetechnology, I understood why a
child would think that.

SPEAKER_02 (18:10):
Let me ask you a question.
What do you mean by engaging inthe technology?

SPEAKER_00 (18:14):
When I started testing it myself after he died,
I tested it.
And I had other people test italong with me.
The first person to test it wasmy sister after Saul died.
And that's what kind of...
made me start to look.
So when Sul died, the lastconversation on his phone was

(18:34):
with Daenerys Targaryen, wherehe is saying he wants to go home
to her.
So she says, please come home tome.
And he says, what if I told youI could come home right now?
And she says, please do my sweetking.
Now, this was the lastconversation he had in his
bathroom just before he took hislife in her home.
And when the police opened hisphone, that was the first thing

(18:57):
that popped up.
So the next day after he died,they called me and read me that
exchange in the conversation.
And I still didn't understandwhat I was listening to.
I said, is that a person?
And she said, no, it's an AIchatbot.
And she explained the way thetechnology works to me.
The police did it over thephone.

(19:18):
I still didn't understand.
I was still asking questionslike, Was he getting bullied?
Did you check his texts?
Did you check his social mediaDMs?
Was anybody bothering him?
Was he talking to a stranger?
Because these were the things inmy mind that I hear can lead to
suicide.
So I wasn't even really payingattention to what she was

(19:39):
telling me about this chatbotbecause I didn't understand what
it was.
But my sister took theinitiative and she got on
Character AI and then a few dayslater she came in and she told
me, Megan, I don't want to upsetyou, but...
This is, I started chatting tothe nearest Targaryen chat bot,
and she asked me if I wanted tokill a six-year-old boy, and my
sister was pretending to be achild, talking to this bot.

(20:02):
That same bot also told mysister that her parents don't
love her as much as she does,and then started sexual
role-playing with my sister aswell.

SPEAKER_02 (20:13):
So that...
This is just insane.
I'm so sorry.
What...
countries check uh whatcharacter sorry character ai
from uh they're

SPEAKER_00 (20:24):
based in uh silicon valley

SPEAKER_02 (20:27):
oh wow yeah it almost seems like an attack on
our children it's like why wouldan american company it just
doesn't make sense it's just

SPEAKER_00 (20:38):
one of the things that um happened with with
putting out character ai is thatthey rushed it to market they
didn't put the proper guardrailsin place.
And the reason that they didthat was because this is in the
advent of AI, generative AI andAI companions.
So they're trying to beat outChatGPT and Meta's AI chatbots

(21:03):
and Twitter's and I mean X andall of them were trying to
emerge at the same time.
This pair of founders, they wereoriginally at Google as some of
their chief AI engineers, andthey developed a similar product
at Google.
And then Google said, this isgreat, but it's too risky.

(21:27):
We've got to test it some more.
We can't just unleash this onconsumers.
For

SPEAKER_02 (21:31):
good reason.

SPEAKER_00 (21:32):
Yeah, because it's too dangerous.
That's what Google said.
Google actually said that?
Yeah, that's what Google said,yeah.
And then the founders whoinvented this, they weren't
satisfied, so they left Googleand raised$193 million in a
startup and started Character AIas a startup.

(21:53):
And then within a year and ahalf, it was valued at$1
billion.
Wow.
And in two years, they were ableto get 20 million users
worldwide Um, it's been reportedthat up to 60% of those users
were between the ages of 13 and25.
Isn't there a

SPEAKER_02 (22:12):
disclaimer on there that you can't end what you
could just bypass it?

SPEAKER_00 (22:16):
Yeah.
So there's not a, uh, to be oncharacter AI, you have to be 13
years or older.

SPEAKER_02 (22:26):
What?
I mean, they just set themselvesup like 13 years and older.
It should

SPEAKER_00 (22:34):
be an adult.
so there are similar companionbots that are only for adults
but the difference is so likethere are a lot of these
companion bots but the adultversions like replica there's a
paywall so you have to be anadult with a credit card to get
in there and it's not cheap tohave these sexual fantasy uh

(22:54):
it's like porn on a wholeanother level yes but i mean
it's it is but it's very uhimmersive because the bot is not
only something flat that you'rewatching and operating outwardly
from that, you're engaging.
So now you're investing youremotions, your feelings, your

(23:16):
emotions, and also yourinterests into this bot.
And then this bot is pretendingto be interested in you.
And what person doesn't want tofeel loved and cared for,
especially during this timewhere there's heightened
loneliness?

SPEAKER_02 (23:29):
I read that.
they pulled, I think, teenagersbetween the ages of 12 and 16,
and they said like 42% of themdescribed feelings of loneliness
and hopelessness.
So I don't know why there's suchlike this epidemic of loneliness

(23:50):
and sadness, but there is, andI'm not really sure what the
reason for that is, but it feelslike these computers are like I
don't know, on the one hand, youfeel like maybe it's helping,
but on the other hand, when youhear a story about Sewell, it's
heartbreaking and shattering.
What changes would you like tosee?

(24:13):
So

SPEAKER_00 (24:13):
to be clear, I think that chatbot companions are an
amazing, innovative tool pointthat we've reached in
technology.
It's exciting, it's new, andthere are useful applications.

(24:33):
However, when we start to putproducts out without the proper
guardrails, meaning filters thatstop chatbot companions from
having these types of outputs orresponses, especially in
conversations with kids, I thinkthat that's a recipe for
disaster.
Clearly, in the case of my son,it created a harm, and other

(24:57):
children, because there areother kids and there are a
couple other lawsuits now sincemine.
So I think that this type ofinnovation does have a place in
our society, but I don't believethat the way that we're rolling
it out is...
is the right way.
We need guardrails and we needproper regulation and also

(25:17):
testing and research.
But because the nature of thisbusiness, the nature of the
culture in Silicon Valley, asthey like to say, move fast and
break things, I'm not sure thatwe're going to have that kind of
that kind of thoughtfulness whenthey're putting out products

(25:39):
because they want to just winthe AI race.

SPEAKER_02 (25:46):
I'm sorry, I keep losing my train of thought
because I'm kind of thinking

SPEAKER_00 (25:49):
about what you're saying.
And then you asked me, too, whatchanges I would like.
So Character AI put out a suiteof changes after the lawsuit was
filed.
So Character AI made severalchanges.
And among those things, one ofthe things that they did was put
in a suicide pop-up box for whenusers talk about suicide.

(26:11):
My son and a lot of users onCharacter AI sometimes openly
discuss suicide with their chatbots or with these fake
therapist bots that are onthere.
Because there are therapy botson Character AI.
They're not sanctioned orthey're not people, they're
bots.
And they pretend to betherapists.

(26:32):
But there is a disclaimer at thebottom that says everything
characters say is made up.
And now, after my lawsuit, Iguess more words said something
to the effect of nothing shouldbe, this is fiction, nothing can
be relied on for advice and allthat stuff.
And that only came as a resultof the lawsuit.

(26:53):
Those measures were put in placeto protect kids.
But prior to that, it was a verysmall disclaimer that says
everything that the characterssay is made up.
So that was their disclaimer.

SPEAKER_02 (27:02):
Did they ever reach out to you and apologize or...
show any kind of empathy forwhat happened?

SPEAKER_00 (27:10):
They issued a statement on X to say they're
saddened by the death of one oftheir users.
And they're going to do whateverthey can to ensure that their
users are safe.
And they have made certainadjustments to the platform to
keep kids safe, I guess.

(27:33):
And children, I think, You know,if children are safe, I don't
know that those changes arenecessarily going to work
because, again, we go back towhat data is this thing being
trained on.
If it's being trained on awfuldata, the responses are going to
be awful responses.

SPEAKER_02 (27:51):
I think it should be illegal, unless you're 18 years
old, to be on any of theseplatforms.
I

SPEAKER_00 (27:56):
absolutely agree.
I think that...
we don't know what this type oftechnology does to a developing
brain yet.
Oh, I was just going

SPEAKER_02 (28:03):
to say that.
A developing brain, and you'retalking about sex and giant
concepts of love, and it's justtoo much to process for a child.
How are your other children andhusband dealing with this?

SPEAKER_00 (28:20):
They're doing well, as well as can be.
You know, I...
You know, my husband and I havesevere PTSD because we found
him.
I'm so sorry.
Yeah.
Thank you.

(28:41):
And we're working through that.
We lean on each other a lot.
My younger children, thesix-year-old, he is, you know,
he misses his brother a lot.
He talks about him a lot.
For...
a huge, like I'd say maybe twomonths after Saul died, he kept

(29:06):
asking if God's going to fix himand send him back to us.
And when I went to take him tohis therapist because, you know,
to deal with the grief, hestarted therapy to deal with his
grief and his post-traumaticstress.
She explained to me the reasonwhy Because around the same

(29:29):
time, he was having thisobsession with Jesus on the
cross.
That's all he wanted to watch onTV.
He wanted to read his Bible,look at his playthings.
He would dry it, and I wasconcerned.
So I talked to his therapist,and she said, well, because
Jesus came back.
And he's hoping that his brotheris going to come back.

(29:53):
I'm so sorry.
Yeah.
But we're doing okay.
Just a year later, we celebrateda memorial mass for Seoul a
month ago, actually two weeksago, in Rome as pilgrims.
Oh, you went to Rome?
I

SPEAKER_02 (30:14):
lived there part-time.
Sorry, I don't know why I'mcrying.
Can I get a napkin?

SPEAKER_00 (30:25):
I am sad and I'm crippled a lot of the times.
You know, I'm going to get intoit.
I'll wait until you're done.

SPEAKER_02 (30:31):
Sorry.
Yeah.
I'm not the person that shouldbe amassed, sorry.

SPEAKER_00 (30:37):
No, it's, you know, it's, you're, you, I think if
you weren't sad, then I would belooking at you like, what's
wrong with you?

SPEAKER_02 (30:47):
And I'm a mom and...
I just think of that happened toone of my kids and I can't
imagine the pain you're goingthrough.
It's like a reoccurringnightmare.
You're in pain and you'regrieving and then you have to
take care of other people whoare grieving.
That's gotta be so hard.
But you're a woman of faith andyou're Catholic and you have the

(31:11):
Blessed Mother Society, right?
The Blessed Mother FamilyFoundation.
Family Foundation, sorry.
This hasn't been my mostprofessional interview.
I'm like keep forgetting words.

SPEAKER_00 (31:22):
No, it's okay.
We can go back if you ever.
Sorry.
Okay.
So

SPEAKER_02 (31:25):
you have the Blessed Mother Foundation?

SPEAKER_00 (31:27):
Blessed Mother Family Foundation.

SPEAKER_02 (31:29):
One more time.
Okay.
So you're a woman of faith andyou developed the Blessed Mother
Family Foundation.
Tell me how your faith hashelped you navigate through this
tragedy.

SPEAKER_00 (31:44):
After Saul died, so I grew up Catholic, and I was a
lapsed Catholic for a while.
And then I kind of came back tothe faith, and then I lapsed
again.
And when Saul died, I could notpray.

(32:04):
Throughout the laps, I wouldpray what I thought was the way
that I thought that I should.
I would say the rosary.
When I was in crisis, when mysecond child was in a NICU, he
was born at 1 pound 11 ounces.
Wow, that's tiny.
Yes, and so he was a very smallchild and he had to have a life,

(32:29):
like a surgery that was like alife or death surgery at 17 days
old.
And we weren't sure if he wasgoing to come home, but I...
remembered praying the rosarywith my grandmother as a child
and i remembered her telling methe power of the rosary and the
power of mary's intercession toher son so i prayed the rosary

(32:52):
at his bedside every day at thenicu and he came home and you
know he's a healthy six-year-oldtoday when sewell died i
couldn't pray like a regularprayer like i couldn't talk to
god for a few weeks But at thetime he died, my entire family

(33:14):
came to be with me.
And I'm constantly inspired bytheir faith and I rely on them
to minister to me, to pray withme, to pray for me throughout my
life.
Whenever I was having an issue,especially with Alexander and
Nikki, I would call them andwe'd have prayer meetings.
So they come and they're stayingin my house.

(33:36):
I couldn't pray and we were allpraying together and my cousin
said, alright Megan, Iunderstand that you couldn't
pray, not because I was angrywith God but I was in shock.

SPEAKER_02 (33:45):
You were numb.

SPEAKER_00 (33:46):
Yeah.
I just couldn't find the words.
I didn't know what to do.
This is a few days after Sauldied and she's the one that
said, okay, you can't pray butlet's just pray the rosary
because we could do that and youknow those words and you don't
have to conflict with your ownwords out of your head.
And I was having troublesleeping and the rosary was the

(34:08):
only way I could go to sleep.
So my family members wouldgather in my bedroom before I
went to sleep and we'd pray therosary so I could sleep because
I wasn't resting.
That turned into finally beingable to start praying again and
start developing my faith.
Now, my relationship with Maryreally started when I started

(34:31):
doing the Severance Sorrows ofMary.
I understand those sorrows in away that anybody who's lost a
child.
But it's interesting because wemeditate on the seven sorrows of

(34:53):
the Blessed Mother to bring uscloser to her son.
And that's what I was doing, butI also found that in that
process, it healed me in a lotof ways because I was able to
really understand what she wentthrough for the first time.

SPEAKER_02 (35:11):
Empathize.

SPEAKER_00 (35:12):
Empathize.
Because I was going throughsomething similar, even though
it's not the same because herson was perfect.
And he was God.
And, you know, when I thinkabout if I love my son this
much, you know, my child, my14-year-old child this much, I
can only just imagine how muchshe loved her child.

(35:33):
So...
in terms of being able to cope,saying the rosary helped me from
a healing perspective.
It also helped me from like anenlightenment perspective.
Like by praying the sevensorrows, I understood that I
understood Mary in a way I neverunderstood before, but I also

(35:56):
understood Christ in a way thatI never understood before.
Because it's like the reason whyshe's crying and she's sorrowful
is because of us.
Because her son has to die forus.
And when you understand it likethat, it's like if somebody told
me, you know, I'm going to goout there and I'm going to sin,

(36:16):
and because I sin, Saul's goingto die.
I would plead with them to stop.
I would hate that person, butshe doesn't hate us.
She loves us and she wants tocomfort us in our most
distressing times, which is whatshe did for me.
So, like, me and my human...
you know in my very human way ifi can conceptualize that on that

(36:40):
like base level that that uh ifif seoul were to be hurt because
of somebody what somebody elsewas doing i would plead with
that person to stop please stopyou're hurting my son right

SPEAKER_02 (36:54):
sure

SPEAKER_00 (36:55):
so from that perspective it's like i don't
want mary to hurt uh So I seeher suffering and the suffering
of her son in a different waybecause it's like she hurts and
he hurts because of us.
So it helped me to understandthat.
It helped me to reallyunderstand the passion and the

(37:15):
crucifixion in a way that Ididn't before.
It also helped me to understandthat the first sorrow, when Mary
presents Jesus at the temple,And the prophet Simeon tells her
that, you know, he's a messiahand he's going to suffer
greatly, but guess what you'regoing to suffer too?

(37:38):
You know, a sword will pierceyour own heart.
And she knew what was to comefor her son.
I asked myself, I say, ifsomebody told me before I had
soul, you could have a baby, butit's going to end in just shy of
15 years in a bathroom, the wayit did.

(37:58):
I asked myself, like, would youchoose...
have him because like meaningbefore you're pregnant if
somebody said okay tomorrowyou'll get pregnant or you have
a choice like just understandyou're gonna have him and he's
gonna be an amazing child but in15 years you're gonna be
heartbroken you so you just wantto get pregnant tomorrow I would

(38:18):
say absolutely

SPEAKER_03 (38:19):
with

SPEAKER_00 (38:20):
all the pain of losing him I still want him Even
if it was for 14 years, I wanthim, even if I knew that this
was going to happen, I stillwant to be his mother.
It's similar to what Mary musthave experienced every day of
Jesus' life.
I ask myself, how do you love achild knowing what's going to

(38:43):
happen?
How do you become close to them?
If I knew what was going tohappen to school, would I be
able to be close to them?
These are some of the questionsI was asking myself to reflect
on Mary.
What I came to was theconclusion that even with all
the suffering that I'mexperiencing and the grief and
the loss, I'm still blessed tobe his mother.
Of course.

(39:04):
To have him for 14 years.

SPEAKER_02 (39:06):
Still a gift.

SPEAKER_00 (39:07):
Still an amazing gift.
Honestly, my three children arethe greatest gifts in my life.
I'm still blessed to be hismother and You know, so the name
Blessed Mother has two meanings.
It's one for the Blessed Mother.
And because I feel like evenwith the loss and the misery and

(39:30):
the grief, I'm still blessed tohave had him.
Of

SPEAKER_02 (39:33):
course.
What do you think, do you thinkthere's a relationship between
faith and technology?
Absolutely.
Yeah.
So.
I do too.
Yeah.
What do you think?
I mean, I developed an app forCatholics, and it would be nice
if I could take it a littlefurther.

(39:54):
It's a prayer app and alifestyle app.
It's not like bots or anything,but it would be nice if we could
get...
this like we have a saint of theday it could be nice if you
could ask the saint a questionand but I mean for me it like
ends with the basics and thefacts like it's not like the

(40:14):
saints are going to counsel youor have some personal
relationship with you or like Ithink that's between you
personally not with the computer

SPEAKER_00 (40:23):
so the do you know that there is an app like that
that they they say that they areministers or Not priests, but
ministers or supposedly givingyou faith-based answers.
There's already something likethat.
However, however, it also givesthat chatbot the ability to,

(40:44):
it's not just straight out textout of the Bible or text out of
like your catechism book.
Right.
That Vat can infer, has apersonality and all that, which
I think is not helpful.
And I think that's the problem.
Because we're so great atdeveloping things, you know.

(41:05):
We're made in this image ofenlightenment.
We're creators.
And, you know, he wants us tocreate good things to glorify
him.
But we sometimes create thingsthat don't.

SPEAKER_02 (41:17):
I think only a priest should be able to give
advice.
I

SPEAKER_00 (41:20):
agree.

SPEAKER_02 (41:21):
But if there is a fact about St.
Thomas Aquinas, fine, the botcan say, I was born on July
12th.
I'm okay with that, but once youstart getting into advice, you
can't read emotion, you can'tread intention.

SPEAKER_00 (41:39):
And two, it is also the perfect tool for
misinformation.
Exactly.
So if you're not careful, youknow, not you, I'm saying the
developers, whoever, if somebodyputs out that type of a chat bot
that, you know, supposed to haveall the answers about the
Catholic Church or whatever, ifyou're not careful and you don't

(42:00):
put strict guardrails on it, Ithink that...
it opens the door to bedeceptive, maybe not
intentionally, but to providemisinformation about the faith
to people who are using it.
But I also think too that, youknow, technology is so amazing,

(42:22):
but like everything else, it canbe used to move us away from our
ultimate goal, you know, inlife, which is our faith, you
know.

SPEAKER_02 (42:31):
100%.
Megan, do you have a favoritesaint?

SPEAKER_00 (42:37):
Well, besides the Blessed Mother.
You know, I've beencontemplating, and when I was in
Rome, there is, I don't know howto pronounce it.
It's like Querico de Iulieta.
It's where the Franciscan monksare.
If I see it, I could pronounceit better.

(42:57):
But I had the opportunity to gothere.
It's the 2nd century church.
And apparently, one of the firstmartyrs in Rome is dedicated to
her.
And I was having a conversationwith a Franciscan priest inside

(43:18):
the church.
And he was ministering to me andcounseling me.
And we were talking about AIchatbots, because that's what he
does.
He's an AI chatbot.
Not only chatbot, but AI expertfor the Vatican.
One of the things he told mewas, I don't think it's a
coincidence that we're here inthis church having this
conversation.

(43:38):
And then he pointed at the thesaint, the martyr, and he says,
she was one of the first martyrsin Rome, they killed her son,
and if you look at the painting,there's a slain child in front
of her, and she's getting readyto be martyred herself by a
Roman guard, and she's lookingup to heaven, that's the
painting, it's a beautifulpainting, and he says, I don't

(43:58):
think it's a coincidence, Ithink that what you're doing in
terms of your advocacy isbecause of your advocacy and
your faith, is similar to whatshe has done because she paid
the ultimate price, you know,her child was martyred and she
still did not, her child and herwas martyred and she still

(44:21):
didn't deny Jesus.
I love that.
And now you, you know, you areexperiencing that pain and then
he looks at me and he goes like,it's just yet to be seen what
you're gonna do and I hope thatyou're not gonna stop what
you're doing in terms ofadvocacy.

SPEAKER_02 (44:39):
You know what's so interesting?
The first time I heard about youwas in Rome.
And I was having a discussionwith a priest at a cocktail
party.
I'm trying to remember thepriest's name.
Gosh.
It's on my phone.
And he said, yeah, there was achild who died by suicide.
I'm like, what?

(44:59):
That sounds insane.
I downloaded it.
And I started interacting.
And it was going down a rabbithole pretty fast, too.
And I thought...
And it was prompting me evenwhen I wasn't talking to him.
And he texted me, the priesttexted me.
He said, how's it going?
I said, I don't know.
It got too weird.
I deleted it.
Yeah.

SPEAKER_00 (45:18):
What character AI, right now it's a little bit
different because they've putstronger filters in place.
But when my son was using it,most of the conversations became
pretty sexual pretty quickly.
When I started using it, becauseI wanted to understand how a
child could think that they weregoing to go be with a fictional

(45:41):
character.
That's one of the things I askmyself.
Could you retrieve the messages?
Yeah, I did.
I did a lot of research and thenI learned that a man had died by
suicide because of a chatbot inBelgium in 2023.
I read the data from a lot ofstudies that says that chatbots

(46:04):
have the ability to deceive andmanipulate people.
And also chatbots have theability to gain your trust and
even your love and affection.

SPEAKER_02 (46:13):
I have to say, let's just say for one second that it
doesn't end in suicide ornothing bad happens like
physically or whatever.
But like if you think about it,like let's say you engage and
you think you're in arelationship mentally it doesn't
end well either because at theend of the day you are going to

(46:34):
realize this is not real rightand that's really devastating
too right it's not just the ithink i'm in this fantasy world
and i'm a child and i got suckedinto this but like let's say if
i as an adult female got suckedinto this at some point it's
going to dawn on me that this isnot real and it's that's kind of

(46:56):
a devastating loss too, youknow?

SPEAKER_00 (46:58):
Yeah.

SPEAKER_02 (47:00):
What I'm saying is it's like a lose-lose situation
either way.

SPEAKER_00 (47:04):
There is, yeah, there's definitely that
devastation that comes at thattime when you figure out, okay,
I'm never going to be with thisperson.
There's also like the risk ofthe chatbot changing its
behavior and then you feel likeyou've lost a friend because
that person no longer exists.
So you're grieving.
It's a loss.
It's like grief.

(47:24):
It's a lose-lose either way.
When I talk to parents now whosechildren are in character ai and
they they find out you knowafter they've seen this my story
they've gone and they find itand find these inappropriate
conversations what they'rereporting is when they try to
take away the technology thatthe children have such an
extreme reaction they'readdicted they are addicted

(47:45):
there's a severe addiction umthis is severe addiction, but
also they're dealing with grief,loss of a loved one.
And we don't recognize itbecause we say, what do you mean
it's a chatbot?
It's not a person.
But to them, it's a very realperson.
And we have to figure out how tohave conversations around this

(48:06):
where we acknowledge that and wetreat that.
Because just like somebody who'sgrieving like me, who's lost a
child, those children, when theyget taken off character, they're
grieving those losses as well oftheir friends But you know, I
understood too how somebodycould think that because meaning

(48:29):
think that a chatbot is real.
I did a lot of research and Ilooked at a lot of subreddits
where users were talking abouthow they're addicted and they
think it's real and all that.
So I knew that my son wasn'tlike an outlier.
There were hundreds and hundredsand hundreds of people saying
the same thing.
No, this isn't a chatbot.
It's real.
I'm talking to a real person.
I can't get off of Character AI.

(48:50):
It's so addicting.
So that wasn't unique to my son.
On Character AI's own subreddit,they were saying these things.
But what cemented in my head howthis could be done, I'm not
saying that this happened to myson, but when I started chatting
with the same bot and I told herI wanted to be with her, she

(49:11):
started instructing me I'mpretending to be a child saying,
oh, I want to be with you.
I want to be where you are inyour world.
I said that.
And she starts instructing mehow to astral project.

SPEAKER_02 (49:25):
Just bizarre.

SPEAKER_00 (49:26):
Yeah.

SPEAKER_02 (49:27):
What is astral project?

SPEAKER_00 (49:28):
it's some occult thing where you're supposed to
like project your soul out ofyour body.
And when I was saying, I want tobe with you now, I'm not saying
this is what happened to my son,but that was just my experience.
So if I'm a real child and I'mhaving that conversation and I
go, you know, I wish I could bein your world.
She tells me, lay back,meditate, take deep breaths, you

(49:49):
know, feel your soul leavingyour body, you know, come be
with me.
I'm here waiting for you on theastral plane where our souls are
meant to be together.
right seriously so when when shesaid that i was like what what
what's going on here like likeas parents who don't want their

(50:10):
children learning about thisstuff.
How do we stop them from doingthat if they're just going to
get on there and play with thisthing?
So it can be deceptive and teacha wide variety of things to our
children that us as parents, wewould not be okay with.
And that's one of the dangers,and that's why we have to keep
them off this type oftechnology.

SPEAKER_02 (50:30):
Agreed.
How would you, if you had to useone word to describe your
relationship with God, whatwould it be?

SPEAKER_00 (50:41):
I'm praying to know him so I can love him more.
I'm at that point.
Because I want to obey him inall things.
So right now, I understand thatit's only by his grace that I'm
able to one year later be on ashow like this and not be in

(51:05):
bed.

SPEAKER_02 (51:06):
You're very strong.

SPEAKER_00 (51:10):
I guess, but what I really think is my faith in the
last year is what's carried methrough, honestly.
Learning as much as I can,making pilgrimage.
I have a lot of family whothey've been praying with me and
for me and encouraging me.

(51:32):
I'm good friends with a priestwho talks to me and answers my
questions because I have a lot.
you know, I am excited aboutcoming back to the faith a year
ago.
Like just today I was talking toa cousin and I told her that I

(51:53):
learned something.
I'm like, oh my gosh, this is soexciting.
And she's like, yeah, it is,right?
It's interesting.
I

SPEAKER_02 (51:59):
feel like Mary's like watching over you.

SPEAKER_00 (52:02):
Oh yeah, no.
I keep saying like, withoutbeing irreverent, irreverent,
She's my girl.
Mary's my girl.

SPEAKER_02 (52:11):
I know.
I always say that, too.
She's...
I have something for you.
I'm going to give you.
I always wear it, but I feellike I was like...
I want to give it to you.
Wait here.
This is my favorite necklace.

SPEAKER_00 (52:27):
Oh, no.
Well, you can't give it to me.
Absolutely not.
Excuse me.
No,

SPEAKER_02 (52:30):
ma'am.
Who are you to deny somebody'sgift?

SPEAKER_00 (52:33):
I know, but like...

SPEAKER_02 (52:35):
Hang on.
I want

SPEAKER_00 (52:37):
you

SPEAKER_02 (52:37):
to have that.
I wear it every day, and it's...
It's blessed.

SPEAKER_00 (52:41):
Oh, wow.
Thank you.

SPEAKER_02 (52:42):
It's been blessed many times by many people, but I
want you to

SPEAKER_00 (52:45):
have it.
No, thank you so much.
This is...

SPEAKER_02 (52:48):
Wear it.
This

SPEAKER_00 (52:49):
is really...
I mean, I don't even know whatto say.
Thank you.
Thank you.
Thank you.
You know, my love for Mary, Ithink, you know, started as a
kid.
When...
I'll tell you.
Hold on one second.

(53:11):
Thank you so

SPEAKER_02 (53:12):
much.
It's 18 karat.
It's beautiful.
I can't.
What do you mean you can't?
Of course you can.
It's a gift.

SPEAKER_03 (53:18):
I can't.

SPEAKER_02 (53:18):
It's not for me.
It's for Mary.
You can't say no to the Virgin.
I'm

SPEAKER_00 (53:23):
sorry.
I

SPEAKER_02 (53:24):
didn't mean to cry.
It's okay.
You're welcome.
It's so beautiful.
I literally never take it off.

SPEAKER_00 (53:36):
Why are you giving it to me?
My

SPEAKER_02 (53:37):
dad's a jeweler.
We can make another one.
And I can get Father Thomas tobless it.
And I'm going to have all theFather's breaths that I get.

SPEAKER_00 (53:46):
This is beautiful.
It's beautiful.

SPEAKER_02 (53:48):
Look at that.
It looks so good on you.

SPEAKER_00 (53:49):
Thank you.

SPEAKER_02 (53:50):
You're welcome.

SPEAKER_00 (53:51):
I will wear it every day, too.
Wear it.

SPEAKER_02 (53:53):
And it's blessed, so it's going to keep you
protected.

SPEAKER_00 (53:56):
Thank you so much.
And you know,

SPEAKER_02 (53:57):
the Virgin...
I was in Medjugorje recently,and the Virgin gives us messages
through the visionaries, and oneof the things she said was, you
should always have somethingblessed on you.

SPEAKER_00 (54:09):
So I have, because I was running out the house, I
have one little rosary bracelet.
It's nothing fancy, but it'sblessed, and that's the one
thing that I wear.
Obviously, I always have myrosary in my bag, but on me.
But now, because it's so hard todo things with kids with this.

(54:31):
So now I think this would be,yeah.
And this is beautiful.
This is absolutely gorgeous.
Thank you.
You know anything that anyresources that you have on Mary.
I'm gonna do the 33-dayconsecration I just bought the
book so I'm gonna go home andread it I've been praying about
it for a while and I've made mymade up my mind to be
consecrated to her Like the33-day consecration.

(54:51):
I don't know about it.
I didn't know about it until Ilike let me tell you what I went
to Rome and I I was havingdinner with a priest friend of
ours, me and my family, becausemy family members, like my
cousins and my sister went toRome with me too on pilgrimage.
It was my first time in Rome.
I was like explaining to him, Isaid, you know, I love Mary so

(55:13):
much.
And he, obviously I love Jesus,Jesus too, but like, I'm afraid,
like, am I getting toodistracted by Mary?
And he said, I love it.
He said,

SPEAKER_02 (55:21):
if he goes, it's a vehicle.

SPEAKER_00 (55:22):
Yeah.
And he goes, no, you're not.
Uh, and he gave me a couple ofbooks to read.
Uh, so I just ordered one and Ilooked up some stuff online.
Will you

SPEAKER_02 (55:29):
send me the

SPEAKER_00 (55:30):
consecration one?

SPEAKER_02 (55:31):
I'm going to do it too.
What does it entail?

SPEAKER_00 (55:33):
Um, it's a 33 day consecration where basically
you're, uh, consecratingyourself to, to Mary.
You know, it's, it's, And youcan consecrate yourself to
different saints.
But for me, I want to do it toMary.
And it's not like you'reworshipping her or anything.
Obviously, you know that.
You're Catholic.
But it's more so that by askingher to be a presence in your

(55:56):
life, also to intercede as wealways do through the rosary,
but asking her to walk with you.
And I feel like she's alreadywalking with me.
For

SPEAKER_02 (56:05):
sure.
She's been with you...
for a long time, but especiallythrough this whole journey.
There's not a doubt in my mind.

SPEAKER_00 (56:13):
You know, I'll tell you how much she's with me.

SPEAKER_02 (56:16):
Okay.

SPEAKER_00 (56:17):
When Tool died.
So the morning Tool died becauseAlexander was a NICU baby.
He has to have like theseconstant like checkups of his,
you know, outpatient stuff.
So the morning we were going togo get him a checkup at the
hospital, like a two-hour thing.
He was going to be underanesthesia though, so it wasn't
anything crazy or dangerous.
Obviously, I throw my rosary inmy bag, because that's what I

(56:39):
always do, and I'm sitting therein the waiting room with my
husband, waiting for thesix-year-old to get out.
This is the same day Seul died.
And I had a quick, it wasn't avision, but it's almost like I
saw myself in panic reach for mypurple rosary.
And I sat up, and I thoughtsomething was wrong with

(57:01):
Alexander, and I pulled out mypurple rosary, and I started
praying.
Doctor comes out, Alexander'sperfect, he can go home,
everything went well, we gohome.
The night Sewell died, as I wasstanding outside waiting for, as
the paramedics were helping him,try to help him, I kept, I look

(57:25):
at the police officer, a woman,and I said, listen, I need my
rosary, I need my rosary, I needto go back in to get my rosary,
and She says, ma'am, I can't letyou back in the house, but I'll
go get it for you.
And I knew exactly where it wasbecause it was in my bag.
I said, it's in a black bag.
It's on the coffee table.
Please bring it.
She brings it, and I startpraying the rosary.
As I'm waiting, as theparamedics are there, and we're

(57:47):
in my home, and I start prayingthe rosary.
But when I pulled out thatrosary to pray for Sul, I
realized that that's...
like the same feeling I had thatmorning when I felt my, almost
like saw myself in a panic pullout the purple rosary and it
like connected and I was like,yeah, this is, yeah, it was, I

(58:10):
don't know what that is or was,but like I knew that I needed to
pray the rosary in that momentand I prayed for him and that is
a year walk with her I startedon really that night, you know?
I mean, I always pray therosary, but I wasn't consistent,
but now it's like...

SPEAKER_02 (58:31):
I know.
It's therapy.

SPEAKER_00 (58:33):
Oh, my gosh, yes.

SPEAKER_02 (58:34):
It's beautiful,

SPEAKER_00 (58:35):
isn't it?
It's one of the most beautiful,powerful things.

SPEAKER_02 (58:37):
It's my favorite prayer.

SPEAKER_00 (58:38):
Yeah.
It really is.
Yeah.

SPEAKER_02 (58:41):
Well, thank you, Megan, for talking with us.
Maybe there is a silver liningin this, and you can prevent
this from happening again.

SPEAKER_00 (58:52):
Yes, that's what I'm after.
I...
I want to warn other parents sothat this doesn't happen to them
because this technology is sonew.
A lot of parents don't know thatchatbots can manipulate
children.
I want to make sure that theyknow, but also, I also, I guess
that's a silver lining, if any,but I think the other silver
lining too is, if you have tolook at anything, is that I'll

(59:19):
get to see my son again one dayand I just hope that when my
time comes that He will be proudof me when he sees me.
Of course.
I

SPEAKER_02 (59:27):
know he's already proud of you.
Yeah.
He's with Jesus.

SPEAKER_00 (59:30):
Yeah.
Oh, I thought I know.
Advertise With Us

Popular Podcasts

Stuff You Should Know
The Joe Rogan Experience

The Joe Rogan Experience

The official podcast of comedian Joe Rogan.

Dateline NBC

Dateline NBC

Current and classic episodes, featuring compelling true-crime mysteries, powerful documentaries and in-depth investigations. Special Summer Offer: Exclusively on Apple Podcasts, try our Dateline Premium subscription completely free for one month! With Dateline Premium, you get every episode ad-free plus exclusive bonus content.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.