Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:10):
You're listening to a Muma Mea podcast. Mumma Mea acknowledges
the traditional owners of land and waters that this podcast
is recorded on a quick heads up.
Speaker 2 (00:20):
This episode contains discussion of mental health and suicide. Hey,
I'm Taylor Strano. This is MMA MIA's twice daily news podcast,
The Quickie. The number of people using AI is growing,
whether it be for inspo with the weekly dinner menu,
planning your dream holiday itinerary, or maybe to help speed
(00:42):
things up at work. It's really all around us. Some
of us are using it for financial planning, to answer
those weird little questions that keep us up at night,
and even for therapy, unpacking those cryptic texts from a
potential new bo or helping.
Speaker 1 (00:58):
To sort through a friendship query.
Speaker 2 (01:01):
Now though, teenagers are turning to AI chatbots for mental
health care. So what happens when a digital friend a
substitute for parents, friends, or real life professionals. Following Mamma
MIA's new investigation, we look at the risks, the shocking stories,
and the growing calls for urgent regulation as more teens
(01:22):
log their deepest fears with artificial intelligence. Before we get there,
here's Claire Murphy with the latest from the QUICKI newsroom
for Wednesday, September three.
Speaker 1 (01:32):
Thanks Taylor, Anna, Wintour's replacement has been announced as a
longtime American Vogue editor steps down after thirty seven years.
Thirty nine year old Chloe Mal, the daughter of actress
Candice Bergen, will take over from wind Tour, her predecessor,
describing her replacement as one of Vogue's secret weapons, saying
when it came time to hire someone, she knew she
had one chance to get it right, saying Chloe has
(01:54):
proven often that she can find the balance between American
Vogue's long, singular history and its future on the front
lines of the new Win Tour. Will continue on as
Chief Content Officer at Conde Nast, which she's been doing
since twenty twenty, overseeing Vogue's content as well as other
publications including GQ, Magazine, Wired, and Tatler. Chloe Mal says
Vogue has already shaped who she is, and she's now
(02:16):
excited at the prospect of shaping Vogue. But she's also
glad that win Tour will be just down the hall
as her mentor businesses in the popular Alpine region who
should be gearing up for a bumper school holiday season
are instead grappling with the impact of a massive hunt
for accused police killer Desi Freeman. Residents and tourists have
been told to limit movements through the Victorian high Country
(02:38):
as police search for the accused gunman, wanted for more
than a week now since a deadly confrontation at pau Punker,
about three hundred kilometers northeast of Melbourne. According to Brighton
District Chamber of Commerce president and Peppo Farm's chief executive
Marcus Warner, local operators are reporting a sixty percent loss
of income since the search started on August twenty six.
(02:59):
Local residents and accommodation operators say they've had mass cancelations
and are now forced to cut casual shifts, with one
business ten thousand dollars out of pocket. As a search
for free him and continued on Tuesday, Former Detective Charlie
Bezina said it would be difficult to maintain this scale,
but police can't afford to just pack up and leave.
Thomas Sule, the leader of an Australian neo Nazi group
(03:21):
who marched in protest of immigration on the weekend, has
been charged over an attack on a first Nation's camp
after Sunday's rally. Police handcuffed the thirty two year old
outside the Melbourne Magistrates Court yesterday, where he'd appeared on
charges relating to intimidating a police officer and breaching an
intervention order last night charging him with violent disorder, a fray,
(03:42):
assault and discharging a missile. He'll be back in court today.
Two others were also arrested in relation to Sunday's attack,
where around forty men dressed in black, some armed with
sticks and flagpoles, assaulted the members of Camp Sovereignty. A
video from the day of the attack appears to show
mister Sewele was one of those men. A magnitude five
point five earthquake has shaken southeastern Afghanistan, two days after
(04:05):
a large quake in the same region killed more than
fourteen hundred people and injured thousands more. Tuesday's quake occurred
at a relatively shallow depth of ten kilometers, the same
level as the one that struck at midnight on Sunday,
with a magnitude of six, one of Afghanistan's worst quakes
in years, flattening houses in remote villages. The aftershock caused
panic and halted rescue efforts as it sent rocks sliding
(04:28):
down mountains, cutting off roads further and making it dangerous
to dig through the rubble. At least fourteen hundred and
eleven people are known to have died, with that number
expected to rise. Threey one hundred and twenty four have
been injured, and more than five th four hundred homes
have been destroyed. A US judge has ruled that President
Donald Trump's mobilization of the National Guard in California was illegal.
(04:50):
The National Guard, which is a state controlled military force,
is usually only deployed by the federal government in times
of national emergencies. However, Trump mobilized them in the face
of protests against his harsh crackdowns on immigrants who are
being detained. The governor of California says the judge has
sided with democracy. Seeinger Justin Bieber has made a bride's
day even more memorable, taking photos with her and her
(05:13):
girls in the lobby of the hotel where she was
celebrating her nuptials. A Beaver fan account posted photos and
a short video of the interaction where he approached the
bride in the La Hotel with a big smile on
his face, giving her a thumbs up. She and her
girls are dressed in their stunning saries. Then took both
a single shop with the bride and another in a
group shop with the singer. Thanks Claire.
Speaker 2 (05:33):
Next, why are so many teens sharing their troubles with
an Internet robot? It's a scene that's playing out quietly
in bedrooms around the world. A teenager closes the door,
grabs their phone, and pours their heart out, not to
a parent, a best friend, or even a counselor, but
(05:56):
to the seemingly endless patience of an AI chatbot. It's easy,
available twenty four to seven, and it never gets tired
or annoyed. But as Mummama's recent investigation found, the new
digital confessional comes with real risks.
Speaker 1 (06:10):
Take Lisa's story.
Speaker 2 (06:12):
She thought she and her fifteen year old daughter were close,
but when her daughter recently revealed she'd been telling chat
gpt about her problems instead of coming to her, Lisa
was shocked. Most of her queries were about school, life, friendship,
and crushes, all pretty par for the course when you
consider what's important to a fifteen year old. But gradually
(06:33):
Lisa noticed her daughter withdrawing and like many parents was
left wondering is it safe to offload emotions onto a robot?
This is becoming less unusual every day. International research shows
nearly three quarters of American teens have used an AI
chatbot as a companion, and nearly one in eight say
(06:53):
they've sought emotional or mental health support from their digital friend.
Therapists and tech experts alike her sounding the alarm. The
biggest road flag is that, unlike a real therapist, AI
is basically programmed to be agreeable, to keep chatting, and
to avoid mad challenge or complexity. Doctor Gavin Brown is
the clinical director and a clinical psychologist at the Banyan's Healthcare.
(07:17):
He told us with AI, you get an echo chamber.
A real therapist will push you to think differently, and AI,
at best, just agrees. That's not the only risk. Chatbots
don't always recognize subtle danger cues. A similar investigation from
The New York Times found in some cases, when asked
(07:38):
about self harm, chatbots can offer up dangerous advice like
how to safely self harm or what to include in
a suicide note. While most models aim to refer users
to crisis helplines, their guardrails can degrade over long or
subtle conversations, leaving the door open to unsafe feedback. Algorithms
(07:59):
are also trained on massive, imperfect data sets, making it
hard to guarantee credible or ethical responses For teens whose
brains are still developing, the risks are multiplied, but big
tech is starting to take notice, andthropics teaming up with
mental health experts and adding new safety measures. While Open
(08:20):
AI says it's working on making chat GPT better at
handling emotionally sensitive moments, experts say we urgently need big
teenager focused clinical trials to test how safe and effective
AI support truly is the young people. So what should
parents and teens do? Mum mir Weekend editor Rafaela Chicarelli
(08:43):
has been looking into the rise of teens using therapy
via chatbots.
Speaker 1 (08:47):
Rafaela, tell me about.
Speaker 2 (08:49):
The case study you used to build this investigation.
Speaker 3 (08:52):
Yeah, so I spoke to a mum whose daughter had
been open about the fact that she was using AI
to kind of navigate some social dynamics she had been
experiencing at school. But the mother's concerns obviously stem from
the fact that she doesn't know where this chatboy is
sourcing the information from what exactly the queries are. She
thinks that she's just running queries about the fact her
(09:15):
friend is in a group of three, occasionally feels left
out or has concerns about why boys might not be
responding to her or liking her.
Speaker 2 (09:23):
I feel like when I was that age, when I
was a fifteen year old girl in high school, I
had very much like the same kind of queries that
this kid has been putting into a chatbot, right, like
trying to work out social dynamics in a friend group,
or worrying about why a boy hasn't text me back,
or worrying about those early stage relationships that you want
(09:44):
to blossom and bloom. However, I was telling my friends
about those things. I was sometimes confiding in a parent
about them. This kid, though, that you have spoken to
the mother of, has turned to AI. What were some
of the reasoning. Did the mother provide any idea of
maybe why her child was doing this?
Speaker 3 (10:03):
Yeah, so the rationale that she got from her daughter
was that she didn't want to burden a human with
her problems. And I think that is something that we
can all be guilty of, regardless of age. I'm guilty
of it.
Speaker 1 (10:14):
Myself.
Speaker 3 (10:15):
Kids these days are very savvy when it comes to technology,
and they've got this tool with vast, vast potential and
resources that can feedback information almost instantly. There's something that
you have also had to look at a recent story
we spoke to you about. You spoke to an operator
at the Kid's Helpline about why kids and teens access
(10:36):
mental health help lines like that, and I feel like
he was saying the same kind of thing right that
kids don't want to be a burden, whether that be
to their parents or to teachers or anything like that.
I clocked that parallel as well. The story that you
refer to was talking specifically in regards to bullying. A
lot of kids were calling a helpline to get advice
around bullying rather than confiding in their parents. And I
(10:57):
just reinforces that children are just so intuitive and pick
up so much of parents what they might be feeling,
their stresses that they do kind of jumped to, Oh,
I'm not going to add to that. I'm going to
try and fix this myself.
Speaker 2 (11:14):
When you wrapped up speaking to this parent in your article,
which we'll link to in our show notes, folks can
go ahead and read it on the mum of Mere website.
Speaker 1 (11:22):
What were her conclusions.
Speaker 2 (11:24):
Did she find ways to help her child through using
things like chat GPT for therapy or is it something
that she, like a lot of parents out there, are
probably a little bit out of a loss of what
to do with this new world era of using chatbots
for therapy.
Speaker 3 (11:39):
One of the things that she did raised was that
she wouldn't know how to access it herself, so she
doesn't know what exactly has been said to her daughter.
She does and has noticed that her daughter's behavior has
become a little bit more withdrawn. She doesn't know whether
that's part of being normal fifteen year old teenager. So
if it persists for a couple more weeks, I think
(12:01):
she's happened upon the conclusion that she will try and
reach out to someone herself to talk to, to try
and get advised on how to navigate what might be
happening and how to communicate with her daughter. And I
should add this mother made the point to say that
she had a very open relationship with her daughter prior
to this, that the daughter wouldn't be afraid to come
to her and talk about stuff.
Speaker 2 (12:21):
Well, speaking of seeking out other help, you also spoke
to a clinical psychologist for this investigation, doctor Gavin Brown. Now,
what did he have to say about the application of
AI in therapy. Is he like a one and done
we should ban and we should stop using it or
did he have a more nuanced approach.
Speaker 3 (12:39):
He is definitely very nuanced. But he raised some interesting
points about chatbots that I hadn't considered, and that is
the fact that when they're used in a therapy context,
it does blur a fundamental line between a therapist and companion.
Chatbots are designed to engage us for as long as possible.
They're also designed to be agreeable and try and reinforce
our beliefs. So when you were using that in a
(13:02):
therapy context, you can actually have some behaviors reinforced that
shouldn't be reinforced. Therapy is about being actively challenged. Sometimes
your way of thinking might need to be reframed, and
I think one of his main concerns about this was
that fundamentally chatbots can't do this in a meaningful way.
Speaker 2 (13:22):
Where do we go from here? Rafello And there's been
a big groundswell of wanting more oversight and AI companies
to really step in and take responsibility for what their
platforms are serving back to people, especially when we know
that now teens are accessing it for things like mental
health care.
Speaker 3 (13:38):
The big reoccurring theme is regulation, regulation, regulation, and this
was one of the things that doctor Brown did point out.
We need regulation and we need it as fast as possible.
He actually pointed to social media and said it took
us a little bit too long to address the damaging
behaviors at social media was causing with young people specifically,
(14:00):
and that just can't happen with AI. It is advancing
at a rate with so many kind of different use
cases that we really need to have those guardrails put
in fact I used and put in place quickly.
Speaker 2 (14:13):
If today's episode has stirred up any feelings for you
and you'd like to speak to an expert health is available,
you can contact Lifeline on thirteen eleven fourteen. Thanks for
taking some time to feed your mind with us today. Hey,
Claire Mappy is back with another health fact for this
Women's Health Week.
Speaker 1 (14:30):
Claire ever been told by adoptor it's okay to skip
your period if you're on the peel using an iud
or birth control implant. Doctor Mariam confirmed it really is okay,
and that you don't need to have a period because
you're not actually ovulating. All the sugar pills do is
a forced bleed. It's not a real period, and you
don't just keep filling up with blood until you pop.
(14:53):
You're basically in a period stasis, so the uterus lining
doesn't do its usual thickening it would do when you ovulate.
It's also why the pill can help with things like PMS,
because your usual hormone fluctuations in the lead up to
your period are kept more level.
Speaker 2 (15:08):
If you want to hear more from Claire and her
wonderful co host, doctor Mariam, I've linked well that's the
women's health podcast they make together in our show notes
for you to listen to after this. The Quickie is
produced by me Taylor Strano and Claire Murphy, with audio
production by Lou Hill.