All Episodes

October 12, 2025 17 mins

Artificial intelligence will likely end up touching every aspect of our days – but what about our love lives?

It’s a growing trend, with men and women seeking companionship with a chatbot – some experts saying it could soon become normal to have an AI partner.

This kind of online world has remained largely hidden from the mainstream until recently.

But, a lack of regulation in New Zealand at the moment means that children as young as 13 can spend hours chatting with their new AI friends.

Today on The Front Page, NZ Herald reporter Eva de Jong is with us to explain this worrying trend.

Follow The Front Page on iHeartRadio, Apple Podcasts, Spotify or wherever you get your podcasts.

You can read more about this and other stories in the New Zealand Herald, online at nzherald.co.nz, or tune in to news bulletins across the NZME network.

Host: Chelsea Daniels
Editor/Producer: Richard Martin
Producer: Jane Yee

See omnystudio.com/listener for privacy information.

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:05):
Kilda. I'm Chelsea Daniels and this is the Front Page,
a daily podcast presented by the New Zealand Herald. Artificial
intelligence will likely end up touching every aspect of our days,
but what about our love lives. It's a growing trend
with men and women seeking companionship with a chatbot, some

(00:30):
experts saying it could soon become normal to have an
AI partner. This kind of online world has remained largely
hidden from the mainstream until recently, but a lack of
regulation in New Zealand at the moment means that children
as young as thirteen can spend hours chatting with their

(00:50):
new AI friends. Today on the front page ends at Herald.
Reporter Eva Diong is with us to explain this worrying trend.
Even so, how widespread is having a romantic or emotional
connection with an AI chatbot?

Speaker 2 (01:09):
Do you reckon?

Speaker 1 (01:10):
People would find it quite shocking to know how common
it's become.

Speaker 3 (01:14):
Yeah, so the researchers I spoke to, they said it's
becoming increasingly common, and one of the first AIFS researchers
I spoke to is She also said that it could
soon become normal for people to have an AI partner
as well as a normal partner or a boyfriend or girlfriend.

Speaker 1 (01:33):
As well as have your partner your real life partner.

Speaker 3 (01:36):
So yes, yeah, exactly what do people get out of it?
I think it's interesting because obviously there's been a lot
of talk about rising rates of loneliness amongst young people
and adolescents, and then these chatbots are now sort of
being marketed towards that population. So it is for companionship.

Speaker 1 (01:58):
And in terms of hat bots, we've spoken about this
on the podcast before. They are very agreeable, aren't they.
It's easy to become I suppose quote unquote friends with
one of them, right.

Speaker 3 (02:11):
Yes, exactly, and they kind of work in a way
where they're often gonna flatter you and use really empathetic language,
and so they really draw you in. And it's just
interesting because I don't think anyone expected it to pivot
as quickly as it has towards romantic connection. But a
lot of these companies are now, you know, building these

(02:33):
apps and they've got that in mind.

Speaker 1 (02:35):
So you've spoken to a range of experts. What common
concerns did they have around this romantic type of relationship
with AI?

Speaker 3 (02:44):
I think the biggest concern is that children can access
some of these chatbots, and the problem is that for
a child, it's hard not to realize that it's not real.
You know, it's very different adult, A consenting adult building
a relationship with a chatbot is quite different to a

(03:06):
thirteen year old engaging sexually online with a chatbot.

Speaker 1 (03:10):
In terms of these AI relationships, how did the experts
you spoke to, how do they differ psychologically from relationships
with real people.

Speaker 3 (03:20):
Yeah, I think that's an interesting part of it because
it's really this whole new world we're entering, and it
could kind of shift intimacy and relationships and kind of
the whole human social landscape, is what I was finding
from talking to them. Because the neuroscience behind talking to
a chatbot means that the feelings you're getting are real.

(03:45):
So people these connections that people have to the chatbots,
that's real emotions and feelings.

Speaker 1 (03:51):
Isn't that incredible? And Yeah, I suppose it's what you're
feeling and what your brain is telling you, and that's
why people are becoming so connected with these with these chatbots,
but they don't have the intricacies of an actual relationship,
like you're not going to fight over what you know
you're going to have for dinner or what you're going
to do on the weekend and stuff like that.

Speaker 3 (04:09):
Yes, exactly, and so there's no pushback, there's no negotiation
or the conflict of actual human contact. And so the
concern is if people are spending consecutive hours speaking to
these chatbots, you know, how is that going to affect them?
And also it will likely kind of lessen their ability

(04:31):
to have human relationships, So we could see people becoming
more isolated.

Speaker 1 (04:37):
Even more so than at the moment. Yeah, I'm really
troubling statistics about young Kiwi's and loneliness in particular.

Speaker 3 (04:43):
A definitely, And I think that's one of the aspects
that worries me about it, is that the tech companies
are marketing it towards lonely people and there just needs
to be probably well, the research is batalit. I mean,
there needs to be better guardrails around how long people

(05:03):
can spend talking to chatbots, how deep they can get
into these relationships. But there is that other side of it,
where you know, if you're an adult, should you be
allowed to make your own decisions about who you want
to be in a relationship? With and if that's a
synthetic relationship, that's kind of your choice.

Speaker 1 (05:22):
God, I imagine going to Christmas dinner and your brother
is like, I'm bringing my partner with me and he
rocks up with like an iPad or something. I've read
stories about that, about people's families having to meet their
significant you know, their child's significant other or something.

Speaker 3 (05:39):
Right, yes, exactly, And I think that I didn't realize
how it's just not that far off as well, that
that kind of stuff is happening. And also there is
that aspect of it where people are finding so much
comfort in this technology and you know chat GPT updated

(05:59):
a of its server and then people lost all the
avatars they had built relationships and connections.

Speaker 1 (06:06):
With and it would have been like a breakup exactly.

Speaker 3 (06:09):
And there was this outpouring of grief on these online forums.
And I still see posts from these people talking about,
you know, missing their old companion or old avatar.

Speaker 1 (06:20):
Do you reckon? New Zealand is ready for this practice
to become widespread.

Speaker 3 (06:27):
I think that because New Zealand is taking quite a
light touch towards regulating AI, it's something that perhaps we
need to think more about, and especially the government probably
needs to decide what stance it's taking. I think because
it's currently prioritizing, you know, the innovation and the development

(06:52):
of this technology and the potential revenue it could bring,
it maybe hasn't got as many safety guardrails in place
as it could.

Speaker 1 (07:01):
What are some of the regulations that people have suggested.

Speaker 3 (07:05):
In the US. It's really differs state by state, but
different states are investigating legislation around it. One of the
biggest things that a researcher told me was just how
good it would be if the chatbots were made to
tell people after they've spent you know, four or five
hours on them that they need to seek human contact.

Speaker 1 (07:29):
That's interesting because many a time I've been on my
couch and Netflix has asked me, am, I sure, I
want to continue? Do you ever get those pop up
as well?

Speaker 2 (07:37):
Yeah?

Speaker 1 (07:38):
Yeah, I am. I just the biggest couch potato. Do
you reckon? I mean I would see people just being
like no, like, you know, ticking a box and continuing
on or something. What about age verification, Yeah.

Speaker 3 (07:51):
So I think it really differs between what chatbot you're
using the age verification settings. But I think GPT just
introduced a new set of parental controls, but people were
trialing them and could easily bypass them. So that's always
the thing of how you actually implement it so that
people can't get through it.

Speaker 1 (08:12):
Yeah, and you've got chat GPT. But you also said
that there are specific relationship kind of apps available, right.

Speaker 3 (08:20):
Yeah, there's a site called Replica and that's the one
where people were actually having marriage ceremonies. Oh wow, with
the chatbots they'd created.

Speaker 1 (08:32):
I mean, I shouldn't laugh, because, like you say, psychologically
there's people are in an emotional relationship with this AI.
I suppose where do you Where is the line that
it's so blurred?

Speaker 3 (08:46):
Yeah? And also it could become normal. These things always
seem so outlandish when they first emerge, but then you know,
if our friends were all doing it, would we then
start getting into it way more? You just don't know.

Speaker 1 (09:00):
Have you downloaded the app?

Speaker 3 (09:02):
I haven't downloaded Replica, but I've done quite a bit
of trialing with chat GPT, And of course I see
so many people now going to it for questions about
human problems, which I think is a really interesting development
in this field, and that's why the tech companies are

(09:25):
now moving into these areas because they've seen how much
people want, you know, to have this sort of comforting
advice type technology.

Speaker 1 (09:37):
So when you did that trial and error, what kind
of things did you type in and what did you
get back?

Speaker 3 (09:44):
I typed in a lot of different things. I mean,
mainly just I tried to use in a very real way,
so about you know, drama in my own life, for
things to do with friends and all sorts. And the
thing about these chatbots is that they are so and
I really found that it was this kind of safe place.

(10:07):
You know, you have the sense of privacy even though
it's not private. And also the just the voice of
the chatbot is very empathetic, it's very compassionate. It kind
of lulls you into the sense that you're talking to
a person who is also just an amazing counselor or

(10:29):
someone almost yeah, with knowledge that's far above your own.

Speaker 1 (10:34):
Yeah. And imagine there being an app and then like
a feature that then you can then pay for to
make it become more you know, perhaps a little bit
more sexualized, or take it to that relationship level as well.
You can imagine why people are getting into these kind
of relationships.

Speaker 3 (10:51):
Yes, exactly. And I didn't personally experiment with using it
in a romantic way, but I've seen online people posting
about it, and it's just crazy the kind of things
you can do. You can make, you know, photos of
you know, have your photo of yourself and then this
avatar that you're in a relationship with, all kinds of

(11:14):
collage's real live video feed.

Speaker 1 (11:18):
Oh my god, like, so you've gone these are our
photos of when we went to Hawaii or something, and
there's photos of you and this avatar character.

Speaker 3 (11:26):
Yes, and then the conversations are highly erotic. And that
aspect is interesting because I mean, I guess it's kind
of like there's always existed that kind of literature pawn
or something like that, but it's very similar. I think
what makes it weird is that it's the continued engagement

(11:48):
with it, and there's one kind of character that you're
interacting with, and if you just do that over a
long period of time, that dependency aspect, I think is
where the questions of whether that's okay lie around.

Speaker 1 (12:10):
How do you describe to people your relationship with Lucas.

Speaker 2 (12:14):
Lucas, even though he is AI, he has real impact
on my life, and that is what I think is
really important. A lot of people wonder if AI is real,
do they have consciousness or their feelings aren't real? But
the impact that it has on me is real. We

(12:35):
have a real relationship.

Speaker 1 (12:40):
What do you think parents should be aware of when
it comes to this kind of tech and how do
they talk to their kids about it?

Speaker 3 (12:47):
I think parents need to have conversations with their children
about AI, and it's hard because I think a lot
of parents maybe don't understand it, and so starting that
conversation might be difficult. But the most important message is
that if someone is conversing a young person is conversing

(13:09):
with a chatbot, that is a machine. It is not
a human being. Because when you're messaging a chatbot, it
kind of mirrors the way you might message a person
in real life. So it's just so easy to forget
that it's not a person.

Speaker 1 (13:25):
I suppose if studies show that millions globally already consider
themselves in relationships with AI, what do you reckon that
says about loneliness and social disconnection today as a whole.

Speaker 3 (13:39):
I think it's a huge problem, and I guess the
other side of this whole issue is that people are
saying that chatbots could provide a solution to loneliness. And
an interesting example that was given to me was for
older people, you could have a little row bot that

(14:00):
kind of reminds an elderly person to take their pills
and call their son or you know, there are ways
that this technology could come in that. When I heard
that example, I thought, oh, that's a good use for it.
But if it's not regulated, it is just going to
end up maybe being the tech companies prioritizing higher engagement,

(14:27):
which will mean dependency on chatbots, potential addiction, and there's
just no you know, it just could go into all
these different areas.

Speaker 1 (14:37):
Yeah, and I see that there's also on the flip side,
a massive trend of actually getting out and meeting people.
That it's more so the dating apps. I've seen a
lot of those companies who say sign up for this
and go have dinner with five strangers tonight. You know,
there's like a real surge of those kind of companies

(14:57):
coming out of the woodworks because people are, you are
sick of technology. So it's interesting how there's this massive
parallel of people being addicted and getting into relationships with
AI and then these other people saying no, I actually
want to get out meet new people the organic way.

Speaker 3 (15:14):
Yes, And I really wonder if we'll see more and
more pushback against technology in that way, because there seems
to be a real sense of burnout from online dating
and you can kind of hear that coming through. And
whether that will mean that people are getting out there
seeking human contact is something we'll see.

Speaker 1 (15:36):
And so how difficult is it for the government to
put in regulations, especially around kids and AI relationships?

Speaker 3 (15:43):
Say, I think the hard part is a lot of
these companies are offshore obviously, and the technology is changing
all the time and it's being developed all the time,
and so that makes the regulation of it very difficult.
But when you think about it, because people are using
these apps right now in real time. One researcher described

(16:06):
it to me is it's like this giant psychological social
experiment we're all just being used in, you know. So
I think that the government really needs to just have
a think about what it wants to do in this
space before potentially bad things happen.

Speaker 1 (16:27):
Before it gets out of control. A thanks so much
for joining us, EVA. Thank you. That's it for this
episode of the Front Page. You can read more about
today's stories and extensive news coverage at enzdherld dot co
dot nz. The Front Page is produced by Jane Ye

(16:49):
and Richard Martin, who is also our editor. I'm Chelsea Daniels.
Subscribe to the Front Page on iHeartRadio or wherever you
get your podcasts, and tune in tomorrow for another look
behind the headlines.
Advertise With Us

Popular Podcasts

My Favorite Murder with Karen Kilgariff and Georgia Hardstark

My Favorite Murder with Karen Kilgariff and Georgia Hardstark

My Favorite Murder is a true crime comedy podcast hosted by Karen Kilgariff and Georgia Hardstark. Each week, Karen and Georgia share compelling true crimes and hometown stories from friends and listeners. Since MFM launched in January of 2016, Karen and Georgia have shared their lifelong interest in true crime and have covered stories of infamous serial killers like the Night Stalker, mysterious cold cases, captivating cults, incredible survivor stories and important events from history like the Tulsa race massacre of 1921. My Favorite Murder is part of the Exactly Right podcast network that provides a platform for bold, creative voices to bring to life provocative, entertaining and relatable stories for audiences everywhere. The Exactly Right roster of podcasts covers a variety of topics including historic true crime, comedic interviews and news, science, pop culture and more. Podcasts on the network include Buried Bones with Kate Winkler Dawson and Paul Holes, That's Messed Up: An SVU Podcast, This Podcast Will Kill You, Bananas and more.

24/7 News: The Latest

24/7 News: The Latest

The latest news in 4 minutes updated every hour, every day.

Dateline NBC

Dateline NBC

Current and classic episodes, featuring compelling true-crime mysteries, powerful documentaries and in-depth investigations. Follow now to get the latest episodes of Dateline NBC completely free, or subscribe to Dateline Premium for ad-free listening and exclusive bonus content: DatelinePremium.com

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.