Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:06):
You're listening to the Sunday Session podcast with Francesca Rudgin
from news Talk ZEDB Yes.
Speaker 2 (00:13):
And enjoining me today on the panel, we have ZB
host Roman Travers, Good Morning, Cure, and we have hosts
of the Front Page podcast, Chelsea Daniels. How are you, Chelsea?
Speaker 3 (00:23):
Good morning, Chelsea.
Speaker 2 (00:24):
Good to have you both of us. Hey, I want
to have a little chat first up about sperm donors.
This is a really interesting topic which is coming up
because as of this year, children born as a result
of sperm or egg donation can find out the identity
of their biological parents. This was something a law that
was changing around two thousand and four, two thousand and
five and up till then you couldn't and now when
(00:47):
you're eighteen you are able to access that information. But
what we are seeing is there are a lot of
people who won't fit into that category who are also
discovering that they may have children or donor can see children,
may discover they have siblings and parents through these day
d testing websites, these ancestry websites, and I reckon this
(01:10):
is really tricky Roman, because discovering that you might not
know that you are donor conceived. You may have presumed
if you were a donut that it would be confidential,
and people are finding out in ways where there isn't
a lot of duty of care to look after these
people to deal with this information. We need to be
more transparent about this, don't we.
Speaker 3 (01:32):
We do. And it's a big issue, isn't it. We're
not just talking about flipping a couple of dollars to
someone as a donation. Creating a life or being part
of that process is incredibly important. Is there anything more
important that you will do as a human being? Probably not,
apart from winning light of and paying ird But I
reckon there could be a catalog process. It could be
a two stage process. Perhaps the person who is created
(01:54):
has the choice to go and see who you are
as a donor, and step two, if the donor accepts,
is to let that person contact them, because you know,
some people just want to help out God and create
more babies and not be known. And I think it's
a readably emotionally fraught issue. What are you a Chelsea?
Speaker 2 (02:09):
It's life changing information, right, Chelsea?
Speaker 4 (02:12):
Yeah? Absolutely, And it throws up a lot of It's
an ethical dilemma, isn't it? Because you've got people who
may have donated on the assumption that they would remain anonymous,
But then you've got people wanting to find out where
their roots are.
Speaker 2 (02:26):
And I think, I don't know.
Speaker 4 (02:28):
It's a really tricky one because obviously I haven't been
in this situation, but I do. I'm a firm believer
in the fact that if you're biologically you know, family,
it doesn't always matter, you know, you choose. Some people
choose their family. So the fact that I think if
(02:48):
you want to learn more about your medical history, for instance,
I think that that would be a good way to go.
But it's definitely a two way street, isn't it.
Speaker 2 (02:57):
The problem is there's been very little oversight or regulation
around this field for so long that has kind of
protected maybe the parent and the donor without thinking about
the rights of a donor conceived child. Yeah, and they
are now growing up, going we'll hang on a minute.
I should be allowed to find out, and I should know,
(03:18):
and we should have good records on this. And I'd
like to know how many siblings I've got. I'd like
to make sure I don't date them. I'd like to
make sure.
Speaker 1 (03:24):
You know exactly.
Speaker 3 (03:25):
I think that there can't be anything more emotionally charged Chelsea.
To Chelsea's point about all that important stuff like family history,
familial hypercholesterolemia or anything that might run through the genes,
that's important. And to live and die without knowing who
your dad was, you know, I'm not particularly close to mine,
and he may as well have been a sperm donor,
but you know, it's it's something that would be at
(03:47):
the heart and hovering over you like a cloud for
your whole life, not knowing that stuff.
Speaker 4 (03:53):
I think I understand that. And of course, you know,
we're talking about finding out who your dad is, but.
Speaker 2 (04:01):
Is dad really the right.
Speaker 4 (04:04):
Word for it?
Speaker 2 (04:05):
So as well.
Speaker 4 (04:06):
Logical with that, But then you've gone through your life
and you've got a dad already, so that kind of
I find that relationship really interesting, Like what would that
throw up in terms of your dad and now finding
out that you have a biological dad that would be
incredibly disruptive.
Speaker 2 (04:26):
Well, and you don't know what relationship the other person
may want or desire or be prepared to give. I
think I think you're right, though, I think there is
some basic information that people may need take the relationship
out of it, whether it's medical you know, and having
access to that I think is really important. But we
(04:47):
just didn't do it and we haven't been able to
make it available for so many decades. So it's going
to be really interesting and I think, gosh, i'd love
to hear from somebody if they had discovered something very
interesting when they were using one of those ancestry websites
and how that all came out. Guys, I want to change.
I want to move on to AI nedes so we're
all the big topics way. So Australia has basically banned them,
(05:10):
should New Zealand follow suits? So Australia is making most
to criminalize non consensual digitally manipulated sexual material or deep fakes,
which I don't know a man, makes a lot of
sense to me. We constantly let technology dictate how we live,
our lives and what we do, and our laws very
(05:30):
rarely keep up with them. We can follow it.
Speaker 3 (05:33):
We can't even tell if an email from the A
and Z is from the A and Z anymore? Can
we let alone? Are her boobs real? Are they? You know?
You can't reach out and squeeze and to find out.
There's all sorts of weird stuff and I have to
say I'm a little bit disgusted by us as a
human species. We have let the Internet govern us when
we're all now going, ah, let's control it. We could
(05:54):
have done this from the outset. Then we'd know if
those boobs were real or not.
Speaker 2 (05:57):
You know, seven years imprisonment for sharing or creating that
kind of material in Australia these days. Well, it's what
the case is going to be, Chelsea. Would you like
to see the law clarified here in New Zealand to
make sure that deep fakes you know, would be involved
as well in our lie?
Speaker 4 (06:15):
Absolutely? And could you imagine going online and your friends like, well,
you know, you've got a photo out there.
Speaker 2 (06:20):
I can see your bits and it's not you.
Speaker 4 (06:25):
So it's happening with celebrities a lot at the moment,
I see, But it's a very thin line as to
when that crosses over to the general public as well, right,
and in terms of that, so there's the ethical side
of it. You know, you don't want people manipulating your
image and that we all put out onto the internet
willy nilly. But also sex workers.
Speaker 2 (06:45):
They have a job, you know.
Speaker 4 (06:47):
This is another AI taking over an entire industry of women.
I mean, sex workers have had a really awful time
for a very long time, even now in twenty twenty four,
So why jeopardize their money making endeavors.
Speaker 3 (07:03):
That's a bit of deep thinking.
Speaker 2 (07:04):
Are you're concerned about the future of only fans Chelsea?
Speaker 3 (07:07):
Yes, I mean.
Speaker 4 (07:08):
Those You've got to respect them going up against Imagine
a job. Imagine a job where everyone thinks it's awful
and hates you, like say journalism. You go against a
lot of you know, hatred and vitriol just to earn
a crust. And these women, I think are doing an
incredible job. They're filling a gap in the market that
(07:29):
we all need them since the beginning of time, and
this would jeopardize them.
Speaker 3 (07:34):
I think it's fair to say in terms of AI,
certain people have benefited from AI. Look at Donald Trump.
He makes more sense when it's not him, when it's
a computer talking for him, because you know, he often
just opens his mouth to change feet. So AI has
benefited some people like Donald Trump. That's all I've got,
and dead silence.
Speaker 4 (07:54):
You've got, sy AI equals Donald Trump.
Speaker 3 (07:57):
That's all I got.
Speaker 2 (07:59):
I've got to move on really quickly. We spoke before
to doctor Michelle Dickinson about the state of our water
bottles and the bacteria that you find in water bottles,
in our reusable ones is way more than you would
find on your toilet's seat if you don't clean them
on a daily basis. So oh dear, no man, Okay,
how often do you have it? Do you have a
(08:19):
reusable water bottle?
Speaker 3 (08:20):
I do, and I've also been a very keen cyclist
to almost very nearly qualify for the Tour de France
in my imagination. And when you open a bidden which
is a cycling bottle, often around the valve there is
mold and it's just water and a bit of electrolyte.
So you've really got to wash them properly.
Speaker 2 (08:35):
Chelsea.
Speaker 4 (08:36):
Oh they're terrified.
Speaker 2 (08:37):
Are you a washer?
Speaker 4 (08:40):
I am a washer, yeah, because I get really sick
really easily and once yah, like Roman says, you see
a little bit of that mold and I need to
go held leather with the with the soapy water.
Speaker 2 (08:51):
So sorry, just clarifying Roman, you don't wash your bottle obviously, well,
actually as often as you should.
Speaker 3 (08:57):
I my cycling bottles they always go through the dishwasher,
even though I've washed them by hand, but I have
those great bagyetti steel bottles with the great big wide
mouth like Donald.
Speaker 2 (09:06):
Yeah, you're fine. Then as long as they've got a
wide mouth and you can get that scrubber and all right,
you guys are quite good, then you're quite hygienic. I'm impressed.
Thank you both very much for joining us.
Speaker 1 (09:17):
For more from the Sunday Session with Francesca Rudkin, listen
live to News Talks it Be from nine am Sunday,
or follow the podcast on iHeartRadio