Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:05):
You're listening to the Weekend Collective podcast from News talks 'b.
Speaker 2 (00:10):
In the Parliament First Act, MP Laura McLure displayed a
naked image of herself to highlight the need for her
proposed bill. Now, when I say of herself, it wasn't
really It was an AI deep an AI generated deep
fake made in just minutes with technology that can be
found basically on the first page of a Google search.
(00:31):
Who goes to the second page these days don't know
about that anyway. Laura's Deep Fake Digital Harm and Exploitation
Bill would restrict the creation of non consensual deep fakes,
as well as expanding revenge porn laws and that act.
MP Laura McLure is with me. Now, good afternoon, Good afternoon.
How did this issue first come to your attention?
Speaker 3 (00:52):
Yeah, this issue first came to my attention because i'm
our party's educations both person and out and about talking
with schools and other parents. I'm also apparent and this issue,
you know, a couple of years ago I was hearing
about it, and like this year, I'm hearing about it
far more frequently, and I just thought it, look, something
has to happen in the space. It's become more and
(01:14):
more urgent.
Speaker 2 (01:15):
What data or data or info do we have about
actually how pervasive it is or prevalent it is In New.
Speaker 3 (01:20):
Zealand, the data that we have is quite broad. It's
basically exposure to all kinds of normal pornography, for example,
So we don't have any actual data on deep fake
themselves in a prevalent. But what we do know is
net Safe last year reported that they are getting daily
calls from individuals and they reported that they were, you know,
(01:44):
very very distressed on the other end, and it's becoming
a major major issue. So while we don't have that yet,
we do know that it is becoming a big deal.
Speaker 2 (01:52):
Is it more likely to be school age kids as well,
because well school kids do to some pretty stupid stuff
as well, But is there an age sort of where
it's more prevalent.
Speaker 3 (02:02):
Yeah, absolutely, Look as far as I'm aware at ten
to be happening in schools, and like the data especially
from the UK in Australia is it's ninety to ninety
five percent females that are seemed to be victimized by
this type of bullying behavior. So yeah, definitely seems to
be happening within the younger generations. I've heard of a
(02:24):
girl as young asserting that unfortunately attempted suicide over having
her image deep fate into pornography, which was really, really
sad and really scary and one of my main motivators.
Speaker 2 (02:36):
For this, yere, can you walk us through your decision
to display the AI generated image in Parliament? Now, I cut,
it's blurred out obviously in the media that I see,
But was it actually blurred out in Parliament as well?
But your decision to go ahead with that?
Speaker 1 (02:49):
Yeah?
Speaker 3 (02:49):
Yeah, absolutely, so it was blurred out. I mean, it
would be well and truly out of standing orders for
me to hold out the neod real or not. But look,
I think I i'med an art about how to do this.
So at the moment, this bill is just a member's bill.
It's in the It has, you know, like a one
in seventy or one and forty chance of getting pulled
on a good day. But I believe that this issue
(03:11):
is really serious and I believe that it's something that
we need to tackle sooner. So the General Debate was
coming up and I thought, let's highlight this issue, let's
bring it to life and discuss what I'm doing So
essentially it was to get attention to the issue and
so that we could actually have a big debate about
why it is that we need to do this. And
I think I think it's working so far. It's certainly
(03:34):
bringing attention to thea and yeah, absolutely, Well what.
Speaker 2 (03:39):
About the response from your colleagues and across different parties,
because as you say, at the moment, it's just in
the biscuit, tin, and it seems to be when you
consider the mental health implications and affecting young especially young girls.
What's the chance that you actually might get this adopted
and supported by the parties.
Speaker 3 (03:59):
Yeah, so I'm really hopeful that this will get packed
up either as a government bill or potentially the ballot.
So far, i'vehead excellent response from across the benches, from
the lights of the Green Party wanting to co sponsor
the bill, labor of indicator that they would support it,
and also in New Zealand first. So the reality is
we could skip the ballot be debating this bill tomorrow
(04:21):
if I had the support of to Partty, Maori or
National or we could Yeah, we could make this a
government bill.
Speaker 2 (04:27):
So what's your bet? Are you going to have to
wait for the biscuits in or are you are you
optimistic that we might get some progress on this?
Speaker 3 (04:35):
Well, what I'm what I am hopeful for is that
we will get some kind of progress on this before
some young person takes their life over this. It's becoming
really serious and like you said, the mental health implications,
the fact that there's no victim support. Schools are really
struggling with this issue. Do we wait till we see,
(04:55):
you know, something serious happen, or do we turn around
and say, hey, this isn't a political issue. We all
agreed that this is becoming more and more serious and
now is the time to do something about it.
Speaker 2 (05:05):
What's your bill looked like at the moment and did
are there other jurisdictions other countries that have got laws
like this in place we can look towards.
Speaker 3 (05:14):
Yeah, So, so my bill basically builds upon the legislation
that we already have, and there was revenge poorn legislation
pots throw in twenty twenty two. So essentially my bill
would just add AI synthetics deepsakes into the non consented
image part. It's pretty basic. It's actually not a huge
technical change, and it would also amend part of the
(05:36):
Crimes Act, so you've got the Harmful Digital Communications Act
and the Crimes Act. It mirrored a little bit off
the legislation from South Korea because this became a huge
issue there, particularly again with their young people in schools.
You know, it used to be isolated to celebrities potentially politicians,
but it's now it could be anybody. It could be you,
(05:57):
It could be your daughter, your mum, you know it is.
It is a really really scary, unfortunately thing that comes
with this technology, which is wonderful and great, but we
do need to think about the behavior of individuals that
miss use.
Speaker 2 (06:11):
And just being clear at the moment, if somebody did
manufacture a deep fake of Yumi or anyone else, that
is something they couldn't be prosecuted under the current revenge
porn lawn laws.
Speaker 3 (06:23):
There is a possibility, and there are people that have
taken up civil cases around this. However, this threshold is
really challenging and because it's not defined, it is quite gray.
There is a loophole where prosecutions are difficult, so the
police actually often aren't prosecuting because they know this. So
I mean reality is, yes, possibly there is an avenue,
(06:45):
but at the end of the day. It seems to
be only in the civil courts, you know, for deformation
for example.
Speaker 2 (06:50):
Yeah, okay, oh well, hey, good luck with it, Laura.
And he's hoping we'll hear about it. Well, here's hoping
you just draw it out of the biskitton and we
get it done.
Speaker 3 (06:58):
Yeah. Well, here's hoping that we one of the other
political parties that we need to get it over the
line comes to the party. Yeah, we can start protecting
our young people.
Speaker 2 (07:07):
Excellent. Hey, Laura, I appreciate your time. This stelf known
all the best that is. Laura mcclury is ACTMP.
Speaker 1 (07:12):
For more from the Weekend Collective, listen live to News
Talk ZEDB weekends from three pm, or follow the podcast
on iHeartRadio.