Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:06):
You're listening to the Sunday Session podcast with Francesca Rudkin
from News Talks EDB Right.
Speaker 2 (00:13):
So, this week we saw MP Laura McClure share a
nude peck of herself in Parliament. It was a bold move,
but the picture wasn't Laura. It was an AI deep fake,
an image created by Laura herself support her proposed bill
to criminize the creation and sharing of non consensual deep fakes.
So the UK has already made moves to crack down
on this, citing a four hundred percent increase in deep
(00:34):
fakes since twenty seventeen, and the US has also introduced
legislation Act. MP Laura McLure is worth me. Now, Good
morning Laura, thanks for your.
Speaker 3 (00:42):
Time, Good morning, thanks for having me.
Speaker 2 (00:45):
Okay, so, standing up in front of Parliament with a
photo of yourself, nude photo, even though it was a fake,
What was that like?
Speaker 3 (00:54):
Absolutely terrifying. I had almost almost chipped out last minute.
My colleague Nicole McKee was the next thing and she's like,
you've got this, it's worth it. So yeah, okay, absolutely terrifying.
It was blurred and clearly the image I know it's
not actually me because I made it, but it still
(01:15):
was absolutely terrifying, and yeah, I really feel for the
victims that have this happened to them. So it felt
like I had to do it to make the point.
Speaker 2 (01:24):
And what was the point you were trying to make?
Speaker 3 (01:27):
So the point was to get the attention of the
other MP's in the house to raise the issue of
how concerning it is and how prolific this issue is becoming,
and particularly with our young people at high schools for example,
because i'm our education spokes so and I'm out talking
to principles and parents quite frequently, and I'm a parent myself.
So the deep sake kind of abuse has really increased
(01:52):
in the last sort of three to five years, and
it's becoming more and more prolific, and it's really doing
quite serious harms. I'm hearing some really sad stories. So yeah,
it just blows my mind that this is not included
in our current legislation, and I think it's a gap
that we need to address really urgently to stop this
kind of harm happening.
Speaker 2 (02:12):
Yeah, I agree. How long did it take you to
create that image?
Speaker 3 (02:18):
Less than five minutes or so. So I just opened
up the Google browser moved top off the safe filter,
typed in deep fake node, and then there was literally
hundreds of different websites. The first one that popped up
was a blog recommending the sixty top sites to do
this on. I didn't need to download an app, I
didn't need to create an account. All I needed was
(02:41):
a headshot of myself, which obviously is you know, out
there readily available. And I've heard of schools this happening
when people have had their image taken off, like the
school service for example. So yeah, it didn't take long
at all, less than five minutes, and I had ten
images of myself looked staringly like myself, and in places
(03:04):
that could be in my home, like the bathroom, the bedroom,
in the kitchen for example.
Speaker 2 (03:09):
So anyone can do this, I think you have to
think like, yes, I'm over eighteen, but that's about was
that about the only thing stopping you doing this? Well,
you know the.
Speaker 3 (03:17):
Only yes, I'm yes, I'm over eighteen, and yes I
give you know, I have the consent to your.
Speaker 2 (03:22):
All right, okay, complete, yeah, yeah, okay.
Speaker 3 (03:25):
It's completely irrelevant exactly.
Speaker 2 (03:27):
So what would this bill do?
Speaker 3 (03:30):
Yeah, so this bill, we've already got framework in place,
and we have a revenge porn kind of framework that
came in in twenty twenty two, So all it would
do is as synthetically created images i e. Deep Sakes
and essential and we already have all of the descriptions
(03:54):
as to what it is in place, and it would
it's basically already to go a couple of words in
the Digital Harmful Digital Communications Act and also the Crimes
and that would set this up to be an official
crime because at the moment, without it being well defined,
(04:14):
people are actually using this to get off. It's a
bit of a loophole unfortunately. And also the police find
it hard to get a prosecution, so they don't really
do much when they hear of these cases. So it
kind of we need something in place, and we also
need help for victims because there's no victims support at
the moment, and schools are just crying out for help
with dealing with these issues.
Speaker 2 (04:33):
So it sounds like the legislative aspect of it is
relatively simple as far as yeah legislation, guys.
Speaker 3 (04:42):
Absolutely, And look I mean in twenty twenty two there
was an amendment plots through to include deep facts I
think by the Green Party at the time and X
supported them, but it didn't unfortunately get the full the
House support to get it across. But it really is
a matter of just changing and adding in a new
definition as to what would be considered harmful. I think that,
(05:06):
like everybody can agree that it is quite harmful when
you've got images out there that look exactly like yourself.
And it's not just images, it's actually like pornography as well,
so we're talking all kinds of degrading, dehumanizing videos that
are out there as well.
Speaker 2 (05:25):
This is a member's bill at the moment, Laurel, How
is it going to be possible for you to make
this a government bill? Because it seems to me, I
don't know why we're wasting time doing this. This technology
just keeps going faster and changing faster than we can
keep up with. Why are we making a move on this?
Speaker 3 (05:40):
I absolutely agree with you. Look, it's something that I
obviously feel very passionately about. And the more that I'm
researching this topic and having my bill out there, the
more sad stories that are coming to me, and you know,
like I'm very motivated to try and push this to
become potentially a government bill. Got the support of New
Zealand first. To do that, it's just convincing my other
(06:01):
coalition partner that this is, you know, quite an urgent issue.
But it also has from across the opposition side, because
I do think that this isn't a political issue. I
think everybody you know, whether they're female, whether they have
got a mum or sister, a daughter for example, that
can all relate that they would not want this to
happen to them. So look, I think I do have
(06:21):
quite a lot of support out there. It's just whether
we can either make this a government bill or whether
I can get the support to potentially skip the ballot even.
Speaker 2 (06:29):
Can you tell me a little bit about the impact
it has had on people? As you say, you mentioned
you have been contacted by victims of deep fakes. Is
there a lack of understanding and appreciation for just how
damaging these are?
Speaker 3 (06:43):
Yeah, look, I do think so, and I think when
you're dealing in particularly the youth space, and that's where
we've seen the biggest rise in this kind of abuse.
I call it because it is abuse realistically, or bullying
is kind of a bullying mechanism, But the types of
cases that I'm hearing about you know, young girls potentially
(07:05):
one of one of the dims that I've heard about.
She was of a religious faith, not originally from New Zealand,
and she got deep fated by somebody in her class,
and not only was that completely and utterly humiliating, it
was also really really bad for her family. So she
(07:26):
went on to have quite significant issues, wouldn't return back
to that particular school, which is somewhat understandable and is
now still struggling with the ongoing mental health issues from this.
And there's not a lot of support, and there's not
a lot that the schools can do, you know, other
than telling somebody else telling them that it's, you know,
it's a really bad thing to do. I think if
(07:47):
we had, if we had this actually as a crime,
it would send a message to our young people to say, hey,
this is actually how serious it is. It's not just
a funny joke with your mates. A line in the same.
Speaker 2 (08:01):
I mean, will the police be able to actually do
anything more than they're currently doing, or is it more
charging the society's perception of it. I like the idea
about offering more support as well.
Speaker 3 (08:12):
Yeah, definitely. Look, I think it's I think it's both.
I think it would be sending a signal to society
that hey, this actually is harmful and it's not acceptable,
so we've got that. And secondly, it gives place the
ability to actually come and look at potential prosecution for
you know, really bad cases, potentially some sort of rehabilitative
(08:33):
program depending you know, I mean, a judge would have
the ability to imprison someone for up to two years,
but they'd have lots of other options here. I don't
expect it to be a hugely punitive thing with our youth. However,
I do think it would set an example, and it
would give schools opportunity to access victim support because they
can't access that currently. So that's that is really I
(08:53):
think super important. And it gives the schools a way
of you know, dealing with this issue as well. And look,
I think if we if we could get onto this,
we might actually be able to nip it in the
bud before before someone takes their own life. Because there
has actually been one of the cases I've heard about
a young thirteen year old girl attempted to take her
own life over this, So it is really serious and
(09:14):
I think, at what point, you know, are we going
to do something about this, or are we going to
wait for somebody?
Speaker 2 (09:19):
ACC. Yeah, ACC currently covers therapy for sexual abuse victims.
Will they would somebody who is a victim of a
deep fake crime, would they be eligible for that funding
to get help from their mental health?
Speaker 3 (09:36):
Look, I'm not sure on that technical point, but I
would expect them to be under the same category as
the revenge porn and so I would expect it to be.
But at the moment, definitely there is no specific support
it or available unfortunately, Laura, really and especially for the
families too.
Speaker 2 (09:53):
Yeah, no, absolutely, Laura, really appreciate your time this morning.
Good on you for getting up there with your with
your fake nude and bringing this to attendre because I'm
with you parent of teenagers, and I'm hearing a lot
about the impact that these deep fakes are have and
they're coming for all of us, and it's already rather terrifying.
So thank you very much.
Speaker 1 (10:11):
For more from the Sunday session with Francesca Rudkin, Listen
live to News Talks it'd be from nine am Sunday,
or follow the podcast on iHeartRadio.