Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:05):
Hilda. I'm Chelsea Daniels and This is the Front Page,
a daily podcast presented by the New Zealand Herald. Is
New Zealand's legal system moving fast enough to adapt to
new technologies. It's a question being asked by some of
our top academics and MPs. The conversation around covert recordings
(00:28):
has made headlines again this week, as well as raising
questions around whether it's actually illegal and act. MP Laura
McClure made global headlines after holding up a photo of
herself naked in Parliament. It was an AI generated deep fake,
which McClure said took her only moments to create. So
do our existing laws protect victims from being abused through
(00:52):
rapidly developing technology. We discuss that later with the University
of Canterbury Professor of Law, doctor Cassandra Mudgway, but first
on the Front Page we discuss that viral deep fake
moment with AX Laura McClure. So, Laura, what motivated you
(01:12):
to create this nude deep fake of yourself?
Speaker 2 (01:15):
Yeah, I wanted to highlight the growing or emerging trend
that is deep fake pornography and realistically it's now starting
to happen for a youth in particular, and it's becoming
a huge problem at schools. I'm our party's education spokesperson,
so I'm out there talking to principles teachers parents alike,
and I'm also a parent myself. So I've noticed this
(01:38):
concerning trend over the last few years, and I thought
that I would create a members bill in order to
attack the loophole that we have and see if we
can do something about this.
Speaker 1 (01:47):
Are you surprised by the international reaction to this. I've
seen it picked up by a lot of international news outlets.
Speaker 2 (01:54):
No, look, I'm actually not and I think this is
a worldwide trend. When we see the increase in technology
like AI, we often find that it is used sometimes
by bad actors the time it's used for good, as
we know. But I think that this is a growing
trend worldwide, so I'm not surprised to see us picked
up internationally.
Speaker 1 (02:12):
I saw that while Justice Minister Paul Goldsmith said they're
not currently considering adopting the bill, he'll meet with you
to discuss it. Has that meeting taken place, not just yet.
Speaker 2 (02:22):
But I do believe I am meeting with him next month,
which I think is a positive step. I understand the
government has a very busy workload that this is a
really big problem in the story that I've been hearing
are like harrowing, to be honest, and it's as this
government going to wait until somebody actually commits suicide over this,
or if are we going to actually do something there
about it?
Speaker 1 (02:41):
Well, I know that you need to convince sixty MP's
from across the House to support it so it can
skip that member's bill ballot process. How's that all going?
Speaker 2 (02:49):
Look, I think it's going really well. These positive signs
from all of the parties in government that if this
bill was pulled they would definitely support it as far
as bypassing the Tin. I've got great support from the Greens,
which is sometimes a bit unusual coming from Act but
they also have these share the same concerns labor and
definitely New Zealand first, and I'm talking with Hannah Rafferty
(03:10):
to discuss if she's on board too. So I've written
to their party whips or all the party whips saying, look,
I think this is something a bit more serious and
requires a lot more urgency. Can we get this out
of the tin and see if we can get some
action on it.
Speaker 3 (03:25):
This image is a naked image of me, but it
is not real.
Speaker 1 (03:32):
This image is what we call a deep faith.
Speaker 3 (03:36):
It took me less than five minutes to make a
series of deep fakes of myself. Scaringly, it was a
quick Google search for the technology of what's available. In fact,
I didn't need to enter an email address. I just
had to tack a box A so I was eighteen
and that it was my image and I could use it.
Speaker 1 (03:57):
Do you think we're moving fast enough as a country
to adapt to new technologies? If we wait for your
bill to go through the biscuit ten, get drawn, get debated,
go through three readings, there could be a brand new
tech or ways of doing things by then.
Speaker 2 (04:10):
Hey, well absolutely there could be. But I believe ye
my bill would protect from any sensor sized images. So
whether it's done via the current tech or future tech,
I think it would cover us into the future. And
I do believe it's something that we absolutely need to do.
And this tech is moving really fast, like day by
day there are new sites, new apps that can do
all of this kind of thing. And I think, you know,
(04:30):
as law makers, we really need to consider one. How
can we adopt AI for good. I think that's a
real positive thing, but what can we do when people
use it for malicious purposes?
Speaker 1 (04:40):
What are some of the horror stories you've heard from
parents and students and teachers.
Speaker 2 (04:44):
Yeah, I've heard quite a lot of heroin stories that
are really motivating me to push on with us and
actually show how urgent it is. One of the main stories,
or the key stories that I've been saying, is about
a stirtying year old girl and I've got a twelve
year old boy, so still feels very awfully young for thirty.
She was deep faked and it was shared amongst her
peers at school just year nine, and she attempted suicide
(05:07):
on school site. It was absolutely terrifying and traumatizing, not
for just for the individual, but for her peers in
the school, and the lack of support and resource around
this is really terrifying. It would be on the extreme
side of things, but other than that case, I've heard
of many, many other young predominantly females where this is
happening to her, whether it's at school, university for example,
(05:29):
or within the workplace.
Speaker 1 (05:30):
Thanks for joining us, Laura, Thank you. Last week, Michael Forbes,
a former Deputy Press Secretary in Christopher Lucksen's office was
(05:51):
exposed in a staff investigation for having allegedly made audio
recordings of sex workers and photographed women on the street
without their consent. And this case, alongside the calls for
a crackdown on deep fake porn, have highlighted the issues
facing our legal system and keeping up with new technology.
To discuss further, we're joined now on the front page
(06:13):
by doctor Cassandra Mudgway from the University of Canterbury. First off, Cassandra,
what are some of the current laws in place around
covert audio recordings? I imagine they must be illegal in
some way.
Speaker 4 (06:30):
So there are some crimes under the Crimes Act around
covert audio recordings, but they only cover audio conversations where
the person who's intercepting, which is what it's called interception,
is not party to the conversation itself.
Speaker 1 (06:48):
Right, So if I was to covertly record someone just
in a conversation between us two, that's okay.
Speaker 4 (06:54):
Yes, yes, So if one party consents to that then
that would be lead.
Speaker 1 (07:00):
What about in the case where I don't know, say
it was a covert recording of a session with a
sex worker.
Speaker 4 (07:08):
So that is a sort of an intimate sort of
audio recording, and that is lawful, so it's not illegal
under the Crimes Act. The Crime fact only covers intimate
video recordings.
Speaker 1 (07:21):
What about recording or filming in public places, things like
a woman walking down a street at the gem, extreme
close ups, that kind of thing.
Speaker 4 (07:30):
Yeah, so intimate visual recordings that are prohibited under the
Crimes Act, which is the non consensual creation, possession, or
distribution of those. That definition requires that the person that
you're covertly recording is in a place where they have
a reasonable expectation of privacy. And secondly, that person is
(07:51):
in a state of undress or engaged in sexual activity,
or otherwise maybe showering or going to the bathroom. If
you have COVID taking of photographs or videos of women
in public places, the obvious fish hook that I see
is that the place where the women are being captured,
generally when you're out using public spaces are so walking
(08:11):
down the road or in a supermarket or even using
a public gym, you have a lesser expectation of privacy
than you do in your own private home. And even
where you could argue that perhaps zooming in on certain
body parts where that becomes like the subject of the
image or video, perhaps that could be recognized that you
have a higher expectation of privacy there. The image itself,
(08:34):
according to the definition, must be of those body parts
being exposed in some way, so naked or in your underwear,
so it's not easily captured under these sort of criminal
or that slate of criminal offenses.
Speaker 1 (08:47):
Does any of this counter's harassment Potentially?
Speaker 4 (08:51):
It really depends on the circumstances. So under the Harassment Act,
harassment involves a pattern of behavior, so at least two instances,
and that includes specified acts, which does include following or
watching well otherwise behavior that causes a person to fear
for their safety. So covert filming could meet this definition
(09:13):
if it's repeated and or if it causes fear or distress. However,
proving intent or knowledge regarding that fear can be a
legal hurdle, I think, especially when the conduct is framed
as more like your voyeuristic rather than threatening. It's probably
also difficult to prosecute if the women are being followed
(09:33):
or watched, they've never become aware of that. I can
see that as being difficult to prosecute.
Speaker 1 (09:38):
Yeah, does it seem like the laws are a bit
outdated here.
Speaker 4 (09:42):
Yeah, I think that this situation has really exposed some
gaps in the law, particularly when it comes to those
intimate audio recordings, which as the women who brought this forward,
where we're shocked that wasn't a criminal offense, and generally
this is a really good opportunity I think to really
step back and have a look at our current laws
(10:03):
and while they address some forms of non consensual recording,
as we've discussed, they were designed with specific technologies and
behaviors in mind. So that's slate of criminal offenses I'm
talking about that was created in two thousand and six.
We're talking about the influx of digital cameras. It's not
created with everyone having high powered sort of cameras just
(10:25):
on your phones like we do now with smartphones. So
as a result, the criminal offenses that we have fail
to capture newer, equally harmful conduct like the covert intimate
audio recordings, but also other things like synthetic media abuse,
like non consensual sexualized depth fakes. So I think that
we need to really look at law reform in a
(10:45):
sort of in a different way, something that is more
technology neutral and perhaps look at what kind of harm
we want to sort of protect people from. So what
do we want to protect. We want to protect sexual autonomy,
bodily antegrity, privacy, and a real focus on that lack
of consent. I think.
Speaker 5 (11:08):
Initially we figured that there was a gap in the law.
It's something that was raised as far back as twenty eleven.
Speaker 6 (11:15):
Is it's not a criminal harestler. I feel this case
is so extreme. It's the most extreme end of cyber
ar wrestler. But it could be used to shake Okay,
this probably doesn't have any.
Speaker 5 (11:32):
Day and has been a constant query from the victims
I've spoken to.
Speaker 7 (11:37):
Why why can the government not step in and put
protocols in place to protect people? Surely there must be
something out there that you know the government can can done.
Speaker 5 (11:48):
Well, Sorry, it turns out our laws are pretty good.
Speaker 1 (11:55):
Last year the Herald released Chasing Ghosts the Papper Tier,
and that looked at the case of Catfish and Natalia Burgers.
They looked at the legal side of trying to stop
her and found there were laws that could target her,
but not a specific anti catfishing law or anything like that.
Despite a lot of the victims wanting something like that
(12:17):
put forward. Do you think we trend towards being I
suppose too specific in some of our laws in issues like.
Speaker 4 (12:23):
This, Yeah, because catfishing is a kind of impersonation scam,
so it was sort of like a romance scam colloquially,
that's kind of how we refer to it. And if
you are too specific to tech specific, it could create
gaps later on. I would assume that would be something
around fraud or blackmail. But part of the purpose of
(12:46):
criminal law, I think, is to signal public condemnation of
certain harmful behavior. There is a purpose to explicitly criminalize
a specific behavior. You are making it very clear that
this particular behavior as criminal behavior, and if you are
convicted of that offense, that it carries the stigma of
(13:06):
committing that particular behavior, And that's what we call fear labeling.
So I can see how that could be a really
good argument to making a specific catfishing offense or a
specific deep fake offense. You know, I can see that
being very useful, But again, it runs that risk of
being two tech specific, and you know, you can find
gaps in the law later when new tech is developed
(13:29):
and there are new ways to commit this kind of
harm in the future.
Speaker 1 (13:34):
What kind of laws would you put forward to kind
of capture all of the things that we've been talking
about and to make sure that it's kind of future proofed.
I guess yeah.
Speaker 4 (13:43):
I think it is just going back and thinking about,
as I said before, thinking about law reform in a
principled kind of way, focusing on what we want to
protect and what is the harmful conduct that we want
to criminalize, and that focus on a lack of consent.
I think that's a whole of society sort of conversation
(14:04):
to be had.
Speaker 1 (14:05):
It's interesting as well because these things happen all the time.
Look at the stalking laws that have just come through
or been passed. I mean, we do have the ability
to look at these kind of consent issues and create
legislation to cover it, and which covers things like covert
recordings and taking photos of someone in the street. Hey, yep, yep, the.
Speaker 4 (14:23):
New stalking or although we don't know what the final
form is going to be, probably would have been helpful
in this situation and get sort of an earlier police
intervention as well.
Speaker 1 (14:34):
Thanks for joining us, Cassandra, no worries. Thank you. That's
it for this episode of the Front Page. You can
read more about today's stories and extensive news coverage at
enzad Herald dot co dot nz. The Front Page is
produced by Ethan Sells and Richard Martin, who is also
(14:56):
our sound engineer. I'm Chelsea Daniels. Subscribe to the Front
Page on iHeartRadio or wherever you get your podcasts, and
tune in tomorrow for another look behind the headlines.