All Episodes

January 13, 2026 5 mins

The UK is introducing a law making it illegal to create non-consensual intimate images following growing concern over Elon Musk's AI chatbot. 

Officials in countless countries are condemning the lack of regulation of sexualised content on the app X, formerly known as Twitter, including depictions of children. 

ACT MP Laura McClure, author of the Deepfake Digital Harm and Exploitation Bill, told Andrew Dickens she’s glad to see some action being taken, but the problem is far wider than X and Grok.  

She says there’s hundreds of websites that can do this, so the first thing we need to do as a government or society is make this behaviour illegal.  

LISTEN ABOVE 

See omnystudio.com/listener for privacy information.

Listen
Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:09):
You're listening to a podcast from newstalksed B. Follow this
and our wide range of podcasts now on iHeartRadio.

Speaker 2 (00:16):
All right, So, the UK's Communications regulator is investigating X
formerly Twitter, overconcerns about its ai grock being used to
generate sexual images. Now, the concerns started after it became
something of a trend to use GROC to generate images
of public figures and politicians and celebrities, including children, and
saying that they're fully closed in photo. You say to GROC,

(00:39):
put them in a bikini and boom, there they go.
If you have the paid version, you can say make
them completely naked and there you go. So the Elon
Musko and social media responded by claiming the UK was
trying to censor free speech and put the ability to
generate sexualized images behind a paywall, which is why you
have to get the premium to do the naked. I
shouldn't probably tell you that Laura McClure is an act MP,

(01:01):
the author of the deep Fake Digital harm in an
Exploitation Bill, and she joins me, now, hello, Laura, good morning.
Is the UK doing the right thing?

Speaker 3 (01:12):
Look? I think the UK has the right to be
concerned about X and GROC. I do, however, though, have
concerns about targeting like a specific platform, because I know
in my line of work that there are literally hundreds
of different websites and technology where you can do this.
It's actually quite prolific and the issues far wider than

(01:36):
just X. But absolutely they have the right to be
concerned about it.

Speaker 2 (01:40):
Yes, and that was exactly going to be my next question.
Why is the UK only targeting elon musks Ai where
many other ais are doing the job. So do you
think it's politically motivated because of the owner of the company.

Speaker 3 (01:51):
Oh, look, of course I do think that there is
a bit of political motivation behind it. What I'm actually
happy about is that we are seeing a bit of
action on this and that we're actually highlighting that there
is a real problem. But like I say, it's it's
not X and grog, it is actually everywhere there. Yeah,

(02:12):
there's hundreds hundreds of websites that can do this, and
for me in New Zealand, it's a real concern because
that behavior itself is an illegal, let alone the technology.

Speaker 2 (02:22):
Yeah, exactly, which is why you introduce your bill about
deep fake nude images and you highlighted all of that
in your bill that's been doing the realms for a while. Now,
what do you want to regulate?

Speaker 3 (02:36):
Yeah, so my bills actually, before we get to that
regulation part is actually about saying that the behavior in
itself is explicitly illegal because in the Harmful Digital Communications
Act and the Crimes Act, it doesn't include AI or
synthesized images, which means there's a bit of a loophole.

(02:56):
And I think the very first thing that we do
as a government or a society is to say that
this behavior is illegal. Big platforms like I will use
x as an example, but there's many of them actually
say they'll comple with local laws. So look, if it
was illegal here in New Zealand, would expect them to
say that part of their technology wasn't available. So at
the very minimum, I think that's where we need to start.

(03:18):
We need to say, you know what, this is so
damaging and hurtful and harmful, particularly for our young people,
that we say that this behavior is illegal. And that's
what my film does as a start.

Speaker 2 (03:29):
ELI must defense is this is a censorship of free speech.
So at famously libertarian, where does your party draw the
line between freedom of expression and harmful content online.

Speaker 3 (03:42):
Yeah. Look, I think it's pretty obvious. And we do
not want to suppress people's freedom of speech and the
use of social media platforms because we know that social
media and those platforms often spaces political debate and where
people find communities and things like that. We draw the
line as the behavior of individuals and that personal responsibility

(04:04):
to not harm other people. So that's where we say, actually,
you know, it's illegal to hear somebody's actual node in
New Zealand, it's illegal to hear a revenge porn for example,
and some of these images are so good now and
so real, and also it's de roget decree and what
you can animate them to do for example, that we
say that, actually, you know what, that is harmful enough

(04:26):
to be illegal. But as far as the technology itself
and using the sites, I think we do need to
be careful and we do need to be mindful of
people's rights and freedoms. And I think that's why when
we target the behavior, we still allow the technology to
be the end to exist. That we say that, actually
it's illegal to.

Speaker 2 (04:44):
Do this, okay, And where is your bill at?

Speaker 3 (04:47):
Yes, So my bill was drawn from the tin which
was a really lucky thing to have happened towards the
end of the year, and it's now going to be
read for the first time, probably in the first quarter
of this year. So I'm looking for support from across
the House so that we can get this through into
a Select committee, so we can hear from those people
that have been affected by this kind of technological abuse

(05:11):
I call it, and you know we can. We can
have this bill become a law, hopefully by the end
of the year.

Speaker 2 (05:17):
Laura McClure, I thank you for your time today, ACTMP,
author of the Deep Flake Digital Harm and Exploitation Bill.

Speaker 1 (05:23):
For more from News Talks at b listen live on
air or online, and keep our shows with you wherever
you go with our podcasts on iHeartRadio.
Advertise With Us

Popular Podcasts

Stuff You Should Know
Dateline NBC

Dateline NBC

Current and classic episodes, featuring compelling true-crime mysteries, powerful documentaries and in-depth investigations. Follow now to get the latest episodes of Dateline NBC completely free, or subscribe to Dateline Premium for ad-free listening and exclusive bonus content: DatelinePremium.com

Fudd Around And Find Out

Fudd Around And Find Out

UConn basketball star Azzi Fudd brings her championship swag to iHeart Women’s Sports with Fudd Around and Find Out, a weekly podcast that takes fans along for the ride as Azzi spends her final year of college trying to reclaim the National Championship and prepare to be a first round WNBA draft pick. Ever wonder what it’s like to be a world-class athlete in the public spotlight while still managing schoolwork, friendships and family time? It’s time to Fudd Around and Find Out!

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2026 iHeartMedia, Inc.

  • Help
  • Privacy Policy
  • Terms of Use
  • AdChoicesAd Choices