All Episodes

September 10, 2025 14 mins

Images of child sexual abuse generated by artificial intelligence are on the rise.

Australia’s eSafety Commissioner, Julie Inman Grant, says 100,000 Australians a month have accessed an app that allows users to upload images of other people – including minors – to receive a depiction of what they would look like naked.

Predators are known to share know-how to produce and spread these images – and in Australia, the AI tools used to create this material are not illegal.

All the while, Julie Inman Grant says not a single major tech company has expressed shame or regret for its role in enabling it.

Today, advocate for survivors of child sexual assault and director of The Grace Tame Foundation, Grace Tame, on how governments and law enforcement should be thinking about AI and child abuse – and whether tech companies will cooperate.


If you enjoy 7am, the best way you can support us is by making a contribution at 7ampodcast.com.au/support.


Socials: Stay in touch with us on Instagram

Guest: Advocate for survivors of child sexual assault and director of The Grace Tame Foundation, Grace Tame

See omnystudio.com/listener for privacy information.

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:01):
I'm Ruby Jones and you're listening to seven AM. The
E Safety Commissioner Julie Van Grant says one hundred thousand
Australians a month have access to an app that allows
users to upload images of other people, including miners, to
receive a depiction of what they would look like naked.
Predators themselves are found to share know how to produce
and spread these images and in Australia, the AA tools

(00:24):
used to create the material are not illegal and all
the while, Julia Imman Grant says, not a single major
tech company has expressed shame or regret for their role
in enabling it all. Today, advocate for survivors of sexual
assault and director of the Grace Tame Foundation, Grace Tame
on how governments and law enforcement should be thinking about
AI and child abuse and whether tech companies will cooperate.

(00:50):
It's Thursday, September eleventh, and just a warning, the following
episode contains details of child sexual exploitation. So Grace, it's
Child Protection Week and at the Grace Time Foundation, you've
been focused on AI and the potential harms and the
risks for children. So to begin with, tell me why AI.

(01:13):
While you're looking at that in particular.

Speaker 2 (01:15):
Well, technology has always been used to facilitate child sexual abuse,
and child sex offenders.

Speaker 3 (01:22):
Are often among the early adopters self.

Speaker 2 (01:25):
We go back to nineteen eighty six when the then
US Attorney General did a commission on pornography.

Speaker 3 (01:30):
In their final report.

Speaker 2 (01:32):
They stated very clearly that child sex offenders were using
computer networks to trade child sexual abuse material, but also
to network with each other, share tactics of abuse and
procuring children. In the realm of safeguarding children, there's always
been a concern about technology, and as we're seeing in

(01:54):
recent years that technology is becoming more advanced. So if
we look at even just chatbots, where you can trick
a chatbot into generating descriptive child exploitation material, you can
use a chatbot to give you advice on how to
evade detection. You can ask it what it might do

(02:16):
or what might be the best course of action. You know,
in the case of actually being confronted by law enforcement,
what are the first things that you should say to
you know, mitigate any potential punishment. Then when it comes
to tools that are able to generate images, we're seeing instantaneous,
prolific creation of very realistic abuse material.

Speaker 3 (02:41):
And contrary to.

Speaker 2 (02:42):
Myth that this is harmless, all of these programs that
generate images or videos have actually been trained on photographs
of real children, if not existing child exploitation material. You know,
when we look at, for example, the nature of the
images that are being generated, that are becoming more and
more creative, and you know, it's a very low barrier
to entry for a lot of these programs. Whereas you know,

(03:03):
if you're producing child exploitation material in the traditional sense,
you know you would actually have to film real child
exploitation material, you would potentially edit it and then distribute it,
whereas now it's a few clicks to prompt programs that
are capable of learning to generate this material and then
spread it far and wide.

Speaker 4 (03:25):
You can download an immature two of your classmate, and
for five cases this is a girls or women. You
put it into the online website and it generates a
deep nude, so an accurate depiction of what that person
would look like naked.

Speaker 1 (03:40):
And this is something that Australia is Easafety Commissioner recently
pointed out as an area of particular concern just how
prolific the use of these apps is.

Speaker 3 (03:50):
We have any sense of how many Australian children or
young people have used.

Speaker 4 (03:54):
It for these particular sites that we're taking action against,
we know at least one hundred thousand visits. It's a month,
one hundred thousand visits a month from Australia.

Speaker 1 (04:03):
So what is the legal landscape here and how much
power does law enforcement have in what sounds like a
very rapidly changing space.

Speaker 2 (04:12):
Well, there is low hanging fruit for governance for legislators
to pick off, such as apps like Newdify, apps whose
sole purpose is nefarious. There's no good intention behind an
app that is designed to remove.

Speaker 3 (04:30):
The clothing of a clothed.

Speaker 2 (04:33):
Photograph of someone without their consent, presumably, so you know, outlawering.

Speaker 3 (04:40):
Those sorts of apps.

Speaker 2 (04:41):
But it is very hard to keep up because whereas
in the past with the development and release of technologies,
there's sort of be a kind of a minimum time
frame where you could expect something to become publicly available,
but there's no limits, now there's no or there's very
few limits, and the technology itself is becoming smarter and

(05:05):
smarter and faster and faster. So it's very hard for
legislators to keep up. We need safety by design. We
need safeguards built into publicly available AI tools or online
spaces in general in.

Speaker 3 (05:22):
Terms of actually enforcing the law.

Speaker 2 (05:23):
In terms of detecting crime, it becomes a case of
needing to fight fire with fire. It becomes a case
of needing to empower law enforcement with the relevant AI
software to combat these AI rms. We need victim id
tools available to law enforcement, tools that also can be

(05:45):
applied to the reviewing process of child exploitation material to
be able to discern whether it is quote unquote real
or has been generated by a computer. So yeah, it's
a complex and it's a rapidly shifting space.

Speaker 1 (06:03):
And when it comes to all of these apps, this tech,
the default is that they are legal until they're not.
So when it comes to legislation around this, where does
Australia sit globally?

Speaker 2 (06:14):
Australia is once a front runner in certain areas and
then in others we lag. And some of that has
to do with our privacy laws. Is particularly what we
were just talking about before in terms of the limitations
of law enforcement here to actually use victim identification tools
to speed up the process of you know, otherwise very

(06:36):
drawn out investigations but in terms of legislating the use
of and the possession of certain AI tools that are
used for nefarious purposes, Australia is I think stepping in
the right direction there. There's obviously so much work to

(06:58):
be done. It's difficult when we consider the bigger picture
of the global system in which we are currently operating,
which is essentially a techno feudal system where you know,
even governments have no option, essentially a but to buy
in and it's really the tech industry that is able

(07:20):
to wield so much power and control.

Speaker 3 (07:26):
Coming up?

Speaker 1 (07:26):
Will governments worldwide stand up to tech companies.

Speaker 5 (07:41):
Let's get some more now on that national roundtable in
Parliament House where child safety advocates have been discussing the
threat artificial intelligence poses to child safety. Former Australian of
the Year and advocate for Survivors of child sexual abuse,
Grace Tame has been calling on the government to prevent AI.

Speaker 1 (07:59):
Being used to create child abuse material.

Speaker 6 (08:02):
You recently attended around table at Parliament House which was
around discussing child safety online including AI abuse material. Can
you tell me a bit about how that went and
whether or not you get the sense that this government
is taking the issue as seriously as it should.

Speaker 2 (08:18):
I think it should be commended that we're able to
have these discussions that bring together a combination of politicians,
of people who work in the tech industry.

Speaker 3 (08:27):
Of law enforcement, lawyers.

Speaker 2 (08:30):
Child safety advocates, academics who are all essentially working towards
the same end, and that is keeping children safe from harm.
I find it hard sometimes not to be cynical. You know,
we've gone I think broad on change, broad on education

(08:51):
and awareness raising of the potential harms and the very
real harms faced by children both on and offline, but
we haven't gone deeper. I am always struck when I
have conversations with everyday people who don't work in safe
guiding children.

Speaker 3 (09:09):
I'm always struck by how little they know, which is to.

Speaker 2 (09:12):
Everyone's detriment and it's to the benefit of perpetrators. I
don't think that we, really, generally speaking, in the broader population,
have an appropriate level of awareness of just how a
dark this world is.

Speaker 3 (09:28):
Yeah, so, I mean.

Speaker 1 (09:29):
It's obviously a complex thing to tackle, but specifically, what
would you like to see from the Australian government.

Speaker 2 (09:34):
So obviously there's been a lot of hype around consent
education and respectful relationships education rollouts in recent years, and
while I think they are significant and important milestones, the
next step is for grooming prevention education that is not
just targeted towards children, but targeted towards parents, teachers, childcare workers,

(09:59):
essentially anyone who has a duty of care to a
young person. And the reason for that is that grooming
is a highly specialized stratagem of preparing an environment for
child's actually be used to take place in play in sight,
and it is, you know, in some ways parallel to
other forms of non contact offending, but it is very

(10:21):
specific to the context of adults harming children, and it
has been conflated in the process of developing these other
types of harm prevention programs. It has been dangerously conflated
again to the benefit of perpetrators who really thrive off

(10:42):
social confusion.

Speaker 3 (10:44):
So I would like to see grooming.

Speaker 2 (10:47):
Prevention education as the next key focus area in terms
of prevention.

Speaker 3 (10:53):
Broad education will not tick that box.

Speaker 2 (10:56):
It needs to be highly specialized because we're up against
a high specialized cohort of offenders that's often deliberately harm children,
and there's also the opportunity to criminalize the possession of
certain apps that are clearly solely designed to cause harm.

Speaker 1 (11:17):
And you mentioned the power that the tech industry wields.
Given that, what sense do you get of how successful
governments can be in getting their cooperation in changing the
products that they make.

Speaker 2 (11:30):
We've seen very little action on the part of governments
worldwide to actually stand up to these tech companies that
are producing products that they.

Speaker 3 (11:43):
Know are used to harm children. You know, with encryption.

Speaker 2 (11:50):
Now embedded in even platforms like Facebook, we're seeing offenders
grooming children on these platforms and going undetected because there's
no way to find these conversations.

Speaker 3 (12:03):
Then when it comes to tools.

Speaker 2 (12:06):
That are able to generate images, it is capable obviously.

Speaker 3 (12:09):
Of creating things that don't exist in real life.

Speaker 2 (12:12):
So this ceiling of depravity that reality has doesn't exist online.

Speaker 3 (12:19):
And then the concern.

Speaker 2 (12:21):
Is as well that if you're getting individuals who are
producing and consuming this ever more depraved material, that when
they do actually come into contact with a child that
they are intending to harm, that they are going to,
you know, push those limits of violence even further because

(12:44):
they have been desensitized to you know, a quote unquote
normal boundaries of interaction, but profit is the big motivator
a certainly over protection.

Speaker 1 (12:59):
Well, Grace, thank you for taking the time to talk
to me about all of this today.

Speaker 3 (13:03):
Thanks for having me on and Ruby, I really appreciate it.

Speaker 1 (13:17):
Also in the news today, Australia has condemned an Israeli
strike on Harmas targets in Qatar that killed six people,
including the son of Hamas's exiled Gaza chief and top negotiator.
Qatar is a security partner of the United States and
has acted as a mediator alongside Egypt in ceasefire talks
between Israel and Tamas for a ceasefire in Gaza. Hamas

(13:38):
has described the attack as an attempt to assassinate the
group's ceasefire negotiation team. Foreign Minister Penny Wong says the
strike risks an escalation of conflict in the region, and
the Victorian government has said it will offer financial support
to businesses in the town of Porapunka as the search
for alleged gunman Desmond Freeman continues. The details have yet

(13:59):
to be formally announced. Split premieres Into Allen says the
community has been carrying a heavy burden since Freeman allegedly
shot two police officers prompting warnings to the public to
stay away from the area. I'm Ruby Jones. This is
seven am. Thanks for listening.
Advertise With Us

Popular Podcasts

Stuff You Should Know
Dateline NBC

Dateline NBC

Current and classic episodes, featuring compelling true-crime mysteries, powerful documentaries and in-depth investigations. Follow now to get the latest episodes of Dateline NBC completely free, or subscribe to Dateline Premium for ad-free listening and exclusive bonus content: DatelinePremium.com

The Joe Rogan Experience

The Joe Rogan Experience

The official podcast of comedian Joe Rogan.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.