All Episodes

November 25, 2025 15 mins

The year is 1934, and the setting is Scotland. Startling photos have emerged, appearing to capture the Loch Ness monster.

Fast-forward 90 years to 2024, to an election campaign in Australia, and a video is released of a prime minister promising to ban gambling ads.

Both are fake – designed specifically to deceive. And while humans have always had a tendency to mislead each other, what’s new is the technology.

It’s now easier than ever to create highly realistic fake content. And we’re only just starting to see how wide-ranging and insidious the impact will be.

Today, independent senator David Pocock – on his new bill to crack down on deepfakes – and why he thinks the government has dropped the ball on regulating AI.

 

If you enjoy 7am, the best way you can support us is by making a contribution at 7ampodcast.com.au/support.

 

Socials: Stay in touch with us on Instagram

Guest: Independent senator David Pocock

Photo: AAP Image/Mick Tsikas

See omnystudio.com/listener for privacy information.

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:00):
The year is nineteen thirty four, the setting is Scotland.
Startling photos have emerged, appearing to capture the Lochness Monster.
Fast forward ninety years to twenty twenty four to an
election campaign in Australia and a video is released of

(00:20):
a prime minister promising to ban gambling.

Speaker 2 (00:23):
Ads on all forms of gambling advertising, digital and both.

Speaker 1 (00:28):
A fake designed specifically to deceive. And while humans have
always had a tendency to mislead each other, what's new
is the technology. It's now easier than ever to create
highly realistic fake content and we're only just starting to
see how wide ranging and insidious the impact will be.

(00:48):
I'm Ruby Jones and you're listening to seven AM today
Independent Senator David Pocock on his new bill to crack
down on deep fakes and why he thinks the government
has dropped them all on regulating AI. It's Wednesday, November
twenty six. Senator Pocock, thank you so much for joining me.

Speaker 3 (01:16):
It would be great to be with you.

Speaker 1 (01:17):
So you've been in the public eye for a very
long time now, and I wonder have you been aware
of people making deep fakes about you?

Speaker 3 (01:27):
Not a deep fake?

Speaker 4 (01:28):
I when Back when I was playing professional rugby, there
were a couple of scams where, you know, someone was
just using a photo of me to flog off like
protein powders or something, and it was harder than it
should have been to try and get them taken down.
I think they're on Facebook or you know, some social
media site, but you know, that's just a photo. Now
we're firmly in the era of AI and deep fakes,

(01:52):
and yeah, that's just a whole other level.

Speaker 1 (01:53):
Yeah, I want to get into all of that. But
I wonder when that was happening. I mean, it must
have felt strange to see your image used in that way,
and I mean, did you kind of have much recourse
to stop it.

Speaker 4 (02:07):
I think my manager ended up helping contact the social
media company and get it taken down. But you know,
that was kind of just it. It was very very
hard to do anything else, And yeah, it was. It
didn't feel great having some company selling questionable, questionable products
using your name. And you know, I think we're seeing

(02:27):
this across the board. We know that scams are on
the rise. People are getting far more sophisticated, and I
really think that we need to be legislating against this
sort of misuse of technology.

Speaker 1 (02:38):
Yeah, so, as you say, as AI has come into view,
the scope of this has just exploded. And this week
you introduced a bill into the Senate to crack down
on deep fakes. So tell me what's in the bill.

Speaker 4 (02:50):
Well, my bill basically says that you own your face
and you own your voice. Australians should own their likeness,
that is part of them, and you should be able
to deep fake someone's face or voice, create a video
or an audio recording without their consent. And so it
establishes a complaints process for the non consensual creation and

(03:13):
then sharing of deep fake material, and it strengthens the
Safety Commissioners powers to respond to our generated harm, to
issue removal notices formal warnings to social media companies, and
then actually sets up an avenue for civil redress through
the courts if individually is wrongfully depicted or exploited. I

(03:36):
guess that's probably going to be more used when there
is some sort of commercial gain, and I think we
really need that avenue.

Speaker 1 (03:42):
What about if someone unwittingly shared fake imagery?

Speaker 4 (03:46):
Very little you can do there, and then that is
very clear in the bill you have to knowingly be
sharing what is clearly a non consensual deep fake. It's
very much focused at the creation of them and so
being able to straw a bit of a lie I'm
the sand and say like, this is not okay. We're
entering this this very different world where you can now

(04:06):
put a few prompts and a few images potentially some
clips of someone speaking or video and generate their likeness
and make them say anything. We've seen this happen already
in scams with you know, I think some of the
examples have been you know, Koshy and Dick Smith selling
scam investment schemes. So this is happening, and this is

(04:29):
really trying to get the Parliament to think more about
this and actually put in our law that this is
not okay and there's actually ways that Australians can combat it.

Speaker 1 (04:38):
And so tell me a bit more about I guess
the problem that you're trying to fix here, because this
idea that you know, we should own our own face,
it seems obvious, it seems like common sense. But what
is the current reality. What recourse to people have at
the moment if a fake image or a fake video
of them is shared without their knowledge or consent very little.

Speaker 4 (04:58):
I mean, you know, contact the Facebook complaints team, good luck,
contact Instagram like there there is there is no sort
of designated complaints pathway. And a big part of this
bill is actually to give the e Safety Commission of
those powers so you can get in touch and say, hey,
this is a deep fac of me. I didn't consent
to this, this is this I don't want this up there,

(05:19):
and that they can then issue a directive to the
social media company to take it down.

Speaker 3 (05:24):
And then, as I said, you then have.

Speaker 4 (05:25):
Some sort of recourse through the courts if it has
genuinely caused you you damage. So you know, I think
this is really about our laws evolving with technology and
putting some safeguards in place, engaging in the in the
debate around AI, I'm really concerned that you have a
lot of companies and it seems politicians focus on all

(05:47):
of the upside. This is so amazing for productivity, for
our economy, for all these things, and yet we're not
actually looking at well, there are probably some real.

Speaker 3 (05:57):
Downsides to this.

Speaker 4 (05:58):
One is pretend this sort of identity theft essentially replicating
someone without their consent, and so I do think this
task is urgent and really have been saying to the
government for years now, you've got to get cracking on
this because AI is moving fast and you're certainly.

Speaker 1 (06:14):
Not coming up how AI can disrupt elections and how
political parties exempt themselves from the rules that everyone else
has to follow. Let's talk a little bit more about
the implications when this kind of material is put out there.

(06:38):
We could use a specific example. During the last election campaign,
you created fake videos of Dutton and alb and Easy
to highlight how easy it is to do that they
were obviously fake.

Speaker 2 (06:48):
That my government will be introducing legislation into the Parliament,
so we'll see a three year, phased in complete ban
on all forms of gambling advertising.

Speaker 1 (07:04):
Alvin Easy, I think the one that you created was
promising to ban gambling ads, which.

Speaker 3 (07:09):
He's obviously did that.

Speaker 1 (07:13):
So tell me a bit about how deep fakes have
been used in politics here and overseas, and then I
suppose the effect on the public when those kinds of
videos are put out there.

Speaker 4 (07:26):
In the last parliament, I really urged the government to
act before the election to rule out the use of
deep fakes and political advertising. They didn't have any interest
in it, thankfully, it didn't seem to play a part
in our election. But there are examples from South Korea
and the US where deep fakes of politicians were created.

Speaker 3 (07:45):
In the South Korea example, the parliament.

Speaker 4 (07:47):
They acted very quickly to put some very heavy penalties.

Speaker 3 (07:53):
On the use of deep fakes in elections.

Speaker 4 (07:55):
In the US, we saw a voice clone of than
President Biden tell people not to turn up in one
of the primaries.

Speaker 5 (08:03):
What a bunch of malarkey. We know the value of
voting democratic when our votes count. It's important that you
save your vote for the November election. We'll need your
help in electing Democrats up and down the ticket. Voting
this Tuesday only enables the Republicans in their quest to
elect Donald Trump again.

Speaker 4 (08:23):
So you know, there's I think huge potential to destabilize
the democratic processes. That's one thing, and I think that
should have been a priority. And hopefully for the next
section we'll have some safeguards in place, and can we
talk about the issue.

Speaker 1 (08:39):
I guess a bit more broadly than just deep fakes.
I mean, you made a number of recommendations to the
Senate Committee on Adopting AI, and that was all about
how AI should be regulated in particular to protect the
political system. But when you look at how quickly the
technology is moving, what do you think we need to
see in terms of guardrails.

Speaker 4 (08:58):
Well, I mean at the press conference, I say, when
I announced this, the Member for Curtain, Kate Cheney and
Independent from w A was calling for an AI safety Institute.
I see today the government has announced they're going to
do it, which is which is a great thing. I
think that's a step forward. But again we're sort of
on the back foot here. Everyone knows how fast AI

(09:20):
is moving. Yes, we need we need to body a
safety institute that employees like genuine experts in this space
to advise government free of sort of industry spin on this.

Speaker 3 (09:32):
These are the things you need to put in place.

Speaker 4 (09:35):
Then I think we need an overarching AI Safety Act.
This is the kind of technology where you can't constantly
be legislating against it. We actually need some broad safeguards.
This is what you can use it for, this is
what you can't use it for. And we need to
then use that, I guess to connect with like minded
countries you know, the EU, the UK, Canada and ensure

(09:58):
that we're we're sort of creating this. I guess system
of laws that are comparable and really do protect citizens
rights ahead of these enormous companies which clearly, I mean,
their ethics don't don't. The only thing they seem to
care about is you know, winning the AI race and
making a lot of money.

Speaker 3 (10:17):
So I think that's one thing.

Speaker 4 (10:19):
I think we're also really not talking about potential impacts
on labor markets, on people's jobs, whether that's white collar jobs.

Speaker 3 (10:28):
And I think particularly sort of entry.

Speaker 4 (10:30):
Level jobs, like who is going to be able to
get those entry level jobs if there are you know,
AI agents able to do this sort of work. And
then you look at companies like Amazon and they are
you know, full steam ahead on trying to automate so
much of their supply chain and even delivery. So I

(10:53):
think these are very real risks that for a long
time we've thought, well, that's just a long way over
the horizon, that's not worry about it. But I really
think that as Parliament, that's our job. We should be
looking ahead and saying these things could be coming, let's
have a very serious conversation about them and then put
some safeguards in place. And sadly, you think that's really

(11:13):
been missing and.

Speaker 1 (11:15):
What about on the idea of how political parties collect
and use data. Do you have concerns about how that
data is retained and whether or not it is used
to train AA?

Speaker 3 (11:27):
I do.

Speaker 4 (11:28):
I mean, you know, this is something Independence have been
banging on about when it comes to data harvesting. You know,
major parties send out these forms saying, hey, do you
want to do your postal vote? It kind of looks
like an AEC official form. You send them all this
data and then they, you know, put it into their system.
The major parties have exempted themselves from the Privacy Act,

(11:52):
from the Spam Act. I just don't think that that
cuts it. And so I think, yeah, we should be
asking the the major parties what their policies are around AI.
What are they actually feeding into you know AI platforms.
We don't really know. But again, like these are conversations
we should be we should be having.

Speaker 1 (12:13):
And the government does currently have a review into AI underway.
It's set to wrap by the end of the year.
What would you like to see come out of it?

Speaker 4 (12:23):
Well, I mean they did a lot of reviewing in
the last term of parliament, so I'd be interested what
came of those reviews. I would hope that there would
be a broad AI sort of safety act that would
actually look at all these things in a really holistic
way and put some safeguards in place. I have been

(12:44):
worried about the influence of industry. Seems like there's been
a big shift in the government sort of talking points
since the election. But you know, we'll give them the
benefit of the doubt until this review comes out and
see what it says.

Speaker 1 (12:56):
When you say industry, you mean what tech industry talking points?

Speaker 3 (13:00):
Yeah, big tech, you know, open AI.

Speaker 4 (13:03):
Yeah, all the all the all the players who seem
to have a endless supply of money and a genuinely
in a in a race to dominate this space.

Speaker 1 (13:17):
Well, Devid, thank you so much for your time.

Speaker 3 (13:20):
Thanks Ruby.

Speaker 1 (13:21):
By the way, what is the latest on the Parliamentary
Sports Club? But you are you back playing?

Speaker 4 (13:25):
I'm not actually, you know, publicly they said I was
invited back, but I haven't heard a word from them.
So I've just been doing my own thing in the mornings,
running with the dog what's the leading flycatcher? Chasing a
little sparrow hawk this morning, which is pretty fun.

Speaker 1 (13:45):
So waiting for the official invite, but not sure if
it's coming.

Speaker 4 (13:48):
Well, I actually I have no interest in going back
while they are sort of this cash for access scheme.
I totally disagree with the whole the whole setup, and
so I don't know, we'll see. They've kind of said
the they may review the policy. But I mean one
of the things I'm concerned about is that a lot
of people just think, well, this is just normal and
this is how things operate, and I think that kind
of says a lot about the.

Speaker 3 (14:09):
Problem with it.

Speaker 1 (14:22):
Also in the news, Labor has told government departments and
agencies to find savings up to five percent of their
budgets as it tries to reign in the deficit. Finance
Minister Katie Gallaher has denied the call amounts to cuts,
but unions and independents are warning it could mean job
losses in the public service and hit agencies such as
the CSIRO and the AFP. It comes after Labor campaigned

(14:44):
against Peter Dutton's pledge at the last election to cut
the forty one thousand public service jobs created in Labour's
last term, and One Nation leader Pauline Hanson has been
suspended from the Senate for seven sitting days after wearing
a burker in the Senate chamber. Senator Hanson wore the
garment as a stunt to call for a nationwide ban

(15:04):
on burkers and head coverings, refusing to leave the Senate
floor after she was sanctioned. After her suspension was ordered,
Senator Hanson told the chamber the people will judge me
at the next election. I'm Ruby Jones. This is seven am.
Thanks for listening.
Advertise With Us

Popular Podcasts

Stuff You Should Know
Las Culturistas with Matt Rogers and Bowen Yang

Las Culturistas with Matt Rogers and Bowen Yang

Ding dong! Join your culture consultants, Matt Rogers and Bowen Yang, on an unforgettable journey into the beating heart of CULTURE. Alongside sizzling special guests, they GET INTO the hottest pop-culture moments of the day and the formative cultural experiences that turned them into Culturistas. Produced by the Big Money Players Network and iHeartRadio.

Crime Junkie

Crime Junkie

Does hearing about a true crime case always leave you scouring the internet for the truth behind the story? Dive into your next mystery with Crime Junkie. Every Monday, join your host Ashley Flowers as she unravels all the details of infamous and underreported true crime cases with her best friend Brit Prawat. From cold cases to missing persons and heroes in our community who seek justice, Crime Junkie is your destination for theories and stories you won’t hear anywhere else. Whether you're a seasoned true crime enthusiast or new to the genre, you'll find yourself on the edge of your seat awaiting a new episode every Monday. If you can never get enough true crime... Congratulations, you’ve found your people. Follow to join a community of Crime Junkies! Crime Junkie is presented by audiochuck Media Company.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.