Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:01):
Hi, I'm Ruby Jones and you're listening to seven AM.
As companies across the world pursue generative AI technology, new
forms of manipulation become possible. At the forefront of this
are neurotechnologies which directly connect to your brain and collect
(00:22):
your brain data, and persuasive technologies which can use that
data to influence how you feel. In countries like China,
where the tech industry is controlled by the government, the
use of these technologies for disinformation and social control has
already taken hold, and as the tech gets better, the
risks get bigger. Today, senior analyst at the Makata Institute
(00:45):
for China studies Daria and Biombardo on the rise of
persuasive technologies and the way we could be exploited without
even knowing. It's Wednesday. Third, So, Daria, you've recently been
(01:07):
looking to the ways that the Chinese government is using
and developing AI. Recently we heard from the Taiwanese government
that there had been this mass disinformation campaign linked to
the CCP. Could you tell me a bit about what
we saw.
Speaker 2 (01:24):
Yes, absolutely so.
Speaker 3 (01:26):
What we've been seeing in Taiwan for a while now
is a sophisticated use of Ai by the Chinese government
to influence public opinion and disrupt democracy.
Speaker 2 (01:38):
There.
Speaker 4 (01:38):
With just over forty days till the elections, China's election
interference is ramping up. A scholar says China is sending
targeted propaganda videos to Taiwan voters by using fake Internet accounts,
real influencers, and big data analytics. The New York Times
writes that a new wave of this information is hitting Taiwan,
(01:59):
testing the ISLA le defenses.
Speaker 3 (02:01):
So for example, during the elections and general elections in
Taiwan in twenty twenty four, there were very large these
information campaigns that targeted candidates from the Democratic Progressive Party
and that included current President Lei Chin term.
Speaker 5 (02:19):
That's presidential candidate William Lai. His Democratic Progressive Party or
DPP rejects Beijing and wants closer ties with the West.
But in this YouTube video, Lie praises the pro China opposition,
claiming they represent Taiwan's mainstream view. Turns out this video
is a fake Stato got a hootion then produced with
(02:41):
artificial intelligence using a clip from this actual press conference
where Lie condemned his rivals.
Speaker 3 (02:49):
These campaigns weren't just your typical fake news stories. You know,
they used AI generated content like videos and avatars to
make the messaging more believable but also harder to detect,
and platforms get flooded with that type of content all
the time.
Speaker 2 (03:07):
Now.
Speaker 3 (03:08):
In fact, Taiwan's National Security Bureau reported that just in
the first three months of twenty twenty five, they detected
over five hundred thousand instances of CCP linked this information.
Speaker 1 (03:23):
And so do we know what effect this disinformation had
on people in Taiwan on what they believe to be
true about their country? Right now?
Speaker 3 (03:32):
This is not just peculiar to Taiwan, I think, but
the impact there has definitely been significant. So these AI
driven campaigns are usually designed to confuse people and they
make it much harder for people to trust whatever they
see online. So by tailoring these messages to specific groups,
(03:53):
this information becomes much more persuasive, and this has led
to increased polarization in any places. You know that people
are more divided and it's harder for them to figure
out what's true and what's propaganda. So over time, this
kind of manipulation can really weakened the trust in democratic institutions,
(04:17):
which is exactly what the Chinese government wants to achieve
in Taiwan in the optic of gaining control over the island.
So it's not just about spreading lies, it's about undermining
the very foundation of democracy.
Speaker 1 (04:35):
And the tech involved in all of this is evolving
very rapidly when it comes to China. How intertwined is
that development in the tech sector with the CCP.
Speaker 3 (04:48):
So in China, the government and the tech industry are
deeply intertwined through many initiatives, and one of them is
called military civil fusion. This initiative means that private companies
are often expected to align their innovations with the government's goal,
and especially tech companies, so whether it's in the space
(05:11):
of surveillance or propaganda or even military applications.
Speaker 2 (05:17):
For example, companies like Silicon.
Speaker 3 (05:19):
Intelligence or gore Tech are developing AI tools that can
detect emotions or create lifelike avatars, and emotion detection AI
can predict when someone is at its most vulnerable to
manipulation so that you can deliver those targeted messages exactly
at the right moment, and this can be used to
(05:42):
suppress protests like it often is, but also to spread
propaganda or even influence events. So these initiatives are directed
from the very top of the Chinese Communist Party. In fact,
Si jimping himself constantly and for size, is that the
private sector must serve the state's priorities. And it's obviously
(06:05):
a very different model to what we are used to.
We have at least some separation between government and business.
Speaker 1 (06:13):
You're talking about the government really working hand in hand
with private companies to work out how to read emotions
and then exploit them. Can you tell me a bit
more about how those models are trained.
Speaker 3 (06:28):
Absolutely, some companies were accused of training these systems by
analyzing the responses of Oiger detainees in Sinzhang Province of China.
We're you know, up to one million Oiger Muslims have
been detained for years now under very suspect circumstansus that
(06:50):
the United Nations says may account to crimes against humanity.
Speaker 2 (06:54):
So while these detainees.
Speaker 3 (06:55):
Were in prison or in camps, these company is basically
inter them to train their algorithms. So this is something
that you know, the Chinese government has achieved a fairly
high degree of success at home already. So it's something
that we call tech and hanced authoritarianism. So these tools
(07:18):
have for a long time allowed the Chinese government to
monitor people but also increasingly shape how people think and
how they behave.
Speaker 2 (07:30):
And it's not just happening in China.
Speaker 3 (07:32):
These technologies are being exported and used globally in ways
that can unfortunately undermine democracy and human rights.
Speaker 1 (07:44):
After the break, how are being influenced without even knowing it? Daria,
You've been looking into the very rapid evolution happening in
persuasive technologies. What direction is this technology taking? What types
of tech are we going to see more and more of?
Speaker 3 (08:04):
So aside from generative AI that we are all more
familiar with, there is things that are a little bit
more They got a little bit deeper into influencing us
with us noticing, So things like immersive technologies, you know
when you wear wearable devices that can send you pins
(08:25):
and nudges and get you to do things. Even more,
things like neurotechnologies, so technologies that are able to directly
connect to your brain and collect your brain data. So
if it's any sort of data that your brain produces
in the form of thoughts or things like that, it
sounds like far fetched and kind of very far away
(08:46):
in the future. But if you look at the development
of these companies, these things are coming quite quickly, and
we will soon find ourselves in a situation where from
a day to the next there will be delivered to
the public, and then we don't have the right regulation
set up to make sure that we have the guide
rails that we as individuals have the protection to not
(09:08):
be completely exploited by these technologies, but also by the
people who control these technologies, be that you know, tech
moguls or authoritarian states.
Speaker 1 (09:19):
Yeah, so we're talking about tech that can read us,
influence us, influence how we might be feeling. But are
we likely to know that that is happening. Are we
likely to register that we're being influenced by perspective technologies
as it's happening.
Speaker 3 (09:35):
Honestly, probably not, or like not in most cases unless
the most obvious cases. This is what these technologies are
designed to do. They're designed to influence in a way.
There is such ale and increasingly undetectable, So it's not
our fault for not being able to pick it up.
But it's just by design and how these technologies are developed.
(09:57):
But don't get me wrong, you know they do have
a lot of positive applications. However, things like AI powered
chat bots that we use every day now they can
weave propaganda into what feels like a casual conversation, So
you just wouldn't really be able to pick it up
unless you were really careful. It's just manipulation that feels
(10:21):
very personal, and it makes it harder to spot. So
that's why I think public education is really really important
on this.
Speaker 1 (10:29):
When you look at so called perspective tech and how
it's advancing, is there any one development in particular that
you think about. I mean, what keeps you up at
night when you're thinking about all of this.
Speaker 3 (10:42):
I think neurotechnologies is definitely something that's quite spooky to me,
especially when now increasingly companies are able to read brain wavelength.
If one day companies are actually able to predict what
we're thinking before we even think that, then they can change.
Speaker 2 (11:02):
Those thoughts, it's really scary.
Speaker 3 (11:04):
I actually don't know whether we truly want that from
tech companies. So something I really want to emphasize is
that I do think we should start trying to shape
these technologies in a way that really benefits society and
that follows the values that we want to achieve not
just reinforcing control or manipulation, either for commercial or political purposes.
Speaker 1 (11:34):
These technologies, as you say, they're being used around the world.
If we were to focus that on have CCP is
thinking about and how it's using this kind of technology,
What are the implications for us here in Australia.
Speaker 2 (11:50):
I think the implications are really profound.
Speaker 3 (11:53):
Chinese companies that are subjects to a CCP oversight, they
are exporting technologies globally, including to Australia.
Speaker 2 (12:04):
Something that is peculiar to.
Speaker 3 (12:05):
China is that under Chinese law, these companies are required
to share data with the government under national security circumstances.
So this raises significant security and privacy concerns for Australia.
This means that we really should be vigilant about the
technologies that we adopt and the technologies that we become
(12:28):
dependent on, especially in light of the potential risks that
these pose to our national security and democratic values.
Speaker 1 (12:37):
And how do you think Australia is going on that
on its ability to regulate these kinds of technologies. Do
you think that we are prepared for what's coming?
Speaker 3 (12:47):
You know, the Australian government has made some progress especially
in addressing cyber foreign interference by other states, but it
still leaves behind, especially when you think of about how
quickly persuasive technologies are developing. So there is a lack
of comprehensive regulation. But this is true for most countries
(13:08):
around the world. It's not just Australia, and so most
countries are just not prepared to address the ethical and
security risks, especially when they originate from authoritarian states like China.
So Australia really should start investing more in domestic research
to establish ethical guidelines, not just for AI, but for
(13:31):
a whole set of potentially dangerous technologies.
Speaker 1 (13:38):
Well, Daria, thank you for talking to me.
Speaker 2 (13:40):
It's been very interesting. Thank you for having me.
Speaker 1 (14:00):
News today, the Prime Minister Anthony Alberanzi has hit back
at comments from Israel's Prime Minister Benjamin Nyahu and echoed
by the Israeli Embassy in Canberra, that there is no
starvation crisis in Gaza. It's been reported that Albonzi told
the Labor Caucus room that Israel's stance is quote beyond comprehension.
Two leading human rights organizations in Israel have this week
(14:23):
said that the country is committing genocide against Palestinians in Gaza,
and the UN's Climate chief is urging Australia to take
more ambitious climate action. Simon Steel told an independent industry
body in Australia this week that quote, bog standard is
beneath you, and argued that colossal economic rewards are waiting
(14:44):
for countries who aim higher. Steele called Australia's September announcement
of a new twenty thirty five emissions production target a
potential defining moment for Australia. I'm Ruby Jones. This is
seven am. Thanks for listening.