Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:09):
You're listening to a podcast from News Talk sed Be
follow this and our wide range of podcasts now on iHeartRadio.
Speaker 2 (00:16):
So a bit of an eye opening reality this week
when it comes to our protectional lack of when it
comes to AI ham talking on The Little Things podcast
with my co host Louise and II this week, doctor
Michelle Dickinson told us New Zealand has no regulation in
place to protect us. What does a lack of regulation
lead to. Well, here's one example.
Speaker 3 (00:35):
Meta Facebook just came out recently and said, actually, what
we've done is we have been data scraping Australia and
New Zealand citizens because you guys didn't make any regulation
to protect that. So in the EU you can opt out.
In New Zealand, we cannot. We have been using adult
Facebook and Instagram pages since two thousand and seven and
(00:56):
scraping that data and training all AI models on what
you look like. But also if you're holding your child
or you're posting a picture of your child and it's
not private, we've been using your children's faces and children's
images have been created from New Zealand and Australian citizens and
we can't. You can never unlearn that.
Speaker 2 (01:12):
A group of AI experts are also calling from more
regulation in the AI space. One of the authors of
that letter is Program Director of Artificial Intelligence, doctor Andrew Linsen,
and he's with me now. Good morning, Andrew, Good morning.
So just how far behind is New Zealand in terms
of regulation when it comes to AI.
Speaker 4 (01:30):
A We're pretty nearly the end, to be honest. I
mean we see that Australia is going to do things,
Canada and the US, well, the US was doing things,
and of course the EU is quite busy as well.
So yeah, we're at a back of the pack.
Speaker 2 (01:43):
Okay, when we look at the countries that are doing something,
who's kind of leading the way? Are there examples of
countries you know that we should be looking closely at.
Speaker 4 (01:53):
Yeah, so the example most people turn to will probably
be the EU because they have their EUAI Act. It's
a very comprehensive piece of legislation and it has its
own issues or criticisms, but it's still a really good
piece to sort of start looking at as as a
starting point. And they have this quite nice risk management
(02:15):
based approach, which is sort of what we've advocated for
in our letter, And so I think they're a good one.
And then of course we can also look at Australia
being our neighbors, they also are doing some some quite
good stuff as well.
Speaker 2 (02:25):
What are your concerns if we don't regulate AI.
Speaker 4 (02:29):
Yeah, I think, like Michelle said, there's there's quite specific
concerns like the run strege rays, but there's also you know,
many others. I mean, in our leader we see various
things like the misuse of intellectual property, right, we see
all these all these deep fakes, all these images being created,
the use of AI for fraud. We're sort of upheaval
the workforce. We know that AI will replace some jobs,
(02:52):
but how are we going to manage that transition? As
well as even more sort of sinister things like the
effect that AI has on democracy itself. I mean a
lot of what we're seeing happening in the US. You know,
there's a lot of AI lobbying in the AI manipulation
behind the scenes there from different parties, so quite wide ranging,
but quite scary. But we really can tackle us head
(03:13):
on if we choose to.
Speaker 2 (03:15):
We need to be clear, this isn't about not having AI,
is it. I mean AI can be really useful that
we want to be innovative here in New Zealand. But
it's about deciding in New Zealand, deciding what role were
plays in our lives, isn't it.
Speaker 4 (03:27):
Yeah, that's exactly right. I mean we see really really
cool and helpful uses of AI. I mean I have
a research project working in the medical sector to use
AI as a way to enhance cancer treatment. And you
know that's not something we want to we want to stop.
That is the sort of thing we should be encouraging.
But it's when we think about these sort of third
party interests and as Michelle said, places like Facebook and
(03:51):
Meta coming into New Zealand and doing whatever they like
because we're not saying no. And so I think it's
actually really dangerous that the government has basically said we're
not going to regulate AI. You know, you're welcome to
come here and use however you like, because we will
get manipulated and we will see these big players misuse it,
rather than getting to set our own course and say hey,
(04:12):
this is what we as a country want to use
AI for and going from that position.
Speaker 2 (04:17):
Because if you were a Facebook or Instagram user, you
would have to probably trawl through very small terms and
conditions in fine print to find where it said that
they were going to scrape your public material, right. I mean,
I don't think a lot of New Zealanders is probably
known that's been going on for such a long time.
Speaker 4 (04:33):
Well no, And I mean that's just one of many
examples you could find, right, And I think it's a
kind of indicative of how the hold and we are
to these companies, and that people aren't going to say no,
I'm not going to use Facebook, or no I'm not
going to use Instagram, because that's your whole social network.
That's how you talk to your friends, your family, that's
how you know, have part of your social group in
(04:54):
this modern world. And so the fact they can sort
of make these rules and then technically we could say
no and not use them. But at the same time,
I'm not going to stop talking to my relatives overseas
on Facebook, am I very much? It's an unfair transaction.
Speaker 2 (05:10):
We mentioned sort of what other countries are doing and
things we probably need to be creating our own regulation
that's specific to New Zealand, though, don't we. I Mean,
it's good to look at what other people are doing,
but we probably need to be creating what's right for
us here, is that correct?
Speaker 4 (05:24):
Yeah, for sure. And I mentioned the EUAI Act as
an example, but obviously that has been designed first of
all for European market and also for the culture and
sort of the social systems of the EU. And we
know in New Zealand we obviously have quite different demographics,
we have different value systems, and we have different concerns
around the use of AI. So I would like to
(05:46):
see us look at these international examples as sort of
starting points or as almost case studies, but then make
a version for ourselves that is fit for purpose. It
doesn't mean starting for scratch. It just means, as you said,
tayloring it to our needs.
Speaker 2 (05:58):
Okay, So we want the benefits, but we want to
minimize the risks. So what kind of regulation do we need?
Speaker 4 (06:05):
Yeah, And so this is why we've been asking for
a risk based approach, where we have, for example, some
forms of AI use such as state surveillance, facial tracking,
you know, those really sort of dystopian things that we
say no, no, no, those are not allowed at all.
Those are banned. And then we also have a lot
of uses of AI that are not concerning. So if
(06:27):
people are using co pilot to summarize their emails. You know,
we don't really want to put red tape around that.
But it's the sort of the stuff in the middle
where we look at things like the use of AI
and social media, or the use of AI and healthcare,
where we can potentially see some benefits or that is
needed perhaps, but we want some more oversight as to
(06:47):
how that's done, and we want some more guardrails in place.
And so this is this idea of having a risk
based approach. You're saying, Okay, what's the potential harm here,
and based on that, how do we need to what
rules do we need to put in place to make
sure those those harms are minimized and those benefits are maximized.
Speaker 2 (07:03):
Are you seeing any interest or effort from politicians when
it comes to regulating AI?
Speaker 4 (07:10):
Yes and no. I mean I had a couple of conversations.
We put this letter out, but it's not going anywhere
particularly fast, and we haven't really had much response from
the government, of course, And so yeah, I'm a little
bit I'm not surprised, but I'm still disappointed.
Speaker 2 (07:27):
Visual how urgently do we need to get something done?
I mean, tech development moves very fast, and we just
generally always seem to be behind the ABEL, but we
are really far behind in New Zealand. He someone just
put it in a two hard basket.
Speaker 4 (07:41):
I think so, and I think so. I think it
might become a bit of an election issue next year,
but of course even next year is still a long
time away. When you think to AI, think about AI,
and I mean New Zealand is really concerned about this too, right.
I think we're third to last in a global study
in terms of trusting AI, and about eight percent of
New Zealanders want AI to be regulated. So there's definitely
(08:03):
a political mandate there. It's just the government is for
whatever reason, not wanting to bother with it. And so
I would be very interested to hear why they ignoying
it and why do they think it's not an issue,
because as your raised, you know, there's very real concerns,
not just from what you've said with Michelle, but from
when you talk to everyday people as well.
Speaker 2 (08:22):
Absolutely, thank you so much, doctor Andrew Linsen for talking
us through that. The episode of the Little Things is
out now you can get it at iHeart. Michelle is
really interesting. Andrew mentioned there, of course, the upheaval of
the workforce, and Michelle talks a lot about that, and
you know about how Singapore put an AI strategy in
place ten years ago in order to deal make sure
that you know, if you're over forty, you can deal
(08:43):
with AI. But so we really are behind the eight ball.
But look, if you want to understand a little bit
better about what it is and what it does, and
also how it might affect you, how it might affect
your children, what kind of heart's going to affect our
jobs and things. Michelle covers all that in the podcast
The Little Things, So I Heart where you get your podcasts,
have a listen.
Speaker 1 (09:02):
Check it out for more from news talks that'd be
listen live on air or online, and keep our shows
view wherever you go with our podcasts on iHeartRadio