Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:05):
Kyoda.
Speaker 2 (00:06):
I'm Chelsea Daniels and this is the Front Page, a
daily podcast presented by the New Zealand Herald. New Zealand's
Privacy Commissioner has issued new rules around the automated use
of biometrics. Biometric processing is the use of tech like
(00:26):
a facial recognition to collect and process people's biometric information.
The code comes into force on the third of November
twenty twenty five today, but agencies already using it have
until the third of August next year to align themselves
with the new rules. Today on the Front Page, Privacy
Commissioner Michael Webster is with us to take us through
(00:49):
what all of this actually means and how we can
protect ourselves. First off, my well, You're probably asked this
a lot, but what exactly is biometric information?
Speaker 3 (01:07):
Well, I guess that the simplest way to describe it
is each individual's physical and behavioral characteristics, personal.
Speaker 4 (01:16):
Information about them and the ones I think most people
will be familiar with things like fingerprint scans, iris scanning,
and of course increasingly we've seen here in New Zealand
facial recognition technology or FRT. Those are the sorts of
biometric information processings that we have seen here in this.
Speaker 2 (01:37):
Country, right, So instead of something that only happens in
a Tom Cruise movie, it's increasingly happening in real life.
Can you give us some examples, sure.
Speaker 4 (01:48):
I guess two of the most common are there's a
number of workplaces, for example, where there are sensitive areas,
and some workplaces are now using fingerprint scanning for people
to be able to eat into those places. I think
many people will be already familiar with the use of
IRIS scanning to unlock IT programs or devices that sort
of thing. And of course in New Zealand will be
(02:11):
familiar with the recent trial that Foodstuffs Northilent did of
the use of FRT to address some serious harm retail
crime issues that they were having.
Speaker 2 (02:21):
So what prompted the introduction of this new biometric code.
Speaker 4 (02:26):
Well, we've been scanning both what's happening internationally and here
in New Zealand, and increasingly businesses other agencies are using
technological developments to either improve their customer service or to
deliver on their objectives. And so inevitably there's going to
be we think a greater use of BIOMESHP technologies by
(02:49):
organizations out there.
Speaker 3 (02:50):
And we wanted to make sure that both in the
public and the.
Speaker 4 (02:54):
Private sector, that this country was ready for that coming
wave of a bit of uses of this sort of technology.
Speaker 2 (03:02):
Right, So I'm walking around the supermarket and obviously all
the cameras are catching me what I'm picking up, and
how I'm walking, and what I look like and everything.
What can the supermarket then now do with that information
now that there's a code in place.
Speaker 4 (03:16):
Well, in the past, of course, you would have been
watched by a CCTV which would have just been recording you,
and then a person would have been looking at that
to see what you're up to. Now, for example, if
someone has threatened a staff member or engaged in assault
at for example, a supermarket or a hardware store, they
(03:37):
might go onto what's called a watch list, a group
of people who, for example, have been trespassed and are
not allowed to back in. And so when you come
in again, what the biometric technology can do through the
camera system is match your face with the face that's
on the watch list. So that kind of I guess
verification process there, identification process, and then the store because
(04:02):
there always needs to be human oversight the stalk and
then decide what to do about that.
Speaker 3 (04:05):
Whether to call the police to approach you directly. Those
are the choices open to them.
Speaker 2 (04:11):
Right, So if I go back in wearing a fake
mustache and a wig, and will it pick me up?
Speaker 3 (04:18):
Is it that good?
Speaker 4 (04:19):
Some models of facial recognition technology, some biometric technology is
that good? It comes back to a question about what
level do you set the accuracy reading? And one of
the discussions we've had with New zeal organizations is ensuring
that you don't leave yourself at the risk of misidentification
(04:41):
or incorrectly accusing somebody of being on a watch list
when they're not, because that is incredibly harmful and damaging
and upsetting for individuals. And so you want to be
able to set the match I guess criteria quite high,
and you want to be able to use a very
reliable and good biometric technology product software that sort of thing. Unfortunately,
(05:09):
even if you could make it through every other security measure,
you won't be the last one. That's because it's protected
by gait analysis, the step beyond facial recognition.
Speaker 1 (05:18):
These cameras actually know how the agent walks, how he talks,
how he moves, write down to facial tics.
Speaker 3 (05:25):
So what you're saying is no mask can be to.
Speaker 1 (05:29):
A right.
Speaker 2 (05:30):
How does this code differ from privacy laws we already
have in place.
Speaker 4 (05:35):
Sure, so we already have here in New Zealand our
own New Zealand Privacy Act in biometric technology, biometric information,
like all personal information, was already gaverned under the Privacy Act.
What this does though, is I guess clarify and strength
and some of the requirements on organizations thinking about using
this technology.
Speaker 3 (05:56):
So now, for example, they need to go through a
deliberate process of.
Speaker 4 (06:00):
Considering whether they have privacy safeguards in place. For example,
with facial recognition technology, the Footsteps Northland trial had a
system of immediate deletion of non matches, so they didn't
build up this giant database of Kiley spaces if you
see what I mean, they would immediately deleted. That would
be a privacy safeguard. We expect under the code greater transparency.
(06:24):
So if you go into a business that's using biometric technology,
we would expect there to be very visible signage, whether
it's from employees or customers, saying that it's in use.
And we also want people to go through a very
careful process of working out whether the proposed used is
actually proportionate to the problem or the gain that you're
(06:47):
looking to achieve. So will the privacy risks be outweighed
by the benefits from using that technology? Is it basically necessary?
Is it justified? Is it effective for your to use?
Is it proportionate?
Speaker 2 (07:02):
I'm going to play Devil's advocate here because the first
thing that comes into my mind with this kind of
technology is brands looking at customers and seeing who picks
up their product. Who, you know, what their customer base
would be. So would this code protect your biometric information
from being used, say via marketing.
Speaker 4 (07:24):
So there's a couple of things. That one is again
we should always remember that people have always been watched
while they're in stores with CCTV. That has always been
a feature as well as the people in the store
as well. One of the things we've done with this
code is that we've said there needs to be some
particular limits on its use, and one of those limits
(07:46):
is around what we would call inferential biometrics, such as
I guess, trying to read people's emotions or their mental state. So,
for example, have they appeared to be more excited when
they walk past the particular item of clothing then when
they walk past something else, those sorts of things, and
we've said, actually, unless there are particular exceptions in place,
(08:09):
we don't think biometric technology should be used for those purposes.
Speaker 2 (08:13):
How does the code address particularly sensitive uses I suppose,
like profiling based on ethnicity or health even I.
Speaker 4 (08:21):
Guess there's a couple of issues there that I mean.
The first is that there is always a risk with
the use of biometric technology around bias and profiling, and
that is why, for.
Speaker 3 (08:32):
Example, to come.
Speaker 4 (08:33):
Back to that earlier the ashue I talked about, you
need to be able to set the match criteria quite
a higher level. A lot of the biometric tools that
have been developed overseas aren't representative or reflectives of New
Zealand's population. They're not very good at recognizing people with
darker skin types as well, for example, And so you
need to be able to assure your customers because at
(08:57):
the end of the day, you want the trust and
confidence of your customtomers. You need to be able to
assure them that those that are matched are matched accurately.
Speaker 3 (09:06):
And you also need to have pretty strict criteria if.
Speaker 4 (09:10):
You're using it for say retail crime reasons around who
ends up on a watch list and why they end
up on a watch list. You've got to watch out
for any human bias coming into how the systems are.
Speaker 2 (09:20):
Used, how will compliance be monitored or even enforced, and
what penalties might there be in place once the code
is enacted.
Speaker 4 (09:30):
So the Code includes the same requirements and obligations on
businesses and organizations that exist in the current Privacy Act
at the moment. For example, if you feel that your
privacy rights have been intruded on or you've been treated
unfairly in terms of the management and protection of your
personal information, what we say is that you should first
(09:53):
approach the organization concerned and if you can't resolve your
concerns with them, you can and this is the same
under the code as well. You can complain to our
office and if we feel that your complaint has merit
in it, we will investigate further. We also have a
compliance and enforcement team and one of the roles that
(10:13):
they carry out is just generally doing proactive schemes across
what's going on.
Speaker 3 (10:18):
In New Zealand.
Speaker 4 (10:20):
One of the things about New Zealand is that they're
not shy these days about complaining and so if we
see for example, that there's an uptick and media stories
about a particular organization and how it's used. In biometric technology,
we have the right and the ability to go and
see what's going on.
Speaker 2 (10:36):
What rights do individuals have regarding their biometric information?
Speaker 4 (10:42):
Again, individuals have the same rights as they have under
the privacy X. So, for example, you have the right
for your information to be held securely. We don't want
people building up databases of personal information and then that
information being at risk from a cyber attack.
Speaker 3 (10:59):
And I guess the key point here is that, for.
Speaker 4 (11:03):
Example, I had my driver's license stolen through a cyber attack,
I can go and get a new driver's license, right,
I can get a new driver's license number.
Speaker 3 (11:13):
That's okay. If I have my fingerprints stolen, or my
iris akain stolen, or my face stolen through a cyber attack,
that's not just information about me, that is me. If
you know what I mean, and you can't replace that,
it's gone.
Speaker 4 (11:27):
It's out there, probably being sold on the dark web.
So individuals need to know and have the right to
head information looked at as securely. Information that has held
you also have a right to access as well. You
can ask organizations for the information they hold about you.
Speaker 2 (11:44):
And you said that you looked internationally at what other
countries have been doing around these kind of codes and
this information. What kind of lessons did you guys learn,
either good or bad from what others are doing elsewhere.
Speaker 4 (11:57):
Well, we're actually in a little bit of a catch
up mode here with this code. Other countries have for
quite a while now treated biometric information as sensitive personal
information because of its inherent nature that it is you,
not just about you, and so they've already had their own.
Speaker 3 (12:16):
Rules and regulations in place.
Speaker 4 (12:18):
A number of the countries that we are particularly close
with and compare ourselves with, if you think of Australia,
the UK, Canada, those sorts of countries they are have
been and are currently looking at how.
Speaker 3 (12:31):
They regulate this information as well. So it's not just us.
Speaker 4 (12:35):
All around the world privacy regulators are looking at the
issue of the increasing use of this technology and ensuring
that when it is used that trust and confidence and
people's privacy rights are not impacted by that.
Speaker 2 (12:48):
It's funny you say that they were in catch up mode,
because when I read about this code, I thought, finally
New Zealand is getting ahead of the curve because we're
always talking about, especially on this podcast as well, when
it comes to AI or anything tech, really how far
behind we are, how far behind the laws are. For example,
you know, there are some laws that don't even really
(13:11):
understand that a phone is in our pocket these days,
you know what I mean. So it really does feel
like even for a lay person that we are getting
ahead of this. Are we that behind?
Speaker 4 (13:23):
First that the Privacy Act itself is actually technology neutral,
it's not kind of like, you know, anti technology, and
you know, at the end of the day, we want
New Zealand to benefit from the use of innoventive technologies,
but what it's about is creating I guess guardrails for
how that technology is used, because that at the end
of the day, and doing privacy well is going to
(13:46):
be good for the individual, but it's also good for business.
One of the things we're increasingly seeing in overseas surveys,
and this reflects perhaps a younger, more digitally savvy group
of people growing up, is that if people are unhappy
about how that personal information is being managed by a company,
by an organization, they can quite easily pick up and
(14:09):
leave and go somewhere else and we see that happen
increasingly in New Zealand as well, and so businesses are
now becoming more and more aware of that.
Speaker 3 (14:20):
And so.
Speaker 4 (14:22):
As long as any new regulatory frameworks like this code
are practical for them to use that we provide as
we have lots of guidance about how to do it well,
they're accepting of that as a way to help ensure
that when they do use this new innovative technology that
it is, as I say, good for them as well
(14:43):
as good for the individual.
Speaker 1 (14:49):
Another thing we can do to protect though, is never
rely on a biometric as a single factor of authentication.
It should be part of multi factor authentication. So that way,
even if my fingerprint does get compromised, well, I have
more fingers, But even if that were the case, I'm
still relying on more than just that alone. I'm relying
on a password, something I know, or something I have,
(15:12):
like a particular device. So with multi factoring authentication, we're
reducing that risk surface by spreading out the different security
mechanisms we're using.
Speaker 2 (15:25):
And I note that businesses already using this biometric information
one way or another have until August next year. Is
that just logistics wise? Does it just take a while
to I guess, recalibrate the technology that they're using at
the moment.
Speaker 3 (15:41):
That's right.
Speaker 4 (15:42):
We wanted to ensure again from a practical point of view,
that any business that is already using biometric technology had
a period of time just to reassess these systems against
the requirements and the code, to seek any further advice
they needed to carry out, for example, and updated what
we call obviously impact assessment, to relook at the privacy
(16:03):
safeguards they've got in place, to think about whether they
might want to add in some more for example, to
ensure that they are consistent with the expectations that are
set out in the code.
Speaker 2 (16:14):
And it's really incredible what biometric information we do carry.
I mean, you mentioned the iris scan and the fingerprint.
That's what people are kind of used to seeing, but
you've also got things like the way someone walks, the
way someone moves. That's you know, things that we may
not think of. Did you ever think when you first
(16:36):
became Privacy Commissioner or related into this space that you
would be speaking about people's iris scans, fingerprints and the
way they walk, talk and get excited when they see
a product or something.
Speaker 3 (16:51):
Yes, I did.
Speaker 4 (16:52):
And interestingly enough, we've been through quite consultative process in
developing this code. So it start actually a little while
ago now, a few years ago now, but increasingly both overseas.
But you see the sort of information used, and you
even of course can see it heralded in popular literature,
(17:13):
particularly say, for example, the sort of sci fi literature
you've talked about.
Speaker 3 (17:17):
It is a reality.
Speaker 4 (17:19):
New Zealand wants to benefit from and use digital innovation,
new technology.
Speaker 3 (17:25):
And part of my role is with our.
Speaker 4 (17:29):
Technology Neutral Privacy Act and our regulatory framework is ensuring
that new technology can be used in a way that
is safely used and is protective of people's privacy rights as.
Speaker 3 (17:41):
It can be.
Speaker 2 (17:42):
Thanks for joining us, Michael, you to thank you that
said for this episode of The Front Page. You can
read more about today's stories and extensive news coverage at
enzadherld dot co dot nz. The Front Page is produced
by Jay and Richard Martin, who is also our editor.
(18:03):
I'm Chelsea Daniels. Subscribe to the Front Page on iHeartRadio
or wherever you get your podcasts, and tune in tomorrow
for another look behind the headlines.