Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:00):
Back to the Privacy Commissioner who's found that the facial
recognition trial by the food Stuffs people was compliant with
the Privacy Actor. It also worked, violent behavior was reduced
by sixteen percent, prevented more than one hundred serious harm incidents.
Michael Webster's the Privacy Commissioners with us morning. Good morning mate,
what sort of lens did you look at this through?
Speaker 2 (00:21):
So when we looked at the trial of the use
of facial recognition technology, we looked at it from the
point of view about whether its use was was justified,
whether it was proportionate and necessary to address the problem,
the problem being serious harm caused through retail crime. And
we also looked at as it was used, what privacy
(00:43):
safeguards needed to be put in place to ensure that
as much as possible New Zealander's privacy was protected.
Speaker 1 (00:50):
At the same time, did it pass easily or scrape through.
Speaker 2 (00:55):
We determined that its use by Foodstuffs was compliant the Act,
but we also recommended to food Stuff so there's a
number of steps they need to take to improve and
enhance how it's being used, and we also set out
some recommendations for others who might be thinking of using
it as well. There are still concerns around technical bias
(01:18):
issues with the use of facial recognition technology software that
comes from overseas.
Speaker 1 (01:23):
Right, is that improving rapidly? Do you know, technologically speaking
or not?
Speaker 2 (01:28):
It is improving. What we would like to see here
in New Zealand is a development of our own data
set to use with FRT, so representative of the New Zealand.
Speaker 1 (01:38):
Population that'll come with time, won't it.
Speaker 2 (01:42):
We're certainly recommending to the powers that be that that
sort of thing happened.
Speaker 1 (01:47):
Right when you say proportionate, who decides on proportion?
Speaker 2 (01:51):
Well, what Food Stuff did was get an independent evaluator
and to examine the degree to which the use of
the technology reduced of say aggression or bullying behavior, or
large scale retail theft. And so we've examined that data.
We also went out and spoke to customers and staff
(02:13):
and supermarkets, looked at how the systems were being used
and given the degree of concern that's out there, Mike
about retail crime, we determined that the way in particular
that Foodstuffs had implemented the system met that very high
threshold in the privacy excess.
Speaker 1 (02:30):
So how much weight did you place on so violent
behavior reduced by sixteen percent? Was that a slam dunk
or was that kind of will put that in with
the mix of other things we worry about.
Speaker 2 (02:42):
That was an important statistic. What was also important was
some of the safeguards put in place. For example, Mike,
New Zealanders might be surprised to know that actually close
to two hundred and twenty six million phases were scanned
during the trial, but ninety nine point nine nine nine
seen to them were immediately deleted. And that was because
(03:03):
we suggested to Foodstuffs and they took on board the
idea that if there wasn't a match, the face get
immediately deleted. So throughout the life of the trial, only
about three thousand or so scans of interest were actually
actioned and fed into the system.
Speaker 1 (03:18):
See, I, as a punter, couldn't care less. If you
can reduce crime and it makes life easier for them
to do business, I'm all for it. Am I normal
or not really normal?
Speaker 2 (03:29):
We did a survey this year and we asked questions
about exactly that, Mike, and what we saw was that
about two thirds of New Zealanders we're willing to see
an increased use of what we described as privacy intrusive technology,
which FRT facial recognition technology is. We're willing to see
its use if it reduced theft, if it increased personal safety.
(03:54):
That said, forty to fifty percent of New Zealanders we're
concerned about the use of facial red condition technology as well.
So people are willing to see this technology being used,
but they want it to be used in a way
that is absolutely as privacy protective of them and their
personal information if possible.
Speaker 1 (04:12):
All right, good, So I appreciated Michael Webster, who's the
Privacy Commissioner.
Speaker 2 (04:15):
For more from The mic Asking Breakfast, listen live to
news talks. It'd be from six am weekdays, or follow
the podcast on iHeartRadio.