All Episodes

June 17, 2025 12 mins

Having experienced the devastating consequences of online criminal activity, founder and CEO, Andrew Bud, vowed to find a safe and secure way for organizations to verify the genuine presence of an individual. To enable this to happen it was necessary to ensure that the person setting up an online account was an actual person, and then authenticate that person whenever they return to use a digital service or verify their identity at secure locations.

The Big Idea was to use facial biometrics and controlled illumination to assure the genuine presence of a human being. Why facial biometrics? Because most government-issued identity documents contain a picture of a face and an identity can be verified against a trusted source. Why controlled illumination? Because it’s completely effortless for the user – with no instructions to follow or complex actions required. Just a brief, simple selfie capture that takes just seconds to complete. This gives the most secure assurance that an individual is not an imposter, a photo, a mask, a deepfake video, or a multitude of other sophisticated cyber attack tactics being used by criminal gangs worldwide. iProov does facial verification, not facial recognition. The difference is, with verification tech like iProov's, the user consents, and derives a benefit from it.

See omnystudio.com/listener for privacy information.

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:02):
Bloomberg Audio Studios, Podcasts, radio news. You're listening to Bloomberg
Business Week with Carol Masser and tim Stenoveek on Bloomberg Radio. Carol,
you have traveled abroad in the last year, did it? Yeah?
You did. You went and visited your daughter.

Speaker 2 (00:20):
Oh yeah, that's right.

Speaker 1 (00:21):
Yeah, he came back to the airport.

Speaker 2 (00:22):
Thanks too.

Speaker 1 (00:23):
Yeah, no, prom he came back to the airport.

Speaker 2 (00:25):
Yeah.

Speaker 1 (00:26):
You're in line Customs and Border Protection totally. You go
up to them, you give them your passport. They said,
you say, I'm Carol Masser, and they say, how do
we know you're Carol Masser? Facial recognition, well, facial verification.
There's a difference. Andrew Budd is going to explain it.
And Andrew Budd's product is actually used by Customs and
Border Protection in that exact situation to verify that we

(00:49):
are who we say we are when we are crossing borders.

Speaker 2 (00:52):
I love that you said that because it's an interesting designation.
So let's get to it. Alex, I'm sorry. Andrew Bud
is with us. He's founder and CEO of I Prove
and he joins us from Amsterdam. It's good to have
you here, especially as we are continuing to embark on
our new world order thanks to AI. Tell us a
little bit more about your company and the importance of

(01:13):
this distinction between facial verification versus facial recognition, because they're
very different things.

Speaker 3 (01:19):
They are very different, and it's very easy for them
to be confused. Look, facial verification is about empowering the person,
empowering the citizen. If you can use your face to
secure your own identity, if you know that it's happening,
if you've consented to it's happening, if you get personal
benefit from it's happening, and your privacy is protected, that's

(01:42):
facial verification. That's empowering the citizen. Facial recognition is about
identifying people, often without their consent, often without their knowledge.
It's a completely different thing, and that's about empowering organizations.
It's empowering, about tracking people. Facial verification is about empowering
It's about enabling you to use your face as a

(02:04):
security credential. And it's a great chrisk security credential because
it can't be stolen, it can't be lost, and it
can't be shared.

Speaker 1 (02:10):
So is a face ID on an iPhone, for example?
Is that facial recognition or facial verification.

Speaker 3 (02:16):
That's facial verification very much. Facial verification.

Speaker 1 (02:19):
Facial recognition is essentially used for security purposes by third parties.
Then how would you define facial recognition?

Speaker 3 (02:27):
Yes, offacial recognition is really, in our terms about surveillance.
It's about identifying who the person is. Facial verification is
when I start by saying, Hey, I'm the owner of
this phone. Hey, I'm the owner of this passport. Hey
I'm the owner of this of this enterprise account. And
I want to prove I now want to prove my face.
I want to use my face to prove that I

(02:49):
am who I've claimed to be. But with facial verification,
I start by claiming who I am. In facial recognition,
the system recognized.

Speaker 2 (02:57):
When my doorbell camera says he I recognize somebody familiar
and it's the UPS delivery person. And then it says,
I don't know it's your husband or it's your wife.
Is that facial recognition gone wrong? In other words, like
it's picked up some cues and there's some familiar you know,
like I try to understand this.

Speaker 3 (03:15):
Yeah, So ask yourself always does the does does the
subject have agency? Does the subject, Does the subject know
it's happening, have they consented to it happening? And are
they getting personal benefit from it happening? Those are the
real questions. So when you turn up at the border
with CBP, we have systems installed in nearly a dozen
airports around the US for global entry and now increases

(03:39):
in the also for US citizens at Orlando, for example,
you turn up, you can choose to go through these
these these systems that are based upon facial verification. You
are searching your agency by choosing to go uh and
go to to go and present yourself. You don't have to,
but if you do, it's a lot faster, it's a

(04:00):
lot simpler, it's a much better experience, it's a lot
more convenient.

Speaker 1 (04:03):
Well, let's talk about a little bit up the big
business in the solution, because as I mentioned, the technology
of I prove is being used by customs and Border
Protection at entryways at airports. For example, You've got about
you've raised about seventy million dollars. You've got employees all
over the world, the US, UK, Latin America, Asia, Pacific,
and more. How does the technology work?

Speaker 3 (04:24):
So most of what we do actually involves Verifying that
people are whom they claim to be when they're not
physically doesn't when they're at home, on their couch, for example,
and they're trying to assert their identity for the purposes
of opening an account or maybe very crucially resetting their credentials.
When something's gone wrong. When a person's on their couch,

(04:44):
it's really very difficult to know whether they are whom
they claim to be. You can look at them. You
can use your mobile phone to stream video to a
central system and check do they look who like they should?
That they look like who they claim to be, But
how do you know they real? They're real? We live
in a world of AI of deep When it's extremely
straightforward to put a digital mosque over over a person's

(05:07):
face and pretend to be to impersonate somebody, we stop
that impersonation. We can detect when very accurately, when a
deep fake is trying to be used to impersonate someone.

Speaker 2 (05:18):
Come on, Andrew, we've all seen like the Mission Impossible
movies or something like.

Speaker 1 (05:22):
Those deep faces that's a long time ago too, are
used twenty years ago, thirty years ago. I use nowadays.

Speaker 2 (05:27):
It's a lot better, right, exactly like the technology just
gets better and better. So I mean, how do we
make sure the facial verification systems stay ahead of you know,
the AI that is creating those deep fakes in terms
of their complexity. How do you do that?

Speaker 3 (05:45):
That's the central problem that we solve. That's what we've
built a global business doing, and we've been doing it
now for quite some time. Two things we do. One
is we have we change the physics of the situation.
We illuminate the user's face with a rapidly changing sequence
of colors from screen of their own device. It's an
unpredictable sequence of colors and it illuminates their face. And

(06:06):
while those colors are illuminating their face, we stream video
back to our servers where we look at how those
that screen illumination. There's changing colors reflect off the user's face.
They're unpredictable. The attackers can't know what it's going to
look like, so they can't pre prepare some beautiful deep fake.
That's one element of it. And then the other is
we analyze everything that's happening worldwide all the time. We

(06:26):
filter the systems and we study it so we can
detect the attacks. We can detect if a foreign power,
for example, is busily mounting a set of experiments, and
we learn from them, and we learn more from them
than they learn from us. So we stay ahead because
we have better. We have visibility and they have no visibility.
You know, information advantage enables us to win.

Speaker 1 (06:45):
Carol and I were talking ahead of this when we
were preparing for this, and we've been traveling a lot,
so it's on our mind. But we were members of
Clear now because we are on the plane a lot.
Sometimes it's faster than the traditional security line. Sometimes it's not.
It depends on that, Yeah, depending on where you are.
But just sometimes we can get through security without even
showing our plastic IDs. Does Clear use a system such

(07:10):
as I prove?

Speaker 3 (07:12):
I can't comment on on clear. Clear have their own technology,
they have a very sophisticated technology. What I can say
is that when you go through Global Entry in places
like Newark or Los Angeles or Miami, you walk up
to it and I prove terminal, you don't do anything.
You don't have to put out any card or any
identification system. Your face is. Your face is captured and

(07:33):
it's matched by a CBP itself against their traveler verification
system they have, You've already in the past given them
all of your details so that you're expected because you
are coming in on a particular flight, and they kind
of go, oh, yes, that's Andrew budd We were expecting him.
We know we know his details because he's given them

(07:53):
to us and we had them already. He looks like
he's supposed to look like. Go forward. And the rate
at which people can be pro like that is awesome.
The places where we've been installed, we've eliminated. We've eliminated
most of the cues or what happens.

Speaker 1 (08:06):
What happens if I don't look like I looked in
the photo or the documents that I provided to the
government maybe five or ten years ago. What if I've aged?
What if I've have a different hair collar, What if
I decided to get up saying what if I had
a lot of work done to my face.

Speaker 3 (08:21):
One of the great advantages of face verification technology is
that it works extraordinarily well, and it works a lot
better than people do. There was a study done about
a number of about a decade ago by university which
showed that the performance of really skilled passport officers was
of a level that is now about one hundred thousand

(08:41):
times less good than a modern face verification system. Face
verification systems tend to be relatively indifferent to face furniture
as we call it, beards, glasses, piercings, and so on.
They're really very good at matching people.

Speaker 2 (08:59):
Hey, one day, when I just ask you, before we
get a little bit more into the business, just quickly,
is a face better than a fingerprint?

Speaker 3 (09:07):
Yes? Absolutely? Why because they solve two different problems. When
you're trying to verify somebody, you want to make sure
that there's a good match, and you're not trying to
figure out whether that which of seven billion people in
the world this is. You know, they're expecting you and
the challenges, and you just have to make sure that
you look like the person accurately. There is so and

(09:28):
a face is extremely good at doing that. But the
reason a face is better is the real way that
these things get attacked, especially remotely, is by fate forgeries, copies.
These things. These things are and it's much easier to
detect a It's much easier and much more feasible for
us to detect a forged or deep faked face or
deep faked fingerprint because there's so much more information, there's

(09:51):
so much more texture, there's so much more depth, there's
so much more information around a face. The ambient illumination
is very important, so you get much more more information
from a face, which makes it much more reliable to
protect whether a face is real or not.

Speaker 1 (10:05):
Interesting, are we going to start to see this type
of technology being used in more places than just unlocking
something on our phone or getting through the line at
the airport where.

Speaker 3 (10:14):
Absolutely you're going to see this so so already. If
you want to set up bank accounts in many parts
of the world, you go through you I prove yourself remotely,
You I prove yourself to do the Know your Customer
KYC thing to make sure that the person setting up
the account is genuinely the person that they claim to
be and not somebody being a money an impersonating money

(10:36):
mule who's going to launder money. That's already happening in
many parts of the world, not so much in the
US yet. For enterprises, we're going to see staff doing
this both at the time when they're hired and also
when they have to reset their credentials when they're hired.
There's there's now a real problem in the United States
with impersonators, particularly members of the North Korean Secret Service,

(11:00):
impersonating American staff and being signed up and given jobs
to work remotely. The first of the there was a
very public case in the summer of last year. Now,
the CITO of Mandian, which is a cybersecurity company, says
that literally every fortune five hundred company has at least dozens,
if not hundreds of job applications North Korean IT workers
whose faces have been deep faked to look like American

(11:23):
staff when they're not. Three hundred US companies have been
scammed by a woman who played guilty in February for
having run a scheme like this. So it's a huge thing.
So our technology will make sure that the person who
is being hired remotely is whom they claim to be.
And then you know when you come to reset your password,

(11:43):
you haven't got any other way to assure yourself all right,
that will be used for that.

Speaker 2 (11:47):
Stay in touch, Love to check in with you again
in the future. Andrew bad founder and CEO of I
Prove joining us right here on Business Week
Advertise With Us

Hosts And Creators

Tim Stenovec

Tim Stenovec

Carol Massar

Carol Massar

Popular Podcasts

On Purpose with Jay Shetty

On Purpose with Jay Shetty

I’m Jay Shetty host of On Purpose the worlds #1 Mental Health podcast and I’m so grateful you found us. I started this podcast 5 years ago to invite you into conversations and workshops that are designed to help make you happier, healthier and more healed. I believe that when you (yes you) feel seen, heard and understood you’re able to deal with relationship struggles, work challenges and life’s ups and downs with more ease and grace. I interview experts, celebrities, thought leaders and athletes so that we can grow our mindset, build better habits and uncover a side of them we’ve never seen before. New episodes every Monday and Friday. Your support means the world to me and I don’t take it for granted — click the follow button and leave a review to help us spread the love with On Purpose. I can’t wait for you to listen to your first or 500th episode!

Dateline NBC

Dateline NBC

Current and classic episodes, featuring compelling true-crime mysteries, powerful documentaries and in-depth investigations. Follow now to get the latest episodes of Dateline NBC completely free, or subscribe to Dateline Premium for ad-free listening and exclusive bonus content: DatelinePremium.com

Cold Case Files: Miami

Cold Case Files: Miami

Joyce Sapp, 76; Bryan Herrera, 16; and Laurance Webb, 32—three Miami residents whose lives were stolen in brutal, unsolved homicides.  Cold Case Files: Miami follows award‑winning radio host and City of Miami Police reserve officer  Enrique Santos as he partners with the department’s Cold Case Homicide Unit, determined family members, and the advocates who spend their lives fighting for justice for the victims who can no longer fight for themselves.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.