Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Ian Horne (00:05):
Hello everyone and
welcome to the Money Pot.
This is a special episoderecorded live from Money 2020,
amsterdam 2024.
And, as you can tell, I'menergetic.
I'm happy the show.
We're on day three right now.
The weather is mediocre, butthe show has been excellent, so
I'm in good spirits.
I'm joined today by co-hostrachel morrissey the
(00:25):
ever-present and wonderfulrachel morrissey, how are you
doing today?
Rachel Morrissey (00:29):
I'm uh, you
know, I just got finished with
my, my one on the main stage andnow that that pressure is over
and my knees have knockedtogether, I am doing well.
I much prefer this main stage.
Ian Horne (00:40):
It's got a ring there
you go.
Yeah, I'm bringing stagemorrissey, but yeah, no, I kind
of.
Rachel Morrissey (00:41):
I came here, I
recorded one yesterday, having
come from the main stage, and II'm bringing stage.
Morrissey, it's got a ring toit.
There you go.
Yeah, I'm bringing stage.
Ian Horne (00:45):
Morrissey?
Yeah, no, I came here.
I recorded one yesterday,having come from the main stage,
and I think my adrenalinelevels were still through the
roof.
I know my intro to it hadenergy, at least it felt like it
did.
I don't know.
I'm sure it did Rach.
We could go on combating fraudthrough cross-sector
collaboration and sharedintelligence.
(01:07):
Now that is quite the topic,and we've got Laura Barracliff
joining us to talk about it.
Laura Barrowcliff (01:12):
So, laura,
welcome to the Money Pop.
Thank you very much.
Yeah, thank you Really glad tobe here, and similar to you,
rachel, I did a panel sessionyesterday.
It's that pressure, isn't it,and you kind of build yourself
up and then afterwards youbreathe.
Rachel Morrissey (01:25):
I can relax
now.
You take a couple of deepbreaths, get all that extra
oxygen and all this and youradrenaline levels are like
through the roof.
You come up and you're like ugh.
Ian Horne (01:33):
Oh, it's mad, isn't
it?
It's actually mad, and you weremoderating, right, Laura?
Yes, I was moderating.
Laura Barrowcliff (01:38):
So I was in a
kind of easiest seat.
I thought no, moderating ishard.
I was just moderating, I thinkmoderating is hardest.
I quite enjoyed it.
Actually I'm going to take thebadge of moderating is hardest.
Rachel Morrissey (01:50):
I enjoyed it.
Ian Horne (01:52):
Right, so let's get
into fraud.
Rachel Morrissey (01:53):
Yeah, let's
talk about this, let's talk some
fraud.
Ian Horne (01:55):
Okay so fraud, this
is a UK one, Rachel, but you'll
be fine, it's a familiar themewe.
Rachel Morrissey (02:00):
Well, fraud is
a big, fat, lovely industry
into itself, right?
Like a $2 trillion industry.
Absolutely, criminals are good.
They're making a lot of money.
It does pay until it doesn'tpay.
Ian Horne (02:16):
You just absolutely
USA'd us there by saying
trillion, because I was going tosay digital identity fraud
costs the UK finance industryover $1.2 billion in 2022 alone.
And now we feel really small,fry.
Rachel Morrissey (02:23):
No, no, no.
Globally, it's trillion, it'sglobally, but it's a global
problem as well.
Laura Barrowcliff (02:29):
So you know
we uh, we've put a uk figure in
there, but actually gbg is aglobal company, so we see this
across the board.
You know, in the us, um,massive, um synthetic identity
problems in australia formassive problem with scams
actually less of a problem withsynthetic IDs but more of a
problem with scams and databreaches.
(02:50):
So fraud is absolutely a globalproblem and it's not, you know,
country specific.
It is a problem absolutely inthe UK and something we need to
solve, but it's a problemeverywhere.
Rachel Morrissey (03:01):
Yeah, I mean,
I've been.
I was looking at this last yearbecause we did the fraud summit
and you guys have done a summithere that deals with a lot of
the issues around fraud, and oneof the things that came very
clear to me is that fraud is abusiness and the fraudsters take
it very like.
They treat it just like it's abusiness.
Laura Barrowcliff (03:21):
They're
professional criminals.
Absolutely, it's industrialized, so you can still get your
opportunistic fraudsters, youknow you still get people just
trying to chance it, but it's abusiness in itself and actually
that is increasingly worse.
As you know, technologyadvances, um, you know they
(03:41):
could, they can essentially doit in volume, at scale.
You know, essentially it isabsolutely a business in itself.
Rachel Morrissey (03:50):
You're making
it sound great all the things we
would celebrate if it were apro if it's like at scale, they
can do it in volume if it wasn'tillegal.
Laura Barrowcliff (04:00):
It's a great
business we'll do hyper
personalized fraud 2020.
Ian Horne (04:04):
This is the fraud
part it's a great business.
Laura Barrowcliff (04:04):
That's it.
We'll do hyper-personalizedfraud at Fraud 2020.
This is the fraud part.
It's been a pleasure Seriously.
Ian Horne (04:08):
Anyway, digital
identity fraud is massive.
Oh yeah, obviously, laura,you're here to talk about it.
You're an expert in it.
What are the current trendsthat you're seeing in this?
I'm still talking about it likeit's a really cool thing.
Oh yeah.
Laura Barrowcliff (04:25):
What people
doing?
What should we worry about?
I mean, as we said, it's, it'sum absolutely increasing at
scale, um, you know, ai, I think, gets a lot of bad press and
and ai does increase the risk offraud.
You know, we've all seen videosof deep fakes this year.
We've all seen um deep fakesgoing into the various elections
, going on um, which is scary.
But also ai is an opportunity,I think, from a fraud detection
perspective and we can get intothat maybe a bit later on.
(04:48):
But in terms of some of thetrends, I think synthetic ID is
absolutely an issue and we seeactually an identity stolen
every 22 seconds at the momentis something we look at also,
which is pretty scary, isn't it?
Every 22 seconds.
Rachel Morrissey (05:08):
Three a minute
, yeah, as we're speaking here,
but you know, identities arebeing stolen.
90 identities will be stolen aswe have this podcast.
Laura Barrowcliff (05:17):
Yeah, good
maths, good maths I'm glad you
did that.
But actually about 85% ofsynthetic identity goes
undetected.
So if you think about that froma business perspective, but
also from a consumer perspectiveas well, um, you know, it's
actually really really scary inthat.
Um, you know, we've got somestatistics of how often
(05:38):
identities are stolen, butactually a lot of that goes
undetected.
So there's a lot of unknown,for that is happening as well
you.
Rachel Morrissey (05:44):
You know it's
interesting.
Yesterday we were having apodcast with a guy from the
London Stock Exchange and wewere talking about reusable
digital identities and one ofthe things that was clear we
mentioned Adhar, because Adharwas sort of this revolution in
digital identity and he saidwithin the first year there were
(06:05):
over I think he said like200,000 false digital identities
created in AdHart.
So like, yes, it was and yes,it did, but there was also this
loophole or this kind of gapwhere things got created that
are not real but act real yeah.
(06:26):
And I thought that makes thisproblem.
You're like, okay, we thoughtwe had solved something here and
now we've got a new problem.
It's whack-a-mole you know,it's really.
Laura Barrowcliff (06:35):
You know,
absolutely.
You know reusable identitymakes it easier for the consumer
.
You can own your own identity,all of that but fraudsters just
find a new way of stealingidentity credentials.
You know so.
It's absolutely not a you know,I know this is a bit of a bingo
phrase, but absolutely not asilver bullet in terms of the
digital identity piece.
Actually, what you need isintelligence around fraud on top
(06:58):
of the digital identity pieceas well, because it's forever
changing.
It's real time.
Fraud is real time, so you needto time, so you can't just
stick with an identity and thenexpect that to stay the same.
It's absolutely happening.
Rachel Morrissey (07:10):
Well, I'm just
glad you said it, because I
really needed to fill out thatbingo card.
Ian Horne (07:16):
Yeah, but with an
identity as well we're seeing
people put everything in oneplace and that almost sounds
kind of scary, because ifsomeone can steal one thing they
can notionally accesseverything.
Obviously, some of the biggestsessions we've had at the show
this year have been about theOpen Wallet Foundation or about
the EUDI.
Like what do these initiativesneed to get right?
Like what are the big concernshere?
Laura Barrowcliff (07:35):
Yeah, they
need to get right the kind of
multilayered approach to this,because digital identity is not
a thing in itself.
It needs to go hand in handwith fraud prevention as well.
So it's around adding multiplelayers of intelligence, I would
say, is that we work with 20,000customers across the globe and
(07:57):
we say don't just look at thiswith a one-size-fits-all
approach and don't just take onesingle solution to do this.
Actually, you need to addmultiple layers to to your
digital identity and yourforward um prevention, um
technology.
So it's um adding multiplelayers in it's.
It's actually sharing andcollaboration.
You know we talked uh in thesubject heading around um
(08:19):
collaboration, actually a lot ofum, especially in the finance
sector.
I would say a lot of people arescared of sharing data with
other companies and and quiterightly so, in in terms of from
a privacy perspective.
You know, I think everybody'sworried about oh, what can I do
with this data?
What can't I do?
But actually, if we're sharingintelligence rather than data,
you can really learn from eachother, you know, of how to
(08:42):
detect that fraud and how toprevent fraud.
Rachel Morrissey (08:44):
You know,
that's interesting because I
sort of understand why they arereticent to share data.
I mean, first of all, there'sbeen a.
I mean I'm still in the darkages.
The UK is a little bit aheadwith the open banking laws that
they have and the way that theyhave data sharing according to
customer wants.
We're approaching it.
(09:05):
But the thing that I findinteresting about that is, you
know, there's a fight about whoowns the data and even within
banks there's silos of datawhere they don't share with each
other, and some of that is acompliance right.
So there's a thing.
So sharing intelligence versusdata is a very interesting
juxtaposition.
So how would you defineintelligence versus data is a
(09:25):
very interesting juxtaposition.
So how would you defineintelligence versus data?
Laura Barrowcliff (09:28):
Really, it's
about looking at the insight
that you can gain off the data.
So, for example, we providecustomers back with a trust
score.
So we would give a transactiona score based on all of the data
that we see in the background,but we don't share that data
back with the individual or thecustomer.
It's really about sharing thescore and it's around sharing
(09:49):
some of the behaviours thatwe've seen with that data.
So that's what we would say isintelligence or insight, rather
than sharing data back in theclear.
So what we're really looking atis how that data behaves within
our identity network.
So we're looking at do we seeany anomalies there?
Do we see any combinations ofdata that don't look quite right
(10:09):
or that do look right actuallyto identify good customers as
well, and we apply a score tothat so we can say, yeah, this
looks like a pretty goodcustomer or actually, no, you
might want to take another lookat this customer or you might
want to actually prevent them.
And one piece actually aroundthe the intelligence as well is
using that before you evenonboard customers, so where, if
(10:31):
you think about um as a consumer, you're going through that
process of um getting on boardwith a bank or opening a new
account somewhere.
You really have to go, gothrough many loops and hoops to
get through that from a from anidentity perspective, and that
can cause a lot of friction forgood consumers.
But actually if you do some ofthis preemptive intelligence at
(10:54):
the front door, you're notnecessarily then adding that
friction to the process ofgetting customers on board.
So, it's better from a consumerperspective as well.
Ian Horne (11:03):
I want to jump in on
something that Rachel can
introduce just there.
Um, which is of thatcollaboration piece and rachel's
right, people understandablydon't share data, but when
you're having conversations withpeople, what reasons do they
actually give you and do youthink there is an effective
workaround to that where you canget people to be, you know,
drop their guard and and sharedata proactively?
Laura Barrowcliff (11:22):
yeah,
absolutely.
So some of the um, the kind ofpushback that you might get, is
like from a privacy perspective,I think people are worried
about what they can and can't doin terms of sharing their
customers' data.
The other piece, and also froma consumer perspective, you
worry obviously about sharingyour data, but most of our data
is out there anyway.
As we talked about, that, fraudis an industry.
(11:43):
Most of our data is out thereanyway.
As we talked about, that, fraudis an industry.
Most of our data is availableon the dark web.
So actually it's about makingsure that you're working with
somebody who has gone throughall of the kind of privacy
processes which you know.
We're absolutely, reallystringent from a privacy
perspective.
So making sure the solutionyou're using is privacy by
design.
(12:03):
So making sure the solutionyou're using is privacy by
design and that can really helpalleviate some of those the
reasons that people give aroundsharing data for privacy reasons
, also competitive reasons aswell.
So, for example, we have somecustomers that don't necessarily
want to share their badcustomer you know, their bad
customers with with theircompetitors because they're like
actually we want them to getthe bad customers.
(12:25):
So if they get them, we don't.
Yeah, yeah.
Rachel Morrissey (12:28):
So there is a
bit of that kind of a little bit
like I don't have to outrun thebear, I only have to outrun you
.
Laura Barrowcliff (12:33):
Yeah, yeah
but but everybody's trying to
solve the fraud problem, so it'slike come on, guys, you know um
, grow up and share.
Rachel Morrissey (12:42):
Well, from a
consumer perspective, I mean,
obviously it's hard to say, butmaybe we're a little bit more
sophisticated consumers nowbecause we have so much
knowledge about this.
We talk about privacy and andit's just sort of ephemeral at
this point, Like I mean, some ofus are like oh, they have
(13:05):
everything just to prove thedamn cookies, and who gives?
a crap, I mean at this point yousort of feel like you're out
there, and I'm not sure that'salways a good thing.
There is something aboutwanting to pull back on that
that I understand.
But what I think is funny aboutthe competitive one is they
would really all benefit fromsharing, because if you're
(13:28):
talking about preventingonboarding a bad customer or a
fraudulent customer right, it'snot a really not a real customer
Then if you are sharing theinsights and you're sharing the
data that's feeding thatinsights and you're seeing
patterns at a greater point,then you're preventing it for
you too.
Like there's a lot moresensibility around sharing, but
(13:53):
I think that I thinkcollaboration is hard when you
are in a competitive environment.
Laura Barrowcliff (13:57):
Oh yeah, yeah
, but I think people are
adopting that even more so now.
So if I just give you anexample, actually we've seen um
a guy that we lovingly refer toas mr c in our network um who
has so lovingly?
referred to as mr c.
So over the last 12 monthswe've been tracking this guy and
we've seen him apply um over 5000 times now to multiple
(14:21):
organizations 23 differentorganizations that are in our
network um that mr c is tryingto defraud.
So we've then been able to tellsome of our customers of those
23 different organizations oflike look, this guy is is
attacking you.
He's also attacking companiesover here um.
Again, we never, never give thenames of the different
(14:42):
organizations that are using thenetwork, but it's like actually
we can see this acrossdifferent sectors.
So we've talked about thefinance sector here, but Mr C is
targeting gambling companies,he's targeting retailers, he's
targeting financial servicescompanies.
So, like you say, it's reallyabout it's going to benefit them
as well as you, others in in inthe network as well, if they
(15:02):
just share some of thatintelligence.
And actually the funny thing, asyou mentioned, rachel, is
within organizations data issiloed, but we have many
customers who have bought, youknow, they've acquired other
organizations, so they'veactually got multiple brands
under their banner and againthey're using different
processes.
They're keeping their own datato themselves, but we can see mr
(15:23):
c across these different brands.
So you know that I think peopleare really starting to adopt
this approach now and can seethe the sensible approach around
.
You know collaborating andsharing some insight, and I now
want to mr c comic strip I wantto see the adventures of Mr C
the fraudster.
Rachel Morrissey (15:42):
And also he
sounds really energetic.
Ian Horne (15:44):
Yeah, who's?
Rachel Morrissey (15:45):
got time.
Laura Barrowcliff (15:48):
Who's got
time to do that?
Yeah, this guy's legit.
Rachel Morrissey (15:51):
He's going on
one of our summits.
I'm talking to him this week,seriously.
Ian Horne (15:55):
One thing I actually
want to look at here is
obviously collaborating is howwe can get stronger as an
industry to protect againstfraud.
But what competitive advantagesdo we have over fraudsters?
Because you say they'resophisticated every time we
create something new, theycreate a new approach.
Yeah, ai, ai is great.
All of a sudden, ai is good forfraud.
Yeah, biometrics are cool.
All of a sudden, we find waysto do deep fakes and so on.
So, like, do we literally justhave first mover advantage, or
(16:18):
is there some other thing thatwe've got going for us?
Laura Barrowcliff (16:20):
yeah, no, I
think it's.
It is about um, it's aboutfinding those holes within the
processes.
I think it's it's.
You may never get ahead offraudsters, but I think being
able to again, you know, worktogether across different
organizations and acrossdifferent sectors, to to
identify um, because some, somefraudsters will be stupid as
(16:41):
well, so to identify some of thethings that you know they're
doing wrong.
Or to identify um, because some, some fraudsters will be stupid
as well, so to identify some ofthe things that you know
they're doing wrong.
Or to identify some of theholes in their processes, and
just trying to get ahead of them, really.
But um, but yeah, it's, it's acat and mouse game.
Rachel Morrissey (16:53):
I even think
there's actually an aphorism
right there.
I think nothing, uh, is makesit so easy to catch a criminal.
Is them being stupid?
Yeah, I love that, but that'sso.
So, as you're talking aboutbarriers to collaboration, what
(17:16):
are you finding when people docollaborate?
Like what are you finding themfinding for themselves?
Laura Barrowcliff (17:19):
Yeah.
So I think they're reallysurprised in the results that
they see.
Yeah, it's.
It's like we do some um studieson our customers data and show
them some results and they'relike we didn't even know we had
this, this problem.
You know we we found some um,some results from.
You know we might analyze amonth's worth of data and find
(17:40):
in one of our customers' caseswe found over 6,000 records that
were suspicious for them tothen go on and investigate
within a month.
So they're really surprised atsome of those results.
And what we see as well isactually again highlighting some
(18:01):
of the holes that they have intheir processes.
So, for example, companies areusing one-time passcodes,
thinking that they are, you know, taking a box, a security box,
and actually that's secure, butactually we see fraudsters
really intercepting thoseone-time passcodes straight away
.
And we deployed a new rulerecently within our network and
(18:22):
within four hours we sawmultiple matches to these
disposable mobile numbers thatpeople go on and register online
and just use for one-timepasscodes.
Again, we were able tohighlight that to some of our
customers and they're like, whoa, we didn't really know we had
this issue.
So I think that's the positivepiece that our customers are
seeing is actually the benefitof, of that collaboration um, so
(18:44):
it's really seen in in theresults that's terrifying.
Rachel Morrissey (18:49):
I'm sorry, I
just.
I just realized how many timeson my phone I get a one-time
answer like oh god okay, yeah, Iguess for businesses.
Ian Horne (18:58):
What are the main
vulnerabilities?
Rachel Morrissey (19:00):
that people
have right now because I feel
like this.
Ian Horne (19:02):
Let's say it
constantly evolves.
As you say, individual peoplewithin a company could be a weak
point.
It could be human behavior.
That's a problem.
It could be something else.
But what are the biggest risksthat we've got, that perhaps are
new, that people aren't savvyto?
Laura Barrowcliff (19:12):
I think it is
around the um, because I think
businesses have tended toconcentrate on what they need to
do for regulatory reasons orfor, you know, to tick the
compliance box at thatonboarding stage of the customer
journey.
But actually they're reallyvulnerable at that onboarding
stage.
So I think it's thinking abouthow do you block that at the
(19:35):
front door, you know, how do youreally put that in place to
preempt the crime that's goingto happen further down the line?
So I think that, that's, youknow, maybe, some of the
vulnerabilities.
And then, as we've talked about, there are obviously new
technologies, there's AI,there's generative AI that you
have to worry about.
But I just think it's reallymaking sure, from the business's
perspective, that they'reworking with a vendor who is
(19:56):
keeping on top of some of thosefraud complexities that are
arising.
Rachel Morrissey (20:04):
That makes me
laugh in a way, because it's
we're working with thisjuxtaposition right.
One thing, they're being verypressured to make onboarding
more frictionless, and then, onthe other side, that is where
they are most vulnerable.
Laura Barrowcliff (20:18):
Yeah.
Ian Horne (20:19):
Right.
Rachel Morrissey (20:20):
And those two
things are in conflict, because
a little friction might decreasethe vulnerability Definitely.
Laura Barrowcliff (20:27):
Yeah.
Rachel Morrissey (20:29):
And so how do
you?
What do you think the balanceis?
What would you tell yourcustomers about the balance?
Laura Barrowcliff (20:40):
it's a really
tricky balance, actually,
because you have so, for example, we speak to fraud and
compliance teams and they'resaying we need to put friction
in this process, we need to makesure we send them down this
route and do someknowledge-based authentication,
and all of that.
Bicycle wheels oh yeah, I'mkidding.
Rachel Morrissey (20:59):
I hate the
bicycle wheels.
I'm like I don't know, I can'teven see it anymore.
Laura Barrowcliff (21:04):
So yeah, the
kind of you know from that
perspective.
But then you see the marketingteam saying hang on a minute, we
need to get more customers onboard, we need to.
Ian Horne (21:16):
Or the customer
experience team saying we need
to make this customer journeyreally seamless we want it to be
, you know, biometric led andall of that.
Laura Barrowcliff (21:19):
So it's
really difficult to get that
balance, um, but I would say,you know, back to that kind of
multi-layered approach is, ifyou have to do, for example,
document verification and aselfie at the beginning, that is
quite a lot of friction for acustomer to experience.
It's just making sure you'reyou're balancing that risk with
what you're actually, um, youknow, going to lose if you don't
(21:40):
do that.
So I think it's making sure youum add, um the the things that
you need to do from a regulatoryperspective, but also add that
intelligence that we've talkedabout at the beginning so you
can do that kind of seamlessstuff in the background without
putting too much friction intothe customer journey the selfie
thing seems almost like agenerational one.
Rachel Morrissey (22:00):
Like you're
asking me to do a selfie, I
would be like I don't want to doa selfie, but if you're asking
gen c to do something, they'relike I do one every five minutes
why, not like I wouldn't haveany any friction there for them.
And I almost wonder about thatbeing a generational friction.
Laura Barrowcliff (22:15):
Oh yeah, well
, you think about, like I think
about my mom and dad, forexample.
You know they would find thatreally difficult to do.
A… Do you get a duck face ornot, sorry.
So you know it is thinkingabout what works and you know
the inclusion side of things aswell, making sure you know your
(22:35):
processes can be led by, youknow, people with disabilities
or people that are not used todoing that kind of technology
process as well.
So I think it's making sureyou're plugging in again those
multi-layered, you know insightsat the beginning of that
onboarding stage to make that asseamless as possible.
It's also good to prioritizeyour good customers using that
(22:57):
insight as well.
So actually, if you see thatthis is a really good customer,
we've been able to collaborateand see from other sectors and
from other companies thatactually they use their data
really well, their identityscore is really well, that
they're really trusted, then youdon't have to put that friction
in the process.
So it's just thinking about.
Rachel Morrissey (23:15):
That's
interesting, the idea that the
collaboration could actuallyfeed one another's information
about how to onboard so that youcan decide who gets a little
more friction and who doesn't.
Oh yeah that's it.
That's a really personalized UX.
Actually.
Laura Barrowcliff (23:27):
It's thinking
about good customers and bad
customers.
You know it's not just aboutstopping the fraudsters.
It's thinking about goodcustomers and bad customers.
You know, it's not just aboutstopping the fraudsters.
It's actually thinking aboutyour good customers.
Why should good customers loseout because you're trying to
stop the fraudsters?
Ian Horne (23:37):
Yeah, but how can you
spot a bad customer?
Because obviously you know Mr Cis not even trying anymore to
hide what he's doing.
Laura Barrowcliff (23:44):
You know all
day, every day, he is frauding
and he frauds hard and that'sgreat and he's having a good
time.
But yeah.
Ian Horne (23:50):
Is there a way to
spot someone early, before
they've done their like 900thfraud?
Laura Barrowcliff (23:53):
yeah, yeah,
oh, absolutely.
So it's looking for um and andthat's.
You know, we see thesebehaviors time and time again
within the network, so we'reexperts in being able to
identify this.
So you can see it before itgets to the 900th time.
You know we can see this a fewtimes in.
So it's looking at the um, thevelocity of the transaction.
So it's looking at the velocityof the transaction.
So it's looking at the amountof time somebody is trying to
(24:14):
come through.
It's looking at anomalieswithin the data.
So our lovely Mr C very oftenchanges around his full name and
his middle name, for example,or just changes his date of
birth slightly.
So it's looking for thoseanomalies within the data and
looking for things like so usingdata, like mobile numbers that
usually people would only haveone or two or three of, maybe um
(24:37):
, you know that then linkdifferent data types together.
So it's using those data typesto then link it to an address or
link it to a name that you thensee switching around within the
network.
So it's spotting some of thosebehaviors.
Ian Horne (24:49):
Essentially that that
don't look quite right yeah,
and, and I guess we're seeingthe evolution of technology here
.
We keep talking about, you know, hyper personalization.
We keep talking about seamlesstransactions all that kind of
stuff.
Um, and it makes me think whichcustomer you know market we're
going for.
Obviously, the new kind ofpropositions we have.
I think when you look at, uh,embedded finance, open banking,
(25:09):
all that stuff, I see the massmarket opportunities.
All this technology we'reseeing on fraud does that create
a mass market vulnerability?
All this kind of AI tools thatwe've got now, does it mean that
more people can be defraudedmore quickly?
Laura Barrowcliff (25:22):
It does, but
it's also, as I said at the
beginning, it's also anopportunity.
So it's an opportunity for thefinancial services industry.
It's an opportunity for peoplelike GBG, so there's obviously
other suppliers out there thatprovide for prevention and
identity verification.
We use AI to be able to tacklethose fraudsters as well, so
(25:43):
it's an opportunity as well as athreat.
Ian Horne (25:45):
Right, I want Rachel
to get the last question, but
I'm going to ask one first,which is I'm sorry, I can't lose
this concept of fraud 2020.
I'm thinking of alternativecareers for myself.
I think fraud 2020 could be theone.
Laura Barrowcliff (25:56):
Mr H, I just
exactly, exactly, I can do that,
six or 7,000 a day.
Rachel Morrissey (26:01):
maybe it could
be your side hustle.
Laura Barrowcliff (26:03):
It could be
and no one ever needs to know.
Ian Horne (26:04):
But you know if I'm
if I'm booking the content
agenda for fraud 2020,.
I want to get some tips.
What topics am I booking?
Laura Barrowcliff (26:11):
um, you're
booking ai.
I know it sounds uh kind ofobvious but it's so cliche.
Um, I think it's um looking atum, you know how.
You know reusable identity isan interesting one as well.
It it's like how you know howsome of the what look like those
(26:34):
solutions to reusable identityactually are, you know, have
some of the holes, as youmentioned before, rachel.
I think it's maybe regional andcountry differences as well is
an interesting one Great.
Ian Horne (26:46):
every nation gets a
stand.
Laura Barrowcliff (26:48):
Every nation
gets a stand.
You know, I can see all theflags now in for 2020.
But it's thinking about againlearning from other countries of
what they're seeing, becausethere's different problems
within different countries,different data availability,
different regulations, et cetera.
So I think maybe they would besome of the topics Love that.
Rachel Morrissey (27:06):
I just
realized that it would be a
really interesting culturalstudy to just look at what fraud
is popular where.
Ian Horne (27:15):
Actually, I'd be
fascinated.
Rachel Morrissey (27:17):
It would be
fascinating just to look at all
the geographies and say whatfraud is working within what
cultures.
Laura Barrowcliff (27:23):
I mean.
Rachel Morrissey (27:23):
I'm not sure
we should collect that, because
if we form the fraudsters, oh go, do that there.
That's a good idea, but itwould be really interesting to
know what you're more vulnerableto because of the cultures that
you're raised in.
Laura Barrowcliff (27:35):
Yeah,
absolutely, and it depends.
You know, like I said, dataavailability varies.
So, for example, we see in theMiddle East that people don't
celebrate their birthdays.
They usually put it as thefirst of the month.
So you know, that's again avulnerability that companies who
operate in the middle eastreally need to think about,
because something that we see isquite unique it's not unique in
(27:57):
the middle east, so it's likethinking about other ways to
then find that uniqueness.
Rachel Morrissey (28:02):
You know
nobody they don't have.
Everybody shares a birthday.
Like there's so many sharedbirthdays, that would never have
occurred to me.
Yeah, I'm way too western, oh,oh my.
God, I'm just dumb American,that is.
I just fell right into that.
Ian Horne (28:14):
Okay, it's all right,
I'm a British person who's been
mispronouncing everyone's namefor the last week.
It's all good, but yeah, I seethat Again we'll talk about with
the selfies and Gen Z and thenyou know, has segmentation,
let's not go there because wedon't have time.
Rachel Morrissey (28:34):
No, it would
be fun, I would love to, but
yeah, yeah well, we're gonnahave to like hire you as a
consultant.
I think we will.
Ian Horne (28:37):
So, yeah, yeah, we'll
wrap it up, but this has been
the money pot.
I better not say the wrong oneit has been the money pot.
It's been really good fun liveagain from amsterdam.
It's been a pleasure having, uh, the live audience with us,
that's.
That's given me some energy.
I've really enjoyed that.
Laura, it's been fantastichaving you on thank you,
appreciate it rachel ethic asalways.
What a co-host oh thank you andof course, of course thank you
(28:58):
to all of our fintech nerdslistening in again.
This is the money pot.
Goodbye for now.