Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
YS Chi (00:00):
The Unique Contributions
podcast is brought to you by
RELX. Find out more about us byvisiting RELX.com.
Stephen Topliss (00:09):
Nowadays cyber
crime has evolved into a massive
industry. It's like a business.
We have to assume now that allof us, our names, our addresses,
our emails, some of ourpasswords are all out there.
YS Chi (00:40):
Hello, and welcome to
Unique Contributions, a RELX
Podcast, where we bring youcloser to some of the most
interesting people from aroundour business. I'm YS Chi, and
I'll be exploring with myguests, some of the big issues
that matter to society, how theyare making a difference, and
what brought them to where theyare today. Today, we're talking
(01:01):
about cybercrime. Thedigitalization of physical
services is happening at anunprecedented pace, and across
almost every sector. What doesthis mean for the threat of
cybercrime? And what are wedoing to prevent it? To explore
these issues with me today isStephen Topliss, VP of market
planning for global fraud andidentity at LexisNexis Risk
(01:24):
Solutions. Stephen, welcome.
Great to have you and thank youfor joining the podcast.
Stephen Topliss (01:29):
Thanks YS, nice
to meet you and glad to be here.
YS Chi (01:32):
You are based near
Amsterdam, I guess the Hague in
the Netherlands. How have thingsbeen there for you over the last
few months?
Stephen Topliss (01:39):
Yeah, that's
right. I am in the Hague. It's
been an interesting time, Ithink for everyone. For me
personally, I'm actually veryused to working from home. I
travel a lot in the role I haveand when I'm not travelling, I
tend to be at home a lot. So Ithink that transition to working
from home was easier for me thanmost. I think what has been
(02:03):
interesting has been having thewhole family around me as I
work. So over the last fewmonths, I've spent a lot more
time than I ever have doinghomework with my three children.
I know a lot more about history,geography and physics now than I
have done for 20 or 30 years.
YS Chi (02:21):
I doubt that with a PhD,
I'm sure this has been quite
easy for you.
Stephen Topliss (02:28):
It's
interesting how much you forget.
But it's been very enjoyable.
It's been enjoyable spendingtime with them. Yeah, yeah.
YS Chi (02:36):
Well, speaking of PhD,
I'm really curious. You went
from having a master in French,and then a PhD in space physics.
Now to cybersecurity space. Loveto know more about your journey
to this point.
Stephen Topliss (02:53):
Yeah,
absolutely. So when I finished
school, I didn't really knowwhat I wanted to do. I was
inspired by my physics teacherat school and I also really
liked to travel and I likedlanguages. So I went to
university thinking, let's tryand do something combined with
Physics and French. I studied inManchester. I had a year out
(03:17):
studying physics in France andin the end, I still didn't
really know exactly what Iwanted to do. I had been
enjoying physics, and so as Iwent into the PhD, had a few
different options that appearedthere. One was to do a PhD in
space plasma physics. I'd neverreally been that fascinated by
(03:43):
space, I have to admit, but itwas still it was intriguing and
it was definitely the mostinteresting of the options I
had. So I spent three yearsdoing my PhD in that, thoroughly
enjoyable. But I think what Idid learn from that was a couple
of things. First of all, as Ikind of already alluded to, my
(04:03):
passions probably lie elsewhereand so I felt, maybe this wasn't
a career I wanted to commit to.
I simply didn't have quite thepassion for space that some
people do have. The other thingthat I learned was, in the space
industry, projects take anincredibly long time. So,
generally as we were doingresearch there, we would we
(04:26):
would come up with a researchproject. The work I did was
based on analysing data frominstruments on satellites flying
in space. To plan a researchproject like that. First you
have to define, you know, whatis the question you're trying to
answer, then you have to developthe instrument that's going to
(04:49):
get you the data for it. Thatinstrument then needs to be
aligned with a satellite andthen the satellite needs to be
aligned with a rocket launch,and that takes you know,
probably a five to 10 yearprocess to get to the stage of
launch. You then launch it andthen once the satellites flying,
you start getting data back, andthen you have several years of
analysing that data. So theseprojects can be, you know, 15
(05:12):
years of your life. I also hadthe misfortune I guess, of
seeing one or two of theserockets, unfortunately exploding
shortly after takeoff. So notonly are those long term
projects that you commit to, butthere is a fairly high element
of risk involved there as well.
(05:32):
So I think what it did was itchanged my perspective quite a
lot. So once I completed my PhD,I went into the consulting
world, which was really theother extreme.
YS Chi (05:44):
From the long term to
short term.
Stephen Topliss (05:46):
Exactly,
exactly. And so consulting I
hought was just a very, verynteresting. I got experience in
lot of different organisationsery quickly. I got exposed to
rganisations within Siliconalley. For me that, there was
he click, because there was theechnical excitement, the
(06:08):
ntrepreneurialship there. Ando I started working on projects
ith tech companies in the USho were then expanding into
urope. And eventually, in 2010,was looking kind of for the
ext opportunity there andhere's a few different firms
hat were making this expansionnto Europe, and one of them was
(06:30):
hreatMetrix. And really, it wasno brainer once I kind of
nderstood what was was going onhere. ThreatMetrix had really
ool technology. It was in thisraud prevention space which I
idn't know a lot about, but Iealised that this was a sector
hat I could really getassionate about. So not only
(06:50):
as it an exciting tech company,hey were offering to bring me
n as the first person on theround in Europe and build the
usiness. And the sector was waseally meaningful.
YS Chi (07:01):
Yeah, great timing.
Stephen Topliss (07:02):
Yeah. So the
ThreatMetrix product basically
enables clients to preventfraud. So we're providing fraud
prevention techniques for thelikes of online banking,
ecommerce sites, media streamingsites, and any company that
provides services to an end userbase.
YS Chi (07:23):
You say there was no
brainer, but I must not have any
brain because at that point in2010, it would have gone right
over my head, and I would nothave seen the potential that you
did, but I'm glad you did, andthen I'm glad that ThreatMetrix
found you. You sound like a realtrue renaissance man. From
humanities, to science, toengineering, everything in
(07:45):
between. So like our otherguests that have been here and
have done so well. I think thissession is going to be a great
lesson. Can you please start bypainting a picture of what's
going on out there in the darkweb, and how cybercrime has
evolved in the last few years?
Stephen Topliss (08:06):
Yeah, I think
that the best way to do that is
to go back a little bit to thestart, and in particular, the
start of commerce moving online,moving on to the Internet, and
starting to give you thatability to buy things online.
What happened very, very earlyon was that second marketplace
started opening up and thatsecondary marketplace was
(08:29):
selling stolen information andin particular, stolen credit
card details. It was a naturalprogression for criminals. If
they had the ability to usestolen credit cards, literally
stolen in shops, why not usethat information when you try to
buy things online as well. Sowhen I kind of got into the
(08:52):
sector the first time, ecommercewas really trying to battle that
problem. And it wasn't justthere, there were other new
services emerging online. So theonline dating industry for
example, was also quite an earlyadopter of the digital space.
They had a very differentproblem, they were focused on
scams. The kind of scams westill see today in the social
(09:16):
networking space, where peoplecreate accounts on there,
pretend to be other people, tryto build a relationship with
you, and ultimately try to askyou for money. So those were
problems, you know, early on andthe problem was, obviously, it
had an impact for customers thatwere actually involved in that
(09:36):
fraud. They would find thattheir cards had been used to buy
a purchase, and it would oftenbe quite difficult to get that
money back. Similarly on thedating site example, if you've
ended up paying money somewhere,chances are you were never going
to get that back. So a realimpact to individual people like
ourselves, as well as then,impacts to the organisations as
(09:59):
well. Whether they wereecommerce or then later on,
online banks. And if you look atthe kind of the sophistication
of cybercrime and how you foughtit in the early days, it was
actually really quite easy. Mostof the time, if you could
identify the IP address wherefraud was coming from, you could
(10:19):
block it, you could just ignoreit. But the complexity started
to occur, as fraudsters startedto understand how to change
their IP addresses, or how theycould hide behind things like a
proxy server, so that youcouldn't identify where they
were coming from. And at thetime as I was joining, what was
interesting with ThreatMetrixwas that they had this novel
(10:41):
idea of being able to try togather digital intelligence. So,
to be able to identify if theseproxies were being used to hide
the fraudsters. So they thoughtif the IP address isn't the key
anymore, maybe we can have anidentifier for a device. So if
this fraudster is using thiscomputer, and we can identify
(11:03):
it, then we can block thecomputer. So that was a little
bit kind of what was happeningin the early days. But if you
look nowadays, cybercrime hasevolved into a massive industry.
It's like a business. There'sknowledge sharing, there are
sophisticated tools being used.
You see these automated bots,these botnets, these networks of
(11:23):
devices that have been kind ofcaptured and convinced to carry
out attacks on sites. So thescale of cybercrime has
completely changed, as have thelevel of stolen or breached
identity data. I mean, we'veheard in the press over the
years, many large well knownbusinesses where there have been
(11:46):
data breaches and we have toassume now that all of us, our
names, our addresses, ouremails, some of our passwords
are all out there for those thatwant to buy them and test them.
So the sheer scale of cybercrime has changed. And what
we're seeing now is thatfraudsters are actually, they've
(12:08):
always been shifting to focus onhow they can be successful. But
one of the recent trends we'veseen is a move away from
fraudsters maybe accessing ouraccounts directly and stealing
from us, and actually going backto those scams that I mentioned
earlier from dating. Thedifferent kinds of scams now to
convince you or me to actuallymove our money and pay it into a
(12:32):
fraudsters account. But having astory so convincing that we're
doing it willingly and notrealising we're being scammed.
YS Chi (12:41):
Right.
Stephen Topliss (12:42):
So it's a scary
world out there and criminals,
you know, ultimately, it's mucheasier and safer for them to try
to commit this types of fraudand crime online than walking
into a bank with a gun.
YS Chi (13:10):
You are keeping this so
nice, clean, diplomatic. But I
need some juicy stuff. So youmust have come across some very
particularly interesting storiesabout fraudsters who have been
caught, without necessarilymentioning names. What are some
of the most sophisticated waysthat you've seen fraudsters
(13:30):
operate online?
Stephen Topliss (13:33):
Yeah, I've
definitely got a few stories I
can share and maybe I'll changethe angle slightly and highlight
a little bit how we've dealtwith them or how I've seen
companies deal with some ofthese fraudsters as well. So if
I go back to the scams first ofall, again, thinking of an early
example, with those dating sitesand scammers. One of the
(13:56):
problems that you have to dealwith. If you can identify these
fraudsters on your site, what doyou actually do about it? So you
could just block them. But theproblem is, if you block them
straight away, they know thatyou're onto them and that
actually is only going to helpthem, study that a little bit
more in detail and figure outhow to change their attack and
(14:18):
come back and try and get pastyou again. So one thing that for
me, I remember very well in theearly days was that some of
these dating sites would use ourtechnology to identify the
scammers. But rather thanblocking the scammers, they
would lead them to a secondsite, which would look exactly
the same as the original datingsite, but was actually a
(14:40):
fictitious dating site wherethey would bring all the
scammers and all the fraudsterstogether and they could
effectively play againstthemselves. So I thought that
was always an interesting a...
YS Chi (14:50):
...lovely sting
operation.
Stephen Topliss (14:52):
Exactly,
exactly. But you asked, I mean,
another example about actuallycatching fraudsters. So I guess
YS Chi (14:58):
Bravo.
back in 2014, 2015, we saw thatthe banks that we were working
Stephen Topliss (15:01):
...guys got
locked away. So for me, it was,
with were getting moresuccessful at being able to
identify the fraud because ofall this rich, digital
intelligence that we couldprovide. But there was always
the frustration that thefraudsters and the criminals,
they never got really caught bythe police and went to jail. And
(15:25):
one of the reasons for that wasthat historically, that there
wasn't really the evidence thatcould be used in a court of law,
or even if it was there, maybethe police didn't really
understand how to use it. Sothere was one specific case in
that year where there was somefraud. It was clear that there
(15:47):
were some fraudulent behaviourand a number of online banking
accounts were being accessed andmoney was being moved out. In
one way or the other, I don'tremember exactly how, but the
police had identified that therewas a potential link to a hotel
room, and they did a raid onthis hotel room, found and
arrested a couple of men inthere and seized a laptop. That
(16:09):
laptop was actually opened atthe time in an online banking
session. So they now hadsuspects, they had a laptop and
they had records from several ofthe UK banks, highlighting
fraudulent transactions,fraudulent money transfers. They
were pretty sure that this wasall linked, but they couldn't
(16:31):
link them together. And so oneof our banks and one of our
banking customers came to us andsaid, can you help us explain
the data that we have to yourservice to the police? So I got
personally involved there andworked together with the bank
and one of the police officersto look at that device and
(16:52):
through doing some analysisaround the device, we were able
to identify how that device waslinked to the individual events,
and that device ID that Imentioned earlier. And so I I
generated, I wrote up a witnessstatement for that to help in
the prosecution there and it wasreally one of the first times
(17:14):
that we were able to see thepolice actually prosecute, based
on actual online crime usingdigital intelligence. And these
..
you know, personally verysatisfying, but I think for the
industry as a whole, it wasnice, because it started
changing the focus and madepeople think they could actually
(17:35):
succeed in prosecuting some ofthese criminals.
YS Chi (17:38):
Right. That's what I
think probably what they wanted
to see is a proof that it can bedone. Right?
Stephen Topliss (17:43):
Exactly,
exactly.
YS Chi (17:45):
Well, the pandemic, I'm
sure has caused, not only the
disruption suffering to the mostof us, but it is become an
exploitative opportunity forthese fraudsters. So what is the
impact of COVID on financialcrime, as you can see already?
Stephen Topliss (18:04):
So it's changed
actually during the course of
the pandemic. So at thebeginning, as these lockdowns
started coming into force aroundthe world. When we looked in our
data, we saw that actually thefraudsters were being impacted
as much as the rest of us, sofraud actually decreased
initially. And then as we'veseen this rapid growth in
(18:26):
digital transactions online, aseveryone starts shopping from
home, as everyone's logging inremotely to services from home,
we've seen the volumes of goodtransactions go up very, very
fast, and the fraud has gone upas well. Luckily, the fraud
hasn't gone up as fast as thegood transactions, at least not
(18:46):
in the data that we're seeing.
But what we do then see is thefraudsters are shifting to the
opportunities that are outthere. So you can imagine that
everyone working from home, noteveryone is used to doing that,
not everyone has the maybe thesame security in place. There's
potentially opportunity forfraudsters to take advantage of
(19:08):
that. Then there are also newopportunities for new types of
fraud. So around the world, wesee the government's, you know,
government supported loans thatare being made available to
businesses to get them throughthe pandemic, and they are
absolutely the new target forthe fraudsters. So we're seeing
fradusters.
YS Chi (19:27):
Easy money.
Stephen Topliss (19:28):
It's easy money
and the thing is those
programmes are being rolled outrapidly, so people can take
advantage, and that...
YS Chi (19:35):
With a bunch of holes
Stephen Topliss (19:36):
Exactly.
Exactly, and the fraudsters areall over that.
YS Chi (19:39):
And the amounts are
small, too, isn't it Stephen? So
that, you know, nobody's reallygoing after the 200 frauds they
want to go after the 2 millionfraud and some of these can be
lost.
Stephen Topliss (19:49):
Yeah exactly,
exactly. So that's definitely
concerning.
YS Chi (19:53):
What would you say that
we can do to protect ourselves
and not help criminals get awaytoo easy?
Stephen Topliss (20:01):
You know, a lot
of it and I hate to say it, but
a lot of it comes down to commonsense. I've seen plenty of
examples of fraud preventionexperts as well being caught
out. So we should never be, youknow, embarrassed or concerned
about sharing, you know,questioning something. But you
know, only share what personaldata you need to share when
(20:24):
necessary when you're online.
Use trusted websites if you'regoing to buy online. Buy from
the websites that you can trust.
If you're using a new website,you know, just spend a bit of
time checking that it doesn't,that it doesn't look a bit
dodgy. Look out for anythingunusual and suspicious.
Definitely don't trust anythingthat sounds too good to be true,
(20:44):
because it probably isn't. Andthen the last thing is,
especially on the payment side,keep an eye out on those bank
statements. I think that's alsoa big risk that we don't
necessarily look at those asmuch anymore, especially as
we're sometimes moving fromgetting paper statements that we
would open on a monthly basisand scan through them, to now
having everything online. That'swhat fraudsters also, you know,
(21:08):
are playing on the people maybedon't notice those small
transactions coming through.
YS Chi (21:13):
Where there is
prosperity, there is a magnet
for fraudsters who like to takeshortcuts like parasites.
Stephen Topliss (21:21):
Exactly.
YS Chi (21:22):
Yeah. Well, thank you
for that background. Shall we
move on to how ThreatMetrix andmore broadly LexisNexis Risk
Solutions are addressing theseissues? You're working at the
extreme front of digitalidentity intelligence, which is
all about aggregating billionsof anonymized transactions. I
(21:42):
guess to differentiate a trustedcustomer from a cyber attack. I
understand you can do this inreally near real time. Can you
tell us how it works withoutrevealing any undue trade
secrets?
Stephen Topliss (21:56):
Yeah,
absolutely. So I mentioned
earlier the concept of a deviceID and identifier for a device.
If we start simply and we lookat kind of the initial approach
to building fraud rules, tocatch online fraud. We would
take a device ID and we wouldlook for unusual things related
(22:18):
to that. So a couple of reallysimple examples would be, is one
device actually buying thingsonline with more than, say, two
or three different credit cards?
That could be an indication of afraudster with a list of stolen
credit cards that they'retesting. Or with that one device
do I have, you know, reallyquite a number of different
(22:39):
email addresses that I'm usingto log into different accounts?
Again, that could be indicativeof someone having a list of
compromised data. So, in fraudanalysis, we have that data. We
have the digital intelligenceand we combine it with the event
data, the transactional data,whether it's a credit card
number, or an email address, orphone number, or name, or
(23:03):
something like that. All ofthose pieces of information are
anonymized or hashed in thedatabase, and what we can do,
what we do over time and withthose fraud rules, is we also
looked to compare over time. So,if I log into my account today,
how does that behaviour compareto when I logged in yesterday?
(23:25):
Is it? Am I using the samedevice? Do I have the same kind
of relationships between those,those identifiers? What we
realised over time, actually wasthat we were kind of creating, a
kind of digital identity.
Nowadays everyone talks aboutdigital identity without
thinking about it. But for us, Iguess six or seven years ago, we
(23:48):
started thinking. Yeah, this isa form of identity in the
digital space. And that's one ofthe reasons we refer to that
solution today. So basically,what the solution is doing is,
in real time, every time weanalyse an event for one of our
clients. We're taking the datafrom the event, we anonymize it,
and then we create this digitalidentity. We see how it aligns
(24:13):
with the digital identity'shistory in that global network,
with the goal to basicallyunderstand, is it looking like
normal? Or is there somethingdifferent? So if I log into my
bank today, is my digitalfingerprint for this event, does
it generally match what haslogged in historically into my
(24:34):
account? If it does, then trustme and let me in. If it doesn't
then raise a red flag to thebank.
YS Chi (24:40):
Right. This sounds like
a smart person overcoming a
smarter person, and yet anothersmart person on top of that.
For businesses, particularlyonline consumer facing ones,
(25:02):
there is probably an ongoingchallenge of balancing between
convenience for the customer,and security verification. Are
there different approaches fordifferent regions when it comes
to combating fraud? And arethere different approaches to
security versus conveniencebetween different generation
(25:23):
perhaps?
Stephen Topliss (25:24):
Yeah, and
that's one of the fun things of
the international role becausethere are clear differences. I
think they're changing overtime. So if we look at the US,
for example, I would say thaton, in the digital world or the
online world, the focus isabsolutely on customer
experience. Customer experienceis first in America, whether
(25:48):
you're in a store or whetheryou're online. That can, that
experience can come at a raisedlevel of risk from a security
point of view. If you contrastthat to parts of continental
Europe, and parts of AsiaPacific. In those cultures,
(26:10):
you're much more accustomed tostronger, what we call multi
factor authentication when youaccess services online. So
you'll be asked for a usernameand password and then there'll
be another step. Maybe you'll beyou know, challenged in some
other way. So it's theexperience, the user experience,
there is more friction, as wecall it. But it's generally more
(26:33):
secure, and therefore the riskof fraud and the rates of fraud
tend to be lower. The UK isprobably somewhere in between
the two. So that's what you see,kind of generally around the
world. Then also consider thattype of industry and service,
there are also different levelsof security. The generational
(26:56):
question is a really interestingone because what I described is
probably the norm right now. Ithink that the shift is going to
come from the youngergenerations. The way that
children at school and leavingschool now are so familiar with
all this digital technology andthe latest phones and devices.
(27:20):
What we're seeing with theyounger generations, I think
what we're going to see, being abit of a disrupter here is that
those services that are moresecurely protected with higher
levels of friction with strongerauthentication techniques, are
probably going to turn awaythose younger people.
YS Chi (27:40):
That sounds like you
have to go through more doors to
enter finally, to theperformance.
Stephen Topliss (27:45):
Yeah, yeah,
yeah. And they're going to look
for other services offeringsimilar service, but with an
easier entry point. So we are,from a security point of view,
we are going to have to adapt tomeet the expectations of younger
generations. At the same time, Ithink there's an educational
part that's the responsibilityof all of us. Public outreach
(28:08):
schools, organisations likeourselves to make sure that the
young are really aware of therisks online, and that they
shouldn't be too open withsharing their data. Just to
protect themselves a little bit.
YS Chi (28:23):
I suppose they need to
know that there's a pain on the
other end for being careless.
Stephen Topliss (28:28):
Exactly,
exactly.
YS Chi (28:29):
For adults, it could be
loss of, you know financial
assets. For young people itmight have to be an embarrassing
situation.
Stephen Topliss (28:37):
Yep. Well
exactly, exactly, and it's, I
think it's something that we cando more of. It's really
interesting. I spent, I spenttwo years, I dragged my family
to Silicon Valley for two yearsto live out there and when they
were in the the American schoolsthere, they really had lessons
that focused on exactly that.
How to be safe online. I feelsince we've moved back to
(29:01):
Europe, I feel that we do that alittle bit less in Europe. So I
think that's something we shoulddo more and make make children
more aware of.
YS Chi (29:10):
Yeah. I think that this
trade off is one that companies
have to make also, but we asindividuals have to make,
between our privacy andconvenience. When we do that
properly, we can takeresponsibility for the choices
that we make.
Stephen Topliss (29:29):
So I think as a
society, we've become
increasingly aware of how ourdata online is potentially being
harvested and used for marketingor other purposes. That's driven
that focus on, you know, ourdata and privacy around it, and
it's introduced regulations toprotect our data and I think
(29:50):
that, you know, we're on theright tracks there. Certainly in
Europe we have the GDPRregulations and other parts of
the world have been looking atall that. I think looking at it
from a cybersecurity point ofview, we just need to make sure
that these regulations considerwhere private data can be used
(30:14):
in the right way to preventcrime. So, you know, we
sometimes see regulationscarving out the ability to use
data to prevent crime and it's areally important distinction
there. Otherwise that can alsopotentially impact cybersecurity
in the future.
YS Chi (30:28):
Indeed, we need to make
sure that we don't throw the
baby out with the bathwater.
Stephen Topliss (30:31):
Exactly.
Exactly.
YS Chi (30:33):
Because we have to fight
those fraudsters.
Stephen Topliss (30:35):
Exactly. And
that, you know, there's the
regulation side there that Ithink it's important for us to
get that right. At LexisNexisRisk Solutions, the way we
design our products is also wefollow what we call it privacy
by design approach. So, youknow, those anonymization
techniques that I mentionedearlier. It's what you end up
actually being able to developproducts that can identify
(30:58):
fraud, by creating theseanonymized digital identities
and you actually don't need toknow who the person is behind
those digital identities.
YS Chi (31:06):
That's right. So looking
ahead, keyword these days, of
course, is AI machine learning.
How is that going to, come in tothe future of digital fraud
prevention at ThreatMetrix?
Stephen Topliss (31:20):
So we're
already using machine learning
today and we continue to buildthat out. One of the reasons
that's so important is just thesheer amount of data. So I
talked about those simple fraudrules earlier, and fraud
analysts used to go in andchange those rules on a weekly
basis. But the volumes of data,the number of events that you
(31:42):
have to look out now, it's notpossible to do that manually. So
you need to use machine learningtechniques to optimise those
rules, to learn from thechanging fraud trends. So I
think that's, you know, one areathat machine learning is being
used. I think, you know, there'sthere's a couple of other things
that we're doing in the shortterm, looking ahead with the
(32:05):
ThreatMetrix product, and that'sto actually continue to bring in
the latest digital intelligence,like phone intelligence. So as
everyone is turning more andmore to the mobile channel, the
data associated with the mobilechannel is becoming increasingly
important. As we bring more datain from that, and also from
(32:26):
emerging technologies likebehavioural biometrics, it just
increases the data again, andincreases the need for this
automized, automated machinelearning to optimise things.
YS Chi (32:38):
Does that give us the
future in which we're not just
doing a cat and mouse catch uppost fact, but perhaps something
a solution that is soinnovative, that maybe it can
really prevent them before theyhappen?
Stephen Topliss (32:56):
I'm not sure
that we're ever going to reach
that stage. I think thatbiometrics specifically is
generally going to have quite abig impact. And by biometrics,
in this case, rather thanbehavioural biometrics, I'm
thinking more about thebiometrics from facial
recognition, fingerprints,things like that. They are
(33:18):
really quite secure. Althoughnot impossible, impossible to
beat. The problem with some ofthose technologies is adoption,
though. You can have a reallysecure piece of technology, but
if only a small part of yourpopulation actually want to use
that technology, then you stillhaven't solved the problem.
YS Chi (33:39):
Right. And this adoption
Stephen, a matter of economics?
Or is it a matter of conveniencechoice?
Stephen Topliss (33:49):
I think it's
three things. It can definitely
be economics, especially indifferent parts of the world.
It's choice, some people justdon't like the idea. And third,
it comes back to thegenerational aspect as well. I
think that with biometrics, theyounger generations are probably
very comfortable with it. Someof the older generations aren't
(34:10):
very comfortable with it, andit's maybe difficult for them to
use. So that's generally aproblem for some of these new
techniques not being able to beadopted. And if they're not
adopted across the board, thenobviously the fraudsters, they
adapt and they figure outexactly who to target that isn't
using that, for example. Plus,we talked about scams earlier,
(34:32):
and if I use biometrics, thatdoesn't change anything when it
comes to scams. I'll still carryon sending my money to the wrong
place and I'll smile happily atthe camera.
YS Chi (34:43):
Yes, often it is us who
opened the door for them.
Stephen Topliss (34:46):
Exactly.
YS Chi (34:46):
Voluntarily.
Stephen Topliss (34:47):
Exactly.
YS Chi (34:48):
You know, in a slightly
different scale, but in a
parallel manner. I have spentthe past few decades in the
publishing sector where we'vebeen fighting for copyright
issues and IP protection. I'vealways said three E's as a
principle. First is, educatpeople. That stealing i
(35:09):
stealing, whether it's digitaor physical goods, it'
stealing. Second is economicsIt has to be such, that ther
isn't as much economicaincentive to steal, then to ear
it the right way. And then ththird is enforcement when al
else fails, there's got to bexemplary enforcement to sho
(35:31):
people that there's a price tpay for crime. I wonder if thes
thoughts that you had kind oshared with us today i
different parts of thconversation are similarl
applicable to online icybersecurity fraud
Stephen Topliss (35:50):
Yep, no, I
agree. It's an interesting
parallel. Yeah.
YS Chi (35:53):
Well, I've learned an
awful lot from you today about
cyber security, and the uniquecontributions of LexisNexis Risk
Solutions team for combatingthis. Before I let you go
though, I'm very curious aboutwhat else you're doing these
days to keep yourself, you know,kind of balanced, and not
(36:14):
constantly working, or doinghomework with your children.
Stephen Topliss (36:20):
So I've always
liked to keep fit, and I love
swimming. I haven't been able toswim, really the last year.
YS Chi (36:28):
Me too.
Stephen Topliss (36:29):
So I've tried
to figure out a different
approach to keeping fit. So if Igo back to my last years at
school and university, I used toskip as part of my fitness
routine. I saw an article in thepaper recently about skipping
and I thought, you know what, Ican do that in the home, I can
start doing that again. I wentonline, I bought a skipping
(36:50):
rope. But what I discovered isthere is a whole new world out
there of skipping ropes, jumprope workouts that I don't think
existed five years ago. It's abooming business. You could
spend 10 euros, you could spend150 euros on a skipping rope. So
I now have a kind of mid rangeskipping rope. I'm skipping
(37:12):
every day, there's lots of...
YS Chi (37:13):
I'm hoping to graduate
to the high end skipping rope
one day. Like the bikers do orpeople like us golfers think
that, it's the equipment that'sgoing to get us better score
card.
Stephen Topliss (37:23):
Exactly. I
think my children think I'm
crazy. But yeah, that's my brandnew world at home of keeping fit
right now.
YS Chi (37:31):
Fantastic. Stephen, it's
been a great pleasure. Thank you
so much for being with us today,and I hope that the audience has
learned as much as I have.
Stephen Topliss (37:40):
Thanks YS. It's
a pleasure talking to you.
YS Chi (37:42):
Thank you to our
listeners for tuning in. Don't
forget to hit subscribe on yourpodcast app to get new episodes
as soon as they're released. Inour next episode, we'll be
talking with Anna Dycheva whoruns Reed Exhibitions in Russia,
the Middle East, Turkey and theUK, about the role of the events
industry. Thanks for listening.