All Episodes

March 7, 2024 33 mins

A key actor in risk assessment is the data provider. These commercial operations aggregate and analyze the data produced by governments, enterprises, individuals, and even other data providers. All to feed today’s insatiable appetite for understanding who it is we are dealing with online.

In this Making Data Better episode, Steve and George are joined by Cindy Printer, Director, Financial Crime Compliance and Payments, at LexisNexis Risk Solutions. The company is a major data provider to government and enterprise; Cindy focuses her work on financial services firms and their need for regulatory compliance.

We discuss the granular nature of the data LexisNexis Risk Solutions offers its customers and the breadth of sources used to meet their needs. It’s astonishing.

Cindy makes the point, one we heartily agree with at Lockstep, that risk is specific, a concern for each individual entity and that the data required by each entity varies based upon its specific concerns. And that’s why LexisNexis Risk Solutions tunes the data services it provides to the industry segment and individual firm.

Sitting on top of such vast data resources and knowing the complications associated with deriving meaning from it all, LexisNexis Risk Solutions also provides analytical services that saves an enterprise from having to analyze the data itself.

This is a great conversation if you want to understand the data provider role, the scale of its operations, and its priorities. So take a listen.

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:17):
Welcome to Making Data Better, a podcast about
data quality and the impact ithas on how we protect, manage
and use the digital datacritical to our lives.
I'm George Peabody, partner atLockstep Consulting, and thanks
for joining us With me isLockstep founder, steve Wilson.
How are you?

Speaker 2 (00:33):
Steve, fantastic George, good to be here again.

Speaker 1 (00:37):
I need a quick weather report, steve.
What's the temperature downunder?

Speaker 2 (00:41):
It's 7 o'clock in the morning and it's let's say,
it's 25 Celsius.
So what's that about?
80 degrees, oh wow, and Cindy'sin Chicago, I'm in Boston.

Speaker 1 (00:50):
Yes, yes, cindy, I assume you're looking out at
snow as well.

Speaker 3 (00:55):
It's just that it's snow on the ground.
So yeah, we won't see 80 forseveral months here, maybe July.

Speaker 1 (01:02):
Yeah, exactly, exactly.
All right, well, let's getstarted.
So yes, cindy printer is withus today.
She is the director offinancial crime compliance and
payments at Lexis Nexus Risksolutions.
Cindy, thanks for joining us.
I confess for joining us onceagain.
Yeah, I'm going to re-recordthis as I had a technical issue.

(01:26):
Want to make this right.

Speaker 3 (01:28):
Happy to be here.
Yeah, thanks for having me back.

Speaker 2 (01:31):
Thank you for Casper.

Speaker 1 (01:32):
Yeah, thank you For those of you who have listened
to us before.
Steve and I have a pretty hardfocus on data quality and, as
part of that and I know Cindyshares the same perspective that
the risk owner, the partythat's taking on the liability,
taking on transaction risk, whoin the patch, while the of the

(01:54):
identity industry is called therelying party, their consumers
of data that Cindy'sorganization and many others
provide and, of course, many ofthese organizations, many of our
client organizations, alsogather their own data sets.
So we're really interested intalking about the breadth of
data that Cindy has at herdisposal to offer her clients at

(02:18):
LexisNexisRisk solutions.
So, before we dive into that,cindy, why don't you, would you
tell us a little bit about whatbrought you to risk and fraud as
an industry and as a career?

Speaker 3 (02:32):
Yeah.
Yeah, it's a great question.
I suppose I always likedsolving problems, solving
mysteries, always sort ofenjoyed watching movies or
reading books that were mysteryfocused and not with any intent,
but I really found myself as mycareer in Eval, putting that
toward my career and using thatnow for the good of other people

(02:57):
, which really is whatLexisNexis does, is we help
organizations focus on makingbetter risk decisions, better
decisions around opportunities,by using data in analytics.
Just by chance that I foundmyself using my personal
interest in my career.

Speaker 1 (03:13):
It's nice when those two things line up, isn't it?
It's kind of fun.
So Orient is a little morethoroughly, then, into what
LexisNexis does, and especiallythe risk solutions group that
you're part of.

Speaker 3 (03:26):
LexisNexisRisk solutions in particular, does
help organizations make betterdecisions in two ways both
identifying risks for theirorganizations, but also
identifying opportunities.
And we do that to use some ofyour words, george, from earlier
through our depth and breadthof data that we provide, as well
as our advanced analytics, andin the risk solutions group I'm

(03:51):
part of the financial crimecompliance group in particular
Help me understand.

Speaker 1 (03:56):
Who are your customers, particularly in your
organization?

Speaker 3 (04:00):
Sure, we have a wide variety of customers across
financial institutions, so thatwould be banks, that's credit
unions, it's just some of thefinancial services organizations
that are out there, as well ascorporates.
So we service both financialinstitutions and corporates.
We have a wide coveragegeographically.

(04:22):
We're really in 180 countries.
So we I think we're a littlewe're everywhere, george.

Speaker 1 (04:30):
Before I turn it over to Steve and I bet you, this
question has lots of answers Ihave one vision of you as a
provider of raw data that isconsumed by your enterprise,
your large customers, who thentake it in and analyze it
themselves, versus what I wouldthink is a significant amount of

(04:50):
value out on your part, whereyou're doing the analytics and
saying to your customers thistransaction, this particular
interaction, this identificationsuggests you should be careful
or go ahead and approve it.
Is it the right question to askyou?
Is there a balance betweenthose two in terms of what

(05:11):
you're seeing your customersasking you for?

Speaker 3 (05:14):
Yeah, absolutely.
We've made our legacy by way ofdata and so we've been
providing this very rich datafor nearly 40 years at this
point and through evolution ofour own business today and for
many years now.
But the difference between justbeing data provider to now is

(05:36):
we offer solutions as well.
Those solutions are pieces oftechnology, for example,
screening, screening solutionsso watch the screening solutions
that help identify risks fororganization, and then also it's
how we deliver that output fromthat, and that kind of comes to
that analytics piece.
Is what does that mean?

(05:57):
There's data, there's asolution to ingest that data and
then there's output.
So for us, it's that output isgoing to look a little bit
different for every organization.
So we have organizations askingfor only data, but we have
organizations that also look tous, recognize us as a leading
provider of those analytics andalso of those technologies.

(06:19):
So we really have our customersusing any variety of the data,
the analytics and thetechnologies.

Speaker 1 (06:26):
So that you can tune it based on what they're asking
for.

Speaker 3 (06:29):
It's all about the tuning, george, it is yeah.

Speaker 2 (06:32):
So Sydney?
That brings us to sort ofdesign thinking.
A lot of what we're doing inour work is trying to work out
how to make data better.
What are the questions that youneed to answer about data?
You've already used languagearound solutions and the let's
call them information productsthat you have.
Tell us more about the dialogue.
What's keeping your users up atnight and what are they asking

(06:53):
for you or from you in terms ofyour feature set in your
information products?

Speaker 3 (06:58):
Sure.
So at the heart of it, it'svery, it's integral.
It's key that the data ourcustomers use and look to us for
is credible and reliable.
So they're looking to us toensure that what we're
delivering to them is reliabledata, it's coming from a

(07:19):
credible source.
So, regardless of the problemsour customers are having, it all
starts with that credible andreliable data.
But, that said, in the risksolutions space and financial
crime compliance area, morespecifically, keeping our
customers up at night isregulatory compliance.
The regulatory landscape hasalways been dynamic.

(07:41):
It's always been a top concern.
But in recent years, given thechanges due to geopolitical
events, technologicaladvancements, various
innovations, our change, thepace of change, the rate of
change of regular regulations isat its utmost highest, and so

(08:03):
for our customers, it has alwaysbeen a concern.
But it's almost risen back tothe top, as one of the primary
concerns is assuring thatcompliance, and that all starts
with the data, the credibilityand the reliability of the data.

Speaker 1 (08:19):
How do you prove that ?
Casting no aspersions.
But just because it comes fromyour company, you source it from
somebody else.
Clearly.
How do you guarantee thatprovenance?
How do you guarantee thecredibility, the currency of the
data?

Speaker 3 (08:33):
We've been sourcing data for 40 years, so our
processes are highly structured,highly standardized.
So, in short, we know what goodlooks like because we've proven
it.
We've proven it to ourselves,we've proven it to the industry.
So we know what good looks like.
So, as we analyze new data,aspects of data, we have a

(08:58):
process for testing that and wehave a process that we don't
vary from.
It's a proven process, and oneof those steps is always
comparing it against what goodlooks like.
We have the benefit of knowingwhat good looks like, and so
we're able to just constantlytest, constantly compare.

(09:19):
We have access to a lot ofsources of data, so we don't
have a problem passing on asource of data we deem it to be
unreliable, uncredible, or evenif it's reliable, incredible,
incredible, maybe just not goodenough up to our very, very high
standards.

Speaker 2 (09:38):
We are obviously preoccupied with fakes.
Not to be too negative, wethink that the data industry is
full of good things, but giventhe prevalence of fakes and
given people's concern aboutdeep fakes and generative AI but
consumer level, fraud hasalways been around how do you
know who you're dealing with?
Further up the food chain, witha lot of your sources and a lot

(10:00):
of your clients beingenterprise people, do you see
fakes at that level?
Do you have a lot of issueswith, let's say, fidelity or
genuineness of enterprisecustomers and data sources and
so on?

Speaker 3 (10:14):
Certainly our customers do.
They're the ones that see thefake data.
They are experiencing thosevery overt efforts to disguise
identity, to present an identityperson or an entity as
something other than who theyare.

(10:35):
And when someone wants to fakewho they are, there are ways to
do that, and so we have aconstant goal of hoping our
customers sort through that fakedata.
And that's a lot about therelevancy of the data, what's
important to organization Aversus financial institution B,

(10:57):
so on and so forth, and so thatcomes with those analytic value
propositions that we have, someof the technologies that we
offer, and that is a little bitdifferent for each organization,
but we absolutely help ourcustomers see through that fake
data because it's very real.

Speaker 2 (11:17):
And so the dimensions of quality that your customers
are looking for.
It changes from customer tocustomer.

Speaker 3 (11:23):
It absolutely does.
Yeah, each of our customers isserving a different industry,
you know, or differentindustries.
Some look like the other, butwhether your financial
institution coming and operatingin a highly regulated
environment, or a corporate,maybe not as tightly regulated,

(11:44):
however serving a riskiercustomer base, what they're
looking for, what riskidentifiers and what's important
to those differentorganizations based on the
profile of who they're serving.
The regulatory environment thatthey're operating in means
they're focused on differentaspects of the data.

(12:05):
So it's our goal and ourobjective not just to provide a
product to a customer, butprovide a solution like get to
the bottom of what is theirproblem and then work towards
solving that problem.

Speaker 2 (12:21):
It's so fascinating because we're in one of the
tightest regulated, let's say,financial services.
We're in one of the mosttightly regulated sectors of the
world and you'd think thethings are uniform, and yet we
deal a lot with KYC.
Know your customer.
Everybody does KYC differently,yes, and they've all got their
own concerns, their own riskmanagement habits, their own
settings.
It's fascinating how muchdiversity there is in a tightly

(12:43):
regulated space.

Speaker 3 (12:44):
It absolutely is, and every customer base looks
different from your peers'customer base, and there is
opportunity to leverage,leverage practices, leverage
content.
A lot of talk now about dataconsortia, sharing data across
organizations that might beanonymized.

(13:05):
However, there are stillindicators that can be shared.
But the reality is, while thereare practices that can be
shared, at the end of the day,it's really important that each
organization is focused on whatmatters to them.
It's not only about doing whatyour peers are doing benefit
from that, but you must befocused on your own organization

(13:27):
and make policies and processestailored to your organization.

Speaker 1 (13:31):
I love that you're pointing at that, because it
really underscores thehistorical failure of federated
approaches that have hoped to,for example, apply standardized
data sets and processes for knowyour customer onboard, where
the risk profile of one bank isvery different from the bank

(13:53):
down the street.

Speaker 3 (13:55):
Never mind the credit union.
And those standardizedprocesses are minimums.
They should be looked at asminimums.
This is what you must be doing.
There is more to do now, basedon your risk appetite, your
customer base, so on and soforth, what your particular
problems are that you're lookingto solve.
So it is important to havestandardization, but those are

(14:18):
minimums and should be viewedthat way.

Speaker 1 (14:20):
I can imagine that you have a lot of data that
actually endures for years andyears and years Someone's
address or maybe their phonenumber or their social security
number and birthday.
Those are generally close tolife.
Lifetime things.

(14:41):
Just to drill a little bit, isthere data that changes
frequently, that you guys have ahard focus on or you think
you're particularly good atmonitoring?

Speaker 3 (14:54):
Data changes all the time.
It all changes, quite frankly,even if someone doesn't move.
Someone's social securitynumber is what it is, someone's
birthday is what it is butpeople move.
They take new jobs, they getmarried, they get divorced, they
have children.
Gosh, we really took a stepback and thought about all we
each did individually in thelast month, let alone the last

(15:16):
12 months or the last couple ofyears, if you recorded all those
data points and changes in yourlife.
Data is changing constantly,and so our organizations.
You've come into George, you'vecome into that KYC, cdd,
customer due diligence and, ofcourse, I've mentioned
regulations, sanctions, variouswatch lists.

(15:39):
So that's a lot of what ourcustomers are focused on is what
are the changes in regulationsthemselves?
But then how?
What does that mean for thedata that I now need to be aware
of and track and ensure thatI'm constantly maintaining it to
be accurate enough to date?

Speaker 1 (15:59):
You've got a real breadth of data available to you
.
I mean just a little look atthe acquisition history that
Alan Harris has done fairlyrecently and you bought an email
edge and ID analytics andFlyreal and others.
Each of those companies havebeen focused on a particular
problem set.
Do you bring that data in andmake it available to other

(16:25):
subsidiaries within yourorganization?
The more data, the better.
It's sort of the defaultsupposition on my part.

Speaker 3 (16:31):
Well, now you've just complicated the question,
George.
Now you've just complicated, butyou added a whole new dimension
to the question.
In the first instance, whenwe're analyzing a business for
acquisition, for example, itabsolutely needs to lend to the
core of what we do.
That's coming back to improvingthe decision-making ability
around risks and opportunities,if there is opportunity to share

(16:55):
that data within theorganization in a highly
sensitized way.
We have parts of ourorganization that, for example,
work with governmental agencies.
That's a very secure bubble.
Nothing goes into it, nothingcomes out of it.
That isn't very highlymonitored, and so that would be

(17:16):
sort of like a random situationwhen we say no.
But, for example, I'm in thefinancial crime compliance group
and then we have our fraud andidentity group.
Absolutely, that's a groupthat's tracking a lot of digital
activity or digital identityfootprint, for example, mobile
device, laptop, any kind of appson a phone.

(17:40):
So there is opportunity forfinancial crime compliance to
marry our physical identity, ourdata, physical identity data,
with that very digital identitydata and provide a significant
value to our customers.
And so that's an example ofwhere we do.

(18:02):
We do share information.
It's anonymized in mostinstances, kept in maybe even a
data clean room, and so accessto it might be tightly monitored
.
So it's done in a very secureway.
But where we find opportunityand we're always looking at that
of how can we provide a betteroutcome for our customers, then

(18:26):
we will take that opportunity toshare information or data
within our organization.

Speaker 1 (18:32):
Since this culture of ours is becoming largely
digital, I imagine you anchor afair amount of your analytics.
Going forward is when the kidgets their first mobile phone.

Speaker 3 (18:43):
Absolutely, Absolutely.
I mean think about what age nowdo children?
I have nieces and nephews, andI think they're like 8 and 10.
They don't have their phonesyet with cell service, but they
have iPads and so they're usingthose devices and they have a
digital footprint.

(19:03):
Someone out there knows they'reonline.
Yeah, they're online and sothat's really an opportunity for
benefit to some of ourcustomers and that's maybe a
child and at that age, but evensomeone that goes to college
let's say, bump up the age alittle bit that doesn't have a
credit card yet, but they dohave this digital footprint.

(19:24):
Well, how do you provide creditto someone who doesn't have
credit?
Like it's got to startsomewhere.
You may not have a parent toco-sign for them.
So this is where some of thedata we're able to use and turn
that around and actually providea situation where our customers
can really provide a greatopportunity for someone young

(19:44):
that only has a mobile digitalpresence.

Speaker 1 (19:47):
I've seen similar kind of decisioning in
developing markets, where themetadata around how a mobile
phone is used, for example, thatactually provides good
information with respect to isthis potential customer going to
pay their bill on?

Speaker 3 (20:04):
time Risky or not risky Right.

Speaker 1 (20:07):
Risky, which you even got down to.
Does this person charge theirmobile phone on a regular basis
or does it run?
Out of power and they have tore-establish them.

Speaker 3 (20:19):
Never charge it to 100%, George, that dies down the
battery earlier than it needsto.

Speaker 1 (20:24):
I'm guilty of that.
That just means I just don'tmove around very much.

Speaker 2 (20:30):
So that reminds me of that sort of modern parenting
experience that certainly mostof us had had with our teenagers
, like financial literacy, and Iremember encouraging, against
my best instincts, my teenagersto start thinking about credit
cards and getting into thesystem and having a credit
rating before it's too late.
I hope these days I mean, someof that stuff gets creepy,

(20:51):
George, but these days there'scertainly got to be better data
available to make credit scoringfor teenagers as they grow up.
But that segue to privacy.
Cindy, our privacy business atLockstep has got to do with
design thinking and we askpeople to think about what do
you really need to know aboutpeople to deal with them?
At identity Privacy is like whatis the least amount of

(21:14):
information you need to knowabout somebody to still deal
with them.
And these days in your industrythere's so much natural drive
and selection pressures to knowmore and more, so there are some
interesting tensions there.
Can you talk to us all abouthow your business looks and

(21:34):
those tensions?
To minimize data and yet tominimize data in the interest of
privacy and yet to get the sortof precision that your
customers need for riskmanagement?

Speaker 3 (21:43):
It's not about more data, more data that's going to
enable me to make a betterdecision, a more accurate
decision.
It's about looking at the bestdata, the relevant data, and I
used that word earlier.
I don't want more data, I justcut through the noise.
I want to see the data that'simportant to me, whether that's

(22:04):
alias data, someone trying tomaster identity, or whether it's
we use that fake data, orwhether it's the real data.
I do want to see that all.
I want to have the capabilities,the analytics to poke through
what the fake data is, get tothe good data.
But I want to know that fakedata is out there as well,
because now I can identify thatperson as risky.

(22:26):
But too much data and I'm notgoing to know what to do with it
.
For us, we have an approach, wehave a scoring methodology.
It's called the Exposure Indexand it's something that we've
created and it's something thatwe enhance.
Even open source data, like awatch list data, we enhance it

(22:46):
with this exposure index thattells an organization how
relevant is this Steve Wilsonrecord to the Steve Wilson
that's in my database?
Are they the same Kind ofcommon first name, semi-common
last name?

Speaker 2 (23:01):
There's a lot of us.

Speaker 3 (23:02):
Yeah, In even George Peabody.
That might be a common name.
Mine is not so common.
Yeah, it's poking through thatwhat's relevant to me?
That's one of the value adsthat we provide, but also really
important for our customers outthere something very, very
valuable to them.

Speaker 2 (23:18):
We know about the enormous burden compliance
burden, reporting burden thatbanks have at the moment.
I've recently heard thatcompliance costs has tripled in
the last 10 years for businessin general and it's worse than
that for financial services andbanking.
This is all about data, isn'tit?
Suspicious matter reporting,AML obligations businesses need

(23:39):
to be really sharp in the waythat they test for suspicious
matters and detect suspiciousmatters.
What's the state of the artfrom your perspective, cindy,
about the signals and theproducts that help people detect
suspicious matters and so on?

Speaker 3 (23:56):
It's a great point because, regardless of what data
you're looking at and reallyyou're going to be able to
identify suspicious data, ifit's entity related, it's
something specific.
It's like that KYC data, thatCDD or customer due diligence
data, whether that be on anorganization if I corporate or a
person.
It's really utilizing thatentity data and looking at

(24:20):
behaviors of that data.
Does this look normal andwhat's normal?
I'm a college student.
Am I going to Taco Bell in7-Eleven using my money and
transferring funds digitally?
Or am I transferring money inand out of Cuba or something?

Speaker 2 (24:39):
So some of this is just like yeah, it's on the
front-takes, isn't it?
It's kind of taxed.
Normal varies from one customerto another.

Speaker 3 (24:44):
That's exactly right.
So it's the data, it's how youanalyze that data, and again,
this comes back to a little bitof common sense what makes sense
for my customer base?
And that's where we see thatvariation from business to
business.

Speaker 1 (25:00):
I've worked on push payments quite a bit in my past
life and so I'm super interestedin the problem of authenticated
push payment fraud where and ofcourse, we're seeing that lots
today, with scammers trickingpeople to send money to them.
And, cindy, it really intriguedme to see that you are offering

(25:23):
a service you're calledconfirmation of payee, which I
saw that title.
I went oh, this is reallyinteresting, because a lot of
these payees are bogus orthey're certainly scammers who
are actually controlled alegitimate bank account.
I'm very curious of what it isthat the confirmation of payee

(25:45):
does.

Speaker 3 (25:46):
What you're referring to is a solution that focuses
on detecting, like authorizedpush payment fraud.
So it's that fraud, of course,being the key part, and you even
mentioned it.
It's authorized.
Someone is originating thatpayment and saying I want to
send this payment to GeorgePeabody but, by the way, someone

(26:10):
is posing that they've takenover your name but it's really
their account number on theother end.
So they've representedthemselves.
That's one situation, orthey've just represented
themselves as charitableorganization, playing on the
good heart of someone in societythat agrees to send that money
and turns out it's a scam.

(26:31):
It's not going to charitableorganization.
No tax write off for you.
So we offer a solution thatmatches.
It basically validates theidentity of the intended
receiver.
What it does is it matchesaccount information.
This solution focuses on bankaccount to bank account, so

(26:52):
account to account.
It doesn't involve its digitalpayments, but not credit card
payments.
It's all around verifying whois receiving this.
That person says who they arethe owner of that bank account
on record, so there's a checkthat happens in the background.
With our solution.
It's called Validate and SafePayment Verification.
It's an API that is seamless tothe sender and it's typically a

(27:15):
corporate who would license itand they've installed this API
in their payment process andthey're just running it in the
background.
It happens in less than twoseconds, but what it does is it
stops a bad payment before it'sever sent.
The situation of authorizedpush payment it's someone's
authorized it and they're liablefor that money.

(27:36):
Like that poor victim nevergets their money back.

Speaker 1 (27:40):
So your service might .
For example, if a fraudster orscammer has gotten a hold of
well, they've got a legitimatebank account.
You're checking as to whetherthat bank account matches up to
the name of the scammer?

Speaker 3 (27:52):
Yes, yep matches, so owner on record matching with
that bank account and there area couple other checks that are
happening because keep in mindthat the sender is also still
validating identity and oursolution does also that it
validates the bank and theaddress and some of those other
data points that are just reallyimportant to be accurate to

(28:14):
make a payment.
But also we'll make sure it'sgoing to the intended recipient.

Speaker 1 (28:18):
I'm curious are any of your customers who are
actually using the service?
Are they giving a thumbs up tothe sender to say is past, our
checks go ahead?

Speaker 3 (28:29):
No, it's really the opposite.
When it doesn't go through,they're going back to that
sender to say, by the way, wedidn't send this payment, it was
being sent to an unintendedparty, and it's saving that
sender money.
But also we're going to theentity who's also sending the
money.
You know, not just theoriginator, but the originating

(28:50):
entity.
There's a lot of money involvedin these failed payments and
these fraudulent payments.
So it's doing a couple ofthings.
It's not just about savingmoney, it's about stopping fraud
, it's about detecting andpreventing fraud.

Speaker 1 (29:06):
That's fraud today.
Let's wrap up with a littlediscussion about what you're
seeing as the future of fraud,and particularly with respect to
.
I'm always curious how areregulators possibly keeping up
with technology, with theevolution of fraud, now that we
have, you know?
Obviously the poster child ofthis concern is generative AI.
That's got to be high on yourwork schedule.

Speaker 3 (29:29):
It is.
In fact, we recently issued acompany-sponsored statement a
couple statements, but acompany-sponsored statement on
our use of AI, and we take itvery seriously.
You know there's some form ofAI that has been used in some of

(29:50):
our products for years, butthis generative AI in terms of a
regulated space is somethingthat must be used in a very,
very measured way.
We realize there are benefitsto it and we realize it's going
to get used in some of oursolutions over time.
But you know I'm not herestating it will or won't, but it

(30:13):
is certainly an aspect thatLNRS recognizes and that we look
into and we're answering to.
We want to provide the bestsolutions for our customers.
That may or may not involve AIover time.
There are other ways for us toadvance our products.
Things like I mentioned earlier, this consortia, just consortia

(30:35):
of data, that sharing of dataacross businesses and industries
, across customers, across yourpeers.
So there's a very realopportunity there.
That's just another way, not anAI-focused way, but another way
where advancement is happening.
Regulators recognize thebenefit of that and so you know
it's very much supported.

(30:55):
Some of these are very muchsupported by regulars, and so we
try to work hand in hand withthem as the industry advances.

Speaker 1 (31:03):
So let me be clear on that LNRS is actually actively
supporting data sharing acrossorganizations and making data
networks available so that theexperience of all of these
parties can be pooled and Iassume your analytics
capabilities can be applied toenrich the overall result.

Speaker 3 (31:25):
Yes, so that is happening in some aspects of our
business, especially in some ofour digital businesses, our
digital data that we provideAnonymizing data, and it's
always anonymized, it's alwaysprotected Data privacy.
Data security is a number oneconcern for us, but there are

(31:45):
some areas of our business whereit is supported by our users.
It's a contributory model, veryaware of the fact that their
peers are contributing data aswell.
It's anonymized and it'sreshared out for the good of
stopping fraud, of preventingfinancial crimes, terrorist
financing, money laundering, andalways for that goal of doing

(32:11):
good.

Speaker 2 (32:12):
That's great to see.
We've always been strugglingwith what's essentially a
tragedy of the commons, haven'twe?
With siloed businesses,competitive businesses,
understandably, the naturalthing is to hold that data close
, because data is like the crownjewels.
But more and more willingnessand more and more understanding
of that need to share experienceand know-how and intelligence

(32:33):
that's so fresh.
It's good to hear, cindy.
Thank you.

Speaker 3 (32:36):
Yeah, things done in a responsible way can do a lot
of good.
It's always focusing on beingresponsible about that use and
sharing of data.

Speaker 1 (32:46):
Well, we'll leave it there, Cindy.
Thank you so much for joiningus on making data better and for
coming back.

Speaker 3 (32:53):
We're really Lovely to be here, john, and then I
hope I get another opportunity.

Speaker 2 (32:57):
Oh, you will.

Speaker 1 (32:58):
Yeah, let's do that we look forward to it.

Speaker 3 (33:00):
That'd be great.

Speaker 1 (33:01):
Thanks again.

Speaker 3 (33:02):
Thank you.
Advertise With Us

Popular Podcasts

United States of Kennedy
Stuff You Should Know

Stuff You Should Know

If you've ever wanted to know about champagne, satanism, the Stonewall Uprising, chaos theory, LSD, El Nino, true crime and Rosa Parks, then look no further. Josh and Chuck have you covered.

Dateline NBC

Dateline NBC

Current and classic episodes, featuring compelling true-crime mysteries, powerful documentaries and in-depth investigations. Follow now to get the latest episodes of Dateline NBC completely free, or subscribe to Dateline Premium for ad-free listening and exclusive bonus content: DatelinePremium.com

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.