Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:22):
Welcome to Making
Data Better, a podcast about
data quality and the impact ithas on how we protect, manage
and use the digital datacritical to our lives.
I'm George Peabody, partner atLockstep, and joining me is, of
course, steve Wilson, founder ofLockstep.
Speaker 2 (00:38):
Great to see you,
Steve.
Good to see you too, George,over the channel, over the
signal processing, makingpodcasts better.
Speaker 1 (00:45):
There we go, we're
making audio data better.
All right.
So, look, this is going to beone of those podcast discussions
between Steve and George, so wewon't inflict it on you for
very long, but we wanted to talkabout I guess the topic really
is around the distinctionbetween technology and the
context that technology has tooperate in and that has
(01:09):
everything to do with thebusiness incentives and the
perceived benefits of atechnology adoption and the
competition of adoptingsomething.
And, of course, we're talkingabout security here or
security-related topics, thetension between implementing
those kinds of things versusadding yet another function to
(01:32):
the software that you're selling, for example.
Right and Steve and I are allabout the sharing of metadata,
the story of the data of sort ofaround an identifier like a
phone number or a driver'slicense.
We want to know.
How long has that been inexistence?
(01:53):
Where did it come from?
There are great examples outthere in the world of metadata
being shared.
Using existing tools works.
We like PKI, digitalcertificates and cryptography at
the edge, so we've got thingslike chip cards great example of
(02:14):
a combination of hardware andsoftware getting rid of plain
text presentation We've got nowwell, it's been a year or so
since passkeys have beenintroduced and here's a hardware
and software-based combinationto kill passwords.
Very exciting.
(02:38):
Which phone numbers can bespoofed by a robocaller to make
me think I'm getting a call frommy next-door neighbor.
Indeed, the telecom industryhas got one called Stir, stroke,
shaken.
Of course, stir and Shaken forthose of you who may remember
(03:02):
James Bond, ian Fleming's spycharacter.
Speaker 2 (03:02):
He took his martini
shaken, not stirred.
It's terrible.
It dissolves the ice.
It's one thing James Bond gotwrong, I think.
Speaker 1 (03:08):
Here is an industry
that's trying to go after
robocalling In the United States.
It's getting some significantpressure from the Federal
Communications Commission andyet the fact that the business
case for deploying it clearlyisn't high enough for what,
after a few years, is a flatadoption rate of that technology
(03:33):
in the 35% range, I think,which means the impact on
robocalls is essentiallynegligible, which really gets to
that preamble I gave aroundwhere you've got to have a
business reason and you've gotto have ubiquity if you're going
to be able to put up a barrierto bad actors.
Speaker 2 (03:55):
There's so many
demands on product developers,
software engineers.
These days.
The product developmentlifecycle is so tight, the
competitive pressures are sohigh for overt features that
basic hygiene, I think, alwayssuffers, which is sad.
You know, we've had thiscapability with the mobile
protocols to do much betterconfirmation of origin.
(04:17):
That metadata is available.
If developers had time to knowabout it, implement it, talk to
their colleagues.
I mean, a lot of this has gotto do with co-op petition,
doesn't it?
You do need to be coordinatingacross competing telco providers
and even handset providers toreally leverage some of this
technology.
So I think in a busy day-to-daylife of the developer there's
(04:39):
so many demands on their workthat it becomes difficult to get
what we call a non-functionalrequirement up on the table.
I reflect sometimes.
We've been talking aboutverifiable credentials, george,
for months now.
Well, years in the industry,but months on our pod.
I always say that one of theoriginal verifiable credentials
is actually the SIM card itself.
(05:00):
And if you think about it,there's a chip in everybody's
handset that protects theirmobile phone number.
It actually protects theirso-called IMSI, the
International Mobile SubscriberIdentifier, which maps onto your
cell phone number.
But the irony of all of this isthat the world's first
verifiable credential is sittingin everybody's handsets.
It's a cryptographically signedcopy of your definitive IMSI
(05:23):
and the handset could make thatverifiable credential available
to the firmware in the phone sothat every single phone call
could actually be checkedagainst the EMSI.
And the network, of course,does this.
The network checks the EMSIbecause that's how it generates
bills.
Global roaming depends on theSIM card signing the EMSI and
(05:44):
sending it into the network sothat a handset identifies itself
at the start of a call.
So that a handset identifiesitself at the start of a call,
you know, it kind of frustratesme that that signal is not also
available, I guess, at thefirmware or the software level
in the phone, so that thehandset could also identify
whether the claimed caller IDactually matches the EMSI or not
(06:04):
.
Speaker 1 (06:07):
Which I knew more
about, the stir-shaken protocol,
but it doesn't seem to be goingto the hardware at that level.
It looks to me like it's asoftware-only approach.
Speaker 2 (06:18):
Yeah, and you
mentioned before, George, as we
were preparing for this, thatthere is this sort of
cross-protocol problem too, thattelephony these days is mostly
about IP telephony and not justusing the network infrastructure
.
So we think that what happensis that even if you're using
this protocol as a call, as thecall data moves between an IP
(06:39):
network and a mobile broadbandnetwork, a lot of that metadata
gets lost.
It's just too hard to keep themetadata across those different
boundaries, so it disappears.
Priority is given to the actualcall data and maybe the data,
the IP payload and all of thatmetadata that shows us where
(06:59):
it's come from and what it'ssupposed to be used for and how
does the data originate.
All of that metadata gets lost.
It's dropped on the floor.
It's an important resource, butwe just can't maintain it.
Speaker 1 (07:10):
So, steve, what have
you seen with respect to solving
this?
It's such an important resource, but we just can't maintain it.
So, steve, what have you seenwith respect to solving this
ubiquity problem?
And I'm going to take up somemore airspace here.
In the payments world where Icome from, apple Pay been around
for almost a decade, had a verylong time to reach hockey stick
(07:34):
level adoption because itrequired handsets.
It required the comfort levelto be attained by people to load
their phones with paymentcredentials.
That was a new thing.
Of course, it took acontactless footprint all around
.
The convenience factor is wayup there, so it's getting used
more and more.
It's still not ubiquitous Evenin that case.
The specification on whichApple Pay is based and Google
(07:57):
Pay and Samsung Pay all followit the EMV tokenization
specification, frameworkspecification.
It specs a lot of data,specifies a lot of metadata
types.
It's really handy for riskmanagement and the challenge
here is that a lot of entitiesthat might use that metadata
don't.
(08:17):
They don't adjust it and use itfor risk management
capabilities.
Back to my question is how dowe get security to be a priority
against the functional stuff?
Speaker 2 (08:32):
It's our old friend
regulation, isn't it?
I was wondering where you'd go.
Look, it's hardly our lone voicecalling for regulation.
I mean none other than BruceSchneier, for example probably
the world's most preeminentsecurity commentator and
security engineer has said for along time of things like IoT,
the Internet of Things, IoT.
(08:54):
Security is such an importantthing.
It doesn't get a voice at thetable in the competitive product
engineering landscape.
So Bruce Schneier calls forregulation to mandate minimum
security on IoT devices, and Ithink he's right.
I think you get a market failurewith safety.
I mean, Lord, let's go back toautomotive safety.
(09:17):
With the best will in the world, competitive businesses don't
prioritise safety, especiallywhen it's not a competitive
differentiator, and we don'twant safety to be thus
differentiator and we don't wantsafety to be thus so.
The brutal truth is thatcapitalism fails to deliver
safety when safety is not acompetitive differentiator.
That's almost like a law ofnature.
(09:39):
So we do look to regulators,and perhaps that's why these
things move so slowly, becauseregulators are notoriously slow.
But I think that the parallelsbetween car safety and internet
safety and cyber safety are verystrong and very obvious.
Speaker 1 (09:55):
Sure, we've seen it
with automobiles and, of course,
we've seen it in airlines andaircraft.
Speaker 2 (10:01):
Faa is very strong,
your favorite example, the
importance of airline safety andthe advances in airline safety
are down to regulations.
I think everybody would have toagree on that.
That's right.
Speaker 1 (10:12):
Very rigorous
certification requirements for
every type of aircraft and everymodification to that aircraft.
Speaker 2 (10:19):
So we're actually
asking politicians, I guess, to
regulate things that lead torobocalls, and maybe the
political competitive imperativeto regulate robocalls is not
there.
I mean, where do most of ourrobocalls come from?
Politicians?
Speaker 1 (10:35):
Do we just stop and
weep at this point in
frustration?
You know, I do think that andwe've talked about this a lot
amongst ourselves is that how doyou build in economic
incentives to improve security,incentive to build to improve
security and you know, you and Ihave talked a lot about in our
own thinking about a solution isthat there are economic models
(10:55):
that could actually, in terms ofsharing verifiable credentials,
there are ways that a partythat's consuming those
credentials, that has to make arisk decision based on it, has
an economic incentive for payingfor it.
Speaker 2 (11:07):
Paying for better
data.
You're saying Pay for thequality signals.
Maybe they're optional qualitysignals.
Speaker 1 (11:13):
We've got a ton of
relying parties who subscribe to
services like LexisNexis RiskSolutions and and other data
providers because they need tospend money to manage their
risks.
There's a way you knowalternatives that require less
storage of data, stronger proofsof data provenance, just to say
(11:37):
that there's not an economicengine that's available.
That's not the case?
Speaker 2 (11:45):
No, it's plainly not
the case.
We have a free market that hasled to a marketplace of data
signals.
Businesses do pay, and they paya lot for better data, and in a
sense, it's a good thing.
I don't think the way that thatdata is distributed at the
moment makes a lot of sense.
I think that the market iscorrupted by a number of
(12:06):
misaligned interests, shall wesay, to be politically correct.
But clearly, in a sense,information wants to be free.
There's so much freeinformation out there and yet
businesses do pay a premium toget those risk management
signals.
Speaker 1 (12:20):
Information wants to
be free is just some hippie
nostrum call.
We have always paid for qualityinformation.
We've paid for books forcenturies.
That's quality information.
Just because I don't have to goto a library or a store to buy
a book online, there's no reasonwhy it should be free.
I think, as an industry lookingat systemic economic incentives
(12:45):
and getting those alignedaround the sharing of data using
more broad mechanisms ratherthan the I have to contract with
company A, company B, company Cto get the data that I'm
looking for and hoping that it'saccurate we can do a lot better
.
All right, we leave it there,steve.
Speaker 2 (13:06):
Yeah, it's one of
those little conversations that
raise more questions thananswers, so we always hope that
our listeners and our audienceshave come here for ways of
thinking about problems andunpacking the dimensions of this
wicked problem.
There's no easy solutions, butI think the principles are
becoming clearer.
How do you make data quality,signals, metadata, how do you
(13:29):
make it available and how do youhave a flatter playing field, I
guess, on how to access thatdata and how to pay reasonable?
Speaker 1 (13:38):
fees for it.
Well, that's it.
I think it's this combinationof sure regulation when it's
feasible, possible.
Living in the US regulation isa challenging piece.
Speaker 2 (13:49):
How very polite of
you.
Speaker 1 (13:58):
Politics drives our
regulation hugely, of course,
and tempt businesses to deploywith a regulatory kick in the
back, because I do agree that aregulatory push, it really moves
markets.
You know, one of the reasonsI'm excited to have this
conversation with you, steve, iswhat appears to be happening in
Australia is that there's a lotof strong, focused thinking on
(14:22):
this topic and what's the roleof regulators, what's the role
of the market?
It's top of mind, so at least Ihope our conversation is useful
in your neck of the woods.
Speaker 2 (14:40):
Definitely so.
We could flag another podcastfrom National Australia Bank,
which had me as a guest, ontheir Digital Next podcast.
That's going to be posted nextweek sometime, and we've got a
couple of blogs lined up thatyou'll also see at the Lockstep
website, discussing what wethink is shaping up to be Wall's
best practice in terms ofregulating digital ID and a
vision for how this might beextended to regulating all sorts
(15:01):
of other data as well.
So it is a really interestingtime here in Australia.
It's been a long time coming.
We've got probably the thirditeration of what used to be
called digital identitylegislation third iteration in
15 years.
But no, we're absolutelygetting there.
It's good to see some progress.
Speaker 1 (15:21):
Good.
Well, on that optimistic note,I'll see you next time.
Got to be optimistic, dave.
Absolutely.
Thanks, my friend.
Talk to you later, cheers,thank you.