All Episodes

August 13, 2023 38 mins

Consumers are beginning to understand that their personal data has real value, but it is still held in the hands of big technology companies.  The promise of consumers managing and owning their own data will become a reality thanks to decentralised or self-sovereign identity (SSI).

To delve into this topic, I'm delighted to welcome onto the podcast Marie Wallace, Digital Identity Lead at Accenture.

Marie and I both had the chance to work together at IBM and I've always enjoyed our discussions about analytics and data, and this conversation explains SSI in a very accessible way.

Self-Sovereign Identity, or decentralized identity, is a model where individuals have full control over their personal data.

During the episode, Marie debunked the common myth that implementing this model necessitates a blockchain, and we looked at the concept of "streaming trust".

We also looked at Marie's instrumental role in projects like IBM's Digital Health Pass and New York's Excelsior Pass during the pandemic.

A fascinating part of the discussion revolved around the concept of verifiable data, and the potential of AI in offering personalised career advice, primed with our own personal data.

We also examined how companies might interact with customers to provide incentives for releasing individual data and how LinkedIn is starting to verify profiles to allow individuals to have more control over the data they own.

This episode also looked at the integral role of trust networks, the need for secure digital wallets, and the exciting prospects of verifiable data.

We also covered topics such as

  • The concept of "streaming trust" & how SSI works
  • Sovereign identity & worker onboarding
  • Trusting the trust provider
  • The need for secure, trusted digital wallets
  • Using SSI to validate my LinkedIn profile
  • Exploring the Concept of Verifiable Data
  • Empowering consumers with their own data
  • The role of AI with SSI & personal data
  • SSI drivers: risk, cost, fraud reduction
  • Where do I store my SSI data?
  • The Philosophy of Self-Sovereign Identity
  • Examples where SSI is working already
  • Self-Sovereign Identity and AI Explorations
  • Embedding trust with the supply chain
  • AI uses in SSI
  • Empowering Individuals with Self-Sovereign Identity
  • Who needs to drive SSI adoption?
  • The biggest challenge in SSI
  • What are the steps to take to control my own data?
  • Being more data literate and caring about your identity and online safety
  • Three actionable steps to manage your own identity


More on Marie
Marie on LinkedIn
Marie on Twitter
Marie's blog
Marie's TED talk

Resources mentioned

Thanks for listening to Digitally Curious. Pre-order the book that showcases these episodes at digitallycurious.ai/pre-order

Your Host is Actionable Futurist® Andrew Grill

For more on Andrew - what he speaks about and recent talks, please visit ActionableFuturist.com

Andrew's Social Channels
Andrew on LinkedIn
@AndrewGrill on Twitter
@Andrew.Grill on Instagram
Keynote speeches here
Pre-order Andrew's upcoming book - Digitally Curious

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:30):
to the Actionable Futurist podcast, a show all
about the near-term future, withpractical and actionable advice
from a range of global expertsto help you stay ahead of the
curve.
Every episode answers thequestion what's the future on,
with voices and opinions thatneed to be heard.

(00:51):
Your host is internationalkeynote speaker and Actionable
Futurist, andrew Grill.

Speaker 3 (01:19):
An area I haven't covered on the podcast to date,
but is an important and evolvingspace, is that of
self-sovereign or distributedidentity.
To dive deeper into this topic,I'm delighted to have on
today's show a former colleagueof mine, marie Wallace, who I
work with at IBM.
She is now managing directordigital-agent lead at Accenture.
Welcome, marie, it's so good tosee you again.
We're not in the same room, butI always enjoyed our

(01:41):
conversations at IBM about dataand social and all things in
life.
This is a fascinating topic.
When we met at IBM, you'dalready been there for some 13
years.
Perhaps you could explain yourjourney to date, from your work
in natural language processingat IBM to leading digital
identity at Accenture.

Speaker 4 (01:57):
I was 20, nearly 22 years in IBM research,
technically doing research andinnovation.
It's all been predominantlyaround AI, natural language
processing, analytics in variousincarnations.
We built our first smartassistant back in 2007 or 2008.
I've been in this for a longtime.
All the analysis and the datathat I was working with was

(02:19):
predominantly personal data.
I got increased in the interestin the ethics of data science,
privacy and just generally in adigital society when a lot more
data is going to be flowingaround right now.
People don't have the sameagency over their data.
They don't know what happenswith their data.
They don't really understandanything about their data.
It's fundamentally going to bemore used more and more.

(02:41):
There's more risk associated inthe ethics.
Anyway, make a long story short.
This got me really interestedin this idea of self-sovereign
identity Less about the identitypiece and more just about
self-sovereign data.
How do I have greater agencyover my own data?
That's really what got me intothis space.
In the last probably six years,I built something called the
IBM Digital Health Pass when Iwas at IBM, which was used as

(03:03):
the Excelsior Pass in New Yorkduring COVID.
Subsequently did a variety ofdifferent projects Essentially
the reason I joined Accentureand why I'm really excited to be
here is we've now got to thestage where it's not mainstream
yet it's not market adoption.
People are slowly realizing thevalue and the importance and the
challenges of a digital societyand the need to have a

(03:25):
decentralized identity approach.
Why joined Accenture is becausereally, it's about reinventing
the enterprise, reinventing theworld, reinventing digital
society and thinking about howwe do this in a really ethical,
a really previously preservingway, with empowering people with
their data.
Accenture has lots and lots ofclients across pretty much every
industry and every geo.
We're working with them.
I've got some reallyinteresting use cases that we're

(03:46):
exploring with a variety of ourclients that all rotate around
and decentralize their density.

Speaker 3 (03:51):
I'd love to get into some of those use cases that you
can talk about.
I've been talking about thefact that we will own our own
data for some time.
As the technology evolves, asthe future is, my predictions
become true.
I kept saying.
When that whole FacebookCambridge Analytica thing
happened a few years ago, I wassurprised at how blasé many
people worry about their data.
Initially there were lots ofheadlines, you know oh my
goodness, they did this withoutdata.

(04:12):
Now it's fallen back and people, I think, have forgotten the
amount of data they have outthere.
I want to own my own data, butthere are challenges around that
and SSI will help solve that.
How do you explain SSI to mymom?
On your blog and a number ofcolleagues you've been talking
with on LinkedIn?
You talked about SSI with theexample of streaming music like
Spotify or Napster.
You coined the phrase streamingtrust, maybe using that as an

(04:36):
analogy.
How does self-sovereignidentity work using that analogy
?

Speaker 4 (04:39):
If we think about the music industry, there's a few
things that are interesting.
One we all remember actuallythe older of us remember, some
of the youngest maybe double.
We remember buying big stats ofCDs and you'd have a big CD
case in your living room withall the CDs that you had bought.
Then if you wanted to have adinner party you'd have to take
out a CD and put it in.
So you bought all the data, allthe songs, even though there
was maybe only three or foursongs or maybe on the CD you

(05:01):
might want to listen and youmight listen to them not that
often.
Then you had to keep buyingmore and more CDs.
So that's kind of the oldanalogy and I create that very
much to the way things worktoday in terms of data.
People buy data.
They might be buying it from athird party data aggregate or
they might actually beeffectively buying it or getting
it directly from you.
They ask for a stack of dataand then they stick that stack

(05:22):
of data in a big MDM somewhereand then they have to manage
about, manage security and riskof data theft.
So they have all the costs, thesame way we did, of managing a
big stack of data.
What streaming did for musicwas allow people to say I don't
need to own any songs, I don'tneed to have stats of CDs, but
what I can do is I can get thesong when I need to.
So when I'm sitting down fordinner I can say this is the

(05:43):
song I need and you basicallypay on demand.
So that was basically the model, and we all remember the
content battles that contentproviders did not want to go
down this route and whathappened is they went out of
business and a new generation ofcompanies came around that
really took advantage of thestreaming model.
And I think this is the samething with data.
If we take a very simple analogy, we're doing some really
interesting work around the kindof work of credentialing.
And let's just take the simpleanalogy Back in February, I

(06:05):
joined Accenture and I had to gothrough the onboarding process.
And what you do today withonboarding, you have to have
proof of prior employment, soit's a letter of proof.
When you work for a specificcompany, you've got your
transcript or you basically tellthem.
You self attest you got it fromthis college.
So all this data, you providethem photocopies or you self
attest and then they eitherthemselves go or they have to

(06:27):
pay a third party to go verifyall that data.
Go to the college and whathappens is you go back to these
institutions that verifyacademic data and they'll get
all the academic data and thenthey'll pass back the data.
So now what you have is don'tknow this manual work and the
reality is you share all thisdata about, say, academic
credentials as an example thatyou don't need.
Now what we're proposing is witha fair file potential.

(06:48):
When I get, when I qualify, orwhen I, when I work for a
company, I get proof ofemployment.
When I leave it, I still haveproof of employment, but it's
just expired.
You know the data of employmentwas this date and this date.
When I then want to get a newjob, I just have to exchange for
our five credentials.
It happens as I need it.
The receiving company, accenture, doesn't even need to store.
They can store, record therecord, but they don't need to
store all the data.

(07:09):
They only need to store theminimum amount of information
they needed to complete thetransaction, which was to hire a
marine to make a permanentemployee.
So it completely changes thedynamics, both in terms of the
user experience.
How easy and quick it is toonboard with the company or to
get a new credit card orwhatever the process might be
the minimum data that thereceiving company actually
really needs to hold, because,as one of the example I'll give,

(07:31):
which I always think is areally nice one, is your address
.
People's addresses change allthe time.
Why do banks need to storeaddresses?
The only time they need theaddress is when they want to
send you a letter, which theyshould be able to send a quick
proof request, check youraddress and then send you your
letter if they have tophysically send you a letter.
So there's all this data we'reholding on to that we don't need

(07:53):
to be, and that's really how Iequate music to this kind of
streaming data and streamingtrust.
We get the data on demand whenwe need it, and it's a whole
different business model.

Speaker 3 (08:02):
There's a recent example where this went horribly
wrong in Australia and one ofmy former employers, optus, had
a data breach.
People then realized that Optushad been storing their full
driver's license.
Now in your world if you thenthe driver's license
verification here in the UK it'sthe DVLA you basically have to
say do you have a valid driver'slicense and are the credentials
correct, yes or no?
It's almost a flag.
What people were surprisedabout is that Optus had stored

(08:25):
the full driver's license numberand everything else.
Of course, when that wascompromised, people's driver's
license was exposed.
Optus, I think, had to paysomething like $300 or $400 per
person.
There was literally a line upoutside the motor vehicle
registration for people to gettheir new driver's license
numbers In your world.
What would have happened thereis that there would have been
exchange of verifiablecredentials between the
government and the telcoprovider to say, yes, andrew

(08:47):
Grill has that license, and allthey'd be storing would be a
verifiable check mark ratherthan the driver's license.
Is that how it work in practice?

Speaker 4 (08:53):
You're coming to really what differentiates the
decentralized model?
The way decentralized works isthe individual is issued their
data.
If I want to rent a car fromHertz, I go to Hertz, I
self-attest my driving deeds andI have to give normally a lot
more information than they need,because they then have to use
that data to go back and verifywith the DMV or with the
driver's license, either usingan API or something.

(09:15):
I give a load of data and thenthey have to go verify.
Then I have to give consent sothat they can consent to share
the data.
You get all this plava ofconsent and security and privacy
because these two companies aresharing your data, whereas in
the self-solving identity worldthey don't talk to each other at
all.
It's not the business of the DMV.
Whether I rent a car at Hertz,the DMV issue me my driver's

(09:36):
license.
I have my driver's license inmy wallet as a cryptographically
verifiable document.
I then in turn, share thatdirectly with Hertz.
As an example, as a proofexchange, what Hertz store may
just be yes, the proof exchangewas successful.
They might, for example, beforeyou can rent the car, they
might need you to prove that youare.
You say you are, so proof ofidentity, proof of a driver's

(09:57):
license and maybe it might evenbe proof of insurance, if you
can use your insurance.
You respond to three proofs andthen they give you your car.
It could be as simple as that,moving forward.
The person is the agent thatdoes the exchange of data, not
two organizations sending yourdata back and forth without your
visibility.

Speaker 3 (10:13):
I hold the credentials.
How can you trust that thecredentials I hold are real and
verifiable?
Who trusts the trust provider,or is that an unfair question?

Speaker 4 (10:22):
It's actually the number of the question.
To be honest with you, andrew,I think there's a couple of
things.
Something was said to me.
They're really surprised.
Accenture was interested in thisbecause it's in this like
anarchy.
No, because there still istrusted issuers.
There still is the DMV issuingdrivers.
That fundamentally doesn'tchange.
The DMV, as an example, isissuing a cryptographically
signed document using theirprivate key that nobody else has

(10:42):
to.
Marie.
None of the data elementswithin that document can be
changed because otherwise theverification will be possible.
That's one protection.
The other critical thing is howdoes the DMV know that it's
issuing the data to, indeed,marie Wallace?
This is where you have to havea digital wallet which is
tightly bound to thatindividual's identity.

(11:04):
Very often, you also actuallyprobably want to bound to a
device so that if Marie's dataor wallet or something was
stolen, you'd have to get herphone and you'd have to get her
biometrics to get into the phone, and then you might have to get
a passkey.
You'd actually have to do a lotto try to get into our wallet
and then to be able to use it.
Critical thing is you need asuper secure wallet and the
issuer needs to be issuing to asecured, trusted wallet.

(11:26):
The verifier then obviouslyneeds to trust the wallet and
they need to trust the issuer.
That's where the trust networkcomes into play.
There isn't a single network.
It's ultimately a network ofnetworks.
Ultimately, the verifier doesindeed need to trust.
They either need to directlytrust the issuer.
If the DMV is publishing theirkeys, then you as a verifier can
say I want to keep my own trustregistry.
Or you might want to trust athird party registry that's like

(11:49):
Yellow Pages that is managingonboarding of all these
different entities to say, yeah,these are entities.
Trust beauty is in the eye ofthe beholder.
If I'm in some country I don'tknow some country somewhere,
they might, for whatever reason,decide we're not going to trust
the DMV.
It's not like a rule you haveto trust.
The only critical thing is youhave to be able to identify the

(12:10):
entity to know this is indeedthe DMV that is the official
issuer of driver's license forthe United States.
The trust network and the trustregistries are a critical
component of this system working.
These can be something that anorganization maintains for
themselves.
That works actually in caseswhere you have small ecosystems.
But for very large ecosystems,obviously, what you're going to

(12:32):
end up having is networks oftrust registries and maybe even
government mandates orgovernment managed trust
registries.

Speaker 3 (12:39):
You mentioned the word wallet.
When I think wallet, I thinkblockchain, but we talked off
air and there's actually no linkbetween the two.
Could you do some myth-bustingthat self-solving identity
doesn't need a blockchain?

Speaker 4 (12:48):
No, it doesn't.
Now this is maybe not the mostsecure way of doing it, but
during COVID, as an example,countries around the world
Europe being a good examplewants to be able to issue COVID
passes.
They didn't want to have tobuild everything on blockchain,
but they wanted to have a wallet, they wanted to have verifiable
credentials, they wanted tohave the individual having the
agency and they wanted it to bepeer-to-peer.
So, on the edge, they wantedeverything that is self-solving

(13:09):
identity.
But in the case of that, forexample, in Europe they
basically had the public keydirectory.
The trust registry for Europewas the EU gateway.
It was an actual gateway, acentralized system.
That was the trust registry forEurope.
Now, in the United States, on anexample, they didn't have that
sort of centralization.
If we look at, for example, newYork.
So the Excelsior Pass in NewYork, new York State, had its

(13:32):
trust registry for its ExcelsiorPasses.
That was in.
I think might have been adatabase.
Then, when we built the DigitalHealth Pass, we built, indeed,
it on blockchain.
You can use many differenttechnologies in different ways.
Blockchain is one of them.
In certain cases, it providessome very, very nice
characteristics, but you don'thave to use blockchain.
So people use differenttechnologies for doing that, and

(13:53):
that's perfectly fine.
For me, the key thing about theconcept of self-solving
identity is essentially thephilosophy, the concept, the
design.
That's what has to be respected.

Speaker 3 (14:02):
For a while now, I've been talking about the concept
of tokens and I've been usingthe example of a blue tick
against the roles of my LinkedInprofile.
You know that I did actuallywork at IBM in the roles I had
from 2013 to 2017.
Let's look at how that wouldwork in practice.
I understand the whole trustnetwork, but IBM would have to
be able to issue a credential tome or to LinkedIn and LinkedIn
would have to verify that.

(14:23):
How would that work in practice?
What's the incentive forLinkedIn or me or IBM to even
offer that service?

Speaker 4 (14:28):
Funny that you should mention that, because this is
actually something that we'rereally in the process of
exploring here within Accenture.
I've had some interestingconversations with a number of
our clients over the last monththe incentives around releasing
individual data to them,respecting your customer and
there's a surprising amount ofinterest from the companies I
speak with with releasing datato their customers for a number
of different reasons.

(14:49):
One is they want to minimizethe footprint of PII that
they're holding.
There's the risk element.
They don't want to be holding aloaded data about you that they
don't need to be holding.
There's a really attractiveabout you having more and more
autonomy over your own data.
There's a real differentiationif you're a company that
empowers individuals with theirown data.
I'll give you a very, verysimple example.
It seems trivial, but it couldbe very powerful.

(15:09):
Most companies, if you're acustomer, they generate a risk
profile or a credit score ofsome description.
How good is Maria paying herbill on time?
All companies do that, so theyknow which customers are.
Very good, they pay the billson time and things like that.
That's never data that I get.
Just imagine if the companyreleased it to me Then if I'm
renting an apartment and theywant to.
Well, I can prove that my localtelecom provider or the

(15:32):
electricity supply board orwhatever here's to verify that
I've always paid my bill on time.
That's a very simple example ofthe incentive to the individual
is this is a company I do abusiness with that is giving me
value in and above the servicethat they're also giving me.
When I was an IBM, gili Romettiquoted as data is the new
currency, or data is the fuelthat's going to drive our
economies.
And if it's a currency, if it'sa fuel, it should be leveraged,

(15:55):
it should be utilized and I asa citizen should be able to
leverage the data for value.
So I would argue there are agrowing number of incentives.
If we look, say, to the proofof employment, so just think of
it.
So okay, proof of employmentcan be very useful for companies
in just simply giving access tosystems, giving access to the
building.
I'm an Accenture employee so Ican get into the Accenture

(16:15):
building, so there's that kindof obvious benefits.
There's also benefits for how Irepresent myself on LinkedIn
and the incentive for me as anindividual is to increase the
trust of people looking at me inLinkedIn, so increase my
reputation.
The value to LinkedIn is theyhave trusted more valuable data
that everybody's going to trust,and the value to my company, as
an example, might be, you know,increasing.
Okay, there's a risk that if myprofile increases, I might get

(16:38):
poached by other companies.
But again, I think of it as,like you know, if you love
somebody, set them free and ifthey love you, they'll come back
to you.
If they don't, they never lovedyou in the first place.
So there is an element that youhave to trust people with this
data.
If you go a step further, youmentioned things like checkmarks
for other types of data.
So we're also doing some reallyinteresting work to look at
employees and what otherinformation would they really

(16:59):
benefit from?
I think about developing mycareer.
How do I identify?
Do I have the right skills andwhat sort of training should I
do?
What sort of job should I do?
What sort of project should Ido?
Just imagine if I'm collecting,over time, all the data that
represents the work I've beendoing, the formal education I've
done, the certifications I'vetaken, maybe even the jobs I've
done within a company.
So, while I may not be able toexpose the fact that I worked

(17:20):
with company X on project Y.
It could be issued as an offthe scale of verifiable
credential Because, again, allof our five credential needs to
prove is you worked with aFortune 500 company in the
automotive space doing X and yougot a five star rating, so you
can.
Over time you can.
Your CV isn't the static pieceof a snapshot, it's every data
element.
So we think about AI.
So now we think about AI andthe role of AI and, as basically

(17:44):
a career manager helping mewith my career, you're actually
having a wonderful set of datapoints that I can now really get
personalized career advice andreally figure out what jobs I
could win I could be going for.
Both me as an employee, butalso the employer will have a
much, much easier time offinding the right employee.
The potential is huge but,again, ai is only as good as the
data that feeds it.

(18:04):
We need really good data andthis is one of the interesting
challenges around credentialing.

Speaker 3 (18:09):
And then if I own my data and I can choose to give it
to another downstream systemand I have much more control
over that.
So I think that's fantastic.
Just back to LinkedIn, they'restarting to do a level of
verification.
I've seen people now withverified profiles where they
basically send an email to yourwork address and say well, do
you work there?
And if you have an adextensioncom email address, then
you must work there.
So they say that that's correct.

(18:29):
At least they're starting to dothat.
But I think the next phaseprobably requires effort on both
sides, and they're owned byMicrosoft.
Why wouldn't they want to do it?

Speaker 4 (18:37):
And they are actually doing it already.
There was an announcement a fewmonths ago.
So Microsoft Entra, which isthe new rebranded set of
Microsoft Identity offerings,and they have Entra Verified ID,
which is indeed a Solve Sub-InIdentity solution that issues
verifiable credentials.
It can be sent directly to yourauthenticator app.
So if you have theauthenticator app, you can get
your verified credentials proofof employment, as an example and

(18:57):
indeed LinkedIn leverages that.
So LinkedIn is alreadyproviding support for verifiable
credentials.
So this is where we are seeinguse cases emerge.
We're seeing companies startingto adopt.
Sometimes it's not immediatelyvisible, because the average Joe
on the street doesn't carewhether this is a verifiable
credential, but they care abouta more seamless, frictionless
user experience.
And that is, I think, where thevalue is really going to come,

(19:19):
where this concept of verifiabledata is going to enable these
more frictionless userexperiences.
And the other critical thing isas well I don't talk so much
about privacy these days.
I think more about safety,because when you talk about
privacy privacy, well, I havenothing to hide it's not about
something.
That is about you being safe asyou move through your digital
world, and I think safety is themore important issue for

(19:41):
everybody out there, the amountof fraud that exists, the amount
of money it's mind blowing theamount of money on these things
like phone scams, all sorts ofscams.
The reality is, verifiablecredentials can solve all those
problems as well.
So I think there's a reallyinteresting set of drivers at
this point in time between risk,between cost, between fraud,
between frictionless experiences, that's starting to make

(20:04):
verifiable data really relevantand really timely.

Speaker 3 (20:08):
If I now own my own identity, where does it get
stored and who should store it?

Speaker 4 (20:12):
Using the analogy data is currency.
I think about it very similarly.
To think about banking Likeyears ago, people put all their
money onto their mattress andthey felt it was safe there and
whatever, and then eventuallythey started trusting banks.
I would argue and this hasalways been the vision of
self-sustaining identity is thatultimately, in the future, what
you'll have is you'll havewallet providers and there'll be
many of them, so you can choose, the same way you choose what

(20:33):
bank you want to go to, andthose wallet providers will have
secure, privacy-preserving,protected wallets that will also
have both a cloud wallet and amobile wallet, so it will also
mean you'll have backups to thecloud.
If you lose your device, you canbasically break it if you need
to.
So I think that, ultimately, iswhat's going to happen, and
that way, it'll be much morelike having an online bank.

(20:55):
The thing that's nice aboutthis model from a, as we start
to adopt these open standards ifyou don't like the wallet
you're currently using and theservice you're getting and the
capabilities you're getting, youjust go to another one, so like
you would change bank accounts.
So that's kind of where I thinkultimately this needs to go in
order to make this sort of dataownership to kind of offload the

(21:16):
responsibility of managing datafarm the companies to the
individuals.
You obviously need to providethem the tools that allow them
to do that.

Speaker 3 (21:23):
Well, just on that.
I use one password.
I've been using a passwordmanager for about 15 years now
and I tell everyone and all mytalks that they should look at
doing that.
They're now adding things likepasswordless.
They're looking at pass keys.
Is that the place if I trustone password and they have
two-factor authentication?
On top of that, they have amobile app.
I actually had my phone stolentwo weeks ago and it was really
easy to get everything backbecause I had one device that I

(21:44):
was able to authenticateeverything again.
Is that the style of providerthat I already trust, with
thousands of passwords thatcould become my wallet provider?
Is that where things are headed?

Speaker 4 (21:54):
In theory, I also use one password.
I've used them for years aswell.
I definitely come from theconcept of I like to have a
vault.
I guess the only thing I wouldsay, and I don't know where one
password is going or what theirvision is.
But I think the difference withverifiable credentials is that
it's not just about access andauthentication.
I would argue that when peoplethink of identity today and they
think about tokens, they thinkabout access tokens.

(22:16):
It's basically a one-trick pony.
It's all about access, andaccess is fine.
If you think about all thedigital touch points between you
and specifically as we get moreand more digital, accessing
something is actually thefraction of the interactions you
actually have with the company.
So it's all the other dataelement that's needed.
You're a dress credential thatmight be issued from the post

(22:37):
office, or your bankingcredentials, your health
credentials.
There's a whole slew of typesof data that you use or that you
will use all the time as partof your daily life and being
able to make that superfrictionless the exchange of
that either not necessarily theexchange of data, but it might
be just exchange proof requests.
I'm ordering food on a menu.
How do I make sure that I'm notallergic to anything on that

(22:58):
menu.
Your phone has the allergiesthat you have and it basically
strips out, marks up theallergies.
That's not a difficult thing todo.
You don't want to share yourhealth data with the retailer.
The data's on your device andthen you can basically have this
personalized user experiencedirectly on your device.
The really interesting thing Ithink about self-solving
identity is it's a philosophy,it's a concept, but it's also a

(23:20):
bit of a head-mind-met.
It's a bit of a head-recker forpeople because you have to flip
your perspective of the worldon its head.
You have to look at thingscompletely differently.
But once people do, then youstart seeing all these really
exciting use cases.
But it requires a bit of ahead-flip for you to kind of see
the use cases.
They're not necessarily evident.

Speaker 3 (23:39):
Give me some tangible examples where this is working
already and other companies cansay oh, we can do that too.

Speaker 4 (23:44):
Obviously, we know about government IDs.
We know about EU EIDAS, forexample, europe is going to be
issuing a digital wallet toevery citizen end of 2024, 2025,
and everybody will have digitalIDs which will be verifiable
credential.
So we know kind of governmentsare starting to do this.
There's a couple of areas thatI think are really interesting.
I mean there's many areas thatI don't come back to the worker
example.
I like that example for acouple of reasons.

(24:08):
One is it takes a significantamount of existing cost out of
the business, so it costs a lotalready, and particularly as
we're moving more towards thatgig economy.
I spent 21 years in IBM, thenext generation coming well,
we'll spend 21 years at anycompany.
People are moving, so theturnaround is a lot higher,
which means the cost of everytime you bring people on board
is a lot higher.
So the thing about workercredentialing that's really

(24:30):
interesting is because there's abig cost saving impact and also
, as we move toward theskills-based economies and
skills-based organizations,optimizing your workforce that
becomes a huge challenge.
So I do think we're alreadystarting to see LinkedIn was an
example they're already startingto support verifiable
credentials for proof ofemployment.
We're starting to see Microsoftissuing this to lots and lots

(24:50):
of their clients.
We in Accenture are starting tolook at this more broadly, not
just about issuing proof ofemployment, but looking at
things like certifications,academic qualifications, skills.
So I think the workspace isparticularly interesting and
there's a lot of companies thatit's a really easy sell because
they've got a lot of expense atthe moment and this helps them
address that.

(25:10):
I think the other area that isgetting a lot of traction is
around organizational identityand verification.
So we're seeing the likes ofGlythe and the work they've done
around kind of the wholegovernance of issuing
organizational identities.
We're working with someprojects which I can't talk
specifically about but hopefullylater in the year we'll be able
to talk about where they'relooking to become a network of

(25:31):
issuing organizational IDs andthen integrating kind of trust
within the supply chain.
So the amount of fraud we thinkabout fraud with individual
personal identities, but theamount of organizational fraud
is eye-wateringly large.
So how can we kind of take someof that fraud out of the system
?
So I think organizationalidentity and credentialing is
very interesting.
We've also seen the concept ofthings like how do I make
certain information public?

(25:52):
So things like greencredentials, so if I'm, for
example, an organization.
I want my E Street credentialsand how do I, number one, prove
that I've been audited from atrusted provider of these checks
?
And then how do I share themwith other suppliers I'm working
with or companies I want tosupply to?
But maybe I want to also sharethem with consumers.
So you could imagine a whilemoving forward where you know I

(26:13):
might just want to look on awebsite, but I might want to,
whenever I buy a product, checkthe green status of the company
that manufactures the product,and it's not a difficult thing.
Again to the example of whenyou're scanning stuff you check
to see what your allergies are.
You might check to see howgreen the company is.
We're seeing organizationcredentials as something that's
really interesting as well.
And then there's a whole slewof things around the hospitality
sector.
How do we make the whole travelexperience kind of more

(26:35):
seamless and touchless?
That's the key thing.
So those are some of the usecases that we're seeing a lot of
interest in.

Speaker 3 (26:42):
So you touched on AI before and everyone's talking
about AI and you can't have apodcast these days without
mentioning AI and ChatGPT.
Where does it play a part indecentralized identity and what
are some typical use cases andwhat are you seeing the
opportunity for AI right now?

Speaker 4 (26:55):
I've worked in it for 20 odd years.
There's so many different usecases of AI when I think
specifically about identity andverifiable credentials.
I know there's been a lot ofhype about ChatGPT and it is
very impressive, but at the endof the day, you know, garbage in
, garbage out.
If you want highly specializedrecommendations and analysis and
matching or whatever tounderstand something about an
individual data in, data out.

(27:16):
So the more reliable,trustworthy data and the more
data elements you can bring intothe album, particularly because
they can now handle a hugeamount of data and they can
handle a huge amount ofcomplexity the more data you
give it, the better resultsyou're going to get out the
other side.
I think the credentialing isinteresting on two fronts.
Number one is it releases moreand more data to the individual.
So it actually generates moredata because there's a lot of

(27:39):
data that is sitting in silos.
So if you think about you knowthe example I gave, you know
companies might have your creditscore or whatever, and they
have it in a system.
Or LinkedIn has a bucketload ofGod knows what data about us
and Facebook all these companieshave a lot of information about
us.
But just imagine if that datawas released to us in a way that
we can use, so it becomesusable data for us, so that then
obviously could feedpotentially AI.

(28:00):
Now the other thing that'sinteresting about the verifiable
, credential, self-sorbentidentity is that how we share
that data with AI engine againis putting us in more in the
driving seat, so it's giving usgreater autonomy and agency is
allowing it to be shared inmaybe pseudonymous ways, so
maybe our actual identity isn'texposed.
It can kind of create thesereally interesting ways of us

(28:21):
sharing lots and lots and lotsof data, potentially data points
, with these algorithms in a waythat we choose to share it,
whether we want to be completelyidentified or we want to be
identified or whatever the caseis.
So there's one interestingexample that I really like
because it's a good example.
It was a health care scenario,I think a while ago.
The issue some rfp aroundlooking at citizen health Is a

(28:43):
really interesting examplebecause it's this incremental
idea that I might incrementallybe sharing some information
about myself anonymously For thepurpose of the getting health
recommendations and reallytargeted.
You need to think about thisdiet of maybe you should be on
this drug or whatever the caseis.
You get some, but at some pointin time is the engine maybe
recognize something concerning.
So now I want you to go see adoctor.

(29:04):
So now there's gonna have to besome identification.
But what is this incrementalidentification?
You can be completely anonymousto start with.
You can use these algorithmsCompletely anonymously.
If they share somethingimportant that you maybe want to
identify yourself, then you canchoose incrementally, do that.
So I think about potentialsjust gives you a lot of
flexibility In terms of howyou're sharing the data.
It's not a magic wand.

(29:24):
There's still is gonna be a lotof issues around ethics.
That goes without saying.
I think empowering people withtheir own data and making
companies kind of puttingpressure on companies to release
your data back to you, I thinkis really important and I do
think it actually has aninteresting positive side effect
on the ethics of a I.
Just one quick example justimagine if all the insights that

(29:46):
a company generates about youthey had to release to you.
I feel really uncomfortablethat a company knows more about
me than I know about myself.
That doesn't sit right with me.
So fine, they can use the data,but they have to tell me, and
the thing that's interestingabout that is it would actually
be a self self managing cycle,because if, if I have to tell an
individual, I'm Talk to thembecause I know they're easy to

(30:08):
manipulate and I have to tellthem all the reason you're
getting this all of a sudden,they're gonna have to be careful
about.
Do I actually want to use thatproperty?
Do I want to generate that?
Because I'm gonna have to tellsomebody I'm doing that.
So even if you don't tell mealgorithms, it's really is a
self.
So I do think there's a reallyinteresting Benefits of
transparency and giving peoplemore of their data so I'm a geek

(30:31):
, I'm a future, so I want thingsto happen now.

Speaker 3 (30:33):
I would love to be able to manage my own data in my
wallet right now.
And we mentioned chat gpt andyou mentioned you've been
working for a long time.
So I think what happened whenchat gpt was launched is it
remove the friction.
Anyone can now use the toolwithout having to write a models
and scripts and pythons.
They can actually play with ana I model.
It's a great demo.
It's not perfect.
What is the frictionless momentwhere it's another chat gpt

(30:56):
moment, where Self serveridentity just becomes so demand
that everyone saying I needaccess to this, I want this.
Is there a frictionless momentlike chat gpt or we are long way
off, and what I'm really tryingto say is who's got to really
drive this?
Is a government?
Is it individuals?
Is it companies?
How do we accelerate this?
I'm sure you're more thanexcited about this space.
How does it become a reality bythe end of 2024?

(31:18):
One?

Speaker 4 (31:18):
of the reasons I joined excenture is because I
was really interested on thedemand side.
If we think about the currencyexample, there's been a lot of
focus over the last years on thesupply side of credentials,
government issuing ideas and youknow.
So all this kind of the supplyside.
How do I issue credentials?
Obviously nobody gets up in themorning says I'd love a wallet
with a loaded data in it.
What you do is you need itbecause you need to solve a
particular business problem.

(31:39):
I think the demand side iswhat's gonna drive it and that's
where the frictionless userexperiences come in.
So, for example, it might beyou know the telecom providers
introduces into the system andWhenever you make a telephone
call or you receive a call, youreceive a text message, you can
see the green check mark to seeis this really my bank or is
this somebody pretending to bemy bank?
So you can start to kind of getrid of trillions of dollars

(31:59):
worth of fraud out of the system.
So that's an example.
That's a really big use casewhich could really start.
Say, I'm hanging a secondorganization have to be having
Verifiable credentials becauseotherwise they're not going to
get the train travel and I'mnever going to answer a call
from a company that doesn't havethe green check mark.
So that's an example.
Maybe other kind of a big usecase that could be a chat gtp
like kind of transformation.
I'm not sure that's what'sgonna happen.

(32:21):
I think what we're going toprobably see is more kind of
incremental.
So it might be.
We have specific types of usecases we see, for example, in
the health care space, we seeMove on things, like you know,
provide a credentialing.
So there'll be certainindustries that start to
introduce this because they needto do for various reasons,
might be for for risk, for carsavings.
So doctors will need to havetheir wallet with their

(32:41):
credentials because I need toget into the operating room or
to get into the hospital or toPrescribe drugs, as an example,
they'll need to prove thelicense to basically prescribe
certain types of drugs.
I tend to think that we might wemay not have that big chat gtp
moment.
You could end up having a bigmoment where you know, all of a
sudden it starts to become thisis the telecom private gives me
the safest experience thateverybody wants to get.

(33:02):
Everybody has to get on board.
All the other child providershave to get on board.
I don't have an answer.
It's hard to know what is goingto be Big or incremental.
I tend to think at the momentwe see something a bit more
incremental what do you think isthe biggest challenges in a
decentralized identity world?
the biggest challenge isprobably the very fact that it's
decentralized means you havemultiple parties participating.
A wallet the critical thing isthe wallet.

(33:24):
So you think about the dissent.
You have an issue of data, yourdmv or whatever.
You have the issue of data.
You have the verify consumer,you know who's gonna request the
proof and you got the wallet.
To make this really, reallyfrictionless, seamless.
You ideally want to havemultiple wallets that people can
choose and that can interactwith issuers from different
types of issuing technologies.
So interoperability.
So I guess, if I was to thinkabout what are the big

(33:46):
challenges today, I thinkinteroperability I mean the this
is something everybody knows.
I'm not saying anything.
Somebody doesn't know.
We're working on it.
I think we're really seeingcompanies really really working
on interoperability, so it'sgetting a lot better, but it's
not there yet.
I think interoperability is abig thing and, to your point
about frictionless, I think whatwe really have to do is we have
to figure out how can weintroduce verifiable credentials

(34:07):
in this proof exchange protocolconcept into existing
applications or entering newclasses of applications that
make it easier to make telephonecalls, to transfer money, to
manage your career, whatever.
People don't really need to bethinking about potential, what
they need to be, thinking abouthow quickly they can apply for
job or find a job.
That's sort of, I think, wherewe need to get to.
I'm kind of excited by the nextyear.

(34:28):
As I said you, there's a lotgoing on.
It's not clear which use casesgoing to be the killer use case.
Watch this channel, I want tocontrol my own data.

Speaker 3 (34:37):
What are the first steps I should take?

Speaker 4 (34:38):
I guess my call out to people will be slightly
different.
I think one is to your point.
I definitely use a password,for I just think that's just
basic housekeeping.
Everybody should be using apassword or some description to
manage their passwords becauseyou know you don't want identity
fraud.
That's a big thing, but I thinkmy bigger call out really to
people in general will be tostart to care about your data

(35:01):
and to start to push back,because for us to make this
really happen, to allow peopleto be empowered with their data,
obviously we want to have theuse cases of companies will
benefit from it.
They might benefit from revenuegeneration, cost saving, risk
mitigation might be these newfriction experiences that will
help them get more clients.
It needs to be a bottom up aswell.
The consumer needs to say well,hang on, I want, I want my days

(35:21):
.
I want to know more about whatthese companies know about me.
I want to be able to have afriction experience when I rent
a car or when I get a job orwhen I go to the bank I've
forgotten why I haven't replacedmy credit card in 20 odd years,
because it's literally it'sworse and get teeth extracted
from the dentist.
It was painful, the amount ofchecks or whatever that have to
happen.

(35:41):
There's no reason why thatshould happen in today's world.
So I think we need to seeconsumers putting more and more
pressure on the companies thatthey work with to have a
grassroots movement around selfsovereign identity and also
putting a value on our data.

Speaker 3 (35:54):
We don't actually understand how much it's worth.
Years ago, I used to put up acartoon and my talks that showed
the value we had to people interms of advertising revenue,
and it was in hundreds ofdollars or pounds and you often
think, well, why don't I get acut of that if you're making all
this money out of my data toyour point?
I want some of my data back.
I want the, I want the enricheddata back that I can then use
in other systems.
And now we've had that chat GPTmoment.

(36:16):
We can now plug in our data toAI to make it useful and make
our lives much easier.
So I think people are going tosay, maybe, maybe chat GPT has
done more than we think.
It's that watershed momentwhere now AI can do things for
us.
But we need the right data andwe need our own data, and now
it's available on a platter ifpeople make it easy for us to
access it having people moredata literate and caring about

(36:37):
their identity and their safety.

Speaker 4 (36:40):
So we stop the conversation actually about
privacy but make it about safety.
I want to be safe in the world,so I don't want my data
filtering out there andeverybody's grandmother knowing
things about me so that they cansteal my identity.
So that's kind of my call outfor people to start to care
about this.

Speaker 3 (36:55):
So we almost had a time and we're up to my favorite
part of the show, the quickfire round, when we learn more
about our guest iPhone orAndroid, android window or I'll.
I'll in the room or in themetaverse.

Speaker 2 (37:04):
I work from home.
So I'm going to say in themetaverse your biggest hope for
this year and next that we seeself sovereign identity go
mainstream.

Speaker 4 (37:11):
I wish that AI could do all of my meetings
particularly all my prep workand all my actions afterwards,
like all that the mess aroundmeetings that would take so much
time for me what's the app youuse most on your phone?

Speaker 2 (37:22):
Google Maps.
I couldn't find my way out ofpaper bag.
The best piece of advice you'veever received.

Speaker 3 (37:26):
Don't be afraid to fail.
How do you want to beremembered?

Speaker 4 (37:29):
that I made our digital world a little bit safer
.
As this is the actual futurespodcast.

Speaker 3 (37:33):
What three actual things should our audience do
today when it comes to managingtheir own identity?

Speaker 4 (37:38):
I use one password, so I think managing your
passwords is really, reallyimportant.
To think about that, to be justvery alert and very aware when
you're sharing data, when you'reaccepting the check marks and
I'm the worst in the world foraccepting, you know things
without necessarily reading thefine print, but at least be a
little bit aware, even if youdon't read all the fine print.
Just be conscious of every timeyou're sharing data, because

(37:59):
that data, if it's with acompany you're not familiar with
it, can be used potentially forquestionable things, including
stealing your identity, ifyou're not careful.

Speaker 3 (38:07):
Maria, fascinating chat about a really interesting
topic.
How can people find out moreabout you and your work?

Speaker 4 (38:12):
You know I'm on LinkedIn.
You'll be able to find me there.
I tend to blog.
I have a blog, all thingsanalytics dot com.
That's from my analytics days,so this, but I still kept the
URL.
So all things analytics dot comis me.
Come to my blog.
Connect with me on LinkedIn,follow me.
I talk about this quite a lotand I'll be continue to show
more over the next year.

Speaker 3 (38:28):
Great to connect with you again.
I've always enjoyed ourdiscussions.
I really enjoyed this one today.
Thank you so much for your time.

Speaker 4 (38:33):
Thank you, andrews, great chatting with you again.

Speaker 1 (38:35):
Thank you for listening to the actionable
futurist podcast.
You can find all of ourprevious shows at actionable
futurist dot com and if you likewhat you've heard on the show,
please consider subscribing viayour favorite podcast app so you
never miss an episode.
You can find out more aboutAndrew and how he helps

(38:56):
corporates navigate a disruptivedigital world with keynote
speeches and C-suite workshopsdelivered in person or virtually
at actionable futurist dot com.
Until next time, this has beenthe actionable futurist podcast.
Advertise With Us

Popular Podcasts

1. Stuff You Should Know
2. Dateline NBC

2. Dateline NBC

Current and classic episodes, featuring compelling true-crime mysteries, powerful documentaries and in-depth investigations.

3. Crime Junkie

3. Crime Junkie

If you can never get enough true crime... Congratulations, you’ve found your people.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2024 iHeartMedia, Inc.