Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:20):
Welcome to Making
Data Better, a podcast about
data quality and the impact ithas on how we protect, manage
and use the digital datacritical to our lives.
I'm George Peabody, partner atLockstep Consulting, and thanks
for joining us.
With me is Lockstep founder,Steve Wilson.
Hi, Steve, Greetings, George,how are you?
I'm very well.
I'm glad to be on the road alittle bit outside of my normal
(00:43):
office down in New York City nowTo tee up our conversation
today.
I want to acknowledge the factthat you schooled me on the fact
that privacy and security arevery different concerns, but
there is a linkage I did come upwith connecting privacy and
data breaches that might meritconsideration by those tempted
(01:04):
who continue to be tempted tovacuum up every data crumb, and
that is if you're scrupulousabout data privacy.
You collect only what you needand nothing more, which means it
won't even be there when youget breached.
And for more on that, take alisten to our discussion with
Michelle Thinnerman-Dentity inepisode 11.
To our discussion with MichelleThinnern and Dendity in Episode
11.
Speaker 2 (01:24):
Yeah, we had a great
time with Michelle and I'm
hoping that our guest today hasoverlapped with Michelle, the
definitive privacy engineer.
But, as discussed with Michelle, this is one of these areas
where privacy and securityoverlap so nicely, because it
doesn't matter how you look atdata, it's best not to stockpile
the stuff without a good reasonI mean, that's privacy 101, and
(01:45):
it's best not to stockpile thestuff without a good reason.
I mean that's, you know,privacy 101, and it's also the
security risk that everybody'srunning by having excessive data
.
You know they talk about it asthe new toxic waste.
I'm actually one of thosepeople who, like the data, is
crude oil metaphor.
It's very much contested, likeall metaphors, but I do love the
image of a data spill beingmuch like an oil spill and the
(02:07):
cost and the angst that it takesto clean it up and, I think,
the regulatory imperative thatcomes with.
You know the risks that go withthis stuff crude oil or data or
whatever.
So, look, I can talk some moreabout context, because this
podcast is all about exposingand exploring the multiple
facets of data quality methodsto secure data.
(02:29):
There's an orthodoxy.
We talk about securing data atrest using encryption,
cryptographic scrambling of datatokenization one of Georgia's
favorite pet topics, a reallyimportant technique.
Data, of course, is vulnerablein flight, whenever it's moving
through a network, when it'smoving from place to place.
One of my pet issues is thedata supply chain and the
(02:51):
transformation steps, theprocessing steps, evaluated
steps that data goes throughfrom point to point through the
network, increasinglycomplicated and increasingly
needing security in motion ordata in motion.
So, look, we've got thatcovered, haven't we?
We've got data at rest, data inmotion, pretty well covered by
conventional cryptography, but,as mentioned and we don't want
(03:14):
to get too technical, I guessbut when you encrypt data, when
you scramble it, you necessarilymake it useless or you put it
beyond conventional use, and oneof the challenges that we've
had for a long time is how doyou protect data while it's in
use?
We had a great conversation afew episodes back with Mark
Anjuna.
Mark Bauer of Anjuna, markBauer of Anjuna Gosh.
(03:37):
That's a furphy, isn't it?
I've known the guy forever.
Speaker 1 (03:40):
I think they're
probably Dave.
When Mark thinks the name ofthe company is his last name,
it's a good way to go actually.
Speaker 2 (03:48):
A lot of people think
that my surname's in lockstep.
So there you go, withoutfurther ado.
The latest weapon that we have,the latest piece of our data
protection arsenal, isconfidential computing, and we
are delighted to talk to, shallwe say, mr Confidential
Computing.
I'm sure he's too modest totake that because it's a huge
effort, but we're delighted tohave Mike Birtle, security
(04:11):
consultant author.
We could talk about hiswonderful book, but he's the
executive director, for today'spurposes, of the Confidential
Computing Consortium.
Speaker 3 (04:20):
Welcome, mike.
Thank you very much indeed,steve, and nice to speak to you.
And speak to you, george, andI'm going to start off with just
a quick sort of expansionbefore we go anywhere, which is
that, although it's the one Icare about the most,
confidential computing isn't theonly technology to help with
privacy and data and useprotection.
(04:41):
There's a bunch of them, calledsometimes privacy enhancing, so
privacy.
I'm a Brit so I pronouncethings weirdly privacy enhancing
technologies or PETs.
There's things like fullyhomomorphic encryption or secure
multi-party compute or zeroknowledge proofs and stuff like
that, so they're kind of itfalls in a family.
But certainly the thing I careabout the most and I think which
(05:03):
is most usable in most usecases is absolutely confidential
computing.
So I probably shouldn't be sortof marketing other technologies
, but I'm just going to put itout there and be entirely honest
for you.
Speaker 1 (05:14):
It sounds to me as if
you've been taken to task for
not mentioning those techniques.
Speaker 3 (05:19):
No, no, no, no, no,
no, absolutely not.
They're actually.
They can be reallyinterestingly complementary.
I suspect we're not going tohave time to go into that in
detail today, but there'sactually some really interesting
use cases where you might usetwo of them or use them at
different stages in a process orin a system or stuff like that.
So, yeah, no, I'm just being agood citizen.
Speaker 2 (05:38):
You are being a good
citizen, mike, and being very
humble.
We could play good cop, bad cop, because I actually do think
that what you're doing is goingto be more impactful and more
practical than some of thosethings that you mentioned.
Homomorphic encryption I get it, but it's a difficult thing.
It comes with tremendousconstraints.
Speaker 3 (05:56):
I agree, we're really
well made about PETs.
Speaker 2 (06:01):
Mike, tell us more
about your background and how
you got to where you are today.
How?
Speaker 3 (06:05):
long have we got?
So, uh, yeah, my, myundergraduate degree is obvious
is obviously in englishliterature and theology.
And I well, exactly, yeah, it'skind of weird, right, uh.
And uh, I took I took an mbaalong the way as well.
And then I that was kind oflater on, but I left university
not quite sure what I was goingto do, ended up doing something
(06:26):
called electronic publishing,which was kind of later on, but
I left university not quite surewhat I was going to do Ended up
doing something calledelectronic publishing, which was
kind of CD-ROMs, and thenmorphed into the web.
And then the web got big.
So I moved on, becauseobviously you wouldn't want to
be spending too much time doinganything.
And then I became a softwareengineer not a hugely good, but
okay and then moved intosecurity, became a product
manager, then became a securityarchitect, and I've kind of
(06:46):
moved between those rolesforever, so for 25 years or so
now.
What I've always foundinteresting is being on the
intersection, on the boundarybetween the business side and
the technical side.
I can go deep, low level threadmodeling or whatever on one
side, but on the other side it'sabout how does this interact
(07:09):
with risk and the business andstrategy and all those sorts of
things, and I think that it'sreally important that we've got
people in the cybersecurityrealm who can look in both
directions and talk in bothdirections, because too often
the security people are seen asthe people who say just say no
right, and that's no good foranybody.
So I've really enjoyed thatsort of stuff and so I did all
(07:33):
those things with a whole bunchof people like Citrix, twice,
intel, red Hat, some other folks, smaller players.
And then I created an opensource project, co-founded it
with a friend of mine at Red Hat, and then we did a startup
which failed in confidentialcomputing around this open
source project.
(07:53):
And just after that, theConfidential Computing
Consortium which was set up in2019, and we'd been a member of,
said look, we're looking for anexecutive director.
This is beginning to get big.
We need to professionalize theconsortium.
We need someone who can helprun it.
You know this stuff.
Would you be interested?
I joined as executive directorin about April 2023.
(08:14):
I've been doing that ever since.
I do some other stuff on theside as well, but that's my main
gig.
Speaker 1 (08:20):
I write a book.
You're right, raoul PAL.
Yeah, you wrote a book Trust inComputer Systems and the Cloud.
And I have to say, how often doI get to read a book that
mentions the movies, war Games,the 17th century philosopher,
thomas Hobbes, and includes itsown playlist?
So cheers to that.
So, mike, I got to say, one ofthe things I really appreciate
about the book is that you'vetaken that very human concern,
(08:44):
trust and examined it deeply insuch a way that you take us
through the processes of reallysaying what are the implications
of trust?
What do you need to do to beable to claim trust in digital
systems?
Your definition that trust isthe assurance that one entity
(09:05):
holds that another will performparticular actions according to
a specific expectation.
Gosh, those are a lot of humanwords there and, as I say, to
express those in digital terms,particularly in the cloud and a
multi-vendor cloud, what was theinspiration for writing this
book?
Speaker 3 (09:24):
Oh, I got really
annoyed.
That's a really simple answer.
So I've been kind of thinkingabout this stuff and authority
and how authority works in thedigital world and the
non-digital world.
I've been thinking about thatfor 20 years or so.
And I went to a conference andsomeone had a I can't remember
who it was gave a talk and Ican't remember what the talk was
(09:46):
, but it had the word trust init and I thought this person
does not know what they'retalking about.
And then it occurred to me thatnobody really knows what
they're talking about becausethere's no definition, there's
no sort of framework for us totalk about this stuff.
People talk about zero trust,and every time I talk to someone
about zero trust at aconference or in a seminar or
whatever, we all were sayingdifferent things using different
(10:08):
words.
I thought, well, we need somesort of agreement on what this
might mean and what could leadus to or things that are bad
about it.
And so I thought I'll write abook with a framework trying to
bring it all together, and Idon't care if the framework's
wrong.
At least we as an industry havea starting point.
(10:28):
And I looked at stuff like opensource and community and trust.
I looked at stuff like yeah,zero trust architectures.
I looked at stuff like cryptoand cryptocurrency and
blockchain generally and trust,and I looked at hardware, routes
of trust and all of those sortsof things and tried to put it
(10:49):
all together and walk throughwhat a trust chain might look
like and just give peoplesomething to look at and argue
about and say, at least we knowwhat we're talking about.
And that was why I wrote it,just because I was annoyed and
I'd been thinking about thisstuff for 20 years and I thought
, well, there's probably nobodyelse who's got as much in their
head about it and I'll just dumpthis in a book.
Speaker 1 (11:10):
And I love your
reaction to being outraged.
Write a book.
That's terrific.
Speaker 2 (11:15):
But should we move on
to confidential computing?
The?
Speaker 1 (11:17):
thrust of the book
points to the value of
confidential computing.
Speaker 3 (11:22):
It does.
It does Absolutely Because, asSteve mentioned kind of at the
beginning of the podcast, we'vebeen able to do data in motion,
data in transit, networkencryption for ages and protect
data on that.
We've been able to do it instorage and on databases or on
hard disks or wherever.
That's fine when it's data inuse has been much more difficult
(11:48):
Because the basic classicalmodel of computing and
virtualization, which is how thecloud works, is that if you own
the machine, if you havecontrol of the machine, or if
you've compromised the machine,so if you pwn it or own it or
operate it, so if you haveadministrative access, kernel
access or hypervisor access, youcan look at and mess with any
(12:13):
data or any applications on thatmachine.
It's just that simple.
That is literally the way thatvirtualization works and that's
kind of fine for maybe myholiday synapse, that's kind of
great.
But if I'm a multinationalcompany or even a small company
with customer data or patientdata or intellectual property or
(12:35):
credit card information orresearch data or cryptographic
keys or anything, basically howhappy am I, how happy are my
regulators and auditors aboutthat data, those applications,
being on the cloud where anyonewho has access to that machine
can look at it.
Now I'm not saying that Azureand AWS and Google and IBM and
(13:00):
folks are going to look at thisstuff, but all you have is
organizational process and legalways to stop them.
You don't have any technicalway of stopping them, and that's
just not good enough.
Speaker 2 (13:15):
You're on a tear and
good on you.
There have been technicalresponses to your point.
You make a very good point.
We've had things like hardwaresecurity modules in the cloud,
which is really just leasingspace on a TALIS box.
Not a bad idea, but verydifficult to make it work.
Speaker 1 (13:31):
We've had multi-party
computation.
Speaker 2 (13:33):
So there are promises
.
I mean, I've spoken with someof the big three-letter cloud
providers who promise, and Ibelieve them, that thanks to
multi-party computation, we, thecloud service provider, can't
unpack your data, even if wewanted to.
But along comes confidentialcomputing, which I think I want
you to talk to the issue abouthow do you tame the technology
(13:54):
and make those technology tricksavailable in the cloud.
But first, we started thispodcast and we reached out to
you, mike, because we found ashocking lack of awareness of
confidential computing as abrand or as a concept.
I think it's the future of thecloud, I think it's the most
important thing I've seen for along time.
So talk to us about theawareness problem and what do we
(14:15):
do about that?
It is surprising.
Speaker 3 (14:17):
The reason that the
Confidential Computing
Consortium was created waslargely to deal with this issue,
right?
So the charter of the CCC, aswe call it, not the Chaos
Computer Club this isConfidential Computing
Consortium is to promote usageof confidential computing and
encourage open sourceimplementations around it as
(14:39):
well and projects, and it's apart of the Linux Foundation, so
it allows a safe space for evencompetitors to get together and
talk about this stuff and workout ways to market together or
do technical stuff.
So Intel, arm, amd, nvidia,huawei they're all members of
the top tier, the premier tier,and they can all have these
(15:00):
conversations and we can dostuff together.
But, as you say, there's thisshocking lack of awareness of
the technology.
Shocking lack of awareness ofthe technology, and it's kind of
weird because you can use thistechnology in all of the major
clouds and most of the minorones these days, because it just
requires them to have systemswith a specific set of chips.
(15:24):
There's a whole bunch of AMDchips, whole bunch of Intel
chips, there's a whole bunch ofNvidia chips.
These days, there's someARM-based chips coming soon as
well, and to turn it on and whypeople don't know about it is
kind of weird.
Actually, using it has onlyjust become easier.
I think that's fair to say aswell.
(15:45):
So the maturity of thetechnology for just your
standard user is only justcoming on.
And there's one other reallyinteresting and quite complex
piece, which I'll try andexplain simply, which goes by a
nice word, which is attestation.
Without attestation you've gota problem.
So let's say I want to use oneof these chips and I'm going to
(16:09):
create what's called a T, atrusted execution environment.
It's a way to do your computingusing these chips, so that
anybody who has access to themachine can't look into it, even
if they're admin, kernel, hyper, all that sort of stuff.
So it's protecting your dataand your application.
So you want to use one of those.
So I say to you George, george,george, you are my cloud
(16:30):
provider, can you set one ofthese up for me?
And you say, of course I can.
There we go and I put all of mystuff into your, into your T.
And I suddenly think, wait asec, the whole reason I'm using
this thing is I don't trust you,george.
I don't trust you not to lookat my stuff.
So how can I trust you to sayyou set this thing up to protect
(16:51):
me correctly and that's oh dear.
That sounds like a way, and sothe answer is I can actually ask
the chip who's created this forme, rather than you, george to
give me a measurement, acryptographic measurement, of
this thing.
It's kind of like a signedthing to say this has been set
up correctly, and I then need tocheck that that's correct.
(17:13):
And if it is correct, well,that's good, because it means
that George kind of messed withit.
So you need a way to doattestation, and that is tricky,
and if you don't do it justright, you've kind of lost the
whole point, and part of theproblem here is that one of the
things is that I've got to makesure that the person who's
checking the attestation for meis also not.
Speaker 2 (17:35):
George.
Thanks, mark.
You sort of read our mind thatTEE is the Trusted Execution
Environment which, as Iunderstand it is more and more a
built-in feature of those bigprocesses that you're talking
about.
Speaker 3 (17:47):
It is, yeah,
absolutely a built-in feature of
those big processes that you'retalking about.
It is, yeah, absolutely the ARM, and.
Speaker 2 (17:49):
Intel Attestation, so
that we have an independent
assurance of the state of thesoftware, the state of the keys,
the configuration.
Nice, if you think that you'rerunning Elliptic Curve 256, is
it really the algorithm properlyimplemented?
Speaker 3 (18:04):
Even more than that,
I would say.
Not only has it been set upcorrectly, but the software that
I think I put in it is what iswhat I expected, which means
that I can start doing some ofthat multi-party computation and
collaboration that we talkedabout before, which is really
interesting.
Speaker 2 (18:21):
I wanted to underline
the two things that I think are
so important and complicated,things like having a good chip
and having good attestation ofthings that are just so
difficult to manage.
So the consortium comes in, theCCC comes in to manage some of
that.
It's a great story, but tell usmore about what's required on
the enterprise side to tool upfor this sort of thing.
I mean, how do I engage withthis if I might already have my
(18:44):
workloads in a particular cloudservice?
How do I?
Is it much of a pivot to takeadvantage of confidential
computing?
Speaker 3 (18:50):
And I think this is
one of the areas that we're
trying to help people understandwhat needs to happen.
And the answer is it dependshow you want to do it.
If you want to sort of putexisting workloads into a TE,
and as long as you've got anattestation mechanism, it's
actually fairly easy to do that.
If you want to be building anew application from the ground
(19:11):
up to take advantage of thesenew technologies, then it's
going to be a bit more complexand there are open source
projects to do this, lots ofstartups, there's other people
doing it.
So, for instance, microsoft hasmoved all of its credit card
processing $22 billion worth ayear for Azure into confidential
computing.
(19:31):
Good example there's one of thebig multinational anti-human
trafficking agencies is usingconfidential computing to keep
data safe and so that it can'tbe changed as well.
So this is one of the thingsthat putting your existing stuff
in it kind of works, but onceyou start thinking about how
(19:54):
this stuff could be used, itkind of changes the way you
think about computing.
The primitives that you have toplay with to use data, to share
data in ways that you can besure that it's correct, makes
you start thinking.
Well, actually I shouldprobably be architecting my
applications differently.
(20:15):
And then, once you start doingthat, obviously it's a longer
journey.
So everyone's talking about AIat the moment, right?
What does AI have to do withconfidential computing?
Well, there's kind of two areaswhere you might want to use it.
First one is at sort of the end.
So you've got your model, it'sall trained, you want to use it,
right?
(20:35):
So I want to engage with an AImodel which is hosted by I'm
going to pick on George again byGeorge, right.
But I may be asking questions ofthat model that I don't
necessarily want George to knowabout, right, it might be stuff
about domestic violence, forinstance.
Or it might be a proprietarymodel which is hosted by George,
(20:57):
but it has information that'sproprietary to me.
So not only do I not want himto know what I'm being asking of
it, but I also don't want themto know what the answers are.
I also don't want him to changethe data in that model, right?
And if I run that model in aconfidential computing, a TE
world, then that's the sort ofisolation that allows me to have
(21:20):
those assurances.
So that's kind of one end, butalso at the other end, there's a
lot of concern at the momentabout what data you're training
your models on right, and one ofthe things you can do with
confidential computing andattestation very, very important
here is to track what data'sbeing used and prove that, when
(21:42):
you get to the actual usage ofthe model, that is the only data
that's being used.
You can also use it for thingslike supply chain, to track
exactly what's gone into yoursupply chain.
You talked about supply chainearlier, so it just makes you
think about it in rather adifferent way and that's really
exciting, but it takes companies, particularly in an economic
recession, a while to thinkabout how could I be doing new
(22:04):
stuff in different ways?
So you know we at the momentwe're very soon the next month
or so going to be publishing aset of use cases, a white paper
of use cases from some of themembers to give people an idea
of how they might use thesetechnologies.
Speaker 2 (22:19):
Thanks for speaking
to the mental models too,
because I think that computing,you know cloud is necessarily a
fuzzy idea.
I mean it's in the name, um,and virtualization, I think has
led people to be I don't want tosay lazy, but pretty, pretty
laid back about where the datais and, yeah, who else have got
access to it.
We tend not to think about thatanymore.
I don't want to go to ownershipper se, but I think control and
(22:40):
, you know, leasehold over data,I think is now.
But we are entering into anarea where the mental models of
who's got access to my data andhow can I think about that where
confidential computing isbreaking ground.
Speaker 3 (22:53):
If you're a hospital
and a pharmaceutical comes to
you and says a company comes toyou and says, look, we want to
run some models and modelling onyour patient data, give us all
your patient data.
You're going to say I don'tthink so.
And HIPAA or whoever'sregulated is going to say I
don't think so either.
Right, but if thepharmaceutical company can come
(23:16):
up with an application and proveto you that application won't
exfiltrate your data it'll onlydo certain processes on it and
then run that in a confidentialcomputing environment, then you
can be sure that the data you'resending will only be used in
the way that you expect it to be.
So you can actually this is thesort of the collaborative and
(23:38):
you could combine that withother data from other hospitals
and not worry about the wrongpeople seeing it or getting
leaked to the wrong people.
So this is why things kind ofchange about how you think about
it.
You could use this for fraudmanagement.
You could use it for oil andgas exploration.
We're seeing people talkingabout this for space usage.
You've got microsatellites oredge use cases where you want to
(24:02):
combine this stuff but makingsure that it can't be tampered
with or can't be seen where itshouldn't be.
It's just new ways of thinkingabout it and that's why I get so
excited about it and what itdoes and you used the word
really early in this podcastrisk, steve and that's why I
care about it, because thisallows you to change your risk
(24:22):
profile.
Before putting stuff in thecloud set you up with a whole
bunch of sets of risk thatGeorge is going to mess with it
if he's the cloud provider.
I keep picking on you, george.
Sorry, if that machine iscompromised, there'll be
problems.
There's no perfect security,but it raises the bar for that
particular set of risks andallows you to think you know
(24:43):
what, maybe there's stuff that Icould put in the cloud for that
particular set of risks, andallows you to think you know
what.
Maybe there's stuff I could putin the cloud now that I could
only do internally before,because my risk profile has
changed because of thesesecurity technologies.
Speaker 1 (24:54):
I'm excited to hear
about this, Mike, because it
strikes me actually that I'mhearing the inklings of
competitive differentiationbased on security.
Competitive differentiationbased on security.
I'm hearing that there areopportunities for
re-architecting businessprocesses which would give the
architect and its compliance acompetitive differentiation over
(25:19):
another company or anotherprovider and it's been so rare
company or another provider andit's been so rare.
Security has always been thissunk cost as far as enterprise
is concerned.
But, Steve and I, as we talkabout how do we make data better
, how do we put an economicmodel underneath data drive,
investment to drive amarketplace, to drive really
(25:43):
strong businesses around thedata provenance, data
custodianship, as opposed totoday's current one, which is
either internet advertising onthe one hand, or hoovering up
every data crumb, repackaging itand selling it back out.
Speaker 3 (26:00):
So I saw you talk
about the vacuum cleaner problem
, which is kind of a weird one,but what I mean by that is that
I don't know anybody who buys avacuum cleaner because they want
to.
Everyone buys a vacuum cleanerbecause they have to right.
It's just one of those things.
It's got old, you've moved to anew house, you have pets now
(26:21):
whatever, but you buy it becauseyou have to, and security is
kind of like that.
Often we need to changesecurity from being a vacuum
cleaner to something which addsvalue to you, allows you to
create value, and this, I think,is a technology which
absolutely allows that.
Steve, you mentioned earlier onsupply chains and I just
(26:42):
thought it might be nice to comeback to that again briefly,
because we've had just recentlythis, this xz problem or xz, if
you're, if a commonwealth personright, and people may not be
aware of that, but it was a isan attack on basically the linux
uh operating system andecosystem supply chain by people
(27:02):
taking a long time to sortthings out and doing long-term
attacks what we call an APT, Iguess if it was a.
It's over two years thesethings have been happening.
We need to think about how weimbue trust into the supply
chain and confidential computingdoes not fix all of the
problems, but it allows you tofix some of the problems because
(27:26):
it allows you to build yourpiece of the supply chain in a
Confidential Computingenvironment where you can know
what the build environment is.
You can prove that it was thatassurance, that attestation,
through into the supply chainthrough your S-bombs or whatever
(27:48):
you're using to do that, andallows you to start having more
assurance in particular pointsin the supply chain.
And I think that we've got along way to go on supply chains.
There's things like Toto whichare helping us with this.
But if we come back to trust andthe book, you gave the
(28:09):
definition.
But there's three coloraries,one of which is that trust is
always contextual.
So when I do a build and I signthat build as the maintainer or
whoever it is, am I signing tosay, yes, this was the software
I think it was that has beenbuilt?
Am I signing to say, yes, thebuild system is what it should
(28:31):
be?
Yes, the hardware is what itshould be.
Yes, the implementation iscorrect.
Yes, the underlyingcryptographic algorithms, for
instance, are correct.
You need to contextualize everysingle thing there, otherwise
you start having gaps, and onceyou can contextualize that and
then you can put assurances inplace that that has been checked
(28:54):
and that flows up or down thesupply chain.
Whichever way you look at it,things start changing and you
start to differentiate again,george, to your point, the
offering that you provide.
Speaker 2 (29:07):
Yeah, you're telling
the story behind the data, or
you're providing infrastructurethat allows stakeholders to tell
the story behind their own data.
Speaker 3 (29:13):
Yeah, and the
applications as well.
What is data?
What is application?
At this point, you know,everything is data.
It's all ones and zeros untilwe start going quantum.
But that's a whole bunch ofother questions, right?
Speaker 2 (29:25):
It's going to change
our mental models.
Speaker 3 (29:27):
Oh yeah, I know my
brain already hurts whenever I
think about it.
Yeah, so, as you can tell, I'mreally passionate about this
stuff and for me, it's thiscombination of security, trust,
risk and privacy which youbrought up right at the
beginning or privacy right atthe beginning, george, about how
these interact and what itallows us to do when we think of
(29:50):
them in new ways that allows usto build new things,
differentiate our businesses,our projects, uh, whatever we're
doing and I think that's reallyexciting, and, as he says at
the moment, it's why would youuse these technologies In 5, 10
years?
It's going to be.
Why wouldn't you?
Right, it's going to be.
(30:11):
If you're not using this, I'msuspicious of you, and that's in
the same way that it used to be.
Why would you bother with HTTPS?
Now, you would find it verydifficult to find a website
which doesn't have HTTPS and ifit doesn't, you start worrying.
And that's where we should bewith computer.
Speaker 1 (30:32):
So, mike, let's leave
it there.
But before we let you go, whatdo you see as the next frontier
in handling data?
Speaker 3 (30:40):
It's being able to
trace it and its supply chain
and having a knowledge of who'sinteracted with it, when and how
, so that we think of data notjust as contextless but full of
context, and I think thatchanges the way we think about a
(31:01):
whole bunch of stuff.
I think open source is takingover the world, but the world
hasn't quite understood thebusiness ways to understand that
yet and we're beginning toaddress this, and I think this
is one of the ways that thathappens too Well that's terrific
.
Speaker 1 (31:16):
I know, Stephen, I
couldn't agree with you more.
Speaker 2 (31:19):
Tell us about your
next steps.
The organization has a bit ofan event coming up, I think next
steps.
Speaker 3 (31:25):
The organization has
a bit of an event coming up, I
think, yeah.
So we've got the ConfidentialComputing Summit coming up in
June, I believe early June, inSan Francisco.
In fact, I'll be speaking atRSA in San Francisco in just a
couple of weeks' time, so maybepeople will make it to that.
But yes, the ConfidentialComputing Summit San Francisco
would love to see you there.
(31:45):
Come and talk, ask us questionsand we'll take it from there.
Speaker 2 (31:49):
And here's another
plug.
We'll see you between RSA andConfidential Computing Summit.
We'll see you at Identiversefor a panel on confidential
computing and the future ofidentity.
Indeed, I'm really lookingforward to that.
This is hot stuff, mike.
Thanks for sharing, thanks fordemystifying and hopefully we're
raising awareness of this thing.
It's the future of the cloud.
Thank you, thank you.