All Episodes

January 29, 2024 37 mins

Data provides the basis for how we make decisions. An enemy of security these days, from our point of view, is plain text. We need better than that. We need device-assisted support for proving where data comes from and how it's been handled. We need systems that keep data (and code) from being altered without cause, that give us the ability to trace the change history of data. 

Confidential computing is a new compute paradigm that provides a hardware-based foundation for running code and the data it manipulates. It safeguards data and code (it's all data; it's all code) in its most vulnerable state: while it's being processed.

In this episode of Making Data Better Steve and George are joined by Anjuna's Mark Bauer to dive into this new model's high impact on security and low impact on cloud app development.

Mark dissects the mechanics behind this approach including how it strengthens the software supply chain through hardware-based attestation. He addresses its fit in modern cloud infrastructure including Kubernetes, data loss prevention (DLP), API scanning and more.

The conversation addresses  the initial major use cases for confidential computing. High risk environments including defense, banking, and healthcare are obvious.  Not so obvious is securing multi-party data sets in the cloud for machine learning and AI-based applications.

So take a listen to this episode of Making Data Better and learn how hardware-based security can harden the cloud. 

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:22):
Welcome to Making Data Better, a podcast about
data quality and the impact ithas on how we protect, manage
and use the digital datacritical to our lives.
I'm George Peabody, partner atLockstep, and thanks for joining
us With me is Lockstep Founders, steve Wilson.
Hey, steve.

Speaker 2 (00:39):
Good day George.

Speaker 1 (00:39):
How are you going?
Good to see you, Good to seeyou, Happy New Year.
Here's to a great 2024 for allof us.
So, Steve, today we're going totalk about an important
technology that has much tooffer for those concerned with
security but, at least as far asI'm concerned, is not well

(01:00):
understood or even known.
It's called confidentialcomputing.
And while we're talking aboutthis on our podcasts about
making data better, because datais the basis on which we make
decisions and we need systemsthat keep data from being
altered without cause, withoutit giving us the ability to
trace the change history of data.

(01:22):
And, as you and I both talkabout all the time, the enemy of
security these days is, fromour point of view, is plain text
.
We need better than that.
We need device assisted supportfor providing data that well,
you know where it comes from andhow it's been handled.
So here's this new notionconfidential computing.

(01:43):
We went to this trade groupcalled the Confidential
Computing Consortium and here'sthe definition they put.
They posted.
It says confidential computingprotects data in use by
performing computation in ahardware based, attested,
trusted execution environment.
We're going to get to all thoseterms.
These secure and isolatedenvironments prevent

(02:05):
unauthorized access ormodification of applications and
data while in use, therebyincreasing the security
assurance for organizations thatmanage sensitive and regulated
data.
So you can see sort of thedomain here.
I know, Steve, you're going tolike it because and I like it
too because it's convergingsoftware and hardware.
We think the two need to gotogether when we're talking

(02:27):
about security Absolutely.

Speaker 2 (02:30):
I love confidential computing, george.
It's an idea that his time hasfinally come.
We've had a go as an industryat hardware based security for a
long time.
People listeners would befamiliar, which try not to get
too technical, but most peoplewould be familiar with the idea
of encryption and the conceptthat we need encryption over
data when it's in storage, atrest, and we need encryption

(02:53):
over data when it's moving, whenit's in motion.
Now confidential computing isabout yet another idea of using
encryption over data when it'sin use.
There's a number of ways ofdoing this.
There's homomorphic encryption,there's multi-party secure
computation.
I think confidential computingis the best go that we've had at

(03:14):
this so far and I'm reallylooking forward to diving into
some of those details.
I'm just going to flag our teststation, the idea that
somebody's got your back aboutthe quality of the data, the
quality of the stories, thequality of the algorithms and
the quality of the computingenvironment that we're talking
about.
It's a bit wild west at themoment.
There's all sorts of securechips and enclaves and secure

(03:35):
elements and God knows what.
How do you know that any ofthis hardware is actually in a
proper state?
How do we know that it's fitfor purpose.
That's what we're going to getinto today, and how does this
property and how does thesecharacteristics become available
to the enterprise?

Speaker 1 (03:51):
To take us deeper into this topic.
I'm delighted to welcome MarkBauer, who is VP of Product at
Confidential Computing Company,and Junna Mark join me on
episode three of Making DataBetter, where I asked him to set
his way back machine to theaughts and his role in helping
Heartland payments increase thesecurity of its system after a

(04:13):
card breach.
Mark, I'm delighted to have youcome back to Making Data Better
and to talk about what you'reactually doing today.

Speaker 3 (04:22):
Yeah.
So, george and Steve, good tomeet you again, and thank you
for having me back.
By the way, I'm happy to haveyou here to everyone too.
So, yes, I think Steve'sabsolutely right.
Confidential Computing is arelatively new technology, that
the concepts have been aroundfor some time, but here at
Anjuna we're essentially we havethe vision of anybody be it

(04:42):
enterprises, software companies,saas providers should have
access to this technologybecause of the bar raising
capabilities that it has inreducing risk.
And if you think about all ofthe breaches that we've had over
the many years, they oftenstart in memory or in software
that is vulnerable, and beingable to reduce those
vulnerabilities really has tocome down to taking it down to

(05:05):
the hardware itself, which iswhat confidential computing is
about.

Speaker 1 (05:12):
Wow.
So before we get into all ofthat, mark, tell us a little bit
about your backstory.
How did you come to this roleand interest?

Speaker 3 (05:26):
Yeah, so it actually goes back to the story that we
told in the previous episode,where we think about the
Heartland breach and how thatwas resolved and how it came
about.
It was ultimately systems thatwere vulnerable, attackers
getting in, getting access tothings that they should have
zero access to at the end of theday, and exploitation of
vulnerabilities that resulted inhundreds of millions of credit

(05:49):
cards being exposed, and thecompany Heartland reacted in a
really positive way to turn thatfrom being a potential disaster
to a change that the industryhad to embrace, which was really
introducing hardware to protectdata as it's acquired from the
card readers themselves, whichwere always quite secure but
weren't used in the way toprotect card data, and then

(06:11):
keeping that data secure all theway to the backend systems.
And so that was in the days ofwhat we thought of was
end-to-end security that carddata.
When you think about theproblems we're dealing with now,
when organizations areembracing machine learning,
massive amounts of data, videofeeds, live audio coming in you
need a better way to handle notjust the data but the code

(06:33):
itself, and so I'd been watchingthis technology come about over
the last few years.
I'd had it on my productroadmaps in the past, and I'd
spent time at Amazon in thehardware security group as well,
and so when I saw theopportunity to jump into a
vendor that was spearheading thedevelopment of this technology
at a very early stage in themarket, I jumped across, because

(06:54):
it is a shift in the waycompute will take place and I
think within five years and itcould be even sooner than that
it'll almost become the defaultway we compute, whether we like
it or not, because of thedemands on security and the
risks that we're dealing withand the regulations and the
privacy requirements that peopleultimately expect.
So I think there's a lot ofchange coming as a result and I

(07:17):
want to be part of that.

Speaker 1 (07:19):
Cool.
So this is actually kind ofhurting my head because I'm
trying to square how we dobusiness today, how somebody
that the folks who've beenbreached in the past they just
store data.
There's no such thing as datathey don't like.

Speaker 3 (07:41):
Right this is a data opportunity.

Speaker 1 (07:45):
Right and, as we know , the downside is it's also an
opportunity for fraudsters.
From a bridge to liability,it's an opportunity for
fraudsters to steal it.
Where does confidentialcomputing come in?
I mean, we think dataminimization is a great idea,
but I take it that you'reapplying confidential computing

(08:05):
principles to the storage of allthat data.

Speaker 3 (08:09):
It's when you think about it.
We've had and Steve alluded tothis, we've had data at rest
protection for many years andalmost today you wouldn't go
into a cloud or even into yourown data center without data at
rest protection.
For the most part, it'ssomething that, if it's there,
it should be turned on.
In fact, we've abstracted it tothe hardware for a lot of bases
.
Same with data emotion.
Data emotion is about theintegrity of the data, it's

(08:32):
about the confidentiality of thedata and it's about making sure
you know who you're talking toand either end.
But all too often all of thoseenvironments are still
processing data in memory duringuse.
So you read data from adatabase into memory to process
it, and now that data is in theclear and it's remarkably easy,
if you have the right level ofprivilege which you can gain

(08:53):
from malware and fromvulnerability software to be
able to get access to thatmemory.
And there have been many attacksas a result of that that have
compromised systems because youcan pull down things like
threads.
You can pull down the personaldata straight out of memory.
You don't have to attacksystems or worse, we see an
encryption keys being stolen outof memory.
You know, maybe it's memorydump that then gets moved to

(09:15):
another system and that key thengets extracted by an attacker
in a lower trust environment andit's used to decrypt data.
That's in a productionenvironment, and so the data in
use problem is very, very realand it's been very difficult to
solve until you take it to ahardware based approach to what

(09:37):
is essentially confidentialcomputing.
It's simplest.
It's essentially taking thememory, encrypting it during use
, restricting the processing toprocesses that themselves can
decrypt that content.
At the very core it'sencrypting the registers, the

(09:59):
caches, everything else, and sothat you're shrinking the attack
surface to only thecomputational thing, the CPU
core itself, which also has somesecurity properties, so that
you're eliminating unauthorizedaccess to memory.
And if you can do that, thendata that is protected at rest

(10:20):
or transported out of thatenvironment can also be
protected from that in the mostkey, essentially the CPU
environment itself.
And so that's fundamentallywhat it is about in the highest
level.

Speaker 1 (10:32):
So you're taking advantage of the hardware that's
built into the CPU.
The trusted the T, the trustedenvironment.

Speaker 3 (10:42):
Yeah, very much so.
In most modern processors now,or in some of the hyperscalers
that have their own proprietaryversions of this, there's
instructions that have beenadded to the processors to more
or less isolate the processing.
So either creating a virtualmachine instance that is
isolated from the rest of theprocessor, instrumenting a

(11:03):
process itself which cannot beseen by any of the other cores
there's no other even to therest of the operating system
almost or creating an isolatedinstance that is locked down,
and so you have techniques likememory encryption, isolation,
and then things like processisolation.
Plus on top of that, as Stevementioned attestation, which

(11:25):
I'll get to but in essence it'snew capabilities in the
processors that give you whatwere typically kind of things
that you do using, say, hardwaresecurity modules, very bespoke
pieces of kit that you wouldrack and stack for things like
payments, encryption and keymanagement, but to do that for
more general purpose computation, and so that gives you the

(11:48):
elasticity you need, the scalethat you need, as opposed to
running in a very confinedhardware box that was over here
on the left but did one thingwell and secured it well, but
now you can run any applicationin this kind of environment with
the right enabling software andso on.

Speaker 1 (12:03):
And are you able to virtualize the T, then
Essentially, yeah.

Speaker 3 (12:12):
So there are hardware capabilities and there's
mechanisms called hardware rootsof trust and there's hardware
modules in the processors thathandle the encryption, offload,
the encryption of memory.
So, it's not done on the regularprocessor.
It's all offloaded so it runsat full speed if you will.
But the kind of the notion ofhaving hardware roots of trust

(12:36):
gives you this ability to alwayscome back and prove that the
software that you're running isin this trusted environment, and
which gives you a different wayto start to prove that you're
running in an environment thatyou can at least establish there
was an acceptable level ofsecurity before you started to
use it, which is something thatyou just don't get with regular

(12:58):
processors at all.
You make an assumption of trust.

Speaker 2 (13:01):
So what's different now, mark?
We throw some jargon aroundlike trusted platform modules
and secure elements and hardwaresecurity modules.
I think most people have got asense of this.
Hardware security module islike a shoebox size piece of kit
that runs in a computer rackand it costs $50,000 and it's a
great idea.
But unless you're a bank youcan't afford it.

(13:23):
We've had commoditized versionsof this before.
The trusted platform module wassupposed to be a chip on the
motherboard of every PC and infact most PCs have got them, but
they're not sort of turned on.
What's happened?
Leading question to make CCCaccessible, confidential

(13:44):
computing accessible, wherewe've had all these dead ends
before?

Speaker 3 (13:50):
Yeah, so confidential computing itself has actually
been around for more than fiveyears.
In fact started in probablyalmost a decade ago with the
very early processors thattended to be look more like the
hardware security module smallmemory for prints, limited
capabilities, very complex touse, just like HSMs, in fact.

(14:12):
These days, though, that'schanged.
When you think about thedemands of modern work loads.
So you're running an AIworkload that needs to process
billions of points in a model, acore banking system that you
want to run in these kinds ofenvironments.
Then you need the samecomputation capability that you
get with regular instances thatyou have in the cloud.

(14:32):
So I need 200 processors, Ineed 10,000 processors, so the
technology is scalable to thatlevel number one.
There's also been a kind of a.
You know availability is huge.
Now Nearly every AWS instancehas what's called AWS Nitro
on-claim, which is their termfor confidential computing.

Speaker 2 (14:51):
It's extensions to run isolated work loads, and so
this is much more than whatAmazon and others have had like
HSM, cloud HSMs for a long time.
It's going beyond that, isn'tit?

Speaker 3 (15:03):
Yes, you know, the HSMs are great for storing keys,
signing transactions, like PKI,so I've got a document that I
need to sign with a digitalcertificate in a high integrity
environment.
It's great for that.
It's not great if you want torun core banking in one of those
things, because they're justnot designed for it and you'd
have to re-architect the app andeverything about it.
Proprietary operating systemsit's a mess.

(15:25):
These days you want to runKubernetes applications and to
run them in an isolatedenvironment so that insiders
have no access, so your adminshave no access to them.

Speaker 2 (15:34):
Re-archiving the special edges in script language
.

Speaker 3 (15:37):
Exactly exactly, which is that's what our role
here is around.
Junior is making this very,very simple so that not only
you're taking advantage of thisenvironment because it's there
and you can, which is just agood thing.
Generally you know bettersecurity overall.
But it starts to get into thatproblem of how do I trust
something before I use it?

(15:59):
And if you think about that,when you go to a cloud today and
you get an instance, it's likegetting a server.
You assume that the BIOS ishigh integrity and has no
backdoors.
You assume that the hypervisoryou're using is good.
You assume that the operatingsystem is good and you look at
the certifications that saysthis organization went through
PCI, hipaa, soc2, etc.

(16:20):
Which is an assessment done byhuman at some point in time.
But it doesn't give you a way toactually measure as improved
mathematically that this is apiece of hardware that meets
this level of security and hasthis level of firmware and BIOS
and microcode.
So I know it hasn't beentampered with and confidential

(16:42):
computing lets you do thosethings.
So you can ask the hardware.
Tell me what state you are inand prove that.
Tell me what software you'rerunning and is it the same
software that I built in mysecure environment over here in
the data center, so I know thatit hasn't been tampered with.
So think of SolarWinds, thatattack where there was

(17:06):
manipulation of compilers andlibraries being inserted into
the supply chain.
You could prevent that kind ofsituation by showing that this
code has been tampered withbefore I run it, and I can prove
that, and the hardware can tellme.
It's not the piece of softwarethat could be manipulated, it's
the processor itself that canprove that.
That's a game changer.

(17:27):
When it comes to Real thingslike zero trust, we not just
rely on a one way.

Speaker 2 (17:34):
When remind the audience, the solar winds was
the so-called software supplychain.
It was almost like a black swanevent.
It shouldn't have been.
I mean, we should have knownall along that software is
incredibly complicated.
It's got its own life story,it's got its own supply chain
and what happened with solarwinds was that elements of that
supply chain.
The attackers were very smart.
They found the software modulesub providers, the

(17:56):
subcontractors that were mostvulnerable, and they attack
those.
So then the software modulescome back into the mainstream,
everything's recompiled,everything runs and you've got
that Vulnerability lurking inthe supply chain that has been
exploited.
Um, I, that is really whyGeorge and I so interested in
Confidential computing.
Foundationally, because ithelps us tell the story behind

(18:18):
the data.
It helps us you know, code isdata and it helps us improve
that confidence in the, in thebackstory of the code and the
data that we're all depending on.

Speaker 1 (18:29):
Yeah, exactly, it's all about my are you taking some
Um fingerprint of the data, I'msorry, fingerprint of this, of
the, of the compiled code on the, on the Development system, and
then using that as to compareto the running code that's
inside of your, your, your, yoursystem?

Speaker 3 (18:50):
It's a little more than that.
So, when you think about all ofthe, all of the things that you
can measure, so what?
What constitutes something thatyou're going to run?
Well, you've got things likethe, the bootloader, the
firmware, you've got themicrocode version in the
processor, then you've got theactual code itself for the
application, which may be, youknow, containerized application,
multiple things, and thenyou've also got the initial

(19:12):
memory state.
You expect that to start from.
All of those things Constituteessentially what you expect to
run.
So, when you've gone throughyour build process and you
typically you'll have goodsecurity practices around the
build process Um, you've got tomake sure that what you're
building is actually what you'verun and you don't have an
unexpected hypervisor that'sactually leaking data out to a

(19:35):
third party.
Um, that you don't have anoperating system with a back
door that you didn't expect tobe there, or you are using
something that's not in thebuild process, that has a
vulnerability that somehowcorrect in, or malware.
It's being injected somewhere.
So, essentially, you're takingsecure hatches of all of those
components.

(19:55):
You know you can choose whichones you want to do, depending
on your sophistication and risktolerance, but it's all about
making measurements of thingsComputing hatches and then
asking the hardware we computethose hashes for me and it will
give you those measurements backand you can then look at those
measurements and combine them towhat you build.
And the the upside of this toois that the measurements on that

(20:18):
software and the fact that thehardware has digitally signed
this with the key that'sessentially Embedded in at
manufacture time into the secureprocessor, it can sign that, so
this then becomes Evidence ofwhat you're running and that
evidence can also be then usedas identity Like a machine
identity to then do things likepull a secret into that

(20:40):
application.
And so the old way of having togive a cred to an app After you
start it and then hope thatit's in a secure environment and
hope that that cred doesn't getstolen by an insider that can
be eliminated, that kind ofthreat.

Speaker 2 (20:55):
And that means credential.

Speaker 3 (20:57):
Everything's got a credential in these days.

Speaker 2 (21:00):
Yeah, yeah, thank you .

Speaker 3 (21:02):
Yeah, so the power of attestation comes down to
proving trust in something thatyou're running and then also
using that information to bindthe identity to other things,
like a key management system, soyou can then pull in secrets or
pulling configurations.
If you can do all of thatautomatically, then confidential
computing becomes Very, veryseamless and you can instrument

(21:24):
it into your dev processes, asopposed to building encryption
tools or having to turn thingson at the application level.
That is the way we were doingit today.

Speaker 2 (21:34):
Nice to trust, but it's better to verify.

Speaker 3 (21:37):
I Hold that.
It's trust but verify.
But the question is well, howdo you really verify if you
can't verify the thing that'sverifying, which is software for
the most part, and so abstractthat to the hardware and you do
have to still trust the hardware?
Yes, but we trust AMD, we trustintel, we trust arm.
You know, we trust themanufacturers and they have very
, very strong processes overthis and the trust mechanisms in

(21:59):
that process.

Speaker 1 (22:02):
Mark, what's the process of deploying this?
I?
It sounds complex, complex andit sounds that I mean.
My experience is that whenencryption gets involved,
there's overhead and and, and,and I'm just even from a
processor.
Overhead, you know, there's aperformance hit.
For those of us who think aboutCPU performance, yeah, how do

(22:23):
you, how do you sell around thatyou?
How do you sell around that?

Speaker 3 (22:28):
Well, the thing is you don't have to, because the
modern processors if I look at,say, an AMD sev processor and
Intel TDX these are the very youknow current processors and TDX
is also fairly new the, thebenchmarks of these show that
the performance impact of thefull confidential computing
capabilities is somewhere to theorder of, you know, 5% on
average, even under the seriousload, which is a very acceptable

(22:52):
number when you're getting thisamount of value from encrypting
data in use.
So you know, and the questionthat you asked is exactly the
right one, traditionallysoftware based encryption
mechanisms always added a fairchunk of the processor to
dedicate to the processing ofdata and encrypting it, whereas
this is offloaded into hardwareaccelerators, on the, on the

(23:14):
processors, for the memory andthen also for things like IO and
disk encryption and so on, andso you can get some very high
performance and you have theelasticity, and so you can scale
horizontally and, you know, setyour limits on Kubernetes, run
them in a confidential pod andyou're away and you don't have
to worry so much about thatperformance and you don't have

(23:35):
to worry so much.

Speaker 1 (23:37):
So, steve, I know you've got I suspect you have
more more technical questions.
I want to know how heck you getthis into the marketplace.
You know what.
Who is buying this?
Um, what are the objections yourun into mark?
How do you, how do you, knockthose objections down?

Speaker 3 (23:54):
Yeah, good question.
So number one is always Ihaven't heard of this technology
before.
This sounds too good to be trueand it is a very powerful
technology, and so thisdefinitely education that's
needed, which is why we're doingthis podcast.
Why is we spend a lot of timewith customers in workshops and
things like that, to educationon on the technology?
Um, so that that'd be the firstone, I think, when you think

(24:16):
about you know.
Coming back to your lastquestion about well, how do you
get this seen?
Um, traditionally, confidentialcomputing was quite an onerous
task for organizations.
They had to build applicationsto it.
Um, that's no longer the case,especially with what we're doing
.
It's a matter of essentiallytaking applications and
processes and putting them intoa confidential computing

(24:37):
environment, and you can do thatin in very short order.
You know, if you think aboutthe old way of protecting data
was the best practice wastypically encrypted, the
application tier, so we'd use atoolkit and we'd have key
management and we'd have tothink about the data flows and
so on.
That was, you know, weeks tomonths of effort, typically per
application in a typicalenterprise Confidential

(24:59):
computing, including the abilityto not only protect data in use
but enforce data at rest and inmotion can be instrumented into
a CI CD, and so then it becomesa question of well, how are you
building software?

Speaker 1 (25:12):
We are CI CD.

Speaker 3 (25:14):
Sorry, good point.
So it's into the developmentpipeline process so you can
actually turn on confidentialcomputing when you need to use
it, as opposed to code it in,which is how you used to build
it into.
You know you used to buildapplication security with SDKs,
like you know.
Rsa, be Safe is the classic one, and there's loads of vendors
that do this sort of stuff.
With confidential computing, itdoesn't have to be built into

(25:39):
the app.
It can be instrumented byoperations and turn it on so
that the instance becomesconfidential and the application
then runs confidentially, atleast the way we implement it.
So it's easy.

Speaker 2 (25:53):
So I'm imagining that a confidential computing
element like a T, a trustedexecution environment, is
essentially a hardened processor, a computing environment.
I'm imagining that you'retalking about taking enterprise
software in its current stateand, in a sense, recompiling it
or running it in a virtualmachine inside the TEE.

Speaker 3 (26:15):
Yeah, it's more like the latter.
So the confidential computingenvironment is going to be
regular processors.
It's not like a special purposeprocessor, like TPM was a
dedicated special process of themanaged keys.
So you can have, you know, youcan get an Intel Xeon with TDX
or SGX extensions, you can haveAWS, nitro enclaves.

(26:36):
These are all the kind of thebrands, if you will, of
confidential computing.
In essence, the way we look atit is that you should be able to
take your applications, runthem in confidential computing
without change, withoutre-architecting, not even a line
of code change or evenrecompiling.
It's taking the binary andrunning it virtualized so that

(26:57):
you can run in a confidentialcomputing environment, a
run-time environment.
Essentially an operating systemgives you, gives the
application, what it needs torun.
So if there was theconfidential computing
environment was missingnetworking and storage.
We filled in that gap, so itjust appears like a regular
application environment.

(27:18):
And then think about, say,starting a web server, right?
So one of the things a webserver needs is the TLS key so
it can decrypt traffic coming infrom the browser Simple key.
That's a very important keybecause if that's exposed all
sorts of compromise can happen.
So if you can securely injectthat into the enclave as it
starts, so that it just lookslike a file that it would

(27:40):
normally pick up.
Then it can just run just as itwould on a regular environment,
and so confidential computinghas to be about the simplicity
of the application and then alsosimplicity for the processes
around it managingconfigurations, keys,
orchestrating, integrating intoKubernetes so that it's just
seamless.
So that is how this works.

(28:01):
It is almost a kind of avirtualized way of looking at
the world, and an applicationthat was formerly insecure can
now be run securely withoutchanging it.
That's how you get to do thisin the development to operations
process, as opposed to in thedecoding process.

Speaker 2 (28:21):
Got it.

Speaker 1 (28:22):
So that strikes me that part of your value
proposition is you'reeliminating the need for, I
guess, deep dive security auditsof all the source code to
employ every protectiontechnique, that secure technique
that you might.
Now you can just take that codethat you've been running with

(28:44):
you knew you had some technicaldebt associated with it move it
into your environment and youcan, because of that connection
back to the hardware, you canadjust it.

Speaker 3 (28:57):
You can certainly raise the bar on security for
existing applications like that.
You have to be mindful andrealistic also about
expectations.
So, for example, if you have adatabase that you want to run in
confidential computing, youabsolutely can and should, so
that the operations on thedatabase are not visible to
insiders and they can't memorydump and encryption of data at

(29:17):
rest is protected inside thewalled garden of the enclave of
the secure execution environment.
But if you have a SQL interfacethat is vulnerable and you can
query it, you'll still have avulnerable interface and so you
have to be mindful of how youuse it and the expectations
around it.
It's definitely a very powerfultechnology, but you might want
to also think about the overallthreat model to what you're

(29:39):
instrumenting into the cloud oryour data center and think about
what mitigations you can reduce.
But we've done analysis againstthe mitre attack matrix, which
is basically the typical threatsthat people have to deal with
in running applications, andthere's at least 77 mitre
attacks, as in vulnerabilitiesthat are immediately eliminated

(30:02):
by running it in hardware, whichis a pretty big chunk of high
risk that's reduced just out ofthe gate without doing anything.

Speaker 2 (30:13):
So what do you use?
Cases that you're seeing themost accident in People that are
coming to you?
Yeah, great question Is itbeyond regulated and sensitive
data?

Speaker 3 (30:23):
Yeah, it's some really interesting stuff that
we're seeing and it's probablylike three categories.
I'd say One is the regulatedindustries, the obvious suspects
.
You know the defense agencies,as you do.
Others often started banking,healthcare as being the next
obvious ones, and it's justunblocking things like cloud
migration, where you know theirony of confidential computing

(30:43):
is it's widely available in thecloud but it isolates you from
the CSP.
So if you don't trust the cloud, you can now control what
you're putting into it.
So it gives you hardwarecontrols that you might have had
in the data center that weren'tthere before.
So now you can move things tothe cloud.
So banking, you know healthcare.
But I think the two interestingcategories we're seeing are

(31:04):
especially machine learning andAI where, if you think about AI,
you're often dealing withmultiple entities of data, so
people that don't often trusteach other.
So in a bank, you might havetwo lines of business that could
, you know, merge data together,but the merger of that data
could be very high risk from abreach risk perspective, like

(31:26):
high net worth data above andbeyond, you know, typically
regulated, it's the crown jewelsof most banks.
Yet being able to analyze thatwith machine learning might give
you something that's brand new,and you want to make sure that
you have integrity over theexecution of the model itself.
So you want to make sure thatyou have integrity over the
dilemma, exactly.
And so now you can start toshrink the attack surface of

(31:47):
execution of the AI model andcreate environments where you
can have multiple parties cometogether into what is a trusted
ecosystem that can't leak andyou can control what comes out
of that.
So let's get the results out.
So let's target this individualwith these products much more
granularly than I could withjust anonymized data.

(32:09):
Or, you know, minimize data AIand AI as a service inside
organizations who just want toexplore it and has a popularity
and excitement, and so on.
That's a really big one.
And then the one that's reallycropped up is third parties that
for years have been dealingwith security and they've got
visibility into data that isvery high value.

(32:31):
So imagine all of those DLPvendors, you know, the ones that
have the perimeters that usedto inspect the data coming in
and out.

Speaker 2 (32:37):
DLP.
These are the data lossprevention yeah, data loss
prevention.

Speaker 3 (32:41):
Or there's companies that do things like API scanning
to see what the behavior lookslike, with applications to
detect risks based on thebehavior of users and systems.
Those have privilege overwhat's coming in and out.
They also have the API tokens,so being able to secure that and
lock that down closes a gap onwhat is actually security

(33:03):
infrastructure itself, andthat's helping those
organizations move more of theircustomers to the cloud who have
concerns over that level ofaccess and visibility into the
client on infrastructure thatthey don't own.
So there's some reallyinteresting scenarios that span
everything from banks tohealthcare to highly regulated

(33:24):
industries and then brand newbusinesses forming on the back
of this technology as well, likethe multi-party computation.
That's practical.

Speaker 1 (33:37):
Well, mark, I think we're gonna leave it there.
This has been really superinteresting, unless there's
something else you'd like toaddress.

Speaker 3 (33:45):
No, well, I think I come back to the fact that you
know, irrespective of whetheryou're thinking about cloud or
computation in the data center,over the next few years the
confidential computing is gonnabe on people's radars and it's
gonna end up on their roadmaps,whether they like it or not.
You know the stakes are too high, especially as we get into
higher levels of data that needto be processed by machine

(34:07):
learning and AI.
We're gonna hear about trustedAI.
We're gonna hear more and moreabout trust I know that Steve's
been talking about this foryears and all of these things
come together with trust.
That has to come back tohardware at somewhere and then
from there everything else canbe trusted on top, and without
that it's a house of cards.
So this is absolutely inpeople's futures and I think

(34:31):
what it'll mean is that securityresolves to being about denial
of service and the human error.
If we can close down a lockdowncomputation in this way, which
is a very profound,forward-looking statement, but I
think in five years we'll lookback and think you know what
that might've been the rightthing to do.

Speaker 2 (34:47):
And then I think we'll take the confidential off
the front of this and just makeit all computing.

Speaker 3 (34:52):
I think that is 100% correct.

Speaker 2 (34:55):
It's understandable that we triage at this time
because things are expensive forthis, and we triage sensitive
and regulated data to be thefirst use cases for confidential
computing.
But you know, as computing getscommoditized, I mean I'm
sitting here in a room with a,for all I know, a
software-controlled LED lightbulb with a couple of thousand
lines of code, and there'smillions of these things around

(35:15):
the world and they're all pointsof attack and that idea of
attestation and hardware rootsof trust, even in light bulbs
and vacuum cleaners andautomobiles.
This stuff needs to be spreadfar and wide and it's great to
see this, the practicality ofthe deployment that you talk
about, that the existing computeand the existing work flows can

(35:36):
be picked up and moved into asecure environment.
It's gotta be the way to go,yeah absolutely.

Speaker 1 (35:42):
I agree, so I may just stop right there, but I'm
gonna say here myself that,while tremendously powerful, I'm
also struck by the need forexisting approaches.
With Mark, you were talkingabout how do you protect an SQL

(36:05):
interface and the database thatwhere that data that's normally
stored in plain text and youstill have the techniques of
good security hygiene that haveto apply to those.
Similarly, you were talkingabout AI data so much the data
that LLM has been trained onit's bullshit.

Speaker 3 (36:31):
You have to think about integrity.

Speaker 1 (36:33):
So integrity, the regulatory discussions around
bias that's built into the datasets, confidential computing,
can't do anything about that.
There's still a tremendousamount of work that needs to be
done around the data itself, ohsure.

Speaker 3 (36:48):
Yeah, yeah, absolutely.
I mean, you can make sure thatit's coming from somewhere you
trust, but you have to make surethat the data is also
reasonable as well.

Speaker 1 (36:56):
And that's what's so powerful about what you're
talking about.
Yeah, exactly, To be able toprove that you have a trusted
device is manipulating your bitsand bytes, Exactly.
Oh, Mark, thank you very much.
We'll look forward to gettingyou back sometime.

(37:18):
Hear about your progress andreach out to us.
Let us know when something hothappens.
We'd love to have you back.

Speaker 3 (37:24):
Absolutely anytime.
All right, good to see youagain, guys.

Speaker 2 (37:27):
Thanks, mark, so good .
Thanks everyone, cheers.
We'll see you next time.
Advertise With Us

Popular Podcasts

United States of Kennedy
Stuff You Should Know

Stuff You Should Know

If you've ever wanted to know about champagne, satanism, the Stonewall Uprising, chaos theory, LSD, El Nino, true crime and Rosa Parks, then look no further. Josh and Chuck have you covered.

Dateline NBC

Dateline NBC

Current and classic episodes, featuring compelling true-crime mysteries, powerful documentaries and in-depth investigations. Follow now to get the latest episodes of Dateline NBC completely free, or subscribe to Dateline Premium for ad-free listening and exclusive bonus content: DatelinePremium.com

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.