Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:02):
All right, welcome
everybody to the SNEA Experts on
Data podcast.
Super excited because we've gotan amazing panel of experts and
fantastic humans that I'mjoining today to have a great
discussion around the experts ondata when it comes to practice
areas in SNEA.
So we've brought a lot of neatthings around focus areas and
(00:25):
today we're going to talk aboutprotect and you're probably
going to say what does that mean?
That's why I've got theseamazing folks with me and oh, by
the way, my name is Eric Wright.
I'm the Chief Content Officerand co-founder of GTM Delta and
also the lucky host of the SNEAExperts on Data podcast and with
that I'm going to take usaround the room and introduce
the incredible panel.
(00:45):
We'll start with Michael.
Do you want to give a quickintro yourself?
And then we'll get rollingthrough the crowd.
Speaker 2 (00:53):
Thank you, eric.
My name is Michael Dexter.
I'm an independent supportprovider in Portland Oregon.
I'm very active in open sourcecircles and I host three calls a
week one on containers, one onOpenZFS and one on hypervisors.
Speaker 1 (01:06):
Well, I'm glad we
could sneak in between those.
Thank you very much for joiningEric Hibbard.
Thank you Not just becauseyou're amazing, because you have
a great first name, but there'sdefinitely a lot more that I'm
a real big fan of with you, Eric.
Speaker 3 (01:20):
Thank you, eric.
So I actually in SNEA chair thesecurity technical work group.
My day job is with SamsungSemiconductor and I deal with
NAND and SSDs and very active inthe security and privacy
standardization activities IEEE,iso, things of that nature
(02:02):
no-transcript, and mostly I'm asenior manager with Ernst Young.
Speaker 4 (02:09):
I've been in the
industry for more than 35 years,
all around data protection,data retention and data archival
.
I have held a lot of disasterrecovery, business continuity
and ransomware protection acrossmy career and I'd be happy to
be a member of this discussionpanel.
Speaker 1 (02:31):
Fantastic and, John,
I can say that you love Keoxia
so much that it's your middlename.
I love your tag that I get tosee on here.
So, John, if you want to do aquick introduction, then we're
going to jump into the chat.
Speaker 5 (02:54):
Sure.
So hi, I'm John Geldman.
I'm employed by Keoxia.
I get to lead their standardsactivity in general.
It's fair to say I am astandards geek.
I am also on the SNEA board andactive in most committees you
can name, including a lot thatEric runs.
Speaker 1 (03:14):
Now the fun part is
that we get to stay on you for a
second John, because one of thethings we have with the
practice areas is reallyunderstanding what does it mean?
You know, the data focus areasare curious to folks from the
outside, so let's kind of startfresh start.
No one knows who we are.
Let's talk SNEA and the protectdata focus area.
Speaker 5 (03:37):
All right.
So today I'm here to introducethis talk on one of SNEA's focus
areas to secure and protectdata.
I'd like to call out two of theactive groups that SNEA has
focusing on this area the dataprotection and privacy,
(03:58):
community marketing andeducational materials.
And the storage securitytechnical work group that
generates standards, providescomments on various domestic and
international standards,provides support to other SNEA
technical work groups and workswith many of SNEA's alliance
(04:19):
partners.
By the way, I also co-chairthat group Now the SNEA online
dictionary, which you haven'tseen.
It is a great tool.
(04:47):
Describes data protection as thecombination of data integrity,
data availability andconfidentiality.
I'm going to throw in dataprivacy as the fourth element.
That's, ensuring that data hasnot been altered, ensuring that
data is available even if thereare physical failures, ensuring
that data is only accessible byauthorized parties, and what
does it actually mean for aperson to have control over
(05:10):
their personal data?
Each of these elements iscomplex, multiple aspects
approaches.
Each of these four elements areaffected by ongoing development
of media, new data,organizational frameworks and
(05:35):
also by the growing capabilitiesof quantum computing.
That affects our cryptography.
Each of those elements issupported by a combination of
cryptographic systems andpermission systems and data
redundancies and regulations Forsystem and developers.
(05:56):
We use all of that stuff, exceptusually, the regulations.
Now the lawyers and legislators.
That's different.
There are no laws on the booksthat have consequences if you
aren't doing the right things inthese areas, if your company is
responsible for data breachesor doesn't appropriately provide
(06:17):
control over personal data.
Now that I've got that scarythought going, I'll turn this
over.
Now that I've got that scarythought going, I'll turn this
over.
Speaker 1 (06:25):
Nothing like opening
on a bright note, and it's funny
that you did highlightsomething that I think we're
probably going to spend a goodtime digging down into across
the whole panel is thedifference between sort of
(06:55):
technical capabilities, theimplications of that as an
operational challenge, and thenthe third pillar being the
regulatory challenge that comesalong with it, and that is, you
know, regional as well.
So you talked about some of thefun part of as lawmakers enter
the conversation, and maybelet's kind of start as we look
at defining data protection.
Let's kind of start as we lookat defining data protection.
You were looking at how the USfocuses on it versus.
(07:16):
You know there's not a person inthis audience who likely hasn't
heard the phrase.
You know the letters.
Gdpr does go all the way downto the metal because there are
capabilities that we're havingfrom the very, very lowest
possible, you know, viewpointthat will traverse to the point
where they are, you know, boundby regulations and excitement.
So, uh, who wants to?
(07:38):
I'm going to open this up andsee who gets the fastest to the
mic on this one, but but let'stalk about, you know, that sort
of market definition and whyit's regionally different.
Speaker 3 (07:49):
So I can probably
take that one, since I do quite
a bit of work in the privacyspace.
Yeah, you're right, it does getall the way down to the metal,
so to speak.
And you can use the GDPR, sinceyou invoke the well-known
acronym Because something asquote simple as destruction of
(08:17):
data could constitute a databreach.
And that was one of the bigsort of eye opening things for a
lot of folks, especially in thein the storage industry,
because I think up until thatpoint, we always thought of a
data breach as unauthorizedaccess, but that, in fact, is
(08:38):
not the case.
Corruption and destruction canget you in just as much hot
water as unauthorized access.
That said, gdpr was a sort ofsecond move by the European
(09:00):
Union to try and get this right.
They've since been followed bya bunch of different
jurisdictions, so places likeChina, for example, have very,
very strict laws in this space.
In the US, we're a bitschizophrenic when it comes to
privacy regulations At thispoint.
(09:22):
I believe every state hassomething on the books and they
vary significantly.
California, where I think Johnand I are at, has some of the
more stringent requirements, butyou know, massachusetts has got
its own sort of version.
None of them are tracking, likeGDPR.
(09:45):
At the moment, the focus is alittle bit different and you
know we don't get too worriedabout things like IP addresses
and things like that, where theEuropeans are very fussy.
Even photographs and whatnotare, you know, protected
information in certain settings,so we haven't crossed over into
that space.
It's unclear that we will seeany serious privacy regulation
(10:12):
in the US.
If I had to speculate, it wouldprobably take a path that would
preempt some of the morestringent that are on the books
in some of the states.
Speaker 1 (10:23):
So yeah, I think the
chaos factor we've got in the us
is it's going to continue I cansay it coming from, uh, I I'm
canadian by birth, as my textand accent always gives away
here, but the I worked for for,let's just say, a large uh large
insurance company.
That rhymes with fun life.
Anybody, anybody who goes backto LinkedIn, will easily tell
(10:45):
where I was from.
But I was really exposed earlyto the idea that it's not just
you know, as a company you haverequirements, but it was as a
company per location becausethere were provincial
regulations around the types ofdata we can hold and maintain.
And then it became country tocountry, of course, because it's
a global company maintain.
(11:06):
And then it became country tocountry, of course, because it's
a global company and I I didget that, excuse me, early
exposure to it and, like yousaid I think, eric, you brought
it up was the idea that we thinkof it as exfiltration, like the
removal and and selling orpublication of data is what
people see is like the, thesacrosanct, the ultimate,
ultimate thing.
But in fact just possessing andholding the data still has an
(11:27):
incredible amount of regulatorystuff attached to it and when we
think of like right to beforgotten.
I think was one of the reallyinteresting parts of legislation
that came out of the EU andthat introduces a whole exciting
thing that takes us down tothat.
So maybe you know on that, whatdoes it mean?
(11:48):
You know, as a standardsdevelopment body, when we see
something like that come fromthe regulatory side, how do we
then parlay that down throughthe rest of our organizations
and how we handle it?
Speaker 3 (12:01):
and how we handle it.
Well, I mean somebody who'sheavily involved in like ISO
standards.
We don't cross over into theregulatory space.
In fact, this is something thatwe have to be extremely careful
about because you know like inthe case of some of the work in,
you know, the ISO Subcommittee27, which deals with security
(12:25):
and privacy, you know that's alittle more expanded than that
we have, on the order of 75national bodies represented.
So one size would not fit allfrom a regulatory perspective.
So the standards tend to dodgethat issue.
The twist is that it's notuncommon for regulations to
(12:50):
actually cite certain standards.
So there is a relationship, butin general the standards try to
avoid, kind of getting.
Speaker 5 (13:01):
I'll note that I
would tend to agree with Eric.
We don't really touch them inthe standards, but I know for
what I do.
I get called for advice fromlegal about export rules from
different places.
Speaker 1 (13:20):
We need to put our
product specifications, not our
standards, out, that's actuallypart of what we need to control
carefully where we're actuallyexporting trade secret
information.
Remember the days when evenjust running the wrong version
(13:40):
of Netscape could actually getyou in trouble.
Remember there was a 40-bit anda 128-bit version.
Remember there was a 40-bit anda 128-bit version.
Like it's kind of wild that wetake for granted how easy it is
to just grab software and put itdown on our systems these days.
And then, when it comes to, youknow, data protection in
general and again we'd sort oftake it for granted that what we
assume when we hear dataprotection it means backup, but
(14:05):
that's actually like one sliverof it.
So maybe on that, what is thedifference between security,
privacy and the storage?
You know elements of thisprotect data focus area elements
of this protect data focus area.
Speaker 4 (14:25):
Maybe I can step in
here.
Go for it, muneer.
Yeah, the idea here is, whenyou talk about data protection
in general, our thoughts withinthe infrastructure resources is
that it's protection from dataloss, data corruption, data
unavailability, by providingwhat you just said, eric, the
(14:46):
backup as an example, maybereplication, maybe cloning,
maybe snapshotting.
All this is to protect thephysical capability of the data,
to protect the data from pryingeyes.
That is data privacy.
So we always tend to mix.
As Eric Hebert just said, theline separating those two items
(15:13):
is very blurred, so it's veryimportant for the end user to
understand what is he protectingfrom.
Like, for example, if your datais on your local device that is
not connected to anything, soit is unlikely to be accessible
to praying eyes, this isprotected from the privacy point
(15:35):
of view, but that does neverprotect you from hardware
failure or data loss.
So these are the barriers, orthe separations between the data
privacy concept and the dataprotection concept.
Speaker 3 (15:54):
I might put a
slightly different spin on that.
Different spin on that becauseyou know so, privacy, you know
as a practicing privacyprofessional there's a legalist.
So I would say what Muneerdescribed is probably more from
a confidentiality perspective.
Privacy has got a legal regime,a regulatory regime that, and
(16:20):
to use as example the fact thatyou might have certain data on
that laptop might actually be aprivacy violation, but it
wouldn't be a confidentialityissue because it's not connected
and so understanding I meanthat may sound like a subtle
difference but it's not you maynot be authorized to have
(16:42):
anything to do with certain datathat's sitting on your laptop.
So privacy has a dependency onsome of the security aspects and
you said earlier availability,integrity, confidentiality, said
it the wrong way, I should havesaid the CIA sort of wording.
I said it the wrong way, Ishould have said the CIA sort of
wording.
But security itself doesn'tactually care about privacy from
(17:08):
a regulatory perspective andthat's, you know, one of the
interesting challenges.
Coming back to Muneer's commentabout data protection through
kind of the lens of storage,about data protection through
kind of the lens of storage,there is from a security
perspective there's kind of anassumption that you're doing the
(17:31):
storage side of thingscorrectly, because there are
concerns about like businesscontinuity and things of that
nature.
So there's, you know a littlebit of.
So.
Privacy has some dependencieson sort of the security
community.
The security community kind ofhas some dependencies on sort of
(17:51):
the storage-oriented aspects ofdata protection.
But it's subtle in many casesyou know how strong that
dependency actually is.
Speaker 1 (18:04):
Yeah, that's an
interesting thing you brought up
, eric, is this idea that it'salmost like kinetic and
potential, as we talk about inthe sense of energy, in that the
data you have on your machinehas the potential to be
something.
It's not necessarily activelyat a point where it's at risk of
exploit, but just the fact thatdata is possessed on that drive
(18:29):
introduces a regulatory, couldbe a regulatory issue.
Yeah, and that's why it's sucha careful dance.
I thought I just had to call mystorage team up.
I didn't realize that I had tocall legal every time I need to
ask one of these questions.
But that almost is thisinteresting boundary and, like
I'm going to come into you forthe hot take, there's got to be
(18:49):
something.
How do you explain this whenyou're talking to your community
and what's the angle at whichyou look at this from this
security, privacy and protection?
Speaker 5 (19:01):
So one of the key
aspects of privacy is it really
is about somebody's personalinformation and control of it,
and so if it doesn't have to dowith that person, then it's not
a privacy concern with thatperson.
So it's the sharing of thatinformation and whether it's
(19:25):
still there or not or whether itis allowed to be shared in
other places, which is important.
When we talk aboutconfidentiality in general, it's
a more really general thingabout are only the people who
are supposed to have access tothat having access to that?
So if the company has decidedthat it's okay to have a
(19:49):
database of names and phonenumbers and everybody has access
to it, then we need to makesure that only the people who
are supposed to have that havethat.
Speaker 3 (20:19):
Except that's a good
example, John, in that I
suddenly forced to deal withthat to comply with GDPR, and in
many cases, people would splitup their call centers so that
they would be in certain regionsor they would say, yeah, we
can't, you know, we can't keepthis.
Oh look, we got 25 years worthof records.
(20:40):
Why do we have this?
There was a lot of soulsearching, that kind of went on.
But yeah, I think moregenerically, eric, to a comment
that you kind of healthcareinformation stored away,
scrolled away somewhere.
If it goes into force now,everybody's got to scramble to
(21:07):
figure out.
What do we actually have on youknow?
Speaker 1 (21:10):
what do we get?
Speaker 3 (21:10):
stored, because now
it's suddenly you know it's now
regulated and you've got tobasically deal with it.
This happens all the time withyou know the regulatory space.
As John pointed out, personalinformation is a little easier
to track because you know whatto chase, but we're seeing
regulations pop up.
You know, california a whileback had an IoT-oriented
(21:34):
regulation where, all of asudden, you know, am I an IoT
company?
And it depended on thedefinition, which was basically
everything but routers.
You know so we get hit withthese things where we've got to
sort of skulk through the datato figure out what we have and
how do we have to protect it.
Speaker 1 (21:54):
And it's such a tough
challenge too, as technologists
, because we live in thisintermediary space where we have
to understand the regulatoryimpact, but then we also have to
relate to the technicalcapabilities, and that was a
great example where, yeah,unless it was bolted into a 42U
rack, it was considered to bemobile, you know like that, and
(22:14):
that introduces an interestingchallenge.
Now I'm going to switch thequestion I brought up to you,
Michael, before, because I Iwant to go.
Let's talk about bottom upwhat's your, as you look at,
describing what are thecapabilities that we're
standardizing, that attend tosome of these challenges that
are being brought up as we'retalking about at this regulatory
(22:35):
and sort of top level layer?
Speaker 2 (22:38):
I'm glad you
mentioned that, because you used
a phrase like I need to talk tolegal every time I do something
.
You used a phrase like I needto talk to legal every time I do
something.
Well, while we have countlessstorage products out there, the
average user, be it small andmedium organizations and even
rather large ones, do not havestorage engineers.
And I'm glad SNEA has thestorage developer event, because
(23:02):
there are very few true storagedevelopers, and so, at best, we
are handing smallerorganizations weapons that are
unsafe at any speed, and I'dhope that organizations like
SNEA can help bring them thesenotions that when you are
(23:22):
setting up storage and this wasan eye opener for me thank you,
eric Hibbert that thedestruction of data is just as
important as the protection ofdata, because your average
storage engineer will hold on tothat data until the end, and
that's their job, and they arefailing at their job if you need
to destroy that.
So, bottom up, I think we needtools that are available to all.
(23:43):
I think we need tools that areavailable to all.
I personally believe those liein open source tools, because
the vendors are inconsistent attheir delivery of data
protection solutions, and it's achallenge.
There's no question, it's achallenge, be it those competing
interests of protecting, yetcarefully deleting, and, as
you've mentioned, the suddenHIPAA or GDPR requirements.
(24:07):
Untangling a rock-solid archiveof data can be a nightmare.
And so far as if we'vesucceeded as storage engineers
in providing multi-tiered,decades-ready storage, suddenly
someone one user out of athousand has the right to be
forgotten.
How do we take that solid'ttouch it store and extract that
(24:31):
one user?
So the challenges are palpable.
Every effort is appreciated andI certainly hope Sneha can play
a role in all that.
I hope that answers yourquestion to some degree.
Speaker 1 (24:42):
That's great, yeah,
and I think this, actually,
we're going to go to the deepestpossible complex thing that has
now come to the fore, which is,you know, vector databases
being a way in which we storedata that's not easy to remove.
You know, and this is theproblem We've got people are
training models, we're storingmodels that contain a vast
(25:03):
amount of information.
We struggle with explainabilityand and the reason is because
it goes down to this idea oflike protecting, destruction and
life cycle of that data how dowe make sure that we can safely
and even potentially remove someof that stuff?
Now we're not solving thevector database problem.
I pulled as one example, butyou know what are the artifacts
(25:27):
that we're dealing with at themetal and up, that are being
leveraged as we see new techcome in.
That's going to challenge themethodology.
Speaker 2 (25:38):
I'll throw in one key
point there that I always come
back to in our meetings withDPPC.
It's that when there'ssomething new, be it a vector
database or something exciting,generally there's a question
well, is it faster, morereliable or more secure or
cheaper?
And so it's still a storagetechnology.
Let's find where it fits inthat mix.
(25:59):
But yeah, there's alwayssomething new and shiny, and
there always will be.
So that's my take on it.
Speaker 3 (26:06):
Yeah, and part of
that sort of following up on his
summary there.
From the security piece wetypically ask for what happens
when you're all done with it.
So it's not so much.
You know, you could actuallyhave the security problem solved
(26:28):
while it's operational in use,but if you haven't thought
through what does it mean whenI'm all done, then you may have
actually exposed yourself to alot of data breach scenarios at
a time, because if you'regetting rid of something or
you're shutting operations down,you surely don't want to be
(26:50):
dealing with something like this.
So you know, this is a facetthat is very important,
especially in the storage arena,because, you know, we are the
guardians of the data and, as Ithink everybody is aware, we go
to great lengths to squirrel itaway in lots of places to ensure
(27:11):
it never disappears.
But when it's time to make itdisappear, we've got to
basically deal with that problem.
Speaker 1 (27:21):
I had.
The strangest funny thing thathappened years ago is I remember
that we had, like you know,workstation style servers.
This is back in the early 90sI'm an older fella, so I've been
around since some of the earlyimplementations and we had a
bunch of spare servers and sowhat?
They would come into production, we would take their drives
away and move them.
So a shelf of servers, shelf ofstorage, so all these disks,
(27:44):
you know know, gigantic, 256gigabyte.
You know drives, massive, youknow.
But I would take, I took drives.
I'm like, oh, there's a spareserver and I just took five
disks, popped it into my nicelittle tower and I powered it up
and it took about three or fourminutes and it came up to a
(28:05):
Windows screen.
I was like, oh my God, Iliterally picked a stack of
drives that were still hadcontroller memory and disk side
memory and it recovered thearray and I just happened to
have picked five that were inorder.
So that was a scary momentwhere I was like, okay, what
(28:26):
just happened?
Here?
I mentally installed Windows NT.
I don't know how this works.
Speaker 4 (28:32):
Actually, this
problem now is very common
across the Internet, thehyperscalers and the cloud
providers, where a company or anindividual acquires a storage,
let's say for a project, andputs some test data or dev data
on it to do his development andqualification and certification
(28:57):
of the project and then, oncehe's done, he releases the
resource to the cloud provideragain to stop billing without
removing the data.
The next person who picks thatdata he's going to be having a
lot of fun going through PIIdata that he can use for
(29:19):
multiple purposes.
So that's a concern that'salways existing, to be true.
Speaker 5 (29:26):
Now, personally I'm a
fan of the sanitize command,
but I helped invent it.
So there you go.
But that's a command which isused in hardware devices to
really reset everything that'sthere.
Really reset everything, that'sthere.
One thing that I'll share iscoming from NVMe, but I won't
(29:49):
give too much status about it isthe ability to purge a
namespace, which will be veryhelpful for logic people and for
logical storage in the future.
But that's not available yet.
It's coming.
(30:11):
It's a lot easier to deal withthese logical constructs than it
is with the physical constructs.
Speaker 3 (30:18):
And John said
something actually that's worth
pouncing on here.
He used the term purge and thatin the security world has well
hopefully has some specialmeaning to you, because that's
essentially a technique that issupposed to hold up to,
(30:41):
potentially, a nation statetrying to do a recovery.
It should definitely beanti-forensic.
So I mean, if you're using theHRS techniques that have been
implemented correctly you knowforensics people should be very
unhappy if they're trying torecover data.
There should be no way fromgetting it on the data itself.
(31:03):
Now, forensic people are verycreative and very creative, and
systems are notorious for temp,space and again, squirreling
data away in a bunch ofdifferent locations, and so they
may be able to recover it thatway, but if you just hand them a
drive that's been, you know,purge operations been executed
(31:23):
there should be nothingavailable to them or a nation
state if done right.
Speaker 1 (31:29):
Yeah, and it's to the
point where you know, on the
physical side, you know, we usedto just literally degauss
drives, which was always fun towatch them make little dances
for us on these little excitingmagnets and heat up.
But then, yeah, as we move tothe cloud, we move to
distributed shared storageenvironments.
We don't have the degaussingoption and it's that real
(31:53):
understanding of lifecycle goesto the point of termination of
the data and we just easilyforget.
And we even talk about backupproducts.
They're not backup products,they're restore products.
But we always talk about it inthe context of you first have to
back it up.
But time and time again it'slike can it be restoredTO?
Whether it's going to besitting in cold but live storage
(32:27):
, whether it's going to besitting in hot storage in a
secondary site.
There's so many ways in whichwe can manage how that data
lives and, ultimately, where itdies.
Speaker 3 (32:43):
Yeah, and just you
know, another angle that we're
having to deal with it's relatedto the standardization is when
you consider sustainability.
You know, going out anddestroying drives that are
potentially reusable is reallybecoming less acceptable.
(33:07):
I don't think we've sort ofquite crossed that line yet, but
there are a lot of people thatare looking at that situation of
like.
So how, how do we eradicate thedata on the drive but leave the
drive in a state that it's it'susable?
And this is back to that termthat John used earlier.
(33:27):
You know purge and, inparticular, the using
cryptographic erase, which isessentially, you know, using
encryption and certain keymanagement techniques, as in,
like, losing the keyintentionally.
There's some other conditionsthat you have to worry about.
So not only are we having toworry about eradicating the data
(33:51):
, but now we're also having tolook at a situation of doing it
in a way where we're not causingphysical destruction to the
underlying storage device ormedia, and it's clear to the
storage industry that we've gotto deal with this.
We see SNIA is operating in thisspace for quite some time.
(34:13):
The Green Stories Twig hasworked with the EPA on a.
You know a lot of technologies.
You know in the past, we seethe Open Compute Project has,
you know, got an entiresustainability activity.
So this is something that youknow.
The information communicationstechnology industry is basically
(34:35):
trying to figure out how tosort of step up and deal with
some of this trying to figureout how to sort of step up and
deal with some of this.
Speaker 1 (34:44):
Yeah, and that is
where we often feel like we're
competing interests, because youknow best way to destroy it
would be, you know, burn it withfire, throw it into a volcano,
but obviously the impact ofdoing such is terrifyingly bad
for so many other reasons.
So we have to think aboutrecycling, but destruction at
the data layer, with absolutebelief and proof that this data
is irrecoverable.
Speaker 4 (35:06):
Yes.
Speaker 5 (35:06):
One of the fun things
.
Speaker 2 (35:09):
Go ahead, john, you
go you haven't gone yet.
There is indeed a secondhandmarket, hand market, and in this
broader education of okay, howdo we properly sanitize?
A consumer sure appreciatesknowing the number of hours on a
drive, the smart status, thehealth of that drive and
(35:30):
whatever they can about thehistory.
And there are vendors who aresomewhat famous for fudging
firmware and somehow havingterabytes of data through a
drive but very few hours, whichis completely implausible.
And so this whole ecosystem isindeed a challenge, and that
hasn't quite been solved to ourbroader satisfaction.
(35:51):
Go ahead.
Speaker 5 (35:52):
So two interesting
things.
One is destruction isn't aseasy as it used to be.
One is destruction isn't aseasy as it used to be when we
can put multiple encyclopediaBritannicas on an eighth of my
pinky nail.
Then all of a sudden, if you'retrying to shred this, you just
(36:16):
can't.
You can't get it down tosomething which is meaningless
to a state organization.
You have to effectively phasechange it.
You have to make it go fromsolid to liquid or something
else.
You have to melt it.
(36:38):
You have to do something toreally change it.
Now, to erase is actually oneof the easier ways to actually
get things done If your hardwaresupports it and can support it.
And when we're talking aboutwhere we're holding data, you
know if we're actually holdingdata.
(37:00):
You know if we're actuallyholding data which is now.
The data is a database or in anFPGA.
We've got a different story onour hands, but we still have
this potentially confidentialityand privacy concerns to worry
about to worry about this is.
Speaker 1 (37:18):
I think it reminds me
that such a perfect
conversation, and I only wish wehad like another hour to be
able to go, because there's somuch we could dive into.
But I think I'll saythematically.
I want to wrap by just sayingthe most important reason why
what we're doing in SNEA andcollecting amazing folks, like
(37:39):
everybody I've got here is thefact that we are going to come
out of here, as we should andout of every architecture
discussion with a series ofquestions, not answers.
Obviously, we drive towardanswers, but the best thing you
can have is are all thequestions being asked, and I
would say that this is probablythe number one thing why I
recommend people get involvedwith SNEA, because you're
(37:59):
surrounded by othertechnologists who are
experiencing the same problem inparallel with you and we're all
solving it together.
So the pace of acceleration ofinnovation so much better in
this broader tech community andinevitably, you know, let's not
reinvent the wheel and let'scertainly not destroy the wheel
(38:20):
in a way that doesn't, you know,protect us from PII being on
the wheel.
You know there are so manythings that we could do, but I
want to say thank you to all theamazing folks.
This has been a fantasticconversation.
Like I said, I just there's somany thousand questions I want
to ask.
I want to have so much fun witheverybody, but you've all been
great.
So, for just quick round,what's the best way that folks
(38:42):
can reach you?
We're going to start with you,eric Hibbert.
What's the best way if peoplewant to get caught up with you?
Speaker 3 (38:52):
Other than, of course
, through directly meeting up at
SDC and the C events.
Speaker 1 (38:55):
Sure Well, yeah, I'm
on email erichibbertbert at
Samsungcom, or I'm pretty easyto track down on LinkedIn, munir
again, I recommend people checkit out.
Munir and I had a fantasticconversation in the past, so do
check out some of the otheramazing podcasts as well as with
other folks that have been onhere.
So, munir, how do we get aholdof you if we want to catch up
after the fact?
Speaker 4 (39:15):
I think the fastest
way to access me and potentially
everybody on this call, isLinkedIn, so our profile is open
for anybody who wants to getinformation or connect.
I'll be happy to respond to anyinquiry following on.
Speaker 1 (39:33):
Excellent and Michael
.
Speaker 2 (39:35):
So by email editor at
callfortestingorg and on the
Fediverse Dexter at bsdnetworkNice.
Speaker 1 (39:44):
And John.
Last but very certainly notleast, thank you again for
everything you're doing aroundkeeping the wheels on the bus,
the hamsters spinning insidewheels and all the amazing stuff
you're doing with SNEA andkeeping all of us really having
these great conversations.
What's the best way we can getahold of you if we want to reach
you again?
Speaker 5 (40:02):
First, I'll have to
admit that, whatever it is, my
boss would like to know, but Ican be.
I can be reached at johngeldman, at keoxiacom, and I also have
a LinkedIn area where I can bereached.
Speaker 1 (40:24):
Fantastic, yeah, and
there's a great white paper
that's coming up.
We're going to be seeing that.
As soon as it becomes available, we'll probably be able to
share that around the dataprotection white paper.
There's tons of great contentthat goes along with these
conversations.
So folks do check out, followthe podcast, smash that like
button and subscribe and do allthose things that the Zoomers
(40:45):
tell us we're supposed to do,but, more than anything, connect
with these amazing folks.
Thank you all for sharing thetime with me today.