Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:00):
How's it going?
Jason, it's great to get you onthe podcast.
You know we've been planningthis thing for a little bit, you
know off and on, and now it'skind of like on my end it's kind
of a mad dash to get stuff donebefore my next kid arrives.
It's like very hectic on myside.
Speaker 2 (00:18):
Well, thanks for
having me, Joe.
It's great to be here, we'reexcited for it, and
congratulations.
By the way, it's an excitingtime.
Speaker 1 (00:25):
Yeah, it's an
exciting time to get back into
the suck, as I call it.
It's just like a blur of thosefirst couple months, you know.
Speaker 2 (00:34):
Yeah, absolutely, and
then you forget about it later
so you can do it again, right?
Yeah, not too bad.
That first five-hour nightsleep after about six weeks old
is perfect.
Speaker 1 (00:45):
Oh man, yeah, Like
it's interesting.
You know I never thought thatlike I would be able to, I guess
, adapt, you know, to the lackof sleep.
But you know I wear like afitness tracker.
I wear two fitness trackers,one's a whoop and one's an Apple
watch.
Right, so in my whoop it tellsyou you know your recovery
percentage, right, like howrecovered are you, how, how good
(01:08):
you should be feelingthroughout the day so you can
put in, you know, more work, oryou know better workouts,
whatnot.
And you know, in the verybeginning I mean it'd be like
zero percent, you know, becauseyou're on three, four hours of
sleep, right, you know, um, it'sjust so brutal.
But after a couple weeks mybody was like well rested, after
four hours of sleep and I'mlike man, this is, this is wild.
(01:31):
and I'm up at 4 am like wideawake with this kid right it's
uh, it's the craziest thing, butthen, like by noon, I'm like
crashing.
Speaker 2 (01:39):
You know I need an
hour nap or something sure, for
sure it's a lot like running astartup, actually at the same
time about the same amount ofsleep.
Speaker 1 (01:47):
So yeah, yeah, I I
feel like I don't know.
I, you know, so I do consultingon the side, Right, and I
really try to limit how I marketit so I don't get that much
work coming in Right, but Icould imagine, you know, like
focusing more on it and it's,it's literally like having a
(02:08):
newborn, you know where.
You're just constantly doingstuff, you're constantly, you
know, changing and whatnot,right, Like it's.
It's hectic.
From what?
From what I understand, youknow.
Speaker 2 (02:21):
Yeah, for sure.
And for me to just constantlythinking about how to move the
ball forward, what needs tohappen next, concern about the
problems that are out there andhow you can solve them faster.
So your brain's constantlygoing, but that's also the
exciting part of it, right?
So you're solving new problemsand helping people out, so huge
motivating factor to spend thattime.
Speaker 1 (02:54):
Yeah, it's
fascinating because it takes a
certain kind of mentality to runa startup.
I feel like run a startup andbe somewhat successful, bring in
any, of course, to have thatvision and know what needs to
get done, get it done and allthat sort of stuff.
You're kind of like raising achild, as you're raising this
business, so to speak, one thatcan step into any problem, make
(03:22):
a quick decision and move on tothe next problem, right, and
maybe the decision you make iswrong and that's fine.
You're the one that takesownership of it.
You're the one that you knowlives with it and whatnot.
Maybe you readjust, recalibrate, make another decision to make
the right choice and whatnot.
But I think that that's where alot of people get hung up on.
Right, and I see that even youknow in nine to fives, right,
(03:47):
where people are mulling over.
You know different scenariosand whatnot.
You know for myself like I gointo it and it's like you know
five minute decision.
This is what we're doing, thisis what I think we should do,
this is why I think we should doit, you know, and if someone
else disagrees with me, to tireup the tool, I'm not tied to it
(04:08):
or anything, but still havingthat mentality really pushes
things along, even if it's anincorrect answer or decision.
At least you learned like hey,oh, that didn't work over there
right, for sure, I'm gonna beopen about that.
Speaker 2 (04:18):
Yeah, you have to
definitely be impatient, I think
, and also, like you said,versatile and want to do all the
things.
I enjoy that, like, like in myrole, you know, being able to
jump from a deep technicalconversation to a sales call, to
all those things, and then,yeah, all that background of
impatience that comes to it andmaking you know smart, quick
decisions, and then, like yousaid, being able to react you
know what did we learn?
(04:39):
And then having a great teamaround you that can also move
fast and react, and then havinga great team around you that can
also move fast and react thatsort of always is what's driven
me to a startup-like environmentat the end of the day, because
you like to move quick, you liketo see that progress and the
output of your work.
You know constant reinforcementthat you're doing great things.
Speaker 1 (05:03):
How do you manage
being highly technical and doing
sales as well?
Because that is, it's like twodifferent sides of the brain,
almost right, like that's howdifferent the two things are.
But then a good salesperson ora good tech person that does
sales is able to meld betweenthe two.
And it took me a long time tokind of figure it out, and I
think the best way that I wasable to kind of get that thought
(05:25):
into my head of how it shouldbe is how I expect from an end
user to interact with a vendor,right, what am I looking for
from them?
I'm looking for straight upanswers.
I'm not looking for, you know,oh yeah, we can do everything
under the sun.
I'm looking for, you know, thestuff that they can deliver and
whatnot.
And so I kind of factor that infrom like a sales side where
you know the stuff that they candeliver and whatnot.
(05:45):
And so I kind of factor that infrom like a sales side where
you know there's also like abusiness philosophy where it's
like you say yes now, and thenyou figure out how to do it
later, yeah, but like you haveto be able to make that quick
five second decision of is thiseven something that I can
deliver, you know, and thenstart working towards it, even
if you can't deliver it today.
Speaker 2 (06:06):
Although you know
that's kind of why having that
spectrum of ability is helpful,right.
So you're making educateddecisions in that there's a lot
of context switching, so andthat's you know, I tell people I
meet that's a muscle you canbuild if you don't think you
have it and it's just likelifting with weights, you know,
and you can work on movingcontext and then it's hard at
first and tiring and you getbetter at it and better at it.
(06:27):
And also I think I'm maybebenefiting from where you know
we're in the cybersecurityindustry and so that sales is a
lot of technical sales.
And back to exactly what yousaid you know we look to provide
value to customers in anyconversation that we're having,
right.
So you know what is yourproblem, what are you dealing
with?
How are we helping?
Is there something else goingon?
So you're bringing thattechnical expertise to that
(06:50):
sales call the other day.
You kind of treat them all likeyou know partners with you and
you know collaboration of thegood guys versus the bad guys,
and then it kind of gets easierat the end of the day.
Speaker 1 (06:59):
Yeah, that makes
sense.
Well, Jason, you know we kindof went off the deep end right
into this business stuff, but wedidn't hear anything about your
background.
So you know why don't webackpedal a little bit?
And, you know, start with yourbackground right, like what made
you want to get into IT?
Did you have experience with it?
You know previously, or youknow what does that journey look
like?
Speaker 2 (07:18):
Yeah for sure.
So growing up loved computers.
I so, growing up lovedcomputers, started programming
at an early age, I think on anApple II, so don't age me there.
And then you know, I justwanted to program, work on
computers, went to college,graduated in the 90s, right
before the dot-com boom and theY2K stuff, and then had a
technical career for a while,but always interested in product
(07:40):
and then got interested inbusiness and, thankfully,
(08:13):
through a series of promotionsand career, changes, able to
kind of see the entirety of abusiness.
Speaker 1 (08:17):
And then you know,
before Invery you know I worked
for Matterport from their likeseries D to their IPO.
So spatial data company got achance to do a lot of product
engineering, temporary CISO andyeah, it's just that broad
experience that kind of led meto the CEO role here at Inveri.
So I'm a technical guy,programmer and probably
discipline along the way youkind of learned it maybe as you
went right.
Speaker 2 (08:31):
Yeah.
Speaker 1 (08:31):
Is that fair?
Speaker 2 (08:31):
to say, yeah, and the
products that we built.
So you know, a long time ago Iwas at Motorola in their home
division, working on around sixproduct lines IoT and networking
and there's always a securitycomponent in that and what
you're doing.
So it's kind of built into youright In your genes and your DNA
, so to speak.
And then you know, went on anddid more consumer-facing IoT
(08:54):
work, and then you know twoaspects of security there.
There's home security, so we'resecuring people's you know
physical property and their life.
But then the software itselfobviously needs to be highly
secure.
There's a lot of privateinformation there, so we put a
lot of attention and focus andlearning into that.
And then at Matterport, if youdon't know, it's a spatial data
company, so they digitizephysical places in the 3D data,
(09:17):
a lot of IT and, as aresponsible person for their
platform, I have big customerssaying I have very important
information in your system.
Person for their platform, Ihave big customers saying I have
very important information inyour system.
Tell me why that's secure,right?
So, again, I think it'sprobably good for any
organization to have thesecurity practices built into
all of the operations,especially in engineering.
And, yes, there are specificsecurity rules and cyber rules
(09:40):
and CISOs and things.
But if you create a culturearound security, it helps and I
certainly think we did that andthat's how I learned and ended
up ultimately being responsibleand eventually running a cyber
company.
Speaker 1 (09:52):
Yeah, I feel like a
lot of companies make the
mistake of not reallyestablishing a security culture
around their security programand they just start kind of
dictating different securityprinciples.
And they just start kind ofdictating, you know, different
security principles and thingsthat they want, you know, done
and secured a certain way.
Right, like it's good in theory, you know, because you're
(10:14):
thinking from like a technicalperspective and whatnot, but you
know, when it comes down to it,like that's a difficult way to
go about it, you know there'seasier ways.
I feel.
Speaker 2 (10:26):
Yeah, for sure.
And at the end of the day, youknow your customers deserve it
and will demand it, and so youknow you have to make sure it's
embedded inside your processes,your culture, like I said, to
make sure that you're servingthem correctly.
And I think if you have thatattitude about it, you know you
can do well.
And you know, sometimes amandated checklist, you know
(10:47):
those are just hey, the box ischecked A little different than
actually.
You know, for me, if I'mstanding in front of a customer
and they're saying you know howdo you do X, y or Z, I want to
be able to explain that to them,have them have that confidence
in the organization and what I'msaying is super important, much
different than like a checkboxon a compliance looser.
Speaker 1 (11:07):
Okay, it's
interesting.
Yeah, you know, I feel like Ifeel like the industry as a
whole is kind of, you know, inlimbo right now, almost right.
Everyone kind of wants to pushforward to this new technology
of like AI, right, but securityhasn't really caught up with.
(11:28):
How do we do good securityaround AI, right?
Is that?
Is that something that you'realso seeing in the industry?
Speaker 2 (11:36):
Yeah, we talked to
partners about it.
I mean, I didn't vary.
What we do is we ensure thatthe core of the system is free
from advanced persistent threats, so that your operating system,
your kernel, down to yourhardware.
And you know, in the AI space alot of people are, like you
know, trying to figure out whatdo I do to secure this?
And you know the data, theinference data, the metadata,
(11:57):
your training data it's one ofthe most valuable assets a
company has and you know, maybehaven't put a lot of thought
into what am I doing to deterthat.
And so we work with DoD, intelcustomers and then commercial
customers, so some of the mostsensitive AI frameworks, and we
learn from them and how they useour technology to ensure that.
(12:18):
But I do get a lot of questions.
I think you're right Beyond AI,I think the problem that we're
facing is our attack surface isjust expanding exponentially.
And I'm sure you're rightBeyond AI, I think the problem
that we're facing is our attacksurface is just expanding
exponentially, and I'm sureyou're familiar and I talk about
it a lot the threat cycle wherethe attack surface expands and
then the attackers find a way toexploit it and then we as
defenders figure out they didthat, try to stop them, but then
(12:39):
we're repeating the cyclealready because something else
has changed and so getting infront of that a lot of what we
talk about and vary in what weneed to do.
I think very relatable to yourcomment there.
I want AI.
The features and power of itare amazing, right, but if I
don't secure it, you know, acustomer's put inference data in
there that has their personalinformation and gets stolen.
(13:00):
Somebody steals my weights ormesses up my weights and now
it's giving out bad, improperinformation.
All kinds of things can gowrong.
Speaker 1 (13:07):
So there's a lot to
do for sure, yeah, talk to me
about your not your experiencewith the government, but like
working with them, right as likean outside party.
I have a little bit ofexperience with that and it was
always very interesting seeinglike who who tries to, you know,
(13:27):
push you around a little bit,and it's like like the levers
that are being pulled and thelocations that you go to.
I thought the locations partwas probably the most fun part
for me, right, like like okay,flying to this airport, drive
these directions, you know,don't even give you an address,
and yeah, you show up and you'relike am I at at the right place
?
Speaker 2 (13:47):
Sure, I mean so at a
very.
You know, our software is basedon a license that we have of
technology from the NSA and thenwe have a collaborative
research agreement with theirlaboratory for advanced
cybersecurity research.
On top of that we also ofcourse, do business with
entities in the government,sometimes through a prime.
So like I probably shouldn'tsay but the large primes that
(14:09):
are out there into thoseprograms.
So a lot of experience acrossthat board, I would say.
And, to be honest with you, inmy background we've done work in
government before andsupporting things at those
different companies I mentioned.
But here it's a lot moreintimate and with the NSA
relationship and ourcollaboration agreement it's
really, I feel, like the goodguys collaborating, meaning that
(14:31):
coming into it I wasn't surewhat to expect, but they're
really just security engineersthere to kind of make sure we're
doing the right things with thetechnology and that we're
helping each other.
I find that very rewarding.
I think that the goal is toprotect all of us, right, and
that's sort of what's drivingthe research, the conversation,
the work that we do, andeverything else is kind of
(14:51):
secondary to it.
So I found that interesting.
And then you know, on workingwith the customer.
They are very thorough.
You know there's weirdtimelines and systems and things
that you have to do, and we canget into all of that.
I kind of think about it, though, as kind of walking through mud
, where you're always makingprogress.
It's slow, but they also, youknow, some of the brightest
(15:15):
individuals are in thoseprograms, really looking at the
scope of things, and I think alot of the requirements, the
compliances that we in thecommercial world have to adhere
to, are coming out of this workthat's happening in a lot of
these places, where we'rethinking about it to the next
level, figuring out how to makeit scale, economical, efficient,
and then bringing those thingsback to mind or back to the
expecto.
And it's a community.
The commercial world does ittoo.
(15:36):
But that's kind of what I seewhen I work with these entities
is, you know, obviously it's aserious thing, but it's also a
partnership where we're workingtogether, sharing ideas.
Sometimes it's even adisagreement.
So a lot of times we talk aboutthe walls that we put in place
to secure our infrastructureversus the verification that
those walls are doing whatthey're doing.
(15:56):
Relationship there Nobody canget to my kernel versus I zero
trust world where maybe somebodycould, so we better verify that
a persistent threat actor isn'tthere, great conversations to
have, and that's kind of been myexperience, you know, working
in those communities.
Speaker 1 (16:14):
So what is NSA,
licensed tech?
I think that's the term thatyou use there.
Well, what is that?
I've never heard of that before.
Speaker 2 (16:24):
Yeah, and I don't
know if your listeners have
heard of any kind of techtransfer.
So normally you think about itin terms of university.
So the university, a professor,will do some research and come
up with some patents and that'stheir IP, and you're like, hey,
I think I can make a product outof that.
And then you licenseessentially those patents from
them and you pay royalty back.
Turns out the NSA has a techtransfer office as well.
(16:46):
So you can go Google search NSAtech transfer and there's a
series of patents and thingsthat they have out there that
they're willing to license toindividuals.
So part of our founding team isa professor of EECS at the
University of Kansas who doeswork in trusted computing,
sometimes with DARPA, sometimeswith NSA, and then our CTO PhD
(17:07):
in this space.
So we were familiar when thistechnology came up and said, hey
, we're willing to license thisout.
And there again you create arelationship that says, hey, I
have a business plan, I think Ican use this technology for the
greater good, and then you justnegotiate a license.
They get small royalty back onmoney that we make, as an
(17:27):
example, and from theresometimes they don't always have
a collaborative researchagreement we happen to do that
as well in the space that we'rein entrusted mechanisms.
To do that as well in the spacethat we're in entrusted
mechanisms, but yeah, it's justlike going to university, but
you go to a government entity aswell and, to be honest, again,
before this I would have beenlike I don't know about that.
I would do it again in aheartbeat.
(17:48):
Very, very smart, bright people, very great technology and in
our case, when I looked at it,I'm like we find threats that
are underneath a lot of things.
Right, our demonstration isusually somebody running a bunch
of security vendors, and I canstill get past that and to me I
was like this needs to get outthere.
Right, this is a technologythat needs commercial scale and
(18:12):
you know that's part of theirmission too.
So it was a great I hate to usethe word synergy, but a great
synergy, I think in what theywere able to offer and what we
were able to do.
Speaker 1 (18:26):
Yeah, that's really
interesting because you're
talking about one of the top oneor two intelligence agencies in
the world and they kind of havethat collaborative experience
with the public sector which is,or private sector, which is.
It's fascinating, right,because typically when you hear
about something like that,you're hearing about it from,
(18:50):
you know, one of the Israel's.
You know 8,200 group people,right, that left the service
allegedly and now they formedthis company and they're doing,
like amazing things.
You know, like that's that'slike the only time that's the
closest thing that you reallyhear of that.
You know, in this space, in thisworld, I'm going to have to
look into you know, their, theirpatents and their offerings,
because I'm actually getting myPhD right now in in satellite
(19:15):
security and a part of it isactually utilizing zero trust in
a secured architecture formatto secure communications within
the satellites for preparing itfor, like, quantum encryption
you know, a post-quantum world,right.
So getting that, getting thatwould be very helpful, because
I'm sure that they've alreadylooked at something like that,
(19:36):
right, and they kind of justneed someone to run with it to
some extent, and that'sdefinitely something that I'm
looking to do, right, becausenow that I'm in my PhD and I'm
doing the research andeverything I'm starting to, I
always think you know five yearsout, right.
So now I'm thinking of, okay,well, what do I do with this?
You know, what do I?
You know where do I see thisgoing, and stuff like that,
right?
So that's all a part of it aswell.
Speaker 2 (19:58):
Yeah, for sure I mean
a lot of need for what you're
doing.
So thank you for doing that.
Number one.
And you know, to give you asense of, like, the people
behind this stuff, you know theyare also the originators of
Secure Linux, se Linux and toopen source, and so really
they're just security engineers,like I said, and then this is
my view, so I don't representthem at all, but I don't know
(20:20):
what rules or restrictions theyhave around publicizing what
they do.
I know it's not as easy as itis, like it may be a university,
especially a private one, whereyou control, and so I think
it's just maybe a lack ofexposure and they need more
exposure to create that benefitfor more people.
But yeah, definitely worthchecking out.
And, to be honest with you, I'msure the other agencies have
(20:41):
similar programs, not asfamiliar with them, but I would
definitely look at at thoseopportunities as well.
Speaker 1 (20:48):
Yeah, probably, at
least like the NSA and DARPA.
You know that would be, uh,that'd be really interesting.
Yeah, and you, you bring it up.
Uh, sc Linux there kind oftakes me back almost.
Uh, I don't know, it was a.
It was an interesting experience, you know, like fresh out of
college, handed this project andit has as the SC Linux on it,
(21:09):
it was like, oh, what's SC Linux?
Right, went down a rabbit holefor 18 months learning SC Linux.
So by 18 months learning sclinux.
So by the end of it, it waslike, oh, yeah, you have to run,
you know, these 15 commands todo this thing.
Whenever someone at my companyran into an issue, they were
like, just go to jail, like eventhe developers that were
supposed to be deploying and,you know, integrating sc linux
(21:29):
into our product.
Like they would go to me andyou know, be like, hey, like
what's going on here?
Like oh, yeah, well, you, yeah,well, you guys broke it this
way, so you have to run thesethings and it'll allow it
through.
Like what a powerful solution,you know, in terms of just
application security, serversecurity, overall.
You know, like I mean that'samazing what they built and then
(21:53):
they released it out into thepublic.
I couldn't believe it when Ifully understood it.
Speaker 2 (21:58):
Yeah for sure, and
it's got its own language to it
kind of right.
So it is extensive and I thinkthat shows I've used the word
trusted mechanisms a lot andagain, my opinion, not theirs.
But you look at, how are weassuring We'll talk about zero
trust in a minute, to kind of goback to your PhD work some
interesting thoughts there.
But how are we assuring we'lltalk about zero trust in a
(22:18):
minute, to kind of go back toyour PhD work some interesting
thoughts there.
But how are we assuring thethings that we are relying on?
And so I think of Etsy Linux as, like a lot of work that they
put in for boot time security.
Right, they're like, well, youknow, I need some assurance
right At boot that I can trustkind of what's happened.
And then you know the technologywe licensed was called kernel
integrity measurement, we callit runtime integrity, but it
(22:40):
essentially extends that toruntime and saying are the
things that I believe to be trueabout this particular operating
system always true?
And so you're constantlychallenging yourself back to
zero trust.
What are my assumptions right?
So it turns out for a typicalsecurity vendor there is a
really bad assumption that, like, the operating system is okay,
because I have to ask it a bunchof information, right, you know
(23:04):
?
Give me the logs.
Well, did an attacker in themiddle alter those logs?
You don't know.
Tell me when a file was opened.
Did it not tell you that aparticular file was open?
You don't know, and so even youknow.
Going back to satellites andzero trust, a lot of times
people think about authorizationand authentication.
Of course those are important,but you forget about assets and
resources, right, so you cankeep going down and then you're
(23:26):
like that whole root of trustprogram and then we can talk
about confidential computing andTPMs and things like that.
All grown out of the thinkingthat I think was the genesis of
SELinux.
I think was the genesis ofSELinux.
Speaker 1 (23:38):
Yeah, yeah, it's.
You know, I remember when, zerotrust, when that, when that
term was starting to be thrownaround and kind of, I think
maybe to some extent, zscalerlike maybe coined the term or
made it popular, right, yeah,and then it quickly turned into
a framework, right, at least Iview it as a framework, I think
it's a framework.
(23:58):
You know the, uh, the nistdocumentation.
I don't use it as a framework,so I think it is now right.
But, you know, with, with myresearch, it's like almost going
down, like a rabbit hole.
It's like, well, how zero trustdo I need to make this thing
right, to make it secure andcompletely trustworthy?
Do I need a tiered architecturewith the satellites right,
(24:20):
where you have like a ring ofsatellites around the planet,
you know, that then communicatesto a higher level ring that has
authentication mechanismsaround their own, you know, that
can authenticate thesesatellites.
Well, how do I verify it?
Well, that's going to talk to adish, you know, back on earth,
right, and that has to have itsown verification system.
It's, you know, it'sinteresting, right, because my
(24:43):
background, I got my bachelor'sin criminal justice and then I
went and got my master's incybersecurity, and you know,
like you said, right.
Well, what's the authenticity ofthat log file?
What's the authenticity of thissystem?
Did it change, did it getremoved, or everything like that
?
During my master's because itwas a very hands-on program I
(25:05):
mean, we learned about how tomodify all that stuff, right,
like we learned about theimportance of log files, and my
professor immediately said andall that goes out the window.
When you're hacking something,this is exactly what you do to
modify it and make it look likethe window.
When you're having something,this is exactly what you do to
modify it and, look, make itlook like the way that you want
it to look.
And I mean, I'm being taught by, you know, nsa hackers and I
say people that do forensics, Imean the smartest people on the
(25:27):
planet, you know, and they'rethey're literally saying yeah,
like last week I was doing thison a target and this is how I
overrode it.
They'll never know anydifferent because they don't do
this other check over here, andso you have to have all these
checks in place.
It's fascinating, fascinatingstuff, how deep you can go with
this technology, with this space.
There's almost no limits.
Speaker 2 (25:48):
Yeah, I know, you
know there's a couple of things
that stick out in my mind there.
So we're advised by a retiredNSN individual who did a lot of
the inventions that we talkedabout, and he likes to remind me
that you have to think abouthow a computer works, and it's
actually interesting how manycyber professionals and even
software engineers sometimesdon't understand how a computer
works.
Yeah, now down to disk andmemory and the kernel itself and
(26:10):
how it's getting thatinformation to know that an
attacker can circumvent you inthese ways and it gets very
asidary.
The flip side of that coin andI'll toss it back to you is I
think it's our job to simplifythis for the enterprise and
government space right, so itcan get extended, right?
So how do I ensure theauthenticity of a particular
(26:33):
thing implements your trust inan economic way?
And that's kind of what we lookat is from an organization of
20 people to an organization ofa million people or however
large you want to get to, howcan we scale that and solve
those problems for them superfast?
So, as you're going throughyour research, I'm sure you're
thinking about that too.
Right, like you can do it once,but can you do it in times, and
(26:54):
scale that.
Speaker 1 (26:56):
Right, yeah, that's
kind of that bridge between the
government and the privatesector.
The government kind of.
I mean, they'll createsomething, they'll spend
millions of dollars to use itonce on one target, or develop a
whole framework or a taxframework off of something,
(27:16):
right it kind of.
They kind of pass it back fromwhat it sounds like.
Right with what you're saying,with the licensing or the
patented, you know technologiesthat they have, right.
They then turn it back and giveit to the private sector to say
, make this scalable, right,like we don't have that kind of.
Maybe not that they don't havethat capability, right, because
(27:38):
when I think of, I'm thinking ofsmarts, intellect, you know
that sort of thing, right, right, but their mission set isn't to
make it scalable for everyone.
Their mission set is to make itand work to protect America.
And then from there, what goeson with it is beyond them, right
, and that's a really that's afascinating bridge that not many
(27:59):
people talk about or even knowabout.
And to to your point, right withunderstanding the underlying,
you know processes of computersand how it kind of operates,
right, I'm?
I'm not that deep, deep right,but I know enough to where I can
start pointing in the rightdirection and things like that.
And I have a friend who sayshe's very experienced in
(28:24):
security, but in security longerthan me for sure.
He said as securityprofessionals we typically earn
our paycheck maybe two to threetimes a year, and what that
means is that when no one elsein your organization can figure
out what's going on, it'stypically the security guy that
is the one that has to step in,because they understand the
networking perspective, theyunderstand the systems, the
database, they understand thekernel level.
(28:47):
Those are very difficult thingsto understand and the security
person is the one that'stypically pulling it all
together, tying all the dotstogether, right, and that's a
difficult thing and you reallyonly learn that by going deep
into these areas.
I feel like early on in yourcareer to build that foundation.
(29:07):
You know, like, for myself Ididn't really understand
virtualization until I sat downwith an engineer and I said,
okay, how is this tied together?
Like, like, actually show methe port, the socket that is
using.
How do I terminate a socket,all that sort of stuff.
You know going through it and Imean that took me a couple of
years right To really understandit and you know be able to make
(29:33):
my own troubleshootingdeductions from it, you know
Right, but it's a step that'sreally required if you want to
be successful in this space.
Speaker 2 (29:43):
Yeah, I mean in there
, you know the
interrelationships between allthe systems and, as a security
professional, having you'regoing to have to understand them
does give you a leg up in in alot of you know what's going on
in the environment and helpingyou know lots of people out 100
percent.
Um, so it's, it's more thanjust how our computer works.
I think that's a good point.
(30:04):
Right is, how does the systemwork together with all the
computers in it, right?
Yeah, and then being able tomake those determinations and
then you know at know, at leastfor me too, it's complex, right.
So you're like, oh my gosh, Ineed to make it better.
And then you know there arefolks out there that it's
sometimes hard to hire.
You know very exceptionalprofessionals that can do that.
(30:27):
So how can we help these people?
And I think that's the problemthat we face today and Mary
tries to solve lots of otherpeople are trying to solve.
Is that we face today that Marytries to solve, lots of other
people are trying to solve?
Is, you know, I talk to CISOsall the time and they're busy
and stressed all the time andthey have a lot of pressure on
them.
You know, at the same time, andso clearly, as an industry, we
(30:49):
have a lot of work to do to makethat easier for them and stay
on top of the innovation that'shappening.
So, all the way back to AI,right, like, and I see you.
So I was like, oh, I got tohave an AI.
It's so powerful for, you know,maybe a large retail chain, I
have to do it.
But now there's this wholeother set of interrelationships
and problems I haven't thoughtabout in zero trust.
(31:11):
And boy, what am I going to doto put that together, I think,
thought about in zero trust.
And boy, what am I going to doto put that together?
I think that's great.
As you're going to your phdwork, you know, as we're
thinking about how do we scalethis for you know, commercial
use and the technology license,those things that go into it,
which is how am I making thiseasier?
How am I solving exponentialproblems for those people?
Um, so, we can get in front.
I don't know if you're a starwars fan.
(31:32):
Are you star wars?
Okay, so the scene wherethey're in the trash compactor
yelling at Lady, get on top ofit.
Almost every day I'm talking tosomebody and they're underwater
.
I'm thinking about that line oflike get on top of it so you're
not buried in the trash, right?
Yeah, you know, the behaviorsthat we do are interesting, and
(31:58):
really what attracted me to thetechnology when I saw it at the
NSA is, instead of, like,looking for bad things, which we
need to do, this kind offlipped the script and this
thing.
No, I'm just telling you thateverything is doing what it was
supposed to be doing, and so, ofcourse, the absence of that
means that something bad hashappened, but what that means is
I don't have to go look for athousand unique threats, I have
to just go look for onealteration to proper behavior.
And so that's how our mindsthink and work is like.
(32:21):
Am I solving these securityproblems for the world?
How can I apply differentframeworks, techniques, thoughts
around zero trust in such a waythat reduces the number of
tools, the amount of time, theamount of decision-making that
these folks have to have?
Speaker 1 (32:36):
Yeah, yeah, that's
really interesting, you know,
staying on top of the technologyevolution, right?
That reminds me I was doing somework.
I was doing some consultingwork on the side with a
state-run, you know, governmentinstitution with a state-run
government institution, and theywanted to have an AI model that
(32:57):
ingests all of the safetyinfrastructure, safety security
of their state and they wantedto have it to where you could
ask it questions of what are thetop five bridges that are at
the highest risk of collapsingwithin the next three years, so
(33:17):
they can prioritize properly,right, because they were getting
all of this data in, but by thetime they would get to the
critical stuff, it might havebeen too late, it might have
caused more issues, they mighthave been spending more money
when they shouldn't have been,and so they were creating it and
they said, yeah, we're almostdone with it, but we just
(33:37):
realized we didn't even likecheck if we secured it, like we
kind of need to create this as acopy and paste sort of thing,
where it's like, hey, you justcopy this JCP project and you
paste it over here and it hasall the security controls built
in and whatnot, and so we needyou to come in and tell us you
know how to do it best, right,and it's a really I mean, it's
(33:59):
an interesting text, right,because where else am I going to
get that sort of experience?
You know what other opportunityout there will I get to look at
an AI model that's doing thatsort of thing and see how do I
secure it, how are youauthenticating to it and all
that sort of stuff.
It's really fascinating and, toyour point too, trying to stay
(34:19):
on top of it.
That's like a day job in and ofitself, just trying to stay on
top of this stuff we keep.
Speaker 2 (34:27):
Cyber is a big
industry and it's good for a
company in cyber, but reallyit's bad for us as a world and a
waste of GDP in some cases, andso it keeps getting worse.
I don't know if that's exactlytrue, but you and I could
probably try statistics, thatkind of show that things are
getting worse.
But I think it just comes downto what can we do to be in front
of the attackers?
(34:48):
And there's a lot of things.
It goes from security anddesign, so kind of what you're
saying.
I shouldn't have maybe made thesystem before I secured it, but
that has to be easier.
I mean, there's a reason peoplearen't doing it, so it's
expensive, it's hard, maybe it'snot taught.
So there's that part.
And then the solutions that wehave have to be innovative in
such a way where I don't have tohave tools upon tools and spend
enormous amounts of energy,time and to figure things out.
(35:11):
So you know I'm very interested.
You know AI is a big portion ofthat right, both protecting it
and then using it for thosethings.
But to me that's the challenge.
We have to A also communicatebetter as the good guys, right,
I get to work with universitiesand government entities and
commercial entities all the time.
So I'm lucky.
I don't think lots of peoplehave that opportunity.
I'm like Lots of people havethat opportunity.
(35:32):
I like you.
You sound like you know.
You had that luck in yourexperience too.
So how can we get yourlisteners, more people, to be
involved with you know, largergroups of people in a
non-competitive way where we'reall just trying to to get ahead,
to get on top of the pot, stayon top of it, yeah we, we've,
we've kind of danced around it,but why don't we circle back, so
(35:54):
to speak, to your company,right?
Speaker 1 (35:57):
What's the name of
your company?
What are you guys trying tosolve, and how are?
Speaker 2 (36:00):
you doing it?
Yeah, sure.
So it's Invari, short forinvariance and we ensure the
security and confidentiality ofsystems and we start at the
operating system, because that'sthe core of assumptions being
made.
That isn't validated today,down, as I mentioned, to
hardware and validating.
For instance, if you're awareof trusted execution
environments and confidentialcomputing, they are meant to
(36:20):
ensure that when you're on a VM,only users of that VM can see
that memory and it's encryptedin such a way where the host or
other VMs on that can't see it.
Is that hardware doing whatit's supposed to be doing?
So we do a lot of properbehavior attestation at runtime,
so a lot of that boot timesecurity we extend to runtime,
again based on, you know, thattechnology that we licensed from
(36:41):
the NSA.
And then, real quick on how wedo it at a high level, I'll talk
about the operating system fora second.
So it turns out that we can mapin a graph, a graph, data
structure behaviors of anoperating system.
It's about a million datapoints of data structure objects
, relationships, code sequences,and I think about them as
(37:01):
constellations.
So you think about a milliondata points.
That's a lot, but if you thinkabout, like our night sky
constellations, big Dipper.
I know where it's at, I knowwhere it should look like.
It has an invariance to it,right.
So if I look up and it'schanged location, changed shape,
something's gone terribly wrong.
We don't probably have a lot toworry about.
So it turns out with software ithas some of those invariants
(37:24):
built into its design.
So the kernel is designed insuch a way that has these
invariants in it that you canmap.
It's a lot of the IP that we'vegot from NSA has that
definition of those invariances.
So we baseline an operatingsystem and then at runtime we
never have to have seen thatmachine before we grab that same
map very efficiently, pull itup and then essentially compare
(37:45):
the constellations and see, hey,are they doing anything
different?
And if so, that difference isprobably almost assuredly an
attacker implanting themselvesin the middle of one of those
data structures huh, yeah,that's um, that's really
fascinating.
Speaker 1 (38:02):
It's interesting how
something like this hasn't
really hasn't really existed forthe most part, you know, up
until now, right, because I'mthinking back to when I was like
managing you know bit nine,before they got bought out by
vmware and everything else right, where you know you would have
to do research into the process,into the service and how it's
(38:25):
actually, you know, hooking ineverything.
And I remember sitting therebeing like can't someone just
put something together where,just you know, builds a model
off of this, like there's no waythat microsoft doesn't know
what their os is doing and theexpected services and stuff like
that, like why can't we just,you know, have that and then map
(38:45):
off of that?
You know, and it's um, when youbuild it in like that, you know
you're starting with securityin a very good place.
I mean, how much better can itget from there?
Speaker 2 (39:00):
Yeah, there's other
layers to it.
If you could say, push a buttonand know that your entire
system is doing only what it'ssupposed to be doing, you would
do that right.
As opposed to the flip of likerunning a bunch of software to
look for bad things.
I think traditionally it's beenmathematically hard to prove
and do so.
Like I mentioned Dr Alexander,one of our founders, he spends a
(39:21):
lot of his research timemathematically proving
attestation and trustedmechanism techniques, so we know
there can be true.
And then now you have to makeit perform at scale.
Again, going back, I don't wantto say or know how much time it
is they spent researching thisjust for kernels, but it's
probably quite a bit right.
It's a lot of IP that went intowhat to measure, how to measure
it.
You know we put our own IP ontop of that.
(39:42):
So you do have to make it scaleand I think if you think about
bespoke applications then theyget unique right.
So you have to know a littlemore about them.
So now there's a relationship,I think, with the developers of
those applications and metricsthat they know are important.
But it also turns out, at leastfrom our perspective, the
(40:06):
relationships betweenapplications and the kernel also
have an invariance to them, andso, like I'll use the log4j
example, when it got exploitedit used different code paths and
sequences that it normallydoesn't do Effectively, in my
opinion, violated an invarianceof the purpose of the design of
that particular application.
So I don't know, are youfamiliar?
I'm sure you're familiar withthe Rockford JX player from a
(40:27):
few years ago, so you can startthinking about it like that and
then I like to think about it asyou're a really cool person and
then putting layers of blanketson you.
I don't have to make you 100%warm right away, but if I can
add a lot of value by making youslightly warmer over time,
that's worth doing.
So start with the kernel.
We moved up on Linux to eBPF,which a lot of security and
(40:51):
network optimization companiesare using, so we're making sure
it stays secure because it turnsout it's being attacked quite a
bit.
And then moving up toapplications, by looking at the
relationships with things thatwe know have strong invariance
to them at the same time.
So traditionally it has beentypical computational, but
fortunately we found a way tomake it super performant, which
you have to when you run on someof the platforms that we run on
.
It has to be non-invasive tothe mission of that machine.
Speaker 1 (41:14):
Right, yeah, that's
really interesting.
And then being able topotentially hook into it to
create basically blocks in thesystem and secure it further,
and things like that, it's afascinating space.
I feel like Bit.9 was one ofthe first players in this space
and it kind of died off rightand everyone has tried to kind
(41:36):
of build a product around.
You know that sort offunctionality, right, of having
that in-depth look at yoursystem and building in you know,
like what you said, theinvariance of, hey, this is so
far, you know, outlying andwhat's normal for your system.
We should look at it more, youknow, and building in that sort
(41:58):
of functionality.
It's really fascinating.
Speaker 2 (42:02):
It's closely related
to anomaly detection right,
which also has a lot of value.
It's deep in knowing what'ssupposed to happen, right.
So sometimes with anomalydetection it's a lot of
investigation.
I don't know if you'veexperienced this, like in my, my
teams myself.
You get a lot of fatigue rightand so you don't know, right,
and so that's like for us.
We tell our customers if youget a signal that you don't have
(42:25):
integrity.
That is a step one.
There's no need to investigatethat right.
That is like, and we've proventhat over and over and that's
kind of sneaked in the work thatthat came before us as well, so
we've benefited from that.
But I think that's where weneed to get to, where I can tell
a forensics person or aresponse person this is a fact,
essentially, and why I know it'sa fact, and then that gives
(42:47):
them a much quicker ability torespond and then they don't have
to worry about the noise in andaround.
Speaker 1 (42:54):
Yeah, yeah, that
makes a lot of sense.
Well, you know, Jason, we'rebasically at the top of our time
here.
You know it was a fascinatingconversation.
I think we're definitely goingto have to, you know, have you
back on and, you know,potentially even have on.
You know, some of the advisorsthat you were mentioning that
are doing some of that research.
I think that would be reallyinteresting, really fascinating,
(43:14):
to hear from them as well.
Yeah, we'd love to do that.
Yeah, yeah, absolutely.
Well, you know, jason, before Ilet you go, how about you tell
my audience, you know, wherethey can find you if they wanted
to connect with you and, youknow, learn more about your
company and whatnot?
Speaker 2 (43:29):
Yeah, sure, so it's
invaricom I-N-V-A-R-Ycom.
You can reach me atjasoninverycom or infoinverycom.
I'm happy to answer anyquestions or help anybody out.
Speaker 1 (43:39):
Awesome.
Well, thanks everyone.
I hope you enjoyed this episodeCool.