Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
SPEAKER_03 (00:53):
How's it going,
Heather?
It's great to finally get you onthe podcast.
It's it's been an interestingcouple months for me for sure.
Things are just getting rampedup in my life and lots of
changes with two little kids,but I really appreciate you
taking the time to come on andtalk with me today.
SPEAKER_00 (01:11):
Yeah, Joe, thank you
so much for having me here.
I am excited to share all thingscybersecurity future related.
SPEAKER_03 (01:19):
Yeah, you know, it's
interesting that you bring that
up because I always try to coachpeople to like look to the
future, and then that's what youshould be focusing on now.
Right.
And you know, I I think of myown career, and when I was
getting into IT, you know,security was like kind of in its
(01:41):
infancy of blowing up, right?
Where where organizationsstarted to realize, you know,
okay, we need to look at thisthing, we need to pay attention
to it and give it more money andwhatnot.
And so I knew IT overall wasn'tfor me, but I knew security was.
And so I tr started to worktowards getting into security.
And then once I got in, Iimmediately started looking at
(02:03):
cloud security.
And cloud security was like AWShad like, I think, 10 or 12
services at the time, you know,like nothing compared to the 200
plus that it has now.
And so I started to look towardsgetting specialized in cloud
security, and now I'm actuallygetting a PhD in something that
is probably gonna be morerelevant in five years.
(02:24):
So like it, I I bring all thatup because it's extremely
important, you know, to looktowards the future and then
start mapping your own careerfor that.
SPEAKER_00 (02:33):
I think what you're
talking about, like how you
decided to focus on cloudsecurity.
I mean, when I started in tech,cloud security didn't exist.
We had to go to data centers anddo all the coding.
And I had a friend who was aconsultant, and he's like, Yep,
setting up some servers in therack space.
Those, I don't know.
(02:54):
I used to look on Flickr andfind all of those funny images
of how people would do theircabling.
Sometimes that cabling was likea crazy, and sometimes it was
really beautiful.
I mean, behind all of theclouds, there is still a rack
somewhere, somewhere, a serversomewhere, just who knows where
it is.
So yeah, yeah.
(03:15):
Can I ask what you're focusingon for your PhD?
I'm curious.
SPEAKER_03 (03:19):
Yeah, so I'm
focusing on deploying the zero
trust framework on thecommunication satellites to
prepare for post-quantumencryption.
Because right now, there's noreal Yeah, there's no real good
way to maintain security onsatellites once they're
launched.
It's kind of like you get whatyou get when they're on the
ground and that's it.
Because you only have like a 10to 15 minute window to actually
(03:42):
patch them as they orbit aroundthe earth, and you can, you
know, switch ground stations,but you know, switching ground
stations mid-upload of asecurity patch is like pretty
difficult to do it right withoutyou know something breaking.
So it's just easier to not patchit at that point.
SPEAKER_00 (04:03):
You know, it's
really smart to start thinking
about like the post-quantumworld and what how we're gonna
secure those things.
Because I think at least one ofthe things that I've been kind
of seeing is this cons, thisfear that a lot of the threat
actors trying to have like thesedata grabs right now that maybe
(04:23):
they can't crack into them andget that, or even like with
blockchain or something likethat, just having it and uh you
know, keeping it on some serversomewhere because space is so
cheap, and then just waitinguntil the quantum, until the
current encryption is crackedthrough quantum computing.
And uh yeah, it's it's gonna bereally interesting.
(04:44):
So I'll put on my futurist hathere for a minute.
What I would look for is whatorganizations are already
thinking about implementing thezero trust framework like you're
talking about?
Which ones are already lookingat quantum resistant, you know,
algorithms or quantum resistantblanking on the word, whatever,
(05:05):
like the NIST requirements, theNIST guidelines that they have
it.
Like Apple and Signal are twocompanies.
And those are the companies thatare going to, the companies that
are using this host quantumquantum resistant security,
they're the ones that are goingto be less targeted in the
future.
And from a user perspective orlike a vendor perspective,
(05:28):
you're gonna want to trust thosethose organizations more versus
organizations that are just kindof like focusing on other
things.
And it doesn't matter who'sgonna crack quantum first,
whether it's gonna be China, itmight be China or the United
States, or United States incombination with EU, you know,
like it's just coming down.
(05:48):
It's a very real concern thatunfortunately, like most of the
focuses on like all the bigfires and the things that are
like top of, you know, all thecurrent drama that's like the
fire that we have to deal with.
Dealing with quantum andunderstanding it for both
security perspectives and otherperspectives is a must focus
(06:11):
that is not urgent yet.
And so it's actually a reallygreat example of like why you
want to think about the future.
You want to wait until quantumis a huge fire for you, or you
want to have a little bit of aplan so that you're not taken by
surprise when there is a prettyradical quantum breakthrough.
You know, there will be newsstories saying that quantum
(06:33):
quantum computing's beenreached.
And then, of course, all thehackers and threat actors and
even the nation state threatactors are going to start using
it for all kinds of stuff.
It'll be like a whole newparadigm shift.
SPEAKER_03 (06:46):
Yeah, it's uh, you
know, I a part of my research,
I'm talking with a lot ofexperts in the field, of course,
and all of them pretty muchagree on the same thing is that
companies are kind ofdownplaying it right now and
they're waiting until the lastminute when their urgency should
actually be right now because ofthe amount of work that is
(07:09):
actually going to be required toprepare their organizations for
you know the newer algorithmsand whatnot, right?
And so a lot of a lot of placesare gonna be in a situation
where you know, seven years whenit's actually real present day,
they didn't prepare at all, andnow they're in a mad dash to you
(07:30):
know get it all done, right?
Which it's interesting becauseyou know the other side of it is
oh, we've been told you know forthe last 20 years that quantum
is you know five years away orwhatever it might be, right?
But it's a little bit differentwhen you start throwing quantum
on the satellites, and thenyou're using those satellites to
connect to ground stations as itorbits via quantum, like China
(07:55):
did.
Or at least what they claim thatthey did.
Uh-huh.
It's a little bit different whenthat becomes a reality because
it's like, oh, this is this isbeing used.
Like this is real, this is nolonger theory, you know, and and
one expert is was actuallytelling me, you know, a lot of
people are looking at the wrongthe wrong thing to determine if
(08:16):
quantum is mainstream yet orready to be used.
And he, you know, I don't wantto like give away everything for
my research, right?
But he pointed out like one keything, and he's like, everything
else just doesn't matter.
If we get this one factorworking, everything else just
corrects itself.
And he's like, and that's whatI'm focused on, right?
(08:37):
And he he literally told me,he's like, yeah, it could it
could happen tomorrow.
I say five years because I thinkit'll actually be three, but I
give myself more time, you know,to actually like make it happen.
But it's it's interestingbecause it's a new, it's a
completely new field or newdomain within security that like
(09:00):
you know, even when I'm readingthese research papers on how
quantum cryptography works, Ihave to read it like 10 times
and then go talk to the personthat wrote it, and the person
that wrote it like still, youknow, isn't quite able to
explain it to me as like youknow, a high schooler, right?
Or a grade schooler.
And so it's like, okay, well,this thing is like so complex,
(09:22):
you know, that it's going intolike truly uncharted territory.
SPEAKER_00 (09:27):
You know, I think
what we've ended up landing
talking about on, you know,people are always like, what
does a futurist do?
How do you know what the futureis?
Yeah, everyone wants aprediction of what the future
is.
We've just been talking about afuture right now, one that is
gonna be really, reallyimportant.
And unlike like, you know, AI orthe metaverse or pick whatever
(09:50):
like most recent hypedtechnology, this is really
serious and is gonna have areally serious impact.
And it's interesting that it'snot being taken as serious as,
say, the metaverse or AI.
But in futurist terms, I wouldcall what we are talking about
right now a pocket of the futurein the present.
(10:12):
And you're deep in researchtalking to folks who this is
like their day-to-day presence,and you're thinking about this,
and I study this as well.
But that's because we'reattracted to this and we're
we're interested and we're likeliving in the future in this
little moment.
And that's why, and like WilliamGibson has a quote the future's
(10:33):
already here, it's just unevenlydistributed.
So we're in one of these littleunevenly distributed pockets of
the future, and we're like, whyare no more people, why are
people like paying moreattention to this?
Because they're they're livingsome other pocket of the future,
or maybe not even the future inthe present, whatever, you know.
So, and also, Joe, this isn'tthe only big future thing that
(10:59):
might be happening.
It feels like that to us, andprobably to you because you're
deep in that research.
But you know, other people arelooking at other things, and you
gotta realize that they'refocused on other things, which
is another point.
It's like the future isn't noone's in control of the future.
(11:19):
No one can say this is gonnahappen, and I have all the
resources and I can make ithappen because you're
specifically a quantum, you'relooking, you're researching,
there's a whole lot of stuffhappening.
There's governments, there'sprivate companies, there's
nation states, there's there'sorganizations, there's like, you
know, your impact withsatellites, there's impact with
(11:43):
other things, like even withinthis topic area, there's a lot
going on.
And so none of the people andthe players in that space have
control over the trajectory ofhow quantum's gonna like unfold
into the present moment.
And that I think is a reallyimportant understanding.
(12:05):
Like the future is this kind offluid, movable thing, but we can
influence it.
Like you and everyone you'retalking to in quantum right now
have a tiny bit of leveragepoints that you have a tiny bit
of leverage.
You can kind of leverage it.
You were doing your research,highlighting your points.
You're gonna be able to say,these are the things you might
want to have.
(12:25):
Recommendations, you definitelywill probably have security
recommendations for yoursecurity friends.
And you might be able to havepositively influence the
adoption of quantum resistanttechnology within your circle.
And that's you being able totake this knowledge and
influence the future.
And that's like what I'm tryingto do, helping people understand
(12:48):
how they can be empowered byknowing this stuff.
And I didn't expect to come inand have like a such a clear
case, like of you might notthink that you are working and
influencing the future, but youreally are in this specific area
that you're passionate about.
So that's awesome.
SPEAKER_03 (13:08):
Yeah.
Yeah, no, it's it's definitelyinteresting.
And I I have a bad habit of likewhen I get bored, I start trying
to challenge myself and learnnew things and whatnot.
And so I think that this is kindof what what came of it.
But you know, Heather, we kindof just dove right in, right?
Without telling, you know, yourbackground, right?
(13:28):
What's what's your story?
Like what what made you want toget into the IT field overall?
And uh, you know, it it soundslike you're you're a bit
specialized with security, butwith some other things as well,
being a futurist and whatnot.
And I want to hear all aboutwhat that is and what that
means.
SPEAKER_00 (13:48):
Okay, so I'm gonna
try to keep it short, but I can
kind of be a talker sometimes.
When you asked me like what gotme into this, well, I landed in
San Francisco in 1996, and myfirst job was at a startup.
And so I kind of just gotstarted into that.
And then, like, you know, one ofmy favorite films that came out
shortly afterwards was theoriginal Ghost in the Shell.
(14:10):
And that has, you know, it'sit's that in Japan, it's like,
you know, a little bit ofespionage, really questioning
like the human identity, thetechnology identity.
And I've always been reallycurious about how technology
augments and extends and helpsus create our personal
(14:32):
identities.
And so I've always been on theinternet, and so also being on
the internet, and like when Iwas in college, my first website
was in 1992, 93, and I had tolike learn how to code HTML and
upload it on a VAC system andstuff like that.
Then doing HTML and movable typeand blogging, like all I've
(14:56):
known, all that kind of stuff inthe back.
So understanding how all thatgoes around.
I worked in tech and I alwaysfound myself attracted to
companies that were doing coolnew things that were ahead of
their time.
They were ahead of their time inthat they were not financially
successful when I was involved.
(15:16):
Oftentimes later they would bereally successful.
I worked, I was a DHTMLevangelist in 1999 and 2000,
which ended up being Ajax.
It was a whole new way of how doyou build the web and how do you
build interactive webtechnologies that had a
different type of relationshipwith the client and the server
(15:39):
rather than back in the olddays, when you hit a new web
page, you hit the server, itrecreated the web page from
scratch.
And so you'd have long loadtimes.
And then anyway, this technologyreally changed it.
So I have, it wasn't just aboutthe new technology and what it
could do, it was getting peopleto adopt it and realize that,
oh, okay, hey, we're gonna adoptit and it's better and it's
(16:01):
gonna be more improving.
Let's fast forward.
I got tired of working forstartups that were ahead of
their time and not making anymoney.
So I try to figure out what Icould do where I could stay
naturally attracted to thefuture.
And I met someone at aconference, told them what I
did, and they said, You're afuturist.
I said, That's crazy.
What's that?
And then a couple of yearslater, I went back and I got my
(16:22):
master's of science in strategicforesight.
Strategic foresight or afuturist is both an art and a
science.
It is both studying research andthen putting patterns, patterns
together, understandingpatterns, looking at trends,
extrapolating them out into thefuture, creating little stories
about the future, understandingthat the future hasn't happened
(16:44):
yet, and that if you understandcertain trend lines, you might
be able to influence them.
Why would you want to do all ofthat?
I say because you might want tomake better decisions in the
present moment.
I like to use the nerdy analogy.
Being a futurist or teachingsomeone to think like a futurist
allows you to see the cling onshift decloak just a little bit
(17:06):
sooner before everyone else.
So you can react to it.
You can have kind of it's it'salmost in some ways like you
know how you do tabletop gamesin order to be familiar with
different scenarios.
So when you do have an incidentcome up, you don't freak out,
you're a little bit morerelaxed.
Well, you did the scenarioplanning, you think about these
scenarios in order to understandwhat the future could be.
(17:29):
So as you start to identify, itcould be like that, or it could
be like that, and then say, Oh,is that what we want?
If you're working for a companyor insecure, a specific security
thing, you could say, okay,well, if this thing shifts, that
might mean we're gonna get moreattacks here or more impact
here, or more attention orscrutiny here.
So we can watch that trend andsee how that might happen.
(17:53):
Unfortunately, not a lot ofpeople have the kind of
bandwidth to be able to thinkthat way because we're just so
busy dealing with fires.
I officially got my re-entranceinto cybersecurity as we know it
today, in about 2014 or 15.
I was doing a project for the USArmy, was about teaching with a
(18:18):
future teaching module.
And the topic that we used inthat teaching module was
cybersecurity because you know,cyber warfare, the biggest
warfare that we're happeningright now.
And in the process of goingthrough that, of course, I was
watching all these DEF CONvideos, and I came across Chris
Rock's.
What was it?
How to hack into, how do you howhe hacked into a country?
(18:39):
And I thought, who's this crazyguy?
This crazy, audacious guy.
Like, what the hell?
Is it for real?
This just sounds too crazy.
So, so yeah, of course I had tolearn more.
You know, a few years later,fast forward, I partnered up
with a T a friend of mine, BobBlakely, and we wrote a paper
for the new security paradigmsworkshop, which basically took
(19:01):
all my strategic foresightmethodology and applied it to
cybersecurity.
And I just pulled it up overhere.
It's called Shifting Paradigms,Using Strategic Foresight to
Plan for Security Evolution.
Because I had this theory.
I was like, okay, if we cananticipate what the future could
be, maybe I could help securityfolks secure things better.
(19:24):
And so we came up with all thesescenarios, and I came up with 12
new paradigms based on oldparadigms.
I'm just gonna pull up a few ofthem because they might be
interesting.
My proudest moment with this isthat this was in we did this
research in 2017 and 18 when wewere looking at security in
(19:44):
2038.
And at that time, one of theparadigms that was just like
taken for granted was this ideaof username and passwords.
And this is also like a hugeattack vector because everyone
wants to steal credentials to godo bad things with them.
(20:04):
And so at that time, I had theconcept of having like no more
passwords, and interesting.
We were talking about using zeroknowledge proofs and biometrics
as a way to potentially reducethe use of passwords and have
something that's a more secureand more difficult to you know
(20:27):
have credentials dealing.
Back in 2018, it was crazy.
Why would you no one was talkingabout new passwords?
That was just the way the worldworked then.
And now, you know, we have a lotof different options that aren't
passwords.
Magic links, zero knowledge, youknow, security at the edge, new
(20:47):
concepts.
And that's kind of like that wasinteresting.
Some of the other old paradigmsand new paradigms that might be
interesting are user is theweakest link in security, and in
tech, we always like to blamethe user.
Blame it.
They screwed up, they forgottheir password, they put their
password on a spit post-it note.
(21:09):
But what if we built securitythat actually augmented what
humans are really good atinstead of like fighting with
hum our human experience?
Anyway, I could talk more aboutthat, but that's kind of how I
got into it.
And yeah, I've written somebooks.
I also like exploring spies andespionage that kind of was like
(21:32):
on the edge.
Also really curious aboutstudying disinformation and
misinformation and psychologicalmanipulation and like social
engineering and uh everything,though.
What I look at is I'm motivatedto have people be empowered to
make more conscious decisionsand have more conscious
awareness.
(21:53):
And hopefully give like all thepeople, like you and people who
are listening to you, like extratools and powers so that you can
be heroes in your own areasbecause we can all influence the
future in very different ways.
SPEAKER_03 (22:10):
Yeah, that's
interesting.
You know, like when I think backin my career, like when when I
was trying to do some work thatwould prepare, you know, like my
team for the future, you know,with a solution and whatnot, I
remember very clearly, right, II wanted to put all of our work
on hold and really optimize youknow the rules that we were
(22:34):
putting into this one particulartool.
Because we were seeing a lot ofperformance issues, and you
know, our rules were were veryinefficient.
And my my argument was we shouldprobably have like, I don't
know, maybe 25% or 20% of whatwe actually have.
That would reduce it to you know50 rules compared to the insane
(22:57):
amount that we had.
And a part of it was actuallyyou know me having the vendors
like top architect come into theenvironment, review, you know,
every single rule that wecreated, tell us what the most
efficient ones are, tell uswhich ones we can get rid of and
how to do it, and everything.
And when he did, he came backand he said, Yeah, you guys only
(23:20):
need like 25 rules, you know, todo everything, and you guys have
over 800 at the time.
And the the issue was that mymanager didn't really see it
that way.
He saw it as, well, we need tokeep pushing forward and we'll
correct it as we go.
But at the at the level of workand effort that we were putting
(23:43):
in for like current ongoinginitiatives, there was no way
that we were ever going to likego back and actually clean it up
or anything.
And I eventually left thatcompany and I still you know
kept in touch with with peoplethat were there on the team and
whatnot.
And the one the one guy that wasleft on the team, you know, he
(24:03):
told me he goes, Hey, we'refinally like doing the cleanup.
He's like, Well, I'm I'mactually doing the cleanup.
And I said, Oh, okay, you know,let me know how it goes.
And like 18 months later, hesays, I finally got done.
Like, now we're running reallyefficiently.
And I mean, we had so many, somany issues with the pro with
the solution, just because itwas running so inefficiently,
(24:27):
you know, you would make achange, and it would take two to
four hours for that change toactually like take place, unless
you went into the SQL databaseand like manually triggered it.
SPEAKER_01 (24:37):
Wow.
SPEAKER_03 (24:38):
It's like, guys, you
understand that this is like a
really bad way to interact witha SQL database.
SPEAKER_00 (24:44):
So, okay, well,
clearly you are of a futurist
mind.
I can tell with your quantum andand this.
It is very hard to make the casefor a lot of executives in
traditional organizations.
And that is unfortunatelybecause they are just really
(25:04):
dealing at the fire drill level.
I mean, I would like to, I knowthinking a little bit about the
future opens a little bit morecapacity.
I guess if I were to have topitch this, I think maybe the
most successful way to pitchthis was would be about
increasing internal resiliency.
(25:26):
Increase the resiliency of yoursystems, right?
Because an example you broughtup, the tool product wasn't
running efficiently.
And I don't know, if somethinghad broken or gone wrong, maybe
it would have been really bad,right?
Whereas if you just yeah, well,if you just like done a little
(25:47):
bit of cleanup, just allocate alittle bit of time to do some of
these proactive resiliencebuilding things, then you're
gonna be able to absorb andrecover faster from you know
stuff that might come in to tryto to knock it off.
(26:08):
And I think that's like so muchof for profit organizations are
just like maximizing theefficiency so much for profit
for profit.
And when you're maximizing fromofficial for efficiency, you are
taking away from resiliency.
And efficiency is great becauseboom, it does that one thing
(26:31):
really efficient, but efficientthings are also fragile and not
always flexible, and so ifsomething in the world or the
system or whatever breaks thething that this is efficient
made efficient for, you've lostall of it.
(26:52):
And so it's I think it's reallyinteresting that maybe we're
seeing a little bit of come backto some resilience, desire for
resiliency.
And I think like government andthe military maybe get a bad rap
for not being very efficient,but they are very resilient,
highly resilient, and they haveto be like military and
(27:15):
government, they do think aboutthe future because the future of
the country, the future ofdemocracy, the future of
capitalism, free marketcapitalism.
Like that's all stuff that youknow, United States protects,
not just the borders, itprotects our belief, what
freedom thinking, like that kindof stuff.
(27:37):
And that enables capitalism.
And so I do think that like I'veworked both in the private
sector, uh for big companies,trans global companies, for
startups.
I've worked on governmentprojects, I've done European
government work.
I think there is a lot that theUnited States private sector
(27:57):
could learn about resiliencyfrom government versus putting
private sector fragileefficiency, forcing that on
government.
I do think some government stuffcan be streamlined.
And I think we are starting tosee some of that with the
adoption of new like identitytechnologies.
(28:17):
It takes forever, but um butthat's what you get with
resiliency.
And bureaucracy also generallyhas a level of transparency in
the leadership levels, and evenmilitary has transparency, like
you know the route you're gonnatake to get promoted, or you
(28:37):
know what you need to do to bepromoted.
It's not secret.
Well, you don't always know thatin the private sector.
Like you might be like, why'dthat person get promoted versus
someone else?
So it's interesting to see howthese different agencies,
organizations work, you know,like the private sector
corporation versus like agovernment agency.
(28:58):
And what's more focused onsecurity, you know, security is
a cost area for private sectorfor government, no, that's just
part of and military, that iswhat it is, it's security in a
way.
So I think understanding all ofthat, yeah, and and learning,
(29:19):
like it's not like one is betteror worse, it's not like
capitalism is bad, but and it'snot like AI or quantum or humans
or technology is bad.
It's about knowing what's thebest tool for the job.
And sometimes the human's thebest tool for the job, and
sometimes technology is the besttool for the job, and sometimes
having a free market is a greatway to get great new products.
(29:42):
And sometimes you need like, youknow, a very resilient base to
have security for everyone.
SPEAKER_03 (29:49):
Yeah, it's it's
fascinating that you bring up
resiliency, right?
Because I mean, obviously, Idon't plan anything, right?
So but we didn't plan to talkabout resiliency or anything
like that.
But yesterday, you know, AWS hadprobably their biggest outage
ever that impacted somethinglike 80% of the internet.
So, like for most of the users,it was 100% of the the internet
(30:12):
that it was that was affected.
And uh it's always it is so justabsurd to me how many people
just fully rely on AWS US EastOne and put all their stuff into
it when AWS makes it actuallypretty easy for you to make your
things redundant across regions,like it's not terribly
(30:36):
difficult, and I I talk about itlike all the time, right?
Because you know, a couple yearsago I actually got the AWS
security specialistcertification.
There's a whole section in therethat just drills into high
availability, redundancy,disaster.
Recovery.
And I mean, you have to knowthose topics inside and out with
(30:58):
AWS.
Like you have to be able to, youknow, be presented with a
failure somewhere and say, yeah,it's a failure there, but it we
have to look at this problem allthe way over here that no one
else would look at, right?
Mm-hmm.
And the government actually doesan extremely good job of this.
And I I tell this example beforebecause early on in my career, I
(31:21):
managed all of our governmentclients for the company that I
was working for.
And so that included me going onsite to some really cool places
that you know the public doesn'teven realize that are there.
SPEAKER_00 (31:32):
You know, and so
have your fingerprint scanned to
get in.
SPEAKER_03 (31:36):
That's like the
lowest tier facility that I was
at, right?
So I mean, I, you know, at thisone facility, I was in the
middle of the mountains, and Ithink it was like West Virginia.
And, you know, like it's thisfacility just tucked away in
between mountains.
It's a giant facility.
And, you know, I'm walkingthrough and I'm asking my
(31:56):
handler, I was like, why isthere so many people here?
Because like the building isliterally as long as you can
see, right?
Like you can stand at one endand not see the other end.
And there's these huge modulesof people.
I mean, rows of like 50 to 100people in each row.
And I said, like, you know, iseveryone really working on
(32:18):
something different here?
And he goes, Yes and no.
All of us in the same row havethe same skill set.
We can do each other's job withno issue.
It's not really a big deal.
But we are all working ondifferent things, and we don't
know what the other one'sworking on.
But if they came over to one ofus and was like, hey, the guy
across the aisle from you, youknow, is no longer with us.
(32:41):
You have to do his job and thestuff that you are doing.
This is what it is.
All of them can do it.
And within each module, they'reonly working with one tech
stack.
And the tech stack overall isservicing, you know, the
underlying infrastructure ofwhatever servers and data
they're protecting and whatnot.
Well, each module has its owncompletely separate tech stack.
(33:05):
So for instance, one module willhave an Avaya phone system.
The entire thing will be Avaya,and it will be scaled to the
point where everyone on thebuilding could run off of this
phone system, but only onemodule is actually running off
of it.
And then the next one over isall of the competitors of that
tech stack.
All of them.
Cisco, everyone is in that nextone.
(33:27):
And then it just keeps on goingdown.
Because to them, they're like,we're gonna have high
availability, high resiliencyacross absolutely everything.
We're gonna get the top fiveproducts in every single
category that we need for a techstack.
We're gonna buy them all andwe're gonna deploy them all,
we're gonna hire the experts in,they're gonna be in-house.
(33:49):
You know, like that's the sortof stuff, right, that like
organizations should be thinkingof.
You know, like when I when I gointo that building and they're
asking me how high availabilityworks for my product, and then
they're asking me what's thelimitation of it.
You know, their their responseto me was okay, well, let's do
(34:13):
five-tier high availability foreach module, right?
So we're gonna put 25 of yourservers in here.
They're all gonna be servicingdifferent modules.
We're gonna we'll have fivetiers of high availability.
And they they did not care whatthe cost was.
Like it was just straight updidn't care.
Send it over.
And at the end of the year, theywould always ask me, is there
(34:36):
anything else I could buy fromyou?
And I'm just sitting here, like,guys, you buy literally every
single skew we have.
Like, there isn't another thingyou can buy.
And they're like, Well, can webuy more of it?
Okay.
SPEAKER_00 (34:49):
They probably had an
extra, extra budget.
I think that's probably one ofthe things that was like really
so surprising to me because Ikind of started the early part
of my career working forstartups.
And all the folks that startedthe startups, you know, the
founders, they all came from thesame tech private sector area.
(35:09):
And then I did some projects forthe government, and it feels
like it's not efficient.
And in fact, for a while, I waslike involved with developing
and designing decentralizedidentity, which was a new type
of technology designed forprivacy, private data sharing,
and securing and stuff likethat.
But there was a whole divisioninside of DHS that was looking
(35:34):
to fund startups developing thistechnology, and their attitude
wasn't we're gonna just pickone.
Their attitude was like, we wantto support the development of
this industry, so we're going togive money, we're gonna fund,
you know, five projects of Xamount, and in order to create a
(35:58):
robust marketplace, so thatmarketplace will compete with
each other, and they will endup, and us consumers will end up
with a better product becausethey're helping all of these
startups in order to createsimilar solutions using uh this
(36:20):
technology, but might want tobuild different products.
Like that's kind of like thebest way of using capitalism to
have an idea, to develop it,etc.
You know, and it's very, youknow, you win if you make
profit, right?
The money is the win way.
But we like end users win fromthis, and we don't use any, we
(36:43):
don't invest anything into it.
And the government wins becauseit ends up having like the best,
most innovative things come upversus like, oh, say you only
like invest in like sayMicrosoft products or like
everyone's on Amazon WebServices, you know, like that.
Like the diversity is criticalto the success because the
(37:03):
competition, in order tocompete, you become better.
So it was just kind of likereally interesting to see this
relationship between how thegovernment enables this
competitive marketplace that wethen all win from.
But then like the private sectorhas no doesn't even see this.
They are only really focused onwinning in their stack or
(37:26):
whatever their product area.
And so I feel like again, likeprivate sector could like have a
little bit of more awareness ofwhere they fit in the overall
scheme of things.
You know, it's not I'm notsaying that they shouldn't make
money.
I mean, money, you have to makemoney to survive.
That's that's the currency ofthe realm.
But there's more to thesetechnologies than just profit.
(37:52):
And I think there's some otherbusiness models that kind of
explore that.
So it's just really interestingto see what comes.
And also, you know, the internetand the government, like some of
these projects that you mightthink, what's the how are you
gonna apply this?
I mean, you know, I was on some,I was, I was on the internet
really early days, had no userinterface.
(38:12):
It was you had to like dial up,dial into like a BBS, you know.
But it turned out to be like nowI'm in my office, you're in your
office, and we're recording apodcast, you know, through that
same technology.
So it's it's kind of cool 30years later, something like that
happens.
So the way the technologyevolves is not a direct path.
(38:35):
And that point from today, thattrend extrapolated into the
future is not a direct patheither.
So you can kind of have an ideaof what the future could look
like in some ways, but it'sreally just more of like a
feeling, like, oh, it could feellike that.
Because then when you start, ifyou're familiar with some of
(38:57):
what those futures might be, andI'm thinking about some of the
futures that we put in ourcybersecurity futures paper,
that then you can better respondto it.
Oh, I just remembered something.
One of the things that resultedfrom this paper that I'm working
on, I'm working on I'm writing anew book, Cybersecurity Futures
Playbook, kind of more of aplaybook model of if you are in
(39:20):
cybersecurity, you want to applysome of these ideas, this would
be like a playbook for you.
So we have like variables, likegeneral variables when talking
about the future.
But working with Bob on thispaper, we ended up coming up
with cybersecurity variables,cybersecurity specific
variables.
And of course, we totally likegot very detailed, like a lot,
they're like 20 or 30, or like Ijust added like 10 more.
(39:43):
But at a high level, they'rebasically like attack surface
variables.
Are attack surfaces increasingor decreasing?
And you can talk about specificattack surfaces like quantum or
IoT.
Attacker effectiveness areattackers more effective and
successful at getting through?
(40:05):
Or are they not as successful?
Is their effectiveness goingdown, aka we're better repelling
them or securing?
And then defender effectiveness.
So is defender, are defendersbetter able to increase their
effectiveness, or is there likea new technology that's causing
(40:27):
the defenders to not be aseffective?
I think AI is really interestingin this case with deep fakes
that enables threat actors, butAI is starting to be explored to
see how it can be used to kindof help identify pattern
recognition to make things moresecure, make make it better for
(40:48):
defenders.
It's not just, you know, themagic, magic key for attackers.
So those kind those kinds ofvariables.
And I really want to get somefeedback on how these variables,
the concepts of these variables,like attack surface, attacker
effectiveness, defendereffectiveness, how they could be
used like for you or for otherpeople that you're listening to.
(41:09):
And that's one of the reasonswhy I'm I'm realizing like I
need to write this book and thenget it out there.
And I'm interviewing folks sothat you know to see if some of
these theories can practicallyimprove your guys' ability to
secure things.
SPEAKER_03 (41:26):
Yeah, it's it's
interesting, you know, because
I've um I talked to a lot ofexperts and it's pretty well
just like kind of recognizedthat most companies you know
secure their environment prettywell, right?
Like most, the vast majority ofthem will, you know, put the
(41:47):
funds in, get the right tools,deploy it, configure it
properly, and everything elselike that.
And so attackers realize that.
And attackers, you know,realize, okay, if I'm gonna
actually attack this company,it's gonna be a huge amount of
resources if I do anything, youknow, kind of like technical.
And so they're resorting toother, you know, social
(42:08):
engineering methods, right?
Like you look at the the CaesarsEntertainment Breach or the MGM
breach, whichever one it was, Ithink it was Caesars, where they
just called up the help desk andsaid, Hey, I'm locked out, I
need to reset my MFA token.
You know, can I reset mypassword?
I just don't remember what itwas.
SPEAKER_00 (42:25):
But like the t the
Twitter hack, the Twitter one
that happened like you know,probably three or four years
ago.
I was super impressed by thatone that people got in and like
changed people's names and yeah,no, it yeah, but yeah.
SPEAKER_03 (42:42):
Yeah, it's really
relevant too, right?
Because I I was working for acompany and it was it was a
regular occurrence that the CEOwould call up the CFO and say,
hey, send X amount of money tothis account, like we need to
fund, you know, this thing orwhatever.
It was a it was a pretty regularthing.
So the security team enacted.
(43:03):
Yeah, so so the in the securityteam enacted a a passcode that
rotates every month or everyweek, whatever it was.
And you know, they they justtold the C the CFO, like, look,
if you if we figure out that youdon't request this passcode or
if you do it without thepasscode, it doesn't matter if
the CEO remembers it or not.
(43:23):
Like you have to get thispasscode, and if you don't,
you're fired the next day.
Like the next day, you arefired.
It's done.
It was written into hiscontract.
And so, you know, one time,actually, an attacker, you know,
called up, deep faked the CEO'svoice, which is very easy.
They just get some earningcalls, you know, recordings of
(43:45):
his voice, mimic it.
Sounded exactly like the CEO,looked like it came from his
number, right?
And the CFO was was good witheverything.
He's like, okay, send thisaccount.
It was like 15 million to thisaccount.
He said, Okay, what's thepasscode?
And they couldn't figure it out.
They it was completely threw himoff.
(44:05):
And he said, like the CFO waslike, okay, I'll give you one
more try with the passcode.
And if you can't do it, we'reending this call.
And so they couldn't do it.
So he ended the call, and thenhe like forwarded the thing on,
right?
Like how he should.
And sure enough, I mean, it wasan attacker, and that was the
only thing that was keeping themfrom it.
Because, you know, my paranoidmindset is like, okay, well,
(44:29):
what if everything else fails?
Like, deep fakes are brand new.
Surely an attacker wouldn't beusing a deep fake.
Like, well, no, let's assumethat they are, right?
Like, let's assume that theyare, let's assume that they're
competent.
Let's assume everything elsefails in the environment.
What's the last thing that wehave?
You can request something, youknow?
And it just rotates regularly.
(44:49):
So like the CFO even has to likego look up what it is, you know,
at times within a secureddocument, within a secured
server and whatnot, right?
Wow.
But like being able to thinkahead and you know, kind of
outsmart or outwit theseattackers, because like these
guys, I mean, I guess I am oneof these guys to some extent,
(45:11):
right?
Where it's just like you see aproblem and you see it as a
challenge, and it's just like,okay, I'm gonna spend an in like
abnormal amount of time thinkingabout this, testing it.
Like, you know, I can't tell youthe amount of times or amount of
hours that I've spent, you know,testing out different things
just to see if it would work,and the thousands of failures
(45:33):
that I had.
Like, for me, that's justanother day, and people think
that they'll just eventuallystop.
It's like, no, these guys, whenthey're set on something, like
there's no stopping them.
When you pay Chris Rock to gohack into a country's water
system, he's getting in, and hedoesn't really care how he does
it.
Literally, the mindset of anattacker, and I think I think it
(45:55):
might have been MGM, it mighthave been MGM or Caesar's Palace
where a couple years ago, rightbefore DEF CON, he said our our
network is too secure, they'renever gonna attack us, you're
totally secure on our network.
And it's like, dude, you'veobviously never spoken to a
hacker before.
Because if you use those wordswith us, like we have an
(46:18):
unlimited amount of drinks.
Vendors at DEF CON, like if Ijust go say, hey, I work for
this company, buy me drinks,they will buy me an unlimited
amount of drinks.
And if I want a bottle ofsomething, they will go buy me
the bottle, right?
With that, and you justchallenged the top 40,000
hackers in the world.
(46:39):
Like, we're going to get in.
And then sure enough, they gotbreached and they were down hard
for like a week and a half, allbecause this guy challenged us,
right?
And then we came back and we didthe next year and the year after
that, and the the CISO got firedafter the first breach, you
know, like it was insane.
SPEAKER_00 (46:59):
I think that's like
a lot of people.
I did some, I did like aresearch survey once on hacker
motivation because I was withthis group of people and they
were always like, okay, sothey're doing it for the money.
And they're needing to do it forfun.
I'm like, and I'm like, I knowsome of these people, I don't
(47:21):
think they're only moneymotivated.
I think some are.
And yeah, it turned out thatthere is some people who are
money motivated, but there'sfeign motivation.
There is doing it for the lols,you know, because you're
challenged, like what you'retalking about.
It's like, I think this is partof like understanding like the
human nature of like the type ofpeople that are drawn to
(47:46):
security, you know.
Like you're talking about, like,I mean, I've had that itch where
it's like you're trying tofigure something out, and then
it's like you're I'm like aboutready to give up, and then I'm
like, no, I will stay awake forthe next seven hours and figure
out how to do this one thing,you know.
There's like something thatsnaps in my brain that like goes
from being like a normal personto not a normal person.
(48:09):
It's like, I want to figure thisout.
And so I try to stay on the theline of the normal person line.
But I think that it's great tohave that type of perspective
and that kind of neurodiversityof the different way of seeing
things.
Like, I mean, those are spies,those are people who are trying
to do social engineering, peopleare trying to secure things.
(48:29):
You gotta think that way inorder to outthink the bad guys,
who are, I think, just some ofthe most creative people out
there.
Like, I see what they do as in away, well, this is very positive
framing, you know, as a way ofthey're actually helping us make
our systems more secure.
SPEAKER_03 (48:49):
So yeah, no, it's
it's interesting.
I mean, some of the people thatI've had, like I jokingly say
it, but they even agree with me,is that it's it's literally a
national security threat ifthese people get bored.
Like, you know, there thereneeds to be an organization that
just keeps them busy all daylong so that they can't figure
out how to like you know hackairplanes while they're mid-air
(49:12):
on the plane and start to turnit and whatnot, and I'm gonna
spoof all of the digitalcontrols in the cockpit, you
know, so the pilots don't know,right?
Like I've talked to I've talkedto someone that did did that
allegedly.
I'm sure he'll be pissed off ifI don't say allegedly there.
But it's just like it's just youknow, you get bored and your
(49:34):
mind starts going, oh well, howdoes this entertainment system
hook up to the internet?
What else is hooked up to thatinternet?
You know, like is there anotherway to pivot through and
whatnot, right?
Like it starts going throughthat section.
You know, like like you said,right, with the with the spy
aspect of it, I feel like theagency really cultivates that
sort of mentality as well.
(49:55):
And I've talked to some formerspies on the podcast, and you
know, they they tell me the samething that sometimes when they
have a big target, they startarranging things and start, you
know, kind of preparing theirtarget in you know
non-recognizable ways for whenthey interact with them, you
know, like they make sure theirtarget, even though they've
(50:17):
never talked to them before oranything, gets invited to this
party, right?
Or gets invited to this thing,and he knows that he has, you
know, some sort of you know,addiction or f or affiliation
with something that he canentice them with to gain his
attention, right?
And shift the conversation in away that makes it feel like
(50:39):
they're not betraying theircountry.
Like Jim Lawler explained it tome.
He said, you know, the vastmajority of people are actually
good people, you know, like ifyou look at like the Iran
nuclear scientists, I mean heand he literally said this, you
know, if you look at like theIran nuclear scientists, they
don't think that they're on thebad side, they think that
(51:00):
they're just building, you know,a solution for their country
that they love, and they theydon't believe that Iran would
use it on anyone, right?
They're just building it tosecure their own country's
future.
And you know, he he said that hewould always approach the
(51:21):
conversation from just levelsetting with them.
It's like, hey, yeah, I don'twant the world to blow up.
I'm sure you don't want theworld to blow up.
You're the one designing thebomb, and I'm sure you don't
even think that that would everhappen.
Well, let's you know, kind ofmake sure that that won't
happen.
You know, can you just tell me alittle bit about this facility
or anything?
You know, like and that's howthey they tie them all in,
(51:44):
right?
And it makes a lot of sensebecause you kind of disarm them
before they're even able to knowthat they need to put up
defenses, right?
Like you you're disarming themwhen they're in Iran and not in
Switzerland, for instance, orZurich, right?
You're disarming them there,where they're like, Oh, you're
invited to this party, you'rethis really smart guy, we need
you to talk at this thing.
(52:05):
It's like, no, the agency isputting it on, they're hosting
it.
They invited you, they invitedyou from someone that you
already know that you expectthings from, right?
It's just it's a fascinating wayof doing it.
It's the same thing with socialengineering for hackers.
You know, I'm approaching thehelp desk as this person.
Maybe I sound like that personeven.
(52:26):
Like you mapped his voice,great, because I have his voice
already locked in.
It's already in my deep fake,it's already making the call,
you know, all those sorts ofthings.
But it's a it's a fascinatingworld for sure.
SPEAKER_00 (52:40):
I feel like we can
keep talking for hours because I
even told you this work I justfinished on research security,
which like forgot that I've likedoing.
And also in my free time rightnow, I'm actively writing a spy
espionage, a Cold War spyespionage screenplay.
So I just feel like we couldkeep talking about this for
(53:01):
forever, but but we've beengoing for a while already.
SPEAKER_03 (53:05):
Yeah.
Yeah, we're we're unfortunate,unfortunately, at the top of the
time.
And I apologize for going over.
I'm normally a lot better aboutstaying on time.
SPEAKER_00 (53:17):
I think we're having
so much fun.
Like this happens to me with allmy pretty much every
conversation.
We just get going into somethingabout the there's just so much
fun stuff to talk about.
SPEAKER_03 (53:28):
Yeah.
Well, you know, that that justmeans that I'll have to have you
have you back on sometime, youknow.
SPEAKER_00 (53:36):
Yeah, well, anytime.
Anytime, Joe, just let me know.
It's pretty, pretty easytalking.
SPEAKER_03 (53:42):
Yeah, absolutely.
Well, you know, before I let yougo, how about you tell my
audience where they could findyou if they wanted to connect
with you and maybe where theycould find some of your research
papers or your books that youknow you may have put out there.
SPEAKER_00 (53:54):
Yeah, great.
So you can find me on LinkedIn.
That's kind of my more curatedprofessional identity.
So it's, you know, how we allare on LinkedIn.
But I post about a lot ofdifferent stuff up there.
I also have a substat, it'scybersecurity futures
substat.com.
And I kind of try to keep thaton cybersecurity futures
(54:16):
related, although I'm writingthis espionage spy espionage
screenplay, so I might kind oftalk about that or movies or
things I'm watching with that.
I've got some books on Amazon,the cyber tech survival manual,
which probably will be extremelyboring to everyone in your
audience.
You probably all know everythingin it already.
It's more for like giving it toyour mom or like to someone who
(54:39):
is not like us that doesn't knowabout it.
It does have some really funstories in there.
This audience also might reallylike this book.
I co-wrote a little bit, is TheSecret of Spies.
And it's kind of a tabletopbook, great photos.
You can open any page.
There's some stories, it's allabout spies and espionage.
And I did I did a lot of thewriting in the book, but I also
(55:01):
did the final chapter, which wasthe future of espionage.
I put on my futurist hat.
So when they brought me on to dothat project, I was like,
There's no futures in here.
What's that?
So my uh shifting paradigmspaper, which might be
interesting to folks, is calledShifting Paradigms using
cybersecurity Using StrategicForesight to Plan for Security
(55:26):
Evolution.
It came out in 2018.
There's a copy on ResearchGate.
If you connect with me onLinkedIn, uh you want to know my
work, I'm happy to share it.
I'm also very searchable.
Although my blog of 21 yearsjust got taken down because of
miscommunication.
So I am trying to come up with anew website with all these
(55:47):
materials and stuff.
So there.
So much information canoverwhelm.
And yeah, if you are curious oryour audience is curious about
any of the things we've talkedabout or like the future, please
just reach out.
You can get me on LinkedIn, sayyou heard me on your podcast.
I'm always down for havingconversations.
SPEAKER_03 (56:05):
Cool.
Well, awesome.
Yeah, it was a greatconversation.
And everyone listening orwatching, I hope you enjoyed
this conversation.
And if you want, you know, feelfree to pick up Heather's books
and reach out if you want.
Thanks, everyone.
Thank you.