Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
SPEAKER_00 (00:04):
You're listening to
the audit presented by IT Audit
Labs.
My name is Joshua Schmidt, yourco-host and producer.
Today we're joined by Eric Brownand Nick Mellum with IT Audit
Labs.
And our guest today is FosterDavis of BreachBits.
Thanks so much for joining ustoday, Foster.
How are you doing?
Yeah, thanks for having me,guys.
Looking forward to it.
Absolutely.
So we'd like to hear a littlebit of background from you and
(00:24):
your experience.
You're a naval officer as wellas a tech entrepreneur.
That's right.
SPEAKER_02 (00:29):
Well, at least in my
former life, uh at least uh
recently left the reserve, butbefore I did uh served for about
15 years active duty uh U.S.
Navy, uh cyber warfarecryptologic officer, as they as
they were calling it as I left,um, and spent about half the
time in the service fightingpirates on the high seas, and
(00:51):
the second half working uhspooky missions in dark
buildings and uh red teaming, ofcourse.
SPEAKER_00 (00:58):
Awesome.
So we've seen these traditionalred team engagements, it cost a
fortune, and you kind ofmentioned that a lot of the
times it become outdated withinweeks.
SPEAKER_02 (01:07):
I think from from
our end, it's really important
that you have an outside look onyour security posture and what
you're doing.
There's I think two reasons forthat.
One is technical, because thosethings need to be looked at.
As we used to say in the Navy,um inspect what you expect.
The second reason though is it'spsychological.
(01:29):
You know, there's this umthere's this blindness that you
get from looking at your ownsituation, and that could be of
course we're talking aboutsecurity, but that could be
everywhere in life.
You know, sometimes you've everhad that your best friend is the
one that will tell you you'vegot, you know, some mustard on
your face or something likethat.
So that has just always thatfascinated me um from the first
(01:52):
time I ever did red teaming.
So there's the so there's thepart about red teaming itself
and that it should be done atsome point, and then there's the
magic of what if we could dothis process continuously or
near continuously, and and whywould that even matter?
And I would say that's becauseattackers are sure coming after
you all the time.
(02:13):
And so why do something once ayear when the other 365, you
know, the attackers are?
Um so that's kind of how weapproach it.
SPEAKER_01 (02:23):
That outside in
perspective really brings to
light a lot of the things thatthe internal teams already know
and say, but then when you havethat that essentially
certification from an outsideresource who says the same
thing, packages it up, and andputs it into a format that's
digestible, it seems to bebetter accepted and listened to
(02:46):
by the people that makedecisions in the organization.
And we see that too coming in asconsultants.
We could be saying the exactsame thing that the internal
team has.
And when we've gone in to dothese engagements, in the past,
the internal team who's usuallypartnered with us to do the
engagement will just tell uslike all of the areas that they
(03:06):
have been trying to getattention on, local admin or
whatever it is, so that they canmake sure that that those things
appear in the report as well.
SPEAKER_00 (03:15):
Aaron Powell What is
it inherent about these red team
engagements that make them goobsolete so quickly?
And then what have you both Ericand and Foster have you seen to
remediate that that timelinessof those those assessments?
SPEAKER_02 (03:30):
I think what what
we've seen is that the speed
with which attackers develop, itcould be new tactics or it could
just be variations on tacticsthat are well tried and true.
Um but sort of I'd say the thelife cycle of the attacker's
reattack or their um next try,couple that with the fact that
(03:53):
IT infrastructure's attackservices are always changing,
and there's a there's achallenge to keep up with that.
I mean put those two together,and now you have a situation
where if you aren't lookingoften, it's almost like you're
not looking at all.
SPEAKER_01 (04:08):
Um Yeah, and Nick,
you and I have been been faced
with this recently here in thelocal environment related to the
breach that's going on in in theMinnesota, and in particular the
the St.
Paul area uh where there was uhan encryption event that
occurred.
And uh the the localities aroundhere are just scrambling to get
(04:31):
their cyber um really to have afocus on that because it wasn't
just one breach, but it was afew different breaches at a
couple of differentmunicipalities.
And it just it shines the lighton the fact that point-in-time
hygiene is is really not whatworks.
(04:51):
It it has to be that continualinvestment in people, place, or
or technology and process,because you can't just say, hey,
we're gonna spend a milliondollars this year and then we're
gonna forget about it for fiveyears, which unfortunately is
what we see all too often.
So you do need that continualreminder.
(05:12):
And and Nick, have you you'vebeen living in this uh for a
while.
What are you seeing?
SPEAKER_03 (05:19):
Well, yeah, and uh
kind of I guess around this
whole topic is I think uh whenyou use that, I feel it's kind
of like chasing your tail,right?
They're chasing the tail,they're doing the same thing,
and they're doing sometimeseither for compliance reasons
because they own a set of data,maybe CGIS or PCI, whatever it
is, and they're doing it justenough to meet those compliance
(05:40):
needs.
But to foster his point, they'redoing it once every year or
whatever it is.
In those other 364, they're notdoing anything, and then that
loop comes back around, they'rechasing their tail.
And then that's when that breachhappens.
And this is what we're seeing atSt.
Paul.
We're not necessarily hands-onwith St.
Paul right now, but we've goneinto that defensive posture at
one of the accounts uh to makesure that there's no issues.
(06:03):
Um, so we're well defended, butwe're seeing maybe at the at St.
Paul that things could have beendone differently.
And I think that's probably thecase for most anybody that goes
through a breach, you know,hindsight's 2020, they could
have done some things different.
SPEAKER_00 (06:18):
So, Foster, I'm
curious if you could give us
your take on what a red teamengagement looks like.
Certainly.
SPEAKER_02 (06:24):
Um I like to start
by thinking as to why we even
call it red teaming.
Um, I've heard different storiesaround this, but one that I like
is it's sort of like if you areon the football team and you're
preparing for the championship,what are you gonna do?
You're gonna throw half theplayers a red penny, a red
pennant jersey.
They're gonna play the play theother team.
(06:46):
And they're probab maybe evenrun some of the plays that the
that the other team is known forrunning to see if your team, how
well you're gonna do against it.
And even just making a statementlike that, or this I kind of
brought up earlier about I seeit as as two halves.
There's the technical part ofit, but then there's the
(07:06):
psychology.
And so where I would say isstart with a psychology of what
are we here to do.
The psychology here is thatthere's a team of people who are
inside who are working very hardevery day to accomplish a
mission.
And what we need to do is weneed to have a group of people
that are outside of that, whowill pretend to be um bad guys,
(07:30):
nefarious, call it what youwant.
And and once you can, if youstart the engagement by just
saying we're gonna have twoseparate teams, just if you just
even start with that, and andthen later you can have
different levels of technicalability and things like that,
but start by keeping it a bitseparate.
And so what you're gonna do withRed Team is you're gonna have
(07:52):
this separate team who's gonnathink different.
They're not gonna communicate ortrade information with uh the
blue team, as we would say, thethe good guys, the internal
guys, and we're gonna have thembe separate and essentially
plan, you know, spy versus spyor a chess match, whatever you
want to think about it, andstart to plan how you're going
(08:12):
to achieve an attack on anorganization.
And even then, right there, evenif just you did it on paper, um,
the goals of the organizationmight be X, Y, and Z.
The posture and protections ofthat organization may also be
here.
But already that red team isthinking in ways about how to
skirt around or how to approachthat organization to get what
(08:36):
they want.
Because what the attackers wantisn't always what the defenders
think they want.
So, in this, in setting up thisframing, you allow this red team
to go through that independentthought of what needs to be
done.
And then, you know, and so thenwhen you start an engagement or
you start the start thescrimmage, start the clock, um,
(08:58):
you uh allow each team to s toput up their best um their best
effort.
And then periodically, whatyou're gonna do is take a break.
You're each team will talk aboutthe things that they noticed,
and they tell them from theirperspective.
And it's one of thoseengagements where there's no
right answer, there's no wronganswer, it just is.
It's just what was observed.
(09:20):
And if both teams can umequally, mutually respect what
each other team is seeing, umthey're what each what each team
is seeing is absolutely correctand that that's what they
observe.
And then if you put someprogramming around it such that
um you're led through adiscussion to understand that
(09:40):
what somebody saw is what theysaw, but we can also reconcile
what is actually correct.
And then for the blue team, anddid that particular issue even
matter towards the mission ofthe company?
So some of those things all puttogether, and then of course, a
lot of the red team guys aregoing to be technical experts.
They're gonna be maybe they'vedone some hacking in the past,
(10:00):
maybe they've done penetrationtesting.
And so that if you do all thattogether in a very deliberate
way, now you're you are sort ofevolving past uh what I might
call a discrete penetration testinto this red teaming
methodology, this mentality thatwhat the hacker believes could
(10:21):
be true is very, very importantto know as you're trying to
defend that.
SPEAKER_03 (10:26):
Yeah, I I certainly
agree too.
And I think a lot a lot of theways we're thinking about this
at IT out labs is shifting focusto the the not reactionary, but
we're playing the offensivegame.
We're not waiting for a breachto happen.
We're training as if it didhappen and getting people to
shift into that mindset of notjust playing defense and
reactionary.
And I think we just see thatacross the board.
(10:48):
People just live in that spaceof waiting for something to
happen.
They do what they do.
I don't know if I necessarilywant to say the bare minimum,
but they're getting by, right?
It hasn't happened yet, um, orit hasn't been reported yet, at
least, that something'shappening.
But I think a lot of it is aculture shift, you know, to get
people to think about thesethings, how it could happen,
where their weak points mightbe, and to continue to do these,
(11:11):
put these efforts forward tofind that, those issues before
they come problem.
SPEAKER_01 (11:15):
You know, Nick, it
it's it's we're we're
experiencing it right now,right?
Where never waste a a goodbreach.
And that's kind of like the theinternal uh messaging to take
advantage of for the IT andcyber teams to get the funding
or to bring on the resourcesthat they need to accomplish the
items they've identified asareas to improve in the
(11:36):
organization.
But it's unfortunate that it itcomes down to that, right?
You know, look at St.
Paul as a as an excellent casestudy.
This thing is taking themoffline um for weeks, if not
months.
It's gonna cost tens of millionsof dollars, probably upwards of
50 million, to resolve when hadthey just had the right tooling
(12:00):
in place, the the rightcommitment financially from the
organization in thatenvironment, maybe spent you
know, a tenth of that on propertooling, education, resourcing
up front.
It wouldn't be in the situationthey're in.
But what we see this time andtime again, it's like people
(12:22):
just don't learn.
Baltimore County SchoolDistrict, huge breach.
They weren't investing in cyberbeforehand.
Then you know they have thisevent, and not only do they have
the cleanup and the impact tothe students and teachers and
community, now they've got toreinvest in tools.
And the money's got to come fromsomewhere.
It's unbudgeted, so they've gotto pull it from other resources,
(12:44):
and the whole thing isdisruptive.
But having this continual way oftalking about cyber, getting it
up to the board level where theboard has that responsibility to
ask, what are we doing aboutcyber?
Are we doing enough?
Do we have the right people inplace?
Are we having the rightconversations?
You know, let's hear from thecyber teams themselves.
(13:05):
Let's not filter it through thereporting structure of, you
know, this person reports tothis person, this person reports
to somebody else that, you know,can't even spell cyber, and
they're responsible for it, butthey don't know what it is.
Let's get the actual teams tothe board level having those
conversations and presenting tothe leadership of whatever the
(13:27):
entity is, of where their truerisk is.
And I think you know, tools likewe're talking about today can
help with that from a continualperspective of looking at the
organization as a whole, thatthat hacker mentality.
If we're if if we were incollege or high school or even
(13:48):
at the professional levelplaying another team, we're
going to be watching gamefootage in that football
scenario.
We're going to be studying thoseother players, which is exactly
what the threat actors are doingagainst all of our
organizations, and especially inthe public space.
It's public information whattools they bought years ago
because it's public funding.
(14:09):
So all of those records areavailable.
And it's just another one-up forthe threat actors to be able to
learn that environment andspecifically focus on breaches
that might impact the tools thatthat environment's using.
So it's it's just, you know,just circling back, it's it is
just frustrating that we getinto this um cycle of do
(14:32):
nothing, breach, do everything,then do nothing, and then the
cycle just repeats.
SPEAKER_00 (14:37):
So, how do these
engagements look different from
the civilian side of things orwhat we see in the private or
public sector versus what yourexperience is in the military?
I'd I'm really curious to hearum how these red team
engagements go down, and andeven if you could share, you
know, what some of the goals areor the takeaways, or even some
of the particulars of thoseengagements, what those might
(15:00):
look like in a Navy situation ora naval situation or a military
situation.
No, that's a very interestingquestion.
SPEAKER_02 (15:09):
And it it the first
thing that popped to my mind
when you said that was thethere's a stark difference.
One of the stark differences wenoticed is uh we thought it was
difficult to run red teamingexercises when we were in the
military.
We thought it was difficultgetting people on board.
We thought it was tough.
Um, and we had admiralscommanding other admirals to do
(15:33):
it.
We had, you know, forces of lawessentially commanding,
commanding people to do it, inthe real sense of that word,
commanding people to do it, andit was still very tough.
And you still had to get peopletogether, and you you had to
have people have mutual empathyand understanding, and you had
to do really tough things thatpeople would just rather focus
(15:55):
on other things.
Um boy, was it a lot eventougher when there was not a
literal command from thecommanding officer to do
something.
Now there are organizationswhere the uh chief executives or
the C level or the boardunderstand, perhaps implicitly,
(16:15):
or perhaps before we got there,they understood this concept of
why red teaming, as compared toother types of testing, but this
this adversarial approach totesting, um why it is superior
and knew that that's what neededto be done, and people tended to
fall in line and we could getthings done.
Um I found that when there was alot of education needed in
(16:39):
helping people understand,especially in a market where the
concept of penetration testingis is well understood and has
its place and is important, um,but where there are other
options to find out what youthink is wrong, um, it became
even harder to suggest, well,what we really need to do is the
(17:00):
pinnacle practice here, andthat's red teaming.
And um so we so that was a majordifference.
Um before in in the military,when I was conducting red
teaming across our differentgeographic fleets, these were
very large exercises with verylarge uh command structures.
(17:21):
The reason that that was able tohappen was because we had um the
two admirals that I worked withwere bought into the idea from
the beginning.
And it actually came from notcyber, it came from war gaming.
It came from, you know, thefirst admiral who who we did it
with uh or did these red teamingexercises with at a fleet level,
(17:42):
he was not a cyber guy.
He was actually a ship, what wecalled a um a ship fighter, a
surface fighter guy.
And um and he saw that thewisdom carried over from this
other practice.
And so that I think that was thethat's really the major
difference I think that comes tomind.
Uh, I think that there are umways that incentivize people in
(18:06):
the private sector.
And that's what we've beenreally fascinated with is what
incentivizes what we securityfolks know is good behavior.
I'll just kind of put it in thatway.
Um what does incentivize people?
And I think it's really equallyimportant to understand when
business owners don't want to doa test continuously, when they
don't want to do a test at all,uh, when they say they have no
(18:29):
reason to do it, when they saythat it can't happen to me, when
they when the money only unlocksbecause there was a recent
breach somewhere nearby, I thinkit's really important to
understand because those arevery valid ways of thinking.
And what we've learned since2018 is how to understand and
(18:49):
unlock those those ways that theprivate sector needs to operate.
Um, and and of course, a big onethat um that that we found are
compliance, of course, has itsplace, uh, but insurance, um,
because it's a completelyvoluntary scheme, uh, and yet it
drives to the mission of theorganization.
SPEAKER_00 (19:09):
Well, other than
breaches, what have you seen
work in tying uh untying thosepurse strings or loosening up
those purse strings to actuallycreate some some change in an
organization with their securityposture?
SPEAKER_01 (19:22):
So we often get
brought into an organization to
come in and perform a leadershiprole in that organization.
So we have a seat at thatstrategic table, which is really
helpful.
But I I view my role coming inum as that cyber advocate.
So as quickly as we can, we tryto move up the organization from
(19:44):
an educational standpoint.
So we're speaking with thedecision makers that are leading
that organization, if it's aboard or a council or an
executive team, whatever thatlooks like, but then bringing
cyber to them in a very tangibleway, because a lot of what we do
behind the scenes really is alot of uh it's it can be
(20:09):
complicated, right?
You know, you start talking injargon about IOCs and and
threats and things like thatthat aren't really tangible.
But if you can bring that to alevel that is understandable at
the personal level with thedecision makers, uh so maybe
it's around email security ormaybe it's around physical
(20:29):
security, you're likely somesomebody in their family has had
a cyber event in the past.
You know, maybe, maybe theircredit has had an issue from
credit theft, or they knowsomebody that's been impacted.
But being able to translate thatpersonal interaction into what's
happening in the workplace, justthe other day I had the
(20:52):
opportunity to present to alarge group at one of our
customers, and we were goingthrough reviewing here's all of
the inbound emails that comeinto the organization in a
month.
Yes, millions of emails.
But inside that, all of thoseemails that are coming in, the
filters that are in place arestripping away all the bad
emails, the malware, thephishing, and you're left with a
(21:13):
much smaller amount.
So diving into that and justusing some of these key aha
moments to show that we're underattack 24 by seven, and what
does that mean to theorganization?
You know, who's being fished inthe organization?
Who are the attacked people andwhy?
And I think once you bring thatto a tangible level and can
(21:35):
actually show this is what'sgoing on.
And a year ago or two years ago,before you had any of this
protection in place, this wasall hitting your organization.
And of course, somebody is goingto click on something and enter
creds.
And at that point, it's it'spretty much a waiting game until
the threat actors uh perform anencryption event or steal some
(21:56):
data, and and then you're you'rein front of the news camera.
SPEAKER_00 (21:59):
I guess we all want
to hear like uh, you know, kind
of like this movie kind ofHollywood stories.
But if you have anything youwould like to share that's kind
of real-world action, I thinkthat'd be really cool for our
listeners.
SPEAKER_02 (22:11):
No, that's a that's
a good question, and and I think
it'll be a good it it actuallydraws on something Eric brought
up, which which I really uhagree with with with the way you
put that, Eric.
So um so then to so Joshua, youknow, you you how does the
military go about these things?
And I think that something thatof course we were a part of and
we witnessed and later I foundout uh I took for granted when
(22:34):
we came to the private sector.
In the military, we have a verygood process of when you are
doing these, let's call them wargames for simplicity, but when
you are thinking about how to goagainst your adversary.
So a story that um uh I'll tella story from one of the red
teaming operations we did.
(22:54):
Um I will not tell you where itwas, I won't tell you the
specific vulnerabilities,anything specific, I do say I'm
just making up for illustration.
Okay, so all the caveats are outthere, right?
I've got lifetime obligationslike I'm sure Nick and others do
uh to protect some of thatinformation.
So um so this this one though, Ithink gets to something Eric uh
(23:17):
in a way alluded to.
In this story, what happened isone of our most junior people,
one of our uh one of our mostjunior red teamers.
So this would be a hacker.
Uh this is an active dutyservice member.
Um she was on our red team, andshe found, she's one of the most
(23:38):
junior people, but found one ofthe biggest issues.
And then, and then um, so let mestart to tell the story.
So we were in this theater ofoperations.
Um there were uh severaldifferent uh military units,
probably a dozen Navy units.
There were some Army and AirForce units there.
Um this was an area where theywas a base where they conducted
(24:03):
a lot of operations in a verylarge area, um, anywhere from
keeping people fed to keepingpeople a place to sleep to
resupplying them, doingreconnaissance missions, and
also what we would politely callkinetic missions uh as well in
this area.
So we, you know, on day one, wewalk in and my team, my red
(24:25):
team, and uh, I say uh Admiral,in about uh six weeks, my team
is going to come and we aregoing to try to completely take
down your network while you'redoing live operations.
And he says, you know, likehecky are commander, what the
heck are you thinking you'regonna do?
(24:46):
And I say, well, hear me out,Admiral, hear me out.
Here's how we're gonna do it.
Here's how it's gonna be donesafely.
Here's how it will, here's thetype of ways we'll communicate.
Here's the here's the refereecan blow the whistle.
If this if this happens, sodon't worry, we will not uh put
anybody in particular danger,but we what we will do is we are
going to test the five layers ofredundancy you have in your
(25:08):
systems to can conduct youroperations.
So that was sort of thesituation we were in.
Uh our team, um we were the redteam.
Uh our team was uh we we did alot of our preparation work, uh
and we we got to let's call it ahalfway point.
Let's say it was halftime.
(25:28):
At halftime, we were we wepaused everything and we had
sort of a halftime check.
And we were telling the admiralum a lot of issues that were
going on, um, some of which wereminor, but a couple that we
really needed him to payattention to.
And honestly, we were in theroom with a lot of the uh very
senior people.
(25:49):
And tell me if you've ever seenthis.
You just see a glaze go over theeye.
We had we on the red team haddone some of the most incredible
work that I think had been donein this practice in the military
ever.
Okay, and that's not necessarilyme.
I'm just I'm talking about theskill of the of the service
members that we had who reallyknew what they were doing.
(26:10):
These were very good white hathackers.
They definitely knew what theywere doing.
We had done some incrediblethings and yet we saw the glaze.
And we all, okay, that washalftime.
We all go back and I say, guys,we've got to figure out what's
going on because they didn'tnotice this particular issue
that we came up with.
Um so then what happened is wesaid, look, we have to describe
(26:31):
this, we have to communicatethis in a way that the business
owner, the admiral, couldunderstand.
And that's where we said, whatreally has to happen here is we
have to tightly couple redteaming with risk management
principles.
So the next time we came back,um, everything we told them, um,
we said, here's the bad thingthat you don't want to happen.
(26:53):
Here is the we've quantifiedthis risk in terms of how likely
is it to happen and howdangerous would it be if it were
to happen.
And we used a risk chart.
Nick, you may know this asoperational risk management.
Yep.
We used operational riskmanagement, which is the same
language that I told you we hada uh a shipfighter admiral, we
(27:15):
had we're we're talking topilots, we're talking to
submariners, we're talking tothe army, all these guys, they
all use operational riskmanagement to know the here's
the bad thing that could happenbecause it highlights to the
leadership here's something thatactually should be resourced
right away.
We now showed, let's say, thesame data to the admirals, but
now there were these several reddots, and the glaze that was
(27:38):
there immediately disappears.
Sonny, what's that right there?
What's this big red thing rightthere?
What are you telling me abouthere?
Are you saying there's a yes,Admiral, there's an issue?
And right here I've got um, youknow, petty officer Smith, and
uh and and she's going to tellyou what they found.
Why is it why is it this20-year-old petty officer Smith
(28:02):
who's coming to tell me aboutthis?
This is with this part about thejunior person.
Oh, why is this here?
I said, well, because she'sactually the smartest person in
the room right now about thisparticular issue.
She's gonna tell you exactly whythis was an issue.
And and so we were kind ofhaving a bit of fun with it all,
but but the point there was thatwe got people to break down the
walls.
We broke down as securitypeople, we broke down our wall
(28:25):
of just wanting to tell people atable of data, you know, and
push our glasses up.
And we broke down that to giveit in a way that they like to
hear it, and they broke downtheir walls, understanding that,
hey, actually a 20-something inthe room might be the person
that we need to listen to themost.
Why, Admiral?
Why?
Because this thing that shefound ties back to this
absolutely critical item thatyou told the president that you
(28:46):
were going to make sure uhneeded to be done.
And so, needless to say, youknow, she uh she she got lots of
um lots of uh accolades forthat.
At the end of the day, I thinkwe even awarded this this sailor
a medal just for that specificuh item that she was able to be
found.
SPEAKER_03 (29:03):
It it sounds exactly
like a military engagement.
You get set senior leaders thataren't they don't like hearing
from a younger eva maybe evenofficer, but when you go to the
enlisted area, I've got so manythings that are bopping into my
head now about this.
It has nothing to do with cyberwarfare, but I'll recall and
really quickly, we did asecurity engagement, physical
(29:24):
security in in the kinetic uhmindset for the anniversary of
Iwo Jima uh for Mount Sara atMount Sarabachi, where I think
if I recall correctly, it wasone of the long last uh Japanese
military members.
I think he was an officer inWorld War II uh during the
Battle of Iwo Jima.
This is 15 years ago.
(29:45):
And uh we went out there toprovide security.
We went to Mount Sarabachibecause the dying witches of
this officer was to have hisashes spread on the beach um or
on the island.
So we went out there and uh Iwent to Mount Sarabachi, came
back one day.
I was one of Five Marines onthis ship with the Navy.
And uh I was only I was an E4 atthe time, and it was me and one
(30:07):
of my E3s.
We're the only two Marines onthe ship.
And we were running comms.
Comms happen to go down.
Of course, it always does.
It goes down.
I had on a hook with the captainof the ship.
I don't remember recall hisrank.
And I'm talking to this guy atan E4.
And I'm like, sir, you know,whatever I did.
I was like, we gotta move theship.
Whatever is happening right now,we can't get comms.
(30:27):
And he's like, son, what rankare you?
I'm like, uh I'm a corporal.
And he's like, what rank is thatin the ring card?
I was like, E4.
He's like, gotta be shitting me.
And E4 is telling me to move myship.
And I was like, I'm sorry, sir.
Whatever it was, he laughed, butit just made me remember just
all these times hearing fromLaura enlisted how that must
have felt for her.
SPEAKER_00 (30:46):
Eric, have you seen
any of that in in your sector?
Just uh maybe some young gunscoming in with some information
that kind of changes the game.
SPEAKER_01 (30:54):
We've got a young
guy on the team now that has he
joined us about uh uh sevenmonths or so ago, just through a
random encounter at a um at asecurity conference that we were
at.
And he came to a couple of ourgame nights and and just really
enthusiastic uh individual.
(31:15):
And as we got to know him, it'slike, wow, this this guy's
genius.
You know, we the more we threwat him, the more he's able to
absorb, and he's bringing in allof these ideas.
And it it you know, for me, itwas it was really cool to have
somebody with that energy andthat ability to absorb
information and and do somethingtangible with it.
(31:37):
So it's it's kind of like Igotta rein them back in in some
areas where it could just youknow go way off into the into
the toolies with something thatmight not be 100% relevant and
with our customers, but it'sjust you know, it's just really
refreshing to be able tointeract with the younger people
who are coming up that you knowdon't know what a dial-up modem
(32:00):
is, and all of these things thatwe just kind of take for
granted, but yet you know, herethey are you know, kind of
coming out of the womb with uhwith AI.
So I I just get really excitedabout working with people that
are you know maybe maybe alittle bit different
intellectually, and how do wework with that as a as a team?
(32:23):
And you know, maybe they don'thave the best presentation
skills, but how do we you knowhow do we bring that that
forward and kind of round outpresentation skills, but then
get the other team membersexcited about this new
technology?
I mean, we're we're interfacingwith customers now that are
still pushing back.
Given the Heisman on AI, it'slike, wow, you guys just don't
(32:44):
get it.
Like it is here, it is coming,whether you like it or not, it
you cannot escape it.
And if you continue to try toescape it, you're just gonna get
shuffled further and furtherback.
People are gonna leave theorganization.
You know, it's just a everythingthat we all know, um, it's just
it's shocking the meetings thatwe're in arguing about what sort
(33:09):
of data some AI tool has aboutyou that you know is probably
already replicated 500 times outthere somewhere else.
Like nobody gives a shit aboutyour data.
To me, and Nick, I know you'regonna appreciate it, Josh, too.
It to me, it seems like yes, thepiracy is a problem, but it
seems like it's a problem thatcould be solved with some 556
(33:31):
NATO and you know really not bea persistent problem.
You know, back with the themovie with Tom Hanks, where you
know he was a captain and youknow, all that sort of stuff.
But like I'm surprised that it'sstill happening.
I'm also surprised that thedeterrence that like, okay,
we're spraying, we're washingthem with some hoses, hoping
(33:53):
that's gonna stop the problem.
But pirates in a rubber dinghy,I mean, let's solve the problem
permanently.
SPEAKER_02 (34:03):
And depending if we
want to go down the topic, it it
actually I hadn't thought aboutit until you you break bring
this up.
But you know, when I talk topeople about piracy, they have a
similar reaction of the like,what that actually happens?
What are we talking about?
Jack Sparrow coming down here?
And like, um, and and I wouldhave said the same thing before
I was in the Indian Ocean beingbriefed by um the intelligence
(34:26):
operations there.
And the the short of this isthat the same reason people
don't understand pirate like howcould that even happen?
Yes.
It's because you don'tunderstand what the pirates are
going through in their life andin their circumstances, you
don't understand how easy it isto do certain types of attacks
(34:47):
on ships, which is a veryappropriate concept to the same
these are the same people whoperhaps might say, well, hackers
aren't here to get me.
I mean, did you know that it'sactually with a canoe, an
outboard engine, and um andafter the monsoon season when
the ocean in the Indian Ocean isflat, did you know you can go 50
miles an hour in a canoe and allyou need is a machine gun?
(35:09):
And because most um vessels, dueto the laws of their nation, are
not allowed to have guns onboard, there you go.
Now now did you know that sothat's a that's a C V E right
there in terms of piracy.
The C V E is canoe plus machinegun plus calm water equals hack.
unknown (35:31):
Right.
SPEAKER_02 (35:32):
You know, and and
but but once you get to once we
do did piracy and we understoodwhat these pirates are going
through, and there's a wholeother, obviously, there's a
whole other element to this asto why even people would engage
with it, but once you actuallysee what their lifestyle is like
and what they're and what'sgoing on there, it actually
makes perfect sense.
And and so that's the benefit ofthe red team mentality and the
methodology is that things thatwould make no sense to you, now
(35:56):
you have some 20-something usingAI, um, doing something
completely away that you wouldhave never done it, and now, oh
my gosh, actually, all of asudden, that that completely um
destroys the mission you weretrying to accomplish.
SPEAKER_03 (36:10):
Yeah, it's a great,
great topic to get into.
I think we all appreciate it.
I do want to ask one morequestion about cyber before we
keep going to the militarything.
We're I guess a good place tostart, Foster, was you were
talking about the engagementwith the junior enlisted finding
the issue.
So let's say that's completed,right?
You've briefed the commandingand the commander of the ship.
(36:30):
What does the remediation looklike after that in the military?
SPEAKER_02 (36:34):
I'll tell you the
right way and the wrong way, and
there's direct analogies to theprivate sector.
Um, let me start with the wrongway.
The wrong way, which happensoccasionally, you know, um in
the military, um, but certainlyum we see in the private sector,
here's the wrong way.
Here's your report.
We've given you the brief, we'vetold you the issues, we'll see
(36:55):
you later, we'll see you in ayear.
Um let's take that report andlet's put it in a filing
cabinet.
SPEAKER_01 (37:01):
Yeah.
SPEAKER_02 (37:01):
And and it's human
nature, it makes sense.
It's it there's lots of reasonsto do that.
We've got a lot of things goingon, right?
We've got a lot of things in ourlives going on, and we're kind
of just trying to get home tomake dinner with our kids or or
whoever we spend our time with.
So the second part of that,which would be, I think, the
reason that's the wrong way isbecause what happens is there's
(37:23):
not an imperative to dosomething about it.
There's no follow-up, there's nocorrective action.
The fact that it's buried andnot seen often, I would say is
the prime root cause or primeone of the root issues as to
when things go wrong.
So what's the stark difference?
(37:44):
In the military, when it wouldwrite, when it would go right,
is this admiral I was talkingabout, he said, okay, Davis, you
take that operational riskmanagement chart that you made,
and we're just gonna put this infront of everybody every week.
And I want all the commandersand I want all the division
heads to come tell me every weekhow we're doing against this
picture of Skittles right here.
(38:06):
And I don't have to be uh theadmiral.
I don't, he says, I don't haveto be uh an expert.
I just know that Skittles thatare red are bad.
Get rid of them.
But what he did was every week,every cycle, everybody was up
there explaining as to what didor did not, and it was
independently verified, youknow, by the red team.
(38:26):
Yep, that ruts that that dot'sgone.
Um and similar things though inthe private sector.
If there is an ethos or strongleadership that is telling
people we cannot just forgetabout it, that's where we see it
go well.
SPEAKER_03 (38:42):
Yeah, it so it
sounds they're very pretty
similar.
The remediation similar.
Yeah, very similar.
I mean, that's really how we'reoperating too at IIT Addile.
I was like, we we'll want to doan engagement where we're with
you from the start and then wefinish the engagement, but then
we stick with you throughout theremediation process.
We don't want to leave the pileof papers on your desk and say,
thanks to the check, you know,we'll we'll see you next year.
(39:03):
Like, well, let's stay with youthroughout the year and make
sure everything is good and andremediate these processes that
have been broken.
SPEAKER_02 (39:11):
Yeah.
Yeah.
Continuous.
I mean, that gets to why we'reputting all of our everything we
are is is putting into the ideaof let's do this continuously,
let's make sure AI isn'tincorporated in the right spot,
let's make it easy so that theeasy choice is to do the right
thing, and so it's continuouslybrought up.
And uh, and as long as youreduce, as long as you eliminate
(39:33):
false positives, because that'llkill you.
As long as you can eliminatefalse positives, that we've seen
is the way.
SPEAKER_00 (39:40):
Thank you so much
for your time today, Foster
Davis with Breachbits.
You can check them out onLinkedIn.
Or is there any other placesyou'd like to point our guests
if they wanted to learn more?
SPEAKER_02 (39:49):
Breachbits.com or
see us uh when we travel to
London and the uh financialdistrict.
SPEAKER_00 (39:54):
Awesome.
We'd love to stay in touch withyou guys and uh stay abreast of
what you're working on.
And you've been listening to theaudit presented by IT Audit
Labs.
My name is Joshua Schmidt, yourco-host and producer.
Today our guest was Foster Daviswith BreachBits.
We also have Eric Brown,managing managing director at IT
Audit Labs and Nick Mellum.
Thanks so much for joining ustoday.
And uh please check us out onSpotify.
(40:16):
We have video now.
Please like, share, andsubscribe, and leave a comment
and uh review on Apple Podcastsif you're so inclined.
SPEAKER_01 (40:23):
You have been
listening to the audit presented
by IT Audit Labs.
We are experts at assessing riskand compliance while providing
administrative and technicalcontrols to improve our clients'
data security.
Our threat assessments find thesoft spots before the bad guys
do, identifying likelihood andimpact where all our security
control assessments rank thelevel of maturity relative to
(40:46):
the size of your organization.
Thanks to our devoted listenersand followers, as well as our
producer, Joshua J.
Schmidt, and our audio videoeditor, Cameron Hill.
You can stay up to date on thelatest cybersecurity topics by
giving us a like and a follow onour socials, and subscribing to
this podcast on Apple, Spotify,or wherever you source your
(41:09):
security content.