All Episodes

March 31, 2025 45 mins

In this episode of CISO Tradecraft, we host Chris Hughes, CEO of Aquia, cybersecurity consultant, and author. Chris shares insights on the evolving landscape of cybersecurity, discussing software supply chain threats, vulnerability management, relationships between security and development, and the future impacts of AI. Tune in to gain expert advice on becoming an effective cybersecurity leader.

Chris Hughes - https://www.linkedin.com/in/resilientcyber/

Transcripts: https://docs.google.com/document/d/1j5ernS0Gk3LH-qcjhi6gOfojBqQljGhi

Chapters 

  • 00:00 Introduction and Special Guest Announcement
  • 00:55 Chris Hughes' Background and Career Journey
  • 02:46 Government and Industry Engagement
  • 03:42 Supply Chain Security Challenges
  • 07:34 Vulnerability Management Insights
  • 12:13 Navigating the Overwhelming Vulnerability Landscape
  • 22:19 Building Positive Relationships in Cybersecurity
  • 23:41 Empowering Risk-Informed Decisions
  • 24:29 Aligning with Organizational Risk Appetite
  • 25:33 Navigating Job Changes and Organizational Fit
  • 26:32 The Role of Compliance in Security
  • 33:27 The Impact of AI on Security
  • 43:05 Balancing Build vs. Buy Decisions
  • 45:05 Conclusion and Final Thoughts
Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
G Mark Hardy (00:00):
Hey, today I've got a special guest on our show, Chris Hughes.

(00:02):
He's the CEO of Aquia.
He's a chief security advisor.
He's an author, he's a veteran.
He's done amazing stuff andI think he would love to hear
what he's got on his mind.
So stick around.

(00:24):
Hello, and welcome to another episodeof CISO Tradecraft, the podcast that
provides you with the information,knowledge, and wisdom to be a more
effective cybersecurity leader.
My name is g Mark Hardy.
I'm your host for today, and I've got aspecial guest on the show, Chris Hughes.
Chris, welcome to the show.
I.

Chris Hughes (00:39):
I'm excited to be here.
I've listened to many episodesover the last several years,
so I'm excited to be on.

G Mark Hardy (00:43):
Awesome.
So hopefully this is youropportunity to, of course obviously
you're gonna be on an episode.
But more importantly, to contribute tothe body of knowledge that we've been
trying to build here at CISO Tradecraftfor people in the cybersecurity career.
first of all, let's, give youa chance to introduce yourself.
Tell me a little bit about yourself,your background, and how you
got to where you're right now.

Chris Hughes (01:01):
Yeah, as you mentioned, I'm the CEO of a company named Aquia.
It's a government focused,cybersecurity consulting company
or digital services more broadly.
by way of background, I've beenin the federal, and commercial
it cyberspace for about 20 years.
was active duty Air Force and thena government employee, a couple
different agencies, doing cloud,DevSecOps, cybersecurity kind of stuff.

(01:22):
and then started my own company.
Then outside of that, alot of industry engagement.
I have a, substack called ResilientCyber, where I do a lot of writing
and just sharing, a newsletter onindustry news and things like that,
as well as like separate pieces anddeep dives and different topics.
I have authored a couple books, onecalled Software Transparency that focuses
on software supply chain security.

(01:43):
Everything from, IOT, opensource, critical infrastructure,
that kind of stuff.
And then another book after thatone called Effective Vulnerability
Management, where I went really deep inthe topic of vulnerability management
and, I don't wanna say all the thingsthat are wrong with the ecosystem,
but a lot of problems we have and someinnovations and opportunities as well.
I'm excited to be on and, like Isaid, a big fan of the show and,

(02:05):
excited to be part of the community.

G Mark Hardy (02:07):
Great.
first of all, thank you for your service.
Appreciate that.
For those of us who've had a, theprivilege to wear the cloth of our
nation, it, time goes so fast, doesn't it?

Chris Hughes (02:15):
It.

G Mark Hardy (02:16):
You're on the other side, and I remember when I, finished,
I had 30 years of misspent youth inthe Navy, and then when it came time
to retire, of course you don't wantto go and you miss the camaraderie
and everything else like that.
And one of my buddies, probably one ofthe better pieces of advice said, Hey.
You work really hard to create anddefend the system of government
and the way of life we have.
So why don't you go enjoy that way of lifethat you helped to go ahead and provide.

(02:39):
So again, thanks on behalfof everybody who is willing
to extend that thanks to you.
But I also understand you on other stuff.
a cyber innovation fellow at CISA, isthat something that's still going on or is

Chris Hughes (02:51):
no.
I've wrapped up that, last year thatwas a, an interesting non-governmental
kind of appointment, as they callit, non-governmental employees.
So they bring in folks from industrywith various areas of expertise.
mine was supply chain securityand vulnerability management
and cloud and things like that.
And you get to collaborate with some ofthe folks within CISA that are working
on different things, industry, workinggroups, publications for the community.

(03:14):
stuff like that.
And it was a great opportunityto me, for me to be able to,
again, the service aspect.
get involved in government whilestill, not leaving the company I've
started and some of the work thatI'm doing in the commercial space.
so I had a lot of great, timesthere, met some great folks and,
they're doing a lot of awesomework for the community over there.

G Mark Hardy (03:32):
Yeah.
So I think this supply chain, Ihad Cassie Crossley on my show,
I'm guessing about a year ago.
I was about number 171and we're now at 226.
So I do the math on it.
And what we're finding then is thatfrom a perspective of supply chain,
and I want to go into that a littlebit and then we wanna talk about some
of your other books, is we're seeingmore and more with the nature of.

(03:54):
How products and even softwareare put together that it's very
difficult to identify a provenancefor what's going into our system.
If I go ahead and I'm trying to rememberwho it was, but somebody had said they,
they decided to do a little projectand they're gonna try to go ahead and
say, without using anything else otherthan your own resources, make one of

(04:17):
these, this is a pencil, nothing fancy.
He said, after years.
Realize that it's, it can'tbe done all by itself.
yes, you can grow a tree, I guessyou can go ahead and cut the tree
down, but then you need to get an ax.
So you gotta create some tool to cutdown the tree, and then you have to
get the, lead in there, or you canget the graphite, then you have to
refine it, and then the little metalthing and the rubber on the end, and
then the ink and every and the paint.

(04:38):
And it turned out that there's so manydifferent inputs and something as simple.
As a pencil, we'll multiply thatby orders of magnitude when we look
at other things that are going on.
And I think the error might bethat we think software is not
that, but it's just monolithic.
Oh, I just coded it up and I'm done.
But what do you see and what shouldCISOs and other security experts
be aware of the potential dangerswith respect to supply chain?

(05:03):
And it comes to our, not onlyour software systems, but even
the hardware that runs it.

Chris Hughes (05:07):
Yeah, it's a very, interesting and broad question.
And first off, Cassie is awesome.
Her book on, on Supply ChainSecurity, definitely recommend
folks, check that out.
and there's a, a lot of challengesaround software supply chain
security, like you talked about.
sourcing materials in our worldthat often is open source software.
it can, it's part of I think it's 90%plus of every single code base that's,

(05:30):
they're looking at, 70% or more of themodern code bases entirely are made
of open source software components.
In terms of Providence, we don'treally know who these things came
from, what their intentions are.
what they may or may have not done toit, and many folks don't even know.
what's in the proc thatthey're using, right?
In terms of vendors selling 'em softwareor, applications and services and so on.

(05:51):
You don't really understand thecomponents of dependencies, the
things that are in those products.
and it's, crazy that it, we'rein this world of zero trust,
where it's no implicit trust.
But when it comes to software, weimplicitly trust random people on the
internet all the time to put thesecomponents into things that we use, and
we really don't know where they came from.
what they've done to themand that kind of thing.
And I think attackers have gottenreally, keen to that, over the

(06:14):
last several years in particular.
That's why we've seen things like thecybersecurity executive order and,
things like that gain traction, asattackers started targeting the supply
chain more, both in the commercialfront and the open source front.
And there's also some challenges,like I, I've written a piece
recently talking about this.
We, try to draw parallels forsoftware supply chain to other
supply chains, say medical orautomobile manufacturers and stuff.

(06:37):
The challenge there, looking atautomobiles, there's really stringent
regulations and requirements.
There's only, several thousands ofautomobile suppliers around the world.
But in software, there's.
Literally millions.
Like you could have thousands,hundreds of thousands, millions of
people around the world, contributingto this and putting things out on
the internet that people are using.
And then they get used and used as thesenested dependencies, as they call it,

(06:59):
of, direct and transitive dependenciesthat are in your software, these
different components and libraries.
And, and they're taking advantageof it in a lot of different
ways, whether it's, maliciously,inserting malicious code in there.
Going after the developers whoare using things, doing typo
squatting, like just really, uniquesocial engineering type attacks.
we recently saw, a supply chain attackagainst something called TJ Actions.

(07:20):
I think it was like a GitHubaction, that was targeted last
week that, that made the news.
so there's a lot of, we have aporous and, problematic ecosystem
when it comes to software, andthe attackers are taking advantage
of it in a lot of different ways.

G Mark Hardy (07:34):
Yeah, I'm looking through the table of contents right now on your
software transparency, but of course youstart out with the software supply chain
threats, but then you talk about thingssuch as vendor risk management and.
Our vulnerability databases.
Now, I think a lot of times wetend to think of vulnerabilities
as being stuff that we get withhas CVSS score assigned to it.

(07:55):
It's something thatsomebody else discovers.
It's posted.
We then have to go aheadand do a risk assessment.
A do we have it implemented?
B. Is this on a mission critical system?
And C, can we patch it withouttaking down that business operation?
If the answer is yes, and yes, andthis is high enough score and we've
got the time for it, yeah, we do it.
But how does that tie into supplychain risk with respect to looking

(08:18):
at vulnerability databases?
Does it at all?
And if so, what should we also be thinkingabout in addition to just waiting for
the latest list to come out, or thelatest Super patch Tuesday, which says,
oh, we're fixing all of these things.

Chris Hughes (08:31):
Yeah, as you point out, historically we use, CVEs or common
vulnerabilities and enumerations from thenist, national Vulnerability Day database.
We love our acronyms in cyber, by theway, the NVD, which has some challenges
that we can get into there around, thevulnerability database ecosystem itself.
I. But as you said, proprietary commercialproducts often will have CVEs associated

(08:52):
with a vulnerability and, people can runscans of different, vulnerability scanning
tools that we use and identify theseCVEs and see is there a patch available?
Is it on a critical system?
am I able to patch this withoutcausing operational disruption?
But when it comes to open source,the, landscape is much more vast
and many of these things don't have.
unique CVE identifiers for them, orif they do, you have to track it down,

(09:15):
not just on NVD, the OSV or o GitHubsecurity advisory database and on.
there's many different sources of truth.
And trying to reconcile thatas an organization or an
enterprise can be challenging.
And that's just, thingsthat have an identifier.
There's things likemisconfigurations and things like
that need to be considered too.
and then there's, there's, thereare resources out there For example,

(09:37):
OWASP has, the open source softwaretop 10 list that looks at not just
known vulnerabilities, but howoften is something maintained,
when was it last updated, who'scontributing to it, minimizing the
bloat of the software that you have.
Do I need all these dependenciesin libraries, for example, or can
I minimize this to just what'sneeded to run the software?
there's a lot of, different factorsthat you should be looking at.

(10:00):
open source of not just known Id, knownvulnerabilities, but who's maintaining
it, how often it's maintained, whenit's last been updated, if you're using
the latest version and on there's alot of different, CVEs are lagging
indicators of risk right at that point.
We know there's a problem.
It's has an identity assigned to it,but there's leading indicators of risk
that we can get ahead of and startto look at something and say, Hey,

(10:22):
there's not a problem here yet, butthere very well could be here soon.
And how will we respond to it?

G Mark Hardy (10:27):
I think a good leading indicator is has a
vendor put out a new version.
Of anything, because almost alwayswe'll find out that although they might
have been fixing old vulnerabilitiesor old weaknesses, they're probably
adding some functionality becauseyou always wanna keep moving forward.
I worked as a software developer andwe always had a queue of things to
put in and get this going, and we had'em prioritize and stuff like that.

(10:49):
And of course, if there's a, weaknessor vulnerability, you want to go
ahead and move that to the top ofthe stuff you're gonna work on.
And depending on thesize of the organization.
You talked about supply chain.
This is a little small company and.
We remember we had this, the salesguy came back and said, Hey, we
can get a sale this week if youput this feature in the software.
And we're like, yeah, it'snumber 27 on the list.
They said, we'll make payroll on Friday.

(11:11):
If you can put this feature in this week.
You're like, okay, it'snumber one on the list.
And so what happens is you adaptto the market and things like that.
Now, I used to get, and Istill get them from CISA.
alerts of things that, hey, thisvulnerability is discovered or this
has been found, and things like that.
So we look at the CVEs andidentification, and oh, by the way,
a quick little aside here for people,CVEs are not assigned in, if you will,

(11:34):
chronological or numerical order.
Are they?

Chris Hughes (11:37):
No, And then there's, there, there's also the, problem
that, just to your point there asecond ago I wanna touch on that, is,
we have these competing incentives.
we often in security, like welike to think we're the kind of
center of the universe, right?
Like the world revolves around us,the business revolves around us.
But the business, believe it or not,is in the business of making money and
generating revenue and having a returnfor shareholders and things like that.

(12:00):
as much as we love cybersecurity,things like speed to market and,
feature development, customerattraction, retention, like those
often will come ahead of security.
And, that's just the nature of how it is.
unless, we have a systematickind of paradigm change.

G Mark Hardy (12:13):
Yeah, and that's really one of the challenge for security leaders
is to drive home the point that we arein the business of revenue protection.
We're not a cost center.
Rather, we enable.
All these other profit centersto actually generate a profit.
And for anybody who's taken a lookat some of the major breaches that
have taken place, and either the costto repair or the regulatory fines

(12:34):
or anything else, they realize that,yeah, those are a lot of times unforced
errors on behalf of the business that.
With a proper investment upfront incybersecurity, they're able to go
ahead and potentially avoid, either,maybe not entirely, but you could
certainly avoid the impact of some ofthese things simply because you've got
controls in place that will identify,Hey, something is going sideways.

(12:56):
Let's go ahead and drop all thebarriers, if we will start to
isolate systems and stop the lateralmovement of a potential, adversary.
But, back to the, CVE and thingslike that, One of the things I had
noted is that you look at CV andwait a minute, what do you mean 2023?
It just came out yesterday or all ofa sudden it's CVE 20 25, 9 9 9 9 9.

(13:21):
It's like, how in theworld did we get that high?
any insight in terms of howthose numbers are generated?

Chris Hughes (13:26):
Yeah, so I think there is a date component in terms of,
chronological order, in terms ofwhen it's discovered and, documented.
There's like a whole ecosystemof what they call CNAs.
CV numbering authorities whocould, it's a whole ecosystem.
They submit these CVEs to be,add to the catalog and they
get an ID and things like that.
and then they, add numericalnumbers after that.

(13:47):
but there's a, there's an aspect of whenit's reported, when it's identified,
when it's analyzed, when it's addedto the NVD, that kind of thing.
And in terms of how we're getting up.
The, those numbers so high.
it's really amazing when you lookat the, figures that you know,
for, folks interested, there'sa vulnerability researcher named
Jerry Gamblin, who puts out somereally great insights on LinkedIn.
and he show, he showed, like last yearwe exceeded 40,000 CVEs just last year.

(14:12):
the number now is 200,000 plus in the NVD.
That's just the NVD.
Again, not OSV, not OSS index,not GitHub security advisories.
In the other end number of.
sources of, vulnerabilitydata and advisories out there.
So 200,000 plus 40,000.
Just last year he showed that we'reon a pace, I think it's 35% year
over year growth so far, just in2025, over 2024, which already

(14:36):
had double digit growth as well.
so organizations are simply drowning.
The vulnerability backlogs are,hundreds of thousands, even millions
in large enterprise environments.
so these things keep piling up.
And organizations, as yousaid, if you're an SMB, right?
You live below the kind of quote unquote,peripheral, like the cybersecurity poverty

(14:57):
line as it's been called by Wendy Nather.
you are in a really challengingsituation, and even if you're in a
large enterprise environment, again,to your previous point, you may be
coming and pointing these things out.
But you have, you're competing withother things like speed to market,
feature velocity, new customer requests.
so you, have your work cut out foryou and the problem is getting worse.
it's, not getting any easier and wecan dive into perhaps like how AI

(15:20):
and things like that, play a roleand, may make it worse or better.
But, safe to say we have a lot of workto do and not enough people to do it.

G Mark Hardy (15:29):
Yeah, and I wish my 401k were in able to give the return that we're
getting in terms of his vulnerabilities.
We'd all be set for life at this point.
So obviously the more complex we get.
I think it was Bruce Schneider that saidcomplexity is the enemy of security,
and if he didn't, I'm sure he, I.
Probably did at some point, and I'll give,I like to give attribution where possible,
but we look at the CVSS and it's evolved.

(15:50):
We start out at one and I got intoit when A CVS version two, then we
had three, 3.1 and now version four.
And so we're finding that is thatit's more than just a baseline score.
And one of the things is you cannotjust take a raw number and say,
oh, this is a 9.8, so I'm gonnamake it the most important thing.
In fact, it just has to dowith your vulnerability.
Understanding and also what's actuallybeen exploited in the real world.

(16:14):
a 9.8 that came out a fewyears ago I think was poodle.
And yet I don't know of any real worldin the wild exploit of that, outside
of been demonstrated in a laboratory.
So from that perspective, I. As CISOs,as security leaders, when we're faced
with the potential challenge of anoverwhelming number of vulnerabilities
being reported, being charted, beingpresented to us, and then being told,

(16:38):
okay, you've gotta solve this with theresources you have, and we don't have
enough resources, what's your adviceto how do you deal with this problem?
That's clearing out the Ian stables.
It's just, it's beyondthe scope of one person.

Chris Hughes (16:53):
Yeah, this is a topic that Canon has been the,
focal of many conversations.
It could be an entire episode.
And this is what kind of led me downthe path is I've been in roles as,
doing vulnerability management anddoing Nessus scans and it's like a, a
treadmill, a like a ger on a treadmilljust running around over and over week
after week and never getting anywhere.
And it's because of the problem that we'vebeen discussing, and that's why we've been

(17:16):
seeing, So what happens is, historicallywe've used these CVSS scores, right?
And they have a base score, of, a 9.2or whatever, a numerical score, but it
has no real relevance to your particularorganization, for example, in terms of
your environment, the applications thatyou're running, those kind of things.
And it also has no, context aroundknown exploitation, exploitability,

(17:39):
likelihood of exploitation.
So we've seen these additional kind of,Systems or, scoring mechanisms pop up.
One that's taken a lot of, interest inthe last several years is called EPSS,
the Exploit Prediction Scoring System.
and that one looks at a, it looks ahead30 days and gives you a predictive
number, a probability score betweenzero and one that the, CVE will

(18:01):
be exploited in the next 30 days.
And you can adjust that based on your,risk tolerance as an organization.
And it's pretty, accurate in terms ofthey broke down the, how it does over
the course of a year and how it performs.
so people are starting to use that.
There's of course sources like cisaKEV or Known Exploited Vulnerability
catalog where you can go and say, Hey,not only is this thing likely to be

(18:23):
exploited, it's already been exploited.
we have evidence of knownexploitation in the wild.
against organizations, and you can leaninto that and I'll add that, the kev,
like anything else, any of these databaseswe talked about is not, invaluable.
There's no, it's, not perfect.
There's other people who havebuilt on that and, found additional
exploitation that the KEV didn'tfind and things like that.

(18:43):
But the KEV is a great place to start.
it's a free resource to thecommunity, again, to CISA's credit.
And then, so you have EPSS, you haveKEV, and then there's additional
tools that are coming out.
more modern, like software compositionanalysis or runtime, security tools.
They're now looking at,things like, is it reachable?
It may be known to be exploited,it may be likely to be exploited.

(19:05):
It may be a criticalquote, unquote critical.
I. but can I even reach it in a productionenvironment or reach it in the code base?
Does this function get called?
Or is this system even reachablebased on my architecture?
those kind of things.
And you can you do it wind windowdown, pretty significantly.
They show, I think it's like 95, 98%.
you can get a reduction in kindof the noise and the toil and the,

(19:26):
the burden that you would haveif you just used CVSS base score.
and that has a cultural element to it too.
going back to your point aboutbeing a business enabler for many
years in security, we've just ranscans and threw it over the wall to
developers and engineers and said, Hey.
You gotta figure this out beforeyou can go to production, before
you can have this next, productrelease, whatever the case is.
And that has fostered a lot ofresentment, a lack of trust and

(19:49):
frustration on both the development partand the business's part with security.
so I think it's in our best interest,to optimize prioritization using
some of the things we talked about,to build back that trust, make it an
actual manageable problem that's notintractable, and help them move faster
as a business while still, protectingthe revenue as you talked about.

G Mark Hardy (20:09):
Yeah, and for those who are interested in the CVSS or the EPSS,
I know EPSS, their version four cameout a week ago, and so that's update.
You'll find them on first First.organd so it's a reference for that.
very good point about the reachability.
Because if you have systemsthat are isolated, air gapped.
And of course I use little airquotes on the air gap because an

(20:31):
awful lot of times we find out asystems somehow get cross connected.
We didn't expect to, but somehow they do.
And then also people have demonstratedmore laboratory than anything else.
Oh wow.
I could cause this hard drive to spinthis way and then get audio and then pick
it up over here and then make it do thisand transmit information in morse code.
And, it's, really not necessarilyconsidered to be a real exfil.

(20:54):
It's gonna be pretty slowto get stuff out that way.
But more to the point is that yourindicated the relationship you
have, not only with the businessbut with the IT and the dev team.
And they're both important becauseas IT security, and one thing,
if you're not doing it right,management views you as a cost center.
You're just a tax on the organization.

(21:15):
You slow things down.
You're the, department of no, andwhile we have compliance, which is
the only reason you probably evenhave a budget in the first place,
is because we have to be compliant.
But we all know as securityprofessionals that compliance does
not equal excellence and the security,it's a minimum passing grade.
Then we have the relationshipwith the development team who just
sees security as a pain in the.

(21:36):
you know what to say.
They keep coming up and manufacturingall these requirements that we've
been running perfectly fine.
Without this in there, we haven'tneeded to go ahead and change
our crypto algorithms or put thisparticular thing in our code base, and
there's never been a problem with it.
But why are they banging onthis drum when we've got.
All these other things linedup on our list of things to do.

(21:57):
So if we look at the politicalreality of these relationships, it
transcends the functional requirement.
It's not just can you get yourjob done and then go home?
It's can you create positive workingrelationships with two primary groups
that have the potential to make yourlife difficult And there may be more.

(22:19):
but what are your thoughts about thatfrom the political perspective in
your experience you've had over, 20plus years you've been doing this?

Chris Hughes (22:26):
Man, you just laid out so many great points that we
could dig into on each of those.
And first off, like you talked about,the kind of the political realm and the
relationships and building, rapport, oftenin security, I feel like historically it's
been like you, you were looked down uponif you weren't quote unquote technical,
but we're realizing the soft skills,the communication, the empathy, the
building trust or building relationships.

(22:46):
That's actually the quality stuffthat makes the biggest difference.
you build those relationships, you havean opportunity now to drive the ball
forward for security much better thanif you just come in as the office of no
and tell people they can't do things.
And slam your hand on the deskas you said, and, be a blocker.
And what happens in that case is weoften foster what we call shadow.
You hear shadow it, shadow,SaaS, shadow, whatever.

(23:08):
It's because they don'twanna work with us.
They don't wanna engage withus because of the way we behave
and the way we've done things.
So they work around us and then we.
Perpetuate the age old, bolted onrather than built in, security paradigm
that we all know and love so well.
so that's part of it.
and then, so building those relationships,building that, the trust with your
peers, things like that, bringing contextrich, information to, making decisions

(23:32):
around vulnerabilities is so critical.
And then, being a business enabler, asyou said, of, helping the business, in
my opinion, not say no, but you can say.
Here's the risk of doing set activity.
you are in a position as thebusiness to make a risk informed
decision on whether you wanna moveforward with something or not.
You're empowering them, with thatinformation, rather than just saying
yes or no, you could say, here's how wecould do it the most secure way possible.

(23:56):
Here's some options, here'ssome recommendations, and
let them drive with that.
so I think that's, the key points thatyou made, it really jumped out to me is.
Doing those things are far moreimportant than, any technical depth
in terms of a particular technologyor a scan or whatever, right?
Technology will change.
but those human relationships,those kind of, things are critical.

G Mark Hardy (24:16):
And what I advise people is you want to not be the department of no.
You wanna be the department of how whensomeone says, we want to do something,
here's how to do it in a way that keepsour risk within an acceptable level.
Now, ultimately, the risk appetiteof the organization is gonna drive
your decision making, and it'sreally important to understand.

(24:36):
What that risk appetite is fromsenior management and leadership.
And then also try toalign your own actions.
'cause if you're, if you will acowboy and you're working in a
very conservative organization,there's gonna be some issues there.
Conversely, if the other way is around,or you're working for someone who says,
what the heck, we'll just do it live.
And you're the person who wants todo everything measured and carefully,

(24:59):
you're gonna be very uncomfortablein that position because you can
be way outta your comfort zone.
So a lot of what we find then.
Is that for success?
CareerWise transcends yourtechnical skills, even your
communications management, leadershipskills and political awareness.
It's, is there a good fit?
And this is a tough area to look intobecause we spend so much time looking at

(25:21):
the technical stuff and we can write booksupon, things such as software and supply
chain and security and things such asthat and look at vulnerability management.
But do you have any thoughtsor insights in terms of.
How to read the organization.
I've got some friends right now whoare looking for other jobs, either
because their old positions, terminatedabruptly, or the organizations
are changing, or sometimes theyjust said, I don't like it here.

(25:46):
any thoughts you have in terms ofhow would you figure out where are
you gonna land and land safely sothat you don't find yourself in three
to six months going, why am I here?

Chris Hughes (25:54):
Yeah, this is a good question.
I feel like it's, it, to be honest, atleast in my experience, it gets easier
to navigate the longer you've been indoing this because you got those war
wounds, what to look out for, what kind ofquestions to ask of the leadership and the
team and how they handle certain things.
things like that.
but just having those openconversations during perhaps the
hiring process to a ask, How do theydefine organizational risk tolerance?

(26:16):
what happens if they have a situationthat exceeds that risk tolerance?
Is someone willing to put their name ona dotted line and sign off on things?
is security there, to really be atrusted, collaborative partner to
the business or merely to checksome boxes due to compliance, for
example, as you talked about earlier.
and just for a quick while I said that,you said something earlier, you, you said
you have a job because of compliance.

(26:37):
And I wanna stop on that for amoment because, we hear so often
that compliance isn't security.
The reality is compliance is security.
It's just not the elimination of risk.
Nothing is, as we know, unlessdisconnect the system, throw it
in a lake or something, right?
but in the absence of compliance,anyone who's been doing this
long enough, enough knows.
If you show up to your engineering,development peers, et cetera, and you

(26:59):
ask 'em to do something without kind of acompliance requirement, to tie it back to.
You have a, steep hill to climb where ifyou can point to a compliance requirement
of here's why we have to do said thing,it you have a much easier case to make.
So I think compliance often, asI've said in some articles I've
written, compliance is the floor.
It's not the ceiling, but at leastit, without it, we'd be in free fall.

(27:20):
It at least gives us afoundation to start upon.
so I did, wanna call that point out.
You made earlier.

G Mark Hardy (27:25):
And that's a very good point.
And also it helps because when youhave a compliance framework, most of
these things are ideally well thoughtout, have probably evolved over a
period of years with a lot of inputs.
And so if you're trying toget an organization, let's say
for a CISO, come do a new job.
and so we say, we needto do this, and this.
why, if I say, it's theChris Hughes security model.
yeah, But if I go, what?

(27:47):
This is something that comes from.
CISA, or it comes from CMMC or fill inthe blank with your own favorite acronym,
and you can point to something externally.
It's a little bit like, oh, okay.
So we, in a way, we get a littlebit of credibility for our
approach, regardless of your.
Intelligence, your skills, yourabilities, your knowledge, and all
the capabilities you bring to it isyou had said it's a hard sell when you

(28:08):
can't point to an external requirement.
And when time comes to go ahead and cutsome budgets or trim back a little bit.
And if you can't show an anchor outthere to say, we've gotta do this, or
otherwise we can't keep this customer'cause we breach, our agreement.
we end up in a compliance problem, whichmeans that we might not be able to process
credit cards if it's PCI-DSS, or atleast we get charged a whole lot more.

(28:29):
Or it could be a government thing wherethey might just simply say either like
CMMC, we're gonna take away your contract.
Or we're gonna just write a big fine.
So how many zeros have yougot in your bank account?
let's go ahead and do that.
And I was just looking recently atsome of the fines that had come out,
my last episode I did on, the Irish,security cell and, nine figure fines

(28:50):
and even 10 figure fines, billion euroswith exchange rates a little bit more.
and you go, but that's viewed by someorganizations as a cost of doing business.
And so here's a difficulty.
For us as CISOs, we may be able to say,in an ideal world, everything is pure
and kind and just, and it's all secure.
And at some level of leadership and saythe worst risk we have if we violate

(29:15):
this is a $10,000 fine, but we can makea hundred thousand dollars if we do it.
Let's just do it, and then you'llfind out that it's, the equivalent
I remember is when I used to work inCharleston, I was out there with the Navy.
I talked to a guy I used to work downtown.
He said that parking garages weresomething like $20 a day, but
a parking ticket was $10 a day.

(29:36):
So you'd park on the street, you'd getfive tickets, you'd pay them every week,
and it was the cheaper way to park.
And they didn't take away your licenseor your car because you kept paying them.
But it was cheaper than parking in there.
So from that perspective,it was a business decision.
It

Chris Hughes (29:50):
It is.

G Mark Hardy (29:51):
it's interesting approach.
It.

Chris Hughes (29:52):
Yeah, no, I wanted to jump in there.
'cause it's so funny if anyone who'sbeen doing this for a long time, we've
went through CISSP or other things likethat, again, you don't put a control
on something that's worth more thanthe asset you're trying to protect.
The business is making thatbusiness decision to say.
We're not gonna implement a controlor meet a security requirement
'cause it's cheaper to just paythe fine or the regulatory require,

(30:13):
ramifications or whatever theout kind of outcome of this is.
and it's funny you said Charleston.
I actually used to work, in theNavy for several years as a civilian
in the government and with NIWCAtlantic in Charleston as well.
So we have that in common.
Yeah, but you point out that great point.
And then, this is a, quite a doubleX sword in terms of compliance.
On one hand, again, we need it, right?

(30:33):
'cause in its absencewe wouldn't do security.
But it is a problem too, as you said.
We got HIPAA, high trust, SOC two,FedRAMP, CMMC, NIST a hundred one seventy
one, NIST two, the eu, and I can goon and on, like it just keeps going.
And so we
.Yeah, we have these compliance framework sprawl that are happening

(30:54):
and it makes it really problematic andchallenging for the business to navigate.
what have I met already?
What do I still need to meet?
Do I need to do all of these?
can, so there are efforts likethe government, last year had an
RFI for, regulatory harmonization.
we've had an administrative change.
the speed of government is a naturalforce that's gonna be at play.
but I do hope we do see some regulatoryharmonization because it can go the

(31:17):
other direction where you said the costof doing business can be so prohibitive
due to fines and things like that.
That can have an economic impact.
If you look at, like the EU forexample, they're, they've set out
to be a re regulatory superpower.
And that may be great in terms ofcompliance and security and, governance,
but it may have economic implicationsand even national security implications.

(31:39):
So we gotta pick our poison in termsof how, heavy handed we wanna get with
the compliance as, a forcing function.

G Mark Hardy (31:45):
Yeah, and again, I mentioned this in last week's episode and it was
something that I wanna do the homework.
I haven't had a chance to do it yet, butI had heard from somebody a couple weeks
ago, and again, I don't like dealing inrumors, so that's why I'm gonna coach
this with, say, trust, but verify.
Go look it up.
That fines that are collectedfor GDPR violations by countries.
Can be used as an offset foryour dues that you have to pay
into to be a member of the eu.

(32:07):
It's basically, it's a pass through.
So it's as if to say, Hey, I've got mycountry club dues and they're a thousand
bucks a month, but if I can go ahead andfind every golfer who puts a golf ball
in my backyard, $250, I might be able to.
Be a member for free, as comparedto making a profit on it.
I dunno what happens past that point.
But it does suggest thoughthat their goals for doing

(32:28):
these enforcement are twofold.
One, on the business side yousay, I want to avoid unnecessary
costs, but as you had said.
Correctly, maybe the cost of doingbusiness this way is less than the
opportunity cost of not doing it at all.
And then on the other side, from aregulatory perspective, ultimately
say what's the purpose of agovernment to protect their citizens?

(32:48):
And then secondly, territorial integrity.
Then we go on things like that.
But just, basically you want to protectthe life and safety of your citizenry.
And to a certain extent, when we look at.
The concepts behind a lot ofthe regulations that come out.
If they have that at thecore, you go, okay, I get it.
They might be a little bit more extreme.
GDPR when it says, Hey, your IPaddress is PII, it's wait a minute,

(33:09):
your IPI address is ephemeraland it's generated on the fly.
unless it's 192.168.100.1.
that, it travels with me for somereason everywhere I, every house
that I set up seems to have that.
But that's, Another story.
So you had mentioned though, and Imade a note of this here about ai.
I wanna make sure we don't skip ai,before we get to the end of the show.

(33:31):
So what are your thoughts abouthow's that gonna impact everything
we've been talking about so far?

Chris Hughes (33:35):
Yeah.
I'm not, I, don't wanna be the one toget in here with a buzzword, right?
But it, it is a real thing that's, taking,getting traction and having an impact.
For example, I don't, exact figureschange depending on who's reporting
'em, but, many organizations, I thinkit's 70% of developers report they're
using, coding assistance and GenAI tools, LLMs right to accelerate.

(33:57):
Quote unquote, productivity.
we now have a trend, if you'veheard of this, called vibe coding.
If you've heard of this, wherethey just, you know, you, you

G Mark Hardy (34:04):
describe what you want and it writes it for you.

Chris Hughes (34:07):
And you don't even look, you don't even verify, validate it.
You just run with it.
and like again, to speedto market and, competitive
factors that are at play there.
so you just run with it.
and so that's, a factor.
And then so you know, this quoteunquote, productivity boom of
allowing people to develop morecode faster, or even the Democrat.
Democratization of development, wherenow people who aren't even developers

(34:29):
right, can just in plain naturallanguage, produce applications and
software through these tools withoutany kind of validation or verification
is gonna have a massive impact.
Impact, in my opinion, ofexploding the attack surface
of, things that are out there.
If you think about these, largefrontier models in terms of coding.
A lot of 'em are trained on opensource, and we talked about open

(34:49):
source earlier in the conversation,which is, has, many vulnerabilities.
Often is not well maintained,providence and pedigree challenges,
so all those challenges now getproliferated via the LLMs and coding
assistance that, that people are using.
and a lot of times, there's otherstudies showing that developers don't
often go back and check the work ofthe LLM, they just inherently trust it.

(35:09):
and they just run with it.
they don't.
double check it.
So you know, that's gonna be a problem.
But at the same time, it's, there'sthis two, two prong challenge with AI
where we have ai, the need to secureai, but we also have AI for security.
So we may be able to use,AI for security purposes.
Like we know AppSec, engineers andthings like that are often outnumbered
in large organizations, a hundred toone or a thousand of one by developers.

(35:32):
So if we can use these, evolving,emerging technologies like
LMS and copilots and ai, and.
Agentic AI, things like that, toactually imple implement, a security at
scale in a way we couldn't historicallydue to workforce constraints.
there's an opportunity there too to,review more code, make recommendations,
provide quicker feedback loops todevelopment teams and engineering teams.

(35:54):
so it's this dual pronged challengethat I think is unfolding and it'll
be interesting to see, which onewins out, whether it's, needing to
secure AI or using AI for security.

G Mark Hardy (36:04):
Yeah, so the risk I think with the vibe coating is the
open source supply chain attack vector.
Now, it could, as you had said, it'llingest all this stuff that's out there,
mo mostly open source because that's whenyou get your hands on it and the second
thing then is that if I am a patient.
Threat actor.
I'll go ahead and I'll putdeliberately vulnerable code, but

(36:26):
it's not, it'll run just fine.
Except when you add this extra littlecomponent at the end, which then
goes ahead and triggers whateverthat vulnerability would be.
And then I just sit back and Iwait, and everybody vibe codes.
Eventually this thing worksout and then I go look for it.
And so these are the novel typeof supply chain attacks that I'm
thinking of that I'm hoping, thebad guys are gonna think about it.
I don't think they watch my podcast,but if so, yeah, give us a shout

(36:48):
in LinkedIn and let us know.
Yep.
The dark side is listening too, butwhat we find then is that people.
In general, given an opportunitybetween a tremendous workload, I
gotta get a whole bunch of stuff done.
And while it's a shortcut, mostpeople are gonna take the shortcut.
And so the availability of models thatcan generate code, whether it's Python

(37:11):
code or whatever it happens to be, thatseems to run, that doesn't go through the
traditional code validation, as you said,an AppSec engineer being able to say, Hey,
wait a minute, how do you keep up with.
You know the infinite monkeys oninfinite typewriters generating
code when it's really just your.
People vibe coding or some vice presidentof sales who said, I think it'd be really

(37:33):
cool if I could do something like this.
And hey, it works.
Off it goes, and they wanna plug it in.
How do we create a gatekeeper functionin our enterprises to ensure that with
this rise of AI generated code, thatwe can check for the provenance of it?
I don't know if we can tag it somehow,or the reverse might be almost like
a canary in the absence of a tag.

(37:56):
It says this was not developed by oneof our developers, so I want to go
ahead and give everybody their own.
And this is me thinking onthe fly, being the security
guy, I'm gonna give everybody.
You have your public privatekey pair and things like that.
You are gonna sign your code andthere's gonna be a sign code there
that's gonna be based upon a signaturefrom a key pair that is signed by
a CA. And now any block of code.

(38:19):
That doesn't have that.
And oh, by the way, if you putthat thing onto a piece of AI
generated code, you agree in writingthat you may be terminated, with
prejudice for violating that.
Now, I didn't say you can'tdo it, but you can't sign it.
What do you think aboutsomething like that?
Would that at least cause a little bitof pause or at least allow the AppSec

(38:42):
team to look at stuff to say, Hey, thisall the code was generated last week.
90% is signed, 10% is not.
I'm gonna presume that it cameout from this vibe coating, so
it's gonna get special attention.
Thoughts about doingan approach like that?

Chris Hughes (38:55):
Yeah, I think that could work in, in, particular organizations.
Again, going back to the kind of therisk tolerance thing and, getting
people willing to sign on thedotted line, whether it's the, the
executives or the developers in termsof the code that's being produced.
I think implementing, sound securitypractices like CICD pipelines
and, SAST, DAST, SCA, all theacronym soup of security tooling

(39:15):
that we're all used to, of courseshould be part of, any deployment.
process to production environments.
and then, the open source pieceis, problematic because even, it's
just such a massive ecosystem thattracking every single library and
component and dependency down andtrying to determine whether it's known
or trusted or not is a big problem.

(39:36):
some organizations are standing up.
OSPOs, OSPO, Open Source ProgramOffices that can help 'em implement,
some governance and rigor aroundhow we consume and use open source
in our software development and ourapplications and things like that.
but again, it's a big undertaking.
You gotta have the resources to do that.
You gotta be a large enterprisethat can afford to, you fund

(39:57):
a team for that purpose.
and then just keeping up with it.
Anyone who's had to manage, say goldenimages, if you're familiar with that term.
You're always getting newrequests, you're always making
updates, you're always patching.
Now imagine doing that for thousandsand thousands of open source,
libraries that you need to maintain.
Say, this one's trusted.
This one's trusted.
Oh, we have a request for 87 more.
We need to go reviewthose and approve those.

(40:18):
so it could be problematic.
having tools that can help, look atthe, the pedigree, the providence,
the vulnerabilities when itwas last maintained or updated.
is it gonna be reachable in runtime?
all the things that we've talkedabout, can go a long way as well.
the, idea that you won't have somekind of mechanism and some kind of
control in place, is just absurd.

(40:39):
it's not a viable strategy, but we doknow, as you said, it's human nature.
Like we're always looking for theshortcut, whether it's, financially
or physical fitness, whateverit is, software development.
we love our shortcuts.
We, and we're gonna do the same whenit comes to software development.
Not to mention the competingincentives we talked about, like
speed to market and revenue.
so we gotta have these mechanisms inplace and, in my opinion, we should be

(41:01):
looking to use the same technology, toour own benefit, rather than being a
laggard as security often is, and lateto the party, and, Bolted on, et cetera.
like we have been in previoustechnological ways like cloud
and, SaaS and, mobile, et cetera.
We should be using this technologyearly and often seeing how we can use
it to our own benefit, and, improveour security of the organization.

(41:22):
'cause the, to your point, theattackers are early adopters of this.
They're already using this forimproving phishing, improving and
inserting malicious code, improving,all the things that they do.
so we should be looking to do it thesame on our defensive cyber side.

G Mark Hardy (41:35):
Yeah, I am thinking I, I'm gonna be at RSA next month and
I'm almost for fun, I like to makemy AI bingo card where I'll take 24
different categories from Gartner.
My free spot is ai.
I, and anytime you see a vendor whofits into one of these categories, that
has AI in they're advertising, get acheckout and then all of a sudden you
someone, like AI bingo and people're,what is that person talking about?

(41:55):
And says, yeah, you just went overthe top for this particular, thing.
It's a little bit fun, at least getspeople going around and said, getting a
stamp, collect all the stamps and thengo ahead and, get involved in a drawing.
But I think what we're gonna see, and youhad very good insight on that one, is that
if AI is being used both by adversaries.

(42:16):
By our people who are creating codethat are not really our adversaries
per se, but they do represent a kindof a competitive force and within the
organization, both for budget, fortime, to market on things such as that.
And then also for political viability.
Because at the end of the day someonemight argue, Hey, security, how many

(42:36):
dollars have you put to the bottom line?
But we put the money to the bottom line.
I said, yeah, but we make surethat it gets to the bottom
line, that it doesn't, go away.
Putting AI into our defensive strategies,being able to look at the tool sets that
we have right now, we turn to vendors.
and so for those of us who served in themilitary necessarily the term COTS, right?
Commercial Off The Shelf software.

(42:57):
And so when we have a room fullof developers, we go, why don't we
just find a COTS application for it?
It never is a perfect fit.
so kinda last thought aswe're running outta time here,
but do I reduce my risk by.
Crowdsourcing my software withthe equivalent of the COTS, or
am I managing my risk better bydoing my own development in-house?

(43:21):
And that almost sounds like a, toss upquestion I should put out on LinkedIn,
but what are your thoughts on that?

Chris Hughes (43:26):
I think you should definitely put it out on LinkedIn.
But in my experience, and I've donea lot of work with, large federal
agencies, as we talked about throughoutmy career, and this build versus
buy decision is constantly somethingthat they're trying to navigate.
And it really comes down to resourcesand expertise and core competencies.
I. For example, do I have thefinancial resources and the security
and development resources to go andbuild something, my sale myself?

(43:48):
Maintain it in perpetuityindefinitely, right?
and, and make it more secure.
'cause you gotta patch it, yougotta maintain it, you gotta
secure the configure it, yougotta do that forever, basically.
As long as you organization orothers relying on you use it.
Or am I better off goingto use a commercial off the
shelf, software solution.
And even that is a spectrum.
there's a big difference between say.

(44:09):
A GitHub, right?
Who's, run by Microsoft and usedby millions of developers around
the world versus a mom and pop SaaSstartup that just started and doesn't
even have a security team member.
so that the COTS solutionis a big spectrum.
Just like building it yourself is abig, maybe you have two developers,
maybe you have 20 developers and asecurity engineer and better with them.
It really depends on what your resourcesare and your core competencies.

(44:33):
do I wanna be in the business ofbuilding and maintaining this thing?
Or am I in the core competency of,delivering value to my customers,
my stakeholders, my mission owners,and instead buying something and
letting you know, the commercialindustry do what it does best.
and that's something each organizationhas to navigate, and it might look
different e even within the organization,depending on the particular tool
or software or capability, thatyou're discussing at the time.

(44:55):
so it's, it's always, itdepends in my opinion, there's
no perfect answer on that.

G Mark Hardy (45:00):
that's a good insight and a good way to dodge it, but
you're right, it does depend asmy lawyer buddy would tell me.
And we are out time.
So Chris Hughes, I want to do thankyou very much for being on our show.
I appreciate the fact that you've,been a fan, if you will, of CISO
Tradecraft and now you're a part of it.
And I do appreciatealso your contributions.
To the community with the booksthat you've written on effective,

(45:20):
vulnerability management, softwaretransparency, and people can go and,
they can find those, up on the net.
And you are on LinkedIn as ResilientCyber and people could find you there.
So thank you very much for beingpart of the podcast, Chris.
And for those who are listening onCISO Tradecraft, we have a lot more
than just podcasts, so make sureyou're following us on LinkedIn.

(45:42):
We also have our YouTube channel.
If you wanna see mysmiling face once a week.
And until next time, thisis your host, G Mark Hardy.
Thank you for being part of CISOTradecraft, and stay safe out there.
Advertise With Us

Popular Podcasts

Dateline NBC

Dateline NBC

Current and classic episodes, featuring compelling true-crime mysteries, powerful documentaries and in-depth investigations. Special Summer Offer: Exclusively on Apple Podcasts, try our Dateline Premium subscription completely free for one month! With Dateline Premium, you get every episode ad-free plus exclusive bonus content.

The Breakfast Club

The Breakfast Club

The World's Most Dangerous Morning Show, The Breakfast Club, With DJ Envy, Jess Hilarious, And Charlamagne Tha God!

Crime Junkie

Crime Junkie

Does hearing about a true crime case always leave you scouring the internet for the truth behind the story? Dive into your next mystery with Crime Junkie. Every Monday, join your host Ashley Flowers as she unravels all the details of infamous and underreported true crime cases with her best friend Brit Prawat. From cold cases to missing persons and heroes in our community who seek justice, Crime Junkie is your destination for theories and stories you won’t hear anywhere else. Whether you're a seasoned true crime enthusiast or new to the genre, you'll find yourself on the edge of your seat awaiting a new episode every Monday. If you can never get enough true crime... Congratulations, you’ve found your people. Follow to join a community of Crime Junkies! Crime Junkie is presented by audiochuck Media Company.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.