All Episodes

October 21, 2024 47 mins

Building secure software isn't optional—It's critical. Here’s how you can do it right!  

In this episode of The Audit presented by IT Audit Labs, we’re joined by Francis Ofungwu, CEO of DevSecFlow, to break down the urgent topic of software security. Together with Nick Mellom and Bill Harris, we dive into the common security threats developers face today and discuss the vital steps every company should take to secure their software development lifecycle. 

In this episode, we’ll cover: 

  • The biggest software security threats developers face in 2024 
  • How to integrate security seamlessly into the software development lifecycle 
  • The convergence of infrastructure security and software security 
  • The role of AI in secure coding and software development 
  • The importance of threat modeling and attack surface reviews 
  • How to create a more resilient software supply chain and manage risk effectively 

Whether you’re a developer, security pro, or IT decision-maker, this episode is packed with actionable insights to elevate your security strategy and ensure your software is built to withstand today’s evolving cyber threats. 

Don’t forget to hit that subscribe button and drop a comment below on your top takeaway! 

#CyberSecurity #DevSecOps #SoftwareSecurity #AICoding #IncidentResponse  #ITSecurity #CloudSecurity #RiskManagement

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Joshua Schmidt (00:04):
Welcome.
You're listening to the Auditpresented by IT Audit Labs.
I'm your co-host and producer,joshua Schmidt.
As usual, we're joined by NickMellom, and today we have Bill
Harris from IT Audit Labsfilling it out as well, and
we're joined by Francis Ofungu,and Francis is a Chief Executive
Officer at DevSecFlow andGlobal Field CISO at GitLab,

(00:25):
among many other hats he'swearing here.
He's got a pretty prolificLinkedIn page, so we'd like to
hear more about that.
So, without further ado, I'llhand it over to you, francis, if
you could give us a littlebackground on yourself and our
topic today will be softwaresecurity, so maybe give us a
little background and tell ushow you got interested in what
you're doing now.

Francis Ofungwu (00:45):
Sure, and just a quick point of clarification.
You can choose to leave this in, but I left GitLab about a
month ago to start my own firm.
So you're right, I am the CEOof DevSecFlow.
We are a company focused onhelping organizations build
secure, reliable software, andmy background is fairly

(01:06):
straightforward like most peoplein tech or in cybersecurity
Been doing this a while, but indifferent domains.
So I started off doing sort ofthe security analyst role at a
company called Rackspace, whichis a cloud and managed hosting
company that is based out inTexas.
For a lot of my time atRackspace I worked out of the
London office, where I spentclose to six years helping set

(01:28):
up that team and establishing asecurity capability for Europe
and EMEA and sorry for EMEA andAPAC, I should say and during
that time sort of learned a lotaround the initial days of cloud
and exactly what customers werelooking for for all things
software security and cloudsecurity.

(01:48):
I think that's where I cut myteeth in this space.
So the last 20 years I've had avariety of roles in that
software security andapplication security space doing
CISO roles, product roles,field CISO roles, like you just
mentioned, and really helpingorganizations understand where
their risks truly are in thisspace and coming up with more

(02:09):
robust solutions to help themaddress that.

Nick Mellem (02:12):
That answered one of my questions, francis.
I was going to ask how youended up doing schooling in
London, so it sounds like youwere working over there and you
ended up doing some schooling.

Francis Ofungwu (02:22):
Yeah, I'll give you the short version of the
story.
So I wanted to go back toschool to get a master's degree
in cybersecurity and I had twooptions one in Chicago and one
in the UK.
The UK option was shorter, butit also meant I got to live in
London for what was supposed tobe 18 months turned out to be

(02:43):
close to seven years in total.
So yeah that was my sort ofentry into that market and that
part of the world and I lovedevery second of it.

Joshua Schmidt (02:54):
So I usually like to start out the podcast
with a little icebreakerquestion Are you a soccer fan
now or do you like to watchfootball, Francis?

Francis Ofungwu (03:02):
All of the above.
So soccer fan I actually coachmy son's under 10 soccer team,
so I get to wear that hat on theweekends, but I still like
football.
I have the fortunate orunfortunate distinction of being
a Bears fan, and that's comewith its own trials and
tribulations over the last fewyears.

(03:23):
But yeah.
I tend to dabble in both.

Joshua Schmidt (03:27):
I can relate as a Vikings fan.
We share a similar disposition.
How about you, nick?
We don't know what side of thefence Nick's on Now he's down in
Texas, but he's been a lifelongVikings fan, so we have a game
coming up.
Feeling a little torn aboutthat.

Nick Mellem (03:42):
Vikings fan.
But uh, I gotta say thisweekend I think I'm going for
the Texans because the Vikingshave let me down for my whole
life.
So I figured, you know, maybewe can bring it to them this
time.
So I'll be going for the Texansthis weekend.

Joshua Schmidt (03:56):
Bill, what do you like to do on Sundays, while
we're all wasting our day, uh,sitting inside watching TV?

Bill Harris (04:02):
yeah, what I?
I would like to do that too,but I I got a 12-year-old who's
really active, so I'm taking herto a softball now for the next
several weeks.

Joshua Schmidt (04:12):
That's great, yeah, well, we'll jump back into
it.
And yeah, we wanted to talk toyou, francis, about software
security and that was kind ofsome of the things.
We had our pre-productionmeeting and that seemed to be
something that was reallyanimated you.
So I wanted to ask you what aresome common security threats
that developers are facing rightnow and when they're developing

(04:33):
modern software projects, andmaybe you can give us kind of
like a high level overview ofhow you view security and
software development.

Francis Ofungwu (04:42):
Yeah, so I've been focused on software
security and maybe applicationsecurity as well Significantly
focused over the last five years.
I've dabbled throughout mycybersecurity, but really the
last five years has been whereI've taken a more deeper view on
the subject matter, and reallyit started while I was on the

(05:02):
consulting side of my career,where we were advising all these
large Fortune 500 organizationson supply chain threats but
focus more on physical, and thenthe whole digital supply chain
came up as, at the time, asmaller risk than the physical
supply chain security problems.
Supply chain security problemsand what we saw in that last, in

(05:27):
those last five years, is thisdependency that we have on our
digital ecosystem whether it'scloud, software, whatever it is
that we're doing as part of ourdigital tool chains has created
this new risk.
This digital supply chain riskor software supply chain risk
that most organizations aren'tas equipped to handle as they
are their physical security, orphysical supply chain risk that
most organizations aren't asequipped to handle as they are
their physical security orphysical supply chain risk.

(05:48):
So really in that time when wespeak to either CISOs or CTOs or
really people that haveresponsibility for delivering
digital products, whether that'sthrough software or any other
digital product.
We're hearing that, because ofthe lack of cohesion in the
development process, trying toget governance from everything

(06:11):
from the ideation process to therelease process is a challenge.
And back to your questionsaround developers I think it
starts with developers, ordevelopers are a huge
stakeholder in this process, butit's not just their
responsibility.
It's how do we get everystakeholder involved, from your
developers to your securityengineers, to your release

(06:33):
engineers or everybody thatexists in this whole new it's
not even new anymore this wholeDevOps work chain or tool chain?
How do we get them all singingfrom the same hymn book around
every governance step requiredto secure the end-to-end
development lifecycle?

Nick Mellem (06:50):
Wow that sounds like a heavy lift.
You mentioned physical securitybefore.
I was just curious if you guyswere doing any sort of social
engineering work alongside ofthat.

Francis Ofungwu (06:58):
So I had the responsibilities for data center
security as well in my previouslife, so there were some social
engineering aspects to thatrole as well.
But doing data center securityis an interesting beast because
you are going out to the middleof nowhere, to data centers that

(07:20):
are unmarked, and trying tofigure out the best way to go
into man traps or things thatare unmarked, and trying to
figure out the best way to gointo man traps or things that
are typically quite difficult topenetrate.
But I can't say I miss thosedays around the data center
security side.
I'm definitely more comfortablein the software digital world.

Nick Mellem (07:39):
I'm sure you got a lot of fun war stories from
those days, oh yeah.

Francis Ofungwu (07:43):
Oh yeah, it's.
There are as much as you wantto present your best foot
forward or your best caseforward for data center security
, what you end up doing ishaving to manage a lot of third
party audits.
When I say put your best footforward, I mean you have your
SOC 2, your ISO reports that sayhere are all the controls that

(08:04):
you have.
But because you have all thesefairly large public sector and
financial sector institutionsthat have the right to audit,
you're basically managing a lotof auditors coming at your door
and trying to validate thecontrols themselves, and we've
had people show up and try toclimb water tanks to make sure

(08:24):
the water is at the right levelor do crazy things that are not
necessary from an auditorstandpoint.
But yeah, as I said, I don'tmiss those days, do they just?

Bill Harris (08:35):
show up and try to do that, or do they ask for
permission?

Francis Ofungwu (08:39):
They have permission based on their
contractual agreements with justknocking your door, but there
is never a no.
Audits were the same in myexperience.

Joshua Schmidt (08:52):
Nick and Bill, maybe you could speak to how you
view software security and howthat shows up in your security
practices and then maybe pass itback to Francis to kind of give
us some insights or maybe somestories in the real world.
How does that show up?
What does that really mean?
Is that like the CrowdStrikething that we've seen when those
things aren't done correctly?
Or how does that show up as athreat or a risk in the real

(09:16):
world?

Bill Harris (09:17):
Sure yeah, so you know.
So, francis I see a riftbetween, say, the folks who work
on infrastructure security andthe software security fields.
How has it been trying to bringthose two groups of people
together and really collaborateto come up with a more of a
holistic security perspective?

Francis Ofungwu (09:37):
Yeah, I'd actually like to hear more about
the RIF because it differsslightly from what I'm seeing.
I would say two, three yearsago there was definitely that
distinction between what you hadto do for infrastructure
security and what you have to dofor software security in terms
of coming up with a holisticsoftware supply chain solution.

(09:59):
But what we're seeing now,especially as more people adopt
cloud infrastructure and use aprogrammatic way of configuring
their cloud infrastructure as ininfrastructure as code or just
basically sending your cloudprovider instructions via
scripts and code there is thatintersection between the type of
tests you have to do for yoursoftware security health and

(10:22):
your infrastructure securityhealth, your software security
health and your infrastructuresecurity health.
So I'm actually seeing thosetwo worlds converging to having
the same pipeline of activitiesthat you have to follow to
achieve the same level ofassurance.
So, if you don't mind, I wouldlove to hear what you're seeing
in terms of that riff that youmentioned.

Bill Harris (10:41):
Yeah, so I think you're right in, and so far as
there is this convergence fromone premises infrastructure into
the cloud, and then you'rechanging the whole paradigm from
infrastructure on hardware toinfrastructure as code, right To
your point, it's comingtogether Really kind of the
quote unquote rift I'm referringto is for private clouds, right

(11:04):
, and private infrastructure,where you have teams of people
highly skilled in securing theirinfrastructure, securing their
servers or networks, their IAMand so forth, and then you have
the developers who may have tohave to work within all of that,

(11:25):
right, and how do they securetheir applications and have
their applications work in a waythat's compatible with sort of
this self-managed infrastructure, right, where typically you
don't have infrastructureengineers who are super skilled
at development tasks.
So there's a there's a littlebit of a language barrier there.

Francis Ofungwu (11:44):
Yeah.
So I would say that myexperience in that space is
developers are obviously tryingto move at the speed of the
business right and they'relooking for some level of
frictionless guidance toimplement those security best
practices or those complianceobligations or whatever you want
to govern that end-to-enddelivery lifecycle.

(12:07):
The challenge that we havetoday is an over-reliance on
tools within that deliverylifecycle and tools for lack of
a better description garbage in,garbage out.
So if you don't have theengineering capacity to program
them or to implement the toolsto have the context of your

(12:27):
environment, you're going to begetting a lot of false positives
and things that essentially youknow detract from the trust
that you're trying to establishwith your developers.
There are so many examples inmy last two gigs or last two
stops, where we've gone todevelopers and said here are all
the spreadsheet of CVEs in yourspecific code and then, like,

(12:52):
the first four lines are notapplicable, this one is not
exploitable in production.
And they just go down the listand do a very high level triage
of the findings and tell you whyyour tool is not giving you the
information that is required tosecure the entire delivery
lifecycle and it starts to erodethat trust and then they start
to have this conflict or thislack of collaboration between

(13:15):
what security is saying and whatdevelopers are trying to
achieve.
The way you address that ishaving some level of
understanding from the securityside around what the value
stream from again ideation todelivery looks like within your
organization and really get intothe weeds for lack of a better
term around the delivery cycle.

(13:37):
So you're giving more bespokeguidance to your software
developers and not justsomething that you can't really
explain back to them, becausewhatever scanning tool you've
implemented has said that's aproblem and the way that we've
seen it work in the past isreally engineering solutions
throughout the developmentlifecycle.
What do I mean by that is notjust doing static scans on the

(14:00):
code of the developing, it's howare we doing permissions?
How are we doing the buildprocess?
Can we get some level ofattestation or assurance that
your build artifacts have gonethrough some level of provenance
?
There are so many steps thatyou have to do validation on and
if you can create a paved roador a frictionless paved road for

(14:22):
developers to go through thatlevel of testing, then you have
essentially won them over forlack of a better term if you're
able to make that part of theirinstrumentation and not just
here's a document or aConfluence page you have to go
read on secure development.
It has to be those instructionscodified into the development

(14:43):
lifecycle.
A lot of what we're seeing ishow do we get an engineering
solution for an engineeringproblem?

Nick Mellem (14:50):
Sounds like you need to be a good project
manager.

Francis Ofungwu (14:52):
To some degree.
Yeah, you have to understandthe end-to-end delivery
lifecycle which comes withproject management, but you also
have to fortunately orunfortunately have some
engineering knowledge as well.
That says here's what goodlooks like.
And if you don't have theengineering knowledge, you have
developers or engineers thathave that knowledge and you can
collaborate with them and havethem be part of the solution, as

(15:16):
opposed to sort of throwingstone tablets from above.

Joshua Schmidt (15:20):
So I'm going to be coming in with the kind of
the lowbrow questions here,since my background is music,
music production and then audioproduction and stuff.
So I get going to be coming inwith the kind of the lowbrow
questions here, since mybackground is music, music
production and then audioproduction and stuff.
So I get to play the role askind of the average Joe in these
types of conversations, becausethey get really complicated
really quickly, right?
So my question was are youtraining, basically training

(15:42):
developers to start thinkingmore like cybersecurity
professionals in a way, then?
Or how does that show up whenyou're guiding them to start
thinking about each one of thesesteps from a security
standpoint?
Is it based in coding or is itjust kind of a multi-stage
process throughout the wholedevelopment?

Francis Ofungwu (16:00):
So I think it's both sides.
You have to educate both sides.
You have to have developersthink in more cybersecurity
terms, and then you also have tohave cybersecurity folks think
in more engineering anddeveloper terms.
Maybe you don't have to choose,but if I had to choose which
side to focus my efforts onmeaning, do I get more

(16:29):
developers to think likecybersecurity folks or more
cybersecurity folks to thinklike developers?
I would do the latter.
I'll have more cybersecurityfolks think like developers and
understand the engineeringlifecycle, because I find that
us cybersecurity folks arebetter equipped to have those
engineering and developerconversations if we truly
understand what their valuestreams look like and I don't

(16:52):
know if a lot of us in thecybersecurity space are there
yet and especially on thecompliance side, where we're
trying to get evidence aroundthe right things being done
throughout the developmentlifecycle, we're not asking the
questions that will give us thatlevel of assurance.
In my experience, yeah, Francis.

Nick Mellem (17:11):
No, I agree with Francis, what you're saying.
We definitely want engineers orcybersecurity professionals
that can think more engineering,speak, and I think that speaks
to what you were just saying.
It closes that gap of the riftright.
It brings everybody closertogether.
People can you know I guess,for lack of a better term speak
out of both sides of their mouth, right, they understand what's
happening on both sides, andthat's kind of why I brought up

(17:33):
the project manager portion.
So I think, if you can, youkind of able to pull both sides
together, if you can walk thatwalk, especially in an audit.
I do a lot of audits day to dayfor the organizations I'm
working with now and a lot oftimes we find people running
away from us with their, youknow, hair on fire because they
don't want to talk to the guythat's holding the audit in

(17:53):
their hand.

Eric Brown (17:54):
So you know, but if you come, to them being able to
speak.

Nick Mellem (17:57):
You know the terms that they're used to and you
know in a polite manner,obviously, but speak to what
they're used to, what they workday to day.
You know I find a lot betteroutcome.

Francis Ofungwu (18:08):
Yeah, and to your point around project
management, I think the bestproject managers that I've seen
know what good looks like.
They don't necessarily know howto engineer or how to write
code, but they know what goodlooks like, what a good project
execution should look like, andI would say the same thing for
audits and for compliance andthe security personas, that they

(18:31):
should know what goodengineering looks like, a
well-governed engineering lookslike, without having to know
exactly how to execute it.
And that training and thatunderstanding it's something
that I'm asking a lot of thepeople that I work with, both in
our firm and the customers thatwe serve to look into as part
of their continued development.

Nick Mellem (18:51):
Yeah, I guess that you know there's a lot going on.
Obviously every day in the news, you know every day on Bleeping
Computer there's somethingcrazy going on.
But you know, we talked aboutthis, you know, in the
pre-production meeting but alsowanted to bring it up to get
your thoughts on the recentissue with CrowdStrike and you
know what they're doing and youknow maybe what tactics or tools

(19:12):
could they have used to makesure that situation didn't
happen or what could they dogoing forward to make sure it
doesn't happen again yeah, soI'll avoid speaking about
crowdstrike in specifics.

Francis Ofungwu (19:22):
I know there's a lot of details out there
around what what happened andwhat may have happened, but I
would say that what happens tocrowdstrike is emblematic of
what we're discussing around anover-reliance on tools and not
the right engineering to applythose tools into our system.
So the way we rely on softwaretoday, and not just CrowdStrike

(19:46):
if you look at SolarWinds fromabout three, four years ago or
the Kaseya issue from a fewyears back as well is when there
is an issue within that supplychain and in this case, in the
CrowdStrike case, it wasobviously something that was at
a kernel level.
We don't know how the issue gotthere because we don't know how
these deployments happen withinour environments and we don't

(20:07):
have a good recovery plan to getback to green.
And what I mean by that is weare wholly reliant on either the
vendor or whoever installedwhatever software we're using to
get it up and running and wedon't have the in-house
knowledge or resource to knowexactly how to back out of that

(20:29):
specific updates or back out ofthat specific install.
That over-reliance is what'sdriving our resilience
challenges when it comes tosoftware supply chain risks and
whether it's today withCrowdStrike or the next one.
We need to invest in some levelof know-how within the
organization to say not just,you know, the vendors need to do

(20:50):
right by us, but we needin-house resources to understand
exactly how to recover fromsuch an incident, because these
incidents aren't going away.
The speed of software deliverymeans that there's going to be
another one just around thecorner and different from the
CrowdStrike incident, butLock4Jay three years ago as well

(21:12):
.
We knew that there was an issue, but we couldn't find out which
of our components or which ofour software relied on that
specific package or thatspecific library.
So the asset management or theunderstanding of the environment
was something that was lackingfor a lot of organizations and
that's why they struggled withthe recovery.

(21:32):
So, repeating myself a littlebit, the true path forward is
getting some level of analysisaround where we have these
things deployed, how to back outfrom those deployments, how to
quickly recover when we identifythe next issue, because it's
something that is going tohappen again, and I'm pretty

(21:54):
sure I can.
I can bet money on that.

Nick Mellem (22:02):
Yeah, I agree with everything you said.
I think a lot of us woke upWell in the middle of the night
to a very real tabletop exercisethat hopefully we don't have to
do again anytime soon.

Francis Ofungwu (22:08):
Exactly exactly , and you know tabletop
exercises.
You know something that I'vedone throughout my career but
we're always in a position wherewe're doing more of a breach
type tabletop versus aresilience tabletop.
Obviously, we need to do bothand as we look at the software

(22:29):
landscape and we see ourdependency on software increase
over time, we have to startdoing these resiliency tabletops
, because that's what's going tobe as important as you know,
confidentiality issues or breachissues going forward.

Nick Mellem (22:46):
Yeah, so we want to play more offense I think
you're describing them playdefense.
We don't want to react all thetime.
We want to be thinking aboutthese outcomes and, you know,
get a solution in place for allthese different situations that
could come up.

Francis Ofungwu (23:00):
Yeah, especially for critical customer
facing assets that we'redependent on for the life cycle
of our business, the lifebloodof our business, we need to make
sure those assets, especiallyif they're cloud assets, have
some level of recovery or somelevel of resilience, because
those tools or those platformsaren't going to be up 100% of

(23:23):
the time Make sure that calltree is those phone numbers are
updated.

Nick Mellem (23:27):
Yeah, absolutely.

Joshua Schmidt (23:29):
I want to throw it to Bill for a second and back
up a little bit on ourconversation about having
security professionals thinkinglike developers.
Bill, with your wealth ofinformation and education, have
you found yourself thinking likea developer at times when
setting up architecture andmitigating risks, or analyzing
organizations?

Bill Harris (23:49):
um, yeah, so.
So I have increasingly,especially as we sort of pivot,
especially for smallerorganizations.
We pivot into the cloud, whereeverything is tool driven.
And you know, as I, as I workwith developers who come to me
saying, hey, you know, give mesome insight for what I need to

(24:11):
do to to secure my code, I findmyself wishing I thought more
like a developer.
But that does lead to a question, though, that I had for Francis
, which is, I mean.
So in your questions, or ratherin your answers, you have
suggested that developers havean overreliance on tools, and it

(24:36):
sort of reminds me where, youknow, engineers got really used
to the GUI and they kind of lostcontrol of the command line,
really digging down deep, and soit seems like a similar thing
might be happening.
And as these smaller companiesin particular move into the
cloud and they really rely ontools and these cloud-enabled

(24:57):
levers that they have that theycan pull, are they really losing
the grasp on these underlyingdevelopment security principles?
And what challenges do you seepresent now in the public cloud
that we may or may not beprepared for, and what can we do

(25:18):
to get ahead of?

Francis Ofungwu (25:19):
them.
Yeah, I'll answer your lastquestion first, just because
it's top of mind.
So we see a lot of, whetherit's on the cloud infrastructure
side, but also the developmentside.
That reliance that I talkedabout earlier is a result of
this move towards microservicesarchitecture, so containerized

(25:41):
applications and using that wayof segmenting applications,
typically for resiliency.
But because of all thesemicroservices, we have various
tools and services andidentities talking to each other
for performance.
So when you look at your typicalmicroservices architecture, the

(26:02):
permissioning structure or theidentity structure to make all
these things sing can be verycomplex and the way we get
ourselves in trouble is well, itworks.
So let's not look under thehood to see what's talking to
what.
And what happens in that setupis you have either accidental or

(26:25):
malicious capturing of either aservice account or a
developer's credentialsdevelopers' credentials and once
you do that, you sort of havekeys to the kingdom.
Your ability to move laterallyor your ability to get sensitive
information or to causeresiliency events is fairly
trivial once you get thosepermissions.

(26:46):
So we're basically building badpermissions at scale without
having the know-how or applyingthe engineering effort to really
make sure that we're securingthose identities at scale.
So that's where I see thebiggest risk in this space is
this tools and technologies thatwe've used and cobbled together

(27:09):
microservices architecture areso reliant on getting secrets
and passwords and permissions totalk to each other, but we
don't really have a way ofmanaging that at scale.
So that's the first piece Toanswer your earlier question
developers want to do the rightthing.
In my experience, yes, they maylook for a fast track to

(27:30):
production, but if you're ableto instrument controls in their
world able to instrumentcontrols in their world, meaning
that you're going to give themcode, programmatic way of
codifying your security rules,not telling them to go read a
document around the securitypolicies they're able to
implement that within theirpipelines.
And I feel that in a lot oforganizations you hear

(27:53):
developers say well, nobody toldme how to do that.
It's like well, there is apolicy that says you have to
develop software, developapplications securely, Like,
yeah, I know that, but it's notbeen translated into my world.
And that's where I think wehave an opportunity, as
governance professionals, tohelp them translate those
policies into code, and thereare various tools out there that

(28:16):
are available to help you dothat.
And I'm seeing more standardsthat offer very bespoke software
development, secure softwaredevelopment frameworks in this
space, Things like the OWASPsoftware assurance maturity
model, OWASP, SAM, things likethe NIST SSDF, the secure

(28:38):
software development frameworkthese are now speaking
specifically to this problem andoffering us governance
professionals specific steps wecan take to developers to help
them secure their pipelines andwith these frameworks, we're
able to have a more coherentconversation around the gaps and
the risk that exists in thisspace.

Joshua Schmidt (29:00):
How do you identify what is a threat before
releasing the software?
How are you testing things out?
Are you identifying thesethreats because you've seen
other software have problems,like we've seen show up in the
real world where things break,are we kind of identifying them
from what comes first, thechicken or the egg there?

Francis Ofungwu (29:23):
Yeah, I would say there are two approaches to
that.
There's a macro approach at thehigh level, where each use case
or each new design element ofyour software or new software
should go through some level ofthreat modeling to uncover the
bad things that could happen.
Honestly, a lot oforganizations that I've worked

(29:46):
with or that I know don't dothis well at scale.
They do it sometimes, but youhave a lot of legacy
applications that are runningtoday that don't have that
opportunity to go through a truethreat modeling exercise.
At a more micro level, at amore grassroots level, there is
the opportunity to do somethingcalled an attack surface review.

(30:07):
So where should we be concerned?
What is being exposed publicly?
What kind of identity providerare we using?
Where do we have multipleaccounts that haven't been
changed in the last few years?
We can do an attack surfacetype mapping to get a more

(30:28):
pragmatic view of the areas thatwe should double click on.

Joshua Schmidt (30:32):
And a combination of those two things
the attack surface analysis andthe threat modeling exercise
should give us some level ofcomfort that we're topping and
tailing this risk identificationpiece, if that makes sense and
please feel free to pull me outof the weeds if I'm going too
deep, but I want to make this assurface level as possible so

(30:54):
people understand what we'retrying to solve for here now I'm
curious about that because, um,the software, especially, you
know, with our operating systemsand stuff, gets updated so
frequently because of securitythreats that they're identifying
that a lot of the times seemsto me that they're not really
testing it and that the usersare the beta and, um, they're

(31:16):
kind of waiting to see what thefeedback or what kind of support
tickets come in and they'retrying to roll out things as
fast as they can.
And we're kind of in a culturewith our technology and with
sales and marketing where youknow the shiny new thing is what
, what gets the most attention,and we're in an attention
economy.
So you know, updates all thetime on your phone, updates all

(31:36):
the time, and I've said thisbefore on the podcast as a
software engineer, there's timeswhere I just have to skip
updates because it will breakthird party plugins or things
won't be talking to each other,even kind of came up with our
CrowdStrike conversation about.
you know, maybe don't push thisupdate to everyone at the same

(31:57):
time.
Maybe maybe choose a segmentand and see how that that floats
with with the smaller groupbefore rolling it out to
everyone.
So that's kind of where my, my,my inquisition was coming from
there.

Francis Ofungwu (32:09):
Yeah, and so you're a hundred percent
accurate and what I feel mylooking into my crystal ball is
there is now an acknowledgementfrom the legislators and
governments that this is anissue that we have to push back
on the software providers totake some responsibility for.

(32:32):
If you look at the new Secure byDesign initiative from CISA,
they're basically saying releasesecure software, don't have
insecure software, and hardeningguides for your consumers to
harden the software, actuallyharden it upon release, and
they're pushing thatresponsibility back on the

(32:52):
software producers to actuallybe responsible for the secure
use of their software.
We're also seeing that in theEU we have the new DORA, the
Resilience Act, the DigitalOperations Resilience Act and
the NIS2, these are newlegislations coming out in the
EU over the next six months thatagain make the emphasis on the

(33:16):
software provider or the digitalproduct provider to have some
accountability for the securityof their product and their
software, and what I hope isthat these like GDPR and PCI DSS
.
It creates a groundswell ofacknowledgement that you can't

(33:38):
just keep releasing insecuresoftware.
You have to take some level ofaccountability for the use of
your software securely, and Ihope that it sort of changes the
current experience that youdescribed.

Joshua Schmidt (33:53):
Bill and Nick.
Do you think that would helpkind of make your job easier for
lack of a better word, to havethat onus on the software
developers?
So what are your thoughts onthat?
Does that kind of, does thatfeel like it would shore up some
of the risk that you're seeingpop up day to day?

Nick Mellem (34:13):
I think the question is twofold or the
answer is twofold.
You know it's going to obviouslytake a lot more time and
development and a lot more cost,you know, to complete that
mission you're describing.
But from wearing a security hat, obviously we would want things
to come out of the box ready,and I think we're seeing it
across the board from everybody.
For example and this is where Iwas going to go with my

(34:34):
question before take AI or takeApple, for example.
Their new iPhones just came outlast week, releasing today, but
new software isn't coming outto further Apple intelligence
until next month.
You know, maybe not directly,you know the same thing, but
they're promising softwarebefore for a product that hasn't

(34:54):
heard, that product that'scoming out, for a promise of
software that's coming out later, and I think that kind of is
along the same lines of whatyou're talking about, josh.
Right there, they're giving youa product that's maybe not
fully baked or fully secure andthey're leaving it up to
security professionals like allof us sitting here, to make sure
that we're good stewards ofdata for whatever organization

(35:15):
we're working for at the time.
So that's kind of a broad answer, but I think drilling down sure
, yeah.
I mean, in a perfect worldeverything would be ready, but
then I could also say I mightnot have a job.

Joshua Schmidt (35:28):
So it's kind of we're walking the line.
I don't know, because it seemsto me that you guys have so many
things coming at you every daythat if that was kind of taken
care of, you could focus more on, you know, training and
cultural things and shoring uprisk in other areas that are
often neglected, because you'realways trying to put out these

(35:49):
little fires and, bill, maybeyou could speak to that.
Do you have a take on that?

Bill Harris (35:55):
You know, it reminds me of a story I was
talking with the head ofdevelopment.
It's been a while ago now.
He was very proud of sayingthat he doesn't put any
firewalls in front of hisproducts because it makes his
developers more conscientious ofsecurity.
Now, I would not advocate thatapproach.
There's sort of something to besaid just for the attitude.

(36:22):
However, I think even if yourdevelopers are doing a great job
and they're doing secure bydesign and they're following
OWASP and they're doingeverything they can, you still
have to have that defense indepth Right, Because there are
just millions and millions oflines of code across all these
interconnected products.

(36:42):
You're never going to get it100 percent right.
There's always going to be agap someplace.
So I feel like the securitypersonnel will always have a
role to play next to developersproviding that defense in depth,
providing compensating controls, and that'll just be the way it
is.

Francis Ofungwu (36:58):
And, if you don't mind, joshua, just to add
to the last two points that Billand Nick made.
So, yes, if we're able tooperationalize a lot of the
things we've talked about todayas sort of proactive efforts to
secure in software, it allows usto get better at the reactive

(37:18):
incident management and breachmanagement to Bill's points
because I don't know anyone thatis really great at doing that,
is really great at doing that.
So if we can put out these firesthat we have to worry about as
part of the development process,we can spend that time building
out better incidents andrecovery processes because we

(37:38):
need it.
And then, going back to a pointthat Nick made and this is just
a slight pushback I don'tbelieve that building more
secure back, I don't believethat building more secure,
resilient software is asignificant increase in cost or
time, because we can engineerthose practices to be part of
the development lifecycle, notseparate line items or separate

(38:00):
efforts.
And I want organizations toembrace that message because
there is that perception oh, wehave to spend more money on
security controls or governancecontrols.
I don't believe it'ssignificant enough to not do it
or to kick the can down the road.
There are ways to making itsimplistic.

(38:20):
There are ways to making itpart of the development
lifecycle, so it's not somethingthat slows down the development
lifecycle.

Nick Mellem (38:30):
Yeah, that quickly you've changed my mind, I would
agree.
Going back to my projectmanagement discussion, I was
saying before, having thoseplans and processes in place is
going to probably get you a muchbetter, secure package right
out of the gate.
Then also, like you said,probably not add to the time or
cost if you budget appropriately.
So, yeah, great point.

Joshua Schmidt (38:48):
We talked a little bit about tools today.
I'm curious, francis with therise of AI, assisted development
, generative AI have you seenany of that come into the space
to help shore up the security ofcoding?
Or it seems to me that AI couldeasily go through those
millions of lines of codes thatBill just mentioned and maybe

(39:10):
point out some weaknesses, butI'm curious to see what tools
that are coming online for that.

Francis Ofungwu (39:15):
I'll answer the question, but I want to make a
distinction before I answer thequestion.
So I make the distinction in mydiscussions between secure
coding and secure softwaredevelopment, between secure
coding and secure softwaredevelopment.
There are great tools todayfrom an AI standpoint that are
able to give you initialfeedback or immediate feedback
around the insecurity of thecode you've written.

(39:38):
But what we're seeing, as Imentioned earlier, is
cogeneration or coding is just,it's becoming a smaller part of
the entire software engineeringeffort.
When we talk about again themicroservices, architecture, the
development, your buildpipelines, your cic testing, all
those parts of your softwarefactory need some level of

(39:59):
security attention and I've notseen yet that level of ai
improvements in the rest of thesoftware factory.
So we've shifted left in termsof securing the code or getting
more secure code or initialfeedback, but we also need to
apply that same level ofengineering and automation for

(40:22):
better security throughout therest of the lifecycle.

Nick Mellem (40:26):
I think the portion of AI that I was going to bring
up before and maybe it's anon-issue, but I guess, for
instance, I was curious on youand Bill were talking about
engineers relying on tools Doyou think this is the next tool
that developers are going torely on?
That AI will cover them.

Francis Ofungwu (40:42):
Yeah, it's a great point.
It's a great point because whatyou're essentially doing by
getting things like co-pilotsand all these co-generation
tools, is potentially creating abottleneck for the rest of your
lifecycle because they stillhave to go through security
testing and functionalitytesting and all the different

(41:02):
things they have to go beforerelease.
So if you have all these juniordevelopers and even senior
developers now pushing so muchcode throughout the rest of the
software factory, you'reactually unintentionally slowing
down the process usingsomething that you thought would
sort of help you with velocityand help you with productivity.

(41:23):
And then, going back to yourearlier point as well, we're now
back to an over-reliance on atool or something to do part of
our role, without trueunderstanding of what it's
spitting out.
So if we just take whatever aco-pilot spits out for code and

(41:44):
push it through withoutunderstanding exactly what it's
doing, then we're back to wherewe started, which is not really
knowing the context and thebusiness logic of our
applications, which can makeincident response and recovery
more painful than it is today.

Nick Mellem (41:58):
You went exactly where I was going to go.
Then we're going to get back towhere we didn't know what was
going on within the application,and then, like you said, at
that point we're openingourselves up for more risks.
We're not quite sure, so we'reback to playing defense versus
offense.

Bill Harris (42:14):
Yeah, you know, and I've read.
I hope I don't get the name ofthe college wrong.
If I do, I apologize, but itwas Stanford, or maybe it was
Cambridge.
One of the notable schoolsreleased a paper several months
ago that concluded that codethat was co-developed with AI
was less secure than code thatwas developed by people, and

(42:37):
their conclusion was that AIjust doesn't yet understand the
nuances of application anddevelopment security, security.
So do you think that, I guess,would you first of all agree
with that conclusion and do yousee, or at least hope for,
changes in AI, that in AIengines that will help help, at

(42:59):
least assist developers, withmaking more secure?

Francis Ofungwu (43:03):
Yeah, um, I haven't seen.
I haven't seen that paper.
I understand where it's comingfrom.
What I would say is if that wastrue three months ago, it's
probably less true today becauseof the way AI is sort of
improving over time in a veryshort period of time, repeating
myself a little bit.

(43:24):
But I still think that thesecure coding parts, or
developing secure code, is stilla smaller part of the equation
that we're trying to solve here.
And even if AI starts todevelop more secure code the
permission structure that wetalked about earlier, the

(43:45):
identities, the fact that nomatter how secure your code is,
if I can fish the credentials ofa developer and get their
credentials, then the securityof my code of the sort of the
robustness of my code doesn'tprevent that level of attack.
So I still think that AI ishelping from a productivity

(44:08):
standpoint and the security ofthe code that AI is producing is
improving over time.
But let's not focus too much onthat specific part of the
development lifecycle.
Let's look at it holisticallyand look at all the other threat
vectors that could impact thesecurity of our software,
including the security of thecode, but, more importantly, the

(44:30):
rest of the pipeline.

Joshua Schmidt (44:32):
Yeah.
So, francis, can you tell us alittle bit about your new
business venture and what you'vestarted here and what you're
working on now?

Francis Ofungwu (44:39):
Yeah.
So, as I mentioned, I've beenin this space for a while now
and what we've seen over thelast again five years is that
companies need better education,both on the security side and
obviously on the developer side,to build more secure, reliable
software.
And we see that there is a gapin that level of education in

(45:00):
terms of people that can speakto both sides, and we have a
team of people that can havethose conversations and know the
right tools and levers to pullto get better, secure software
and more reliable software.
And we are seeing an increasein theISA people that have to

(45:30):
comply with the new SBOMattestation.
How do we make it moreefficient?
How do we make it less costly?

(45:50):
How do we make it frictionless?
So it's not an additionalburden on your develop
development team.
So we're having thoseconversations with folks and
helping them just be better,build better, more secure
software.

Joshua Schmidt (46:03):
And what's the name of your company?

Francis Ofungwu (46:05):
DevSecFlow.
So we believe sort of thatwhole DevSecOps movement is
definitely something that webelieve in, but the part that is
missing is sort of thatworkflow aspect Getting a true
workflow from development tooperations, which includes
security.
How do we make that a seamlessprocess?

(46:26):
Excellent.

Joshua Schmidt (46:27):
Well, thanks so much for joining us today.
You've been listening toFrancis Ofungwu from DevSecFlow.
We've also been joined by NickMellom and Bill Harris from IT
Audit Labs.
My name is Joshua Schmidt, yourco-host and producer.
You've been listening to theAudit and thanks so much for
joining us today, francis.
Thanks for your time.
It's been a really interestingconversation.
I hope we'll stay in touch,maybe on LinkedIn, and we'd like

(46:50):
to track what you are up toover the course of the next
couple of years here.
So don't be a stranger, andthanks again for joining us.
Thank you for having meAppreciate it.

Eric Brown (46:58):
You have been listening to the Audit presented
by IT Audit Labs.
We are experts at assessingrisk and compliance, while

(47:20):
providing administrative andtechnical controls.
Thank you, organization.
Thanks to our devoted listenersand followers, as well as our
producer, joshua J Schmidt, andour audio video editor, cameron
Hill, you can stay up to date onthe latest cybersecurity topics
by giving us a like and afollow on our socials and
subscribing to this podcast onApple, spotify or wherever you

(47:43):
source your security content.
Advertise With Us

Popular Podcasts

Dateline NBC

Dateline NBC

Current and classic episodes, featuring compelling true-crime mysteries, powerful documentaries and in-depth investigations. Follow now to get the latest episodes of Dateline NBC completely free, or subscribe to Dateline Premium for ad-free listening and exclusive bonus content: DatelinePremium.com

24/7 News: The Latest

24/7 News: The Latest

The latest news in 4 minutes updated every hour, every day.

Therapy Gecko

Therapy Gecko

An unlicensed lizard psychologist travels the universe talking to strangers about absolutely nothing. TO CALL THE GECKO: follow me on https://www.twitch.tv/lyleforever to get a notification for when I am taking calls. I am usually live Mondays, Wednesdays, and Fridays but lately a lot of other times too. I am a gecko.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.