All Episodes

January 13, 2025 51 mins

Dive into the transformative power of data in cybersecurity in this must-watch episode with Wade Baker, where cutting-edge insights meet real-world applications.  

Hear from The Audit Team as we discover how massive data sets are reshaping risk management, AI’s evolving role in combating cyber threats, and the surprising insights data can unveil about security incidents. We also dive into ransomware trends, phishing techniques, the ethics of AI, and the critical role of storytelling in decision-making, with some fun nods to fantasy swords along the way. 

In this episode, we discuss: 

  • Using big data to tackle cybersecurity challenges 
  • Ransomware and phishing trends 
  • The ethical debate around AI in security 
  • Unique discoveries from security data analysis 
  • Practical strategies for influencing decision-makers 

 

Catch this insightful conversation and stay ahead of the cybersecurity curve. Like, share, and subscribe for more expert discussions on the latest security trends! 

#Cybersecurity #DataAnalytics #RiskManagement 

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Joshua Schmidt (00:04):
All right, you are listening to the Audit
presented by IT Audit Labs.
I'm your co-host and producer,joshua Schmidt, we have Eric
Brown and Nick Mellom, and todaywe're joined by our guest, wade
Baker from Scientia Institute.
Wade is a alumni of VirginiaTech, as you can see by his
sweater there, and we broughthim on today to talk about his
company.
Hopefully, we can go into largedata sets, maybe touch on how

(00:27):
AI affects that.
Get to know a little bit aboutWade.
So, without further ado, wade,could you give us a little
background on yourself and howyou became to be working with
Scientia?

Wade Baker (00:37):
Yeah, sure, thanks for having me, by the way,
always enjoy doing these kindsof things.
So yeah, a little bit about me.
Founded Scientia Instituteabout seven years ago, and that
is also the time, by the way.
I am an alumni of Virginia Tech, but I'm also a professor at

(00:58):
Virginia Tech currently.
So, I teach some cybersecuritymanagement classes in their
business school.
So one leg in academia and theother leg in security industry,
and it kind of captures what Ilike to do.
I like learning things, I likeresearching, trying to figure
out answers to hard questionsand security has a ton of hard

(01:21):
questions that really haven'tbeen answered yet, and so that
keeps me interested.
And I like answering questionswith data rather than just
taking a guess from you knowfinger in the wind kind of thing
.
So I really have spent a lot ofmy career chasing different
security data sets, seeing whatI can learn from them and

(01:43):
sharing what we learn with thecommunity, and that's what we
get to do at Scientia Institute.
We get access to a ton ofinteresting data and our job is
mining it, drawing out insightsand sharing that in various
forms of content.
So it's a good gig.
It's sort of one of those thatI said what do I want to do with
my life?
I want to do this.
I wonder if I can create abusiness out of that.

(02:03):
Yay, I can.
And here we are, you know,seven years later.
So I feel good.

Joshua Schmidt (02:08):
Excellent.
Well, thanks for joining ustoday.
One thing you'll get to knowabout Eric and the IT Audit Labs
we have a game night once amonth on Wednesday, so big time
gamers.
He's really gotten me intogaming, or back into gaming.
So I have an icebreakerquestion for us today, to kind
of get the conversation whileflowing here.
And uh, that is, what is yourgo-to dnd fantasy character?

(02:28):
Uh, I tend to be a barbarian,whether it's diablo or uh, you
know, I have a hero quest hereas a board game.
Uh, we have yet to play it atgame night, but I kind of just
go for the bash the highstrength, count the uh high
fortitude and uh, just bash myway through the game.
So that's kind of my go-to.
How about you, eric?

(02:49):
You know?

Eric Brown (02:50):
for me.
I'm more on the uh, probablythe more the magic side, so
sorcerer, wizard type, um, kindof like the interest intricacies
of the nuances in the spellcasting and and debuffs or buffs
, things like that and just alittle bit more of the
battlefield control type ofcharacters.

(03:11):
But you can do that with healertypes too.
But Wade a question for you.
I see you've got and we've gotto hear from you and Nick too on
your characters, but I seeyou've got some swords there
behind you.
It looks like a bastard swordand some long swords or
something.
What's the story behind those?

Wade Baker (03:31):
uh, the story is, I just like fantasy literature and
I've had these since I was inhigh school and, uh, I finally
get a chance in my adult life todisplay them and I'm thrilled
about it.

Joshua Schmidt (03:44):
So wait, I got to interject and say I'm just
finishing up once in future.
King by TH white.
It's been one of my favoritebooks I've ever read.
Uh, are you familiar?

Wade Baker (03:54):
with that.
That one I am not.
But uh, I'm jotting that down.

Joshua Schmidt (03:59):
It's the quintessential King Arthur story
from the thirties and it's it'sa joy to read.
So, okay, while he's jottingthat down, maybe we can get an
answer from Nick.
What, what's your?
What's your go-to?
Well?

Nick Mellem (04:11):
I had to rack my brain to think of something and
since we've been talking, Ichanged my answer.
Uh, pre episode, I was going togo with the Rings and I'm not
much of a fantasy guy.
I appreciate it, not quite mything.
I like Lord of the Rings.
The second one, I think it wasI can't remember what it's
called.
Anyways, there was a scene.

Wade Baker (04:32):
Can we kick it off?

Eric Brown (04:34):
I can exit.

Nick Mellem (04:39):
I might redeem myself, because there was a part
in when the whole battle isgoing on.
Probably what is it?
Three quarters of the movie isthe battle.
Anyways, legolas Two Towers.
Thank you, legolas, right ishis name.
He's got a bow.
Yeah, he looked down.
It was a Dumbledorf and theywere counting out the kills.
They were getting Like 17 orwhatever, and they were going on
.
I'm going with Legolas.
He's nimble, he can't beat abow, right, so that's how I

(05:02):
gotta go.
What?

Wade Baker (05:03):
I, uh, I like it I did I redeem myself?
I it's a good choice, okay,thank you I usually end up going
with some kind of rangercharacter.
Um, you know, I always say no,I'm gonna do something different
this time.
But I always land because I Ido like the bow, but I also like
you know some some, some combat.

(05:24):
You know, they usually have afew spells in their pocket as
well, or you know something likethat.
I thought we were going foractual characters, but one of
the early fantasy books I readwas the Crystal Shard series,
and I don't know if Drizzt,depending on who you ask, is a

(05:45):
Dark Elf Ranger character, lovethat character and in the D&D
world.
So that would be my answer.

Eric Brown (05:56):
Have you read the Lies of Locke Lamora yet?
Have you read any?
Of the books.

Wade Baker (06:04):
No no.

Joshua Schmidt (06:04):
I'm 0 for 2 on questions.
We're learning things today.
I didn't expect you to actuallyask.

Wade Baker (06:09):
I just want to say you know we're not all about the
data today.
I guess the book I have nowthat I'm cracking open is
Brandon Sanderson's latest inthe Stormlight Archive.
I don't know if you do anyBrandon Sanderson, but that just
came out in the last I don'tknow week or so, so I'm digging

(06:30):
into that.

Joshua Schmidt (06:31):
Cool.
Yeah, there's a large libraryattached to IT Audit Labs.
I still have to go dig throughthat.
Eric's a kind of prolificreader, so you're in a good spot
.

Eric Brown (06:45):
I had another question.
Go ahead nick.

Nick Mellem (06:47):
What's that I was gonna say?
I I thought josh was leaninginto something when he said he
let the cat oh boy, so way, justreal quick.

Eric Brown (06:54):
Uh, nick's got a lot of hairless cats and we always
kind of joke about them.
But uh, with, with your um, itlooks like you've got an
eclectic collection there behindyou, certainly, um, the
original star wars, which isawesome to see, looks like.
Is that, uh, one of thetransformers that optimus prime
back there?

Wade Baker (07:15):
so that is optimus prime but.
But it's a lego optimus and itactually transforms which to me
same thing with the Voltron overthere.
It's a Lego Voltron.

Eric Brown (07:25):
Oh, that's cool.

Wade Baker (07:27):
Yeah, it makes it even harder to make.

Joshua Schmidt (07:29):
So we can all align on our love for fantasy.

Nick Mellem (07:33):
We'll give you a pass.
We'll give you a pass.
I'll see you guys later.

Joshua Schmidt (07:36):
You'll get there Chat, books and stuff as you
approach middle age.
You'll get there, nick.
So yeah.
One middle age.
You'll get there, nick.
So yeah, one of the things Iwanted to start out with
shifting gears here, wade, isyou must have some security
background if you're dealingwith data analysis around
security.
So how does your securityexperience, or what does your
security experience, and howdoes that inform the data

(07:56):
analysis?

Wade Baker (07:58):
I was actually a security person before I ever
got into data analysis, but theygo very much hand in hand.
So I was doing a lot ofsecurity and risk assessments.
This would have been 20 yearsago or more and that went okay
for a time.
But there got to a point wherepeople started asking questions

(08:21):
like how often does this attackhappen, and is this one more
risky than this thing over herethat I'm worried about?
And I didn't really have goodanswers and I didn't like the
fact that I didn't have goodanswers.
My response to that was oh,maybe the answer is out there
somewhere and I'll startcollecting some data.
And then I became kind of a dataanalyst.

(08:42):
I still don't consider myself ahardcore data scientist.
I'm a little bit more of a datastoryteller, but I love the
whole process and those two feedoff each other, I feel like.
Because I have securityexperience, I know what to look
for in the data and the types ofdata that I need to answer the

(09:02):
questions that I want to answer,and so that keeps me going,
whereas if I didn't, I don'tthink I would be nearly as good
at what I do if I was doing itin a completely separate field.
I have no interest in being adata analyst in something that's
not security, at least not atthis point in my career.

Eric Brown (09:21):
How did you start digging in on the data side?

Wade Baker (09:25):
this point in my career.
How did you start digging in onthe data side?
It was probably so.
I started a PhD in 2003.
And I realized that I neededsome data to do my dissertation,
and I wanted to do it on somekind of security topic and I
landed on risk management andhow do we take a more

(09:45):
quantitative approach to riskmanagement and decision-making.
And that's when I startedcollecting any shred of data
that I could collect.
So it's been, yeah, about 20, alittle over 20 years ago for
that.

Eric Brown (09:59):
So 2003,.
That was really when a lot ofinformation was probably mostly
in libraries and starting tocome online, right, and you know
, now it's like if you're goingto the library the data is too
old.
So how have you seen thatchange over time for you in your
practice about almost real-timeaccess to data?

Wade Baker (10:23):
Yeah, I definitely, over that 20-year period, have
seen a shift from when I firststarted there was not sufficient
data available, it just didn'texist or people weren't making
it available or publishing it tonow where I think there's

(10:43):
plenty of data.
It's more a matter of how do wemake sense of all of this and
use it for a specific purpose?
And yeah, especially in thesecurity industry, I mean it's
kind of a cagey space.
People don't just readilypublish data because we're
supposed to protect data.
We don't just put it out therefor the world to see data.

(11:04):
Because we're supposed toprotect data, we don't just put
it out there for the world tosee, and especially data on risk
factors.
Now you're talking about whenincidents occur and the
frequency of those incidents andwhat controls were in place or
not in place and the lossesassociated with them, which
people never voluntarily reportthat kind of information.

(11:24):
And you started to see I think2005 was the California data
breach ruling that required ifthere was a data breach on a
California resident that had tobe publicly reported.
And then the data started kindof spilling out and getting
larger and larger as more andmore states did that and making
those kinds of data available.

(11:45):
So lots of things contributingto the current state of data
availability.

Eric Brown (11:52):
On security factors I noticed I was working with a
client recently and they wereevaluating a product that would
help with AP automation.
So essentially, invoices comein, the product would look at
those invoices and then assignsome automation and workflows to

(12:14):
the right approver and youcould essentially train the
model on.
You know, it's like you getthese invoices, you match them
to a PO, or if there's not a PO,what's the company and then who
it should go to in theorganization for approval.
And the organization wastouting their ability not the
client, but the product theywere looking at was touting the

(12:36):
ability to recognize fraudulentinvoices.
And I've seen a couple theylook pretty good where they just
they send invoices into theaccounts payable department and
they don't really know thatthey're not supposed to pay this
Oracle invoice, or if they are,you know, just it looks funny.

(12:58):
But they were saying that theirproduct was able to detect
fraudulent invoices and I saidwell, you know, does the product
then use data that it'sgathering from the hundreds of
or thousands of otherimplementations that it's in?
And if it sees, you know, a badOracle invoice in this company,
does it then allow some sort ofnotification to happen in other

(13:22):
companies, or at least allowthat option so you could kind of
take advantage of this.
And they said no, it's reallyfocused on the instance of the
particular install for thatparticular client and it doesn't
share knowledge.
I was like, wow, that's kind ofa gap right of like.
You would think that if it isdetecting it it could share that
.
And we see that with some ofthe more modern phishing tools.

(13:46):
If they're starting to see somebad behavior in one client,
they let all of their clientsknow.
And I think, where we're at nowwith the speed of information,
if the product isn't smartacross its install base, it's
really a disservice to thecustomers that use it.

Wade Baker (14:04):
Yeah, agreed, and a lot of those things can be
shared without putting thetarget organization at risk.
If you get a phishing email orone of those things and it has a
link or URL out, well, the nextemail that is sent that

(14:24):
contains that link, regardlessof the subject line, regardless
of who it was sent from, couldbe classified as suspicious.
Don't go to that link.
So there's a ton of thatinformation that can be gleaned
and shared to bolster defensesand it's usually a very good
idea.

Nick Mellem (14:41):
It's never ending, Wade.
I was on your website before westarted the show and I was kind
of more curious on the IRIS.

Wade Baker (14:50):
Oh, IRIS yeah.

Nick Mellem (14:51):
Yeah, are you able to speak on what that is or how
you guys use it?
I saw some eBooks on there.
It looked like maybe, if you goon, yeah.

Wade Baker (15:01):
Yeah.
So the information risk insightstudy I is a series of research
that we've been doing for thelast four years I think we
started in 2000.
And it's an effort todemonstrate that you really can
collect data on cyber risk andanalyze it and share that in a

(15:23):
meaningful way.
There's long been debate of canyou really quantify risk
factors in a way that isreliable to make risk-based
decisions in security.
A lot of people say, oh no,everything changes too quickly
and all of this and there's noway to measure it with any

(15:45):
degree of confidence.
So we might as well just stickwith high, medium and low red,
yellow, green kinds of things.
And so, iris, for us is a veryintentional attempt to say no, I
mean, look, we can't actuallymeasure these things.
Look, we're doing it here.
Let's show you how we can comeup with probability estimates

(16:08):
that are well-founded inhistorical data and draw a
distribution around financiallosses of what a typical loss or
an extreme loss is and all ofthese kinds of things and kind
of push the industry forward.
And you know we do that in kindof a rolling series of reports.
The latest one we did focused onransomware, just because that's

(16:30):
such a big topic for many andlots of organizations and cyber
insurers are losing a lot ofmoney to ransomware, so we
wanted to turn that analysisonto it.
But yeah, these are freereports.
Anybody can get them.
There's no registration.
Most of them are sponsored byreports.
Anybody can get them.
There's no registration.
Most of them are sponsored byorganizations and the last few
have been sponsored by the USCybersecurity and Infrastructure

(16:52):
Security Agency, or CISA, and Isaw that on your report where
you were talking about the topattack techniques that were used
by ransomware.

Eric Brown (17:06):
I was surprised, right, because I've always
firsthand experience aroundphishing and seeing phishing and
just reading about phishingbeing that that top threat
vector.
But then I I saw that thereport called out that the
public-facing applications werealso very high and that made

(17:28):
sense with the waterhole attacksor attacks like SochGhoulish,
where sites are infected withthis malicious JavaScript and
then lots of people areconnecting to them and getting
your browsers out of date,pop-ups or what have you.
So I thought that wasinteresting and a great way to

(17:50):
circle back and just make thattop of mind.
So I don't know if you had anythoughts on that piece, but yeah
, certainly interesting.

Wade Baker (17:58):
Yeah, it is, and I think that's why it's important
to kind of dissect these thingsand analyze them, because
sometimes there'scounterintuitive findings
Sometimes there are.
I think the industry has done agood job with phishing
awareness.
I'm not saying it's perfect,because people still click on

(18:18):
phishing messages a lot, butmost people are familiar that
phishing exists and theyshouldn't just click on
everything sent their way andthe attackers know that.
So they're always looking fornew and cheaper, faster ways to
exploit targets and it's good tohave an understanding of how

(18:39):
they're going about that.
And these things change, I meanthey fluctuate over time.
I can remember a time when theremote access, remote desktop or
other types of things like thatwere the number one vector for
most cyber attacks that I saw,and these things just come and

(18:59):
go and it's good for us asdefenders to track them because
that helps focus what we'redoing.

Joshua Schmidt (19:09):
You kind of mentioned, Eric, that you had
kind of picked up on somepatterns there.
When reviewing what Wade wastalking about has looking at
large data sets or kind ofseeing trends, how has that
informed the way you approachsecurity or personal
informational security?

Eric Brown (19:23):
From my perspective, one of the things that we're
looking at now and starting tospend some cycles researching
are the different languagemodels because there's lots
coming out and the differentchat interfaces with them and
then using different prompts toget information out of the

(19:48):
language models.
So, either language models thatare private or language models
that are public, and one thatwe've been doing a little bit of
work and research with iscalled Anything, called Anything

(20:11):
LLM, and it's a self-containedlanguage model where it's
essentially not using your datato train other models, which is
something that a lot ofcustomers are sensitive to.
So things like Google'sNotebook LM, which is a pretty
interesting one.
Right, it keeps your dataprivate, it uses the Gemini
language model.
It even creates like atwo-person podcast off of the
data.
But you could say upload apolicy document to Notebook LM

(20:37):
and then you could have a textchat with that document to say
say, you uploaded your securitypolicies.
You could then say well, youknow how long does my
organization require my passwordto be?
And rather than you reading allof the documents, it would just
quickly find out and cite howlong the password had to be and
you know other information aboutit.

(20:58):
So we've been going down thatrabbit hole of how do we use
different prompts to elicitinformation out of documents,
and there's a tool called Fabricwhich is an open source tool,
and there's lots of differentprompts in Fabric that you can
then press against a largelanguage model.

(21:19):
And one of the things that'sinteresting is if you run those
prompts against differentlanguage models, you're going to
get slightly different answers,you're going to get some
hallucinations, and justyesterday I saw we were looking
at what were we looking at?

(21:40):
I think Grok was one, geminiwas another and maybe GPT-01.
The answers that were comingback were different and I can't
remember which one it was, butone of them threw in a
completely hallucinated citationwhere we went back and we're

(22:03):
trying to validate the findings,and one of the articles that it
cited didn't exist at all and Iwas like, wow, this is.
You know it's really cool.
I mean kind of cool in one handbut not cool in the other.
You know we've all seen thearticle or read the article
where the lawyer citedhallucinated findings in a

(22:29):
courtroom setting.
It's easy to do because it lookslegitimate and balancing the
information that we're gettingback from these language models
versus reality is sometimes hardto differentiate.
So wait, I'm really curious.
I've kind of gone on a tangenthere about this, but there's so

(22:53):
much with big data that youcould, if you just took the
information, republished theinformation, and then something
else scoops up the republishedinformation, you're kind of
quickly creating this kind offalse sense of data, right,
you're almost, you know,creating these, you know,

(23:16):
hallucinations, or generatingthis data that isn't real, and
then if that's publishedmultiple times, then it could
essentially become fake news, ifyou will.
So I don't know, to me it'sjust really interesting.

Wade Baker (23:30):
Yeah, sometimes I feel like that's no different
than the rest of the securityindustry media.
Yeah, sometimes I feel likethat's no different than the
rest of the security industrymedia.
But I hear what you're sayingand you know all of our real
analysis.
I mean, we haven't trusted thatto any kind of AI.

(23:50):
Here you can take a data setand throw it in.
We've done basic super hey,just summarize this just testing
around and playing around, butnot in any real sense for a
major research project.
We're doing, where we have donea lot of experimentation sounds
similar to some of the thingsthat you're doing is in some
background research and datacollection.

(24:11):
So we try to gather a lot ofinformation on security
incidents that occur and we'vebeen working with prompts just
to hey.
Has this been reported to theSEC?
Are there any known financiallosses for this event?
Give me a summary of thisattack chain and the common
MITRE attack techniques thatwere used in this and we've been

(24:35):
noticing some hallucinationsand things like that.
But by and large it's a timesaver for gathering those kinds
of details and we've beenputting some effort into that.

Nick Mellem (24:49):
That looks very promising, I think.
Going back to Josh's question,the low-hanging fruit obviously
for me is risk assessments, butbehavior analytics has been big
for us to try to predict youknow what might be coming in
using historical data fromdifferent tool sets that we do
use, you know, to make surewe're at our.
Some of our clients are moresecure, so we've got a small

(25:11):
team that's that's working onthat.
But the big one for me also ispolicy improvement, because you
know at some clients we comeacross that policies might be
four, five, 10 years, you know,unrevised.
So we're using, we can paint apicture through data that you
know why we think this and thenwe can shuffle that efficiently

(25:31):
into policy.
Another thing I wanted to bringup is funding.
I think has been a big one andI think that's what a lot of
organizations struggle with isif you're at a small, medium,
large, whatever it isorganization, a lot of times
they might have a tight pursestring right For cybersecurity
tools and I think with data andwhat we're doing to protect our

(25:53):
organization whether it'sphishing attacks, like we've
already discussed, socialengineering, whatever it is we
can prove with this databackground why we these tools
are so beneficial to oursecurity posture and our cyber
hygiene, we can show C-levelsuites that aren't maybe in our
space or our swim lanes, whythis is so important.
So you know that's.

(26:13):
So.
That's a quick answer for me.
But hearing you talk, wait,obviously you're super, you love
this space, obviously.
So I'm really curious on.
You know, I'm not a day's datascientist either, I don't.
You know, I do it minorly for,obviously, our careers.
But, uh, what does a day-to-daylook like?
Like you guys go into theoffice or you go and wherever

(26:34):
you guys are working that day Ifyou are working at your
organization you're looking atthese data sets.
What does it look like?
What does your day look like?

Wade Baker (26:46):
So, yeah, we all work from home, so the office is
home and we like that, butnormally we are sort of paired
on projects.
You know, for instance, today Iam working with with one of my
colleagues and it's all aboutsome software security data that

(27:09):
that we have and that we'reanalyzing and we're we're trying
to work it to a report.
So you know we're well, youknow what's the prevalence of
security flaws acrossapplications scanned with static
, dynamic and open sourcescanning, and how old are these
flaws and how quickly does ittake to remediate them.
And so it's just the daybecomes sort of chasing these

(27:34):
things.
We want to answer that we thinkmight yield interesting insights
, and then things sort ofdevolve into okay, well, that
seems interesting, but how do wevisualize that?
And should we do it as a barchart or something more
interesting?
What options do we have?
Does that adequately convey themessage that we want to give?
What else could we do thosekinds of things?

(27:56):
So it's a lot of fun.
I mean, I enjoy that.
I realize that maybe everybodywouldn't enjoy that, but it's
good.
And so we spend most of ourdays trying to flush out those
findings.
Sometimes it's a slog, you know, we have messy data sets,
missing data.

(28:16):
We don't have the data that wereally think answers the
interesting questions, and youknow so.
So there's there's definitelychallenges, but it's it's, it's,
it's good.

Nick Mellem (28:29):
I it's very much research which I like, but it's
applied research which which Ilike, and it's very much
research which I like, but it'sapplied research which I like.
I think you used the term datastoryteller earlier, yeah, so
kind of what you just said there.
It fits it perfectly.
So I kind of got my answer fromthat one little statement, but
I appreciated the longer one.
I think it's really interestingwhat you guys are doing, yeah
thanks.

Joshua Schmidt (28:50):
It connects in with our icebreaker of you know,
fantasy novels storytelling.
You mentioned monetaryassessing monetary impact from
the cybersecurity risk.
How do you use the storytellingabilities or how do you present
those findings to organizationsto justify spending or

(29:11):
mitigation or remediationefforts?

Wade Baker (29:16):
What's your communication style like there?
It's kind of multiple.
So when we're publishing areport, we'll have a section
that just analyzes this fromevery direction.
Some just statisticaltechniques.
What's a median financial lossfrom a security incident?
What's a 95th percentile moreextreme style loss?

(29:39):
What does the distribution looklike?
What can we do with this?
And then pivoting that to allright.
Well, if you have a certainnumber of records that are
compromised in a data breach,how do you get to a loss?
Well, you know, here's a tablethat you can kind of look up and
get some upper and lower bounds.
So we try to throw a bunch ofdifferent visualizations and

(30:02):
statistics and descriptions inwhat we produce.
We do a lot of presentations onthese things too, because that
resonates with people.
We try to do a lot ofinfographics.
If you don't want to read afull report on something like
that, here's a bite-sized chunkwith this one piece of

(30:24):
information that we want you totake away.
And that's a challenge for us,by the way, because I think the
tolerance for long form, heavyanalytical content it only goes
down over time I don't feel likethat goes up.

(30:44):
So, given the nature of what wedo, which is heavily analytical
.
We always have a challenge oftrying to communicate it in ways
that resonate with people andmake it easy to consume, and
those kinds of things.

Nick Mellem (30:58):
Josh, with your budget questions, I think we
were for a long time fightingthe you need to have a security
incident happen to get fundingfor something right.
So the higher ups, or whoeveryou want to classify, they need
to see an issue happen, theyneed to be, you know, you know,
kicked while they're down, let'ssay, for them to maybe buy that

(31:21):
expensive product you had beentelling them for years to get.
And I think one thing that we'vebridged that gap through
exactly what we're talking about, through the data, painting
that picture.
But I think, with all thisresearch especially you guys are
doing, we're able to take that,put it into exactly what you
said, into a short form, insteadof doing long drawn out, maybe

(31:43):
meetings, or learn, teachingthem how to use a product for
them to see the worth.
We're able to paint thatpicture, as you would say,
storytelling, in maybe amanageable way that they would
be on board much quicker,instead of of going down the
painful route where you'regetting that breach on a Friday
evening where nobody wants to beworking and you know we're up

(32:03):
all weekend trying to, you know,rectify the situation, you know
.
So hopefully at least that's myhope is we've gone, are the
days of begging for somethingand not getting it, needing an
incident to happen, then wemight get it here with data and
what we can present from otherapplications.
Why it's a need you know.
Hopefully a budget opens up,and a big part of that too is a

(32:25):
risk register eric, I know yougot something to say there.

Eric Brown (32:32):
As a security consulting firm, we often get
brought in as a third partyexpert to come into an
organization and help them withsecurity.
99% of the time, we're sayingthe same thing that the
incumbent people are saying, butjust because we're a third
party saying the same thing,it's like, oh yeah, whatever

(32:55):
they say is that we should dothat, but the internal
organization has been sayingthat for years.
So, wade, I would imagine it'sthe same for you, right?
You're able to articulate andtell that story through data and
help those teams articulate themessage they're trying to
convey.

Wade Baker (33:14):
Yeah, I agree.
In fact, just today I postedsomething on LinkedIn about
human risk and the stat wasbasically there's a really low
number of users that areresponsible for the vast
majority of phishing clicks ormalware downloads and stuff like

(33:36):
that.
And one of the comments waslike yeah, we kind of already
knew that.
I don't really see this as allthat novel and that's true, but
that's not really the point.
Sometimes it's good to havenumbers around or actual
percentages and ratios andbenchmark that to know well, are

(33:57):
you along with everybody elseor are you outside the norm in
whatever percentage of that thatyou're seeing?
So I think that can be helpful,even if we don't learn
something that's just earthshattering and brand new that we
never even suspected.
Validating the things that wethink are true is a good thing

(34:20):
and, honestly, something I don'tthink we do enough in the
security industry.

Joshua Schmidt (34:29):
Has there been something that your analysis has
found that's kind of an outlieror surprising, that you might
not have been going into dataanalysis.
That kind of stuck out as likeoh, oh, wow, that's, that's
unique, this is something weird.

Wade Baker (34:42):
Yeah, I mean, there's always little things
like that, you know, in in whatwe do where.
Where huh that?
I?
I, I really didn't know thatand and there are things that
that I think run verycounterintuitive.
I'll give an example from along time ago.

(35:04):
One of the early public projectsI was involved in I started the
Verizon Data BreachInvestigations Report and it was
from hey, I'm interested, Iwant to go collect data on
actual security incidents andsee what I learn and publish
this.
And that's where that came.
And I remember when the firstversion of that published and we

(35:26):
had a statistic in there that80% of incidents were from
outsiders, not insiders.
And man, there were people thatwere just really upset about
that and calling it bunk.
Everybody knows that insidersare 80% of security risk.
This is the trash research.

(35:48):
Get this out of my face.
And very strong reaction becauseit was a deeply held and very
strong reaction because it was adeeply held belief at that time
that it was all insiders.
And you know it prompted someinteresting conversations and
but now you think about that andeverybody's like, yeah, duh, I
mean you know everybody'sworried about state affiliated

(36:09):
attacks and organized criminalgroups and we're still worried
about insiders, but I think atthat time that was the main
thing, and since we've realizedthat, yeah well, there's a lot
more people outside yourorganization that want to attack
you than inside, just by volume.
So I like those kinds ofdiscussions when some piece of

(36:34):
research or analysis sort ofchallenges the status quo or the
thinking of the day, because Ithink it's healthy, even if the
analysis is wrong or somethingwe can talk about it.

Nick Mellem (36:46):
Wade, you touched on the Verizon breach just a
second ago that you worked on.
Have you been involved in anyof the other ones?
I feel like it's happening alot now.
T-mobile has been out a bunchor they've had their incidents,
but most recently it was in thenews about how SMS messaging and
phone calls are no longerconsidered secure, right, yeah,

(37:07):
have you had any involvement ordone any research on that?

Wade Baker (37:11):
I have not.
I worked for Verizon for a longtime, but I've been gone for
almost 10 years now, so Ihaven't kept up with that.
But what you bring up, I thinkit's better than just a password
alone by many orders ofmagnitude.
But as attackers evolve to ourevolving controls, then we got

(37:47):
to continue to evolve, andthat's important.
That's part of what we do yeah,exactly that's why we do what
we do yeah.

Eric Brown (37:56):
And I think, just to clarify, you worked on the
Verizon breach report.
Like the report that comes outevery year, it's like 100 pages
now.
I was just looking at ityesterday, I think right.

Wade Baker (38:09):
Yes, yes, that is correct.
I was the initial person behindthat, and then I led that team
for seven or eight years andthen left.
I think it's been going onlonger without me than it was
with me now, which is cool tosee.
Yeah.

Eric Brown (38:30):
I always look forward to reading it.
That's good and it's oftencited it is.

Wade Baker (38:40):
It is, yeah, and that was fun to be a part of,
because back then there wasn't awhole lot of data on security
incidents and it really got alot of attention in the
following because of that, and Idon't think it would today, you
know, because there's so muchmore information available on
data breaches and people arereporting them and reporting
analysis, whereas back then itthat that was kind of novel Just

(39:05):
picturing.

Joshua Schmidt (39:05):
Eric on vacation on a beach.
Reading the 100 page Verizonsecurity risk report.

Nick Mellem (39:12):
I was laughing to myself about that.

Eric Brown (39:17):
Yeah, I'd be in the room because I don't want to get
sunburned or anything.

Nick Mellem (39:22):
Eric doesn't like the camp.

Joshua Schmidt (39:24):
I got to get the SPF 50 going Absolutely Big hat
gloves the whole work.

Wade Baker (39:31):
Talk about risk, we did try to make those enjoyable
to read by putting a bunch oflittle jokes and things in there
.
I wouldn't call them the heightof literature or anything like
that, but hopefully more funthan your average security
report.

Joshua Schmidt (39:51):
Have you considered adding cat memes to
your reports?

Wade Baker (39:53):
Oh boy, I'm pretty sure there are some in there,
we're off the rails.

Nick Mellem (40:00):
Well, okay, I'm going to bring us back with a
question because I've beencurious about this as we've been
talking about it.
You know, just because you'rein it every day, wade what.
You know where we're at rightnow.
You know if we fast forwardfive or 10 years, where do you
see your industry going?
Oh man, I know it's a tough one.

Wade Baker (40:20):
I um, and when you say your industry, do you mean
the cybersecurity industry orthe data long time, I think very

(40:41):
basic.
If you just collected some dataand analyzed it, you had
answers that were new early days, long time ago.
And then it gets harder andharder, and I view that as a
good sign that we're maturingright.
We need to do better analysisin order to answer that next

(41:02):
question, and so we have tomature the art of analysis.
We have to get better and morerefined data or start combining
data sets together to answerthese questions, and so I think
that's what I see.
I see continued improvement andabilities to do that.
I've loved it that data scienceover the last decade or so has

(41:25):
become its own thing, and nowsecurity data scientist is like
an actual job title.
So you've got people that whatused to be just super niche
thing.
People generally realize that,yeah, it makes sense to combine
data skills and cybersecurityacumen together because you can
do some great things, and Ithink that unlocks a lot of

(41:48):
doors and makes the industrybetter and more mature, makes
better products, makes forbetter decisions.
I mean, decision-making is sucha big, important part of
managing security, and the morewe transform that
decision-making from you're justwinging it and it's all about

(42:11):
how good you are, to where youcan rely on good data and make
data driven decisions.
I think, I think the whole, thewhole industry is better served
.

Nick Mellem (42:21):
So instead of follow the money, we follow the
data.

Wade Baker (42:23):
We'll still follow the money.

Nick Mellem (42:27):
You know who am.

Wade Baker (42:28):
I kidding.

Nick Mellem (42:29):
Yeah, I think that's.
I think we probably share allthat, that same hope too,
because the more better we getat reading and sifting through
data, I think the better theseapplications we rely on every
day get right, the better we getand these applications that are
I don't want to say taking overpower that we would do
individually right.

(42:49):
It might replace a person onyour team.
I think it's piggybacking offthat person or it's watching out
for things that go bump in thenight, so those things just
continue to get better andevolve and you know it's making
the security professionals evenbetter.
You know reading all thesereports and whatever else.

Wade Baker (43:05):
Yeah, and you guys have mentioned, you know,
investment decisions a few times, and one of the things that I
really hope happens is thatthere's continued pressure to
vet out the investments that wemake in security products and
services, because I think for along time the industry has kind

(43:29):
of gotten through without havingto do that, but now security
budgets are swelling so large.
I think people start askingquestions and executives want
more, and that forces betterdata and analysis to really
figure out.
All right, this actually works,and here's my evidence for that
, because we, by and large,haven't had to do that very,

(43:51):
very much, and there's questionswe still can't answer, like, if
I implement this thing, whatreduction in incident likelihood
should I expect to see?
Or what reduction in lossesover the next five years should
we expect to have?
And we're still not very goodat answering those questions.

(44:12):
But I think there's more andmore and more pressure and
expectation that we need to be,which goes back to the data and
analysis, and so I think theanalysis can actually help the
industry with that thing.

Nick Mellem (44:28):
I think that's one thing we do focus on.
You were talking aboutloosening the purse strings, I
think, in vetting out theapplications we're bringing in
because of that budget.
Change management boards, Ithink, are a big deal for that.
We assist on one that I canthink of in particular, and you
know we're working on vettingout different applications.

(44:50):
You know goods and bads, why wemight want to do that.
The business might come with anidea of what they want to do.
We can help direct them in ameaningful way that can tell a
story about why the budget isthe way it is or isn't.
So the change management piece,I think, is big for that, at
least for me.

Eric Brown (45:08):
Yeah and Wade, just to go back to the influencing
the decision makers, you knowkind of day-to-day insecurity or
you know often insecurity orother aspects of the technology.
Like you know we're all kind ofnerds, so to speak, and you
know just steeped in this stuffreading it.

(45:28):
You know we're all kind ofnerds, so to speak, and you know
just steeped in this stuffreading it.
You know, for for pleasure,right, we could probably
probably going to read morescientific news or security news
than we are about.
Like you know who miley cyrusis dating or whatever, right, I,
I think I don't know that's.
I mean, that's hard hittingnews.
Josh might be a Swifty, but youknow.
I actually do know who MileyCyrus is dating, just if you

(45:51):
know, I'm going to know moreabout a Falcon 9 engine than I
am about news like that.
But anyway, going back to theinfluencing the decision makers,
have you found any tips ortricks of how to take the

(46:13):
knowledge that's going to beimpactful for an organization to
that leadership, the boardleadership, and have that
conversation with people thatdon't know a lot about
technology but yet control lotsof the budget direction?
Do you have any tips for peoplewho might be challenged with
that?

Wade Baker (46:33):
There are approaches that seem to resonate better
than others.
You know, I think one of thetendencies that is a mistake
that security people make isthey think, well, here's what I
need to do, my job as a securitymanager and what is important
to me.

(46:53):
Therefore I'm going to passthat up the chain to those
people.
And that's usually a mistake,you know, because you know your
job is managing the securityprogram, so you need your.
You know these certain KPIs andthat kind of stuff, but they're
not what non-security people atthe business level need.
So, doing that, translation,culling it down do not think you

(47:18):
have to over communicate reallyboiling it down to what matters
.
And I think that kind of goesback to risk right.
One of the reasons there's beena lot of emphasis on risk and
resilience is because that'ssort of the bottom line for the
business, right, andcommunicating that in ways that

(47:40):
sound like the other executivesreporting into the board or
business.
If security can kind of learnthat language, I think it seems
to help them out.
And storytelling, I mean that'sbeen a theme in this
conversation.
I've heard a lot of boardmembers that storytelling is
very effective because that cantake a very complicated scenario

(48:03):
and kind of make it interestingand, hey, I want to listen.
Okay, great, I get what you'resaying.
Uh, you know, let's make surethat doesn't happen to us, and
here are the three things weneed to do for that.
Uh, those, those kinds ofthings.
I've heard a lot of um, uh,success with and, and heard from
people that that that resonateswell.

Joshua Schmidt (48:21):
We'll wrap it up with this.
Uh, this last little thing.
Wade, you mentioned that whenyou're dealing with large data
sets, you're not relying on AIto ingest that material, but you
do use AI for data analysisthat then you do bring in to
inform your algorithm or yourprocess.
What's your stance on AI interms of just ethics and
security at this point?

Nick Mellem (48:46):
Oh man.

Wade Baker (48:50):
You need another hour, right?
That's an interesting wind downquestion.
My stance is developing,honestly, because I will admit
that I'm not an expert in AI andhaven't put it through its
paces and all the use cases andsecurity, and there's a little
bit of a wait and see approachthat I have on this, trialing it

(49:15):
where it's good, and this isinteresting.
In my world as a professor too,been tons of discussion about
the use of AI for completingassignments, you know, and and
my kind of approach on this iswell, if you're going to do that
, I'm not going to prevent youfrom doing that.

(49:36):
I mean, that's, that's just oneof the tools that the world now
has in its in its tool belt.
Figure it out.
But what you can't do is justhave it rip and hallucinate a
bunch of junk and pass that tome, and I'm going to be okay
with that.
You need to do your homeworkand check it and make sure that
whatever it's given back to you,it passes muster and so it's

(50:03):
unfolding.
I don't have a stance yet andI'm kind of as confused and
figuring out like everybody elseat this point.

Joshua Schmidt (50:13):
Well, thanks for your time today, wade.
We'd love to stay in touch withyou and see how things develop
on the AI front and moregenerally as well.

Wade Baker (50:21):
Yeah, I'd like to see your sword collection grow,
maybe next time we see you,we'll have a few more katana or
something like that in thebackground.
Yeah, my, my, uh, my.
My kids have a.

Eric Brown (50:31):
Have a few katanas, they're wooden but uh, you know,
they, they, uh, they do likethat.
Which one of those did you getfirst?

Wade Baker (50:41):
huh, I think I got the the excalibur-ish one first
and then I probably got this oneafter I saw Braveheart, because
it's the Claymore and this isthe Conan sword.

Joshua Schmidt (50:57):
Any Ren festivals down in your neck of
the woods Wade?

Wade Baker (51:08):
there, there, there, there are, and I, I, I went um,
uh, to something.
Uh, I forget what it was calledrecently, but it was yeah along
along those lines.

Joshua Schmidt (51:12):
So I brought my kids to their first ren fest
this year.
So, um, I might have to pick upa sword.
Come on back, nick to yourhometown, here I was gonna bring
it up though that.

Nick Mellem (51:21):
Did you all see the uh documentary on hbo about the
rent fair guide in texas?
No, because apparently heapparently gets like an hour
from my house.
Uh, here and he startedbasically was the originator of
rent fairs and it's the biggestone in the world.
People live with their camperson this site, uh, like an hour
away, uh and uh, check it out.

(51:44):
It's super interesting.
The guy's a total nut and it'swild to see this world and it's
huge here.
It is huge.
They had one of the Saturdaystwo weeks ago.
They had like 60,000 peoplethere or something.
Oh, my goodness.
So, it's just crazy.
So if you're into that, checkout the documentary.
The guy is.
He's crazy in a good way.

(52:04):
That's cool.

Joshua Schmidt (52:08):
And with that we've come full circle on the
conversation.
Thanks again for your time,Wade, You've been listening to
the Audit presented by IT AuditLabs.
My name is Joshua Schmidt, yourco-host and producer.
Today we have Eric Brown andNick Mellum from IT Audit Labs,
and we've been joined by WadeBaker from Scientia.
Scientia I think I got itScientia.
All right, we will bepublishing this on Spotify,

(52:32):
YouTube and all of the placeswhere you get your podcasts.
Please like, subscribe andshare with your friends, and
we'll see you in a couple weeks.

Eric Brown (52:40):
You have been listening to the Audit presented
by IT Audit Labs.
We are experts at assessingrisk and compliance, while
providing administrative andtechnical controls to improve
our clients' data security.
Our threat assessments find thesoft spots before the bad guys
do, identifying likelihood andimpact, while our security
control assessments rank thelevel of maturity relative to

(53:03):
the size of your organization.
Thanks to our devoted listenersand followers, as well as our
producer, Joshua J Schmidt, andour audio video editor, Cameron
Hill, you can stay up to date onthe latest cybersecurity topics
by giving us a like and afollow on our socials and
subscribing to this podcast onApple, Spotify or wherever you

(53:25):
source your security content.
Advertise With Us

Popular Podcasts

Dateline NBC

Dateline NBC

Current and classic episodes, featuring compelling true-crime mysteries, powerful documentaries and in-depth investigations. Follow now to get the latest episodes of Dateline NBC completely free, or subscribe to Dateline Premium for ad-free listening and exclusive bonus content: DatelinePremium.com

24/7 News: The Latest

24/7 News: The Latest

The latest news in 4 minutes updated every hour, every day.

Therapy Gecko

Therapy Gecko

An unlicensed lizard psychologist travels the universe talking to strangers about absolutely nothing. TO CALL THE GECKO: follow me on https://www.twitch.tv/lyleforever to get a notification for when I am taking calls. I am usually live Mondays, Wednesdays, and Fridays but lately a lot of other times too. I am a gecko.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.