All Episodes

September 20, 2023 • 62 mins

How do you approach your duty and responsibility to protect yourself and those around you from cyber threats? Whether you're a large organization or an individual, it's never right to assume that someone else is protecting your best interests.

Our generation is living in an incredibly significant transitional period, and the rules and expectations about how people should be treated are still being tested and hotly debated. The only way to ensure that you and your business remain safe is to take personal responsibility for that safety. In addition, we should also be lining up to preserve the safety of the more vulnerable members of our community - the older and younger generations.

This week, I sat down with Kristi Hoffmaster, who is a senior analyst at Okta, a company that provides secure identity management for businesses of all sizes. Kristi's experience and expertise in third-party risk management yielded a conversation that centered around the need to carefully evaluate and monitor the technology that you allow into your business and your life.

While we spent a good deal of time talking about the specifics of securing an organization's technology stacks, it was the individual, human aspect of cybersecurity that resonated most deeply for me. I was left at the end of our conversation feeling very confident that the only way for humankind to ensure our future cyber safety is to begin fostering a strong sense of individual responsibility across the board.

I hope this discussion leads you to the same conclusion.

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Sam Gerdt (00:07):
Welcome everybody to Road Work Ahead, a podcast that
explores the unmapped future ofbusiness and technology.
My name is Sam Gerdt and I amyour host.
How do you approach your dutyand responsibility to protect
yourself and those around youfrom cyber threats?
Whether you're a largeorganization or an individual,

(00:28):
it's never right to assume thatsomeone else is protecting your
best interests.
Our generation is living in anincredibly significant
transitional period, and therules and expectations about how
people should be treated arestill being tested and hotly
debated.
The only way to ensure that youand your business remain safe

(00:50):
is to take personalresponsibility for that safety.
In addition, we should also belining up to preserve the safety
of the more vulnerable membersof our community the older and
younger generations.
This week, I sat down withKristi Hoffmaster, who is a
senior analyst at Okta, acompany that provides secure

(01:11):
identity management forbusinesses of all sizes.
Christy's experience andexpertise in third-party risk
management yielded aconversation that centered
around the need to carefullyevaluate and monitor the
technology that you allow intoyour business and your life.
While we spent a good deal oftime talking about the specifics

(01:32):
of securing an organization'stechnology stacks, it was the
individual, human aspect ofcybersecurity that resonated
most deeply for me.
I was left at the end of ourconversation feeling very
confident that the only way forhumankind to ensure our future
cyber safety is to beginfostering a strong sense of

(01:52):
individual responsibility acrossthe board.
I hope this discussion leadsyou to the same conclusion.
, you are in third-party riskmanagement, which, to my
understanding, is not aspecialized role.
It's more of a general role.
You have eyes on every part ofcybersecurity, is that correct?

Kristi Hoffmaster (02:16):
That's right.
Third-party risk management,also known as VRM or vendor risk
management they usually meanthe same things.

Sam Gerdt (02:25):
So you have eyes on, then, auditing companies,
looking at every aspect ofcybersecurity, as you're looking
at the industry becauseeverybody knows that this is
shifting and changing so rapidlyas you're looking at all types
of business small business,large business what would you

(02:46):
say is the greatest challengethat you're facing in the next
five years?

Kristi Hoffmaster (02:53):
So the challenge I would say is first
of all for a companyunderstanding what's there, what
is their inventory, whatsuppliers exist that that
company is doing business with,and then, from a TPRM assessor
standpoint, when you're lookingat suppliers and their

(03:13):
businesses, I would say they'reunderstanding kind of two lanes.
The first lane would be what isthe baseline, what are we doing
business with?
From a baseline assessment,static point in time, what is
this organization structure,what is the primary systems that

(03:35):
this organization is made outof?
And then how are they managingthose risks across their
organization?
So that's the first lane.
The second lane would beperiodically saying what has
changed about that company, whathas there been incidents that
would affect that company'soperations or how they service

(03:57):
the companies that it doesbusiness with, and then just
looking at what is the responseto those changes.
Has the response been nothing?
Has it been?
An auditor has come in andattested to the fact that
they've handled this change insome way, hopefully in a
positive way.
So those two lanes are kind ofwhat an assessor, third party

(04:21):
risk management professionalslooking at the challenge for an
organization that is beingassessed by TPRM evaluator would
be the overwhelm, because thereare multiple inquiries and
there are multiple questionsthat a TPRM professional would
issue and request from thatcompany or that supplier.

(04:43):
So sometimes those teams thatare like I mentioned the trust
team, the assurance team wouldbe the team in-house fielding
those questions and respondingto those inquiries.
A company that does business, alarger company that does
business with hundreds of othercompanies.
They could be answeringthousands of questions a month

(05:06):
from these external companies.
So there's a lot of overwhelmin this field as well and there
needs to be a strategy fromleadership and managing both
sides of TPRM and how to kind ofoptimize and deal with that
function efficiently.
Otherwise it could be anywherefrom one extreme of legal

(05:28):
liability to the other extremeof just burnout from employees
working in this field.

Sam Gerdt (05:35):
From a subjective standpoint, it seems to me that
this thing that you're talkingabout, all of this overwhelm, is
going to be the greaterchallenge for the small business
versus a larger corporation.
And we're getting to the pointnow where the small business
can't ignore cybersecurity andreally needs services like what

(05:57):
you're talking about, but halfthe time they're not even aware
that it's a need, and then theother half, when they become
aware, like you said, they'reoverwhelmed at the very prospect
of undergoing even an audit.
So in that case, when you'rediscussing these things with a
smaller business who's facingthat challenge, what's the

(06:18):
response?

Kristi Hoffmaster (06:19):
Sure, I think small business owners and those
who are involved in protectinga small business can do the same
thing that third party riskmanagement professionals do,
which is look at the data.
The data is the crown jewels.
The data is really the king ofyour business.

(06:39):
So step back and like from avery high level.
You might want to do this onceor twice a year.
A small business owner can saywhat kind of data does my
business handle and manage anddeal with?
What kind of promises do I needto make to my customers to
protect that data?

(07:00):
Obviously, a landscaping smallbusiness is going to have
financial data.
If they take digital payments,they're going to have invoicing
personal data on some of theircustomers that they need to
protect.
How are they protecting that?
What systems are they using?
So, if they're using a cloudpoint of sale system to handle

(07:21):
financial transactions, theyneed to be looking at, first of
all, internally, what can theydo procedurally to handle that
data properly?
And then looking at the actualthird party vendor that is
providing those financialtransactional services.
Looking at is this a companythat just created itself last

(07:44):
week and is using your businessas a guinea pig?
Is it a friend's startup?
In that case, you're taking onsome risk.
There's not a lot of historythere to prove that they've
maturely been able to handlethat capability.
Or are you using a verywell-established cloud vendor
that's been around for a longtime, that has a track record of

(08:06):
protecting data and dealingwith challenges in a transparent
way with its customers?
Those are the kind of thingsyou, on a high level, would want
to do as a small business owner.

Sam Gerdt (08:18):
Are there any other things that you would consider
if you were evaluating a vendoras a small business, or are
there any other red flags thatyou might look out for?

Kristi Hoffmaster (08:27):
Sure, that's a really good question, the
number one I work for.
I am employed with an identityprovider company, so identity is
how you're logging into anapplication, how are you
authenticating and how are youdoing that safely?
I happen to work for a companythat you know.

(08:48):
Our pride is, you know,promising that we can help
businesses secure thatauthentication handshake.
So the number one red flag Iwould say is if a small business
you know is wanting to takepayments but not requiring any
kind of single sign-onauthentication to its

(09:08):
application, that would be ahuge red flag.
Or just that the system ofrecord for your business with
them and operating and sharingthat data doesn't have
consistency, or maybe it's noteven present, those would be red
flags.

Sam Gerdt (09:24):
So, to give you an example, so I work for you know
we do digital marketing, digitalsales.
We're in this digital space asan agency and a big part of my
job is recommending software asa service platforms for

(09:46):
companies to run things likemarketing and sales.
And the average size of a techstack for a company that we
would work with, depending onthe size of the business but
even the landscaping business,as an example, is going to have
at least a dozen vendors and thetech stack for a larger,
mid-sized company, say, is goingto be way higher, maybe even

(10:08):
close to 50, 60, 70 vendors.
And so, as we're evaluatingthese vendors that we, sometimes
we resell them, sometimes werecommend them, sometimes we
just test them and use themourselves and then, if the need
arises, we have it as arecommendation.
You're looking at it completelydifferently than the way we look

(10:29):
at it.
We look at it and we seepricing, we see feature set
capability, we might look atcustomer reviews, we might hop
on a call with a salesman fromthat provider and hear more
about the internal workings ofthe company from them.
But what we lack a lot of times, and a lot of times it's not
even available to us.

(10:50):
What we lack is insight intotheir security, their cyber
security practices, compliances,all of that stuff.
And so I'm curious, for my ownbenefit and for the benefit of
our customers what's the mark ofa really conscientious vendor

(11:10):
that you're not going to getthat email that says, hey, we
had a data breach, your creditcard might have been exposed?

Kristi Hoffmaster (11:17):
Right.
Okay, that's a great question.
So I think that there are, on apositive note, there are
actions you can take from thatperspective, sitting in that
chair, that don't require a lotof legwork.
First of all, in this day andage, most people and there have

(11:40):
been studies that show this withconsumer purchasing habits,
most human beings do a littlebit of research on their own
before they even engage anotherparty in a transaction or a
question or any kind ofengagement.
So you can look at theirprivacy policy, which should be

(12:01):
on their website.
If a company doesn't have awebsite, that's not necessarily
a red flag, but that is an easyway to tell.
A hallmark of this is a companyI want to do business with.
They have some kind ofstatement that is publicly
available for customers to lookat and see.
How do they handle information.

(12:23):
What kind of commitments dothey make to you, a consumer, or
you another business thatthey're doing business with, and
how are they going to promisethat they are going to make an
effort to protect thatinformation?
Also, usually most likely, ifyou scroll down on any
application or any software orany business or organizational

(12:48):
website, there should be at thevery bottom of the privacy
policy, a small paragraph thattalks.
That speaks to how you canengage that company, how you can
communicate with them, whetherit's a phone number, email
address, and so you can lookvery quickly.
If that's there, that means weare open to talking with you and

(13:09):
working with you.
There's also legal reasons andregulatory reasons that that
business might be subject toproviding that information,
which is another reason they doit.
So usually a privacy policywould be present, as well as a
security statement or some kindof portal or page speaking to

(13:29):
what frameworks are theycompliant with?
What kind of you knowregulations do they acknowledge
that they're out there and thatthey are subject to keeping your
information safe with?

Sam Gerdt (13:42):
Yeah, you actually bring up an interesting
secondary point to this wholething, and that is it's not just
ensuring that your data is safefrom outside attack when you
engage with a third party, it'salso making sure that that party
is going to be responsible withit themselves.

Kristi Hoffmaster (13:59):
Absolutely.

Sam Gerdt (14:00):
And we're seeing more and more of this.
There was the incident withZoom, where they were.
They had changed their policyto indicate that data collected
on the platform couldpotentially be used in the
training of LLMs AI, and peoplelost their minds.

Kristi Hoffmaster (14:20):
Yeah, there was a lot of industry pushback
there.

Sam Gerdt (14:23):
Yeah, there was a huge backlash and they walked it
back and they tried to say no,no, no, that's not what we meant
, but it was very apparent thatthey intended to quietly allow
themselves access to that datafor that purpose, and customers
obviously did not want that.
So how do we protect that?

(14:45):
You can have the most compliantvendor in the entire world, and
if that vendor chooses to useyour data in a way that you
don't want them to, what's therecourse other than leaving them
?

Kristi Hoffmaster (14:59):
Leaving them is one recourse.
There is strength in numberswhen customer behavior can
influence a company's responseand posture.
For sure and that's kind ofwhat happened here with Zoom is
there was a lot of noise inresponse to what had been
reported.
So you bring up a really goodpoint, which I would probably

(15:25):
tie this point to a lot ofquestions that someone like you
could ask about cybersecurity,which is the importance of
socialization and normalizationof the human factor in
protecting information so it canwork on both sides.
The human factor can influencecompany behavior in that type of

(15:45):
scenario where people are justsaying we don't like this, this
isn't right, we don't want toenter this territory of
information being used in a waythat we aren't sure they outcome
.
And then on the other side,there's the human factor, and
you brought up this questionwith what can organizations do

(16:07):
internally to keep thosepromises?
There's the human factor ofjust.
There's always subjectivitywhen it comes to human beings
making decisions and they can besafe decisions or unsafe
decisions.
So what someone in TPRM wantsto see a company do is have

(16:27):
those internal controls that areadministered within the company
background checks for employees, security training happening
regularly, whether it's once ayear or upon hire, and then
every quarter or twice a year.
You would want to see thosetypes of things and activities
happening.
You would want to see anauditor coming in and attesting

(16:50):
that they have looked at thatand that activity is actually
present and working as intended.
So as far as public recoursethere's, you know we live in a
society and a culture,fortunately, where there is the
ability to discuss this and havepublic recourse about it.

(17:11):
What can really happen in thissituation with Zoom is where you
have customers saying you knowwhat's happening, we don't, we
don't appreciate this, and thenyou have practitioners from the
industry breaking it down andsaying here are the risks that
you know this could introduceand here are the things that we
see wrong with this picture.
So public recourse is actuallyreally effective in a lot of

(17:34):
situations.

Sam Gerdt (17:36):
It just comes down to then whether or not you can
uncover it Before we get to thatindividual responsibility, that
individual vigilance.
There's one more thing that Iwant to touch on a challenge of
the small business that kind ofgoes along with that building a
tech stack and just a kind ofoutsourcing to these third party
vendors, pieces of yourbusiness.
And that is when we do this,especially on the small business

(18:02):
side I see this all the timethere's this assumption of just
letting go and okay, they handlethat.
I don't have to think aboutthat.
So we saw this a lot with GDPRand the data privacy rules that
came out of GDPR.
There was this, there was thisidea, this attitude that my

(18:23):
vendors will be compliant.
My vendors will sort this out.
I don't need to necessarilyknow the inner workings of this
law and how it affects mebecause I'm using a very popular
vendor, or if they're compliant, I'm compliant.
I don't need to worry aboutthis.
They're worrying about it forme.
Is that always a good idea?

Kristi Hoffmaster (18:44):
No, great question.
You can take that statement andyou can blast it out to a macro
perspective.
For a company like the one Iwork for and have the same
problem which we do, largecompanies have this challenge,
which is continuing to have apulse on what is happening,

(19:08):
number one, with your data, withyour customers data, continuing
to understand what thosepromises that vendor is making
to you are and what is thetermination of those promises.
So now we're looking at thelegal, contractual agreements.
You are most likely going tosign a purchase order and a

(19:29):
contract with that business andthat vendor that you're handling
your business and giving thempart of your data to you.
So what are those?
What's in the contract?
So I would say, for a companylike the one I work for and for
you as a small business owner oranyone out there, you need to
have someone in-house whetherit's you or someone that you

(19:53):
hire, part-time, full-time thathandles those relationships and
what I would call the supplierinventory that is helping
operate your business.
So because, ultimately, thebusiness is accountable for any
problems, any residual problemsor any just gaps with omission

(20:16):
of knowledge of what's going onthat could introduce data loss,
financial loss.
It could affect your let's say,your employee.
You've got 20 employees andyour scheduling cloud vendor is
not available for two days andsomething goes down and that
affects your operations, that'sa good example.
So having that resource whetherit's you or someone else to

(20:40):
understand what has been agreedin writing, how long do those
agreements last?
And what is Amazon kind ofcoin-saving this for their
industry landscape, which is thephrase of shared responsibility
and having the sharedresponsibility model, and what
that means is what is the vendorresponsible for and what are

(21:04):
you, as the customer of thatvendor, responsible for doing on
a regular basis?
And a good example would belet's say, your cloud vendor is
responsible for theinfrastructure and that platform
and providing that to you andbeing available 99.5% of the
time.
They make that promise to you,they ask you to be responsible,

(21:26):
in this example, for patchingthe software that lives in that
cloud instance.
That's your responsibility as abusiness.
So if that goes down and it'son your watch because you
haven't patched it in 13 months,that would be your
accountability that you would befacing.
So having someone in-house thatcan look at what the promises

(21:49):
are and what theirresponsibilities are from you as
a business and from them thenyou kind of have a template of
you can apply that to any vendorthat you do business with,
whether it's five or 20 or 100vendors.

Sam Gerdt (22:03):
And something additional that I think a lot of
small businesses neglect tothink about is the fact that
their vendors have vendors.

Kristi Hoffmaster (22:09):
Yes.

Sam Gerdt (22:11):
And this relationship .
This relationship is just nottwo people.
There were two companies.
This relationship is yourrelationship with this
organization, who has arelationship with dozens of
organizations and thoseorganizations.

Kristi Hoffmaster (22:23):
And some of those vendors are open source
products.
So there is no company managingthat organization that's
providing that software to you,for example.
So you might be reliant on anopen source project that is
handling some aspect of yourbusiness and that's important to
know as well.

Sam Gerdt (22:42):
So, for every listener out there, what we're
saying is you need somebodyknowledgeable about all of this
stuff to evaluate your businessdecisions with regards to who
you work with, and don't justtake a simple reputation metric
as gold and go from there.
You need to have somebody who'sconstantly on top of this.

Kristi Hoffmaster (23:02):
Yeah, and fortunately it's the cost of
doing business in 2023 andbeyond.

Sam Gerdt (23:08):
Yeah, that's.
It's really good to hear yousay that, because a lot of what
I do in my capacity isevaluating vendors for specific
tasks and looking at all of this, and it is a challenge.

Kristi Hoffmaster (23:19):
Yeah, and you have to look at jurisdiction
too.
Is your company doing businesswith locals within a 30 mile
radius, and that's yourjurisdiction as the state you're
in and the operating in, or areyou subject to laws and
regulations all over the world?
It depends on what yourbusiness is doing.

Sam Gerdt (23:38):
Yeah, it gets really complicated as soon as you go
global.
I'll tell you that.

Kristi Hoffmaster (23:42):
Yes.

Sam Gerdt (23:44):
So let's get into then the individuals
responsibility.
This is for businesses theyhave responsibility but also for
employees.
I feel like there's this greatneed for everyone, in every
capacity, whether you're at homesitting on your couch, or at
work sitting at your desk,driving.

(24:04):
Even these days, you have tohave this awareness of all of
the things that are happeningaround you, how they're
affecting you and how they'reaffecting your family, your
organization.
Data is constantly beingcollected.
We are constantly interfacingwith platforms who are watching
us, collecting data on us andusing that data, whether it's

(24:28):
toward us or for the benefit ofa company somewhere else that
data is being used, and so wetalk about vigilance with your
individual accounts, two-factorauthentication, email
confirmation, all of these otherlittle things that most people
are aware of, but there's alsojust this general need for, I

(24:51):
think, a basic cybersecurityeducation across the board.

Kristi Hoffmaster (24:57):
I agree.

Sam Gerdt (24:59):
Is that something that we're seeing more of on the
company level?
Are companies getting on boardwith this and providing this
kind of information, or are westill woefully behind?

Kristi Hoffmaster (25:09):
Well, I grew up I'm going to date myself here
, but I grew up in this industry.
I started in IT many, manyyears ago, and so security was
the phrases that it was boltedon, and now systems and
organizations are attempting tooperate with security and

(25:30):
privacy by design as a conceptin the best practice.
So I do think that it's beenforced on us, culturally and
society, to become more mindfuland aware, because it's not
really a matter of if it's amatter of when things are stolen
in the you know society that welive in digitally.

(25:52):
So I do think there's moreawareness, which is great.
We're coming on the annualpresence of Cyber Security
Awareness Month that happensevery year around this time that
CISA, one of our United Statesfederal agency, promotes each
year and to the common publicwith publications and tips and

(26:14):
best practice reminders.
So there are activities likethat that happen on a cyclical
basis that can help us be moremindful.
But going back to, you know,cyber security traditionally was
thought of as endpoints andfirewalls to a network, and now
it's everything it's datamanagement, it's the network,

(26:38):
it's the human factor.
So, going back to that aspect,you know, I would say one of the
most costly and impactful andexpensive impacts to any company
or any organization.
You know hospital, schoolanything would be.
You know ransomware and takingthat data and those systems and

(27:03):
holding them hostage, as well associal engineering, that the
tactics and the threats are outthere and they're more
sophisticated than ever.
So, as a baseline best practice, individuals you know should be
handling their credentials invery, very safe manners, like
not writing down passwords, wereway past that age.

(27:25):
You should be using a vault andusing two factor, multi factor
authentication with this, youknow phishing resistant
mechanisms for everyauthentication login that you
personally have.
So I'm a parent.
I grew up, you know, raising mychildren.
You, like most parents, wantingto protect our family and I

(27:46):
would make that analogy from afamily to a company with
operations.
You know you want to protectyour family's physical structure
, their brains, their citizenry,their you know spiritual, you
know soul and everything else.
As a parent, you would want toprotect that child that's
vulnerable.
You might even set up some bestpractices for protecting your

(28:09):
family with disciplining andcorrection.
You know working with yourfamily and your partners in the
village that's raising thatchild or those children.
So it might seem far fetched,but you could even take it to
your family as taking thatresponsibility that you're
talking about and write a policyand a procedure for how your

(28:29):
family handles digital bestpractices and applications in
your home.
Everything's going to lookdifferent depending on what
family and how you're raised,but, for example, you can make a
policy that you know doesn'tdoesn't allow your children to
do certain things withtechnology.
So those are ways that you canand have responsibility, helping

(28:50):
educate people, whether it'solder or younger, understand
what happens when you downloadthat application.
What are you giving that vendoraccess to then and residual Lee
.
So those are steps that you cantake for responsibility, but I
do think that companies aredoing a better job of training

(29:11):
the workforce and understandingeverything from best practices
to innovative steps to bemindful of, like you know, ai
and deep fakes.
What should we look for?
So, just helping understandwhat your expectations are for
yourself and for anyone thatyou're responsible for, whether
it be a family or a school or abusiness, having those things

(29:35):
that are aware and thoseexpectations laid out is really
important.

Sam Gerdt (29:39):
We've arrived at this place and I think we've been
here for a little while where wereally need to be looking out
for the more vulnerable and andlifting them up and coming
alongside.
I think you touched on it, butyou think about the older
generation and the youngergeneration.
The older generation thatdidn't grow up with this stuff,

(30:02):
with these threats, is woefullyincapable of managing the real
threats that are kind of attheir doorstep, and in many ways
, they're the more targeteddemographic as well, because
they're the ones that have money, they're the ones that have
something to take, and so youthink about the, the myriad of

(30:22):
scams that prey on the elderly.
This is something that we, asyou know, the generation who is
in charge, really should behelping with.
I mean, I feel, I feel verystrong, strongly about this, but
it paint.
It pains me to see how we'vekind of left A generation behind

(30:43):
, and you know, to watch as anexample, to watch our
grandparents try to navigatesocial media so that they can
see, you know, family picturesthat they would have had other
ways of seeing, you know, 1520years ago, to expect them to
adopt a platform that they'revery unfamiliar with, and then

(31:03):
and then all of the wolves thatexist on that platform, it, we
do a disservice to them.
And then the flip side of thatis children to you know, to hand
a child a piece of technologylike a phone or a tablet,
without that education thatcomes alongside of it,
understanding, like you said,what happens when you download
this app, what exactly ishappening when you download this

(31:25):
app, and then having thoseconversations to where you say,
listen, you may not be thinkingabout this right now, but what
you do on this device will comeback to you in 10 years, 15
years, 20 years, because,because they're remembering the
people that you're dealing with,the companies, the devices that
you're dealing with.

Kristi Hoffmaster (31:44):
Right the data.
Yeah, I think that I totallyagree with you.
I mentioned cyber security andawareness month.
It is coming up.
There are posters and flyersand reminders about those types
of you know statements remindingyou.
But I think, to go back to aparenting analogy, when you

(32:10):
whether it's a person or anentity, an organization when you
have the responsibility toprotect another party that is,
you know, less able and maybehas less insight, or just you're
keeping that promise to protectthat person or that entity and

(32:30):
that party, you have to realizethe responsibility.
And then, like you're saying,you know we should take more
ownership, you should lay outwhat are the expectations.
So it just comes down to basicexpectations.
For example, you would tell athree-year-old child in a
parking lot my expectation isthat you hold my hand as we make

(32:54):
this journey into this placeand I'm not gonna let go of your
hand.
And if someone else were tocome up, you know hypothetically
, and ask to take your hand, youknow you should never do that.
You should never expect thatsomeone wanting to do that has
your best interests in mind.
So, laying out to the elderlythis is what you should expect
to see here.
And if you don't see that.

(33:16):
Or if you see such and such,you know it's malicious most
likely.
So just people understandingwhat's to be expected.
That's where people who are owna business don't own a business
, been to school forcybersecurity or not.
That's when you know people whohave a little bit more
awareness.
That's our responsibility tocommunicate that.

(33:37):
I definitely agree there.

Sam Gerdt (33:40):
And so moving forward .
What we would like to see, whatI would like to see, is
companies, especially smallbusinesses, taking this
seriously and invitingprofessionals in.
Not necessarily in a businesscapacity, where they're being
invited in for, you know, aprofessional audit, but inviting
them in and saying hey, listen,we're going to buy lunch, you
know we'll pay you for your time, but we want you to just talk

(34:03):
to us about these things andbring our people up to speed.

Kristi Hoffmaster (34:05):
Yeah.

Sam Gerdt (34:06):
Those kind of like lunch and learn environments.
I would love to see more ofthat on the smaller scale.
And then on the larger scale, Ithink there's this need for
larger companies to recognizethat not every employee is on
the same page with this, andthere needs to be this, this
conscientious effort to makesure that you're not leaving

(34:26):
someone behind, especially asyou think about, like, all of
the innovation happening aroundAI.
There's a lot of people who arejust really insecure about all
of this and most of that, mostof that's coming from ignorance.
I think They've not beenbrought along.

Kristi Hoffmaster (34:41):
Exactly, and we talk about diversity and
inclusion from a.
You know something that webusiness owners are more and
more aware of as well One waythat you can have include people
that work for you and yourconversations is combine
training with fun.
You know you want to boost youremployees' morale.

(35:02):
You want them to stay engaged,you want them to enjoy working
for your company.
My suggestion would be combinesome of that bringing experts in
and, you know, combine it withan outing or a fun activity
where you have not just someonetalking to your workforce but

(35:22):
having an actual engagingconversation where all questions
are welcome, all ideas arewelcome.
You know, curiosity is reallysomething that can boost morale
in your company.
So the more you allow healthy,curious debate and conversation
and inquiry, the more that youhave productivity and that

(35:43):
training just kind of happensorganically.
You can have that coupled withformal.
You know formal training aswell.

Sam Gerdt (35:50):
Yeah, I agree.
I think it's.
It's so needed.
It will become very helpful forthe companies who do adopt a
culture of this education.

Kristi Hoffmaster (36:00):
Yeah, we have to because we're in this
interim generation and in timein history.
You know the White House justpushed out a strategy for
enhancing cybersecurity in theworkforce just very recently I
think it was about a month agoor so.
So you're going to see theimpact of that directive coming
from the US government.
You're going to see thatresidually happening where

(36:23):
cybersecurity is normalized andjust talking about it's going to
be like we don't, we don'trefer to the phone as a
smartphone anymore.
It's that language is kind ofpassed as we just say phone and
we all know it's a smartphone.
So normalizing safety classesabout digital safety and privacy

(36:46):
and security those are going tobe coming into, you know, high
school and middle schoolcurriculum over the next decade.
So that's going to be somethingthat's advantageous for that
generation because they willgrow up with that mindfulness.
But we're in this interim phasewhere we have to step back and
say not everyone grew up aroundthis and not everyone

(37:08):
understands the importance ofkeeping information safe and how
it could impact your businessif you don't.

Sam Gerdt (37:14):
We're coming to this point too, and I'll be curious
to hear your thoughts on this.
We're going to get it more intothe realm of speculation here,
but we we've seen some scarystuff with the capabilities of
artificial intelligence to dothings like you know just
passive learning, where it'sjust constantly churning data

(37:36):
behind the scenes.
There's no directive, it's justa machine processing these
points of data and coming upwith these insights that are
unbelievably accurate.
And then the flip side of thatis we're giving we're giving
more eyes to these systems, moreinputs, more sensors, and I
feel like we're getting to thispoint, to where even our like

(38:00):
you were saying we've shiftedfrom a bolted on approach to
security to more of anintegrated approach to security.
Even there, I feel like we'reso far behind that now we've got
agents that, in particulartasks, can act with superhuman
intelligence to to analyze dataand produce insights that could

(38:25):
produce malicious attacks onindividuals and then also could
do things like you know, bruteforce, but but brute force in
creative ways, where it's noteven necessarily just trying a
million passwords, it's, it'slistening, you know, on your
phone, on your, on your device,it's using those sensors that

(38:48):
it's been given to try to breakits way into your life.
These are things that we can'tnecessarily protect against with
single sign on, because theapproach is so, it's so human.
How do we, how do we adapt tothat?

Kristi Hoffmaster (39:06):
Wow, I would never presume to be the expert
voice on speaking on how toaddress this.
There are many voices out thereand it is that question and
asking that is kind of thegolden question right now.
For the world.

(39:27):
You know NIST has put out an AIrisk management framework
recently in an AI resource andtrust center that people and
organizations can go to and kindof get the most recent
questions and guidance fromhundreds of private sector and
federal contributions.

(39:48):
So I would highly recommendeveryone check that out.
I would say the challenge isinferring from what you're
getting at.
The challenge is we don't know.
And when you don't know, as aTPRM professional, when we don't
know the presence or absence ofsomething, we assume the worst

(40:09):
or we assume the highest risk isinvolved.
So cybersecurity terminologythat covers that would be zero
trust.
So there's the concept of whenyou are looking at something or
when data is moving, whensystems are interacting, you

(40:29):
assume the worst.
So there's zero trust.
The gate is closed.
Everything needs to betightened up.
So from a technical controlaspect, what you would want to
do is have as much monitoring aspossible.
When the eyes and the inputsare myriad and just out there
exponentially you have no idea.

(40:49):
There's thousands or millionsof data flow, inputs and
connections you would want to bemonitoring that activity as
much as possible.
From a small businessperspective, that's not always
easy to do, but the number one,primary way that you can address
authentication handshakes inthose gates where that data is

(41:11):
coming and going in transit.
Is single sign on, is phishingresistant, multifactor
authentication.
So you can, for example, sayevery system that our business
uses must have this enabled.
If the vendor cannot, you know,provide that single sign on

(41:33):
function and that MFA function,then we won't do business with
them.
That's just a requirement thesedays because we don't want that
data leaking.
Another thing that organizationscan do is create an AI policy
that's appropriate for yourworkforce.
You know you could say thescope of this policy applies to

(41:55):
everyone in the company andeveryone in the organization and
everyone who contracts with usto do services for us.
And then you could say we areonly going to allow this type of
data to be processed from anysystem that uses AI.
It could be anything that'spublicly available to our

(42:17):
company or our customers, or itcould be.
You know this kind ofinformation is off limits and
you train your workforce to sayyou know you have.
If you're going to work here andbe an employee, you've got to
promise and commit to notputting such and such type of
data into these systems, notsharing this type of information

(42:40):
because we don't know yet andthen having those resources to
be able to technically look atwhat's happening.
You know, every technicalsoftware company these days
should have experts whounderstand APIs and understand
what kind of permissions andwhat kind of data is flowing
with those APIs and what are thesettings, what are the

(43:01):
configurations for them.
So, yeah, we're there.
We're at a day where we look atthe unknowns and we have to
kind of assume the worst andunderstand more about what
systems are doing and opt outwhere you can.
You know, it's just readingunderstanding those terms and
understanding the terms ofservice for applications that

(43:23):
your companies are lying on.

Sam Gerdt (43:26):
That's actually a really good answer to the
question and I know that was ahard question.

Kristi Hoffmaster (43:29):
That's okay, thank you.
It is a very big question rightnow.

Sam Gerdt (43:33):
It is.
It's so big.
If we're going to talkcybersecurity, we have to talk
about it, because I feel likethe cybersecurity industry has
done such a great job ofresponding to the pressure of
five years ago, just in time forus to see this new existential
threat.
And I don't mean, you know,existential threat in the sense
that AI is going to wipe outhumanity, but existential threat

(43:56):
in the sense that, all of asudden, we're facing
intelligence you know, computerintelligence that is incredibly
capable when acting maliciously,just like it's incredibly
capable when acting for our good.

Kristi Hoffmaster (44:11):
Yeah, and getting your arms around that is
a huge challenge.
Just like I would make ananalogy to social media.
You know, some of the CEOs ofthe most prominent AI companies
have faced Congress veryrecently saying we don't want to
see a repeat of social mediarevolution happening with AI.

(44:32):
In other words, we're askingfor partnership with government
to form regular you knowlegislation and regulation
around this industry, but it's aglobal and that's another
aspect of it is a global aspectof importance that we understand

(44:53):
and this is something we needto teach the most vulnerable as
well is that we do not live inour very narrow scoped society,
local culture, anymore.
This is a global world that welive in and we are globally
dependent on each other and weare globally connected and the
threats are global.

(45:13):
So it is really important tounderstand that as we go forward
.

Sam Gerdt (45:20):
Do you feel optimistic that that's going to
be the case, that we will beable to control, to regulate the
explosion of these technologies, or is it going to be the Wild
West, like it was with socialmedia?

Kristi Hoffmaster (45:37):
I think that it's an exciting dance of
caution and innovation that'shappening.
Leadership in large techcompanies we're all looking for
the competitive edge and to seehow this plays out and who the

(45:58):
key players in these spaces areand how AI can innovate and help
foster innovation.
So, on one hand, I'm veryoptimistic healthcare technology
security.
I think that AI is going tobring a tremendous amount of
innovation.
I do also think, like socialmedia, there's going to be more

(46:19):
accountability.
You look at the way that humanbeings operate now versus a
decade ago.
With social media, there is moreaccountability.
If there is an altercationhappening in a public square and
someone is live streaming thatto a social media platform,
there is accountability wherethere used to not be.
That I heard on the news theother day.

(46:41):
The IRS either has or isstating that they will have the
capability to rapidlyinvestigate and analyze tax
return information and data withAI in order to foster more
accountability to the taxpayer.
So there will be moreaccountability across the board

(47:02):
as well.
But there will be accidents andbumps along the way.
Hopefully they're nottremendously harmful to society,
but who's to say?
It's an exciting time to be ahuman being on this earth and in
outer space.

Sam Gerdt (47:19):
Yes, we're not just on the earth anymore, are we
Kind of getting right into thatas well?
We talk about this generationbeing a generation of transition
and looking at initiatives thatare going to be better for the
younger generation coming up.
We are going to leave them witha very different world, and so

(47:45):
what are the ways in which weprepare the younger generation
so that, as kids are gettinginto that career age, they're
choosing a path that's going tobe both beneficial for them but
also beneficial for the world?
And I think, specifically withregards to things like

(48:06):
cybersecurity, what are the waysthat we can foster the right
attitude?

Kristi Hoffmaster (48:14):
That is such a great question.
That kind of gets to the coreof what I'm passionate about at
this point in my life.
Yeah, what a weightyresponsibility we have in this
generation.
Like I said, I am a mom.
I have children who are verymuch growing up in a culture

(48:36):
that can be very frightening.
There's a lot of unknowns.
My son said to me the other daywe were having dinner, maybe it
was last night even, mom, thisis a crazy world to grow up in.
It's just crazy.
So I think I mentioned ourgovernment.

(48:58):
Our culture is trying to fostera quick pivot Nothing happens
quickly in government but apivot to an attitude of, by
design, conceptually educatinghumans as they grow up in
primary and secondary educationto understand the importance of

(49:22):
keeping information safe andunderstanding how it impacts
your personal safety withinfrastructure and considering
just the larger aspect ofkeeping systems safe.
So we have a cyber force, aspace force arm now in our

(49:43):
military branches that islooking at satellite technology,
securing everything above andaround and below us, so
encouraging people to understandthat there are so many
opportunities for them to workand do this work.
And then, stepping back,there's that viral meme about

(50:05):
disasters where people say it'sMr Rogers and he says look for
the workers.
Those are the ones in a crisisthat will be, those are the ones
doing the good.
And so, stepping back and saying, which side do you want to be
on?
Do you want to be on the helperside or do you want to be on
the side creating harm?

(50:26):
And so I know it sounds cliche,but To me there's a lot of work
to be done in protecting humanlife and protecting
organizations and systems.
There's a ton of work to bedone and there's a lot of, you
know, paths that you could taketo do that work.
And it's way more exciting, inmy opinion.

Sam Gerdt (50:47):
We talked a little bit about normalizing some of
this stuff, and we have.
I think the stigma is dyingaway finally, but there was a
time when I was growing up thiswas the case where there were
the computer geeks and everybodyelse, and now it seems like
what we need is an entiregeneration of computer geeks if

(51:08):
we're going to get through thissafely.
Is that something that youagree with?
I'm assuming the answer isabsolutely yes.

Kristi Hoffmaster (51:17):
Yeah, I mean there have been publications
from multiple agencies in ourgovernment about the dire need
for increasing cyber workforce.
But yeah, I think, sincepractically everything has gone
digital in the last decade, thisgeneration of students, you
know it makes complete sensethat things are out there and

(51:41):
need to be kept safely secure.
But there's so many fields thatone person could go into and
the beauty of cyber especiallyTPRM that I'm in, or any GRC
role in cybersecurity is thatyou don't have to try on one hat

(52:01):
and stay there.
You know you could work in pentesting, you could work in
vulnerability management, youcould work in networking and you
could even work in engineeringand then pivot across the
landscape.
There's so many paths.
So I don't think it's nerdyanymore.
I think it's kind of reallycool.

Sam Gerdt (52:20):
And I don't even mean that we necessarily need a full
generation of people who areworking in IT or cybersecurity
or any of it, but there justneeds to be this computer
literacy so that even the guyyou know who's building houses
has a computer literacy to thepoint that he can protect
himself against ever increasing,increasingly capable systems,

(52:44):
these intelligent systems whoare, you know, for better or
worse, going to be looking athim.

Kristi Hoffmaster (52:49):
Yeah, and I think that I personally I don't
know if you do this, sam, butevery December I sit down and I
create a word and I hone in on aword for the following year.
That it's my word and I want tosee and challenge myself and
everything out there.

(53:09):
You know, I do a littlepersonal retreat and meditate
and I think about like what do Iwant my word to be and how am I
asking it and inviting it tocome into, like all my
interactions in my work life, mypersonal life, everything.
And it's so interesting, I'vebeen doing it for about seven
years and I see like how crazy,in amazing ways it's, this word
plays out.
But I chose last year my wordto be freedom and I think you're

(53:34):
hitting on my word.
There is just the concept ofeveryone wants freedom.
Everyone, no matter what theydo for a living, no matter how
much they get paid, what theirsalary is, what their net worth
is, what their family looks like, where they live they want
freedom.
So you know the company I workfor, our part of our motto is

(53:55):
freeing people to safely use anytechnology.
So when you look at that theperson who's owning or working
with a small business you know,in South Carolina we have I-85
going from South Carolina toNorth Carolina.
There are men out therebuilding that highway.
That's what they do sun up tosun down.

(54:16):
They may not care about the appthey're using out there and they
may not.
It may not matter to them, butonce something happens with that
application or that data thataffects their freedom, that's
when people's attention perks up, you know.
So, I think, when you can tiecybersecurity and tie privacy to

(54:38):
freedom and frame it in such away that this is why this is
important to us we would like tohave freedom to get into the
things we need to get into, towrap up the things we need to
wrap up at the end of the day,to walk away from things we
don't have, you know, a need orinterest in.
That's why we care.
We want to be free to use thattechnology.

Sam Gerdt (55:01):
It's a shame that we don't act more proactively when
it comes to freedom.
We do have that, you know,reaction to when it's taken away
, but we don't proactivelyprotect it all the time.
And I think you know businesswe've been talking about
business also.
Individuals absolutely have torealize that just the mere

(55:24):
existence of the majority, ofthe majority of the technology
today, it's not meantnecessarily to make you more
free.
If you want freedom freedom isyour word then you've got to
understand that.
Just blindly adopting atechnology because maybe it, you

(55:47):
know, maybe it frees up fiveminutes of your time per day, or
maybe it makes you feel likeyeah, you've gained convenience.
That doesn't equate to freedom,and in many cases, what we're
seeing these days is it actuallyequates to less freedom,
because you are handing over somuch information and that
information is being used inways that are not necessarily

(56:10):
easy to recognize.

Kristi Hoffmaster (56:12):
Yeah, that's a great point.
Humans are, we have data, anddata is the commodity of you
know.
So that is the biggestconsideration really, when it
comes down to it is, we're allinteracting with technology and
we are the consumer at the endof the day.

Sam Gerdt (56:29):
We saw that with social media.
We saw, you know, theseplatforms that were, you know,
free.
They didn't cost money.
But then you ask how is this,how is this company valued at
billions of dollars and theirproduct is free?
You know the any any thinkingperson's going to look at that
and recognize I'm the productlike.
I'm, I'm the, I'm the value,not you know, not my, my fee,

(56:54):
not my wall.

Kristi Hoffmaster (56:55):
Yeah, there's some great books out there on
that on, about you know thesocial the, the whole phenomenon
of us being the product in thisdata commodity.

Sam Gerdt (57:06):
And we're going to see this again with AI.
There are certain things thatAI will not be able to do
because it's not human andhumans.
There's going to be thepossibility, there's going to be
the attempt to commoditizehumans, to augment AI, and it
should be the other way around.
Ai should augment the humanexperience, and so we need to be

(57:30):
aware of that, I think, andresist it wherever we can in the
name of that freedom thatyou're talking about.
Recognizing and I think thishas come up in every interview
I've had so far recognizing thatAI is a tool and people are
where the value is.
And there's going to be thatattempt to flip it around and
say, oh no, people are the tool.
Ai is where the value is, andfor companies especially.

(57:54):
I really want to see thisconscientious rejection of that
and I want to see it play out.

Kristi Hoffmaster (58:00):
I think that you know, as we are trained in
cybersecurity, in that field,the best practice mindset to
have is that security needs toenable business.
Security doesn't need to get inthe way of business.
So I think that full circle towhat you're saying.

(58:21):
When we can secure technology,we free up what really needs to
happen, which is people beingpeople, people interacting.
You know business operating.
So security is often perceivedas getting in the way.
But really, when security bydesign is followed in a business

(58:43):
, operations and in systemsdevelopment and in assessing it,
looking at it, purchasing itall of that, the whole life
cycle security can enable whatneeds to happen instead of
getting in the way.

Sam Gerdt (58:59):
Absolutely so.
One last question before wewrap this up considering all
that we've talked about, if youwere to sit down with small to
mid-sized business owner let'ssay, small business owner today
what's the next step?
We've talked about a lot here,but what's the practical next
thing that this person needs tobe thinking about?

Kristi Hoffmaster (59:22):
Looking at your resources, what is your
budget?
What do you have available toyourself to invest in an
internal resource, an advisoryor legal guide, whether that's
hiring someone or consultingwith or outsourcing so getting

(59:42):
your arms around, how can Isecure what I have and how can I
keep better promises to mycustomers that I'm going to keep
their information safe?
Making that investment beforeyou need to is going to pay off.
In my opinion.
That would be my one next stepguidance and then, of course, on
a technical level, every pointin your business where you need

(01:00:08):
to have a credential and a login.
Secure that with MFA.

Sam Gerdt (01:00:14):
That's a good, practical one, because you can
do that today.

Kristi Hoffmaster (01:00:16):
You can do that over the weekend, for sure.

Sam Gerdt (01:00:19):
One more thing building trust.
How do we build trust whenwe're talking about
communicating the value of trust?
How do we do that when it comesto data and security?

Kristi Hoffmaster (01:00:36):
That's great.
Trust and transparency go handin hand, so openly stating what
you're doing, why you're doingit and how you're doing it is
really important.
Every business could stand up apage that says
businessnamedomain slash trust.
Explain what you're doing withyour customers' data, why it's

(01:00:59):
important to running thebusiness From a technical
standpoint.
Software companies can actuallyproduce an SBOM a software bill
of materials that shows what isin this software product, what
vendor products, what libraries,what open source libraries are
in this product.
That came from an executiveorder from the US government, I

(01:01:23):
think in 2021, which is askingcompanies to provide that
software bill of materials fromany private company, any company
that's doing business with thegovernment.
But you can do that proactively.
If you're a startup creatingtechnology products, make an
SBOM would be my practicaladvice as well.
That promotes trust andtransparency, saying here's

(01:01:43):
what's in our stuff.
Here's you can take a look atit and analyze it and see it,
and also just managing yourvulnerabilities.
Looking at not just technicalvulnerabilities with software,
but looking at all yourvulnerabilities.
How are you managing those?
That doesn't have to go on apublic trust page, but that's
something you can do internallyand keep track of.

Sam Gerdt (01:02:04):
This has been such a good conversation.

Kristi Hoffmaster (01:02:06):
Thank you, I've enjoyed it.

Sam Gerdt (01:02:09):
I feel like I'm only halfway through, but I'm
absolutely going to end it here.
We may have to just have youback.
There's so much that we couldtalk about here, but, , this is
an excellent start.
I'm glad we got to focus oncybersecurity from an individual
level, cybersecurity from thatsmall business perspective,

(01:02:30):
because I feel like these arethe people who most need to be
paying attention and I thinkonce we get the ball rolling on
this, there's so much more totalk about.
That may be on the moretechnical side, but for right
now, this has been excellent.

Kristi Hoffmaster (01:02:47):
Thank you.
Thank you for having me.
Advertise With Us

Popular Podcasts

United States of Kennedy
Stuff You Should Know

Stuff You Should Know

If you've ever wanted to know about champagne, satanism, the Stonewall Uprising, chaos theory, LSD, El Nino, true crime and Rosa Parks, then look no further. Josh and Chuck have you covered.

Dateline NBC

Dateline NBC

Current and classic episodes, featuring compelling true-crime mysteries, powerful documentaries and in-depth investigations. Follow now to get the latest episodes of Dateline NBC completely free, or subscribe to Dateline Premium for ad-free listening and exclusive bonus content: DatelinePremium.com

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.