Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Joshua Schmidt (00:08):
My name is Josh.
I'm the producer of the ITAudit Labs podcast called the
Audit.
Today we're going to get toknow some of the ITAL members.
We have Scott Rizdal, ericBrown, bill Harris and Nick
Mellum.
I'd like you guys to go around,kind of do a popcorn style,
introduce yourselves and maybegive us a little background on
what you do at IT Audit Labs.
(00:28):
All right, I'll start.
Scotty Rysdahl (00:31):
Hey, thanks,
josh.
I'm Scott Rizdal.
I am a security practice lead,generally kind of a blue teamer,
here at IT Audit Labs, and Ispend most of my time working
directly with clients, managingsome security staff and also
working hands on on somesecurity projects for one of our
main clients.
In my free time I like to workon free software and keep up on
(00:58):
infosec news, follow othersecurity podcasts, twitter, x,
reddit, et cetera you name itJust love soaking up that info
sec news.
So that's me in a nutshell.
Bill Harris (01:10):
All right, this is
Bill Harris, and I am focused on
really the administrativeaspects of cybersecurity.
So I really spent a lot of timedrilling down into
cybersecurity policy,vulnerability management,
technology, futures and securityassessments in both the private
and the public sectors.
Nick Mellem (01:30):
And Nick Mellum
here, also a security engineer.
My I'd say my main functions orfocuses at IT labs are policies
and procedures, compliance fromNIST, cgs, cmmc, pci, vendor
reviews.
I would say my most favoritethough it's got to be social
engineering.
So whenever that comes up,always to always down to get
(01:54):
into some social engineeringaspects, but then also risk
register reviews and creation,things of that nature.
Eric Brown (02:01):
And Eric Brown and I
founded IT audit labs in 2018,
and have been working with acouple major clients in a
fractional CIO and fractionalCISO role, as well as keeping IT
audit labs running and keepingthe great team that we've got
(02:22):
here motivated and working onsome cool things.
But probably my most importantduty is keeping the coffee
filled here at IT audit labs.
So tough job, but somebody'sgot to do it.
Joshua Schmidt (02:32):
I noticed you
have a lot of beer in the cooler
there and we probably I'll helpyou with that problem on
Wednesday.
Eric Brown (02:38):
Awesome, yeah, gonna
move some inventory.
We've got a craft soda placehere in Minneapolis where they
have all kinds of differentflavored sodas, so I picked up
some of that too, for the coolershould be should be pretty cool
.
And I guess in my free time Ido a little bit of aviation.
(02:58):
That's probably my currentpassion is flying, and I was
able to fly a small plane toDeadwood this year for Wild West
, hack and Pest and that was afun trip and I certainly will
take anybody up who wants to go.
I think I've offered to Billlike, hey, let's go up, bill.
Bill and I go back a number ofyears and before, before I flew
(03:24):
fixed wing, I learned to flyhelicopter and I actually became
a helicopter flight instructor.
But I don't think Bill, bill,we've never gone up, have we?
There's always been some excuseabout why we weren't going to
go.
Nick Mellem (03:38):
I don't blame you,
Bill.
Bill Harris (03:39):
Been up in the
helicopter with you.
I'm going to decline any tripin a single engine plane to any
place to start you with the worddead.
Nick Mellem (03:46):
That's a great
point.
My hands are sweating right nowthinking about getting up in a
plane that small I did.
Before I I was in the militaryand before we deployed to
Afghanistan we had to dotraining for helicopters and
planes.
So they put you in a rotisseriedunker.
You know, have you seen on thevideos when they throw you into
(04:08):
the water?
For, like, how, if you do acrash landing into water of any
nature and then we also did itfor trucks so they roll you
around and they stop, and thenyou have to, like, undo your
seatbelt, fall to the ceilingand then crawl out whatever a
turret hole or whatever, butjust getting you ready for
conflict.
And so, eric, talking aboutthis stuff, I'm thinking
flashbacks of doing thistraining or, if anything.
Joshua Schmidt (04:30):
Now we know why
it audit labs is strategically
positioned near the airport.
Nick Mellem (04:36):
We can bug out
anytime we want.
Joshua Schmidt (04:38):
Speaking about
preparing for conflict.
No, I just wanted to transitionto the Cisco article.
Scott, would you be able topull that up and share it with
us?
I don't have the fancy adblocker that you do and I'm
embarrassed to yeah, some likemy little pony.
Nick Mellem (04:55):
Yeah, Ronny's.
Eric Brown (04:57):
That I was gonna
answer that.
Scotty Rysdahl (05:00):
It's getting
weird over here.
Feel free anybody to jump in,and I guess breach is the wrong
word.
Cisco vulnerability that hasled to a number of security
incidents or breaches around theglobe.
If you work in Infosec longenough, the first few times, the
first few years, you kind ofstart getting on the
(05:21):
vulnerability disclosure train.
Like it's pretty exciting stuff, like oh, I can't believe these
.
You know, million or billiondollar IT tools or operating
systems have all these problems.
Like why don't they fix themmore?
By the time you get five, six,eight, 10 years into your career
, you're just like, you know,just another vulnerability.
Yeah, no surprise at all.
(05:42):
But sometimes there are thoseones that stand out a lot and
this is one of those.
I think the exchangevulnerabilities from a couple of
years ago, I think maybe justlast year.
Time Flies was another one, butit's.
It's the situations where youknow a well-respected,
well-regarded, widely deployedproduct or or technology has a
(06:03):
very fundamental flaw and thenvery quickly gets weaponized by
Whoever you know, a nation-state, people with political or
economic goals.
You know, almost overnight itjust sort of goes around the
world and all of a suddeneveryone's scrambling to catch
up and this is one of those.
So at a really high level itaffects a lot of Cisco devices
(06:23):
that run a version of theirFirmware or their operating
system that's just called iOS Xe.
Next, he is just kind of anewer iteration of iOS, which
has been Cisco's kind of coredevice operating system for a
long time.
What what's also cool about thisone is it's it's really easy to
explain and for kind ofnon-technical people to
(06:44):
understand, because really Allit takes is for the, the web
page that is used byadministrators to configure a
Cisco device to be open andexposed to the internet or to a
network that is otherwiseaccessible by bad people.
It doesn't require any specialaccess, it doesn't require any
credentials.
So they say it's, you know,remote, unauthenticated.
(07:07):
Really anybody who knows aboutit and has just enough technical
aptitude to know where to lookor has read online Postings
about how to weaponize it, cango in and completely take over a
device, and so it lets themcreate a new level 15 account,
which is kind of Cisco speak fora super admin account on the
device and and the device istheirs.
(07:28):
And I actually One of theclients that I work for had this
happen and it took a littleback and forth with some people
who manage our network to Torecognize that, yes, indeed,
this was our device and yes,indeed, it had been compromised.
But yeah, I got to.
Just last week I got to dealwith this firsthand and it's,
it's an interesting feelingknowing that somebody else owns
(07:49):
one of your Very core.
So in our case it was a corerouter, so one of our two border
routers, which is about asimportant a system as you can
have.
Joshua Schmidt (07:58):
What Cisco
products are out there?
Like what devices would thiswith this effect?
Eric Brown (08:03):
Isn't it?
Any device that has a webmanagement, a Cisco device that
can be managed over their theirweb interface, so routers
switches.
Bill Harris (08:15):
Is that's my
understanding?
Yeah, anything that's exposedto the, to the public internet
over HTTP or HTTPS, right forthat management GUI?
No, appears to be what'simpacted here.
So the work around so Iunderstand it is that you would
disable that there's notcurrently a Patch for this
system that Cisco has released,unless I'm mistaken, in which
(08:37):
case, let me know, but I thoughtthe only way around this is to
shut down that public access tothat GUI.
Scotty Rysdahl (08:43):
That's all,
totally tracking with my
understanding of the situationand who knows when this episode
will air.
So hopefully by then Cisco willhave released a new patch to
correct it.
You can see on the screen herewhenever this was posted, I
think a few days ago, there weretens of thousands of
compromised devices thatsecurity researchers were
already finding out there and ofcourse when the attackers take
(09:05):
completely take over the devicethey can lock out the good guys.
So, as in our case, you know wehad to.
We had to send a network guyout there with a thumb drive,
essentially to the data centersphysically, to re-image these
things to a known good state.
So totally reinstalled theoperating system, totally
reapplied the configuration andthen just sort of yeah, and then
(09:28):
of course turn off the web UIbecause it's still not patchable
and Scott when that happened,right so they they they got
access to your border router.
Eric Brown (09:38):
Then Presumably they
were able to to get information
about the network.
Do you know what otherinformation they might have or
might have been able to get fromthat device?
Scotty Rysdahl (09:54):
Yeah, so the
first thing that we just assumed
was that any passwords, thelogin passwords or the you know
kind of the enable so make me asuper user after I log in
password that those had beencompromised, not only because
they're stored using somewhatweak encryption on these, some
of these devices, but alsobecause our admins, trying to
figure out if the devices wereaffected, tried to log in after
(10:16):
the device was likelycompromised so.
So yeah, as far as othersensitive information, I guess
sort of luckily these borderrouters by their nature are sort
of outside the rest of theNetwork.
You know they.
They sit between one's internetservice provider, upstream
service provider and you knowyour own like security perimeter
(10:38):
.
So really they don't have a lotof privileged access or
privileged information about thenetwork.
The most that they would haveis, you know, fairly high level
kind of super routes between theon-prem network space and and
the internet or the upstreamnetwork.
So that in itself isn'tnecessarily too damaging, but
(11:00):
it's, um, it's still kind ofthat feeling like you know
Somebody came into your housewhile you were gone and looked
through your drawers.
You know.
Nick Mellem (11:08):
Scott, how did you
maybe you mentioned it and I
missed it how did you becomeaware that you guys were
compromised by this?
Scotty Rysdahl (11:14):
Yeah, yeah, it's
a good question.
So Cisco released a securityadvisory a few days after the
disclosure and Along with saying, hey, just turn off the web
service.
This is the problem.
There's no patch.
They also gave you a littlekind of customized curl command,
so a Linux command line webrequest tool that can go out and
(11:35):
and Look for a specificFingerprint, basically, of the
implant that threat actors wereusing after they had compromised
a device.
So they compromised the devicewith the vulnerability, then
they load into memory thislittle sort of backdoor and so
you could use this curl commandthat Cisco provides to just test
and see if it had beencompromised and then, based on
(11:56):
the output it spits out.
Bill Harris (11:58):
You just know
should most organizations even
have their Cisco devicespresenting a Management GUI to
the internet, or should that bemore on the private network?
Scotty Rysdahl (12:11):
This is why,
bill, this is why you make the
big bucks Obvious question maybe, but maybe there's a good
reason?
I don't know, there there isn't,and it.
You know, the usual answer isthat, well, when we set it up,
dot dot dot, insert, you knowhistorical excuse here, kind of.
So it's just a matter of thingsbeing set up to work and once
they work you don't take awayunnecessary access or turn off
(12:33):
unnecessary, unneeded features.
You know, and that seems likethat's what happened here.
One other interesting wrinkleto this whole thing is that, as
security researchers werescanning to sort of track To see
how many were still exposed dayto day after this thing got
announced, they noticed I thinkit's on the screen here they
noticed a huge drop in, so fromlike 50 or 60,000 down to like
(12:54):
under 10,000 Devices that werethat had had the implant.
And so it turned out.
I think somebody discovered Idon't know if it's in this
article, but that the bad guys,they believe, basically went in
and patched their Implant, theirback door, to make it not
appear that it had beencompromised.
So the bad guys are even cominginto and quality control after
(13:14):
the fact on their, on theircompromised infrastructure.
Nick Mellem (13:17):
Did they say what
group it was?
Yeah, I don't recall the name,but they did provide, I think,
two years of Social security,you know, protection for for
anybody's data that was leaked.
Scotty Rysdahl (13:32):
I wonder how
social security numbers would
have been exposed as a result ofthis.
It's, it's very much a nuts andbolts.
You know core networkingvulnerability, not like a CRM
product or or you know anythingthat would directly hold that
information.
Eric Brown (13:47):
Bill on these right.
We'll get pulled into thesethings from time to time in the
Administrative aftermath ofsomething like this right, where
organization recognizes that Abreach occurred and then it's
like, okay, now, what right?
What are the downstreamRamifications?
How bad is it?
How is our user base or ourcustomers impacted?
(14:10):
So, as you look at thisparticular Breach and the
aftermath of, say, a borderrouter being impacted, quickly
found and then cleaned up, whatare some of the things that you
would be looking at from aorganizational standpoint of
(14:31):
where they're looking to Do, dobreach notification or
potentially Bring in theirinsurance and their breach coach
?
What are some of the thingsthat you'd be thinking about In
advising some of those customers?
Bill Harris (14:46):
so for this one I
mean for a zero to exploit like
this, it really is all aboutResponse, right, you're not
going to necessarily get aheadof it in terms of prevention, so
it's about getting the quickresponse from the vendor to
understand they found the zeroto eight exploits and make sure
that you are on that responselist.
(15:07):
And then the other thing thatyou could potentially do is in
the aftermath of this is justmake sure that your IDS and your
IPS are really up to snuffright.
Some of these, some of theseevents can be detected as
they're in progress because theydetect unusual activity.
Especially in the days ofartificial intelligence, some of
the some of the intrusiondetection systems are really
(15:29):
pretty smart, so they'll detectthat unusual activity and
they'll flag it for humanintervention.
Eric Brown (15:35):
It's interesting too
, where the position of some of
these devices could sit that aregetting hit right.
So if they're sitting outsideof maybe that IDS, ips, that
firewall, that am I performingthose services, and unless you
are getting some of those feedsback into your SIEM, you might
not really be aware of what'shappening on on that device.
(15:59):
So it is particularlyinteresting that the malicious
actor went ahead and patched itjust in case.
A couple days later people arereading the article and saying,
oh, we should look and see ifwe're vulnerable.
No, we're not, we're good.
And then they don't even check,they open the door back up.
Bill Harris (16:16):
Yeah, and this is
something you're going to see
now in every, every pen testthat comes out.
Right From this point forward,all the pen testers are going to
be looking for this exploit.
Now they're going to add thisto their list because it's
accessible to the internet.
If they've got that, if they'vegot that port open.
Eric Brown (16:33):
And I think, for the
Josh you had asked about.
You know what can you do toprevent this?
One of the things that that'stop of mind and this is probably
a few is, as you're doing, yourmonthly scans.
Maybe you have a service, maybeyou're you're working with a
(16:55):
government agency or you're at agovernment agency and they have
access to ceases free scans.
If not, maybe you've got someform of a service that's looking
at all of your exposed IPaddresses and then running,
essentially, a vulnerabilityassessment against those exposed
(17:17):
IP addresses, and this wouldhave come up as a vulnerability
once the vulnerability was wasknown, but it would also come up
as it, as an exposed IP addressto the internet and give you
something to look at as far aswell.
Do you really want this exposed?
Scotty Rysdahl (17:35):
Maybe it's not a
great idea to leave privileged,
potentially privilegedadministrative portals or tools
sitting open to the entireinternet on important devices
and turning them off is great,but let's say you have to keep
this on for legitimateadministrative business.
Most network and securitydevices allow you to have kind
(17:56):
of like a trusted set of IPsthat you can do management
things from.
So even if the web UI has to beopen to the internet,
essentially you can restrictjust a set of known, good
organizational IPs who can evenopen that, and that would have
shut this thing down before iteven got started.
Joshua Schmidt (18:13):
So is this a
matter of just kind of a blind
spot in the security protocolsof Cisco, or would this be
something that should have beentaken care of?
Or is this just another one ofthose inevitable casualties of
the ongoing cyber war?
Bill Harris (18:31):
I think for a lot
of organizations, this was
preventable in terms of anydamage caused.
Right, a lot of organizationsshould not have had this port
open to the whole world on theiredge routers or whatever
devices they have outside theirnetwork.
It's usually not necessary,however.
This is a zero day exploit, sothere was little that they could
(18:52):
really do to know that thevulnerability was there in the
first place.
Joshua Schmidt (18:57):
Are you guys
familiar with Akira?
Is this kind of a bad boy onthe block, so to speak, or any
insights into this breach?
Nick Mellem (19:06):
This.
I did read through the articlein this one For sure.
They were talking about datathat was siphoned off and they
are also providing protection toanybody that was in the fallout
.
I believe it shows yeah, rightthere.
It says right underneath theadd any personal data, personal
(19:29):
information that was stolen inSeptember.
Oh yeah, so there you go, thereare full names, date of birth,
social security, healthcareinformation or health
information.
So yeah, all the juicyinformation was stolen.
You definitely want to belocking social security numbers,
if you don't already do that,on all the major credit bureaus,
(19:51):
get on there.
If you don't have an account,they're free to make.
That would be critical, I think, right now to do that.
Joshua Schmidt (19:58):
Just for anybody
.
Nick Mellem (19:58):
You should be doing
that as just general practice
too.
Joshua Schmidt (20:01):
So it looks like
they got super transparent
right away.
Was that to help kind ofmitigate the damage, solve the
problem.
Scotty Rysdahl (20:07):
I would guess
some of it is led by disclosure
requirements for certainindustries.
Oftentimes, now, it's just thelaw that organizations have to
disclose in a reasonable amountof time.
When things like this happen,and maybe doubly so if they're
hoping that their cyberinsurance will pay out, there's
(20:29):
usually kind of a prescribedprocess that works should follow
, and responsible disclosure istypically somewhere in there.
Eric Brown (20:36):
And they're all
going to be working under the
guidance of a breach coach and alegal team that's going to vet
all of the communication thatgoes out publicly and internally
once this happened.
This is critical infrastructureand there's going to be quite a
(20:57):
few legal teams involved.
Nick Mellem (21:00):
I know it does
mention somewhere in the article
here too, about deploying allkinds of new EDR AV tools, and
you can see here withindecommissioning legacy systems.
I know what one of the otherclients I work with.
That's been a big point.
A big pinch point is legacysystems, and we just had Server
2012 go out of service, and thisis also why it's so important
(21:27):
to do those security scanning,penetration testing to find
these kinds of systems.
If you're not aware, but ifthey haven't said somewhere this
potentially could have been theway in, is that those legacy
systems?
So that doing those scans everymonth or weekly is one of the
best things you can do toprotect yourself, especially for
something like this.
Eric Brown (21:46):
So this would be in
their SCADA environment and you
would typically see a separationbetween the SCADA environments
and their internal enterprisenetwork and that SCADA
environment would be reallyfirewalled off or completely
(22:07):
isolated so that you makes itreally difficult to hop between
those networks which, if youremember the sandworm situation
with Iranian centrifuges, wherethat essentially you took
something off of an internalnetwork and then that worm was
(22:31):
able to move to that privatenetwork because somebody I
believe it was the USB drivethat they moved between the two
but you'd have similarguardrails in this type of SCADA
environment.
I would think so it'd beinteresting to see how this
happened or this could be, andI'm not up to speed on this
(22:56):
particular breach.
It could be that theirenterprise network was the one
that was breached and not theSCADA environment that is really
running the operations of theenergy generation and all of
this information.
Now that I'm kind of thinkingabout it, it came from their
enterprise network and themalicious actor probably did not
(23:19):
get to that SCADA environment.
Joshua Schmidt (23:21):
How does this
differ from the last article we
talked about?
When it's dealing with criticalinfrastructure, is it kind of
like they're looking for anyonethat can help in this situation
to provide assistance?
Are they tagging specificprivate companies like IT audit
labs, or how does that work asfar as the recovery process
(23:42):
responding to this sort of thing.
Scotty Rysdahl (23:43):
Your response
yeah, it's pretty common these
days for especially inhigh-profile things like this
where somebody really gets takenout to bring in a third-party
instant response company likeMandiant or whatever they're
called now, because on the onehand it's not really economical
(24:04):
or common to keep that level ofinstant response talent in-house
for a lot of companies.
But also, if a company doeshave the breach insurance,
oftentimes they'll mandate thata third-party come in and handle
the response.
Eric Brown (24:19):
Typically what will
happen, josh to echo what Scott
said was that their insurancecompany is going to really drive
what happens and there'll be apre-vetted list of companies
that the insurance company andthe breach coach works with and
the entity that is looking forservices will leverage that list
(24:44):
of service providers thatalready have a negotiated
agreement with them and with theinsurance company to provide
the forensics work.
That's going to happen and it'sjust a well-coordinated system
of events that happens, but it'susually under the direction of
(25:09):
that breach coach.
When you're talking aboutsomething at this scale, smaller
organization that if they don'thave insurance or they're
trying to solve the problem ontheir own, they may go and reach
out to an organization thatthey find or that they've heard
about to help them.
(25:30):
But it's much more structured.
In larger organizations therewould have been more than likely
playbooks and drills that havehappened throughout the year,
hopefully, of when somethinglike this happens.
Here are the people that we'regoing to call and going through
(25:52):
a variety of scenarios of whathappens if we're completely
locked out of our systems.
How do we get the contactinformation to know who to call
or to bring in from the internalteam?
So it can be a well-structuredoperation that organizations
(26:17):
would run tabletop exercises onto make sure that it's
orchestrated properly,especially something as critical
as power utility.
Scotty Rysdahl (26:30):
It's really
difficult in 2023 to even get
cyber insurance if you don'thave MFA on just about
everything, If you don't have agood password policy that's
enforced, you don't have somereasonably current, if not
cutting edge, endpoint detectionand response tool.
If they really were lacking allthese things on some or all of
(26:52):
their systems, they were in thedark ages.
Not to victim shame, butthere's just table stakes for
even being connected to theinternet with your business, and
it seems like maybe they didn'thave a lot of that stuff yet.
So I'm sure they won't waste agood crisis and they'll make a
quantum leap here.
Nick Mellem (27:14):
It looks like the
initial attack happened through
stolen VPN credentials from athird-party contractor.
Scotty Rysdahl (27:22):
I saw that, with
presumably no multi-factor on
that, so they were in Probablypretty easy.
Nick Mellem (27:32):
Could have been a
social engineering event.
Eric Brown (27:33):
It's like the target
breach all over again.
Bill Harris (27:36):
What's really
interesting, too, about this one
is there's a double whammy here.
We often hear about ransomwareattacks that just encrypt your
data and then you pay them whenthey give you the encryption key
.
This did that, but it also gotthe double whammy in that they
stole the data at the same time.
So now they have their dataoutside of their network and
(27:57):
they've encrypted their data.
So what we're seeing fromransomware actors increasingly
is this double whammy, wherethey will encrypt your data so
that you can't get to it, andthen they will extort you, and
if you don't pay the ransom,then if you don't pay a second
ransom, then they'll releaseyour data to the public, which
would be devastating for thetype of information that they
(28:18):
stole here.
Scotty Rysdahl (28:18):
So what was
stolen?
You can see on the bottom ofthe screen.
There.
Some companies may still sortof live in that space where they
don't think they're really aworthwhile target.
What does an energy companyhave to steal?
You can't siphon credit cardnumbers directly off of them,
maybe, or whatever, but evenjust SSNs things that can be
used for identity theft that'svaluable data, and what company
(28:41):
doesn't have that informationabout their employees at least,
or their customers too?
So really there's fewer andfewer companies that can
consider themselves not aworthwhile target.
Joshua Schmidt (28:52):
And just for our
audio-only listeners.
The data that was stolen wasfull name, date of birth, social
security numbers and healthinformation.
Have you, bill, worked with theBreach Coach before or anyone
on the podcast today?
Have you been in on the groundlevel of these kind of
situations?
Eric Brown (29:12):
I unfortunately have
a couple of times with some
customers where we were broughtin to help to steer out of a
situation that had occurred.
Yeah, it's a long process.
The breach I was involved withwas personally identifiable
(29:38):
information was stolen from tensof thousands or could have been
stolen from tens of thousandsof people.
That was in the data set thatwas taken.
So we have to assume that tensof thousands was exposed.
Whether or not the maliciousactors actually did anything
(30:01):
with it, we have no way ofknowing.
So that's where you go throughthe process of using forensics
organizations to assist inactually determining what was
stolen or what was possiblystolen, and then going through
notifications, like thisorganization had done and I
(30:27):
think this organization wasusing Experian to provide credit
monitoring for a number ofyears but essentially going
through that work.
And then the aftermath, theinternal cleanup, the one that I
was involved with.
I think it took the better partof a year and a substantial
(30:47):
amount of money, not just in thecleanup efforts but then the
ongoing changes internally topolicy and the administrative
controls that were put in placetechnical controls that were put
in place.
Even four years later theorganization has that memory of
(31:12):
what occurred and it takes awhile to heal from that, if ever
.
But as people change roles andleave the organization, new
people come in.
The event tends to dim and, yousee, maybe ways going back to
(31:35):
things that were happeningbefore that caused that sloppy
behavior that allowed the breachto occur in the first place.
So it is really important tonot dwell on what happened but
make sure that the changes thatwere put in place for a good
reason stay in place, eventhough sometimes they're
(32:00):
uncomfortable to businessprocess.
They're there for a good reasonand continually reminding the
organization and talking aboutit I think openly internally is
really what helps theorganizations maintain a good
security posture.
It's frustrating when you lookat companies like T-Mobile that
(32:23):
seems to experience a majorbreach every year, where they
don't seem to apply thoselessons learned and hundreds of
thousands of people are impacted, and to me that's pretty
frustrating.
Joshua Schmidt (32:41):
So it sounds
like the good key to any
relationship is communicating,even in the cybersecurity world.
Well, let's lighten it up alittle bit.
I thought this next article,Scott, you don't mind pulling it
up.
Eric Brown (32:53):
Is this another one
about Nick's cat?
Joshua Schmidt (32:55):
Unfortunately
this one's about AI.
This is our last new article andthen we'll move on to some get
to know you stuff and try towrap up at 60 minutes.
I thought this was aninteresting tool.
I was wondering if you guyshave come across it.
It's called Nudge and it helpsyou discover who in your company
(33:20):
are using AI tools and gettingalerts when new AI tools are
introduced.
What are your thoughts on this?
It's something that businessesshould be using.
Is this something that youcould see yourselves using?
I think it was kind ofinteresting.
We've been talking a lot aboutAI lately and how that's
impacting our workflow and ourbusinesses and the security of
our information, so I thoughtthis was an interesting tool to
(33:42):
bring up to discuss.
Nick Mellem (33:44):
I'm not familiar
with it.
It seems like a neat tool to beable to kind of go through and
show you what you have in thenetwork.
The first thing I think of whenI am looking through this is
with how quickly this is moving.
Organizations probably need tostop figure out what their
stance is and figure out theirstandard of how they're going to
(34:04):
move forward and how they wantto implement AI and what tools
they want to use before they'reprobably going to use a tool
like this and then implementsomething like this and then
move forward.
But I think right now we'rekind of letting it blow in the
wind.
What is the organization'sstance at?
How do you want to move forwardwith AI before you start doing
that?
But this could be a reallyinteresting tool to help with
(34:25):
that.
Joshua Schmidt (34:26):
Yeah, we're
already using AI on the podcast
in some way, shape or form.
There's a lot of AI tools tocome up with descriptions for
YouTube videos, for example, orsuggest titles and things like
that, so it's already being usedjust for this podcast.
Nick Mellem (34:43):
We've all used chat
yeah.
Eric Brown (34:48):
When it comes to
this thing, I'm on the other end
, because trying to contain theuse of AI is like trying to hold
water in your hands.
It's just not going to work.
This tool, it's like herdingcats.
Yeah, it's like exactly, nick,like herding cats.
This is a waste of time andmoney to buy this and to roll it
(35:13):
out, in my opinion, because itmight tell you some interesting
stuff.
It's probably more designed tobe interesting to buy rather
than it is interesting to use.
But what are you going to do?
So?
What?
Well, who cares if somebody isusing one of 10,000 AI tools
(35:33):
that are out there?
Ai is built into Microsoftproducts.
Now, certainly there's the chat, gpt and all of those things
that we know about.
Well, maybe you're not going touse it on your network, but I
can pull it up on my phone witha cellular connection to the
internet.
And what's the point of spendingany amount of time or energy
(35:58):
implementing tools like this tocatch something that you can do
nothing about?
My opinion is better spent, likeNick was talking about, have a
plan organizationally aroundwhat it is that you're trying to
protect or prevent.
Are you trying to protect acertain amount of information on
(36:21):
your users or your customers.
Well, that's valid and that'sworth a conversation.
So understanding how you'regoing to protect that data is
important.
Or are you, say, a document ora company that's generating
content, either visual content,audio content or written content
(36:45):
?
Then you do need to have someideas about how your employees
are generating content, eithernaturally or with AI, because if
they're using AI, there couldbe some downstream ramifications
if they're claiming that theyauthored it, but they were
really using AI.
(37:06):
So those are the more importantconversations, again, in my
opinion, to have rather than todetect something that your
organization, somebody in yourorganization might be using
while they're on your network,using your property, when they
could easily just do it withtheir own property over a
(37:28):
cellular network.
Scotty Rysdahl (37:29):
That's one thing
.
We've seen a lot of kind of theinitial wave of blowback
against this, like the lastyear's AI revolution, is
lawsuits, like you hinted, eric,about.
All these AIs are trained ontraining data.
That's how they work, and a lotof that training data is pulled
directly off the internet or invarious kind of underhanded
(37:51):
methods that you might callscraping of one kind or another,
so the AI can produce somethingthat seems novel and new and
interesting and informed.
But really it's just I rememberthe basso-matic from SNL back in
the day with John Belushi,where he puts the fish in the
blender and you get out thissort of uniform fish gel.
(38:13):
It doesn't come from nowhere.
It's not really an intelligentthing creating output.
It's just doing a whole lot ofcomputation and analysis and
machine learning algorithm workbehind the scenes and spitting
out the basso-matic version ofwritten content or songs or
(38:35):
whatever.
So I think the legal exposureis something companies really
need to keep in mind, becausethis is uncharted territory and
there isn't legal precedent yetreally for a lot of this stuff
that if you just jump in withboth feet without having kind of
an organizational approach toit, like Nick said, you could
end up in some hot water.
Bill Harris (38:56):
You know, I just I
was in an ISC2 conference this
week and they spent a lot oftime talking about AI.
One of the really interestingaspects of this is that and this
is a great example of a productthat's taking an early swing at
a problem that is so new Scott,to your point and no-transcript
(39:17):
Right now, the, the White Houseand Congress are trying to get
some you know, some legislationin the United States to to help
Inform some decisions and someguide some, some, some rails
around AI, whereas in Europethey take a much, much more
prescriptive approach, rightwith some of the GDPR laws you
(39:39):
have in Europe very, veryprescriptive, but here in the
United States it's moredescriptive and there's not
going to probably be a reallyheavy-handed.
You know you have to do thingsspecifically this way, in this
way from the government.
So I think tools like this willproliferate because people will
be trying to follow some roughguidelines.
(39:59):
So I think you're gonna seemore of this.
But I do kind of wonder if someof these things could also be
solved with, like, a DNS or aURL filter to see what people
are going through the corporatenetwork to hit some of these
generative AI sites.
Snake oil, I.
Nick Mellem (40:16):
Don't know if this
pertains so much, but on this on
the side, I do a lot ofphotography and AI is huge there
, so but the reason I bring itup is one of my favorite
companies like a.
They release a new camerayesterday and the reason why
it's cool is, if you're familiarwith the technology, the
coalition for content, something, and authenticity I can't
(40:40):
remember what it's called,something of that nature, but
it's basically the standard nowof what's real and what's not.
So in the camera, when you takea picture, you're able to put
in the metadata your, basically,signature that can't be altered
, so, and it's a live track ofwhat's happening.
So, let's say, you take thisphoto, you put it on the
internet and you you edit thepicture.
(41:02):
Somehow it shows the editsyou've done.
So if you sell the picture ormove the picture, you can see.
Okay, this was Eric's picture.
He took it here at this time inthis place and he didn't made
these changes to it.
I think it's a cool technology,whether it's directly related
to cyber security or not.
It's kind of that battleagainst AI and it's more moving
parallel with differenttechnologies to combat what's
(41:22):
real, once not.
You know what's a phishingemail and what's not so.
They kind of are a paralleltechnology.
I did think it was neat thatthey came out with this to
combat AI as well, as it's soprevalent in our industry.
Scotty Rysdahl (41:33):
Yeah, I wonder,
nick, how that would, how that
would work with a, you know, ascraping and and sort of
generative AI System thatdoesn't care about your
watermark, you know, certainlyit might be hard to get that
through the filter of the AI Atthe end of the pipeline.
Nick Mellem (41:53):
But I guess I don't
know what you're right, I don't
know it enough about it as well, and this is the first I was
hearing about it yesterday.
But you know, you're alwaysgonna have the issue of somebody
just taking like a screenshotof the picture or something and
using it that way.
I think this is just an initial, you know.
I don't want to say throw it atthe wall and see what sticks,
but it's an initial swath ofsomething that can help Creators
(42:16):
or whatnot.
But yeah, real interestingpoint you made, there's, there's
gonna be a lot of differentangles that we're not gonna be
able to protect and you know,that's kind of the industry we
work in as well.
We're it's 360 degrees andwe're firing in all directions.
Joshua Schmidt (42:30):
What one thing I
wanted to touch on today Is, if
we don't get to all the funstuff is this if someone could
quickly explain you know, whatis an audit, what is a security
audit, that might be a biggerconversation down the road, but
for somebody that has maybe gota Multi-million-dollar business
that's starting to think a lotharder about cybersecurity after
listening to our podcast, whatdoes that look like for them?
(42:52):
Can you, can you walk usthrough the process, quickly,
give us any insights as as towhat you guys do and why that's
valuable?
Eric Brown (43:01):
I could take a first
crack at it.
So an audit would be just ahigh level assessment by a third
party of a current state Ofyour environment, so a point in
time evaluation of yourenvironment by someone qualified
to Conduct that assessment.
The outcome would be you wouldknow what it was that was, you
(43:29):
know, let's say it's a securityaudit.
So you, to set that up, youwould go through with with the
person conducting the audit whatit was that they were looking
to discover and then you'd get areport at the end that would
tell you that the outcomes oftheir tests against your
environment.
It could be something likephysical controls where You're
(43:53):
trying to test how hard or easyis it to get into your building
after hours, for example?
So did did your securitycompany alert you when the the
auditor was attempting to To getin through your front door?
Could they bypass your, yourside door by Cloning an employee
(44:14):
badge and or nick, putting onthe UPS uniform and carrying a
package in?
Could they get into your serverroom?
Could they get somewhere wherethey could install something in
a network check?
You know these are kind of moreof like a physical audit.
Other audits could be.
Other security related auditscould be Something where someone
is trying to impersonate aremote actor, that is, without
(44:39):
being physically In yourenvironment.
How far can they get in just byusing information gained from
social media, open sourceintelligence gathering and what
you, what you look like on theinternet?
You know, as Scott alluded toearlier?
We were, we were talking aboutthat Cisco breach.
(44:59):
So a scan against your network?
Oh, there's a vulnerabilitythere.
Now what can we do from fromthat entry point?
Different types of audits basedon what you need.
There are some regulatoryaudits, like a PCI audit, if
you're handling credit cardinformation and and the credit
(45:20):
card processors and merchantbanks want to know that you're
practicing, do care and dodiligence Around how you're
handling credit card data.
Are you writing down the creditcard data on a sticky note and
then entering it into the systemlater, or do you have, you know
really good practices aroundhow you take credit cards,
(45:41):
either card present or card notpresent and are you adhering to
the industry standards?
You're probably going on a longramble about all of the
different types of audits, buthigh level, I think that would
would give a good synopsis andyou know Certainly, nick bill,
scott, any anything else to addthere.
Nick Mellem (46:00):
I think for me,
that point that I want to bring
up is People aren't doing itenough.
We're not testing socialengineering enough and it's,
it's so huge, from dropping athumb drive in a parking lot,
you know to, like Eric said,dressing up as a UPS driver and,
you know, trying to infiltrateAny place of business.
I know one we've done beforewas dressed up as a fire Marshal
(46:23):
.
You know you walk in with thefire chief shirt.
You know style on the hat andjeans and a clipboard.
You know, soon as you grab aclipboard, it's pretty much, you
know, open season everywherebecause people aren't going to
stop you.
But as I as I digress, the pointof bringing up is you know we
spent so many years trying tocombat fishing emails and
teaching our you know Co-workersand you know anybody in our
(46:46):
organization.
What is a fishing animal?
What is an efficient email onhow to, you know, combat that
you know.
Now we're seeing we're probablynot putting enough emphasis on
social engineering.
Well, what's happening whenthey're in the lunchroom or
they're, they're tail lettingsomebody tailgate them into a
building and that's somethingwe're probably not talking about
enough, I think.
And when we're doing theseexercises, that's what we're
(47:07):
seeing, you know we have apretty good success rate and I
think you know, starting theconversation, you know, from
just the fishing side to thephysical side is something we
need to start doing.
Bill Harris (47:17):
Yeah, that makes a
lot of sense to me.
I think fishing is a huge one,as you point out, nick.
The other thing I see a lot oftimes with audits is that it's
really difficult for people toidentify what the gaps are right
.
So a lot of audits are drivenby interviews and they're driven
by access to the informationwithin the organization.
It's generally pretty technicalin nature.
(47:40):
So I think IT organizationsneed to do and are starting to
do an increasingly better job ofminimizing the burden on that
customer to provide thatinformation, because we have to
get in there and really helpthem out.
They may not have the expertiseto supply all those answers for
(48:00):
us.
Scotty Rysdahl (48:00):
One thing about
IT audits in 2023 that's
different maybe from a decade ortwo ago is that there's some
really really good frameworksout there now that are
essentially free or have anominal cost that really outline
the process in a structured anddata-driven way.
In 2003, cybersecurity as anindustry was very much the Wild
(48:24):
West and you had just verydifferent people doing very
different types of work, andit's really matured quite a bit.
So now we have these greatblueprints and battle plans to
use when we go intoorganizations that are developed
by the government or byindustry or by academia, and it
doesn't have to be a best effortthing so much anymore.
(48:45):
There's really good systemsthat practitioners like us can
use to do a deep and thoroughevaluation.
Eric Brown (48:54):
And where our claim
to fame is, so to speak, is
where not only can we help onthat audit side of understanding
where a customer might want tobe, so helping them create that
roadmap.
A lot of times you'll talk to acustomer and they've got audit
fatigue because they've beenthrough a number of audits in a
short period of time and eachaudit ends with a stack of
(49:17):
papers.
And I always equate it to you'redriving a car, you take your
car to the mechanic shop andthey say, well, here's all the
things wrong with it.
They hand you a printout.
It's the same thing thathappens in the cyber world.
You get your printout ofeverything that's wrong and
typically then the person whoconducted the audit, they go off
(49:39):
and they're on to conductinganother assessment the following
week and you're left to takethe piece of paper and figure
out what to do with it.
Where we really found an itchwas that not only would we give
them the piece of paper thattold them what was wrong, but
we'd also help them build a planto correct it and oftentimes
(50:01):
that was with either staffaugmentation or coming in to run
a part of their securityprogram to really take that the
corrective actions and thestrategy and build the plan so
that they really had a roadmapto get better and equated back
(50:21):
to the car example.
We're going to be the mechanicsthere that help them get that
car back on the road and drivingthe way they would want it to.
Nick Mellem (50:30):
We want to be in
the trenches with you.
Joshua Schmidt (50:32):
That's what I'm
looking for.
Yeah, all right guys.
Well, we had Scott Rizdahl,eric Brown, bill Harris and Nick
Mellum on today.
From IT Audit Labs.
I'm Joshua Schmidt.
I'm the producer and co-host ofthe audit.
Thanks so much for joining ustoday.
This is the end of the 2023calendar year for the audit, so
(50:52):
we'll see you next year withsome new content, new topics and
the same old cast and crew here.
So thanks for joining us today,subscribe and follow.
You can find us on YouTube.
We also have LinkedIn,instagram and Facebook as well,
so check us out and hope to seeyou next year.
Eric Brown (51:10):
Josh, one thing for
next year.
You think we can get Bill togrow a beard.
Joshua Schmidt (51:16):
Probably have a
lot better chance than I do.
I have the facial hair of a12-year-old.
I've never been able to do that.
It looks pretty gross when Itry to go for the beard.
I'm not going to lie.
I do do Captain Morgan stash,though.
Nick Mellem (51:34):
I'm strong.