All Episodes

January 23, 2023 36 mins

Breaches, phishing, attacker programming, and more, in this week of The Audit. Tales from the Trenches will talk about several scenarios our hosts have experienced with fraudulent situations, as well as ways these hackers implement their tactics. Tune in to The Audit today to hear more! #cybersecurity #protection #itauditlabs #theaudit 

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Eric Brown (00:05):
You're listening to the Audit presented by IT Audit
Labs.
Welcome to the Audit.
My name is Eric Brown, andjoining me today is Kyle
Rosendahl, nick Mellom and JamesArndt.
James is our guest today, aswe're going to tell some tales

(00:28):
from the trenches.
Hey, james.

James Arndt (00:31):
Hello, thanks for this opportunity.

Eric Brown (00:33):
Absolutely so day in the life of James.
What is that like?
What do you do professionally?

James Arndt (00:41):
I am a cyber threat intelligence analyst and so we
get big feeds of maliciousemails and malware coming in and
we analyze that and send thatout to our customers so that
they can use that to check theirown environments for legit,
known bad things.
So we've got a variety of rolesthat we are rotating through

(01:03):
and it's been a slow week kindof this week.
So we've got a variety of rolesthat we are rotating through
and it's been a slow week kindof this week, so I've been
messing around with a lot of NETmalware, doing some research on
that.
It's pretty interesting.

Eric Brown (01:14):
What's some of the things that you've seen.

James Arndt (01:16):
Well, it's NET malware.
It's really quite interestingin that it is accessible to just
about anyone.
You only need a couple of toolslike a DN spy, il spy.
You just load it up in thereand you can see all of the code
just right there in front of you.
It's not like you're diggingdown into binary or some obscure

(01:37):
assembly language or anythinglike that.
It's all just written in Csharp and you can easily just
follow the code around and youknow if I can do it.
I think anybody can do itreally.
But yeah, there's a.
There's a couple of familiesthat are all mainly written in a

(01:58):
dot net, like agent Tesla keylogger, snake key logger.
Occasionally form book is also.
And so when you can get pastthat initial first stage and get
into the second stage where allof the different variables and
things it's C2 channels, it'semail addresses, it's sending
information out to once you getto there, then it all is just

(02:19):
laid out right in front of you.
You can really see theinternals and how it works.
It's really quite fascinatingof you can really see the
internals and how it works.

Eric Brown (02:25):
It's really quite fascinating.
What have you seen recently asfar as?
What are they going after?

James Arndt (02:35):
Well, it really kind of depends on the malware
family when it comes to, likeAgent Tesla or snake keyloggers.
They're really just embeddingthemselves on the endpoint and
then just well, being akeylogger that is monitoring all
your keystrokes, grabbing themall and sending them off to
whoever is listening.
On the other end, snakeKeylogger is also quite modular
in that it can also run throughall of the installed programs on

(02:57):
your endpoints and start tryingto grab saved usernames and
passwords inside, themucking allthose together and then sending
them off to the bad guy too.
So, yeah, grabbing passwords,key logging, screenshots even,
and all information about theendpoint too.
So, whatever it is that the badguy is going to try to get,

(03:18):
yeah, he'll try to do it.

Eric Brown (03:21):
Wow, and how are these getting onto the end
user's workstation?

James Arndt (03:27):
Through phish, as always.
You know they're just.
It is not overly difficult foran attacker to sneak malware or
sneak a malicious attachment inan email through all of the
security devices that are onyour network and get it to a

(03:47):
person's inbox.

Eric Brown (03:49):
And then when they click the link.

James Arndt (03:51):
Yep, yep, they open up the HTML attachment and they
click the link, and sometimeswell, okay, sometimes, when you
open the HTML attachment, youclick the link, the actual first
stage is embedded directly inthe HTML document itself, so
that looks like you justdownloaded something.
You open it up and, hey, whatdo you know?
It's malicious, executable.
Other times you click the link,then it downloads something.

(04:13):
Most often, though, you clickthe link in a malicious e-mail
or attachment, it brings you toanother web page and then asks
you to download the thing.
That's how they're bypassing.
It's not necessarily the casethat the malicious thing, the
stage one, is always inside theemail itself.

Eric Brown (04:36):
So it used to be that you could prevent users
from having local admin on theirmachines so that they couldn't
install these executables andyou'd have ways to detect C2
callouts, or command and controlcallouts on your network.
Are you seeing ways now thatthe threat actors are able to

(05:00):
bypass these things?

James Arndt (05:02):
A lot of them, it doesn't require administrative
rights to run them in user space.
I mean, at my old company we'dhave to play whack-a-mole
because people were installing abrowser that we didn't want
them to.
Well, they didn't need adminrights to install them locally
as a user.
Well, malware can work.
The exact same way also when itcomes to C2 information.
Yeah, if you know about it, Ifyou're whatever network security

(05:25):
devices you have, if they knowabout it, then they can detect
it.

Kyle Rosendahl (05:28):
But, man, if you don't know it, it's not always
the case that it's going to befound.
Yeah, so when these are gettinginstalled?
So I mean one of the things howeffective do you find like
endpoint, new endpointprotection programs to be right?
I think a lot of people usesomething like CrowdStrike.
Palo Alto, symantec is outthere, right, I mean, they all

(05:49):
have these new EDR platforms forendpoint protection.
How well are these guys gettingaround those protections to get
their code executing on theboxes and what are some of those
strategies that they're usingto kind of bypass those EDR
detections?

James Arndt (06:06):
I really don't live in that aspect of malware
analysis and just testing outdifferent EDR tools.
What I do know is that,depending on the EDR, you know
they'll be looking at differentbehaviors as to, hey, where did
this actually get installed from?
Where did this get executedfrom?
Somewhere in downloads?

(06:26):
Okay, maybe that should raisean alert.
Is this a DLL that's beingloaded up?
Is it signed or not?
And then whom is it signed by?
So there are different ways thatthey can try to detect all of
this stuff.
For a while, everybody washating on you know, using
signatures and antivirusproducts and stuff.
But there's a lot of them, butthey're fast and they're good at

(06:47):
detecting yesterday's stuff.
The new hotness now is alwayshow quickly can we figure out
and find today's stuff?
And that's how you get all ofthese different sorts of
behavioral analysis andbehavioral analytics promises
that EDR vendors are selling.
So yeah, that was at my old job.
I lived in that quite a bit.
As far as nowadays, man, ifthere was something that

(07:11):
attackers were doing and itwasn't working anymore, they'd
find a new way to do it.
So what's the quote fromJurassic Park?
Life will find a way.
Well, so will threat actors.

Eric Brown (07:21):
So it's interesting that you talk about the phishing
, and I think we've all seenthis, we all know it, we all
talk about it that phishing isthe number one threat vector for
our end users to be attacked,and I saw one over the summer

(07:42):
that just reminded me howingenious these threat actors
are and can be and why we needto continually educate and
remind our users the importanceof really paying attention to
the emails.
It's not just emails that havelinks in them.

(08:04):
The one that I was involvedwith involved the company that I
was working with and then theirthird party rental broker
property manager.
So the company I was workingwith had a few rental properties
I was working with, had a fewrental properties.

(08:31):
Somewhere around $16,000 amonth in rent was due to one of
their property managemententities that they rent from
through this property manager,and that property management
company had suffered a breachpreviously, maybe a month or so
before, and had not told theircustomers of this breach.

(08:55):
So the property managementcompany was breached.
One of their employee accountswas taken over.
Employee accounts was takenover and they set up.
They reviewed all of the emailsthat were in that person's
account, learned about theircustomers and started crafting

(09:18):
email rules that would sendemails off-site to another email
address that the threat actorcontrolled.
So when the property managementcustomer sent an email in, that
email would be rerouted to thethreat actor's mailbox and they

(09:39):
could curate and learn abouttheir intended targets, which
were the property managementcompany's customers.
So in this particular instance,the threat actor crafted an
email on behalf of the person'saccount that they had

(10:00):
compromised and at this time theperson doesn't know that their
account's been compromised andthey send that email targeted
email to that customer, whohappened to be the company that
I was working with, and theemail was structured where it
was enticing, in that the emailsaid we are changing our ACH

(10:34):
information and we want to offeryou a discount of 5% if you
prepay your rent through the endof the year.
So $16,000 a month times six,somewhere close to $100,000 in
rent.
They get a discount of 5% onthat if they paid by the end of
June and there's five days or soleft in the month of June at
this point in time.
So the person at the company Iwas working with then sends back

(11:00):
well, you need to fill out thisinformation change request,
which was a form where they putdown the new ACH routing
information and contactinformation, as the policy was
that you call to verify any bankrouting change instructions.
But this is where things got alittle bit convoluted and they

(11:22):
received back the form and thenthat form needed to go to a
different department and thatdepartment would do the
verification.
Well, the threat actor addedthat undue pressure of it's
getting close to the end of themonth, we need these routing
instructions changed in order totake advantage of that discount

(11:47):
.
With that time pressure and notfollowing the administrative
controls that were in place,those routing the ACH
information was changed, paymentwas sent, it was sent and the
person that thought they weredoing a good thing for the
company did that, had it done,sent it off to finance and said

(12:08):
you know, this information needsto be changed.
Finance changes, it executesthe payment and then everybody
forgets about it until about amonth and a half later where the
property management companythen reaches out via phone and
says we didn't get payment forthe last two months and then
that ensued an investigation anddiscovery of actually what

(12:34):
happened and at that point intime that property management
company revealed that they hadbeen previously breached
breached.
So long story, short phishingwithout actually clicking on a

(12:54):
link is still widely prevalentand it works.

James Arndt (12:56):
Nice Did that company.
Then, did they change up any oftheir procedures for payments
going outbound and did?
Do they have now checks anddouble checks on top of that?

Eric Brown (13:06):
That is in process as I understand it.

James Arndt (13:09):
It seemed like in a situation like that, it's not
necessarily the person's faultwho messed it up or who blindly
believed the person on the otherend, you know.

Eric Brown (13:19):
Yeah, absolutely.
And you know, I think it's oneof those things that it
realistically could happen toanybody and no matter how many
technical controls you put inplace, you need those
administrative controls that arethoroughly reviewed and tested
and then roadblocks cleared orthe process restructured blocks

(13:48):
cleared or the processrestructured.
When you do run through thosein a tabletop exercise and you
sit there and say, wow, you know, yeah, this isn't working.
It could take two weeks to getthis approved because of all the
bureaucracy.
Let's fix that.
And you know, unfortunately,you know, without that, those
tight and tested administrativecontrols, I could see this
happening again.
For sure.

Nick Mellem (14:09):
Yeah, that's a wild story.
I think one of the biggestproblems and things I hear that
happen in a situation like thisis everybody is quick to blame
it on the fish or however itcame in, but nobody ever steps
back to realize nothing wasreviewed.
The processes weren't reviewed.
The processes weren't reviewed.
How long has the businessprocess has been in place?
So when we go back and reviewit, we realize well, that

(14:31):
probably was one of the mainissues.
I think we don't reviewpolicies and procedures enough
to prevent situations of thisnature.
But great story, veryinteresting.

James Arndt (14:44):
Yeah, that reminds me I heard this in a class one
time where the instructor wassaying you know, let's say that
someone at your company doeshave an eight-character password
and it does get cracked supereasily.
You could say it's thatperson's fault for having an
eight-character password, butwhat technically allowed that
person to make aneight-character password in the
first place?
That's the problem.

Eric Brown (15:04):
right there you have to get down to the root of it
and understand.
Yes, if we understand that theproblem is that technical
control, well then that'ssomething that you can fix, and
in that case, there'd be thetechnical control that would
enforce the password, thecharacters and the length of the
password, and then there'd bethe administrative control that
would say that it's okay for thecompany to have a character

(15:25):
password.
How about you, Nick?
What have you run across inyour past?

Nick Mellem (15:30):
Yeah, I'm glad you asked.
So when we were talking aboutdoing this, I was trying to
think of all the great storiesand I always come back to social
engineering because it's myfavorite of all the
cybersecurity it has to offer us.
But when thinking about thestories and what we've been
through and recently we, youknow, have engaged in different

(15:53):
types of social engineering, butthis specific story was in
person and the company engagedus to test the new RFID badges
that they just had installed.
I think that's what we'reseeing more often nowadays is
everybody's going away from thephysical key and they're going
to the reader, so you get within.
I think that's what we'reseeing more often nowadays is
everybody's going away from thephysical key and they're going
to the reader, so you get within.
I think, generally under 12inches.

(16:14):
Sometimes people just touch yousee them touch the pad and the
door unlocks.
So they specifically wanted tosee if we could utilize this new
technology and gain access tothe building.
So when we were doing this,this was specifically at a
production company of aconstruction company.
So there's a lot of people inand out, a lot of moving pieces,

(16:35):
which generally aids in oureffects, making us more
effective.
So what we did is we essentiallyreverse engineered one of these
readers and you can get clonersonline.
I think Hack five makes them,if I'm not mistaken.
So we took one of these, uh,and kind of reverse engineered
it.
So every time we walk bysomebody with a badge, it was

(16:59):
essentially taking the readingof the badge.
It houses, though, thatinformation.
You go back to your office, car, whatever, whatever have you,
and you start cloning badges andsee which ones work.
So the easiest way we found todo it was put this reader
actually in a bag or a backpack.
We dressed up as the UPS drivermaking a delivery, carrying a

(17:23):
bag.
We walked through the lunchroom, basically just around, and
nobody said anything to usbecause we had a clipboard and a
package, right.
Those are kind of the two itemsthat you pass, go, you're good,
right, they see this, theydon't question you.
So we walked through a bunch ofareas.
At the end we totaled about 12badges that were red.

(17:45):
So we come back the next daywhen we, after we clone the
badges, the one we found to bethe most useful was maintenance.
Right, generally, maintenancehas access to everything.
So we used this.
We did try all the badges thatwe got to clone.
But when we used maintenance wewere able to get into basically
any room we wanted and in thiscase, including the crown jewels

(18:07):
right, we were able to get intothe any room we wanted and in
this case, including the crownjewels right, we were able to
get into the server room.
So then we were able to usedifferent hack 5 tools right,
the shark jack and we kind ofbled over into there.
But the most fun right wasgetting this badge reader to
actually get the badges and walkaround here and collect all
that data and just see thatpeople and collect all that data

(18:28):
and just see that people werenot questioning us at all.

Eric Brown (18:38):
So that was one of the most interesting, most
recent stories, I think.
Again, I think that comes backto those technical control side
where there's quite a few thingsthat you could do with.
Encrypted readers make itharder to clone them Although,
al, I think you've got sometools where you've cloned the
encrypted badges before, butthat's probably a different
podcast but you've got thosetechnical controls.
But then you have the scenarioof the impossible travel which

(19:01):
we look at in the IP space.
Right, you know, is this usercoming from New York and then 10
seconds later they're comingfrom Moscow?
Probably something not quiteright there we could do the same
thing as security practitionerswith badge access.
If we see person one going intobuilding one at 10 o'clock and

(19:23):
then at 10.01 they're going intoanother building across campus,
well, that's probably unlikelybecause of an impossible travel
scenario.
And I think the way in whichmost organizations are set up is
property management controlsthe door, the badge access and

(19:47):
the door security andinformation security controls
the technical side of thesecurity for the organization,
and those two don't often meetor don't often have access to
each other's data.
But I think in just about everyorganization that I've worked
in that was large enough to havemultiple departments.

(20:08):
There was that separation ofduty between the property
management side and thetechnical information security
side.
But I always thought it wouldbe an interesting thing to go
and grab the logs from the doorreaders and just comb through
them and see how much impossibletravel is in there and identify

(20:31):
which badges have been cloned.

Nick Mellem (20:33):
Yeah, absolutely.
And I think you know with thetechnical controls the biggest
issue we have is just noteducating staff on actually
confronting people that theymight think are illegitimate.
Right, we don't questionauthority.
In another instance we actuallydressed up as a fire marshal
checking fire distinguishers.

(20:54):
So again, where people justthink, oh, they're important,
they've got a badge, we justrigged up a badge on a little
plastic thing.
Quick, you know, you don't getclose enough, you won't notice.
Similar to the story I justtold.
It's still a technical issuewith the technical control, but
it all kind of comes back to thesame issue.
So it's a position of authorityGenerally.

(21:15):
It's fair game.

Eric Brown (21:17):
Okay, so you can get in anywhere with a clipboard
Exactly.

Kyle Rosendahl (21:21):
Yeah, and I think, nick, that's an
interesting point too that youbring up about not training
users or people to confrontpeople that they think might not
belong, right, I mean, workingin the consulting space, we're
constantly among differentbuildings, different work groups
with different clients, thingslike that, right, we're
constantly kind of a new face innew areas.

(21:42):
Um, sometimes we have, you know, badge access with readers that
function.
Sometimes not so much, but thenumber of times that I've been
able to just tailgate somebodythrough a door or just look like
I belong there and they'll holdthe door open for me and say,
oh, you're going this way andyou go, oh, yeah, yeah, yeah,
right, and you just put on asmile and walk through the door.

(22:05):
Nobody really gives you asecond thought, right, and
similar to Eric's story, whereyou see an email and if you
think about it too long, youknow maybe something doesn't
seem quite right.
Or you think about, you know,leaving the door open for
somebody that you don't know100%.
You know there's always that.
You know what if they're herefor something bad, or what if

(22:25):
they're doing something, or whatif this isn't what they say
that it is right when it comesto an email or tailgating
someone through a door, and Ijust think people don't want to
be uncomfortable and have thatpotentially awkward conversation
of like you know I'minterrogating somebody because I
don't recognize them.
They're coming into my officespace right Like people just
want to do their job and gethome and do a good job in most

(22:49):
cases.
So it's just interesting thatso many of these are possible
just because people don't wantto rock the boat.
I guess in a lot of cases Well,we are Midwest nice.
We are, yeah, and it's good andit's bad.
Right, I mean, try to tailgatesomeone into New York.
I'm sure it's a different storythan doing it in Minneapolis.

Nick Mellem (23:09):
Yeah, You're spot on, kyle, with everything you
just said, and it does make itvery difficult to train and
educate staff.
Because of that, I think thatwhen we are educating staff, one
thing that I think about is aska question that is friendly,
right?
You ask them hey, who are yougoing to see?

(23:29):
Where are you off to, whatmeeting are you going to?
And that's when you start tothink, okay, well, I've never
heard that person, or maybe theyhave a long pause, right, or
they start to trip up there.
So to me it's well, would youlet somebody tailgate into your
own house?
Right, obviously you wouldn't.
But we go back to.
Obviously, if this person's gota clipboard, I'm probably gonna
let them through.
So it is very tough to you knowkind of figure out what's right

(23:53):
and when what's wrong, becauseyou don't want to confront
somebody.
But, yeah, great points you'rebringing up, absolutely.

James Arndt (24:02):
You need to have that culture of security built
in from the top down, though youknow you need buy-in from the
upper people in order to pushthat down to everybody.
Otherwise, if it's just fromthe security team, you might not
get all the way across thecompany like you need to kind of
gets a looped in with thephishing training.

Nick Mellem (24:19):
Right, we give it all the time, but somebody's
still going to click on thatemail or the link with an email
yeah, and I mean speaking ofthat.

Kyle Rosendahl (24:28):
I mean that kind of leads into my story that I
prepped for today.
But you know, there's kind ofsocial engineering, trusting all
of those pieces, and maybeit'll be a good segue for Jamie
too, or James.
It started with a penetrationtest that took place at the
client.
I mean years past, years beforeI had even started there, this

(24:49):
group of penetration testers,during the engagement, ran a
phishing campaign, right sosending out emails that look
legitimate to drop a piece ofmalware onto the client
computers to try and get remoteaccess to those systems or
command and control capabilitiesyou know just kind of basic
stuff.
And they sent out this emailthat looked like it came from

(25:09):
the Citrix organization and saidhey, you know, your Citrix
client is out of date.
Install this application toupdate your Citrix and
everything will be good to go.
So they did that.
In reality, there's actually awhat was the tool set,
powershell Empire, which is likea kind of C2 framework for

(25:30):
PowerShell.
I think there's some Pythonscripts with.
It now Gives you a whole slewof capabilities.
I don't know if it's currentlybeing worked on, it was for a
while, but anyway it was justbasically an HTA script that
would execute in the browser andthen do a callback to a C2
server that lived off at thepenetration tester's company.
So they send this out Yearslater.

(25:54):
I start kind of helping out atthis client and looking through
some of their logs, lookingthrough some of their old stuff,
looking through you know,alerts that are popping every so
often.
And there's this one calledcitrix underscore, update dot
HTA and it keeps going off andsaying, hey, this looks like you
know some malicious file.
This looks bad.
This doesn't look good.
And you know, having juststarted there, I'm like, hey,

(26:15):
team, what's this file that'shanging out right Like this
notifies us that it's bad.
This says it's bad.
You know, everything here saysit's bad.
If we upload it, it looks likeit's a bad piece of malware.
And they're like, oh, it's notmalware, it's just a Citrix
updater.
You know we keep it around incase we need to update anybody's
Citrix.
And we're like, okay, cool.
So you know, I let it slide thefirst time and it pops up again

(26:38):
.
And it pops up again.
So I said, okay, what am Igoing to do with this?
So I grab it.
Not super difficult to get into, I mean, it was basically just
an HTA, which is an HTMLapplication file, basically just
base64, encoded A little bit ofkind of trickery here and there
to obfuscate.
It wasn't necessarily encrypted, but it was obfuscated code in

(26:59):
there.
Pull that apart, see what it'sdoing, run it in a sandbox that
I had, find the IP address thatit was reaching out to pull back
those records, go to that IPaddress and say, okay, who owns
this domain that lives there?
Sure enough, it leads back tothat penetration tester's domain
.
You know somebody else who'sdoing security work and so I

(27:22):
reach out to their guy overthere and I say, hey, you know,
I've got a question about thisfile that I found on our systems
, not knowing at this point thatit was part of a penetration
test, just like.
Well, you know why is thisthing reaching out to your
servers?
You know I'm thinking could itbe?
You know a man in the middlelike they're trying to send it
to another trusted source andthen exfiltrate data.
You know what's the purposebehind this and he's like, oh no

(27:44):
, that's.
Yeah, let me pull those oldreports.
You know, here's our oldreports we did for this client
back, you know, four or fiveyears ago.
And yeah, here's that file.
It says it was a Citrix updateand nobody had cleaned it up,
right?
So it's an instance wherepeople receive this email, they
receive this, this thing thatlooks official, they believe it

(28:06):
and then they save it inlongevity to hang on to it and
say, oh yeah, no, this is anofficial document, right, we got
the email, this is fine.
Even when the penetration testcompletes, the report is handed
out, nobody goes through andcleans up the files.
Nobody alerts the entire staffthat, hey, you know that file
you received that's not really aCitrix update, right.

(28:28):
So I mean a successful fish.
In the fact that you know notonly one was it successful and
it helped them achieve theirgoals during the pen test.
But it was also successful inthat, you know, they trick
people for years on end intothinking that this file isn't
what it is.
And even, you know, faced withdetections in their anti-malware

(28:49):
, in their endpoint detectionsoftwares right.
Even when faced with theevidence that, hey, these things
say it's malicious, they werelike, no, it's not malicious,
right, this is a totally normalfile.
Somebody sent it to us.
That's just a misclassification.
So, you know, when it comesfrom an official source or
something somebody thinks is anofficial source and they don't
want to rock the boat right then.

(29:09):
Even when given evidenceagainst that, they're still
going to just claim it's theother thing.
So a pretty interesting story,both on making sure you actually
clean up after your penetrationtests, but also don't always
believe everything everybodytells you in an email, so that's
really similar to a socialengineering situation.

Nick Mellem (29:29):
right, you don't question authority, you don't
question what somebody told you,and it lives there as a
backdoor forever.

James Arndt (29:35):
So were those HTA files still calling out
regularly.

Kyle Rosendahl (29:39):
Only when somebody tried to execute it.
Now our anti-malware wasstopping it from running because
it recognized the source right.
That's saying, hey, this is anEmpire script, and so no, they
weren't calling out anymore.
But I had to get the actualfile out of the quarantine when
it got picked up and then pullit apart to find where that IP
was hard-coded.
So no, they weren't receivingany traffic from us, but people

(30:03):
were trying to execute it, andthen it would get caught and
then sent off to the securityteam.

James Arndt (30:08):
Who did not seem very concerned about it.

Kyle Rosendahl (30:12):
Yeah, until we started taking a really close
look at it and I got to write upa whole like five, six page
report.
I'm like here is why this isnot a citrix update, right like,
here's all the evidence, here'sall the screenshots.
You know, go do it yourself.
Here's how you pull it apart,you know, similar to what, james
, you do for work, I'm sure, butI mean, just step by step, pull
the thing apart from, you know,obfuscated to plain text, and

(30:36):
show people that no, it's notwhat you think it is.

James Arndt (30:39):
Pretty much.
Well, one time I messed with apen testing company.
It was pen testing us.
I was, um, it was when I wasworking for an electrical
utility.
You know fishing and fishinganalysis that was my bread and
butter.
I love that.
That was my favorite thing.
And we start getting reports ofcoming in and the email was

(31:01):
beautifully written.
It had some C-level persons.
You know their signature, phonenumber, cell phone number and
the signature.
I mean it was perfect.
I followed the link safely andit came to an exact copy of our
login portal.
I mean it was branded andeverything.
It was beautiful.

(31:22):
So I was like, oh, I mean it wasone of those times like, okay,
I know, we're a target, we weredefinitely being targeted.
This was bad, all right.
And then 20 came in, 30, andthen up to, like you know, maybe
75 to 80 of these all came into different people around the
company.
Like, okay, okay, okay, get allhands on deck right, um, start
going through it.

(31:43):
Uh, we're blocking, uh, anysort of traffic to that uh url.
Um, we're, um, you know,getting a list of all the people
who received it.
We're deleting it from inboxesand then a different one starts
coming in from a differentC-level person.
It was beautiful Again.
It had all this information init, still going to the same,
like, oh, this is horrible.
And then one of my coworkerssays wait a minute, wait a
minute, are we being pen testedright now?

(32:04):
And we're like, oh, you'reprobably right, because this
wasn't just really really good,it was perfect.
It was almost too perfect,right?
So what I did was I put in afake username and a fake pin and
just left it.
That told my boss.
He's like okay, good job guys.

(32:24):
So we wrote up what we did andeverything and we did that.
Well, some weeks later, the pentesting company came to our
company and they did a big youknow all hands meeting about hey
, how did it go?
And everything.
And they even did some physicalpen testing too.
And you could see on camera howthey piggybacked off of someone
.
They, you know, had on theappropriate looking sort of you
know helmet and gear and jeansand boots and they were carrying

(32:48):
big binders that were brandedwith our company stuff, you know
, and they just snuck right inbehind someone.
But when he was talking aboutthe phishing email and the stats
and what they were sending itout to and everything.
That's when I perked up andhe's like, yep, we didn't get
anybody to click on it, but wedid get one Not that they didn't
get anybody to click on it, butas far as gathering actual

(33:10):
usernames and pin numbers andeverything you know because they
were waiting for them to comein so that they could quick take
that pin and then just reuse iton the site and get back in.
But he said, yeah, I know thatwe got found out because
somebody put in the usernameJenny and the pin number 8675309

(33:30):
.
Everybody died laughing.
He's like all right, who was it?
And so I raised my hand.
He's like all right, that washilarious, that's going on our
wall of pen testing memories andso.
So yeah, that's, but still,when you don't know you're being
pen tested and you see theperfect fish coming, it's like
this is not good.
And then everybody startsgetting one.
It was a.

(33:50):
It was a good test of ourincident response and email
deletion capabilities.
It all turned out well, thoughthat's good.

Eric Brown (33:59):
And Jamie, you do a little bit of instruction or
work with SANS too, don't you?

James Arndt (34:05):
Well, I was in the SANS mentoring program for a
while.
A while I had, you know, thatwas just like a local community
class, where it wasn't the fullseven-day course, you know, it
was maybe one or two hours andone night a week for maybe six

(34:28):
or seven weeks and so, yeah, Iwas doing that.
That was a lot of fun.
I have teaching in mybackground too, and so it was a
lot of fun to do technicalthings.
But it was during the pandemicwhen SANS, when they closed down
the mentoring program, since somuch of their content is also
available to streamed online orjust prerecorded, so it makes
sense for them to have closed itdown.

(34:49):
But boy was it fun when I, whenI had the opportunity.

Eric Brown (34:53):
Yeah, good for you, and I know those courses are
really well received.
I don't think you can go wrongwith any of the SANS courses and
I know the folks that havetaken them on the teams have all
spoken very highly of them.

James Arndt (35:10):
Yeah, they have a very impressive cadre of
instructors working for them.
They're all very, very good atwhat they do.

Eric Brown (35:18):
Yeah, jamie, thanks for coming on, really appreciate
your time.
I've always loved the work thatyou've done and the training
and the chats that you've donewith the teams when you've come
in and done some in-persontraining, so thank you for that
as well.

James Arndt (35:33):
You bet Always happy to help out.
Thanks again.

Eric Brown (35:40):
Want security leadership without the headcount
.
As an extension of the team, ITAudit Labs will provide the
experts to guide and counselyour company.
We will start by creating acustom security program that
caters to your industry whileproviding transparency and
remediation to improve cyberposture while reducing risk.

(36:02):
Contact IT Audit Labs to findout more.
Advertise With Us

Popular Podcasts

Dateline NBC

Dateline NBC

Current and classic episodes, featuring compelling true-crime mysteries, powerful documentaries and in-depth investigations. Follow now to get the latest episodes of Dateline NBC completely free, or subscribe to Dateline Premium for ad-free listening and exclusive bonus content: DatelinePremium.com

24/7 News: The Latest

24/7 News: The Latest

The latest news in 4 minutes updated every hour, every day.

Therapy Gecko

Therapy Gecko

An unlicensed lizard psychologist travels the universe talking to strangers about absolutely nothing. TO CALL THE GECKO: follow me on https://www.twitch.tv/lyleforever to get a notification for when I am taking calls. I am usually live Mondays, Wednesdays, and Fridays but lately a lot of other times too. I am a gecko.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.