Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
(00:00):
Music.
(00:14):
We are live dave holder
mr rambo how are
you sir i am doing well man how are
you doing great yeah man it's tuesday
afternoon chatting with you and not doing anything else i'm supposed to be doing
so this is fantastic as far as i'm concerned yeah i should be doing a lot of
(00:37):
other things so i've had a couple of weeks off now i've told everybody on linkedin
so i guess it's not a A huge surprise,
but I am no longer with my previous company and I am a free agent,
which feels kind of nice, kind of awkward too, because I haven't been without
work in, I don't know, 30 years.
And so it's kind of one of those interesting things.
(01:02):
And I'll tell you a funny story. So the day I found out that I was separating
from my previous company, I shot my wife a text. And as soon as she got home,
she was like, oh, that's great news. We can remodel your office now.
And so by we, we know what she meant.
I was like, I said, okay, well, I didn't expect that.
(01:25):
But initially I pushed back, but then I thought, you know, it's a great idea, actually.
So we ripped everything out of the office, repainted it, did some remodeling,
added tint to the windows. those essentially just worked our butts off for a
solid week after I'd separated.
So it looks really good in here,
(01:46):
but what I should be doing is hanging up pictures and my I love me wall.
And I just haven't been able to get to that point yet.
So maybe this week. That's not what you should be doing. You should be chilling,
hanging out with me and all of our friends that are listening in.
So that's what you should be doing. that's what I'm doing that's cool though
(02:08):
that you guys worked on it together and crushed a project that sounds like fun man.
And we didn't get a divorce, you know, which is always a bonus.
Yeah, I love my wife to death and she has a great work ethic like me.
But man, you know, sometimes working that closely with your spouse is not always
(02:28):
a good idea. For some people, I guess.
Yeah, I certainly don't have that problem. We live and work together and we
both work at home and it's great for different companies, you know,
but it's just our personalities, I guess. We enjoy it. Yeah.
And sometimes I wonder if it
might not be beneficial for me to go to an office every once in a while.
(02:49):
I've been working from home for five years now and my wife and I,
and I love my wife. Don't get me wrong.
Sometimes a nice little break, you know, maybe, maybe some TDY would be beneficial.
Give her a chance to miss me a little bit. And catch up on her own stuff.
You know, Leah and I both like to alternate travel
(03:09):
for not really for that reason but we
do enjoy that side benefit of traveling just
kind of getting in our own head space and staying grounded
individually that's good stuff dude
yeah so what have you been working on anything interesting i think
so most of my family friends and peers think it just sounds really insanely
(03:32):
boring but i've been working on building out some research on foreign threat
threat actors that aren't just impacting the U.S.,
but basically any company trying to do the right thing and looking for their
selectors so I can do threat hunting inside of our environment.
So I've been doing that over the last week, as well as prepping to go on a six-week
(03:55):
off-roading tour with my daughter from the East Coast to the West Coast.
So drop her off at UC Santa Cruz in late September.
And between now and then, you know, the next few times you and I talk will be
with a backdrop of either the Rocky Mountains or the desert working from under
an awning with a little table and a comfortable chair and a couple extra monitors and a generator.
(04:19):
Not a generator, but a big solar-powered battery.
I've been wanting to do this for three years. And it's taken a long time to
get all the gear sorted out and optimized and tested. It's going to be really fun.
Oh, gotcha. Six weeks. And a lot of work. Yeah, I bet it is.
So, you know, we were talking with some neighbors of mine last weekend or a
(04:42):
couple of weekends ago, and they all came together and said,
hey, you know, we should all get condos on the beach at Orange Beach, Alabama.
And I said, you know, I've had enough beach. I've had enough water.
I really need to go see some mountains, go see some rugged country and get away
from the beach vibe for a while.
(05:05):
And they all looked at me like I had a thing grown out of my head.
You don't want to go to the beach. I was like, no, it's hot.
It's saltwater. It's sandy.
I want to go see some mountains. So I may have to do that actually over the next week or two.
Just take a little trip up through the Appalachians and see what I could find.
(05:28):
Yeah, sure. Sure, or fly out to Denver and jump in one of our two touring vehicles
and do a ride along for a couple of days or Vegas.
We'll pass through Vegas on the way to Yosemite.
That's how me and my family and friends typically do. Whoever's on a trip,
we try to just jump in when somebody has the time for it. It's always possible, dude. You're invited.
(05:49):
I mean, if I can make it happen, I definitely will.
Right on. Yeah. The last time I did that, I think it was in 2011 or 2012,
and it was for a project I was working with the Army.
And instead of meeting in a restaurant or a hotel, the guy said,
hey, why don't we just jump in my Jeep and we'll go ride through the mountains
(06:11):
and we'll talk while we're driving through the mountains.
Coolest, coolest meeting I've ever had.
It was an all-day deal. Eight, nine hours just cruising through the mountains.
I mean, we saw all kinds of wildlife, a moose, a baby moose, deer. It was magical.
Yeah. I was like, dude, I could do this all the time. Yeah. And as it turns
(06:34):
out, it's really difficult to perform surveillance in those types of environments.
I remember turning some heads when I mentioned that I'd had a meeting with my
FBI co-case behind the breakers.
We were surfing one morning before work hours, mind you. So we were on our own
time and talked around the issues.
(06:56):
But by the end of that surf session, we had a great time, got some exercise,
and knew exactly what we were going to do that day.
Worked out great. And I wrote up my little memorandum for record,
my interagency collaboration results of the meeting and submitted it.
But I didn't talk about the venue.
(07:18):
No outdoor meetings, man. What's that? Did you file ICF for that meeting?
Surfing's free, brother. I love it.
Well, dude, it is so great to catch up with you. We had a wonderful episode 11.
Got a lot of great feedback from episode 11, by the way. I have to give a shout
(07:39):
out to a guy named Dave Snell.
And I think you're connected with Dave on LinkedIn and he actually direct message
and said, Hey, that was one of the most insightful episodes that he's listened to so far.
And I want to personally thank him one for the feedback, but also give you the
feedback so that you're aware of what he said.
(08:02):
And you did most of the talking on that one, which was great.
I mean, you're, you're an expert in investigations and he really felt like he
got a lot out of that episode. So, congratulations. Well, that's wonderful to hear.
I'll give a shout out to a couple individuals I got a chance to meet up with
in person this weekend, Brian and Sam, who said they're learning what they know
(08:23):
about counterintelligence from our show because they don't work in this lane normally.
And it's kind of exposed them to these concepts and this side of the business
where, okay, there's professionals here that are trying to deny red from seeing
blue or influence what red sees.
And I think that's a great way to put it. Once they put that sort of set of
(08:44):
glasses on or lenses, everything clicked into place.
They were like, do you have t-shirts yet?
And we need to give some merch. We've got to give some merch.
We're a little behind with the power curve in that regard. So we'll get on it.
We're getting there. I think you're working on a new logo too,
(09:05):
though, which we may need to wait for that before we start making merch.
Well, we'll get the challenge coin going. That's ready. But yeah,
the logo, I've had to take a break and deal with some other stuff with this trip planning.
Well, it's going good. I know we've got some great topics today.
I won't introduce them yet, but we want to go back to some things that we had
(09:28):
done in early episodes and revisit some of that and see where it takes us in
the early part of this podcast.
So, you asked a great question earlier, I'll just ask it back to you.
Dave, what books, our book, are you reading currently?
Well, the one I'm reading now is Spy Fail by James Bamford.
(09:52):
So I'm working my way through it, and I haven't gotten through it yet.
But my wife, Leah, read it and said it was really good. And that's all I need to hear.
She's a prolific reader, if you can put those two words together.
And we ran out of bookshelves long ago. I mean, six years ago.
And this was one of the ones that
(10:12):
she said yeah this is really good and i like
that it addresses where counterintelligence meets politics
when i joined the army i was
i was like concerned that i would end up
fighting a war or supporting a program or something that was just a political
football i was always painfully aware that that's how wars often work and that's
(10:36):
how vietnam was and A lot of people felt like they went to war and came back,
sometimes multiple times,
and just weren't appreciated, and we didn't get the results we wanted at the
strategic level, and it was just a political game.
And I didn't want to deal with all of that. So what did I do?
Join the infantry at the ass end of the accordion, because I just didn't know
(10:57):
anything about the national defense.
It kind of felt like I wanted to start at the bottom of the educational curve
and learn what it means to be a soldier.
And what it means to do soldiering and then to lead in that environment.
And then moved over into counterintelligence pretty soon afterwards.
(11:17):
But the one thing that brought me and my late friend, Mike Ethier,
I think, fulfillment about our career field, in spite of all the BS that we
faced along the way, was that...
It didn't really matter what the political football was, because from a counterintelligence
perspective, we're protecting a couple things that are really important to us.
(11:40):
One is our nation's ability to fight and win wars, and maintaining secrecy is critical.
And the other is protecting our troops, wherever they were.
They're fighting political footballs, fighting legitimate skirmishes here and
there, of course, around the world, and we're protecting them from insider threats.
(12:02):
And we caught a lot of insider threats that saved lives.
And so when I read the cover for Spy Fail, it connected all those dots for me
about where counterintelligence meets politics and where do we fit there?
You know, should counterintelligence people be involved in politics?
Things like that. What about you, dude? What are you reading?
(12:24):
So believe it or not, I had to step away.
You know, sometimes I come into fiction and nonfiction. And I also read spy books and other books.
And I just had read or listened to three or four of the Jack Carr books,
a series that he's writing from the terminal list.
(12:47):
Yeah. Okay. Yeah. Yeah. So really good series.
I heard the second series is going to come out fairly soon. And so I dipped
back in to Jack Carr's world with James Reese, and I'm listening to In the Blood.
I've also been doing a lot of driving and traveling, so it just made more sense
for me to listen to an Audible book versus read one.
(13:10):
Well, nobody's judging you, dude. Yeah. I mean, read, listen,
it's all the same thing, right? There's a lot of words on those pages.
There really are. And I can't pronounce most of them, but it's pretty good.
And you can see the great thing about the Jack Carr books, as he's writing more
and becoming a more polished author, it comes through in his books.
(13:33):
The stories get better with each new book.
So I'm listening to In the Blood.
Really, really well done. has
nothing to do with counterintelligence or espionage or any of that stuff.
You know, you could switch off, listen to it, and be entertained.
But I do also have a book that's sitting on my desk, and as soon as I finish
(13:56):
In the Blood, I plan to pick it up and read it.
And it's Calder Walton's book called Spies, The Epic Intelligence War Between East and West.
Now, it's a job. It's a pretty thick book. I don't know if you know this,
but I look at a 600-page book and become intimidated because I'm like,
okay, this is going to take some commitment.
(14:18):
But I keep hearing great feedback about it. People are posting about it.
And so obviously, I had to get the hardcover version of it because I wanted it on my bookshelf.
Over the next few days, I plan to open it up and actually spend some quality
time with it. So looking forward to that.
Cool. Yeah, I've seen that one making the rounds on our LinkedIn. network.
(14:39):
I also bought that. I bought it on audio. Don't judge me.
How many hours is that?
Jack Myrtlebook is 12 hours. I'd imagine this is going to be 26, 27 hours worth of audio.
I don't know. I never look at the length because to me, if it's interesting
content, it could just go forever and I wouldn't want it to end.
(15:01):
I remember reading Atlas Shrugged, that was my first over a thousand page book.
And this was like 25 years ago or something.
I thought it was just an enjoyable pace of reading and the story kept changing
enough to keep me interested.
So I don't care. I mean, give me a good book. Good book's a good book.
Well, you'll be driving too for the six weeks.
(15:23):
So maybe you'll get a chance to, to kick it in and listen to it as you're on the road.
Oh, a hundred percent. Yeah. I've got a couple of audio books lined up for the
trip. That's fantastic.
All right. So we've covered books. What good movies or series have you been
watching or paying attention to lately? Well, just finished The Night Manager,
and it's a John LeCarr adaptation, yeah.
(15:45):
And it was gorgeous to look at. Just amazing.
And a great story. You know, it was more like a collection operation almost, or a sting.
So really, not a lot of counterintelligence per se, much more of a sting type
of collection, like an investigative source operation.
But from an insider threat perspective, I could see that they ran a clinic on
(16:12):
insider threat throughout.
It's just one season. It's one story. It was nice to look at.
The Little Drummer Girl was better, I thought.
That was another series adaptation. If you want to know about what the stress
on an asset is like, I've never seen a more realistic depiction as the woman
that portrayed the intelligence asset in that show.
(16:36):
When I was going through our case officer training, one of the things that happens
sometimes is they try to provide access to one or more people who have retired as assets.
And they'll come in and it's a big process because they're still living in the
shadows, even though they're retired from that business.
(16:58):
And so it can sometimes take two or three days to get them into that that environment safely.
And one person that came in just broke down in tears on the stage,
maybe 50 of us or so in the room,
because it was the only chance in her life that she had to chat about her life experience.
(17:20):
There was literally no other approved venue for it.
And all of that stress and all of that emotion for taking part in these just
incredibly powerful and successful operations during a 20-year-plus career.
Just, she can't talk about it, nor about all of the prices that she had to pay along the way.
(17:41):
And so Little Drummer Girl for me was...
I don't know. That's a great series. That is a fantastic series.
Yeah, I forgot about that one, but I need to go back and re-watch it.
Actually, my wife watched it with me, and she loved it. So I need to go find
that again and re-watch it. That's a good one.
(18:01):
Do you have a favorite all-time CI-focused movie or anything?
I do. And I've told you Tinker Tailor Soldier Spy, obviously.
I mean, that's way up there. But the other one is The Good Shepherd.
Oh, yeah. Yeah, yeah. Of course, I'm a huge fan of Matt Damon.
I just think that guy is so cool. The Good Shepherd, I thought,
(18:25):
was just a fantastic intelligence, counterintelligence story.
It's based on James Jesus Angleton, loosely based on his story and how he evolved
and how it impacted his life, how it impacted his family.
And ultimately, if you know the story about James G. Singleton,
(18:47):
he's a polarizing figure in the history of US counterintelligence.
But you could see what it did to his mind.
I mean, he became completely paranoid towards the end of his career and killed
a lot of careers and stymied a lot of really awesome intelligence collection
opportunities because he was so afraid that the Russians had penetrated all
(19:10):
levels of the US U.S. government.
That is one of my favorite all-time counterintelligence movies.
What about you? You got one? I do love that movie.
I mean, that's kind of where counterintelligence was born.
And I mean, counterintelligence has, of course, been around from time immemorial, but here in the U.S.
Anyway, we started formalizing the CI approaches around that time.
(19:33):
So great piece of history yeah I've said it before my all time favorite CI movie was Casino Royale.
When I watched it, I did not expect it to have so much overlap with my career.
It was obviously on steroids and with much better looking people all around.
(19:59):
Way cooler cars than I ever got to drive. But it was the theme lines running
through it and the consequences of getting it right or wrong were fantastic.
And the whole thing about coercion, too, is something else I talk about a lot.
In my work is when you're countering an insider threat, sometimes it's not malicious.
(20:21):
It might be intentional, but it could be coerced. And there's this huge cultural
aspect of coercion around the world.
And when we're supporting workforces that work under those conditions,
I think about that a lot and how to help our workforce be successful and safe.
And that was an insider threat that was coerced and it took a long time to figure
(20:44):
out the truth because they weren't applying,
counterintelligence fundamentals they were assuming everybody was good
because they were on the inside oh you're gonna hate me for this and you brought
this up in episode one i still haven't come back watch casino royale yet and
i just wrote myself and i was like please watch this movie i've got to see this
again so oh my gosh that's on me yeah it's intense it's great and very intense.
(21:09):
And it was also a breath of fresh air for me because I always saw this potential
in the James Bond franchise.
And when this new one came out, it was a whole different level.
What about miniseries? Watching any miniseries? Anything lately that has got
you turned on or turned off?
How do I want to answer this question?
(21:32):
Phrasing as Archer would say. Hey, let's just say Archer for that question. How about that?
A day with archer is like a day in any army office i've ever worked yes it's such a great series,
i haven't watched it in years are they still making new episodes of that i don't
(21:53):
know i haven't either yeah but whenever it first came out i couldn't get enough of it oh yeah.
Yeah it's outrageous and so so true to life in some places i will tell you that
i've re-watched A Spy Among Friends about Kim Philby is kind of an alternate historical. Oh, yeah.
I went back and revisited it pretty good.
(22:17):
Whether it's true or not, who will ever know. But I just thought it was well done. Love the actors.
And I just love British espionage movies. Anyhow, man, I just think they...
They do a much better job in many cases with their espionage movies.
It's not a lot of explosions, but it's real.
I think they bring out the essence of the chess aspect of counterintelligence.
(22:41):
Yeah, you're right. They take their time to build out the real storyline instead of just cut shots.
That's one reason I like John Le Carre.
I've read a bunch of John LaCarre books and seen all of his movie and film productions,
TV series and things over the years, and they just focus so much more on the
human experience within this lifestyle.
(23:03):
Like The Constant Gardener, you know, you would never say, oh,
that's a spy movie, but it absolutely was.
Also, Turn, Turn's one of my all-time favorites.
It was very well done, very close to the book it was based on, Washington Spies.
A lot of people still haven't heard of that. It just looks like this random
old AMC boring sort of documentary style reenactment, but it is not that.
(23:28):
Well, it's funny you mentioned that because during one of the first very few
episodes you mentioned Turned and back in 2014,
15 timeframe, whenever it first came out, I did watch several of the first few
episodes and really didn't find it that appealing.
Feeling and so i quit watching it and then
as we were talking about i said okay i'm gonna make the effort i'm
(23:51):
gonna go back and watch it and oh my god me and my wife been
watched binged watched it over like a weekend
couldn't get enough of it it was really well
done and as the seasons progressed i mean
i just thought it got better i'm glad y'all i'm glad
you turned me on to it because i i really really enjoyed that series i mean
(24:11):
since we're turning each other
on hey how you doing Oh no Camera's off Earmuffs Earmuffs kids Mr. Rambo,
Yes, sir. How would you describe the difference between information security
continuous monitoring and cyber incident response, digital forensics, and e-discovery?
(24:39):
That's an easy question. Not compound at all, right? Yeah, not at all. Where do I begin?
Well, they're obviously all different disciplines. I totally separate functions
that all support the same effort, but totally different.
When you came up with these topics, I thought, okay, well, this is interesting.
(25:04):
How are we going to tie all of these different security functions together in
a podcast that makes it enjoyable?
So for me, information systems, continuous monitoring, I'll tell you what I've
seen that in the past. It was really just vulnerability management.
We were working a government contract. The government is really big on making
(25:29):
sure their vulnerabilities are patched on time, that they're identified,
that you have asset inventory.
And so we did that every week. Every Thursday, we'd sit down for four hours
or bring all the teams together.
We would make sure that our inventory was straightened, both the prod,
non-prod, cert, test, build segments of the network, and just talk about how
(25:51):
we're going to eliminate vulnerabilities.
And that was the whole part of the ISCM that I've worked.
Now, you may have a different view of that, and I can't wait for you to tell
me what you think about information security, continuous monitoring.
Really? Because when you say it like that, it almost makes me think you can
wait to hear how I'm going to say it.
(26:15):
Well, I mean, I think it means different things to different people,
but I can't wait to see. Like the word is.
Yeah. In the 90s. Just like the word is. Yeah. Yeah.
No, you're right, man. I mean, there's a couple different sides to the ISC and COIN.
You think about ISCM from the NIST description, it can include a couple things.
(26:35):
And then in practice, I like to think of it as also monitoring human behavior
on IT assets, like that kind of continuous monitoring.
So, you know, when you're talking about ISCM from more of an infosec engineering
perspective, then yeah, absolutely.
You know, you're doing continuous assessment of things like security controls.
(26:58):
You're doing real-time monitoring of systems and networks.
You're measuring compliance with policies and regulations. and you're looking
for, there can be some anomaly detection, but not so much on the human side, right?
It's anomalies with systems that need attention.
There's, yeah, vulnerability scanning, there's log and security event analysis.
(27:20):
There's response to alerts, integration with the SIM and everything that's happening
on the cyber defense perimeter and all that.
And then cyber defense also kind of plays into it, the ISCM,
Because I think about a CISO, you know, by definition, it's the chief InfoSec
officer. And that's what that means.
(27:42):
And typically the CISOs have those two big branches of InfoSec on one side and
then the perimeter, cyber perimeter defense on the other.
But there's a lot of overlap, of course.
We're protecting information from maybe misuse internally, but also theft from
the outside or corruption. Anything that disrupts the CIA triad. Confidentiality.
(28:06):
Integrity and availability. So other things that cyber defense plays into is
just all of the different systems that prevent the outsiders from getting in,
prevent inside things from beaconing out. So a firewall is really important.
But on the other side of it, you've got this human aspect of ISCM,
and that's how do we monitor employees' compliance with InfoSec policies?
(28:30):
How do we detect abuse of privileges and abuse of access?
So let's take NIST and the new cybersecurity framework, 2.0,
that came out, I think it was earlier this year.
There are some great InfoSec fundamentals covered in that, even though it's
(28:51):
obviously designed more for cyber defense.
With the InfoSec NIST publications, going back to those, we can find touch points
with monitoring human behavior.
And then what does that look like? It's almost like humans are a system too.
It's monitoring the performance and compliance of human behavior on IT systems.
So you're also looking for things like compliance, you're enforcing policy with controls like DLP.
(29:18):
So you're not only detecting non-compliant behavior, but you have security controls
built in and that's connected to the ISCM paradigm. on.
And when there is a problem, just like with cyber defense, there's a playbook for that.
And then you can kind of get out of ISCM and into incident response or investigative activities.
When we're talking about insider risk, there's always this question about what
(29:42):
are you really monitoring?
Are you talking about people's lifestyle things and their stressors that that
cause them to perhaps go down a behavioral spiral towards violence or rash behavior
or addictive behavior, things like that?
Or are you talking about people trying to steal trade secrets?
I like to separate ISCM on the human side between InfoSec behavior and.
(30:08):
And malicious behavior. InfoSec behaviors, all the accidents and negligence
and all of that, just lack of compliance with policy and best practices.
Where malicious behavior is everything else.
Things you would find in the criminal code somewhere, like fraud,
embezzlement, theft, and sabotage of IT and OT assets.
And that's an interesting, because I don't know that we spend enough time doing
(30:34):
continuous monitoring on the human systems.
I don't know that we go back all the way back to even the fundamentals roles,
what access each roles get.
Should there be differences between this role and their access versus a role
in a totally different, maybe the same title, same kind of role,
but do they need the same level of access?
(30:56):
And so I think there's a lot of value in watching how employees interact with
the data, what events take place with the data, and then going back and refining
whether or not we should allow,
even from the basic level, this role to have as much access as they have.
(31:16):
And I think we've talked about it in the past where hire somebody,
maybe they go through four or five different interviews.
Then they land the job, they're on border for a few days, and then bam,
they get their laptop and complete and total access to the network.
And I just wonder if we're not approaching this in a smart way,
(31:39):
if we could do it a better way.
I just don't know. I wonder how widespread that is, because where I've worked,
that really hasn't been the case.
And I've been quite impressed with access management.
Access management is, of course, part of InfoSec. So on the InfoSec engineering
side, that's where you should understand privileges and access.
(32:00):
And, you know, anyone that's gone through CISSP or some of the predecessors,
the ones that you would do before that,
it's just one of those fundamentals you keep getting hit on over and over and
over is access control and identity management,
managing need to know and separation of duties.
(32:20):
All of that is along that side of infosec engineering.
And then your ISCM is just monitoring compliance with whatever framework you've set up and policies.
And then on the human side, of course, ISCM is monitoring infosec behavior across
the whole workforce, particularly around people working on crown jewel projects and R&D type stuff.
(32:43):
But do you think that we have too much access?
Even thinking back to your military time, do you think we had too much access to information?
In your corporate environment, do you think we have too much access to information day-to-day?
Certainly at times, and sometimes all the time, depending on each organization's setup.
(33:05):
But the alternative is that...
And just interesting thoughts popping into my head here.
I'm just remembering back to so many episodes where there was a dilemma where
somebody had too much access, but by removing it, it significantly limited their productivity.
And it basically exposed all of these other hangups that weren't there before.
(33:27):
And they were all solvable. So we would just turn on their access and go solve
those little pain points and then dial it in.
We have to be super careful not to
impact productivity when it really needs to
happen and where it needs to happen it's the
same reason like you know big it operations teams will do change management
(33:49):
initiatives really carefully because if something breaks it could have million
or billion dollar impacts like let's say a plant goes down and it's a big plant
It could cost a billion dollars,
a insert time frame here, a week, a month.
So we do need to be careful with our security controls. And if we're going to
(34:11):
change things, do we have too much? Maybe.
What's the alternative? We need to hire a massive
team or bring in a really expensive software application to to monitor everyone's
productivity and InfoSec behavior based on a benchmark of a RACI chart and a
(34:32):
job description and a project plan.
Now, have you ever worked with anywhere, like any company that had a job description,
a project plan, and a RACI so detailed that you could map every task they needed
to perform and which data they would need? No.
Short answer, no.
(35:17):
Putting controls around critical assets that only have small teams working on
them and spend more time on educating those small teams.
The better they're educated, the fewer problems you're going to have.
That's always statistically proven out.
So that balance is going to be different for every organization based on their
risk tolerance and their cost benefit.
(35:38):
So, but I have seen where, I mean, companies do a really good job of protecting,
say, for example, financials, You know, the financial reports,
the accounting team, the accounts
payable team, really, really great job protecting that information.
And it's rare that I've seen, I have seen it, don't get me wrong,
(35:58):
but it's very limited to the amount of data exfiltration events that I've witnessed
coming out of the finance team makeup and a couple of different corporations.
So they do a pretty good job of it. so i'm
not beating anybody up i just no i think
it's being done well i just wonder if we
(36:20):
give too much access right off the bat and maybe it should be a phased approach
or or something different you know just looking to solve that problem and that
can be down to each individual line manager line managers have a tremendous
amount of influence over what access their new team
members have and any supporting resources.
(36:41):
Like if they're bringing in agency or contract hires, they have to make some
hard decisions about, you know, do we expose them to cycle plans and revenue
cycles and crown jewel information or not?
What about team members that are coming, that are working remotely from other global regions? Yeah.
(37:02):
You know, are there information security concerns or laws or regulations or
privacy regimes that we need to be aware of?
So, I mean, it's a tough question. I don't know if there's really an answer, right?
There's probably, there's a best practice around securing your most critical
assets and the people that work on them.
But then let's say you have 100 people working on critical assets and you have
(37:27):
30,000 working on everything else.
It's a big difference in monitoring InfoSec behavior and also like looking for
malicious indicators on 100 versus 30,000.
Think about the scale of the team or the software resources you'd need for that.
And overall, I think we do a pretty good job. Of course,
(37:48):
we see insider threat activity all the time or insider risk activity all the
time, but I don't see it as such a huge problem that we get ridiculous with it.
The one size fits all hurts productivity, and then the company doesn't make
money, and then they can't pay for all the security controls.
So, you know, we kind of find that balance with our people paying the bills
(38:13):
and with the board and the senior leaders.
So are you part of the information security continuous monitoring team?
And it's part of any charter I've ever been on with an insider threat team.
So you're doing infosec monitoring. Yes. Things like DLP, unauthorized data,
to exfil, things like that.
But you're also monitoring for behavioral concerns that are indicative of malicious intent.
(38:39):
That's a very different type of ballgame than in ISCM, like monitoring for infosec behavior.
Now, are you using in your process or in your continuous monitoring process,
are you using any type of lifestyle triggers, significant life events to kind
(39:00):
of whittle down the people that you're monitoring to a kind of a group?
For example, those employees that are on a PIP, those employees that have put
in a termination notice, 14,
30 days, 60 day, 90 day notice of termination, any of them that have any significant
hardships brought to ER or HR's attention.
(39:25):
Are you using that at all in your continuous monitoring activities?
Well, I will say that I advocate for it. I support a lot of different client requirements,
and each client looks for something a little bit differently based on what kind
of sector they're working in and what kind of regulatory oversight they have,
(39:49):
what types of crown jewels they have, the core competencies represented on those teams.
And, you know, if you think about the difference between, say,
people working in a security department and a chief research scientist,
you know, they think very differently about security.
And so it kind of depends on the monitored population what approach I would take.
You know, if I'm working in a defense contractor, which I have before,
(40:12):
you're looking at compliance is really big because that affects the bottom line right up front.
Beyond compliance, reputational harm is probably the next thing they think about.
If an employee gets kicked off a contract because they're showing up drunk to
work every day and hitting on all the females or males in the workforce,
then it's disruptive and the company's got to replace that person.
(40:36):
And the client may just give the contract to somebody else next time it comes up for renewal.
And if they get tired of dealing with some BS, there's just a range of ways
that each client wants it configured.
But I'll tell you what I think is ideal in all of these different client requirement configurations.
I think it's ideal to monitor certain
(40:57):
risk signals based on the monitored population and what their role is.
This comes up a lot when people are interested in using something like LexisNexis
or Thomson Reuters Clear or Indera or TRSS,
Any of these types of LexisNexis special services where they're getting criminal
(41:19):
and civil records brought into their overall employee continuous monitoring program.
Well, when you do that, you want to take a micro-segmentation approach so that
if you've got a team of delivery drivers, you're probably not concerned with
espionage. But what you might be concerned with is DUIs, reckless driving.
(41:40):
You might be concerned about things like package fraud. That's huge now.
So that's what I would look for. But I wouldn't just have this massive omnibus surveillance program.
Likewise for software developers. I probably...
Don't care about DUIs and reckless driving. I don't care if they get in a fistfight
(42:01):
at the soccer field with, you know, another parent.
I mean, it doesn't impact their job unless they just, I don't know,
can't show up to work for a week or something on a big project.
But I would be really concerned with anything in their background that indicated
they'd been involved in intentional fraudulent behavior,
like resume fraud, or working for 12 companies full -time at the same time,
(42:23):
outsourcing to North Korean hackers, like whatever.
I want to know those. So that micro-segmented approach for risk signals is really
important when you've got workforce that consists of a bunch of different core competencies.
And you mentioned Clear and a handful of others. And I've used Ontic in that way.
(42:45):
Ontic is a great platform that was able to pull data from external sources.
Marry it up with the information that you have internally, which solved a lot of challenges for me.
So I have to give them kudos. They really do it a great job with their platform in that respect.
And it is important to be able to pull that information.
I mean, you may have been the guy that, you know, did a code change at,
(43:09):
you know, 10 o'clock at night for an endpoint security company that shut down
all of the Microsoft platforms across the globe.
I mean, you probably don't have a job the next day, but, you know,
It might be interesting to know
if that thing is following you from job to job. Well, not necessarily.
I have seen these types of cases before, and in some instances,
(43:32):
it wasn't even the employee's fault.
They were following policy, but the policy was jacked up.
Or they had a manager that had a very permissive culture.
With security controls and just allowed employees to basically do whatever they wanted.
I've run into situations where corporate cell phone usage was such that the
managers encouraged their staff to get rid of their personal cell phone bills
(43:55):
and phones because they didn't need two phones.
Just use the company phone for your personal stuff. Who cares?
It doesn't cost anymore.
Well, there's problems with that. Now you're intermingling personal private data with company data.
And if you're going to monitor the usage of that IT asset, which is exactly
(44:16):
what it is, it's on a register.
For sure. As an IT asset, there is a risk register at GRC somewhere that has
some kind of risk identifiers and probably some security controls like containerization at least.
It kind of messes it up to commingle personal data in there.
Yeah, man, there's, there's all kinds of ways, like little issues that you run
(44:40):
into with cultural application of the policies.
Yeah, for sure. And I've always carried two cell phones with me.
Sometimes I've carried more than two cell phones with me, but yeah,
I don't want to call you. Well, aren't you just special?
Well, I used to be special. Now I'm just that guy.
I love it.
(45:01):
But there's a term for guys like me, but I can't say it on the,
on the air. Oh, you can't, huh? Why is there some prohibition against free speech in America?
Well, I can, but it might offend people. So, uh, I don't, I don't want to offend
people, but people that, yeah, they, they know what I'm talking about.
So they'll probably be DMing me with that, that phrase before long,
(45:26):
but, but not, I mean, I, and my wife used to always give me hell because I'd
have two cell phones with me all the time.
But just for that reason, I mean, I mean, I don't want to put my bank information,
my bank app, all of that stuff on a corporate device and have that commingled
with company information and that sort of stuff.
But I carry two cell phones just to make sure that there's a clear delineation
(45:52):
between what's corporate, what's Ryan Rambo's stuff.
And so there's no mystery to any of that. But I do have another question for you.
Wait, wait, wait. Before we're off the cell phone topic, just because we're
here, I have a challenge. I have a challenge for all of you out there and for you too.
Someday, do an inventory of the apps that are downloaded onto your corporate phone inventory.
(46:17):
Just export an Excel spreadsheet from whatever MAM or MDM you're using,
Intune or Jamf or whatever.
And take a look at it by category.
Gory and i dare you to do this because you're gonna find some almost every time
i've done this for different places i've found things like you know of course
(46:39):
lots of gambling apps and games and things like that but also ashley madison
tinder grinder the whole list is all there They're like,
these phones can be monitored, people.
They typically are. Yes. Yeah. And deleting the app doesn't mean that it's deleted forever.
(47:00):
There's an audit trail. The audit trail is going to be there,
and the app's going to leave potential.
There's apps that have a lot of malicious code wrapped into them.
There are apps designed to at least attempt to defeat containerized areas where
you have your business files and business stuff.
So, yeah, I would reset that phone and get rid of those apps.
(47:22):
Yeah. Well, and what was the go ahead?
The question that I have for you is when you're setting all of this up,
your continuous monitoring, is it an alert-based type setup or are you actively
going in and threat hunting?
Because I think it's two different scenarios and I'll give you what I mean by this.
(47:47):
Normal day-to-day operations for continuous monitoring, it was all alert-based.
And I had my alerts set up, you know, from a counterintelligence perspective,
which weren't out-of-the-box type rule sets.
There were some that I created just for me.
And so day-to-day operations, it was all, I only acted on alerts.
(48:08):
But then there were times when the company was downsizing or they were doing
layoffs or any of that stuff, where we'd flip the switch and then go into active threat hunting mode.
And I'm just curious, what's your preference?
What have you used in the past? Which ones do you think work better, work more efficiently?
(48:30):
Well, it's not really one size fits all again. I've had a lot of configurations
that worked, but you're right.
There are a few different modes of operation, if you will.
Personally, I mean, I think an alert-based thing is important as a security control.
So when I think about alerts from an InfoSec perspective or an insider threat
(48:52):
detection and response perspective, I'm thinking about DLP, data loss prevention,
where you have a known bad.
Meaning you've got an issue type that is common, you're well aware of everything
about that issue, and you can create a policy that detects an incident happened.
(49:14):
It alerts somebody and you can program in either SOAR automation or DLP controls
like blocking, encryption, quarantining, and all of that to give analysts a
chance to see, hey, what is this user trying to do?
And is it an acceptable business practice? And maybe they just have a productivity,
an urgent requirement and they need help?
(49:36):
Or is it potentially malicious? or is it just pure negligence?
So, you know, your question about whether I look for rule-based alerts,
I would look for that in that DLP space,
but I would not look for necessarily rule-based alerts in a lot of other spaces
that you need to find clusters of behaviors rather than known bads.
(50:00):
So with like, with insider threat, it's not just data protection, right?
You're also trying to detect various other types of criminological activity,
criminological profiles, you know, and activity along those attack cycles.
And so for all of those, you might be doing more like anomaly detection plus
certain triggering events.
(50:21):
So for example, if you're setting up a detection platform based off of anomaly
detection, you can select your triggers.
So you start with a use case of, let's say, IT sabotage.
Well, if you analyze all the IT sabotage cases out there that make the news,
most of them are a result of some type of disgruntlement.
(50:42):
Okay, so let's take a disgruntled employee and see what path they generally
take on the way to installing a logic bomb that detonates a month after they're gone.
You can work backwards and say, okay, well, there are certain things they have
to do in order to get this into place without being detected.
And that becomes your attack cycle. So if you look at the attack cycle in each
(51:06):
of its phases, if it's MITRE ATT&CK,
which doesn't really work, but parts of it work for insider risk,
or the DTEX-5 phases, there's a lot of different companies out there that have
emerging frameworks for attack cycles,
for human behavior, they've got to go through all those steps.
(51:27):
And maybe not in the same order every time, but they have to do all of those things.
Therefore, you can set up more like algorithms than rules that are looking for
certain clusters of behaviors that are indicative of a certain type of activity.
Then, of course, you've got to pivot and go into triage mode,
and that's where you can get into some more intrusive means and start really
(51:49):
drilling down into what's going on. Now, so that's.
That's half of it. The other half is the triggering event. So if we're concerned
about IT sabotage resulting from disgruntled employees, what's the triggering event?
That's where the behavioral data feed is critically important.
Are they a leaver? And what's the characterization of their leaving?
(52:13):
Are they leaving because they're being terminated for poor performance or some
kind of concerning behavior?
Or because they're super happy with what they've done at their company And they're
just ready to move on to another challenge or because they're in a mass layoff.
All of these characterizations of why they're leaving are really important and
(52:33):
help you color your lens about kind of where should I look for evidence of malfeasance.
Likewise, you mentioned a PEP, a performance improvement or enhancement plan.
That almost always statistically results in the employee leaving the company. Yeah.
So therefore, if we believe the statistics that something like 70,
(52:55):
you know, between 40 and 70% of leavers take proprietary data with them,
depending on which survey you look at, then we should expect that someone leaving
the company is going to take data.
And we should expect that someone on a PIP or a PEP is at least vulnerable to
leaving the company and therefore taking data with them.
(53:16):
That should be a good triggering event. You know, a senior manager that gets
demoted is a great trigger.
And so some companies will set up what they would call an enhanced monitoring
regime or protocol where they have a certain set of controls that kick in after
these triggers, and you could call that trigger an alert.
(53:37):
It's not really, though, because it's descriptive.
It's not prescriptive. It's not predictive.
It's simply a descriptive event that says, here's a little bit of context.
Now, if you get that along with concerning behavior, now you've got a case.
(53:57):
So, again, you asked me a question. I'm going to give you the longest possible answer.
You're on a roll, brother. Go for it. I do love it.
And I love what you said about alerts and how to use them and how to set them up.
And I was specifically talking about DLP because that's the one that I have
(54:20):
the most experience with from an alert standpoint.
Standpoint but i don't know
if you've dug into the weeds on dlp sure
but but you know it doesn't catch everything obviously
no because we don't we only control for the known bad right and so when you
go into and pull the logs and do excel spreadsheet csv analysis of all of the
(54:47):
traffic that flows out to different email domains and that sort of thing you can start to pick up
on some real threats that DLP is just not picking up on.
And you'll see it in clusters of emails, for example. Yeah.
Like, Oh my God. Yeah. Over a weekend. Yeah. And that was always,
whenever I was in threat hunting mode, I would go and pull that CSV just for
(55:09):
the weekends because there's really no reason for you to be working on the weekend typically.
And from friday evening to
monday morning emails flowing
out of the company just unreal yeah and you're like
you're not supposed to be doing that and then
when you see an employee send out 30 emails
(55:31):
you know to their personal email address and a lot
of times it's obvious it's their their personal email address
i mean it has their name at gmail.com and
or you could just stop it at gmail.com because
that is obviously not an authorized place for your company's proprietary
information to reside and and that
we get just that little bit of analysis over the
(55:53):
weekend 72 hours worth of email traffic you know it doesn't take that long to
do but you see a lot of interesting hits and and but i didn't feel like i wanted
to do it all the time because it comes across a little big brotherish and and
you know kind of outside the norm So,
like I said, day-to-day operations, I would be alert-based.
(56:16):
But whenever we had to flip on the threat hunting mode, then I would dig in
and do a little bit deeper dive, a little bit more investigation,
a little bit more analysis on that, on those triggers.
Well, that's a perfect approach. And you and I have done that onesies and twosies,
like either by ourselves with a couple individuals our whole career,
just because there weren't big, robust teams with all the right core competencies
(56:40):
present at all times, right?
So maybe we can think about, for a second, if we are on a big team,
what would we want it to look like?
And for me, I would want DLP to be one team because that's a beast of a tool for a big company.
And the policy creation and management aspect of it is a full-time job for probably three people. Yep.
(57:07):
And then somebody's sitting there dealing with up to millions of alerts per
month until it's properly tuned.
And then they're so lost in the weeds that they struggle to convert all of the
lessons learned out of it into educational materials that help the workforce
start getting better at their ISCM behavior.
(57:29):
So you need a big team that's just focused on seeing what the workforce is struggling
with and creating messaging and training resources for them to help them reduce the number of alerts.
On the other side, you've got this more proactive approach where you're not
sitting around waiting for a bad thing to happen so you can respond to it.
(57:50):
You're trying to get more proactive where you catch certain behaviors along
the attack cycle of malicious actors before the negative event happens and look
for opportunities to help your organization intervene. convene.
And this is where I come back to like the IT sabotage or OT sabotage type incident with disgruntlement.
(58:12):
The proactive, you would never say, oh, somebody's a lever.
So we're going to turn off their lever in a month and it's voluntary and they're
not going to a competitor and they have great ISCM behavior and whatever.
There's no other indicators. Well, you probably don't just turn off their access,
especially if they're not working on anything particularly sensitive.
(58:34):
But if If, oh, by the way, they're somewhere along the attack cycle for their
snooping in data environments they have no business in.
Plus, they're looking at the
security controls in place, and they're researching how to defeat them.
And they're on some newspaper's website that gives them instructions for how
(58:55):
to give them classified information without detection.
And, of course, they're doing this on a company computer, which at the bottom
of the page, they say don't do that. And then it's like, oh,
well, it's too late, right?
You know, NSA would have caught Daniel Hale if they were monitoring for that
behavior, that type of behavior early on.
And they had some kind of algorithm or detection protocol for behavior like,
(59:18):
oh, I'm on the open Internet looking for ways to give a newspaper classified information.
Here's how you do it. Yeah. Bells and whistles, man. It does happen.
I've actually seen it. And we got a guy doing that exact thing.
Can't give you any of the details, but...
I've actually seen it. Sensitive meetings were happening. And while in the sensitive
(59:42):
meeting, the individual is reaching out to a reporter and looking at their contact
information on a web page.
And it's in the logs. I mean, you could see what he was doing and reading his
mind as he is going through the process of, yep, I'm going to reach out to that
reporter and give him the scoop.
So yeah when you aggregate all
(01:00:04):
of that that time event correlation from the
disparate data you know any person using a computer's probably
got several processes running at the same time so you need almost something
similar to a sim to bring it all together and into a time event correlated linear
progression it becomes quite obvious that someone's either doing their job or
(01:00:25):
they aren't with just logs.
And other things are dead ringers, is what I call them.
I gave a talk to OSIT, Carnegie Mellon's OSIT last year, and we were talking
about indicators of different crimes.
And when you think about an indicator of espionage, I had two columns.
(01:00:46):
One is your descriptive factors.
And this is what DOD and FBI tell you to look for.
Sudden changes of behavior, you know, all this kind of stuff.
On the right side was what I call dead ringers.
And one item in there was obfuscation. And I keep coming back to this because
(01:01:06):
it seems pretty easy for different types of skill teams to latch onto.
But, you know, nobody should be hiding what they're doing from the company.
They might be trying to hide their activity to,
you know, maybe play a joke on on their co-worker okay well you're going to
do a quick triage and realize that was the case and it's going to go away or
(01:01:28):
they're trying to hide their activity some other personal gain yeah.
Yeah, good employees don't try to hide what they're doing. No, no, that's a problem.
Right, and so for those dead ringers, they're the exception to the clustering principle.
You don't need a cluster of behaviors to predicate an investigation off of a dead ringer like that.
(01:01:52):
You still need to do your methodical investigation so that you don't hurt your
chances of getting to the evidence of what really is going on,
of course. But you know you've got a case.
And then you can do other stuff like digital forensics.
And a great lead-in to the next topic. Yeah. Well played, sir.
(01:02:15):
Well played. Like it's scripted or something.
So your first question, what is the difference between ISCM,
cyber digital forensics, and eDiscovery?
And so here we are in cyber forensics. Tell me, what do you think is the difference
between ISCM and cyber forensics?
(01:02:37):
I think it's the charter, because the skills can be very similar,
particularly when you're doing, if you're doing counter insider threat work,
but you're focused exclusively on super insiders or advanced insider threats, as SANS might call it.
And these are people that might be pen testers, for example,
(01:03:00):
and they know every single protocol and its vulnerabilities.
They know every single security tool in place and process.
They know the cultural application of them and what the company cares about.
And I'll tell I'll give you
another is a second challenge I challenge everyone
(01:03:21):
out there to hire a pen tester and put them on your insider threat or your counterintelligence
team they they will look at your setup and just shake their head it doesn't
matter what you have in place because they will almost immediately a talented
pen tester will will find workarounds.
(01:03:41):
I've never seen a company that was so locked down that a pen tester couldn't
easily defeat from the inside.
There's just too many ways to do it. So you need to have some forensic skills
on your team while you're doing this monitoring and triage so you understand
the significance of the artifacts coming across your desk and you know who to call for assistance.
(01:04:04):
So when I think about inside risk, we use forensic methods and techniques,
but our charter is different.
We're trying to detect risk to the company from the human domain,
where on the cyber On the cyber defense side, forensics is an investigative
process, and you need a set of tools.
And there are legal requirements for how you do the forensics,
(01:04:26):
sometimes including user activity monitoring,
which is kind of like a video capture of everything the analyst does in their incident response,
so that they can show how they arrived at their forensic conclusion on the stand
without anybody challenging their techniques.
And how well they follow their internal procedures and things like that.
(01:04:50):
Digital forensics might mean collections. It might mean analytics.
Let's tie it all together with a scenario.
DLP gets a hit that someone is trying to move a couple hundred really sensitive
files to an online cloud account that's a personal account.
They look into it. They ping inside risk and they say, Hey, are there any other
(01:05:13):
risk factors that you can see on your platform?
Behavioral, whatever, anomalies, clusters, stuff like that.
Inside risk comes back and says, yeah, we have a trigger here that they are
a known lever. And prior to that, they were put on a PIP.
A personnel improvement or enhancement
plan. A lot of things are coming together now, aren't they? Yeah.
(01:05:36):
So now DLP comes back and says, well, all of these are encrypted and we can't see what the files are.
The user encrypted them using some COTS application that we weren't tracking
and we can't get into them.
All right, well, now we've got a real case.
Okay, so let's send it up to legal. Let's open an investigation and ask for
(01:05:58):
some special permissions.
To do some real intrusive stuff on this data set, on the company data that was zipped up and sent.
And by the way, you know, the file name was changed to Dave's 1966 Mustang photos,
you know, and the extensions were changed from, you know, whatever to JPEG. Yeah.
(01:06:20):
And, you know, typically, you know, a lot of teams don't have access to endpoint
metadata that would show you all that file activity.
They have to call cyber. Yeah.
Once legal has a case open, and then legal can authorize cyber to do forensics,
and they can pull the endpoint metadata directly off the machine,
either over the wire or physically in person, and get a forensic image of it.
(01:06:45):
Once they've got that, then they've got access to the logs, because they can
get down to the kernel level where a lot of that super user or really advanced
insider threat activity happens.
So you need that dedicated support to those investigations and you need experts
who have the right credentials.
They can do it in a controlled environment so that it'll stand up in court.
(01:07:09):
And they understand the significance of the sort of criminal,
the artifacts that are indicative of criminal intent and criminal behavior,
because that's ultimately what will be rolled up in an evidence chain of custody
and provided to legal for litigation or law enforcement referrals.
And so that for the cyber forensics piece over the course of my corporate experience
(01:07:33):
has always been pushed out to external counsel to handle because it is so complicated.
We didn't have the experts. We didn't have the tooling like in case or some
of the other technologies that are out there on the market for digital forensics.
And so what I've typically seen is we just gave everything to legal.
(01:07:55):
Legal would reach out to external counsel. External counsel would have subcontractors
that were capable of doing that digital forensics for us.
And then we would just pass everything off to them just to make sure that we
didn't foul up anything through the investigative process or the examination process.
And so I haven't done a lot of cyber forensics throughout the course of seven
(01:08:19):
or eight years of being in corporate, because if we did get an indication and
it did look like it was going to go to litigation,
it was taken out of our hands almost immediately and passed externally.
I guess there just wasn't the expertise on hand to make sure we didn't screw it up.
Yeah, I'd say that's pretty common in small to mid-sized businesses where,
(01:08:41):
you know, they can't just invest in these, you know, many hundreds of thousands
of dollars for various types of subscription services.
And there's an outside counsel and even, you know, big companies,
of course, farm a lot of things out to outside counsel and their networks, too.
But you should, you know, a big company really, I've always seen that there
(01:09:02):
are at least some internal capabilities for doing things like forensic collection.
At least get a solid image of the machine or the data set and then make a couple working copies.
Archive that best evidence, best copy.
Or best copy, best evidence, depending on where you learned about BCBE, BDCB.
(01:09:27):
So it's not hard to do forensic imaging. It's one of the entry-level tasks that
a lot of cyber forensics people start with and get their credentials for.
Collecting data, essentially, in a way that establishes a reasonably tight evidence chain of custody.
Then doing the so that's
FI forensic or FC forensic collections forensic imaging
(01:09:50):
FI and then you've got FA which is forensic
analysis on the analytics side that's where a lot of companies suffer and need
outside counsel to know where to look and to have access to those big tools
and things like that but some are doing it internally I mean I've done it myself
with supervision by credentialed FI and NFA experts,
(01:10:13):
where I needed to be looking at the data in its rawest form so I could threat hunt to infinity.
Like every time we did a record check or an interview or an operational activity of any kind,
I would take that new data and pivot right within that big air-gapped forensic
(01:10:37):
examination machine that I've got just for that case.
Where I'm dumping all of my FI, my forensic imaging collections from every place
I've gotten it onto one big air-gapped machine that's got all the software loaded
onto it and UAM to video basically how I'm doing everything under supervision.
(01:11:00):
And I'm doing content analysis, mind you.
And when I get off of the machine to go do something else, the forensic experts
come back on and they start doing forensic analysis.
So in a sense, like I'm using the forensic tools, I'm using forensic methods,
but I'm only interested in the content and what can the content tell me about
(01:11:24):
my investigation and where to look next? Yeah.
That's a great way. I think that we should be working with our cyber forensics
teams in corporate America.
Now, I did that in my last company that I worked for very successfully after a fashion.
Like it took us a little while to figure out our terminology
and get past some hang-ups so cyber forensics a lot of times has an issue with
(01:11:50):
looking at human behavior so that's when that's why like when we come in we
have to first establish an open investigation with legal so that they feel comfortable
supporting and are getting their direction not from us but legal,
and legal will say we want you to do the forensics imaging collection analysis
and all of that but But the investigator needs to be able to have access to
(01:12:13):
the entire data set for threat hunting purposes to do content analysis.
That's one of those differences maybe that answers the initial question we started
the podcast with is what are all these differences?
If you don't have forensic credentials, you shouldn't be doing forensic work, right? Right.
Unless you're doing the unless you're doing the content analysis under supervision
(01:12:36):
of a forensics person who is taking care of the forensic requirements.
Well, I think it depends as well. I mean, so if it's not going to litigation
and you're just trying to gather evidence to term somebody or are some of those
things, maybe the forensic,
maybe the certifications and the process can be a little bit fluid and less
rich, rigid. It depends.
(01:12:58):
Are you going to litigate? Are they going to sue you after they get fired?
Maybe, but you got to take everything into consideration.
But you make a good point. You're absolutely right. Yeah. Very few cases like that.
If it's not going to be criminally prosecuted by the state or feds,
then you're right. A lot of this doesn't matter.
(01:13:21):
However, if you're going to litigate or expect litigation in return for a term
termination, then you got to run just as much of a tight ship, I think.
Yeah. Well, I had a really interesting conversation today with an IT leader and we were discussing.
He just flat out. Well, no, I asked him, I said, well, look,
(01:13:43):
here's the deal. I said, I've worked in two different environments.
One environment, I had to request cybersecurity support from multiple teams to gather information.
So I had to go to the identity access management team, the DLP team,
the Microsoft Purview team for e-discovery.
I had to go to another team for URL logs. I spent the majority of the initial
(01:14:07):
part of the investigation just doing negotiations with teams to get the logs
that I needed to prove or disprove allocations.
Painful. Now, in another scenario, I had access, like ready-made access to all of these systems.
At the user level, I could pull my own logs without having to bother somebody.
And I like the second option best because there's an opportunity cost.
(01:14:31):
Whenever I reach out to an engineer or an architect and break them away from
what they're currently working on to pull a log for me, it becomes friction.
They're like, okay, yeah, I'll get to that in three or four hours from now.
The other downside of doing it in the first scenario is that you're expanding
understanding knowledgeability of your investigation to whoever you have to request data from.
(01:14:57):
So now it's not as discreet as you would like it to be, you know,
depending on who you're investigating.
I mean, it could be a peer, could be a best friend, could be an acquaintance.
So the chances of compromising the investigation become higher,
the more people you bring into it.
And just thinking, you know, have you experienced that?
You know, which way do you prefer your team set up to be?
(01:15:21):
Do you like it to have access to all of the systems or do you prefer reaching
out to the different teams to gather and collect that information for you? Yes.
Perfect answer next you're going to follow that up with it depends,
in true warrant officer fashion yeah there's
(01:15:44):
so many factors i would start with you know i've either set up or helped set
up a number of programs from from the beginning and i've also gotten in you
know midstream to look at how they're doing things and i'll tell you how i I
think is kind of the best setup.
I think the best setup is to agree with your steering committee and executive
(01:16:06):
sponsor as to which data sources you're authorized to pipe into your detection platforms.
And you just have to make sure that those data sources have potential to inform
you enough about InfoSec and malicious behavior that you can get some insights out of it.
So let's say you do a scrub and you've identified, you know,
(01:16:29):
like you probably want some kind of identity and access management,
active directory type thing for a source of truth about who people are and what
they have access to, right?
You probably need something like exchange metadata, a mail exchange metadata, right?
You need some endpoint metadata for that kernel level file activity and change
(01:16:51):
activity stuff. You want to see what protocols they're using on their endpoint
to communicate data externally that aren't monitored by DLP.
So you need a core set of data sources, but you don't need everything.
Then once you better understand your critical assets, you've got a couple of
different new data streams to think about.
Things like facility access badge logs become really important for areas that
(01:17:18):
have extra security controls around them. You also...
Have the application layer of InfoSec that's happening.
And at the application layer, that's typically not something that an InfoSec
team is monitoring or an insider threat team or any kind of counterintelligence element.
Those happen at the business unit level. So let's say you've got a big R&D team
(01:17:42):
or a big software development team, and they use GitHub,
and they've got hundreds or thousands of repos, some of which contain sensitive data, some don't.
That application layer needs to be funded, housed, and cared for by that business unit.
Then when there are problems, that's what gets sent over to an insider risk monitoring team.
(01:18:09):
They can take all the other disparate data sources and compare what's happening
with that employee to that one indicator and see if there's a cluster.
Cluster so as far as having access to everything i'd say start with the core
data set you know get everybody comfortable that you know what you're doing
you know how to follow an approved workflow,
you're not snooping you're not trying to do your own privilege escalation and
(01:18:33):
once you can show that you're getting roi and doing it in a compliant manner
then go for the rest a lot of data streams that are i think critically important
important one is you've got to have access to endpoint point metadata from the kernel level coming in.
And I don't mean stuff that an EDR can get, because it's not going to see everything.
(01:18:53):
You need something purpose-built for monitoring human behavior on IT assets.
It would be wonderful to have things like,
HR triggers. Certain HR events predicate human risk, like involuntary terminations and PEPs.
And so what you want to do is get
an integration with the enterprise case management system or HR system.
(01:19:17):
Some companies have one for investigative case management and another for HR actions.
And you need both of those so that you can bring those triggers in.
You don't need all the data.
You don't need to know what the nature of the problem was, really.
You don't need to see all the details of the case.
All you need to see is a yes or a no.
(01:19:40):
Yes, they're on a PEP, or no, they are not on a PEP.
Yes, they are on a voluntary or involuntary term list, or no,
they aren't. that is enough to trigger a look at what else is happening.
And if you see clusters associated with like the known types of things people
do when they're disgruntled or leaving, you know, then you pivot.
(01:20:04):
So I would say you never need really access to everything.
What you need access to is the type of information that'll help you generate
insights about risk along the lines of your critical asset protection program or your cap,
and also your risk register at GRC and your associated control register.
(01:20:26):
Yeah. Bam. A whole lot of words there. It's a total word salad. I'm sorry.
No, but it's all relevant. And you're right. I don't need to see every data point, every log.
I mean, it's just way too much traffic.
But being able- It's too intrusive for what we're doing. Yeah,
but being able to jump into that identity access management system and pull
(01:20:51):
your own log, you know, for a specific, you know, investigation or incident,
I think it's important.
It's critical. Oh, you've got to have immediate access to those things.
Right. You should have access to your IP database that contains your patents
and who filed them and who was responsible for the invention and also your NDAs.
(01:21:14):
You need to see all of that.
What about USB exceptions to policy?
You must have access to that to see if it's part of someone's authorized daily duties.
So yeah, there's some always-on stuff that you would want to get approved right in the beginning.
Don't really have a lot of potential for abuse, actually. And they also,
if you think about the cost-benefit, like you kind of alluded to earlier on this topic,
(01:21:39):
they shave days or weeks or months off of the triage and investigative cycle
timelines just by clarifying a fact about some descriptive data.
It's it. Yeah.
It used to frustrate me to no end.
(01:21:59):
And I had to look at it through their eyes.
But to reach out to a team and say, hey, we're conducting a counterintelligence
investigation or an insider risk, insider threat investigation.
I need this log. Very specific. Here's the time frame. Here's the keywords.
Here's everything I need to get the answer back.
(01:22:22):
Like, okay, well, look, I'm right in the middle of a project.
And we'll get to that as soon as we can, but it's going to be at least a couple of weeks.
I'm like, dude, did you hear what I said about it's an investigation potentially
impacting the business?
But it happens all the time, or it used to happen all the time until I was able
(01:22:43):
to gain access to the systems and pull my own data.
So if you're out there and you're setting up a corporate counterintelligence
program, See if you can work that, negotiate that out, you know,
get access to the system so you could do your own data pools.
Yeah, it may be helpful to build out a visual workflow in something like Visio,
(01:23:06):
where you show the principle of least intrusive means throughout the workflow,
and you can get pre-approved under certain conditions to escalate how deeply
you're going into the data.
I would highly encourage you to segment the phases of your triage,
and then if you do have that investigative chart or your investigative activities,
(01:23:29):
so that you're showing we're not going to ask to do user activity monitoring
on video and keystroke until we've reached a pretty significant threshold.
And we've determined there is definitely a problem, and we just simply can't
resolve it any other way than seeing what's displayed on their screen.
(01:23:50):
Because that is a great way to get away with all kinds of murder in an IT environment, right?
There's so many ways to hide what you're doing that aren't captured in logs.
So, yeah, least intrusive means, display that. Show those thresholds.
Show what you think is appropriate for each level and then negotiate with those
(01:24:15):
stakeholders and educate them as to the significance, the cost benefit of not approving things.
Educate them as to what they can expect their workload to be if you're very restricted.
Because you know what you're going to do? You're going to get a thousand indicators
a day and just chuck them over the fence to their section and say,
have fun sorry you
(01:24:37):
don't know how to do any of this you know call me
no don't call me just deal with it yeah just
or i can have access to the data and it's it's
over do it myself yeah yeah and i love that concept of least intrusive means
and i've shared some recipes with you i think in the past where i've included
that language in a corporate environment it's critically important yeah and
(01:24:58):
so early on as i was building out my programs the most important thing is to build trust.
And so even whenever I would toss up an incident and request an investigation
from employee relations and legal, I would add in there a basic investigative plan.
Like this is exactly what I'm going to do to get the data that I need to prove
(01:25:19):
or disprove allegations.
And they enjoyed it. Now, as time went on and they became more comfortable with
me trusting the program, trusted I wasn't going to overstep,
then I could leave the investigative plan off.
It was just, they already knew what to expect and how we were going to operate in that.
But I think early on in the process, it's important to do all of,
(01:25:41):
to go above and beyond to make sure that everybody knows this is a legit activity
and, you know, not big brother.
We're not trying to ruin culture or any of that stuff.
We're just trying to do our job. Yeah, we're responding to things you did as an employee.
We're not inventing things you did. You did that thing.
(01:26:02):
And what we're trying to do is not be judgy, but figure out the context while
leaving you alone to be productive.
And then if there's nothing, we just drop it and move on, right? Right.
So that's how it's supposed to work. I like the way you characterize that with least intrusive means.
It made me think of something that a really important colleague in my maturity
(01:26:25):
kind of growth said once that's super common.
But for whatever reason, I just hadn't really heard it to the point where it sunk in.
But he said, capture repeatable processes into local procedures. Yep.
And when you talk about your investigative plan.
If every investigative plan is about the same up to a point,
(01:26:46):
then that just becomes a workflow or a playbook.
A playbook is that repeatable process that's always the same given certain conditions.
So you don't need an IP. But I would say for anything deviating from the playbook,
you would want an approval in an email or something saying, I'm okay with you
(01:27:07):
deviating from the playbook or whatever.
Well, the last thing, and we're talking about least intrusive means,
and one of the most intrusive things that we could do is to pull a knee discovery.
I mean, that's where you're reaching in, you're grabbing emails,
and you're reading somebody else's message traffic.
(01:27:27):
And that was the third thing in the original question that you had.
Oh, you just gave me the stank face too.
You did like something I said, I can't wait. That's just because my dog walked
in the room. No, I'm just kidding.
But e-discovery is the third topic that you raised in your multi-topic question at the beginning.
(01:27:52):
It's the fourth and last. Yeah, we hit ISCM, incident response,
and digital forensics. Yeah, fourth and the last.
Incident response, we won't really dig into because that is a very well understood domain.
EDiscovery isn't. And a lot of people have maybe been around it,
(01:28:12):
or they knew it was happening, but not how or who was responsible for it,
who drove the requirement,
who authorized it, and what was it really designed to accomplish.
There's a lot of fuzziness there sometimes in my conversations with people.
In legal cases, there's discovery.
Each party shares what they have.
(01:28:34):
And that's really important because, you know, especially if there's a sculptory evidence.
You have to share that. Let's say you're the prosecutorial team and you've got a sculptor.
If you don't share that, then it can hurt the credibility of the team and even
cause cases to get thrown out.
(01:28:56):
Well, e-discovery is simply discovery with digital artifacts.
So the e-discovery team isn't really doing any kind of analysis.
That's where your forensic support to the investigation happens.
Happens, that's when you're looking at email body content and looking for leads and evidence.
EDiscovery is capturing every communication and
(01:29:19):
every related document and file and artifact related
to the case and zipping it up into one file and getting into the hands of your
legal team in the correct condition so that evidence chain of custody is intact
and it will be admissible as evidence in court and stand up to scrutiny.
(01:29:41):
So it's not really part of the investigative process. It's actually part of
the legal process after an investigation.
See, and I've seen it a little bit different. And I guess I haven't done as
many investigations as you.
The only time that I've ever really done e-discovery is on email.
(01:30:02):
Now, Microsoft Purview, you know, going in, pulling emails, maybe batches of
emails, timeframes, subjects, and that sort of thing.
So whenever you say e-discovery, that's the thing that comes to my mind.
It's just that very specific task, but it's much more involved than that.
Well, I think it's just a misnomer, you know, because discovery...
(01:30:28):
Is often used interchangeably with threat hunting. And that's an acceptable
use of discovery during investigations.
When you say e-discovery, though, it's kind of like saying cyber incident response.
There's a very specific application that at least is normally understood.
And it's specifically focused on the legal process of finding and handling electronic
(01:30:53):
evidence during litigation litigation or investigations.
And so I suppose you could leverage an e-discovery expert as part of a case
before it goes to some kind of legal action and just use that process and playbook
to help you find and gather the right artifacts for your content analysis.
(01:31:14):
I don't see any reason you couldn't do that if you got legal approval.
Otherwise, I think what you're doing is It's simply content and analytics.
And you're doing some cyber forensics to get the content in a condition so that
you can perform content analysis.
Yeah. So maybe a couple different applications of e-discovery out there.
I'd be interested if anyone else out there is using the term in the way that you mentioned.
(01:31:39):
Yeah, I haven't seen it go past that.
But again, in most situations that I've seen, if it was going to litigation,
we would always tend to pull in external counsel and let them do all of their
data pools and give them whatever they need, whether it's a PST file or some
(01:32:01):
kind of cybersecurity log or something else.
We would just be hands-off at that point and really defer to external counsel's
expertise on that. Sure.
And especially as a counterintelligence leader or practitioner.
That's the legal team's obligation. By the time it gets to the legal team,
(01:32:22):
we're kind of done, apart from some advisory services.
Yeah, that e-discovery, at least the kind of e-discovery I guess I'm referencing in comparison,
is more of a legal task to pull all of the data together into a format that
if maintained evidence chain of custody in a way it'll hold up in court.
You give it to your legal counsel, and your general counsel for any company
(01:32:48):
is the one directing a resource somewhere to do it.
And I've seen that most big cyber defense teams typically have just one or a
couple forensic experts on their team, like incident response people,
that have gotten additional training on e-discovery.
And that is all they do for the company. That's their job. They don't do incident
(01:33:12):
response anymore. They just do e-discovery.
And then if they're not busy for a period of time, then they can jump in and work other projects.
Well, dude, we have covered a lot of ground during this conversation.
Yeah. I mean, there's so many different aspects of this program that we get
a chance to explore and talk about.
It's been a lot of fun. Brass tacks, man. We should label this episode brass tacks, dude.
(01:33:37):
I love it. I love it. But I really look forward to the feedback that we get
from this discussion, because I think there's a lot of things of interest.
There's a lot to pick apart.
And I hope we do have enough information in here that we can at least have a
continued conversation with our peers and other experts in the field to kind
(01:34:00):
of help round out what we've said and their experiences.
So really important topics, really important discussion. us and I had a lot of fun doing it.
Yeah. I'd say if we don't create some questions out of this,
then our listeners might be listened to the wrong podcast.
We put a lot of challenging ideas out there, you know, and, and thought starters.
(01:34:24):
So hopefully we can generate some conversation. Ideally, that's the whole point
is to get some buzz around these topics and clarify some things. And we're not, I.
And I'll speak for myself. I won't speak for you, but I will say I'm not the
expert in all of these things, you know, not even close.
(01:34:44):
I wish I was. And I know there's some experts out there.
We're all still trying to navigate and figure this whole thing out.
It's still relatively a new concept, especially whenever you talk about counterintelligence
in a corporate environment.
It really is a newer concept.
And I had an opportunity to talk to some really smart people over the last few
(01:35:06):
days, people that set up a counterintelligence program at a major bank,
set up a counterintelligence program at a huge defense contractor.
And they've had varying degrees of success. You know, and it's led to people
reaching out to me and saying, hey, I saw that, you know, this one bank has
a counterintelligence program.
(01:35:26):
I work at a bank. Do you think we need a counterintelligence program?
I love that conversation. I'm like, hell yeah, you do. Absolutely.
But, you know, it's becoming more mainstream.
We are demystifying the counterintelligence and insider risk piece,
what it looks like, how it lays out.
And hopefully, hopefully, we're providing a valuable service.
(01:35:50):
Those in the industry are at least interested in learning more about the topic
while having some fun at the same time. Well, of course, we're always having
fun. And even on the job, we're having fun with our peers.
And if nothing else, you know, when we talk about counterintelligence,
insider risk, investigative fundamentals,
hopefully it gives you the right questions to ask of your internal leadership,
(01:36:15):
you know, wherever, whatever companies and projects you're supporting,
it'll give you some ideas for what about this? Have you thought about that?
Who does this type of work? Who can I ask about this?
And just do some of that ad hoc discovery until you generate some hypotheses
about how to go about improving things.
(01:36:36):
Yeah, hey, take it, run with it, let us know what works, what doesn't,
things that you're doing that are really working well, that are novel.
Those are the things that if you're not trying to patent or monetize somehow
and are willing to share, we'd love to facilitate that. Yeah.
And if you've seen it in a way that it works better, please share it with us.
(01:36:56):
You know, that by all means, even it to yourself doesn't do the community any good.
I know that I will help anybody along their journey if I can.
I get questions all the time and do the best I can to answer them.
But if you point out something or if you found something that works better in
your world, share it with us. Let's get it on the podcast.
(01:37:18):
Let's share it with the community so that we all get a big win.
And I think that collaboration that we have amongst peers, probably one of the
most essential things to a successful program or a successful field and to also
protect our organizations from people who wish to do us harm.
So that's my public service announcement for this podcast.
(01:37:43):
Yeah, I'll just add to it by saying, don't be afraid to reach out.
The only things I respond to on LinkedIn are requests for assistance or people
that want to talk about this kind of business.
I'm not responding to advertisers and people trying to fix my finances and give
(01:38:03):
me a new executive credit card and all this BS.
But if you're interested in this field and, or you're in it and just have a
clarifying question or an idea or a comment, those are the, those are the things that we're here.
This is a passion project of ours. And we're not here to like tell everybody how smart we are.
You can see that from the title of the podcast and just our general demeanor on the show.
(01:38:29):
It's here to generate ideas, thoughts, questions, challenges,
and show some best practices.
So join us. Well, you know, and I forgot to mention our honorary third member
of Beth, our podcast has a standard question. Let me see.
So she's always in the booth with us, and she always has this question.
(01:38:52):
So I'll go ahead and ask it for us. Bethany, Jay Redmond's question.
So Dave, what are you drinking tonight, my friend?
Hashtag so blessed I have Lagunitas Daytime, and it's my light beer,
midweek, sip on it, and be very happy.
Very, very nice. So I am drinking water currently, but earlier I had a very
(01:39:18):
nice glass of Four Roses bourbon.
Oh, my goodness. Yeah, it's good stuff, man. So, you know, just one glass to
kind of shift gears from what I was doing during the day to making a podcast.
Is that the glass behind you there that I see? It looks more like a flower pot
for like a really big plant.
(01:39:40):
What would I say glass on that bottle?
175, obviously. No, just kidding.
There is something to have a glass of whiskey. Oh, yeah. Yeah.
Well, I hope you have a great trip, six weeks on the road.
Can't wait to see some of the videos and pictures.
Maybe we should post it on the Unintelligence website on LinkedIn so that we
(01:40:05):
can share the good news on the journey.
But dude, I really, really hope you have a great trip and I look forward to
doing the next episode with you. Thanks, brother. Likewise.
All right, dude. Great convo, man. All right. See you, buddy. All right. Bye.
Music.
(01:40:34):
Thank you for listening to Unintelligence, the corporate counterintelligence podcast.
The thoughts and opinions, along with any mistakes made during live sessions, are our own.
We represent no corporate or government agencies. And if interested,
we invite you to join the conversation as we seek to turn over stones and shed
light on the counterintelligence mindset and its applications in the places we all work.
(01:40:59):
And we're always open to feedback, so we make sure to speak to the topics you're
interested in. So keep the comments and likes flowing.
Until next time, stay safe and secure and report concerns to security professionals.
Music.