Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:01):
How's it going, brad?
It's great to get you on thepodcast.
We've been working towards thisthing for a while and we've had
some pickups along the way, ofcourse, but I'm glad you're here
.
Speaker 2 (00:12):
Yeah, it's an honor
and a pleasure.
I appreciate it.
Yeah, absolutely.
Speaker 1 (00:16):
So, brad, why don't
you tell my audience how you got
into IT, what made you want toget into security, into the
security space?
Right, and I start everyone offthere is it kind of fills you
know two purposes, right?
I have a lot of CISOs,directors, managers that listen
to this, a lot of experiencedprofessionals, right.
(00:37):
So it kind of opens up who Bradis to them.
And then it also shows myyounger audience that is trying
to get into IT, maybe trying toget into security.
It shows them your background,right.
So they can maybe match that upand say, hey, if he did it, I'm
coming from a similarbackground, maybe I can do it
too, right?
Speaker 2 (00:57):
I will kind of
Tarantino this and say, you know
, start with the ending firstand then we'll go back to the
beginning.
So I will say that if I can doit, anyone can do it.
I did not originally start onthe it path by any means.
I always loved sci-fi, I alwaysloved technology.
I was always a tinkerer, likegoing back to when I was a kid,
you know, I had the ge clockradio that everyone had the
(01:21):
brown one with like the littlebuttons, and I would basically
take it apart and put it backtogether again.
My dad would yell at me and belike what are you doing?
And I would overclock my AppleII and everything else.
I would say maybe above average.
Tinker took apart my firstcomputer when I was a teenager,
just wanted to know abouttechnology and was always just
(01:41):
super curious about everythingand how the world operated.
But I did not know what Iwanted to be when I grew up and
when I, you know, in the 90s,and all that when I was, you
know, going through doing, youknow, applying to colleges and
trying to figure out okay, where, where do I go?
Do I go in the military?
Do I not go in the military?
What career do I pick?
(02:02):
Um, what do I do?
Ultimately, my dad convinced meto go into college and then I
ended up, uh, starting inengineering because I wanted to
do math things and doengineering, hated it, switched
to business, ended up going intothe military because of 9-11
and ended up getting a militaryscholarship, went in the army
(02:23):
and what and what they assignedme?
I originally wanted to be apilot, took three flight
physicals and failed the thirdfor vision, and then got
assigned the Signal Corps, whichis basically where
cybersecurity now sits, and ITand all that, and got thrown
into it.
And then, as a young22-year-old lieutenant and
officer, I was in charge ofbasically one of the first, what
(02:46):
was known as a quick reactionforce, or basically a rapidly
deployable team to go around theworld and basically respond to
natural disasters and dealt withthe uh influenza scare.
We'll call it a scare now, butthe influenza outbreak that
almost turned into a pandemicback in 2007.
I was there in Thailand duringCyclone Nargis, which was a
(03:09):
Category 4 and devastated andlost hundreds of thousands of
lives during that time period.
While I was learning all that,there was no real books, I was
very limited in terms ofcertifications and training and
there wasn't any real degreesout there to learn all this
stuff and we were responsiblefor securing our networks.
(03:29):
I mean, it was basically amobile deployment center.
So if you know how the federalemergency management agency
works, or fema or red cross,they typically have their own
command centers.
I used to run those commandcenters and eventually I ended
up getting reassigned or anexpanded role, not really
stopping what I was doing, butbasically doing cybersecurity
(03:50):
for the military intelligence.
While doing this and thatbasically was summed up the
first kind of four, really sixyears of my career as the Army
and eventually moved on towhat's now SecureWorks and built
out a lot of their differentcapabilities.
But I had no idea and really Ijust go on deployment after
deployment.
I always had a book with me andalways had.
(04:12):
You know, google really wasn'ta thing back then.
So whenever I can get my handson you know, through other folks
and when I say books I meanlike manuals, really boring long
Cisco manuals on like howrouting works, and then you know
it's pretty limited in terms ofyou know what the army had to
offer in terms of training.
You know I could basically takea computer and put it back
(04:34):
together again and do some basicrouting when I started but and
configure like outlook.
That was about it.
And then, you know, basicallyfast forward over the past 20
years and, uh, 20, you know,going on 20 plus, and it's been
an interesting journey for sure,uh, and worked with a lot of
different others in terms ofkind of building all that out,
(04:56):
um, at secure works.
You know, it was kind of reallyinteresting because, because
one of the things that we talkedfor the podcast was, like, you
know, kind of work with others,helping them get into the, the
space, and kind of what we didat secureworks was we were
trying, you know, the stockmanager, uh, and I was basically
working with my peers andtrying to figure out like we
(05:17):
need we need to hire people andexpand and grow.
Secureworks is, you know, wasone of the leading MSSPs or
managed security serviceproviders at the time and it was
really difficult, like we wouldfind people outside of college
and not really have anyexperience.
I was hiring, you know, some ofmy people that I hired at
someone I was in Rhode Islandtime.
(05:39):
I hired someone out of Floridathat was working at a Publix as
a bagger and they wouldbasically train, you know, was
learning cybersecuritythemselves.
I was like, if you're willingto relocate up here, we'll train
you, we'll teach you how to bean expert.
You know, they became one ofthe top threat researchers in
the industry and arguably one ofthe top 1%.
(06:00):
And then another was, you know,a general manager at UNOS and
just wanted a career change andknew the opportunity.
What a lot of people don'trealize is you can actually make
20% more doing a similar job incybersecurity than if you're
doing marketing in cybersecurityor applicable field.
(06:21):
You actually can make 10%, 15%,20%, if not 30 percent more in
this field, just because it's insuch high demand.
Speaker 1 (06:28):
So it's it's been an
interesting journey.
Speaker 2 (06:30):
Certainly a lot more.
We probably do a podcast juston the early days.
But um, yeah, that's kind ofhow I got thrown into.
Speaker 1 (06:36):
It wasn't by choice,
right so tell me, tell me about
what that was like on that QRFteam.
What was the workday like?
Right, Are you going from nineto five or are you going until
you hit a certain point and thenyou're studying, you're reading
a book, right to learn thosedifferent skills and whatnot.
(06:59):
What does that look like?
Speaker 2 (07:04):
that look like.
Yeah, I don't think I slept.
The first four years I was inthe military, the short version.
Luckily I did it when I was inmy 20s and not my 30s and 40s,
but um, it was.
I was in charge of sixdifferent teams and basically
over 100 soldiers in terms ofdifferent, you know, billions of
dollars of equipment, andbasically two teams were on
standby, two were on rest periodand two were basically in
(07:25):
training or basically ramping upfor the next cycle.
So at any given moment, twopeople were ready to go and we
basically were trained, like,whether it was boat, rail,
vehicle or air, like we wouldbasically figure out how to.
From a logistics perspective, wehave to be anywhere in the
(07:45):
world within 54 hours, and soyou have to figure out I mean
literally measure it down to,you know, bringing printers and
measuring out all of the paperthat you need toilet paper,
because you're going out intoreally remote, completely
disaster area.
You're not going to have power,you're not going to have clean
water, you're not going to havethe internet is going to be what
(08:11):
you bring with you becausewe're deploying with satellites,
and so basically what it waswas, um, uh, how the vehicles,
we, we had basically our trucksand on the back would have like
a shelter unit and in that you'dhave, you know, pretty much a
data mobile.
It was a mobile data center iswhat it was.
And then we had basicallydifferent satellite systems that
would connect to differentmilitary networks and, if needed
(08:32):
, some of the deployments as Igot into more of the top secret
levels we used commercial bandsas well and it worked alongside
three-letter agencies, dependingon the mission bands as well,
and it worked alongside threeletter agencies, depending on
the mission.
And you know, especially in ifthey're an allied nation and
there's a natural disasterelement, intelligence becomes
incredibly key, as well as cybersecurity elements, and people
(08:55):
don't realize that when there'sa natural disaster that happens,
so part of it is being onstandby and so you're until the
stuff hits the fan.
You know you're, we're, we'rekind of on easy street, so to
speak, in terms of you know,it's kind of a what we call
being in, like basically thegarrison, and we would do like
our day-to-day training.
(09:16):
We'd do maintenance.
Everything had to be tip-top,like ready to go on the plane,
like that day, if needed, mybags were packed and ready to go
, like that day, if needed, mybags were packed, I'm ready to
go, like I'd leave it in mytrunk, kind of scenario where,
like at any given moment, I hadto be at the airport to go where
I needed to be.
Really, what it breaks down tois every free moment that you
get.
You gotta remember, like theinternet isn't what it was today
(09:38):
, like you're not sitting onYouTube.
Youtube didn't exist.
Google wasn't like mainstream,like people actually sat around
and like read books.
You know it was not a sidephone inside, we were just
living in the moment.
I mean, some headpads reallydidn't come out until later in
the early, the later, early2000s, if that makes sense.
But you know the first couple ofyears.
You know you're dealing with us.
(10:12):
You know basically, uh, remoteareas and, and you know,
basically rubble.
You know you don't want tobring your nice stuff most of
the, the clothing and gear thatI had got tore up um, yeah, it's
interesting to to kind of seethat and you fit it in like
you're on the plane for you knowsome kids.
You know eight, six, like eight.
You know I love I was stationedin hawaii.
You know just to get to japan orto get to thailand or get to
(10:33):
korea, south korea or anywhere,or the philippines, you know, it
takes 18 hours and people don'trealize that.
Nine to twelve direct flight tojapan, and then it's another
nine to twelve to get toSoutheast Asia, depending on
where you are.
And all that, sure you can geton a, and we didn't always have
the ability to get on a militaryaircraft because, believe it or
(10:56):
not, despite us having aprimary mission, we didn't have
dedicated aircraft and thingswere constantly being sent over
to Iraq and Afghanistan.
We were ramping up andsupporting the, the two wars, so
we sometimes it was where wegot there, and sometimes we got
there before the equipment did.
But at least we could beliaisons, we could be advisors
(11:18):
and and and advise folks.
Speaker 1 (11:21):
Yeah, yeah, I asked
that because, uh, you know,
there's a lot of people, acouple friends of mine actually
right, they always say I'm sobusy, I can't do it, I don't
have time and this and thatright, um, and I'm just sitting
here, well, you don't want itbad enough, you know.
I mean, like, plain and simple,like you just don't want it bad
(11:41):
enough.
Like, like you, you want, youwant the money, right, you want
the flexibility, you want theability to do something else,
right, but you don't, you don'tactually want it right, because
if you actually wanted it, likeyou would figure out how to
optimize your sleep so thatyou're sleeping less hours and
that you're, you know, spendingmore time studying or doing
(12:03):
whatever you need to do to getthis done right.
Like I can give you the roadmap,I can give you the plan, but at
the end of the day, if you'renot going to do the work, if
you're going to keep on claimingyou don't have time, you know
it's never going to happen,right?
And so, like it's alwaysimportant, I think, to point out
(12:27):
, right when I have people likeyourself on right that are very
much, you know, self-starters,that you, at some point in time,
you saw and you were like, hey,I want to get into this right.
Like I want to learn, I want toknow as much as I possibly can
about this area.
You know, when you saw that,when you identified that, you
went for it.
You know, and a lot of peopleare missing that part.
Speaker 2 (12:46):
Yeah, and I have a
lot to say about that.
So, like when people would say,oh, I didn't get a good night's
sleep last night, I'm likeevery mission that we had we
didn't sleep the first 72 hoursand it's like you're on the
plane you're going.
We would sleep on the planegetting there because there's
only.
But even then, like sometimeswe didn't because, depending on
the mission, we didn't have allthe like.
(13:08):
We were reviewing the materialon the plane, like like we would
get the map, you would get ourkit, and it'd be like here are
the maps of ireland.
That's what it looked likebefore the natural disaster.
You can see the river and allthat's not there anymore.
But uh, yeah, at least figure itout when you get there yeah,
you figure it out when you getthere, but like part of it's
(13:28):
like you're writing theoperations order and the
execution plan and everythinglike on the plane we would have
our standard like packing listand like we would do like last
minute packing and and all thatyou know.
But you know we're it, you knowwe it's.
You have to be there in 54hours.
So that means that like everysecond counts leading up to that
(13:49):
moment when you hit ground.
That's when the fun begins andthen from there that's when the
like we kind of cut it off at 72hours because you start to get
loose, you know.
Then you start to become asafety hazard, so and you get
micro sleep and all that.
But really it's you're prettysleep deprived in the first
couple days and once you get theinternet flowing and phones
(14:11):
going and people can operate,then you can kind of take a
breather.
But yeah, it really istypically something that that
that ends up happening.
I mean your adrenaline'spumping so hard because you know
it's life and limb, so sparethe gory but not everybody's
alive when you get there.
But it's that scenario I wouldsay I have.
You know, a couple of people amonth could reach out to me and
(14:33):
they're like you know, youhelped me break into
cybersecurity.
So, like a little known fact,I've actually worked with three
different police officers overthe past couple years to help
them transition, because there'sbeen a lot of pressure on
police officers as I've been sonice to them over the past
couple years, a lot of goodtrends of mine.
They're like yeah, I gotta getout of here.
(14:54):
It's not safe.
Defund, the police is killingme, everyone hates me.
I want a purpose, you know, Iwant to make money for a living
and you know I'll give them theanswers to the test.
And they're not.
You know, I'd say I love helpingthose types of individuals
because they you know theywouldn't because they have kind
of that, that military mindsetor that warrior mindset in terms
of like, I'll do anything ittakes to accomplish my mission.
(15:17):
Now I'd say nine times out often I talked to somebody in
college or you know, know,they're a younger demographic,
they're really in there.
You know people that like, hey,can you help out my son?
Or can you help out my friend?
All he does is play video gameson his couch.
And it's like OK, well, I'lltalk to him.
And it's like, hey, this iswhat you got to do to succeed,
(15:38):
and then, like they instantlyget nauseous and don't do
anything.
Yeah, it's like you know, oneof the things I like to do is
show people uh, so there's a.
You just go to google and typein uh, security certification
roadmap.
Paul jeremy has built thisbeautiful diagram in terms of
(15:58):
kind of basically how, uh, allthe certifications and there's
over 400 and a huge thing yeah,400 and something.
Certifications from beginner,intermediate to expert.
These are certifications andany one of these certifications
could be at a minimum, 80 hours,if not 800 hours, and studying,
depending on the level ofcomplexity, whether it's
(16:20):
hands-on or not and where you'restarting from.
So it's, it's a majorcommitment and I show people
it's like, hey, this is mounteverest, like you can get there
and most companies will pay.
You know, not for all of them,but you know, anvil, I never.
I maybe paid out of pocket oncefor one of my certifications.
I was able to get the militaryand, uh, you know, secureworks,
(16:41):
ibm, others to pay forcertifications.
As I go along, you havedifferent like grants and things
like that and the ways aroundit.
There's tons of, tons of freeinformation and certifications
that you can get out there, alot of non-profit stuff.
Cisco has their academy, um,just stuff like cyber break,
good stuff out there in terms ofjust tons of research.
(17:01):
But like I show them this andthey're like, no, I can't.
And typically I start peopleand I say, hey, just look at
CompTIA and start with A+, go toNet+, go to Security+, you'll
have enough that you could startworking at like a internet
service provider place and youcan, you know, basically get
(17:22):
work at a health deskeffectively and learn the ropes
and then get some experience andthen, once you have one or two
years under your belt, then yougo somewhere else and at
SecureWorks we would take peoplewith zero experience and train
them.
We actually almost prefer itsometimes, because we get people
out of school sometimes andthey have a master's degree in
(17:43):
computer science and they don'tknow the difference between TCP
and UDP, like you got to bekidding me how do you, how do
you, how did you, how, how, howis that?
You just literally wasted yourentire, your entire, you know,
really postgraduate education,education, yeah there's a
(18:04):
thousand dollars at a top tierschool and you didn't learn a
damn thing, so it really comesdown to aptitude and you know,
like I said, I hired, you know,people that worked at a
supermarket, that worked at arestaurant, that did different
walks of life, and just policeofficers.
you know some of those elementsare translatable and
transferable, so you're notnecessarily starting from
(18:26):
scratch.
You know it really comes downto the fortitude and really
rolling up your sleeves andputting the work in and it's
that perspiration that you havein terms of sweating it out and
and doing it and and I, whenever, when I was starting, I really
(18:47):
like fully immersed myself, likeI was learning a language,
literally learning binary, uh,and and hexadecimal conversions
and all of that and and learninghow to do the packet analysis
component, but really treatingit like I'm fully immersed in
learning Spanish or French orany language, and that's really
(19:07):
the best way to learn anything.
And I would follow newslettersand podcasts.
I mean, podcasts weren't superreadily available 20 years ago.
They really took off over thepast five, 10 years.
But whatever you know, whateverwas available, sans had their
you know their news bike feet.
You know tons of after 2010,.
(19:29):
Things started to really comeout in terms of like open source
and having a lot moreinformation readily available
that you could get your hands onstuff.
Speaker 1 (19:39):
Hmm, yeah, it's.
It's always interesting, youknow, when someone comes to me
and, you know, says that theywant to get into security, how
do they do it and whatnot, Iactually take the approach of
trying to convince them to notget into it.
Right, Because if you're soeasily swayed by what I'm going
to tell you?
You probably shouldn't, youknow, waste your time getting
(20:01):
into it.
You know, like I was telling abuddy of mine, shouldn't, you
know, waste your time gettinginto it.
You know, like I was, I wastelling I was telling a buddy of
mine and he, you know, owns hisown franchise of some business
and whatnot and he really enjoysit.
Right, and I was telling himhow a group of developers I'm
talking about, you know, 150developers at my day job were
trying to convince me to put inan exception that would
(20:22):
essentially bypass our entireWAF.
Right, they knew what they weredoing, but they were wording it
in such a weird way that I wasasking a lot of questions, right
, and so, because I didn'tunderstand what they were asking
, I didn't understand the reason.
I didn't understand, like, anyof what they were trying to say,
because they were literallytrying to just bully me into
(20:43):
accepting it.
Right, and it was.
It was literally 30 minutes ofme asking questions, right, and
then it finally got down to the,down to the answer where they
couldn't escape it, and theywere like, oh yeah, we just want
to get around this whole thingand I'm sitting here like these
are adults.
Yeah, these are.
These are grown, grown adults.
(21:04):
We're paying them good amountof money, right, and they're
trying to get around oursecurity tools that are
protecting them from themselvesto some degree even, which is
which is frustrating right inand of itself, and having to
respond in a very I I'll saystern way, right, like respond
in a very stern way saying no,we're not doing it, you need to
(21:26):
suck it up.
I don't care how much time thisis going to cost you or
anything else like that.
Like you need to figure out howto handle it.
And if you have a problem withit, I'll just bring it up to
your director, right, becausethat's who I got the approvals
from.
That's who told me you had thetime.
It was your director, it wasyour VP, it wasn't anyone else,
it wasn't your manager.
My buddy, though, who's not eveninterested in getting into
(21:47):
security, he heard that storyand he's like, yeah, I would
never.
I would never want to be inthat industry.
I don't blame him.
It takes a certain kind ofperson to really look at that
and be like, yeah, but it'sworth it, you know, and that's
(22:08):
who I want in the industry, youknow, because there's a lot of
people that get burnt out.
You know, you got to be able tobe mindful of your mental
health and whatnot, right, andmaintain that properly.
But it's a great field thatreally anyone can get into if
they want to get into it.
It doesn't matter where you'reat right now, like you mentioned
, right, with those policeofficers who's busier than
(22:29):
police officers, right?
I mean, like I tried to go intolaw enforcement out of college
and I'm very glad that that didnot work out.
You know, I think I had, youknow, no more than like two
brain cells at the time, right,and the riots started kicking
off and I think it was like StLouis or something, right, and I
saw that and I'm like I justfeel like I don't want to die
(22:53):
doing a nine to five.
You know, like I just I feellike I need to find something
else I'm passionate about, right, like it just didn't add up to
me.
So I'm like, okay, let's gosome other route, right, and
then it just escalated fromthere.
You know, it just got worse forthem from there.
But like, who has a morerigorous schedule outside of the
military?
(23:13):
You know, these guys aredriving around, they're
patrolling all day long.
They're putting their lives onthe line all day long and then
you want them to come home, havefamily time, decompress and
study.
Speaker 2 (23:24):
I mean, that's a
large ass, right yeah, it's
tough, it really is, and I meanthe biggest I mean that's why I
left the military ultimately wasit's.
You know, it wasn't very safeand I knew that my body wasn't
going to be able to keep up withit and, uh, still dealing with
a lot of that and it's, yeah,it's tough, and and part of the
(23:45):
problem with transitioning wasthe decompression.
So, for all the uh, soon-to-beveterans or are currently
veterans of, you know, eitherfirst responders or military,
regardless of where you are inthe world, you just have to be
conscious of that decompression.
Sickness is what I I call itwhen you go from being 100 miles
an hour to not so much.
(24:06):
That was our problem.
Being at that operational tempofor so long I forgot what normal
was.
I mean it's like drinking twopots of coffee every day down to
nothing overnight.
It's just like you go 100 milesan hour and just like, slam on
the bra on the brakes and likeand I started off as like a
(24:29):
shift supervisor, I went frombasically being in charge of all
this stuff millions of dollarsof equipment flying around the
world in charge of 116 peopleseeing pretty much everything
under the sun.
It's like, you know, a juniorlevel officer in the army to a
first ship supervisor.
It's a security operationscenter at Secure Works In charge
(24:50):
of like six people.
They're like oh, this isdifferent.
Speaker 1 (24:55):
Yeah.
Speaker 2 (24:57):
And then it's like
you start looking around, it's
like, what can I fix around here?
And then, before you know it, Iended up getting in charge of
like 30 people and then movingup and then eventually moved to
product management and theneventually ended up going more
and more up the ladder.
So eventually it ended upleapfrogging.
(25:18):
But it was definitely a hardtransition, for sure, mentally
more than anything else, and itwas I was I don't really knock
100 miles an hour for anybody tosee that movie like american
sniper.
When, when kyle is is sittingthere and he's in the doctor's
office and kind of is, he's justsitting there and, like his,
his blood pressure's through theroof and it's like, you know,
(25:38):
that's, that's how you gethypertension, you get high blood
pressure and uh, you just youused to that, uh, that temp
that's just going a millionmiles an hour, but yeah, it
really comes down to uh,dedication, dedication to your
craft, um, and that's when it asa hiring manager, that's those
(25:59):
are the things that look for.
And anybody that's ever workedwith me before in my interviews
I always used to ask a variationof this question, which was in
one minute or less, tell me 10things that you can do with a
number two pencil.
And typically I wouldn't hireanybody that didn't try to get
(26:22):
all 10.
And it's about the whole pointof the.
I don't really care what you'regoing to use the pencil for.
It really breaks down to areyou going to give up or not?
And I've had people give up atfive.
I'm like you can't think of,you're just going to give up.
You have 30 seconds left, andso if you can't get through an
(26:42):
interview question for literally60 seconds, then I think I
started off doing 2 minutes toreally like just different pros
and cons to doing 1 minuteversus 2 minutes.
I'll sit there and I'll just bequiet the whole time and just
like list off what they have.
It's just interesting theanswers that people give.
It's just like an interestingquestion to ask.
(27:04):
It really shows you kind of thehow people go about a problem
and how they they do it and youknow the people that ended up
getting 10 out of 10.
Like you know, I've seen peoplewrite it off 10 out of 10 in 30
seconds.
And those are the type ofpeople you want to hire, because
you know when the stuff hitsthe fan that they're going to be
(27:25):
the ones that really pushforward.
Because really it comes down toproblem solving.
If you have a custom routingissue in your Cisco firewall or
whatever that you have customrules that you've built.
You're not going to find thaton Google.
You're going to find that inthe Cisco manual.
You built it.
It's your bed.
(27:47):
Why are you in it?
So you're going to have tofigure it out.
You put custom code on yourmachine.
Only you know that code.
It's like if you bypass the WAFand you built some custom tool
or whatever to do X, y, z, giventhe example that you have,
that's on you, that's shadow it,that's a.
That's an unsanctioned app Likedon't, don't be calling to help
(28:10):
this, because you jail, brokeyour laptop, your device, so you
could play some video game orwatch sports on your work laptop
or do something naughty Uh,sports on your work laptop or do
something naughty uh, that's onyou and that's like a systemic
problem right now across theboard, um, that like over I
would say arguably over 40, ifnot 50 percent of organizations
have this problem right now andit's it's probably even
(28:33):
extensively higher because of,uh, all the adoption of
artificial intelligence.
So, like you'll, you'll look upand just like it's so much like
what are the number oneapplications?
And like it'll be likemicrosoft and it'll be, you know
, adobe, it'll be whatever, andit's like, no, it's actually ai.
It's just no one's talking aboutit.
(28:55):
They're, they're using it to to, basically as a competitive
edge against their own um team,to to make you know they'll be
assigned a project and they'lldo it five times faster because
they used, uh, an ai tool to doit, and but they're not going to
admit that they use an ai toolfor and that's that's kind of
what's happening is, people areusing these unsanctioned apps,
(29:17):
these web apps all I have to dois open a chrome browser and do
that and now we're getting toreally scary stuff where you
have, um, you know, ai avatarsthat are like really hard to
discern, like as it starts tofeel it.
Speaker 1 (29:31):
yeah, I actually I
know someone that very recently
had someone you know interviewand and apply for their company
at open role, right, and theypassed the interview, they got
hired and someone else basicallyshowed up and had you know
something.
They had something going onwith their screen where it
(29:52):
looked like the person that hadinterviewed, but the answers
that they were giving were justlike completely off the wall.
You know it was.
It was wild, Like they thoughtthat they they must have had you
know some AI, you know, likelistening in on the conversation
and telling them what to sayand stuff like that.
It was pretty crazy.
(30:13):
I've heard of that but I'venever talked to someone that has
actually gone through that andexperienced it firsthand.
Speaker 2 (30:23):
AI is turning into a
crazy world where everything now
has a significantly higherattack surface yeah, and it's
gone through the roof, and partof what's driving this forward
is, um, obviously the economy.
So the last two years have notbeen very forgiving.
You're really the post, we'redefinitely the post-co economics
(30:47):
.
So we had basically the massiveboom that occurred in 21, 22.
And then mid-22, we started tokind of hit a critical point.
We saw sales cycles take longer, purchasing decisions were
greatly reduced, budgets werereduced.
A lot of layoffs, a lot oftrimming of the fat, a lot of
shift towards rapid hyper growth, towards better margins and
(31:10):
better profitability, a lot ofhoarding of cash.
The introduction of open AI youknow the Q1 of 23,.
They really started to take offand then like really is the
(31:38):
only way to get like fundingright now is you have to have an
AI store, and it's been thatway for over two years now.
So hat tip to the investmentcommunity for being that
forward-thinking that.
You know, going back to 22, youhad to have an ai story in
order to influence them.
So over 80, if not 90 percent,of all tech companies and
cybersecurity companies have tohave it's part of the roadmap to
(32:00):
have ai embedded into theirsolution.
So it's like a board mandateand and then concurrently, from
an adopter perspective, over 80%of organizations are evaluating
it in some way, shape or formand it's a bell curve in terms
of that maturity level andwhat's driving a large portion
of it is sales and marketing, aswell as operations.
It's a lot of low-hangingadministrative tasks, but some
(32:24):
companies have gone all in andeverything has to be AI-driven
or AI-assisted, depending onwhat they do for a living.
And part of the problem andGartner's kind of devoted a
whole subdivision of studies.
So folks that don't know whoGartner is, it's a research
analyst firm.
They do market insights andthey're the leading provider, so
(32:45):
they have this space where theytrack basically AI and the
trust behind it and the overalladoption and basically the
challenges that we're runninginto is over 80% of
organizations are evaluatingthis and unfortunately, like one
third of all implementationsare resulting in a data breach
(33:06):
because of the expanded attacksurface.
So we're rushing towards thisgoldmine.
It's kind of like the earlydays of the internet there was
no protection and we'reimplementing things like this
and now we have scenarios where,like Anthropic, which is the
OpenAI competitor, their new botthat they have, quad, controls
(33:27):
your entire computer.
Does that sound safe to you?
I don't know.
I don't even understand whythat's even remotely a good idea
.
I understand the use cases ofit, but just because you should
doesn't mean Just because youcould doesn't mean that you
should.
And it was a similar issue withMicrosoft and their AI co-pilot
(33:51):
.
When they launched it out, theywere taking screenshots of your
desktop.
Uh, talk about an invasion ofprivacy and a violation of gdpr
and all the other privacy rulesand protections.
Part of the problem is becauseof this rush, that this giant
gold rush and and drive towardsai and and efficiency
profitability, you know, withoutany kind of forethought, is not
(34:12):
to give the ramifications ofthis.
So it's resulting in databreaches.
It's resulting in, you know,stolen intellectual property,
data exfiltration.
It's resulted in a lot of theseimpacts.
You know over one third of theseimplementations have resulted
in that lot of these impacts.
You know over one-third ofthese implementations have
resulted in that kind of issue.
And then that's compounded bythe, the challenge of like it's
(34:35):
being forced upon you in termsof so, like I'm an adobe user
and like, all of a sudden, likethe ai assistant pops up, it's
like I didn't enable you, I dolegal stuff.
I don't want that.
No, separate.
No, I don't want to have thedefault opt-in, I don't want to
(34:57):
have the default opt-out.
And then there was anotherissue with LinkedIn, where they
were using all of the data fortheir AI mechanism and tool and
their learning capabilities.
Speaker 1 (35:09):
I don't want that.
Speaker 2 (35:10):
And then even NVIDIA
was caught scraping YouTube data
and YouTube video for their AI.
It's like I don't want my ownvideos on YouTube getting
scraped and embedded in yourNVIDIA AI system, sorry.
So there's no rules andregulations anymore.
And Gardner came out and saidthat basically over 92 percent
(35:31):
of uh of organizations have someform of unsanctioned apps that
are running in their environment.
That's, that's very scarystatistic and it's not slowing
down and it's leading over intothe, you know, basically being
used for social engineering.
It's being used for phishing and, um, you know it's it goes
(35:52):
right down to the layers ofbusiness email compromise, but
right down to uh, I mean, Idon't know if you've ever used a
technology device in the pastcouple months, but, um,
automation is everywhere.
Like every day, I get like spamcalls, I get spam, I get spam
email.
Linkedin is now flooded withthis crap.
(36:13):
It's like overwhelming amountsand it's like really clever,
every phishing email is now aspear phishing email.
It's like, is this real or not?
And it's like, and it's reallydifficult, and part of what I do
on the side is, you know,part-time work is expert witness
.
So if you ever watched themovie my Cousin Vinny, so it's
like you get on stand andbasically you evaluate and
(36:37):
provide an expert opinion Toshow you how far I've come.
You know, now I'm actually I'vehad several cases with Fortune
500 companies and part of thecases that are coming up now is
um fraud using ai and basicallyI've listened to recordings and
listened to case files andthings like that where um ai was
(37:00):
used to really what they do.
It's interesting.
So these tools have the abilityto um, unfortunately, listen to
, like this podcast, podcast ora YouTube video or any kind of
snippet of your voice and thencreate an avatar for it.
And then what they'll do isthey'll call in a bank, bank of
whatever, and basically they'llget your information from the
(37:21):
dark web or whatever, piecemealit together so they have your
banking information or whateverenough information for them to
call in and socially engineerthe uh and user and mimic your
voice and then, like I'veactually listened to the
recordings and the, thedefendant or plaintiff in this
case, if they're suing the bankor whatever, it'll actually say
(37:43):
on there like, like that, that'snot my voice.
And then you know part of theproblem is the detection of that
is not at the level it'sactually hit.
The technology creating it isso like 10, if not 100 times
more advanced than the detectionmechanisms.
So what do you do to preventagainst these things?
Because the technology isevolving so fast, but the
(38:04):
prevention mechanisms aren'tthere.
Speaker 1 (38:07):
Yeah, you said it
pretty well, right, it seems
like everyone is racing towardsthis thing, but we don't know
the implications.
We don't know.
I mean, we kind of know theimplications, right, but I feel
like AI is still in its infancy,right, and we're running into
all these issues, all thesesecurity problems, and I haven't
(38:29):
heard a really good answer.
Forancy, right, and we'rerunning into all these issues,
all these security problems, andI haven't heard a really good
answer for them, right, like,the core question is how do I
take an AI model or an LLM, putit into my company's environment
and use it to assist myemployees to do better work for
the environment?
But how do I do that in asecure way?
How do I ensure that, if theyput you know private data into
(38:53):
it by accident or intentionally,that it's protected, right, how
do I ensure that no one elsecan query that data?
All these different things,right, we don't have any answers
for them right now, and all ofthe things that you and I just
mentioned are all like hugebreaches.
Yeah, of these differentregulations that you know,
(39:15):
companies pay, companies pay, Imean banks pay billions of
dollars to get around, likethose regulations to make, not
even to get around, but to makesure that they are you, you know
, abiding by those regulations,you know, to the fullest extent.
So now we're going into asituation where it's like, yeah,
that $2 billion that you justspent on security for your
(39:37):
organization, yeah, it'sprobably going to have to, you
know, five X, I mean that'sthat's insane.
Speaker 2 (39:44):
Yeah, it's absolutely
insane.
Yeah, it's absolutely insane.
(40:07):
I mean, we're trying to atMorphosec this is why I love the
company is we're trying to atleast mitigate a sliver of this
larger problem.
A large portion of it ends uplanding on the endpoint.
So that's where we sit.
It's an agent-based solution,but we basically help with kind
of getting visibility down atthe endpoint level.
And how we kind of go about itis we identify different
vulnerabilities in terms of thatand different to control.
We monitor and validate thatsecurity controls are working
properly.
We identify high-risk software.
(40:27):
Security controls are workingproperly.
We identify high-risk software.
So if you don't want this to runin your environment, you have
the ability to detect thosemechanisms and then provide
mechanisms on who is that?
So if you're bypassing a WAF oryou're bypassing a mechanism,
the endpoint's going to pick itup and then any kind of
misconfigurations that are there.
So those are the four majortenants and it's all on a
(40:48):
continuous basis.
It has to be continuous, can'tbe just like a routine scan, and
so we call that basicallyadaptive exposure management,
being able to basically havevisibility with that.
So that's kind of like theouter ring of like what we call
basically a blast radiusresiliency and currently, like
adaptive cyber resiliency, isanother way to think about it.
(41:10):
And if you think of a blastradius, whether it's an
earthquake, so we're talkingabout natural disasters.
Basically, the more and moreyou can reduce that blast radius
that's contained to a smallercircle and you can really
mitigate the impact that youhave.
So it's just that endpoint, orthat application or that
subsector that you're basicallymonitoring and implementing and
(41:31):
really you want to be able toprevent that infiltration from
happening.
So we have something calledinfiltration protection as well.
So we're looking at thedifferent memory components,
we're looking at the privilegeescalation, we're looking at
credential theft protection,hacking tool mechanisms.
So if you're using a certain AItool that's been flagged as
(41:52):
malicious or known to createthese different avatar
mechanisms, then being able tomitigate the different ways that
these attacks can get in, likeprompt injection or executable
code execution or executablecode for manipulation of memory
so we might not be able to stopthe first kind of components.
(42:14):
But from the actualinfiltration or impact
perspective so actually hittingthat point and where it's
actually going after your crownjewels, it's going after the
intended target then we have thecertain layers, and so the last
layer that we have is impactprotection and mitigating
against that as well.
So it's kind of like athree-layer in-depth approach to
preventing these types ofattacks, as well as things like
(42:37):
ransomware or preventingransomware in advanced-level
attacks, regardless of what youhave operating on your
environment.
So it's kind of like astandalone element in addition
to other endpoint security orapplication security type
elements that you have, andreally what it breaks down to is
you have basically an ai warhappening right now.
(42:59):
So all of the offensive toolsand defensive tools are all ai
driven.
Now, well, that's a threatmodel versus a threat model.
So what happens when that catand mouse game comes to a head
and you have to have that layer,that safety net, that
additional layer that protectsagainst that, and that's kind of
(43:20):
like where we come in and thecore value problem that we have
within the vein of at least AIsecurity.
Speaker 1 (43:28):
It's a different
approach.
It's interesting, yeah, youknow, when you put it like that
right, like we have defensiveand offensive tools with.
Ais that are going up againsteach other.
It's like a recipe for disaster.
Almost right it is.
Speaker 2 (43:43):
Well, yeah, and if
you look at like I don't want to
pick on any leading vendors oranything like that, but
something pointed to the podcast, but they're all toting the
same narrative that you know wehave the special the super-duper
AI that's coming in and we haveall these algorithms and all
that and we're able to detectthe things that no one else can
(44:06):
and all these other things, andit's like, okay, what if?
what if I?
What if, as a threat actor, Ipoison that algorithm and I'm
able to get in there andbasically manipulate that and
change your signatures.
That's how.
That's why AV traditionalantiviruses is a dead and
obsolete solution is becausethreat actors figured out how to
change the signature types.
(44:27):
And one of my favorite storiesis early on, I was doing some
forensic analysis and I noticedthat a threat actor changed kind
of the mechanism of just tosimplify it, this is spam stop.
And it changed it as like thisisn't spam, this is ham.
(44:47):
It's something that's simple.
And then you modify thesignature, because not everybody
listening here is supertechnical.
But you think of the way therule set is written.
It's that simple.
If you change the algorithmsand the main elements of that,
you're able to tamper it.
You're able to bypass it.
(45:07):
Every security tool now can bebypassed.
So what happens?
Where's that safety net?
And it has to be somethingthat's not ai driven, that has
that layer of the actual layerof protection that protects
against it, especially if you'regoing all in on one vendor,
because then you lose thatheterogeneous diversification of
having multiple vendors.
(45:27):
You, you know, because you know, unfortunately, with some of
the bigger players, you have oneAI, like yeah, there's subsets
of that, but I mean, it's reallyone mothership that's running
the whole scene by design.
So you know, if you poison thatlake, then well, unfortunately
all the water's poisoned andwe're just waiting for that
(45:49):
Armageddon day to happen.
I think the recent CrowdStrikeevent that happened in July is a
good testament of how just thatwas.
Like maybe 1% of what couldhave happened.
Take a vendor that's equally asbig, make make the blast radius
that much bigger.
(46:09):
You know what if they'd goneafter?
Uh, what if that wasn't?
What if that was a maliciousattack?
What if they went after, likethe core mothership?
How bad that would have been.
And then what happens if it wasa solar wind step event where
that went undetected for months,if not years.
Right, what the ramifications?
Speaker 1 (46:29):
are yeah, we're going
into uncharted territory, right
, but it'll be interesting,right.
It creates a lot of opportunityfor people to separate
themselves and stand out fromthe rest of us that know a
little bit more AI, that havethat specialty, that have that
knowledge that companies will beable to capitalize on because
(46:51):
they desperately need thatskillset.
So it's a it's an interesting,it's an interesting time for
sure in it overall.
Well, you know, brad, we're,we're, unfortunately we're at
the end of our time here.
I'll I'll have to have you backon.
It was a fascinatingconversation.
I love how you know we wentthrough everything.
I thought it was reallyinteresting.
Speaker 2 (47:19):
But before I let you
go, how about you tell my
audience you know where they canfind you if they wanted to
reach out and connect, and wherethey can find Morphysack?
Yeah, for me the best place isLinkedIn, so drop me a
connection, happy to connect andtalk there.
And then for Morphosac, justreach out on our website so you
have a contact us on there andhappy to have a chat, teach you
more about our solution andlearn more about us.
You can follow us on socialmedia or on LinkedIn.
(47:43):
On Twitter, all the major sitesand Twitter.
And now it's X.
Sorry, an X.
And then it's gonna be reallyhard for me to learn that.
But x and um for linkedinawesome.
So thanks for having me.
Yeah, I hope it's good to catchup.
Speaker 1 (48:00):
Yeah, yeah,
absolutely, um, it'll.
It'll be interesting, I wonder,uh, we could probably do like a
part two about, like emergingais and how to protect against
them and whatnot yeah,absolutely, I'd love to do a
deep dive.
Speaker 2 (48:13):
So pick a topic and I
could talk about this stuff all
day long I've gone a long wayin in 20 something years now.
Speaker 1 (48:21):
Yeah, yeah,
definitely, well, thanks
everyone.
I hope you enjoyed this episode.