All Episodes

January 17, 2025 48 mins

Data and security breaches are a dime a dozen nowadays, and despite their frequency, they’re still just as dangerous. That’s where Yasmin Abdi, the CEO of noHack, comes in. Despite her relatively short career, she’s already worked for some of the giants of the tech industry like Google and Snapchat. Along with Justin and Autumn, Yasmin breaks down real-world security challenges and solutions, a firsthand view into managing role-based access, phishing simulations for employee training, and the delicate balance between security and usability.


Show Highlights

(0:00) Intro

(0:32) Tremolo sponsor read

(2:04) How Yasmin built noHack

(3:15) Breaking down Yasmin's impressive resume

(4:17) What sparked Yasmin's interest in security?

(7:54) Yasmin's biggest challenge since starting noHack

(11:05) How Zero Trust has evolved over the past decade

(12:34) Balancing usability and security

(15:43) The problems with role-based access and how Yasmin's work addresses it

(19:31) Phishing schemes and AI's role in the future of security

(23:14) Tremolo sponsor read

(24:13) Yasmin's efforts to educate organizations on the dangers of phishing and poor security

(29:31) "Security theater" and the lack of serious education

(34:20) How to get people to take security seriously

(39:37) Yasmin's opinions on third-party scanning vendors

(43:17) How Yasmin would have handled the CrowdStrike attack

(46:52) Where you can find more from Yasmin

About Yasmin Abdi

Yasmin Abdi is the CEO and Founder of noHack, a cybersecurity company focused on delivering high-impact solutions for public and private (startups, & SMB) clients. Yasmin’s expertise spans enterprise security, secure software development, vulnerability and risk management, threat detection and intelligence, security assurance and education, and privacy best practices. Yasmin has also shared her knowledge at major industry platforms (featured on Forbes, Cisco, Voice of America) and has established herself as a leading voice in the cybersecurity space. 


Before launching noHack, Yasmin led global security and privacy initiatives at tech giants like Google, Meta, and Snap. With over seven years of experience, she played a pivotal role as a founding member of Meemo, an AI-powered social finance app later acquired by Coinbase for $95M.

Links Referenced

Sponsor

Tremolo: http://fafo.fm/tremolo

Sponsor the FAFO Podcast!

http://fafo.fm/sponsor

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
(00:00):
I think something here that would be really important is like the principle of

(00:03):
least privilege, really ensuring that users only have the needed permissions
while they're doing their work that wouldn't disrupt their workflow.
Welcome to Fork Around and Find Out, the podcast about
building, running, and maintaining software and systems.

(00:36):
Managing role based access control for
Kubernetes isn't the easiest thing in the world.
Especially as you have more clusters and more users
and more services that want to use Kubernetes.
Open Unison helps solve those problems by bringing
single sign on to your Kubernetes clusters.
This extends Active Directory, Okta, Azure AD, and
other sources as your centralized user management.

(00:59):
for your Kubernetes access control.
You can forget managing all those YAML files to give someone access
to the cluster and centrally manage all of their access in one place.
This extends to services inside the cluster
like Grafana, Argo CD, and Argo workflows.
Open Unison is a great open source project, but relying
on open source without any support for something as

(01:20):
critical as access management, May not be the best option.
Tremelo Security offers support for OpenUnison
and other features around identity and security.
Tremelo provides open source and commercial support for OpenUnison
in all of your Kubernetes clusters, whether in the cloud or on prem.
So check out Tremelo Security for your single sign on needs in Kubernetes.

(01:41):
You can find them at fafo.
fm slash Tremelo.
That's T R E M O L O.
Thank you for opening up your firewall ports in your
head to listen to this episode with Yasmin Abdi.
Welcome to the show, Yasmin.

(02:02):
Thank you for having me.
I wanted to talk a little bit about security.
This is really cool seeing that you were saying that you built NoHack,
you're the founder and creator of NoHack LLC, and you were doing security
at Snap before that, but you built this on the side kind of as a project.
And what drove you to do this?
Why were you like, I want to do a side project
and start doing security for other people?
It initially started off with me just helping a few friends with their security

(02:26):
and their digital presence, as well as small business owners and startups.
I just saw a lot of vulnerabilities and how they
were securing their data and most of them were not.
And there's just like a lot of bad hygiene and bad practices.
So it kind of just started off as a hobby,
which helped people a few hours out of the week.

(02:46):
A few years ago around when COVID started and everything became digital.
And then it kind of just snowballed and grew from there.
So while I was at Snap building, building, building, but never
really took it to be like a formal like side company or a side
hustle, just kind of something to help friends and family.
And then over the past year, it just snowballed and kind of grew

(03:06):
into a full functioning security services and solutions company.
So decided to leave that, leave Snap back
two months ago and focus full time on it.
Can you tell us about your resume?
Because it is very impressive.
It's baller.
Yeah, sure.
Um, I've had the opportunity to work at some of the big name companies.
I started my career off at Snapchat, where
I interned twice back in 2017 and 2018.

(03:30):
And then I went over to Google, was there for a few months.
It was a software engineer on the Android team, learned a lot.
I think that's every computer science and
software engineer's dream is to work at Google.
So had the, uh, had the opportunity to be there, learn from some of
the smartest people, and then also had the opportunity to, uh, work

(03:50):
at Facebook, which it was called back then on Instagram specific.
team.
And that was back in 2019.
So yeah, I worked at a Google, Meta, Snap.
I also was a founding member of Mimo, which is, which was an
AI fintech startup that a few of me and my friends started.
And that was acquired to Coinbase back in 2021.

(04:12):
So had the opportunity to do all of the fun things at these places.
What did you do at Snap before you left?
Like, can you run us through like your security career?
I was like, how
did you get into security?
Like what, what started all of this?
I've always had a passion.
I tell myself that earlier on in my career, I always thought like a hacker.

(04:32):
So there's like a way to do things the normal way, you
know, like log into the wifi, here's your password and, and
you know, you're logged in and everything is good to go.
And then there's always a backdoor.
There's always another way to get into something.
There's another way to break into a system.
Um, so I always thought, Hey, like, how can I manipulate the system?
Like, how can I, I don't know if that's the
best response or if that's even legal, but, um,

(04:56):
totally valid.
Everyone has some use case where they're like, I wanted to hack a game.
Security
people used to do like some other stuff and
that's how they learned how to do security.
That's just white hacking.
That's just, you know, well,
I mean, you have a personal bend on most,
like I did war driving back in the day.
Cause I wanted like free wifi, right?
I was like, Oh, my neighbors have wifi.
How do I get on that?
Right.
Yeah, I think for me also my parents would like shut down like internet access

(05:20):
after a certain time, like put passwords on the like desktops in our houses
and like just do things like that and I was like, no, like, there's no way
you're going to turn off my internet after a certain like, there's no way
you can block these certain TV shows and TV channels after a certain time.
So I was kind of for my own personal preference.
But at that time, I didn't know what I was like, I didn't know it was called

(05:40):
hacking and throughout my college career, or even high school, as early
as high school, I said, Hey, like, you know, I enjoyed doing those things.
And then I found out job security, I found out this was something that
was going to be long lasting and in, in the world that we live in.
So studied at the University of Maryland, um, took my first
formal cybersecurity software engineering course there.
Graduated back in 2019 with a bachelor's in computer

(06:04):
science and a focus in cybersecurity, and then started
my full time career in cybersecurity at Snapchat.
And at Snap, I was a software engineer, software security
engineer, working on developing internal tooling for the
security team, and then, um, moved my way up and then became a
manager and kind of led the Insider Risk Program before I left.
Y'all, she's like gorgeous and smart and got like a crazy career.

(06:26):
How did you do all that since 2019?
Like, did you sleep?
Like, what?
Like,
to be honest, no, I didn't sleep.
There was lots of late nights, but I think for
me, I've always just had a passion to learn more.
So even in times and days where I didn't think I was working, working like the
startup that me and my friends started, I didn't really think of that as a job.

(06:48):
I thought of that as more of A side hustle, like you
and your friends are working on building something cool.
And then, I mean, acquisition was the goal,
but I didn't think it would happen that fast.
I think that turnaround was like 18 months.
I had a lot of very senior people on my team, I'll tell
you that, like ex Google directors of search, et cetera.
So like, it was, it was a heavy, heavy group of people working

(07:10):
on that product, but still the turnaround time was, was amazing.
Super fast.
But yeah, I think like working on that project
with, with my friends, like you lose track of time.
You lose track of the days.
And then also when I was at Snap, like working on a product that
you care so much, you feel so much passion and you care about.
And then with Nohack, like I think I work seven days a week.
Like I can't even tell the difference between like work and not work

(07:32):
because sometimes I I work on stuff that's, like, super fun and engaging.
Like, I do a lot of public speaking.
I do a lot of panels.
I do a lot of conferences.
I travel the world now.
And it's, like, for work, but it's, like, fun work.
So, sometimes I get lost in the track of time
with all the different moving pieces that I do.
It seems like you really enjoy what you do and that
you just are born with the hustle and curiosity.
I
would say I agree

(07:53):
with that.
Thank you.
What's, uh, What's the biggest challenge you've faced so far starting
Nohack or maybe what's the biggest thing that you didn't expect?
So as a CEO, I'm learning a lot of not technical like skill sets.
So um, when I was at Snap or when I was at Memo or even Google in Meta, I
was learning It's very much software engineer like or security engineer.

(08:14):
So it was very much within the engineering realm of things.
And now I'm learning about cap tables and investing and like how to
pitch and how to sell yourself and how to sell your company and like
how to form like partnership deals, which for me, I've always had
like the, I feel like I've always been born with the opportunity to
communicate, but I think it's like the sit, the selling aspect of no

(08:34):
hack is something that I've, I've been learning and it's definitely fun.
I feel like I'm back in school where I'm learning something.
All over again from, from the ground up.
I feel like with, with an engineering, there's, everything's changing, but
there's, there's a certain way that things are changing that you can grasp on.
But like with sales or with business or with marketing, like I, I
tend to spend a lot of my time figuring out like all these social

(08:55):
media strategies and things that I've just never been privy to.
So I would say those are some of the things that, that
have been challenging, but they're fun challenges.
Cause I'm always learning and I feel like I'm a lifelong learner.
So it's fun to learn new, new areas.
What sort of software did you start building with?
Like, cause you mentioned you initially,
initially this was like friends and family.
And a lot of that's just like, here's a

(09:15):
webpage with a blog post or something, right?
Like use a password manager, put on TFA if you can, something like that.
And then at some point you have to like transition that into
like, Oh, if I'm going to help a company do this, I need tools or
I need some automation or I need some way to do this reporting.
What sort of things did you focus on first to do?
Like here's your public port scans or here's
a CVE report or what are you doing first?

(09:37):
So at Knowhack, we really started off with
penetration testing and vulnerability scanning.
So regular just assessing systems for weaknesses and vulnerabilities.
Typically, that's that's how we would start.
So, hey, like, you know, if you're a startup, primarily with like a
digital footprint, we would scan your infrastructure, your systems.
Systems, your architecture, your endpoints, like

(09:58):
your APIs, all of the things that you have digitally
connected to the world and scan them for vulnerabilities.
And then we would also do some red teaming.
So we would pen test and really see how we could maybe
break into your system or find weaknesses in your system.
So that's kind of how we started.
And then we built a bunch of other like services and solutions from that.
They can range anywhere between like AI threat detection and response.

(10:20):
So really use utilizing a lot of like machine learning and a
lot of like the insider risks experience that I had to kind
of understand, okay, like, where is the threat happening?
Where are the pain points here?
And then what responses can we build?
So whether that's alerting, whether that's setting
up continuous monitoring and things of that sort.
So I would say we started with vulnerability assessment
and scanning, moved over to threat detection response.

(10:43):
And then now we really focus a lot on like zero trust architecture.
So making sure that we are never trusting and always verifying every request
that comes in, no matter if the user Or device has access to any, any systems
we always would verify and have very strict authorization and authentication.
So I think those are probably the three biggest things that we focus on at

(11:04):
Nowhack.
How has Zero Trust evolved over time?
Like when I remember back in the 20 teens or whatever, like Zero
Trust was just like, Oh, at some level of your network, you need
to be able to figure out if this device is trusted on the network.
And you just do it with mutual certificates.
You're like, Oh, you got a cert, I got a cert.
We trust the signing authority.
We're fine.
Let's keep talking.
But when you talk about.
Auth Z versus Auth N or whatever.

(11:25):
Like you're like moving into the application layer.
You're saying, Oh, you can do that at the network layer.
You can do it at the request load bouncing layer.
You can do it at the application layer.
How has that changed over time for something that is a zero trust mindset?
I think it started off with adding like an additional layer of protection.
So when you think about adding 2FA or MFA.
And I think that's kind of like the early days.

(11:47):
And then now it's kind of evolved into like a continuous monitoring
approach where every single request that comes in, you're going to
verify the identity before allowing any type of level of access.
And then I also think that identity and access
management has also been increasingly important.
So always managing the user's identities and
permissions to minimize any unauthorized risks.

(12:08):
So there's frameworks like RBAC, so Role Based Access Controls.
Access management systems to really ensure that
employees, regardless of like their role, can only access
data that's necessary and needed for their workforce.
So I think it really started off with the MFAs and 2FAs
moved over to a continuous monitoring approach with that
identity and access management, managing users permissions,

(12:30):
and very granular to the RBAC framework that I mentioned.
How do you balance usability and security and kind of like educating people?
Because I feel like That is the hardest part, kind of
getting people to realize how important security is and why.
Because people just be like, well, I don't want to get
alerted all the time and I don't want to sign in twice and

(12:50):
this is so much harder, but trying to really educate them.
But also make it easy to use, but secure is always the best.
I just
moved one of my Google accounts to a passkey and I don't like it.
I'm like, Oh, this is more secure.
It's better.
Blah, blah, blah.
And like this, the sign in flow is just worse now compared
to one password auto filling my username and password.
And now I'm like, Oh, I got three.

(13:11):
I hate remembering all the ridiculous passwords I make, but.
Google, definitely, there is some sort of a bug in the passkey
that sometimes it doesn't always work the way it's supposed to,
but I really appreciate that Apple products use the same passkey.
Like, you know, you can use it from one
of your phones because it's all connected.
I do appreciate that my face is my passkey and it doesn't require

(13:33):
me to remember, like, passwords and constantly change them, so.
Yeah, I mean, I think that that's a core challenge
that's always being spoken about in security.
I think the overcomplex and restrictive systems can lead to like frustration.
I remember when I was at Snap, we had like four different layers of

(13:53):
authentication that we needed to get through to get into our Google account.
So it was definitely a lot of, question.
This is um, so I was kind of looking at the finals of
article 810 and Um, I think it's pretty dark, to be honest.

(14:14):
that is always needed is password complexity.
So if a system requires like a long, complex password, maybe not asking them.
I think sometimes even these days that you require
to get changed every 90 or 180 days, I saw some, some
organizations and I was like, Hey, like that's a bit too much.
That would be too annoying for me personally.
If I had to create a new password, 180 days.

(14:36):
And especially they're so long.
Yes.
I think there's like that battle where like you want.
it to be usable because if not they're going to go around it just
like you said how you went around your parents stuff like I talk about
all the time my kids are going to end up working for the NSA to get
around all of the like you know like so people will go around it and
be super lazy and not use all the safeguards or try to get out of them.

(14:59):
But you want it to also be safe, so it's like the struggle.
Yeah, no, I agree.
I think something here that would be really
important is like the principle of least privilege.
So again, with going back to like role based access control,
really ensuring that users only have the needed permissions.
while they're doing their work that wouldn't disrupt their workflow.

(15:20):
So the principle is least privilege, I think, is
extremely important when trying to find that balance.
And then I also think kind of creating like a human centric design, really
designing these security measures that are intuitive, minimally disruptive.
The, the workflow.
So something like a single sign on could be helpful, but
I think, yeah, it'd always be, it'll always be interesting

(15:41):
to kind of find what that balance is going to be.
One of my pain points of any role based access control system
is it's so hard to Define a person as a single role, right?
Like most people, once they're, when they start their job, like your role is
clearly defined and in larger corporations, it might be easier to fit you in.
Like, this is your role.

(16:01):
This is the only access you ever have access to, but once they move
positions internally, they're now doing two, like a role and a half.
Cause they're like, Oh, I still do some of that stuff for that old job.
I have this new one.
And they switch again, or the team moves or org
charts move or the products move, whatever it is.
All of those roles get really, really messy once we try to
maintain them after six months or a year of real life experience.

(16:25):
Not even just that, but like packages, like, you know, when you're
like are responsible for certain code packages and having ownership
of the testing and the pipelines and all of that stuff, like there's
always some point you have to give somebody access to binaries or
something, but then how long can you give them access to binaries?
And like that
you're, you have a temporary role in this case, right?
Like here, you need this for a week or two.
I don't know.

(16:46):
And, and that just gets messy.
And a lot of times it's just like, Oh, I have, I have root access because I got
it three years ago and I still have root access and no one knew to take it away.
And trying to do it fast, you know, like where, Hey, I need this.
And you're like, uh, I know you need it.
Exactly.
So when you're in production, you're trying
to fix something that makes it so complicated.
Have you
seen easier ways to manage that or to, to change

(17:07):
that or, or just to make RBAC fit the real world?
Yeah.
So that's actually one of the services that
I built when I was a software engineer.
One of the, one of the internal services I built at Snap.
It was so hard to get right.
It was so difficult to get right.
So I'll start with that.
But we built a tool that allows us to know who has access to what.

(17:29):
So given like an employee's email address, it would show us everything they had
access to, when they got access, what role they had, what permission, et cetera.
Like what if it was GitHub, like what repository.
So it looked at internal services, it looked at external as well.
And I think for us, like, First, visibility
and awareness was the most important.
So we can't revoke your access if we don't know what we had.

(17:51):
So I think like kind of what you were saying around
like the temporary access or if someone changes teams.
I know I changed teams like two or three times at Snap and when I was building
this tool I was like oh wow I still have access to the old teams that I
had or I requested access for temporary for this one project and I still
have access from like a external partner that I don't even need access to.
So I think like awareness was the first part and then we built this

(18:14):
mechanism that had like, if you didn't use this access within, I think it
was 90 days or 180 days, then you most likely won't need it moving forward.
And then if you did, well, you'll just have to request it again.
So I think that's a way that it was more applicable to like real world.
Like I wasn't on that team for two years.
I didn't need that access.
So it was revoked.
Like that worked pretty well just because People wouldn't,

(18:37):
and if they needed it, they would just re request it.
It's always really challenging because of all the lateral
movements within organizations, um, temporary access,
time bound access, or if someone leaves the company.
But what we did, what we did was we hooked it onto Workday APIs.
So depending on like your role or depending on the org or the organization
that you were in, you would get whatever access was applicable for that.

(19:00):
But if you changed orgs, it would ideally drop or remove that access.
So early days of, of, of it, but it was working, it was
working, it was working well when, when, when I left.
So I
hope, I hope it still is.
Keeping those roles and org chart in sync is extremely difficult.
Not just that, but when you move from different job families, like going
from an essay where you touch code, but you don't touch production code.

(19:21):
And then all of a sudden you're in production code.
Like I had so many permission issues just because it still
thought I was an essay when I was dev and it was always confused.
What role do you feel like automation and machine
learning are going to play in the future of AI?
Because you said that you do, you did work
on a, um, Machine learning tool, right?
The biggest one is around like being able to detect threats faster and smarter.

(19:47):
So once you have like a vast amount of data and you can
kind of like see similarities and identify anomalies.
And within real time, I think AI will definitely help with faster,
better, smarter, real time threat detection, responding to potential
threats, like blocking access if it's unauthorized, if it looks

(20:08):
malicious, or if you see incoming traffic in the network that
looks, it looks suspicious, it could stop it before escalating.
So I think that will be a high ticket area
where AI and automation will help a lot.
Do you feel like there's any areas that AI are going to make us more vulnerable?
Yeah, And the future with us giving it access to so many things.

(20:28):
It will get better around social engineering and like phishing
and, and, and those and that area and realm of things.
I, even yesterday I was with a friend and they got like
a credit card fraud email alert when we were in Colombia.
And I was like, you don't even have, it was a Chase, Chase card.
I was like, I've never even seen you use Chase over the past four days.
Like, it's probably not real.

(20:50):
And they were like, yeah, Yasmin, like, I think it's real.
And I was like, Oh, okay, whatever.
And then she They kept doing their thing 20 30 minutes later and they
were like, yeah, they even have the same like four digits of like my card.
And I was like, I'm telling you, like, I don't
even, I haven't even seen you pull out a Chase card.
Like you shouldn't, this is phishing.
The email looked so real.
And then I think after, I think it was very fine tuned.

(21:11):
I don't remember what the exact detail was.
And he was like, Oh my gosh, this is actually fishing.
And I was like, I told you from the jump.
Like, I don't, like, I don't understand why you didn't listen to me, but I
think it'll just get really smart and really good at all of social engineering
and like fishing campaigns and spearfishing and all of those things.
I don't know how and where they got, like, our trip location, the card

(21:31):
form, the last four, all those details, like, the T was exactly right.
And then, you know, he almost fell victim to
it, but, but thankfully, thankfully I was there.
Saved the day.
But I think it, uh, Being your friend has to be a total flex.
Like, But like, isn't it crazy though, with all the information that we
give out, like there's been so many times I've had to stop my friends

(21:52):
and they're like, I'm gonna go like, do one of those like, surveys on
Facebook and I'm like, you just gave eight people your passwords, but okay.
Like, you're just like, there's so many different ways, like people
are always giving their location on social media, then they're always
talking about how they're not home and I'm just like, Can y'all just,
yeah, I think you bring up a really good point too.
Cause like for as long as I've been adjacent to security and interested in

(22:16):
security, we've basically always told people like your instincts suck, right?
Like your passwords suck.
You are all of these things that you think are
unique or random and computers can't hack into it.
Like, nah, just don't trust any of that stuff.
Hand off all that stuff to a password manager, certificates,
all these other things that are external to you.
But when it comes to like this.

(22:37):
Phishing attacks and AI generation, none of them pass the vibe check.
If you like have any experience, right?
And like, immediately you're like, this vibe is off.
Don't trust it.
But they were like, no, no, the bank has told me I have to trust them.
And so I'm externally mounted, like all of my, all these
systems I have to go through to make sure I don't lose my money.
Right.

(22:58):
And that's like a big risk.
And that's like, but.
You know, that, that picture of that person has 18 fingers, right?
Like don't trust it.
Like there's some level here that you just
have to be able to like trust yourself.
But in security specifically, we've just
always told people they're terrible at it.
And now we're like reversing some of that.
Running Kubernetes at scale is challenging.

(23:20):
Running Kubernetes at scale securely is even more challenging.
Access management and user management are some of the most
important tools that we have today to be able to secure
your Kubernetes cluster and protect your infrastructure.
Using Tremelo Security with Open Unison is the
easiest way, whether it be on prem or in the cloud.
to simplify access management to your cluster.

(23:42):
It provides a single sign on and helps you with its robust security
features to secure your cluster and automate your workflow.
So check out Tremelo Security for your single sign on needs in Kubernetes.
You can find them at fafo.
fm slash Tremelo.
That's T R E M O L O.

(24:13):
Yeah, and I think that's it.
It's still, phishing is still the number one way that organizations get hacked.
It's always through people.
It's always through their lack of education.
So, I always try to help organizations educate their
employees through phishing, like mock phishing emails.
I actually set a campaign up at Snapchat.
Snap, where we would send mock phishing emails to employees just to

(24:36):
see what the click through rate was, how many of them clicked the link,
but then also entered their credentials in the link and then downloaded
files, and then we had some very nice follow up calls from that.
Like, look at this pie chart.
You all opened a PDF.
Wouldn't it be funny if like you did you sent it like an
email to see like what people would click on or whatever
and then Like a big pop up came and it was like you failed

(25:01):
That's
actually that's exactly what we did so like if they did click on the link
or if they downloaded it it would be like Boom, like you have failed,
like now you have a mandatory education training that you have to go to.
So it wasn't just like a simulation for us to kind of see like how,
like the posture and the health of the organization, but also like we
very much so sent them to like a mandatory training, um, and awareness.

(25:24):
We keep leaning back on that, like, we
need to educate people to get beyond this.
But at the other end, we're like, we want machine learning to do the vibe check.
And at some point, like, I don't know that machines are going to
get the vibes, but people aren't getting the education either.
And so I don't know where that meets in
the middle of like, both these sides suck.
But not just that, but we also constantly
talk about, like, least privilege, right?

(25:46):
the principle of least privilege, but now we
want to give machines access to everything.
We've given like AI so much data, there's so many companies that are piping
their own data back into their AI, and then they're giving it privileges
to infrastructure, giving it privileges to data, giving privileges to
like their code bases and to writing their code bases, and I'm just like

(26:08):
I mean, I wish I knew more about it, but I'm like, how many safeguards
are in the different like, areas that these things aren't talking to?
People gotta get their jobs done.
And like, the AI
systems are the new Jenkins, right?
Because the CICD systems were the place that every hacker went to attack
because it had all the credentials and all the access, all the automation.
That's what I'm saying.
Like, and like, just working in production, like, I think getting a degree

(26:29):
that was about, like, secure software development, I actually went to the
same school you went to, but the online, like, military version of it.
And it's wild, like, what people do in real life production,
because things don't always work the simplest ways.
Like you know what I mean?
Sometimes there is like a weird way that you have to give
something permission to do that or make it so it's automated so
you can release a bunch of versions at once or just something.

(26:50):
And it's, you'd be surprised the amount of, like, I, I was
on a business intelligence team, and they were testing on
Redshift, like, clusters, and I was like, what are we doing?
Like, you know, like, I was the most junior person,
and I'm like, can we, this, this is a bad idea.
There's so many different layers to what you can do in production, and sometimes

(27:10):
you have to do something quickly, and I'm just like, It's this bad when we know
the principles of least privilege and they're humans, and then we're going to
give machines access to all these different levels of data at the same time.
It's going to make it so much easier.
You hack one thing and you get the keys to the candy store for everything.
And I think that's why we set up like systems in place at Snap, where we

(27:34):
would be able to see if you were putting anything into these AI systems.
Like we would send alerts, like we would like data exfiltration.
So like copying any source code or copying any documentation, see the source.
And then we wouldn't be able to flag it in, in chat GPT
or any of, any of these models, but I think you're right.
If you hand them over your source code or anything
like that, and God forbid, I mean, I hope that no one's

(27:56):
storing keys or any, any credentials and code these days.
But how many
times do like, there's literally a bot that goes around Google
tell, I mean, not Google, but GitHub telling people you put your
keys on the internet, like Cause how often do we do it on app?
I remember I was sitting at Google Next and they were like,
we're going to, it's going to write your infrastructure.
It's going to write your app and then it's going to make a database.

(28:16):
And I'm just sitting there like, Oh no.
And then they exposed the EC2 instance name.
And I was like, Oh, like on the stage, like at their keynote.
And I was just like, Like, my little security hardbot died.
Like, I was like, y'all, this is like a 101 of what we

(28:37):
should not do in public.
And I think for me, it's like the biggest things, like the
most easy to catch or like the easiest, the most obvious
vulnerabilities are right in Sometimes people just overlook them.
And a lot of these, a lot of these vulnerabilities
that happen are sometimes the most obvious.
The biggest hacks are the ones that walked through the door.
Like Target got taken down by literal lease

(28:59):
privileges because they gave access to a contractor.
Like it's, it's never something crazy.
Like I think the only thing that we've really thought was really
crazy was that guy who did the social engineering to make the
maintainer really depressed to do like to get the binaries in
and then, you know, Yes, that was the only, think about it.
Out of all the news, that dude, like, look, he

(29:19):
deserves to get like something named after him.
Like I was like, I, like, I can't even be mad at you.
That was the Trojan horse of 2024.
Okay.
Like, but most of the time they walk right in the door.
What's something Yasmin that you think.
is, is commonly said, you should do this thing, but
it's mostly just security theater and doesn't matter.

(29:39):
Right.
Is there something that they're like, Oh, this
is the advice that the news will tell you.
And you're like, actually just don't like, it doesn't matter.
Or, or something that a company's like investing millions of dollars in a thing.
And then you're like, you know, you're probably not going to
get the security outcomes that you want by doing that process.
You know, I think it just goes back to there's a lot of
compliance rules and regulations around mandating data and

(30:05):
maybe just like actually security education for companies.
There's like laws and regulations that now the government has
regulated that says, Oh, like you need to educate your employees.
But sometimes the the employees are just clicking through these docs and
submitting okay or like watch or fast forwarding this video not actually
watching it so i think there's a disconnect around like we actually need

(30:27):
to educate employees but how we are doing it is not actually materializing
into anything that's beneficial because i've surveyed so many people
like hey like did you actually like Watch this or read through this.
Like no, just click to accept, acknowledge and move on.
And I think it just highlights a lot of policies around like privacy
or data, data usage, data deletion, data retention, all of those

(30:48):
things that people just don't really, like they think that, Oh,
my data is secure or like they're not using my data or they're
not retaining it or anything like that when in actuality they are.
There's a lot of fine lines that people are
missing and mis misreading or not even reading.
So I think, oh, like a company doesn't have access to my snaps.
Like, do
they really not?
Are you sure they don't?

(31:09):
I was in SA and we have all this training,
but the training for SDEs were different.
And I remember getting on an SDE team and they
were like, Oh, this customer's having this issue.
And then the other like SDE was like, I'll just log into their account.
And I was like, you're going to do what?
Like,
no, you're not like, it's crazy that like, I mean, we all know,
like, I love security and I think it's interesting, but I definitely

(31:30):
have got a program, like one of those, like, requirement learnings.
And I'm just like, this is so boring, but
how do we make better education though?
You know?
It's not only the fact that like, usually it's just
dry content that no one's really interested in.
That's what I'm saying.
And it doesn't really give you the real use case.
Like, you know what I mean?
There, it doesn't really.
I mean, like all the
cartoons and the silly like situations they try to

(31:50):
say, but like most of the time that any security.
Training I've been at usually is like, oh,
fit this into your normal schedule, right?
Here's the 37 meetings you have this week.
Here's the things you have to get done for work.
Oh, and there's all this training thing, right?
So we're like, well, I'm going to have to do
this, you know, like as I'm doing something else.
And those are all the times that like, I would hack into
like, or I'd look at the JavaScript and change the timestamp.

(32:12):
I'm like, oh yeah, I watched this for 30 minutes.
Yeah.
I changed my system clock and we can fast forward.
I didn't even think
about that.
Justin.
They're all time based and you're just like, oh.
A computer doesn't know what time is.
I do.
Let me skip past these parts.
That's like the things I learned about it
was like, Oh, you did client side validation.
You're an idiot.
Right?
Like, that's like, we can bypass some of that stuff because again, it wasn't, it

(32:36):
was a priority enough to get the checkboxes for people are trained, but not give
them time to learn something or give them a person to ask questions to, right?
Like sit down with someone like, Pair programming is a thing
because it's like, wow, we learned so much by just watching
someone else, an expert in their field, do something, or even
not even an expert, just someone else with a different approach.

(32:56):
I don't even want to invest time into pair programming though.
And like, look at all the studies that show how fast
that helps people to ramp up and they're like, Oh no.
I
mean, pair debugging is like the best
experience I've had in my engineering career.
It was like watching someone else use 18 different tools
to debug something like, what was that command you ran?
I'm writing that down.
I'm going to read the man page later.
This is amazing.
Which
is wild because like everybody can steal code from somewhere,

(33:17):
but debugging is like you will always have to debug something.
Yeah, I was, I was gonna agree.
I think that's why we, well, to the first point, that's why we did the
real live phishing mock simulations where it wasn't like a manual or like a
video or like a document that said, Hey, I, I read, I read this, but it was
a real live simulation where, okay, like you actually read the email, you

(33:40):
clicked on it and then boom, like now you're like, Oh, And then especially
when you CC their managers or like leadership, and it's like, your org is
in charge of 30 percent of this simulation that we, and this could lead
to how many millions of dollars or how much user data could be exposed.
So then now from leadership is like, okay,
like we actually really have to invest.
And it's like, okay, if you already got caught and your team and your org is

(34:02):
like performing very poorly at this, it just becomes so much more impactful.
So then people start to take it seriously.
So.
After you get caught in that kind of a pop up, you're probably going
to pay attention to that class that you got sent to and you're never
clicking on another email link that you don't know about again.
Is there an offset for that?
Like, cause you can't care about everything and you can't pay attention

(34:23):
to everything, but there are a set of maybe this is more relevant now.
And I, I've been subscribed to.
Have I been pwned for I don't know how long.
And I've gotten so many emails after so many times.
I don't read them anymore because I'm like,
yeah, there's nothing I can do about this.
My data got leaked somewhere.
Someone else didn't secure it the right way, or

(34:44):
someone got a phishing attack and they got in the door.
I'm like, I can't do anything about this anymore.
Now it's just noise.
And at that point I stopped caring.
Originally it was like, I really care about these things.
Let me make sure every time I rotate my passwords, all that stuff.
And now I'm just like, I just don't have the time to care.
And I don't have the.
the memory bandwidth to care anymore.
How do we like eliminate or not eliminate, but

(35:04):
just like reduce the fatigue and help people focus.
Like you can't focus on everything.
I would say, um, if you have like multi factor authentication set up
on your accounts and you are like not connecting to like public Wi
Fis and you have like secure best practices, then you'll most likely
be at a less risk for for these attacks or all the noise that you're

(35:27):
saying that you get from these different applications and stuff.
I would say that, yeah, I mean, always just to enable 2FA,
MFA, secure best practices in your day to day workflow.
And, and I think you could take a lesser, lesser
look on some of these, some of these notifications.
And also regularly update your password.
Um, not 180 days, but definitely something, something frequent.

(35:51):
And then I think also like, I mean, I'm not telling
you guys, but maybe other listeners that are not aware,
but don't just update it with like one extra character.
I think that's the most obvious way for you to get hacked.
And a lot of times like these, your emails have
already been in databases where it's been compromised.
So you adding one additional character is
not really going to make it more secure.

(36:12):
One of my first and favorite projects when I started at Disney
Animation was they wanted to see like, Hey, can you use John the
Ripper to look at, well, whose passwords are easy, easy to hack.
And I'm like, sure.
Could you give me the LDAP dump?
And they're like, Oh yeah, here's the literally like, here's admin access.
Go, go, go get the dump.
Uh, and then, and get all the hashes from it and then see what

(36:33):
John the Ripper could do, and we had a render farm and it was a
Christmas break and they, we didn't have a lot of stuff to do.
So I'm like, how much of the render farm can I use?
To start this John the Ripper process.
Like, you can have a rack.
And I'm like, cool.
That's, I get a bunch of machines.
Let me just, in like the amount of things that we're just like.
Very basic and very things.
I would expect this to be a password at Disney.

(36:54):
And I was like, and just increment a number.
So I'm like, Oh, these aren't secure at all.
And that was 2014.
And basically ever since then, I stopped knowing any of my passwords.
I'm like, no password manager is generating everything.
That's not.
You know, like if, if I know the password, I have 2FA on it, right?
Like we have to have some level of security.
If I had to create this thing out of my head, it's not that random in there.

(37:15):
And so, yeah, having that is like one of those things
that the security best practices, like that quote to me
is always really hard because that always depends, right?
Like it always depends on the context.
That always depends on what the information
is, what the actual system you're using.
If this is an internal AI system at Snap, like I have different best practices
compared to You know, a forum login that is a throwaway that I don't care about.

(37:40):
Yeah, I agree.
I was just going to add to that.
I mean, it definitely depends on what context you're speaking
about, but password managers, like you mentioned, I think
something that's always really important is endpoint protection.
So always making sure.
Updates are, are in sync.
Um, you have security patches, firewalls,
antiviruses, anything like that is super important.

(38:00):
I know a lot of people probably are familiar with, uh, password
managers, but not as much with, Hey, like, we're not just
sending you these pop ups cause your device is not updated.
It's probably some security patches that,
that need to be updated in that as well.
So.
To be my opinion, one of the best and worst
things that Microsoft did for the security.
ecosystem is reliably release updates on the second Tuesday of the month.

(38:22):
And I was a Windows system admin when that was happening.
It was always, Oh, second Tuesday's here.
We got to go through tests.
We got, we would block out time because they were predictable.
And then we can say, Oh, I can build predictability into
my schedule for how I'm going to roll these out, where
I'm going to roll them out, how I'm going to test them.
But on the downside of that is like, they weren't.
Equally prioritized as far as like, sometimes there was

(38:43):
a zero day that was actively exploited across the world.
And it just came out normally on a Tuesday.
That's just like, Oh yeah.
Also Excel crashes once, once or twice, right?
It's just like, Oh, this thing is critically important in this other thing.
And I can't tell you how many times I've been in situations
where the infrastructure wasn't kept up to date and that
Helped us not have a CVE because the CVE was in the recent four

(39:07):
releases and we're like, Oh, we're, we're six versions old.
We're good.
Right?
This wasn't introduced yet.
That bug, that CVE, that security hack that was being critically exploited
somewhere like now we don't have to update because we, we were never vulnerable.
And I can't tell you how many times that has happened to me.
Giving it time to bake and let somebody else find.
All the bugs is always a lot of big things do that though.

(39:29):
Like they don't let you update right away.
Like they will look, definitely let it bake
and see if other people exploit it first.
What do you think about making like third party scanning
vendors better and not getting so many false positives?
Cause it seems like the more we get automated, the more we get.
That's a good question.
I just also wanted to add on to the, um, previous quickly.

(39:53):
I think at Snap we actually would shut down access to you, for you
to like log in if you didn't update within like the certain time.
I know IT was very, very, very big on, hey, like,
There's this zero day happening like your computer, cause
you know, it's all managed software from the company.
So you will not be able to log into your computer unless
you update it, or you won't be able to do anything on

(40:14):
your computer until you update or unless you update.
So that's, that's interesting that you said that.
We used to get logged out.
Cause we like.
Closed our computers on a Friday and Monday.
You're like, Oh, the amount of
time I spent fighting Amazon's Acme system internally for updates because they
were so aggressive on doing every piece of software update all of the time.

(40:34):
And if you didn't do it after like three or four
days, it's like, yeah, you can't get email now.
Stop what you're doing and update.
And I'm like, wow, this is this is on the extreme.
And
I spent more time doing that than writing code.
Yeah, exactly.
I can't tell you how much time I was waiting for my system to update.
And on chats, because it wouldn't work and things were, you know,
Oh, look, this new five updates you rolled out don't work together.

(40:55):
And I need that fourth one or whatever.
And those were all things that's is such a hard balance to keep this.
We need to keep it secure.
We need to keep it compatible and just giving people
time back to like, when do they don't think about it?
And I do think beyond what Microsoft did with keeping it
predictable, what's Google did with Chrome and Chrome OS of

(41:17):
making it More immutable updates of saying like, Hey, we're doing
whole patches of systems that roll from one image to another.
And if it fails, we can roll back and you could never roll back with windows.
You can't roll back with a Mac.
And those things make it really difficult.
The downside is you have to reboot and you're like, no one wants to reboot.
But the, the.
Bonus of, oh, I know this is safe to try because

(41:38):
if it doesn't work, I always have a fallback.
Java did something similar, but not so much for rollbacks, but they
made the release cadence shorter so people would no longer get stuck.
Giant updates.
Yeah.
So like after eight, we learned our lesson and they were like, okay.
Really?
Eight?
Sorry.
Like eight will die when the universe nukes itself, okay?

(42:02):
Like that's when it'll die, but like it'll, the release cadence
made it easier to release software more like, um, more regularly,
but it also made it where you're getting new LTSs, but they're
long living enough for them, people to want to switch to them.
But at the same time, kind of giving people where the
versions weren't so different that they were hard for you to.

(42:24):
I think going back to the question around the
third party scanning, it's super critical.
It's a critical component for modern cybersecurity practices.
But I think that there's also a lot of supply chain risk that's introduced.
I'm not sure if you guys are familiar, if you heard of the
SolarWinds attack that was back maybe a few years ago, but

(42:46):
that originated from vulnerabilities in third party systems.
So I think that having these third party scanning capabilities is super
important, but we also have to remember that it increases the attack surface.
So as you're integrating more third party solutions, those
potential entry points for attackers increases significantly.
So there's a shared responsibility, there's a lot, there's a

(43:08):
lot, there's a lot of benefits, but there's a lot of increased
vulnerabilities that happen when you, when you just think
about all the new entry points that, um, attackers have.
How would you have fixed or changed CrowdStrike?
This is a good, this is spicy, just like a little spicy,
but like, like this is, this is like mild compared to
his normal shade that he throws at cloud companies.

(43:30):
I'm
just, I would love to hear some insights on like
what, what you think is something that could have been
done different or should have been done different.
Fundamentally, like, The testing could have been a lot better, but I also
think that they should have had like a layered approach for monitoring,
maybe like combining some type of like endpoint detection or some

(43:50):
type of like network traffic and analysis or like behavior analysis.
for ways to detect these anomalies in their system could have been a way.
But I think that like, just going back to like, how
could they miss something as fundamental as testing?
Like for a company that big for, for them to be faulted at that level.
Because it changed the
behavior.

(44:10):
You know what I mean?
Like it changed such a behavior that like, you know, your
product, you know, it's running in airports, you know, it's
running to things that can't be rebooted, don't have keyboards.
You know what I mean?
So like, I'm just like.
I don't know.
I feel like we all have use cases and bugs
that you can't account for every now and then.
You know, it gets so out of the realm on how a user is
going to use it that like, we all have our issues, but

(44:32):
that wasn't even like a user using it in a weird way.
Yeah, I agree.
I think that they could also have like a better incident response approach.
Maybe if they had some type of speed or clarity of response
during that incident, that would have helped a lot.
So yeah, I think there's a lot of ways in
which that they could have made this better.
Just to play a little devil's Advocate here.

(44:53):
The thing that they had a bug crashed roughly 1 percent of Windows clients.
I would never, I don't ever test my software enough that
1 percent of my customers could not be affected, right?
There's always this edge case of like, how
thoroughly can I test something like security?
And yeah, it's in all these places and all this stuff is, is obviously bad.

(45:13):
And I think that the, the global deployment of a thing.
Uh, it was, was a YOLO moment for them of just like, here it goes,
it's, it's tested on my machine, it works on my machine, and then 1%, 8.
5 million Windows devices crash from it.
Which, again, like, it just seems like a
really small edge case in a lot of ways.
The way they
dug into it though, it just seemed like there were so many opportunities.

(45:36):
Anytime I'm looking from the outside on anything, I'm
like, Oh, this is, this should have been easy, right?
Like, Oh, I could have figured that out.
Right.
But like, when I really look at those, like edge 1
percent edge cases, I don't know that it would matter.
Yeah, I agree.
I feel like they, there, there are so many ways and so many lessons that
they could have, well, now that they learned, but yeah, that was, that

(45:57):
was, I think one of the really interesting outcomes from
this is the fact that Microsoft is giving the kernel hooks
so that they don't have to run in kernel space, right?
Like that was the thing, the API limits that Microsoft walled
off in Windows Vista is now becoming open again so that the
security vendors have the proper access to not run this highly
privileged code that is sometimes untested and causes those things.

(46:19):
So I think the actual, uh, Eventual outcome that's interesting is the,
is the Microsoft changes, not the CrowdStrike changes necessarily.
Cause everyone's going to have 1 percent errors and everyone at
some point is going to say, this is, this has a fix that has to go
out now and, and how much access or where, how critically does that
software run is the real kind of interesting learning thing to me.
Yeah.
Vendor accountability, super important.

(46:41):
And partnerships, right?
Like you're, they build the thing for when Microsoft
Windows and that's where it runs in the primary use case.
And, and that was, was what was affected and
Microsoft never allowed the vendors to get in there.
Yasmin, this has been great.
Thank you so much for coming on the show.
Thank you about teaching us all about your, your career
path and different security aspects at different companies.
Where should people find you if they want to reach out online or get in contact?

(47:03):
Yeah, absolutely.
This was so much fun.
Thank you for having me.
My, uh, socials are Yasmin Abdi, so you can find me on LinkedIn at Yasmin abdi.
Um, Instagram at yaz abdi, Y-A-Z-A-P-D-I also, uh, no hack llc.com.
Feel free to message us.
Feel free to reach out if you wanna learn more about
cybersecurity or you wanna partner or work together.

(47:25):
Yeah, this was super fun and I'm, I'm super glad that we did this.
I'm so excited to meet you, to have met you.
I'm going to be rooting for you and like fangirling the whole time.
It's going to be great.
Right back at you.
Thank you so much and thank you everyone for listening.
We will see you again soon.

(47:53):
Thank you for listening to this episode of Fork Around and Find Out.
If you like this show, please consider sharing it with
a friend, a coworker, a family member, or even an enemy.
However we get the word out about this show
helps it to become sustainable for the long term.
If you want to sponsor this show, please go to fafo.
fm slash sponsor and reach out to us there about what

(48:14):
you're interested in sponsoring and how we can help.
We hope your systems stay available and your pagers stay quiet.
We'll see you again next time.
Advertise With Us

Popular Podcasts

Stuff You Should Know
Dateline NBC

Dateline NBC

Current and classic episodes, featuring compelling true-crime mysteries, powerful documentaries and in-depth investigations. Follow now to get the latest episodes of Dateline NBC completely free, or subscribe to Dateline Premium for ad-free listening and exclusive bonus content: DatelinePremium.com

On Purpose with Jay Shetty

On Purpose with Jay Shetty

I’m Jay Shetty host of On Purpose the worlds #1 Mental Health podcast and I’m so grateful you found us. I started this podcast 5 years ago to invite you into conversations and workshops that are designed to help make you happier, healthier and more healed. I believe that when you (yes you) feel seen, heard and understood you’re able to deal with relationship struggles, work challenges and life’s ups and downs with more ease and grace. I interview experts, celebrities, thought leaders and athletes so that we can grow our mindset, build better habits and uncover a side of them we’ve never seen before. New episodes every Monday and Friday. Your support means the world to me and I don’t take it for granted — click the follow button and leave a review to help us spread the love with On Purpose. I can’t wait for you to listen to your first or 500th episode!

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.