Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:01):
How'd you like to listen to dot net rocks with
no ads? Easy? Become a patron for just five dollars
a month. You get access to a private RSS feed
where all the shows have no ads. Twenty dollars a month.
We'll get you that and a special dot net Rocks
patron mug. Sign up now at Patreon dot dot NetRocks
(00:21):
dot com. Hey, it's dot net rocks. I'm Carl Franklin
and Amirchard Campbell. We're here again for the nineteen hundredth
(00:41):
and sixty third time.
Speaker 2 (00:44):
It's almost like an addiction or a pattern or something.
I don't know.
Speaker 1 (00:47):
You think one of these days we'll figure it out.
Speaker 2 (00:49):
I don't know. If we're going out, we would have
buy now right.
Speaker 1 (00:51):
Maybe we will talk about what happened in the year
nineteen sixty three in just a minute. But first let's
roll the music for better no framework.
Speaker 2 (01:07):
All right, dude, what do you got?
Speaker 3 (01:08):
Well?
Speaker 1 (01:09):
This came from Simon Cropp, who is a font of
all good things, especially in the.
Speaker 2 (01:14):
Area of tests at Bexter.
Speaker 1 (01:16):
Yeah. He found this tool called xUnit dot Combinatorial Oh,
which is a project from Andrew Arnott on the Visual
Studio Platform team but instead of pointing you to the docks,
which he said are okay, But this particular blog post
by Andrew Locke really talks about it in detail. So basically,
(01:39):
it's a the combinatorial is a weird name but fancy word,
but it's a bunch of features that can make it
easier to generate the test data you need with x
unit nice right, So you can also auto generate parameters,
generate all parameter combinations, or randomly generate values and it's
(01:59):
all with you know, just a simple code you know
theory tests right, and you can use more more attributes
from the combination combinator combinatorial data nice x unit combinatorial. Anyway,
(02:20):
it's really good and there's a lot of examples on
this blog post, so you know, like I said, it's
very easy. Just audit, auto generate permutations and parameters and
custom defined values. Generating values for a single parameter, you
can reduce the number of combinations. So there you go,
(02:42):
know it, learn it, love it. Who's talking to us? Yeah?
Speaker 2 (02:45):
I like about that. Do you know what I think
about that particular situation. It's like it's one guy to
write it, but another person to explain it.
Speaker 1 (02:51):
Well, enough that you know, how do we want to
use it? Yeah, yeah, yeah, it happens. The two talents
don't necessarily go together.
Speaker 2 (02:58):
They don't necessarily go together. Absolutely true. So who's talking
to us today? Richard Graud to comment heal for show
eighteen seventy six, which is back in December twenty three,
when we talk to our friend Laura Belle Maine, who
is a regular conversationalist about security and runs a security
related company. And this was about agile application security, the
idea that you can incorporate security into your workflow, sprint
(03:21):
to sprint, it's not crazy. Don't just try and retrofit it.
At the end, and Rob had this great comedy, says
great show. I especially appreciated discussion centered around making security
easier for the average person and for that matter, the
average developer to implement. So often it has taken that
the cost of security is inconvenience, but I think that
turning this idea on its head and is actually what
(03:42):
needs to happen. In order to get people to do things.
Security must make secure solutions to be the convenience solution. Convenances,
the behavior modification tool need to dimploment better security. I mean,
it's not that convenient about a lock, but you don't
want to give them up. What you would like is
a lock that you can unlock reliably when you want to.
So there's some balance there, I agree, Rob, But it's
(04:05):
good to have the tooling in place. But also, and
I think this is one of the things we talked
about a lot with Laura, was just getting it into
the workflow so that security is never the after.
Speaker 3 (04:13):
Yeah.
Speaker 2 (04:14):
So Rob, thank you so much for your commented. A
copy of music Coby is on its way to you,
And if you'd like a copy of music go by,
write a comment on the website at dot m Rocks
dot com or on the facebooks publish every show there
and if you comment there and I reading the show,
we'll send you copy of music. O.
Speaker 1 (04:26):
All right, shall we talk about the year nineteen sixty three? Sure,
it's a few. Yeah, things that happened in sixty three.
Kennedy assassination, yikes, Vietnam War, it's getting worse. Beatlemania, yeah,
the antidote to the Vietnam War. I guess I remember
Paul McCartney says, yeah, you know, when they were telling
(04:47):
us how to behave in the press, and somebody says,
bring up the Vietnam War. We'd just say, oh, bad woo,
bed war.
Speaker 2 (04:52):
That's it. Bet there's any good wars. Yeah, right, bad woo.
Speaker 1 (05:01):
Some civil rights things were going on. In sixty three
March on Washington, Martin Luther King delivered his famous I
Have a Dreams speech.
Speaker 2 (05:11):
Good speech and then bad war, good speech, good speech.
Speaker 1 (05:16):
On the bad side of civil rights. The Sixteenth Street
Baptist Church bombing on September fifteenth, Birmingham, Alabama, killed four
young girls absolutely intensified national outrage and support for the
civil rights movement. Push button telephones, the touchdown, and some
(05:36):
things happened in the nuclear front, right, the Nuclear Test
Band Treaty on October twenty fourth, and the first commercial
nuclear reactor began operation in the US. Anything you want
to say about that.
Speaker 2 (05:48):
Richard, Oh, there were so many reactors at that time.
They go back to the fifties, so it depends on
which one we're talking about. But yeah, no, they were
a lot of military related reactors, but then you had
the Westinghouses and the General Electrics building the commercial editions
just to make them reliable. That those were two Gen
(06:12):
one reactors and they were tricky to operate, and they
had very skilled people to operate them, and most part
went well. It's when it grows out over the next
decade that we started to have more problems with.
Speaker 1 (06:22):
And was it last week we talked about the integrated circuits.
That was sixty two.
Speaker 2 (06:26):
Yeah, we talked a bit about integrated circuits there because
they were struggling still struggling to make them. Then too,
they were part of the equation on the computer front there.
So there's a couple of interesting things. Last week we
talked about the Link computer are really the first personal computer,
but just because you could take one home even though
it was six foot tall. But the following year at DAK,
(06:48):
inspired by that machine, they made the PDP five, which
arguably is the first mass produced mini computer. There were
only over fifty links made, so these were twelve bit
words and you could get between one and up to
thirty two K words of core memory. This is before
you know integrated circuit memory, so you literally have the
little wound fires cores in wound in copper to store stuff.
(07:14):
Came with an editor, assembler or four track compiler and
a debugger for about twenty seven thousand US dollars and
it was a less expensive computer than the PDP four,
which was also a less expensive computer than the PDP one.
So the original machine made by Deck was very expensive,
and so they made a cheaper one, and this was
even cheaper than that, and this machine, the more advanced
version of the PDB five, will become the very famous
(07:35):
PDP eight. One piece of software from nineteen sixty three
is sketch Pad. So this is Ivan Sutherland who was
writing his PhD thesis, and he was really the ancestor
of CAD software of computer aided design and arguably the
very first guy. It ran on the TX two, which
(07:56):
is a completely unique, one of a kind machine from
nineteen fifty eight built in it only lasted about twenty
years or fifteen years, which had sixty four k of
thirty six bit words and you were able to draw
directly on the screen with a light pen. Yeah there
was no mouse yet Nope, that was Xerox part The
mouse is still coming. Yeah yeah, but you know, these
(08:17):
are all the beginning efforts with these early machines, but
also totally bespoke. This was software that ran on exactly
one computer in the world at the time.
Speaker 1 (08:26):
Right, Wow, Yeah. Well, I can't wait to go through
the years here leading up to the computer revolution, because
this is this is where it gets exciting, you know.
Speaker 2 (08:36):
Yeah, And it's we're pulling these bits and pieces together
and just seeing like that PDP five it's all transistors.
There's not a single I see in it. It's before that. Right, Wow,
good stuff.
Speaker 1 (08:48):
All right, stay tuned for more history lessons from Richard
Campbell and myself. But I suppose it's time to talk
to our guest, Michael Howard. So. Michael is a senior
director in the Microsoft Red Team. If you don't know
what a Red Team is, they're the ones that go
and hack. You'll tell you all about it, and the
(09:09):
hackers focused on improving security design and development across Microsoft.
Based on RT findings. He has been at Microsoft for
thirty three years, almost always in security except for at
the start. We're at Microsoft in New Zealand. He supported
Windows three point x, the Windows SDK, and the Microsoft
C compiler. Currently lives in Austin, Texas. Is an avid
(09:32):
scuba diver and life is now much better because his
wife dives now too fantastic, so she didn't scuba dive
FORR The early years of your relationship, is that it?
Speaker 3 (09:45):
Well, I've been diving since ELA's seven. Oh well, okay,
so no, she was absolutely terrified of diving. She felt
very claustrophobic and also like, I live in Austin, so
the closest lake to us for die thing is Lake Travis.
And honestly, the only nice thing about Lake Travis is
it makes every single diving destination looks so much better.
(10:08):
It's just a light.
Speaker 2 (10:09):
Not that much diving in Texas really, No.
Speaker 3 (10:11):
Actually, there actually is. Actually there's a yeah Travis. Some
of the lakes are really commonly dived. And there's also
abandoned nuclear rocket silo in West Texas called Valhalla, which
I plan on diving this year as well. Its about
one hundred and twenty feet down. How many rads hopefully zero?
Speaker 4 (10:33):
There's not a game of fallout, you know, you should
check that out first, take it out.
Speaker 1 (10:37):
That's right.
Speaker 3 (10:39):
But yeah, it's funny. When my wife got her her certification,
she refused to do it in Lake Travis. So she
flew with a girlfriend to the Virgin Islands. Oh nice,
did a certification there instead?
Speaker 2 (10:49):
You find some warm water to swim around in, right, Yeah.
Speaker 3 (10:52):
Absolutely, she's a blue water princess.
Speaker 2 (10:54):
Yeah.
Speaker 1 (10:55):
There so a couple of things RTI Red Team. So
tell me about the Microsoft Red Team. I know what
red teams are because I do a podcast with two
guys on our Red Team about security security this week.
But tell us what the Microsoft Red Team does. Yeah.
Speaker 3 (11:11):
We you know, we're basically treated as a real threat
actor in every possible way you could consider that. Nothing's
else on the table except JA. That's good. That's actually
a very good point. There's a story. There's a story
I can probably tell you there about I don't want
to get it, only very careful what I say here,
But I remember showing something on a on a on
(11:32):
a session and someone said, oh you could you know
you can get fired because of that. And then this
friend of mine who's also on the Red Team, colleague
of mine, chimed up and said, well, actually know he's
on the Red team. He can get out of get
a raise. Yeah, the Red Team is really kind of interesting, right.
So our job is ultimately we start off with an objective,
(11:53):
whatever that objective is, and we go for it. And
that's a I mean whatever it takes to get to
that objective. And a big part that I'm involved with
is the readouts as I look at what actually happened,
what was actually done, all the steps along the way,
and so okay, so what was the for example, the beachhead,
(12:14):
you know, what got the Red team into the environment
and where did they go from there? And how did
they get there? Was it a security vulnerability? Was it,
you know, like from a code level perspective, or was
it a week commission on something? Was it insecure this
that and the other? Was it a combination of things?
The whole point is for me to learn those things
and then turn that into appropriate material for inside of Microsoft,
(12:35):
and eventually we'll make a lot of that available outside
Microsoft as well. But we're starting internally obviously.
Speaker 1 (12:40):
Are you primarily focused on Azure because I know that
that's kind of where you where you live, But what
about internal offices and things like that?
Speaker 3 (12:49):
Yes?
Speaker 1 (12:50):
Both, Yes, I'll just leave it at that.
Speaker 3 (12:52):
Nothing's off the table, nothing's on the table at all.
What's that's kind of interesting is the Red Team. Now.
The Microsoft Red Team is really a sort of conglomeration
of multiple Red teams that existed across Microsoft, now reporting
under one essentially management chain, essentially on the Scott gu three.
Ultimately if those cuts you know, primarily Asia, I mean
(13:13):
he obviously Windows rolls up there as well.
Speaker 1 (13:15):
Right, Penetration tests are kind of expensive. Do you do
them around the clock?
Speaker 3 (13:20):
We do so. So pen testing is not red teaming, right,
I mean pent testing is you almost say, hey, I
want you to go and you know, kick the you
know what out of this product, right. Red Team starts
from an objective and objective yeah, oh yeah, we don't
tell anybody. We don't tell anybody. You know, we want
to create tokens or we want to air quotes, steal
(13:42):
money from something whatever. Right, that's the objective. And whoever
we trample along the way get strampled along the way.
Whereas pen testing is quite different. It's like, you know, hey,
find bugs and share point online, find bugs in I
S or find bugs and.
Speaker 1 (13:58):
Dot it find vulnerabildings.
Speaker 3 (14:00):
Yeah, quite different.
Speaker 1 (14:01):
So the difference is a pentest they ask you to
do X. In Red Team, they don't even know what
you're up to. You're just hack hack hack hack.
Speaker 3 (14:09):
Yeah, and you know we'll leaven, not just like attackers. Right,
So if we are detected, we will you know, change
course and we will try to will try to obfuscate,
if me deleting logs or whatever, We'll do it whatever
needs to be done right, and you know, it's a
beautiful thing.
Speaker 2 (14:27):
Really.
Speaker 1 (14:28):
I asked Twayne Laflatte, who's the head of the Red
Team at Pulsar Security, where then I do the podcast with?
I asked every once in a while, like, do you
ever get asked, why aren't you a criminal? You know,
you know all this stuff you can get, you know
how to get it. He's always giving away criminal career advice,
you know, and he says, because I'm a good human. Duh.
Speaker 3 (14:54):
Yes, it's interesting. I mean, you know a lot of
people that I work with, I mean it's the same thing.
I mean, they're incredibly nice people, like truly nice people,
but they just have this way of thinking about things
that a lot of people don't. There's an analogy I
like to give. You. Even heard the story right where
people said, you know, we'll only make things more secure
(15:16):
when we get more people thinking like attackers. Well, my
argument is you can't do that unless you are one.
And the example I gave because I mentioned this to
my wife, this is years ago. We were some word
I don't know where it was, I mean camera to
this ATM machine and it said hey on the screen,
it said, hey, hey, if this temper evident tape is
tampered with, don't use the device. And I said to
(15:37):
my wife, said, do you think that's a good thing?
So well, I don't know what it's even protecting against.
And I said, well, so you know, quick, give her
a quick skimming one oh one, and it's a magnificent difference.
And that's stupid. She said, what do you mean, Well,
don't you think the criminals can make their own temper
evident tape number one number two? It's actually worse than that.
I can knock out every single ATM in town. So
(15:58):
what do you mean, I still just take my pocket
knifeound just every single piece of.
Speaker 2 (16:01):
Type, start cutting the tape.
Speaker 3 (16:02):
Yeah, so I'm using the defense against yourself. It's like,
how did you even think of that?
Speaker 2 (16:06):
Because ease e.
Speaker 1 (16:09):
But you have to have an evil mind.
Speaker 3 (16:13):
You have to think, oh I would I would consider
myself I don't know, probably chaotic good, Okay, Yeah.
Speaker 2 (16:19):
It's a contrarian mind, just to think of the opposites
of things too. I did a show on run As
not that long ago, maybe I'll just go back in
May with your ideogenous who did a book on how
to get a career in cybersecurity, excellent book and the
thing and he talked about the fact that, you know,
most things insecurity are teachable. The thing that isn't teachable
is this sort of insatiable curiosity to just keep wanting
(16:41):
to pay poke away things, try a different way, keep
on pressing. If you don't have that, this isn't the
job for you.
Speaker 3 (16:50):
Whenever I'm interviewing people, one of the things I do
look for is that fire in the belly. Right, It's
that passion to learn, that passion to dig deeper. I
don't know you. I'm you guys probably the same as me.
Right when I was a little kid, I was the
guy in the corner of reading encyclopedias, you know I was.
Speaker 1 (17:05):
I was taking apart my tape recorders and trying to
put them back together unsuccessfully.
Speaker 3 (17:10):
Right, But you learned a lot right getting.
Speaker 1 (17:12):
Whacked by my mother. M Yeah, I learned that I
probably shouldn't take my taper. Yeah, can I have another one?
Speaker 2 (17:22):
I might have made my tape recorder explode. Actually that's
probably worse.
Speaker 4 (17:27):
Oh, the stuff I the thermite substitute for batteries.
Speaker 3 (17:32):
So I got to actually got an explosion story there.
When I was a kid, I went to that, this
is New Zealand. I went to the local chemist and
of course I went, hey, can I get some you know,
carbon potassium? I said, saltpeter and sulfur. The guy's like, come.
Speaker 1 (17:45):
On making gunpowder?
Speaker 3 (17:47):
It is like but you know what he said though,
He said, I've got two options. I can send you
on your on your way, but you're gonna.
Speaker 1 (17:53):
Do it anyway.
Speaker 3 (17:54):
Or I can show you how to do it safely.
And he showed me how to do it safely.
Speaker 2 (17:59):
Nice.
Speaker 1 (18:00):
Wow, that's good. Yeah, don't ask about explosion stories your fingers.
Speaker 2 (18:04):
You know. I told the thermiting the old server so
that we ended up with funky jewelry and a memory
story the other day. Yes, and it was a bad
child that it was a practical end for a server
that everyone hated. Well, I was talking about your teenager
or the other the many other expos in this day
(18:25):
and age. I just be labeled as a terrorist, probably
Canadian polite tarress. What I was was under supervised, clearly, all.
Speaker 1 (18:35):
Right, So what did you come here and talk about?
Because we could. We could talk about this stuff all day.
Speaker 3 (18:41):
Mm hmm, yeah, I mean, I mean it's in my background.
That's you know, sort of mentioned in the brief bio
at the beginning. IS is essentially security however, is an
interesting topic, right because I originally started when I moved
to Redmond from from New Zealand. I got a job
(19:01):
in IIS and the web server intet Information Internet's information
server as it was then now at Services, but same difference,
and I eventually took on the role as the security
PM in IS. Three. I know, you guys are going
to start giggling to yourself, and I sort of mentioned
this stuff.
Speaker 2 (19:19):
Now that made a lot of money off the challenge
in there.
Speaker 1 (19:23):
There's a challenge to keep that secure.
Speaker 3 (19:26):
There's so many stories. When I retire, I think I'll
write a book on it.
Speaker 2 (19:30):
But three four you've already written a bunch of books there,
I have.
Speaker 3 (19:33):
I have, that's right, yeah. But I three four, five
were interesting, right because they were full of security features,
especially five five was full of security features. We had
Cerbros integration, we had certificate services. I think it may
have been actually the first web server that did serve
aside certificate revocation of client certificates. I think it was
the first one. We're really hardcore actually about that if
(19:57):
we couldn't reach the crill distribution point. But that's all
whole nother discussion. But you know, did they have security features? Yeah,
had a ton of security features, But were they secure features, right,
ann answers potentially No.
Speaker 1 (20:09):
I remember path traversal was a problem.
Speaker 3 (20:13):
Yeah, a canonicalization, right, Colonicalization was was a huge problem
and we even saw it in in dartn Air, right,
there was an anti cross site scripting library, right, and
up being so many ways around it that we just thought,
we just can't maintain this, right, And so yeah, I could.
I could talk forever about colonicalization problems. I'm a big, big,
(20:37):
big fan of canonicalization. In fact, canonicalization or two tokens
kind of worry me. But that's a whole nother discussion
where you basically doing string comparisons to make access decisions.
Is there a more than one way of representing something
that's valid that you're not looking for? But so, yeah,
I mean so I S three, four, five, especially five.
Lots of security features, but not particularly secure features, said Carl.
(21:00):
Canonicalization pas traversal were big ones, and so I ended
up learning a lot from that, and at that point,
you know, my my career kind of pivoted from security
features to securing features, and that's where I really got
stuck into lower level memory corruption issues in C C
(21:21):
plus plus again, canonicalization problems across the scripting. Sequel injection,
I mean, sequel injection basically kind of happened around I
S three fourish guy Rainforest Puppy r f P M
getting an email from him saying, hey, look what I discovered,
and that became sequel injection. In fact, I've spoken a
(21:43):
couple of times to RFP since since.
Speaker 1 (21:46):
They're still still like number one on the list, isn't
it sequel injection?
Speaker 2 (21:49):
It's down to three facts in the top ten?
Speaker 1 (21:52):
Is still in the top ten.
Speaker 2 (21:54):
It sat there for so long.
Speaker 3 (21:56):
We know how to solve we know how to solve this.
Speaker 2 (21:58):
Yeah, it's a totally solved problem, and it just doesn't
get away.
Speaker 3 (22:01):
Yeah. And so the question is why, right, why people
keep making those mistakes, And honestly, it's because they just
don't know. I mean, this is an interesting thing, right.
One of the definitions of a secure system is a
system that does what it's supposed to do with nothing else. Right. Well,
if you build some code that does sequally stuff right
(22:22):
and you do stren cocaatenation for your sequel statements, is
it going to solve the business problems? Yes, it is, right,
it's just going to work. But it's the injection part
that is the or, you know, the something else. You know,
A secure system is the one that does what it's
supposed to do with nothing else. But is that something
else that makes it insecure? It solves all your business problems,
(22:44):
but it's insecure at the same time, and people don't
realize it until they start, you know, running tools over
it or someone you know, compromises the environment.
Speaker 1 (22:52):
Richard and I used to do a talk together at
various conferences about, you know, walking through sort of the
history of inner services, and he made a good point
about saying that iis is like whatuld you call it,
like a Swiss army knife with all the blades out.
The Swiss army knife with every blade out was yeah,
(23:13):
whereas you know.
Speaker 2 (23:15):
Well node was the ultimate opposite. Yeah, no, there are
no blades. You have to go get each blade right.
Speaker 1 (23:20):
That's right, everything's off by default.
Speaker 3 (23:22):
Right, yeah, but same with I six though, yeah, by six,
So that's where we made a huge change, right, right,
So his code name was Kevlar, and there was a
reason for it being kevl Right. We rewrote huge swaths
of code, especially string handling and also kind oftilization was
all centralized. We also changed the design, and one of
the biggest designs was that IS five, Like you say,
(23:44):
it was a Swiss army knife with all the blades
and screwdrivers and what have you sticking out. I six
was the opposite, right, basically ran essentially nothing and you
had to opt in for most of the services and
I seven, which I could even further by opting in
for specific h GDP verbs, especially the web dev stuff.
Speaker 1 (24:03):
Right, yeah, it's just cool out I remember it.
Speaker 2 (24:05):
Well.
Speaker 1 (24:05):
We learn, you know, from each other and evolve together.
Speaker 3 (24:09):
It was a huge baptism.
Speaker 2 (24:10):
Well and in the early days of I S it
was about making it easier for developers. You didn't really
know that much about web development to make things work.
Speaker 3 (24:18):
And and that was Jay Allen's impact, right, I mean
Jay ran the project and he was a big you know,
let's get everything onto the web, especially I S. I know, look,
I understand that. I totally understand that. But at the
same time, you know, if you start turning stuff off
by default, it is going to provide some kind of impedance.
Speaker 1 (24:39):
Yeah, you're also not just interested in server stuff, but
in code security in general, right, like correct avoiding vulnerabilities
like buffer overruns and things which admittedly aren't so much
of a problem in a managed language like C sharp,
but certainly in C C plus plus those are still issues.
Speaker 3 (25:00):
Huge issue. In fact, I'm working on another book now
with some friends, with some colleagues, and one of the
chapters is going to be exactly that, like reconsidering the
role of C and C plus plus.
Speaker 1 (25:13):
Now.
Speaker 2 (25:13):
The only reason, well, this is this what Rassinovich was
talking about, Like you have to justify writing exactly.
Speaker 3 (25:19):
So racinovert Well it's not just Rescinovi, it's also Dave
Western writing windows. So the three of us was talking
at length about about this. And look, I have worked
in C since I was sixteen. Wow, let's just say
that was a few years ago. You know, as I mentioned,
(25:43):
you know in the Buyer at the beginning, right, one
of my well, my first job at Microsoft was in
part supporting the Microsoft C compiler it was C version
four back in the day. But what's interesting is I've
you know, I've had a huge I have a huge
love affair with C and C plus plus. I love
both languages, absolutely, adore both languages. See is probably the
(26:03):
language that I can write without even thinking. It just
spews out onto the keyboard. You know. The problem though,
is that it's too easy to have undefined behavior in
both languages. And remember that before, you know, definition of
a secure system is a system that does what it's
supposed to do with nothing else. A more academic representation
(26:26):
of that is a system that exhibits no undefined behavior. Well,
ce on C plus plus has got plenty of undefined
behavior when it comes to the corruption memory safety. So
cal you said buffer overruns, but that's only just one.
Speaker 1 (26:40):
That's just one.
Speaker 3 (26:40):
Yeah, that's one, right. The whole gamut is essentially memory safety.
I know that there's a lot of work going on
in the C plus plus committees right now around this
thing is called profiles, to basically say, here is the
subset of the language that we will we will enforce.
For example, let's say you have a C array, maybe
(27:02):
a vector as well. But let's say in an array
by C array. I don't mean a sorry, a C
plus plus array. I don't mean a C array. But
if you've got a C plus plus class that's an
array and you use just the index operator, you know,
square brackets eye, that is not bounce checked. But if
you do dot at I, then that is bounce checked.
(27:25):
So you may have in the profile. We will only
allow iterators to use dot at and not just the
array and the normal array index that we used to
so that there's that that works. But the problem is
there is so much in C plus plus code out there.
Are you really going to re engineer it and refactor it?
I don't I don't know.
Speaker 2 (27:47):
No, you're not well. If you are, you're not going
to stay there, You're going to go elsewhere.
Speaker 3 (27:50):
So the word, you know, so the word from on
high in as you're at least, and this is from Ricinovich,
is new code must be written using memory save language.
That could be RUSS, could be see sharp, could be go.
You know, we saw typescript typescript compiler written and go.
Speaker 2 (28:09):
I mean, there is managed C plus plus, but does
anybody use it.
Speaker 3 (28:12):
I don't know of anybody using.
Speaker 2 (28:13):
It, or is it actually safe.
Speaker 3 (28:15):
Honestly, I haven't really used it. I found the syntax
very funky.
Speaker 2 (28:21):
Yeah, and yeah, you know, this is what happened to
me with VB dot net. I was a good VB programmer.
I was very happy dot net comes along. VB dot
net makes me mental because I have VB reflexes, and
so I switched to C sharp because at least I
know for sure it's not VB.
Speaker 1 (28:35):
Yeah point.
Speaker 3 (28:36):
Yeah.
Speaker 1 (28:36):
So it took me a while, but I got over
to Yeah.
Speaker 3 (28:39):
So, you know, Perisenovitch new stuff must be written in
memory safe language.
Speaker 2 (28:43):
And I know Dave West, I think he said it.
You have to justify not writing in a memory safe language.
Speaker 3 (28:48):
Yeah, correct, Yeah, that's that's that's the more accurate statement.
Speaker 2 (28:52):
Yeah. Yeah, well he's pretty careful to not talking in absolutes.
Speaker 3 (28:55):
Mister, what was the confidence for computing? But other than that, okay, no,
it's look I again. For me, it's really difficult. I
love seeing C plus plus, I love both languages. Look,
and I realize that C plus plus can get pretty upteous. Sometimes.
Everyone likes showing the you know, the crazy you know
(29:17):
C plus plus code, you know, try and work out
what the heck. There's things doing. I get that, but
I love cea Sharp for you know, my my general
rule of thumb is this, and this is my general
rule of thumb is if you're writing something new sea sharp, However,
if you need something where you can't incur the cost
of the garbage collection or the the unpredictability of the
(29:41):
garbage collector especially on a server, than Rust. I'm a
huge fan of Russ. I love Russ. I love Rust
to death.
Speaker 2 (29:47):
And it does seem to strike that balance between I'm
quite low level and deterministic in what I'm going to do,
but I'm also memory safe.
Speaker 3 (29:55):
Correct. The downside of Russ is learning it. It is
a very difficult language to learn. I am some years ago,
about two years ago now, I gave a presentation inside
of Microsoft called a lap around Rust, and it's basically
a one hour session, just a quick hey, you never learned,
never know anything, know nothing about Russ. Watch this talking
(30:16):
about visual studio code blah blah blah blah. One half
of the presentation is just on the borrow checker in Rust,
because that's what causes most of the pain. And the
first slide of that is, you know, if you're writing
Russ for the first two to four weeks, you're going
to be you know, punching the screen because of the
(30:37):
borrow checker, because of the way it works, and it
drives people bonkers. But once you get over that learning hump,
it's actually it's a beautiful language to learn, and you know,
nice ecosystem and very good standard library.
Speaker 1 (30:50):
Yeah, well, this seems like a good place to take
a break. So we'll be right back after these very
important messages. And as a reminder, if you don't want
to hear these ads, you can pay five bucks a
month to become a patron Patreon dot dot and Aarox
dot com. You get an ad free feed. We'll be
right back. Did you know there's a dot net on
aws community. Follow the social media blogs, YouTube influencers and
(31:13):
open source projects and add your own voice. Get plugged
into the dot net on aws community at aws dot
Amazon dot com, slash dot net. And we're back. It's
dot in Aros. I'm Carl, that's my friend Richard Campbell, hey,
and our friend Michael Howard, and we're talking code security
(31:35):
and one thing that has become very evident to me,
doing you know, a weekly podcast on vulnerabilities and hacks
that have happened the week before. Is that most of
these aren't technical vulnerabilities. Most of them. Most of the
attacks anyway, are from social engineering, and that is just
(31:58):
a whole another ball away. But it's the first line
of defense and security is the user. Don't click the
freaking link in the email or the text message, you know.
Speaker 3 (32:08):
Yeah, I mean I usually get emails from my father
in law, yeah, saying you know, is this legit? My
answer was always the same. If you're asking me the question,
then the answer.
Speaker 2 (32:18):
Is no, you already know the answer.
Speaker 3 (32:22):
That's right. I get ones that everyone's in a while
from my wife as well, but I mean, yeah, the
work that's going on, certainly Microsoft, I'm sure other companies
as well, you know, in the various UIs to mitigate things.
One of my favorites actually was was added to Office
years ago, which was to run the you know, the
process at low integrity level by default, which is really nice. Well,
(32:46):
I don't know how technical you guys want to get,
but from my perspective, it was a magnificent defense. It
assumes that the document, whether it's a PowerPoint, Excel or
doc file, whatever, it assumes it's malicious and runs it
in this this essentially a a low integrity sandbox. So
it does blow up the potential damage. I'm not saying
it's completely mitigated, but it's certainly less. And you got
(33:10):
to do things like that, right, You've got to assume
that the users going to click on things and open
things and all that sort of stuff, And so how
do you mitigate the damage.
Speaker 2 (33:19):
Yeah, yeah, we're in the admin's side. Our line is
always be more careful next time. Is not a strategy, right, yeah, right,
it's not that the guy clicked on it, it's that
anything happened when he did.
Speaker 1 (33:30):
Yeah, and just that advice be careful, Well, what does
that mean? Does that mean don't ever click on anything?
Does that mean never go to websites? Does that mean
cut off your hands and live in a box? Thanks
Roy Blake for that one. You know, it's hard to tell.
Just be careful.
Speaker 3 (33:44):
That's terrible advice, to be honest, it's terrible. Yeah, I
mean humans is going to do what humans going to do, right,
I mean they can click on that. I mean one
of one of my favorite and I gotta be careful,
I say here, but I was working doing some work
with a defense contractor and they got hit with a
pretty serious attack. And the attack actually came through a
(34:08):
very senior person within the organization, and it's basically a
zero day in Acrobat. But the reason why she opened
it is because the message was about knitting interesting and
she was an.
Speaker 2 (34:23):
Avid knitter, so very good targeting.
Speaker 3 (34:26):
And so she just opened the dark the PDF and
that was it. That was the beach air. Yeah, there's
always going to be something that you're gonna click on.
We can pontificate and say how awesome we are, but
I can guarantee all three of us there's something we
will click on.
Speaker 2 (34:41):
Yep, you're gonna make them and you are just gonna
have a moment of weakness. You're gonna have a mistake
at some point.
Speaker 3 (34:46):
And that's why I'm a huge fan of these sort
of mitigations around assuming that whatever you click on is
going to blow up, and you know, what can you
do to mitigate that? And we spend a lot of
time in Windows especially on that.
Speaker 1 (34:58):
There's some ideas that we've thrown around. One of them
is to have a separate VM where all your email
comes through and all your Internet goes in and out
of the problem then is Okay, if you do have
to communicate with some data and some files and stuff,
now you've got to get them around safely without infecting
each system. But that's not an insurmountable problem.
Speaker 3 (35:19):
You know, it's funny you bring that up.
Speaker 1 (35:21):
You can do a little file Zilla, you know, a
little FTP, you could do that.
Speaker 3 (35:26):
It's funny you bring that up. That was actually thrown around.
I want to say fifteen twenty years ago, I was
a meeting with Gates where Butler lamps and actually threw
out this idea of I think you got a red green,
a red VM and a green VM. Yeah, that's exactly
what it was. Yeah. Wow, But back then, I mean
vms were super duper heavy as well, and it's releatant.
(35:47):
Then there were performance impacts. I think we're in a
much better position today. But you know, when you know Microsoft,
if you're doing administration and VAS, you're at the back end,
you can't do it from your machine. You're not allowed, no, right,
you run it from my secure access workstation. That is
very limited in what it can do on purpose.
Speaker 1 (36:06):
Yeah, do you remember, Richard, we had we interviewed a
guy on dot net rocks. It was a little at NBC,
I think, and he had discovered how to hack dot
net apps and this was dot net framework, Windows and stuff.
He could get into memory and change things around, and
he had developed a tool to do this, and he
(36:28):
gave the tool away for free, and then he sold
an anti tool to procure.
Speaker 2 (36:34):
John McCoy is who you're talking about.
Speaker 1 (36:36):
Yeah, to protect yourself from it. I just remember Richard
and I were horrified. We didn't know anything about this
before we started talking, and then it was like that's evil,
you know.
Speaker 3 (36:46):
But yeah, I remember this is some time ago and
there was a company who, well, that should't work, No,
I don't, but they came up with the wh They
came up with a white paper that showed how US
snaffle private keys out of memory and iis TLS TLS certificates.
(37:07):
But if you bought their hardware, then the key stayed
in the hardware. This is before HSM's working hardware security
models were a thing, and they have one of the
first ones.
Speaker 1 (37:15):
But meanwhile, we're going to teach you how to Yeah,
we're going.
Speaker 3 (37:18):
To give you if you buy a device for two
thousand dollars. And then it is kind of funny that
the tool they had was really really simple. It basically
looked for entropy and memory. That's all it really did, right,
because the private key is random, whereas most memory kind
of isn't really, and so they would just look for entropy.
And then what they would do is they were trying
to match up the public key and the certificate with
the private key by decrypting and encrypting. And so they
(37:39):
got the correct answer. But they could do it within
but they can do it in seconds. So now they
know the private key. But hey, if you buy our
hardware device, then the private key stays in the hardware.
Speaker 1 (37:51):
And they probably thought that.
Speaker 3 (37:52):
Don't get me wrong, I'm a huge fan of HSMs,
I really am.
Speaker 1 (37:54):
Probably thought that was a great idea.
Speaker 2 (37:55):
Yeah, but there's also a way to be as somebody
who understands security and tries to fix things without you know,
being part of the problem.
Speaker 1 (38:02):
Right.
Speaker 3 (38:03):
Yeah, Well, let's back up that a little bit. I mean,
did this company do a service I think explaining to
people how easy if you've got rogue software on the box,
you could snaffle the private key. I think it's a
good thing to know. You know, a lot of people
may not realize right that you could easily get the
private key, and that's why you know things like as
a key vault and managed HSM. You know, the private
(38:24):
keys stay in the hardware. They never leave unless you
want to export them in a shrouded manner. Rather, we
give you know, decrypt and we give signing operations rest
endpoints instead, so the private key stays in the hardware.
Speaker 1 (38:38):
Yeah, let's talk about GitHub, get lab all them. You
know that people are raising concerns like existential threats, you know,
to our very existence because we're dependent on all these
projects and nobody has software bill of materials unless they're
really hip, so they don't know where their dependencies are.
Speaker 2 (39:02):
They go.
Speaker 1 (39:02):
You got depend about which tells you, hey, you've got
some dependencies that aren't safe. But it's kind of a
full time job. It is, playing whack them all with
that stuff. It is.
Speaker 3 (39:12):
Yeah. I mean I get notifications, but at least one
every other day, not necessarily because of anything I have,
but because of I'm sort of a co owner of
other people's projects, especially one that's actually on purpose vulnerable.
Actually they get up guys have I actually use it
as a demo at build. They're kind enough to give
me access to it, but of course again depend about
(39:32):
warnings from that all the time. Yeah, I mean, the
whole bill of materials things is a really serious problem. Right.
In fact, back in the day, some old boss Steve Lipner,
who has since retired, he came with this term giblets,
which is, you know, components that you depend on that
you don't own. The problem is if they have a vulnerability, congratulations,
(39:54):
you've got a vulnerability too.
Speaker 2 (39:56):
You've got it too.
Speaker 3 (39:57):
And the funny thing is that product that sort of
woke us up to that was actually seql server with slammer.
Speaker 1 (40:03):
Yeah, tell us telescope. I can't remember exactly what happened,
like two thousand and one.
Speaker 3 (40:11):
Slammer, yeah, two thousand and one. Yeah. It was a worm,
a really nasty worm. It was a UDP worm. So
it traveled like wildfire, right because you know, you didn't
have to do any three way handshakes. You're just dropping
a UDP packet on the network and off you go.
So produced, you know, you end up creating huge network
congestion of compromise machines. What was interesting though, oh yeah, okay,
(40:33):
I remember now. But what was interesting though, is that
seql server actually wasn't really impacted. What was impacted was
msde right, the embedded version of sql server that people
had their accounting pro right. They needed a good a
good quality database that wasn't access right, so they wanted
(40:55):
SQL server, but they don't want to buy seql service
with msdu right, So the embedded version, so you have
your accounting soft we're running this database back end, and
people didn't realize they had it, so they weren't patching it.
Because they weren't patching it, they got whacked by slammer.
So that work us up to this idea of that.
Steve called it giblets, which is components you depend on
that you don't control.
Speaker 1 (41:15):
So the idea is you're making gravy and you cut
up these things that you that look like little weird parts,
and one of them has like a virus or something,
and is that the idea is that the metaphor, you
know what?
Speaker 3 (41:26):
I think the metaphor is the chicken is yours, but
you've got these giblets inside. I don't I don't think
there's actually a good this. This conversation isn't gonna end
very well. I don't know body parts that's okay, but
but yeah, I.
Speaker 2 (41:41):
Thought the main impact of slimer was just burying networks.
Speaker 3 (41:46):
It was a UDIP, it was a UTP payload, so
and the payload actually sat that actually fit fitted in
one UDIP packet. So what would happen is, you know,
you'd have a machine that would just generate random IP
addresses and then just send that UDOP package to that endpoint.
But they could do it so quickly that because it was UDP, right,
(42:08):
there's no three away handshakes.
Speaker 2 (42:10):
And it was already patched, as I recall, it was patched.
There was a patch, and I'll like it had been
out for a while too. All of us it RAN
sequel servers had done the patches. But it's all these
app devs and these little applications that had msde med
it in it that hadn't been patched became propagation machine.
Speaker 3 (42:27):
That was the problem. There was one beautiful piece of
the code was actually buggy. I could be wrong, and
there's two ways I could go with this. It either
didn't generate odd numbers or it didn't generate even numbers.
I can't remember which. Let's just go with even numbers.
But the random number generatesor for generating the four architects
from ip address couldn't generate even numbers, so it left
(42:51):
out huge swaths of the Internet, which is a good thing.
But yeah, the other interesting part of that is so
the guy that found the bug was a guy called
David Litchfield who worked in the UK at the time.
He worked for Apple. And here I spoke in my
last book, Designing and Developing Secure as Your Solutions, actually
spoke with we've been we've been good friends for a
(43:13):
long time, and I spoke at Land like how he
found the bug and that sort of stuff. But that
was the where he put out the announcements about the bug.
He included a proof of concept, right, and that was
the last time you ever put out a proof of
concept because of the damage that was caused by evil
people using it.
Speaker 2 (43:29):
People use it.
Speaker 3 (43:30):
We imagine, imagine if you didn't even imagine if you
ran the proof of concept in your own little environment
and escaped from the lab.
Speaker 2 (43:36):
Yeah, you know, yeah, that's the thing with those propagators,
like more often than not they did escape from the lab.
Like that's how they actually got away back in the day,
in the last black hat days that exists today.
Speaker 3 (43:48):
So for what it's worth though I used that slammer
code or the bug in SQL serve. Actually the code
wasn't even in sql The bug wasn't actually in SQL server.
It wasn't in one TCP one four three three. It
was UDP one three one four three four, which is
the management interface, which is not the database engine. It's
all the management goop around it. So I actually I
(44:10):
actually use that code for lots of examples about how
we've made progress at Microsoft and in the industry in general.
So for example, so Carl, you mentioned GitHub and you
mentioned dependerbot. Well, another thing that GitHub has which is
really cool is code ql, which is absolutely I'm a
(44:32):
huge fan of coql. And so if you've got a
public repo, or if you have a private repo and
you have an enterprise agreement, let's let's just go with
public repair. If you do a poor request, it will
run codeql. I think they call it security scanning or something.
It will run coq well over the poor request. And
if there's a vulnerability, well, actually you can't merge the
(44:52):
poor request.
Speaker 1 (44:53):
Yeah, nice, that's great.
Speaker 3 (44:54):
Which is really nice. That's even better though, you can
even opt in for an auto fix and it will
use AI to generate a fix. Yeah, and it will
provide that as an update, and you can say whether
you want to take the take the merge or not. Yeah.
Speaker 1 (45:11):
I guess so.
Speaker 3 (45:11):
But it's really really I'm a huge, huge fan of
co q WEL because you can write your own rules
for it, which I really like.
Speaker 1 (45:18):
That's cool.
Speaker 2 (45:19):
I guess we got to talk about AI because it
sounds like both hacking and and resisting being hacked are
going to be affected by these tools.
Speaker 3 (45:29):
Yeah. I mean it's more than that, right, I mean,
I you know, as we mentioned air quotes in the
green room, the impact AI is having in the software
industry and security especially cannot be understated. The work that
we're doing at Microsoft, Like I think every single day
I have at least two meetings on my calendar that
(45:51):
have the letters A and I right in there about
something that's going on in software development, security, security testing.
Once today I just had just now is on security
testing with AI and you know producing you know test
plans that test security. One I had the other day
was or some scanning, some really smart password scanning or
(46:15):
you know, looking for rogue endpoint or unpatched doandpoints using AI.
And they're literally literally, these guys were vibe coding, the developers,
but they were vibe coding, and they're using Go because
of the concurrency to be able to scan things quickly again,
and they were. They batually said that ninety eight percent
of this was vibe coded. The other two percent was
a human being. So you know, there's a demo that
(46:39):
I gave it build where I had some code that
had a service side request ford. It was in C
sharp is a rest dandpoint and I actually said in
Visual Studio, you know, hey, do a security code review
of this this code, and you know, fingers crossed because
of the nondeterministic nature of LLLMS. It came back with, hey,
there's a service side squardry here, and actually came up
(47:02):
with a couple more I hadn't even thought about. Wow, wow,
So you know, do we even need static analysis tools
at this point? Yes, we do. But you know, it's
just an interesting thing that's built into the product that
gives you not just a bug, but also an explanation
of why the bug's bad. And if you want it,
you know, you can tell it, Hey, we'll fix the
bug for me. If you're on a Visual Studio code
(47:23):
in my agent mode or something's crazy.
Speaker 2 (47:25):
It's power. It is powerful and at least gives you
ideas the question of whether or not it's comprehensive or not. Yeah,
it's you know, a big one. Well as well as,
like I said, not deterministic. I asked it again and
I found different vulnerabilities.
Speaker 3 (47:37):
Right the other way looking at it is it's almost
like learning on demand as well. Right, well, I didn't
even know that was a problem. Well now you do, so, Yeah,
it's I'm a huge fan of course in the defensive
side of things. You know, we get so many signals
at Microsoft every single day, and the many trillions of
signals every single day, there's no where human being can
go through all them, and so we use AI extensively
(47:59):
to work out what's real and what's not before a
human gets to look at things.
Speaker 1 (48:03):
I want to shift gears before we end here. And
I know that you know secure code is your thing.
Are there is there anything that we haven't talked about
that your average developer might fall into the trap of
or needs to be aware of where you know to
to leave vulnerabilities in their code. I mean, obviously seql
(48:25):
injection is a huge one, but I think everybody that
listens to our show and has listened to it knows
how to use parameterized queries and all that stuff. But
are there are any any other things that might not
be so obvious.
Speaker 3 (48:36):
Yeah, so we mentioned sequel injection. I touched on cross
site scripting before with remember that talked about the ASP
dot net class or whatever it was that detected cross
site scripting and vulnerabilities or attacks I should.
Speaker 1 (48:49):
Say anti forgery tokens.
Speaker 3 (48:51):
That's the one. Yeah, that's another one, and it doesn't matter.
But you know, talk about a service SID request for
just squal injection, cross site scripting, memory corruption, and so on.
They all have one thing in common. They're all input
valid problems, every single one of them. And that problem
transcends vulnerabilities, like there'll be new vulnerabilities in the future
that we don't even know about yet. Even LMS, right,
(49:12):
LLM suffer from input validation problems.
Speaker 2 (49:14):
Sure, and the.
Speaker 3 (49:15):
Fact that you're merging the data plane with the control plane.
But that's a whole other problem. So to this day,
and I wrote about this in Running Secure Code. You
know the first edition of Running Secure Code, which was
twenty five years ago now gratulations.
Speaker 2 (49:29):
It's a lot of time.
Speaker 3 (49:31):
I'm not sure if that's good or just absolutely terrifying.
But yeah, one of the chapters is all input is
evil until proven otherwise. And that's the same today. And
so I still think that if developers just learned that
one thing, like, you know, is this data coming in
through a rest endpoint through whatever AP port, a socket
(49:54):
or a web socket or whatever, RPC endpoint, gRPC data
coming in, you know, is it correctly formed? Is it
the correct stuff? Don't look for bad things because that
assumes you know all the bad things. Is it correctly formed?
Like if I'm expecting a zip code, is it numbers
with a potential dash? Right, and that's it? And is
it you know, ten digits long max or something like that.
Speaker 4 (50:17):
That's got to be alpha numerica because we have other
content that's right, Yeah, exactly right, but even now for
numeric like sorry, M well, well the UK is the same,
right UK. So I used to live in L nine
to ninety L when I was in England. But it's
interesting that that issue is still the number one source
(50:40):
of issues, and in fact there's a whole new issue
set of issues opened up with l MS.
Speaker 3 (50:45):
And it's all input validation, that's all it is. I
still say that's like the number one thing that developers
have got to think about is my data correct.
Speaker 2 (50:54):
Yeah.
Speaker 1 (50:54):
I learned about a new attack from doing security this week,
which was D serialization injection.
Speaker 2 (51:01):
Yep.
Speaker 1 (51:02):
So imagine, if you will, you have a comments paid
you know, comments thing, and there's no limit to it, right,
and somebody just pastes in there some JSON that has
somehow attached some code to it. I don't This is
what I don't understand. Maybe you de serialize that into
an object that has code. Yeah, and that object has
(51:24):
you know, in in its in its initializer, it's code
that runs.
Speaker 2 (51:31):
Right.
Speaker 3 (51:32):
We'll give an example. You've got a proof of concept
on this, which is this is why we might sort
of deprecate to the binary serializer and SEA shop is
you can deserialize a c shop object and having the
constructor which will run, right, you can.
Speaker 1 (51:46):
Have exec yeah, exec whatever.
Speaker 3 (51:50):
Right, and I'll just run and I'll just run.
Speaker 1 (51:52):
Yeah.
Speaker 3 (51:52):
So there's nothing special or funky or magic abouts it is.
You put whatever the heck you want in the constructor
and then when it's de serialized, that runs. And if
it's an exact you know, prop I don't know what
it is in dot net, I'll be honest with you.
It process dot execute or something process dot run.
Speaker 1 (52:10):
Yeah, yeah, you can create a process you're can just
run it.
Speaker 3 (52:14):
Now you're running a arbitrary code, and the demo I
give is just running notepad of course, you know.
Speaker 2 (52:18):
Yeah.
Speaker 1 (52:18):
But the thing is is that that that object has
to be defined on the system that's doing the de serializing.
So it's not like you can just put in a
bunch of JSON and it's going to create a class
that runs. Do you know what I mean? Yeah, yeah,
you can't de serialize code like that.
Speaker 3 (52:34):
Yeah, yeah, You've got to have there's got to be
a class that you know exists. So for someone like
a person class, right, yeah, then you would put it
in the person constructor. Yeah.
Speaker 1 (52:43):
Right.
Speaker 3 (52:43):
So so here's there are actually, actually there are other examples.
There's lots of examples, and in fact, in the red
team we have a person who is an absolute expert
in building malicious payloads for deserialization. Okay, so I guess
I'm all I can say is I'm glad he works
(53:05):
for Microsoft and nobody else.
Speaker 2 (53:07):
That's good. I'm very I've had a number of people
I've met over the years where you sort of look
at them later and go, you know, I'm really glad
you're a good guy.
Speaker 3 (53:14):
I know.
Speaker 1 (53:16):
One of the things we do, and you might like
this on our show, is we take the cv score
for a given threat, which is usually from one to ten,
you know, being CVSS. Yes, I'm sorry. The CVE is
the number that identifies the threat, the CVSS scorer, and
that if it's ten, basically that means, oh my god,
(53:37):
the world is going to end. You must patch. But
we also add a contagion score, which is how realistic
is it that you the average person is going to
be affected by this vulnerability?
Speaker 2 (53:52):
Right?
Speaker 1 (53:53):
And that to that equation is you know, is it
connected to the Internet? Do you have to be on
the local network already? Do you have to have hacked
to a certain point? So so what do you think
about that? Do you think that should be an official
thing like a contagion score or a reality score?
Speaker 3 (54:10):
I mean, I don't think I think that idea is
actually baked into the modern versions of CVSS, like accessibility,
does it require an authenticated connection? Does it require human interaction? So,
if you've got something that requires no authentication, that's remote,
that requires no human interaction. Then, for example, I remember
a while ago, I remember a while ago, there was
a really nasty bug in iOS where someone could send
(54:33):
a text message, an SMS message.
Speaker 1 (54:36):
Yeah, I remember this too, right, you didn't even have
to open it and it would own your phone.
Speaker 3 (54:41):
Yeah, exactly, I mean, don't get I mean, you know.
But again, the nice thing about iOS, you know, most
mobile platforms, is they're very locked down and isolated, and
they assume this sort of stuff. But even but even so,
it was a really nasty bug.
Speaker 1 (54:56):
So they we have seen scores that have been very high,
like nine or whatever, that we have adjusted down for
our listeners because because of those things, like you already
had to be on the network, you had to like
maybe there had to be a human interaction, like somebody
got up from their screen and went to the bathroom
and left themselves logged in, like, you know, a bunch
(55:16):
of things had to fall into line before you could
get that thing. So I'm not so sure they do
a good job of Yeah.
Speaker 3 (55:24):
But here's the problem. But here's the problem. Though KYL
is a bug by itself is interesting, a CV is interesting.
You look at what the guys doing, the Red Team,
they rarely take a single CVE. They chained them together.
Speaker 1 (55:39):
Yeah, yeah, that's true.
Speaker 3 (55:40):
Right, So you may have a remote and unauthenticated thing
that does nothing exciting. Then you have another one that's
a local only that gives you some privilege elevation. Then
you have another one that requires a little bit of
privileged access, but now it gives you godlike on the box. Right,
So he chained three together. So that's where it becomes. Yeah,
(56:01):
that's where it becomes really dangerous. And that's one thing
I love about the Red Team, right, is they think
about these things. So John Lambert, corporate VP and security
fell at Microsoft, you know, he says that attackers thinking graphs,
that's what they're doing. They're chaining these things together. They
don't think about individual spot things.
Speaker 1 (56:20):
First we'll use this exploit. Then we'll you know, take
a late use whatever is the lay of the land.
I guess what they call it, you know, feeding off
the land, feeding off the land. Yeah, see what's there? Yeah,
and then maybe use another exploit, another privsk and then
you own the box.
Speaker 3 (56:34):
Yeah absolutely, And now next thing you know, you're broken
out of a VM you know, and you're on the
host the ultimate path traverse, right. I mean, that's that's
pretty nasty, you know. And that's where security guarantees come
into play. Is what are the security guarantees around your
security mitigation? Is there a security guarantee around it? So,
(56:54):
for example, there are security guarantees around VMS. If you
violate that, then that's a serious problem that we will fix, right.
But other things, so you know, other things don't have
security guarantees. It's really good examples of his code obfuscation.
There's no security guarantee. There are none because eventually the
CPU has to know what instructions to execute. Right.
Speaker 1 (57:12):
And if you know you're you're a hacker, you're on
the box and you know a assembler or binary, you
can just go walk the memory and see what's going on.
Speaker 3 (57:21):
It's funnys you bring that up. Someone made a comment
on x about some C sharp link stuff or whatever
and said, which is fast? And they give you three
different examples of link and I said, I don't know,
So let's look at the assembly language. Nice, you know,
it's let's look at what it's doing. And it was
really interesting. And this guy made a comment to me, said,
(57:42):
you read assembly language. I said, assembly language is the truth.
Speaker 2 (57:47):
This is what it's landing on the processor.
Speaker 3 (57:50):
Yeah, I mean you can read whatever you want in
the you know, in the source code, but it's what's
generated by the compiler on the link that matters.
Speaker 1 (57:58):
But quite frankly, I would rather you to look at
the thing with the assembly language, then learn assembly language
to the point where I could spot a problem. Thank
you very much, dude.
Speaker 3 (58:08):
I love assembly language. I'm not gonna lie. I love
assembly language. It's it's my happy place. Well, I mean
c is probably my happy place. I love CE, but
I love a SAMI language. Yeah, I've been learning a
lot about our assembly language.
Speaker 2 (58:19):
Well, and see what was created to make more portable
assembly language that you could write one and degenerate down
and do assembly for different processors.
Speaker 1 (58:29):
So what's next for you? What's in your inbox?
Speaker 3 (58:31):
What's in my inbox? A lot of MCP security these days,
you believe it or not. On the as of Security podcast,
we interviewed a guy who works on MCP at Microsoft
and also on the working group for MCP.
Speaker 1 (58:46):
This is let me get it, Master Control Protocol, a
little model, monitor model. Anyway, it's what's between your agent
and the services that you want the agent to access
on your behalf model context protocol model context. But I'm
never going to remember this model context protocol.
Speaker 3 (59:04):
Yeah I I I'm you know what, I have problems
remembering as well. Yeah, I hear every single day.
Speaker 2 (59:09):
Yes, because I keep picking tron Master Control program a lot.
Speaker 1 (59:14):
That's what it is.
Speaker 3 (59:17):
But but yeah, you know, MCP is being used everywhere,
and all of a sudden, you know as e sql
database has now got m C MCP server. Visual Studio
code is an MCP client.
Speaker 1 (59:27):
But last we knew it wasn't secure, right, Well.
Speaker 3 (59:29):
So actually it's interesting the MCP guys are trying as
hard as possible to offload that's a terrible word the
security to the underlying environment. Like so for example, ACT
authentication shouldn't really be an MCP thing really really should be.
You know, if you use two user or two for authorization,
(59:50):
you want to use open idconnect for authentication, use you
use Windows Authentication for authentication, so beet.
Speaker 1 (59:55):
And then your MCP is is whatever is there correct?
Speaker 3 (01:00:00):
Correct? Which is which I agree with. There there is
some work going on with authorization. They recognize that, but
I would not throw out the terms MCP is insecure.
I would think MCP doesn't do some things on purpose.
(01:00:20):
So it's not locked into a WS it's not locked
into Azure, it's not locked into Windows or Linux or
whatever they talk about.
Speaker 1 (01:00:29):
Can it be exploited though?
Speaker 3 (01:00:30):
Well, I actually funny should say that I sort of
the thing came came across in my inbox, but a
vulnerability in some MCP client because it wasn't authenticating the server.
Speaker 1 (01:00:41):
Well okay, okay, not MCP's fault.
Speaker 3 (01:00:45):
That's not an mc exactly exactly right, it's not an
MCP fault. But you know this, this bug is being
built as an MCP vulnerability, but it's not, you know,
so I I understand your you know how you're arriving
at that, but by the same token, you gotta understand
where the MCP guys draw the line, right, Okay, I
think that's fair.
Speaker 1 (01:01:05):
That's fair, that's fair. I mean I think it was
when we were talking to Scott Hunter and he had
some inkling about that, right, that might be where I
picked that up. Yeah, but it's good to know that
it's that it's evolving and all.
Speaker 3 (01:01:19):
That apparently, Yeah, grain their authorization is that next thing m.
Speaker 1 (01:01:22):
Mm hmm okay, well you'll have to google that, folks,
because we're out of time. So Michael Howard, thank you
very much. It's been a pleasure talking to you and
a little bit scary, but in a good way.
Speaker 3 (01:01:33):
Oh I'm not scary, that's all.
Speaker 1 (01:01:35):
No, No, you're not scary.
Speaker 2 (01:01:37):
Life is, but the world is very scary. You just
reminded us.
Speaker 1 (01:01:42):
All right, Thanks a lot, and we'll talk to you
next time on dot net rocks. Dot net Rocks is
(01:02:07):
brought to you by Franklin's Net and produced by Pop Studios,
a full service audio, video and post production facility located
physically in New London, Connecticut, and of course in the
cloud online at pwop dot com. Visit our website at
d O T N E t R O c k
S dot com for RSS feeds, downloads, mobile apps, comments,
(01:02:31):
and access to the full archives going back to show
number one, recorded in September two.
Speaker 2 (01:02:36):
Thousand and two.
Speaker 1 (01:02:37):
And make sure you check out our sponsors. They keep
us in business. Now go write some code, see you
next time.
Speaker 3 (01:02:44):
You got Javans
Speaker 2 (01:02:48):
And