All Episodes

April 30, 2025 41 mins

While you were going about your life, we all narrowly avoided a complete disaster. This is the story of how one lone Microsoft engineer saved all of us. Dexter talks to Alex Stamos, Chief Information Security Officer at SentinelOne, about how it all went down and why this story still terrifies him today.

Got something you’re curious about? Hit us up killswitch@kaleidoscope.nyc, or @dexdigi on IG or Bluesky.

See omnystudio.com/listener for privacy information.

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:09):
Quick question, do you know about the Xzu Till's back
door hack? What backdoor?

Speaker 2 (00:15):
Wait?

Speaker 1 (00:16):
Wait wait wait wit what the xu tills back door?
I have no idea what you're talking about.

Speaker 3 (00:21):
I don't know what that is.

Speaker 1 (00:24):
This is something almost nobody's heard of, but in the
spring of twenty twenty four, we narrowly avoided a complete
technological disaster.

Speaker 3 (00:33):
So you've you've never heard of this though? Nope?

Speaker 2 (00:37):
Yeah, I just searched it up on Wikipedia and it
seems way too nne.

Speaker 3 (00:40):
To read about.

Speaker 1 (00:42):
These aren't just random people. These are other journalists people
in general who keep up with the news.

Speaker 4 (00:47):
Okay, I was like, wait, what did I miss?

Speaker 3 (00:49):
And I feel bad, But I guess maybe I'm not
the only one. What journalists who doesn't know what this
is about?

Speaker 1 (00:57):
Even they didn't really know about what could have been
the biggest hack in the history of the Internet.

Speaker 2 (01:03):
If this had not been caught, then this would have
been a skeleton key that would have allowed these attackers
to break into tens of millions of incredibly important servers
around the world. We probably would have had airlines not working,
trading halted ATM's, not working, banks not working, people not
able to get their money, you'd have a huge loss

(01:26):
of credibility of technology in people's lives.

Speaker 1 (01:30):
Alex Stamos is a cybersecurity expert. Specifically, he's the chief
Information Security Officer or CISO at a cybersecurity company called
Sentinel One, and he's the former CIECO at Facebook. He's
also a lecturer in the computer science department at Stanford
and this attempted hack is something that is still keeping
him up at night.

Speaker 2 (01:51):
It's fallen out of popular discussion, but among people in
security we're still talking about it. It uncovered a real
fundamental weakness that terrifies lots of people who have responsibility
in this area.

Speaker 1 (02:07):
And what scares them most, and this should scare us too,
is that this was caught by complete chance.

Speaker 2 (02:13):
We just got lucky, right like one dude got really
bored and noticed a tiny little change in the speed
of one program executing and pulled the thread. And on
the end of this thread was a humongous, ticking time bomb.
It was one dude, and he should never have to

(02:33):
buy a beer for himself ever again underus freuend I'm raising.

Speaker 4 (02:38):
A toast to you right now. This is just water,
but I wish it was more.

Speaker 1 (02:53):
Kaleidoscope and iHeart podcast is killed switch com dexterous.

Speaker 2 (03:01):
Amentarians.

Speaker 1 (03:23):
If you've never heard about this, that's no reason to
feel bad. But if it hadn't been caught, it absolutely
would have affected you. What kind of activity were talking
about here.

Speaker 2 (03:34):
Well, we really don't know, and because we don't know
who the attackers are, we don't know whether that would
have been used for really quiet surveillance. It could have
been used for national security intelligence gathering purposes. It could
have been used for a humongous heist of hundreds of
millions or billions of dollars of cryptocurrency, or it could

(03:55):
have been used as part of a massive cyber attack
to shut down millions of can and cause massive disruptions.

Speaker 1 (04:02):
One of the main reasons that this potential attack isn't
talked about much is because the details are kind of technical. Well,
some of the details are a lot of this stuff
is really just basic human behavior. It's stuff that you
or I could do if we really wanted, and it
shows us that sometimes the best hacks are the simplest ones.
Let me break it down for you. In late March

(04:25):
of twenty twenty four, Andres Freund who's an engineer at Microsoft,
was sitting at his desk doing his job when he
discovered a malicious piece of code in this little known
tool called xdu tilS. This code created a method that
would allow hackers to access a.

Speaker 3 (04:40):
Lot of different computers.

Speaker 1 (04:42):
Maybe right now you're thinking, okay, so why is this
the problem for me? I mean, I don't use xdu tilS,
so they couldn't get on my computer. And yeah, maybe
you've never heard of XCU tills. Actually I hadn't either,
and I did what most people do when they don't
understand something about a computer, call an expert. But it
turns out that this really well respected expert he found

(05:04):
out about XU tills when I did.

Speaker 2 (05:07):
Yeah, so I personally had not heard of xdu tilS
before this.

Speaker 1 (05:11):
Even you really, yeah, I had definitely not heard of
XU tills. I figured you would have hearing that you
had not heard of it before all this happened. Frankly,
that's a little bit more scary to me. Now, So
why did this backdoor into a program that no one
seems to know about matter so much? And what is
XU tills.

Speaker 2 (05:32):
This is the brilliance of what these attackers did XCU
tilS is an ingredient to an ingredient to an ingredient
to something really important. So the thing that they wanted
to have a backdoor into is a really important program
called open ssh.

Speaker 4 (05:49):
So this is something that every tech he has heard of.

Speaker 1 (05:51):
All right, but what if you're not a techie. So
in order to understand the XU tills haack, we do
need to back up and understand something that xcutil is
used in this thing called open ssh.

Speaker 2 (06:03):
This is the program that the majority of Unix like systems,
especially Linux, also Max and some other operating systems, allow
you to access them remotely over the internet.

Speaker 1 (06:18):
I'll get to the open later, but SSH stands for secure, show,
and let's just focus on secure right now. If you
think of the difference between posting a tweet online and
dming someone, you're actually kind of halfway there. Open Ssh
allows you to communicate with a remote computer just like
you were sitting there right in front of it. So
even though you're far away, if you want to send

(06:39):
a message, or install programs or delete files, you know
that the connection is safe and that nobody else can
see what you're doing or tamper with that connection.

Speaker 2 (06:48):
So you know, when you see like people in the
matrix typing really fast, see a lot of text, right
if somebody is doing that remotely, it's probably over open ssh.

Speaker 1 (06:57):
You might think this doesn't matter for you because you
don't use open ssh, but you do because that's what
you use to connect to systems running Linux. Around the world.

Speaker 2 (07:07):
Linux has become the standard operating system for the cloud.
So when you talk to Google, you're talking to a
Linux system. When you talk to Facebook, you're talking to
a Linux system. When you talk to Apple, you're probably talking.

Speaker 4 (07:21):
To a Linux system.

Speaker 2 (07:22):
Right now, the system that we're talking to each other
with almost certainly is running Linux. So the vast majority
of systems you talk to in the cloud are running Linux.

Speaker 1 (07:30):
Linux is used for Apple's iCloud, for social media sites
like Facebook, Instagram, for YouTube, for Twitter, It's used for
the New York Stock Exchange. Gamers use it when they
run Steam or they play games online, and the list
goes on. The vast majority of the Internet runs on
Linux and open ssh. Make sure that it's you logging

(07:51):
in and not somebody else.

Speaker 2 (07:53):
When you log in and you get your mail, the
server that holds your mail has a stage on it
the server that holds your social media SSSH, the servers
that have your banking information at ssh. It's the door
by which you get into these systems.

Speaker 1 (08:07):
So open ssh is incredibly important to the Internet and
all the cloud systems that we rely on, and because
of that, it has a lot of eyes on it.
Trying to hack open Ssh directly would pretty much be impossible.
Someone would catch you pretty quick.

Speaker 2 (08:24):
People pay a lot of attention to it. A lot
of people run their code scanners on it, a lot
of people look for bugs in it, and so it
has been a while since open ssh has had itself
a humongous security flaw in it. If you just join
the open ssh project and said, hey, I'm a new
guy that nobody ever knew, here's my code, everybody would

(08:45):
be super suspicious, right m h. And whoever these bad
guys are, they know that. So what they did was
they looked at open ssh, and they looked at its
dependency graph, what we call They looked at all the
stuff that goes into open ssh, and what they saw
was open Ssh depends on other things.

Speaker 1 (09:06):
This is where xzu tills comes in. Xu tills is
one of the things that open Ssh depends on what
does xzu tills actually do.

Speaker 2 (09:16):
It's a compression library, so it's just a library that
is used to make data that comes in smaller so
that if you're moving like a big file back and forth,
it can fit down a smaller pipe. Right, you might
be talking to a server on a satellite link, you
might be talking over a modem. Right, you might be
talking over a cell phone, and so you want your
big file to fit into a smaller pipe.

Speaker 1 (09:36):
If you've ever used the zip file on your computer,
you get the general idea. Smaller files can be transferred faster,
which is important when you're dealing with so much data
flowing back and forth. Xdu tilS allows open Ssh to
be both safe and fast, but that's the trick. By
inserting a back door into xu tills, the hackers created

(09:56):
a way to access anything being transmitted via open ss.
That meant they could not only read supposedly secure messages,
but remotely run code on any server that uses open ssh.
And since basically the entire Internet uses this thing, once
you're in there, you can do anything you want.

Speaker 2 (10:15):
You could have used it for a bunch of very
quiet surgical attacks over a multi year period, or you
could have done one humongous big bane where you knock
out a huge chunk of the Internet all at once.

Speaker 1 (10:28):
But how did hackers get access to xdutails in the
first place. Well, remember when I promised to tell you
about the open in open Ssh. Open Ssh and also
Linux are open source programs. This means that anyone can
look at the source code because it's open and it's
posted publicly. The idea is that if everyone works together
on the code, it'll be better and the publical benefit.

(10:51):
And so anyone's free to look at the code, to
learn from the code, or even to remix it for
their own use. And even if you have no interest
in all that nerd stuff, you still use versions of
open source code every day on basically all of your
devices when.

Speaker 2 (11:05):
You're running open source software, which people don't understand. Basically
everybody is right. So what kind of phone do you have?
Do you have an iPhone or Android?

Speaker 3 (11:13):
I actually have an Android?

Speaker 4 (11:14):
Yeah, okay, so Android.

Speaker 2 (11:15):
A humongous chunk of that code is open source, right right,
And that is code that is maintained by volunteers that
you have no idea who those people are. Google has
no idea who those people are. Right, Google collects all
this code from around the internet, they package it all
up and then they put it on a phone, or
they send it to Samson, and Samson puts on the phone.

Speaker 1 (11:34):
And before we get any further, iPhone people, this applies
to you too. Your iPhone uses a lot of open
source code also. And don't get me wrong, this is
not a bad thing.

Speaker 2 (11:44):
It's great because it's free and it makes the phone cheaper,
and it's cool that we all get to contribute. But
the flip side is is that, yes, OpenSSH itself gets
lots of love. The Linux kernel gets lots of love, right,
But something like XCU tills, which is this tiny little
component over here, does not get lots of enough. And
xd details at the time was maintained by one person.

(12:06):
That one dude was then manipulated to giving up control
of it, and the person he gave up control of
it too, turned out to be a totally fake persona
to not exist.

Speaker 1 (12:18):
This is where we get to the human part of
the story. The one guy who was maintaining xeu tills,
his name was lost to Culin. He'd been maintaining xeu
tills since two thousand and nine, and he was the
sole maintainer for the project. He wasn't being paid for it.
He was a volunteer. That's usually how open source projects go.
In twenty twenty two, LASTA. Collins started to get a

(12:39):
lot of requests to make updates to the code. Throughout
the year, multiple accounts, seemingly out of nowhere, started complaining
that Colin wasn't working fast enough and implying that if
he wasn't interested in doing this anymore, maybe he wasn't
the guy for the job and the pressure was getting
to him.

Speaker 3 (12:56):
In June of.

Speaker 1 (12:57):
Twenty twenty two, Colin wrote in a public note, quote,
I haven't lost interest, but my ability to care has
been fairly limited, mostly due to long term mental health issues,
but also due to some other things. He also went
on to remind people that quote, It's also good to
keep in mind that this is an unpaid hobby project. Thankfully,

(13:19):
right about that time, a new programmer had come into help.
This new person's name was Gia Tan. Colin seemed a
little relieved that finally someone wasn't just complaining but helping.
In that same note from June, he wrote that he'd
been working a bit with Gatan on exeu utils to
address all of those complaints, and he said about gia quote,

(13:39):
perhaps he will have a bigger role in the future.
We'll see. Over the course of a few years, gia
Tan really started to gain lesa Collins trust. Gia Tan
was the ideal contributor. He didn't just help when he
was asked to, but he would offer to take on
more work, and by twenty twenty four, Colin had made
gia Tan a co maintainer on the project, which allowed

(14:02):
him to add code without needing approval.

Speaker 4 (14:05):
This is a human attack, right.

Speaker 2 (14:07):
It all happened in the open, but the way they
did it was they created these fake personas where one
guy super friendly and one guy's a jerk, and the
jerk basically is abusing the person who's maintaining the software
and saying, oh, I need this change, I need this change.

Speaker 4 (14:22):
You're so slow. Why are you so slow?

Speaker 2 (14:23):
And remember this guy's not getting paid right like, and
so eventually basically bully this guy to say, oh, I'm
tired of doing this, I don't want to do it anymore.
And then the nice guy's like, oh, well you know,
I'll do it for you. I'll take over man, let
me take this burden.

Speaker 1 (14:39):
For you, right, very convenient, right, And.

Speaker 2 (14:42):
This took several years, and so this shows you kind
of the long play. They're willing to spend months and
months and months and in fact years building these personas,
because like, look, if you just created an account and
you're like, hey, i've got code, take it, that wouldn't
work right with these people. Figured out is that you
have to create these personas. They have to seem real.

(15:05):
You have to make posts, you have to contribute legit stuff.
You've got to create.

Speaker 4 (15:09):
Kind of a history, build a relationship, you have to
build a relationship.

Speaker 2 (15:12):
And so the guy who maintains it gives it up
of like, oh, thank you so much for taking this
burden from me, because look at these jerks.

Speaker 4 (15:18):
Now.

Speaker 2 (15:18):
Of course he doesn't know that the jerks works for
the same team, or maybe you're even the same person
as the nice guy, right, And then he hands it
over to this nice guy who's a friend of his,
and then the friend takes it over and then does
a bunch of legitimate stuff, and then in the middle
of all that legitimate stuff inserts a very very subtle backdoor.

Speaker 1 (15:37):
I've seen this back door talked about using the phrase
sophisticated that it was very sophisticated. Yes, in some ways
it sounds sophisticated, but in some ways it sounds like
it kind of wasn't because a lot of it just
revolved around getting somebody to give them some access.

Speaker 2 (15:53):
The code was sophisticated, The method of getting in there
was very human. It was bugging a guy until he
gave up control. Yes, right, just being a nuisance, Just
being a nuisance. So who was behind those fake personas
We don't know for sure, but Alex has a theory
that's after the break over the course of years, the

(16:25):
one guy maintaining this very important tool called XEU tills
Lasa Colin was being bullied and manipulated online to give
a persona called Gia Tan a lead role in handling
the code. But who is gia Tan. Everybody's been asking
this question of like who did this, Who's behind this?

(16:46):
Most of the names have kind of an Asian origin, right,
So there's accounts like Jagar Kumar. The key one is
Gia Tan, which is like, could be Chinese, could be Korean.
Most of the either the names or the technical indicators
point to right. So the time zones that this person
was working into are kind of the East Asian time zone,
so it's like Beijing or Korea. The names are Asian.

(17:09):
Everything points to Asia, which makes a lot of people
think it's Russia actually because it's just too perfect, right,
because it's just like it's somebody spent three years doing
all this work, and then you're like, like, let's say
you're Chinese. Are you going to use like a Chinese
name as your fake name? Are you going to spend
three years but then work in your normal time zone?

(17:31):
And the generally the only actor who has shown this
level of patients who's been willing to spend three years
working on a back door like this. The only people
who have ever done that is either the United States
or the SVR. So Russia, okay, yeah, are really the
only groups where you've seen people spend years kind of

(17:51):
doing this kind of work. And a lot of people
don't think you would be the US doing something like this,
that they would never mess with something this important because
also the thing the Russians are like was blame other
people right again, because we never got to the point
of again used. Usually attribution is done after something's used,
and so it's a lot easier to figure out because
then you can ask quebono right, who benefits.

Speaker 4 (18:14):
But like all of these indicators.

Speaker 2 (18:15):
Pointing specifically to kind of China or Korea makes you
think it's just a little too obvious.

Speaker 1 (18:23):
A major theory in cybersecurity circles is that Giatan isn't
one person. It's potentially multiple people, but likely Russian hackers
working for the SVR, which is Russia's foreign intelligence service,
and that they tried to cover their tracks even if
it wasn't consistent.

Speaker 4 (18:40):
The guys who.

Speaker 2 (18:40):
Worked for the professionals will change their time zones specifically
around who either what allows them to avoid detection or
specifically around whatever they're doing for attribution.

Speaker 1 (18:51):
Well, there were some times that the time zones actually
pointed to an Eastern European time zone or time or
another time zone, right.

Speaker 2 (18:58):
Yeah, I mean there's there is a little mixed right,
So somebody could be working from the eastern side of Russia,
or they could be waking up early in Moscow Saint
Petersburg and then they slipped right.

Speaker 1 (19:09):
In other words, they might have just slipped up and
forgot to change their time zones. Because remember this happened
over the course of years. Maybe somebody had an off
day and forgot to change the computer settings. But Alex
has another reason for suspecting Russia over China.

Speaker 2 (19:25):
Chinese hackers, for the most part, work very rigorous hours.
You can almost always tell when Chinese hackers are working
because they work office hours. They work eight to five,
eight to six. Really, it's like very regular, yeah, okay,
Whereas it's much harder to do time zone stuff based
for the Russians because they they will work whatever hours
they need to work. You know that that scene in

(19:46):
like one of the Born movies where it's like the
club scene. I always think about this with the Russia.
There's like a club scene in Russia, and it's like
you think it's the middle of the night and he
walks out it's like ten am or something. Right, It's like,
that's what I think about with Russian hackers, whereas like
for China, it's amazing because it's like, oh, six pm
in Beijing.

Speaker 4 (20:02):
You know, it's like, you know, everybody goes home.

Speaker 3 (20:04):
The hacking stops.

Speaker 4 (20:05):
Yeah, or like Chinese.

Speaker 2 (20:06):
Chinese New Year Lunar New Year, everybody goes home, go
sees their parents in the village or whatever, like hacking stops.

Speaker 4 (20:13):
It's amazing.

Speaker 1 (20:14):
And in the case of Exit the UTILS looking at
the timing when this ga Tan was submitting code, there's
a bunch of submissions during Lunar New Year, but during
big Eastern European holidays like Christmas crickets. But that leaves
a question, what's the motive. Why would the Western SVR
want to do this.

Speaker 4 (20:35):
So open everybody uses.

Speaker 2 (20:37):
That's why this is so powerful is you don't have
to have a specific target in mind, which is why
you'd also spend three years doing it. Because let's say
you're at the SVR, you know, no matter what war
you're involved with, no matter what target you're going after,
opens station is gonna be useful. So this is probably
a team of the SVR who they don't know what's
gonna be used for. They're just they know they're gonna

(20:58):
get a medal.

Speaker 1 (20:58):
You'll be able to use this at some point, Yeah,
who knows for what?

Speaker 2 (21:01):
And the US does the same thing, right, Like, there's
people whose job it is to get the capability, and
it's other guy's job who understand the geopolitics, who understand
the intelligence.

Speaker 3 (21:10):
To use it.

Speaker 1 (21:12):
But Thankfully last spring, Andres Freud, the Microsoft engineer, was
able to discover the back door. But this was all
by chance. He wasn't looking for it.

Speaker 2 (21:23):
He works on a database called Postgress, so he doesn't
work on xdutils. He works on Postgress is a big
open source database program that Microsoft uses in their Azure cloud.
So I'm guessing that's why Microsoft pays him. And in
the next version of Debian, so a popular Linux distribution,

(21:44):
Postgress was running a little bit slower, just tiny, tiny,
a little bit tiny, tiny, little bit.

Speaker 1 (21:49):
Right, so tiny, like how much slower?

Speaker 2 (21:52):
Like in one specific circumstance, it's taking a couple of
milliseconds longer to do something right, So like a millisecond, yeah,
but like if you're a database guy, that's a lot, right,
And so he is super looking into what is going on,
and he realizes, oh, it's not actually Postgress that's doing this,
it's open as hish, And so he could have stopped
there because he could have been like, oh, well, it's

(22:12):
not my problem, right, it's not my thing, and then
maybe nobody would have looked at it, right, Like you
could see an open source is that people pass problems
to each other all the time, right, So it is
this is like I think a normal like a normal person,
even open source developer, would have been like, oh okay,
I looked at this, it's not me. I'm gonna let

(22:33):
it go. But he did not let it go. He
ended up digging into okay, well what changed in open
as the sage and then he looks into open a
sh and sees this code and so what the What
the attackers that did is they created what's called a
no bus back door nobody butt us.

Speaker 1 (22:51):
No bus or nobody butt us is a way of
creating a backdoor into something where nobody but us or you,
the hackers have the key.

Speaker 2 (23:01):
They wanted, skeleton key that only they can use. But
no bus back doors nobody but us back doors ar
are actually hard to sneak in because they're pretty like
obviously sketchy.

Speaker 1 (23:12):
So instead of doing everything all at once, they delivered
multiple patches in multiple different places, little things here and
there that wouldn't raise suspicion if you looked at one
or two or three of them, but layered on top
of each other, they created a key that only they
could use.

Speaker 2 (23:30):
And so because they did all this stuff to kind
of obfuscate it and make it super secret. They actually
created the performance impact that unders saw and then went
way out of his way to pull and then he
posts in a public post, guys, this is super sketchy, right, like,
look at this code. There's no good argument for what's going.

Speaker 1 (23:51):
On here, right, So, I mean I kind of have
to wonder about what the implications for this are. I
mean this clearly it almost worked. Do you think there's
hackers out there saying okay, yeah, yeah, let me change my.

Speaker 3 (24:06):
Approach and maybe this is the way to do it.

Speaker 4 (24:08):
Yeah.

Speaker 2 (24:08):
I mean what I'm afraid of is we haven't found
any other ones like this. So what I thought would
happen is at the time, I'm like, oh man, we'll
have one or two more of these because everybody started
looking and then nobody else found any other ones.

Speaker 4 (24:21):
Which terrifies me.

Speaker 3 (24:22):
You think there's more like this out there?

Speaker 4 (24:24):
I think it's quite possible it's more like this.

Speaker 2 (24:25):
Yeah, Like, if anybody has an idea, two or three
other people have had the idea, right, So I can't
imagine these are the only people who are like, oh,
I'm gonna go bully some maintainer of one of the
five thousand libraries on Linux to go take it over
or submit a patch. I can't imagine there aren't other
ones now, are they in OpenSS H or are they
something much more subtle?

Speaker 4 (24:45):
I don't know.

Speaker 2 (24:46):
I mean, this would have been both in kind of
one of the worst possible places, and it would have
been a skeleton key that only this attacker could have used,
which is like kind of the worst case scenario. It's
also the hardest level of difficulty, right, these people picked
the hardest level. Said, if you want to do something
much simpler is you go after a much lesser used

(25:08):
service that's specifically at the target that you're going after.
If you're going after a specific target and you're like, oh,
they use this specific this one specific service that's much
less popular, that doesn't have all these eyeballs on it,
then you don't have to be as tricky.

Speaker 1 (25:24):
There haven't been any like this in OpenSSH, but there
have been other attempts that the Open Source Security Foundation
and the Open JavaScript Foundation have found that use similar
social tactics. One project received emails from accounts asking to
be designated as project maintainers despite having little prior involvement,
and two other projects saw very similar suspicious patterns. This

(25:47):
kind of social engineering is really effective because you don't
have to manipulate code. You just manipulate the person who
has their hands on the code. And it's only going
to get easier to do and harder to detect.

Speaker 2 (26:00):
Now we're at the point where with AI, like you
could be fake now and I have no idea if
you really exist or vice versa.

Speaker 1 (26:07):
Wait are you are you suggesting that doing something like
this might be a little bit easier because somebody could
fake that they actually exist. Oh yeah, with a phone
conversation or a video conversation.

Speaker 2 (26:18):
Oh yeah, we're already seeing that from the ransomware actors.
It's easy for phone, right, So you're already seeing them
fake people's voices. So people are getting phone calls from
like their CEO. The CEO goes on CNBC for two minutes,
they get their voice from CNBC, they plug it into
a AI voice library, and then you call and like, hey,

(26:39):
it's Bob, I need you do a million dollar transfer. Right,
So that kind of stuff, and now you see real
time video too. It's not perfect, but it's getting there.

Speaker 3 (26:48):
Yeah.

Speaker 2 (26:49):
The trick, by the way, if this happens to any
of your listeners. The trick is you can ask people
to move, touch things in the background, do three sixty
on the head. It's harder for them to do ears
forever reason, but they'll get there, right, So, like if
I asked you to take your glasses off, it'd be
very hard for the model, Like take your glasses off.

Speaker 1 (27:05):
By the way, hold on, for those of y'all listening
at home, I took my glasses off here, just double
check it.

Speaker 2 (27:13):
Oh, you kind of frozen me when you did that,
So that's sketch man, it's sketchyf as my students say. Sorry,
they keep me on my Sanford soons.

Speaker 1 (27:23):
But you know, in the future, though, it is going
to be easier to spoof people's personalities, yeah, and stuff
like that. So these things that you're suggesting right now
they work now, are they going to work in a year?

Speaker 3 (27:36):
So?

Speaker 2 (27:36):
I mean, the good thing about this is open source
developers have become much more paranoid, right, So people have
become much more paranoid about new people. And there's a
downside of that, right that if you're trying to get
into open source, it's harder. There have become projects where
it's like, okay, great, let's meet up in person. If
somebody's willing only to communicate with you an email, then

(27:57):
you have to be kind of sketched out. Now, there
have been some changes since this. I think people have
been more paranoid. There's been a bunch of work On
the flip side of AI is that traditional code scanning
tools PREI code scanning tools are not extremely good at
detecting this kind of malicious code. But there is some
hope that some of the newer AI based code scanning

(28:18):
tools could could do this kind of stuff at scale.
The flip side is is AI is really good at
writing code, So you know, do you not have to
be SVR level anymore to be able to write a backdoor?

Speaker 4 (28:31):
That's good, That's probably true as.

Speaker 1 (28:32):
Well, it's open source too much of a risk in
the age of AI, and can we protect ourselves from
another hack like this?

Speaker 3 (28:40):
That's after the break.

Speaker 1 (28:55):
So this and I want to get back into kind
of the play by play here, but a lot of
this hinges on open source. So and I think one
of the really kind of concerning things about this entire
thing that happened or almost happened is the fact that
it basically happened in broad daylight. Yes, and it happened

(29:17):
because this is open source. The thing about open source,
I think, is when you start to explain it to
somebody who's never heard of it. Are you familiar with
the galaxy brain meme?

Speaker 4 (29:28):
Yeah?

Speaker 3 (29:29):
Do you know what I'm talking about? Yeah?

Speaker 1 (29:30):
So I feel like this is like that galaxy brain meme,
where at the very top, when you tell somebody to
open source, the response is, this is a terrible idea.
Everybody can see the code. And then you get a
little bit further down it's, oh, this is a great idea.
Everybody can see the code, and then they hear about
xutails when we get down to the bottom, and it's
a terrible idea. Everybody can see the code. What's the

(29:51):
true galaxy brain take on this for open source?

Speaker 2 (29:54):
I mean, people go back and forth. So one of
the ideas is that if you can see all the code,
you can see all the bugs.

Speaker 3 (30:01):
Right.

Speaker 2 (30:02):
Is the idea that because it's open source, that it
should be more secure than closed source because you could
see the flaws. I don't think that has empirically turned
out to be true, right, And so I think what
I would say is I'm a big proponent of open source.
I think it's great. I think it has a humongous
economic benefit to the world. The truth is is the

(30:25):
entire kind of cloud competing revolution we're all living through
only exists because of open source software. So that's an
incredible thing. That's a wonderful thing. Open source is great
from an economic perspective, it is great from an innovation perspective.
We should not pretend that it magically solves trust and
security problems. And if you're a company that's relied upon

(30:45):
open source, you have a ethical and moral obligation to
deal with the security aspects of it, and it contribute back.
And I do think that is something that's gone lost,
is that people have just kind of assumed somebody else's
dealing with it, and everybody assume somebody else is doing
the security work, and that turns out not to be true.

Speaker 1 (31:06):
You know, I think that really gets the core of
what a lot of this is. Because if somebody sees
XU tills there was a potential security flaw in that, Okay,
well I don't care about that.

Speaker 3 (31:17):
What's that?

Speaker 1 (31:18):
Oh, well, you know it's involved with open as a stage. Well,
I don't use that either. I don't have that app
on my phone. I don't know what you're talking about.
And in this weird way, I feel like the more
and more technology actually starts to become just magic, that
things just work. Yeah, we are less and less actually
tech literate. All the stuff that was science fiction even

(31:42):
ten years ago, two years ago, frankly is it's just
normal now.

Speaker 4 (31:47):
Yeah.

Speaker 1 (31:47):
And so we're able to do so much with technology
just regular people things we just do with our phone
every day, that we've become really removed from the technology itself,
and so less and less of us, fewer and fewer
was actually know how to use a computer. Yeah, and
so this feels totally removed from us. This is like, oh,
this is some weird nerds shit. I'd like, I don't
use that nerd program. Doesn't affect me.

Speaker 2 (32:10):
Yeah, No, you're totally right. I mean, I tell my
Stanford students. Security is one of the best fields to
get into professionally because it's the only part of computers
it gets worse every year. Everything else magically gets better. Man,
So you could find yourself in any other field being
made irrelevant, But if you get into security, you have
job security for life because every year.

Speaker 4 (32:29):
I've been in it. It's gone worse.

Speaker 2 (32:32):
And one of the reasons is because you say it's nerdship.
But even the nerds we get the normal median nerd
gets further and further away from the truth, the reality
of what's going on on computers. So when I learned
how to program, I learned assembly language, right, I learned
how to write like the lowest level languages. And then

(32:55):
you know, they stopped teaching assembly language unless you took
special classes, and you learn, like in Python, right, like
a very high level language that you don't even you know,
you don't learn how to like do memory management.

Speaker 3 (33:07):
Right, I mean Python pithon.

Speaker 1 (33:09):
And it's even just to break this down like Python
for a casual person, you can look at it. You
can kind of tell what's going on. It basically looks
like English. Yeah, assembly is letters and numbers.

Speaker 2 (33:19):
Right, right, But the nice thing about assembly is it's
the truth of the matter.

Speaker 3 (33:23):
Right.

Speaker 2 (33:23):
It has a one to one matching to what the
processor itself is doing. And from a security perspective, if
you look at it, is the reality of what a
security flaw is is seen in the assembly.

Speaker 3 (33:33):
Right.

Speaker 2 (33:34):
In Python, you get further, you get abstracted away, you
get further from the reality of what's actually going on
on the computer. Now what you see it's incredibly powerful,
it's incredibly cool, and so I'm gonna I'm not gonna
crap on it because I think it's an incredibly good
thing for people. But you look at like Claude three
point seven code. You know, this new Claude model, and
you see people on Twitter who don't know anything about

(33:56):
computers and they're able to program now because they can
go into there and they could say, build me software
that does X. And that is going to be terrible
for security. It's super cool for people's economic opportunities because
any bigap you can become a program right now. But man,
are people in security gonna love it because now you
don't need to know anything about how computers work and
you're just gonna ask the AI system to build it

(34:18):
for you.

Speaker 4 (34:18):
And I see it with my students.

Speaker 2 (34:19):
Stanford students like one of the top computer science programs
in the world, and you can graduate and not actually
really understand how operating systems work. I apologize to Sandford
Computer Science department, right, but really like you can have
a totally productive career in Silicon Valley and not really
understand what's going on three or four layers down. In fact,

(34:39):
it's better for you not to write. It's better because
you're at the high level where you're much more productive.
You're much more productive having the AI do the work
for you. You're much productive having get hub Copilot help
you rewrite stuff. You're much more productive using all the
cloud intermediation layers. And so that's one of the reasons
why security gets worse every single year is that we
add these layers of abstraction that makes things easier for people.

(35:02):
And AI is the ultimate abstraction layer, because now you
can talk to computers and plain English and have them
do incredibly complex things.

Speaker 1 (35:12):
The thing about this whole story, I mean, I'm thinking
about you know, we're in a time right now where
anything bad happens or almost happens. Netflix documentary, Hulu documentary,
it's a true crime podcast. At some point, I don't
see that happening with this. This is something that, as
you were saying, was almost it truly could have been catastrophic. Yeah,

(35:38):
but it's also kind of boring.

Speaker 2 (35:41):
It's money, Well, you don't think I could get I
could sell ten episodes to Netflix on this.

Speaker 1 (35:45):
If you can hire me as a producer, I'd love
to help. But you see what I'm saying, it takes
a while to even explain what the heck we're talking about. Yeah,
And I think that comes back to some of this
in the same way that this vulnerability was introduced via
social engineering. A lot of this is social I mean
a lot of your work you probably think about this.

(36:06):
How do you get people to care about something like this?

Speaker 2 (36:09):
I mean, so that's that's a challenge. That's one of
the biggest challenges. If you're like a chief information security officer,
one of your big jobs is getting the rest of
the company to care about security. Ciso's we have a
reputation of being the people who say no all the time.
So I was the CISO of Facebook and I once
walked into a meeting with a bunch of other vps

(36:29):
and somebody literally said like, oh shit, some of this
year like hey guys, I.

Speaker 4 (36:34):
Can hear you. I can hear you, and like, no, no,
it's not you.

Speaker 2 (36:37):
It's just like whenever you come like it's just because you're
telling us, like there's a coup in Turkey or something terrible,
Like because I was just the bare or bad news, right.
But this is what's a real challenge for my colleagues,
and it's a real challenge for us as a society.
People don't want to think that the systems that they
rely upon are fragile, and I think that's like a

(36:57):
real problem.

Speaker 3 (37:02):
What do we learn from this? What is it?

Speaker 1 (37:05):
Let me just say because I don't I personally don't
think just being out and talking to people, if I
was trying to if I try to tell somebody, hey, yeah, man,
what do you think about the xdutails thing?

Speaker 3 (37:13):
Have you?

Speaker 4 (37:14):
You know, hey buddy, what's up?

Speaker 3 (37:15):
Yeah? Has that Has it changed anything about how? Yeah?

Speaker 1 (37:18):
Has that changed anything about how you go about your life?
People can tell me no. So I got to ask
somebody who's actually closer to this. Has this changed how
you approach things? Has this changed how the industry approaches things?
Has this changed how? I mean the theory that you're
putting out is that this is a state actor? Has
this changed how national security is being looked at?

Speaker 2 (37:41):
So for companies that know what they're doing, it has
changed that they approach open source. For a handful of
really big you know, like the Googles, the Metas, the Amazons,
the Microsoft's, the really big tech companies that do a
lot of open source work. They are looking more carefully
at open source for security companies to do this work.

(38:01):
We're investing in software and AI that can do this
work for us. But it has not changed anything massively. Right,
We're still running Linux, We're still all pulling in fifty
thousand packages. We have these humongous dependency graphs. The truth is,
you can't just pivot all these things, right. It has
made us more concerned about these problems. When you talk
to CISOs, my colleagues and I, we're all more concerned.

(38:24):
But we can't magically pivot off of the infrastructure we
have built over a decade. I do not think we've
dealt with the fact that if you get on the
subway in the morning and you look around, most of
the people on that train in their pocket. Ex Utils
is in their pocket. Every single person in there, hundreds

(38:44):
of copies of their social Security number is sitting on
servers that would have been backdoored by this attack. That's
how you can think of it, right, So that's how close.

Speaker 3 (38:55):
We can man.

Speaker 1 (39:01):
So just some closing thoughts here. Again, the reason that
most people don't know about what was almost the biggest
hack in the history of the Internet is because this
is really hard to describe to a non technical audience.
I mean, when you say XU tills or Linux or
open ssh, people's eyes just rolling the back of their heads.

Speaker 3 (39:19):
But we can't.

Speaker 1 (39:20):
Allow tech literacy to be a barrier to understanding how
the world works and the truth is, even beyond all
the tech jargon, a lot of these things are very
human and they're not so hard to understand. And so
that's one of the things that we're really trying to
do here on kill Switch as we keep doing these episodes,
is to open it up so that more people are
able to feel like they're part of the conversations that

(39:43):
affect all of us. And that is it for this one,
for real. Thank y'all so much for listening to kill Switch.
You can hit us up at kill Switch at Kaleidoscope
dot NYC if you've got any thoughts or if there's
anything you want us to cover in the future, and
you can get me at dex Digi that's the d

(40:04):
e x d ig I on Instagram or blue Sky
if that's more your thing, and if you like the episode.
You know, take that phone out of the pocket and
leave us a review. It helps people find the show,
which in turn helps us keep doing our thing. And
this thing is hosted by me Dexter Thomas. It's produced
by Sena Ozaki, Daryl luck Potts and Kate Osborne. Our

(40:26):
theme song is by Kyle Murdoch, who also makes the
show from Kaleidoscope. Our executive producers are Ozma Lashin, mangesh
Hot Togodur, and Kate Osborne. From iHeart our executive producers
are Katrina Norville and Nikki E.

Speaker 3 (40:40):
Tour. That's it for this time.

Speaker 1 (40:42):
Catch on the next one.

kill switch News

Advertise With Us

Follow Us On

Hosts And Creators

Oz Woloshyn

Oz Woloshyn

Karah Preiss

Karah Preiss

Show Links

About

Popular Podcasts

On Purpose with Jay Shetty

On Purpose with Jay Shetty

I’m Jay Shetty host of On Purpose the worlds #1 Mental Health podcast and I’m so grateful you found us. I started this podcast 5 years ago to invite you into conversations and workshops that are designed to help make you happier, healthier and more healed. I believe that when you (yes you) feel seen, heard and understood you’re able to deal with relationship struggles, work challenges and life’s ups and downs with more ease and grace. I interview experts, celebrities, thought leaders and athletes so that we can grow our mindset, build better habits and uncover a side of them we’ve never seen before. New episodes every Monday and Friday. Your support means the world to me and I don’t take it for granted — click the follow button and leave a review to help us spread the love with On Purpose. I can’t wait for you to listen to your first or 500th episode!

Stuff You Should Know

Stuff You Should Know

If you've ever wanted to know about champagne, satanism, the Stonewall Uprising, chaos theory, LSD, El Nino, true crime and Rosa Parks, then look no further. Josh and Chuck have you covered.

Dateline NBC

Dateline NBC

Current and classic episodes, featuring compelling true-crime mysteries, powerful documentaries and in-depth investigations. Follow now to get the latest episodes of Dateline NBC completely free, or subscribe to Dateline Premium for ad-free listening and exclusive bonus content: DatelinePremium.com

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.