All Episodes

May 28, 2025 41 mins

This is an episode of kill switch – a new podcast about our supercharged technological lives. In this episode, host Dexter Thomas explores the biggest hack you’ve never heard of and how one man saved us from complete disaster. This is the xz utils story.

See omnystudio.com/listener for privacy information.

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:12):
Hey, it's Os Valoscian, host of tech Stuff. This week,
we want to do something a little bit different and
share an episode of a show that's new to the
Kaleidoscope and iHeart Podcast family. It's called kill Switch. Kill
Switch is a technology survival guide to modern culture. Host

(00:33):
Dexter Thomas answers questions big and small, putting back the
curtain on the systems we rely on daily, as well
as the more absurd corners of the Internet. So here's
an episode of kill Switch.

Speaker 2 (00:54):
Quick question, do you know about the xzutil's backdoor hack?

Speaker 3 (00:59):
So what?

Speaker 2 (01:00):
Or wait? Wait, wait, wait, wait what the EXU tills
back door? I have no idea what you're talking about.

Speaker 1 (01:06):
I don't know what that is.

Speaker 2 (01:09):
This is something almost nobody's heard of, but in the
spring of twenty twenty four we narrowly avoided a complete
technological disaster. So you've never heard of this though?

Speaker 1 (01:20):
Nope?

Speaker 3 (01:22):
Yeah, I just searched it up on Wikipedia and it
seems way too dune.

Speaker 2 (01:25):
To read about. These aren't just random people. These are
other journalists people in general who keep up with the news.

Speaker 3 (01:32):
Okay, I was like, wait, what did I miss? And
I feel bad, but I guess maybe I'm not the
only one, what journalist who doesn't know what this is about.

Speaker 2 (01:42):
Even they didn't really know about what could have been
the biggest hack in the history of the Internet.

Speaker 3 (01:48):
If this had not been caught, then this would have
been a skeleton key that would have allowed these attackers
to break into tens of millions of incredibly important servers
around the world. We probably would have had airlines not working,
trading halted ATM's not working, bank's not working, people not
able to get their money. You'd have a huge loss

(02:11):
of credibility of technology in people's lives.

Speaker 2 (02:15):
Alex Doamos is a cybersecurity expert. Specifically, he's the chief
information security Officer or CISO at a cybersecurity company called
Sentinel One, and he's the former CISO at Facebook. He's
also a lecturer in the computer science department at Stanford
and this attempted hack is something that is still keeping
him up at night.

Speaker 3 (02:36):
It's fallen out of popular discussion, but among people in
security we're still talking about it. It uncovered a real
fundamental weakness that terrifies lots of people who have responsibility
in this area and.

Speaker 2 (02:52):
What scares them most, and this should scare US two
is that this was caught by complete chance.

Speaker 3 (02:58):
We just got lucky, like one dude got really bored
and noticed a tiny little change in the speed of
one program executing and pulled the thread, and on the
end of this thread was a humongous, ticking time bomb.
It was one dude, and he should never have to
buy a beer for himself ever again. Underus freuend I'm

(03:22):
raising a toast to you right now. This is just water,
but I wish it was more.

Speaker 2 (03:33):
I'm afraid Kaleidoscope and iHeart podcast is killed switch. I'm
Dexter Thomas. I'm sorry. I'm If you've never heard about this,

(04:09):
that's no reason to feel bad, but if it hadn't
been caught, it absolutely would have affected you. What kind
of activity were talking about here.

Speaker 3 (04:19):
Well, we really don't know, and because we don't know
who the attackers are, we don't know whether that would
have been used for really quiet surveillance. It could have
been used for national security intelligence gathering purposes, It could
have been used for a humongous heist of hundreds of
millions or billions of dollars of cryptocurrency, or it could

(04:40):
have been used as part of a massive cyber attack
to shut down millions of computers and cause massive disruptions.

Speaker 2 (04:47):
One of the main reasons that this potential attack isn't
talked about much is because the details are kind of technical. Well,
some of the details are a lot of this stuff
is really just basic human behavior. It's stuff that you
or I could do if we really wanted, and it
shows us that sometimes the best hacks are the simplest ones.
Let me break it down for you. In late March

(05:10):
of twenty twenty four, Andres Freud, who's an engineer at Microsoft,
was sitting at his desk doing his job when he
discovered a malicious piece of code in this little known
tool called xdu tilS. This code created a method that
would allow hackers to access a lot of different computers.
Maybe right now you're thinking, okay, so why is this
the problem for me? I mean, I don't use xdu tilS,

(05:32):
so they couldn't get on my computer. And yeah, maybe
you've never heard of XCU tills. Actually I hadn't either,
and I did what most people do when they don't
understand something about a computer, call an expert. But it
turns out that this really well respected expert he found
out about XU tills when I did.

Speaker 3 (05:52):
Yeah, So I personally had not heard of xdu tilS
before this, even.

Speaker 2 (05:56):
You really, yeah, I had definitely not heard it XU tills.
I figured you would have hearing that you had not
heard of it before all this happened. Frankly, that's a
little bit more scary to me. Now, So why did
this backdoor into a program that no one seems to
know about? Matters so much? And what is xzu tills.

Speaker 3 (06:17):
This is the brilliance of what these attackers did. XCU
tilS is an ingredient to an ingredient to an ingredient
to something really important. So the thing that they wanted
to have a backdoor into is a really important program
called open ssh. So this is something that every tech
he has heard of.

Speaker 2 (06:36):
All right, but what if you're not a techie. So
in order to understand the XU tills hack, we do
need to back up and understand something that xdu tilS
is used in this thing called open ssh.

Speaker 3 (06:47):
This is the program that the majority of Unix like systems,
especially Linux, also Max and some other operating systems allow
you to access them remotely over the Internet.

Speaker 2 (07:03):
I'll get to the open later. But SSH stands for
secure show, and let's just focus on secure right now.
If you think of the difference between posting a tweet
online and dming someone, you're actually kind of halfway there.
Open Ssh allows you to communicate with a remote computer
just like you were sitting there right in front of it.
So even though you're far away, if you want to

(07:23):
send a message, or install programs or delete files, you
know that the connection is safe and that nobody else
can see what you're doing or tamper with that connection.

Speaker 3 (07:33):
So you know, when you see like people in the
matrix typing really fast, see a lot of text right
if somebody is doing that remotely, it's probably over open ssh.

Speaker 2 (07:42):
You might think this doesn't matter for you because you
don't use open ssh, but you do because that's what
you use to connect to systems running Linux around the world.

Speaker 3 (07:52):
Linux has become the standard operating system for the cloud.
So when you talk to Google, you're talking a Linux system.
When you talk to Facebook, you're talking to a Linux system.
When you talk to Apple, you're probably talking to a
Linux system. Right now, the system that we're talking to
each other with almost certainly is running Linux. So the
vast majority of systems you talk to in the cloud

(08:14):
are running Linux.

Speaker 2 (08:15):
Linux is used for Apple's iCloud, for social media sites
like Facebook, Instagram, for YouTube, for Twitter, It's used for
the New York Stock Exchange. Gamers use it when they
run Steam or they play games online, and the list
goes on. The vast majority of the Internet runs on
Linux and open Ssh. Make sure that it's you logging

(08:36):
in and not somebody else.

Speaker 3 (08:38):
When you log in and you get your mail, the
server that holds your mail hash on it. The server
that holds your social media content has SSH. The servers
that have your banking information of Ssh. It's the door
by which you get into these systems.

Speaker 2 (08:52):
So open ssh is incredibly important to the Internet and
all the cloud systems that we rely on, and because
of that, it has a lot of eyes on it.
Trying to hack open Ssh directly would pretty much be impossible.
Someone would catch you pretty quick.

Speaker 3 (09:09):
People pay a lot of attention to it, a lot
of people run their code scanners on it, a lot
of people look for bugs in it, and so it
has been a while since open ssh has had itself
a humongous security flaw in it. If you just join
the open ssh project and said, hey, I'm a new
guy that nobody ever knew. Here's my code. Everybody would

(09:30):
be super suspicious, right, and whoever these bad guys are,
they know that. So what they did was they looked
at open ssh and they looked at its dependency graph,
what we call They looked at all the stuff that
goes into opensh and what they saw was open ssh
depends on other things.

Speaker 2 (09:51):
This is where XU tills comes in. Xeu tills is
one of the things that open Ssh depends on. What
does xutil actually do. It's a compression library.

Speaker 3 (10:02):
So it's just a library that is used to make
data that comes in smaller so that if you're moving
like a big file back and forth, it can fit
down a smaller pipe. Right, you might be talking to
a server on a satellite link, you might be talking
over a modem. Right, you might be talking over a
cell phone, and so you want your big file to
fit into a smaller pipe.

Speaker 2 (10:21):
If you've ever used the zip file on your computer,
you get the general idea. Smaller files can be transferred faster,
which is important when you're dealing with so much data
flowing back and forth. Xdutils allows open ssh to be
both safe and fast, But that's the trick. By inserting
a backdoor into xu tills, the hackers created a way

(10:42):
to access anything being transmitted via open ssh. That meant
they could not only read supposedly secure messages, but remotely
run code on any server that uses open ssh. And
since basically the entire Internet uses this thing, once you're
in there, you can do anything you want.

Speaker 3 (11:00):
You could have used it for a bunch of very
quiet surgical attacks over a multi year period, or you
could have done one humongous, big bane where you knock
out a huge chunk of the Internet.

Speaker 2 (11:12):
All at once. But how did hackers get access to
xdutails in the first place. Well, remember when I promised
to tell you about the open in open ssh. Open
Ssh and also Linux are open source programs. This means
that anyone can look at the source code because it's
open and it's posted publicly. The idea is that if
everyone works together on the code, it'll be better and

(11:34):
the public benefit, and so anyone's free to look at
the code, to learn from the code, or even to
remix it for their own use. And even if you
have no interest in all that nerd stuff, you still
use versions of open source code every day on basically
all of your devices.

Speaker 3 (11:50):
When you're running open source software, which people don't understand.
Basically everybody is, right, So what kind of phone do
you have? Do you have an iPhone or Android?

Speaker 2 (11:58):
I actually have an Android?

Speaker 3 (11:59):
Yeah, okay, Android. A humongous chunk of that code is
open source, right right, And that is code that is
maintained by volunteers that you have no idea who those
people are. Google has no idea who those people are, right.
Google collects all this code from around the internet, they
package it all up, and then they put it on
a phone, or they send it to Samson, and Samson
puts on the phone.

Speaker 2 (12:19):
And before we get any further, iPhone people, this applies
to you too. Your iPhone uses a lot of open
source code also. And don't get me wrong, this is
not a bad thing.

Speaker 3 (12:29):
It's great because it's free and it makes the phone cheaper,
and it's cool that we all get to contribute. But
the flip side is is that, yes, OpenSSH itself gets
lots of love, The Linux kernel gets lots of love, right,
But something like xcu tills, which is this tiny little
component over here, does not get lots of love and
xCD details at the time was maintained by one person.

(12:51):
That one dude was then manipulated to giving up control
of it, and the person he gave up control of
it too, turned out to be a totally fake persona
do not exist.

Speaker 2 (13:03):
This is where we get to the human part of
the story. The one guy who was maintaining xeu tills,
his name was lost to Culin. He'd been maintaining xeu
tills since two thousand and nine and he was the
sole maintainer for the project. He wasn't being paid for it.
He was a volunteer. That's usually how open source projects go.
In twenty twenty two, Lusta Collins started to get a

(13:24):
lot of requests to make updates to the code. Throughout
the year, multiple accounts, seemingly out of nowhere, started complaining
that Colin wasn't working fast enough and implying that if
he wasn't interested in doing this anymore, maybe he wasn't
the guy for the job and the pressure was getting
to him. In June of twenty twenty two, Colin wrote
in a public note quote, I haven't lost interest, but

(13:47):
my ability to care has been fairly limited. Mostly due
to long term mental health issues, but also due to
some other things. He also went on to remind people
that quote, it's also good to keep in mind that
this this is an unpaid hobby project. Thankfully, right about
that time, a new programmer had come into help. This

(14:08):
new person's name was Gia Tan. Colin seemed a little
relieved that finally someone wasn't just complaining but helping. In
that same note from June, he wrote that he'd been
working a bit with Gatan on execu utils to address
all of those complaints, and he said about gia quote,
perhaps he will have a bigger role in the future.
We'll see. Over the course of a few years, gia

(14:31):
Tan really started to gain Lasa Collins trust. Gia Tan
was the ideal contributor. He didn't just help when he
was asked to, but he would offer to take on
more work, and by twenty twenty four, Colin had made
gia Tan a co maintainer on the project, which allowed
him to add code without needing approval.

Speaker 3 (14:50):
This is a human attack, right. It all happened in
the open, but the way they did it was they
created these fake personas were one guy super Friendly and
one guy's a jerk, and the jerk basically is abusing
the person who's maintaining the software and saying, oh, I
need this change, I need this change. You're so slow.
Why are you so slow? And remember this guy's not
getting paid, right like, And so eventually basically bully this

(15:13):
guy to say, oh, I'm tired of doing this. I
don't want to do it anymore. And then the nice
guy's like, oh, well you know, I'll do it for you.
I'll take over, man, let me take this burden.

Speaker 2 (15:24):
For you, right, very convenient, right.

Speaker 3 (15:27):
And this took several years, and so this shows you
kind of the long play. They're willing to spend months
and months and months and in fact years building these
personas because like, look, if you just created an account
and you're like, hey, i've got code, take it, that
wouldn't work. So what these people figured out is that
you have to create these personas. They have to seem real.

(15:49):
You have to make posts, you have to contribute legit stuff,
you've got to create kind of a history, build a relationship.
You have to build a relationship. And so the guy
who maintains it gives it up of like, oh, thank
you so much. For taking this burden from me because
look at these jerks. Now, of course he doesn't know
that the jerks worked for the same team, or maybe
you're even the same person as the nice guy, right,

(16:10):
And then he hands it over to this nice guy
who's a friend of his, and then the friend takes
it over and then does a bunch of legitimate stuff,
and then in the middle of all that legitimate stuff
inserts a very very subtle backdoor.

Speaker 2 (16:22):
I've seen this back door talked about using the phrase sophisticated,
that it was very sophisticated. Yes, in some ways it
sounds sophisticated, but in some ways it sounds like it
kind of wasn't because a lot of it just revolved
around getting somebody to give them some access.

Speaker 3 (16:38):
The code was sophisticated, the method of getting in there
was very human. It was bugging a guy until he
gave up control. Yes, right, just being a nuisance, Just
being a nuisance. So who was behind those fake personas
We don't know for sure, but Alex has a theory
that's after the break, over the course of years, the

(17:10):
one guy maintaining this very important tool called Xzeu Tills
last of Colin was being bullied and manipulated online to
give a persona called Gia Tan a lead role in
handling the code. But who is Gia Tan? Everybody's been
asking this question of like who did this, Who's behind this?

(17:31):
Most of the names have kind of an Asian origin, right,
So there's accounts like Jagar Kumar. The key one is
Gia Tan, which is like, could be Chinese, could be Korean.
Most of the either the names or the technical indicators
point to Asia, right, So the time zones that this
person was working into are kind of the East Asian
time zone, so it's like Beijing or Korea. The names

(17:53):
are Asian. Everything points to Asia, which makes a lot
of people think it's Russia actually because it's just too perfect, right.
WHI is like it's somebody spent three years doing all
this work, and then you're like, like, let's say your Chinese.
Are you gonna use like a Chinese name as your
fake name? Are you going to spend three years but

(18:13):
then work in your normal time zone? And the generally
the only actor who has shown this level of patients
who's been willing to spend three years working on a
back door like this. The only people who have ever
done that is either the United States or the SVR.
So Russia okay, yeah, are really the only groups where

(18:33):
you've seen people spend years kind of doing this kind
of work. And a lot of people don't think it
would be the US doing something like this, that they
would never mess with something this important because also the
thing the Russians really like to do is blame other
people right again, because we never got to the point
of again used. Usually attribution is done after something's used,
and so it's a lot easier to figure out because

(18:55):
then you can ask quebono right who benefits. But like
all of these indicators pointing specifically to kind of China
or Korea makes you think it's just a little too obvious.

Speaker 2 (19:07):
A major theory in cybersecurity circles is that Giatan isn't
one person, it's potentially multiple people, but likely Russian hackers
working for the SVR, which is Russia's foreign intelligence service,
and that they tried to cover their tracks even if
it wasn't consistent.

Speaker 3 (19:24):
The guys who worked for the professionals will change their
time zones specifically around who either what allows them to
avoid detection or specifically around whatever they're doing. For attribution.

Speaker 2 (19:35):
Well, there were some times that the time zones actually
pointed to an Eastern European time zone or time or
another time zone, right.

Speaker 3 (19:43):
Yeah, I mean there's there is a little mixed right,
So somebody could be working from the eastern side of Russia,
or they could be waking up early in Moscow Saint
Petersburg and then they slipped right.

Speaker 2 (19:54):
In other words, they might have just slipped up and
forgot to change their time zones because remember this happen
been over the course of years. Maybe somebody had an
off day and forgot to change the computer settings. But
Alex has another reason for suspecting Russia over China.

Speaker 3 (20:10):
Chinese hackers, for the most part, work very rigorous hours.
You can almost always tell when Chinese hackers are working
because they work office hours. They work eight to five,
eight to six. Really, it's like very regular, yeah okay,
Whereas it's much harder to do time zone stuff based
for the Russians because they they will work whatever hours
they need to work. You know that that scene in

(20:31):
like one of the Born movies where it's like the
club scene. I always think about this with the Russia.
There's like a club scene in Russia, and it's like
you think it's the middle of the night and he
walks out, it's like ten am or something. Right, It's like,
that's what I think about with Russian hackers. Whereas like
for China, it's amazing because it's like, oh, six pm
in Beijing, you know, it's like you know, everybody goes home.

Speaker 2 (20:49):
To hacking stops.

Speaker 3 (20:50):
Yeah, or like Chinese Chinese New Year, Lunar New Year,
everybody goes home, go sees their parents in the village
or whatever, like hacking stops. It's amazing.

Speaker 2 (20:59):
And in the case of exit the UTILS, looking at
the timing when this ga Tan was submitting code, there's
a bunch of submissions during Lunar New Year, but during
big Eastern European holidays like Christmas, crickets. But that leaves
a question, what's the motive? Why would the restern SVR
want to do this.

Speaker 3 (21:20):
So open everybody uses. That's why this is so powerful
is you don't have to have a specific target in mind,
which is why you'd also spend three years doing it.
Because let's say you're at the SVR, you know, no
matter what war you're involved with, no matter what target,
you're going after openness station is going to be useful.
So this is probably a team of the SVR who

(21:40):
they don't know what's going to be used for. They're
just they know they're going to get a medal.

Speaker 2 (21:43):
You'll be able to use this at some point, Yeah,
who knows for what?

Speaker 3 (21:46):
And the US does the same thing, right, Like, there's
people whose job it is to get the capability, and
it's other guy's job who understand the geopolitics, who understand
the intelligence to use it.

Speaker 2 (21:57):
But thankfully last spring, I'm just Freud. The Microsoft engineer
was able to discover the back door, but this was
all by chance. He wasn't looking for it.

Speaker 3 (22:08):
He works on a database called Postgress, so he doesn't
work on xdutils. He works on Postgress is a big
open source database program that Microsoft uses in their Azure cloud.
So I'm guessing that's why Microsoft pays him. And in
the next version of Debian, so a popular Linux distribution,

(22:29):
Postgress was running a little bit slower, just tiny, tiny,
a little bit tiny, tiny.

Speaker 2 (22:33):
Little bit right, so tiny, like how much slower?

Speaker 3 (22:36):
Like in one specific circumstance, it was taking a couple
of milliseconds longer to do.

Speaker 2 (22:40):
Something right, So like a millisecond.

Speaker 3 (22:42):
Yeah, but like if you're a database guy, that's a lot, right.
And so he is super looking into what is going on,
and he realizes, oh, it's not actually postgress that's doing this,
it's open as his age. And so he could have
stopped there because he could have been like, oh, well,
it's not my problem, right, it's not my thing, and
then maybe nobody would have looked at it right, Like
you could see an open source is that people pass

(23:04):
problems to each other all the time, right, so it
is this is like I think a normal like a
normal person, even open source developer would have been like, oh, okay,
I looked at this, it's not me. I'm gonna let
it go. But he did not let it go. He
ended up digging into okay, well what changed and open

(23:25):
as the stage and then he looks into open a
stah and sees this code. And so what the What
the attackers that did is they created what's called a
no bus back door nobody butt us.

Speaker 2 (23:36):
No bus or nobody butt us is a way of
creating a backdoor into something where nobody but us or you.
The hackers have the key.

Speaker 3 (23:46):
They want a skeleton key that only they can use,
but no bus back doors nobody but us back doors
arm are actually hard to sneak in because they're pretty
like obviously sketchy.

Speaker 2 (23:57):
So instead of doing everything all at once, they'd delivered
multiple patches in multiple different places, little things here and
there that wouldn't raise suspicion if you looked at one
or two or three of them, but layered on top
of each other, it created a key that only they
could use.

Speaker 3 (24:15):
And so because they did all this stuff to kind
of obfuscate it and make it super secret, they actually
created the performance impact that Undris saw and then went
way out of his way to pull. And then he
posts in a public post, Guys, this is super sketchy, right, Like,
look at this code. There's no good argument for what's going.

Speaker 2 (24:35):
On here, right, So, I mean I kind of have
to wonder about what the implications for this are. I mean,
this clearly it almost worked. Do you think there's hackers
out there saying okay, yeah, yeah, let me change my
approach and maybe this is the way to do it.

Speaker 3 (24:53):
Yeah. I Mean what I'm afraid of is we haven't
found any other ones like this, So what I thought
would happen is at the time I'm like, oh, man,
we'll have one or two more of these, because everybody
started looking and then nobody else found any other ones,
which terrifies me.

Speaker 2 (25:07):
You think there's more like this out there.

Speaker 3 (25:09):
I think it's quite possible. It's more like this. Yeah,
Like if anybody has an idea, two or three other
people have had the idea, right, So I can't imagine
these are the only people who are like, oh, I'm
gonna go bully some maintainer of one of the five
thousand libraries on Linux to go take it over or
submit a patch. I can't imagine there aren't other ones. Now,
are they in OpenSSH or are they something much more subtle?

(25:30):
I don't know. I mean this would have been both
in kind of one of the worst possible places, and
it would have been a skeleton key that only this
attacker could have used, which is like kind of the
worst case scenario. It's also the hardest level of difficulty, right,
these people picked the hardest level. Thing. If you want
to do something much much simpler is you go after

(25:51):
a much lesser used service that's specifically at the target
that you're going after. If you're going after a specific
target and you're like, oh, they use this specific this
one specific service that's much less popular, that doesn't have
all these eyeballs on it, then you don't have to
be as tricky.

Speaker 2 (26:09):
There haven't been any like this in OpenSSH but there
have been other attempts that the Open Source Security Foundation
and the Open JavaScript Foundation have found that use similar
social tactics. One project received emails from accounts asking to
be designated as project maintainers despite having little prior involvement,
and two other projects saw very similar suspicious patterns. This

(26:32):
kind of social engineering is really effective because you don't
have to manipulate code. You just manipulate the person who
has their hands on the code. And it's only going
to get easier to do and harder to detect.

Speaker 3 (26:45):
Now we're at the point where with AI, like you
could be fake now and I have no idea if
you really exist or vice versa.

Speaker 2 (26:52):
Wait are you are you suggesting that doing something like
this might be a little bit easier because somebody could
fake that they actually exist. Oh yeah, with a phone
conversation or a video conversation.

Speaker 3 (27:03):
Oh yeah, we're already seeing that from the ransomware actors.
It's easy for phone right, So you're already seeing them
fake people's voices. So people are getting phone calls from
like their CEO. The CEO goes on CNBC for two minutes,
they get their voice from CNBC. They plug it into
a AI voice library, and then you call and like, hey,

(27:23):
it's Bob, I need you do a million dollar transfer. Right,
So that kind of stuff, and now you see real
time video too. It's not perfect, but it's getting there.

Speaker 2 (27:33):
Yeah.

Speaker 3 (27:33):
The trick, by the way, if this happens to any
of your listeners, the trick is you can ask people
to move, touch things in the background, do three sixty
on the head. It's harder for them to do ears
forever reason, but they'll get there, right. So, like if
I asked you to take your glasses off, it'd be
very hard for the model, Like take your glasses off.

Speaker 2 (27:50):
By the way, hold on, for those of y'all listening
at home, I took my glasses off here, just double
check it.

Speaker 3 (27:58):
Oh you kind of frozen me when you did that.
That's sketch man, it's sketch af As my students say, sorry,
they keep me on my Stanford since But you know, in.

Speaker 2 (28:09):
The future, though, it is going to be easier to
spoof people's personalities and stuff like that. So these things
that you're suggesting right now they work now, are they
going to work in a year? So?

Speaker 3 (28:21):
I mean the good thing about this is open source
developers have become much more paranoid, right, So people have
become much more paranoid about new people. And there's a
downside of that, right that if you're trying to get
into open source it's harder. There have become projects where
it's like, okay, great, let's meet up in person. If
somebody is willing only to communicate with you an email,
then you have to be kind of sketched out now.

(28:44):
So there have been some changes since this. I think
people have been more paranoid. There's been a bunch of
work on the flip side of AI is that traditional
code scanning tools pre AI code scanning tools are not
extremely good at detecting this kind of malicious code. But
there is some hope that some of the newer AI
based code scanning tools could could do this kind of
stuff at scale. The flip side is is AI is

(29:06):
really good at writing code, so you know, do you
not have to be SVR level anymore to be able
to write a backdoor. That's good.

Speaker 2 (29:16):
That's probably true as well, it's open source too much
of a risk in the age of AI, and can
we protect ourselves from another hack like this? That's after
the break? So this and I want to get back

(29:42):
into kind of the play by play here, but a
lot of this hinges on open source. So and I
think one of the really kind of concerning things about
this entire thing that happened or almost happened is the
fact that it basically happened in broad day. Yes, and
it happened because this is open source. The thing about

(30:05):
open source, I think, is when you start to explain
it to somebody who's never heard of it. Are you
familiar with the galaxy brain meme?

Speaker 3 (30:13):
Yeah?

Speaker 2 (30:14):
Do you know what I'm talking about? Yeah? So I
feel like this is this is like that Galaxy brain meme,
where at the very top, when you tell somebody to
open source, the response is this is a terrible idea.
Everybody can see the code. Yeah, And then you get
a little bit further down, it's, oh, this is a
great idea. Everybody can see the code, and then they
hear about xutails. When we get down to the bottom,
and it's a terrible idea. Everybody can see the code.

(30:36):
What's the true galaxy brain take on this for open source?

Speaker 3 (30:39):
I mean people go back and forth. So one of
the ideas is that if you can see all the code,
you can see all the bugs. Right. Is the idea
that because it's open source, that it should be more
secure than closed source because you can see the flaws, right.
I don't think that has empirically turned out to be true. Right,
And so I I think what I would say is

(31:03):
I'm a big proponent of oen source. I think it's great.
I think it has a humongous economic benefit to the world.
The truth is is the entire kind of cloud competing
revolution we're all living through only exists because of open
source software. So that's an incredible thing. That's a wonderful thing.
Open source is great from an economic perspective. It is
great from an innovation perspective. We should not pretend that

(31:24):
it magically solves trust and security problems. And if you're
a company that's relied upon open source, you have a
ethical and moral obligation to deal with the security aspects
of it. And it contribute back. And I do think
that is something that's gone lost is that people have
just kind of assumed somebody else is dealing with it,

(31:46):
and everybody assumed somebody else is doing the security work,
and that turns out not to be true.

Speaker 2 (31:51):
You know. I think that really gets the core of
what a lot of this is because if somebody sees
XU tills there was a potentual security flaw on that. Okay, well,
I don't care about that. What's that? Oh, well, you
know it's involved with open as a stage. Well, I
don't use that either. I don't have that app on
my phone. I don't know what you're talking about. And
in this weird way, I feel like the more and

(32:14):
more technology actually starts to become just magic, that things
just work. Yeah, we are less and less actually tech literate.
All the stuff that was science fiction even ten years ago,
two years ago, frankly, is it's just normal now.

Speaker 3 (32:32):
Yeah.

Speaker 2 (32:32):
And so we're able to do so much with technology
just regular people things we just do with our phone
every day, that we've become really removed from the technology itself,
and so less and less of us fewer and few
of us actually know how to use a computer. Yeah,
and so this feels totally removed from us this is like, oh,
this is some weird nerd shit. I'd like, I don't
use that nerd program doesn't affect me.

Speaker 3 (32:55):
Yeah, no, you're totally right. I mean, I tell my
Stanford students. Security is one of the best fields to
get into professionally because it's the only part of computers
it gets worse every year. Everything else magically gets better. Man,
So you could find yourself in any other field being
made irrelevant, But if you get into security, you have
job security for life. Because every year I've been in it,

(33:16):
it's gone worse. And one of the reasons is because
you say it's nerdsh it. But even the nerds we
get the normal median nerd gets further and further away
from the truth, the reality of what's going on on computers.
So when I learned how to program, I learned assembly language, right,
I learned how to write like the lowest level languages.

(33:39):
And then you know, they stopped teaching assembly language unless
you took special classes and you learn like in Python, right,
like a very high level language that you don't even
you know, you don't learn how to like do memory management.

Speaker 2 (33:52):
Right, I mean, pithon. And it's even just to bring
this down like Python. For a casual person, you can
look at it. You can kind of tell what's going on.
Basically looks like English. Yeah, assembly is letters and numbers
right right.

Speaker 3 (34:05):
But the nice thing about assembly is it's the truth
of the matter. Right. It has a one to one
matching to what the processor itself is doing. And from
a security perspective, if you look at it, is the
reality of what a security flaw is is seen in
the assembly.

Speaker 2 (34:18):
Right.

Speaker 3 (34:19):
In Python, you get further, you get abstracted away, you
get further from the reality of what's actually going on
on the computer. Now what you see it's incredibly powerful.
It's incredibly cool, and so I'm not gonna crap on
it because I think it's an incredibly good thing for people.
But you look at like Claude three point seven code.
You know, this new Claude model, and you see people

(34:39):
on Twitter who don't know anything about computers and they're
able to program now because they can go into there
and they can say, build me software that does X.
And that is going to be terrible for security. It's
super cool for people's economic opportunities because any big you
can become a program er now. But man, are people
in security gonna love it because now you don't need
to know anything about how can work, and you're just

(35:01):
going to ask the AI system to build it for you.
And I see it with my students. Stanford students like
one of the top computer science programs in the world,
and you can graduate and not actually really understand how
operating systems work. I apologize to Sandford Computer Science department, right,
but really like, you can have a totally productive career
in Silicon Valley and not really understand what's going on

(35:22):
three or four layers down. In fact, it's better for
you not to write. It's better because you're at the
high level where you're much more productive. Right, You're much
more productive having the AI do the work for you.
You're much productive having get hub Copilot help you rewrite stuff.
You're much more productive using all the cloud intermediation layers.
And so that's one of the reasons why security gets
worse every single year is that we add these layers

(35:45):
of abstraction that make things easier for people. And AI
is the ultimate abstraction layer, because now you can talk
to computers and plain English and have them do incredibly
complex things.

Speaker 2 (35:57):
The thing about this whole story. I mean, I'm thinking
about you know, we're in a time right now where
anything bad happens or almost happens. Netflix documentary, Hulu documentary,
it's a true crime podcast. At some point. I don't
see that happening with this. This is something that, as
you were saying, was almost it truly could have been catastrophic. Yeah,

(36:23):
but it's also kind of boring. It's money you don't think.

Speaker 3 (36:27):
I could get. I could sell ten episodes to Netflix
on this.

Speaker 2 (36:30):
If you can hire me as a producer, I'd love
to help. But you see what I'm saying is it
takes a while to even explain what the heck we're
talking about. Yeah, and I think that comes back to
some of this in the same way that this vulnerability
was introduced via social engineering. A lot of this is
social I mean a lot of your work you probably
think about this. How do you get people to care

(36:52):
about something like this?

Speaker 3 (36:54):
I mean, so that's that's a challenge. That's one of
the biggest challenges. If you're like a chief information security officer,
one of your big jobs is getting the rest of
the company to care about security. Ciso's we have a
reputation of being the people who say no all the time.
So I was the CISO of Facebook and I once
walked into a meeting with a bunch of other vps

(37:14):
and somebody literally said, like, oh shit, Sumus this year,
like hey guys, I can hear you. I can hear you,
and like, no, no, it's not you. It's just like
whenever you come like it's because you're telling us like
there's a coup in Turkey or something terrible, and like
because I was just the bare or bad news, right.
But this is what's a real challenge for my colleagues,
and it's a real challenge for us as a society.
People don't want to think that the systems that they

(37:39):
rely upon are fragile. And I think that's like a
real problem.

Speaker 2 (37:47):
What do we learn from this? What is it? Let
me just say, because I don't I personally don't think
just being out and talking to people, if I was
trying to if I try to tell somebody, hey, yeah, man,
what do you think about the XDUTILS thing? Have you?

Speaker 3 (37:58):
You know, hey, buddy, you what's up?

Speaker 2 (38:00):
Yeah? Has has it changed anything about how? Yeah? Has
that changed anything about how you go about your life?
People can tell me no, So I got to ask
somebody who's actually closer to this. Has this changed how
you approach things? Has this changed how the industry approaches things?
Has this changed how I mean the theory that you're
putting out is that this is a state actor? Has
this changed how national security is being looked at?

Speaker 3 (38:26):
So for companies that know what they're doing, it has
changed that they approach open source. For a handful of
really big you know, like the Googles, the Metas, the Amazon's,
the Microsoft's, the really big tech companies that do a
lot of open source work, they are looking more carefully
at open source for security companies that do this work.

(38:46):
We're investing in software and AI that can do this
work for us. But it has not changed anything massively. Right,
We're still running Linux, We're still all pulling in fifty
thousand packages. We have these fwomongkeys, depend t graphs. The
truth is, you can't just pivot all these things. Right.
It has made us more concerned about these problems. When
you talk to Cisows, my colleagues and I we're all

(39:08):
more concerned. But we can't magically pivot off of the
infrastructure we have built over a decade. I do not
think we've dealt with the fact that if you get
on the subway in the morning and you look around,
most of the people on that train in their pocket.
Exutills is in their pocket. Every single person in there,

(39:29):
hundreds of copies of their Social Security number is sitting
on servers that would have been backdoored by this attack.
That's how you can think of it, right, So that's
how close we came.

Speaker 2 (39:42):
Man. So just some closing thoughts here. Again, the reason
that most people don't know about what was almost the
biggest hack in the history of the Internet is because
this is really hard to describe to a non technical audience.
I mean, when you say EXEU or linux or open ssh,

(40:02):
people's eyes just rolling the back of their heads. But
we can't allow tech literacy to be a barrier to
understanding how the world works, and the truth is, even
beyond all the tech jargon, a lot of these things
are very human and they're not so hard to understand.
And so that's one of the things that we're really
trying to do here on kill Switch, as we keep
doing these episodes, is to open it up so that

(40:24):
more people are able to feel like they're part of
the conversations that affect all of us. And that is
it for this one, for real. Thank y'all so much
for listening to kill Switch. You can hit us up
at kill Switch at Kaleidoscope dot NYC if you got
any thoughts or if there's anything you want us to

(40:45):
cover in the future, and you can hit me at
dex Digi that's the d e x d igi on
Instagram or blue Sky if that's more your thing. And
if you like the episode, you know, take that phone
out of the pocket and leave us a review. It
helps people find the show, which in turn helps us
keep doing our thing. And this thing is hosted by
me Dexter Thomas. It's produced by Sena Ozaki, darl Luk Potts,

(41:09):
and Kate Osborne. Our theme song is by Kyle Murdoch,
who also makes the show from Kaleidoscope. Our executive producers
are oz Va Lashin, mungesh Hat Togadur, and Kate Osborne.
From iHeart, our executive producers are Katrina Norville and Nikki E. Tour.
That's it for this time. Catch on the next one.

TechStuff News

Advertise With Us

Follow Us On

Hosts And Creators

Oz Woloshyn

Oz Woloshyn

Karah Preiss

Karah Preiss

Show Links

AboutStoreRSS

Popular Podcasts

True Crime Tonight

True Crime Tonight

If you eat, sleep, and breathe true crime, TRUE CRIME TONIGHT is serving up your nightly fix. Five nights a week, KT STUDIOS & iHEART RADIO invite listeners to pull up a seat for an unfiltered look at the biggest cases making headlines, celebrity scandals, and the trials everyone is watching. With a mix of expert analysis, hot takes, and listener call-ins, TRUE CRIME TONIGHT goes beyond the headlines to uncover the twists, turns, and unanswered questions that keep us all obsessed—because, at TRUE CRIME TONIGHT, there’s a seat for everyone. Whether breaking down crime scene forensics, scrutinizing serial killers, or debating the most binge-worthy true crime docs, True Crime Tonight is the fresh, fast-paced, and slightly addictive home for true crime lovers.

Dateline NBC

Dateline NBC

Current and classic episodes, featuring compelling true-crime mysteries, powerful documentaries and in-depth investigations. Follow now to get the latest episodes of Dateline NBC completely free, or subscribe to Dateline Premium for ad-free listening and exclusive bonus content: DatelinePremium.com

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.