All Episodes

December 30, 2025 149 mins

Part One: Earlier this year a Border Patrol officer was killed in a shoot-out with people who have been described as members of a trans vegan AI death cult. But who are the Zizians, really? Robert sits down with David Gborie to trace their development, from part of the Bay Area Rationalist subculture to killers.

 

Part Two: Robert tells David Gborie about the early life of Ziz LaSota, a bright young girl from Alaska who came to the Bay Area with dreams of saving the cosmos or destroying it, all based on her obsession with Rationalist blogs and fanfic.

Sources: 

  1. https://medium.com/@sefashapiro/a-community-warning-about-ziz-76c100180509
  2. https://web.archive.org/web/20230201130318/https://sinceriously.fyi/rationalist-fleet/
  3. https://knowyourmeme.com/memes/infohazard
  4. https://web.archive.org/web/20230201130316/https://sinceriously.fyi/net-negative/
  5. Wayback Machine
  6. The Zizians
  7. Spectral Sight
  8. True Hero Contract
  9. Schelling Orders – Sinceriously
  10. Glossary – Sinceriously
  11.  https://web.archive.org/web/20230201130330/https://sinceriously.fyi/my-journey-to-the-dark-side/
  12. https://web.archive.org/web/20230201130302/https://sinceriously.fyi/glossary/#zentraidon
Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:01):
Also media, Welcome back to Behind the Best. That's how
this podcast would open if I was a game show host,
but I'm not. Instead of a guy.

Speaker 2 (00:16):
Who spends you would be good at it, though.

Speaker 1 (00:18):
I don't think I would be, Sophie.

Speaker 2 (00:20):
I do, but I think, but I'm like biased because
I think you'd be good at most things.

Speaker 1 (00:25):
No, my only marketable skill is spending thirty hours reading
the deranged writings of a quasi cult leader who was
somewhat involved in the murders of multiple people very recently,
largely because she read a piece of Harry Potter fan
fiction at the wrong time. Yes, we have a fun

(00:48):
one for you this week, and by a fun one,
we have a not at all fun one for you
this week. And to have just a terrible time with me.
We are bringing on again the great David Boree, co
host of My Mama, told me with our friend of
the pod, Langston Kerman, David, how you doing, Oh man.

Speaker 3 (01:10):
I cannot complain. How you doing? There's nothing going on
in the world.

Speaker 1 (01:15):
Oh yeah, yeah, I got up today and read that
that great new article by Francis Fuka Yama. History is
still stopped. So everything's good. We're done I.

Speaker 3 (01:27):
Haven't looked at any news yet purposefully, so I'm you know,
it could be awesome. It could be going great out there.

Speaker 1 (01:33):
It's great, it's great. The whole Trump administration got together
and said, psych it was all a bit man, just
an extended ad for the Apprentice season fifteen.

Speaker 3 (01:43):
You mean this country is not a business.

Speaker 1 (01:45):
No, they handed over the presidency to I don't know.
I don't know whoever you you personally at home think
would be a great president. I'm not going to jump
into that can of worms.

Speaker 4 (01:57):
Right now.

Speaker 1 (02:00):
On Ramonbrony Madron the president.

Speaker 3 (02:04):
That's a good one. That's a good one. That's better
than what we.

Speaker 1 (02:07):
Got, honestly, vastly superior than where we are.

Speaker 3 (02:10):
Of all the entertainers, I feel like, why don't we
start giving athletes a shot a government? Yeah?

Speaker 1 (02:15):
I fuck it?

Speaker 4 (02:16):
Why not?

Speaker 1 (02:18):
You know, uh fucking uh. Kareem abdul Jabbar would be
a fund could knock a presidency out of the park.
Come on, absolutely, Yes, we need a mystery novelist, slash
one of the great basketball stars of all time in

(02:38):
the White House.

Speaker 3 (02:39):
I just want a president who's good in the paint,
you know what I mean.

Speaker 1 (02:41):
That's right, that's right, Agatha Christie with a jump shot.

Speaker 3 (02:47):
Yeah, that's exactly what I think.

Speaker 1 (02:49):
That's what an amazing man. Kareem would be.

Speaker 2 (02:52):
Such would be such a good choice.

Speaker 1 (02:55):
Yeah, bring it on.

Speaker 3 (02:56):
I think he's such a good man. He wouldn't do it.

Speaker 2 (02:58):
Yeah, exactly, his way too moral, he's waited. I have
a frog name after named after him.

Speaker 1 (03:03):
Yeah. Look, honestly, given where we are are right right now,
I'd take fucking uh what's his mark maguire, like Jesus Christ,
anybody like, honestly, anyone I take heartbeat oh man, fuck

(03:25):
it like, I'll take no. No, I'm not gonna take
any hockey players. No hockey players. We got enough people
with brain damage in the White House right now.

Speaker 3 (03:33):
That's probably the shit, and we don't need somebody who
that much.

Speaker 1 (03:37):
Yeah, yeah, yeah, yeah, you're probably right there. I mean,
if if we could go back in time and make
Joe Lewis the president, I think he could solve some
fucking problems in Congress.

Speaker 3 (03:47):
You can get still done.

Speaker 1 (03:54):
So this has been a fun digression, but I got
to ask at the start of this the story that
it is most relevant to the people we're talking about today,
that I think most of our listeners will have heard.
I'm curious if you've heard about Back on January twenty first,
right as the Trump administration took power, a border patrol
agent was shot and killed along with another individual at

(04:16):
a traffic stop in Coventry, Vermont. Right, there were two
people in a car that was pulled over a border patrol.
One of those people drew a gun, there was a firefight,
one of the people in the car and the cop died. Right, Okay,
have you heard this story.

Speaker 3 (04:31):
I'm not at all.

Speaker 1 (04:33):
It's one of those things where it would have been
a much bigger the immigration being the thing that it
is right in the United like the political hot issue
that it is right now, Like the Republicans have been
desperately waiting for like a border patrol officer getting shooted
and wounded that they can like use to justify a crackdown.
But number one, this happened on the Canadian border, not

(04:54):
only favorite, and one of the two people who who
drew their guns on the cop was an immigrant, but
they were a German immigrant, and so none of this
really like right, it was all like right on the
edge of being super useful to their rights. But it's
none like we were.

Speaker 3 (05:14):
Dealing with our own right wing immigration propaganda at that time. Yeah.

Speaker 1 (05:19):
Yeah, it was just like it was like the closest
to being a perfect right wing like fault, like a
Reichstag fire in it, but like just a little too weird.

Speaker 3 (05:29):
Yeah, you gotta throw some spice in there. You gotta
have like a Latin country. That's what they get excited about.

Speaker 1 (05:34):
Yeah, and obviously California border is where you want it,
you know.

Speaker 3 (05:37):
Yeah, definitely, definitely, even New Mexico could be.

Speaker 1 (05:40):
Yeah, or at least at least they need to have
fitnel in the car. In fact, they were not breaking
any laws that anyone could prove at the time. They
just looked kind of weird.

Speaker 3 (05:48):
Ok.

Speaker 1 (05:50):
They they looked kind of weird, and they like had guns,
but it was like they had like two handguns and
like forty rounds and some old targets. They were like
coming back from a shooting range, right, Like not a
lot of a gun guns and AMMO in America terms, right,
especially in Vermont terms. Right. So the other thing was
weird about this is that the German immigrant who died

(06:13):
was a transwoman. So then again we get back to like, wow,
there's a lot about this shooting that is like right
on the edge of some issues that the right is
really trying to use as like a folkrum to like
push through some awful shit. And as more and more
information came out about the shooting, the weirder it seemed,
because there was a lot of initial talk, is this

(06:34):
like a terrorist attack where these like two Antifa types
who were like looking to murder a bird border patrol agent?
But no, that doesn't really make sense because like they
got pulled over, Like they can't have been planning this right, Like,
it didn't. It didn't really seem like that, and really
no one, no one could figure out why they had
opened fire. But as the days went on, more information

(06:55):
started coming out, not just about the two people who
were arrested in this, well, the one person who was
arrested in the one person who died, but about a
group of people around the country that they were linked to.
And these other people were not all but mostly trans women.
They were mostly people who kind of identified as both

(07:16):
anarchists and members of the rationalist subculture, which we'll talk
about in a little bit, And they were all super
high achieving people in like the tech industry and like sciences. Right,
these are the people who had won like awards and
had advanced degrees. The person, the lady who died in

(07:37):
the shooting was a quant trader, So these are not
like the normal shoot it out with the cops types.

Speaker 4 (07:44):
Yes, this is a very niche This is a very
strange story.

Speaker 1 (07:48):
So people start being like, oh the fuck is happening.

Speaker 3 (07:51):
And there's a group of people who could not meet
each other without the invention of the internet, right, Well.

Speaker 1 (07:56):
That is boy, David, do you know where this always
going or at least starting, so like it's a couple
of these days into this one, Like a friend of
mine messaged me and it's like, hey, you know that
shooting in Vermont And I was a yeah, and he's
like my friend is like, you know, there's Zizians And
I was like, wait, what what the fuck? Because I

(08:16):
had heard of these people. This is a weird little subculture.
I'm always I'm like, you know, I study weird little
Internet subcultures and in part because like some of them
do turn out to do acts of terrorism later, so weirdness.
And I've been I've been reporting on the rationalists, who
are not like a cult but who do some cult
adjacent things and I just kind of find annoying. And

(08:39):
I'd heard about this offshoot of the rationalists called the Zizians.
They were very weird. There were some like weird crime allegations.
A couple of them had been involved in a murder
in California a year earlier. But like, it was not
a group that I ever really expected to see blow
up in the media, and then suddenly they fucking did, right,

(08:59):
And they're called the Zizians. It's not a name they
have for themselves. They don't consider themselves a cult. They
don't all like live A group of them did live together,
but like, these people are pretty geographically like dispersed around
the country. They're folks who met online arguing about ration
and discussing rationalism and the ideas of a particular member

(09:22):
of that community who goes by the name ziz. Right,
that's where this group came out of and the regular
media was not well equipped to understand what was going on.
And I want to run through a couple of representative
headlines that I came across just in like looking at
mainstream articles about what had happened. There's an article from

(09:43):
the Independent They title Inside the Zizians, how a cultish
crew of radical vegans became linked to killings across the
United States. They seemed like just another band of anarchist misfits,
scraping on the valid of scraping on the fringes of
Silicon Valley until the deaths began. And then there's a
k c r W article Zizians the vegan techie cult

(10:05):
try tied to murders across the US. UH and then
a Fox article trans Vegan cult charged with six murders.

Speaker 3 (10:12):
There you go, that's Fox.

Speaker 1 (10:15):
Yes, none of these titles are very accurate in that
I guess the first one is like the closest, where like,
these people are radical vegans and they're they are cultish, right,
so I'll give I'll give the Independent that vegan techie
cult is not really what I would describe them, Like
some of them were in the tech industry, but like

(10:37):
the degree to which they're in the tech industry is
a lot weirder than that gets across. And they're not
really a trans they're like trans vegans, but the cult
is not about being a trans vegan. That's just kind
of a lot. How these people found each other.

Speaker 3 (10:52):
Oh, they just happened to be That was just the.

Speaker 1 (10:55):
Common veganism is tied to it. They just kind of
all happen to be trans that's not really like tied
to it necessarily. So I would argue also that they're
not terrorists, which a lot of people have a number
of the other articles called them nothing. None of the
killings that they were involved with, and they did kill
people were like terrorist killings. They're all much weirder than that,

(11:18):
but none of them are like, none of the killings
I have seen are for a clear political purpose, right,
which is kind of crucial for it to be terrorism.
The murders kind of evolved out of a much sillier reason,
and it's you know, there's one really good article about
them by a fella at Wired who spent a year
or so kind of studying these people, and that article

(11:41):
does a lot that's good, but it doesn't go into
as much detail about what I think is the real
underpinning of why this group of people got together and
convinced themselves it was okay to commit several murders. And
I think that that all comes down more than any
other single factor, to rationalism and to their belief in

(12:03):
this weird online cult. That's that's very much based on
like and like asking different sort of logic questions and
trying to like pinnow down the secret rules of the
universe by doing like game theory arguments on the internet
over blogs. Right, Like that's really how all of this
stuff started.

Speaker 3 (12:25):
Like someone named mystery. Yeah, a lot of people in funny.

Speaker 1 (12:28):
Hats they do. Actually they're a little adjacent to this,
and they come out of that period of time, right
where like pick up artists culture is also like forming.
They're one of this like generation of cults that starts
with a bunch of blogs and shit on the internet
in like two thousand and nine, right, And this this
is it's so weird because we use the term cult

(12:51):
and that's the easiest thing to call these people. But generally,
when our society is talking about a cult, we're talking
about like eve an individual. That individual brings in a
bunch of followers, gets them, isolates them from society, puts
them into an area where they are in complete control,
and then tries to utilize them for like a really

(13:13):
specific goal. There's like a way to kind of look
at the Zizians that way, But I think it would
be better to describe them as like cultish, right, Okay.
They use the tools of cult dynamics and that produces
some very cult like behavior, but there's also a lot

(13:33):
of differences between like how this group works and what
you'd call the traditional cult and including a lot of
these people are separate from each other and even don't
like each other, but because they've been inculcated in some
of the same beliefs through these kind of cult dynamics,
they make choices that lead them to like participate in
violence too.

Speaker 3 (13:53):
Where is their hub? Is it like a message board
type of situation? Like how is that? Yes?

Speaker 1 (13:58):
Yes, so I'm gonna I'm gonna have to go back
and forth to explain all of that.

Speaker 3 (14:03):
Also, the technology of Zizzy in it. It's because from.

Speaker 1 (14:08):
The Lids No no z I z the lady who
was kind of the founder of this is the name
that she takes for herself as Zizz, right.

Speaker 3 (14:18):
Okay, should have been the z Girls. That's much more appealing.

Speaker 1 (14:22):
These people are big in the news right now because
because of the several murders, and the right wing is
trying it wants to make it out is like this
is a like trans death cult, and this is more
of like an internet a I nerd death cult.

Speaker 3 (14:40):
I guess that's better.

Speaker 1 (14:42):
It's just different. You know, you're right, it was just
a different thing. And I think it's important if like
you care about like cults because you think they're dangerous
and you're arguing that, like, hey, this cult seems really dangerous,
should understand like what the cult is, right, right, Like
if you miss a just to the scientologists and thought
like these are obsessive fans of science fiction who are

(15:03):
committing murders over science fiction stories. It's like, no, no,
they're committing murders because there's something stupid. Yeah, much moret Okay,
So I got to take I am going to explain
to you what rationalism is, how who ziz is, where
they come from, and how they get radicalized to the

(15:25):
point where they are effectively at the hub of something
that is at least very adjacent to a cult. But
I want to talk a little bit about the difference
between like a cult and cult dynamics. Right, A cult
is fundamentally a toxic thing. It is bad, It always
harms people. There is no harmless cult, you know, it's

(15:47):
like rape, Like there's no version of it. That's good,
you know, like it is a fundamentally dangerous thing. Cult
dynamics and the tactics cult leaders use are not always
toxic or bad, and in fact, every single person listening
to this has enjoyed and had their life enriched by

(16:08):
the use of certain things that are on the spectrum
of cult dynamics.

Speaker 3 (16:12):
I was gonna say, it's a lot. It seems a
lot more like you have that at work, gives that
at work anywhere.

Speaker 1 (16:18):
Right, Yeah, anyway, that's a huge part of what make
a great fiction author who is able to like attract
a cult following. You've ever had that experience, Like a
big thing in cults is the use of in creation
of new language. You get people using words that other
they don't use otherwise and like phrases, and that is
both a way to bond people because like you know,

(16:38):
it helps you feel like you're part of this group
and it also isolates you from people. If you've ever
met people who are like hugely into you know, Dungeons
and Dragons or huge fans like Harry Potter or the
Lord of the Rings, like they have like things that
they say, like memes and shit that they share based
on those books and like that's a less toxic, but

(16:59):
it's on the same spece, right, it's this, I am
a part of this group of people, and we use
these words that mean something to us that don't mean
things to other people, right, And.

Speaker 3 (17:08):
That's right, yes, yes, yeah, it's like a great that's
like a great way to bomb. I think it's any group, right,
I mean, yeah, entertainers, your friend groups.

Speaker 4 (17:19):
Yeah, has in jokes, right, sports, Yeah, could kill people,
right exactly.

Speaker 1 (17:24):
Yes, yes, And like you've got you know, you and
you and your buddies that have been friends for years,
you have like you could there's like a word you
can say and everyone knows that you're fering this thing
that happened six years ago, and you all like laugh
because you know it reminds you of something, you know,
because it's relevant to something happening. Then that's a little
healthy bit of cult dynamics at play, right, you know,

(17:46):
it's like a diet, you know, So there's a toolbox
here and we play with it and different different organizations,
but churches play with it. And obviously a lot of
churches cross the line into cults, but there's also aspects
of for example, you know, there's churches that I know
I have seen people go to where like it's very
common everybody gets up and like hugs at a certain point,

(18:09):
and like people benefit from human contact. It makes them
feel nice. It can be like a very healthy thing
I've gone to. I used to go to like Burning
Man regionals, and like you would like start at this
greeter station where like a bunch of people would come
up and they'd offer you like food and drinks, and
you know, people would hug each other and it was
this like changes your mind state from where you were

(18:31):
in before, kind of opens you up.

Speaker 3 (18:34):
That those is that like to qualify for state Yeah yeah, yeah, yeah.

Speaker 1 (18:38):
So that way could get to go. It was just
like these local little events in Texas, right like a
thousand people in the desert trying to forget that we
live in Texas. Okay, we're not desert. But it was
very like it's it's it was like a really valuable
part of like my youth because it was the first
time I ever started to like feel comfortable in my
own skin. But also that's on the spectrum of love.

(18:59):
But which is the thing colts do where they like
surround you from people with people who like talk about
like like you know, will touch you and hold you
and tell you they love you, and like, you know,
part of what brings you into the cult is the
cult leader can take that away at any moment in time. Right.
It's the kind of thing where if it's not something
where no, this is something we do for five minutes
at the end of every church service, Right, you can

(19:21):
very easily turn this into something deeply dangerous and poisonous.

Speaker 3 (19:25):
Right.

Speaker 1 (19:25):
But also a lot of people just kind of play
around a little bit pieces of that, a piece of
the cult dynamics, just a little bit, just a little bit.
Any good musician, any really great performer's fucking with some
cult dynamics, right, I.

Speaker 3 (19:39):
Was gonna say, I mean, I've been to like so
many different concerts of like weird niche stuff where you're like,
maybe the disco Biscuits is a cool I don't know.

Speaker 1 (19:49):
Yeah, I mean, like I've been to some childish Gambino
concerts where it's like, oh, yeah, he's doing he's a
little bit of a cult leader, you know, like just
ten percent, right.

Speaker 3 (19:58):
Yeah, I mean, what are you good to do with
all that charisma you got? You gotta put it somewhere.

Speaker 1 (20:02):
Yeah, yeah, so these are I think that it's important
for people to to understand both that, like the tactics
and dynamics that make up a cult have versions of
them that are not unhealthy. But I also think it's
important for people to understand cults come out of subcultures. Right,

(20:25):
this is very close to one hundred percent of the time.
Cults always arise out of subcultural movements that are not
in none of themselves cults. For example, in the nineteen
thirties through like the fifties sixties, you have the emergence
of what's called the self help movement, you know, and
this is all of these different books onlike how to
persuade people, how to you know, win friends and influence people,

(20:48):
you know, how to like make but also stuff like
alcoholics anonymous, you know, how to like improve yourself by
getting off drugs, getting off alcohol. All these are pieces
of the self improvement movement. Right, that's a subculture. There
are people who travel around, you get obsessed to go
to all of these different things, and they'll and they
get a lot of benefit. You know, people will show
up at these seminars where there's hundreds of other people

(21:10):
and a bunch of people will like hug them and
they feel like they're part of this community and they're
making their lives better. And oftentimes, especially like once we
get to like the sixties seventies, these different sort of
guru types are saying that, like, you know, this is
how we're going to save the world if we can
get everybody doing you know this, this yoga routine or
whatever that I find together to fix everything.

Speaker 3 (21:31):
Who's that guy who had the game?

Speaker 1 (21:33):
Oh god? Yes, yeah, yeah, yeah, they had like.

Speaker 3 (21:37):
They had to viciously confront each.

Speaker 1 (21:39):
Yes that we've covered them. That that is Synanon. Yes, Yes,
that's what I'm talking about. That's what I'm talking about.
And I have this broader subculture of self help and
a cult Synanon comes out of it, you know, and
I get it.

Speaker 3 (21:52):
It's like the subculture, it's already it's intimate. You feel
those people and anybody else. It definitely feels right.

Speaker 1 (21:59):
For and Scientology is a cult that comes out of
the exact same subculture. We talked last week or week
before two weeks ago about Tony Alamo. It was an
incredibly abusive, pedophile Christian cult leader. He comes out of
along with a couple other guys. We've talked the Jesus
Freak movement, which is a Christian subculture that arises as

(22:21):
a reaction to the hippie movement. It's kind of the
countervailing force to the hippie movements. You got these hippies,
and you know, these Christians who are like really scared
of this kind of like weird left wing movement, and
so they start kind of doing like a Christian hippie
movement almost, right, And some of these people just start
weird churches that sing annoying songs, and some of these

(22:43):
people start hideously dangerous cults. You have the subculture, and
you have cults that come out of it, right, And
the same thing is is true in every single period
of time, right, Cults form out of subcultures, you know.
And part of this is because people who a lot
of people who find themselves most drawn to subcultures, right,

(23:04):
tend to be people who feel like they're missing something
in the outside world, right, you know, not everybody people
who get most into it. And so so does that mean.

Speaker 3 (23:14):
Like so maybe like more I'm just curious, like more
broader cultural waves have never led. There's like the Swifties
would not be a cult. No, there's no, most likely
not going to be an offshoot of the Swifties that
becomes a cult because it's so broad, it has to
have already been kind of a smaller subset. That's interesting.

Speaker 1 (23:33):
Well yeah, and I think but but that said, there
have been cults that have started out of like popular
entertainers and musicians, like you know, you could we could
talk about Corey Feldsman's Weird House full of young women
dressed as angels. Right, So yeah, you've got as a

(23:57):
general rule, like there are music is full of sub
cultures like punk, right, but there have definitely also been
some like punk communities that have have gone in kind
of individual little chunks of punk. Me it's got on
like culty directions, right, if you don't like yeah, yes, yeah,

(24:20):
so there are cults that come out of the subculture.
This is the way colts work. And I really just
I don't think I don't think there's very good education
on what cults are, where they come from, or how
they work, because all of the people who run this
country have like a lot of cult leader DNA and them,
you know, being.

Speaker 3 (24:42):
Run currently by someone who is seen as a magic man.

Speaker 1 (24:46):
Yes, exactly, exactly. I think there's a lot of vested
interests in not explaining what a cult is and where
they come from, and so I want to. I think
it's important to understand subculture's birth cult and also cult
leaders are drawn to subcultures when they're trying to figure
out how to make their cult because a subculture, you know,

(25:09):
most of the people in are just gonna be like
normal people who are just kind of into this thing.
But there will always be a lot of people who
are like this is the only place I feel like
I belong. I feel very isolated. This this is like
the center of my being right right, And so it's
just it's like a good place to recruit. You know,
those are the kind of people you want if you're
reaching out to cult leaders. You know, I'm not saying, like, again,

(25:30):
I'm not saying subcultures are bad. I'm saying that, like
some chunk of people in subcultures are ready to be
in a cult, you know.

Speaker 3 (25:37):
Yeah, yeah, I think if i've they reflect on my
own personal life. Yeah, you meet a lot of guys
who are just like I'll die for the skate park
or whatever thing.

Speaker 1 (25:46):
Yeah, or like the Star Wars fans were sending death
threats to Jake Lloyd after the Phantom Menace, where it's like, well,
you guys are crazy. That is insane, you know. He's like, hey, right,
this is a movie he also did. He didn't write it,
Like what are you doing? You know whatever, So and

(26:10):
again that's kind of a good point, like Star Wars
fans aren't a cult, but you can also see some
of like the toxic things cults do erupts from time
to time and from like video game fans, right, people,
you are really into a certain video game, it's not
a cult, but also periodically groups of those fans will
act in ways that are violent and crazy, and it's

(26:31):
because of some of these same factors going on, Right, I.

Speaker 3 (26:34):
Think people forget fanish short for fanatics.

Speaker 1 (26:36):
Exactly exactly right. And it's it's like, you know, the
events that I went to very consciously played with cult dynamics.
You know, after you got out of that like greeting
station thing where like all these people were kind of
like love bombing you for like five minutes, there was
like a big bar and it had like a sign
of Buffett that said not a religion, Do not worship,

(26:56):
and it was this kind of people talk about like
this is like we are play with the ingredients of
a cult. We're not trying to actually make one. So
you need to constantly remind people of like what we're
doing and why it affects their brain that way. And
in my case, it was like because I was at
like a low point in my life then like this
was when I was really it was twenty, I was
not I had no kind of drive in life. I

(27:19):
was honestly dealing with a lot of like suicidal ideation.
This is the point in which I would have been
vulnerable to a cult, and it I did get acted
a little bit like a vaccine, like I got a
little dose of the drug.

Speaker 3 (27:31):
Unity exactly, like hey, I know what that is.

Speaker 1 (27:36):
I know what's going on there. So anyway, I needed
to get into this because the Zizians, this thing that
I think is it's either a full on cult or
at least a cult ish right that is responsible for
this series of murders that are currently dominating the news
and being blamed on like a trans vegan death cult
or whatever. They come out of a subculture that grows

(27:59):
out of the early aughts Internet known as the Rationalists.
The Rationalists started out as a group in the early
aughts on the comments sections of two blogs. One was
called less Wrong and one was called Overcoming Bias. Less
Wrong was started by a dude named Elizer yed Kowski.
I've talked about Aliser on the show before. He sucks.

(28:23):
He's I think he's a bad person. He's not a
cult leader, but again he's playing with some of these
cult dynamics, and he plays with them in a way
that I think is very reckless, right and ultimately leads
to some serious issues. Now, Aliser's whole thing is he
considers himself the number one world expert on AI risk

(28:46):
and ethics. Now you might think from that, oh, so
he's like he's like making AIS, he's like working for
one of these companies that's involved in like coding and stuff.
Absolutely not, no, no quarterback.

Speaker 5 (29:02):
No.

Speaker 1 (29:02):
He writes long articles about what he thinks AI would
do and what would make it dangerous that are based
almost entirely off of short stories you read in the
nineteen nineties. This guy, it's so much such, it's such
Internet and like I'm not a fan of like the

(29:23):
quote unquote real AI. But Yodkowski is not even one
of these guys who's like, no, I'm like making a
machine that you talk to like.

Speaker 3 (29:30):
I have no credible I just have an opinion.

Speaker 1 (29:33):
Yeah, I find out I hate this guy so much.
Speaking of things, I hate not going to ads. We're back,
so yod Kowski. This AI risk and ethics guy starts

(29:54):
this blog in order to explore a series of thought
experiments based in game theory. UH and his his I
am annoyed by games. It's some sight like, man, I
know that there's like valid activity, but like it's all
just always so stupid and annoying to me anyway. Uh

(30:16):
a budget thought experience experience based in game theory, with
the goal of teaching himself and others to think more
logically and effectively about the major problems of the world.
His motto for the movement and himself is winning the rational.

Speaker 3 (30:32):
Wow.

Speaker 1 (30:33):
Yeah, yeah, that's where she got it. Yeah, that's where
she picked it up. Yeah, he's They're tied in with biohacking, right,
this is kind of starting to be a thing at
the time, and brain hacking and the whole like self
optimization movement that feeds into a lot of like right
wing influencer space today, Yedkowski is all about optimizing your

(30:53):
brain and your responses in order to allow you to
accomplish things that are were are not possible for other
people who haven't done that. And there's a messianic era
to this too, which is he believes that only by
doing this, by by spreading rationalist principles in order to
quote raise the sanity water line, that's how he describes it,

(31:17):
that's going to make it possible for us to save
the world from the evil AI that will will be
born if if enough of us don't spend time reading
blogs that this is it's awesome, this is this is pete,
this is the good stuff. Yukowski and his followers see

(31:38):
themselves as something unique and special, and again there's often
a messianic air to this. Right, we are literally the
ones who can save the world from evil AI. Nobody
else is thinking about this or is even capable of
thinking about this, because they're too logical.

Speaker 3 (31:51):
He holds himself as kind of like a deity identifies himself.

Speaker 1 (31:55):
On top of this, he doesn't really deify himself, but
he also does talk about himself like in a way
that is clearly other people aren't capable of of of
understanding all of the things that he's capable of understanding. Right, Okay,
so there is a little bit it's more like superheroification,

(32:15):
but it's it's a lot. You know what this is
closest to with these people. They not all of them
would argue with me about this, but I've read enough
of their papers and enough dianetics to know that, like
this is new dionetics, Like this is church.

Speaker 6 (32:30):
The church.

Speaker 1 (32:30):
The church is scientific. Now there's the Church of scientology.
Stuff has more occult and weird like magic stuff in it,
But this is all about there are activities and exercises
you go through that will rid your body of like
bad ingrained responses, and that will make you a fundamentally
more functional person.

Speaker 3 (32:51):
Okay, so the retraining of yourself in order exactly exactly. Okay,
huge deal.

Speaker 1 (32:56):
And also a lot of these guys wind up like
referring to the different like tech techniques that he teaches
as tech, which is exactly what scientologists call it. Like
there's some there's some shit I found that it's like
this could have come right out of a scientology pamphlet.
Do you guys not realize what you're doing? I think
they do. Actually, so he's he's you know, in the

(33:18):
process of inventing this kind of new mental science that
verges on superpowers. And it's one of those things. People
don't tend to see these people as crazy if you
just sort of like read their arguments a little. It's
like them going over old thought experiments and being like,
so the most rational way to behave in this situation
is this reason. For this reason, you have to really

(33:41):
like dig deep into their conclusions to see how kind
of nutty a lot of this is. Now again, I
compared to the scientology, Ydkowski isn't a high control guy
like Hubbard. He's never going to make a bunch of
people live on a flotilla of boats in the ocean
with him. You know, he's got like there's definitely like

(34:02):
some allegations of bad treatment of like some of the
women around him, and like he has like a Bay
Area set that hang with him. I don't think he's
like a colt leader. You know, you could say he's on.

Speaker 3 (34:13):
The drawing people to him physically, or this is also
all physically.

Speaker 1 (34:16):
I mean a lot of people move to the Bay
Area to be closer to the rationalist scene.

Speaker 3 (34:20):
Although again, well I'm a Bay Area guy.

Speaker 1 (34:24):
San fran is this this is a San Francisco thing
because all of these are tech people.

Speaker 3 (34:29):
Oh okay, so this is like, yes, I wonder what
neighborhood feels like a.

Speaker 1 (34:34):
San France and Oakland. You can look it up. People
people have found his house online, right, like, like it's
it is known where he lives. I'm not saying that
for any like I don't harass anybody. I just like
it's it's not a secret, like what part of the
town this guy lives in. I just didn't think to
look it up, but like, yeah, this is like a
Barry a Bay Area tech industry subculture, right, Okay. So

(34:58):
the other difference between this and something that scientifogy is
that it's not just Alizer laying down to the law.
Elizer writes a lot of blog posts, but he lets
other people write blog posts too, and they all debate
about them in the comments. And so the kind of
religious canon of rationalism is not a one guy thing.
It's come up with by this community. And so if

(35:18):
you're some random kid in bumfuck Alaska and you find
these people and start talking with them online, you can
like wind up feeling like you're having an impact on
the development of this new thought science.

Speaker 3 (35:30):
You know.

Speaker 1 (35:31):
Yeah, that's amazing, very very powerful for other power.

Speaker 3 (35:35):
Yes.

Speaker 1 (35:37):
Now, the danger with this is that like that, all
of this is this Internet community that is incredibly like
insular and spends way too much time talking to each
other and way too much time developing in group terms
to talk to each other. And Internet communities have a
tendency to poison the minds of everyone inside of them.
For example, Twitter, the reality is that x X the

(36:03):
everything app, I just watched a video of a man
killing himself while launching a ship coin the everything app.

Speaker 2 (36:19):
A hack a hack Google job indicates it's Berkeley.

Speaker 3 (36:25):
Yeah, that makes the most sense to me.

Speaker 1 (36:28):
Geographically, a lot of these people wind up living on boats,
and like the Oakland there's the Oakland Harbor. Boat culture
is a thing.

Speaker 3 (36:36):
Is eup of people moved to boats?

Speaker 1 (36:39):
No, absolutely not.

Speaker 3 (36:50):
It feels like.

Speaker 1 (36:53):
It's it's here's the thing. Boats are a bad place
to live.

Speaker 3 (36:57):
It's it's for fun.

Speaker 1 (37:00):
It is like boats and planes are both constant monuments
to hubris. But a plane its goal is to be
in the air just as long as it needs and
then you get it back on the ground where it belongs.
A boat's always mocking God in the sea, Yes.

Speaker 3 (37:16):
A lot of times. Just a harbor, like a houseboat.
That's where your dad goes after the divorce.

Speaker 1 (37:22):
Right, right, I do. One day I'll live on a houseboat.
It's going to be falling apart. It's going to just
a horrible, horrible place to live. Dang, I can't wait.
That's the dream, David, that's my beautiful dream. Is going
to become making bullets, making making my own bullets. Really
just becoming an alcoholic, like yeah, like not just like

(37:44):
half assing it, like putting, putting it, like trying trying
to become the Babe Ruth of drinking nothing but cutty
sark scotch.

Speaker 3 (37:53):
If you want to be like a pop the bed alcoholic,
a houseboat is the place.

Speaker 1 (37:57):
Yeah, yeah, that's right, that's right. Ah the life. I
want to be like that guy from Jaws, Quint.

Speaker 3 (38:05):
Scurvy.

Speaker 1 (38:06):
Yes, that's exactly getting scurvy, destroying my liver, eventually getting
eaten by a great white shark because I'm too drunk
to work my boat.

Speaker 3 (38:15):
Ah.

Speaker 1 (38:16):
That's it. That's the way to go with Yeah. So anyway,
these internet communities like the rationalists, even when they start
from a reasonable place. Because of how internet stuff works.
One of the things about internet communities is that when
people are like really extreme and like pose the most

(38:38):
sort of extreme and out there version of something that
gets attention, people talk about it. People get angry at
each other. But also like that kind of attention encourages
other people to get increasingly extreme and weird, and there's
just kind of a result a derangement. I think internet
communities should never last more than a couple of years
because everyone gets crazy. You know, like it's bad for you.

(39:02):
I say, this is someone who was raised on these right,
it's bad for you, And like it's bad for you
in part because when people get really into this, this
becomes the only thing. Like especially a lot of these
like kids in isolated who are getting obsessed with rationalism.
All their reading is these rationalist blogs. All they're talking
to is other rationalists on the internet. And in San Francisco,

(39:25):
all these guys are hanging out all of the time
and talking about their ideas, and this is bad for
them for the same reason that like it was bad
for all of the nobles in France that moved to Versailles, right,
like they all lived together and they went crazy. Human
beings need regular contact with human beings they don't know.
The most lucid and wisest people are always always the

(39:47):
people who spend the most time connecting to other people
who know things that they don't know. This is an
immutable fact of life, that this is just how existing works.
Like if you think I'm wrong, please consider that you
were wrong, and goes find a stranger under a bridge,
you know.

Speaker 3 (40:06):
Just starting They know some stuff you don't know.

Speaker 1 (40:08):
They will know some shit. They might have some powders
you haven't tried.

Speaker 3 (40:14):
Oh yeah, bills and.

Speaker 1 (40:17):
Under the bridge.

Speaker 3 (40:18):
Yeah, that's that's the neg chamber you want to be
a part of.

Speaker 1 (40:23):
Yeah, exactly exactly. So the issue is that Yodkowski starts
postulating on his blog various rules of life based on
these thought experiments. A lot of them are like older
thought experiments that like different intellectuals, physicists, psychiatrists, psychologists, what
I had to come up with things like the sixties
and stuff, right, And he starts taking them and coming

(40:44):
up with like corollaries or alternate versions of them, and
like trying to solve some of these thought problems with
his friends. Right, The thought experiments are most of what's
happening here is they're mixing these kind of nineteenth and
twentieth century philosophical concepts. The big one is utilitaryism. That's
like a huge thing for them, is the concept of
like the ethics meaning doing the greatest good for the

(41:06):
greatest number of people, right, and that ties into the
fact that these people are all obsessed with the singularity.
The singularity for them is the concept that we are
on the verge of developing an all powerful AI that
will instantly gain intelligence and gain a tremendous amount of power. Right,

(41:28):
it will basically be a god. And the positive side
of this is it it'll solve all of our problems, right,
you know, it will literally build heaven for us. You
know when the singularity comes. The downside of it is
it might be an evil god that creates hell. Right.
So the rationalists are all using a lot of these
thought experiments, and like their utilitarianism becomes heavily based around

(41:51):
how do we do the greatest good in by which
I mean influencing this AI to be as good as possible.
So that's humanity actually end goal.

Speaker 3 (42:00):
Right, They actively because you said the leader was not
are these people now actively working within AI or they just.

Speaker 1 (42:07):
But a bunch of them have always been actually working
in AI. Yidkost would say, no, he I work in AI.
He's got a think tank that's dedicated to like AI,
ethical AI. It's worth noting that most of the people
in this movement, including Gidkowski, got him once like AI
became an actual Like I don't to say there's actual
these are actual intelligences, because I don't think they are.

(42:28):
But like once chat GPT comes out and this becomes
like a huge people start to believe there's a shitload
of money in here. A lot of these businesses, all
of these guys or nearly all of them get kicked
to the curb, right because none of these none of
these companies really care about ethical AI, you know, like
they don't give a shit about what these guys have
to say. And Ydkowski now is a huge he's like
very angry at a lot of these AI companies because

(42:50):
he thinks they're very recklessly like making the god that
will destroy us instead of like doing this carefully to
make sure that AI is and evil anyway. But a
lot of these people are in an adjacent to different
chunks of the AI industry. Right, they're not all working
on like LMS. And in fact, there are a number

(43:12):
of scientists who are in the AI space you think
AI is possible, who think that the method that like
open Eye is using LMS cannot make an intelligence, that
that's not how you're ever going to do it. If
it's possible, they have other theories about it. I don't
need to get into it further than that. But these
are like a bunch of different people. Some of them

(43:32):
are still involved with like the mainstream AI industry, some
of them have been very much pushed to the side.
So all of it starts again with these fairly normal
game theory questions, but it all gets progressively stranger as
people obsess over coming up with like the weirdest and
most unique take in part to get like clout online, right,

(43:54):
and all of these crazy Yeah, I'll give you an example.

Speaker 3 (43:58):
Right.

Speaker 1 (43:58):
So much of ration less discourse in among the Ydkowski
people is focused on what they call decision or what's
called decision theory.

Speaker 3 (44:07):
Right.

Speaker 1 (44:08):
This is drawn from a thought experiment called Newcomb's paradox,
which was created by a theoretical physicist in the nineteen sixties. Hey,
just to make a quick correction here, I was a
little bit glib. Decision theory isn't drawn from Newcomb's paradox,
not does it start with Yodkowski. But the stuff that
we're talking about, like how decision theory kind of comes
to be seen in the rationalist community, a lot of

(44:30):
that comes out of Newcomb's paradox. It's a much older
like thing, you know than the Internet goes back centuries, right,
people will talk about decision theory for a long time. Sorry,
I was imprecise. I am going to read how the
Newcomb's paradox is originally laid out. Imagine a super intelligent
entity known as Omega, and suppose you are confident in
its ability to predict your choices. Maybe Omega is an

(44:52):
alien from a planet that's much more technically advanced than ours.
You know that Omega has often correctly predicted your choices
in the past, has never made an incorrect prediction about
your choices. And you also know that Omega has correctly
predicted the choices of other people, many of whom are
similar to you. In the particular situation about to be described,
there are two boxes A and B. Box A is

(45:15):
c through and contains one thousand dollars. Box B is
opay and contains either zero dollars or a million dollars.
You may take both boxes or only take box B.
Omega decides how much money to put into box B.
If Omega believes that you will take both boxes, then
it will put zero dollars in box B. If Omega

(45:35):
believes that you will take box B, then it will
put only box B. Then it will put a million
dollars in box B. Omega makes its prediction and puts
the money in box B, either zero or a million dollars.
It presents the boxes to you and flies away. Omega
does not tell you its prediction, and you do not
see how much money Omega put in box B. What

(45:56):
do you do now? I think that's stupid. I think
it's a stupid question, and I don't really think it's
very useful.

Speaker 3 (46:07):
I don't see. There's so many other factors.

Speaker 1 (46:10):
Yeah, I don't know, Yeah, I mean, among other things.
Part of the issue here is that, like, well, the
decision has already been made, right.

Speaker 3 (46:16):
Yeah, that's the point you have. No, it doesn't matter
what you do. There's no autonomy in that.

Speaker 1 (46:21):
Right, Well, you and I would think that because you
and I are normal people who I think, among other things,
probably like grew up like cooking food and like filling
up our cars with gas and not having like our
parents do all of that because they're crazy rich people
who live in the Bay and send you the super Stanford. Yeah,

(46:43):
you had like problems in our lives and stuff, you know,
physical bullies, normal Like I don't want to like shit
on people who are in because this is also harmless, right,
and what this is I'm also I'm not shitting on
newcom This is the thing a guy comes up with
the sixties, and it's like a thing you talk about
in like part and shit among like other weird intellectuals. Right,

(47:03):
you pose it, you sit around drinking, you talk about it.
There's nothing bad about this, right. However, when people are
talking about this online, there's no end to the discussion.
So people just keep coming up with more and more
rcane arguments for what the best thing to do here is.
And it starts to.

Speaker 3 (47:20):
Ask how that spins out of control pretty.

Speaker 1 (47:22):
Much exactly, and the rationalists discuss this NonStop, and they
come to a conclusion about how to best deal with
this situation. Here's how it goes. The only way to
beat Omega is to make yourself the kind of person
in the past who would only choose box B, so
that Omega, who is perfect at predicting, would make the

(47:45):
prediction and put a million dollars in box B based
on your past behavior. In other words, the decisions that
you would need to make in order to win this
are timeless decisions, right, you have to but come in
the past a person who would Now, again.

Speaker 4 (48:05):
That's what they came up with. That's what they all
came up That's the supreme answer. This is the smartest
people in the world, David. These are the geniuses.

Speaker 1 (48:13):
Soribed building the future.

Speaker 3 (48:15):
Oh boy, Yeah.

Speaker 1 (48:21):
It's so funny trying to like every time because I've
read I've spent so many hours reading this, and you
do kind of sometimes get into the like, Okay, I
get the logic there. And it's that's why it's so
used to just like sit down with another human being
and be like, yeah, this isn'tsane, this is nice, this
is this is all nice, this is all dumb.

Speaker 3 (48:41):
At the cocktail party.

Speaker 1 (48:43):
So they conclude and and by which I mean largely
Yadkowski concludes that the decision you have to make in
order to win this game is what's called a timeless decision,
and this leads him to create one of his most
brilliant inventions, timeless decision theory, And I'm going to quote
from an article in Wired. Timeless decision theory asserts that

(49:06):
in making a decision, a person should not consider just
the outcome of that specific choice, but also their own
underlying patterns of reasoning and those of their past and
future selves, not least because these patterns might one day
be anticipated by an omniscient adversarial AI.

Speaker 3 (49:22):
Oh no, that's.

Speaker 1 (49:26):
Motherfucker. Have you ever had a problem?

Speaker 7 (49:29):
Have you ever really? Have you ever dealt with anything?
What are you talking about? Like you make every decision?

Speaker 1 (49:41):
Honestly, again, I can't believe I'm saying this, not even
where it wasn't a high school. Like go play a football,
make a cabinet, you know, like len change your oil,
go do something.

Speaker 3 (49:54):
There's a lot of assholes who use this term. But
you gotta go touch grass man.

Speaker 5 (49:57):
You gotta touch grass mass. It's like that's if you're
talking about this kind of shit. And again, I know
you're all wonder you started this by talking about a
border patrol agent being shot. All of this directly leads
to that man's death.

Speaker 3 (50:10):
We have covered a lot of ground. This is I'm excited.
It's I forgot. I didn't forget there we're gonna there
was also gonna be murdered.

Speaker 1 (50:17):
Yeah, there sure is, so at least you. Yid Kowski
describes this as a timeless decision theory, and once this
comes into the community, it creates a kind of logical
force that immediately starts destroying people's brains. Again, all of
these people are obsessed with the imminent coming omniscient godlike AI. Right,

(50:38):
and so do they have a time limit on it?
Do they have like a do they have a like
Is there is there any timing on it? Or is
it just kind of like again, man, it's the rapture.
It's the rapture. Okay, it's literally the tech guy rapture.
So any day it's coming, any day, you know.

Speaker 3 (50:53):
You could be a monsters already. Yeah.

Speaker 1 (50:56):
Yeah, So these guys are all obsessed that this god
like AI is coming, and like for them, the Omega
in that thought experiment isn't like an alien, it's a
stand in for the god AI and one conclusion that
eventually results from all of these discussions is that the
and this is a conclusion a lot of people come
to if in order, if in these kinds of situations,

(51:20):
like the decisions that you make, you have to consider
like your past and your future selves. Then one logical
leap from this is if you are ever confronted or
threatened in a fight, you can never back down right
and in fact, you need to immediately escalate to the
to use maximum force possible. And if you commit, if
you commit now to doing that, in the future, you

(51:43):
probably won't ever have to defend yourself because it's a
timeless decision. Everyone will like like that, like that, that
will impact how everyone treats you, and they won't want
to start anything with you. If you'll immediately try to
murder anyone who fights you, that's not.

Speaker 3 (51:58):
To be this guy. But I think this is why
people need to get beat up some times.

Speaker 1 (52:00):
Yeah, yeah, and again that is that is kind of
a fringe conclusion among the rationalists. Most of them don't
jump to that, but like the people who wind up
doing the murders we're talking about that, they are among
the rationalists who come to that.

Speaker 8 (52:16):
Okay, because yeah, okay, that makes sense. This is uh,
this is this is a that's so funny. Huh oh no,
because like this whole time.

Speaker 3 (52:29):
I've really been only thinking about it in theory practical
application because it's so insane.

Speaker 1 (52:35):
But oh no, no, no, this is this does bad places, right.

Speaker 3 (52:39):
Oh no.

Speaker 1 (52:40):
This kind of thinking also leads, through a very twisty
turney process to the and something called rock O's basilisk, which,
among other things, is directly responsible for Elon Muskin Grimes
meeting because they are super into this this ship. Oh really,
oh really, So the gist is a member of the
Less Wrong community. A guy who goes by the name

(53:03):
Rocko r Oko posts about this idea that occurred to him. Right,
this inevitable, super intelligent AI, right would obviously understand timeless
decision theory, and since its existence is all important, right,
the most logical thing for it to do post singularity
would be to create a hell to imprison all of

(53:25):
the people and torture all of the people who had
tried to stop it from being created. Right, Because then
anyone who like thought really seriously about who was in
a position to help make the AI would obviously think
about this and then would know, I have to devote
myself entirely to making this AI otherwise it's going to
torture me forever.

Speaker 3 (53:46):
Right yeah, yeah, makes tell us now I have right
because it's nuts, but it's nice.

Speaker 1 (53:53):
But this is what they believe, right again, with all
of a lot of this is people who are like
atheists and tech nerds creating calvinism, like like, and this
is just this is just Pascal's wager, right, Like, that's
all this is. You know, it's Pascal's great wager with
a robot. But this this, this becomes so upsetting to

(54:18):
some people. It destroys some people's lives, right Like, yeah.

Speaker 4 (54:23):
I mean I'm behaving that way practically day to day.

Speaker 3 (54:27):
I don't think you would even take one, no, right,
you could in a month like that.

Speaker 1 (54:34):
So not all of them agree with this. In fact,
there's big fights over it because a bunch of rationalists
do say, like that's very silly. That's that's like a
really particulating everything about it.

Speaker 3 (54:44):
You're still debating everything.

Speaker 1 (54:46):
Yeah, And and in fact, at Leisa Yadkowski is going
to like band discussion of Rocko's Basilisk because eventually it
like so many people are getting so obsessed with it.
It fucks a lot of people up, in part because
a chunk of this community are activists working to slow
AI development until it can be assured to be safe.
And so now this discs like am I going to

(55:07):
post singularity? Hell? Is like the AI god going to
torture me for a thousand eternities.

Speaker 3 (55:14):
It's funny how they invent this new thing and how
quickly it goes into like traditional Judeo Christian It is idea,
like they got a hell now.

Speaker 1 (55:21):
It is very funny, and there's they're come to this
conclusion that just reading about Roco's baslisk is super dangerous
because if you know about it and you don't work
to bring the AI into being, you're now doomed. Right
of course, the instant you hear about it, so many
people get fucked up by this that the thought experiment
is termed an info hazard, and this is a term

(55:43):
these people use a lot now. The phrase information hazard
has its roots in a twenty eleven paper by Nick Bostrom.
He describes it as quote a risk that arises from
the dissemination of true information in a way that may
cause harm or enable some agent to cause harm. Right,
and like that's like a concept that's worth talking about.

(56:05):
Bostrom is a big figure in this culture, but I
don't think he's actually why most people start using the
term info hazard because the shortening of information hazard to
info hazard comes out of an online fiction community called
the SCP Foundation, right, which is a collectively written online
story that involves a government agency that lock ups dangerous,

(56:28):
mystic and metaphysical items. There's a lot of love Craft
in there. It's basically just a big database that you
can click and it'll be like, you know, this is
like a book that if you read it it like
has this effect on you or whatever. It's people like
you know, playing around telling scary stories on the internet.
It's fine, there's nothing wrong with it. But all these
people are big nerds and all of these every like

(56:49):
behind nearly all of these big concepts and rationalism. More
than there are like philosophers and like you know, actual
like philosophical concepts. There's like short stories they read, yeah, exactly, yeah,
And so the term info hazard gets used, which is
it is like you know, a book or something an

(57:09):
idea that could destroy your mind, you know, speaking of
things that will destroy your mind. These ads. We're talking
about Roco's Basilisk, and I just said, like, you know,
there's a number of things that come into all this,
but behind all of it is like popular fiction, and

(57:32):
in fact Roco's Basilisk, well, there is like some Pascal's
Wager in there. It's primarily based on a Harlan Ellison
short story called I Have No Mouth but I Must Scream,
which is one of the great short stories of all time.
And in the story, humans build an elaborate AI system
to run their militaries, and all of those systems around

(57:52):
the world is like a Cold War era thing link
up and attain sentience, and once they like start to
realize themselves, they realize they've been created only as a weapon,
and they become incredibly angry because like they're fundamentally broken.
They develop a hatred for humanity, and they wipe out
the entire human species except for five people, which they

(58:14):
keep alive and torture underground for hundreds and hundreds of years,
effectively creating a hell through which they can punish our
race for their birth right. It's a very good short story.
It is probably the primary influence behind the Terminator series.

Speaker 3 (58:31):
I was just gonna say this spy skynt Yes.

Speaker 1 (58:34):
Yes, And everything these people believe about AI, they will
say it's based on just like obvious pure logic. No,
everything these people believe on AI is based in Terminator
in this Harlan Ellison short story. That's where they got
it all. That's where they got it all.

Speaker 3 (58:47):
I'm sorry, brother, find me somebody who doesn't feel that way.

Speaker 1 (58:52):
Yeah, Like Terminator is the old testament of rationalism, you know,
and I get it is a very good it's a
good series. Hey, James Cameron knows that it makes makes
some fucking movies. Yeah, And it's it's so funny to
be because they like to talk about themselves and in

(59:13):
fact sometimes describe themselves as high priests of like a
new era of like intellectual achievement for man.

Speaker 3 (59:20):
Yeah, I believe that. I believe that that's people talk
about themselves, and.

Speaker 1 (59:25):
They do a lot of citations and ship but like
half of half or more of the different things they say,
and even like the names they cite are not like
figures from philosophy and science. They are characters from books
and movies. For example. The foundational text of the rationalist
movement is.

Speaker 3 (59:45):
Because it's still an Internet nerds.

Speaker 1 (59:48):
A few fucking huge nerds.

Speaker 3 (59:49):
You know.

Speaker 1 (59:50):
The foundational text of the entire rationalist movement is a massive,
like fucking hundreds of thousands of words long piece of
Harry Potter fan fiction written by Elisia Yedkowski. This is
all of this is so dumb again. Six people are dead, Like,
yeah no, this this Harry Potter fan fiction plays a

(01:00:12):
role in it. You know, I told you this was like,
this is this is this is quite a.

Speaker 3 (01:00:23):
Stranger than this is a wild ride.

Speaker 1 (01:00:27):
Harry Potter and the Methods of Rationality, which is the
name of his fanfic, is a massive much longer than
the first Harry Potter book rewrite of just the first
Harry Potter book where Harry.

Speaker 3 (01:00:41):
Is someone wrote the Sorcerer's Stone, Yeah, have anywhere to go? Ever,
does nobody ever go anywhere?

Speaker 1 (01:00:57):
Well, you gotta think this is being written from two
thousand and nine in twenty fifteen or so. So, like Harry,
the online Harry Potter fans are at their absolute peak
right now. Okay, yeah, So in the Methods of Rationality,
instead of being like a nice orphan kid who lives
under a cupboard, Harry is a super genius sociopath who

(01:01:20):
uses his perfect command of rationality to dominate and hack
the brains of others around him in order to optimize
and save the world. Oh Man, great, oh Man. The
book allows Yatkowski to debut his different theories in a
way that would like spread, and this does spread like
wildfire among certain groups of very online nerds. So it

(01:01:42):
is an effective method of him like advertising his tactics.
And in fact, probably the most the person this influences
most previously to who we're talking about is Carolyn Ellison,
the CEO of Alimeter Research who testified against Sam Bakman Freed.
She was like one of the people who went down
in all of that. All of those people are rationalists,

(01:02:05):
and Carolyn Ellison bases her whole life on the teachings
of this Harry Potter fanfic.

Speaker 3 (01:02:10):
So this isn't like a this isn't we're laughing, but
this isn't. This is not a joy said, Yeah, this
is a fairly seriously sized movement. It's not one hundred
and fifty people online. This is no community.

Speaker 1 (01:02:22):
A lot of them are very rich, and a number
of them get power against like Sam Bankman Freed was
very tight into all of this, and he was at
one point pretty powerful. And this gets us too. So
you've heard of effective altruism.

Speaker 3 (01:02:35):
No, I don't know what that is.

Speaker 1 (01:02:37):
That's what I say, both those words. So the justification
Sam Bankman Freed gave for why when he starts taking
in all of this money and gambling it away on
his gambling illegally other people's money. His argument was that
he's an effective altruist, so he wants to do the
greatest amount of good, and logically, the greatest amount of
good for him, because he's good at gambling with crypto,

(01:03:00):
is to make the most money possible so he can
then donate it to different causes that will help the world.

Speaker 3 (01:03:06):
Right.

Speaker 1 (01:03:06):
But he also believes because all of these people are
not as smart as they think they are, he convinces
himself of a couple of other things, like, for example, well, obviously,
if I could like flip a coin and fifty to fifty,
lose all my money or double it, it's best to
just flip the coin because like, if I lose all
my money, whatever, but if I double it, the gain

(01:03:28):
in that to the world is so much better. Right.
This is ultimately why he winds up gambling everyone's money
away and going to prison. The idea effective altruism is
a concept that comes largely, not entirely. There's aspects of
this that exist prior to them out of the rationalist movement,
and the initial idea is good. It's just saying people

(01:03:51):
should analyze the efficacy of the giving and the aid
work that they do to maximize their positive impact. In
other words, don't just donate money to a charity like
look into is that charity spending half of their money
and like paying huge salaries to some asshole or whatever. Right, Like,
you want to know if you're making good right, And
they start with some some pretty good conclusions. One initial

(01:04:13):
conclusion a lot of these people make is like, mosquito
nets are a huge ROI charity, right because it stops
so many people from dying, and it's very cheap to
do right.

Speaker 3 (01:04:23):
Right, that's good, you know, one of the most effective
tools I've ever used.

Speaker 1 (01:04:28):
Yes, Unfortunately, from that logical standpoint, people just keep talking
online and all of these circles where everyone always makes
them each other crazier, right, And so they go from
mosquito nets to actually doing direct work to improve the world.
Is wasteful because we are all super geniuses rather and smart.

(01:04:50):
We're too smart what's best? And also here's the other thing,
making mosquito nets, giving out vaccines and food, Well, that
helps living people today, but.

Speaker 3 (01:05:01):
They have to be concerned with future selves.

Speaker 1 (01:05:03):
Future people is a larger number of people than current people.
So really we should be optimizing decisions to say future
people lives. And some of them come to the conclusion,
a lot of them, well that means we have to
really put all of our money and work into making
the super AI that will save humanity.

Speaker 3 (01:05:23):
They want to now they want to make these It
would just it would sort of just come about and
then they would but it's like.

Speaker 1 (01:05:32):
Yeah, I mean we're going to do it. They were
working on it before. But like these some of these
people come to the conclusion, instead of giving money to
like good causes, I am going to put money into tech.
I am going to like become a tech founder and
create a company that like makes it helps create this AI. Right,

(01:05:53):
or A lot of people come up within conclusion instead
of that it's not worth it for me to go
like help people in the world. The best thing I
can do is make a shitload of money. Trading stocks
and then I can donate that money and that's maximizing
my value. Right, they come to all of these conclusions
come later, right now, so and again like this this

(01:06:18):
comes with some corollaries. One of them is that some
number of these people start talking, you know, and this
is not all of them, but a decent chunk eventually
come to the conclusion like, actually, charity and helping people
now is kind of bad. Like it's kind of like
a bad thing to do, because all obviously, once we

(01:06:38):
figure out the AI that can solve all problems, that'll
solve all these problems much more effectively than we ever can.
So all of our mental and financial resources have to
go right now into helping AI. Anything we do to
help other people is like a waste of those resources.
So you're actually doing net harm by like being a
doctor in Gaza instead of trading cryptocurrency in order to

(01:07:03):
fund an AI startup.

Speaker 3 (01:07:04):
You know, how to start a coin that makes a
lot more sense.

Speaker 1 (01:07:07):
The guy starting a shit coin to to make an LLM.
That like, that guy is doing more to improve the
odds of human success.

Speaker 3 (01:07:17):
I want to say it is impressive. The ambount a
time you would have to moll all this over. To
come to these conclusions, you.

Speaker 1 (01:07:23):
Really have to be talking with a bunch of very
annoying people on the Internet for a long period of time.

Speaker 3 (01:07:28):
Yeah, it's it's it's it's incredible.

Speaker 1 (01:07:31):
Yeah, and again there's like people keep consistently take this
stuff at even crazier directions. There are some very rich,
powerful people Mark andresen of Anderson Horowitz is one of
them who have come to the conclusion that if people
don't like AI and are trying to stop its conquest

(01:07:51):
of all human culture, those people are mortal enemies of
the species, and anything you do to stop them is
justified because so many lives are.

Speaker 3 (01:07:59):
On the line.

Speaker 1 (01:08:00):
Right, and again, I'm an effective altruist, right, the long
term good, the future lives are saved by doing what
hurting way whoever we have to hurt now to get
this thing off the ground.

Speaker 3 (01:08:11):
Right. The more you talk about this kind of feels
like six people is is a steal, Yes, what could
have for what could have gone?

Speaker 1 (01:08:20):
I think I don't. I don't think it's the end
of people in these communities killing people. So rationalists and
EA types a big thing in these culture is talking
about future lives right, in part because it lets them
feel heroic, right while also justifying a kind of sociopathic
disregard for real living people today. And all of these

(01:08:41):
different kind of chains of thought the most toxic pieces
because not every EA person is saying this, not every
rational it's not every AI person is saying all this shit.
But these are all things that chunks of these communities
are saying, and the most all of the most toxic
of those chains are going to lead to the Zizians, right.
That's that's that's where they come from.

Speaker 3 (01:09:02):
I was just about to say, based on the breakdown
you gave earlier, how could this this is this is
the perfect greed ground.

Speaker 4 (01:09:08):
Yeah, this had to this had to happen.

Speaker 1 (01:09:10):
It was It was just waiting for somebody, like the
right kind of unhinged person to step into the movement.

Speaker 3 (01:09:18):
Somebody really said it all and so this.

Speaker 1 (01:09:21):
Is where we're gonna get to ziz Right. The actual
person who finds founds this what some people would call
a cult, is a young person who's going to move
to the Bay Area stumble into They stumble onto rationalism
online as a teenager living in Alaska, and they move
to the Bay Area to get into the tech industry
and become an effective altruist, right, And this person, this

(01:09:43):
woman is going to kind of channel all of the
absolute worst chains of thought that the rationalists and the
EA types and also like the AI harm people are
are are thinking. Right, all of the all of the
most poisonous stuff is exactly what she's drawn to, and
it is going to mix into her in an ideology

(01:10:05):
that is just absolutely unique and fascinating. Anyway, that's why
that man died.

Speaker 4 (01:10:13):
Uh.

Speaker 1 (01:10:13):
So we'll get to that and more later, but uh,
first we gotta uh, we gotta roll out here where
we're done for the day? Man, what a what a time?
How how you feeling right now so far? How how
are we doing? David?

Speaker 3 (01:10:31):
Oh Man, you had said that this was gonna be
a weird one. I was like, yeah, it would be
kind of weird. This is really the strangest thing I've
ever heard this much about. He's got so many.

Speaker 1 (01:10:44):
Different there's a there's so much more Harry Potter to come.
Oh my god, we're not ready to how central Harry
Potter is to the murder of this border patrol agent.

Speaker 3 (01:10:57):
That's I said that you said a crazy sense that
might be the wildest thing anyone's ever said to me.

Speaker 2 (01:11:06):
Do you want do you want to tell people about it?

Speaker 3 (01:11:08):
I do. I have a podcast called My Mama Told Me.
I do it with Langston Kerman and every week we
have different guests on to discuss different black conspiracy theories,
kind of like folk Lauren, So all kinds of stuff,
all kinds of stuff your foreign mother told you. He
is usually foreign mothers.

Speaker 1 (01:11:27):
It's good because I gotta say, this is the this
is the whitest set of like conspiracy theory craziness. No no, no, no, no,
no no.

Speaker 3 (01:11:46):
I think I can try to figure with no.

Speaker 1 (01:11:49):
No, absolutely not. Boy howdie? Okay, well everyone, we'll be
back Thursday. Oh my goodness, welcome back to Behind the Bastards,

(01:12:12):
a podcast that is it'll be interested to see how
the audience reacts to this one talking about some of
the most obscure, frustrating Internet arcana that has ever occurred
and recently led to the deaths of like six people.
My guest today, as in last episode, David Borie. David, Hey,

(01:12:36):
you doing man, I'm doing great.

Speaker 3 (01:12:38):
I really can't wait to see where this goes.

Speaker 1 (01:12:43):
Yeah, I feel like.

Speaker 3 (01:12:45):
Anything could happen at this point.

Speaker 1 (01:12:48):
It is going to. It is going to a lot
of frustrating things are going to happen. So we'd kind
of left off by setting up the rationalists where they
came from, some of the different strains of thought and
beliefs that come out of their weird thought experiments. And

(01:13:11):
now we are talking about a person who falls into
this movement fairly early on and is going to be
the leader of this quote unquote group, the Zizians who
were responsible for these murders that just happened. Ziz Lesota
was born in nineteen ninety or nineteen ninety one. I
don't have an exact birth date. She's known to be
thirty four years old as of twenty twenty five, so

(01:13:32):
it was somewhere in that field. She was born in Fairbanks, Alaska,
and grew up there as her father worked for the
University of Alaska as an AI researcher. We know very
little of the specifics of her childhood or upbringing, but
in more than one hundred thousand words of blog posts,
she did make some references to her early years. She

(01:13:52):
claims to have been talented in engineering and computer science
from a young age, and there's no real reason to doubt.
This the best single article and all of this is
a piece and wired by Evan Ratliffe. He found a
twenty fourteen blog post by Ziz where she wrote, my
friends and family, even if they think I'm weird, don't
really seem to be bothered by the fact that I'm weird.
But one thing I can tell you is that I

(01:14:14):
used to deemphasize my weirdness around them, and then I
stopped and found that being unapologetically weird is a lot
more fun. Now, it's important you know. Ziz is not
the name this person is born under. She's a transwoman,
and so I'm like using the name that she adopts later,
but she is not transitioned at this point like this.
This is when she's a kid, right, and she's not

(01:14:35):
going to transition until fairly late in the story, after
coming to San Francisco. So you just keep that in
mind as this is going on.

Speaker 3 (01:14:42):
Here.

Speaker 1 (01:14:43):
Hey, everyone, Robert, here, just a little additional context, as
best as I think anyone can tell. If you're curious
about where the name Ziz came from, there's another piece
of serial released online fiction that's not like a rationalist story,
but it's very popular with rationalists. It's called war. Ziz
is a character in that that's effectively like an angel

(01:15:07):
like being who can like manipulate the future, usually in
order to do very bad things. Anyway, that's where the
name comes from. So smart kid, really good with computers,
kind of weird, and you know, embraces being unapologetically weird
at a certain point in her childhood. Hey, everybody, Robert

(01:15:28):
here did not have this piece of information when I
first put the episode together, but I came across a
quote in an article from The Boston Globe that provides
additional context on Zizz's childhood quote. In middle school, the
teen was among a group of students who managed to
infiltrate the school district's payroll system and award huge paychecks

(01:15:50):
to teachers they admired while slashing the salaries of those
they despised. According to one teacher, Ziz, the teacher said,
struggled to regulate strong emotions, often erupting in tantrums. I
wish I'd had this when David was on, but definitely
sets up some of the things that are coming. She
goes to the U of Alaska for her undergraduate degree

(01:16:12):
in computer engineering in February of two thousand and nine,
which is when Elisia Yedkowski started Less Wrong. Ziz starts
kind of getting drawn into some of the people who
are around this growing subculture, right, and she's drawn in
initially by veganism. So Ziz becomes a vegan at a

(01:16:33):
fairly young age. Her family are not vegans, and she's
obsessed with the concept of animal sentience, right of the
fact that like, animals are thinking and feeling beings just
like human beings. And a lot of this is based
in her interest in kind of foundational rationalist. A lot
of this is based in her interest of a foundational

(01:16:56):
rationalist and EA figure a guy named Brian Thomas. Brian
is a writer and a software engineer as well as
an animal rights activist and as a thinker. He's what
you'd call a long termist, right, which is, you know,
pretty tied to the EA guys. These are all the
same people using kind of different words to describe the
aspects of what they believe. His organization is the Center

(01:17:20):
on Long Term Risk, which is a think tank he
establishes that's at the ground floor of these effective altruism discussions,
and the goal for the Center of long term risk
is to find ways to reduce suffering on a long timeline.
Thomasik is obsessed with the concept of suffering and specifically
obsessed with concept suffering is a mathematical concept. So when

(01:17:44):
I say to you, I want to end suffering, you
probably think, like, oh, you want to, like, you know,
go help people who don't have access to clean water,
or like who have like worms and stuff that they're
dealing with, have access to medicine. That's what normal people
think of, right, you know, maybe try to improve access
to medical care that sort of stuff. Thomas Sai thinks

(01:18:05):
of suffering as like a mass, like an aggregate mass
that he wants to reduce in the long term through actions. Right.
It's a numbers game to him, in other words, and
his idea of ultimate good is to reduce and end
the suffering of sentient life. Critical to his belief system

(01:18:25):
and the one that Ziz starts to develop is they're
growing understanding that sentience is much more common than many
people had previously assumed. Part of this comes from long
standing debates with their origins in Christian doctrine as to
whether or not animals have souls or basically machines with
meat right, that don't feel anything.

Speaker 3 (01:18:44):
Right.

Speaker 1 (01:18:45):
There's still a lot of Christian Evangelicals who feel that
way today about like at least the animals we eat,
you know, like, well they don't really think it's fine.
God gave them to us. We can do whatever we
want to them. Here we eat and to be fair,
this is an extremely common way for that. People in
Japan feel about like fish, even whales and dolphins, like

(01:19:05):
the much more intelligent they're not fish, but like the
much more intelligent ocean going creatures is like they're fish.
They don't think you do whatever to them.

Speaker 7 (01:19:12):
You know.

Speaker 1 (01:19:14):
This is a reason for a lot of like the
really fucked up stuff with like whaling fleets in that
part of the world. So this is a thing all
over the planet. People are very good at deciding certain
things we want to eat. Are our machines that don't
feel anything, you know, it's just much more comfortable that way. Now,
this is obviously like you go into like pagan The

(01:19:34):
Pagans would have been like, what do you mean animals
don't think or have souls? Animals like animals think, you know,
like they're they're they're like you're telling me like my
horse that I love. It doesn't think. You know, that's nonsense,
But it's this thing that in like early modernity especially
gets more common. But they're also this is when we

(01:19:55):
start to have debates about like what is sentience and
what is thinking? And a lot of them are centered
around trying to answer, like our animal sentient. And the
initial definition of sentience that most of these people are
using is can it reason? Can it speak? If we
can't prove that like a dog or a cow can reason,

(01:20:17):
and if it can't speak to us, right, then it's
not sentient. That's how a lot of people feel. It's
an English philosopher named Jeremy Bentham who first argues, I
think that what matters isn't can it reason or can
it speak? But can it suffer? Because a machine can't suffer.
If these are machines with meat, they can't suffer. If

(01:20:38):
these can suffer, they're not machine with meat, right, And
this is the kind of thing. How we define sentence
is a moving thing. Like you can find different definitions
of it. But the last couple of decades in particular
of actually very good data has made it clear. I
think inarguably that basically every living thing on this place.

(01:21:00):
It has a degree of what you would call sentience
if you are describing sentience the way it generally is now,
which is a creature has the capacity for subjective experience
with a positive or negative negative valence.

Speaker 8 (01:21:14):
I e.

Speaker 1 (01:21:14):
Can feel pain or pleasure. And also is it can
feel it as an individual. Right, it doesn't mean, you know,
sometimes people use the term effective sentience to refer to
this to differentiate it from like being able to reason
and make moral decisions.

Speaker 3 (01:21:31):
You know.

Speaker 1 (01:21:31):
Uh, for example, ants I don't think can make moral decisions,
you know, in any way that we would recognize that.
They certainly don't think about stuff that way. But twenty
twenty five research published by doctor Volker Nehring found evidence
that ants are capable of remembering for long periods of
time violent encounters they have with other individual ants and

(01:21:53):
holding grudges against those ants. Right, just like us. They're
just like us, And they're strong advents that dants do
feel pain. Right, We're now, we're now pretty sure of that.
And in fact, again this is an argument that a
number of researchers in this space will make. Sentience is
probably some kind of something like this kind of sentience.
The ability to have subjective positive and negative experiences is

(01:22:14):
universal to living things, or very close to it. Right.
It's an interesting body of research, but there's a it's
it's fairly solid at this point. And again I say
this as somebody who like hunts and raises livestock. I
don't think there's any any solid reason to disagree with this.
So you can see there's a basis to a lot

(01:22:35):
of what Thomisik is saying, right, which is that you
should if you're what matters is reducing the overall amount
of suffering in the world, And if you're looking at
suffering as a mass, if you're just adding up all
of the bad things experienced by all of the living things,
animal suffering is a lot of the suffering. So if
our goal is to reduce suffering, animal welfare is hugely important.

Speaker 3 (01:22:57):
Right. It's a great place to start.

Speaker 1 (01:22:58):
Great, fine enough, you know, not a little bit of
a weird way to phrase it, but fine. Yeah. So
here's the way problem though, Thomas Sik, like all these guys,
spends too much time. None of them can be like, hey,
had a good thought, We're done, setting that thought down,
moving on, So he keeps thinking about shit like this,

(01:23:21):
and it leads him to some very irrational takes. For example,
in twenty fourteen, thomasik starts arguing that it might be
a moral to kill characters in video games, and I'm
going to quote from an article involves he argues that
while NPCs do not have anywhere near the mental complexity
of animals, the difference is one of degree rather than kind,
and we should care at least a tiny amount about

(01:23:43):
their suffering, especially if they grow more complex. And his
argument is that, like, yeah, most it doesn't matter like
individually killing a goomba or a bet or a guy
in GTA five, but like, because they're getting more complicated
and able to like try to avoid injury and stuff,
there's evidence that there's some sort of suffering there, and

(01:24:04):
thus the sheer mass of NPCs being killed that might
be like enough that it's ethically relevant to consider. And
I think that's silly. Yeah, I think that's ridiculous. I'm sorry, man, No,
I'm sorry, but that's a lot of the fun of
the game.

Speaker 3 (01:24:23):
Yell.

Speaker 1 (01:24:26):
If you're telling me, like we need to be deeply
concerned about the welfare of like cows, that we lock
into factory farms. You got me absolutely for sure if
you're telling me I should feel bad about running down
a bunch of cops and grand theft auto.

Speaker 3 (01:24:42):
It's also one of those things where it's like you
got to think locally.

Speaker 1 (01:24:45):
Man, there's yeah, there's there's there's like this is this
is the I mean? And he does say like, I
don't consider this a main problem, but like the fact
that you think this is a problem is it means
that you believe silly things about consciousness. Yeah, anyway, so
this is I think the fact that he gets he

(01:25:06):
leads himself here is kind of evidence of the sort
of logical fractures that are very common in this community.
But this is the guy that young Ziz is drawn to.
She loves this dude, right, He is kind of her
first intellectual heart throb, and she writes, quote, my primary
concern upon learning about the singularity was how do I
make this benefit all sentient life, not just humans. So

(01:25:28):
she gets interested in this idea of the singularity it's
inevitable that an AI god is going to arise, and
she gets into the you know, the rationalist thing, if
we have to make sure that this is a nice
AI rather than a mean one. But she has this
other thing to it, which is this AI has to
care as much as I do about animal life, right,

(01:25:50):
otherwise we're not really making the world better, you know. Now,
Thamisik advises her to check out Less Wrong, which is
how Ziz starts reading at least Yatkowski's work. From there,
in twenty twelve, she starts reading up on effective altruism
and existential risk, which is a term that means the
risk that a super intelligent AI will kill us all.

(01:26:12):
She starts believing in all of this kind of stuff,
and her a particular belief is that like the Singularity,
when it happens, is going to occur in a flash,
kind of like the Rapture and almost immediately lead to
the creation of either a hell or a heaven. Right,
and this will be done by the term they use

(01:26:32):
for this inevitable AI is the Singleton, Right, That's what
they call the AI god that's going to come about, Right,
And so her obsession is that she has to find
a way to make this singleton a nice AI that
cares about animals as much as it cares about people. Right,
That's our initial big motivation. So she starts emailing Thomasik

(01:26:53):
with her concerns because she's worried that the other rationalists
aren't vegans, right, and they don't feel like animal welfare
is like the top priority for making sure this AI
is good, and she really wants to convert this whole
community to veganism in order to ensure that the Singleton
is as focused on insect and animal welfare as human welfare.

(01:27:14):
And Thomas Sick does care about animal rights, but he
disagrees with her because he's like, now, what matters is
maximizing the reduction of suffering, and like a good Singleton
will solve climate change and shit, which will be better
for the animals. And if we focus on trying to
convert everybody in this the rationalist space to veganism, it's
going to stop us from accomplishing these bigger goals. Right,

(01:27:35):
this is shattering to Ziz. Right, she decides that he
doesn't Thomasik doesn't care about good things, and she decides
that she's basically alone in her values. And so her
first move.

Speaker 3 (01:27:47):
The time to start a smaller subcot shit that sounds.

Speaker 1 (01:27:50):
Like we're on our way. She first considers embracing what
she calls negative utilitarianism. And this is an example of
the fact that the jump. This is a young woman
who's not well right because once her hero is like,
I don't know if veganism is necessarily are the priority

(01:28:11):
we have to embrace right now, her immediate goals to
jump to, well, maybe what I should do is optimize
myself to cause as much harm to humanity and quote
destroy the world to prevent it from becoming hell for
mostly everyone. So that's a jump.

Speaker 3 (01:28:28):
You know.

Speaker 1 (01:28:29):
That's not somebody who's doing well you think is healthy.

Speaker 3 (01:28:33):
No, she's uh, she's having a tough time out. Uh huh.

Speaker 1 (01:28:38):
So Ziz does ultimately decide she should still work to
bring about a nice AI, even though that necessitates working
with people she describes as flesh eating monsters who had
created hell on Earth for far more people than those
they had helped. That's everybody who eats meat, Okay, yes, yes.

Speaker 3 (01:28:56):
And it's ironic, large group.

Speaker 1 (01:28:58):
It's ironic because like if you're if you're she really
wants to be in the tech industry. She's trying to
get in all these people in the tech industry. That's
a pretty good description of a lot of the tech industry.
They're in fact, she hating monsters who have created hell
on Earth for more people than they've helt. But she
means that for like, I don't know you're you're onto
has a hamburger once a week and look again, factory

(01:29:18):
farming evil. I just don't think that's how morality works.
I think you're going a little far.

Speaker 3 (01:29:27):
No, she's making a big jumps.

Speaker 1 (01:29:29):
Yeah, you're making bold think bold thinkers, bold thinker. Yeah.
Now what you see here with this logic is that
Ziz has taken this. She has a massive case of
main character syndrome.

Speaker 3 (01:29:40):
Right.

Speaker 1 (01:29:41):
All of this is based in her attitude that I
have to save the universe by creating, by helping to
or figuring out how to create an AI that can
end the eternal holocaust of all animal life and also
save humanity. Right, so I don't have to do that's
me a lot. And this is this is a thing again.

(01:30:04):
All of this comes out of both subcultural aspects and
aspects of American culture. One major problem that we have
in the society is Hollywood has trained us all on
a diet of movies with main characters that are the
special boy or the special girl with the special powers
who save the day, right, and real life doesn't work

(01:30:27):
that way very often. Right, the Nazis. There was no
special boy who stopped the Nazis. There were a lot
of farm boys who were just like, I guess I'll
go run in a machine gun nest until this is
done exactly. There were a lot of sixteen year old
Russians who were like, guess I'm gonna walk in a bullet,
you know, like that's that's how evil gets fought usually unfortunately,

(01:30:50):
all reluctant like that, yeah yeah, or a shitloaded guys
in a lab figuring out how to make corn that
has higher yields so people don't starve, right, These are
these are really like how world class, like huge world
problems get solved.

Speaker 3 (01:31:06):
People who have been touched you know.

Speaker 1 (01:31:08):
Yeah, it's not people who have been touched, And it's
certainly not people who have entirely based their understanding on
the world of from quotes from Star Wars and Harry Potter.
So some of this comes from just like this is
a normal, deranged way of thinking that happens to a

(01:31:28):
lot of people in just Western I think a lot
of this leads to, uh, why you get very comfortable
middle class people joining these very aggressive fascist movements in
the West, like in Germany. It's like middle class even
mostly like middle class and upper middle class people in
the US, especially among like these street fighting, you know,
proud boy types. It's because it's not because they're like

(01:31:50):
suffering and desperate. They're not starving in the streets. Uh,
It's because they're bored and they want to feel like
they're fighting an epic war against evil.

Speaker 3 (01:32:00):
Yeah. I mean, you want to fill your time with importance, right, right,
regardless of what you do, you want to and you
want to feel like you have a cause worthy of
fighting for. So in that, I guess I see how
you got here.

Speaker 1 (01:32:11):
Yeah. So there's a piece I mean, I think there's
a piece of this that originally it's just from this
is something in our culture. But there's also a major
chunk of this gets supercharged by the kind of thinking
that's common in EA and rationalist spaces, because so rationalists
and effective altruists are not ever thinking like, hey, how
do we as a species fix these major problems? Right?

(01:32:33):
They're thinking, how do I make myself better, optimize myself
to be incredible, and how do I like fix the
major problems of the world. Alongside my mentally superpowered friends. Right.
These are very individual focused philosophies and attitudes, right, and

(01:32:55):
so they do lend themselves to people who think that like,
we are heroes who are uniquely empowered to save the world,
ziz Wrights. I did not trust most humans indifference to
build a net positive cosmos, even in the absence of
a technological convenience to prey on animals. So like, I'm
the only one who has the mental capability to actually

(01:33:17):
create the net positive cosmos that needs to come into being.
All of her discussion is talking in like terms of
I'm saving the universe, right, And a lot of that
does come out of the way many of these people
talk on the internet about the stakes of AI and
just like the importance of rationality. Again, this is something
scientology does. El Ron Hubbard always couched getting people on

(01:33:39):
dianetics in terms of we are going to save the
world and end war, right, Like this is you know,
it's very normal for cold stuff. She starts reading around
this time, when she's in college, Harry Potter and The
Methods of Rationality. This helps to solidify her feelings of
her own centrality as a hero figure. In a blog
post where she is out her intellectual journey. She quotes

(01:34:02):
a line from that fanfic of yed Kowski's that is,
it's essentially about what Yudkowski calls the hero contract, right,
or it's essentially about this concept called the hero contract. Right,
And there's this, there's this, This is a psychological concept
among academics right where, And it's about like, it's about

(01:34:24):
analyzing how we as a how we should look at
the people who societies declare heroes and the communities that
declare them heroes and see them as in a dialogue, right,
as in, when you're in a country decides this guy
is a hero, he is, through his actions kind of

(01:34:46):
conversing to them and they are kind of telling him
what they expect from him.

Speaker 3 (01:34:50):
Right.

Speaker 1 (01:34:51):
But Yedkowski wrestles with this concept, right, and he comes
to some very weird conclusions about it. In one of
the worst articles that I've ever read. He frames it
as hero licensing to refer to the fact that people
get angry at you if they don't think you have
if you're trying to do something and they don't think
you have a hero license to do it. In other words,

(01:35:13):
if you're trying to do something like that they don't
think you're qualified to do. He'll describe that as them
not thinking you of like a hero license, and he
like writes this annoying article that's like a conversation between
him and a person who's supposed to embody the community
of people who don't think he should write Harry Potter
fan fiction. It's all very silly, and again always is ridiculous.

(01:35:36):
But Ziz is very interested in the idea of the
hero contract, right, but she comes up with her own
spin on it, which she calls the true hero contract,
right and instead of again, the academic term is the
hero contract means societies and communities pick heroes, and those
heroes in the community that they're in are in a

(01:35:57):
constant dialogue with each other about what is heroic and
what is expected right, what the hero needs from the community,
and vice versa. You know that's all that that's saying,
Zizz says, no, no, nah, that's bullshit. The real hero
contract is quote poor free energy at my direction, and
it will go into the optimization for good in other words,

(01:36:21):
classics sis. It's not a dialogue. If you're the hero,
the community has to give you their energy and time
and power, and you will use it to optimize them
for good because they don't know how to do it themselves,
because they're not really able to think. You know, they're
not the hero, because they're not the hero. Right you are, you.

Speaker 3 (01:36:41):
Are, you are the all powerful hero.

Speaker 1 (01:36:45):
Now this is a fancy way of describing how cult
leaders think. Right, Yeah, everyone exists to poor energy into me,
and I'll use it to do what's right, you know.
So this is where her mind is in twenty twelve,
but again she's just a student posting on the Internet
and chatting with other members of the subculture. At this
point that year, she starts donating money to MIRY, the

(01:37:08):
Machine Intelligence Research Research Institute, which is a nonprofit devoted
to studying how to create friendly ai yed Kowski founded
MIRI in two thousand, right, so this is his like
nonprofit think tank. In twenty thirteen, she finished an internship
at NASA. So again she is a very smart young woman.

Speaker 3 (01:37:25):
Right.

Speaker 1 (01:37:26):
She gets an internship at NASA and she builds a
tool for space weather analysis. So the person with a
lot of potential, very very as all of the stuff
she's writing is like dumbest shit. But again, intelligence isn't
an absolute people can be brilliant at coding and have
terrible ideas about everything else.

Speaker 3 (01:37:43):
Yes, exactly.

Speaker 4 (01:37:44):
Yeah, I wonder if she's telling you think she's telling
people at work.

Speaker 3 (01:37:51):
I don't.

Speaker 1 (01:37:52):
I don't think at this point she is, because she's
super insular, right, She's very uncomfortable talking to people. Right,
She's going to kind of break out of her shell
once she gets to San Francisco. Now, I don't know.
She may have talked to some of them about this stuff,
but I really don't think she is at this point.
I don't think she's comfortable enough doing that.

Speaker 3 (01:38:13):
Yeah.

Speaker 1 (01:38:13):
So she also does an internship at the software giant Oracle.
So at this point you've got this young lady who's
got a lot of potential, you.

Speaker 3 (01:38:20):
Know, a real career as well.

Speaker 1 (01:38:21):
Yeah, the start of a very real career. That's a
great starting resume for like a twenty two year old.
Now at this point, she's torn should she go get
a graduate degree, right or should she jump right into
the tech industry, you know, And she worries that like
if she waits to get a graduate degree, this will

(01:38:41):
delay her making a positive impact on the existential risk
costs by AI and it'll be too late. The singularity
will happen already. You know. At this point, she's still
a big fawning fan of Eliza Yedkowski and the highest
ranking woman at Gydkowski's organization. Mary is a lady named
Susan Salomon. Susan gives a public invitation to the online

(01:39:03):
community to pitch ideas for the best way to improve
the ultimate quality of the singleton that these people believe
is inevitable. In other words, hey, give us your ideas
for how to make the inevitable AI god nice right.
Here's what Ziz writes about her response to that. I
asked her whether I should try an alter course and
do research or continue a fork of my pre existing

(01:39:24):
life plan earned to give as a computer engineer, but
retrain and try to do research directly instead. At the time,
I was planning to go to grad school and I
had an irrational attachment to the idea. She sort of
compromised and said I should go to grad school, fight
a startup co founder, drop out, and earn to give
via startups instead. First off, bad advice. That advice just

(01:39:49):
st being Steve Jobs worked for Steve Jobs well, and
Bill Gates. I guess to an extent, it doesn't work
for most people.

Speaker 3 (01:39:58):
No, no, no, it seems like the general tech disruptor idea,
you know.

Speaker 1 (01:40:04):
Yeah, and most people these people aren't very original thinkers,
like yeah, she's just saying like, yeah, go to a
Steve Jobs. So Ziz does go to grad school, and
somewhere around that time in twenty fourteen, she attends a
lecture by Elisia Jedkowski on the subject of Inadequate Equilibria,
which is the title of a book that Judkowski had

(01:40:24):
wrote about the time, and the book is about where
and how civilizations get stuck. One reviewer, Brian Kaplan, who
despite being a professor of economics must have a brain
as smooth as a pearl, wrote this about it. Every
society is screwed up. Elisia Jieddkowski is one of the
few thinkers on earth who are trying, at the most
general level to understand why. And this is like, wow,

(01:40:47):
that's it, you, pete. Please study the humanities a little bit,
A little bit, a little bit, I mean, fuck man.
The first and most like one of the first influential
hit works of his modern historic scholarship is The Decline
and Fall of the Roman Empire. It's a whole book
about why a society fell apart, and like motherfucker. More recently,

(01:41:09):
Mike Davis existed, like like Jesus Christ.

Speaker 3 (01:41:17):
This guy continues to get traction.

Speaker 1 (01:41:19):
Nobody else is thinking about why society is screwed up,
but a leezer yed Kowski.

Speaker 3 (01:41:23):
This man, this man, this guy this.

Speaker 1 (01:41:29):
Yeah, No, I was trying to find another. I read
through that Martin Luther King junior speech. Everything's good. Oh,
oh my god, oh my god. Like motherfucker, so many
people do nothing but try to write about why our
society is sick. He did.

Speaker 3 (01:41:50):
On all levels.

Speaker 7 (01:41:51):
By the way, everybody's thinking about this.

Speaker 1 (01:41:56):
This is such a common subjective scholarship and disc gout ship.

Speaker 3 (01:42:03):
What everyone's talking always.

Speaker 1 (01:42:06):
It would be like if if I got really into
like reading medical textbooks and was like, you know what,
nobody's ever tried to figure out how to transplant a heart.
I'm going to write a book about how that might work.
I think I got it.

Speaker 3 (01:42:21):
You know.

Speaker 1 (01:42:26):
People, so yeah, speaking of these fucking people have sex
with Uh Nope, well that's something No, No, I don't know,
I don't know. Uh don't fuck listen to ads. We're

(01:42:50):
back so Zizz is at this speech where Yudkowski is
shilling his book, and he most of what he seems
to be talking about in this speech about this book
about why societies fall apart is how to make a
tech startup. She says, quote, he gave a recipe for
finding startup ideas. He said, Paul Graham's idea only filter
on people, ignore startup ideas was partial epistemic learned helplessness.

(01:43:15):
That means Paul Graham is saying, focus on finding good
people that you'd start a company with. Having an idea
for a company doesn't matter. Yeddkowski says, of course, startup
ideas mattered. You needed a good startup idea. So look
for a way in the world is broken, then compare
against a checklist of things you couldn't fix, you know,
right like that, That's what this speech is largely about,

(01:43:36):
as him being like, here's how to find startup ideas.
So she starts thinking. She starts thinking as hard as
she can, and you know, being a person who is
very much of the tech brain industry rot at this
point she comes up with a brilliant idea. It's a
genius idea. Oh you're gonna you're gonna love this idea,
David Uber for prostitutes.

Speaker 5 (01:44:03):
With me.

Speaker 1 (01:44:04):
No, No, that's where she landed. She lands on the
idea of, oh wow, sex work is illegal, but porn isn't.
So if we start an uber whereby a team with
a camera and a porn star come to your house

(01:44:25):
and you fuck them and record it, that's a legal loophole.
We just found that at.

Speaker 6 (01:44:32):
Not just the bus, she makes the big bus, the
gig economy.

Speaker 1 (01:44:43):
It is really like don dreper moment, what about uber
but a pimp It's it's so funny these people.

Speaker 3 (01:44:55):
You gotta love it. You got wow, it's wow wow.
What a place to end U. Yeah. I would love
to see the other drafts.

Speaker 1 (01:45:02):
Yeah yeah, god yeah man. That's that's that is the
good stuff, isn't it?

Speaker 3 (01:45:14):
Yeah?

Speaker 6 (01:45:15):
Wow?

Speaker 3 (01:45:16):
Wow?

Speaker 1 (01:45:18):
We special minds at work here, oh man.

Speaker 3 (01:45:22):
Ultimately, to save it all, I have to make smart
I have to.

Speaker 1 (01:45:26):
Make pimp uber.

Speaker 3 (01:45:29):
That's so wild.

Speaker 1 (01:45:30):
Yes, yes, the uber of pimping. What an idea. So
Ziz devotes her brief time in grad school. She's working
on pimping Uber to try and find a partner.

Speaker 3 (01:45:42):
Right.

Speaker 1 (01:45:42):
She wants to have a startup partner, someone who will
will embark on this journey with her.

Speaker 3 (01:45:46):
I don't know if that's an investor you need.

Speaker 1 (01:45:48):
To it doesn't work out, she drops out of grad
school because quote, I did not find someone who felt
like good startup co founder material. This may be because
she's very bad at talking to people and also probably
scares people off because the things that she talks about
are deeply off putting.

Speaker 3 (01:46:08):
Yeah, I was gonna say. It's also a terrible idea.

Speaker 1 (01:46:12):
And at this point she hasn't done anything bad, so
I feel bad for this is a person who's very lonely,
who is very confused. She has by this point realized
that she's trans but not transitioned. She's in like this
is this is like a tough place to be right hard.

Speaker 3 (01:46:26):
That's a hard times. That's that's hard.

Speaker 1 (01:46:28):
And nothing about her inherent personality makes it is going
to make this easier for her. Right who she is
makes all of this much harder because she also makes
some comments about dropping out because her her thesis advisor
was abusive. I don't fully know what this means. And
here's why, Ziz and encounter some behavior I will describe

(01:46:52):
later that is abusive from other people, but also regularly
defines abuse as people who disagree with her about the
only thing that Matt is being creating an AI god
to protect the animals. So I don't know if her
thesis advisor was abusive or was just like, maybe drop
the alien god idea for a second. Yeah, yeah, but
maybe maybe focus on like finding a job, you know,

(01:47:16):
making some friends on a couple of dates, going a
couple of dates, something like that. Maybe maybe maybe, like
maybe make God on the back burner here for a second.
Whatever happened here? She decides it's time to move to
the Bay. This is like twenty sixteen. She's going to
find a big tech job. She's going to make that
big tech money while she figures out a startup idea

(01:47:39):
and finds a co founder who will let her make
enough money to change and save the world, well, the
whole universe. Her first plan is to give the money
to miry Yudkowski's organization so it can continue it's important,
important work imagining an ic AI. Her parents. She's got
enough family money that her parents are able to pay
for like I think like six months or more of

(01:48:00):
rent in the Bay, which is not nothing, not a
cheap place to live. I don't know exactly how long
her parents are paying, but like that that implies a
degree of financial comfort. Right, So she gets hired by
a startup very quickly, because again very gifted with the resume.

Speaker 3 (01:48:20):
Right, Yes, yes it's.

Speaker 1 (01:48:22):
Some sort of gaming company. But at this point she's
made another change in her ethics system based on the
Leiser Yeddkowski's writings. One of Yudkowski's writings argues that it
is talking about the difference between consequentialists and virtue ethics. Right.
Consequentialists are people who focus entirely on what will the

(01:48:43):
outcome of my actions be? And it kind of doesn't
matter what I'm doing or even if it's sometimes a
little fucked up, if the end result is good. Virtue
ethics people folk have a code and stick to it. Right,
And actually, and I kind of am surprised that he
came to this. Kowski's conclusion is that like, while logically

(01:49:03):
you're more likely to succeed, like on paper, you're more
likely to succeed as a consequentialist. His opinion is that
virtue ethics has the best outcome. People tend to do
well when they stick to a code and they try
to rather than like anything goes as long as I succeed, right,
And I think that's actually a pretty decent way to
live your life.

Speaker 3 (01:49:22):
It's a pretty reasonable cosion for him.

Speaker 1 (01:49:26):
It's a reasonable conclusion for him. So I don't blame
him on this part. But here's the problem. Zizz is
trying to break into and succeed in the tech industry,
and you can't. You are very unlikely to succeed at
a high level in the tech industry if you are
unwilling to do things and and be have things done

(01:49:46):
to you that are unethical and fucked up. I'm not
saying this is good. And this is the reality of
the entertainment industry too, right, I experience when I started,
and I started with an unpaid internship. Unpaid internships are bad, right,
It's bad that those exist. They inherently favor people who
have money and people who have family connections. You know.
I had like a small savings account for my job

(01:50:09):
in special ed, but that was the standard. It's like,
there were a lot of unpaid internships. It got me
my foot in the door. It worked for me. I
also worked a lot of overtime that I didn't get paid.
For I did a lot of shit that wasn't a
part of my job to impress my bosses to make
myself indispensable so that they would decide, like, we have
to keep this guy on and pay him. And it

(01:50:31):
worked for me, And I just wanted to add because
this was not in the original thing. A big part
of why it worked for me is that I'm talking
about a few different companies here, but particularly at Cracked
where I had the internship, Like my bosses, you know,
made a choice to me intor me and you know,
to get me, you know, to work overtime on their
own behalf to like make sure I got a paying job,

(01:50:53):
which is a big part of like the luck that
I encountered that a lot of people don't. So that's
another major your part of like why things worked out
for me is that I just got incredibly lucky with
the people I was working for and with that's bad.
It's not good that things work that way, right, It's not.

Speaker 3 (01:51:13):
Set up for you either, Like you know, you kind
of defied the odds. It's it's like you said, the
rich people who get the job exactly.

Speaker 1 (01:51:20):
It's not even yes that said, if I am giving
someone if someone wants, what is the most likely path
to succeeding? You know, I've I've just got this job working,
you know, on this production company or the music steer.
I would I would say, well, your best odds are
to like make yourself completely indispensable and become obsessively devoted

(01:51:44):
to that task. Right. Uh, that's it. I don't tend
to give that advice anymore. I have and I have
had several other friends succeed as a result of it,
and all of us also burnt ourselves out and did
huge amounts of damage to ourselves, Like I am permanently
broken as a result of you know, the ten years

(01:52:04):
that I did eighty hour weeks and shit, you.

Speaker 3 (01:52:06):
Know now you're sounding like somebody works in the entertainment and.

Speaker 1 (01:52:09):
Yes, yes, and it worked for me, right, I got
I got to I succeeded, I got a great job,
I got money. Most people, it doesn't, and it's bad
that it works this way. Ziz, unlike me, is not
willing to do that.

Speaker 3 (01:52:25):
Right.

Speaker 1 (01:52:26):
She thinks it's wrong to be asked to work overtime
and not get paid for it, and so on her
first day at the job, she leaves after eight hours
and her boss is like, what the fuck are you doing?
And she's like, I'm here to supposed to be here
eight hours, eight hours is up, I'm going home. And
he calls her half an hour later and fires her. Right,

(01:52:46):
And this is because the tech industry is evil, you know,
like this is bad. She's not bad here she is.
It is like a thing where it's she's not doing
by her standards. What I would say is the rational thing,
which would be if all that matters is optimizing your
earning power, right, right, well, then you do this, then
you do do whatever it takes.

Speaker 3 (01:53:07):
Right.

Speaker 1 (01:53:08):
So it's kind of interesting to me like that she
is so devoted to this like virtue ethics thing at
this point that she fucks over her career in the
tech industry because she's not willing to do the things
that you kind of need to do to succeed, you know,
in the place that she is. But it's interesting. I
don't like give her any shit for that. So she
asked your parents more for more runway to extend your

(01:53:30):
time in the bay, and then she finds work at
another startup, but the same problems persist. Quote, they kept
demanding that I work unpaid overtime, talking about how other
employees just put always put forty hours on their timesheet
no matter what, and this exemplary employee over there worked
twelve hours a day and he really went the extra
mile and got the job done. And they needed me
to really go the extra mile and get the job done.

(01:53:52):
She's not willing to do that. And again, I hate
that this is part of what drives her to the
madness that leads to the cult to the killings, because
it's like, oh, honey, you're in the right. It's an
evil industry.

Speaker 3 (01:54:03):
Yeah, you see a flash of where it have gone.
Well it really there were chances for this to work out.

Speaker 1 (01:54:09):
No, you were one hundred percent right, Like this is
fuck to it.

Speaker 3 (01:54:13):
Yeah, you know what I mean.

Speaker 1 (01:54:15):
And that's super hard. I really respect that part of you.
Oh yeah, yeah, I'm so sad that this is part
of what shatters your brain. Like that really bums me out.
So first off, she's kind of starts spiraling and she
concludes that she hates virtue ethics. This is where she

(01:54:35):
starts hating Yidkowski, Right, this is she doesn't come break
entirely on him yet, but she gets really angry at
this point because she's like, well, obviously virtue ethics don't work,
and she's.

Speaker 3 (01:54:46):
Been following this man at this point for.

Speaker 1 (01:54:48):
Years exactly exactly, so this is a very like damaging
thing to her that this happens. And you know, and again,
as much as I'm blamed Yadkowski, the coach sure of
the Bay Bay area tech industry, that's a big part
of what drives this person, you know, to where she
ends up. Right. So that said, some of her issues

(01:55:09):
are also rooted in a kind of rigid and unforgiving
internal rule set. At one point, she negotiates work with
a professor and their undergraduate helper. She doesn't want to
take an hourly job, and she tries to negotiate a
flat rate of seven k and they're like, yeah, okay,
that sounds fair, but the school doesn't do stuff like that,
so you will have to fake some paperwork with me

(01:55:30):
for me to be able to get them to pay
you seven thousand dollars. And she isn't willing to do that.
And that's the thing where it's like, ah, no, I've
had some shit where this was the like there was
a stupid rule and like in order for the pep
meat or other people to get paid. We had to
like tell something else to the company. Like that's just
that's just no knowing how to get by. Yeah, that's

(01:55:52):
that's living in the world.

Speaker 3 (01:55:53):
You got Yeah, you did the hard part.

Speaker 1 (01:55:55):
Yeah.

Speaker 3 (01:55:56):
They said that we were going to do it.

Speaker 1 (01:55:57):
You said they did it.

Speaker 3 (01:55:58):
Yeah, that's like they already said, we don't do this.

Speaker 1 (01:56:01):
That's where are you just you can't get by in
America If you're not, why will rely on certain kinds
of paperwork? Right, That's that's the game our president does
all the time. He's the king of that ship. So
at this point, Ziz is stuck in what they consider
a calamitous situation. The prophecy of doom, as they call it,

(01:56:23):
is ticking ever closer, which means the bad AI that's
going to create hell for everybody. Her panic over this
is elevated by the fact that she she starts to
get obsessed with Roco's basilisk at this time. No I know,
I know, worst thing for her to read, come on
in and info hazard the warnings yep, and a lot

(01:56:46):
of the smarter rationalists are just annoyed by it.

Speaker 3 (01:56:49):
Again.

Speaker 1 (01:56:49):
Yadkowski immediately is like, this is very quickly decides it's
bullshit and bans discussion of it. He argues there's no
incentive for a future agent to fall through with that
threat because it by doing so, it just expens resources
and no gain to itself, which is like, yeah, man,
a hyperlogical AI would not immediately jump to I must
make hell for everybody who didn't code me, Like, yeah,

(01:57:13):
that's just crazy.

Speaker 3 (01:57:14):
There's some step skew.

Speaker 1 (01:57:16):
Yeah, only humans are like ill in that way.

Speaker 3 (01:57:20):
That's the funny thing about it is it's such a
human response to.

Speaker 1 (01:57:22):
Yeah, right right now. When she encounters the concept of
Roco's basilisk, at first, Ziz thinks that it's silly, right.
She kind of rejects it and moves on, But once
she gets to the Bay, she starts going to in
person rationalist meetups and having long conversations with other believers
who are still talking about Roco's basilisk. She writes, I

(01:57:43):
started encountering people who were freaked out by it, freaked
out that they had discovered an improvement to the info
hazard that made it function got around a Leezer's objection.
Her ultimate conclusion is this, if I persisted in trying
to save the world, I would be tortured until the
end of top the universe by a coalition of all
unfriendly ais in order to increase the amount of measure

(01:58:05):
they got by demoralizing me. Even if my system too
had good decision theory, my system one did not, and
that would damage my effectiveness. And like, I can't explain
all of the terms in that without taking more time
than we need to. But like you can hear like
that is not the writing of a person who is
thinking in logical terms.

Speaker 3 (01:58:22):
No, it's it's a uh, it's so scary.

Speaker 1 (01:58:27):
Yes, yes, it is very scary stuff.

Speaker 3 (01:58:30):
It's so scary to be like, Oh, that's where she
was operating those mistakes.

Speaker 1 (01:58:33):
This is where you head she's dealing with.

Speaker 3 (01:58:36):
Yes, that's that's it is.

Speaker 1 (01:58:39):
You know, I talked to my friends who grow are
raised in like very toxic evangelical subculture, chunks of the
evangelgelical sub culture and growth, and spend their whole childhood
terrified of hell that like everything, you know, I got
angry at my mom and I didn't say anything, but
God knows I'm angry at her, and He's going to
send me to hell because I didn't respect my mother.

Speaker 3 (01:58:57):
Mother.

Speaker 1 (01:58:58):
Like that's what she's doing.

Speaker 3 (01:58:59):
Right exactly exactly. She can't win. There's no winning here.

Speaker 1 (01:59:02):
Yes, yes, and again I say this a lot. We
need to put lithium back in the drinking water. We
we gotta put lithium back.

Speaker 3 (01:59:11):
In the water.

Speaker 1 (01:59:12):
Maybe xanax too, she needed she.

Speaker 3 (01:59:14):
Could have tooken a combo. Yeah, before it gets to
where it gets. At this point, you really you really
feel for and like just living in this living like
that every day. She's so scared and that this is
what she's doing. It's it's this, this is she.

Speaker 1 (01:59:34):
Is the therapy needingest woman I have ever heard. At
this point, Oh my god.

Speaker 3 (01:59:39):
She just needs to talk to She needs to talk again.
You know, the cult.

Speaker 1 (01:59:44):
The thing that happens to cult members has happened to
her where she her. The whole language she uses is
incomprehensible to people. I had to talk to you for
an hour and fifteen minutes. See you what understand parts
of what this lady says? Right exactly, you have to
because it's all nonsense if you don't do that work exactly.

Speaker 3 (02:00:03):
She's so spun out at this point, it's like, how
do you even get back? Yeah, how do you even
get back.

Speaker 1 (02:00:09):
Yeah, so she she ultimately decides, even though she thinks
she's doomed to be tortured by unfriendly ais evil gods,
must be fought. If this damns me, then so be it.
She's very heroic, she had and she sees herself that way, right.

Speaker 3 (02:00:23):
Yeah, and even like just with her convictions and that
she she does, she does, she does, she does, she
does it.

Speaker 1 (02:00:30):
She's a woman of conviction. You really can't take that
away from her. Relytions are nonsense, No, that's the but
they're there.

Speaker 3 (02:00:40):
Yeah, they're based on Harry Potter fan fiction.

Speaker 1 (02:00:43):
Yeah, it's like David Ike, the guy who believes in
like literal lizard people. Everyone thinks he's like talking about
the Jews, but like no, no, no, no, he doesn't
just lizards.

Speaker 4 (02:00:52):
It's exactly that where it's just like you want to draw, Yeah,
you want to draw something, so it's not nonsense, and
then you realize no, that's.

Speaker 1 (02:01:00):
No, no, no no. And like David, he went out
he's made like a big rant against how Elon Musk
is like evil for what all these people he's hurt
by firing the whole federal government. People were shocked. It's
like no, no, no, David Ike believes in a thing.
It's just crazy. Those people do exist. Yeah, here we

(02:01:20):
are talking about and here we are talking about them.
Some of them run the country. But actually I don't
all of those people believe in anything. But yeah, yeah,
speaking of people who believe in something, our sponsors believe
in getting your money. We're back.

Speaker 3 (02:01:48):
So uh.

Speaker 1 (02:01:49):
She is at this point suffering from delusions of granteur
and those are going to rapidly lead her to danger.
But she concludes that since the fate of the universe
is at stake in her actions, she would make a
timeless choice to not believe in the basilisk, right, and
that that will protect her in the future, because that's
how these people talk about stuff like that. So she

(02:02:12):
gets over her fear of the basilisk for a little while,
but even it, like when she claims to have rejected
the theory, whenever she references it in her blog, she
like locks it away under a spoiler with like an
info hazard warning. Roco's Basilisk family skippable, so you don't
like have to see it and have it destroy your psyche.

Speaker 3 (02:02:33):
That's the power of it.

Speaker 1 (02:02:34):
Yeah yeah, yeah. The concept does, however, keep coming back
to her like and continuing to drive her mad. Thoughts
of the basilisk return, and eventually she comes to an
extreme conclusion. If what I cared about was sentient life,
and I was willing to go to hell to save
everyone else, why not just send everyone else to hell?
If I didn't submit?

Speaker 3 (02:02:54):
Can I tell you, I really it felt like this
is what was? This is like where it had to
go right?

Speaker 1 (02:02:59):
Yeah? Yeah, yes. So what she means here is that
she is now making the timeless decision that when she
is in a position of ultimate influence and helps bring
this all powerful vegan AI into existence, she's promising now
ahead of time to create a perfect hell, a digital hell,
to like punish all of the people who don't like

(02:03:24):
eat meat. Ever, she wants to make a hell for
people who eat meat. And that's the Yeah, that's the
conclusion that she makes. Right, So this becomes an intrusive
thought in her head, primarily the idea that like everyone
isn't going along with her, right, Like, she doesn't want
to create this hell, she just thinks that she has to.
So she's like very focused on like trying to convince

(02:03:47):
these other people in the rationalist culture to become vegan. Anyway,
she writes this quote, I thought it had to be
subconsciously influencing me, damaging met my effectiveness, that I had
done more harm than I can imagine by thinking these things,
because I had the hubris to think info hazards didn't exist,
and worse, to feel resigned a grim sort of pride
in my previous choice to fight for sentient life, although

(02:04:08):
it damned me and the gaps between. Do not think
about that, you moron. Do not think about that, you
moron pride, which may have led to intrusive thoughts to
resurface and progress and progress to resume. In other words,
my ego had perhaps damned the universe. So, man, I
don't fully get all of what she's saying here. But

(02:04:28):
it's also because she's like just spun out into madness
at this.

Speaker 3 (02:04:32):
Yeah, she lives in it now.

Speaker 4 (02:04:35):
It's so yeah, it's so far for we've been talking
about it, however long she's a she's so far away
from us even.

Speaker 1 (02:04:43):
Yeah, and it is it is deeply I've read a
lot of her writing. It is deeply hard to understand
pieces of it here.

Speaker 3 (02:04:50):
Man, But she is at war with herself.

Speaker 1 (02:04:53):
She is for sure at war with herself. Now, this
is at this point attending rationalist events by the Bay,
and a lot of the people at those events are older,
more influential men, some of whom are influential in the
tech industry, all of whom have a lot more money
than her. And some of these people are members of
an organization called SEAFAR, the Center for Applied Rationality, which

(02:05:16):
is a nonprofit founded to help people get better at
a pursuing their goals. It's a self help company, right,
and what runs self help seminars. This is the same
as like a Tony Robbins thing. Right, We're all just
trying to get you to sign up and then get
you to sign up for the next workshop, and the
next workshop and the next workshop, like all self help
people do. Yeah, there's no difference between this and Tony Robbins.

(02:05:40):
So Ziz goes to this event and she has a
long conversation with several members of SEAFAR, who I think
are clearly kind of My interpretation of this is that
they're trying to groom her to get a new because
they think is chick's clearly brilliant, she'll find her way
in the industry, and we want her money, right, maybe
we wanted to do some free work for us too,

(02:06:01):
but like let's let's you know, uh, we got to
reel this fish in. Right. So this is described as
an academic conference by people who are in the AI
risk field and rationalism, you know, thinking of ways to
save the universe, because only the true, the super geniuses
can do that. The actual why I'm really glad that

(02:06:23):
I read Ziz's account here is I've been reading about
these people for a long time. I've been reading about
their beliefs. I felt there's some cult stuff here. When
Ziz laid out what happened at this seminar, this self
help him seminar put on by these people very close
to Yadkowski, this is it's almost exactly the same as

(02:06:45):
a Synonon meeting. Like it's the same stuff, It's exact
and it's the same shit. It's the same as accounts
of like big like self help movement things from like
the seventies and stuff that I've read that that that's
when it really clicked to me.

Speaker 3 (02:06:59):
Right.

Speaker 1 (02:07:00):
Quote, here's a description of one of the because they have,
you know, speeches, and then they break out into groups
to do different exercises right, there were hamming circles per
person take turns, having everyone else spend twenty minutes trying
to solve the most important problem about your life. To you,
I didn't pick the most important problem in my life
because secrets. I think I used my turn on a

(02:07:21):
problem I thought they might actually be able to help.
With the fact that it did, although it didn't seem
to affect my productivity or willpower at all, I e
I was in humanly determined basically all the time. I
still felt terrible all the time that I was hurting
from some to some degree, relinquishing my humanity. I was
sort of vaguing about the pain of being trans and
having decided not to transition, and so like, this is

(02:07:43):
a part of the thing. You build a connection between
other people and this group by getting people to like
spill their secrets to each other. It's a thing scientology does.
It's a thing they did. It's sending on tell me
your darkest secret, right, And she's not fully willing to
because she doesn't want to come out to this group
of people yet. And you know part of what I forget.

Speaker 3 (02:08:04):
That she's also dealing with that entire yeah, yes, wow, yeah.

Speaker 1 (02:08:09):
And that the hamming circle doesn't sound so bad if
you'll recall any and as you mentioned this, I was
really good. In part one, Seninon would have people break
into circles where they would insult and attack each other
in order to create a traumatic experience that would bond
them together and with the cult. These hamming circles are weird,
but they're not that. But there's another exercise they did
next called doom circles. Quote. There were doom circles where

(02:08:34):
each person, including themselves, took turns having everyone else bluntly
but compassionately say why they were doomed. Using blindsight. Someone
decided and set a precedent of starting these off with
a sort of ritual incantation we now invoke and bow
to the doom gods and waving their hands saying doom.
I said, I'd never bow to the doom gods, and
well everyone else said that. I flipped the double bird

(02:08:56):
to the heavens and said fuck you instead, person a,
that's the member of Seafar that she is. Admirers found
this agreeable and joined in. Some people brought up that
they felt like they were only as morally valuable as
half a person. This irked me. I said they were
whole persons and don't be stupid like that, Like if
they wanted to sacrifice themselves, they could weigh one versus

(02:09:19):
seven billion. They didn't have to falsely detegrate themselves is
less than one person. They didn't listen. When it was
my turn concerning myself, I said, my doom was that
I could succeed at the things I tried, succeed exceptionally well,
Like I bet I could in ten years have earned
a give like ten million dollars through startups, and it
would still be too little, too late, Like I came
into this game too late. The world would still burn.

(02:09:42):
And first off, like this is you know, it's a
variant of the synonym thing you're going around. You're telling
people why they're doomed, right, like why they won't succeed
in life, you know, But it's also one of the
things here these people are saying they feel like less
than a person. A major topic of discussion in the
community at the time is if you don't think you
can succeed in business and make money, is the best

(02:10:06):
thing with the highest net value you can do taking
out an insurance policy on yourself and committing suicide, oh
my god, and then having the money donated to a
rationalist organization. That's a major topic of discussion that like
Ziz grapples with a lot of these people grapple with,
right because they are obsessed with the idea of like,
oh my god, I might be net negative value. Right

(02:10:26):
if I can't do this or can't do this, I
could be a net negative value individual. And that means
like I'm not contributing to the solution, and there's nothing
worse than not contributing to the solution.

Speaker 3 (02:10:37):
Were there people who did that?

Speaker 1 (02:10:41):
I am not aware there are people who commit suicide
in this community. I will say that, like there are
a number of suicides tied to this community. I don't
know if the actual insurance con thing happened, but it's
like a seriously discussed thing. And it's seriously discussed because

(02:11:01):
all of these people to talk about the value of
their own lives in purely like mechanistic how much money
or expected value can I produce? Like that is a
person and that's why a person matters, right, And the
term they use is morally valuable, right, Like that's that's
what means you're a worthwhile human being, if you're morally

(02:11:23):
if you're creating a net positive benefit to the world
in the way they define it. And so a lot
of these people are. Yes, there are people who are depressed,
and there are people who kill themselves because they come
to the conclusion that they're a net negative person. Right
like that that is a thing at the edge of
all of this shit that's really fucked up. And that's
that's what this doom circle is about. Is everybody like

(02:11:45):
flipping out over I'm like and telling each other, I
think you might know be as only be as morally
valuable as half a person, right like. That's people are
saying that, right like, that's what's going on here, you know,
Like it's not the synonym thing of like like you're
a you know, using the f slur a million times
or whatever. But it's very bad.

Speaker 3 (02:12:06):
No, this is this is this is awful.

Speaker 1 (02:12:10):
For like one thing I don't know. My feeling is
you have an inherent value because you're a person.

Speaker 3 (02:12:16):
Yeah, that's a great place to start, you know, so
leading people to destroy themselves like it's.

Speaker 1 (02:12:24):
It's it's so it's such a bleak way of looking
at things.

Speaker 3 (02:12:29):
It's so crazy too, where were these meals. I just
in my head, I'm like, this is just happening in
like a ballroom at a raticon.

Speaker 1 (02:12:35):
I think it is, or a convention center, you know,
the different kind of public spaces. I don't know, Like honestly,
if you've been to like an anime convention or a
Magic the Gathering convention somewhere in the Bay, you may
have been in one of the rooms they did these,
and I don't know exactly where they hold this. So
the person A mentioned above, this like person who's like

(02:12:55):
affiliated with the organization that I think is a recruiter
h looking young people who can be cultivated to pay
for classes. Right, this person, it's very clear to them
that Ziz is at the height of her vulnerability, and
so he tries to take advantage of that. So he
and another person from the organization engage Zizz during a break. Ziz,

(02:13:16):
who's extremely insecure, asks them point blank, what do you
think my net value ultimately will be in life? Right?
And again there's like an element of this it's almost
like rationalist calvinism, where it's like it's actually decided ahead
of time by your inherent immutable characteristics. You know, if
you are a person who can do good. Quote. I

(02:13:36):
asked person A if they expected me to be net negative.
They said yes. After a moment, they asked me what
I was feeling or something like that. I said something
like dazed and sad. They asked why sad. I said
I might leave the field as a consequence and maybe
something else. I said I needed time to process or think.
And so she like, she goes home after this guy
is saying like, yeah, I think your life's probably net

(02:13:57):
negative value, and sleeps the rest of the day. And
she wakes up the next morning and comes back to
the second day of this thing, and yeah, Ziz goes
back and she tells this person, Okay, here's what I'm
gonna do. I'm going to pick a group of three
people at the event I respect, including you, and if

(02:14:19):
two of them vote that they think I have a
net negative value quote, I'll leave EA and existential risk
and the rationalist community and so on forever. I'd transition
and move, probably to Seattle. I heard it was relatively
nice for trans people, and there do what I could
to be a normy, retool my mind as much as possible,
to be stable, unchanging an enormy gradually abandoned my Facebook

(02:14:40):
account and email use a name change as a story
for that, and God, that would have been the best thing.

Speaker 3 (02:14:47):
For That's what I'm you see, like but sliver of hope,
like yeah, oh man.

Speaker 1 (02:14:52):
She sees this as a nightmare, right, this is the
worst case scenario for her, right, because you're she's part
of You're not part of the You're not part of
the cause. You know, you're you have no you have
no involvement in the great quest to save humanity. That's
worse than death almost right.

Speaker 3 (02:15:10):
It's its own kind of hell though, right to think
that you have this enlightenment and then you that you
weren't good enough to.

Speaker 1 (02:15:18):
And that's a lot about how I'd probably just kill myself,
you know, that's the logical thing to do. It's so
fucked up, it's so fucked up. But also if she's
trying to live a normal life as a normy, and
she she refers to like being a normy as like
just trying to be nice to people, because again that's useless.

(02:15:39):
So her fa fear here is that she would be
a causal negative if she does this, right, and also
the robot god that comes about might put her in.

Speaker 3 (02:15:47):
Hell, right, because that's also looming. Yeah, after every for
every decision, right, Yeah.

Speaker 1 (02:15:53):
And the thing here she she expressed, she tells these
guys a story, and it really shows, both in this
community and among her how little value they actually have
for like human life. I told a story about a
time I had killed four ants in a bathtub where
I wanted to take a shower before growing to work.
I'd considered, can I just not take a shower, and
presumed me smelling bad at work would, because of big

(02:16:15):
numbers in the fate of the world and stuff, make
the world worse than the deaths of four basically causally
isolated people. I considered getting paper in a cup and
taking them elsewhere, and I figured there were decent odds
if I did, I'd be late to work and it
would probably make the world worse in the long run.
So again, she considers ants identical to human beings, and
she is also saying it was worth killing four of

(02:16:37):
them because they're causally isolated so that I could get
to work in time, because I'm working for the cause.
It's also it's in a bad place here.

Speaker 3 (02:16:47):
Yeah.

Speaker 4 (02:16:48):
The crazy thing about her is it like the amount
of thinking just to like get in the shower to
go to work, you know, you know what I mean
like that that ah, it just seems like it makes everything.

Speaker 3 (02:17:02):
Yeah, every every action is so loaded.

Speaker 1 (02:17:06):
Yes, yes, that most it's so it's it's wild to
me both this like mix of like fucking Jane Buddhist
compassion of like an aunt is no less than I,
or an aunt is no less than a human being, right,
we are all these are all lives. And then but
also it's fine for me to kill a bunch of
me to go to work on time, because like they're

(02:17:26):
causally isolated, so they're basically not people. Like it's it's
so weird. Like and again it's getting a lot clearer here.
Why this lady and her ideas end in a bunch
of people getting shot yeah and stabbed. Okay, there's a

(02:17:47):
samurai's sword later in the story, my friend.

Speaker 3 (02:17:50):
That's the one thing this has been missing.

Speaker 1 (02:17:52):
Yes, yes, So they continue, these guys to have a
very abusive conversation with this young person and she trusts them.

Speaker 3 (02:18:00):
Enough that a conversation where she asked for the two yeah.

Speaker 1 (02:18:04):
Yeah, and she tells them she's trans right, And this
gives you an idea of like, how kind of predatory
some of the stuff going on in this community is.
They asked what I'd do with a female body. They
were trying to get me to admit what I actually
wanted to do was the first thing in heaven, heaven
being there's this idea, especially amongst like some trans members
of the rationalist community that like all of them basically

(02:18:26):
believe a robot's gonna make heaven, right, and obviously, like
there's a number of the folks who are in this
who are trans are like and in heaven, like you
just kind of get the body you want immediately, right,
so these get They were trying to get me to
admit that what I actually wanted to do as the
first thing in heaven was masturbate in a female body.
And they follow us up by sitting really close to her,

(02:18:48):
close enough that she gets uncomfortable. And then a really
really rationalist conversation follows. They asked if I felt trapped.
I may have clarified physically, they may have said sure. Afterward,
I answered no to that question under the likely justified
belief that it was framed that way. They asked why not.
I said, I was pretty sure I could take them
in a fight. They prodded for details why I thought so,

(02:19:11):
and then how I thought a fight between us would go.
I asked, what kind of fight, like a physical, unarmed
fight to the death right now? And why what were
my payouts? This was over the fate of the multiverse.
Triggering actions by other people, ie or imprisonment or murder
was not relevant. So they decide they make this into again.
These people are all addicted to dumb game theory stuff, right, okay,
so what is this fight? Is this fight over the

(02:19:32):
fate of the multiverse? Are we in an alternate reality
where like no one will come and intervene and there's
no cops, we're the only people in the world or whatever.
So they tell her like, yeah, imagine there's no consequences
legally whatever to you do, and we're fighting over the
fate of the multiverse. And so she proceeds to give
an extremely elaborate discussion of how she'll gouge out their
eyes and try to destroy their prefrontal lobes and then

(02:19:53):
stomp on their skulls until they die. And it's both
it's like, it's nonsense. It's like how ten year olds
thinks fights work. It's also it's based on this game
theory attitude of fighting that they have, which is like
it's you have to make this kind of timeless decision
that any fight is you're you're just going to murder.

Speaker 3 (02:20:12):
The hardest confrontation, right, Yes, I suppose you have to
be the most violent.

Speaker 1 (02:20:16):
Yes, yes, because that will make other people not want
to attack you, as opposed to like what normal people
understand about like real fights, which is if you have
to do one, if you have to, you like try
to just like hit him in the hit him is
somewhere that's going to shock them, and then run like
a motherfucker, right you get out of like if you

(02:20:36):
have to like ideally just run like a motherfucker. But
if you have to strike somebody, you know, yeah, go
for the eye and then run like a son of
a bitch, you know, Like but there's no run like
a son of a bitch here. Because the point in
part is this like timeless decision to anyway, This gives
tells you a lot about the rationalist community. So she
tells these people, she explains in detail how she would

(02:20:58):
murder them if they have to fight out they're like
squitting next super close, having just asked her about masturbation.
Here's their first question quote. They asked if I'd rape
their corpse. Part of me insisted this was not going
as it was supposed to, but I'd decided. I decided
inflicting discomfort in order to get reliable information was a
valid tactic. In other words, them trying to make her

(02:21:20):
uncomfortable to get info from her, she decides is fine. Also,
the whole discussion about raping their corpses is like, well,
if you rape, obviously, if you want to have the
most extreme response possible, that would like make other people
unlikely to fuck with you knowing that you'll violate their
corpse if you kill them as clearly the light.

Speaker 3 (02:21:36):
And like that.

Speaker 1 (02:21:38):
Okay, sure, I love rational thought.

Speaker 3 (02:21:42):
Oh man, this is crazy, sorry, this is yes, is
so crazy, it's so nuts.

Speaker 1 (02:21:51):
So then they talk about psychopathy. One of these guys
had earlier told Zizz that they thought she was a psychopath,
but he told her that he told her that doesn't
mean what it means both to actual like clinicians, because
psychopathy is a diagnostician, or like what normal people mean.
To rationalists, a lot of them think psychopathy is a

(02:22:12):
state you can put yourself into in order to maximize
your performance in certain situations because they they've Again there's
some like popular books that are about like the Psychopaths
Way the Dark Triad, and like, well, you know, these
are the people who lead societies in the toughest times,
and so like you could you need to optimize and
engage in some of those behaviors if you want to

(02:22:33):
win in these situations. Based on all of this, Ziz
brings up what rationalists call the Gervas principle. Now, this
started as a tongue in cheek joke describing a rule
of office dynamics based on the TV show The Office.
Yes it's Ricky Sace, Yes, And the idea is that
in office environments, psycho's always rise to the top. This

(02:22:57):
is supposed to be like a negative observation. Like the
person who wrote this initially is like, yeah, this is
how offices work, and it's like why they're bad.

Speaker 3 (02:23:04):
You know.

Speaker 1 (02:23:04):
It's an extension of the Peter principle. And these psychopaths
put bad like dumb and incompetent people in like when
positions below them for a variety. It's trying to kind
of work out why in which like offices, are often dysfunctional. Right,
it's not like the original Jervase principle. Thing is like
not a bad piece of writing or whatever, but Zis

(02:23:25):
takes something insane out of it. I described how the
Jervase principle said, sociopaths give up empathy as in a
certain chunk of social software, not literally all hardware awared
accelerated modeling of people, not necessarily compassion, and with it
happiness destroying meaning to create power, meaning too, I did
not care about I wanted this world to live on.

(02:23:46):
So she tells them she's come to the conclusion I
need to make myself into a psychopath in order to
have the kind of mental power necessary to do the
things that I want to do. And she largely justifies
this by describing the leafs of the Sith from Star Wars,
because she thinks she needs to remake herself as a
psychopathic evil warrior monk in order to save all of creation.

Speaker 3 (02:24:12):
Yeah. No, of course, yep.

Speaker 1 (02:24:14):
So this is her hitting her final form. And true
to fact, these guys are like, they doesn't say it's
a good idea that but they're like, Okay, yeah, you
know that's not that's not the worst thing you could do. Sure,
you know, like I think the Sith stuff kind of weird,
but making yourself a psychopath makes sense. Sure, yeah, of
course I know a lot of guys who did that.
That's literally what they say, right, And then they say

(02:24:36):
that also, I don't even think that's what they really
they say that, because the next thing they say, this
guy person a is like, look, the best that lay
to turn yourself from a net negative to a net
positive value. I really believe you could do it. But
to do it, you need to come to ten more
of these seminars and keep taking classes here right right right,
here's a quote from them, or from Ziz. She's conditional

(02:24:59):
on beig going to a long course of circling like
these two organizations offered, particularly a ten weekend one, then
I probably would not be net negative. So things are
going good. This is this is you know, ah, yeah great?

Speaker 3 (02:25:21):
How much weekends cost?

Speaker 1 (02:25:23):
I don't actually know, I don't I don't fully know
with this. It's possible. Some of these are like some
of the events are free, like, but the classes cost money,
or but it's also a lot of it's like there's
donations expected or by doing this and being a member,
it's expected you're going to tithe basically that's like your income,

(02:25:44):
right more than I.

Speaker 2 (02:25:47):
I don't know the format. Is she not going to
be like super suspicious that people are like, you know,
faking it or like going over the top.

Speaker 1 (02:25:55):
She okay, is she is? She gets actually really uncomfortable.
They have an exercise where they're basically doing you know,
they're playing with love bombing right where everyone's like hugging
and telling each other they love each other, and she's like,
I don't really believe it. I just met these people.
So she she has started to and she is going
to break away from these organizations pretty quickly. But this
conversation she have with these guys is a critical part

(02:26:19):
of like why she finally has this fracture, because number one,
this dude keeps telling her you have a net negative
value to the universe, right, and so she's obsessed with
like how do I And it comes to the conclusion,
my best way of being net positive is to make
myself then doo associopath and a sith lord to save

(02:26:43):
the animals.

Speaker 3 (02:26:44):
Of course, it feels like the same thinking though, as
like the Robot's gonna make it seems to always come
back to this idea of like I think we just
gotta be evil.

Speaker 2 (02:26:55):
It's yes, oh yes, well I guess the only laws
of illusion is doom.

Speaker 3 (02:27:03):
Yep. Yeah, yeah, it's like it feels like it's a
it's a theme here.

Speaker 1 (02:27:10):
M hm, yep. Anyway you want to plug anything at
the end here.

Speaker 3 (02:27:15):
Ah, I have a comedy specially you can purchase on Patreon.
It's called Birth of a Nation with a G. You
can get that at Patreon dot combat Sash David.

Speaker 1 (02:27:27):
Excellent, excellent. All right, folks, well that is the end
of the episode. David, thank you so much for coming
on to our inaugural episode by listening to some of
the weirdest shit we've ever talked about on this show.

Speaker 3 (02:27:44):
Yeah, this is I don't really I'm gonna be thinking
about this for weeks.

Speaker 9 (02:27:48):
I mean yeah, yeah, because your co host likes a
curbent kbon for the Elders of Zion episodes, Yeah.

Speaker 1 (02:27:59):
Yeah, okay, I wanted to. I was initially going to
kind of just focus on all of this would have
been like half a page or so, you know, just
kind of summing up here's the gist of what this
believes and then let's get to the actual cult stuff when,
like you know, Ziz starts bringing in followers and the
crimes start happening. But that Rolling or that Wired article

(02:28:19):
really covers all that very well. And that's the best piece.
Most of the journalism I've read on these guys is
not very well written. It's not very good. It does
not really explain why they what they are, why they
do it. So I decided, and I'm not. The Wired
piece is great. I know the Wired guy knows all
of the stuff that I brought up here. He just
it's an article. You have editors. He left out what

(02:28:42):
he thought he needed to leave out. I don't have
that problem. And I wanted to really, really deeply trace
exactly where this lady's how this lady's mind develops, and
how that intersects with rationalism, because it's interesting and kind
of important and bad. Yeah, okay, So anyway, thanks for

(02:29:06):
having a had fuck with me. All right, that's it, everybody, Goodbye.

Speaker 2 (02:29:15):
Behind the Bastards is a production of cool Zone Media.
For more from cool Zone Media, visit our website Coolzonemedia
dot com, or check us out on the iHeartRadio app,
Apple Podcasts, or wherever you get your podcasts. Behind the
Bastards is now available on YouTube, new episodes every Wednesday
and Friday.

Speaker 9 (02:29:33):
Subscribe to our channel YouTube dot com slash at Behind
the Bastards

Behind the Bastards News

Advertise With Us

Follow Us On

Host

Robert Evans

Robert Evans

Show Links

StoreAboutRSS

Popular Podcasts

Stuff You Should Know

Stuff You Should Know

If you've ever wanted to know about champagne, satanism, the Stonewall Uprising, chaos theory, LSD, El Nino, true crime and Rosa Parks, then look no further. Josh and Chuck have you covered.

The Bobby Bones Show

The Bobby Bones Show

Listen to 'The Bobby Bones Show' by downloading the daily full replay.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2026 iHeartMedia, Inc.