Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:01):
Also media.
Speaker 2 (00:04):
Oh my goodness, welcome back to Behind the Bastards, a
podcast that is it'll be interested to see how the
audience reacts to this one talking about some of the
most obscure, frustrating Internet arcana that has ever occurred and
recently led to the deaths of like six people. My
(00:27):
guest today as in last episode, David Borie, David, Hey,
you doing man, I'm doing great.
Speaker 3 (00:34):
I really can't wait to see where this goes.
Speaker 2 (00:39):
Yeah, I feel.
Speaker 3 (00:41):
Like anything could happen at this point.
Speaker 2 (00:44):
It is going to It is going to a lot
of frustrating things are going to happen. So we'd kind
of left off by setting up the rationalists where they
came from, some of the different strains of thought and
beliefs that come out of their weird thought experiments. And
(01:07):
now we are talking about a person who falls into
this movement fairly early on and is going to be
the leader of this quote unquote group, the Zizians, who
were responsible for these murders that just happened. Ziz Lesota
was born in nineteen ninety or nineteen ninety one. I
don't have an exact birth date. She's known to be
thirty four years old as of twenty twenty five, so
(01:28):
somewhere in that field. She was born in Fairbanks, Alaska,
and grew up there as her father worked for the
University of Alaska as an AI researcher. We know very
little of the specifics of her childhood or upbringing, but
in more than one hundred thousand words of blog posts,
she did make some references to her early years. She
claims to have been talented in engineering and computer science
(01:51):
from a young age, and there's no real reason to
doubt this. The best single article on all of this
is a piece and Wired by Evan Ratliffe. He found
a two thousand and four teen blog post by Ziz
where she wrote, my friends and family, even if they
think I'm weird, don't really seem to be bothered by
the fact that I'm weird. But one thing I can
tell you is that I used to deemphasize my weirdness
around them, and then I stopped and found that being
(02:14):
unapologetically weird is a lot more fun.
Speaker 3 (02:17):
Now.
Speaker 2 (02:17):
It's important you know. Ziz is not the name this
person is born under. She's a trans woman, and so
I'm like using the name that she adopts later but
she is not transitioned at this point like this, This
is when she's a kid, right, and she's not going
to transition until fairly late in the story after coming
to San Francisco. So you just keep that in mind
as this is going on here. Hey, everyone, Robert here,
(02:40):
just a little additional context, as best as I think
anyone can tell. If you're curious about where the name
Ziz came from, there's another piece of serial released online
fiction that's not like a rationalist story, but it's very
popular with rationalists. It's called Worm. Ziz is a character
in that that's effectively like an angel like being who
(03:04):
can like manipulate the future, usually in order to do
very bad things. Anyway, that's where the name comes from.
So smart kid, really good with computers, kind of weird
and embraces being unapologetically weird at a certain point in
her childhood. Hey, everybody, Robert here. Did not have this
(03:26):
piece of information when I first put the episode together,
but I came across a quote in an article from
the Boston Globe that provides additional context on Zizz's childhood quote.
In middle school, the teen was among a group of
students who managed to infiltrate the school district's payroll system
and award huge paychecks to teachers they admired while slashing
(03:48):
the salaries of those they despised. According to one teacher, Zizz,
the teacher said, struggled to regulate strong emotions, often erupting
in tantrums. I wish I'd had this when Dave it
was on, but definitely sets up some of the things
that are coming. She goes to the U of Alaska
for her undergraduate degree in computer engineering in February of
(04:10):
two thousand and nine, which is when Elisia Jedkowski started
Less Wrong. Ziz starts kind of getting drawn into some
of the people who are around this growing subculture, right,
and she's drawn in initially by veganism. So Ziz becomes
a vegan at a fairly young age. Her family are
(04:31):
not vegans, and she's obsessed with the concept of animal sentience,
right of the fact that like animals are thinking and
feeling beings just like human beings. And a lot of
this is based in her interest in kind of foundational rationalist.
A lot of this is based in her interest of
a foundational rationalist and EA figure a guy named Brian
(04:55):
Thomasik Brian is a writer and a software engineer as
well as an animal rights activists and as a thinker.
He's what you'd call a long termist, right, which is,
you know, pretty tied to the EA guys. These are
all the same people using kind of different words to
describe the aspects of what they believe. His organization is
(05:15):
the Center on Long Term Risk, which is a think
tank he establishes that's at the ground floor of these
effective altruism discussions, and the goal for the Center of
Long Term Risk is to find ways to reduce suffering
on a long timeline. Thomasik is obsessed with the concept
of suffering and specifically obsessed with conceps. Suffering is a
(05:37):
mathematical concept. So when I say to you, I want
to end suffering, you probably think, like, oh, you want
to like, you know, go help people who don't have
access to clean water, or like who have like worms
and stuff that they're dealing with, have access to medicine.
That's what normal people think of, right, you know, maybe
try to improve access to medical care that sort of stuff.
(06:00):
Thomas Saith thinks of suffering as like a mass, like
an aggregate mass that he wants to reduce in the
long term through actions right. It's a numbers game to him,
in other words, and his idea of ultimate good is
to reduce and end the suffering of sentient life. Critical
(06:20):
to his belief system and the one that Ziz starts
to develop, is the growing understanding that sentience is much
more common than many people had previously assumed. Part of
this comes from long standing debates with their origins in
Christian doctrine as to whether or not animals have souls
or basically machines with meat right that don't feel anything right.
(06:41):
There's still a lot of Christian evangelicals who feel that
way today about like at least the animals we eat,
you know, like, well they don't really think it's fine.
God gave them to us. We can do whatever we
want to them.
Speaker 3 (06:52):
Here we eat and.
Speaker 2 (06:54):
To be fair, this is an extremely common way for
that people in Japan feel about like fish, even whales
and dolphins, like the much more intelligent they're not fish,
but like the much more intelligent ocean going creatures is
like they're fish. They don't think you do whatever to them,
you know. This is a reason for a lot of
like the really fucked up stuff with like whaling fleets
in that part of the world. So this is a
(07:16):
thing all over the planet. People are very good at
deciding certain things we want to eat. Are our machines
that don't feel anything, you know, it's just much more
comfortable that way. Now, this is obviously like you go
into like pagan The Pagans would have been like, what
do you mean animals don't think or have souls? Animals
like animals think, you know, like they're they're they're like,
(07:38):
you're telling me, like my horse that I love it
doesn't think, you know, that's nonsense. But it's this thing
that in like early modernity especially gets more common. But
they're also this is when we start to have debates
about like what is sentience and what is thinking? And
a lot of them are centered around trying to answer,
like our animals sentient And the initial definition of sentience
(08:03):
that most of these people are using is can it reason?
Can it speak? If we can't prove that like a
dog or a cow can reason, and if it can't
speak to us, right, then it's not sentient. That's how
a lot of people feel. It's an English philosopher named
Jeremy Bentham, who first argues, I think that what matters
(08:24):
isn't can it reason or can it speak? But can
it suffer? Because a machine can't suffer. If these are
machines with meat, they can't suffer. If these can suffer,
they're not machine with meat, right, And this is the
kind of thing how we define sentience is a moving thing.
Like you can find different definitions of it. But the
(08:46):
last couple of decades, in particular of actually very good
data has made it clear. I think inarguably that basically
every living thing on this planet has a degree of
what you would call sentience if you are described being
sentienced the way it generally is now, which is a
creature has the capacity for subjective experience with a positive
(09:08):
or negative negative valence i e. Can feel pain or pleasure,
and also is it can feel it as an individual right,
it doesn't mean, you know, sometimes people use the term
effective sentience to refer to this, to differentiate it from
like being able to reason and make moral decisions. You know,
(09:28):
for example, ants I don't think can make moral decisions,
you know, in any way that we would recognize that
they certainly don't think about stuff that way. But twenty
twenty five, research published by doctor Vulker Nehring found evidence
that ants are capable of remembering for long periods of
time violent encounters they have with other individual ants and
(09:49):
holding grudges against those ants. Right, just like us. They're
just like us. And there's strong evidence that ants do
feel pain. Right, We're We're not pretty sure of that.
And in fact, again this is an argument that a
number of researchers in this space will make. Sentience is
probably some kind of something like this. Kind of sentience,
the ability to have subjective positive and negative experiences is
(10:10):
universal to living things, or very close to it.
Speaker 3 (10:13):
Right.
Speaker 2 (10:15):
It's an interesting body of research, but there's as it's
fairly solid at this point. And again I say this
as somebody who like hunts and raises livestock. I don't
think there's any any solid reason to disagree with this.
So you can see there's a basis to a lot
of what Thomisik is saying, right, which is that you
should if you're what matters is reducing the overall amount
(10:38):
of suffering in the world, and if you're looking at
suffering as a mass, if you're just adding up all
of the bad things experienced by all of the living things.
Animal suffering is a lot of the suffering. So if
our goal is to reduce suffering, animal welfare is hugely
importantly right.
Speaker 3 (10:53):
It's a great place to start.
Speaker 2 (10:54):
Great, fine enough, you know, not a little bit of
a weird way to phrase it, but fine.
Speaker 1 (10:58):
Yeah.
Speaker 2 (11:00):
H So here's the way. Problem though, Thomisik, like all
these guys, spends too much time. None of them can
be like, hey, had a good thought, We're done, setting
that thought down, moving on. So he keeps thinking about
shit like this, and it leads him to some very
irrational takes. For example, in twenty fourteen, Thomas six starts
(11:23):
arguing that it might be a moral to kill characters
in video games, and I'm going to quote from an
article in Volves. He argues that while NPCs do not
have anywhere near the mental complexity of animals, the difference
is one of degree rather than kind, and we should
care at least a tiny amount about their suffering, especially
if they grow more complex. And his argument is that like, yeah,
(11:46):
most it doesn't matter like individually killing a goomba or
a bet or a guy in GTA five, but like,
because they're getting more complicated and able to try to
avoid injury and stuff. There's evidence that there's some sort
of suffer ring there, and thus the sheer mass of
NPCs being killed that might be like enough that it's
ethically relevant to consider. And I think that's silly. Yeah,
(12:09):
I think that's ridiculous. I'm sorry, man, No, I'm sorry.
Speaker 3 (12:18):
But that's a lot of the fun of the game.
Kill you.
Speaker 2 (12:22):
If you're telling me, like we need to be deeply
concerned about the welfare of like cows that we lock
into factory farms, you got me, absolutely for sure. If
you're telling me I should feel bad about running down
a bunch of cops and grand theft auto.
Speaker 3 (12:38):
It's also one of those things where it's like you
got to think locally. Man.
Speaker 2 (12:41):
There's yeah, there's there's like there's like this is this
is the I mean? And he does say like, I
don't consider this a main problem, but like the fact
that you think this is a problem is it means
that you believe silly things about consciousness. Yeah, anyway, so
this is I think the fact that he gets he
(13:02):
leads himself here is kind of evidence of the sort
of logical fractures that are very common in this community.
But this is the guy that young Ziz is drawn to.
She loves this dude, right, He is kind of her
first intellectual heart throb, and she writes, quote, MY primary
concern upon learning about the singularity was how do I
make this benefit all sentient life, not just humans? So
(13:24):
she gets interested in this idea of the singularity. It's
inevitable that an AI god is going to arise, and
she gets into the, you know, the rationalist thing of
we have to make sure that this is a nice
AI rather than a mean one. But she has this
other thing to it, which is this AI has to
care as much as I do about animal life, right,
(13:46):
otherwise we're not really making the world better, you know.
Speaker 3 (13:50):
Now.
Speaker 2 (13:51):
Thamisik advises her to check out Less Wrong, which is
how Ziz starts reading elizer Yetkowski's work. From there, in
twenty twelve, she starts reading up effective altruism and existential risk,
which is a term that means the risk that a
super intelligent AI will kill us all. She starts believing
in all of this kind of stuff, and her a
(14:14):
particular belief is that like the Singularity, when it happens,
is going to occur in a flash kind of like
the rapture and almost immediately lead to the creation of
either a hell or a heaven. Right, and this will
be done by the term they use for this inevitable
AI is the Singleton, right'. That's what they call the
AI god that's going to come about, right, And so
(14:37):
her obsession is that she has to find a way
to make this Singleton a nice AI that cares about
animals as much as it cares about people. Right, that's
our initial big motivation. So she starts emailing Thomasick with
her concerns because she's worried that the other rationalists aren't vegans, right,
and they don't feel like animal welfare is like the
top priority for making sure this AI is good, and
(15:00):
she really wants to convert this whole community to veganism
in order to ensure that the Singleton is as focused
on insect and animal welfare as human welfare. And Thomas
Sick does care about animal rights, but he disagrees with
her because he's like, now, what matters is maximizing the
reduction of suffering, and like, a good Singleton will solve
climate change and shit, which will be better for the animals.
(15:22):
And if we focus on trying to convert everybody in
this the rationalist space to veganism. It's going to stop
us from accomplishing these bigger goals, right, this is shattering
to Ziz. Right, she decides that he doesn't Thomas Sick
doesn't care about good things, and she decides that she's
basically alone in her values. And so her first move.
Speaker 3 (15:43):
The time to start a smaller subcol shit that.
Speaker 2 (15:46):
Sounds like we're on our way. She first considers embracing
what she calls negative utilitarianism. And this is an example
of the fact that from the jump, this is a
young woman who's not well, right because once her hero
is like, I don't know if veganism is necessarily are
(16:06):
the priority we have to embrace right now, her immediate
goals to jump to, well, maybe what I should do
is optimize myself to cause as much harm to humanity
and quote destroy the world, to prevent it from becoming
hell for mostly everyone. So that's a jump.
Speaker 3 (16:24):
You know.
Speaker 2 (16:25):
That's not somebody who's doing well you think is healthy.
Speaker 3 (16:29):
No, she's uh, she's having a tough time out. Uh huh.
Speaker 2 (16:34):
So Ziz does ultimately decide she should still work to
bring about a nice AI, even though that necessitates working
with people she describes as flesh eating monsters who had
created hell on Earth for far more people than those
they had helped. That's everybody who eats meat, Okay, yes, yes,
And it's ironic, large group. It's ironic because like if
(16:55):
you if you're she really wants to be in the
tech industry. She's trying to get an all all these
people are in the tech industry. That's a pretty good
description of a lot of the tech industry. They're in
fact monsters who have created hell on Earth for more
people than they've felt. But she means that for like,
I don't know, you're you're onto has a Hamburger once
a week and look again, factory farming evil. I just
(17:16):
don't think that's how morality works. I think you're going
a little far.
Speaker 3 (17:23):
No, she's making big jumps.
Speaker 2 (17:25):
Yeah, you're making bold thinkers, bold thinker. Yeah. Now, what
you see here with this logic is that Ziz has
taken this. She has a massive case of main character syndrome.
Speaker 3 (17:36):
Right.
Speaker 2 (17:37):
All of this is based in her attitude that I
have to save the universe by creating, by helping to
or figuring out how to create an AI that can
end the eternal holocaust of all animal life and also
save humanity.
Speaker 3 (17:53):
Right to do?
Speaker 2 (17:55):
That's me and this is this is a thing.
Speaker 3 (17:59):
Again.
Speaker 2 (18:00):
All of this comes out of both subcultural aspects and
aspects of American culture. One major problem that we have
in the society is Hollywood has trained us all on
a diet of movies with main characters that are the
special boy or the special girl with the special powers
who save the day, right, and real life doesn't work
(18:23):
that way very often. Right, the Nazis, there was no
special boy who stopped the Nazis. There were a lot
of farm boys who were just like, I guess I'll
go run in a machine gun nest until this is
done exactly. There were a lot of sixteen year old
Russians who were like, guess I'm gonna walk in a bullet,
you know, like that's that's how evil gets fought. Usually, unfortunately,
(18:46):
all were reluctant like that, yeah yeah, or a shitloaded
guys in a lab figuring out how to make corn
that has higher yields so people don't starve. Right, These
are these are really like how world clatt Like huge
world problems get solved.
Speaker 3 (19:02):
People who have been touched, you know.
Speaker 2 (19:04):
Yeah, it's not people who have been touched, and it's
certainly not people who have entirely based their understanding on
the world from quotes from Star Wars and Harry Potter.
So some of this comes from just like, this is
a normal, deranged way of thinking that happens to a
(19:24):
lot of people in just Western I think a lot
of this leads to, uh, why you get very comfortable
middle class people joining these very aggressive fascist movements in
the West, Like in Germany, it's like middle class mostly
like middle class and upper middle class people in the US,
especially among like these street fighting, you know, proud boy types.
It's because it's not because they're like suffering and desperate.
(19:48):
They're not starving in the streets. It's because they're bored
and they want to feel like they're fighting an epic
war against evil.
Speaker 3 (19:56):
Yeah, I mean, you want to fill your time with importance,
right right, regardless of what you do, you want to
and you want to feel like you have a cause
worthy of fighting for. So in that, I guess I
see how you got here.
Speaker 2 (20:07):
Yeah, So there's a piece I mean, I think there's
a piece of this that originally it's just from this
is something in our culture. But there's also a major,
a major chunk of this gets supercharged by the kind
of thinking that's common in EA and rationalist spaces, because
so rationalists and effective altruists are not ever thinking like, hey,
how do we as a species fix these major problems?
Speaker 3 (20:28):
Right?
Speaker 2 (20:29):
They're thinking, how do I make myself better, optimize myself
to be incredible, and how do I like fix the
major problems of the world alongside my mentally superpowered friends.
Speaker 3 (20:45):
Right.
Speaker 2 (20:46):
These are very individual focused philosophies and attitudes, right, And
so they do lend themselves to people who think that, like,
we are heroes who are uniquely empowered to save the
world ziz rights. I did not trust most humans indifference
to build a net positive cosmos, even in the absence
of a technological convenience to prey on animals, So like,
(21:09):
I'm the only one who has the mental capability to
actually create the net positive cosmos that needs to come
into being. All of her discussion is talking in like
terms of I'm saving the universe, right, And a lot
of that does come out of the way many of
these people talk on the Internet about the stakes of AI,
and just like the importance of rationality. Again, this is
(21:31):
something scientology does. El Ron Hubbard always couched getting people
on dianetics in terms of we are going to save
the world and end war, right, like this is you know,
it's very normal for cold stuff. She starts reading around
this time when she's in college, Harry Potter and the
Methods of Rationality. This helps to solidify her feelings of
(21:51):
her own centrality as a hero figure. In a blog
post where she lays out her intellectual journey, she quotes
a line from that fan thick of yed Kowski's that is,
it's essentially about what Yudkowski calls the hero contract, right
or sorry, It's essentially about this concept called the hero contract. Right,
And there's this, there's this, This is a psychological concept
(22:17):
among academics right where. And it's about like, it's about
about analyzing how we as a how we should look
at the people who societies declare heroes and the communities
that declare them heroes and see them as in a dialogue,
right as in when when when you're in a country
(22:37):
decides this guy is a hero. He is through his
actions kind of conversing to them, and they are kind
of telling him what they expect from him.
Speaker 3 (22:47):
Right.
Speaker 2 (22:47):
But Yedkowski wrestles with this concept, right, and he comes
to some very weird conclusions about it. In one of
the worst articles that I've ever read, he frames it
as hero licensing. I refer to the fact that people
get angry at you if they don't think you have
if you're trying to do something and they don't think
you have a hero license to do it. In other words,
(23:09):
if you're trying to do something like that they don't
think you're qualified to do. He'll describe that as them
not thinking of like a hero license. And he like
writes this annoying article that's like a conversation between him
and a person who's supposed to embody the community of
people who don't think he should write Harry Potter fan fiction.
It's all very silly, and again always is ridiculous. But
(23:32):
Ziz is very interested in the idea of the hero contract, right,
but she comes up with her own spin on it,
which she calls the true hero contract, right and instead
of again, the academic term is the hero contract means
societies and communities pick heroes, and those heroes in the
community that they're in are in a constant dialogue with
(23:54):
each other about what is heroic and what is expected, right,
what the hero needs from the commune unity, and vice versa.
You know, that's all that that's saying, Ziz says, No, no, nah,
that's bullshit. The real hero contract is quote poor free
energy at my direction, and it will go into the
optimization for good.
Speaker 3 (24:16):
In other words, classic.
Speaker 2 (24:17):
SiZ, it's not a dialogue. If you're the hero, the
community has to give you their energy and time and power,
and you will use it to optimize them for good
because they don't know how to do it themselves, because
they're not really able to think.
Speaker 3 (24:34):
You know, they're not the.
Speaker 2 (24:35):
Hero, because they're not the hero. Right you are? You are?
Speaker 3 (24:38):
You are the all powerful hero.
Speaker 2 (24:41):
Now this is a fancy way of describing how cult
leaders think. Right, Yeah, everyone exists to poor energy into me,
and I'll use it to do what's right. You know.
So this is where her mind is in twenty twelve,
but again she's just a student posting on the Internet
and chatting with other members of the subculture. Point that year,
(25:01):
she starts donating money to MIRY, the Machine Intelligence Research
Research Institute, which is a nonprofit devoted to studying how
to create friendly Ai y had Kowski founded Miri in
two thousand, right, so this is his like nonprofit think tank.
In twenty thirteen, she finished an internship at NASA. So
again she is a very smart young woman.
Speaker 3 (25:22):
Right.
Speaker 2 (25:22):
She gets an internship at NASA and she builds a
tool for space weather analysis, so as the person with
a lot of potential, very very as. All of the
stuff she's writing is like dumbest shit. But again, intelligence
isn't an absolute. People can be brilliant at coding and
have terrible ideas about everything else.
Speaker 3 (25:39):
Yes, exactly. Yeah, I wonder if she's telling you think
she's telling people at work.
Speaker 2 (25:47):
I don't. I don't think at this point she is,
because she's super insular. Right, She's very uncomfortable talking to people. Right,
She's going to kind of break out of her shell
once she gets to San Francisco. I don't know. She
may have talked to some of them about this stuff,
but I really don't think she is at this point.
I don't think she's comfortable enough doing that. Yeah. So
(26:10):
she also does an internship at the Software Giant Oracle.
So at this point you've got this young lady who's
got a lot of potential, you.
Speaker 3 (26:16):
Know, a real career as well.
Speaker 2 (26:17):
Yeah, the start of a very real career. That's a
great starting resume for like a twenty two year old.
Now at this point, she's torn should she go get
a graduate degree, right, or should she jump right into
the tech industry, you know, and she worries that like,
if she waits to get a graduate degree, this will
(26:37):
delay her making a positive impact on the existential risk
caused by AI, and it'll be too late. The singularity
will happen already, you know. At this point, she's still
a big fawning fan of Eliza Yedkowski and the highest
ranking woman at Gydkowski's organization, Mary is a lady named
Susan Salomon. Susan gives a public invitation to the online
(26:59):
community to pitch ideas for the best way to improve
the ultimate quality of the singleton that these people believe
is inevitable. In other words, hey, give us your ideas
for how to make the inevitable AI god nice right.
Here's what Ziz writes about her response to that. I
asked her whether I should try an alter course and
do research or continue a fork of my pre existing
(27:21):
life plan earned to give as a computer engineer, but
retrain and try to do research directly instead. At the time,
I was planning to go to grad school and I
had an irrational attachment to the idea. She sort of
compromised and said, I should go to grad school, fight
a startup co founder, drop out, and earn to give
via startups instead. First off, bad advices and bad advice
(27:45):
being stobs, being Steve Jobs, worked for Steve Jobs well,
and Bill Gates. I guess, to an extent, it doesn't
work for most people.
Speaker 3 (27:54):
No, no, no, it seems like the general tech disruptor idea,
you know.
Speaker 2 (28:00):
Yeah, and most people these people aren't very original thinkers,
like yeah, she's just saying, like, yeah, go to a
Steve Jobs. So Ziz does go to grad school, and
somewhere around that time in twenty fourteen, she attends a
lecture by Elisia Jedkowski on the subject of Inadequate Equilibria,
which is the title of a book that Jodkowski had
(28:20):
wrote about the time, and the book is about where
and how civilizations get stuck. One reviewer, Brian Kaplan, who
despite being a professor of economics, must have a brain
as smooth as a Pearl wrote this about it. Every
society is screwed up. Elisia Yidkowski is one of the
few thinkers on earth who are trying, at the most
general level to understand why. And this is like, wow,
(28:43):
that's it you, Pete. Please study the humanities a little bit,
A little bit, a little bit, I.
Speaker 3 (28:50):
Mean, fuck man.
Speaker 2 (28:51):
The first and most like one of the first influential
works of his modern historic scholarship is The Decline and
Fall of the Roman Empire. It's a whole book about
why a society fell apart, and like motherfucker. More recently,
Mike Davis existed, like like Jesus Christ.
Speaker 3 (29:12):
I believe this guy continues to get traction.
Speaker 2 (29:15):
Nobody else is thinking about why society screwed up, but
a Leezer yed Kowski, this.
Speaker 3 (29:19):
Man, this man, this guy, this marry.
Speaker 2 (29:25):
Yeah, no, I was trying to find another. I read
through that Martin Luther King junior speech. Everything's good, oh,
oh my god, oh my god. Like motherfucker. So many
people do nothing but try to write about why our
society is sick.
Speaker 1 (29:43):
They did.
Speaker 3 (29:46):
On all levels, by the way, on.
Speaker 2 (29:49):
Everybody's thinking about this. This is such a common subject
of scholarship and discussion.
Speaker 3 (30:00):
What everyone's talking always.
Speaker 2 (30:02):
It would be like if if I got really into
like reading medical textbooks and was like, you know what,
nobody's ever tried to figure out how to transplant a heart.
I'm going to write a book about how that might work.
I think I got you know, people, So yeah, speaking
(30:28):
of these fucking people have sex with Uh Nope, well.
Speaker 3 (30:35):
That's not something.
Speaker 2 (30:36):
No, I don't know. I don't know. Uh, don't fuck
listen to ads. We're back. So Zizz is at this
speech where Yudkowski is shilling his book, and he most
of what he seems to be talking about in this
speech about this book about why societies fall apart is
(30:58):
how to make a text startup. She says, quote he
gave a recipe for finding startup ideas. He said, Paul
Graham's idea only filter on people, ignore startup ideas was
partial epistemic learned helplessness. That means Paul Graham is saying,
focus on finding good people that you'd start a company with.
Having an idea for a company doesn't matter. Yudkowski says,
(31:19):
of course, startup ideas mattered. You needed a good startup idea,
so look for a way in the world is broken,
then compare against a checklist of things you couldn't fix,
you know, right, Like, that's that's what this speech is
largely about, as him being like, here's how to find
startup ideas. So she starts thinking. She starts thinking as
hard as she can, and you know, being a person
(31:41):
who is very much of the tech brain industry rot
at this point, she comes up with a brilliant idea.
It's a genius idea. Oh you're gonna you're gonna love
this idea. David Uber for prostitutes.
Speaker 3 (31:58):
Yeah, fucking with me?
Speaker 2 (32:00):
No, No, it's.
Speaker 3 (32:04):
That's where she landed.
Speaker 2 (32:06):
She lands on the idea of oh wow, sex work
is illegal, but porn isn't. So if we start an
uber whereby a team with a camera and a porn
star come to your house and you fuck them and
record it, that's a legal loophole. We just found that
(32:26):
at Its not just bus. She makes the bank bus,
the gig economy. It is really like dondreper a moment,
what about uber but a pimp? It's so funny these people.
Speaker 3 (32:51):
You gotta love it, you got wow, it's wow. Wow,
what a place to end up. I would love to
see the other drafts.
Speaker 2 (32:58):
Yeah, yeah, first because god, yeah, man, that's that's that
is the good stuff, isn't it.
Speaker 3 (33:10):
Yeah?
Speaker 2 (33:11):
Wow wow, we special minds at work here. Oh man.
Speaker 3 (33:18):
Ultimately it all I have to make smart.
Speaker 2 (33:21):
I have to make pimp uber.
Speaker 3 (33:25):
That's so wild.
Speaker 2 (33:27):
Yes, yes, the uber of pimping. What an idea? Now,
so Ziz devotes her brief time in grad school. She's
working on pimping Uber to try and find a partner.
Speaker 3 (33:38):
Right.
Speaker 2 (33:38):
She wants to have a startup partner, someone who will
will embark on this journey with her.
Speaker 3 (33:42):
I don't know if that's an investor you need to.
Speaker 2 (33:48):
It doesn't work out, She drops out of grad school
because quote, I did not find someone who felt like
good startup co founder material. This may be because she's
very bad at talking to people and all, so probably
scares people off because the things that she talks about
are deeply off putting.
Speaker 3 (34:04):
Yeah, I was gonna say. It's also a terrible idea.
Speaker 2 (34:08):
And at this point she hasn't done anything bad, so
I feel bad for this is a person who's very lonely,
who is very confused. She has by this point realized
that she's trans but not transitioned. She's in like this
is this is like a tough place to be, right,
that's a hard times.
Speaker 3 (34:23):
That's that's hard.
Speaker 2 (34:24):
And nothing about her inherent personality makes it is going
to make this easier for her. Right who she is
makes all of this much harder because she also makes
some comments about dropping out because her her thesis advisor
was abusive. I don't fully know what this means. And
here's why Ziz and encounter some behavior I will describe
(34:48):
later that is abusive from other people, but also regularly
defines abuse as people who disagree with her about the
only thing that matters being creating an AI god to
protect the animals. So I don't know if her thesis
advisor was abusive or was just like, maybe drop the
alien god idea for a second. Yeah, yeah, but maybe
(35:09):
maybe focus on like finding a job, you know, making
some friends, a couple of dates, go on a couple
of dates, something like that. Maybe maybe like maybe make
God on the back burner here for a second. Whatever
happened here, She decides it's time to move to the Bay.
This is like twenty sixteen. She's going to find a
(35:30):
big tech job. She's going to make that big tech money.
While she figures out a startup idea and finds a
co founder who will let her make enough money to
change and save the world, well, the whole universe. Her
first plan is to give the money to Mary Yudkowski's
organization so it can continue it's important, important work, imagining
an ic AI. Her parents. She's got enough family money
(35:53):
that her parents are able to pay for like I
think like six months or more of rent in the bay,
which is not nothing, not a cheap place to live.
I don't know exactly how long her parents are paying,
but like that that implies a degree of financial comfort. Right,
So she gets hired by a startup very quickly, because
again very gifted.
Speaker 3 (36:16):
Right.
Speaker 2 (36:16):
Yes, yes it's some sort of gaming company. But at
this point she's made another change in her ethics system
based on the Leiser Yeddkowski's writings. One of Yukowski's writings
argues that it is talking about the difference between consequentialists
and virtue ethics. Right. Consequentialists are people who focus entirely
(36:38):
on what will the outcome of my actions be? And
it kind of doesn't matter what I'm doing or even
if it's sometimes a little fucked up, if the end
result is good virtue ethics. People folk have a code
and stick to it, right And actually, and I kind
of am surprised that he came to this. Yakowski's conclusion
is that like, while logically you're more likely to succeed,
(37:01):
like on paper, you're more likely to succeed as a consequentialist.
His opinion is that virtue ethics has the best outcome.
People tend to do well when they stick to a
code and they try to rather than like anything goes
as long as I succeed, right, And I think that's
actually a pretty decent way to live your life.
Speaker 3 (37:19):
It's a pretty reasonable conclusion for him.
Speaker 2 (37:22):
It's a reasonable conclusion for him, So I don't blame
him on this part. But here's the problem. Zizz is
trying to break into and succeed in the tech industry,
and you can't. You are very unlikely to succeed at
a high level in the tech industry if you are
unwilling to do things and have things done to you
(37:43):
that are unethical and fucked up. I'm not saying this
is good. This is the reality of the entertainment industry too, right.
I experience when I started and I started with an
unpaid internship. Unpaid internships are bad, right, It's bad that
those exist. They inherently favor people who have money and
people who have family connections. You know, I had like
(38:03):
a small savings account for my job in special ed,
but that was the standard. It is, like, there were
a lot of unpaid internships. It got me my foot
in the door. It worked for me. I also worked
a lot of overtime that I didn't get paid for.
I did a lot of shit that wasn't a part
of my job to impress my bosses to make myself
indispensable so that they would decide, like, we have to
(38:25):
keep this guy on and pay him. And it worked
for me. And I just wanted to add because this
was not in the original thing. A big part of
why it worked for me is that I'm talking about
a few different companies here, but particularly at Cracked where
I had the internship. Like my bosses, you know, made
a choice tom intor me, and you know, to get me,
you know, to work overtime on their own behalf to
(38:47):
like make sure I got a paying job, which is
a big part of like the luck that I encountered
that a lot of people don't. So that's another major
part of like why things worked out for me is
that I just got incredible lucky with the people I
was working for and with That's bad. It's not good
that things work that way, right, It's not healthy.
Speaker 3 (39:09):
Set up for you either, Like you know, you kind
of defied the odds. It's it's, like you said, the
rich people who get the job where exactly it's not.
Speaker 2 (39:17):
Even yes, that said, if I am giving someone, if
someone wants what is the most likely path to succeeding?
You know, I've I've just got this job working, you know,
on this production company or the music steer, I would
I would say, well, your best odds are to like
make yourself completely indispensable and become obsessively devoted to that task.
Speaker 3 (39:41):
Right.
Speaker 2 (39:41):
Uh, that's it. I don't tend to give that advice anymore.
I have and I have had several other friends succeed
as a result of it, and all of us also
burnt ourselves out and did huge amounts of damage to ourselves.
Like I am permanently broken as a result of of
you know, the ten years that I did eighty hour
(40:01):
weeks and shit, you.
Speaker 3 (40:02):
Know, now you're sounding like somebody who works in the
entertainment and.
Speaker 2 (40:06):
Yes, yes, and it worked for me, right, I got
a I succeeded, I got a great job, I got money.
Most people it doesn't. And it's bad that it works
this way. Ziz, unlike me, is not willing to do that, right.
She thinks it's wrong to be asked to work overtime
and not get paid for it, and so on her
(40:27):
first day at the job, she leaves after eight hours
and her boss is like, what the fuck are you doing?
And she's like, I'm I'm here to supposed to be
here eight hours. Eight hours is up, I'm going home.
And he calls her half an hour later and fires her. Right,
and this is because the tech industry is evil, you know,
like this is bad. She's not bad here she is.
(40:49):
It is like a thing where it's she's not doing
by her standards. What I would say is the rational thing,
which would be if all that matters is optimizing your
earning power, right, right, well, then you do this, then
you do do whatever it takes. Right. So it's kind
of interesting to me like that she is so devoted
to this like virtue ethics thing at this point that
(41:10):
she fucks over her career in the tech industry because
she's not willing to do the things that you kind
of need to do to succeed, you know, in the
place that she is. But it's interesting, I don't like
give her any shit for that. So she asked your
parents more for more runway to extend your time in
the bay. And then she finds work at another startup,
but the same problems persist. Quote, they kept demanding that
(41:33):
I work unpaid overtime, talking about how other employees just
put always put forty hours on their timesheet no matter what.
And this exemplary employee over there worked twelve hours a
day and he really went the extra mile and got
the job done. And they needed me to really go
the extra mile and get the job done. She's not
willing to do that. And again I hate that this
is part of what drives her to the madness that
(41:53):
leads to the cult to the killings, because it's like, oh, honey,
you're in the right. It's an eavil industry.
Speaker 3 (41:59):
Yeah, you a flash of where it could have gone.
Well it really there were chances for this to work out.
Speaker 2 (42:05):
No, you were one hundred percent right, Like this is
stuck to it. Yeah, you know what I mean, And
that's super hard. I really respect that part of you.
Oh yeah, yeah, yeah, I'm so sad that this is
part of what shatters your brain. Like that really bums
me out. So first off, she's kind of starts spiraling
(42:28):
and she concludes that she hates virtue ethics. This is
where she starts hating Yidkowski, right, This is she doesn't
come break entirely on him yet, but she gets really
angry at this point because she's like, well, obviously virtue
ethics don't work.
Speaker 3 (42:42):
And she's been following this man at this point for.
Speaker 2 (42:44):
Years exactly exactly, so this is a very like damaging
thing to her that this happens. And you know, and
again as much as I'm blamed Yadkowski, the culture of
the Bay Bay area tech industry. That's a big part
of what drives this person and you know, to where
she ends up.
Speaker 3 (43:02):
Right.
Speaker 2 (43:03):
So that said, some of her issues are also rooted
in a kind of rigid and unforgiving internal rule set.
At one point, she negotiates work with a professor and
their undergraduate helper. She doesn't want to take an hourly job,
and she tries to negotiate a flat rate of seven k.
And they're like, yeah, okay, that sounds fair, but the
school doesn't do stuff like that, so you will have
(43:25):
to fake some paperwork with me for me to be
able to get them to pay you seven thousand dollars.
And she isn't willing to do that. And that's the
thing where it's like, ah, no, I've had some shit
where this was the like there was a stupid rule
and like in order for the pep meat or other
people to get paid, we had to like tell something
else to the company. Like that's just that's just no
(43:46):
knowing how to get by. Yeah, that's that's living.
Speaker 3 (43:49):
In the world. You got Yeah, you did the hard part. Yeah,
they said that we were going to do it.
Speaker 2 (43:53):
You said they did it.
Speaker 3 (43:54):
Yeah, that's like they already said we don't do this.
That's where are you just?
Speaker 2 (43:59):
You can't get by in America If you're not, why
will rely on certain kinds of paperwork? Right, That's that's
the game our president does all the time. He's the
king of that shit. So at this point, Ziz is
stuck in what they consider a calamitous situation. The prophecy
of doom, as they call it, is ticking ever closer,
(44:21):
which means the bad AI that's going to create hell
for everybody. Her panic over this is elevated by the
fact that she she starts to get obsessed with Roco's
basilisk at this time. No I know, I know, worst
thing for her to read, come on and info hazards
right the warnings yep, and a lot of the smarter
(44:43):
rationalists are just annoyed by it.
Speaker 3 (44:45):
Again.
Speaker 2 (44:45):
Yadkowski immediately is like this is very quickly decides it's
bullshit and bans discussion of it. He argues there's no
incentive for a future agent to follow through with that
threat because it by doing so, it just expends resources
no gain to itself, which is like, yeah, man, a
hyperlogical AI would not immediately jump to I must make
(45:06):
hell for everybody who didn't code me Like, yeah, that's
just crazy.
Speaker 3 (45:10):
There's step skew.
Speaker 2 (45:12):
Yeah, only humans are like ill in that way.
Speaker 3 (45:16):
That's the funny thing about it is it's such a
human response to it.
Speaker 2 (45:18):
Yeah, right right now. When she encounters the concept of
Roco's basilisk at first, Ziz thinks that it's silly, right.
She kind of rejects it and moves on, But once
she gets to the Bay, she starts going to in
person rationalist meetups and having long conversations with other believers
who are still talking about Roco's basilisk. She writes, I
(45:39):
started encountering people who were freaked out by it, freaked
out that they had discovered an improvement to the info
hazard that made it function got around a Leezer's objection.
Her ultimate conclusion is this, if I persisted in trying
to save the world, I would be tortured until the
end of top the universe by a coalition of all
unfriendly ais in order to increase the amount of measure
(46:01):
they got by demoralizing me. Even if my system too
had good decision theory, my system one did not, and
that would damage my effectiveness. And like, I can't explain
all of the terms in that without taking more time
than we need to. But like you can hear like
that is not the writing of a person who is
thinking in logical terms.
Speaker 3 (46:18):
No, it's it's a uh it's so scary.
Speaker 2 (46:23):
Yes, yes, it is very scary stuff.
Speaker 3 (46:26):
It's so scary to be like, Oh, that's where she
was operating those mistakes.
Speaker 2 (46:29):
This is where you're she's dealing with.
Speaker 3 (46:32):
Yes, that's that's it is. You know.
Speaker 2 (46:35):
I talked to my friends who grow are raised in
like very toxic evangelical subculture, chunks of the evangelgelical sub
culture and growth and spend their whole childhood terrified of
hell that like everything. You know, I got angry at
my mom and I didn't say anything, but God knows
I'm angry at her, and he's going to send me
to hell because I didn't respect my mother.
Speaker 3 (46:53):
Mother.
Speaker 2 (46:54):
Like, that's what she's doing right.
Speaker 3 (46:56):
Exactly, like exactly, she can't win. There's no winning here.
Speaker 2 (46:58):
Yes, yes, and again I say this a lot. We
need to put lithium back in the drinking water. We
we gotta put lithium back in the water. Maybe xanax too, she.
Speaker 3 (47:10):
Needed she could have tooken a combo. Yeah, before it
gets to where it gets at this point, you really
you really feel for and like just living in this
living like that every day. She's so scared and that
that this is what she's doing. It's it's this, this
is she is.
Speaker 2 (47:30):
The therapy needingest woman I have ever heard. At this point,
Oh my god.
Speaker 3 (47:35):
She just needs to talk to she needs to talk again.
Speaker 2 (47:38):
You know, the cult the thing that happens to cult
members has happened to her. Where she her. The whole
language she uses is incomprehensible to people. I had to
talk to you for an hour and fifteen minutes, so
you want to understand parts of what this lady says, right, Exactly,
she has to because it's all nonsense if you don't
do that work exactly.
Speaker 3 (47:59):
She so spun out at this point, it's like, how
do you even get back? Yeah, how do you even
get back?
Speaker 2 (48:05):
Yeah? So she she ultimately decides, even though she thinks
she's doomed to be tortured by unfriendly ais evil gods,
must be fought. If this damns me, then so be it.
She's very heroic, she had and she sees herself that way, right.
Speaker 3 (48:19):
Yeah, And even like just with her convictions and that
she she does, she does, she does, she does, she
does it.
Speaker 2 (48:26):
She's a woman of conviction. You really can't take that
away from hers. Are nonsense, No, that's the but they're there.
Speaker 3 (48:36):
Yeah, they're based on elaborate Harry Potter fan fiction.
Speaker 2 (48:39):
Yeah, it's like David Ike, the guy who believes in
like literal lizard people, and everyone thinks he's like talking
about the Jews, but like no, no, no, no, he
don't just lizards.
Speaker 3 (48:48):
It's exactly that where it's just like you want to draw, yeah,
you want to draw something, so it's not nonsense, and
then you realize.
Speaker 2 (48:55):
No, that's no, no, no no. And like David he
went out, he's made like a big rant against how
Elon Musk is like evil for what all these people
he's hurt by firing the whole federal government and people
were shocked. It's like no, no, no, David Ike believes
in a thing. It's just crazy. Those people do exist.
(49:16):
Here we are talking about and here we are talking
about them. Some of them run the country. But actually
I don't know how much all of those people believe
in anything. But yeah, yeah, speaking of people who believe
in something, our sponsors believe in getting your money. We're back.
(49:44):
So uh. She is at this point suffering from delusions
of granteur, and those are going to rapidly lead her
to danger, but she concludes that since the fate of
the universe is at stake in her actions, she would
make a timeless choice to not believe in the bast
right and that that will protect her in the future
(50:04):
because that's how these people talk about stuff like that.
So she gets over her fear of the basilisk for
a little while, but even it, like when she claims
to have rejected the theory, whenever she references it in
her blog, she like locks it away under a spoiler
with like an info hazard warning Roco's Basilisk family skippable,
so you don't like have to see it and have
(50:26):
it destroy your psyche.
Speaker 3 (50:28):
That's the power of it.
Speaker 2 (50:30):
Yeah yeah yeah. The concept does, however, keep coming back
to her like and continuing to drive her mad. Thoughts
of the basilisk return, and eventually she comes to an
extreme conclusion. If what I cared about was sentient life,
and I was willing to go to hell to save
everyone else, why not just send everyone else to hell?
If I didn't submit, Can I tell you I really
(50:52):
it felt like this is what was This is where
it had to go, right yeah yeah yes. So what
she means here is that she is now making the
timeless decision that when she is in a position of
ultimate influence and helps bring this all powerful vegan Ai
into existence, she's promising now ahead of time, to create
(51:12):
a perfect hell, a digital hell to like punish all
of the people who don't stop like eat meat. Ever,
she wants to make a hell for people who eat meat.
And that's the yeah, that's the conclusion that she makes. Right,
So this becomes an intrusive thought in her head, primarily
the idea that like everyone isn't going along with her, right, Like,
(51:36):
she doesn't want to create this hell. She just thinks
that she has to. So she's like very focused on
like trying to convince these other people in the rationalist
culture to become vegan. Anyway, she writes this quote, I
thought it had to be subconsciously influencing me, damaging me
at my effectiveness, that I had done more harm than
I can imagine by thinking these things, because I had
(51:57):
the hubris to think info hazards didn't exist, worse to
feel resigned a grim sort of pride in my previous
choice to fight for sentient life, although it damned me
and the gaps between. Do not think about that, you moron.
Do not think about that, you moron pride, which may
have led to intrusive thoughts to resurface, and progress and
progress to resume. In other words, my ego had perhaps
(52:17):
damned the universe. So Man, I don't fully get all
of what she's saying here. But it's also because she's
like just spun out into madness at this.
Speaker 3 (52:28):
Yeah, she lives in it now. It's so yeah, it's
so far for we've been talking about it, however long
she's she's so far away from us even.
Speaker 2 (52:39):
Yeah, and it is it is deeply I've read a
lot of her writing. It is deeply hard to understand
pieces of it here, man.
Speaker 3 (52:47):
But she is at war with herself.
Speaker 2 (52:49):
She is for sure at war with herself.
Speaker 3 (52:52):
Now.
Speaker 2 (52:53):
This is at this point attending rationalist events by the Bay,
and a lot of the people at those events are older,
more influential men, some of whom are influential in the
tech industry, all of whom have a lot more money
than her. And some of these people are members of
an organization called see FAR, the Center for Applied Rationality,
which is a nonprofit founded to help people get better
(53:16):
at a pursuing their goals. It's a self help company, right,
and what runs self help seminars. This is the same
as like a Tony Robbins thing. Right, We're all just
trying to get you to sign up and then get
you to sign up for the next workshop and the
next workshop and the next workshop, like all self help
people do. Yeah, there's no difference between this and Tony Robbins.
(53:36):
So Ziz goes to this event and she has a
long conversation with several members of SEEFAR, who I think
are clearly kind of My interpretation of this is that
they're trying to groom her to get a new because
they think chick's clearly brilliant, she'll find her way in
the industry, and we want her money, right, you know,
maybe we wanted to do some free work for us too,
(53:58):
but like, let's you know, uh, we got to reel
this fish in, right. So this is described as an
academic conference by people who are in the AI risk
field and rationalism, you know, thinking of ways to save
the universe, because only the true, the super geniuses can
do that. The actual why I'm really glad that I
(54:19):
read Ziz's account here is I've been reading about these
people for a long time. I've been reading about their beliefs.
I felt there's some cult stuff here. When Ziz laid
out what happened at this seminar, this self help him
seminar put on by these people, very close to Yedkowski.
This is it's almost exactly the same as a Synonon meeting,
(54:43):
Like it's the same stuff, It's exact and it's the
same shit. It's the same as accounts of like big
like self help movement, things from like the seventies and
stuff that I've read that that that's when it really
clicked to me.
Speaker 3 (54:55):
Right.
Speaker 2 (54:56):
Quote, here's a description of one of the because they
have you know, speeches in they break out into groups
to do different exercises. Right there were hamming circles per
person take turns, having everyone else spend twenty minutes trying
to solve the most important problem about your life. To you,
I didn't pick the most important problem in my life
because secrets. I think I used my turn on a
(55:17):
problem I thought they might actually be able to help
with the fact that it did, although it didn't seem
to affect my productivity or willpower at all, I e
I was in humanly determined. Basically all the time, I
still felt terrible all the time that I was hurting
from to some degree relinquishing my humanity. I was sort
of vaguing about the pain of being trans and having
decided not to transition and so like, this is a
(55:39):
part of the thing. You build a connection between other
people in this group by getting people to like spill
their secrets to each other. It's a thing scientology does.
It's a thing they did. It's send it on, tell
me your dark is secret.
Speaker 3 (55:50):
Right.
Speaker 2 (55:51):
And she's not fully willing to because she doesn't want
to come out to this group of people yet. And
you know part of what I forget.
Speaker 3 (56:00):
That she's also dealing with that entire yes.
Speaker 2 (56:03):
Wow, yeah, and that the hamming circle doesn't sound so
bad if you'll recall any And as you mentioned this,
I was really in part one. Sin and On would
have people break into circles where they would insult and
attack each other in order to create a traumatic experience
that would bond them together and with the cult. These
hamming circles are weird, but they're not that. But there's
another exercise they did next called doom circles. Quote. There
(56:28):
were doom circles where each person, including themselves, took turns
having everyone else bluntly but compassionately say why they were
doomed using blindsight. Someone decided and set a precedent of
starting these off with a sort of ritual incantation we
now invoke and bow to the doom gods and waving
their hands saying doom. I said I'd never bow to
(56:49):
the doom gods. And while everyone else said that, I
flipped the double bird to the heavens and said fuck
you instead. Person a that's this member of Seafar that
she is. Admires found this and joined in. Some people
brought up that they felt like they were only as
morally valuable as half a person. This irked me. I
said they were whole persons and don't be stupid like that,
(57:11):
Like if they wanted to sacrifice themselves, they could weigh
one versus seven billion. They didn't have to falsely detegrate
themselves is less than one person. They didn't listen. When
it was my turn concerning myself, I said my doom
was that I could succeed at the things I tried,
succeed exceptionally well, Like I bet I could in ten
years have earned a give like ten million dollars through startups,
and it would still be too little, too late, like
(57:33):
I came into this game too late. The world would
still burn. And first off, like this is you know,
it's a variant of the synonym thing you're going on.
You're telling people why they're doomed, right, like why they
won't succeed in life, you know. But it's also one
of the things here these people are saying they feel
like less than a person. A major topic of discussion
(57:54):
in the community at the time is if you don't
think you can succeed in business and make money, is
the best thing with the highest net value you can
do taking out an insurance policy on yourself and committing suicide,
oh my god, and then having the money donated to
a rationalist organization. That's a major topic of discussion that
(58:14):
like Ziz grapples with, a lot of these people grapple
with right because they're obsessed with the idea of like,
oh my god, I might be net negative value. Right,
if I can't do this or can't do this, I
could be a net negative value individual. And that means
like I'm not contributing to the solution. And there's nothing
worse than not contributing to the solution.
Speaker 3 (58:33):
Were there people who did that?
Speaker 2 (58:37):
I am not aware there are people who commit suicide
in this community. I will say that, like, there are
a number of suicides tied to this community I don't
know if the actual insurance con thing happened, but it's
like a seriously discussed thing. And it seriously discussed because
(58:57):
all of these people to talk about the value of
their own line vives in purely like mechanistic how much
money or expected value can I produce? Like that is
a person and that's why a person matters, right And
the term they use is morally valuable, right like, that's
that's what means. You're a worthwhile human being if you're
(59:19):
morally if you're creating a net positive benefit to the
world in the way they define it. And so a
lot of these people are. Yes, there are people who
are depressed, and there are people who kill themselves because
they come to the conclusion that they're a net negative person,
right like that that is a thing at the edge
of all of this shit that's really fucked up. And
that's that's what this doom circle is about. Is everybody
(59:41):
like flipping out over I'm like and telling each other,
I think you might know be as only be as
morally valuable as half a person, right like, that's people
are saying that, right like, that's what's going on here,
you know, Like it's not the synonym thing of like
screaming like you're a you know, using the f slur
a million times or whatever. But it's very bad.
Speaker 3 (01:00:03):
No, this is this is this is awful for like.
Speaker 2 (01:00:06):
One thing I don't know. My feeling is you have
an inherent value because you're a person.
Speaker 3 (01:00:12):
Yeah, that's a great place to start, you know, I'm
so leading people to destroy themselves. Like it's yeah.
Speaker 2 (01:00:21):
It's it's so. It's such a bleak way of looking
at things.
Speaker 3 (01:00:25):
It's so crazy too. Where were these meals? I just
in my head, I'm like, this is just happening in
like a ballroom at a raticon.
Speaker 2 (01:00:31):
I think it is, or a convention center, you know,
the different kind of public spaces. I don't know, Like honestly,
if you've been to like an anime convention or a
Magic the Gathering convention somewhere in the Bay, you may
have been in one of the rooms they did these,
And I don't know exactly where they hold this. So
the person A mentioned above, this like person who's like
(01:00:51):
affiliated with the organization that I think is a recruiter
h looking for young people who can be cultivated to
pay for classes. Right, this person, it's very clear to
them that Zizz is at the height of her vulnerability,
and so he tries to take advantage of that. So
he and another person from the organization engage Ziz during
a break. Ziz, who's extremely insecure, asks them point blank,
(01:01:15):
what do you think my net value ultimately will be
in life? Right? And again there's like an element of this.
It's almost like rationalist calvinism, where it's like it's actually
decided ahead of time by your inherent, immutable characteristics, you know,
if you are a person who can do good. Quote.
I asked person A if they expected me to be
net negative. They said yes. After a moment, they asked
(01:01:38):
me what I was feeling or something like that. I
said something like dazed and sad. They asked why sad.
I said I might leave the field as a consequence
and maybe something else. I said I needed time to
process or think. And so she goes home after this
guy is saying like, yeah, I think your life's probably
net negative value, and sleeps the rest of the day,
and she wakes up the next morning and comes back
(01:02:00):
to the second day of this thing, and yeah, Ziz
goes back and she tells this person, Okay, here's what
I'm gonna do. I'm going to pick a group of
three people at the event I respect, including you, and
if two of them vote that they think I have
a net negative value quote, I'll leave EA and Existential
(01:02:21):
risk and the rationalist community and so on forever. I'd
transition and move, probably to Seattle. I heard it was
relatively nice for trans people, and there do what I
could to be a normy, retool my mind as much
as possible, to be stable, unchanging an enormy gradually abandoned
my Facebook account and email, use a name change as
a story for that, and God, that would have been
(01:02:42):
the best thing for That's what you see.
Speaker 3 (01:02:45):
But sliver of hope, like yeah, oh man.
Speaker 2 (01:02:48):
She sees this as a nightmare. Right, this is the
worst case scenario for her, Right, because you're not part
of right, you're not part of the you're not part
of the cause. You know, you're you have no you
have no involvement in the great quest to save humanity.
That's worse than death, almost right, It's its own kind.
Speaker 3 (01:03:07):
Of hell, though, Right to think that you have this
enlightenment and that you that you weren't good enough to.
Speaker 2 (01:03:14):
And that's a lot about how I'd probably just kill myself,
you know, that's the logical thing to do. It's so
fucked up, it's so fucked up. But also if she's
trying to live a normal life as a normy, and
she she refers to like being a normy as like
just trying to be nice to people, because again that's useless.
(01:03:35):
So her fe fear here is that she would be
a causal negative if she does this right. And also
the robot god that comes about might put her in hell, right.
Speaker 3 (01:03:45):
Because that's also looming. Yeah, after every for every decision right. Yeah.
Speaker 2 (01:03:49):
And the thing here, she she expressed, she tells these
guys a story, and it really shows, both in this
community and among her, how little value they actually have
for like human life. I I told a story about
a time I had killed four ants in a bathtub
where I wanted to take a shower before growing to work.
I'd considered, can I just not take a shower, and
presumed me smelling bad at work would, because of big
(01:04:11):
numbers in the fate of the world and stuff, make
the world worse than the deaths of four basically causally
isolated people. I considered getting paper in a cup and
taking them elsewhere, and I figured there were decent odds
if I did, I'd be late to work and it
would probably make the world worse in the long run.
So again she considers ants identical to human beings, and
she is also saying it was worth killing four of
(01:04:33):
them because they're causally isolated so that I could get
to work in time, because I'm working for the cause.
It's also in a bad place here. Yeah.
Speaker 3 (01:04:44):
The crazy thing about her is it like the amount
of thinking just to like get in the shower to
go to work, you know, you know what I mean
like that that Oh, it just seems like it makes everything. Yeah,
every the action is so loaded, yes, yes, must.
Speaker 2 (01:05:04):
It's it's so it's it's wild to me both this
like mix of like fucking Jane Buddhist compassion of like
an aunt is no less than I, or an aunt
is no less than a human being, right we are
all these are all lives. And then but also it's
fine for me to kill a bunch from me to
go to work on time because like they're causally isolated,
so they're basically not people. Like it's it's so weird.
(01:05:29):
Like and again it's getting a lot clearer here why
this lady and her ideas end in a bunch of
people getting shot.
Speaker 3 (01:05:39):
Yeah and stabbed.
Speaker 2 (01:05:41):
Okay, there's a samurai's sword later in the story, my friend, that's.
Speaker 3 (01:05:46):
The one thing this has been missing.
Speaker 2 (01:05:48):
Yes, yes, So they continue, these guys to have a
very abusive conversation with this young person, and she clearly
she trust them enough.
Speaker 3 (01:05:57):
To a conversation where she asks for the two yeah, ok.
Speaker 2 (01:06:00):
Yeah, and she tells them she's trans, right, And this
gives you an idea of like how kind of predatory
some of the stuff going on in this community is.
They asked what I'd do with a female body. They
were trying to get me to admit what I actually
wanted to do was the first thing in heaven, heaven
being there's this idea, especially amongst like some trans members
of the rationalist community that like all of them basically
(01:06:22):
believe a robot's going to make heaven right. And obviously,
like there's a number of the folks who are in
this who are trans are like and in heaven like
you just kind of get the body you want immediately, right,
so these get they were trying to get me to
admit that what I actually wanted to do as the
first thing in heaven was masturbate in a female body,
and they follow us up by sitting really close to her,
(01:06:44):
close enough that she gets uncomfortable. And then a really
really rationalist conversation follows. They asked if I felt trapped.
I may have clarified physically they may have said sure. Afterward,
I answered no to that question, under the likely justify.
I believe that it was framed that way. They asked
why not. I said I was pretty sure I could
take them in a fight. They prodded for details why
(01:07:07):
I thought so, and then how I thought a fight
between us would go. I asked, what kind of fight,
like a physical, unarmed fight to the death right now?
And why? What were my payouts? This was over the
fate of the multiverse. Triggering actions by other people, ie
or imprisonment or murder was not relevant, So they decide
they make this into again. These people are all addicted
to dumb game theory stuff, right, Okay, so what is
(01:07:27):
this fight? Is this fight over the fate of the multiverse?
Are we in a you know, an alternate reality where
like no one will come and intervene and there's no
cops we're the only people in the world or whatever.
So they tell her like, yeah, imagine, there's no consequences
legally whatever to you do, and we're fighting over the
fate of the multiverse. And so she proceeds to give
an extremely elaborate discussion of how she'll gouge out their
eyes and try to destroy their prefrontal lobes and then
(01:07:49):
stomp on their skulls until they die. And it's both
it's like, it's nonsense. It's like how ten year olds
thinks fights work. It's also it's based on this game
theory attitude of fighting that they have, which is like
it's you have to make this kind of timeless decision
that any fight is you're you're just going to write.
Speaker 3 (01:08:08):
The hardest confrontation, right, Yes, I suppose you have to
be the most violent.
Speaker 2 (01:08:12):
Yes, yes, because that will make other people not want
to attack you, as opposed to like what normal people
understand about like real fights, which is if you have
to do one, if you have to, you like try
to just like hit him in the hit him is
somewhere that's going to shock them, and then run like
a motherfucker, right you get them get out of Like
(01:08:32):
if you have to, like ideally just run like a motherfucker.
But if you have to strike somebody, you know, yeah,
go for the eye and then run like a son
of a bitch, you know. Like, but there's no run
like a son of a bitch here, because the point
in part is this like timeless decision to anyway, This
gives tells you a lot about the rationalist community. So
she tells these people, she explains in detail how she
(01:08:53):
would murder them if they have to af They're like
sitting next super close. Have you just asked her about masturbation?
Here's their first question quote. They asked if I'd rape
their corpse. Part of me insisted this was not going
as it was supposed to, but I decided. I decided
inflicting discomfort in order to get reliable information was a
valid tactic. In other words, them trying to make her
(01:09:16):
uncomfortable to get info from her, she decides, is fine. Also,
the whole discussion about raping their corpses is like, well,
if you rape, obviously, if you want to have the
most extreme response possible, that would like make other people
unlikely to fuck with you knowing that you'll violate their corpse.
If you kill them as clearly the light and like that. Okay, sure,
I love rational thought.
Speaker 3 (01:09:38):
Oh man, this is crazy. Sorry, this yes is so crazy,
It's so nuts.
Speaker 2 (01:09:47):
So then they talk about psychopathy. One of these guys
had earlier told Ziz that they thought she was a psychopath.
Speaker 3 (01:09:55):
But he told.
Speaker 2 (01:09:57):
Her that doesn't mean what it means bo to actual
like clinicians, because psychopathy is a diagnostition or like what
normal people mean. To rationalists, a lot of them think
psychopathy is a state you can put yourself into in
order to maximize your performance in certain situations. It's because
they they've again there's some like popular books that are
(01:10:18):
about like the psychopaths way, the Dark Triad, and like, well,
you know, these are the people who lead societies in
the toughest times, and so like you could you need
to optimize and engage in some of those behaviors if
you want to win in these situations. Based on all
of this, Ziz brings up what rationalists call the Gervas principle. Now,
this started as a tongue in cheek joke describing a
(01:10:41):
rule of office dynamics based on the TV show The Office.
Yes it's Ricky Space, Yes, and The idea is that
in office environments, psycho's always rise to the top. This
is supposed to be like a negative observation, Like the
person who wrote this initially is like, yeah, this is
how offices work, and it's like why they're bad.
Speaker 3 (01:11:00):
You know.
Speaker 2 (01:11:00):
It's an extension of the Peter principle. And these psychopaths
put bad like dumb and incompetent people in like when
positions below them for a variety. It's trying to kind
of work out why in which offices are often dysfunctional. Right,
it's not like the original Jervas principle thing is like
not a bad piece of writing or whatever, but Zis
(01:11:21):
takes something insane out of it. I described how the
Jervase principle said sociopaths give up empathy as in a
certain chunk of social software, not literally all hardware awared
accelerated modeling of people, not necessarily compassion, and with it
happiness destroying meaning to create power, meaning too, I did
not care about I wanted this world to live on.
(01:11:42):
So she tells them she's come to the conclusion I
need to make myself into a psychopath in order to
have the kind of mental power necessary to do the
things that I want to do. And she largely justifies
this by describing the beliefs of the Syth from Star Wars,
because she thinks she needs to remake herself as a
psychopathic evil warrior monk in order to save all of creation. Yeah, no,
(01:12:08):
of course, yep, So this is her hitting her final form.
And true to fact, these guys are like, they don't
say it's a good idea, but they're like, Okay, yeah,
you know that's not That's not the worst thing you
could do. Sure, you know, like I think the Sith
stuff kind of weird, but making yourself a psychopath makes sense. Sure, yeah,
of course I know a lot of guys who did that.
(01:12:29):
That's literally what they say, right, And then they say
that also, I don't even think that's what they really
they say that, because the next thing they say, this
guy person a is like, look the best that later
turn yourself from a net negative to a net positive value.
I really believe you could do it. But to do it,
you need to come to ten more of these seminars
and keep taking classes here right right right, here's a
(01:12:52):
quote from them. Are from Ziz. She's conditional on me
going to a long course of circling like these two
organizations offered, particularly a ten weekend one, then I probably
would not be net negative. So things are going good.
This is this is you know, ah, yeah great.
Speaker 3 (01:13:17):
How much is ten weekends cost?
Speaker 2 (01:13:19):
I don't actually know. I don't I don't fully know
with this. It's possible some of these are like some
of the events are free, like but the classes cost money,
or but it's also a lot of it's like there's
donations expected, or by doing this and being a member,
it's expected you're going to tithe basically like your income
(01:13:40):
right more than at.
Speaker 1 (01:13:42):
Ult I don't know the format. Is she not going
to be like super suspicious that people are like, you know,
faking it or like going over the top.
Speaker 2 (01:13:51):
She okay, is she is? She gets actually really uncomfortable.
They have an exercise where they're basically doing you know,
they're playing with love bombing right where everyone's like hugging
and telling each other they love each other, and she's like,
I don't really believe it. I just met these people.
So she has started to and she is going to
break away from these organizations pretty quickly. But this conversation
(01:14:12):
she have with these guys is a critical part of
like why she finally has this fracture because number one,
this dude keeps telling her you have a net negative
value to the universe, right, and so she's obsessed with
like how do I And it comes to the conclusion,
my best way of being net positive is to make
(01:14:33):
myself into as sociopath and the sith Lord to save
the animals.
Speaker 3 (01:14:40):
Of course, it feels like the same thinking though, as
like the Robot's gonna make It seems to always come
back to this idea of like I think we just
gotta be evil.
Speaker 1 (01:14:51):
It's like, yes, oh, yes, well I guess the only
logical conllusion is doom yep.
Speaker 3 (01:15:01):
Yeah, yeah.
Speaker 4 (01:15:02):
It's like it feels like it's a it's a theme here,
m m yep. Anyway you want to plug anything. At
the end here, I have.
Speaker 3 (01:15:13):
A comedy specially you can purchase on Patreon. It's called
Birth of a Nation with a G. You can get
that at Patreon, dot combat Slash.
Speaker 2 (01:15:22):
David Bori excellent, excellent, All right, folks, Well that is
the end of the episode. David, thank you so much
for coming on to our inaugural episode. By listening to
some of the weirdest shit we've ever talked about on
this show.
Speaker 3 (01:15:40):
Yeah, this is uh, I don't really I'm going to
be thinking about this for weeks.
Speaker 1 (01:15:44):
I mean, yeah, yeah, fair because your co host likes
a curbent kbon for the Elders of Zion episodes.
Speaker 2 (01:15:54):
Yeah, yeah, okay, I wanted to. I was initially gonna
kind of just focus on all this would have been
like half a page or so, you know, just kind
of summing up, here's the gist of what this believes,
and then let's get to the actual cult stuff when,
like you know, Ziz starts bringing in followers and the
crimes start happening. But that Rolling or that Wired article
(01:16:15):
really covers all that very well. And that's the best piece.
Most of the journalism I've read on these guys is
not very well written. It's not very good. It does
not really explain why they what they are are, why
they do it. So I decided, and I'm not The
Wired piece is great. I know, the Wired guy knows
all of the stuff that I brought up here. He
just it's an article. You have editors. He left out
(01:16:38):
what he thought he needed to leave out. I don't
have that problem, and I wanted to really, really deeply
trace exactly where this lady's how this lady's mind develops,
and how that intersects with rationalism, because it's interesting and
kind of important and bad.
Speaker 3 (01:16:56):
Yeah, Okay, he's so.
Speaker 2 (01:17:01):
Anyway, thanks for having a had fuck with me, all right.
That's it, everybody, goodbye.
Speaker 1 (01:17:11):
Behind the Bastards is a production of cool Zone Media.
For more from cool Zone Media, visit our website Coolzonemedia
dot com, or check us out on the iHeartRadio app,
Apple Podcasts, or wherever you get your podcasts. Behind the
Bastards is now available on YouTube, new episodes every Wednesday
and Friday. Subscribe to our channel YouTube dot com slash
(01:17:32):
at Behind the Bastards