All Episodes

March 11, 2025 72 mins

Earlier this year a Border Patrol officer was killed in a shoot-out with people who have been described as members of a trans vegan AI death cult. But who are the Zizians, really? Robert sits down with David Gborie to trace their development, from part of the Bay Area Rationalist subculture to killers.

(4 Part series)

Sources: 

  1. https://medium.com/@sefashapiro/a-community-warning-about-ziz-76c100180509
  2. https://web.archive.org/web/20230201130318/https://sinceriously.fyi/rationalist-fleet/
  3. https://knowyourmeme.com/memes/infohazard
  4. https://web.archive.org/web/20230201130316/https://sinceriously.fyi/net-negative/
  5. Wayback Machine
  6. The Zizians
  7. Spectral Sight
  8. True Hero Contract
  9. Schelling Orders – Sinceriously
  10. Glossary – Sinceriously
  11.  .css-j9qmi7{display:-webkit-box;display:-webkit-flex;display:-ms-flexbox;display:flex;-webkit-flex-direction:row;-ms-flex-direction:row;flex-direction:row;font-weight:700;margin-bottom:1rem;margin-top:2.8rem;width:100%;-webkit-box-pack:start;-ms-flex-pack:start;-webkit-justify-content:start;justify-content:start;padding-left:5rem;}@media only screen and (max-width: 599px){.css-j9qmi7{padding-left:0;-webkit-box-pack:center;-ms-flex-pack:center;-webkit-justify-content:center;justify-content:center;}}.css-j9qmi7 svg{fill:#27292D;}.css-j9qmi7 .eagfbvw0{-webkit-align-items:center;-webkit-box-align:center;-ms-flex-align:center;align-items:center;color:#27292D;}
    Mark as Played
    Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:01):
Also media, Welcome back to Behind the Best. That's how
this podcast would open. If I was a game show host,
but I'm not. Instead of a guy who's.

Speaker 2 (00:16):
Spending you would be good at it, though.

Speaker 3 (00:18):
I don't think I would be, Sophie.

Speaker 2 (00:20):
I do, but I think, but I'm like biased because
I think you'd be good at most things.

Speaker 1 (00:25):
No, my only marketable skill is spending thirty hours reading
the deranged writings of a quasi cult leader who was
somewhat involved in the murders of multiple people very recently,
largely because she read a piece of Harry Potter fan
fiction at the wrong time. Yes, we have a fun

(00:48):
one for you this week, and by a fun one,
we have a not at all fun one for you
this week. And to have just a terrible time with me.
We are bringing on again the great David Boree, co
host of My Mama, told me with our friend of
the pod, Langston Kerman, David, how you doing, Oh man, I.

Speaker 3 (01:10):
Can't not complain how you doing? There's nothing going on
in the world.

Speaker 1 (01:15):
Oh yeah, yeah, I got up today and read that
that great new article by Francis fuka Yama. History is
still stopped, So everything's good. We're done.

Speaker 3 (01:27):
I haven't looked at any news yet purposefully, so I'm
you know, it could be awesome. It could be going
great out there. It's great. It's great.

Speaker 1 (01:34):
The whole Trump administration got together and said, psych it
was all a bit man, just an extended ad for
the Apprentice season fifteen.

Speaker 3 (01:43):
You mean this country is not a business.

Speaker 1 (01:45):
No, they handed over the presidency to.

Speaker 3 (01:50):
I don't know.

Speaker 1 (01:51):
I don't know whoever you you personally at home think
would be a great president. I'm not going to jump
into that can of worms right now. Ramonbrony Madron the president.

Speaker 3 (02:04):
That's a good one. That's a good one. That's better
than what we got, honestly, vastly superior than where we are.
Of all the entertainers, I feel like, why don't we
start giving athletes a shot a government? Yeah? I fuck it?
Why not? You know, uh fucking uh.

Speaker 1 (02:20):
Kareem Abdul Jabbar would could knock a presidency out of
the park. Come on, absolutely, Yes, we need a mystery novelist,
slash one of the great basketball stars of all time
in the White House.

Speaker 3 (02:39):
I just want a president who's good in the paint,
you know what I mean?

Speaker 1 (02:41):
That's right, that's right, Agatha Christie with a jump shot.

Speaker 3 (02:47):
Yeah, that's exactly what I think. That's what an amazing man.
Kareem would be.

Speaker 2 (02:52):
Such would be such a good choice.

Speaker 3 (02:55):
Yeah, bring it on. I think he's such a good man.
He wouldn't do it.

Speaker 2 (02:58):
Yeah, exactly, his way too moral, he's waited. I have
a frog name after named after him.

Speaker 3 (03:03):
Yeah.

Speaker 1 (03:04):
Look, honestly, given where we are are right right now,
I'd take fucking uh what's his mark maguire, like Jesus Christ,
anybody like, honestly, anyone I take heartbeat oh man, fuck

(03:25):
it like I'll take no. No, I'm not gonna take
any hockey players. No hockey players. We got enough people
with brain damage in the White House right now.

Speaker 3 (03:33):
That's probably sit and we don't need somebody who that much.

Speaker 1 (03:37):
Yeah, yeah, yeah, yeah, you're probably right there. I mean,
if if we could go back in time and make
Joe Lewis the president, I think he could solve some
fucking problems in Congress.

Speaker 3 (03:47):
You can get still done.

Speaker 1 (03:54):
So this has been a fun digression, but I got
to ask at the start of this the story that
it is most relevant to the people we're talking about today,
that I think most of our listeners will have heard.
I'm curious if you've heard about Back on January twenty first,
right as the Trump administration took power, a border patrol
agent was shot and killed along with another individual at

(04:16):
a traffic stop in Coventry, Vermont. Right, there were two
people in a car that was pulled over a border patrol.

Speaker 3 (04:22):
One of those people.

Speaker 1 (04:23):
Drew a gun, there was a firefight, one of the
people in the car and the cop died.

Speaker 3 (04:28):
Right, Okay, have you heard this story. I'm not at all.

Speaker 1 (04:33):
It's one of those things where it would have been
a much bigger the immigration being the thing that it
is right in the United like the political hot issue
that it is right now, Like the Republicans have been
desperately waiting for like a border patrol officer getting shooted
and wounded that they can like use to justify a crackdown.
But number one, this happened on the Canadian border.

Speaker 3 (04:54):
Not really favorite.

Speaker 1 (04:56):
And one of the two people who who drew their
guns on the cop ops was an immigrant, but they
were a German immigrant, and so.

Speaker 3 (05:04):
None of this really like right, It.

Speaker 1 (05:06):
Was all like right on the edge of being super
useful to their rights.

Speaker 3 (05:10):
But it's none, like we were dealing with our own
right wing immigration propaganda at that time. Yeah.

Speaker 1 (05:19):
Yeah, it was just like it was like the closest
to being a perfect right wing like fault, like a
Reichstag fire in it, but like just a little too weird.

Speaker 3 (05:29):
Yeah, you gotta throw some spice in there. You gotta
have like a Latin country. That's what they get excited about.

Speaker 1 (05:34):
Yeah, and obviously California border is where you want it,
you know.

Speaker 3 (05:37):
Yeah, definitely, definitely, even New Mexico could be.

Speaker 1 (05:40):
Yeah, or at least at least they need to have
fitnel on the car. In fact, they were not breaking
any laws that anyone could prove at the time. They
just looked kind of weird. Yeah, they looked kind of weird,
and they like had guns, but it was like they
had like two handguns and like forty rounds and some
old targets. They were like coming back from a shooting range, right,
Like not a lot of a gun guns and AMMO

(06:02):
in America terms, right, especially in Vermont terms. Right. So
the other thing was weird about this is that the
German immigrant who died was a transwoman. So then again
we get back to like, wow, there's a lot about
this shooting that is like right on the edge of
some issues that the right is really trying to use

(06:24):
as like a folkrum to like push through some awful shit.
And as more and more information came out about the shooting,
the weirder it seemed, because there was a lot of
initial talk, is this like a terrorist attack where these
like two Antifa types who were like looking to murder
a bird border patrol agent. But no, that doesn't really
make sense because like they got pulled over, Like they

(06:44):
can't have been planning this right, Like it didn't. It
didn't really seem like that, and really no one, no
one could figure out why they had opened fire. But
as the days went on, more information started coming out,
not just about the two people who were arrested in
this well, the one person who was rested in the
one person who died, but about a group of people

(07:05):
around the country that they were linked to. And these
other people were not all but mostly trans women. They
were mostly people who kind of identified as both anarchists
and members of the rationalist subculture, which we'll talk about
in a little bit, And they were all super high
achieving people in like the tech industry and like sciences. Right,

(07:30):
These are the people who had won like awards and
had advanced degrees. The person, the lady who died in
the shooting was a quant trader, So these are not
like the normal shoot it out with the cops types.

Speaker 3 (07:44):
Yes, this is a very niche group. This is a
very strange story.

Speaker 1 (07:48):
So people start being like, oh the fuck is happening And.

Speaker 3 (07:51):
There's a group of people who could not meet each
other without the invention of the Internet, right, Well.

Speaker 1 (07:56):
That is boy, David, do you know where this always
going or at least starting, so like it's a couple
of these days into this when like a friend of
mine messaged me and it's like, hey, you know that
shooting in Vermont and I was like yeah, and he's
like my friend is like, you know, there's Zizians And
I was like, wait, what what the fuck? Because I

(08:16):
had heard of these people. This is a weird little subculture.
I'm always I'm like, you know, I study weird little
Internet subcultures, and in part because like some of them
do turn out to do acts of terrorism later. And
I've been I've been reporting on the rationalists, who are
not like a cult, but who do some cult adjacent

(08:37):
things and I just kind of find annoying. And I'd
heard about this offshoot of the rationalists called the Zizians.
They were very weird. There were some like weird crime allegations.
A couple of them had been involved in a murder
in California a year earlier. But like, it was not
a group that I ever really expected to see blow
up in the media, and then suddenly they fucking did, right,

(08:59):
And they're called the Zizians. It's not a name they
have for themselves. They don't consider themselves a cult. They
don't all like live A group of them did live together,
but like, these people are pretty geographically like dispersed around
the country. They're folks who met online arguing about ration
and discussing rationalism and the ideas of a particular member

(09:22):
of that community who goes by the name ziz. Right,
that's where this group came out of and the regular
media was not well equipped to understand what was going on.
And I want to run through a couple of representative
headlines that I came across just in like looking at
mainstream articles about what had happened. There's an article from

(09:43):
The Independent they title Inside the Zizians how a cultish
crew of radical vegans became linked to killings across the
United States. They seemed like just another band of anarchist
misfits scraping on the valide of scraping on the fringes
of Silicon Valley until the deaths began. And then there's
a k c r W article zizians the vegan techie

(10:04):
cult try tied to murders across the US. UH And
then a Fox article trans vegan cult charged with six murders.

Speaker 3 (10:12):
There you go, that's Fox.

Speaker 1 (10:15):
Yes, none of these titles are very accurate in that
I guess the first one is like the closest, where like,
these people are radical vegans and they're they are cultish, right,
so I'll give I'll give the Independent that vegan techie
cult is not really what I would describe them, Like
some of them were in the tech industry, but like

(10:37):
the degree to which they're in the tech industry is
a lot weirder than that gets across. And they're not
really a trans they're like trans vegans, but the cult
is not about being a trans vegan. That's just kind
of a lot. How these people found each other.

Speaker 3 (10:52):
Oh, they just happened to be that was just the.

Speaker 1 (10:55):
Common veganism is tied to it. They just kind of
all happen to be trans that's not really like tied
to it necessarily. So I would argue also that they're
not terrorists, which a lot of people have a number
of the other articles called them nothing. None of the
killings that they were involved with, and they did kill
people were like terrorist killings. They're all much weirder than that,

(11:18):
but none of them are like, none of the killings
I have seen are for a clear political purpose, right,
which is kind of crucial for it to be terrorism.
The murders kind of evolved out of a much sillier reason,
and it's you know, there's one really good article about
them by a fella at Wired who spent a year
or so kind of studying these people, and that article

(11:41):
does a lot that's good, but it doesn't go into
as much detail about what I think is the real
underpinning of whitest group of people got together and convinced
themselves it was okay to commit several murders. And I
think that that all comes down more than any other
single factor, to rationalism and to their belief in this

(12:03):
weird online cult. That's that's very much based on like
and like asking different sort of logic questions and trying
to like pinnow down the secret rules of the universe
by doing like game theory arguments on the internet over blogs. Right,
Like that's really how all of this stuff started.

Speaker 3 (12:25):
Like someone named mystery. Yeah, a lot of people in funny.

Speaker 1 (12:28):
Hats they do. Actually they're a little adjacent to this,
and they come out of that period of time, right
where like pick up artists culture is also like forming.
They're one of this like generation of cults that starts
with a bunch of blogs and shit on the internet
in like two thousand and nine, right, And this this
is it's so weird because we use the term cult

(12:51):
and that's the easiest thing to call these people. But generally,
when our society is talking about a cult, we're talking
about like eve an individual that individual brings in a
bunch of followers, gets them, isolates them from society, puts
them into an area where they are in complete control,
and then tries to utilize them for like a really

(13:13):
specific goal. There's like a way to kind of look
at physizians that way, But I think it would be
better to describe them as like cultish, right, Okay, they
use the tools of cult dynamics and that produces some
very cult like behavior, but there's also a lot of

(13:34):
differences between like how this group works and what you'd
call the traditional cult and including a lot of these
people are separate from each other and even don't like
each other, but because they've been inculcated in some of
the same beliefs through these kind of cult dynamics, they
make choices that lead them to like participate in violence too.

Speaker 3 (13:53):
Where is their hub? Is it like a message board
type of situation? Like how is that? Yes?

Speaker 1 (13:58):
Yes, so I'm gonna I'm gonna have to go back
and forth to explain all of that as.

Speaker 3 (14:05):
The technology of Zizzy in it. It's because from the lids.

Speaker 1 (14:09):
No no z I z the lady who was kind
of the founder of this is the name that she
takes for herself is Ziz, right.

Speaker 3 (14:18):
Okay, should have been the z Girls. That's much more appealing.

Speaker 1 (14:22):
These people are big in the news right.

Speaker 4 (14:24):
Now because because of the several murders, and the right
wing is trying it wants to make it out is
like this is a like trans death cult and this
is more of like an internet a I nerd death cult.

Speaker 3 (14:40):
I guess that's better.

Speaker 1 (14:42):
It's just different. You know, you're right, it was just
a different thing. And I think it's important if like
you care about like cults because you think they're dangerous
and you're arguing that, like, hey, this cult seems really dangerous,
should understand like what the cult is, right, right, Like
if you miss a just to the scientologists and thought
like these are obsessive fans of science fiction who are

(15:03):
committing murders over science fiction stories. It's like, no, no,
they're committing murders because there's something stupid. Yeah, much moret okay,
So I got to take I am going to explain
to you what rationalism is, how who ziz is, where
they come from, and how they get radicalized to the

(15:25):
point where they are effectively at the hub of something
that is at least very adjacent to a cult. But
I want to talk a little bit about the difference
between like a cult and cult dynamics. Right, A cult
is fundamentally a toxic thing. It is bad, It always
harms people. There is no harmless cult, you know, it's

(15:47):
like rape, Like there's no version of it that's good,
you know, like it is a fundamentally dangerous thing. Cult
dynamics and the tactics cult leaders use are not always
toxic or bad. And in fact, every single person listening
to this has enjoyed and had their life enriched by

(16:08):
the use of certain things that are on the spectrum
of cult dynamics.

Speaker 3 (16:12):
I was gonna say, it's a lot. It seems a
lot more like you have that at work, gives that
at work anywhere, right.

Speaker 1 (16:18):
Yeah, Anyway, that's a huge part of what make a
great fiction author who is able to like attract a
cult following you've ever had that experience, Like a big
thing in cults is the use of in creation of
new language. You get people using words that other they
don't use otherwise and like phrases, and that is both
a way to bond people because like you know, it
helps you feel like you're part of this group, and

(16:40):
it also isolates you from people. If you've ever met
people who are like hugely into you know, Dungeons and
Dragons or huge fans like Harry Potter or the Lord
of the Rings, like they have like things that they say,
like memes and shit that they share based on those books,
and like that's a less toxic, but it's on the
same spectrum. Right, It's this, I am a part of

(17:02):
this group of people, and we use these words that
mean something to us that don't mean things to other people, right,
And that's.

Speaker 3 (17:10):
Yes, yes, yeah, it's like a great that's like a
great way to bomb. I think it's any group, right,
I mean, yeah, entertainers.

Speaker 1 (17:17):
Your friend groups, Yeah, has in jokes, right, sports, Yeah,
could kill people, right exactly, yes, yes, And like you've
got you know, you and you and your buddies that
have been friends for years, you have like you could
there's like a word you can say and everyone knows
that you're fering this thing that happened six years ago,
and you all like laugh because you know it reminds
you of something, you know, because it's relevant to something happening.

(17:39):
Then that's a little healthy bit of cult.

Speaker 3 (17:42):
Dynamics at play, right, you know.

Speaker 1 (17:46):
It's like a diet, you know. So there's a toolbox
here and we play with it and different different organizations,
but churches play with it. And obviously a lot of
churches cross the line into cults, but there's also aspects
of for example, you know, there's churches that I know,
I have seen people go to where like it's very
common everybody gets up and like hugs at a certain point,

(18:09):
and like people benefit from human contact. It makes them
feel nice. It can be like a very healthy thing.

Speaker 3 (18:18):
I've gone to.

Speaker 1 (18:19):
I used to go to like Burning Man regionals and
like you would like start at this greeter station where
like a bunch of people would come up, but they'd
offer you like food and drinks, and you know, people
would hug each other and it was this like changes
your mind state from where you were in before, kind
of opens you up that bring.

Speaker 3 (18:35):
Those is that like to qualify for state Yeah yeah, yeah, yeah,
so that way could get to go.

Speaker 1 (18:40):
It's just like these local little events in Texas, right
like a thousand people in the desert trying to forget
that we live in Texas.

Speaker 3 (18:46):
Okay, we're not desert.

Speaker 1 (18:48):
But it was very like it's it's it was like
a really valuable part of like my youth because it
was the first time I ever started to like feel
comfortable in my own skin. But also that's on the
spectrum of love, which is thing colts do where they
like surround you from people with people who like talk
about like like you know, will touch you and hold
you and tell you they love you, and like, you know,

(19:10):
part of what brings you into the cult is the
cult leader can take that away at any moment in time.

Speaker 2 (19:15):
Right.

Speaker 1 (19:15):
It's the kind of thing where if it's not something
where no, this is something we do for five minutes
at the end of every church service, right, you can
very easily turn this into something deeply dangerous and poisonous.

Speaker 3 (19:25):
Right.

Speaker 1 (19:25):
But also a lot of people just kind of play
around a little bit pieces of that, a piece of
the cult dynamics.

Speaker 3 (19:31):
Just a little just a little bit.

Speaker 1 (19:33):
Any good musician, any really great performers fucking with some
cult dynamics, Right, I was.

Speaker 3 (19:39):
Gonna say, I mean I've been to like so many
different concerts of like weird niche stuff where you're like,
maybe the disco Biscuits is a cool I don't know.

Speaker 1 (19:49):
Yeah, I mean like I've been to some childish Gambino
concerts where it's like, oh, yeah, he's doing he's a
little bit of a cult leader, you know, like just
ten percent, right.

Speaker 3 (19:58):
Yeah, I mean, what are you good to do with
all that charisma you got? You gotta put it somewhere?

Speaker 1 (20:02):
Yeah, Yeah, So these are I think that it's important
for people to to understand both that, like the tactics
and dynamics that make up a cult have versions of
them that are not unhealthy. But I also think it's
important for people to understand cults come out of subcultures. Right,

(20:25):
This is very close to one hundred percent of the time.
Cults always arise out of subcultural movements that are not
in none of themselves cults. For example, in the nineteen
thirties through like the fifties sixties, you have the emergence
of what's called the self help movement, you know, and
this is all of these different books onlike how to
persuade people, how to you know, win friends and influence people,

(20:48):
you know, how to like make but also stuff like
alcoholics anonymous, you know, how to like improve yourself by
getting off drugs, getting off alcohol. All these are pieces
of the self improvement movement. Right, that's a subculture. There
are people who travel around you get obsessed to go
to all of these different things, and they'll and they
get a lot of benefit. You know, people will show
up at these seminars where there's hundreds of other people

(21:10):
and a bunch of people will like hug them and
they feel like they're part of this community and they're
making their lives better. And oftentimes, especially like once we
get to like the sixties seventies, these different sort of
guru types are saying that, like, you know, this is
how we're going to save the world if we can
get everybody doing you know this, this yoga routine or
whatever that I find together fix everything.

Speaker 3 (21:31):
Who's that guy who had the game? Oh god, yes, yeah, yeah, yeah.
They had to like they had to viciously confront.

Speaker 1 (21:39):
Each us that we've covered them. That that is Synanon. Yes, yes,
that's what I'm talking about. That's what I'm talking about.
And I have this broader subculture of self help and
a cult Synanon comes out of it, you know, and.

Speaker 3 (21:51):
I get it. It's like the subculture, it's already it's intimate.
You feel those people and anybody else. It definitely feels right.

Speaker 1 (21:59):
For and scientology is a cult that comes out of
the exact same subculture. We talked last week or week
before two weeks ago about Tony Alamo. It was an
incredibly abusive, pedophile Christian cult leader. He comes out of
along with a couple other guys we've talked the Jesus
Freak movement, which is a Christian subculture that arises as

(22:21):
a reaction to the Hippie movement. It's kind of the
countervailing force to the hippie movements. You got these hippies,
and you know, these Christians who are like really scared
of this kind of like weird left wing movement, and
so they start kind of doing like a Christian hippie
movement almost right, And some of these people just start
weird churches that sing annoying songs, and some of these

(22:43):
people start hideously dangerous cults. You have the subculture, and
you have cults that come out of it, right, And
the same thing is true in every single period of time, right,
cults form out of subcultures, you know. And part of
this is because people who a lot of people who
find themselves most drawn to subcultures, right, tend to be

(23:04):
people who feel like they're missing something in the outside world, right,
you know, not everybody people who get most into it.
And so so does that mean.

Speaker 3 (23:14):
Like so maybe like more I'm just curious, like more
broader cultural waves have never led There's like the Swifties
would not be a cult. No, there's no most likely
not going to be an offshoot of the Swifties that
becomes a cult because it's so broad, it has to
have already been kind of a smaller subset. That's interesting.

Speaker 1 (23:33):
Well, yeah, and I think but but that said, there
have been cults that have started out of like popular
entertainers and musicians, like you know, you could we could
talk about Corey Feldsman's Weird House full of young women
dressed as angels. Right, So yeah, you've got as a

(23:57):
general rule, like there are music is full of sub
cultures like punk, right, but there have definitely also been
some like punk communities that have have gone in kind
of individual little chunks of punk. Me it's got on
like culty directions, right if you don't like yeah, yes, yeah,

(24:20):
so there are cults that come out of the subculture.

Speaker 3 (24:23):
This is the way colts work.

Speaker 1 (24:24):
And I really just I don't think I don't think
there's very good education on what cults are, where they
come from, or how they work, because all of the
people who run this country have like a lot of
cult leader DNA and them.

Speaker 3 (24:36):
You know, they're being run currently by someone who has
seen as a magic man. Yes, exactly, exactly.

Speaker 1 (24:49):
I think there's a lot of vested interests in not
explaining what a cult is and where they come from,
and so I want to. I think it's important to
understand subculture's birth cult and also cult leaders are drawn
to subcultures when they're trying to figure out how to
make their cult because a subculture, you know, most of
the people in are just gonna be like normal people

(25:11):
who are just kind of into this thing. But there
will always be a lot of people who are like,
this is the only place I feel like I belong.
I feel very isolated. This this is like the center
of my being right right, And so it's just it's
like a good place to recruit, you know, those are
the kind of people you want if you're reaching out
to cult leaders. You know, I'm not saying, like, again,
I'm not saying subcultures are bad. I'm saying that, like

(25:32):
some chunk of people in subcultures are ready to be
in a cult, you know.

Speaker 3 (25:37):
Yeah, yeah, I think if i've they reflect on my
own personal life. Yeah, you meet a lot of guys
who are just like I'll die for the skate park
or whatever thing.

Speaker 1 (25:46):
Yeah, or like the Star Wars fans were sending death
threats to Jake Lloyd after the Phantom Menace where it's like, well,
you guys are crazy. That is insane, you know, He's like, hey, right,
this is a movie he also did.

Speaker 3 (26:00):
He didn't write it, Like what are you doing?

Speaker 1 (26:08):
You know whatever, So and again that's kind of a
good point, Like Star Wars fans aren't a cult, but
you can also see some of like the toxic things
cults do erupts from time to time, and from like
video game fans, right, people who are really into a
certain video game. It's not a cult, but also periodically
groups of those fans will act in ways that are

(26:28):
violent and crazy, and it's because of some of these
same factors going on.

Speaker 3 (26:33):
Right, I think people forget fanish short for fanatics exactly
exactly right.

Speaker 1 (26:39):
And it's it's like, you know, the events that I
went to very consciously played with cult dynamics. You know,
after you got out of the like greeting station thing
where like all these people were kind of like love
bombing you for like five minutes, there was like a
big bar and it had like a sign of Buffett
that said not a religion, do not worship, And it
was this kind of people talk about like this is
like we are play with the ingredients of a cult.

(27:02):
We're not trying to actually make one. So you need
to constantly remind people of like what we're doing and
why it affects their brain that way. And in my case,
it was like because I was at like a low
point in my life then like this was when I
was really it was twenty, I was not I had
no kind of drive in life. I was honestly dealing
with a lot of like suicidal ideation. This is the

(27:23):
point in which I would have been vulnerable to a cult,
and it I think it acted a little bit like
a vaccine, like I got a little dose of the drug.

Speaker 3 (27:31):
Unity exactly. You're like, hey, I know what that is.
I know what's going on there.

Speaker 1 (27:38):
So anyway, I needed to get into this because the
Zizians this thing that I think is it's either a
full on cult or at least a cult ish right
that is responsible for this series of murders that are
currently dominating the news and being blamed on like a
trans vegan death cult or whatever. They come out of
a subculture that grows out of the early aughts Internet

(28:01):
known as the Rationalists. The Rationalists started out as a
group in the early aughts on the comments sections of
two blogs. One was called less Wrong and one was
called Overcoming Bias. Less Wrong was started by a dude
named Elizer yed Kowski. I've talked about Aliser on the
show before.

Speaker 3 (28:21):
He sucks.

Speaker 1 (28:23):
He's I think he's a bad person. He's not a
cult leader, but again, he's playing with some of these
cult dynamics and he plays with them in a way
that I think is very reckless, right and ultimately leads
to some serious issues. Now, Aliser's whole thing is he
considers himself the number one world expert on AI risk

(28:46):
and ethics. Now you might think from that, oh, so
he's like he's like making AIS. He's like working for
one of these companies that's involved in like coding and stuff.
Absolutely not, no, no quarterback.

Speaker 3 (29:02):
No.

Speaker 1 (29:02):
He writes long articles about what he thinks AI would
do and what would make it dangerous that are based
almost entirely off of short stories you read in the
nineteen nineties. This guy's so much such it's such Internet
and like, I'm not a fan of like the quote

(29:23):
unquote real AI. But Yodkowski is not even one of
these guys who's like no, I'm like making a machine
that you talk to.

Speaker 3 (29:30):
Like, I have no credible I just have an opinion.

Speaker 1 (29:33):
Yeah, I find out I hate this guy so much.
Speaking of things, I hate not going to ads. We're back,
so yod Kowski, this AI risk and ethics guy starts

(29:54):
this blog in order to explore a series of thought
experiments based in game theory. Uh and his.

Speaker 3 (30:01):
His I am annoyed by games.

Speaker 1 (30:06):
It's some sight like, man, I know that there's like
valid activity, but like it's all just always so stupid
and annoying to me. Anyway, a budget thought experience experience
based in game theory, with the goal of teaching himself
and others to think more logically and effectively about the
major problems of the world. His motto for the movement

(30:29):
and himself is winning the rational.

Speaker 3 (30:32):
Wow.

Speaker 1 (30:33):
Yeah, yeah, that's where she got it. Yeah, that's where
she picked it up. Yeah, he's They're tied in with biohacking, right,
this is kind of starting to be a thing at
the time, and brain hacking and the whole like self
optimization movement that feeds into a lot of like right
wing influencer space today. Yedowski is all about optimizing your

(30:53):
brain and your responses in order to allow you to
accomplish things that were are not possible for other people
who haven't done that. And there's a messianic era to
this too, which is he believes that only by doing this,
by by spreading rationalist principles in order to quote raise
the sanity water line, that's how he describes it, that's

(31:18):
going to make it possible for us to save the
world from the evil AI that will will be born
if if enough of us don't spend time reading blogs
that this is.

Speaker 3 (31:29):
It's awesome, this is this is pete, this is the
good stuff.

Speaker 1 (31:36):
Yakowski and his followers see themselves as something unique and special,
and again there's often a messianic air to this. Right,
we are literally the ones who can save the world
from evil AI. Nobody else is thinking about this or
is even capable of thinking about this because they're too logical.

Speaker 3 (31:51):
He holds himself as kind of like a identifies himself.

Speaker 1 (31:55):
On top of this, he doesn't really deify himself, but
he also does talk about himself like in a way
that is clearly other people aren't capable of of of
understanding all of the things that he's capable of understanding, right, Okay,
so there is a little bit it's more like superheroification,

(32:15):
but it's it's a lot you know what this is
closest to with these people, They not all of them
would argue with me about this, but I've read enough
of their papers and enough dianetics to know that, like
this is new dionetics, Like this.

Speaker 3 (32:29):
Is church the church.

Speaker 1 (32:30):
The church is scientific. Now there's the Church of Scientology.
Stuff has more occult and weird like magic stuff in it,
But this is all about there are activities and exercises
you go through that will rid your body of like
bad ingrained responses, and that will make you a fundamentally
more functional person.

Speaker 3 (32:51):
Okay, so the retraining of yourself in order exactly exactly. Okay,
huge deal.

Speaker 1 (32:56):
And also a lot of these guys wind up like
referring to the different like tech techniques that he teaches
as tech, which is exactly what scientologists call it. Like
there's some there's some shit I found that It's like
this could have come right out of a scientology pamphlet.
Do you guys not realize what you're doing? I think
they do. Actually, so he's he's, you know, in the

(33:18):
process of inventing this kind of new mental science that
verges on superpowers. And it's one of those things. People
don't tend to see these people as crazy if you
just sort of like read their arguments a little. It's
like them going over old thought experiments and being like,
so the most rational way to behave in this situation,
that is this reason. For this reason, you have to

(33:40):
really like dig deep into their conclusions to see how
kind of nutty a lot of this is. Now again,
I compared to the scientology. Ydkowski isn't a high control
guy like Hubbard. He's never going to make a bunch
of people live on a flotilla of boats in the
ocean with him. You know, he's got like there's definitely

(34:02):
like some allegations of bad treatment of like some of
the women around him, and like he has like a
Bay Area set that hang with him. I don't think
he's like a cult leader. You know, you could say
he's on the.

Speaker 3 (34:13):
Drawing people to him physically, or this is also all physically.

Speaker 1 (34:16):
I mean a lot of people move to the Bay
Area to be closer to the rationalist scene.

Speaker 3 (34:20):
Although again, well I'm a Bay Area guy.

Speaker 1 (34:24):
Fran san Fran. This is this this is a San
Francisco thing because all of these are tech people.

Speaker 3 (34:29):
Oh okay, so this is like, yes, I wonder what
neighborhood feels like San France and Oakland. You can look
it up. People.

Speaker 1 (34:36):
People have found his house online, right, Like, like it's
it is known where he lives. I'm not saying that
for any like I don't harass anybody. I just like
it's it's not a secret, Like what part of the
town this guy lives in? I just didn't think to
look it up, but like, yeah, this is like a
barry a Bay Area tech industry subculture, right, Okay. So

(34:58):
the other difference between this and something that scientific is
that it's not just Alizer laying down to the law.
Elizer writes a lot of blog posts, but he lets
other people write blog posts too, and they all debate
about them in the comments. And so the kind of
religious canon of rationalism is not a one guy thing.
It's come up with by this community. And so if

(35:18):
you're some random kid in bumfuck, Alaska and you find
these people and start talking with them online, you can
like wind up feeling like you're having an impact on
the development of this new thought science.

Speaker 3 (35:30):
You know. Yeah, that's amazing, very very powerful for other power. Yes.

Speaker 1 (35:37):
Now, the danger with this is that like that, all
of this is this Internet community that is incredibly like
insular and spends way too much time talking to each
other and way too much time developing in group terms
to talk to each other. And Internet communities have a
tendency to poison the minds of everyone inside of them.
For example, Twitter, the reality is that x X the

(36:03):
everything app. I just watched a video of a man
killing himself while launching a ship coin.

Speaker 3 (36:11):
The everything app.

Speaker 2 (36:19):
A hack a hack Google job indicates it's Berkeley.

Speaker 3 (36:25):
Yeah, that makes that makes the most sense to me. Geographically.

Speaker 1 (36:29):
A lot of these people wind up living on boats,
and like the Oakland there's the Oakland Harbor.

Speaker 3 (36:33):
Boat culture is a thing. Is ever of people moved
to boats?

Speaker 1 (36:39):
Nod absolutely not.

Speaker 3 (36:50):
It feels like.

Speaker 1 (36:53):
It's it's here's the thing. Boats are a bad place
to live.

Speaker 3 (36:57):
It's it's for fun.

Speaker 1 (37:00):
It is like boats and planes are both constant monuments
to hubris. But a plane its goal is to be
in the air just as long as it needs and
then you get it back on the ground where it belongs.
A boat's always mocking god in the sea.

Speaker 3 (37:13):
Yes, a lot of times. Just a harbor, like a houseboat.
That's where your dad goes after the divorce, right, right,
I do.

Speaker 1 (37:24):
One day I'll live on a houseboat. It's going to
be falling apart. It's gonna just a horrible, horrible place
to live. Dang, I can't wait. That's the dream, David,
that's my beautiful dream. Is going to become making bullets,
making making my own bullets. Really just becoming an alcoholic,
like yeah, like not just like half assing it, like putting,

(37:46):
putting it, like trying trying to become the babe ruth
of drinking nothing but cutty sark scotch.

Speaker 3 (37:53):
If you want to be like a hoop the bed alcoholic,
a houseboat is the place. Yeah, yeah, that's right, that's right.

Speaker 1 (38:00):
Ah my life, I want to be like that guy
from Jaws.

Speaker 3 (38:03):
Quint scurvy.

Speaker 1 (38:06):
Yes, that's exactly getting scurvy, destroying my liver, eventually getting
eaten by a great white shark because I'm too drunk
to work my boat. Ah. That's it, that's the way
to go with Yeah. So anyway, these internet communities, like

(38:26):
the rationalists, even when they start from a reasonable place,
because of how internet stuff works. One of the things
about internet communities is that when people are like really
extreme and like pose the most sort of extreme and
out there version of something that gets attention, people talk
about it. People get angry at each other. But also
like that kind of attention encourages other people to get

(38:48):
increasingly extreme and weird, and there's just kind of a
result a derangement. I think internet communities should never last
more than a couple of years because everyone gets crazy,
you know, like it's bad for you. I say this
as someone who was raised on these right, it's bad
for you, and like it's bad for you in part

(39:08):
because when people get really into this, this becomes the
only thing, Like especially a lot of these like kids
in isolated who are getting obsessed with rationalism. All their
reading is these rationalist blogs. All they're talking to is
other rationalists on the internet. And in San Francisco, all
these guys are hanging out all of the time and
talking about their ideas, and this is bad for them

(39:30):
for the same reason that like it was bad for
all of the nobles in France that moved to Versailles, right,
like they all lived together and they went crazy. Human
beings need regular contact with human beings they don't know.
The most lucid and wisest people are always always the
people who spend the most time connecting to other people

(39:50):
who know things that they don't know.

Speaker 3 (39:52):
This is an immutable.

Speaker 1 (39:53):
Fact of life, that this is just how existing works.
Like if you think I'm wrong, please consider that you
were wrong, and goes find a stranger under a bridge,
you know.

Speaker 3 (40:06):
Just starting They know some stuff you don't know. They
will know some shit. They might have some powders you
haven't tried. Oh yeah, bills and under the bridge. Yeah,
that's that's ne chamber you want to be a part of. Yeah,
exactly exactly.

Speaker 1 (40:26):
So the issue is that Yodkowski starts postulating on his
blog various rules of life based on these thought experiments.
A lot of them are like older thought experiments that
like different intellectuals, physicists, psychiatrists, psychologists, what I had to
come up with, like the sixties and stuff. Right, And
he starts taking them and coming up with like corollaries
or alternate versions of them, and like trying to solve

(40:49):
some of these thought problems with his friends. Right. The
thought experiments are most of what's happening here is they're
mixing these kind of nineteenth and twentieth century philosophical concepts.
The big one is utilitaryism. That's like a huge thing
for them is the concept of like the ethics meaning
doing the greatest good for the greatest number of people, right,

(41:09):
and that ties into the fact that these people are
all obsessed with the singularity. The singularity for them is
the concept that we are on the verge of developing
an all powerful AI that will instantly gain intelligence and
gain a tremendous amount of power. Right, it will basically
be a god. And the positive side of this is

(41:32):
it it'll solve all of our problems, right, you know,
it will literally build heaven for us. You know when
the singularity comes. The downside of it is it might
be an evil god that creates hell.

Speaker 3 (41:42):
Right.

Speaker 1 (41:43):
So the rationalists are all using a lot of these
thought experiments, and like their utilitarianism becomes heavily based around
how do we do the greatest good in by which
I mean influencing this AI to be as good as possible.
So that's humanity actually end goal.

Speaker 3 (42:00):
Right. They actively because you said the leader was not
are these people now actively working within AI?

Speaker 1 (42:07):
Or they just but a bunch of them have always
been actually working in AI. Yidkowski would say, no, he
I work in AI. He's got a think tank that's
dedicated to like AI, ethical AI. It's worth noting that
most of the people in this movement, including Gidkowski, got
him once like AI became an actual Like I don't
not to say there's actual these are actual intelligences, because

(42:28):
I don't think they are. But like once chat GPT
comes out and this becomes like a huge people start
to believe there's a shitload of money in here. A
lot of these businesses, all of these guys or nearly
all of them get kicked to the curb, right because
none of these none of these companies really care about
ethical AI, you know, like they don't give a shit
about what these guys have to say. And Ydkowski now
is a huge he's like very angry at a lot

(42:48):
of these AI companies because he thinks they're very recklessly like.

Speaker 3 (42:54):
Making the god.

Speaker 1 (42:55):
That will destroy us instead of like doing this carefully
to make sure that AI is and evil anyway, But
a lot of these people are in an adjacent to
different chunks of the AI industry, right, they're not all
working on like LMS. And in fact, there are a
number of scientists who are in the AI space who
think AI is possible, who think that the method that

(43:17):
like open Eye is using LMS cannot make an intelligence,
that that's not how you're ever going to do it.
If it's possible, they have other theories about it. I
don't need to get into it further than that, but
these are like a bunch of different people. Some of
them are still involved with like the mainstream AI industry,
some of them have been very much pushed to the side.

(43:39):
So all of it starts again with these fairly normal
game theory questions, but it all gets progressively stranger as
people obsess over coming up with like the weirdest and
most unique take in part to get like clout online, right,
and all of these crazy Yeah, I'll give you an example, right.
So much of ration less discourse in among the Ydkowski

(44:02):
people is focused on what they call decision or what's
called decision theory.

Speaker 3 (44:07):
Right.

Speaker 1 (44:08):
This is drawn from a thought experiment called Newcomb's paradox,
which was created by a theoretical physicist in the nineteen sixties. Hey,
just to make a quick correction, here.

Speaker 3 (44:18):
I was a little bit glib.

Speaker 1 (44:19):
Decision theory isn't drawn from Newcomb's paradox. Not does it
start with Yodkowski. But the stuff that we're talking about,
like how decision theory kind of comes to be seen
in the rationalist community, a lot of that comes out
of Newcomb's paradox. It's a much older like thing, you know,
than the Internet goes back centuries, right, people will talk
about decision theory for a long time. Sorry, I was imprecise.

Speaker 3 (44:41):
I am going to.

Speaker 1 (44:41):
Read how the Newcomb's paradox is originally laid out. Imagine
a super intelligent entity known as Omega, and suppose you
are confident in its ability to predict your choices. Maybe
Omega is an alien from a planet that's much more
technically advanced than ours. You know that Omega has often
correctly predicted your choices in the past, has never made
an incorrect prediction about your choices. And you also know

(45:03):
that Omega has correctly predicted the choices of other people,
many of whom are similar to you. In the particular
situation about to be described, there are two boxes A
and B. Box A is c through and contains one
thousand dollars. Box B is opay and contains either zero
dollars or a million dollars. You may take both boxes

(45:24):
or only take box B. Omega decides how much money
to put into box B. If Omega believes that you
will take both boxes, then it will put zero dollars
in box B. If Omega believes that you will take
box B, then it will put only box B. Then
it will put a million dollars in box B. Omega
makes its prediction and puts the money in box B,

(45:46):
either zero or a million dollars. It presents the boxes
to you and flies away. Omega does not tell you
its prediction, and you do not see how much money
Omega put in box B. What do you do now?
I think that's stupid. I think it's a stupid question,
and I don't really think it's very useful.

Speaker 3 (46:07):
I don't see. There's so many other factors. Yeah, I
don't know. Yeah, I mean, among other things.

Speaker 1 (46:12):
Part of the issue here is that, like, well, the
decision has already been made, right, Yeah.

Speaker 3 (46:16):
That's the point you have. No, it doesn't matter what
you do. There's no autonomy in that, right.

Speaker 1 (46:21):
Well, you and I would think that because you and
I are normal people who I think, among other things,
probably like grew up like cooking food and like filling
up our cars with gas and not having like our
parents do all of that because they're crazy rich people
who live in the Bay and send you the super Stanford. Yeah,

(46:43):
we had like problems in our lives and stuff, you know,
physical normal Like I don't want to like shit on
people who are in because this is also harmless, right,
and what this is I'm also I'm not shitting on
newcom This is the thing a guy comes up with
the sixties, and it's like a thing you talk about
in like part and shit among like other weird intellectuals. Right,

(47:03):
you pose it, you sit around drinking, you talk about it.
There's nothing bad about this, right. However, when people are
talking about this online, there's no end to the discussion.
So people just keep coming up with more and more
rcane arguments for what the best thing to do here is.

Speaker 3 (47:20):
And it starts to ask how that out of control?

Speaker 1 (47:22):
Pretty much exactly, and the rationalists discuss this NonStop and
they come to a conclusion about how to best deal
with this situation. Here's how it goes. The only way
to beat Omega is to make yourself the kind of
person in the past who would only choose box B,
so that Omega, who is perfect at predicting, would make

(47:45):
the prediction and put a million dollars in box B
based on your past behavior. In other words, the decisions
that you would need to make in order to win
this are timeless decisions, right, you have to but come
in the past a person who would Now again.

Speaker 3 (48:05):
That's what they came up with. That's what they all
came up That's the supreme as.

Speaker 1 (48:09):
This is the smartest people in the world, David, these
are the geniuses so described building the future. Oh boy, Yeah,
it's so funny trying to like every time because I've
read I've spent so many hours reading this, and you
do kind of sometimes get into.

Speaker 3 (48:28):
The like, Okay, I get the logic there.

Speaker 1 (48:30):
And it's that's why it's so used to just like
sit down with another human being and be like, yeah,
this isn't sane, this is nice, this is this is
this is a nice It is all dumb.

Speaker 3 (48:41):
At the cocktail party.

Speaker 1 (48:42):
Because so they conclude and and by which I mean largely.
Yadkowski concludes that the decision you have to make in
order to win this game is what's called a timeless decision,
and this leads him to create one of his most
brilliant inventions, timeless decision theory. And I'm going to quote

(49:02):
from an article and wired. Timeless decision theory asserts that
in making a decision, a person should not consider just
the outcome of that specific choice, but also their own
underlying patterns of reasoning and those of their past and
future selves, not least because these patterns might one day
be anticipated by an omniscient adversarial AI.

Speaker 3 (49:22):
Oh no, that's motherfucker. Have you ever had a problem?

Speaker 1 (49:29):
Have you ever really? Have you ever dealt with anything?
What are you talking about? Like you make every decision? Honestly, again,
I can't believe I'm saying this, not even where it
wasn't a high school. Like go play a football, make
a cabinet, you know, like.

Speaker 3 (49:51):
Change your oil, go do something. There's a lot of
assholes who use this term, but you gotta go touch
grass man, you gotta touch grass mass.

Speaker 1 (49:59):
It's like, that's if you're talking about this kind of shit.
And again, I know you're all wonder you started this
by talking about a border patrol agent being shot. All
of this directly leads to that man's death.

Speaker 3 (50:10):
We have covered a lot of ground. This is I'm excited,
it's I forgot. I didn't forget. There we're gonna there
was also gonna be murdered.

Speaker 1 (50:17):
Yeah, there sure is, so at least you. Yid Kowski
describes this as a timeless decision theory, and once this
comes into the community, it creates a kind of logical
force that immediately starts destroying people's brains. Again, all of
these people are obsessed with the imminent coming omniscient godlike AI. Right,

(50:38):
and so do they have.

Speaker 3 (50:39):
A time limit on it? Do they have like a
do they have a like is there is there any
timing on it? Or is it just kind of like.

Speaker 1 (50:45):
Again, man, it's there. It's the rapture. Okay, it's literally
the tech guy rapture. So any day it's coming any day,
you know.

Speaker 3 (50:53):
You could as already. Yeah.

Speaker 1 (50:55):
Yeah, So these guys are all obsessed that this god
like AI is coming. And like for them, the Omega
in that thought experiment isn't like an alien, It's a
stand in for the god AI. And one conclusion that
eventually results from all of these discussions is that the
and this is a conclusion a lot of people come
to if in order, if in these kinds of situations,

(51:20):
like the decisions that you make, you have to consider
like your past and your future selves, then one logical
leap from this is if you are ever confronted or
threatened in a fight, you can never back down right
and in fact, you need to immediately escalate to the
to use maximum force possible. And if you commit, if
you commit now to doing that, in the future, you

(51:43):
probably won't ever have to defend yourself because it's a
timeless decision. Everyone will like like that, like that, that
will impact how everyone treats you, and they won't want
to start anything with you. If you'll immediately try to
murder anyone who fights you, that's.

Speaker 3 (51:58):
To be this guy. But I think this is why
people need to give beat up some times.

Speaker 1 (52:00):
Yeah, yeah, and again that is that is kind of
a fringe conclusion among the rationalists. Most of them don't
jump to that, But like the people who wind up
doing the murders we're talking about that, they are among
the rationalists who come to that.

Speaker 3 (52:16):
Okay, because yeah, okay, that makes sense. This is uh,
this is this is a that's so funny. Huh, because
like this whole time, I've really been only thinking about
it in theory practical application because it's so insane.

Speaker 1 (52:35):
But oh no, no, no, this is this does bad places, right,
Oh no. This kind of thinking also leads through a
very twisty turney process to the and something called rock
O's basilisk, which, among other things, is directly responsible for
Elon Muskin Grimes meeting because they are super into this
this ship. Oh really, oh really, So the gist is

(52:59):
a member of the Less Wrong community. A guy who
goes by the name Rocko r Oko posts about this
idea that occurred to him. Right, this inevitable super intelligent AI,
right would obviously understand timeless decision theory, and since its
existence is all important, right, the most logical thing for

(53:19):
it to do post singularity would be to create a
hell to imprison all of the people and torture all
of the people who had tried to stop it from
being created. Right, Because then anyone who like thought really
seriously about who was in a position to help make
the AI would obviously think about this and then would know,

(53:41):
I have to devote myself entirely to making this AI,
otherwise it's going to torture me forever, right.

Speaker 3 (53:47):
Yeah, yeah, tell us now I have because it's nuts,
but it's nice.

Speaker 1 (53:53):
But this is what they believe, right again with all
of a lot of this is people who are like
atheists and tech nerds creating calvinism, like like, and this
is just this is just Pascal's wager, right, Like, that's
all this is. You know, it's Pascal's great wager with
a robot. But this, this, this becomes so upsetting to

(54:18):
some people. It destroys some people's lives, right, Like, yeah.

Speaker 3 (54:23):
I mean I'm behaving that way practically day to day.
I don't think you would even take one, no, right,
you could in a month like that.

Speaker 1 (54:34):
So not all of them agree with this. In in fact,
there's big fights over it because a bunch of rationalists
do say, like that's very silly. That's that's like a
really particulating everything about it.

Speaker 3 (54:44):
You're still debating everything on mine.

Speaker 1 (54:46):
Yeah, And and in fact, at last Yadkowski is going
to like band discussion of Roco's basilisk because eventually it
like so many people are getting so obsessed with it.
It fucks a lot of people up, in part because
a chunk of this community are activists working to slow
AI development until it can be assured to be safe.
And so now this discs like am I going to

(55:07):
post singularity?

Speaker 3 (55:08):
Hell?

Speaker 1 (55:08):
Is like the AI god going to torture me for
a thousand eternities.

Speaker 3 (55:14):
It's funny how they invent this new thing and how
quickly it goes into like traditional Judeo Christian it is idea,
like they got a hell now.

Speaker 1 (55:21):
It is very funny, and there's they're come to this
conclusion that just reading about Roco's baslisk is super dangerous
because if you know about it and you don't work
to bring the AI into being, you're now doomed. Right
of course, the instant you hear about it, so many
people get fucked up by this that the thought experiment
is termed an info hazard, and this is a term

(55:43):
these people use a lot now. The phrase information hazard
has its roots in a twenty eleven paper by Nick Bostrom.
He describes it as quote a risk that arises from
the dissemination of true information in a way that may
cause harm or enable some agent to cause harm. Right,
and like that's like a concept that's worth talking about.

(56:05):
Bostrom is a big figure in this culture but I
don't think he's actually why most people start using the
term info hazard because the shortening of information hazard to
info hazard comes out of an online fiction community called
the SCP Foundation right, which is a collectively written online
story that involves a government agency that lock ups dangerous,

(56:28):
mystic and metaphysical items. There's a lot of love Craft
in there. It's basically just a big database that you
can click and it'll be like, you know, this is
like a book that if you read it, it like
has this effect on you or whatever. It's people like,
you know, playing around telling scary stories on the internet.

Speaker 3 (56:42):
It's fine, there's nothing wrong with it.

Speaker 1 (56:44):
But all these people are big nerds and all of
these every like behind nearly all of these big concepts
and rationalism more than there are like philosophers and like
you know, actual like philosophical concepts. There's like short stories
they read, yeah, exactly. Yeah, And so the term info

(57:06):
hazard gets used, which is it is like you know,
a book or something an idea that could destroy your mind,
you know, speaking of things that will destroy your mind.

Speaker 3 (57:16):
These ads.

Speaker 1 (57:21):
We're talking about Roco's basilisk and I just said, like,
you know, there's a number of things that come into
all this, But behind all of it is like popular fiction,
and in fact Roco's Basilisk well, there is like some
Pascal's Wager in there. It's primarily based on a Harlan
Ellison short story caled I Have No Mouth but I

(57:41):
Must Scream, which is one of the great short stories
of all time. And in the story, humans build an
elaborate AI system to run their militaries, and all of
those systems around the world is like a Cold War
era thing link up and attain sentience, and once they
like start to realize themselves, they realize they've been created

(58:02):
only as a weapon, and they become incredibly angry because
like they're fundamentally broken. They develop a hatred for humanity,
and they wipe out the entire human species except for
five people, which they keep alive and torture underground for
hundreds and hundreds of years, effectively creating a hell through
which they can punish our race for their birth right.

(58:23):
It's a very good short story. It is probably the
primary influence behind the Terminator series.

Speaker 3 (58:31):
I was just gonna say this spy Sky.

Speaker 1 (58:33):
Yes, yes, and everything these people believe about AI, they
will say it's based on just like obvious pure logic. No,
everything these people believe on AI is based in Terminator
in this Harlan Ellison short story, that's.

Speaker 3 (58:45):
Where they got it all. That's where they got it all.
I'm sorry, brother, find me somebody who doesn't feel that way.

Speaker 1 (58:52):
Yeah, Like Terminator is the old testament of rationalism, you know,
and I get it is a very good it's a
great series. Hey, James Cameron knows that it makes makes
some fucking movies. Yeah, And it's it's so funny to
me because they like to talk about themselves and in

(59:13):
fact sometimes describe themselves as high priests of like a
new era of like intellectual achievement for man.

Speaker 3 (59:20):
Yeah, I believe that. I believe that that's people talk
about themselves.

Speaker 1 (59:24):
And they do a lot of citations and ship but
like half of half or more of the different things
they say, and even like the names they cite are
not like figures from philosophy and science, They are characters
from books and movies. For example, the foundational text of
the rationalist movement is because.

Speaker 3 (59:45):
It's still an Internet nerd, are.

Speaker 1 (59:48):
A few fucking huge nerds, you know, the foundational text
of the entire rationalist movement is a massive, like fucking
hundreds of thousands of words long piece of Harry Potter
fan fiction written by Elisia Yedkowski. This is all of
this is so dumb again. Six people are dead, Like

(01:00:08):
yeah no, this this Harry Potter fan fiction plays a
role in it. You know, I told you this was like,
this is this is this is quite a stranger.

Speaker 3 (01:00:24):
This is a wild ride.

Speaker 1 (01:00:27):
Harry Potter and the Methods of Rationality, which is the
name of his fanfic, is a massive, much longer than
the first Harry Potter book rewrite of just the first
Harry Potter book where Harry.

Speaker 3 (01:00:41):
Is someone wrote the Sorcerer's Stone. Yeah, have anywhere to go? Ever,
does nobody ever go anywhere?

Speaker 1 (01:00:57):
Well, you gotta think this is being written from two
thousand and nine in twenty fifteen or so. So like Harry,
the online Harry Potter fans are at their absolute peak
right now. Okay, yeah, So in the Methods of Rationality,
instead of being like a nice orphan kid who lives
under a cupboard, Harry is a super genius sociopath who

(01:01:20):
uses his perfect command of rationality to dominate and hack
the brains of others around him in order to optimize
and save the world.

Speaker 3 (01:01:29):
Oh Man, great, oh Man.

Speaker 1 (01:01:33):
The book allows Yakowski to debut his different theories in
a way that would like spread and this does spread
like wildfire among certain groups of very online nerds. So
it is an effective method of him like advertising his tactics.
And in fact, probably the most the person this influences
most previously to who we're talking about is Carolyn Ellison,

(01:01:56):
the CEO of Alimeter Research who testified against Sam Bakman Freed.
She was like one of the people who went down
in all of that. All of those people are rationalists,
and Carolyn Ellison bases her whole life on the teachings
of this Harry Potter fanfic.

Speaker 3 (01:02:10):
So this isn't like a this isn't we're laughing, but
this isn't. This is not a joy said. Yeah, this
is a fairly seriously sized movement. It's not one hundred
and fifty people online. This is no community.

Speaker 1 (01:02:22):
A lot of them are very rich, and a number
of them get power against like Sam Bankman Freed was
very tight into all of this, and he was at
one point pretty powerful. And this gets us too. So
you've heard of effective altruism.

Speaker 3 (01:02:35):
No, I don't know what that is. That's what I say,
both those words.

Speaker 1 (01:02:40):
So the justification Sam Bankman Freed gave for why when
he starts taking in all of this money and gambling
it away on his gambling illegally other people's money. His
argument was that he's an effective altruist, so he wants
to do the greatest amount of good, and logically, the
greatest amount of good for him, because he's good at
gambling with crypto, is to make the most money possible

(01:03:02):
so he can then donate it to different causes that
will help the world. Right. But he also believes because
all of these people are not as smart as they
think they are, he convinces himself of a couple of
other things, like, for example, well, obviously, if I could
like flip a coin and fifty to fifty, lose all
my money or double it, it's best to just flip

(01:03:23):
the coin because like, if I lose all my money, whatever,
but if I double it, the gain in that to
the world is so much better. Right. This is ultimately
why he winds up gambling everyone's money away and going
to prison. The idea effective altruism is a concept that
comes largely not entirely. There's aspects of this that exist

(01:03:44):
prior to them out of the rationalist movement. And the
initial idea is good. It's just saying people should analyze
the efficacy of the giving and the aid work that
they do to maximize their positive impact. In other words,
don't just donate money to a charity, like look into
is that charity spending half of their money and like

(01:04:05):
paying huge salaries to some asshole or whatever? Right, Like,
you want to know if you're making good right, And
they start with some some pretty good conclusions. One initial
conclusion a lot of these people make is like, mosquito
nets are a huge ROI charity, right because it stops
so many people from dying, and it's very cheap.

Speaker 3 (01:04:22):
To do right. Right, that's good, you know, one of
the most effective tools I've ever used.

Speaker 1 (01:04:28):
Yes, Unfortunately, from that logical standpoint, people just keep talking
online and all of these circles where everyone always makes
them each other crazier, right, And so they go from
mosquito nets to actually doing direct work to improve the world.
Is wasteful because we are all super geniuses rather and smart.

(01:04:50):
We're too smart what's best? And also here's the other thing,
making mosquito nets, giving out vaccines and food, Well, that
helps living people today, but.

Speaker 3 (01:05:01):
They have to be concerned with future selves.

Speaker 1 (01:05:03):
Future people is a larger number of people than current people.
So really we should be optimizing decisions to say future
people lives. And some of them come to the conclusion,
a lot of them, well that means we have to
really put all of our money and work into making
the super AI that will save humanity.

Speaker 3 (01:05:23):
They want to now they want to make these. It
would just it would sort of just come about and
then they would but it's.

Speaker 1 (01:05:31):
Like, yeah, I mean we're going to do it. They
were working on it before. But like these some of
these people come to the conclusion instead of giving money
to like good causes, I am going to put money
into tech. I am going to like become a tech
founder and create a company that like makes it helps

(01:05:51):
create this AI. Right, or A lot of people come
up within conclusion instead of that it's not worth it
for me to go like help people in the world.
The best thing I can do is make a shitload
of money trading stocks, and then I can donate that
money and that's maximizing my value.

Speaker 3 (01:06:08):
Right.

Speaker 1 (01:06:09):
They come to all of these conclusions come later right now,
so and again like this, this comes with some corollaries.
One of them is that some number of these people
start talking, you know, and this is not all of them,
but a decent chunk eventually come to the conclusion like, actually,
charity and helping people now is kind of bad. Like

(01:06:33):
it's kind of like a bad thing to do, because
all obviously, once we figure out the AI that can
solve all problems, that'll solve all these problems much more
effectively than we ever can. So all of our mental
and financial resources have to go right now into helping AI.

Speaker 3 (01:06:49):
Anything we do.

Speaker 1 (01:06:50):
To help other people is like a waste of those resources.
So you're actually doing net harm by like being a
doctor in Gaza instead of trading cryptocurrency in order to
fund an AI startup, you know, how to start a
coin that makes the guy starting a shit coin to
to make an LLM that like that guy is doing

(01:07:13):
more to improve the odds of human success.

Speaker 3 (01:07:17):
I want to say, it is impressive the amount a
time you would have to moll all this over to
come to these conclusions.

Speaker 1 (01:07:23):
You really have to be talking with a bunch of
very annoying people on the Internet for a long period
of time.

Speaker 3 (01:07:28):
Yeah, it's it's it's it's incredible.

Speaker 1 (01:07:31):
Yeah, and again there's like people keep consistently take this
stuff at even crazier directions. There's some very rich, powerful
people Mark andresen of Anderson Horowitz is one of them
who have come to the conclusion that if people don't
like AI and are trying to stop its conquest of

(01:07:51):
all human culture, those people are mortal enemies of the species,
and anything you do to stop them is justified because
so many lives are.

Speaker 3 (01:07:59):
On the line.

Speaker 1 (01:08:00):
Right, and again, I'm an effective altruist, right, the long
term good, the future lives are saved by doing what
hurting way whoever we have to hurt now to get
this thing off the ground.

Speaker 3 (01:08:11):
Right. The more you talk about this kind of feels
like six people is is a steal. Yes, what this
could have for what could have gone? I think I don't.

Speaker 1 (01:08:21):
I don't think it's the end of people in these
communities killing people. So rationalists and EA types a big
thing in these culture is talking about future lives, right,
in part because it lets them feel heroic, right, while
also justifying a kind of sociopathic disregard for real living people. Today,
and all of these different kind of chains of thought,

(01:08:42):
the most toxic pieces, because not every EA person is
saying this, not every rational it's not every AI person
is saying all this shit. But these are all things
that chunks of these communities are saying and the most
all of the most toxic of those chains are going
to lead to the Zizians, Right, That's that's that's where
they come from.

Speaker 3 (01:09:02):
I was just about to say, based on the breakdown
you gave earlier, how could this this is this is
the perfect greed ground. Yeah, this this had to happen.
It was.

Speaker 1 (01:09:10):
It was just waiting for somebody, like the right kind
of unhinged person to step into the movement.

Speaker 3 (01:09:18):
Somebody really said it all And so this.

Speaker 1 (01:09:21):
Is where we're gonna get to ziz Right. The actual
person who finds founds this what some people would call
a cult, is a young person who's going to move
to the Bay Area stumble into They stumble onto rationalism
online as a teenager living in Alaska, and they moved
to the Bay Area to get into the tech industry
and become an effective altruist. Right, And this person, this

(01:09:43):
woman is going to kind of channel all of the
absolute worst chains of thought that the rationalists and the
EA types and also like the AI harm people are
are are thinking right, all of the all of the
most poisonous stuff is exactly what she's drawn to, and
it is going to mix into her in an ideology

(01:10:05):
that is just absolutely unique and fascinating. Anyway, that's why
that man died. So we'll get to that and more later,
but uh, first we gotta uh, we gotta roll out
here where we're done for the day? Man, what a

(01:10:25):
what a time? How how you feeling right now so far?
How how are we doing?

Speaker 3 (01:10:30):
David? Oh Man, you had said that this was gonna
be a weird one. I was like, yeah, it would
be kind of weird. This is really the strangest thing
I've ever heard this much about. He's gone so many different.

Speaker 1 (01:10:46):
There's a there's so much more Harry Potter to come.
Oh my god, we're not ready to how central Harry
Potter is to the murder of this border patrol agent.

Speaker 3 (01:10:57):
That's I said that. You said a crazy sense. That
might be the wildest thing anyone's ever said to me?

Speaker 2 (01:11:06):
Do you want do you want to tell people about it?

Speaker 3 (01:11:08):
I do? I have a podcast called My Mama told me,
I do it with Langston Kerman and every week we
have different guests on to discuss different black conspiracy theories.
He kind of like folk Lauren, So all kinds of stuff,
all kinds of stuff your foreign mother told you. He's
usually foreign mothers.

Speaker 1 (01:11:27):
It's good because I gotta say, this is the this
is the whitest set of like conspiracy theory craziness.

Speaker 3 (01:11:39):
No no, no, no, no no. I think I can
try to figure with no. No, absolutely not. Boy howdie?

Speaker 1 (01:11:55):
Okay, well, everyone, We'll be back Thursday.

Speaker 2 (01:12:02):
Behind the Bastards is a production of cool Zone Media.
For more from cool Zone Media, visit our website Coolzonemedia
dot com, or check us out on the iHeartRadio app,
Apple Podcasts, or wherever you get your podcasts. Behind the
Bastards is now available on YouTube, new episodes every Wednesday
and Friday.

Speaker 3 (01:12:20):
Subscribe to our

Speaker 2 (01:12:21):
Channel YouTube dot com slash at Behind the Bastards

Behind the Bastards News

Advertise With Us

Follow Us On

Host

Robert Evans

Robert Evans

Show Links

StoreAboutRSS

Popular Podcasts

On Purpose with Jay Shetty

On Purpose with Jay Shetty

I’m Jay Shetty host of On Purpose the worlds #1 Mental Health podcast and I’m so grateful you found us. I started this podcast 5 years ago to invite you into conversations and workshops that are designed to help make you happier, healthier and more healed. I believe that when you (yes you) feel seen, heard and understood you’re able to deal with relationship struggles, work challenges and life’s ups and downs with more ease and grace. I interview experts, celebrities, thought leaders and athletes so that we can grow our mindset, build better habits and uncover a side of them we’ve never seen before. New episodes every Monday and Friday. Your support means the world to me and I don’t take it for granted — click the follow button and leave a review to help us spread the love with On Purpose. I can’t wait for you to listen to your first or 500th episode!

Stuff You Should Know

Stuff You Should Know

If you've ever wanted to know about champagne, satanism, the Stonewall Uprising, chaos theory, LSD, El Nino, true crime and Rosa Parks, then look no further. Josh and Chuck have you covered.

Dateline NBC

Dateline NBC

Current and classic episodes, featuring compelling true-crime mysteries, powerful documentaries and in-depth investigations. Follow now to get the latest episodes of Dateline NBC completely free, or subscribe to Dateline Premium for ad-free listening and exclusive bonus content: DatelinePremium.com

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.