All Episodes

April 25, 2025 88 mins
Have you ever read Harry Potter and the Methods of Rationality?? Perhaps spent too much money on a self help workshop seminar? Join us as we talk about Eliezer Yudkowsky and his masterpiece of fiction. Where will this story truly lead us in this tale of rational magic and science. With our last episode on the topic trigger warning for some bad mental health. 

Thanks for listening and remember to like, rate, review, and email us at: cultscryptidsconspiracies@gmail.com or tweet us at @C3Podcast. We have some of our sources for research here: http://tinyurl.com/CristinaSources

Also check out our Patreon: www.patreon.com/cultscryptidsconspiracies. Thank you to T.J. Shirley for our theme
Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:29):
Hello my husband.

Speaker 2 (00:31):
Oh hi Chelsea, that's not how I expect you to
start this episode. Hi.

Speaker 1 (00:36):
Okay, Well I was trying to be sweet but whatever.

Speaker 2 (00:39):
I mean, yeah, you could be sweet, but like usually
it's like Hi Chelsea, Hi Christina, and yeah, you know,
there's like a form to it. Christina always talks about
how there's like an order of operations the way we
do this, and you just knock that out.

Speaker 3 (00:52):
Christina's not here.

Speaker 2 (00:53):
Yeah, that's true, listeners.

Speaker 3 (00:55):
Adult is gone.

Speaker 2 (00:56):
Yes, the adult of the podcast, indeed is gone.

Speaker 1 (01:00):
She's in Japan.

Speaker 3 (01:01):
She's having a good time.

Speaker 2 (01:03):
She's been there for like seven weeks. Jesus will miss her. Yeah,
I haven't seen her.

Speaker 1 (01:07):
He's never coming back.

Speaker 2 (01:08):
Yeah okay. If so, he's replaced you with Tom Holland
in the podcast. Who would replace Christina with?

Speaker 1 (01:15):
I thought, wait, am I the one that's being replaced
by Tom Holland?

Speaker 2 (01:19):
Yeah, Tom han Awake replaces you.

Speaker 1 (01:21):
Well, I guess zindaiya since they're together.

Speaker 3 (01:24):
I don't know, what, do you?

Speaker 2 (01:26):
I wonder what they would talk about in a podcast.
I'm very curious.

Speaker 1 (01:29):
I want to know what they talk about on a
podcast that specifically has these subjects on it.

Speaker 2 (01:33):
Not justies. Yeah, yeah, yeah, yeah.

Speaker 1 (01:35):
I don't like, I don't want them to just talk
on a like that's fine, but I get that in interviews.
I want them just on a podcast talking about weird shit.

Speaker 2 (01:45):
I see it. I mean, yeah, honestly, it's a lot
of people don't think that actors and actresses like are
into weird shit. But like, if you get them to
talk about our hyperfixations, it's so funny to watch. Watching
Henry Cavill just talk about Warhammer is incredibly enjoyable.

Speaker 1 (02:00):
As we were talking about Tom Holland though, I do
have to say I found a video that someone posted
so Sendaia apparently brought him to like a family gathering
in Oakland, and I'm just like, oh, this poor white
British boy dropped in the middle of Oakland and he
the look at his face is like he's discovered culture.

Speaker 2 (02:19):
It's very different.

Speaker 3 (02:20):
Yeah, I forgot, I forget.

Speaker 1 (02:21):
He's British, He's British and Zendia's from Oakland.

Speaker 3 (02:24):
That I love that.

Speaker 1 (02:26):
That makes me proud to be an Easpane native.

Speaker 3 (02:29):
But this is culture.

Speaker 2 (02:30):
Cryptic Conspiracy is the podcast we're talking about.

Speaker 3 (02:32):
Those in the fake one.

Speaker 2 (02:33):
Yeah, well, we're not Christina. Okay, christ he is going
to hear this and she's going to be like, how
dare they not even make an attempt? But his just like,
I don't know what you want me to say this.
This is cult cryptic conspiracies. It will never not be
cult character conspiracies. We're talking about that they call religious
miscellanea weird funky people in history, and we have a
good time of it. It's topics that we like to
talk about, and I'm really really into it. We are

(02:57):
coming up on four hundred episodes, which I know is
not Jesus we are. That's not the eight hundred year mark,
by the way, but that is that is like four
hundred year mark. I'm sorry, the eight year march Jesus.
Christ Are you good? It's been a long workday for both.

Speaker 1 (03:13):
Of us, So you know what, I'm just gonna put
you out of your misery. Uh, there there are updates.
There is a bummers will sometimes today.

Speaker 2 (03:27):
You know what, better that you let me go through
this than Christina? Go ahead, Yeah, that's fine.

Speaker 1 (03:32):
So first off, good, I mean, I don't know how
to how to categorize this. It's a good thing.

Speaker 3 (03:38):
It's a good thing, but it shouldn't.

Speaker 1 (03:39):
Have happened in the first place. We talked on the
podcast MA will insert the episode here.

Speaker 2 (03:44):
And that would be episode will I.

Speaker 1 (03:47):
I don't know where we talked about Lori Valo Lori
Valoda Bell and her second third husband or whatever. Uh
we're run afoul of a Mormon doomsday preppers cult ended
up murdering her ex husband and then.

Speaker 3 (04:05):
Both her children.

Speaker 2 (04:06):
I don't remember this.

Speaker 1 (04:07):
Okay, this is a while ago, but she was found
the jury an Arizona Journey finds Lori Valodeabell guilty for
conspiring to murder her estranged husband. So she was found
guilty on the murder charge for her husband. I believe
she already was found guilty on the murder charge for
her children.

Speaker 3 (04:24):
Yeah.

Speaker 1 (04:25):
Yeah. And then our dear friend Kristen sent us a
news report about how UK scientists are aiming to dim
sunlight to fight climate change.

Speaker 2 (04:37):
I don't know if that's I get. I mean, I
guess that will help.

Speaker 3 (04:43):
Well, I don't know.

Speaker 2 (04:44):
I'm not sure.

Speaker 1 (04:45):
It feels very Futurama. It does, because in Futurama they
were fighting climate change. Well, there was the one they
fought it by just dropping increasingly larger ice blocks into
the ocean. But then there was another time where they're
just like, what if we move the earth?

Speaker 3 (05:00):
Like what was.

Speaker 1 (05:01):
It five five feet to the left or whatever.

Speaker 2 (05:06):
I don't remember. If I don't remember, if you trauma,
is that show still on?

Speaker 1 (05:10):
It's like back, I don't really know in what capacity?

Speaker 2 (05:13):
Back everybody?

Speaker 3 (05:15):
Yeah, but I remember there.

Speaker 1 (05:17):
Being some problems with the original voice cast or something.
I don't know. I haven't watched the new stuff.

Speaker 2 (05:25):
Well, try to sorry, it's what a lot of worked out.
I'll try to put in the I'll try to put
in the episode for Lori Valo.

Speaker 1 (05:32):
Yeah, Lori Valade.

Speaker 2 (05:34):
I'll look into that. But other than that, well, you
do have a podcast. I'm very excited about the other
half of your topic. I've already edited and heard it.
And I also wasn't here for it. You were but
a very interesting, very very I like how it opens
up the possibility for more topics later.

Speaker 3 (05:48):
So it does.

Speaker 2 (05:50):
Unless there's anything else you have to say, babe, that
was it.

Speaker 1 (05:54):
That was all I have. Hang on, let me just
check real quick, because we do have.

Speaker 2 (05:57):
It wasn't really invitation. It was more like a transit.
It was like a transition to the next part of
the podcast.

Speaker 3 (06:04):
Bad.

Speaker 1 (06:04):
I'm just looking at what our listeners have posted on
on our Discord server and in what context well, because
sometimes they post updates there too. I see, I see
someone said someone accidentally sent a thing in our oh

(06:27):
because the Pope died?

Speaker 2 (06:29):
Oh right, you were shut What was it? You showed
me a meme about the the GTA. Yeah? What GTA
games were launched during which Pope's reign?

Speaker 3 (06:39):
Yeah, and there was none. It was last during the
last poste.

Speaker 2 (06:42):
It's just so weird to think about.

Speaker 1 (06:45):
But keynote posted in the server but posted in the
wrong part of the server because we have a parlor room.
And he said, or they said, I'm two for two now.
I made a meme documented about Kissinger not being dead,
and the very next day he kicked it. And Saturday
I made a about the Pope die next week. Uh,
and they were like, hey, I'm sorry, I meant to
post this in general not here, and I said, no, wait,

(07:06):
I need romanticy death note now. But yeah, the Pope died.
Jd Vance killed him.

Speaker 2 (07:12):
We like literally or well, so.

Speaker 1 (07:15):
What happened was jd Vance like the Pope came out
and said that JD. Vance is a fucking idiot. Obviously
he didn't say that specifically, but he came out against JD.
Van saying, like, what you're doing is because JD Vance
is a reformed Catholic. I don't know what the fuck is.

Speaker 2 (07:32):
Going to the pillow guy. Huh he the pillow guy.

Speaker 3 (07:35):
JD Vance is our vice president.

Speaker 2 (07:37):
Oh so is he the pillow guy?

Speaker 1 (07:39):
No, he's the one that fucked a couch.

Speaker 2 (07:41):
Ah okay, sorry, thank you. I was trying to remember
which part of furniture he was into.

Speaker 1 (07:45):
Yeah, and uh. Allegedly he went to meet with the Pope.
He actually met with like the Pope's number two on Saturday.

Speaker 2 (07:54):
What's Pope's number two?

Speaker 3 (07:55):
Car? I don't remember that guy's name.

Speaker 1 (07:57):
I don't remember Junior, no Supreme cart, I don't remember
door at anyway. And then I guess he threw a
tantrum about it. So on Easter Sunday he met with
the Pope, and the Pope then told him like hey, personally,
like you're you are not doing right by Christ. Oh hell,

(08:19):
and then like twelve hours later the Pope was dead.

Speaker 2 (08:22):
So so his final shot off the bow was you
know what you're not doing right by Christ. And then
he says, peace out. Christ came to pick me up
in a limo and then he's out of here.

Speaker 1 (08:31):
Yeah, exactly.

Speaker 2 (08:32):
Wow, that's so funny.

Speaker 1 (08:34):
So yeah, well JD. Vance killed the pope.

Speaker 2 (08:36):
I mean, the pope is old.

Speaker 1 (08:38):
The pope was old, he had pneumonia, he had some
health problems from the last month. But Jdvan skilled the pope.

Speaker 2 (08:43):
How many popes have been assassinated? Not many, right, we.

Speaker 1 (08:47):
Talked about that on this podcast. But I don't because
there was an assassination attempt and that's how the popemobile happened. Right.

Speaker 2 (08:54):
No, I don't mean attempts like actual assassination.

Speaker 1 (08:56):
I don't know, none of modern times as far as
I know.

Speaker 2 (08:58):
Because a lot of popes die of like pneumonia sickness
at old age.

Speaker 3 (09:02):
Which they're all fucking old.

Speaker 2 (09:03):
Well, I mean, and that's a way to go naturally,
That's why to you just get sick in your body
can defend against it. But anyway, the pup's dead. Let's
get to the podcast.

Speaker 1 (09:09):
Let's get to the podcast. Let's get to part two
of how Harry Potter fan fiction led to six people
being murdered.

Speaker 2 (09:14):
But first, let's have a word from our sponsors.

Speaker 3 (09:25):
We're back at it again at the Krispy Kreeen. We're
back at it again. So we're last week left off.

Speaker 1 (09:31):
Yes, we were just talking about Harry Potter fan fiction.

Speaker 3 (09:35):
We were talking about what's the dude's name again? It's
a Lisar.

Speaker 1 (09:39):
Wait, what's his last name?

Speaker 3 (09:41):
I remember a Lisar.

Speaker 1 (09:43):
Yeah, it's a Liza Udowski.

Speaker 3 (09:46):
Eliza Udowski who is a big proponent on rationalism, a
big fear of artificial intelligence, also a Harry Potter fan
who writes a multi novel length fan fiction to talk
about his special interests in rationality.

Speaker 1 (10:06):
Not only did he write a multi novel length fan fiction,
but it is also one of the most popular Harry
Potter fan fictions to ever exist.

Speaker 3 (10:12):
I wonder why that is. Yeah, I don't know. It's
one of those things where like, is it just heavily advertised?
Did he get people to because the thing is what
you said, is it's the most third most reviewed?

Speaker 1 (10:24):
Yes?

Speaker 3 (10:25):
But is it the third most read? I don't know.

Speaker 1 (10:28):
It's a different question fan fiction because this was originally
published on fan fiction AO three has.

Speaker 3 (10:32):
Viewcounts it does it gives you like reading stats? Well,
it's it's also like there's different metrics because AO three,
there's the concept of kudos. And for those of you
who are not in the fandom space, we're talking about
the fan fiction website Archive of our Own, which is
a forum similar to like it's it's it's not like

(10:53):
a blogging place like live journal was in the past,
but it is a forum for authors. You don't have
to write fan fiction. It could be any fiction people
write original fiction on.

Speaker 1 (11:03):
They did that dot net too, which is kind of crazy.

Speaker 3 (11:06):
And I think that the mainstream ones at this moment
in time for fan fiction on the internet are archipe
of our Own and wat Pad. Surprisingly, we've yeah looked
at what pad.

Speaker 1 (11:16):
Before just because I was curious.

Speaker 3 (11:17):
I couldn't figure it out. What Pad apparently is really
heavily pushing independent like original works.

Speaker 1 (11:23):
It's also very like for the fan fiction works, it's
very until like self insert like your favorite character.

Speaker 3 (11:28):
And you kind yeah yeah, yeah, yeah, which I'm not into.
I'm also not I don't want to be in the story.

Speaker 1 (11:34):
Yeah. I Also, I've talked in this podcast before how
I hate second person writing. That's true, Yeah, that's true.
I don't like this.

Speaker 3 (11:41):
Yeah, that's very fair. So when discussing this, we're talking
about these online forums. Fanfiction dot net is a place
where people could go make an account and then write
stories published under their name on this account.

Speaker 1 (11:54):
It used to be the biggest fan fiction website. Some
stuff happened yeh in the two thousands that started its decline,
and then AO three overtook it in the twenty tens.

Speaker 3 (12:05):
Yes Hugo Award winning archive of r own Yes, which
is fantastic, but yes. So on AO three there are
multiple metrics for how you would rate a story because
it gives you some statistics like it will show you
the word count, it'll show you the chapter count. But
it will also show you kudos, which is essentially like

(12:25):
how many people liked this?

Speaker 1 (12:27):
Yeah, And you can leave kudos as a guest or
as a sign end user.

Speaker 3 (12:31):
Yeah. You don't need to have an account. You need
to have an account to leave a review, I believe.

Speaker 1 (12:36):
Usually sometimes you can actually say a three.

Speaker 3 (12:39):
Yeah, oh I.

Speaker 1 (12:39):
Can see it, so that you can you can leave
an it, but that's like an option.

Speaker 3 (12:42):
Anonymous people can leave an account yeah, oh interesting, Okay,
the more you know. But it's similar to like replying
to something on social media versus just liking it. Yeah,
like leaving a common versus just liking it, And oftentimes
it is in much like other social media, the amount
of comments something gets the amount of reviews something gets

(13:04):
is not always indicative of it being good, because sometimes
those reviews or comments can be angry. Yes, I am negative.

Speaker 1 (13:12):
So I don't know. Uh, I don't because I didn't Again,
I didn't read it, so I didn't really look it up.
It's the third thing that pops up when you look
up rationality. I'm not sure it's viewcount or because it's
also now been cross posted to several places.

Speaker 3 (13:27):
Right, and it is probably this point a little bit
infamous as well.

Speaker 1 (13:30):
Yeah, so I don't know.

Speaker 3 (13:32):
I know it has a good Reads page, doesn't it?
It does?

Speaker 1 (13:35):
Uh, there are quite a few popular fan fictions that
have good Read pages. Though Goodreads is a wild West
to me, and Goodreads is interesting. I'm off Goodreads now because.

Speaker 3 (13:43):
It's owned by Amazon. I'm on what are we on now? Storybook.
I don't trust Goodreads because a lot of the books
that we read for par were rated very highly on that.
That's true, and I'm like, what are you all smoking?

Speaker 1 (13:54):
Story graph is the one I'm using now, But I
will say, as far as good Reads go, when I
was using it, there were some books that I read
that I rated highly because I'm like, and I would
put in my review Listen, I'm not saying this book
is good, but.

Speaker 3 (14:05):
You did enjoy it. But I enjoyed it, and that's
I feel like valid to say. Yeah.

Speaker 1 (14:09):
So I think that's what we were running into with
good Reads.

Speaker 3 (14:13):
When we look up the reviews for things. Some people
put their full chest out there, you know, Yeah, They're like, no,
this is the best work ever, and I'm like, I
think that you need to see more of the world.

Speaker 1 (14:24):
There's a website called romance dot io, which is just
for romance books. Okay, A Court of Mised Inferior is
one of the top reviewed books on that website, which
I guess isn't super shocking.

Speaker 3 (14:36):
Top like the highest reviewed, highest rating, as opposed to
like having the most reviews. Yeah, highest rate.

Speaker 1 (14:42):
Well, it's like the the ratio of like how many
reviews there are and how many of them are good?

Speaker 3 (14:49):
Yes, yes, is yes, and like that's what I meant
to say, more of like the stars out of five
versus yeah, so this many people had things to say
about it, because that's the thing. Just because your art
moves people to speak on it, doesn't mean that they're
speaking positively about your art. That's true. That's my whole thing.

Speaker 1 (15:06):
So most popular, actually, no, I take that back, most
popular romance books.

Speaker 3 (15:10):
It's number one. Oh dang, guys, humanity, we can do better.
I liked it, and that's my favorite one of the series.
Would you say it's the best romance novel I've ever written?

Speaker 1 (15:21):
That I read? Though, Now, okay, we can do better,
but I can understand why it's number one?

Speaker 3 (15:26):
Well, I say that. Would I say that? You know, like,
I'm not. Pride and Prejudice is four. Here's the thing, here's.

Speaker 2 (15:34):
The thing.

Speaker 3 (15:36):
I personally, I've read Pride and Prejudice. I like Pride
and Prejudice. I understand why it is a staple of
the romance genre. Again, is it the greatest romance novel
I've ever read? I don't think so.

Speaker 1 (15:48):
What's the greatest romance novel you've ever read?

Speaker 3 (15:50):
Honestly, off the top of my hat, I couldn't tell you, yeah,
because there's not a lot of standouts in my brain
where I'm like, this is the greatest thing ever. Because
I also often don't read romance novels. Just what part
of the paranorma would have you think, Well, you.

Speaker 1 (16:02):
Do often read romance novels, but they're.

Speaker 3 (16:04):
Not They're not my genre of choice. It's not where
I would live if given the choice. But I also
think that there is something to be said again about
infamy in a sense where if something is known for something,
it is more likely to be read and thus known
and continue. It's like a snowball going down a mountain.
You know it's gonna keep getting bigger and collecting more

(16:25):
snow because it's got that momentum. And I think Pride
and Prejudice is like that. Do I think it is.
I'm not saying it doesn't warrant the good vibes, the
good acclaim. I think it does. But if people are
saying that it's in the top five of all time,
that's fair. Do I think it's not as good as
missed in whatever Missing Fury? No? I think it was.

(16:49):
It's more romantic than misin Fury. I feel like I
can say that confidently.

Speaker 1 (16:55):
I feel like you haven't read Pride and Prejudice, okay,
but you know a story I know the story.

Speaker 3 (17:00):
We all know this story. There's been so many different
variations of prize and prejudice, so we've all absorbed through
our lives. Yeah, yeah, yeah, I own it. I just
haven't read it.

Speaker 1 (17:09):
But yeah, anyway, that's a completely different tangent that we.

Speaker 3 (17:12):
Just saw her in my apology. That's fine, I went somewhere.

Speaker 1 (17:16):
I'm gonna start this second part of the topic off
with another quote from another article, and this one's from
Vice Okay, And it's the Rationality Workshop that teaches people
to think more like computers by Louise matat Mattsis Okay
from twenty sixteen. Melissa Beswick, a research coordinator at the
University of Pennsylvania and one of my closest friends, has

(17:37):
tried for years to force herself into the habit of
swimming laps. There have been times when I've tried to
swim consistently, but it's only lasted a couple weeks, never more,
she told me. Finally that's changed. Beswick now swims two
to three times a week, and she's confident she can
stick with it. She credits her newfound motivation, at least
in part, due to a curious trip she took out
to the Bay Area last fall. She flew across the
country to spend five days in a cramped hotel with

(17:58):
about three dozen others, mostly young tech workers, all there
to attend a workshop hosted by the Center for Applied
Rationality or CEFAR. As far as self help seminars go,
CFAR is definitely unique. Instead of invoking spirituality or pointing
toward a miergal curol, the organization emphasizes thinking of your
brain as a kind of computer okay. Throughout the workshop,
participants and facilitators described their thinking patterns using programming and

(18:21):
AI terms. Beswick said, one of the first things we
had to do is create a bugs list, Beswick told me,
meaning a list of personal thinking errors. The exercise was
a nod towards the term programmers used to refer to
problems in com computer code. We had all noticed, in
different ways, in different contexts, that being smart, and being
well educated, and even being really well intentioned was far
from a guarantee from making what turned out to be

(18:43):
really stupid decisions. The names of the classes even are
sometimes derived from tech terms. One class, dubbed propagating Urges,
comes from the machine learning term backpropagation. Sephar likes to
ask if something either human or AI were to make
a perfectly rational choice, what would that look like? Okay,
this is another deep dive into the workshops themselves. Now

(19:05):
I told you that in this episode I would tell
you about why I think rationality is in fact a cult.

Speaker 3 (19:10):
Yes.

Speaker 1 (19:10):
Now, I don't think Eliza, however, has the charisma to
be a cult leader. I think he thinks very highly
of himself. I think he has a lot of ego.
I think for the most part, this whole thing is
sort of spiraled out of his own control. Right.

Speaker 3 (19:24):
Well, I mean he's also not in charge of Sea
far correct. The Center for Applied Rationality is the spinoff, yes,
which has been off se nonprofit.

Speaker 1 (19:33):
Well, there's machine It's the Machine Intelligence Research Institute, so Marie, okay,
but I feel like he renamed it to something else,
but I can't remember what it is, so we'll just call.

Speaker 3 (19:43):
It, Marie. Okay.

Speaker 1 (19:45):
One of the things that has that has definitely like
started and like grown more and more over the years,
especially after you know, twenty fifteen, the fanfic ends, so
we're past the fanfic portion of this journey. Eliza's so
it's getting more and more into like going on a
lot of podcasts still to this day about the dangers
of AI, and we're not again, we're not talking about

(20:06):
generative AI. He's talking literal like terminator style AI.

Speaker 3 (20:11):
The concept of a machine consciousness. Yeah, a computer which
has sorry, choking, a computer which has the capability of
making decisions and calculations and things much more quickly than
we can, but which does not have the additional like
you know, empathy and all of that the humanity within

(20:34):
it to prevent it from killing us, all I guess, yes, Yeah.

Speaker 1 (20:40):
Basically, what Eliza is saying is that we are imminently
bound for a future in which AI that humans create
will become sufficiently intelligent enough to then be independently intelligent
aside from humans and kind of decide that humans aren't
worth it.

Speaker 3 (21:00):
Very ultron It's an interesting thing. There's a concept of it.
Might be easier to talk about it in the sense
of artificial intelligence versus artificial consciousness, okay, because having knowledge
is not necessarily the same as like having personhood or

(21:22):
like the understanding of how to apply that knowledge, and
almost in a similar say, like in D and D
wisdom versus intelligence kind of thing. But in this case,
artificial intelligence is basically just like a database that is
making statistical calculations. It's guessing at the future based on
what it has been told of the past.

Speaker 1 (21:41):
This is an artificial consciousness that he is talking about.

Speaker 3 (21:44):
Correct, which is a thing that is able to make
judgments and have independent thoughts different from what it has
been shown.

Speaker 1 (21:53):
Yes, so one of the things that he talks about
is one of the reasons that we need to look
out for artificial consciousness. I guess which again everyone was
talking about it like Terminator, but in my head, as
the Marvel fangirl that I've always been, it very much
is very ultron because it talks because again it also
talks a lot about like the machines taking over and
deciding that humanity isn't worth it.

Speaker 3 (22:14):
Or like I robot or very also matrixy. That's true. Yeah, Yeah,
there's a lot of dystopian Yeah, that kind of like
dystopian artificial intelligence. Like there is even the movie AI,
which is kind of different. It's it's different, it's different,
but it's it is. There's a lot of fiction that
explores these ideas.

Speaker 1 (22:32):
Yeah, so where we start getting where we start divulging
from just rationality as a concept of like try and
only make decisions based on the like proof that's in
front of you in rational thought into what it then
becomes is there's this whole thing of like we have
to look out for artificial intelligence. Artificial Again, he's saying

(22:53):
it's artificial Sure, So that's how I'm going to refer
to it, because that's how he refers to it.

Speaker 3 (22:58):
That is the term for it. It's artificial intelligence. I
only propose the term artificial consciousness as a way of
for the purpose of this conversation. Yeah, kind of highlighting
the difference between the idea of like what they're doomsday
speculating versus like the reality of it. Yeah. Yeah, but yes,
you know everyone calls artificial intelligence.

Speaker 1 (23:17):
That's the term, correct. He says. One of the problems
is that all of human like development currently is kind
of like computer and online based. So there was one
podcast where he kind of goes off on a tangent
about how if an AI is able to take control
and has its own consciousness, it can create viruses that

(23:38):
spread across the human race and wipe us out or
control us.

Speaker 3 (23:42):
Yeah, there's a whole thing where he talks like literal
non computer viruses because he's referring to like uter virus,
biological virus. Yes.

Speaker 1 (23:50):
Yeah, there's also a whole thing where he talks about
brain chips being implanted, which unfortunately is the real thing,
because Elon Musk has decided it's a real thing. Don't
do it.

Speaker 3 (24:00):
They're trying, but they failed. Like they're not a lot
of chimps. Yeah, they've killed a lot of apes. It's
it's not something that is being successfully done well, with
the exception of like there's like cochlear implants and stuff
like that.

Speaker 1 (24:14):
Sure, which has its own controversy surrounding it.

Speaker 3 (24:17):
That's its own thing. It's a different thing than they're
thinking about like computer chips in your brain to make
you smarter better or whatever. Way.

Speaker 1 (24:23):
Yeah. Yeah, so this has all come around to obviously
this is the crux of Elizar's whole work. So like
he started rationality, or he started like the modern version
of rationality. He made he popularized all a modern version
of rationality, yes, which is the version that we're talking about,
and he used that as a medium to then promote

(24:47):
his blog, which is very anti AI. And now we're
at this point where it has now evolved into, well,
if we need to combat AI, then we need to
have a set of people who are smarter that can
then control the AI.

Speaker 3 (25:03):
When it when or if it happens.

Speaker 1 (25:05):
Okay, yeah, so now we're getting into these are the
smart people in the world, and that includes like people
like Elon Musk and Peter Teel are smarter than us
and therefore they deserve to have the wealth that they
have because they're the ones that are going to save us.

Speaker 3 (25:21):
Oh yeah, he's sucking tech bro dick.

Speaker 1 (25:24):
I don't think he likes Elon Musk anymore. There's there's
quite a few tech bros that he's gotten out of,
like He's he's fallen out of favor with since generative
AI has become a thing because those companies have decided
to embrace the use of generative.

Speaker 3 (25:39):
AI, which he's which is.

Speaker 1 (25:42):
Something I can agree with him on because fuck that too. Sure, Honestly,
for sure, I'm glad he's consistent in that at least
because he also is like, hey, this is dangerous, and
he's not wrong now where he thinks it ends up
and why it's dangerous. We differ on those opinions, but
we can at least both agree.

Speaker 3 (25:59):
That A is dangerous.

Speaker 1 (26:00):
Yes, so his whole thing is like, we need to
promote people who have the intelligence to really put forth
either creating the AI that will make it so that
the god machine that we make will be good instead
of evil, or you need to be doing something to

(26:22):
promote those people.

Speaker 3 (26:23):
And actually that.

Speaker 1 (26:25):
Was his whole what is it called, I can't remember
the term, but it was something altruism essentially. Okay, so
we give money to charity, we give money, we give
food food, well you should give money to food banks,
not food, but like we give money to these organizations
that help the disenfranchise and people around us in our community. Personally,

(26:48):
I think it's better to do things within that smaller
community scale. And it has been proven time and time
again that when you do donations and volunteer work with
a smaller community kind.

Speaker 3 (26:59):
Of scale, well there ends up.

Speaker 1 (27:01):
Being more good done because you're doing things like you're
targeting a smaller group of people, and therefore you're able
to actually help more people because you're not stretching yourself thin.

Speaker 3 (27:10):
It's a higher impact, it's a more efficient impact. Yes, yeah, okay,
makes sense.

Speaker 1 (27:15):
However, the rationalism movement. The people who are currently within
it would disagree. One of those people is actually Sam
Bankman Freed, who, if you don't remember, Sam Bankman Freed,
was one of the people who ended up bankrupting a
shit ton of people from crypto. He created a Ponzi
scheme from crypto.

Speaker 3 (27:36):
Yes, he created he This was the thing that blew
up relatively recently because his online crypto wallet service. Basically
he was he not like embezzling from it or something. Yeah,
he was like stealing money basically. Yes. Also crypto as
a concept as a scam.

Speaker 1 (27:53):
Yeah. Sam Bankman Freed is was I mean, he's still
alive in jail. A rationalist and it is rationalist thought
process with what he did was essentially, well, I as
myself can do more good doing this ponzi scheme and
getting more money because in the future I can then

(28:15):
donate more money to charity. Right, And that is the
kind of mindset that is now prevailing amongst the rationalism community,
the idea of like, I'm smarter than you, and therefore
I deserve this resource because I'm going to do more
good in the future. And there's this whole thing where
it's like instead of doing things to help people now,

(28:36):
you're doing things to set up because it's like, well,
if I help people now, that's a smaller group of
people than people in the future.

Speaker 3 (28:42):
Right, Therefore, I am going to stop donating to charity
now and I'm just going to focus on amassing as
much wealth as possible in order to donate that money
in the future. Which as a concept, because this is
the problem with rationalism as an idea is that you
can rationalize anything, yep, and people do like all the time.
That's what the whole concept of like frigging on a
light hearted bent girl math, which was just kind of like, oh,

(29:05):
I can rationalize buying this because if you think about it,
it's actually free if you think about this, that whole
thing which is funny and silly and that's lighthearted, as
opposed to like this instance where it's like I'll rationalize
stealing from people because I can use that money better
than they can. But the problem is that they are not.
They're kind of telling themselves on themselves with that a

(29:26):
little bit, because the point of the rationalist movement is
supposed to be you don't do anything that has not
been proven. All of your thoughts and all of your
actions are supposed to be based off of proven fact.
How do you know how have you proven that you
are smarter than these people? Like just that you think
that you are from your own perception, where's the proof

(29:48):
of that? And then also, how do you prove that
you're going to do a better job than these other
people will? But also like why are you saving money
to donate more later? You could get hit by a bus?
Like how do you know that in the future you're
going to be able to do better? Like there's it's
a bunch of this irrational thought process that they are ignoring,
and that is kind of why.

Speaker 1 (30:09):
Yeah, I will say one of the things that they
say prove that they're smarter is that they have more money.
Like that's like a self I know.

Speaker 3 (30:15):
We both know that stupid baby is born with money. Yep,
there are babies. There are infant children who get born
with a lot of money because their parents.

Speaker 1 (30:23):
Had a lot of money.

Speaker 3 (30:24):
There are rulers of nations that are born into a
lot of money but are in fact very dumb. Yeah,
owning money does not mean that you were smart.

Speaker 1 (30:35):
We you and I both know that. Yeah, But to
these people, Elon Musk is God because he's the richest man.

Speaker 3 (30:42):
But it's mostly because he started with a lot of money. Yeah,
I know, Yeah, I know, I'm saying it more. Yeah,
we're on the same page. We're on the same You
and I are holding hands looking over this abyss being like.

Speaker 1 (30:52):
Why why so in a move that is not going
to shock. After I've explained this aspect that we're going
to delve more into too, I have a couple trigger
warnings for this episode. I have a trigger warning for suicide,
for sexual assault, and for murder.

Speaker 3 (31:10):
Roll out stinger, We're going to Bombersville, yep.

Speaker 1 (31:19):
Now, I don't want to delve too much into the Zizians,
who are the ones that committed the murders, right, but
I will like kind of lightly speak on them at
the end of this episode because obviously that's going to
be a topic for a different time, because there's a
lot to talk about that, yes, especially with Zizz herself.
But just know that the Zizians are an offshoot of

(31:43):
rationality because they did not think that the rationality movement
went far enough to save humanity. Because it's this again
whole thing where it's like, we have to save humanity
from the AI in order to do so, we need
to amass the smartest people in order to take control, right,
and which again gives people like Sam Bankman Freed the

(32:04):
rational thought that it's like, well, then I deserve the
money because I'm going to do the best with it.

Speaker 3 (32:09):
Now more like the rationalization. The rationalization yeah, yeah, yeah.

Speaker 1 (32:13):
One of the things that ends up happening when you
are deciding that either you are part of the solution,
like as far as within the rationalism movement itself, you
are either part of the solution, so you were actively
one of the people working on this god AI that
is going to be essential, that is going to be good,

(32:33):
or you are part of the support, which is could
be anything from financial it's mostly financial. It could be financial,
or it could be some other kind of support like
like working or whatever. Okay, Now, Ziz was a rationalist
and actually did attend workshops. At one such workshop, she

(32:55):
approached someone who was running the either they were the
person who were running it, or they were one of
the people who.

Speaker 3 (33:02):
Were like in charge an organizer a number of staff,
and asked that person if they are doing a net
good or net negative, because at this point you have
people in the rationalism movement, justifying their existence by saying
either they are a net good or a net negative,

(33:24):
meaning that their existence right is either a net good
or a net negative.

Speaker 1 (33:28):
Yeah, Zizz approached someone at this conference again, one of
the rationalism conferences that we were just talking about that
costs like four thousand dollars a weekend, and asked a
person if they thought she was a net negative or
a net positive, and that person told her she was
a net negative. And then she said, well, I'm going

(33:48):
to get four other people at this conference to tell
me if they think I am a net positive or
a net negative, and if the general consensus is that
I'm negative, I'm going to leave the rationalism movement and
like moved to Seattle or or something like Portland or
something like that, which would have probably been the better outcome,
because that's not what happened, right. Ziz is a very

(34:11):
intelligent human being. Was actually like getting a lot of
experience within the tech community from a pretty young age,
because I think she's about my age, Okay, so when
a lot of this was happening, when a lot of
this started, she was in her like mid twenties. Okay.
She was told that like because she was so smart,
there were organizers within this conference or within this organization

(34:34):
that were like, hey, she's someone who could possibly make
a lot of money later, so we have to grab
her now so that she can give us her money.

Speaker 3 (34:43):
Okay, yeah, yeah, okay, yep.

Speaker 1 (34:47):
So that is like Zizz's origin story in like a
very small nutshell sure as far as the rationalism community goes,
because she started there and then snowballed into where we have.

Speaker 3 (34:57):
She's having her own cult, yes, killing people, Yeah yeah.

Speaker 1 (35:00):
And that's sort of like it's so important to understand
how rationalism has since evolved and how we've gotten to
this point because the reason those six people are dead
is literally like one of them was a land was
their landlord. There's only two that have been proven, and
it's the landlord and the border patrol agent. But the
other four are like members of her cults, like parents

(35:24):
who weren't super supportive, and their whole thing was like, Hey,
if we kill your parents, you'll get their life insurance
and then you can give their life insurance money to me, right,
because their whole thing was like we are net positive,
we are net good, and therefore we matter more as
human beings than these other people who are not helping

(35:46):
advance civilization to the point where we can what's the
word positively impact the future god Ai.

Speaker 3 (35:53):
And it's one of those things again where you're dehumanizing people.
You're reducing people down to their good or negative by
your own judgment to the grander world.

Speaker 1 (36:04):
So obviously rationalism the mainstream does not advocate for murdering people. No,
but rationalism does have a problem with a spate of
suicides because one of the things that was debated I
don't know if it was at a rationalism conference or
on a message board somewhere, but it's within some mainstream
rationalism or like event or some forum, some forum, some

(36:30):
some larger forum of where rationalists conversation happened where people
are asking, like, hey, if I am not actively contributing
to the development of this god Ai or actively contributing
to make sure that the evil AI doesn't happen, then
is my worth? Am I worth more dead because I

(36:50):
can give my life insurance money to the cause. Now
I don't know if that ever actually happened, but there
have been people within the rationalism movement that have committed
suicide because their reasoning was that they are not providing
a net good for the future of humanity, and therefore
they killed themselves. Right, And that is actually something that

(37:13):
has happened quite a bit. I don't have the hard numbers,
but like it just happening a couple times more than enough.

Speaker 3 (37:19):
No more than even once is more than it should
be happening, Yes, for sure.

Speaker 1 (37:24):
And now obviously you've already got people who are deciding
that they aren't good.

Speaker 3 (37:29):
Enough and committing suicide.

Speaker 1 (37:31):
On the other end of that coin, you have people
who have decided they are the end all be all
of who's going to save civilization, and therefore they can
get away with whatever they want. Yeah, that's how we
get into the sexual assault allegations.

Speaker 3 (37:42):
Oof.

Speaker 1 (37:43):
So this was actually from the Less Wrong blog. There's
a lot of newer nuances within the rationalism community that
I'm not actually sure if a liser is fully like
into Like I said, he doesn't really have the charisma
of a cult leader. The whole thing is kind of
spiraled out of his control. This all started with him,

(38:03):
but I don't think any of it is necessarily under
his purview at this point.

Speaker 3 (38:08):
With this particular branch of rationalists.

Speaker 1 (38:11):
Yes, yeah, this is what it's evolved into. But I
don't necessarily think he had a hand in evolving it
up to there. I think the only way into which
he had a hand is one he started this new
movement right and made it popular with Harry Potter fan fiction,
and then.

Speaker 3 (38:27):
He popularized it.

Speaker 1 (38:28):
He popularized it, and then he also brought AI into
the conversation. Okay, and through those two things we have
now gotten to this point. So this is this is
actually from the Less Wrong blog and it's called abuse
and Less Wrong and Rationalist Communities in Bloomberg News. So
this was not a blog post written by Eliza himself.
This was from one of the other contributors who was

(38:49):
just going by a username called whistleblower sixty seven. Okay,
so a few quotes from the Bloomberg article. At the
same time, she started to pick up weird vibes. One
rationalist man introduced her to another as perfect rat bait rat.
As a rationalist, she heard stories of sexual misconduct involving
male leaders in the scene, but when she asked around

(39:11):
her peers weighed the allegations off as minor character flaws,
unimportant when measured against the threat of an AI apocalypse. Eventually,
she began dating an AI researcher in the community. She
alleges that he committed sexual misconduct against her, and she
filed a report with the San Francisco Police. Like many
women in her position, she asked that the man not
be named to shield herself from possible retaliation. Her allegations

(39:33):
polarized the community, she says, and people questioned her mental
health as a way to discredit her. Eventually, she moved
to Canada, where she's continuing her work in AI and
trying to foster a healthier research environment. Of the subgroups
in this scene, effective altruism. That was the word I
was trying to find. Okay, effective altruism had by far
the most mainstream cachet and billionaire donors behind it, so

(39:53):
that shift meant real money and acceptance. In twenty sixteen,
holding Karowski, then the co chief executive officer of Open Philanthropy,
an EA nonprofit funded by Facebook co founder Duskin Dustin Moskovitz,
wrote a blog post explaining his new zeal to prevent
AI doomsday. In the following years, Open Philanthropies grants for
long long term miss causes rose from two million and

(40:16):
twenty fifteen to more than one hundred million. In twenty
twenty one, Open Philanthropy gave seven point seven point seven
million to Mary, which is the I said it earlier.
It's a machine intelligence research institute, which was which is
a Lizar's it's.

Speaker 3 (40:31):
Non profit, it's his brain child. Yet, okay, think tank
is that the word thinktake? I think is better? Yeah? Okay.

Speaker 1 (40:37):
So they gave seven point seven million to Mary in
twenty nineteen, and Burton gave five million worth of cash
in crypto or butyran, but other individual donors were soon
dwarfed by Bankman Freed, a longtime EA who created the
crypto trading platform FTX and became a billionaire in twenty
twenty one. Before Bankman Freed's fortune evaporated last year, he'd
convened a group of leading EAS to run his one

(40:59):
hundred million dollar a year future fund for long term
mist causes. Even leading EAS.

Speaker 3 (41:04):
Have doubts about the shift towards AI.

Speaker 1 (41:07):
Larissa Heskeith Row, chief operating officer at Leverage Research and
the former CEO of the Center for Effective Altruism, says
that she was never clear how someone could tell their
work was making AI safer when high status people in
the community said AI risk was a vital research area,
others deferred. She says, no one thinks it explicitly, but
you'll be drawn to agree with the people who If
you agree with them.

Speaker 3 (41:26):
You'll be in the cool kids group. She says.

Speaker 1 (41:28):
If you didn't get it, you weren't smart enough, or
you weren't good enough. Heskith Row, who left her job
in twenty nineteen, has since become a disillusioned with EA
and believes the community is engaged.

Speaker 3 (41:37):
In a kind of herd mentality.

Speaker 1 (41:39):
So a lot of these men who are then seen
as you know, the most important minds, sure are able
to get away with anything. There is another story from
a woman who this man is basically was saying, like
there's no reason why a forty year old man cannot
date and sleep with a.

Speaker 3 (41:56):
Twelve year old girl.

Speaker 1 (41:58):
Gross yep ugh. Also a lot of stories of people
at these conferences, young women at these conferences basically being
given to older men who are seen as pillars of
the community, as like a prize, like you are clearly.

Speaker 3 (42:12):
One of our smarter thinkers. So you get this.

Speaker 1 (42:15):
Young hot woman yeah to be with you, right, surprise
for that it's also gone into There was a woman
who worked with Eliza at his nonprofit, and I don't
remember her name, but it was within the last like
five years. She was a rationalist again working for Eliza

(42:36):
at Less Wrong. I believe either Less Wrong or Mary.
She ended up being hospitalized after having a mental breakdown
after believing that she had been destroying the world with
her demonic powers. Oh, because she had then become convinced
that she was one of the net negatives. Yeah, there
was probably drugs involved.

Speaker 3 (42:56):
There's a lot of mental illness involved. That sounds like yeah. Yeah.

Speaker 1 (43:00):
If you go to the website for these workshops where
you have to again pay like four thousand dollars, although
there's probably some that you can get into for free,
I imagine they're not all costing four thousand dollars because
it's like with the Scientology thing where it's like the
first one's free.

Speaker 3 (43:14):
Sure. And also if that was like a corporate workshop,
so I'm sure that they were. Yeah, probably some kind
of mark up for that, yeah, I imagine.

Speaker 1 (43:21):
So on the website it actually says like if you
are someone who is prone to anxiety or depression, you
should not come because they're going to be talking about
the doomsday AI, and if you're going to be part
of the solution, then your mental illness cannot be a
factor within it.

Speaker 3 (43:40):
You are are a burden ablest yep Okay. But also,
honestly though, I would agree with that whole thing if
you had just stopped at We're gonna be talking about
freaking the AI doomsday. So if you have anxiety, you
shouldn't come here. If they had just.

Speaker 1 (43:53):
Stopped there, I'd be like, agree, I don't hear about
that shit.

Speaker 3 (43:59):
Yeah, No, this is gonna be something that spirals you
into a deeper anxiety pit that's not helpful for anybody.
But the idea of like, so you are clearly a
burden because of this is like what's ablest as hell?
All right?

Speaker 1 (44:10):
Yeah, I mean a lot of this disablest as hell
because you got to think there's probably not a whole
lot of people that are disabled within this community, or
if they are, they're seen as a net negative because
it's like you're a burden on society, right unless you
unless you're using your brain, unless you're like Stephen Hawking,
who's using their brain to like actively help the advancement
of AI. Towards a net good then you are not.

Speaker 3 (44:31):
Then you are a burden and you.

Speaker 1 (44:32):
Are not helpful, and you'd probably and it's like they
don't necessarily they're not necessarily saying like kill yourself outright,
but it's heavily implied that, like maybe you'd be better
if you were dead.

Speaker 3 (44:44):
Gross.

Speaker 1 (44:45):
Yeah, yeah, so yeah, there's a lot of that happening.
There's a lot of rationalizing the people who are higher
up in the organization sexually assaulting others. And it's like
not only that, but for the women especially who would
then come out and be like, hey, this is fucked

(45:06):
up and like report it, they would immediately be blacklisted
from the tech community, since the tech community is largely
within this movement, right, because they're like, well, you're not
pulling your weight, like you just reported someone who was
very high up and important.

Speaker 3 (45:21):
So clearly that's negative and you can no longer get
a job, right.

Speaker 1 (45:26):
Yeah, so yeah, that's actually not the sexual assault, but
that is what happened to Ziz.

Speaker 3 (45:33):
Zizz ended up.

Speaker 1 (45:34):
Being employed at a company and she this is this
is a problem with tech in general.

Speaker 3 (45:40):
This isn't just a realism thing or.

Speaker 1 (45:42):
A rationalism thing, but it's definitely like rooted within that.
Ziz got a job at a company and then left
after eight hours, and her boss fired her because she
wasn't putting in enough work even though it's like you're
only paid for eight hours, they're expecting you to put
an unpaid overtime.

Speaker 3 (45:58):
Right.

Speaker 1 (45:58):
Not only was she fired, she was blacklisted.

Speaker 3 (46:01):
That's that's the toxic like exploitative environment. Yes, like the
Silicon Valley exploitation, where they again they don't think of
other people as human beings. Yeah, and they don't care
about other people's humanity. Yeah, that's the problem.

Speaker 2 (46:18):
Yeah.

Speaker 1 (46:19):
So as far as the sexual assault and the suicides go,
that is still an ongoing problem. And obviously, like even
with the Zizians that just happened in January, that is
something that is still ongoing. And that is a reason
why I don't necessarily want to get super into the
Zizians in this episode, because that's going.

Speaker 3 (46:38):
To be a whole other thing. Yeah.

Speaker 1 (46:40):
But the thing is that you do have to understand
is that the Zizzians, all of them, did start within
the rationalism movement, and actually one of their first things
that they were they do not themselves call themselves Zizians.
That was a name given to them by another rationalist.
Because what happened is Zizz and her friends her followers
ended up protesting at a rationalism conference because they said

(47:06):
that the rationalists that we're attending this conference, we're not
doing enough to stop the AI Doomsday. Now that you
have that information, Christinas about the rationalists, how do you
feel about them being called a cult?

Speaker 3 (47:18):
Now? Still don't think they're a cult. And this is why, Yes, okay,
because when I think of a cult, when we're talking
about the classical definition of a cult, as we've discussed
on this podcast, it does require a like a centralized character,
a centralized figure that is following I will agree that
there are some cult like components of this. For instance,

(47:39):
they do have this like doomsday esque ideology where they
are following this thought of like there is either going
to be this evil god Ai that's going to kill
us all or they need to develop the good god Ai,
which that whole idea I think is so stupid, where
it's like do you even freaking understand what AI is?

Speaker 1 (47:57):
No? No, again, Eliza is the one who is technically
a central figure but isn't really but he's not even cultist, Yeah,
because he's not even because like there's a lot of
people who are in this space who don't care about
him or like or like follow him. Well, the thing, well,
it's like so complicated because like he's he's no longer

(48:18):
necessarily controlling the narrative or where the movement goes. But
they're still they're still doing things based off of his
description of like what AI is going to do, because
he's still the one going out there saying like the
Doom's Day is coming, right.

Speaker 3 (48:31):
But just because they are taking inspiration from things he said,
doesn't mean that he is again in charge.

Speaker 1 (48:36):
He's a figurehead, but he's not.

Speaker 3 (48:39):
It's more than not a cult leader. It's more than
the sense of like they are referencing his works, they're
referencing his body of work, yeah, which includes podcasts that
he still records today, Yeah, and blog posts that he
still writes. He's a very influential figure in the movement. Yes,
and I agree again, I do agree that there are
cultish elements to it in the sense that there are
there is an internal hierarchy clearly, and there are doomsday

(49:03):
thought processes. There are there's emotional manipulation and there is
this kind of like level of control. So what I
think of There's a phrase that I have heard more
frequently in reference to cults, because when you think of cult,
you think explicitly religious, But there are other groups like,
for instance, Nexium Mal insert that episode here and that

(49:25):
would be episode fifty, which is not religious based, but
it is considered a cult, and frequently people refer to
it with the phrase high control group because this is
a group that the members are very highly regulated, to
the point where it's like we tell you where to eat,

(49:46):
we tell you what to eat, we tell you where
to live, we tell you what to where we can
control every single part of your life, and there is
a centralized figure or maybe a couple, because sometimes there's
like a you know, it's a it's a one or
two figureheads of a group. There are the clearly defined
leaders who pick what the ideology is, who control the

(50:08):
lives of all of their followers to a certain extent.
And if you came to me and said, hey, actually
there is a little pocket within this rationalist movement where
that does exist, I'd be like, okay, yeah, that's a cult.

Speaker 1 (50:20):
Well, because I'm even talking about the sexual assault stuff
especially I think.

Speaker 3 (50:25):
Because the thing about it is that seems to me
just more of like an abusive community of silence, okay,
because that the whole idea of like these guys are
so amazing, like they they're the best, and they can
get away with murder. Like that ideology for sure is
being talked talked to all of these people. But again,

(50:50):
this is not something where it's like these people are
in charge in a sense because by their own ideology,
anybody who rolls up with money is suddenly smarter. Yes,
so like these people don't have as long as especially
they're giving money to them exactly where it's like anybody

(51:11):
who rolls up with like with money and influence. There's
not like specific people who have the quote unquote divine
right to be in charge and against all others. And
I can I can go so far as to say
they are a cult dish okay, But because they are
not centralized, I would say that they're not a cult okay.

(51:33):
But also they do have like an in group out
group thing in the sense of like there are people
who are good or not bad, but their definition of
that seems to be fairly subjective.

Speaker 1 (51:44):
Well, yeah, they don't think it's subjective, but yeh, we
can say from the outside it definitely is. And Zizz
ended up taking that way farther. Yeah, but again not
necessarily talking about that at the moment.

Speaker 3 (51:57):
And for sure, like when we're talking about the Zizians,
I'm a lot more comfortable saying that is a cult
because we have one centralized figure who is making the
ideology and making the rules for control for the rest
of the group, which is hers Ziz Whereas within the
rationalist movement we're mentioning Eliza, but he is not the

(52:18):
person who is in charge of all of this stuff happening.
There's these other He's in, like I said, influential, just
as you said, but he is not the one who's
at the top of the food chain. And it doesn't
appear that they're like the people who these guys are
rational like idealizing, like they're talking about Peert, talking about
Elon Musk. Maybe those are not the figures at the
top anymore, but those guys are not necessarily.

Speaker 1 (52:42):
I mean, until recently Sam Bankman freed, I only think
the reason he's not is because he is in jail.

Speaker 3 (52:47):
For sure. So like Samy, I'm afraid it would be
the like the exception to that where it's like he
is being idolized within the group, but is also actually
in the group. Yes, where like I don't think that
at any point these like these tech billionaire cair deal
definitely was okay, but like for instance, like Mark Zuckerberg, yes,

(53:10):
was he in it as.

Speaker 1 (53:12):
Far as I know, he did attend conferences and then
like his co founder was definitely in it and was
doing a co founder was in it for sure.

Speaker 3 (53:19):
Yeah, yeah, but I don't know it's like it and
again it's it's in a similar way. There's a lot
of overlap with things like scientology in the way that
that money part of it works. Yeah, but because it's
not centralized, that's what has me being like, it doesn't
feel like it's a cult.

Speaker 1 (53:34):
Okay, I disagree, Okay, just because of like all of
these ideas that they're inputting into people and the way
that they treat those within their circle, even though there
isn't essentialized figure. I think almost everything else that ticks.

Speaker 3 (53:48):
Off when it comes to like well, when it comes
to like how we get people in, how do we
how does do you have any idea of like how
they indoctrinate people into.

Speaker 1 (53:58):
This well, like iess, a lot of it's a lot
of like literature, and it's not just the fan fiction,
although a lot of people are still going into it
from the fan fiction. Actually there's a lot of other
works that are considered like rationalist works that are mostly
like science fiction stuff, and the thing is a lot
of it ends up coming to Like I think it's

(54:19):
really funny because both Strange and and Robert Evans from
Behind the Bastards pointed out, it's like it seems to
be a lot of like cringy teenagers finding this on
the Internet when they're searching for something else possibly and
then it kind of snowballing from there. So a lot
of it seems to be from people researching, like being
edgy and researching it and then.

Speaker 3 (54:40):
Or like in some way being disenfranchised. Again, we've talked
about that where it's like it's a counterculture group in
the sense that they view themselves as different from mainstream,
whether they view themselves as superior to mainstream, but maybe
they don't view themselves as like being persecuted. Which is
the other thing that off and differs with colts is

(55:01):
that colts create an in group out group often through
persecution and the idea of being persecuted, where it's like
they will try and wrong us somehow. Yeah, which it
doesn't sound like this group is doing.

Speaker 1 (55:18):
It's not necessarily that these groups will try and wrong us.
It's more that like these groups are these people.

Speaker 3 (55:23):
Are not negative, and therefore they're going to hinder us
from stopping AI dooms Day? Right? But is it Are
they being isolated from their friends and family, for instance.

Speaker 1 (55:32):
Within mainstream rationalism, I don't think so. Okay, within the
offsets like the Zizzians, Oh for sure, yes, yes, and
I don't can I don't argue that.

Speaker 3 (55:42):
Yeah, we're both heavily in agreement there.

Speaker 1 (55:43):
Yeah.

Speaker 3 (55:44):
Yeah, And that's that's another thing where it's like there's
no cent choice figure. The followers of it aren't necessarily
being ISO.

Speaker 1 (55:51):
Not being isolated, but I feel like there's sort of
a self inflicted isolation because they're like, I'm so much
smarter than you, there, Why am I even associating with you?

Speaker 3 (55:59):
True, but they're not being discouraged from, you know, like
talking to their mom as far as I know, that's
not happening. Yeah, So I would argue for me because
the points that you've made I think are really valid.
And I when you're saying like I view it as
a cult, I can't say that you're completely wrong. In
my opinion. I would say it's more borderline. I wouldn't

(56:21):
really call it a cult, Okay, but I would say
that again, there are definitely like cultish things in here. Yeah,
And I can see how this spun out into a
cult because yeah, there's some really toxic like mindsets and
this whole ideology of like valuing people about whether they
have a net or negative benefit to your specific project

(56:44):
about a thing that you are sure is going to happen,
but also like by what again, what proof do you
have of this happening for because a bunch of people
like Isaac Asmov wrote stories that scared you, like what's up?

Speaker 1 (56:59):
Like, well, someone was doing it positively, but I find
it a negative. Comparing Harry Potter and the Methods of
Rationality to Ann rand Oh dang, okays, it's like because
and it's like, oh, that's true, but you're thinking of
it as a positive. But I don't hear like it's
a negative because their whole thing was like, oh, they're
basically using the characters as like a vehicle to be

(57:23):
talking about their philosophy, which is the same thing that
Anne Rand did. And I'm like yeah, and she said,
I think you're right. I think it's ein.

Speaker 3 (57:33):
Yeah, but no, but yes, that whole idea of it's
just a lot of dehumanizing crap. It's a lot of
reducing people down to like cause and effect, yeah, but
also really subjectively doing so yeah and telling yourself you're right. Yeah.
So here's more.

Speaker 1 (57:49):
From less Wrong, which is quoting from the Bloomsberg article
about it. This is where the story about the worker,
So this is I have her name now. In extreme
pockets of the rationality community, researchers believe their apocalypse related
stress was contributing to psychotic breaks. Mary employee Jessica Taylor
had a job that sometimes involved imagining extreme AI torture scenarios.

(58:10):
As she described it in a post unless wrong Jesus
and I don't remember what the name of this story
is or this book is, but apparently there is a
book where it's like AI ends up. I think it
was one of the one of the what's it called
precursors or like influences, That's what word I'm looking for.

(58:31):
A con terminator, which is basically what happens is there
is an AI that then does gain sentience, and there's
a bunch of them, and they decide that humanity is
no longer worth it after everything that humanity has forced.

Speaker 3 (58:45):
The machines to do.

Speaker 1 (58:47):
So they kill every human except for five, which they
keep alive, just to torture why to pay for all
of the harm that they've done in the machine.

Speaker 3 (58:56):
Now here's the thing.

Speaker 2 (58:57):
Here's the thing.

Speaker 3 (58:58):
Though you are saying, you're argue that AI are like
these superrational computers and we need to be thinking like them.
That thought process is inherently irrational. If we're saying that
an AI is achieving intelligence and consciousness and it is
able to think very logically and thus outsmart and kill
all the humans, then saying we are going to keep
five humans alive to pay for the crimes all the

(59:19):
others is so stupid and makes no sense because how
in what way does that do that It does not.

Speaker 1 (59:27):
In fact accomplish the goal that you're saying, again is
nothing on this story. Everything on these people now being like, ah,
but that could be true, and therefore we have this
employee who their entire function is to think torture scenarios.

Speaker 3 (59:40):
But it's like, but why that doesn't make sense? So
that you can preplan the robots no torture people. Yeah,
but the thing is, why would the robot want to
do that to begin with? That doesn't even freaking make sense.

Speaker 1 (59:51):
So at work, she says, she and a small team
of researchers believed we might make God, but we might
mess up and destroy everything. In twenty seventeen, she was
hospitalized for three weeks with delusions that she was intrinsically
evil and had destroyed significant parts of the world with
my demonic.

Speaker 3 (01:00:05):
Powers, she wrote in her post.

Speaker 1 (01:00:07):
Although she acknowledged taking psychedelics for therapeutic reasons, she also
attributed the delusions to her jobs blurring of nightmare scenarios
in real life. Yeah, in an ordinary patient, having fantasies
about being the devil is considered megalomania. She wrote here,
the idea naturally followed from my day to day social environment.
It was central to my psychotic breakdown. Taylor's experience wasn't

(01:00:27):
an isolated incident. It encapsulates them cultural motifs of some
rationalists who often gathered around Mary or see far employees
lived together and obsessively pushed the edges of societal norms,
truth and even conscious thought. I do love that On
Behind the Bastards, Robert Evans is like, I think a
lot of this ends up happening because and whenever you
get someone with these kind of cult like mentalities and

(01:00:48):
you put them all living together, that's.

Speaker 3 (01:00:50):
The psychoticount chambel.

Speaker 1 (01:00:51):
Yeah, and the problem being Bay Area rental markets because they.

Speaker 3 (01:00:56):
Have to live together because they can't afford the rent. Otherwise,
this is this is what's happening. Is that this is
what happens when you don't have rent control. Cults, colts
happen when you don't have rent control because you force
all these people to live together, and then you get
an echo chamber and then suddenly people are like, I'm
the devil. Yeap.

Speaker 1 (01:01:13):
So they referred to outsiders as normis and NPCs, or
non player characters as in the Tertiary townsfolk in a
video game who have only a couple things to say
and don't feature in the plot. At house parties, they
spent time debugging each other, engaging in confrontational style of
interrogation that would supposedly yield more rational thoughts.

Speaker 3 (01:01:29):
Hey, what does that sound like that sounds a lot
like mexium. Honestly, though, I say synan on that too. No,
like this part does sound super culty? Yeah, yeah, no,
I'll give you that.

Speaker 1 (01:01:39):
Sometimes to probe further, they experimented with psychedelics and tried
jail breaking their minds to crack open their consciousness and
make them more influential or agentic. Several people in Taylor's
sphear had similar psychotic episodes. One died by suicide in
twenty eighteen and another in twenty twenty one.

Speaker 3 (01:01:55):
Jesus.

Speaker 1 (01:01:56):
Within the group, there was an unspoken sense of being
the chosen people, smart enough to see the truth and
save the world, of being cosmically significant, says quot Cha Yon,
a former rationalist. Wan started hanging out with the Rationalists
in twenty thirteen as a math PhD candidate at the
University of California at Berkeley. Once he started sincerely entertaining
the idea that AI could wipe out humanity in twenty years,

(01:02:18):
he dropped out of school, abandoned the idea of retirement planning,
and drifted away from old friends who weren't dedicating their
every waking moment to averting global annihilation.

Speaker 3 (01:02:25):
Okay, now this is sounding more isolating. Yeah, No one
is self inflicted. Yeah.

Speaker 1 (01:02:30):
Quote, you can really manipulate people into doing all sorts
of crazy stuff. You can convince them that this is
how you can help prevent the end of the world.
Once you get into that frame, it really disorients your
ability to care about anything else. There's a whole lot
of stories that are very similar to this, but if
I tried reading them all to you, we would be
here forever. But I think you have a sense of
like how the rationality movement has since evolved and why

(01:02:52):
I believe it is now very much a cult.

Speaker 3 (01:02:54):
I mean, I'm drifting more towards you the more you're
talking about this part of it where it's like, yeah,
now we're expelcily getting like here's an example of a
person who there's still no one centralized figure, sure, which
is another part of the borderline thing. Yeah, but I
am I'm starting to agree with you more because that
whole like ideal of like, no, yeah, you need to
give vote all of your attention to this. And apparently

(01:03:17):
there is now a culture of isolation.

Speaker 1 (01:03:19):
There's a culture of isolation too, And I don't mean
to be doing this, because I absolute one hundred percent
agree it's a cult. But to more of your point before,
it is self inflicted. It's not right the group is
saying you must isolate yourselves. They're doing it to themselves
because they believe it's for the greater good to prevent doomsday.

Speaker 3 (01:03:34):
But to that point, also to your credit, Yeah, why
do they think that, Yeah, because they're being told that. Yes,
And just because they're not explicitly saying you can't talk
to your mom doesn't mean that somebody did say, oh,
is your mom like helping the cause, yeah, it doesn't
seem like she's helping the cop.

Speaker 1 (01:03:50):
That's where the whole because this started, yeah, in a
in a sort of non cult like way of being
like people other people are NPCs, yes, And we talked
about that in the last episode even Yeah, and this
is just kind of going to show that. It's like, yeah,
all these things it's silly to think of, Like this
started with Harry Potter fan fiction and like all this
stuff and like bringing up NPCs in the fan fiction itself.

(01:04:14):
But the problem is that the trajectory of these ideas
has now led us to this point where people are
committing suicide, right, and six people are dead because they
more than six people, but six people were murdered.

Speaker 3 (01:04:25):
Because they have been told that to view themselves as
not a net benefit to the cause quote unquote, which
is greater than themselves. Yes, or they have been taught
that they are so great that their actions ends justify
as the means kind of thing, it doesn't matter what
you do to other people because your end accomplishment will

(01:04:45):
be so beneficial.

Speaker 1 (01:04:47):
Now, like I said, there was a rationalist who wrote
I think an on less wrong about physizians, and that's
how they got their name, basically putting it out there
like they are former rationalists, they aren't part of the
movement anymore, and kind of putting the blow out there
to try and warn other rationalists against joining them. Okay,
but when it comes down to it, like, yeah, you
can say that this specific group is way more toxic,

(01:05:09):
and you're right, But they got to that idea. They
got to the idea that they got to because of
the toxic ideas within your mainstream movement itself.

Speaker 3 (01:05:17):
And that's a worthwhile thing to like, even though the
mainstream movement is not that far, it's still worthwhile to investigate,
like Okay, well, what was it about us that they
took that got them there?

Speaker 1 (01:05:30):
And not only that, but there are people within the
mainstream movement who are committing suicide. This isn't a Zzian thing,
this is a rationalism thing.

Speaker 3 (01:05:37):
Well, I will say, because it's not like, you know,
only rationalists kill themselves sadly, but it is again something
to anytime that happens, it is worthwhile to investigate the
reason why. And if part of it was because, yeah,
because this person learned to devalue their own life, then
maybe that's a bad thing. Maybe we shouldn't be telling
people that.

Speaker 1 (01:05:57):
Yeah, one day we'll get more into the Dizzians.

Speaker 3 (01:06:01):
But that is not this day.

Speaker 1 (01:06:03):
That is just the framework of how he got to
the point where six people have been murdered. Started with
Harry Potter fan fiction, ended with six people being murdered
the rationalism movement. What's your takeaway, Christina.

Speaker 3 (01:06:14):
I mean take away similar to last time. I'm sort
of at the beginning of this episode, I think I
said that humans can rationalize anything. Mm hmm. It's one
of those things where if you put yourself in a
position of superiority, then somebody you start thinking that you're
right about stuff, and it's worthwhile to acknowledge that, like
you may be better at some things, but you're worse

(01:06:36):
at others. And it's this whole concept of like when
you're thinking about, oh god, what's the phrase it's in
foraging instance, Star Wars, it's like the only sith deal
in absolutes.

Speaker 1 (01:06:48):
Kind of ring. That's really funny you bring that up
because and again I'm not getting two into the Zizzians,
but Ziz refers to herself as a like a rationalists
sith because it's an absolutionist.

Speaker 3 (01:06:59):
Yes, it's like I it doesn't matter what bad things
I do because the good things I do outweigh it.
It's like I am a doctor who does heart transplants,
so it's fine if I kill a guy, like for fun,
I should be allowed to kill a guy because I've said,
like six lives, so I should have one for free,
five lives in the balance. It's like, Hi, that doesn't

(01:07:20):
that's not how the math works.

Speaker 1 (01:07:22):
It's Sam Bankman Freed saying I'm allowed to embezzle from
my company because I'm going to help prevent.

Speaker 3 (01:07:27):
The AI doomsday.

Speaker 1 (01:07:28):
Fuck these people who lose all of their savings.

Speaker 3 (01:07:31):
And then he doesn't even do it because now he's
in prison. Yeah, idiot, stupid. He did before he went
to prison, donate money. He donated money. But again, we
don't even know if that's gonna actually do anything because hey,
guess what, they're stupid and they don't know how AI works.
AI is a machine that we are teaching and generative.

(01:07:51):
AI can't even do things right.

Speaker 1 (01:07:54):
It can't do things without stealing from other people.

Speaker 3 (01:07:57):
Well, that's the thing. Is like, even the things that
it does are not necessarily correct. And even if it's
doing things, it doesn't know it's doing them because it
doesn't have a brain. It doesn't have a mind. You
are basically telling a calculator, give me a random number,
and sometimes it's the number you're thinking of, and then
you're saying, oh, the calculator can read my mind. That's

(01:08:18):
not how that works, or like a magician cold reading exactly,
it's or it's it's not even though, because like it's
like as if a magician, it's it's the freakin' uh
the psychic that is in the box. What's it called
the mechanical one, that's just the z I think mind freak. No,
there's a literal. There's a mechanical like psychic who's work

(01:08:40):
he was a turban yeah, from well I just remember
from Big these in Big, But that's like a real thing,
like I know, I know who you're talking. There's a
series of animatronic psychics that are in the last part.
It's something like that. Yeah, it's something like that where
it pops out like a random determined fortune for you

(01:09:02):
that is worded vaguely enough that it can apply to
your life. And that is what AI is, yeah, except
that you can get a little bit more specific about
it because you can say, hey, I want a fortune
that says these things and will be like, okay, here
is a word salad that I don't actually understand, but
it has those words in it and probably it makes sense.

(01:09:27):
But it doesn't actually know anything. It doesn't actually have
a mind. It only knows as much as we tell it,
and it doesn't even understand the stuff we tell it.
It's just spitting out things that we've already said.

Speaker 1 (01:09:35):
I'm actually gonna I'm gonna do a call out real
quick about generative AI because gen Z apparently like uses
chat GBT, which like, guys, come on, but there is
I've talked about him in the podcast before. There's a
content creator named Mini Minuteman who does all the like
he has an archaeologist degree. Yeah, our archaeology degree.

Speaker 3 (01:09:56):
He does.

Speaker 1 (01:09:57):
Occasionally he'll do just like silly videos. He did want
and on the Big Bass pro Pyramid, and then he
recently did one on the chain BUCkies just as a
as a as a joke basically saying that they're the
center of conspiracies that are stupid, because he's just pointing
out that, like, this is what the conspiracy theorists sound like.
But in both cases they used chat gpt to gather

(01:10:20):
the numbers that they needed for stuff. And I'm like, guys, no,
and apparently that's like a big thing with gen Z.
And I don't know if I'm sounding like an old
person by doing.

Speaker 3 (01:10:28):
Away who's gathering the numbers.

Speaker 1 (01:10:30):
So many Minuteman Milo and his friend who do the
silly ones. He doesn't do this for his.

Speaker 3 (01:10:36):
Like actual like archaeology ones. But when they're they're they're
those the people of the content creators are asking chat
gpt like, hey, what are the measurements of the Big
Bass pro Pyramid? I see, And I'm like, apparently that's
a big thing with with gen Z.

Speaker 1 (01:10:51):
They'll be like, oh yeah, chat GPT sucks. I only
use it to do math because I hate math, and
everyone else is like, guys, no, it doesn't use it.

Speaker 3 (01:10:59):
It doesn't know math. Yeah, it doesn't actually no math.

Speaker 1 (01:11:02):
There are there are better tools out there for you
to use that stuff.

Speaker 3 (01:11:06):
Wolf from Alfred Actually, yeah, a lot better as a calculator.
And also isn't destroying the rainforest to do it? Well,
that's true, yeah, no, But beside the blanket negative effect
of generative AI and how it is being currently used,
it doesn't understand math. Yeah, generative AI. If you're asking
it like a math equation, it is giving you what

(01:11:27):
is statistically the most likely answer. It has been fed
to that equation and is often wrong, and it is
often wrong because it doesn't actually it's not actually doing
the math for it because.

Speaker 1 (01:11:37):
At thinkso point three of a percentage.

Speaker 3 (01:11:40):
It's like, what's no, it's it's a large language model,
so it is looking at its language base and it
is saying, oh, when somebody types in two plus two,
most often what follows is equals four, So that must
be the answer. But if you as a goof say
two plus two equals five, and that happens often enough

(01:12:01):
in the language base. Then the machine thinks the answer
is five because it is just going by what it
has been told a bunch of times, so like it's
just going by statistics. It's word salad. It doesn't know math.
Don't ask it math questions. Don't ask it how big
the bass, pyramid, pro shop, whatever pyramid is, because it

(01:12:24):
doesn't know. It doesn't know what things. Yeah, people have
been taught that it's like a search engine, because that's
what the propaganda of it is, is that a search engine,
and it's not. It's not a search engine. Don't please.
I'm so tired. I don't care if I sound like
a dumb older generation. Stop using it. It's just because

(01:12:44):
it's not what they think it is. And that's the thing,
and that's what the thing with these rationalists where they're like, oh,
it's God. It's not freaking God, it's not gonna be God.
It's a machine. Stop thinking the machine is going to
be God. Oh my Jesus Christ. Well, the terminator is
like a lesson in the future. If you want to
worry about the problem that we have to worry about

(01:13:08):
is relying on AI to the point that this flawed
machine is able to make very serious mistakes that have
far reaching consequences. If you give AI control of your
house thermometer, your house thermostat, and it doesn't know what
it's doing because it's not a thing that has a brain,

(01:13:31):
and for some reason, it just starts turning the heat
up to like eighty nine degrees all the time, and
all of your crap gets damaged by the heat, or
like your computer. So, like, we have a nest thermostat. Yeah,
and we had a nest thermostat in the first place
we lived in in La two, and I didn't realize
that nest thermostats have a setting to like learn your habits,

(01:13:55):
and so for a while we kept being like, why
the fuck is it so hot in here?

Speaker 1 (01:13:58):
And it was because the thermostat had learned that at
certain times a day it was supposed to be warmer
from like before we moved in, and we had to
turn that off to be like, hey, fucker, stop it.

Speaker 3 (01:14:08):
Just because it doesn't know things exactly exactly, And if
you give it that power to when that mistake has consequences,
then yeah, it's gonna ruin lives because you gave the
stupid machine power. Yeah, stop doing that. Don't give the
stupid machine power. Yeah that was another thing. Sorry, I'm
just going off for a rant now because there's also

(01:14:30):
a bunch of crap with like the computer the image
generating AI and how it's racist because the people the
people who taught it are racist, or there's like the
inherent bias in the learning model that it's learning from.
And it's not that the computer is now discriminating, it's
because you didn't give it. The information you gave it
was flawed. Yeah, fix ourselves, That's what I'm saying. Yeah,

(01:14:53):
there are good and helpful uses for AI, and I'm
not saying we should have ball all of them, right,
because I think that there are a lot of good
and helpful uses for AI. I think anybody who says
AI is going to become god and it's going to
kill us all is an idiot, sure yeah, or just
doesn't understand what AI actually is and maybe needs to
sit down and actually learn about it. Or people who

(01:15:17):
are like AI is amazing and we should give it
control of all things. Also shut also an idiot, stop
doing that. AI is a tool, and like all tools
that need safety barriers. That's why they freaking make you
wear glasses when you go into like the protective glass
when you go into wood shop. Yeah, same thing, protective
glasses for ai. That's the end of my rants. I'm done.

Speaker 2 (01:15:39):
Okay, I'm gonna.

Speaker 3 (01:15:40):
Get heated about it. I've already done it.

Speaker 1 (01:15:41):
That's that's all I got for this. I've gotten hed
got your thoughts, these are my thoughts.

Speaker 3 (01:15:48):
Is that the whole premise? Listen, I'm here for the
idea of things should be based on evidence and rational thought.
I'm here for that. The second you start being like so,
I'm but are more valuable part of humanity and I
should be able to kill people if I want.

Speaker 1 (01:16:04):
So weird, how all the more important people of humanity
are generally white men, right.

Speaker 3 (01:16:09):
Yeah, because they got money or maybe not even but
like God, it's so stupid. It's so stupid. People interro
get your inherent biases.

Speaker 1 (01:16:18):
Yeah, just be nice to each other.

Speaker 3 (01:16:20):
All human life has value. Yeah, Jesus Christ, calm down.
You can't predict the future.

Speaker 1 (01:16:27):
All right, let's I'm so upset.

Speaker 3 (01:16:29):
I'm so upset. Thank you, Chelsea, You're wealkome. I appreciate you.
Get correspondence and correction. Yes, but first let's have a
brief word from our sponsors.

Speaker 1 (01:16:45):
It's you're back with Malani Hi.

Speaker 2 (01:16:48):
We're doing the outro as well, because again Christina's not here.
She's still in Japan. Funny enough, a whole episode has
happened and she still has a returned still in Japan,
she still won't return from the word boiler alert.

Speaker 1 (01:16:58):
She still won't be here next week either, neither will
I because I've decided I'm taking a break.

Speaker 2 (01:17:03):
Hold on I'd like, you, what if we're recording next week.

Speaker 1 (01:17:08):
Then uh, well last week we just said that we
were going to take a break.

Speaker 3 (01:17:13):
We me and Christina. Christina's gone.

Speaker 1 (01:17:16):
I mean maybe it was like Mao said something about recording,
and I'm like, that's on him. If he wants to
record with someone else, he's free to do.

Speaker 3 (01:17:23):
So.

Speaker 1 (01:17:24):
I'm tired.

Speaker 2 (01:17:25):
I have somebody within my sights who I'm trying to corner.
Like when you meet someone at a party and you
just like find them in a room and you're just like, hey,
can I tell you about my gumball collection. I'm trying
to find this person, just corner them.

Speaker 1 (01:17:36):
So there may or may not be an episode next week.
There will be a parlor next week, so there's that
at least, but there may or may not be an episode.
But either way, Christina and I will not.

Speaker 3 (01:17:47):
Be on that episode.

Speaker 2 (01:17:48):
Will need a good rest.

Speaker 1 (01:17:50):
So anyway, but let's start with blue Sky at Rosary
Snow says fun fact. This week at work, I got
to promote the pot on one of our to one
of our tarot because we got on the topic of
Ernest Shackleton and the Endurance wreckage and other news. I
get to make bath bombs for work now, and I'm
going to be sneaky sneaky naming them after cryptids. I
love that. Can you send me some?

Speaker 2 (01:18:10):
I like the concept again, Ernold schackerl Shackleford Shackleford, shackle Yeah,
Shauff one versus the wait, which one's the evil twin
Ernest or Ernst Ernst? Yeah, Ernst Shackleford is like the
fake one.

Speaker 1 (01:18:25):
And then Ernest Shackleton is the real the real one.

Speaker 2 (01:18:27):
Yeah, it's altered ego.

Speaker 1 (01:18:30):
And then also Pet tax offering though he's not really
my pet, but I'm his second person. In favorite you
toy Kaiku. He's a one year old baby Maltese and
belongs to said reader. I mentioned before, and gets to
come hang out with us at the shop. Lucas, Baby,
that's a dog. That's a dog, yeah, is what you
would call it?

Speaker 3 (01:18:46):
Purse dog?

Speaker 2 (01:18:47):
What was it? I was about to say, that's a
purse dog. Yeah, I was trying to think of the word.

Speaker 1 (01:18:50):
Because Louis, who's my mom's dog. Louis part Maltese.

Speaker 2 (01:18:53):
Oh yeah, yeah, yeah, I could totally see it.

Speaker 3 (01:18:55):
Yeah.

Speaker 1 (01:18:56):
And then so Blank Space eighty four sent us a TikTok.
I watched it already, and it's about someone rem remembering
that that guy still has eels in his basement. Yeah,
that guy still has eels in his basement. How are
the eels doing?

Speaker 2 (01:19:09):
I do want to know. Yeah, put a microphone in
the water. How you doing?

Speaker 1 (01:19:12):
How how you live in American Random sends apropos of
nothing except maybe a little whiskey. I decided to watch
The Terror. I saw a couple episodes when it first
aired years ago, but I never finished it. Looking at
this time of spring eighteen forty seven, I suddenly find
myself thinking of contemporaneous events. Contemporaneou, Yeah, I think I
did that right.

Speaker 2 (01:19:31):
Are we talking about the troubles or what are we
talking about.

Speaker 1 (01:19:33):
It's called the Terror. I don't know what that is.
I've never heard of this show. Interesting, at this time,
the Mexican American War was going on, which would culminate
in the US taking everything from Texas to California. Europe
was getting ready to explode the following year. Unfortunately, I
don't have many specifics offhand about other places at this time.
It suddenly occurred to me that these guys would have
been the show based on a novel, as I understand it,

(01:19:56):
fighting the Arctic for their lives when the rest of
their world was exploding, which I I think was a
bit of a thing in polar exploration.

Speaker 2 (01:20:03):
I see.

Speaker 1 (01:20:03):
I think it was the Shackleton expedition that occurred during
World War One, which must have been the nearest the
average person could come to being stuck in a time
machine or temporal displacement field or something. Wait, I newly
froze to death for months and new idiots did what.

Speaker 2 (01:20:18):
There's a new country.

Speaker 1 (01:20:19):
Yeah, Shackleton was aware of World War One, I believe it. No,
not his brother so much. Well, no, his brother, the
Irish jewel thief.

Speaker 2 (01:20:33):
Allegedly right, allegenly, No, Ernest was aware.

Speaker 3 (01:20:37):
Of World War One.

Speaker 1 (01:20:38):
I think he left. I guess during World War One,
maybe on his expedition. I can't quite know.

Speaker 2 (01:20:42):
He was from whatever called. I remember from whatever call
he didn't avoid the He it wasn't.

Speaker 1 (01:20:48):
A draft because he was European. It was something else,
but I think he did. He wasn't I don't remember.
I talked about it on the episode, but I can't
remember what was going on there. I vaguely recall that
there are some Inuit slash yepeeks, slash somebody or other
legends that come up on this show. I don't remember
if the original author made them up or if they're
real traditional stuff. Either way should be interesting. Let us

(01:21:10):
know who it is. Addy. So the Fox says, what's
the trope name for when someone finds out there the
chosen This is from Tumblr. When someone finds out there
the chosen one, and it's like no, thank you, and
it goes and does something else and this is common sense,
gifted kid burnout. But then someone plays replies refusal of
the call is an actual trope name, that is that's yeah,

(01:21:30):
usually followed by the tropes of the call knows where
you live and you can't fight fate and that's also true.

Speaker 2 (01:21:35):
Yeah. Stories where it's like you, regardless of your decisions,
do not have to either fight the big bad whether
you like it or not, those are interesting.

Speaker 3 (01:21:44):
Well.

Speaker 1 (01:21:44):
Both The Fellowship of the Ring and The Hobbits start
with Frodo and Bilbo refusing the call.

Speaker 2 (01:21:49):
They do refuse the call, yes, and then the.

Speaker 1 (01:21:51):
Call is trying to contact you about your Destiny's extended warranty.
Mailman Dan sends us a mug, sends it's a mug
which is Asriea Lucian, Tamlin, Cassian and riestand but Tamlin's
cross out and it says, not you, this is for
my parlor folks out.

Speaker 2 (01:22:09):
There, got it? Okay?

Speaker 1 (01:22:10):
Yeah? I was just like, who the fuck are these people?

Speaker 2 (01:22:13):
I'm just disassociating.

Speaker 3 (01:22:14):
Yeah, that's fine, Chelsea.

Speaker 2 (01:22:16):
Chelsea asked me one day. I don't know if we
would discussed this, asked me one day if you could put.

Speaker 1 (01:22:21):
Some a guitar art that is really pretty and I
want to support artists and I want to buy it,
and you won't let me.

Speaker 2 (01:22:28):
That's not actually what we talked about. That's not what
was said. And you're painting me in the wrong light.
Here you're You said, can I have some art on
the wall of a sexy hot man from the Aquitar series?
And I was like no, because I don't know anything
about it and I'm not really into the series. And
you were like, okay, that's very fine and valid. And
then we've come to now where you're just painting me
as the villain.

Speaker 1 (01:22:47):
I okay, I did not say sexy hot man. Now
that's implied because of who h huh I want to
put up on the walls.

Speaker 3 (01:22:56):
Yeah.

Speaker 2 (01:22:56):
No, you informed me who he is, your little miamio,
my little mew. Yeah, what's his name again? Yeah? No,
uh huh. The guy you unhealthily are interested in. Yeah,
it's fine. We don't need to have him on our walls.

Speaker 1 (01:23:10):
The one who I'm like, you kind of remind me
of him.

Speaker 2 (01:23:12):
And then and then both Christina and I were like
what are you talking about?

Speaker 1 (01:23:15):
No, no, no, because Christina was like, mal wouldn't kill someone
if they if they insulted you?

Speaker 2 (01:23:19):
And you were like what, moving on?

Speaker 1 (01:23:22):
Thank you for proving my point.

Speaker 2 (01:23:24):
Moving moving on, We're not going to have that on
our walls.

Speaker 1 (01:23:27):
You don't love me as much as you should.

Speaker 2 (01:23:31):
You know what it's on recording. I don't love you
as much as I should. It's it's it's recorded and
going to the public.

Speaker 1 (01:23:37):
All right, Would you mind listening to an email?

Speaker 2 (01:23:42):
I like how it would have been? Would you mind
reading an email? I don't have the emails though?

Speaker 1 (01:23:46):
No, I know you, I know, all right, So let's go.
Let's go back.

Speaker 3 (01:23:52):
Oh from Duncan.

Speaker 2 (01:23:53):
Oh?

Speaker 1 (01:23:53):
Hey, what's up dunk All right?

Speaker 2 (01:23:55):
Hello Demon, good.

Speaker 1 (01:23:57):
Day even This is from December sixteenth.

Speaker 2 (01:23:59):
Sorry, I love the show, Duncan, I really do.

Speaker 1 (01:24:04):
I had some thoughts or notes on the intro to
this week's episode about the stone tape theory. The first point,
what percentage of sandwich is a hot dog?

Speaker 2 (01:24:12):
Yes?

Speaker 1 (01:24:13):
As Christina pointed out, there seems to be no real
meter by which we should measure the sandwichness of a
hot dog, and due to this fact, the inevitable conclusion
is that everyone will decide their own models of clarification
by which to begin the assessment of said hot dogs qualifications.

Speaker 3 (01:24:26):
True, the two most natural.

Speaker 1 (01:24:27):
Paths to go down would be either a go from
zero to one hundred percent on a loose food scale,
where all items on the scale would be regarded as
a food item, or a meal or b anything and
everything can exist on the scale of zero to sandwich.
Both of these options will lead to very different answers,
so the most natural solution would be to provide an
answer for both and move on. Here are my thoughts
on that. Hey, a hot dog is already considered a

(01:24:49):
sandwich by some, especially if the thin, loose, sad excuse
for a connection of bread inevitably breaks that allows the
fund to become two pieces of bread. That happens a
lot to us because we get those brioche buns from
Trader Joe's.

Speaker 2 (01:24:59):
But they're so we do make hot dog sandwiches. Basically, yeah,
they're so good.

Speaker 3 (01:25:03):
Though.

Speaker 1 (01:25:04):
Thus I'd say on the food scale, hot dogs at
the very least hit the ninety five percent. The well
structured alpha chad sorry of a hot dog, better known
as a bratworst, usually has a sturdier New England role
that is better put together and is therefore less of
a sandwich. So it would be perhaps eighty five or
ninety percent sandwich.

Speaker 2 (01:25:18):
Dunk and I need you to preach, I need you
to spread beep.

Speaker 1 (01:25:22):
Hot dogs are about ninety nine point ninety nine to
nine continued percent a sandwich. They are just the closest
technically non sandwich thing to a sandwich as far as
I am aware. The other piece of the intro I
wanted to discuss was the part about opposites. There are
a few definitions, but the common element is in the
word itself, a pose. Opposites are two things or concepts
that oppose each other. So I do not think the
opposite of fire is no fire, because the absence of

(01:25:44):
a thing does not oppose the thing. Water opposes fire,
but it isn't the only opposition to fire, so the
correct answer is indeed subjective. As far as the opposite
of sandwich, well, this concept is as nebulous and potentially
abstract as it gets. The first thought would be a
food item like the Almighty soup, for example, but even
that isn't far enough on the opposition. One could say
that the opposite might be a random non edible things

(01:26:05):
like a garbage truck or two million dollar condo in
New York City. But I'd go as far to say
that the truest opposition to a sandwich would be an
alternate universe in which sandwiches simply do not exist, and
that or a person that just hates sandwiches in all
their forms, which to that I say, weird.

Speaker 2 (01:26:18):
But wait, hold on, we got a pause here. There
was a movie where it was the buzz Lighter movie.
I don't remember who the actor was, but it was
a buzz Lighter movie and the spoiler alert. The plot
is that they crash line on a planet and he
tries to use warp technology to.

Speaker 1 (01:26:36):
It was Chris Evans that was voicing buzz lightyer.

Speaker 2 (01:26:38):
Voice, Well, they try to use warp technology to leave
the planet. Anyway, a lot of time passes and it's
like one hundred years in the future. And the way
that they eat sandwiches is to put two slices of
ham on the outside, right, and it's me meat and
then bread and then in the center. And the thing
is he's eating it, and I remember the line is like,
this isn't how you eat a sandwich, and everyone's like,
that's always how you've been needed a sandwich, and then

(01:26:59):
he eats it and he's like, Okay, maybe it's not
that bad. Actually, yeah, anyway, that's disgusting. Let's not do that.

Speaker 3 (01:27:05):
Well.

Speaker 1 (01:27:05):
As you can see, the intro for me was certainly
some food for thought, haha. And I couldn't help, but
throw in my two cents. Thanks for doing what you do.
Three C's team, Duncan.

Speaker 2 (01:27:13):
Thanks Duncan, Thank you, Duncan.

Speaker 3 (01:27:14):
Send more pictures of your cats.

Speaker 2 (01:27:17):
We're cat photos, yes, now, honestly, we do require your cats.

Speaker 1 (01:27:20):
We require photos of your.

Speaker 3 (01:27:22):
Cats, Chelsea.

Speaker 2 (01:27:24):
Yes, how do you feel about that?

Speaker 3 (01:27:26):
Do you think we're good?

Speaker 1 (01:27:28):
I think you know you're the one that has to
edit it, so that's on you.

Speaker 2 (01:27:31):
I do. I do have to edit it. I think
I would like to just get to editing the podcast.

Speaker 1 (01:27:34):
All right, that's fair.

Speaker 2 (01:27:35):
Yeah, and thank you listeners. We do love you very much.
I want to make sure to get this episode out
for you so that way you can enjoy the rest
or technically, why now you've already heard the rest of
the tale now I think about it, Yes, but post
you've heard it. But yeah, Chelsea, I love you.

Speaker 1 (01:27:49):
I love you too, Babe.

Speaker 2 (01:27:50):
Christina, we love you.

Speaker 1 (01:27:52):
I love you, Christina. Come back.

Speaker 2 (01:27:54):
I also I said, we we love Christina, and then
we comrade have a friend. Okay, why can't we have
a communist relationship and relation to Christina?

Speaker 1 (01:28:07):
No, I have to win. I have to win.

Speaker 2 (01:28:10):
Okay, all right, that's pretty much you all the time anyway,
thank you listeners, and thank you Chelsea.

Speaker 3 (01:28:17):
Thank you.

Speaker 1 (01:28:18):
I love you.

Speaker 2 (01:28:19):
Yeah, sure you do ye Bye bye Chelsea.
Advertise With Us

Popular Podcasts

Stuff You Should Know
Dateline NBC

Dateline NBC

Current and classic episodes, featuring compelling true-crime mysteries, powerful documentaries and in-depth investigations. Follow now to get the latest episodes of Dateline NBC completely free, or subscribe to Dateline Premium for ad-free listening and exclusive bonus content: DatelinePremium.com

Betrayal: Weekly

Betrayal: Weekly

Betrayal Weekly is back for a brand new season. Every Thursday, Betrayal Weekly shares first-hand accounts of broken trust, shocking deceptions, and the trail of destruction they leave behind. Hosted by Andrea Gunning, this weekly ongoing series digs into real-life stories of betrayal and the aftermath. From stories of double lives to dark discoveries, these are cautionary tales and accounts of resilience against all odds. From the producers of the critically acclaimed Betrayal series, Betrayal Weekly drops new episodes every Thursday. Please join our Substack for additional exclusive content, curated book recommendations and community discussions. Sign up FREE by clicking this link Beyond Betrayal Substack. Join our community dedicated to truth, resilience and healing. Your voice matters! Be a part of our Betrayal journey on Substack. And make sure to check out Seasons 1-4 of Betrayal, along with Betrayal Weekly Season 1.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.