Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Chris (00:24):
Welcome to cult or just weird. I'm Chris.
Kayla (00:28):
Are you confused about it?
Chris (00:30):
No, I just don't know how to start.
Kayla (00:33):
I can start. You want me to start it?
Chris (00:34):
Yeah, go ahead.
Kayla (00:35):
I'll just start it with the fucking intro regular.
Chris (00:37):
Oh, you just. Well, okay. Well, in that case, welcome to culture just weird. I'm Chris.
Kayla (00:40):
Say a little better. Say it like this. Say it like a podcast. Hey, guys, welcome to.
Chris (00:49):
Oh, no, that's good. That's good. Keep doing it.
Kayla (00:51):
Terrible.
Chris (00:51):
No, that's very tiktokable. That's so tiktokable. Hey, guys, welcome to culture just weird. I'm Chris. I'm here showing you some food videos that are way too rapid.
Kayla (01:02):
And I'm Kayla. I'm a tv writer.
Chris (01:05):
Your energy was not up to par. Okay.
Kayla (01:08):
Welcome to Culture just weird, the podcast where we talk about cults and weirds. That's Chris. He is a data scientist and a game designer.
Chris (01:15):
And that's Kayla. She is a tv writer.
Kayla (01:18):
And welcome to the season finale for season six of culture just weird.
Chris (01:24):
Yay.
Kayla (01:26):
Little bit of a surprise, little bit of a curve ball we are throwing at you there. But as we've been doing these episodes on the TESCREAL bundle, which we'll define for anybody who might be tuning in to our show for the first time, and it happens to be the finale, we are at the point where we are to evaluate TESCREAL based on our. Is it a cult or just weird criteria? And that kind of felt like a good stopping point for the season.
Chris (01:50):
Yeah, I don't think that was where we necessarily saw the season going back in, you know, December or January, but that's definitely where the. That's definitely where the story took us, is down this sort of, like, transhumanist extra. So we'll just define it now. It's an acronym standing for transhumanism, extropionism, scientology. What's the. Oh, no. Singular. I'm going to keep that one. Singletarianism. Cosmism. I'm, like, sitting here having to remember this.
Kayla (02:24):
I'm having to remember how to spell.
Chris (02:26):
Yeah. Rationalism, that's T E S c r. Then EA is effective. Altruism. That's one thing. And then the l is long term.
Kayla (02:35):
Ism, basically what the Silicon Valley guys are into.
Chris (02:38):
It's that. And that's kind of the thing. That's why we ended up kind of thinking, like, oh, maybe this is important, actually.
Kayla (02:44):
Yeah.
Chris (02:45):
Anyway, yeah. So we're here to evaluate whether tescreol is a cult or nothing. This is definitely the longest we've gone between using our criteria. It's like our gimmick. And we didn't do it almost at all. What did we do it for this season? Do you remember?
Kayla (03:00):
We must have done it for cryonics.
Chris (03:02):
We did it for cryonics. I don't remember what we decided.
Kayla (03:05):
Not a cult. We decided it was not a cult.
Chris (03:07):
That sounds right. And then I think we also did it for less.
Kayla (03:10):
Wrong, probably.
Chris (03:12):
And I think we also decided. No, that one. We said it was a cult, but that's okay.
Kayla (03:18):
And we also did it with the turge of perpetual life.
Chris (03:20):
Oh, yes, we did. You're right. You're right. Okay, so now is when we're gonna just bring everything together and talk about TESCREAL as a whole. Before we do that, though, I think we have just a couple little admin things to get out of the way just to tell you guys about. You're the one with the notes, so.
Kayla (03:37):
Oh, I didn't know what you were doing. He was pointing his little fingers at me. I just wanna tell you guys about how kind of the rest of the year is going to play out for culture. Just weird. So, like we said, this is the official season finale, but there are a few more stories that we want to tell that didn't fit neatly into the season. Things that are non TESCREAL related, and then some side journeys, some avenues we want to go down that are related to TESCREAL.
Chris (04:00):
So there's one thing that I want to talk about that is, like, maybe could be like, a lowercase letter that is in the acronym that I think should be there.
Kayla (04:09):
So what we're going to be doing is we're going to be releasing these stories as we get to them kind of through the end of the year, not on a regular schedule. But what we do want to do is make sure they're available to our Patreon patrons first, and then they will publish on the main feed after that. And then, of course, we will also be catching up on our Patreon exclusive bonus episodes. So if you are a patron and you're wondering where those are, they're coming. And if you are not yet a patron, go to patreon.com culturjisweird so you can make sure you don't miss any cultures weird content.
Chris (04:41):
And if you are not chatting with us on discord, the link for that is in the show notes. You should be chatting with us on discord because we talk about all the stuff that we talk about on the show.
Kayla (04:51):
There are some great conversations happening in there, and they are the things we talk about here.
Chris (04:57):
Yeah. I'm bleeding right now.
Kayla (04:58):
I wouldn't be recording a podcast if I weren't.
Chris (05:01):
Okay, so I think that now is the time for us to actually do our gimmick and go to Drumroll. Maybe I'll insert a drumroll here. Oh, God. You can tell when we don't have a script because just so lame.
Kayla (05:18):
You can tell we don't have a script and we've just had a nap because we're all like, oh, I have energy.
Chris (05:24):
Anyway with that, let's talk about whether TESCREAL is a cult or not. Kayla, what's the first criterion?
Kayla (05:32):
He keeps saying cult or not. And you're missing an opportunity.
Chris (05:34):
Excuse me. Excuse me. You're right. Cult or whether it's just weird.
Kayla (05:38):
There we go. Thank you.
Chris (05:39):
Because it's definitely at least weird.
Kayla (05:42):
Everything is, actually.
Chris (05:43):
Do you want to do the thing where we. Like I ask you? So what's your initial? Before we go through the criteria?
Kayla (05:50):
Yeah. Always ask me that.
Chris (05:51):
Yeah, well, what is it? Cult, cult, cult.
Kayla (05:54):
But I also feel like, I say everything's a cult, and then I go through the criteria and then it convinces me that it's not.
Chris (05:59):
I mean, I think that guilty until proven innocent is perfectly valid.
Kayla (06:04):
That's what's made this country great.
Chris (06:06):
That's right. Okay, so you think Cult. I think.
Kayla (06:10):
I think it's a network of cults.
Chris (06:11):
Frankly, I think it's too big to be anything but.
Kayla (06:15):
Like, it's too big to be anything but TESCREAL.
Chris (06:18):
Yeah. I don't know. I'll say cult.
Kayla (06:20):
Did doctor Torres have an opinion?
Chris (06:24):
So we did talk about that. I might try to find that clip and splice it in here.
Dr Emile Torres (06:30):
I would say that there is a good case to make that it's that a lot of these communities are run like a cult. It might sound hyperbolic, but I really don't think it is. So take EA, for example. It is very hierarchical. There's a lot of hero worship. I mean, by their own admission, you could find them talking about this. People in the EA community have said there's way too much hero worship. There's a lot of worship of McCaskill and Toby Ord, Elias Suitkowski and Nick Bostrom and so on. You know, you sort of, like, if you're one of the lower rungs of this hierarchy, you oftentimes need some kind of permission to do certain things. You know, the center for Effective Altruism has guidelines for what people in the a community should and should not say to journalists.
(07:20):
They tested a ranking system, secretly tested an internal ranking system of members of the EA community based in part on IQ. So members who have an iq of less than 100 points removed. Once they have an iq of 120, then they can start to have points added. All of this sounds very cultish. They even do something. Many members participate in what they call doom circles, where you get in a circle and you criticize each other one at a time. And if you.
Chris (07:54):
Oh man, yeah, that is for sure.
Dr Emile Torres (07:57):
Right? And if you go on the EA forum website, you can find the statement that they begin each critique with. And it is like, you know, for the greater good, for the benefit of this individual and the aim of improving themselves and their epistemic hygiene and so on. It sounds so bizarre. I mean, there are many examples. There's a lot of myth making in the community. You know, leading figures like McCaskill basically propagated lies about Sam Bankman Fried, one of the saints of the EA rationalist, long termist community. He drove a beat up Toyota Corolla and lived this very humble lifestyle. What they didn't tell you, what they knew was that Bankman free lived in a huge mansion and owned $300 million in bahamian real estate.
(08:47):
So just like a lot of the televangelists who preach one thing and then have private jetse who are flying around in private jets without anybody really supposed to know about that. So anyways, there's probably 50 more things to say about that. But it is very cultish in many ways. And in fact, when I was part of the community, we had a number of conversations about whether or not we are part of a cult. Even. Here's my last thing I'll say. I've even had. This is 100% true. I've had family members of some of the leading eas contact me in private to say that they're worried their family member is part of a cult.
Chris (09:28):
Oh my gosh. I mean, you're. Those. Some of the things you mentioned are pretty classic.
Kayla (09:37):
Well, I think the first criteria. Criterion, criteria.
Chris (09:41):
Eight criteria.
Kayla (09:43):
First criteria. It was going great, difficult. And I don't know how we would define who the charismatic leader of TESCREAL is.
Chris (09:58):
Yeah, I think this kind of goes directly to the. It's so big, right?
Kayla (10:03):
Because I can name.
Chris (10:03):
This is what she said.
Kayla (10:05):
I can name a bunch of guys, which is also what she said. And I don't know if I can, like, is it Ray Kurzweil? I don't know. Is it Elon Musk? I don't know. Is it Nick Bostrom? I don't know. Is it William McCaskill? I don't know. I could probably make an argument for any of those. Probably least of all Elon Musk. Frankly.
Chris (10:24):
I think so. And then there's Julian Huxley, who coined the term. And then there's Max Moore, who coined other terms like extropianism, so we could voltron all these white dudes together into like a giant nerd.
Kayla (10:40):
I was just going to say, is the acronym itself the charismatic leader?
Chris (10:46):
So it's Emile's fault.
Kayla (10:47):
It's all doctor Torres fault?
Chris (10:50):
No, I think let's pick someone.
Kayla (10:53):
And I mean, I think that not being able to pick someone is also answer. If there's not a definitive person, is that the answer? That there is no good point singular. Charismatic leader. But then that makes me question the criteria, because there's certainly been cults with more than one leader thinking on theme. Heaven's gate for most of its life had two leaders, t and o, right? So I don't know.
Chris (11:19):
Yeah, so I think you're right that, like, the more diffuse it is, the less this particular criterion hits.
Kayla (11:27):
Criterias.
Chris (11:28):
Sorry, criterias hits. But I think that if I did have to pick one and maybe we can talk about multiple people as being charismatic leaders within the movement. Like, you know, we could talk about McCaskill, I think would be a good choice. I think Max Moore would be a good choice. But I think if I really had to pick someone, I would pick Nick Bostrom.
Kayla (11:46):
Interesting.
Chris (11:47):
And it's unfortunate that we didn't talk more about him on the show. He came up a couple times, but I really feel like he should have come up more because of how influential he has been across all of the letters. Because remember, we're not talking about transhumanism. We're not talking about just ea. We're not talking about just singletarianism. Right. Like, if it was the t, I would say maybe Max Moore. E. Max Moore, the S. I'll give you Ray Kurzweil for that one, for sure. And he's definitely charismatic. C. We'd have to maybe go to Ben Goertzel. So he's like the father of modern cosmism and also invented the term AGI, artificial general intelligence. If it was r rationalism, maybe you could say, see, now that I'm going through them, though, I'm like, each one has a. Its own guy.
Kayla (12:39):
Each one has its own guy.
Chris (12:40):
And like, it's okay. I'm like, picturing, like, a Power Rangers set up here.
Kayla (12:44):
It's funny now, because now I've gone back to convincing myself in my head that the best argument is actually Elon Musk.
Chris (12:52):
See, that's why the best argument for me is Nick Bostrom, because Elon Musk feels more like a receiver than a. Than a giver in terms of these philosophies. Wait, let me finish the letters, though. Let me finish the letters. Okay, so you got R. That's rationalism. That's less wrong. That's Eliezer Yudkowski. Then you got EA. That's William McCaskill. Then you got Elle. That's also William MacAskill and Nick Bostrom. Yeah, but now that we're picking one for each, let's say the EA is Sam Bankman fried. Let's say it's Sam Bankman fried.
Kayla (13:24):
I don't think. Yeah, wait, sorry. SBF for EA. Too many syllables. Too many acronyms.
Chris (13:31):
It's a very acronym heavy season.
Kayla (13:33):
I think McCaskill is the EA and.
Chris (13:37):
The L. Okay, so we'll count him for both of those. And then, like, so those are the Power Rangers and then, or those are the Voltron characters and the I'll form the head one for anybody who's watched Voltron, which is going to be like, one person here, but the main guy that's sort of, like, sitting above it all seems to me to be Nick Bostrom.
Kayla (13:56):
Nick Bostrom is.
Chris (13:58):
He touches all of those letters.
Kayla (14:00):
Possibly not even arguably the most, like, preeminent philosopher on this stuff. Like, if you were to get into the philosophy world and say, who's the most famous guy talking about x, risk and AI and.
Chris (14:13):
Right, but he's also talked about transhumanism. He's also talked on EA.
Kayla (14:18):
It would be him. Yeah, but do you want to hear my Elon Musk argument?
Chris (14:21):
Yeah.
Kayla (14:22):
I really don't want to give him the credit for this, however.
Chris (14:25):
Well, it's about time he gets credit for something, Caleb.
Kayla (14:28):
He gets credit for something. I think of all of those names that we've said of the people who have made Tezcriol the most visible to the people, to the non Silicon Valley people, to the non academia people, to the non upper echelon elitist, to us plebs out here listening to podcasts is potentially Elon Musk.
Chris (14:51):
Okay.
Kayla (14:53):
And so I think that I could see the argument here of, like, yeah, sure. He's not. He's definitely not coming up with these ideas. He's not a philosopher. He's not an academic. However, does the person coming up with the ideas have to be the number one, or can they be the number two? There definitely are instances we can think of in cults where there is a figurehead and a number two that maybe one is actually coming up with the ideas more than the other. But I don't think these guys. I also. I have to say this. I think a lot of these guys probably don't like Elon Musk.
Chris (15:23):
Probably. I imagine that.
Kayla (15:25):
I think Eliezer and Elon get into it on Twitter.
Chris (15:28):
Yeah. And I imagine that, like, a guy like William McCaskill probably doesn't care for him. But I don't want to put words in anybody's mouth. Okay, so what I think. Let's sort of summarize here. I think there's, like, potential arguments for Nick Bostrom, potential arguments for Elon, and then each individual.
Kayla (15:44):
I'd rather Nick Bostrom. Cause fuck Elon Musk, but also fuck Mick Bostrom.
Chris (15:48):
Well, yeah, fuck both. They're both shitty. And then each individual letter kind of has its own little leader. So I kind of feel like that means, to me, that means overall charismatic leader is high. You have a group of charismatic leaders, and then you have an overlord thing.
Kayla (16:03):
Charismatic someone is the overlord. It reminds me, then, of Hare Krishna, of the International Society for Krishna Consciousness, in which after the overlord, after RLA Prabhupada passed, instead of the group, then passing to the second in command, it went to ten guys. And so there were, like, ten. And they all had individual factions and were their own kinds of charismatic leaders.
Chris (16:27):
And their own flavors of ISKCOn, while our guys have their own flavors of TESCREAL.
Kayla (16:33):
So it reminds me of that a little bit.
Chris (16:34):
Okay. For the art for this, I want to draw these guys faces in Power Rangers outfits.
Kayla (16:40):
Please do. Wait, why Power Rangers? Why not Voltron?
Chris (16:43):
Well. Cause it's easy. I mean, what am I gonna do? Like, stick them in one of those giant metal tigers for that? I don't know. How's that gonna come out?
Kayla (16:49):
I don't know. It sounds cool, though.
Chris (16:50):
It does. Okay, so first criteria. We're gonna say, hi, charismatic leader. What do we have up next?
Kayla (16:56):
Expected harm.
Chris (16:57):
Expected harm. Oh, boy. Ugh. Some of these things feel more harmful than others.
Kayla (17:05):
I don't think that most of these guys are gonna, like. I'm trying to think of, like, other things that have had expected harm are gonna take, like, a bath with, like, a mud product that was made from, like, petroleum on accident and say that it's like healing them. I don't think they're going to eat borax. I don't think they're going to.
Chris (17:22):
Yeah. And I guess if we're saying expected harm, like, yeah, we're talking about rank and file, not leaders. So if you are a person that's doing ea stuff, I don't think it's going to be harmful to you for the most part.
Kayla (17:34):
Well, I don't think physically, but I think that there's other harms that can, especially like the less wrong communities.
Chris (17:40):
Right. Roko's basilisk and all the people that donated money to Miri after that, the.
Kayla (17:44):
Psychological harm, the financial harm. I do think if we do start talking about, you know, we already evaluated church of perpetual life on its own, but if we talk about the folks in this bundle who get into some of this life extension longevity stuff, there is potential for harm there. And a lot of these tuscarial guys do kind of end up being like antithetical to quote unquote big pharma, which I don't fault them for, but that can wade into dangerous territory. And then I also think just like social harm, you and I have heard stories of it's impossible to go to a party in Silicon Valley area anymore without people just talking about aih.
Chris (18:21):
What about AI?
Kayla (18:22):
And like that.
Chris (18:23):
You're supposed to talk about the weather.
Kayla (18:25):
Okay, I'd rather talk about the weather at this point. If that's the only social skill that is being built up in this population to the point where you would not be able to then go to a party in Boston. That feels like a harm to me.
Chris (18:40):
Well, you can't go to a party in Boston without talking about the pats, so it's kind of similar.
Kayla (18:43):
I guess that's true.
Chris (18:44):
Yeah. So I see what you're saying. I think.
Kayla (18:49):
How do you think actually, can I interrupt you please?
Chris (18:52):
Because I don't even know what the fuck I'm thinking.
Kayla (18:55):
Not to keep appealing to authority, but how do you think doctor Torres would answer this question? Doctor Torres, who is one of the academics who coined the phrase, well, they.
Chris (19:06):
Think that it's a tremendously dangerous ideology.
Kayla (19:09):
That says something to me.
Chris (19:10):
Yeah. And I think that if that's part of what I was going to get at is it kind of depends on what we're talking about. First of all, which letter we're talking about. One letter might be more harmful than another letter. But if we're talking about TESCREAL, if we're talking about the big Voltron, then it does dip into at least these ideas that allow billionaires to say, well, it's okay if we nuke most of the world as long as it doesn't threaten the project. I should say specifically, that was Eliezer, who was not a billionaire. Point stands. There's some ideas there that they seem rather comfortable with. Lots of bad things happening to currently living people in order to preserve this bizarre vision of a long, long term future.
Kayla (20:00):
Yeah. Interestingly, the expected harm for this is high for everyone, even people not in it, which is different from most of the groups we've talked about.
Chris (20:10):
And you know what I'm thinking of now, which either sucks or is really interesting?
Kayla (20:14):
Qanon.
Chris (20:16):
They are both suck and interesting.
Kayla (20:18):
I just mean that QAnon has, like, oh, that's gonna hurt everybody, not just the people in it.
Chris (20:23):
Right, right. Yeah. No, I'm thinking, like, for us to evaluate expected harm, we kind of have to do the moat math. You know, we kind of have to sit here and go like, well, it's pretty unlikely that these guys are gonna cause nuclear war, but if they do, the punishment is very high. So when you multiply those two together, it equals this. Like, I kind of felt myself doing that in my head, and I'm like, no, don't do that. That's them. That's what they do. But that being said, I don't know, there's also a place and a time for that, and maybe this is one of those. So I'm gonna say relatively high, although the big harms definitely have a lower chance.
Kayla (21:04):
I think that's valid. Okay, the next one's easy. Cause it's kind of our gimme question.
Chris (21:09):
Okay.
Kayla (21:10):
Presence of ritual. Oh, yeah, it's so high, it couldn't be higher.
Chris (21:14):
Oh, my God.
Kayla (21:14):
And I think that you and I potentially need to reevaluate this criteria just.
Chris (21:19):
Because it's just on everything.
Kayla (21:21):
Have we ever talked about something that didn't have high presence of ritual? I don't know.
Chris (21:25):
I don't know. I kind of want to go back and we've actually talked about this before where, like, this criterion criterias. Excuse me. This is usually like the canary in the coal mine.
Kayla (21:34):
You know, this is what usually gets us into a topic.
Chris (21:37):
Yeah. Like, with Irvine, it's like, why does Irvine feel weird? Well, it's got a weird vibe, but, like, mainly it's the logo, right. You know, like, it's always something like that. Like, what are those people doing? Why are they chanting that weird chant? Why are they all wearing the same clothes? Why do they use that weird language. Right. And that's one of the things that doctor Shore has talked about too, is like when talking to these outsiders who are worried about their insider friends and family, they'll say like, oh, they use this weird vocabulary now.
Kayla (22:05):
Yeah, the jargon in this, the jargon.
Chris (22:08):
In less wrong was insane.
Kayla (22:09):
The chart and the jargon doesn't just mean like, oh, they use a word to mean as a slang. No, when you're talking about less wrong, it's like you have to have entire volumes of context in order to understand the phrases and words that are being used in other posts and contexts.
Chris (22:27):
Yeah. Okay, ritual is high.
Kayla (22:29):
What other examples of ritual do you think? Because it's definitely jargon, but there's other stuff too.
Chris (22:34):
I mean, if we're going all the way back to our cryonics episodes, which I think we should, I think that cryonics does fit as a puzzle piece into the TESCREAL landscape. There's plenty of stuff there. I mean, essentially it is a death ritual in order to preserve your body, to be resurrected in the future. There's all kinds of procedures. I don't know, maybe it's not a ritual if it's science based and it's medical based, but I don't know.
Kayla (23:04):
I think another example of presence of ritual is we didn't necessarily touch on this a lot this season, but just this is a very conference based community. There's a lot of symposiums and conferences and talks.
Chris (23:19):
We talked about that in the humanity plus episode a little.
Kayla (23:21):
The act of getting together and being in community is there's singularity summits, there's effective altruism summits, there's radfests, there's all this stuff that is, yes, an opportunity to spread the good word, but it's also reifying the community and taking part in actions that solidify your identity as a member of this group. I think that's a high ritual.
Chris (23:48):
Yeah. Now that we're kind of going through it a little more, I don't know if this is the highest ritual we've ever seen. Now that we're actually talking about individual examples, I'm like, okay, yeah, there's some ritual going on with that. And if they add them all together, it's quite a bit. And certainly if you depending, again, on the letter, right. If it was just the rationalists, then I would say it's much higher. But if it's like extropions, then, you know, I don't know, they had a listserv, they had a group, I don't know if they had, like, logos and chants and songs.
Kayla (24:19):
They definitely don't have chants and songs, unfortunately.
Chris (24:22):
So I think it's definitely there. I just. But I'm not sure that it's, like, the highest thing that is ever.
Kayla (24:28):
Are we talking ourselves in a medium?
Chris (24:30):
I think I'm talking. Yeah, I think we talked ourselves into a medium ritual or, like, medium.
Kayla (24:34):
Well, I'll agree to that.
Chris (24:36):
Okay, what's the next one?
Kayla (24:38):
I think another fairly simple answer. Niche within society. Actually, this is not a simple. No, this is tremendously difficult because the actual people who are, like, who would identify as a TESCREAL, and again, they wouldn't identify as a TESCREAL, but you know what I mean? That's niche within society. The wide ranging effects are massive.
Chris (24:58):
Yeah. It's so strange, because if you don't either a, live in Silicon Valley, or b, listen to this podcast, then it's likely you don't know what the hell this is. Even my mom was telling me the other day, she's like, I was listening to some of your recent episodes, and I was kind of lost. And I was like, yeah, that makes sense. You're not online. It's like, a lot of this stuff is normal. People that aren't living on Twitter or engaging with this type of content and this type of these movements. It does feel kind of like, wait, what? What are you talking about? What the hell is that? And Silicon Valley itself, in terms of number of people, is a small number of people. A very small number of people.
Kayla (25:36):
Right.
Chris (25:37):
But they control the world.
Kayla (25:39):
So people we're talking about are Elon Musk and Mark Andreessen and Peter Thiel, and people who have the ear of Ray Kurzweil. The most powerful governments in the world, and the most powerful entertainment industries in the world, and the most powerful industries in the world, affecting all of them.
Chris (25:56):
Sitting on top of giant influence platforms like Facebook and Twitter.
Kayla (26:00):
Like, yes. You walk down the street and ask people, do you know what TESCREAL is? They wouldn't know what the fuck you're talking about. You ask them what AI is. Everybody would know.
Chris (26:07):
Also do note, TESCREAL as a acronym is only like, what, two years old, year and a half.
Kayla (26:13):
That's the niche within society of that.
Chris (26:15):
Yeah. So I think I am going to call this niche because I'm going to grade it on how many people are we might put in the cult, how many members there are. I'm gonna grade it on that, not on its outsized influence.
Kayla (26:29):
This analogy might help if you were to evaluate if the illuminati were real in the way that conspiracy theorists think it's real. Would you evaluate the illuminati as.
Chris (26:40):
This is just as hard.
Kayla (26:41):
I don't know, as niche within society, because they say it's a small group of elites that are controlling the world.
Chris (26:46):
The world.
Kayla (26:47):
And I think that's still makes it niche within society, because it is a secret society. It is a hidden activity that kind of feels more similar to what the tescrealists are doing. I'm not saying the illuminati is real. I don't think the illuminati is real.
Chris (27:00):
No. I'm actually going the other direction in my head. And I'm like, I kind of see some of the critique of TESCREAL being a conspiracy theory. I don't think it is, but I think it has some elements, and it should definitely raise some pink flags at least.
Kayla (27:14):
I just think. Make sure you don't fall into the idea that there is a grand unifying theory of, hello, we are the council of TESCREAL elders, and this is our unified goal, and we are using all of these arms to make sure that it happens. That's the conspiracy theory. It's not a conspiracy theory to say that Elon Musk uses his money to exert influence. That's not right.
Chris (27:34):
And it's not a conspiracy theory to say one of these ideas influences the next. It's just not a cabal. Okay. I think I would still say niche for illuminati, although maybe a little less here, because, like, most people know that word now at least. But I would still say niche. I mean, what would you say? I don't think that's definitively the correct way to look at it.
Kayla (27:54):
I think I would call. I think if I were to evaluate the illuminati, I would call it a niche group.
Chris (27:57):
Okay, then I think this is, too.
Kayla (28:00):
I think there's exclusivity, and I think there's exclusivity in tuscarial.
Chris (28:03):
Yep. Yep.
Kayla (28:05):
Different kind of exclusivity, because you don't have to, like, do a secret handshake, but they should probably come up with.
Chris (28:10):
Yeah, you don't have to do, like, a satanic ritual to kill children or whatever. Like, you have to do with illuminati.
Kayla (28:15):
Yeah. Next one is antifactuality.
Chris (28:18):
Actually, you probably do. All right, sorry. Antifactuality. This one, I'm just gonna go ahead and say is relatively. I don't know.
Kayla (28:30):
I would call it high man as.
Chris (28:33):
I'm saying the sentence out loud. My brain is going through each letter.
Kayla (28:37):
I know.
Chris (28:38):
And transhumanists are like, we want to transcend our human limitations. I'm like, okay, that's just the thing that you want to do. That's not antifactual. But then by the time my brain gets to long termism, I'm like, there's a bunch of crap there.
Kayla (28:52):
I think the buck pretty easily stops with effective altruism in that, like we talked about in the episode, I think they very clearly downplay facts, and they very clearly downplay patterns of history. In order to keep the glue holding the philosophy together, in order to keep the glue holding the philosophy together, you have to ignore the way wealthy power structures, wealthy classes and power structures have existed in our societies forever. And that, to me, that, to me, is antifactual. And I would love to have a conversation with William McCaskill about this, because I cannot believe that he has not thought about that. I do believe that he has thoughts and feelings about it. I just don't know what they are.
Chris (29:35):
Yeah, that would kind of go into, I think. Yeah, some of the blind spots part of antifactuality. I don't know about, like, logical fallacies, but I do think that there are some. Like, I don't. I don't necessarily believe them when they say each dollar is spent is worth, like, x number of lives. Like, I. I get the impulse to do something like that, but, like, I don't know. That doesn't feel, like, super based in reality. I don't know about, though, about, like, other logical fallacies. I mean, if we're gonna link together transhumanism and eugenics, which we kind of did these last two episodes, there's a ton there. Right. So to the extent that TESCREALists are into the eugenicsy type stuff, which, personally, that's part of, I think, of how I would.
(30:21):
Part of how I differentiate TESCREALsts from, like, other people who are just into one or more of the letters, is that they have a more eugenicsy sort of bent.
Kayla (30:30):
Right?
Chris (30:31):
So, for example, when a Nick Bostrom, who we have called charismatic leader, talks about dysogenic pressures, that's when you go like, oh, okay. You got some blind spots in your facts.
Kayla (30:43):
That makes me go back to expected harm. Yeah, hi.
Chris (30:46):
Yeah. So what are we saying on this one, then? This topic is so big. It's too big.
Kayla (30:54):
TESCREAL is not antifactual in the way that, like, an MLM is anti factual.
Chris (30:59):
Right. Or in the way that QAnon is. Yeah.
Kayla (31:02):
I think that we can sit here. I think what's happening more with TESCREAL, and this is kind of getting into a personal place, is that it's a bundle of philosophies that have reached different conclusions re philosophy than, like, you or I would. And maybe that's not anti factual.
Chris (31:21):
That's true.
Kayla (31:22):
I think it's a lot of people with the same information reaching different conclusions. And that doesn't necessarily mean the whole thing is antifactual. We can definitely find instances of, like, this is wrong. This is wrong.
Chris (31:33):
But I think the dysgenic pressures and scientific racism, there's actual evidence against that, whereas some of the other stuff is just, like, you've reached different conclusions.
Kayla (31:43):
Right.
Chris (31:44):
Yeah. And I think that's important because antifactual can't just mean, like, I disagree with your ideas.
Kayla (31:50):
Right.
Chris (31:50):
Right. It has to be, like, some specific. There's evidence against this. I am noting the presence of logical fallacies and blind spots.
Kayla (31:59):
So, like, I don't agree with the conclusions that long termists have drawn, and I think that's. I think I'm right, but I don't think that makes them antifactual, anti factual.
Chris (32:10):
Okay.
Kayla (32:11):
Which I could be wrong, honestly. This is one that I really want to hear from, like, people on. So, like, please, if you're listening right now and you're, like, screaming at your podcast, please let us know. Cause this one's hard.
Chris (32:21):
Discord call to action. Come join us on discord. Talk about it.
Kayla (32:24):
Do we want to just say low for, like, funsies? Okay, that sounds low to me.
Chris (32:30):
It's, like, low, but it's not in the. It's just above a quarter.
Kayla (32:35):
Right.
Chris (32:36):
So that's, you know, it's science.
Kayla (32:39):
Science. We're doing science here. Percentage of life consumed or life consumption.
Chris (32:45):
Oh, God. Again, this topic is biggest, depending on who we're talking about and which letter. It could be a lot or a little. If it's cryonics, it consumes your whole life.
Kayla (32:56):
No, it doesn't. And also, we evaluated cryonics on its own, and it's not part of the TESCREAL bundle. I would honestly.
Chris (33:04):
It's a little barnacle on the TESCREAL ship.
Kayla (33:06):
It's like. It's. The Venn diagram is quite, like, the overlap of the Venn diagram is quite small, actually.
Chris (33:12):
I don't know about that, my dude. Like, there's a lot of TESCREALists that are signed up for cryonics.
Kayla (33:17):
I think that there's more cryonics people. People hinted for cryonics that would subscribe to a TESCREAL philosophy than there are independent TESCREALists that are signed up for cryonics. I don't think I agree with you in saying that there's a lot of TESCREALists signed up for cryonics.
Chris (33:33):
I see it as, like, if TESCREAL is, like, nerd culture, then cryonics is, like, the gathering. Nerd culture includes a bunch of different things. Yeah, there's definitely. There's a lot of nerds that play magic, but not every nerd plays magic. Some nerds play Pokemon.
Kayla (33:50):
Okay, well, this is not about cryonics. You're just trying to. You're just trying to avoid answering the question.
Chris (33:55):
Yeah, I don't want to.
Kayla (33:56):
Percentage of life. I'll answer it. I think it's high.
Chris (33:58):
Why do you think it's high?
Kayla (34:00):
I think it's high because I'm basing it off of a lot of, like, the folks, the specific folks that we've talked about. If you're talking about William McCaskill, this is his entire.
Chris (34:08):
But those are the leaders.
Kayla (34:09):
I know. I will land the plane. Nick Bostrom, William Macaskill. The names that we said, like, 100%, like, this is their entire lives. I'm sure they have hobbies as well that don't have anything to do with it.
Chris (34:20):
Magic, the gathering.
Kayla (34:21):
I think that the stereotype that you and I have come across of, like, if you're a Silicon Valley person, this is all you can talk about. While may not 100% reflect reality, reflects a trend, reflects a trope, reflects something that is going on in the culture of TESCREAL, in which this becomes a very big part of one's life. I think the folks that are on less wrong spend a lot of time thinking and learning about less wrong and rational stuff. I think the potential for a large portion of your life to be dedicated to this is very high and to the TESCREAL bundle on ramps that I think that you can, like, kind of. I don't want to keep comparing it to QAnon because there are huge, huge differences.
(35:07):
But I think there's a similar on ramp, a similar onboarding, where, like, you can kind of take a small step in, and it's really easy to snowball that into. This is my everything.
Chris (35:18):
Both QAnon and TESCREAL have letters in their titles.
Kayla (35:21):
It's just the acronyms. Yeah. So I guess QAnon's not an acronym.
Chris (35:25):
Not really, no. Yes. I think I agree with you, and I think Emil also was kind of talking about that with the, like, I know people who emailed me. Like, that suggests that there's, like, a. It's not family separation, but it's like, maybe step one of that. It's like dipping your toes into family separation, where somebody's starting to spend a lot of time doing this other thing. That being said, I don't know. There's no compounds. There's a lot of conferences that might take up most of your life going to all these different conferences.
Kayla (35:55):
It does seem like if you're into this, you might be going to a.
Chris (35:59):
Lot of conferences, eating a lot of hotel food.
Kayla (36:02):
Yeah, yeah, you're right. It's not the same as. I'm trying to think of something that has a compound. It's not the same as the Hare Krishnas, where people were leaving their families to go live communally. It's not the same as an MLM, where it literally does require you to have your entire life be consumed by the project. But I'm thinking again about if we go back to russian cosmism, which is one of the forefathers of this, the father of russian cosmism, very much had the belief that this should be.
Chris (36:34):
Everybody's dedicated to the tag, the sort of like, oh, my God, this is the most important thing ever. That thread definitely carried through into TESCREAL at large, I think I'm gonna say medium. Medium.
Kayla (36:53):
Because the potential's there, but it doesn't have to be. I think you can be a casual TESCREAL guy.
Chris (36:58):
Casual TESCREAL.
Kayla (36:59):
I think you can casually be like, wow, this effect of altruism thing sounds cool. Or you can casually read a Nick Bostrom book. We have a Nick Bostrom book on our bookshelf. You can casually dip your toe into these ideas.
Chris (37:09):
Yeah, I'm not gonna read it, though.
Kayla (37:11):
I think I had this last.
Chris (37:13):
Yeah, maybe I do want to.
Kayla (37:14):
Let's at least take it off our bookshelf. Cause now I'm embarrassed when people come over and see it sitting next to, like, our Ray Kurzweil one, which I'm.
Chris (37:20):
And they listen to the show. Like, people that come to the. Come to our apartment also listen to the show. So they'll just be like, I thought you hated that guy.
Kayla (37:26):
I'd rather keep Ray Kurzweil on our bookshelf than Nick Bostrom. And, like, frankly, Ray Kurzweil is way more of a magical prophet than. And, like, a factual source.
Chris (37:36):
Yeah, I still like him, though. I still like Ray Carter's wild. You're not going to get me to not like Greg.
Kayla (37:42):
He's got a lot of ideas that are.
Chris (37:44):
Yeah. Oh, yeah.
Kayla (37:46):
Not based on anybody.
Chris (37:48):
Agreed.
Kayla (37:49):
And spends a lot of time disseminating those ideas.
Chris (37:52):
Okay. So because we have the ability for people that are into maybe one of the letters, to get into all of the letters and be consumed by the letters, then we're gonna say, what medium?
Kayla (38:04):
Medium's good.
Chris (38:04):
Medium. Okay. Great.
Kayla (38:06):
Dogmatic beliefs.
Chris (38:07):
Are you keeping track of all these?
Kayla (38:08):
Yes.
Chris (38:11):
We'Re right. Everybody else is wrong. Kick you out if you dissent in any way, is that gonna depend on the letter?
Kayla (38:17):
Because I feel like, less wrong. No, I feel like there's room for dissent.
Chris (38:21):
Less wrong was very low dogma.
Kayla (38:22):
But I feel like. I feel like. And this is so mean. This is not like, let's take this out of the evaluation, but I just wanna say it. I feel like I could have a. This is just based on vibes. I feel like I could have a conversation with William McCaskill in which I could challenge his ideas. I don't know if I could have a conversation with Nick Bostrom and challenge his ideas. I don't think he.
Chris (38:41):
Why is that mean?
Kayla (38:41):
I'm keeping that because that's literally based on nothing. And I'm basing a lot of things.
Chris (38:47):
You said feel.
Kayla (38:48):
I know, and I think it's mean to say, like, I feel like this guy's a dick.
Chris (38:52):
Yeah. Okay. Fair. It's not mean. It's maybe, like, a little antifactual.
Kayla (38:56):
It's definitely antifactual, and that's why I want to say it. Still, let's strike it from the record. Separate it from the criteria. There's no evidence for this, whereas we have the evidence for. There's room for dissent unless wrong. I think there's even room. I mean, I'm in the cryonics discord. There's room for dissent there. I think that also.
Chris (39:15):
Now you're bringing up cryonics when it's supported.
Kayla (39:17):
You said we was allowed.
Chris (39:19):
Just throwing it around whenever you think it's useful.
Kayla (39:22):
I think there's room for dissent in the effective altruist community. I've seen examples of differing philosophers and academics and people being able to talk about this, and I can't imagine that if you go to a Silicon Valley party where everyone's talking about AI, they all have the exact same thought about AI. Even within TESCREAL, there are differing, diametrically opposed ideas about AI specifically. And those people are all part of TESCREAL.
Chris (39:47):
Right? If you're gonna call Eliezer Yudkowski and Mark Andreessen, TESCREALists, which I think we are. Then you have to say that the dogma is low because they have some very different views about how to go about this.
Kayla (40:00):
Right.
Chris (40:01):
Low dogma.
Kayla (40:02):
Ugh. That feels weird.
Chris (40:04):
Too bad. We thought about it. We said it. We are infallible.
Kayla (40:10):
That feels weird. But yeah.
Chris (40:12):
Yeah. Well, low dogma is just weird. And that's what coming up on the.
Kayla (40:16):
End here, you got two more. Chain of victims.
Chris (40:19):
I mean, I had you read the singularity isn't the air, so I. Yeah.
Kayla (40:22):
Can I confess something?
Chris (40:23):
I got you into it.
Kayla (40:24):
I never finished that book.
Chris (40:26):
What?
Kayla (40:27):
I never. It's fabulous, Kayla. It's so big. I never finished.
Chris (40:30):
It is pretty. It's a big fabric.
Kayla (40:31):
I've read portions of it from beginning to end. I referenced it for these episodes. I've Wikipedia ed. It's like when you watch a movie on TNT. I've seen it.
Chris (40:42):
Okay, no, that is not at all like that. So if you read little chunks of. Of the singularities, near, like, small little chunks over time, and then, like, eventually enough of them overlap that you know, the whole movie.
Kayla (40:54):
Yeah, I think it's that.
Chris (40:55):
Then it's like war games. I mean, TNT.
Kayla (40:57):
Yeah. Yes. It's like Lord of the Rings.
Chris (41:00):
Lord of the Rings is your TNt example.
Kayla (41:03):
I just think it interesting. Before I cut the cord, it was, like, always on d and D. Oh, man.
Chris (41:08):
No, for me, maybe it's just as, like, an age gap thing, but for me, it's either war games or tremors.
Kayla (41:14):
I just think we didn't have TNT when I was growing up. I think that was, like, a teenage thing.
Chris (41:18):
And now we have all the Gen Zs listening. What the hell's a TNT?
Kayla (41:23):
Okay, so chain of victims. Is it recruity?
Chris (41:29):
I don't get the sense that it compels anyone to want to recruit people into the fold.
Kayla (41:35):
It's not an MLM in that regard. It's not like, here's a codified recruitment chain, and it's not.
Chris (41:42):
It's not Qanon.
Kayla (41:43):
I gotta bring it up again. It's not QAnon in which it, like, you can see the virality. I think that this has the potential for virality, and that's probably the way it spreads more in that a book gets published and someone reads it, and then you recommend it to your friend. Or you're on Twitter and you see a thing and you're like, oh, what's that? Or you see an interview with Elon Musk, and he talks about population stuff that's going on, I think it has the potential for that virality, but it doesn't seem to grab hold the same way that QAnon did.
Chris (42:10):
Yeah. What about something like. Cause I would call NXIVM high chain of victims, too, because.
Kayla (42:14):
Yeah, but that started as an.
Chris (42:15):
It's hard to differentiate, but that's true. But it's hard to differentiate there between victim and perpetrator, especially at those mid to high levels where they still got recruited in. They're still victims, but then they also did bad things. Two people.
Kayla (42:30):
I think that this. Having a diffuse, even though it's high, having diffuse charismatic leaders helps with that a little bit, and that there's not one point that is forcing everybody to revolve around them and requiring this top down structure. I don't think that this requires. I think if you really got into it, you could make an argument, but it would be an exonym, not an endonym, if that makes sense.
Chris (42:54):
Yeah.
Kayla (42:54):
It would be something that we are a framework that we are assigning rather than a framework that has been designed or emerged.
Chris (43:01):
I think also, I think this criteria is sort of inextricably linked to the expected harm one. Because if there's harm and you bring somebody in, then you start to get that ambiguity between victim and perpetrator here. I'm not feeling the harm on the individual member level nearly as much as I'm feeling the harm on a global influence level.
Kayla (43:27):
We are all chained victims in this.
Chris (43:29):
Right. That's true. We are the victims of Tuscrael. So I'm gonna say low on chain of victims.
Kayla (43:34):
I do think there's a little bit of virality with, again, going back to our favorite man, I think that Elon Musk has a little bit of a viral reach.
Chris (43:41):
Sure.
Kayla (43:41):
And I think that there's some virality in AI stuff. But again, just because I can go, like, well, this and this doesn't mean that TESCREAL as a whole.
Chris (43:51):
Right. And even with Elon, like, he has a big reach. Is it a viral reach? Yeah, I guess maybe people do like to share his shit, but, like, usually because it's stupid, I don't know. Yeah, I think. But I think overall, low.
Kayla (44:07):
And this is the last one. The final criterias of. We know that.
Chris (44:13):
I think people are going to come away from this and, like, start using.
Kayla (44:15):
The word experience is the wrong word. Safe or unsafe. Exit.
Chris (44:20):
I don't feel like it's unsafe to.
Kayla (44:22):
I don't feel like it's unsafe at all. And if you're a TESCREAList and you want to leave, come, leave. You'll be fine. We'll take care of you. We'll welcome you into the fold of crip theory. And watching tv on the couch and not talking about AI.
Chris (44:33):
Yeah. You know, I lurked around less wrong. And then I stopped. And, like, I didn't get shunned by anybody. Of course, granted, I didn't make any friends.
Kayla (44:41):
No. And were singularitarians once upon a time, and it was easy to get in, easy to get out.
Chris (44:45):
Yeah. But again, I don't know. Like, if were friends with Ray Kurzweil, do you think he'd shun us now if were like, dude, I don't know.
Kayla (44:52):
If I got shunned by Ray Kurzweil, that would be the greatest thing to ever happen to me. Imagine getting to put that on your, like, resume.
Chris (45:02):
LinkedIn.
Kayla (45:03):
On your LinkedIn.
Chris (45:03):
Shunned by Ray curse.
Kayla (45:04):
Shunned by ray curse.
Chris (45:05):
Would you put, like, a date there? You know, like, 2005. Attended so and so college and shunned.
Kayla (45:10):
By rakers, put under special skills.
Chris (45:12):
Special skills. Okay.
Kayla (45:15):
I think that the exit is safe. You're not gonna be cut off from your friends and family. You're not gonna be. Yeah. Shunned as in the Amish. You're not gonna lose your entire community, as in QAnon. I think that the exit is safe here.
Chris (45:28):
And, of course, your mileage may vary. I'm sure that has happened to some people where they were, like, really into EA, and they and their EA friends didn't care for it when they stopped being into it. I'm sure that exists. But from our research on these letters, seems low, so. Okay, so you know what's weird is that, like, the first few were, like, so high.
Kayla (45:52):
Yeah.
Chris (45:52):
And I was like, oh, we are just heading straight towards a cult evaluation. And then it just kind of petered out. And the last few were low.
Kayla (46:01):
So just a reminder for everybody. Charismatic leader, high. Expected harm, high. Presence of ritual, medium, maybe medium well. Niche within society. Yes, it is. Niche. Antifactuality, low. Life consumption, medium. Dogmatic beliefs, low. Chain of victims, low. Safer, unsafe. Exit, low.
Chris (46:17):
It's almost split down the middle.
Kayla (46:18):
This, my friends, I think, is just weird.
Chris (46:22):
It's. Oh, man, I really was not expecting that, but it has such potential for danger and harm and perniciousness.
Kayla (46:29):
I think if were we to desire to punish our listeners, we go through every letter and evaluate every letter, and we're not.
Chris (46:40):
We should have done that at the time that ships.
Kayla (46:41):
We're not doing that. We love our listeners. We're not going to do that. We could probably make individual cases that are different. Like, maybe the t is a little higher in this and the l is a little lower than this and the l is a cult and the EA is just weird. We could probably play that game. But because we've discovered that this thing is a bundle, that this. That the TESCREAL acronym is a terror management theory of bundle, we're evaluating it as one. And I think you're right that it split down the middle. But I think that there's enough lows that I'm hitting. Just weird, and I don't want to say that.
Chris (47:14):
Yeah. And I also think, like, for me, this needs to, like, for me to call something a cult with these criteria, it needs to be, like, more than simple majority. It needs to be, like, most of the criteria. Like, you know, maybe with like one or two or three that don't. So I agree. Based on our criteria, it's not a cult. It's just weird. And I. You know, and I think about, like, okay, well, does anything make me balk about that? The harmfulness, like, things can be harmful without being a cult, right? Things can have, like, top down leaders and, you know, people influencing other people without being a cult.
Kayla (47:49):
But I wouldn't call, like, this is that case Chevron a cult? Probably. I don't know. Companies are cults. We did that.
Chris (47:57):
Yeah, we did. Well, I mean, that's the risk of a show like this, right? Is that we're like, you know, everything's a cult, right? Like, we could easily fall into that. And I think we shouldn't.
Kayla (48:07):
Man, I really don't want to call it just weird, but I think that. I think that TESCREAL as a whole, while it has the potential to grow and blossom into something more currently, is just weird and has some cult like tendencies, for sure, as we learned from doctor Torres and just from other people we've talked to. But, yeah, I think that's our definitive answer this season.
Chris (48:26):
I'm gonna tell Emile. They're gonna be so upset. They're gonna disown us on Twitter.
Kayla (48:30):
If you are in the same camp as Doctor Emile Torres and you disagree with us again, please let us know, because we're wrong all the time.
Chris (48:38):
Yeah.
Kayla (48:39):
And we wanna be corrected.
Chris (48:40):
No, we want to be less wrong.
Kayla (48:43):
Oh, God.
Chris (48:43):
Okay, so at the end of all this. Cause, like, you mentioned this earlier, we've kind of self identified in the past as singletarians and in general, we kind of like this stuff. Like, in general, we kind of like cool, sort of like futurist cyborg. We would like to not necessarily have to die at age 77. We would like to have more control over when we die. There's a lot of things that are appealing that have traditionally, in the past, been appealing to us. Maybe still are. So after doing this season of the show, how do you feel about TESCREAL? And I'll say TESCREAL, I know you've never been like, I'm an Extropian, but how do you feel about that part of your identity and belief system.
Kayla (49:34):
Now calling me a TESCREAL?
Chris (49:38):
Yeah.
Kayla (49:41):
I don't think that you have to be a TESCREAList in order to be a futurist. Futurist is not part of the TESCREAL acronym. It's not fTESCREAL. It probably could be, maybe because a lot of people are futurists, TESCREAL. But I don't think you have to be a TESCREAList to be a futurist. To be somebody who thinks about, is interested in, is excited about, is passionate about, maybe has a reverent relationship with what the future might hold in terms of, like, our technology and our expansion and our growth and what that means for, like, human beings and humankind and going off planet. I don't think you have to be a task realist to be interested in those things. And I think that it's a little bit unfortunate that TESCREAL is such a big chunk of that community these days.
(50:28):
And it reminds me about how effective altruism got a little bit bastardized by the community that kind of took it over. It started as like, oh, this is people who want to be, like, frugal and ascetic and donate lots of their.
Chris (50:42):
Money to things versus, and to be fair, had, like, a naivete about what, you know, the power structures are that enabled them to have that sort of takeover.
Kayla (50:52):
They didn't build. William McCaskill didn't build a philosophy around, like, how can Elon Musk get billions of dollars?
Chris (50:58):
Right.
Kayla (50:59):
It just kind of got, because of the blind spots in EA, it got co opted by the billionaires.
Chris (51:04):
Sure. Yeah.
Kayla (51:05):
And I think that futurism, or, like, thinking about the future and thinking about technology and thinking about cyborgs and thinking about all these cool things have gotten a bit co opted via the blind spots by TESCREAL.
Chris (51:17):
Right.
Kayla (51:17):
And so I think that I can still identify as somebody who is a futurist.
Chris (51:24):
Would you, if somebody asked, identify as a TESCREAL, if somebody asked you, were you a transhumanist, what would you say?
Kayla (51:29):
No, no, I think because you introduced crip theory in the last episode, and that gives me something that I align with more to talk about.
Chris (51:41):
Then you could say you're into crypt technoscience. Yeah, yeah, I kind of. So I agree with all that.
Kayla (51:49):
I'm not saying all TESCREALists are bad, and I don't talk to them. Love to have these conversations with people. And there's definitely, you know, take what you like and leave the rest. There's definitely things there that I can take, and there's a lot of things that I can leave.
Chris (52:00):
Yeah, I think take what you like and leave the rest is a big part of this show. We've definitely talked over the years about sort of being anti the concept that, like, oh, you're brainwashed when you're in the cult, and then when you're out of the cult, you're super not brainwashed in that sort of, like, binary of like, oh, my God, I believe everything these guys are saying and the rest is evil. And then when you get out, oh, my God, those guys were evil. And I believe that everything that he's. I think that we advocate or think is a better model for understanding as a more integrative approach. For example, I'm really glad that we did those episodes on Ayn Rand and objectivism, because I think that's a good example.
(52:40):
There's definitely still some things that I think I keep in me from that same, even though a lot of it is crap, a lot of it is horrible. And I just think that's a more healthy relationship with some of our past engagements with things that we may not like now is the sort of, like, again, take what you want. It just, it kind of sounds so cliche, because it is a cliche, but, like, I I think it's the best approach for, like, because if I sat here now and was like, everything about objectivism was bad, and now everything I think is right, then, like, I would just be running into the same. I would have the same blind spots. I would have the same, like, I can't be wrong. Everything I believe right now must be correct. And then how. What good is that?
Kayla (53:24):
And I think that having been a part of some of these communities in some ways better positions you to be able to criticize from within. Part of why we wanted to bring doctor Torres on the show as an authority is not simply because they helped coin the term TESCREAL and has done this research, but also because they have a deep background in being a member of that.
Chris (53:46):
Yeah, they helped edit the Greg Kurzweil's second book.
Kayla (53:50):
And I think that people who have that lived experience are better positioned to offer honest and valid criticism to a movement.
Chris (54:01):
Yeah. So with that in mind, I think what I call myself a transhumanist, I don't know, maybe I would say I'm a crip technoscientist instead. But if somebody said yes or no to just the transhumanists and didn't ask me to give an alternative, then I might still say yes. I think there's a lot of danger here. I think that doing the eugenics episodes definitely opened my eyes to the idea that. And I think this is partially because us talking about crip theory and crypt technoscience is, like, the. How you go about achieving a goal really does matter. The goal, like, you might say that this is me putting words in people's mouths. So this is just my thought. But you might say, like, a cyborgist and a eugenicist both have this goal of, like, transcending their biology. Right.
(55:00):
But, like, one is going about it via very pernicious bad means that have been proven to be extremely dangerous to the genocide level. And then the other one is. Is doing it in a manner that is, like, you know, creative and individual focused and blah, blah. So I think that, like, the how makes a big difference. And that's why I think I'd be comfortable saying, I'm a transhumanist. I'd be like, I'm a transhumanist, but I don't like idiocracy. I'm a transhumanist, but I don't believe in dysgenic pressures.
Kayla (55:32):
So it's a little bit of my relationship to the word vegetarian.
Chris (55:35):
Mm.
Kayla (55:36):
I don't like to call myself that.
Chris (55:38):
And part of that vegetarianism is a dysgenic pressure.
Kayla (55:41):
It's absolutely a cult. No, part of that is because of the stigma that's associated with it, which is definitely less now. Like, I'm somebody who a. My vegetarianism has waxed and waned through my life, but I came up as a vegetarian in time when it was, like, really not looked upon, kind of.
Chris (55:58):
Well, you also lived in a pretty red meat sort of area.
Kayla (56:01):
Yeah. And so there was stigma attached. And then also there is internal stigma. There are people who are vegetarians and vegans who I am not in community with because I don't fuck with the way that they talk about the underpinning philosophy of vegetarianism. And veganism. So I have a little bit of a complicated relationship with that label. And I think that mirrors a little bit, maybe how you're feeling about something like TESCREAL or transhumanism.
Chris (56:29):
Yeah. And I don't think we're nearly at the place where somebody's going to ask me if I'm a tTESCREAList. It's not nearly that much in the lexicon yet, but I think somebody might ask you if you're a transhumanist. And certainly I think. I think our friends might ask us how we feel about the whole thing now that we've done these episodes this season. And I think that's my answer. I had already kind of given up on a few of the singletarian things that feel a little more fantastical.
Kayla (56:58):
No hope in my life.
Chris (57:03):
But I really have. Until I started really engaging with some of the TESCREAL stuff from Timnit and Emil head, I didn't see as much harm as I see now and as much potential harm. And I think that is useful when these ideas are so influential in Silicon Valley, they're so influential in the most influential place that has ever existed. I think it's important to know those things.
Kayla (57:34):
So, Chris, I think that's a good place to kind of transition us into the latter. The second half of this episode or.
Chris (57:42):
Second part of this episode to Transhuman.
Kayla (57:45):
To transfer what we've been talking about. We started this season by talking about death. We were really captivated by what cryonics meant about our relationship to death when we started the season and before we started the season, and we wanted to extrapolate that and explore that. And that led us to Tuskriel, which I would still argue is very much about the fear of death and is very much about managing terror regarding our mortality. So much of this has to do with immortality. And not just immortality of the self, but immortality of the species, ensuring that humanity exists forever. People who are talking about x risk, existential risk are managing their.
Chris (58:27):
Their terror about the species. Yeah. And transhumanism is like, we want to transcend human biology and do all these things and make people better in so many ways, but mostly we want to have longevity.
Kayla (58:39):
Right? And so I kind of want to ask you asked me a question, so I'm gonna ask you a question.
Chris (58:43):
No, please don't.
Kayla (58:44):
I think we've distilled what we're saying about, like, tuscarial. What are we saying about the fear of death with this season? Like, what is your takeaway on that and I have a second question to transition into that. So if you want me to ask you the second half, I can. Or you can answer this part.
Chris (59:01):
Go ahead and lay the second half on me.
Kayla (59:04):
I want to talk to you about how maybe your relationship to your fear of death has been altered, or not, by this season.
Chris (59:15):
Honestly, I think I had. If this season had taken away my dreams of living in a San Junipero, of uploading my brain to a heaven adjacent techno heaven.
Kayla (59:28):
Right.
Chris (59:28):
So that I could.
Kayla (59:29):
Persistent world or persistent world. Digital self.
Chris (59:31):
Right. Of pleasure and joy or whatever, then I think I would have a different answer than I do. But I think I had already kind of. Before we started this season, I was already pretty skeptical that the rapture for nerds was actually going to become a reality, right. So I don't think it's really changed my relationship to death as much. I do think that the last episode of last season, though, did the death cafes. The death cafes did. And I don't necessarily think. I'm a little torn. Inside me, there are two wolves, as always, and one wolf is like, hey, cryonics might be cool, and it would be nice to live as long as you want and choose your own time of passing. That would be cool. And then there's another wolf inside me that's like, that.
(01:00:26):
Fear has led humans down some weird paths over. According to terror management theory, all of the paths we have gone down over the centuries and millennia have been because of our fear of death. So there's this other wolf that's kind of going like, yeah, but maybe it's better to somehow find a way to peace, come to terms with that reality. And I think that I definitely get more of that from the death cafe. One of the biggest takeaways I have from that is how often we talk about living well and how not often we talk about dying well. And so I've. At least over the past year, if somebody dies well, I think that's important. I don't know. Sometimes we joke about this, right? Like, okay, Evel Knievel does 47 flips over the Grand Canyon while he's on fire and cool.
(01:01:18):
And we're like, well, but what if he dies? And I'm like, what a way to go. Like, after going to death cafe and sort of having that mindset changed a little bit about dying well. Right now, when I say what a way to go, I'm like, kind of serious.
Kayla (01:01:32):
Yeah.
Chris (01:01:33):
You know, like, if that's. If your life's work is doing crazy shit like that, and that's how you.
Kayla (01:01:40):
Go out if your life's work is dying. Well, yeah, that's tremendous.
Chris (01:01:47):
Yeah. So, like, sometimes now when I see things where it's like, well, that's risky, you know, rather than being like, oh, no, the most. The overriding, most important thing is to not die.
Kayla (01:01:57):
Right.
Chris (01:01:58):
But maybe we should make some room for dying.
Kayla (01:02:02):
Well, not that we're advocating everybody go out and jump the Grand Canyon on motorcycles. I'm still. I totally see what you're saying, and I still will probably continue to live my life with, like, oh, no, that's scary. That's risky. I don't want to.
Chris (01:02:14):
No. Yeah, yeah. I think that having an appreciation for risk and safety is also important.
Kayla (01:02:19):
It makes me think more about, like, you know, I don't. No spoilers, but it makes me think about a recent tv show that we've watched in which a survival scenario was presented to a group of people, and lots of those people did not survive. Most of the people did not survive. And many of those people went out in ways that, while fictional, were beautiful. Defending other people, helping other people, you know, making meaning out of their death in ways that, frankly, made them more immortal than simple immortality would.
Chris (01:02:54):
Yeah. Yeah. So I know that this season has definitely changed some of my perspectives. I just don't know if it's changed my perspective on death. It's changed my perspective on. Has it changed your TESCREALs? I'm still pretty afraid of dying.
Kayla (01:03:09):
Me too.
Chris (01:03:10):
The only thing that has been able to alleviate that a little bit for me was in the brief moments that our son was alive.
Kayla (01:03:20):
Yeah. So not the podcast.
Chris (01:03:24):
So not the podcast. Sadly, it's a good podcast. It's not that good.
Kayla (01:03:30):
I think, for me, it has affected my fear of death.
Chris (01:03:34):
Really?
Kayla (01:03:35):
And I think when I say that, I mean, not that it's lessened my fear at all. Terrified, but it's made me more comfortable in, like, knowing that I kind of have to build a relationship with that fear and not try to manage it away. Like, yes. According to terror management theory, that's kind of what we're always doing.
Chris (01:03:57):
The desire to not manage away your terror is itself terrible. Managing away.
Kayla (01:04:04):
I think it's just made me. When I first learned about cryonics, it instilled a panic in me, really. I became panicked to a degree of like, oh, fuck, I have to do this, and I have to do this now, and I have to do this fast, and I need to get you signed up and, like, should we have done this for our son? Am I a bad parent. Am I. Do I need my. Do I need to get my family involved in this? Like, we need to do this now? Now? Like, I felt a panic of, like, if I don't do this, then I'm going to die and be dead, and then that's it. And there's no. It's like, it's the worst thing I can think of. And to me, that was terror management. That was me going like, well, Kayla.
Chris (01:04:38):
That was just your hyper fixation.
Kayla (01:04:39):
Yeah. I've discovered a way to solve this unsolvable thing. I've discovered a way to surmount death. And so I have to do it now. I could die at any time. I have to do it now.
Chris (01:04:48):
Yeah.
Kayla (01:04:48):
And going through this season, I don't feel that anymore. I actually feel much more calm in a weird way where.
Chris (01:04:57):
Cause we don't wanna wake up alongside Peter Thiel and Nick Bostrom.
Kayla (01:05:01):
Peter Thiel and Nick Bostrom. That does help, I'm not gonna lie. And also, I think it just talking about Crown X and talking about the church of perpetual life and talking about long termism and EA and transhumanism and extropianism and cosmism and how much, like, panic and anxiety are behind these things make me realize that I can either spend my life in panic and anxiety and try and stop death for myself or not. And kind of either one's okay. And that's. I'm seeing a possibility that there's just as much value in going, I'm not going to solve my own death. And that's okay as there is in going. I think I found a way, and that's what I want to do. And I'm more comfortable right now with the part of me that's going, I can have a relationship with this fear of death.
(01:05:57):
And if I don't sign up for cryonics and I'm committing to the finality rather than leaving an ellipses, I feel more comfortable about that than I did when we started the season. I feel a lot more comfortable about that.
Chris (01:06:09):
Yeah. Dying well can be as valuable as living forever.
Kayla (01:06:14):
Yeah.
Chris (01:06:16):
Would you still sign up for cryonics? Say you had money to spare?
Kayla (01:06:21):
If I had money. If I had. Fuck you money? Yes.
Chris (01:06:24):
Okay.
Kayla (01:06:25):
I would sooner sign up for cryonics than buy a fancy car. But considering that I don't.
Chris (01:06:29):
What if you had, like, screw you, buddy money?
Kayla (01:06:31):
Considering that I don't. And so I'm evaluating my life as the person I am now who has to make budgetary decisions and decide, okay, do I want to live in a one bedroom apartment and not have children and live as frugally as I possibly can in this life for the possibility of an extension. I'm not currently willing to make that trade off, and I'm. And I feel comfortable with that. I feel comfortable with living well now or living as well as I can, rather than trying to tie my life to the possibility of more.
Chris (01:07:08):
What if you woke up any solarpunk utopia, though? Well, you wouldn't. You wouldn't. Because the Peter Thiel and Nick Foster.
Kayla (01:07:17):
Yeah, that's cool. And also, like, what if they weren't?
Chris (01:07:20):
What if it was, like, people that you liked were all signed up and all these douchebags weren't? Would that actually make a difference?
Kayla (01:07:27):
Yeah, kind of would. Yes, it would. But that's not the reality.
Chris (01:07:32):
Yeah, I know.
Kayla (01:07:33):
And I also think that, like, right now, where I am talking about this, I'm feeling less attachment to, like, I have. I. The me. The me that eat me is the eye that I am right now has to experience that or else is not valuable. Or else. Or else I'm a little detached. I'm a little less attached to the fomo. And maybe that's just because I'm like, I don't know. Am I about to have a midlife crisis? I don't know. But I think that I can find. I think part of that is talking about this season of so much of these people are nothing. Like I said, they're not all cryonicists. So much of these people are not signing up forever life for themselves.
(01:08:16):
So many of these people are talking about long termism and talking about the future of humanity, not the future of themselves.
Chris (01:08:22):
Right.
Kayla (01:08:22):
And so if I can do. If I can take what I like from that and leave the rest, it's okay that other people will get to experience that and not me. And I find a value in that. Like, even though my consciousness does not experience that. Like, we're all just pieces of the universe waking up to itself. We're all that. Do I find some comfort in thinking about somebody else will experience that? And in a way, that's a humanity thing, not a me thing. Does that make sense?
Chris (01:08:58):
It does. I don't know if this is for the show. It kind of burst your bubble.
Kayla (01:09:01):
No. Why?
Chris (01:09:06):
That's great. I love it.
Kayla (01:09:10):
But cool story, bro.
Chris (01:09:13):
Cool story, bro. No, it's beautiful. But isn't that also true for people to get bronze bulls and tortured and.
Kayla (01:09:22):
What do you mean?
Chris (01:09:23):
Horrible things happen, too. It's kind of like at the end of season one of true detective, bro, you asked for every good thing, for every part of the universe that's waking up experiencing something cool. There's also brother negative one.
Kayla (01:09:35):
You can't ask me, what if you woke up in a solarpunk utopia? Wouldn't that be great? And then when I say, like, yeah, I hope other people get to experience that. You can't gotcha me. And say, like, well, what if there's bad stuff? You presented me a scenario and I answered it.
Chris (01:09:50):
What if the solarpunk utopia doesn't have SPF? Then what are you gonna do with all the solars? Yeah, no, I think that's a good answer. That is a beautiful answer.
Kayla (01:10:04):
And I'm not saying I don't want to live forever. I'm just saying I'm a little more okay after this season with acknowledging and accepting that may not happen. When I learned about the singularity and I learned about, oh, my God, I could live forever. Kind of really attached to that, and I'm unattaching now.
Chris (01:10:26):
Yeah. Just to answer my own question, I think I would still sign up for cryonics. I don't know. I think it's basically very similar to what you said, where there are budgetary constraints for us. We're not fuck you money. But if the budgetary constraints were relaxed for whatever reason, because we do get fucking money or whatever, if I could.
Kayla (01:10:47):
Get a good life insurance policy, which unfortunately, I probably can't, it's kind of.
Chris (01:10:51):
Like the might as well thing. I'm going to be dead anyway. Might as well roll the dice.
Kayla (01:10:56):
And I think coming at it from a might as well perspective versus a panic and anxiety perspective feels fine.
Chris (01:11:03):
And I think I had a bit of that sort of transition, like you were talking about more with the death cafe than with all this stuff.
Kayla (01:11:12):
Yeah. Are you feeling done?
Chris (01:11:17):
What I'm feeling is, like, I've had a million thoughts over the past, like, five months about all of this, and now, like, I don't know if I'm, like, now the panic I have is there's, like, just all these really good thoughts up in my head that I just can't remember.
Kayla (01:11:34):
That's why we're going to leave room open for future episodes this season, even though this is the finale.
Chris (01:11:41):
Oh, wow. What a good. I didn't even do that on purpose.
Kayla (01:11:44):
I know. This is called making accommodations for ADHD. We're an accessible podcast.
Chris (01:11:49):
That's right.
Kayla (01:11:50):
I guess if we're ready to close out, I kind of have just a little food for thought to maybe close us out.
Chris (01:11:56):
Ooh, what is it? Is it Mac and cheese?
Kayla (01:11:58):
Well, I just. I think that something that is cool about TESCREAL and futurism and cryonics and all of this that we've talked about this season is. It presents the opportunity for a grand adventure. This is all grand adventure stuff.
Chris (01:12:11):
Even though I'm waking up next to Peter Thiel, it might be in this, like, crazy samurai Jack post modern weird.
Kayla (01:12:19):
World and the things that transhumanism may do to our bodies that we can't even conceive of and thinking about what the long term future is and what humanity might look like when it spreads out across the stars.
Chris (01:12:29):
This is grand adventure stuff.
Kayla (01:12:31):
Yeah.
Chris (01:12:31):
I would like to visit Mars.
Kayla (01:12:33):
It's very cool.
Chris (01:12:34):
I know there's a lot to dislike about all. There's a lot to criticize, but, man, I really want to visit Mars.
Kayla (01:12:39):
And I think that's something, again, that I will take from TESCREAL is that there's really cool things to think about here. Sci-Fi is cool for a reason, and this is why the grand adventure. But I came across a quote from Jane Goodall recently, and it kind offered an alternative to that as the only great adventure. So I'm just going to read this quote with you and the rest of our listeners.
Chris (01:13:04):
Okay. Is it ook?
Kayla (01:13:07):
Is that how chimpanzees talk?
Chris (01:13:09):
I think they do sign language, actually. So you're gonna sign this to me?
Kayla (01:13:12):
I'm not gonna sign this to you.
Chris (01:13:14):
Okay. That would not be good on a podcast. That would be actually not accessible.
Kayla (01:13:18):
So here's the quote. My next great adventure, aged 90, is going to be dying. There's either nothing or something. If there's nothing. That's it. If there's something, I can't think of a greater adventure than finding out what it is.
Chris (01:13:36):
And thank you to all of our wonderful listeners for going on this adventure with us of this tescrial terror management. Season of cult or just weird.
Kayla (01:13:47):
This is Kayla, and this is Chris.
Chris (01:13:49):
And this has been season six of.
Kayla (01:13:52):
Cult or just weird.