Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Emile Torres (00:01):
In the near future, where we go from our current human state to this radically different post human state, in a sense, that's the end of the world. The end of the world as we know it, at least. And that transition itself will be. There's this utopian promise on the other side of it, but we have to cross this field of landmines to get there. And that's sort of like the apocalypse, right? Right. One possible outcome of trying to build this utopian post human future is human extinction.
Chris (00:48):
All right, let's get started. Welcome to Cult of just weird. This is Chris.
Kayla (00:52):
This is Kayla.
Chris (00:53):
I forgot to give my credentials. I'm a game designer and a data scientist, even though those credentials are totally worthless for what I'm doing.
Kayla (01:00):
This is Kayla. I already said that. That's my biggest credential. And then my next credential is that I'm a tv writer. And then my third credential is that I host a podcast.
Chris (01:10):
Well, welcome to Cult of just weird. We'll get right into it here. But before we do that, just a few quick items of business, or maybe just one item of business, unless you have anything. But we have a new Patreon subscriber, so gotta do our shout out. Thank you to Jim Fingal.
Kayla (01:26):
Hell, yeah.
Chris (01:26):
For subscribing to our patreon. Hopefully you join us on our discord. You don't need to be in our patreon to join us on our discord. It'll just give you access to exclusive chat rooms in the discord.
Kayla (01:39):
Our discord is a lot of fun, my friends. There's a lot of really good memes in there.
Chris (01:44):
It's all memes that I've stolen from Elon Musk because he steals them from everybody else. Yeah. So if you just go to our show notes, there's a link to the discord there. It's also on our website and pretty much everywhere else that we've done anything on the Internet. All right, Kayla, now we need to transition to talking about transhumanism. That got you. Really? I thought that was pretty lame.
Kayla (02:07):
You're fired. I'm quitting.
Chris (02:10):
Well, so we're not just talking about transhumanism. Actually, if you've been with us for the last several episodes, what, like five episodes now? I think five or six. Something like that. Really?
Kayla (02:20):
This whole season, more than we anticipated. This season took some turns. I mean, we're still on theme, which I'm very proud of us for. But we did end up taking some turns that I don't think were.
Chris (02:31):
Expecting, well, this is one of those serious rabbit hole situations. This rabbit hole is deep with many chambers and it's confusing and it's hard to know. It's hard to orient yourself or get a framework for understanding any of this stuff. So what do we do when we are confused about something and we want to learn more about it?
Kayla (02:51):
Well, go to Twitter.
Chris (02:53):
We go to Twitter. We look at my mom's Facebook posts. Or alternatively, we look for an expert to talk to, which is exactly what we did here. So the next three episodes will be an interview with myself and Doctor Emile Torres. They were kind enough to speak with me about a concept that. Well, we'll get to that. But basically it's an effort to tie a lot of the concepts that we've already talked about in these last few episodes together.
Kayla (03:23):
There's an umbrella here. I just don't know what that umbrella is.
Chris (03:26):
It prevents you from getting rained on. Oh, yeah, I'm just cribbing this directly from the Wikipedia article on.
Kayla (03:35):
That's how you know you've made it, is when you got a Wikipedia.
Chris (03:39):
I know. Which is like, it's actually not that you've got a Wikipedia, it's that you have maintained a Wikipedia page.
Kayla (03:45):
They haven't stolen it away from you because they've deemed you unworthy.
Chris (03:48):
Yeah. So Doctor Emile Torres, who does have a Wikipedia page, that, to my knowledge, has lasted at least a few weeks since I've been doing this research. According to the Wikipedia page, they are an american philosopher, intellectual historian, author and postdoctoral researcher at Case Western University. Their research focuses on eschatology, existential risk and human extinction.
Kayla (04:13):
Cool.
Chris (04:14):
Along with computer scientist Tim Nitgebru, Torres coined the acronym. Well, we'll get to that.
Kayla (04:20):
Like how you keep hiding it.
Chris (04:21):
I know, I know. It's like, going to be so anticlimactic. Oh, whatever. But that's what we're here to talk about. So anyway, without further ado, here is the first part of my interview with Doctor Neil Torres. I'm super excited to talk to you because. Yeah, I followed you on Twitter and on your other socials for a little while now and I'm sort of like a fan of singularity type stuff. But I also agree with what you're saying. So I've just been really curious to, like, I don't know, help me unpack some of this stuff.
Emile Torres (04:53):
Yeah, sure.
Chris (04:54):
So let's just start with just the real basic stuff. If you could introduce yourself for our audience, if you want to talk about your work or even plug anything you're working on right now?
Emile Torres (05:03):
Sure. I'm Emile P. Torres, and my work over the past decade plus has focused on existential threats to humanity and civilization. And my most recent book was on the history of thinking about human extinction within the western tradition and the ethics of human extinction. More recently, I've done a lot of AI ethics work, including publishing a paper with Doctor Timbit Gebru on what we call the TESCREAL bundle of ideologies, from transhumanism, the t, to long termism, the l, the TESCREAL bundle. So that's who I am. That's what I've been up to.
Chris (05:38):
That sounds like a fascinating line of work to research human extinction. How did you get into that?
Emile Torres (05:47):
So I think what piqued my curiosity and whetted my appetite for this topic was probably growing up in a deeply religious community where there was lots of talk about the end of the world and the end of humanity. So there was the expectation that the rapture was going to happen at any point, the Antichrist was going to emerge.
Chris (06:11):
What religion were you brought up in, if you don't mind my asking?
Emile Torres (06:14):
I don't mind at all. It was, yes, very fundamentalist evangelical community I went to attended a Baptist church for many years. And so, yeah, there was this expectation that the Rapture was imminent. 1990s is when the left behind series was.
Chris (06:32):
Oh, right.
Emile Torres (06:34):
Remember that? I mean, I think series. I mean, they sold. I can't remember exactly what the numbers were, but, I mean, it was a massive success. Like, people don't realize how many books. I think it sold as many books as were sold in total the previous three decades, something like that, yeah, it was incredibly successful. Yeah, I think that's awful, the details, but, I mean, massively popular. So I was very much sort of caught up in that, and I think that planted the seeds of interest in the future of humanity. And the end of the age where you.
Chris (07:07):
Sorry, that age where you. You were a believer, too.
Emile Torres (07:11):
Absolutely, yes.
Chris (07:12):
Okay.
Emile Torres (07:12):
Very much so, yes. I mean, I sort of had, you know, some dreams of possibly being a preacher. So, yeah, I was very much a believer. But ultimately, you know, by the early two thousands, I had pretty much left religion. The problem of evil, as philosophers call it, was a main reason. I just couldn't understand how God would allow there to be so much or even any evil in the world if God really was omnibenevolent, perfectly good, and omnipotence, all powerful. So that's what led me. But nonetheless, what was left behind was a void that took the shape of eschatology, and that is just the study of last thing. So it's been a branch of theology since forever.
Chris (08:05):
Yeah, yeah. Nice callback to left behind there, by the way.
Emile Torres (08:08):
Yeah, yeah, right. Yeah. So, yeah, I don't know. Then I stumbled upon this. Writings from transhumanists and long termists, although the word long termism hadn't been coined at that point. And they basically were talking about some radical transformation in the near future where we go from our current human state to this, you know, radically different post human state. In a sense, that's the end of the world. The end of the world as we know it, at least. And that transition itself will be. There's this utopian promise on the other side of it, but we have to cross this field of landmines to get there. And that's sort of like the apocalypse.
Chris (08:54):
Right? Right.
Emile Torres (08:55):
And one possible outcome of trying to build this utopian post humanity future is human extinction. So that's how I got into it. And I was very much involved in that transhumanist long termist movement for a decade or more and then became a critic of it. Felt like their whole approach to thinking about the future of humanity is misguided. But still, I mean, you can be interested in human extinction and the long term future of humanity without subscribing to any of these ideologies. So that's sort of where I am now. Anti transient, anti long termist, but nonetheless super interested in this possibility that our species may cease to exist at some point in the future.
Chris (09:36):
It seems like maybe mission accomplished on the preacher thing there.
Emile Torres (09:40):
Yeah, right.
Chris (09:40):
Just a different topic.
Emile Torres (09:42):
Yeah, yeah, absolutely. I mean, I talked to a journalist just the other day who asked me if it would be appropriate to refer to some leading figures in the transhumanist of long term movement as prophets. And it's like, kind of. I mean, you know, they'd object to that term, of course, because they don't want to be associated with, you know, traditional organized religion.
Chris (10:04):
But Ray Kurzweil, for sure.
Emile Torres (10:07):
Yeah, exactly. I mean, even down to his specific predictions about when the singularity, sometimes called the techno rapture, will occur, 2045, you know, and which is a very specific dates like that for the second coming of Christ or the rapture. I mean, those have been very common throughout the history of religion. You know, the Millerites famously thought 1844, the second coming would happen. Harold camping, not that long ago predicted. I think it was 2011, the rapture was going to happen.
Chris (10:40):
So at least track, honestly.
Emile Torres (10:42):
Yeah, right. There are a lot of examples. Yeah.
Chris (10:46):
I do want to get back to your journey, because I also have questions about my own journey with these ideologies. But first, I want to level set a little bit. We've been talking on the show the past, previous few episodes about transhumanism. I've mentioned extropianism, but I kind of want to level set with you. What is test real? What does that acronym mean?
Emile Torres (11:09):
So, the reason that Tim, Niet, Gebru, and I came up with that term was were writing a paper trying to understand the ideologies that have driven and shaped the current race to build artificial general intelligence, or AGI. So what are these? Where did this race to build AGI come from? Because ten years ago, maybe even really five years ago, basically, nobody was talking about AGI.
Chris (11:37):
That's kind of a good point. Yeah.
Emile Torres (11:39):
Right. So it sort of came out of nowhere. Even ten years ago, a lot of people in the field that we would now consider to be AI wouldn't use the term AI. They were just embarrassed by the long history of unfulfilled promises. So there was this. Okay, we're working on, like, machine learning or natural language processing.
Chris (12:01):
Yeah, you're right. The language was different. And, I mean, I've. Sorry to interrupt, but machine learning is something that I'm pretty familiar with my work with big data analysis, and. Yeah, that's all you heard. Like, you know, you never heard AI. You only heard machine learning.
Emile Torres (12:18):
Yeah. So that's a fascinating transformation of the field that is very recent. And so I don't want to speak for Gabru, but my understanding is that part of the reason that we. She got in touch with me in the first place was because she was trying to understand. Why is it that all of a sudden everybody's talking about AGI, and these companies that have billions and billions of dollars in funding or investments have as their explicit goal the creation of these artificially general intelligent systems. How did that happen? And so our claim in the paper that we just recently published, there are two interpretations of it.
(12:58):
One is the weak thesis, and that is just the claim that if you want to understand the origins of the race to build AGI, if you want to understand where these companies came from in the first place, a complete explanation requires reference to seven ideologies, and these are the test rule ideologies. So there's transhumanism, extropionism, singularitarianism, cosmism, rationalism, effective altruism, and long termism. I know that it's a mouthful. The big polysyllabic words.
Chris (13:32):
No, you just gave me the rest of the episodes for the season. So there you go.
Emile Torres (13:35):
Yeah, great. I mean, without a doubt, like, a book could be written on each of these ideologies. So the weak thesis is just saying, like, if you want to give a complete explanation, you have to reference these ideologies. The strong thesis says that actually. So strong thesis accepts the weak thesis, but goes beyond it in saying that actually we should conceptualize these ideologies as forming a single cohesive bundle. You know, it's basically just one tradition of thinking that has these various little branches, these variations. And so that is what we defend in the article. Transhumanism is sort of the backbone of the test. Real bundle. Pretty much. But, I mean, all of the other ideologies grew out of the transhumanist movement, either entirely or at least partly. So extropianism, singularitarianism, and cosmism are just versions of transhumanism.
(14:34):
Rationalism was founded by a transhumanist, extropian singularitarian who has close connections to cosmism. Altruism came out of the transhumanist movement and was developed on. Within the rationalist blogosphere, on the blogs run by rationalists. And long termism is really just a kind of ethical interpretation of the vision that is at the heart of a lot of the worldviews of a lot of transhumanists. We go out, colonize space, we create this sprawling, huge, multi galactic civilization. It's very similar to cosmism in terms of its vision of the future. So that's the strong thesis. And I think the evidence for the weak thesis is unambiguous and overwhelming. Those ideologies played an integral role in the race to build AGI. And I also think the evidence for the strong thesis is extremely strong. That's my intention at least.
Chris (15:40):
Right? Yeah, it's interesting. You're kind of getting into my follow up question, was, I was gonna ask why you guys coined this term. Like, does it convey something important? That's simply saying transhumanist or singletarian doesn't convey by themselves.
Emile Torres (15:58):
Yeah, yeah. Great question. So I would say, like, part of the importance of the TESCREAL concept is the following. I think a lot of companies within our capitalist system, their behavior can be explained fully and completely simply by referencing the profit motive. Right. So why are the fossil fuel companies destroying the world? Well, they're just trying to maximize profit.
Chris (16:27):
Maximize shareholder value, line go up.
Emile Torres (16:30):
Yeah. Their only obligation is to their shareholders, not to the future. Generations or, you know, whatever. So part of the claim then is that the AI companies are a weird anomaly. They're not like other companies. Without a doubt, there are larger companies that are pouring money into these AI companies, like Microsoft and Google, Amazon and so on. And why are they doing that? Because they expect to make billions in profits. But the founding of these AI companies in the first place really didn't have that much to do with profit. It had to do with. With this techno utopian vision that lies at the very heart of the TESCREAL bundle and has been given slightly different interpretations by some of these different ideologies. Singularitarianism emphasizes the singularity a bit more than ex tropianism or cosmism.
(17:27):
But really, there's just tremendous overlap in terms of the communities that have coalesced around each of these ideologies. In terms of the value commitments, the intellectual tendencies, and the visions of the future, these ideologies overlap significantly. And then historically, they just grew out of each other. Extropionism was the first organized modern transhumanist movement. It gave birth to singularitarianism and cosmism. And then, as I mentioned before, it was out of this early testeral movement that you get rationalism, effective altruism, then ultimately long termism. So the work that this concept of test realism is doing is trying to provide an explanatory framework for understanding these phenomena. And, you know, one, if you ask philosophers, they'll say that one sort of theory about what makes an explanation in science, for example, a good explanation is its capacity to unify a bunch of disparate phenomena, right?
(18:34):
So, like, evolutionary theory does this, it's like you've got all this evidence from the fossil records, you've got biogeographical data, you know, some species over here. And evolutionary theory just brings it all together, explains why you see what you see. So I think that one of the virtues of the test real framework or explanation is that it does precisely this. You might think, well, you know, like, what exactly does long termism have to do with transhumanism or cosmism? The TESCREAL framework aims to explain that and thereby bring together these various phenomena into one single coherent framework.
Chris (19:09):
Yeah, if it catches on when I use it, just talking with my co host, it definitely reminds me that there are multiple ideologies that we're talking about. It reminds me that they're tied together, and it reminds me that there's a history, like you said, of one growing out of the other. So it kind of puts that by saying that the one word to encompass all of it. It puts that all at the forefront. So that makes sense.
Emile Torres (19:37):
Absolutely. That is one of the aims with this concept is to foreground these particular ideologies, to make them explicit, to name them in a way that they weren't before. So, for example, there's a cultural critic named Douglas Rushkoff, and he published this fantastic book, maybe in 2022, I can't remember, but quite recently called survival of the richest.
Chris (20:02):
Oh, I've heard of that book.
Emile Torres (20:03):
Yeah, yeah, it's really good. So he follows a bunch of, like, tech billionaires around and asks them questions about their views of the likelihood that civilization will collapse. And they. All right, yeah.
Chris (20:16):
This is the one where he talks to everyone about their, like, mega bunkers in New Zealand.
Emile Torres (20:22):
Yes.
Chris (20:22):
Which is so weird, by the way. Sorry. The whole, like, oh, man, the world's going to end, and we can't. We, the most powerful people on earth, can't do anything about it. So let's. Let's have mega bunkers in New Zealand. Oh, my God. Sorry. Yeah, go ahead.
Emile Torres (20:37):
No, I highly recommend it. It is very amusing. And he is extremely critical. And, you know, I think he had extraordinary access to all of these really powerful, super richest individuals. And he completely burned all those bridges, I think by intention.
Chris (20:53):
Right? Yeah.
Emile Torres (20:54):
Basically kind of mocking them because they just, you know, like, hard not to. Right? Yeah, yeah. He has some questions, like, well, you know. Okay, so you could hire these security guards to keep your bunker.
Chris (21:05):
Oh, he was the security guard guy. Yep, I remember that. Please tell a story. I love that story.
Emile Torres (21:10):
Yeah. I don't remember the exact details, which are juicy and very abusing, but it was something like, well, when civilization collapses and money is worthless, how exactly are you going to prevent them from just killing you? And he has stories of these billionaires going like, oh, shit, I hadn't really thought about that.
Chris (21:30):
Right. Yeah, well, you can't, sir. Have you tried being friends with them?
Emile Torres (21:40):
So, ultimately, the reason I mentioned this is a lot of the billionaires that he talks to have this vision of the future. Like, okay, maybe civilization could collapse, but we're going to survive the apocalypse. And ultimately, their vision of the future is that they will digitize their brains, live in, you know, live on the cloud in some computer simulation, and that is the future. And so Rushkoff calls this the mindset with a capital t and a capital m. And basically it's more or less synonymous with TESCREAL. And so I think one of the virtues of TESCREALism, and I suspect I don't want to speak for Rushkoff, but I suspect he would agree with this. One of the virtues is that it foregrounds, like I said, it foregrounds these ideologies. What is the mindset?
(22:27):
The mindset is the vision built on transhumanism and a kind of cosmist or long termist view of the longer term future of humanity whereby we re engineer ourselves, we upload our minds to computers, we spread into space and so on. So, yeah, I think that's partly why, insofar as the concept is valuable, I think that's one reason.
Chris (22:51):
So does that explain it all, Kayla? Pretty much. Do you understand all of the. Everything now?
Kayla (22:55):
All of the everything. Everything everywhere all at once.
Chris (22:58):
All of it, yeah.
Kayla (22:59):
Thanks for listening to culture. Just weird.
Chris (23:01):
We did it. We explained everything.
Kayla (23:03):
I think that was the appetizer course.
Chris (23:08):
Yeah.
Kayla (23:08):
So there's two more.
Chris (23:09):
Yeah. Two more episodes coming.
Kayla (23:11):
An entree and then a dessert. Is that what's coming?
Chris (23:15):
As long as there's not a requirement.
Kayla (23:16):
Or is it more like tapas?
Chris (23:17):
I think it's more like tapas. Cause I'm just concerned that the final episode may not be sweet, it may be more savory.
Kayla (23:23):
So this is the dim sum or sour episode runs. Okay. Got it.
Chris (23:27):
Well, cuz, you know, we used to do these episodes where it was like, here's a two hour long interview, and then it was like us talking for 2 hours. So we're trying to break it up into smaller chunks.
Kayla (23:36):
Well, I took. I took a lot of notes while were listening because there's a lot of. There's a lot of things that.
Chris (23:40):
You took notes.
Kayla (23:41):
Of course you did too. Don't hate.
Chris (23:42):
Yeah, it's horrible.
Kayla (23:43):
You took more than me.
Chris (23:44):
I took more notes than I ever did in school.
Kayla (23:46):
Oh, my God. Like, combined, there were a lot of things that the two of you talked about that I just wanted to be like, ooh. So I'll just mention some of those things. I don't necessarily have a lot to add. I have a lot to talk about when it comes to survival of the richest. But, you know, we'll get to that.
Chris (23:59):
A lot of takes.
Kayla (24:00):
A lot of takes to give. I just think it's interesting how we've drawn the parallel before in this season, but how much of a parallel there is between millenary. I can't ever say the word millenarian.
Chris (24:12):
Millenarian.
Kayla (24:13):
Millenarian religious beliefs.
Chris (24:15):
Millie Brownian.
Kayla (24:16):
Thank you. Millenarian religious beliefs and transhumanist singularity. Long termist future beliefs.
Chris (24:24):
Yeah. And like, we've seen it in other areas, too, and in other groups we've covered on the show to the point where I'm just kind of like. I think, like, millenarianism is just something that. It's like an archetype, you know?
Kayla (24:36):
I think there's some sort of fundamental human thing. Exactly like furries, like how we decided there was something fundamentally human about furry fandom. I think furry fandom and millenarian beliefs, like those are really two things that.
Chris (24:50):
Define you as a human being. I agree. You know, we mentioned the left behind books and how that kind of, like, planted some of those seeds in eschatology. I just want to mention a couple things. First of all, there was an rts made by the left behind guys.
Kayla (25:04):
That's a type of video game.
Chris (25:05):
Yeah. I think it was basically just command and conquer. But then, like, you could also use God powers or something.
Kayla (25:10):
Did we say left behind? The conceit of left behind is that it's post. It's the world post rapture.
Chris (25:15):
Yeah, yeah.
Kayla (25:15):
So the rapture happens. Those that have been left behind.
Chris (25:18):
Right. And there's a million books. I tried because it was just, like, ridiculously popular. I tried reading the first one and I just couldn't do it. Was it dog shit? I don't like if anybody's out there listening. It was not for me. If you like the series, I'm not trying to shit on you or anything, but it just. I could not. I didn't even get past the first chapter.
Kayla (25:36):
I think kind of staying on this, like, talking about the religious aspect and the faith aspect. The two of you talked about profits in Tezcreal and whether or not profits exist in Tuscarial. And you mentioned Ray Kurzweil and just. I think we could name a lot of people. I think that Max Moore could be granted profit status. I think that would definitely be exonyms. I doubt that folks within the community would be interested in having those labels applied to themselves.
Chris (26:06):
Depends on how tongue in cheek they want to be about themselves.
Kayla (26:09):
There's a future episode coming up after these interviews where they use that term. I don't want to spoil too much of the episode, but literally, futurists, long termists, singularitarians that use the term profit to describe the futurist thinkers of the past.
Chris (26:26):
Yeah, I mean, it can be pejorative, but at the same time, futurist and profit are both just. They're two people that are trying to make predictions about the future. They just get their information from different sources.
Kayla (26:38):
Allegedly in one reality, Roko of Roko's basilisk is simply a prophet. It's simply Rocco's prophet.
Chris (26:44):
He's a prophet.
Kayla (26:44):
Simply Basilisk's prophet.
Chris (26:46):
Right. He's more of, like, a herald of Roko's Basil. He's the basilisk's herald. It's like Galactus.
Kayla (26:53):
H a r o l d. Harold.
Chris (26:56):
Harold. Yeah. Harold. Emil also mentioned how their first chink in the armor of religion was theodicy. It was the problem of evil. And I just thought that was like, oh, yeah, that's everyone's right? Like, isn't that everyone?
Kayla (27:10):
This is, again, why I say I want to have, like. And I've done no work to make this happen, but I say, like, oh, I want to have, like, a religious academic or a religious scholar or, like, a priest. Like, somebody who's very learned in the, like, academics of faith and not, like, in a. Not in a way that Emile Torres is. But I mean, like, somebody who, like, is within the church. To fucking talk to me about that, because, come on. I need to. I. If this is such a. If this is the turnpoint for so many people, there has to be a lot of philosophical thought about it.
Chris (27:46):
If this were a business, there would be a lot of research done on. This is what's causing people to unsub from our app.
Kayla (27:53):
Yeah.
Chris (27:54):
Yeah. Overall, I think, you know, the. Obviously, the takeaway from. From this episode and really, the reason that I had Emile on the show, the reason that I talked to Emile was because of this whole test reel thing, because the ideologies that you and I have been sort of like, rabbit holing down are so. There's, like, so many of them, and they're, like, all same but all different.
Kayla (28:16):
Right.
Chris (28:16):
And so I was, like. I was getting overwhelmed. And I just, like, the. The framework of Tescreal helps me think about this stuff easier. And, like they were saying, it forefronts the. It reminds you that when you're talking about one thing, you're also kind of talking about the other. Right. Like, it reminds you that when you're talking about so and so, you know, Elon Musk and his SpaceX cosmism, you know, he's also got this, like, effective altruism slash singletarian thing, too. So. And also that bit about, quote, unquote, the mindset. I think that's. And we'll get into more of that in future episodes. But just how influential this stuff is the other big, like, why I wanted to talk to Doctor Torres, right.
(29:03):
Because if this was just, like, four people in Nebraska that were just like, we have a club to talk about robots. Who cares? But this is the most powerful people in the world that are into the mindset that he's talking about is the Silicon Valley ruling class.
Kayla (29:18):
And I think that's maybe the thing that is the fundamental human component is this just like, the common ground between that Silicon Valley mindset that begets tezcrial and millenarian religious beliefs is, I think, that something that sets us apart as human beings from other creatures, other animals, is that we're not just thinking about our own mortality. We're cursed with thinking about and considering the extinction of our species, which is kind of inevitable, like, in the grand scheme of things.
Chris (29:54):
No, it's not kind of inevitable.
Kayla (29:57):
I'm softening the blow. It's an inevitability, y'all.
Chris (30:00):
Everyone is gonna eventually die.
Kayla (30:01):
Everyone, including the entire species, will be no more. And so I think that maybe that's the fundamental human condition at the core here. And, like, of course, that's terrifying. And of. I don't really blame anybody for trying to come up with more, quote, unquote, science minded approaches to that feeling, that fear, doing something about it.
Chris (30:26):
Would you say they are using those approaches to manage their terror?
Kayla (30:30):
I totally forgot about terror management theory, which is underpinning.
Chris (30:33):
There's only a few episodes ago.
Kayla (30:35):
I know. I got a lot of my mind. Yeah. I think that terror management. And I don't ever want to. I don't ever want terror management to take away from why somebody is doing something.
Chris (30:49):
Sure.
Kayla (30:51):
But I think it's very present here.
Chris (30:53):
Right. I agree. I think that it's not a coincidence that we wound up here talking about this stuff. Right. We're like, hey, let's do a season where we kind of talk about death and overcoming death and, oh, cryonics. It's really quite a natural progression to where we are when that's what you start with.
Kayla (31:13):
Right. I mean, look, two of the folks we've talked to this season already were brought up in baptist households, right?
Chris (31:18):
Yeah.
Kayla (31:19):
Specifically baptist households.
Chris (31:22):
So Emil mentioned. I think they mentioned all of the different names of the things in the acronym.
Kayla (31:28):
Yes.
Chris (31:29):
I'll just go over it again one more time just for, like, a.
Kayla (31:33):
Please.
Chris (31:33):
It's good to repeat stuff. Right? All right, so TESCREAL. So, luckily, it starts with transhumanism, because that's what we started with.
Kayla (31:40):
Right.
Chris (31:41):
And then the e is extropianism. We mentioned that on the show a few episodes ago, but you were kind of actually just talking about this. You were talking about like human lifespan extinction and also species extinction. And extropions are kind of like we're anti that. It was coined to sort of be like an opposite to entropy, right? If entropy is going to kill everyone eventually, then can we reverse that? Can we prevent it? And also an opposite to an alternative to utopia because of all the negative things that utopianism carries with it. So that's the e in testreal. The s is singulitarianism. Singularitarianism. The singularity, singularitist people that are into the techno rapture. That's basically what that is. That's like we're all going to upload our minds and there's going to be a super AI that enables us to do cool stuff like that.
(32:37):
C is cosmism.
Kayla (32:39):
That's one I don't quite understand.
Chris (32:40):
Yeah, cosmism is, at least in my estimation, it's kind of like the black sheep of the acronym. Not that it doesn't belong, but it's kind of harder to fit that puzzle piece in. But don't you worry, Kayla. We are going to do that. We're going to jam that puzzle piece in there for everyone. We'll get to that. But anyway, just think of it as like, our destiny is to colonize space and then, like, apply other parts of TESCREAL to it. So our destiny is to colonize space and, you know, with robots.
Kayla (33:12):
Cool.
Chris (33:13):
The r is rationalism. We just filled the airwaves with three episodes worth of rationalism when we talked about less wrong. And the rationalist diaspora. The e and the a are both effective altruism. We'll get to that as well. And the l is long termism. Long term ism is super weird, you guys. I think we'll probably get to that too in a future episode. But I just wanted to say all the words in the acronym so that we can kind of level set. So are you ready to go live in a mega bunker in New Zealand then?
Kayla (33:45):
No, because I don't think we should even talk about it because we're trying to keep these episodes to digest. And I could talk about that for a long time. And also, like, where are my stories about that kind of dystopian setting? I want like a Sci-Fi book about.
Chris (34:02):
What happens, like, when the billionaires actually are the only ones.
Kayla (34:05):
The billionaires are in the bunkers and their people turn on them. Yeah, that would be great. Where is it somebody who writes books do it?
Chris (34:13):
If only we had a tv writer that could write a television show about this thing.
Kayla (34:17):
I got too many projects already, none of them paid somebody. Hire me. I think that it's one of the. We were talking a little bit as were supposed to be listening to the interview and waiting to talk on the air, but we didn't wait. Cause we're really bad at it. Of just how, like, I don't even know what the correct word is, just how impressive it is that the billionaires of the world, the most powerful people on planet Earth, which could mean the most powerful people in the universe, the most powerful in the solar system, all.
Chris (34:54):
Those people in the solar system cannot.
Kayla (34:57):
Solve the problems of climate change, of disaster, of, like, oh, no, human extinction is upon us. Cannot solve that problem in any way beyond. I gotta get up in a bunker and, like, figure out how to, like, not be killed by my own security. Like, that's impressive.
Chris (35:17):
I remember reading that. So I think that the book survival of the richest was actually based on an article. And I want to say slate or something.
Kayla (35:26):
I feel like you and I read this article.
Chris (35:28):
I think everyone read this article because I remember talking about it. It was like, water cooler topic for a hot minute there. But, yeah, it's crazy. This guy went and talked to the world's richest people, the world's most powerful people, and they were like, we need your advice, man. Like, how do we keep our security guards from turning on us when money isn't a thing, when the current money system goes away? And it was kind of like, you can't. What are you talking about?
Kayla (35:54):
You could try and stop that from happening with your extreme wealth.
Chris (36:00):
Yeah. And this also goes back to what you and I were discussing while were listening to the interview, as I'm.
Kayla (36:06):
Lecturing all the billionaires listening to our show.
Chris (36:08):
Elon, listen here. If you just give this, if you donate to culture, just weird. You can avert this whole problem.
Kayla (36:14):
There we go.
Chris (36:15):
So were talking about how it's interesting that they feel sort of almost trapped by the same system that we feel trapped by. And I think that the problem is that just that it's one of those passing through the eye of the needle things where the only way to do it is to drop the thing that is precious to you. So, like, for these elite billionaires, right? It's like, hey, how do we prevent the world from collapsing? Is there a way? I'll do anything right? And then the answer is like, yeah, share. And they're like, oh, is that the only way? Yeah, I'm afraid so. And the consequences. What? You said it was extinction. Oh, that's a tough call. I don't know, man. I don't think I can do that.
Kayla (37:05):
Actually feel like that, weirdly does and doesn't feel TESCREAL aligned. Like, how is that aligned with extropianism? Like, how is that aligned with the tech utopia ideals? It's not. But I'm not surprised that mentality is part of this mindset. But it doesn't really align with what is preached by what I seem to understand as being preached by the greater community here.
Chris (37:35):
Yeah, I agree. This is going to be a recurrent theme. And this is one thing that we did talk about in the humanity plus episode, was that there does seem to be at least one large fracture within the sort of TESCREAL type community. And also, by the way, Doctor Torres themselves say this. Not everyone who is a transhumanist, for example, or not everyone who is a cosmist would they classify as a TESCREAList. So it's not like a one size fits all. And they're very clear about that. I've seen them do other talks where they're very clear about how diverse the Tuscal, transhumanist, blah. I won't say all the words, how diverse that community is.
(38:18):
And I think that's something we're going to keep coming back to, because it does feel like there is at least one major fracture between the sort of like, libertarian Peter Thiel Elon Musk, like, ultra wealthy set.
Kayla (38:30):
Right?
Chris (38:31):
And then people who were like, I think it would be cool if there were robots and I could upload my mind to San Junipero. That'd be neat.
Kayla (38:40):
Oh, San Junipero. I forgot about that. That was nice.
Chris (38:44):
That's a reference for.
Kayla (38:45):
That's the one nice episode of Black.
Chris (38:48):
Mirror where people uploaded their minds to like a. It's like a sick utopia. Anyway, just answer to your question. I think that there is kind of a fracture there, right? And I think that's a theme that is just a recurrent theme with this whole topic. Next time on cult or just weird. How are all these poor billionaires gonna save themselves? When the shit hits the fan, the screen.
Kayla (39:15):
I like it. I didn't know if I'm gonna say anything else. The way for them to do that is to, like, share and subscribe.
Chris (39:22):
That's how you save the world.
Kayla (39:24):
If you want to save the world, come hang out on our discord. The link is in the show notes. If you want to save the world even harder, you can go to patreon.com culturejisweird. Get access to bonus content and behind the scenes stuff. And, you know, just keep listening and we'll see you next time.
Chris (39:38):
Caleb, please, okay? We're not trying to save the world. We're trying to save the billionaires.
Kayla (39:43):
We're trying to save ourselves.
Chris (39:44):
When the billionaires are what's important, everyone else can die.
Kayla (39:49):
That's what they think. And that's why I hate them.
Chris (39:51):
This is Chris, this is Kayla, and this has been cult or just escriel?
Kayla (40:14):
Subscribe.
Emile Torres (40:19):
Our channel.