All Episodes

May 28, 2024 41 mins

Wanna chat about the episode? Or just hang out?

Come join us on discord!

 

---

Have not you read The Sequences?

 

Chris & Kayla do their first ever CoJW topic revisit and discuss an online community dedicated to Rationalism.

 

---

*Search Categories*

Anthropological; Science / Pseudoscience; Common interest / Fandom; Internet Culture

 

---

*Topic Spoiler*

LessWrong

 

---

*Further Reading*

https://www.lesswrong.com/

https://www.reddit.com/r/LessWrong/

https://rationalwiki.org/wiki/LessWrong

https://en.wikipedia.org/wiki/LessWrong

The Sequences

just some Harry Potter fanfic

https://www.lesswrong.com/posts/DyG8tzmj3NnGRE8Gt/explaining-the-rationalist-movement-to-the-uninitiated

https://www.lesswrong.com/posts/LgavAYtzFQZKg95WC/extreme-rationality-it-s-not-that-great (the community is remarkably self-aware)

Yudkowsky on cults!

Yudkowsky on cults again!

the podcast eats its tail (CoJW's previous LessWrong episode)

https://rationalwiki.org/wiki/Roko's_basilisk

https://news.ycombinator.com/item?id=8060440

https://www.reddit.com/r/askphilosophy/comments/19cmro3/should_i_read_lesswrong_wiki_or_not/

https://www.reddit.com/r/askphilosophy/comments/2ddfs6/comment/cjp22pf/

Slate Star Codex (LessWrong2.0)

somethingawful has a bitch-thread about LessWrong

https://www.reddit.com/r/SneerClub/ (LessWrong hater's group)

https://rationalwiki.org/wiki/Bayesian

https://en.wikipedia.org/wiki/Functional_Decision_Theory

https://en.wikipedia.org/wiki/Newcomb%27s_paradox

Time platforms Yudkowsky's doomerism

Receipts for how bad Yudkowsky's pr

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:00):
The word rational sends up little signals for me these days. And so I just wanted to. I just wanted to say that I.

Speaker 2 (00:05):
Think this whole site and even all of this research in these episodes are kind of like, riding that razor's edge of, like, the more rational you think you are, the more of a blind spot you have. But also, it's good to try to cultivate your rationalities. So, like, there's. I think that's part of why I've felt so, like, bizarrely torn is because this is just a tearing sort of thing that they're talking about. Right? Like, if you're trying to improve your own thinking, there is that feedback loop of, like, I'm thinking about thinking now I'm thinking about thinking about. And then you also have the. The paradox of, like, the more rational I become, the more rational I think I become, the more irrational I might actually because of my overconfidence.

Speaker 1 (01:04):
I have to tell you, as your wife and as the person who's been married to you for ten years, you just breathe loud out of your nose. Sometimes at night, I have to force myself to not freak out. How. Ouch. You're breathing.

Speaker 2 (01:17):
Well, I'm in my cpAp, so.

Speaker 1 (01:19):
No, no. Before that. Like, when you're just, like, playing your games or whatever.

Speaker 2 (01:21):
Oh, really?

Speaker 1 (01:22):
Sometimes when you're playing a game, you breathe really intense.

Speaker 2 (01:27):
That's probably when the game gets intense, if I had to guess. Oh, boy. Just podcast. Okay, so is this. But I've been doing this that we've been recording for so long. You have no idea how. How you said it forever. I've been doing. I've just always.

Speaker 1 (01:43):
You're just a breather.

Speaker 2 (01:44):
Loud.

Speaker 1 (01:44):
You just breathe loud through your nose. I think it's because you have a deviated septum that the doctors refuse to diagnose.

Speaker 2 (01:49):
Right. And then they make fun of me.

Speaker 1 (01:51):
Yeah.

Speaker 2 (01:52):
Well, aside from that, I am also a game designer and data scientist.

Speaker 1 (01:58):
What's your name?

Speaker 2 (01:58):
I guess I shouldn't be a podcaster. No, if I breathe loudly. But my name is Chris.

Speaker 1 (02:04):
My name's Kayla.

Speaker 2 (02:05):
Hello.

Speaker 1 (02:08):
Shrug.

Speaker 2 (02:09):
Okay.

Speaker 1 (02:10):
I'm a tv writer, and I should be doing a podcast. And that's why you're here listening to us talking on cult or just weird. Welcome back to the show. We're so happy to have you. If you want to support the show, you can find us on patreon.com cult or just weird. And if you want to chat with us and other fans about the show and whatever weird cults you have in your life. You can find us on discord that's linked in the show notes.

Speaker 2 (02:34):
Yeah, we talk about the episodes. We post funny memes or memes that I think are funny, and everybody else laughs obligingly. It's a good time. You should definitely check it out. It's in our show notes. And with that, let's get going. So I did want to ask you, though, how's your living forever project going? Is it going good? Are you at least managing your terror?

Speaker 1 (02:58):
Oh, no, I'm done not managing my. Remember, I gave up on living forever because we had that conversation of how disorienting it would be to, I guess, living forever via cryonics, not through transhumanism.

Speaker 2 (03:09):
Oh, so that was enough torpedo the whole cryonics thing for you is the oops, I wake up in the future?

Speaker 1 (03:14):
Yeah, I think so.

Speaker 2 (03:15):
No way. Really?

Speaker 1 (03:17):
I think it'd be. I think it'd be too traumatic.

Speaker 2 (03:19):
It's fine. You've had surgery. Yeah, it's like that. It's like you go to sleep, and then you wake up and no time has passed.

Speaker 1 (03:26):
Do you remember that one time I had surgery, and I was in the recovery room, and then there was a person next to me was also coming out of surgery, and they were just sobbing uncontrollably. Not because of, like, medical news they were getting or, like, the alcohol, just simply because the act of coming out of anesthesia was so, like, traumatic for them and, like, it can make you weirdly emotional.

Speaker 2 (03:47):
Really? Yeah.

Speaker 1 (03:48):
So if that's happening, that's temporary, though.

Speaker 2 (03:50):
Okay, look, that's not what this topic is about. I'm sorry that you. You're not gonna, you know, join us in the frozen future, Kayla. And I'm sorry that you're not managing your terror as much. So, yeah, two weeks ago now, we started tugging at the loose thread of a particular movement that is managing its own existential terror in a frankly straightforward way, that is trying not to die. The movement known as transhumanism is based around trying to expand human limits beyond what our accident of birth and biology sticks us with. Especially that one really sticky bit where we die at the end.

Speaker 1 (04:27):
I hate that bit.

Speaker 2 (04:28):
It's the worst bit. We also briefly mentioned that if you are particularly interested in not dying, you might also call yourself an extropion. Especially if you see future human progress as an ever receding horizon and not a utopian goal. And especially if the thought of immortality makes you consider long term entropy type challenges? After all, if the universe isn't immortal, then it doesn't matter how long you live. You aren't either.

Speaker 1 (04:52):
Do you think people really want to live literally forever?

Speaker 2 (04:56):
Some people think about it like, some people think about that problem.

Speaker 1 (04:59):
But do you think anybody can come to the conclusion, if you think about, ooh, if we suddenly had the chance to become functionally immortal? No. Truly immortal? Does anybody come to the conclusion of anything other than spoilers for the good place, the ending of the good place?

Speaker 2 (05:15):
I don't see how you can.

Speaker 1 (05:16):
And so what we're talking about here is that if humans were able to become immortal, would we not eventually choose some sort of end of life, some sort of suicide, some sort of finality, potentially maybe hundreds of thousands or millions or billions or trillions of years in the future? I just wonder if there's anybody who would be like, no, I want to live. I've thought about it. I want to live forever, ever.

Speaker 2 (05:44):
I cannot speak on behalf of most extropions or transhumans. I imagine there's probably some folks that do feel that way, or at least think they feel that way. My sense is that, by and large, when people say immortality or live forever, they're just thinking of, like, they get to choose when to die, or they're overcoming senescence and old age. Not like, yes, I would like to see what the infinity universe looks like infinity years from now, because that's too much for one brain to handle. But I don't know. There might be some people that are like that either way, I'm just sort of, like, mentioning extropions because it was a couple weeks ago that we talked about it on the show, and we only talked about it briefly.

(06:36):
So I wanted to talk about to rehash transhumanism, which is the idea that we want to improve our biology, improve and take control of our accident of biology, and then are kind of like that, except they're more about the, like, you know, the immortality bit. And I think in general, like, a lot of those folks will think about things like, well, you know, the sun's gonna blow us up and, you know, forget forever. Right? The sun's gonna blow us up in 30 million years. What are we gonna do about that? Got that number wrong. Thanks. The sun's pretty good, but eventually it's gonna swallow us. So in the last episode, we also talked about an organization called Humanity plus, which seems to, like, overall, like, their ideas are pretty cool, but it also doesn't seem like much is actually, you know, happening.

(07:24):
Like, it's a bunch of nerds. And I say this, I say nerds as a nerd with the utmost goodwill and pot calling kettle. But it does just kind of. It seems like a bunch of nerds doing classic, like, Internet circle jerk. Like, lots of talking about the cool thing, some advocacy. But it was kind of hard to tell if anyone was, like, working on the cool thing. It was more like a fan club. I don't know. But if you like Internet circle jerks.

Speaker 1 (07:48):
Oh, God, that's my favorite thing, then.

Speaker 2 (07:50):
You have come to the right podcast.

Speaker 1 (07:52):
Thank God.

Speaker 2 (07:54):
In fact, let's come full circle from a previous episode of this very show culture. Just weird, because today we're doing something that we actually haven't done before, ever. We are doing a full topic revisit. Today we are talking about the former home of Eliezer Yudkowski Roko and his basilisk, lesswrong.com dot.

Speaker 1 (08:19):
What do you mean by former?

Speaker 2 (08:21):
Well, get to that. But he hasn't hung out there for quite a while. Why, that is a big ol. We'll get to that.

Speaker 1 (08:29):
I wanna know.

Speaker 2 (08:30):
Yeah, it does involve one of your favorite words, diaspora. But that's just a little pinch, little spice for what we're getting to in the future. Anyway, I know that everyone listening has, like, a perfect eidetic memory of every recorded second of every episode of Cult or just weird. I know that's all just in your brains, but just in case there's, like, one or two of you that don't, I'll go ahead and recap. What left's wrong is now, of course, if you want to listen to the episode on less wrong and Roko's basilisk, you can go check it out. It's the fourth episode of our second season, and it's called the information hazard. It's slightly dated because.

Speaker 1 (09:09):
It's slightly dated because at the time, were like, oh, my God. Oh my God, this is so scary. And now I'm like, oh, my God. I could not care less.

Speaker 2 (09:17):
Yes, it's dated mostly because I was.

Speaker 1 (09:20):
So scared of Roque was basilisk.

Speaker 2 (09:22):
Yeah.

Speaker 1 (09:23):
I could not give less of a shit now.

Speaker 2 (09:28):
Yeah, I was, too. And now I don't care if I get tortured by a future superintelligence, it's already happening. Which, if you're not following, by the way, first of all, good for you. That means you're not too online. But the future super intelligence torture reference is sort of like. So Roko's basilisk is like a what if thought experiment posited by a user named Roko on the lessrong.com forum back in 2010. I'll talk about it a little bit more, a little bit later. But that's kind of what our other episode was about, where we really go more into depth on it. But in terms of this episode, I'm just mentioning it because it was birthed on lessrong.com. But aside from a.com, aside from a website, what is less wrong?

(10:12):
Well, there's the literal definition part, which will only take a minute, and then after that we kind of need to talk because like, I'm just having a really tough time with.

Speaker 1 (10:24):
You've been having a really hard time these last couple weeks.

Speaker 2 (10:26):
I'm having a tough time with like, how to feel about it.

Speaker 1 (10:29):
Yeah, I'll help you. Facts are picky.

Speaker 2 (10:31):
Please tell me how to feel about it. The literal definition is easy, though, so less wrong is a website. It offers a community forum space very similar to Reddit or like other community forums. It has posts and threads of comments and a humongous backlog of tagged threads for your incredulous browsing pleasure. But it was first created, though, as a repository for the musings of one Eliezer Yudkowski by. Let me check my notes here. Oh yeah, Eliezer Yudkowski. Yeah, he created it himself for his own musings and thoughts.

Speaker 1 (11:04):
Is he Roko and his less wrong?

Speaker 2 (11:06):
The basil is definitely back in 2009, there was a blog site known as overcoming bias, and Mister Yudkowski had made a bunch of blog posts there about the site's theme, overcoming human cognitive biases. He then scooped up these posts and went and created his own forum for people to read and comment on them, and perhaps make their own posts for others to read and comment on. This site was lesswrong.com dot. Now, in case you've been wondering where the forum's title comes from, this is actually one of the things that I do really like. Remember, I'm confused about this, right? This is one of the things I like about the community, actually, is they do seem pretty committed to self improvement and combating human cognitive bias. And one of the biggest biases you can have is to, like, think that you're smart, right?

(11:53):
And paradoxically, the more you work on refining your own thought and combating your own biases, the smarter you will think you are. And to some degree you're right, because you are putting work into it. But that also carries with it that, like, I'm too smart to be tricked by XYZ blind spot. So that's why they call their site less wrong. It's an acknowledgement that they're all biased and wrong, but it is possible and desirable to lessen that incrementally if you work at it. And I honestly, I think that's, like, really smart and capital g, good insight. And also, honestly, like, it's something I think about all the time working on this podcast. Like, we're always talking about biases and pitfalls and thinking on cult or just weird.

(12:35):
We talk about, like, apophenia, conspiracy style thinking, closed logical loops, motivated reasoning, so on and so forth. When we're doing this podcast, I'm constantly worried about, like, does researching and talking about this stuff make me smarter? And then does that previous sentence make me more blind? So, like, honestly, you can really twist your brain into knots when you're thinking about thinking, because there's a feedback loop here, and the feedback loop is in your own brain. Meet thinking about thinking. And if you spend any time on less wrong, which I have spent a lot of time there recently for this episode, you'll see that, like, brain twisting induced by thinking about thinking is actually kind of common there. Yudkowski himself even writes in several places that the business of helping each other's brains think is a pretty dangerous business.

(13:22):
Kayla, what do you think about the bias bit there? Like, is that something. How do you feel about that sort of bias, that, like, that blind spot that we doing this show? You think that you're infallible?

Speaker 1 (13:34):
This is what I was gonna say. Sorry.

Speaker 2 (13:35):
You have a gigantic blind spot because of how highly you think of yourself.

Speaker 1 (13:39):
The reason why I'm laughing is Chris is asking me this question as he asked me this question, just a slow, like, Grinch style smile is spreading across my face, and it's because this is my biggest bias. And I know it sounds very egotistical of me, but, yeah, I'm like, oh, my God, I'm so smart. I, like, my information is the best information, and I'm so objective, and blah, blah. And, like, good lord, do you have to keep that in check? And honestly, in some strange ways, like, still being on things like social media, which is really bad for this, because the bubble is real, the echo chamber is real, being served propaganda and biased content that serves to support your biases.

Speaker 2 (14:24):
And if you ever see anything outside your bubble, guaranteed gonna be, like, the dumbest thing from the other side. So you're gonna be like, look at those morons.

Speaker 1 (14:35):
But lately, it's been very helpful for me to see to be able to have real time evidence of my biases. Because being on TikTok, it's so easy to scroll and you see something and you're like, what? Oh, that's crazy. And then if you do take one moment to go to recognize, like, I'm having a reaction, I'm having a click triggered response. Triggered response. That means I should go look up this thing. That means I should go figure out what exactly I'm being served and why. And then nine times out of ten, you'll figure out it's something that's totally out of context or fucking fake, and that's enraging. And also, it's a way to have that real time feedback of, look how not immune you are. No one is immune to propaganda. No one is immune to misinformation.

(15:27):
So, like, I've been very humbled by social media recently by taking that step of like, oh, maybe I should. Maybe I should look this up, like, having. And I don't know how I got there, and I don't know that I'm always there because I definitely still see things and go like, oh, my gosh. But, yeah, I'm really susceptible to this stuff.

Speaker 2 (15:50):
Yeah, combating your own biases is actually tremendously difficult. And if you don't go, and this is part of, again, why I like the site's title, is because if you don't go into it knowing that it's essentially impossible, which they are well cognizant aware of, on less wrong, then that's its own bias. You're never going to get there. You can only become incrementally better. And I think that's the only way to properly think about. They're like, well, I've learned a lot of stuff doing this show, right? And if I ever feel like I've arrived at the perfect knowledge, then that's definitely a bias, right? Like, that's definitely going to blind me. We also talk, you and I talk a lot about this in terms of, like, falling, quote unquote, falling for cult stuff, right?

(16:34):
Like, the minute you think that you're immune to cult stuff is the minute that you are susceptible. So I think that falls neatly into that sort of bias category, too.

Speaker 1 (16:42):
All of this said and granted, I know a little bit about less wrong than the last time we talked about it. And just. I probably should say this later, but I'll say it right now, like, I definitely dabbled in. I looked at a lot of less wrong when I was doing research for cryonics because they're obviously cryonics comes up on the website, and they have a lot of really great information on cryonics. Like, kind of the gold standard for. Okay, how do I actually, like, sign up for this and, like, get the insurance and blah, blah is there's a repository of, like, step by step information on less wrong. Like, it's a great resource.

Speaker 2 (17:09):
So you went to less. Okay, so interesting. The fact that you've been there is interesting. Interesting. How did you feel about your research there?

Speaker 1 (17:16):
I mean, I was literally. I don't want to say too much. I don't want to say too much. We'll get to it.

Speaker 2 (17:21):
You scraped the surface?

Speaker 1 (17:22):
Just scraped the surface. Some of the things I saw there, I am enraged by. I have rage voice even. Talk to. Talk about it later. I will say that less wrong gives off a lot of yellow and red flags to me. And there's something so sinister in life where it's like, this should be a really great place, and I can't trust it.

Speaker 2 (17:48):
Yeah.

Speaker 1 (17:48):
Like, this should. And I feel like that's like, okay.

Speaker 2 (17:50):
So you kind of know how I felt the last.

Speaker 1 (17:53):
And it feels like I keep running into that, of, like, oh, this should be the safe space. And then it's. And then it's not. And then you find out that there's, like, I don't know. I don't. I don't know how to combat that. I don't know what to do about it. I don't know if that's just, like, reality, but there's something that I just. You just can't. You can never rest on your laurels of, like, a group of people talking about things on the Internet. You have to always have, like, I don't know. You can never. You can never just go, like, I'm here. I've arrived.

Speaker 2 (18:21):
Yeah. I mean, with. This is the place with the thing you're researching, too.

Speaker 1 (18:24):
Yeah.

Speaker 2 (18:24):
Yeah. Not just your own brain. When you said, all that being said, I thought the next thing you were gonna say was, I actually am smarter than everyone else.

Speaker 1 (18:32):
No, I'm nothing.

Speaker 2 (18:35):
Okay, well, if you thought that way, then you would probably be Eliezerkowski himself. But anyway, we'll get to all that. And the thing is, Kayla, like, you'd know all this stuff if you had read the sequences.

Speaker 1 (18:48):
What the fuck's the sequences?

Speaker 2 (18:50):
What?

Speaker 1 (18:51):
I hate this already so much.

Speaker 2 (18:52):
You haven't. Oh, you went on less wrong and you didn't read the sequences.

Speaker 1 (18:56):
See, this is what I don't like about this shit. This is what I don't like, about this shit is that this is like, geez, Kayla. Fucking nerds who, like, have felt probably like outcasts iRl getting together on the Internet and then, like, recreating the patterns of the way society is, like, shitty and isolating. Like, I don't want to go onto a forum and have to deal with clicks and like, in groups and out groups and like, oh, you don't know this? Oh, I can't handle it anymore.

Speaker 2 (19:26):
If you want to engage with any of the cult, the, sorry, the community members of less wrong, you have to read the scripture. Sorry, you have to read the sequences first. I mean, if you don't read the sequences, how will you and other less wrongers be on the same page?

Speaker 1 (19:41):
But that's also true.

Speaker 2 (19:41):
You have to reach your catechism to get a common basis of understanding and vocabulary.

Speaker 1 (19:47):
This enrages me, but it's also like, I can see that being not wrong. I've definitely been in conversations with people where I'm like, we just have different levels of information on this thing and I can't really.

Speaker 2 (19:57):
You're coming at it from different basis of understanding.

Speaker 1 (19:59):
Yeah, I can't really engage with them until I read this book.

Speaker 2 (20:01):
I know, I know. That's why I'm like, one hand I'm like, oh, there's scriptures for a call. But on the other hand, the reason that I have read them say, like, why you should read the sequences is exactly that. It's like, well, it'll be easier for us to have a conversation if we have a similar basis of understanding. And boy, oh boy, do you need to have a similar basis of understanding on less wrong. They have so much vocabulary. We actually talked about this on the aforementioned previous episode of this show about less wrong. But man, you would be hard pressed to find a place with more jargon in group idioms and internal links all around to the other posts on their own site and other parts of the sequences.

(20:47):
Actually, one place that I can think, the one place I can think of that's more self referential, bodybuilding.com dot. Okay, two places. The other one is tvtropes.com.

Speaker 1 (20:57):
Oh, okay. Yep. Yes.

Speaker 2 (20:59):
So less wrong is a lot like tv tropes. If you've ever spent any time on.

Speaker 1 (21:03):
That site and maintained your sanity.

Speaker 2 (21:05):
And maintained your sanity, like, it's impossible to parse at first because every other word in an article on tv tropes is just another link that links to another tv tropes page about something else. But then you get to that page. And that's all just links to other things, like every single. So you can't even read sentences there unless you already know what's going on. And eventually you'll read enough pages on tv tropes that you're sort of able to start parsing the stuff that you're actually there to read. But by then, it's too late. Your mind has been poisoned, and now you're just thinking in tv tropes language. And also, 4 hours has gone by. Less long. Less long.

Speaker 1 (21:45):
Less long. That's the porn version.

Speaker 2 (21:49):
I'm keeping that less wrong is a lot like that. A lot like tv tropes. And one thing I ran across this time that I unfortunately didn't last time when I was doing research was that some less wrongers seem to be, like, pretty well aware of this fact and will sometimes even reference tv tropes themselves in their posts and comments.

Speaker 1 (22:06):
Like reference the site or reference actual tv tropes?

Speaker 2 (22:10):
Kind of both.

Speaker 1 (22:11):
Okay.

Speaker 2 (22:11):
It's a little bit. That line is a little blurred.

Speaker 1 (22:14):
Yes, it is.

Speaker 2 (22:14):
As you know. Yes, it is. But yeah, or reference the site. But, you know, again, Kayla, if you had just read the sequences, you would know all this.

Speaker 1 (22:23):
Are you being exaggerating? Are you being facetious? Or is that kind of how the sequence is actually talked about on the site?

Speaker 2 (22:30):
Yeah, no, that's how it's talked about. And it's also talked about in a manner of, like, because Eliezer wrote them and they are so influential. And he is so influential. There's definitely a blind spot there of, like, Eliezer is a bit. I don't want to say the word infallible, but, like, a little bit infallible. Like, referenced a lot as a authority on XYZ thing that you're talking about. And, like, linked to this part of the sequences where he talked about this.

Speaker 1 (23:00):
I feel like in the spirit of talking about our biases, I should be upfront. I have a very strong bias coming into this episode in that I really don't have great feelings about Eliezer Yudkowski personally.

Speaker 2 (23:14):
I mean, you're not alone.

Speaker 1 (23:16):
I'm excited to get to that. I know that I'm also probably a little bit too one side and with not as much information as I should have. I haven't read the sequences, but from what I have gleaned from less wrong, I don't think he and I would really get along. I don't think we would hang out well.

Speaker 2 (23:35):
Yeah, he doesn't strike me as the kind of guy that finds it easy to get along with people because of sort of his. His way of moving about the world and his, like, sort of insane, pedantic overconfidence. And he's very, like, r. I am very smart, like, walking around as a human, which is weird because he's also, like, I don't know.

Speaker 1 (23:55):
He is very smart.

Speaker 2 (23:56):
We'll get to this because it's confusing, because, like, when I read the stuff he's written, I'm like, that's pretty reasonable. He does talk about biases in a pretty, like, thoughtful, intelligent manner. And he is constantly, like, we'll get to some of the quotes, but, like, I. There's a lot to like about what he has to say. But then, like, some of the ways that he says it are, like, very off putting, especially to people that are very well versed in domains that he isn't, that he talks about.

Speaker 1 (24:22):
Right. Also, just to be a little fair, this is just a guy, and we're reacting to his Internet Persona. And as much as the Internet is real life, it is also, there is some level of separation. I don't know what he's like.

Speaker 2 (24:34):
No, the Internet sucks.

Speaker 1 (24:35):
I don't know what he's like off the Internet. The Internet can bring out the worst in a lot of us. It's hard to pick up on context. It's hard to pick up on attitude.

Speaker 2 (24:42):
Kayla, the Internet enabled us to make this podcast.

Speaker 1 (24:44):
I love the Internet.

Speaker 2 (24:44):
Unequivocal, bad.

Speaker 1 (24:45):
And also, like, my bias against Eliezer Yudkowski is against the Persona that's presented on the Internet. I don't actually know the guy.

Speaker 2 (24:54):
That's true. He might be in real life. He might be like, a perfectly, in fact, probably is a perfectly personable dude. I don't know him or people that do know him. And actually, I've even seen people that criticize him. He does get critique from people that do like him, that are like, look, he's good in all these ways, but then these ways, he kind of sucks. So he's a mixed bag. We all fucking are. I'm surprised you haven't asked me, what are the sequences?

Speaker 1 (25:22):
I figured at one point you were going to go, okay, but what are the sequences?

Speaker 2 (25:27):
That is almost word for word what I have written in the script right here. But it's, like, a little bit on purpose, because that means, like, you have a little taste of how it felt for me to try to dive into less wrong at first. Like, lots of confusing, like, unexplained, but linked to references. So I was just like, okay, I keep trying to figure out what this site is about and like, people keep saying like, well, you should read the sequences. Not directly to me, but like in the posts that I'm discovering and whatever.

Speaker 1 (25:55):
Did you post anything?

Speaker 2 (25:57):
I didn't. I thought about it. I thought about reaching out to a few people. I quickly became overwhelmed. We will talk to someone who is not a less wronger, but I'm still maybe thinking of reaching out to someone. But I also don't want to spend like too much time on the topic.

Speaker 1 (26:10):
Sure, sure.

Speaker 2 (26:10):
Because there's a lot to dig into. Anyway, I eventually figured out the sequences thing and it wasn't too hard. It was just like a little bit opaque at first. And actually, I've already explained here today what they are. Just without naming them. They are Eliezer Yudkowski's first blog posts on less wrong that we talked about earlier in the show. The ones that came from his posts over on overcoming bias.

Speaker 1 (26:36):
Gotcha.

Speaker 2 (26:37):
Also, they're kind of like a Bible for less wrongers. As we talked about, the teachings of Eliezer are referenced frequently there. Here's a quote from a less wrong post entitled highlights from the sequences. The sequences is a series of essays by Eliezer Ryudkowski. They describe how to avoid the typical failure modes of human reason and instead think in ways that more reliably lead to true and accurate beliefs. These essays are the foundational texts of less strong. To provide a quick and accessible introduction to the sequences, we have selected 50 of the best essays that capture some of the seminal ideas. Estimated read 8 hours art by dolly too. There's a little art bit there in that article. This by dolly. Just. Just want to throw that in there.

Speaker 1 (27:22):
I don't want to read them.

Speaker 2 (27:26):
They are fucking long. Here's another less wrong post, simply entitled sequences. Quote the original sequences were written by Eliezer Yudkowski with the goal of creating a book on rationality. They have since been collated and edited into the book rationality from AI to zombies. If you are new to less wrong, this book is the best place to start.

Speaker 1 (27:51):
I want to say something.

Speaker 2 (27:53):
You're currently saying something. Good job.

Speaker 1 (27:55):
I want to say another thing.

Speaker 2 (27:56):
Oh, okay.

Speaker 1 (27:57):
And now another. Sorry. Oh boy.

Speaker 2 (28:02):
This is good content right here.

Speaker 1 (28:06):
The word rational.

Speaker 2 (28:09):
Mm.

Speaker 1 (28:11):
Is a yellow flag word for me now.

Speaker 2 (28:14):
That's because you're irrational.

Speaker 1 (28:16):
Because I'm so irrational that I can't even hear it without going crazy. It's just because it is simply a word, and there is no credentialing for being able to say x is rational and y is not. And I think that most of us can probably think of times in which the word rational was used to label one thing just because that person wanted it to be rational versus another thing just because they wanted it to be irrational.

Speaker 2 (28:42):
Go back to episodes last season about objectivism.

Speaker 1 (28:44):
Yeah. Simply because you label something as rational doesn't make it rational. And, like, I also think there's a lot of fallacies over, like, well, something that follows, like, this style of logic is rational or something that addresses the existence of emotions is irrational. And the word rational sends up little signals for me these days. And so I just wanted to say.

Speaker 2 (29:09):
That I think this whole site and even all of this research in these episodes are kind of, like, riding that razor's edge of, like, the more rational you think you are, the more of a blind spot you have. But also, it's good to try to cultivate your rationalities. So, like, I think that's part of why I've felt so, like, bizarrely torn is because this is just a tearing sort of thing that they're talking about. Right. Like, if you're trying to improve your own thinking, there is that feedback loop of, like, I'm thinking about thinking now. I'm thinking about thinking now. I'm thinking about. And then you also have the paradox of, like, the more rational I become, the more rational I think I become, the more irrational I might actually because of my overconfidence. It's. It's also a word fucking weird that.

Speaker 1 (29:56):
Has been, like, used. Ugh. And I'm gonna, like, get into, like, jarring a now. And I just. Please don't. Please don't all shut your ears to me just because I'm saying the words that I'm saying.

Speaker 2 (30:04):
I've never opened my ears to you, not once.

Speaker 1 (30:06):
I think that, like, the word rational has also frequently been, like, weaponized against marginalized populations. Like, I'm thinking, like, women or queer people or people of color, especially when, like, those people are advocating for, like, the rights of their group. Like that.

Speaker 2 (30:24):
Oh, women especially.

Speaker 1 (30:25):
Yeah. The word rational is, like, well, the opposite. What you're saying is irrational, and what I'm saying is rational. And we're simply saying you're irrational because we don't want you to have rights. And we're saying we're rational because we want to preserve the status quo. And that's another reason why that word is a little tricky for me these days.

Speaker 2 (30:44):
I don't have a big chunk of podcasts devoted to this. But now that we're talking about it, I just really do want to mention it. So we talked about here. This quote said that he made a book called rationality from AI to zombies, which is essentially just a collection of all those sequences. He also wrote a book that's like, oh, that's cute. Foundational, a to z.

Speaker 1 (31:04):
Sorry, sorry.

Speaker 2 (31:05):
AI to zombies.

Speaker 1 (31:06):
Don't put any of that in. Sorry, please continue.

Speaker 2 (31:09):
Too late. I'm gonna. He also wrote a book. I guess it's considered fanfic, technically cool. And it's also considered somewhat foundational. Like, people definitely reference it and talk about it. Harry Potter and the methods of rationality, where, like, I couldn't be less interested. Harry and Hermione and what's his face all are like. Harry's like this super rational person that discovers all of these ways to think less biasedly. And then he's also getting in discussions with the snapes and the whatevers of the world. The conflict between magic and a rational world. I didn't read any of it. Well, I read little snippets of it.

Speaker 1 (31:55):
I hope that was helpful to some people. I will never be looking at that, not even once.

Speaker 2 (31:59):
It's like, what if we took two really cringey things and mush them together into a cringe Voltron? That's what that is. Anyway, one of the things I was wondering, I'm not sure if you're wondering at this point, but why are they called sequences?

Speaker 1 (32:14):
Yeah.

Speaker 2 (32:15):
The fundamental answer is, I don't really know. It's another bit of less wrong jargon. But it comes from referring to, quote unquote, sequences of site posts. So, like a series of site posts that all cover a similar theme, like, say on less wrong. If there's a bunch of related posts talking about confirmation bias. Okay, that might be called a sequence of posts on confirmation bias, and they might, you know, tag it. So there's like a little like, hey, do you want to talk about confirmation bias? Go to these sequences. But there's also, like, this unacknowledged bit, which is like, there's this huge separation between sequences. Lowercase and the sequences capital t, capital S. Okay. With sequences just being like any old collection of theme posts and the sequences being Yudkowski's original writings. Definitely a huge separation there.

Speaker 1 (33:08):
I'm just getting. I'm not objectivism. Objectivism. It just feels like objectivism style group.

Speaker 2 (33:18):
It's definitely. I mean, there's overlap there too. Like, we talked about this a little bit in the humanity plus episode but, like, there is a sort of libertarian ish wing of transhumanism, less wrong type people, and there's a more progressive type wing. But also, this stuff is really orthogonal to the traditional left right culture war stuff that we're so fucking flooded with every day. I think that's also part of what's disorienting about it is because you know what?

Speaker 1 (33:51):
It's scary, but it's also a little refreshing.

Speaker 2 (33:53):
Yeah, it's both. Totally. It's like, this disorients me, but also. Oh, man, that's kind of nice. Now, speaking of his original writings, are all of them about rationality and human cognitive biases?

Speaker 1 (34:05):
Yes.

Speaker 2 (34:07):
No, kayla, of course not. They're, like, partially one man's treatise on how he thinks of cognitive biases and the art of rationality. And by the way, I'll say I did read a lot of, like, okay, I read what I consider to be a lot of them, but was a drop in the bucket because of how many there are, how long it is. I found them useful and interesting for the most part, but they're partially that treatise on cognitive biases and also partially, like, a guru self help book about how to think better yourself. And then as that kind of thing tends to happen, where the guru talks about things that aren't exactly human cognitive, he'll sort of branch off into politics and this and that. Politics is its own whole thing. On the site, they say politics is the mind killer.

Speaker 1 (34:57):
What does that mean? What is.

Speaker 2 (34:59):
We'll get to that. We'll get to that.

Speaker 1 (35:02):
I'm gonna keep.

Speaker 2 (35:02):
But he talks about, like, a lot of stuff that isn't just biases and human thinking.

Speaker 1 (35:07):
Okay.

Speaker 2 (35:07):
Even though it's mostly that. Okay, so obviously then, Mister Yudkowski's background is in philosophy, then?

Speaker 1 (35:15):
Are you asking me? Are you saying me? I would say it's almost certainly not, no.

Speaker 2 (35:21):
Okay. So then, like, what do you think? Maybe psychology?

Speaker 1 (35:24):
I don't. If I had to guess, I would say, like, history or something.

Speaker 2 (35:31):
Like, wow, history.

Speaker 1 (35:32):
That's like some sort of, like, academic career where he had to read a lot of books. Like this kind not philosophy, because that's too obvious to. But, like, not psychology, but, like, something where he's had to read a lot of these guys, and then when I could be these guys.

Speaker 2 (35:53):
So here's the interesting thing, is that his and many less wrongers, his interest in these cognitive and philosophical topics comes directly from working on the problem of artificial intelligence. Like, that's.

Speaker 1 (36:05):
Oh, so he's a computer.

Speaker 2 (36:06):
He's an AI guy.

Speaker 1 (36:07):
Like, he makes an aihdeendeh.

Speaker 2 (36:09):
He talks about it.

Speaker 1 (36:12):
Well, what does he, what is it? What's the answer here? What is he?

Speaker 2 (36:17):
Hold on.

Speaker 1 (36:17):
Where did he come from?

Speaker 2 (36:18):
So, like, if you're trying to build a human equivalent brain, then, like, you basically, you necessarily run into all this kind of stuff about how do people think? And how should I make this thing think?

Speaker 1 (36:28):
And I'm really glad that people that are making, that are in the AI world are thinking about this stuff. Yeah, that seems super important.

Speaker 2 (36:36):
Yeah. And if you're thinking about all that, then, I don't know, maybe it's a good idea to say, like, hey, I've been thinking about all these cognitive things and cognitive biases working on this AI problem. So, like, maybe you guys would benefit from it, too. So I'll make a website. So. All right, last part of the question then. Not philosophy or psychology or whatever. Where did Yudkowski get his background for working on the AI problem? Did he get his degree in computer science or natural language processing maybe or something.

Speaker 1 (37:04):
Say, yeah, he's like a computer science engineer. Probably lives in the Bay area.

Speaker 2 (37:12):
Also. No.

Speaker 1 (37:13):
Okay.

Speaker 2 (37:14):
He did not attend college and is considered, that's fine. An autodidact.

Speaker 1 (37:19):
See, I hate that word. I hate it.

Speaker 2 (37:21):
One of those guys is autodidact.

Speaker 1 (37:24):
The one I hate, or is there's another one?

Speaker 2 (37:26):
No, you hate autodidact.

Speaker 1 (37:27):
I know. I hate that, too. But I think there's another word that also means the same thing. But I really don't like that word.

Speaker 2 (37:32):
Yes, I really hate it. I know you hate that word, and.

Speaker 1 (37:36):
I shouldn't, because I think that informal education is extremely valuable and important. I just don't like that word.

Speaker 2 (37:44):
He also didn't attend high school. He does not have a high school degree either. Okay, so just going back to what you were saying a second ago, I do want to put a huge disclaimer here that this is not an educational elitism podcast.

Speaker 1 (37:59):
No. Some people don't get to, or some people don't choose to go to high school.

Speaker 2 (38:02):
That's right. And that is more than perfectly fine. You can be an extremely smart person, an extremely successful smart in whatever way you want to define it, book smart, street smart, emotional smart, whatever, without going to high school. That's totally possible.

Speaker 1 (38:18):
It is just an unusual thing to come across when we're talking from the perspective of two Americans, where it's more rare to come across someone who hasn't attended high school.

Speaker 2 (38:30):
And we'll get to this a little bit, but it makes you kind of run into the exact thing that less wrongers talk about when you go to their site, which is like, well, you know, it'll help if you read the existing sequences so that we can talk about something and we don't reinvent the wheel and blah, blah. That's exactly what he didn't do. Because not only did he nothing go to college for philosophy or natural language processing, he also kind of doesn't have a lot of regard for the previous work in that area.

Speaker 1 (39:05):
Okay, there's two things going on there, because not going to college or high school doesn't necessarily mean you don't have regard for.

Speaker 2 (39:11):
Exactly for, but he has that institutional learning.

Speaker 1 (39:14):
But so you're saying in addition to not having attended institutional learning facilities, he also is like, oh, that's a. Yeah, yeah.

Speaker 2 (39:22):
And like, I think that if he had. If that wasn't true, then it would be kind of like, okay, well, who cares if he didn't go to college? He still read all the stuff and is engaging with the community of philosophers in like, on the terms that have been.

Speaker 1 (39:35):
So how is he an autodidact? Like, what did he do to didact himself?

Speaker 2 (39:39):
I think he just thinks about things a lot.

Speaker 1 (39:40):
That's not an autodidact then. Wait, isn't Ida. Hold on.

Speaker 2 (39:43):
Listen.

Speaker 1 (39:44):
Is an auto didact somebody who, like, self educates?

Speaker 2 (39:46):
Self educates? I don't know.

Speaker 1 (39:48):
There's a difference between self educating and saying a lot of words.

Speaker 2 (39:51):
I agree. I cannot tell you, with the research that I did why he and others refer to him as an auto die. Dad.

Speaker 1 (40:00):
He doesn't lead with things like, I've read a lot of this or I read a lot of that. It's like, that's not even part of it. Can it be okay that somebody's credentials are simply the body of work that they produce, even if that body of work is not necessarily informed by the existing knowledge?

Speaker 2 (40:19):
I think that is a very high bar to cross, if it's crossable at all. And we will definitely get to that.

Speaker 1 (40:29):
That's making me confront some of my biases.

Speaker 2 (40:32):
There's actually a lot more to talk about regarding the lesser on community, including the really important bit of what movements it fits in with. Spoiler. It's, you know, transhumanism and extropianism, which is why we did those episodes in sequence. Oh, and the influence it less wrong, in turn, has had. Spoiler. It spawned the modern rationalism movement. That's what we're going to get into in the next episode. But before we get to that episode for the next one, I just need to, like, I just need to vent, man. I I need some processing time that's just, like, private between you, me, and everyone that can download our publicly available podcast. Until next time. I'm Chris.

Speaker 1 (41:17):
I'm Kayla.

Speaker 2 (41:18):
And this has been.

Speaker 1 (41:19):
I don't know anymore.

Speaker 2 (41:20):
Cults are just rational thinking weirdness.
Advertise With Us

Popular Podcasts

Stuff You Should Know
Dateline NBC

Dateline NBC

Current and classic episodes, featuring compelling true-crime mysteries, powerful documentaries and in-depth investigations. Follow now to get the latest episodes of Dateline NBC completely free, or subscribe to Dateline Premium for ad-free listening and exclusive bonus content: DatelinePremium.com

On Purpose with Jay Shetty

On Purpose with Jay Shetty

I’m Jay Shetty host of On Purpose the worlds #1 Mental Health podcast and I’m so grateful you found us. I started this podcast 5 years ago to invite you into conversations and workshops that are designed to help make you happier, healthier and more healed. I believe that when you (yes you) feel seen, heard and understood you’re able to deal with relationship struggles, work challenges and life’s ups and downs with more ease and grace. I interview experts, celebrities, thought leaders and athletes so that we can grow our mindset, build better habits and uncover a side of them we’ve never seen before. New episodes every Monday and Friday. Your support means the world to me and I don’t take it for granted — click the follow button and leave a review to help us spread the love with On Purpose. I can’t wait for you to listen to your first or 500th episode!

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.