Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:03):
Welcome to Stuff to Blow your Mind from how Stuffworks
dot com. Hey, welcome to Stuff to Blow your mind.
My name is Robert Lamb, and I'm Joe McCormick and
Robert and I. Last time, we're talking about the rapture
(00:23):
about Christian eschatology, religious eschatology in general, and secular eschatology
of a technological nature such as the singularity trans humanism.
And our discussion went so long that we decided to
divide it up into two parts. So here is the
second part of our conversation on the end times. Okay, well,
(00:44):
I've got another uh futurist, sort of techno futurist mindset
that I think we can ask. Is this a religion?
Is it like a religion or does it just sort
of like uh tickle some of the same weak spots
We've got as many religions too. So I want to
talk about the singularity. Robert, do you feel good about
(01:04):
the Singularity? You feel bad about it? You think it's
a bunch of whoy? I feel good about it? U.
I mean part of that I think comes from just
being I'm a fan of like the E and M
Banks culture series model of of a post singularity post
scarcity world in which the super intelligent computers have our
best interests at heart. They kind of look after us
(01:24):
as kind of benevolent, semi noble gods. Yeah, okay, okay, Well,
so I think we need to explore this idea because this,
to me is is the core of this kind of
topic we're talking about today, like technological eschatology. So you
could say that the singularity, I think has come to
have almost as many interpretations as the Christian escuoton, right, Like,
(01:45):
just like the Christian eschatology is, they're all sort of
loosely related and grounded in similar beliefs, but they vary
in the explicit details or in the way you'd explain them,
even if they're not necessarily contradictory. So, in the most basics,
and the singularity is a hypothesis that technological innovation, at
some point in the future is going to reach a
(02:07):
tipping point of unimaginable innovation that fundamentally changes the fate
of humanity on a rapid time scale. Very often it
involves artificial intelligence. So I'm gonna talk about three basic
models just to give you an idea. One one is
the super intelligence model, and this is the idea that
humans are going to create superhuman artificial intelligence. So when
(02:31):
you think about it, intelligence, I would probably agree has
been the most rapid driver of change in the universe
up until this point. Like in cosmological or geological history.
What difference does ten thousand years usually make to anything?
You know, not a lot, hardly any, But you think
about how much Earth has changed in the past ten
(02:51):
thousand years due to intelligence. Once, once we reach the
creation of machines with functional intelligence exceeding that of any human,
assuming that actually does happen, proponents argue that this superhuman
artificial intelligence is just going to rapidly transform our environment.
The conditions of our lives, and the capabilities of human
(03:12):
civilization will be fundamentally transformed on a very huge scale.
And they say, so the age of super intelligence is dawning,
and this is the singularity. Another way of looking at
it is also based on artificial intelligence, but it's known
as the intelligence explosion, and this is based on the
idea of self improving intelligence paradigms that surpass our ability
(03:36):
to understand or control them. So let's say, Robert, you
build a superhuman artificial intelligence. What's the first thing you
have it do um, co author my paper about the
creation superhero human AI intelligence. Oh yeah, well that's that's
a smart move. But do some talk shows to promote
it and my book about the creation? Yeah? I guess yeah,
(03:59):
it's hard to think past that. Come up with the
tastiest breakfast cereal ever created. No. I mean what a
lot of people say is, well, you know, the best
bang for your buck as soon as you create a
superhuman artificial intelligence is that it would be best put
to use designing ways to rapidly make itself more intelligent,
which theoretically it should be able to do. So. If
(04:19):
it can do this, it will experience exponential self improvement
and not only will our technological intelligence grow, and not
only will it accelerate, but it's acceleration will accelerate. I
feel like the people saying that that's the best use
of super intelligence UM have not seen any science fiction film,
because it seems like the thing to do is all right.
(04:40):
It's it's reached the point to where it's smart enough
that I can take it to Vegas and and just
totally kill But it's not so smart that it will
literally kill me or or Sky at the entire country. Well,
I don't know. I mean, I think the argument would
go that, well, even if you don't do that, somebody's
gonna somebody's gonna say, well, I want to have a
smarter AI than your AI, so I'm gonna have my
(05:03):
AI improve itself that I don't know. You take your
AI to a poker game, and somebody else has been
having their AI self improve its own intelligence, and it's
telling that person out this metaphors out of hand. But
but you want to be the person with the smartest day.
I right, Well, but it's kind of like you're you're
(05:23):
reaching a point where you can create your own demons,
create your own deities, and in doing so, it seems
like you you need to know where to stop, like like,
maybe not make a full blown god, that make a
demi god, and let me make an artificial demigod, because
then then at least it has limits. Right, Yeah, you
want a Hercules not a Zeus. Yeah, I far rather
(05:44):
deal with a her renegade or Hercules this in a
situation as opposed to a renegades Zeus, which of course
is I mean, that's pretty much how both of those
guys rolled what they wanted Okay. One more idea of
the singularity, and this is a I think a simpler version,
but it just is that at some point technological process
again usually the emphasis is on artificial intelligence, but you
(06:05):
could put the emphasis elsewhere. At some point it begins
to accelerate so quickly that our power to predict the
outcome becomes basically zero. Events will become so strange and
unpredictable that we we just can't even imagine them from
our present vantage point. So an outside context indent, yeah, sure,
outside context problem caused by technological innovation. Yeah. So a
(06:27):
lot of modern singularity thinking can be traced, and there
are plenty people who have written about this over the years,
but I'd say the biggest figure in the singularity today
is Ray Kerswile. He's the technologist and author. Rakers while
very smart guy, worked on technology for years. He's made
a lot of very accurate and interesting predictions that have
that have proved correct over the years. But he's also
(06:50):
made a lot of predictions about the future that a
lot of people now think are I don't know, maybe
he's being overly optimistic, but anyway, Kerswill has argued in
several books that the Singularity, This artificial intelligence revolution that's
going to fundamentally change humanity is coming, and it's coming soon.
In his two thousand five books, the Singularity is near,
(07:12):
Cursewell predicted it would happen in so, Robert, you've got
any plans for um? I think my son will be
out of high school at that point. I think he's
he's uh graduating high school theoretically in so Okay, So
(07:33):
I guess, I'm so. I don't know. Well, we have
a job by the time the Singularity happened. I don't know. Yeah,
it's it's crazy to think about that, Like what do
you what do you? What do you do post high
school if the Singularity is coming in just a few years,
Like do you go to college? Hey? With with the
singularity in mind bingo, that is a very good question.
So so let's back up. Let's just assume for a
(07:55):
moment a Singularity type scaton is coming. That might be
a big assumption, but let's assume something like that will happen.
Is it good or bad? Will we get the AI
my Trea or the AI Ragnarok? Who well, obviously I
want my Traya as opposed to Ragnarock. Yeah, I mean,
so the my Trea version, what would it be? You know,
(08:17):
war cease, all diseases, including aging itself or eradicated. There's
general abundance of resources and wealth. Everybody gets what they want.
Humans live indefinitely in a state of harmony, exploration, and bliss.
If I feel like the model we get in most
sci fi scenarios is that you think you're gonna get
my Treia, but then you're actually on the road to Ragnarok,
(08:39):
and then you end up just kind of burning everything
down and going back to whatever nest up system you
had before. Right, So why do people think that we're
on the road to Ragnarok? Well, you mentioned earlier, you know,
the machine, a machine smart enough to to do all
this great stuff for you, is also smart enough to
figure out how to kill you if it wants to.
So why might it want to? Well, super intelligent machines,
(09:02):
people say, well, either intentionally choose to eliminate humanity for
some reason, though I admit I've never heard this explain
in a way that made a lot of sense to me,
Like why they would intentionally choose to hunt us down
and eradicate us. So maybe somebody could explain that to me, Um,
but the judge of the living and the dead, it
could be. But but the other version is that perhaps
(09:25):
merely by a flow of design, they tend to damage
or destroy humanity as a byproduct of executing their primary function,
and that that vision seems more plausible to me. But so,
here's one common example of how something like this would go.
You've got a super intelligent computer program that's smarter than
any human. It can improve its own intelligence, is just
(09:46):
crazy smart, and it's designed to maximize efficiency along the
production line of a factory that makes toilet paper. So
somebody forgets to add some conditionals, and the computer program
ends up turning all the atoms on the surface of
the earth, including the ones in your body, into toilet paper.
It has done its job. Well, that sounds kind of crazy,
(10:07):
but some tam or version of that nightmare seems more
plausible to me than the intentional terminator style eradication of humans.
I don't know what do you think about that, Robert Um? Yeah,
I think I tend to interpret the sort of sky
Net models the sort of like all humans are flawed.
That's thus all humans must be eradicated. Thing to be
(10:28):
more of a uh more more rejecting insecurities. Well that
and also just a bedtime story to say, like make
sure you program and that failsafe, you know. And I
feel like, obviously we would have that failsafe, and if
it were a super intelligent machine, it would have. Yeah.
I just have a hard time buying into that model
(10:51):
of a super advanced system that it would be so
catastrophically myopic. Yeah. I mean, both visions have their advocates
right there. There are smart people on both sides. But
the funny thing is all this is assuming that something
like the Singularity will actually happen one way or another,
and not everyone agrees that it will. But there are
(11:14):
also people who have been critical of the idea of
the singularity, and some people who have characterized it as
a religious idea inherently. One of them is the technologist
and virtual reality pioneer Jerone Lanier, and he's got a
lot of opinions about the Singularity. He's made this case
in in in books, said that it's not only religious
(11:35):
in nature, but potentially a dangerous or harmful religion. So
I read an article that he wrote and published in
The New York Times in two thousand ten called the
first Church of robotics, and I thought this was interesting.
So Laniard points out that every news report of an
advance in artificial intelligence gives us the impression that the
(11:55):
field is making steady progress to overtake human intelligence in
cape of alities, and I sort of agree with him.
I get that feeling when you see these little news
stories like there's there's a new artificial intelligence program that
can smell flatulence, There's a new artificial intelligence program that
can tell a joke. You know, all these like weird
little quirks of humanity, and they seem to add up
(12:18):
to something over time. Okay, so it can smell farts,
that can make jokes about farts, because it's it's about
the corner with the standard it has become Adam Sandler, Yeah,
it has surpassed its humanity. But yeah, so there's nothing
wrong with individual AI projects, Landier says, In fact, he
has worked on them himself. But he makes a couple
(12:39):
of points. He says individual advances in artificial intelligence research
can be interpreted more usefully without the overall framework of AI,
meaning without the idea that computers are becoming synthetic versions
of human intelligence. And the example he gives his you know,
the IBM computer winning on Jeopardy, Watson going on, are
(13:00):
beating people on Jeopardy. So he he says, that's a
theatrical stunt that over sells a fake aspect of what
this computer program can do, you know, conversational human knowledge,
which it doesn't really have, and undersells a real aspect
of its capabilities that it's a very powerful phrase based
search engine. And then he also says that AI, uh,
(13:24):
that the AI that in some way passes for human
requires humans to do all the work. Like when you've
got a social robot, you know, going around saying like Hi,
I'm your friendly social robot. Look at how advanced my
artificial intelligence is. His point is that humans, being social,
will work with what they get. And if you're having
a transaction with a robot that you are told is
(13:45):
social and nature, you can very easily adapt your social
behaviors to the extreme limitations of this machine, and they
do tend to be pretty extreme. You're not gonna mistake
at a machine for a human. Yeah, I mean, as
we've discussed on a show before, or I mean humans
can have we can have conversations with with a with
a doll with a with a smiley face. It's scrawled
(14:07):
on the wall and in putting. So, uh, it's not
that much of a step to say that, yes, we
can have a conversation with a chat bot. We can,
we can have a conversation with the search engine, and
of course we can certainly have a conversation with an
echo borg. How often, though, do you meet a person
who really loves fruit trees? That's true, I love fruit trees.
(14:28):
I hate fruit trees. But how can you hate fruit
trees because I'm not a robotlet what would you do
if you found a turtle? The turtles on his back?
Why aren't you turning him over? Joe? But what what?
What I'll tell you about my mother? Okay? Sorry? So
but anyway, A layer goes on to say, quote, what
bothers me about this trend? And he's talking about this
AI media trend. However, is that by allowing artificial intelligence
(14:50):
to reshape our concept of personhood, we're leaving ourselves opened
to the flip side. We think of people more and
more as computers, just as we think of computers as people.
And I think there might be something to that, actually,
that the idea of AI not only sort of lifts
computers up, but in a certain way, at least emotionally
(15:12):
kind of lowers humans down. Yeah. And we we've discussed this,
I feel recently on the show too, about how just
even though we we have this increasingly pervasive model of
our own cognitive performance as being essentially a program, we
think about processing things, we think about visual input of data,
(15:34):
we think about crunching the numbers, uh, and various other
models we use technology to inform our understanding of how
our brains are working. And those are just those are
just metaphors really, and but the underlying processes are not
really comparable to the computers that we have today. It
would be we could maybe compare them to an advanced
(15:56):
computer will have in the in the future, some sort
of maybe um, you know, um advanced machine maybe. Yeah. Yeah,
I think that's true. Uh, They're they're very much metaphors.
And I'm not saying that they're not useful metaphors, but
I do think I want to take what Lanni you
are saying seriously here, because I think he has a point. Anyway,
he goes on when he's going to get to the singularity,
(16:18):
so he says, you know, engineers are humans with human desires,
even though they may be advocating this sort of like
dehumanized technological ideology or view of intelligence in personhood in
the world. And they've got the same desires that lead
us to create and follow religions. So this sort of
leads them to wrap their technological ideologies up into parallels
(16:40):
of religious thinking. And he and according to Lannier, he
thinks the concept of the singularity is a religion expressed
through engineering culture. And he points out if this religion entails,
you know, essentially scatological beliefs that imply that most people, basically,
I would say, anybody not working in information theory, artificial intelligence,
(17:02):
or robotics, if pretty much anybody outside those fields can
basically just sit and wait for imminent salvation or destruction
via the machines. And if inadvertent devaluation of persons by
overstating analogies between persons and computers takes place, it can
certainly exacerbate tensions between traditional religions and modernity, and it
(17:26):
can be a destructive and demoralizing force in itself. So anyway,
he ends by saying technology best serves humanity when we
resist the urge to turn it into a religious ideology. Yeah,
because otherwise you end up in a situation where you're like,
I could, I could exercise and eat right, but singularities
come in. Singularities gonna work that out, or you might think,
(17:47):
I'm not even gonna bother to write the Great American novel,
because Great American Novel bought thirty seven oh five. It's
going to write the Great American novel. It's gonna write
one a day, one every hour. I could work hard
to achieve peace in world civilization, but maybe the Ai'll
sort it all out for this or just kill us all.
(18:07):
I mean, either one. It's sort of a demoralizing, de
motivating belief, which of course has obvious parallels with you know,
more apocalyptic near future religious movements that said the end
time is coming, So what does it matter? Give me
your money now? Certainly? Yeah, what whether it's a secular
(18:28):
or you know, supernatural or not supernatural in nature, If
you've got beliefs about a powerful force that comes in
and intervenes and makes all of our actions obsolete, either
either way it comes from that seems like it can
have a de motivating force on the way you live.
Your life. But I want to say one more thing
Linear says about the Singularity that I think is interesting.
(18:50):
Uh So, I listened to a podcast that was an
interview with Linear and and he says, quote, the difference
between a religious fanatic and a religious non fanatic is
in certainty of the imminent timing of whatever their escaton is,
you know, whatever their end of days is. It's when
you believe that you know when it's happening and it's
(19:11):
soon that you really turn into a nutcase and you
do harm to yourself and others. In my mind, the
Singularity movement is sort of doing that. And I think
I see what he's saying there, because he sort of
qualifies saying that if you mean, in the very long
term view, humanity will make some phase transition, and it
has something to do with information technology that's not necessarily
(19:33):
crazy or harmful. You know, you might say, who knows,
over the next few thousands of years, some somehow humanity
might change itself, uh, due to computers and the Internet
and stuff like that. But the notion that it's going
to happen in twenty years or something is destructive and
a sign of fanaticism. According to him, and I think
that that idea makes sense to me too. Well, in
(19:57):
large part, you could say, you're you're setting yourself up
for disappointment, You're setting yourself up to fail right, that
you're going to reach that point when the second Coming
is going to happen, when the Singularity is going to happen,
and then it doesn't happen, and then how do you
feel about the world. Then how do you approach the
near and long term future? Well, I think that is
definitely something we should talk about in just a moment,
(20:18):
But I want to finish up on the Singularity with
one or two more things. First of all, I want
to go back and defend against that by saying, there
are also people who say, you know, look, if we're
worried about the Ai Ragnarok uh saying the Ai is
gonna come in and destroy us, all we do need
to be thinking about it now, even if it's not
gonna happen soon. Just the possibility that it might happen
(20:38):
soon means that we need to be preparing immediately. I
can see the wisdom in that too. I mean, I
don't know, it's uh. At the risk of sounding fascil,
I I see both sides, Yeah, well, I mean, certainly,
h if the iceberg is out there, we want to
know how not to run the ship into it. But
(20:58):
then again, but what if iceberg is a superhuman robot
that you don't even know whether it could possibly exist
or not. Well, that's the kind of thinking that enables
humanity time and time again to not actually, you know,
deal with the problems they're facing them, to just say, ah,
it might not even be real. So I don't know,
the jury is still out on the science there. So
(21:23):
I want to offer one more thing, the impossible defense
of singularity thinking. I found a short video on YouTube
where somebody in a crowd asks Ray Kurzweil himself, the
guy who said it's going to happen if the Singularity
is a religion and curs while responds that. He says, essentially,
it does provide some of the same things that religion
(21:44):
traditionally provided, like a means to forestall death. That's worth
pointing out. But it's also not a religion inherently because
it's based on scientific and technological prediction. And he said,
quote it doesn't start from rapture. He also says, quote
religion and or just in pre scientific times. To call
it a religion is to say that it's pre scientific
(22:05):
or unscientific. And I think, you know, maybe or maybe not,
Like I definitely do sense that some people who call
the singularity of religion, because Landing is not the only
one who said this, this is a thing people say
in criticism of it. I think some people are merely
being pejorative. You've noticed that if you ever heard anybody
(22:25):
call something that's not a religion a religion, it's an insult.
And that interesting, like, uh, that political candidate you agree
with Greg Stillson Man, you all you Greg Stilson supporters
are like a religion that's never a positive thing. Yeah. Yeah,
I mean I find myself doing that less and less
(22:46):
because I feel like I can have it's just me personally.
But I think there are plenty of things that I
can say, Oh, I have kind of a religious interest
in this. Yeah, not to say that I am, you
know that it's some sort of like fanatical, like unrealistic
level of enthusiasm for it, but like I I kind
(23:06):
of measure it like to the amount of my person
the amount of my identity, uh that I pour into
the topics. So I would say that I have religious
or semi religious enthusiasm for things like science, for certain uh,
for certain fictional worlds and ideas, um Robert. If I
(23:27):
may compliment you, I think that's a sign of broad
minded thinking. I mean, no, I think both religious and
non religious people. I don't think this is really limited
to people who think religion is in some way a
bad thing. But I think generally both religious and non
religious people insult a thing that's not explicitly a religion
by calling it a religion. Yeah, I would agree with that, yes,
(23:48):
but yeah, in your sense, it doesn't have to be.
I mean, in some ways we could just observe, like, okay,
not assuming that calling something a religion is a bad thing. Uh.
In what ways is the Singularity like a religio gen Well,
I think it does have a lot of similarities. It
does seem to offer a means to forestall death. It
does seem to offer uh, potentially both apocalyptic and utopian
(24:09):
schatological visions. It does seem to rely I don't know
if I would say it relies on faith in the
same way that many supernatural beliefs to do. But it
does seem to sort of ask you to to imagine
the possibility that things that we've never seen before are
going to happen. You know, we've never seen superhuman artificial
(24:30):
intelligence before. There's no evidence that that can in principle exist.
We but we also just don't know that it couldn't.
So it's just like maybe maybe not all right, So
it seems like a good place to bring it back
around to trans humanism and rapture the idea that there
are all these similarities between the two things, there are
certain conflicts between these two ideas, and then at the
(24:54):
very end here we may have just a little fun
thinking about how the two crash into each other. Okay,
tell me, because here's one thing I've noticed, I think,
and I haven't seen any data to back this episode.
This is just purely anecdotal experience. In my experience, people
who identify with trans humanism or belief in the singularity
are overwhelmingly atheist or agnostic, non traditionally religious people. Yes,
(25:21):
you do see a great deal of that. However, it
is worth pointing out that again they have at least
two organizations. You have the Mormon trans Humanist Association. That's
the world's largest advocacy network. Really, yeah, for the ethical
use of technology and religion to expand human abilities. That's
they're they're from there, they're talking points. Then you also
(25:41):
have the Christian trans Humanist Association, and um, you know,
each one has their own sort of set of guidelines
and all. Um. Uh and I could I could certainly
roll through a lot of this. But for instance, um
that the Mormon trans Humanis Association has this idea that
they push of a transfigurism. They said transfigure they say,
transfigurism is religious trans humanism exemplified by syncretization of Mormonism
(26:05):
and trans humanism. That's awesome. Um. And in meanwhile, the
Christian trans Humanist Association, one of their tenants is that
God's mission involves the transformation and renewal of creation, including humanity,
and that we are called by Christ to participate in
that mission, working against illness, hunger, oppression, in justice, and death. Uh. Well, yeah,
(26:26):
I mean you could look at it that way. For
some reason, the idea of transforming the human animal, it's
one of those things that I I don't see in
any way that it violates any tenants I recognize of Christianity,
but it seems just sort of like esthetically in affront
to traditional religious thinking. It's a lot of what we
talked about in our you know, UH Techno Religion episodes
(26:49):
that we did earlier. And you could say that this
episode is in many ways a sequel to those, or
it's in that spirit. But in the Techno Religion episodes,
we talked about how religious thinking seems most encouraged by
a traditional technological atmosphere, and when you bring computers and
iPhones and stuff, it just does seem profane in some way.
(27:12):
And there's nothing in the Bible or in any religious
text I know of that says thou shalt not bring
a computer into a worship service. But you know, it
just doesn't feel very sacred. Yeah, it doesn't feel right. Now.
Going back to the the idea of sort of Christian
transhumanist ideas that James J. Hughes paper that I mentioned earlier, Um,
(27:34):
he calls uh up a quote from a fourteen eighties
six UH work by Pico Della Mirandola. It was a
Christian humanist and in his orritation on the dignity of
man wrote quote to you has granted the power of
degrading yourself into the lower forms of life. The beasts
(27:55):
or to you as granted the power contained in your
intellect and judgment to be reborn into the higher forms
the divine um, and and and he wrote other other
statements as well that kind of backed up this kind of,
you know, way pre humanist notion of how one might
improve the expression of human life. Well. Yeah, I mean, if,
(28:18):
for example, Jesus said take all you have and give
to the poor, then wouldn't the Christian who thinks that
we can achieve abundance through technology be best served by
working on technology that will eliminate all scarcity and bring
great resources to everyone. Yeah, it seems like you would
be in favor of any trans humanist idea that does
(28:40):
not attempt to interfere with God's plan for humanity. Um, so,
I guess it depends on what you think that plan is.
Right now, this is where we get into an interesting idea.
And this is kind of where uh okay, So, first
of all, you see a lot of especially the conspiracy theory,
sort of the the Christian conspiracy theoryst area of the internet.
(29:02):
You see a lot of mashing up of trans humanist
ideas with with the bad guys in in a book
of revelations. Oh yeah, remember how I mentioned the ones
who get to rule with Christ in the millennium are
those who have not taken the mark of the Beast
and many and what is the mark of the beast?
I've seen various interpretations where they say, oh, the mark
(29:25):
of the beast is trans humanism, Like to become transhuman
is to take the mark of the beast, and therefore
only sort of uh, you know, non augmented humans are
going to be part of this model. There have been
many visions of the mark of the Beast in Christian
literature that thought that it was going to be some
sort of biometric verification. So imagine that you replace your
(29:48):
credit card with a barcode or a chip in your
arm or your head or your hand or something like
that that you can use to pay for groceries or
something like that just by scanning your body. A lot
of people have said, well, something like this is coming
and that's got to be it. That's the mark of
the beast. And in the Antichrist is perhaps the AI. Right,
this this post singularity artificial intelligence that that we're investing
(30:12):
all of our our faith and energy into. And I
put it, this is an interesting idea too. I have
seen arguments to say that the Illuminati or the anti
Christ or the Illuminati and the inn of Christ. However,
you want to factor those two bad guy factions together,
they are going to create a fake rapture in order
(30:36):
to trick the faithful into thinking that the rapture has
already occurred and they were left out, thus like dismantling
their faith a bit. That is an awesome conspiracy theory.
It is um and that's like the mother of them all. Yeah.
I found some some stuff online about it, dating back
from around two thousand UM, and they were weirdly propping
(30:57):
up some of this by pointing to a Technology Review
you article about the about editing live video, editing people
out of live video. Uh. The idea being here that
through mass media consumption, you could fake the rapture of
individuals and and if I may, I'm thinking if the
Illuminati is involved in this, you could then disappear like
(31:19):
famous a moral individuals and say, hey, check out this,
this particular awful celebrity just got raptured. You didn't clearly
you should abandon all of your faith. That is uh,
that is mighty sinister Robert or is it though, because Okay,
so if and I'm playing with the ideas here and
(31:40):
having some fun with the ideas, but if the Antichrist
is an artificial intelligence, could it be a benevolent artificial
intelligence that has realized, Hey, if the off if the
end times come, that's bad for all these humans that
I'm protecting. What can I, as an all powerful AI
do in humans? Is soxiety to prevent the end times
(32:02):
from coming? Does that mean faking the rapture and perhaps
forestalling the rapture? Well, I know many Christians believe that
the Antichrist will come in the name of peace, and
that is weird. You would end up in a scenario
where you would say, no, peace is bad because we
actually need all that tribulation to come so that the
good guys can ultimately win, even though it's going to
(32:23):
mean a lot of a lot of suffering. Okay, we've
gotten very, very deep into the both secular and religious
theology of the end here, so maybe we should pull back.
Are you ready to pull back back? Let's pull back
to the real world and and look at a few
examples of what happens when our our scatons, when our
spiritual end times utopias or apocalypse is don't come about
(32:45):
as predicted. So there have been so many religious predictions
of the end times that have failed. There is no
way we can even begin to list them. I would
recommend you go to Wikipedia. Here's a great Wikipedia page,
uh called list of dates predicted for apocalyptic events. There
are just hundreds, uh and it's it's really interesting. One
(33:07):
of my favorites is the leader of the or one
of the leaders of the Monster rebellion, the Anabaptist rebellion
in Prussia, Jon Matthias. He had a good in times
prediction that didn't turn out good for him, and I
think he ended up mutilated. And yeah, anyway, things things
don't tend to well, actually not tend to Why am
(33:30):
I hedging? No one who has ever predicted the end
of the world has turned out to be correct that
that is a true statement. Inevitably these predictions fail. And
then what happens when they fail, Because if you're the
kind of individual who's thrown out these these predictions, you
tend to have there tend to be people listening to you,
(33:51):
reading your word, gathering around you, maybe giving you all
of their money. On living in a barn with you
or in a bunker, or planning for spaceships to come
and steal you away so you can avoid massive floods. Um.
And then how do you readjust well, there's a there
was a famous book came out from psychologist Leon Festinger
(34:12):
titled Win Prophecy Fails, and this is this was an
important work. He and his colleagues studied the Seekers, a
religious group that believed a massive flood would wipe out
the West coast uh and and the founders said that
beings from the planet Clarion were communicating to her through
automatic writing. Uh And then the followers were banning their jobs, possessions,
(34:35):
their spouses to wait on the flying saucers that would
rescue them. And then nothing happened. So so they all
they all figured they were wrong and went home right, No, no, no,
And and this is something we can sort of see.
I guess we can look for models of this in
our own lives on lesser with with lesser um lesser steaks.
(34:58):
One example that in only comes to mind, and we
see a lot of this in sort of geek culture,
and that is what happens from that movie that you're
super excited about. It doesn't get that great of reviews,
how do you respond. Do you say, uh, well, maybe
it's not going to be that good, or you see
it and you think, okay, well I agree with the
critics and it wasn't that good. Or do you do
do you double down? Oh? Many will double down, yes,
(35:20):
depending on I think a lot of times it's based
on how much you've invested in the movie going into it.
So if there's a movie you hoped would be good,
but you thought about it a couple of times before
it came out, and then you see the reviews aren't good,
that's oh, that's disappointing. If it's a movie you've been
talking about with your friends every single day up until
it comes out, and then the reviews aren't good, you're like,
(35:42):
well they gotta be wrong. Now it's it's sunk cost. Now, obviously,
with films, you can simply argue, well, hey, I enjoyed it,
so I don't care. Uh, certainly, I'm I'm the type
of person who enjoys many bad movies, and I you know,
I'm not gonna apologize. Is about say liking Chronicles a
(36:03):
ritic it's a it's a bad movie. I'm not gonna
argue it's good, but but it certainly was enjoyable. Robert,
I have a question for you. You seem like maybe
the right guy for this. Were you one of the
few people who liked on Lee's Hulk. Yeah, well, I
enjoyed it at the time. I didn't like love it,
but I was like that that's the Hulk. He's holking
it up there. Yeah. I like the follow up more,
(36:27):
but I don't think. I don't think people were in
love with that one. Like that one, even though it
was an early Marvel Cinematic Universe movie with the one
with Edward Norton. Yeah, yeah, I feel like I enjoyed
that seemed kind of forgettable. Yeah, that one the the
the those who are a member of the religion of
the Marvel Cinematic Universe, they do not bring that particular
(36:48):
book of the Marvel Cinematic Bible up that much. It
is the it is the problematic chapter in the scripture.
It's like second Chronicles. People just don't quote from it
a whole lot. But it I've getting a little off
off subject here. Essentially, Festing festing Or entered into this
with the mindset that people would need to resolve cognitive dissonance.
They can't just argue, hey, I know that this, I
(37:10):
know that my beliefs are crappy, but I enjoy them. No,
your beliefs were based along the lines that a flood
would come and you'd be on a UFO right now,
and you are not. So how do you deal with that?
He said that they doubled down on their beliefs and
recruitment efforts, but in reality some leave, but the vast
majority um experience little cognitive dissonance, and so they make
(37:33):
only minor adjustments to their beliefs. So it ends up
having like it doesn't have that great an effect. Like
the failures come and you can always this is the
key part. You can always attribute that failed prophecy to
human slip ups. Right. Well, I mean, so let's say
you predicted that the Singularity is going to come in
a certain year and you get there and nothing happens. Well,
(37:56):
I mean you could always say, well, hold on, now,
this doesn't challenge in any way the principle that the
Singularity is coming and it's coming soon. It's just I
can explain this. We had a we had a big
recession in the economy several years back, and that cut
way down on funding for universities and corporations that are
studying machine learning and artificial intelligence, and so progress just
(38:17):
got slowed by some external events that aren't you know,
my predictions basically sound. We just had a few slip
ups that happened. We didn't realize Pokemon Go would be
such a success and distract everybody and and decrease productivity. Right, so,
and yeah, you can. You can make those arguments about
about singularity. You can make those those arguments about any
kind of social utopian idea. You can make excuses for anything. Yeah,
(38:40):
I mean the the typical problem with utopian cults is
it's almost always the same the the utopian leader says
this is the way we're gonna make life better for
for you and all the followers, and then there's some
sort of sex scandal, there's some sort of power disruption,
something goes wrong, and you can always say, well, no,
they just were the wrong vessel for the message. Well,
(39:01):
this also works for non technological but secular utopias. Imagine
the way people think about the implementation of h communist utopia.
You know, so you create a Marxist government, uh, and
I'm not talking about you know, socialist democracy or whatever.
But it's so, you know, the real utopian vision, like
we're gonna create a perfect workers paradise um and and
(39:26):
it kind of fails or I mean, I guess it's
always arguable, but I think it's pretty hard to say,
look at the Soviet Union or something and say that
worked great. But you can always argue, well, they did,
they just didn't do it right. This's not it's not
a problem with the principle. They're just they weren't real
communists or something. Yeah, they were the they were the
the fake Christ as opposed to the actual cast. They
(39:48):
were antichristis yeah, there's nothing wrong with my the principle
behind my my end times prediction. I just I made
I made errors. The math is wrong. I I translated
the Golden tablets and correct really, etcetera. So that seems
to be the basic time timeline, the basic outcome for
any kind of failed prophecy, be it you know, the
(40:11):
value of a particular superhero movie or the outcome of
a of an apocalyptic prophecy. Robert, I want to ask
you about one weird specific example. So we've mentioned the
idea of singularity that of course hasn't been disproven yet,
but you know, we we can imagine how it could
be adjusted if it were disproven. We've talked about Marxist
supposed utopias, but I wonder if you could look at
(40:35):
the Internet as a failed secular utopia, because if you
go back to the early day, you know, dot com
bubble type days where people are talking about what's possible
with the Internet, there is a lot of hyperbolic positive
language where people are talking in incredibly grandiose terms about
the Internet almost as a type of utopia is going
(40:56):
to open all these doors and connect people all over
the world for limitless sharing of ideas and potential, the
information super highway. We're you know, we're all gonna be
talking and sharing ideas and it's it's just gonna be great.
And I don't know, look at the your Facebook feed today.
Could you argue that the Internet is a failed secular
(41:18):
utopia if you if you look at its goals as
purely utopia and I think I think very much though,
like I don't think the Internet. Certainly, the Internet has
has connected us more. It's allowed for a great deal
of sharing and uh and informational availability. But on the
other hand, it also allows you to simply, you know,
flock to people who share your own set of beliefs,
(41:40):
making it easier to find UH like minded individuals. Both
for both in the positive and in the negative sense
you've been able it allows us to uh to put
on a mask UH and lose their identity and and
lash out with the most horrible you know, in the
most horrible way and complete strangers. I mean, I don't
(42:03):
have to tell anybody this. You're all on the internet.
You know that. You know how it all plays out.
It's more of just an expression and in various ways
apple amplification of who we are as opposed to a
refinement of who we are. And that always the way
you can look at that as all utopias. Somehow, all
utopias end up being very human, Like We're still going
(42:24):
to express as human no matter what sort of social
structure you try and strain us through, no matter what
kind of technological change you strain us through, We're going
to come out as the same animal. And I think
that's inherently why there is the appeal of these superhuman forces,
like either a supernatural religious force god or or you know,
(42:46):
the gods of any religion coming into set things right
via some kind of power that transcends what we're capable
of naturally, or a superhuman artificial intelligence, powerful machines that
are smarter than we are and no better than us
and can improve us in ways that we don't seem
to be capable of our natural experimentation with social behavior. Yeah,
(43:07):
So enter the AI, Enter Christ, enter the Bodhisatva of
the future. Whatever it takes for something from the outside
to step in and shake us into the right form.
All right, So there you have it, the rapture, the singularity, uh,
trans humanism. We threw out a lot of ideas here
and moved them around on the board, had some fun
(43:29):
with them, hopefully, had some some thoughtful conversation about it.
And I'm sure a lot of you have some feedback
as well, your own thoughts on these topics. And I
do want to ask everyone out there if there was
some sort of a Christian rapture, what if there was
a way to create a rapture proof suit. Oh wow,
superhuman technology versus superhuman uh, supernatural power. Yeah, I mean
(43:50):
the the AI Antichrist that I was going on about earlier.
Perhaps they might come up with some sort of shielding
to try and prevent you know, souls from being uh
sucked up into the into the into the outer void.
And then on the other hand, perhaps someone might want to,
you know, even a faithful individual might say, actually I
need to stay behind, I want to document, or maybe
(44:12):
I I love the some people in my life too
much you are not believers, or I just want to,
you know, make sure I get to work on time
the next day. I better wear a rapture proof suit
that will contain my myself and keep mine you know,
almost like wearing a like weighted boots to keep from
just rising above the ground. Robert, you never cease to
amaze me. Well, thank you, Joe, I can try, uh
(44:34):
so anyway, rapture proof suits. What do you think about that?
You can find us at the stuff to Blow your
Mind dot com. You can reach out to us on
Facebook and Twitter a few other social media accounts as well.
You'll find links to those at stuff to Blow your
Mind dot com, and if you want to email us,
as always, you can write to us at blow the
Mind at how stuff works dot com for more on
(45:03):
this and thousands of other topics. Is it how stuff
works dot com the f