All Episodes

July 23, 2024 77 mins

If you met a person from the year 1019, they would be, in many ways, much like you. But what if you met a person from 3019? How different will human beings be one thousand years from today? Will humanity as we understand still be around? Join the guys with special guests John Goforth and Brent Hand, hosts of Hysteria 51, as they explore the strange twists and turns the future may hold for our species... assuming, of course, that we survive.

They don't want you to read our book.: https://static.macmillan.com/static/fib/stuff-you-should-read/

See omnystudio.com/listener for privacy information.

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:00):
Welcome back to our classic episodes, Fellow conspiracy Realist. In
twenty nineteen, we put our brain trust together and we
asked ourselves what will humanity be like in one thousand years?

Speaker 2 (00:14):
And since it hasn't yet been a thousand years, this
is still very much topical, one would argue.

Speaker 3 (00:22):
Yeah, And we're also joined with some very special guests here,
friends of the show, John go Forth and Brent Hand
who are hosts of Hysteria fifty one, or they I
guess they were hosts of Hysteria fifty one at the time,
but either way, friends of.

Speaker 2 (00:38):
The show, Brent the dead hand Hand as we like
to call him.

Speaker 3 (00:41):
In John go Forth, Tally go Forth, Tally Hoe go Forth.

Speaker 2 (00:45):
Yeah, sure, well yeah, he does tend to move in
a forward direction.

Speaker 1 (00:49):
So our question is in this episode, in this conversation,
what would it be like if you met someone from
the year three zero one to nine? How different would
that human being be from the person you find yourself
today in twenty twenty four. We still believe this is

(01:11):
a worthwhile conversation and we look forward to your thoughts,
So tune in and hit us up if anything has
changed on this one conspiracydiheartradio dot com.

Speaker 2 (01:23):
What was what was a thousand years in the past?
Can you guys do some cocktail math real quick?

Speaker 4 (01:27):
No, I don't know.

Speaker 3 (01:29):
I'm just getting ready for a project thirty twenty five,
you know what I'm saying.

Speaker 1 (01:33):
There we go mandate for leadership. Also, to answer your question,
it would have been as we're recording now, it would
have been ten twenty four, which was a big year
for humanity in several terrible ways. Yeah.

Speaker 2 (01:47):
What was hot in ten twenty four? Uh?

Speaker 1 (01:49):
Plagues, plagues, nobody was washing their hands. Mmmm, swords Just
a couple of quick hits.

Speaker 2 (01:58):
Henry the second died in ten twenty four and Conrad
the second, first of the Sallyan dynasty, was elected king
only after some debating among dukes and nobles.

Speaker 1 (02:09):
And ten twenty four CE was a leap year. Put
that in your pipe and smoke. It were fun at parties.
Here's the episode from UFOs to Psychic Powers and government conspiracies.
History is riddled with unexplained events. You can turn back
now or learn this stuff they don't want you to know.

Speaker 4 (02:44):
Hello, welcome back to the show. My name is matt Our.

Speaker 1 (02:47):
Compatriot Nole is on adventures, but will return soon.

Speaker 5 (02:51):
They called me Ben.

Speaker 1 (02:52):
We are joined as always with our super producer Paul
Mission controlled decad Most importantly, you are you. You are here,
and that makes this stuff they don't want you to know. However,
you are not alone, fellow conspiracy realist. We are exploring
the idea of humanity in one thousand years with a

(03:13):
ton of asterisks and a dollup of optimism. And we
are not diving into this exploration on our lonesome. No,
quite the opposite. We are joined by some friends of
the show, some personal friends of ours, the host of
our peer podcast, Hysteria fifty one. Check it out if
you haven't yet.

Speaker 3 (03:33):
Everyone, please welcome John go Forth and Bent Hand.

Speaker 5 (03:37):
Gentlemen. Thank you very much.

Speaker 6 (03:39):
I guys, I told you I shouldn't bring Brent, but
you still allowed it. I don't know.

Speaker 5 (03:45):
I begged and begged, and you know, just.

Speaker 6 (03:49):
Probably the worst decision was allowing conspiracy Bot here. We
will try to keep him quiet.

Speaker 1 (03:54):
That's that's true. That's true. Conspiracy Bot. Also, thank you
for coming.

Speaker 3 (04:00):
Did you notice that Scully seems to really be getting
along with conspiracy Bot.

Speaker 5 (04:04):
Yeah, it's weird.

Speaker 4 (04:05):
I know.

Speaker 1 (04:05):
This is an audio podcast. Maybe if they do something
that's remotely work appropriate, we'll take a picture and host it.

Speaker 6 (04:12):
But right now, yeah, we can't show it right now.

Speaker 5 (04:14):
We're just it was a bear getting the TSA, so
he needs to enjoy himself. This last time we're traveling
with them.

Speaker 1 (04:20):
Smells like circuits burning, and it's so guys, Like most people,
we spend a great deal of our time here in
the present twenty nineteen.

Speaker 5 (04:29):
As we record this, everyone I know spends their time
in twenty nineteen.

Speaker 1 (04:32):
Yes, yeah, we spend a lot of this time thinking
about what might or might not happen in the future.
And most of the time when we're using these great
predictive computers that we refer to as our brains, we're
using this this imaginative capacity to think of what we
could call small events, not in a diminutive way, just

(04:54):
like things that don't really rock the friggin' timeline.

Speaker 3 (04:57):
You know what where if it's on your calendar?

Speaker 2 (04:59):
Right?

Speaker 1 (04:59):
Stuff on account? Where am I going to go for lunch?
What are my list of errands? You know? How can
I get out of that conference? Call those kind of things?

Speaker 6 (05:10):
When will our son supernova? I've got that on my calendar.

Speaker 1 (05:14):
Yeah, yeah, exactly exactly because we oh yeah, we have
money on that, right, Yes, so we also have a
tendency and we could even call it a compulsion to
wonder about those large events. Let's define it, very loosely,
arbitrarily as any event that involves a group of people

(05:34):
large enough such that members of the same group may
not know each other and may never meet. So this
is interesting because for both Hysarian fifty one and stuff
they want, you know, really for any podcast that's that's
a large scale event because a lot of us listening
now are probably not going to meet each.

Speaker 4 (05:53):
Other, right WHOA I never thought about it like that.

Speaker 1 (05:56):
The universe is a cold, lonely place. So you know,
we've got examples of these, like what's a what's a
large event?

Speaker 4 (06:04):
Oh, sure, like a huge election.

Speaker 3 (06:06):
Let's say a primary election or a presidential election. The
people who are actually voting probably aren't going to meet
each other. You're definitely not going to meet everybody who's
voting in that election.

Speaker 6 (06:15):
Nine to eleven. Yes, there you go from impact from
not being there even to the people who were there.
I mean, there were so many people impacted that weren't
just in those buildings and surrounding communities and such and
family members. Not all of them all met.

Speaker 1 (06:29):
And that's an excellent example because that's a single event
rather than something like a war that goes on for years.

Speaker 3 (06:35):
Well, and it's a So that's a large event, and
it's all it's also a long term event, right.

Speaker 5 (06:42):
Yeah, we're still reeling and the effects are felt.

Speaker 2 (06:45):
Yeah.

Speaker 5 (06:45):
Yeah.

Speaker 1 (06:46):
And that's because in addition to defining something as a
small or large event, we can also imagine them the
way you said, Matt, short or long term. So this
so nine to eleven clearly a long term event, but
a short term, large scale event would be a celebration
in your city, like the Bulls win in Chicago, because

(07:07):
you guys, you guys live in the Windy City. And
when the Bulls win in Chicago, what do people go nuts?

Speaker 4 (07:13):
Excuse me?

Speaker 6 (07:14):
We call it the lower fourth. Well we haven't seen
that for so long, I'm not sure how. But when
the Hawks won the Cup, yes, when the.

Speaker 5 (07:24):
Comes one, seven million people I think showed up one
of the largest human gatherings in history here.

Speaker 2 (07:30):
Wow.

Speaker 1 (07:31):
Yeah. So a long term, small scale event then would
be something we're like, all right, I got this kid, now,
I got to send them to college. At some point,
I better start putting away the scratch, right, And this
can take longer, but it's not really It might not
alter a timeline. Today's episode is about a long term,
large scale event. One of the biggest questions we ever

(07:54):
deal with. What's gonna happen to us?

Speaker 5 (07:57):
You know, the collective us? Yeah, collective us? What are we?
Where are we going?

Speaker 1 (08:03):
Not what's going to happen today or tomorrow, but a
thousand years from now? And off air this is this
is an idea that we were talking about that we
had originally. I don't know if we mentioned this on
Hysteria fifty one. We had originally when we were talking
with each other, we're like, all right, yeah, yeah, one
of us will do an episode on humanity in a
thousand years, and then the other and then we'll do

(08:24):
a companion episode about humanity in ten thousand years. And
we started looking into the research and we were, you know, Paul,
you're gonna have to bleep me on this, but we
were like, holy, ten thousand years.

Speaker 5 (08:34):
Yeah, I mean, you could just say whatever you want.
Maybe you know.

Speaker 6 (08:38):
We're all jellyfish, yeah I know.

Speaker 1 (08:42):
And even with so we just did an excellent episode.
Well I really enjoyed it. I thought you guys did.

Speaker 4 (08:49):
I was good.

Speaker 5 (08:50):
Yeah, and you too.

Speaker 1 (08:52):
Conspiracy and we did an episode on the future of
humanity in one hundred years, and what you guys should
us was that there's quite a lot of stuff down
the pike that not only is possible, but is in
a very real sense inevitable.

Speaker 5 (09:08):
Right, And that's the big I think that's the big
defining difference of you know, when we look at these things,
is what are we controlling and what is just going
to happen? That's you know, this inevitability that is just
we're careening towards these events.

Speaker 6 (09:21):
A lot of times, I think, especially with technology, it
really is the pebble that starts the avalanche. But once
it started, you're not stopping it. Yeah, Like the progression
of our processing power isn't going to even though we're
not following More's law, still it is not going to stop.

Speaker 1 (09:39):
Yeah, it's a train with no breaks, right. Yeah. This
is this is the part of the show where we
want to want to let you know that if you
haven't checked out Hysteria fifty one, please do it. In
this episode, we're going to be exploring some things that
dovetail with the previous episode. So if you want to
get the the full cinema for your ears experience, stop now.

(10:06):
We'll wait check out our earlier episode.

Speaker 6 (10:10):
And for you comic book fans, we're referring to this
as a crossover.

Speaker 1 (10:13):
Events, a shared universe.

Speaker 6 (10:16):
Yes, I'm Batman.

Speaker 1 (10:19):
Oh all right, I respect that. I respect that. How
far are we crossing over? Can I go Marvel?

Speaker 5 (10:25):
Yeah? Yeah, there you go.

Speaker 6 (10:28):
Just stay away from image and we'll be fine.

Speaker 1 (10:30):
Oh yeah, their creator owned.

Speaker 5 (10:32):
Uh, it's true. So all right, let's get to it.

Speaker 3 (10:38):
Sure, So let's talk about where we humans. Everybody in
this room, everyone listening is going to end up in
one thousand years, ten generations ostensibly from now.

Speaker 1 (10:50):
Well to understand that, Matt, don't we have to go
back to Would it be helpful for us to go
back to one thousand years before now?

Speaker 4 (11:01):
I guess?

Speaker 1 (11:02):
Okay, cool, Thanks for saying that, because otherwise.

Speaker 3 (11:06):
All right, that's that's really the only thing we can
do to have an understanding. Right, Well, let's see how
far we've come in a thousand years and then extrapolate
from there.

Speaker 1 (11:15):
And Brett earlier in our previous episode. You you mentioned
the phantom time hypothesis, so you're when we're playing a
little bit with history, please write in by the way
to hysterianation and us and let us know what your
take is. What actually happened in ten nineteen, right?

Speaker 5 (11:35):
What year isn't really? Year is really?

Speaker 1 (11:38):
Yeah, and you can use the juw Ja calendar if
you want as well. But okay, so here's the here's
the good news. For a lot of people. In broad
evolutionary terms, just basic biology, one thousand years is not
a drop in the bucket. It's not it's not an
it's not an iota of spit. It's not a snap.

Speaker 6 (11:58):
So you're saying it's not enough time to develop gills
Kevin Costner style.

Speaker 1 (12:04):
The only flaw in one water War, the only one.

Speaker 5 (12:07):
It's been worked out behind his ears.

Speaker 1 (12:13):
And for some reason people hated that. They were like,
that's so convenient, it was so great.

Speaker 4 (12:19):
You jerk totally forgot about that.

Speaker 6 (12:21):
I got an idea, guys, mad Max on a water world.

Speaker 1 (12:26):
Now only if we get the guy from Field of Dream.
That's really the glue. But yeah, this means what does
this mean? Like, like, what happens if we meet someone
putting aside all the interesting problems with time travel, we
just don't think about it too.

Speaker 5 (12:47):
Too intents and purposes. If you met someone besides you know,
maybe some some language barriers and in the way they're dressed,
you wouldn't really know the difference because in the last
thousand years we haven't changed that much.

Speaker 6 (12:59):
I mean, you kill them, like, don't use don't give
them a blanket.

Speaker 5 (13:09):
It'll probably twenty five years old and look seventy. You know,
it'd be a.

Speaker 1 (13:13):
North Centinal Island situation. Possibly, but but yeah, you're you're
absolutely right, you've nailed it in one. We would still
eat the same food, we still need the same range
of environmental conditions, and if there's a little bit of
romance in the air, you can reproduce.

Speaker 6 (13:29):
There's actually a documentary that explores this. I think the
other member was a little a little older than just
a thousand years old, but explores how they might operate
in modern society. It's called Encino Man.

Speaker 4 (13:41):
Oh god, I.

Speaker 1 (13:43):
Felt, I've got to learn that every time you reference
a documentary, it's not so it's interesting. There have been
some discernible changes. People like doctor Alan Kwan, who works
with computational genomics. He he has some very specific ones.
One of them is kind of weird, right, did you

(14:04):
you saw this?

Speaker 3 (14:05):
Yeah, yeah, the concept of our foreheads changing.

Speaker 5 (14:09):
They call it the Manning syndrome. Yeah, Peyton Manning.

Speaker 4 (14:14):
You guys are just brutal.

Speaker 1 (14:17):
Remember that old insult when people call someone the forehead?
More like five heads? Pretty soon if this trend is correct,
a thousand years from now, and people will be making
fun of three heads if you're if your foreheads too small?

Speaker 6 (14:31):
Too small?

Speaker 3 (14:32):
Right?

Speaker 4 (14:32):
Right?

Speaker 1 (14:33):
So we also seem to have a trend of growing
taller as a species, but a lot of that is dicey.
Can we can we call that straight up evolution?

Speaker 6 (14:45):
Or is that just our ability to feed ourselves better
and breed with Scandinavians?

Speaker 1 (14:50):
And yeah, there a tall bunch. There are a tall bunch.
Like how now we have this, we have this conspiracy
that they are Scandinavians, scandidate.

Speaker 3 (15:04):
But the biggest difference between that human and you really
would be things that are a little different. There's societal things, right,
medical care, like advances in technology that have allowed for communication,
Like you said, language barriers are going to be different.
Even just over all the beliefs and values of that
person are going to be vastly different. Sure, but I

(15:27):
mean that's really it.

Speaker 1 (15:28):
It doesn't Yeah, it doesn't mean that they have lower
or higher cognitive ability, right, It just means that their
priorities will also widely different.

Speaker 5 (15:37):
You're very much you know, you're product of your environment
you're surrounding, and so that's going to play heavily into that.

Speaker 6 (15:44):
You might show them a car and they think it's magic,
but that's you're You're so right about it not being
their cognitive ability. It's why we we we did an
episode not that long ago on go Beckley Tepi and
and and the or the Pyramids or whatever, and there's
all of these anthropologists to say, well, how did they
do this? How could they possibly figure this out? Well,

(16:05):
because they had the same sized brains that we do,
that's how they did. I mean, a lot of elbow grease,
maybe a few decades to figure it out, and you
can build stuff.

Speaker 1 (16:16):
Yeah, yeah, exactly. And that's why you know, there's a
valid criticism of a lot of ancient alien theories where
it's like, okay, so what's more likely? Is it more
likely that someone came all the way to Earth and
built a vague geometric shape and then left and didn't

(16:36):
you know, they don't want you to know why. Or
is it likely that people were always relatively on the
same cognitive scale. Yes, yes, so this would appear to
be good news at first blush because this means that
for the past thousand years, the biggest changes to our
species have been cultural and technological. Right, even the clear

(16:59):
and excellent example of the diseases that we carry and
are immune to, that is a technological improvement. We didn't
just biologically figure that out. And our population is skyrocketed.
If we were a product at your local grocery store, what,
what's the popular grocery store.

Speaker 5 (17:19):
In Chicago, Marianos and Jewel Yeah, okay, Kroger Baby is
owned by Kroger.

Speaker 1 (17:31):
Fingers on the hand, I see, well, if we were
a product in Mariano's, we'd still have the same ingredients,
the same great taste, the same oddly fragile packaging. It's
just easier to find us in more places. And we
have a great example. We've pulled this up before.

Speaker 3 (17:48):
Oh, the world population at this moment right now, Yes,
we're recording this. Yes, seven billion, six hundred and eighty
eight million, five hundred and thirty four thousand, four hundred
and twenty nine.

Speaker 5 (17:58):
Yeah, and that's spinning really fat.

Speaker 4 (18:00):
It's going really fast.

Speaker 5 (18:01):
I mean yeah, when you look at that clock, it's
a staggering thing to see. That's how fast people are
coming into this world.

Speaker 1 (18:07):
And we always it's always tempting when we bring that
up in an episode. It's always tempting to check it
again at the end and see how.

Speaker 5 (18:16):
Many people are sixty billion? What the hell?

Speaker 6 (18:20):
So, Yeah, the premise that most of us go with
when we're you know, doing engaging in futurism is that
where will we put all these people?

Speaker 4 (18:32):
Yeah, there is some.

Speaker 6 (18:35):
There's some new studies out that are looking at larger societies,
maybe not the globe as a whole, and saying, actually,
we are not currently running at replacement rate. So although
it's kind of like the fact that we are, population
is still growing is a function of things that happened
fifty years ago, because those people are still alive and

(18:57):
they overpopulated then, and that if if you look a
few generations in the future, you can actually start to
see evening out of because this number keeps growing, evening
out or even decline. So there's not a consensus in
the scientific community that overpopulation is going to continue to
exponentially explode the way we kind of all been fed by.

Speaker 1 (19:21):
Like enthusie and reasoning and stuff.

Speaker 6 (19:23):
Yeah, exactly.

Speaker 3 (19:24):
You don't want to go Osamandias on it and just
wipe out most everybody.

Speaker 6 (19:28):
Yeah right, snappy fingers.

Speaker 1 (19:31):
You know, I'll tell you the thing that made that
guy Thanos a real villain and the biggest plot hole
in that film spoiler is he could have just made
the universe twice as big, or just.

Speaker 5 (19:42):
Doubled all snap it and you make it twice as big,
you double all of the whatever you need, all the resources.

Speaker 1 (19:48):
I think he was just being super emo, to be
completely honest, But his cosmic billy corgan this aside. We
can say right now, in twenty nineteen.

Speaker 6 (20:01):
You also opened a T shop also shop.

Speaker 1 (20:06):
We as a species are making some huge waves of
multiple fronts, and Matt and I were trying to figure
out the best way to frame this, and what we
came up with was the good, the bad, and the ugly.

Speaker 4 (20:19):
Right, So there you go.

Speaker 5 (20:23):
I can't do it.

Speaker 1 (20:24):
That's great, Anio Morricone, Right, good things, we get good things.
When it comes to technology, we're the best in the
biz because we're the only ones we really know, you know,
like some higher order mammals can make tools, corvids can
use sticks or whatever. But as far as we know,

(20:46):
no non human entity has built a computer yet.

Speaker 6 (20:49):
Right, Yes, But I mean I'm giving it a fifty
to fifty on dolphins, but we haven't found them yet.

Speaker 1 (20:55):
Right, right, And it's probably built to do something that
only dolphins.

Speaker 5 (20:57):
Yeah, we wouldn't understand that.

Speaker 6 (20:59):
When you get it, it's for dolphin kisses.

Speaker 1 (21:02):
Right, so some of us listen. Now, we remember a
world before smartphones. We talked about this in our episode
about humanity a century from now. We remember times before
the internet, a world of landlines and paper maps. Somebody
would tell tell you where to go at a certain time,
and you would have to remember it yourself. And we

(21:24):
are also learning more about our own bodies. Genetic research
is definitely going to help us eradicate some genetic diseases,
as well as eventually allowing us to tweak some specific
traits for practical purposes and cosmetic ones. You know what
I mean. You want you want kids with purple eyes?

Speaker 6 (21:43):
Yeah, one of the you mentioned on the technology side.
You know, we talk about AI and we've we've all
done episodes on AI. It's probably not worth diving deep into,
but it is worth talking about that. We need to
be careful as it relates to, you know, creating nanobots
and things of that nature. I recently read are you

(22:03):
guys familiar with Nick Bostrom, Swedish philosopher. He's got some
really cool and interesting stuff on future And I read
this thing about the h They call it the paper
clip Maximizer, and it's basically a thought experiment, that's all
it is. But it's the idea that the most mundane
thing that you could tell an artificial intelligence to do

(22:25):
could turn out really poorly. So here's a quote from him.
Suppose we have an AI whose only goal is to
make as many paper clips as possible. The AI will
quickly realize that it would be much better if there
were no humans, because humans might decide to switch it off,
and their ultimate goal is to make as many paper
clips as possible.

Speaker 5 (22:42):
Okay because cheer.

Speaker 1 (22:45):
Yeah, And this is generalized AI, so it can think
big pictures correct.

Speaker 6 (22:49):
Also, human bodies contain a lot of atoms that could
be made into paper clips, reformat and turned into paper clips.
The future that that AI would be trying to gear
towards would be one in which they were a lot
of paper clips, but not a lot of humans. And
if you like from a if you hadn't programmed that
robot to value human life or to understand empathy, we're

(23:12):
talking like that could be a problem.

Speaker 1 (23:15):
Yeah, that leads us straight into the bad. Yes, that's
an excellent example. Many of the self same technological innovations
that are looming on our species horizons will probably create
new problems as readily as they solve existing ones. Like John,
your example with the paper clip maximizer, aside from being

(23:38):
frankly terrifying, is spot on because we don't know what
this stuff will be like once we actually created. To
our common knowledge, there is no functioning generalized machine consciousness
generalized AI.

Speaker 2 (23:53):
Right.

Speaker 1 (23:54):
There's stuff like no offense conspiracy bought, but there's there's
stuff that can do specific tax as.

Speaker 6 (24:01):
We haven't reached the singularity, right.

Speaker 5 (24:03):
Right, And that's why it's such a broad term, because
we don't It's that we don't know, and that's why
it's so scary for people, and it's so inspiring for people.
And everyone's seen terminator, you know, and so that's always
looming in the back of your mind because that that is,
I guess a possibility.

Speaker 1 (24:21):
Yeah, and that's a good metaphor. But it looks like
we have we may well have a terminator situation. It
just won't look human, it'll be gray goo.

Speaker 4 (24:30):
Yes, So this is the thing I wanted to bring up.

Speaker 3 (24:33):
The paper clip problem gets exponentially scarier if it's not
making paper clips, if it's replicating, if it's just making
more of itself, right, oh yeah, and if it's on
a nanoscale technology that's just self replicated replicating.

Speaker 4 (24:46):
That means that the.

Speaker 3 (24:47):
Gray goo that is inside of our heads that can
come up with making a nano bot and then getting
that nanobot to have artificial technology to continue to replicate,
then creates this thing that occurs out in the physical
world that is a gray gu that is just an
entire planet covered in nanobots that have self replicated to
the point where that's all that's left.

Speaker 6 (25:08):
It's all is there.

Speaker 1 (25:09):
Yeah, yeah, exactly. And if we survive all of this,
then the other bad news concerns our bodies and the
environment in which they live. Genetic research loves it. I
think we shouted out Gataga earlier. It's a fantastic film.
Watch it again if you've already seen it. Like most
advanced technologies, especially in their earlier days, this is going

(25:32):
to be controlled by wealthy institutions at first because of
the way that society functions right now, and so before
it bleeds into the genes of a general population, before
like the descendant of Jeff Bezos scandalizes his or her
family and sleeps with a peasant, this stuff would result

(25:53):
in in something and it's most extreme that we would
call man made speciation. But then there's the chance we
talked about this for the tin Cream with traits and
one part of the geni code could have unintended disastrous consequences.
For fiction. A good example of this would be the
Eloy and the Morlocks from the Time Machine.

Speaker 5 (26:10):
Right, you know, did we have the best of intentions
and all of a sudden, now we're moment underneath the
ground because we can't be on the sunlight just because
Crisper ran.

Speaker 6 (26:20):
That's right, there's a ticking time bomb inside of every cell.
Very A lot of geneticists are like, well, the key
to immortality could be turning off that time bomb. One
of the challenges or scary parts of that is there's
one type of cell today that does not have that
time bomb in it. It's a cancer cell.

Speaker 4 (26:40):
Ah.

Speaker 6 (26:41):
Yes, and so you changed the makeup of a cell
a little too much, you become a big tumor all.

Speaker 1 (26:47):
Of a sudden, right, right, and technically speaking, very technically speaking,
there is one immortal individual Henrietta lacks at least her cancer, right.

Speaker 3 (26:59):
You guys love us no, no, no, Well, listen to
our episode on do we do an episode or.

Speaker 1 (27:04):
Just a we did an episode on immortality? Got it
in the various terribly problematic ways.

Speaker 4 (27:11):
It's achieve it.

Speaker 3 (27:12):
It's this crazy messed up thing where this one woman's
cancer cells were taken as uh samples to be tested
right for testing, and then all of the like all
of the other samples that were that existed in this
one building ended up having Henrietta LAX's cancer cells in
those samples. And then they started realizing, wait a second,

(27:33):
somehow these samples, because they were being shipped across the
planet to be studied, uh, they ended up in all
the cancer cells, or at least the vast majority of them.

Speaker 1 (27:44):
And they they continue to reproduce.

Speaker 5 (27:48):
Yes, yes, wow.

Speaker 1 (27:51):
So technically immortal, but not the kind of immortality you probably.

Speaker 3 (27:55):
Yeah remember for But it goes back to our whole
humanity as a virus thing spreading across the cosmos from
our last episode.

Speaker 5 (28:05):
Isn't that what you want to do? It's you want
to reproduce and you want to live. You know that
you're bringing it down to its basis terms. But I
mean that's the idea. And guess what I was. I
needed to leave this body. I need to leave this planet,
you know. And they say we're going to have to
be an interplanetary species. Well guess what virus is spread.

Speaker 1 (28:24):
We get angry when it's cancer, but we're one hundred
percent on board if it's us, right, right, right. So
the last thing is the climate. Yes, the climate. You've
heard it before. It is the tragedy of commons writ large.
Unless we all at the same time make some massive changes,
the climate seems set to inevitably change in some drastic
ways ocean acidification, temperature changes, deforestation, and so on.

Speaker 6 (28:48):
I have one question. Yes, you guys are familiar with
the Kardashiev scale. Yes, we are currently a point seven
on the Kardashiv scale.

Speaker 5 (28:56):
Seven two three. Come on now, no coteam.

Speaker 6 (28:59):
They say that a level one civilization can control all
of the energy on their planet, and as a function
that they can control all of the weather. Everything pretty
much is happening in their neck of the woods. If
we achieve level one in the next thousand years, don't
we also achieve the ability to go, oh well, we

(29:20):
can just do whatever we need to do the atmosphere
and make sure that doesn't happen.

Speaker 1 (29:23):
Yes, the problem is getting to that thousand year mark.

Speaker 6 (29:26):
From here to level one.

Speaker 1 (29:29):
Yeah, that is, don't let.

Speaker 5 (29:32):
Anything that we've already done to ourselves erase before we
can get that.

Speaker 4 (29:36):
Quit's fair.

Speaker 3 (29:37):
If if life has existed on other planets and emerged
from a goo from a you know, a chemical reaction
that occurred because there just happened to be the right
molecules on that planet, don't you think they do they
go through a very similar process of figuring out.

Speaker 5 (29:54):
Scis Yeah, that like life is probably very similar. Of
that's the grand equalizers. If they're smart enough to save themselves,
because they all probably come up to this part. Stay
we're at now, we're very close to the tipping point
we were talking about, you know, a singularity. We're almost
to that point now where we need to decide that

(30:16):
we want to better ourselves or are do we want
to wipe ourselves out?

Speaker 6 (30:21):
I think that's assuming that they're like normal ish whatever
we consider that, like carbon based life forms. Yeah sure,
you know, there's also people that speculate that on gas giants,
sentient life could develop in a way that we don't
even understand.

Speaker 3 (30:36):
Yeah, no, I guess what I just what I mean
is the fuel, the available fuel that exists on whatever
planetary body the life emerges from, ends up being used
up to an extent or altered to an extent, no
matter what it is.

Speaker 5 (30:54):
That for us in a thousand years from now, if
we're going to be able to do the things that
we needed to save our planet from extinction of whatever
might happen. To be able to explore the cosmos, we're
gonna need energy and things like that that come from
far more exotic places, black holes, other dimensions. These are

(31:15):
the things that they say that we're going to be
playing with if we are you know, to be a
you know, a level two or something like that. So
climate in and of itself is nothing.

Speaker 4 (31:27):
Yeah, and yet it's everything.

Speaker 5 (31:28):
Climate exactly.

Speaker 1 (31:29):
Climate is the the VHS that we have to use
your earlier tample mat the VHS. We have to keep
them working order until we can afford LG. So the
last thing we mentioned the good. We talked about the bad,
the big bads. Now we have to talk about the ugly.
Just briefly, there's one ugly thing you should keep in

(31:50):
your mind, folks, as as we listen along. The number
one ugly thing, the frightening, horrifying anti ganesh like elephant
in our species collective room is put simply this, there
is a very high chance that we will not make
it to the one thousand year mark. We were safe.
In our earlier episode we said, you know, we're assuming

(32:12):
that nothing terrible happens to the one hundred year mark.
Now we're doing that ten times.

Speaker 5 (32:18):
Yeah, you know, and.

Speaker 1 (32:21):
In Stephen hawkins estimation, we absolutely are not gonna make
it unless.

Speaker 5 (32:27):
We get off of Earth. Right.

Speaker 1 (32:29):
Stephen Hawkins not necessarily are curmudgeon, but calls him like
you sees him. He has some statements that are pretty
much along the lines of screw humans.

Speaker 5 (32:40):
Unless they get off Earth. Screw Earth. We need to
leave it.

Speaker 1 (32:43):
You know, no disrespect, but it's working out. Aliens are
real and screw them to So we're gonna get ai
and you know what, screw that. That's I mean, I'm
paraphrasing a little.

Speaker 4 (32:55):
Yeah. Yeah, we learned about a lot of that stuff.

Speaker 3 (32:58):
Did you have you, guys ever spoken with Josh Clark,
who did End of the World from stuff?

Speaker 4 (33:02):
You should know?

Speaker 6 (33:03):
No, we we really want to. We're gonna have him
on to talk the Fermi Paradox.

Speaker 4 (33:07):
Yea, oh that's great.

Speaker 1 (33:08):
Oh, that's gonna be great.

Speaker 4 (33:09):
Yeah.

Speaker 1 (33:10):
So he has a podcast that explores.

Speaker 5 (33:12):
All the chuckles that are that are facing us.

Speaker 1 (33:14):
Oh yeah, it's a cavalcade of comedy. Yeah, but we
recommend that. And with all this in mind, we have
to understand that everything we are about to contemplate about
human civilization one thousand years from now is inherently optimistic,
if only because we are assuming that somewhere, somehow, something
like humanity is still is still partying on in threeenty nineteen.

(33:39):
What are we talking about. We'll tell you after a
word from our sponsor.

Speaker 5 (33:49):
Here's where it gets crazy.

Speaker 1 (33:52):
Humanity one thousand years from now. Welcome to thirty nineteen.
If you're listening to this, then hypothetically you have survived
and out of all the weird trials and travails that
you have encountered in thirty nineteen, you have decided to
take a break and listen to a podcast, and you've

(34:12):
chose this one. Yeah, thank you.

Speaker 3 (34:16):
Good job doing some ancient recovery of either technology or
I don't know.

Speaker 5 (34:22):
This is part of an archaeological dig.

Speaker 6 (34:25):
This show is actually included in the tests ract.

Speaker 5 (34:29):
I'm sure it's going to go into the Smithsonian at
one point in time or another, so you know it's true. Yeah,
good lord.

Speaker 1 (34:34):
It would be like the regional version, you know, like
how their Ted X talks right Smithsonian X. So let's
let's see we left off one thousand years ago. We
left off with the fact that there are inevitable things
that will happen in the climate as we knew it
from twenty nineteen, right, So a thousand years later, a

(34:56):
lot of water under the bridge, you know.

Speaker 4 (34:59):
A lot of water over there, over the over the bag.

Speaker 1 (35:03):
So he did there, Matt, can you tell us a
little bit about what's happened with the climate?

Speaker 3 (35:10):
Well, odds are humanity continued despite our best intentions, we
continue to use carbon fuels, carbon based fuels and other
things that put CO two emissions out into the atmosphere. Right,
because as of twenty nineteen, CO two emissions are at
an average of three hundred and eighty five parts per million.

(35:33):
Sounds really low, right per million?

Speaker 4 (35:35):
Yeah, it sounds really low. But here's the deal.

Speaker 3 (35:38):
That's increasing and if that gets up just a little
bit higher to say four hundred and fifty, maybe all
the way up to six hundred parts per million, and
that could be really bad because if it reaches that
level of as an average, and then if in some
way all CO two emissions just stop debt, no more

(36:01):
carbon dioxide is added to the atmosphere.

Speaker 4 (36:04):
This is what would happen.

Speaker 3 (36:06):
There would be persistent decreases in dry season rainfall that
are that are and would be comparable to the nineteen
thirties depression era, like the dust bowl that we've all
heard of, that would just exist that there would be
in zones of.

Speaker 6 (36:20):
All your life, now, yeah, it would be.

Speaker 3 (36:23):
There would be zones of this stuff, this terrible situation
all across the planet, every everywhere from southern Europe to
northern Africa, southwestern North America, which already you know, is
not doing great in twenty nineteen. If you think about
some of the wildfire situations and other just heat situations,

(36:44):
Southern Africa would be affected, Western Australia. Human water supplies,
specifically fresh water supplies aquifers.

Speaker 5 (36:51):
Which is already suspect right now.

Speaker 3 (36:53):
Yes, that stuff would decrease much further. Again, fires would
just be everywhere.

Speaker 1 (37:00):
If you're one of those people who hates being cold,
thirty nineteen is like your year.

Speaker 3 (37:07):
Yeah, actually several years after twenty nineteen, and then up
until thirty nineteen.

Speaker 4 (37:13):
It's your face.

Speaker 5 (37:14):
Yeah, yeah, it's your era.

Speaker 3 (37:16):
And all kinds of agriculture is going to be hugely
affected by this, just because of the differences in rainfall
at this point, because of these things.

Speaker 6 (37:24):
Now, you said if it if all the emission stops,
you mean you're saying that we take we make progress,
and they all stop, but these will still be the outcomes.

Speaker 3 (37:35):
Yes, Because it takes that long for a large climate
change to effect to be affected, essentially a thousand years, right,
That's one of the main issues. It takes forever to
the effects will be seen rather quickly, and then it
takes a long time to fix.

Speaker 6 (37:53):
Those put the toothpaste back into the tube.

Speaker 3 (37:57):
Yes, that is to say, though, or this is without
saying that there could be technological advancements up to this
point where we could cause that CO two to be
out of the atmosphere for one reason or another.

Speaker 1 (38:10):
That's our wild card. We're leaning heavy on this one.

Speaker 6 (38:15):
I don't know about you, guys, but I have I've
learned most recently that climate change isn't actually a real thing.
And no, it's not a real thing. You saw how
cold it got in the in the Midwest this winter.
Obviously things aren't.

Speaker 5 (38:30):
Getting climate change other.

Speaker 6 (38:35):
And as we all know, plants like CO two, so
this isn't a problem. I don't know why we're talking
about this. I can't, but in all seriousness, if I
do think, if we make it to a thousand years
from now, isn't that kind of part and parcel with
we will have the technology to fix this like the
the in between time, like you mentioned before, Ben, is

(38:55):
the challenge. Yes, but if we make it there unless
I guess there's one addendum to that, if we get
reduced to the place where we are, you know, a
bunch of tribes running around again because we've lost technology
because of cataclysmic events. I suppose in that case it wouldn't.
But assuming we're still a relatively normal society, whether it's

(39:16):
a global society or still split up by then wouldn't
we have the tech to affect changer hopefully?

Speaker 1 (39:23):
The question is whether, given our tribalistic nature, we would
be capable of cooperating on a large enough scale to
implement it in a meaningful way. So maybe some country
or some institute, or increasingly what would be likely is
some large private institution, a company the Koch brothers, sure,

(39:44):
the Koch brothers, Nessley, Unilever, Haliburton, all the hits, all
the good ones. They have a let's say that they
have a way to game the weather in a specific region,
or they have some thing like innovative breakthrough the weather

(40:04):
dominating Yeah, oh yeah, yeah, yeah, and throw the wildcard
in and you're fine, right right. The question is what
sort of negotiation or arrangement would they expect people to
enter into in order to gain access to that technology.
That's study about CO two, by the way, is from

(40:24):
the National Oceanic and Atmospheric Administration, and according to their study,
they're not a super controversial group by the way.

Speaker 6 (40:33):
No, actually, Noah, I mean getting biblical on me.

Speaker 1 (40:39):
Yeah, they're sort of the bad boys of oceanography. They
they do say in this same study that changes in
surface temperature, this is a quotation, rainfall and sea level
are largely irreversible for more than one thousand years after
CO two emissions are completely stopped. And they if you

(41:00):
look at the study, they've they've gained it such that
they go from the average emission now to maybe they say,
it increases just a little bit and then it all stops.
It increases toward that more reasonable number the match just
named max out at six hundred per million, and then
that just stops. The The scary thing is it's not

(41:20):
going to completely stop. Like that's just not how things
of that scale.

Speaker 6 (41:24):
We'll be breathing.

Speaker 1 (41:26):
That's true, Like we emit.

Speaker 6 (41:28):
CO two every time we take a breath.

Speaker 1 (41:29):
Classic usum, Sorry, I've been really off off away from
the podcast, off topic here. You guys ever get caught
in the overuse of a turn of phrase or a
figure of speech, Yeah.

Speaker 5 (41:52):
You can't stop using it.

Speaker 1 (41:53):
They're like, ah, I've got to stop referring to things
as classic whatever.

Speaker 5 (41:58):
The other person just says it's.

Speaker 6 (42:01):
Kind of like when I refer to things as a
documentary when they're not really documentaries.

Speaker 1 (42:04):
Well, you did that so well because I felt that
after you had clearly showed us that that was on
the way.

Speaker 5 (42:11):
Yeah.

Speaker 1 (42:12):
The other one is I went through oh, I will
die on this hill, which is a melodramatic way to
apply to anything.

Speaker 6 (42:23):
That aside barring you actually being on a hill and
willing to die over it.

Speaker 1 (42:27):
Right that actually that does not happen yet. So if
you're listening in thirty nineteen, writ in and let us
know what your favorite turn of future phrases, assuming you're alive.

Speaker 6 (42:40):
It's like it's like Fill and Ted. You guys literally
saved the universe.

Speaker 1 (42:45):
Well, you're in this too now.

Speaker 3 (42:49):
And all of this is just to say that in
a thousand years, if we don't fix this right, it'll
look this way. So and the other thing that we'll
see is sea levels because of this, and it's directly
related to those co two. Yeah, and that's also a
pretty It doesn't sound as bad initially when you say
it that if it happens the same way, oceans will

(43:13):
rise at an average of I think one point two
feet or something or what is it?

Speaker 5 (43:18):
A one point three to three point two feet.

Speaker 3 (43:20):
That's exactly what it is, so like a maximum of
a meter, right, But.

Speaker 5 (43:24):
When you think about that though, yes, bye by Florida, Yeah,
by all these coastal areas.

Speaker 1 (43:29):
Solomon Islands, Yeah, a lot of Micronesia. Then the leaders
of those countries, a lot of island nations are very
well aware of this, because even back in twenty nineteen,
they could see this stuff happening and they pled their
case to I think we used to call the United Nations,
which may still be around, I don't know.

Speaker 6 (43:50):
Well speaking of in twenty nineteen, it was in twenty eighteen,
I believe that a piece of the Arctic shelf fell
off that was the size of Delaware.

Speaker 5 (43:59):
Jeez, what a wonderfully descriptive size Delaware.

Speaker 1 (44:04):
That's our new Let's just compare things to the measure.
I've attatched a picture of Delaware for comparison.

Speaker 5 (44:11):
It was three point six delawares. That's great, let's do that.

Speaker 1 (44:18):
So, yes, the ocean will rise, and it will rise
by two meters if CO two peaks at one thousand
parts per million and two meters makes a hell of
a difference. That's that's where you know, the smart money
has already bought something inland, right right, and they're waiting

(44:39):
for the beach to show up.

Speaker 6 (44:41):
That beachfront property you own in Arkansas, that's right, right, right.

Speaker 5 (44:45):
Right right.

Speaker 1 (44:47):
This this brings us to we're talking about the natural world.
This brings us to another thing. Even in twenty nineteen,
we as a species, we're undergoing what's called a great extinction.
Great great extinctions are nothing new. It's not the first,
this will not be the last, but they are brutal

(45:07):
to experience. In the moments, you know, a thousand years Hence,
we will have already have lost quite a few wild animals,
quite a few plants, a ton of insects, many of
which were never discovered before they went extinct. But because

(45:27):
we're a thousand years in the future now, we may
also be able to pull a wooly mammoth and bring
them back.

Speaker 5 (45:36):
We just don't know where we will put them.

Speaker 6 (45:38):
Well, Ben, you spent so much time trying to figure
out if you could, you never stopped to think if you.

Speaker 5 (45:45):
Should know.

Speaker 6 (45:50):
A mastardization of it. But yes, yeah, yeah, yeah.

Speaker 5 (45:54):
We look at all these things that they're just talking about.
The bees, are now on the endangered species and they say,
you know, where are we without bees and politenization? I
mean hurting very much so, so you better hope that
we have that kind of technology or wherewithal now to
change things. Like you said, a hard stop, you know,
but you know there needs to be a lot of

(46:16):
hard stops in almost every one of these camps for
our next thousand years to be something that we want
to be a part.

Speaker 6 (46:23):
Of, another threat to throw into the ring other than
being in an extinction moment, other than where the climate
could go from a warming perspective, we are overdue for
the next ice age. Oh yeah, and in a thousand
years makes it much more likely. Now it's a weird

(46:45):
catch twenty two because they most scientists think that the
reason hasn't happened again yet is because the global warming
saving ourselves by killing ourselves. Yeah, it's like cutting the rope.
But you're like hanging over a vat of acid, Like
I'm either going to hang here or I'm going to
fall into the bat of acid. These are not good outcomes.

(47:08):
But and I am I am not a climatologist. I
don't know if I told you that. Guys before we
started the show.

Speaker 1 (47:14):
I'm not you're pretty, You're not like some kind of
weather surgeon or whatever they call it.

Speaker 6 (47:20):
A climatician. But I have to imagine that the balance
there of global warming versus the Earth really wants its
next ice age all play in somehow to a cosmic
soup of not goodness.

Speaker 1 (47:38):
Yes, yeah, that's that's the thing. That's what makes this
a tricky topic. If we're talking about the planet in
one thousand years, it's going to be around. It's it's right,
it's going to be, by and large still home to
some sort of life. The question is whether or not

(47:58):
people are still going to be in the mix. And
for assuming that they are, they're going to be radically
different from what we know. That thing we talked about
where we could speak, well, we could interact with someone
from ten nineteen and they.

Speaker 5 (48:12):
Would look sort of like us.

Speaker 1 (48:14):
That's probably not the case in thirty.

Speaker 5 (48:16):
Nineteen, because of environmental or designer reasons, either or apps.

Speaker 1 (48:21):
I would say, I would argue largely designer reasons because
now we are capable in thirty nineteen of impacting, affecting,
and steering our own evolution and adaptations. Because despite the
Great extinction despite the ups and downs with the climate
or the ecology in which we live. If we're around

(48:44):
and civilization hasn't collapsed, we are going to be doing
amazing science fiction level stuff. Will we be close to
being a Type two civilization? I don't know.

Speaker 5 (48:57):
Type two has a hell of a gap.

Speaker 1 (49:00):
You know, there's there's is Type two, the one where
you are able to your star, your top, your star, right,
harness the.

Speaker 6 (49:07):
Entire basically, will we have a dice in sphere?

Speaker 4 (49:11):
Yes?

Speaker 5 (49:12):
Or progress to where we don't need one because we've
come up with some other.

Speaker 1 (49:18):
Yeah, maybe we get to type one and we say, okay,
let's just let's stay here, let's franchise out Mars and
maybe you know the shopping districts. Yeah, if the Martians
want to do it, they can. But we will have
this amazing technology. We will have super fast computers, perhaps

(49:39):
perhaps so so very quick that they are no longer
computers to us. They are part of us, you know,
Like we mentioned with people becoming you know, becoming digital
versions of themselves. This is this is rife for all.

(50:00):
So a side note, a new era of folklore. Can
you imagine all the urban legends you will literally hear
voices in your head. Your ancestors will not have died, right,
they will still exist. Yeah, you've got beef with your
great uncle. He's still around.

Speaker 6 (50:19):
Now you're saying he's still around because they captured his consciousness. Yes, okay,
it's funny.

Speaker 4 (50:25):
One of the.

Speaker 6 (50:25):
Technologies they're talking about, but this could be actually a
lot sooner than a thousand years is just taking enough
writings and enough other input about a person and their life,
and all of a sudden you input all of that
and you get into augmented or virtual reality and you're
sitting there having a conversation with Samuel Clemens. You know,
it's an approximation obviously thereof but but you mean actually

(50:47):
talking to the person.

Speaker 1 (50:49):
Right right, the digital essence the closest we could get
perhaps to a soul at that point. But it gets
even weird because we will also like, at this point
we're clearly in ship a thesis territory, and this means that,
you know, maybe let's say you have the the digital

(51:13):
ghost of your great great grandmother says, you know what, Brent,
I remember the Scorpion Wars and I would love to
go to that museum with you. Just buy me a
body real, right, because now it's just hardware.

Speaker 5 (51:29):
Do you know what I mean?

Speaker 6 (51:31):
They call him a sleeve? Was that in altered carbon?

Speaker 3 (51:34):
Yes?

Speaker 4 (51:34):
Yes, sleeve?

Speaker 2 (51:36):
Yeah.

Speaker 3 (51:37):
All right, guess let's stop right there for a second
and we'll come right back afterward from our sponsor. All right,
strap in, let's get back in this.

Speaker 6 (51:51):
The other wild card here, Yeah, that could impact every
aspect of what we've talked about, whether it be how
we harness power. Do we have the technology to create
a dice in sphere? Uh? Do we have the technology
to save our Earth? To to do all of these
things would be the interference of another society, extraterrestrials, So

(52:15):
interdimensional or interdimensional?

Speaker 5 (52:17):
Sure?

Speaker 6 (52:18):
I uh. We talked about this on our show a lot.
Whether we believe in aliens, there's two questions always hidden there.
Do you believe that another form of life exists in
this universe?

Speaker 1 (52:28):
Absolutely?

Speaker 6 (52:29):
Second question, do you think they've been here?

Speaker 5 (52:32):
That's exactly exactly we are.

Speaker 6 (52:36):
I don't. I don't think that they've been here, but
I think we are very very close to them showing up,
Like I think we're getting farther, far enough along to
be able to send some signals out there to be
able to see things. I think we're I think that
we're going to get to a point very soon where
they're just going to show up one day.

Speaker 5 (52:57):
Yeah, have our.

Speaker 6 (53:00):
From Star Trek first contact, have our warp drive moment
now the if that happens, which I think is just
as realistic a possibility as we can upload our conscious
I mean, like it's just the showing up, Hey, they're here,
that if we do acknowledge they exist. You know, if
depending on how old they are, their technology they're they are,

(53:20):
if they are able to reach us from something that
we can't see right now, chances are their technology is
so advanced that they could help us, help bring us along.
And I that would I mentioned wild card, that would
be the ultimate wild card, because they're introducing new technologies
to us that bring us. I don't think it's the
alienation thing where they're just as messed up as we

(53:41):
are and have the same internal and strife and you know,
because then it's just that doesn't align to me. How
could they have gotten here so quickly with such advanced
technology just to be as you know, but then as
we are.

Speaker 5 (53:55):
That opens up that whole other, you know, line of
thinking of if they are that advanced and they're going
to see that. Why do they care? Right right?

Speaker 1 (54:02):
That's the thing. The question then becomes not whether we
would be capable of recognizing this as sentient life, It
would be whether it recognizes us as such.

Speaker 5 (54:15):
Mitchillkaku, he was like he was, everyone sees ant hills.
When's the last time you said, ants, here are beads
and trinkets. Take me to your your queen. And if
you do, the ants still know you're talking to them.
Now is the ant dumb because it doesn't understand English?
Or is it Are you dumb because you're trying to
talk to an ant?

Speaker 6 (54:33):
Yeah, I'm dumb because I've got the magnifying glass out
and I'm trying to burn them with the sun.

Speaker 1 (54:37):
Yeah right. It's just I mean everything gets very quickly
to the level of stories from the Old Testament, you
know what I mean? Like we can we can we
can give favor to these ants and they'll say, oh food,
or like oh the other ants that we're killing us
are mysteriously gone and the land is poisoned.

Speaker 5 (54:59):
I just wherever we go, if we're ever in that
position to where we can be the ones too out there,
we just build pyramids and leave, you know, and so
that way for a generation.

Speaker 1 (55:10):
I mean, that's a power move, you know what I mean.

Speaker 7 (55:13):
I vote we get weird with It's just confuse the
hell out of them. Let's uh, let's let's give uh
let's let's give some very basic and wildly arbitrary, irrelevant rules,
you know what I mean, Like, uh, let's ban people
from doing or let's ban a life form from doing
something very specific with its antenna, or and it's got

(55:40):
to be something they would normally have never thought of doing.

Speaker 5 (55:42):
Right. Uh, but you're but you're right.

Speaker 1 (55:45):
This this span of time, this one thousand years, it
inherently includes space exploration at this point, right, very very soon.
As far as twenty nineteen went, el on Us was
planning to have four ships permanently on a circuit to

(56:06):
Mars in the twenty twenty twenty five.

Speaker 5 (56:09):
Twenty twenty two, they're gonna start sending you Twenty twenty five,
I think, is when that's gonna kick into Yeah.

Speaker 1 (56:14):
Yeah, So even if that estimate is wildly optimistic, which
it is, But even if even if it's so wildly
optimistic that that doesn't happen until twenty one to twenty five,
still another nine hundred years. So if we're going to
if we're going to get to space, it is going

(56:36):
to be within this time span, which makes it one
of the most important eras in human history. And that
goes to your your point, John, that will involve us
putting out so much noise in the universe, even more
than we already have. And we all read those, you know,
those kind of bleak dystopian sci fi pieces where we

(56:59):
finally get a message from space and it's something cryptic
like sh they'll hear you, and that that could be
the thing, because we could also find ourselves in a
situation we're just speculating. Now, there's not a basis on
this as a thought experiment. We can find ourselves in
a situation where I would advance. This is more likely

(57:20):
the meeting of biological life form, where we find the
creations of some other organic life form and they're really
into paper clips. We know that Pluto used to be
an entire sphere, but it's slowly disintegrating, and then we
have that calculator. That's the thing, given what we understand

(57:44):
about travel across these impossibly vast distances, unless they're unless
they have a propulsion or transit system that redefines our
understanding of physics, which they pretty much would have to
We would be in one of the worst waiting games ever.
We would say, we know something's coming, we know it's

(58:06):
not natural. We have four hundred years together.

Speaker 5 (58:10):
Yeah, so people are gonna spuild religions around that. I
mean everything, every I mean mass hysteria.

Speaker 1 (58:18):
And a lot of a lot of religions that we
have back in twenty nineteen. I'm riding this dead horse
to the ground. They will have also changed such that
they would they may be widely unrecognizable to people practice
them today. Right, the Catholic Church has soldiered throughout the

(58:39):
years and may well still exist, but may also have
a very different set of practices.

Speaker 5 (58:45):
Well, they've they've come out in years recent and said like, hey,
if aliens come, it's it's okay, It's part of God's plan.
It's part of the Bible, you know, all inclusive, bringing
on in.

Speaker 6 (58:55):
I whish you would have been burned for not that
long ago, the same thing that. But Ben, I think
you bring up a really good point, but you can
almost take it to the next level, not just religion,
but society and species changes. So continuing along the thought experiment,
as a society, we're probably a global society by that point.

(59:16):
We probably speak one to two languages as a as
a group. The rest are dead languages, and we probably
you know, one currency for the planet. I mean, these
are just things that are almost inevitable over that time period.
If that's the case and we are an interplanetary species,
we're going to have to do some of the things

(59:37):
that we discussed earlier, some gene editing, some addition of cybernetics,
things of that nature. We're going to have to change
to live in different environments. So there's probably going to
be a group of people. They could be living on
the ore cloud there. There will be a group of
people at living near Proximus century. All all of these

(01:00:00):
people will have had to have changed and could be
gone for long enough periods of time because of how
long it takes to get there that their heritage there.
The way they look, they almost would be disassociate themselves
from the history of man. They are their own thing.

Speaker 1 (01:00:17):
We have created aliens one thousand years. Hence we have
created aliens. The most well known to us are probably Martians, correct,
but The lunar people are pretty weird too. They're like
the new Florida.

Speaker 6 (01:00:33):
Well it's underwater. They had to go somewhere soon, man, Florida.

Speaker 1 (01:00:37):
Man, Yeah, by which I meant awesome. And Disney owns it.
But that's that's fantastic though, because now we may have.

Speaker 5 (01:00:46):
Also it's the weirdest part this.

Speaker 1 (01:00:50):
Maybe a thousand years is too small a margin for this,
but what if we had a successful colony on Mars
somehow survived, not just survive, but thrived, and then and
civilization on Earth collapsed, we entered a dark age. We
had legends about Martians, but they never screwed up to
the extent that we did. And they decided to come

(01:01:12):
visit us. They're like, wow, our parents' house is trash.
Like that's that would be an alien encounter for us.
They would be extraterrestrials, right. But at this point we
will have had if we're around it all, we will
have had people visiting these places and hopefully living there

(01:01:35):
on a semi permanent basis very quickly, and I think
you raised this point earlier, Brett. They will speciate.

Speaker 5 (01:01:45):
That or be completely uploaded into some sort of cybernetic
you know, Westworld, right, and then then that negates all
of that, you.

Speaker 1 (01:01:55):
Know, so we may have we may have a situation
where everybody on Earth speaks thus one to two languages.
That's incredibly likely.

Speaker 5 (01:02:04):
But we may be in.

Speaker 1 (01:02:05):
A situation where someone says, okay, I have to use
my cybernetic parts to my my in the cloud software
programming because I've got to speak with this martian and
then who knows what the hell they're talking about.

Speaker 6 (01:02:22):
And it becomes even more likely if we play out
the scenario we discussed earlier where we're using we start
with self replicating robots getting us to other places, and
then if we want to go there because we can't
survive you know, the five hundred year trip or whatever,

(01:02:43):
we laser shoot our consciousness there. They like Mitchiukaku, you
mentioned him earlier, Brent. He talks about a day where
we will be cosmic tourists and so oh you want
to go, you know, look at this exo planet that is,
you know, fifty light years away. We just we shoot

(01:03:05):
through a laser our consciousness there. You've got a robot
body waiting there that you rent for the day, and
you go look around and see the big volcanoes, and
then you shoot to the next planet and so on
and so forth. Now taking that a step back, just
the ability to get there, but and and shoot a
large laser beams, shoot our consciousness there. If we shoot

(01:03:25):
there and the laser, the laser itself could take five
hundred years to get there, depending on how far away
it is. There's no time involved to you. You get
there and then you know, a thousand of your friends,
you start a new all of a sudden, now we've
got a new species of robot people.

Speaker 1 (01:03:41):
Right, and yeah, and we're again we're much more likely,
based on what we understand about sentient life, which is
very very little, we're much more likely to run into
technology of some sort that sentient life created. It's it's
the the Geer kind of example from Star Trek, right.
And what's fascinating about that is there's a very important

(01:04:04):
step that we can't we can't miss. And I know,
as some of us we're listening along, we said, well,
all right, send your consciousness through this laser. Robot people
come back to Earth, right, return these memories to you know,
Matt or John Prime or whatever you call yourselves, to
experience that what that means is they're going to have

(01:04:25):
to reverse the cloud process. They're going to take the
relationship between neurons mapped out virtually, and they'll see that
and see the difference between that, and then have your
neurons performed the same dance so that now you can
remember these things. And all you had to do was,
you know, to wait, switch bodies whatever, repair yourself for

(01:04:48):
five hundreds to one thousand years geese. So it's it's like,
you know, it's there's some time involved.

Speaker 6 (01:04:57):
I just want to know one thing, how how long
will I need to work to build up that kind
of time off?

Speaker 1 (01:05:03):
No Ah, that's the question.

Speaker 6 (01:05:05):
Yeah, I'll I'm gonna put my out of office on
I'll be back in a thousand years.

Speaker 5 (01:05:09):
Yeah, DMV. Still terrible. It's still terrible.

Speaker 3 (01:05:15):
It is crazy to imagine that perhaps one day our
species will think of time on that kind of scale.

Speaker 1 (01:05:22):
I think we would have won't have to inevitably, or
it'll just be an illusory thing.

Speaker 5 (01:05:28):
We know that.

Speaker 1 (01:05:29):
We'll have a terraforming down pat It will still be
it'll be an imperfect science, but we'll have a lot
better idea of what we're working with and what the
scale is we will have finally answered your earlier question, gentlemen,
whether it's better to use a laser or nuclear weapon
on the on the poles of Mars. Yes, we'll have

(01:05:51):
new energy sources, some of which we're going to have
a difficulty comprehending. In twenty nineteen, harnessing interactions in higher
dimensions or different dimensions will also have utility fog.

Speaker 4 (01:06:09):
Which is the utility fog.

Speaker 1 (01:06:10):
Yeah, it's like a well thought experiment now, but it's like,
so the philosopher's stone in alchemy can transform any substance
into any other substance, utility cloud, a utility fog. Rather,
it's kind of like a nanotech version of that. You
can say, well, I'm I enjoy this domicile, but now

(01:06:31):
I want something a bit more what do they call
it Southwestern? I want something with a Pueblo feel. Make
it so, and then this fog, these nanobots would just
rearrange into whatever they thought you meant. Now Hopefully, hopefully
it won't be as fraught with hilarious error as telling

(01:06:54):
your Amazon or your Google Home in twenty nineteen, what
song you want to hear?

Speaker 5 (01:06:59):
Right?

Speaker 6 (01:07:00):
T Oh Gray Hot.

Speaker 5 (01:07:03):
It's like it's like a crazy three D printer from
the future. Yes, yeah, just a living, breathing three D printer.

Speaker 6 (01:07:10):
Well, I just made the quite necessary Star Trek the
next generation reference, Yeah, the better series. Sorry, yeah, I know,
I just made a hot take. Hot you just started
a war, Yes, Kirker Picard, you decide No, they thank you?

Speaker 5 (01:07:31):
Uh? They.

Speaker 6 (01:07:32):
The one thing we haven't talked about with all of
this is, okay, but what if we do achieve faster
than light travel, or what if we do find a
way to bend space time to where you can you know,
warp seven and and we just virus out everywhere.

Speaker 1 (01:07:47):
Yeah, that is just in a word, it's gonna be baller. Yeah,
because like we can, we will be able to finally
confirm things that we have worn wandered about since we
were able to wonder about things.

Speaker 5 (01:08:03):
Are we rare?

Speaker 1 (01:08:04):
Are we alone in the universe? And by the time,
by the time we take time out of the equation,
which is tricky, but that is the right way to
say it in English, then all bets are off. Everything
has changed. You know, if you are if you are
warping time in space, then you can visit ten nineteen

(01:08:24):
if you so wished it right. I don't know if
people would or if we recognize, because now at this
point in thirty nineteen, people are recognizable to us, but
we are We are early man to them, right, And

(01:08:46):
we use our hands to do things, not out of
some sort of fashion statement, but because we have to.

Speaker 3 (01:08:53):
We still eat a whole bunch of food.

Speaker 5 (01:08:56):
Yeah.

Speaker 1 (01:08:56):
No, here's the thing. So we've talked about this in
the past. Human beings. Now we think of ourselves as individuals,
but we're much closer to cities ourselves, given all the
cells that outnumber us, that live inside us. And this
trend will continue, and in a thousand years again, if

(01:09:16):
people are still around and nothing super horrific and existential occurs,
we are going to redefine. Our privacy will be gone longer.
Oh yeah, privacy is a relatively recent notion in twenty nineteen,
and that's probably when it started to die now too,
But by thirty nineteen, privacy will be a weird alien

(01:09:40):
why would you do that concept? And the concept of
the individual will change as well, because the individual will
be just the way that our consciousness now is an
agglomeration or the sum total of interactions between neurons. We
will each individually be a networked node of several different things.
You won't just have have one cloud consciousness, You'll have multiple.

(01:10:04):
You'll be a group mind, and at times that group
mind might not agree with what you consider.

Speaker 5 (01:10:10):
You, which is freaky.

Speaker 1 (01:10:12):
I mean, yeah, it's It's something we've always stated in
multiple languages throughout ancient history, when people like I'm on
the fence, I'm of two minds about that, you will
be of several hundred.

Speaker 3 (01:10:25):
Right, Well, guys, in the end, no matter, you know,
all the things we've discussed so far in this episode,
all of us in this room, the four of us,
Paul out there, everyone listening in their office right now,
and oh, Conspiracy bo I didn't even see you back there.

Speaker 4 (01:10:43):
He must be recharging drunk. Oh, that's what it is.

Speaker 6 (01:10:46):
They eventually pass out, so all.

Speaker 3 (01:10:48):
Of us, you know, in the end, we're all gonna
get together in twenty nineteen and decide that our species
is going to move forward towards the future. We're gonna
steer our planet and our civilization in the direction of
a utopian thirty nineteen.

Speaker 5 (01:11:04):
Yeah, can we.

Speaker 1 (01:11:05):
Get some inspired, inspiring music under that.

Speaker 3 (01:11:09):
I'm totally totally kidding, guys, all of the decisions that
could lead to any type of utopian future will be
made in corporate boardrooms and in government situation rooms, and
the decisions will be based on profits and margins and
election cycles.

Speaker 4 (01:11:26):
The end we fizzle out.

Speaker 1 (01:11:29):
I mean for some for some point of time. Oh,
I wish we had more time. I find myself this
is unusual for us, Matt. I find myself a little
bit more optimistic.

Speaker 4 (01:11:39):
I think it's because of these guys.

Speaker 1 (01:11:40):
It probably is. And speaking of John and Brent, maybe
we should maybe we should end on a end on
a cooperative new and go around the table. First off, guys,
thank you so much for coming on our show on
behalf of us and our listeners. Secondly, two questions. We'll

(01:12:01):
start with you John. One, what of the things we've
examined day, what do you find most exciting? And two,
what do you find most terrifying? Oh, if you had to.

Speaker 6 (01:12:13):
Choose, let me do an reverse order. I think the
most terrifying is the idea of moving from where we
are to that next phase two where we become a
first or second class civilization according to the Karda chev scale.

(01:12:33):
I think the road between here and there is littered
with a lot of potholes that could that could be
really really take us off track and could become one
of those existential threats, whichever one you want to talk about.
I mean, I think they're all just as likely as
one's just as likely as another. You know, That's what

(01:12:53):
that's the most terrifying part, because and the reason that's
the most terrifying is I'm a pretty big optimist when
it comes to these things. I think a lot of
negative human behavior is driven by need and necessity. Certainly
some people are driven by power, but that power is
rooted in the power that other people will give to them.
And if the masses are fairly happy, they're not going

(01:13:17):
to allow a tyrant to run them in just my
one guy's thought. So I think that if we are
able to get to eight thousand years from now, and
we have made it past these existential threats, I do
see a little bit more utopian. I think we'll be
interacting with other alien species, and I think it will

(01:13:38):
we will be a multiplanetary, perhaps multi galaxial if that's
not a word I just made it up. Sound's pretty
cool species. And I think it could be a really
really cool future.

Speaker 4 (01:13:52):
We just got to get there.

Speaker 5 (01:13:54):
I think for me, the thing that is the most
exciting is the thought of the exploration out there, what
we could do. You know, It's one of those things
you can just shut your brain off and think and
there's really no wrong answers because we don't know.

Speaker 4 (01:14:08):
And that's what's awesome.

Speaker 5 (01:14:10):
The scariest part of that ties into that too, is
I'm still hung up on the whole uploading of our
consciousness and what we will be if in a thousand years,
for us to be able to do those things, will
we still be human?

Speaker 3 (01:14:21):
You know?

Speaker 5 (01:14:21):
And you know that's what's a mean to be human?
You know. I think it's the big question. And for
us to be able to explore the cosmos, to go
out to survive, we're gonna have to make a lot
of changes, you know, internally, externally physically, and what that
does to us is a big question mark to me.

Speaker 3 (01:14:40):
Wow, Well, gentlemen, thank you so much for coming on
the show.

Speaker 5 (01:14:44):
Well, thank you for having us all last.

Speaker 3 (01:14:48):
Just why don't you tell everybody a little bit more
about hysteria fifty one where they can find you, what
they should be listening to and everything.

Speaker 6 (01:14:56):
So if you haven't listened to Hysteria fifty one. We
talk about a lot of stuff like this, conspiracy theories, UFOs,
the unexplained, the unexplored. We do it a little bit
differently in that our third host is not NOL.

Speaker 4 (01:15:08):
That'd be weird, but it.

Speaker 5 (01:15:09):
Yeah.

Speaker 6 (01:15:10):
Our third host is an angry robot who you see
here in the corner named conspiracy Bot, and Brent built
him in his lab to help edit and produce the show. Instead,
he just get drunks, he just pardon me. Instead, he
just gets drunk and threatens to take over the world.
Though he's kind of run by like a for eighty
six computer, so it's not a big threat. Okay, he's

(01:15:32):
not an existential threat. But yeah, and you can find
us wherever you listen to your podcast, our website Hysteria
fifty one dot com.

Speaker 5 (01:15:39):
Yeah, just anywhere on Facebook, Twitter, you know, Instagram, smoke signal,
We'll get to you.

Speaker 1 (01:15:43):
And you have his Hysteria Nation Steri Nation, and yeah,
you can search for that on Facebook.

Speaker 5 (01:15:47):
That's our discussion group and it's pretty active, so we
have a lot of fun in there. You can you
can tell us you know that we're wrong. You can
also do it.

Speaker 1 (01:15:56):
You can also occasionally catch one of us matter myself
popping in on Hysteria Nation because we are also fans
of the show.

Speaker 3 (01:16:05):
That's right, and check out just while we're here, everyone listening,
check out our Facebook group. Here's where it gets crazy.
Just again, anything you want to talk about in the show.
You've got a questions for these guys, Let's just all
have a discussion. Let's talk about the future. Let's talk
about how none of it's going to be good.

Speaker 4 (01:16:22):
In my opinion, is.

Speaker 5 (01:16:24):
It such a doubter? It's like, what's the future?

Speaker 1 (01:16:27):
Oh God, as Matt shakes his fist at the sky
and the inevitable slow grind of what we recognize today
as time. We reached the end of our episode, but
not the end of our show. Tune into our next episode, which,
without laying any spoilers because we're not quite sure which

(01:16:47):
what we're gonna do yet, is going to be very
very very very strange.

Speaker 3 (01:16:52):
And positive, so positive.

Speaker 6 (01:16:54):
Oh it's so good.

Speaker 1 (01:16:56):
Angel farts trumpets, Yeah, nice, like slow jack a numbers whatever.
But in the meantime, we'd like to hear from you.
You are indeed our favorite part of the show. No offense, John,
No offense, Brent.

Speaker 3 (01:17:09):
And that's the end of this classic episode. If you
have any thoughts or questions about this episode, you can
get into contact with us in a number of different ways.
One of the best is to give us a call.
Our number is one eight three three STDWYTK. If you
don't want to do that, you can send us a
good old fashioned email.

Speaker 1 (01:17:29):
We are conspiracy at iHeartRadio dot com.

Speaker 3 (01:17:33):
Stuff they Don't want you to Know is a production
of iHeartRadio. For more podcasts from iHeartRadio, visit the iHeartRadio app,
Apple Podcasts, or wherever you listen to your favorite shows.

Stuff They Don't Want You To Know News

Advertise With Us

Follow Us On

Hosts And Creators

Matt Frederick

Matt Frederick

Ben Bowlin

Ben Bowlin

Noel Brown

Noel Brown

Show Links

RSSStoreAboutLive Shows

Popular Podcasts

24/7 News: The Latest

24/7 News: The Latest

The latest news in 4 minutes updated every hour, every day.

Crime Junkie

Crime Junkie

Does hearing about a true crime case always leave you scouring the internet for the truth behind the story? Dive into your next mystery with Crime Junkie. Every Monday, join your host Ashley Flowers as she unravels all the details of infamous and underreported true crime cases with her best friend Brit Prawat. From cold cases to missing persons and heroes in our community who seek justice, Crime Junkie is your destination for theories and stories you won’t hear anywhere else. Whether you're a seasoned true crime enthusiast or new to the genre, you'll find yourself on the edge of your seat awaiting a new episode every Monday. If you can never get enough true crime... Congratulations, you’ve found your people. Follow to join a community of Crime Junkies! Crime Junkie is presented by audiochuck Media Company.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.