Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:00):
From UFOs to psychic powers and government conspiracies. History is
riddled with unexplained events. You can turn back now or
learn this stuff they don't want you to know. A
production of iHeartRadio.
Speaker 2 (00:26):
Hello, welcome back to the show. My name is Matt,
my name is Noah.
Speaker 3 (00:30):
They call me Ben. We're joined as always with our
super producer Dylan the Tennessee pal Fagan. Most importantly, you
are you.
Speaker 4 (00:38):
You are here.
Speaker 3 (00:39):
That makes this the stuff they don't want you to know.
We are recorded on Monday, March thirtieth. This should come
out a little bit after that. And frankly, we're not
sure how much this is going to how much this
information will change over time. This is a thought experiment.
This is a really weird one that's captivated us for
(01:02):
a while. Now. Gentlemen, I ask you, do you have
a favorite monster? Do you have a favorite legend? Fairy tale?
Speaker 2 (01:10):
Tall tale?
Speaker 4 (01:11):
It changes week to week, okay, depending on I was
just a monster of the week reference, No favorite monster, Jeez, Louise,
I really like the that guy you.
Speaker 3 (01:22):
Know, that guy yeah from earlier? Yeah, yeah, yeah, I
like Him's.
Speaker 2 (01:27):
Pall del Toro creation for what that is?
Speaker 4 (01:30):
Pan's Labyrinth? I would have expected it.
Speaker 2 (01:32):
Thank you.
Speaker 4 (01:33):
Also, whatever the death demon kind of grim reaper guy
at the end of that is with the crazy wings.
Love that one. Also love a lot of the monsters
in Scrooged. I just really dig the design of the
Scrooged monsters.
Speaker 3 (01:47):
Scrooge is great, I guess, go wrong with practical effects. Okay,
so Ghost, what about you, Matt.
Speaker 2 (01:53):
Does something like the Philosopher's Stone fits like the legendary thing.
Speaker 3 (02:00):
Chemical?
Speaker 4 (02:00):
Yes, a crossover for sure.
Speaker 2 (02:02):
And those ancient concepts of alchemy that you could somehow,
through man's ingenuity and a little bit of magic, change
one substance into another. And we just saw that occur. Guys,
it really happened at the LHC.
Speaker 3 (02:18):
Yes it has. Yeah, we're talking about the old legends
of transmutation, right, and alchemy is of course the granddaddy
of modern chemistry. So not all of it was what
we would call bunked intoday's secular world.
Speaker 2 (02:35):
No, but but the concept of turning lead like directly
into gold was crazy. And we see now that at
the LHC, guys led to eight, a nucleus of one
of those was turned into a gold two O five
nucleus for about ten to the negative twenty three seconds.
Speaker 4 (02:53):
Wait a minute, and then I just switched it flipped back.
Speaker 3 (02:56):
Yes, h Jim Carrey and dumb and dumber saying you're
saying there's a.
Speaker 4 (03:00):
Chance, the chance there is, there's a chance.
Speaker 3 (03:03):
Yeah, this is a pretty bizarre proposition for us, folks,
the larger idea. It's a little bit different from some
of our usual episodes. If you remember, as you're thinking
and hanging out with us today, you remember all those
stories of monsters and myths and legends throughout history. Tonight
we are asking something that's been on our collective minds
(03:26):
and a bit of an obsession. What if technology is
making some of these old supernatural things in practice reality.
I say we get into it.
Speaker 4 (03:37):
We must the first to quick word from a sponsor.
Speaker 3 (03:46):
Okay, So, if I'm you audience member and I'm I'm
listening to this, my first question is, what the heck
are these guys talking about? What do we mean like
we're describing a general thing. All those old fairy tales
you heard, all those legends, those monster stories, whether it
be jin or vampires, or shape shifterschnanthropes, cryptids, zombies. Most
(04:12):
people have some kind of favorite monster like to your
point in old Maybe it's a passing fancy, or maybe
it's just a certain genre of film that they would
enjoy in the realm of fantasy or horror. So some
people love vampire films, right, some people love undead or
(04:34):
walking dead zombie films.
Speaker 4 (04:35):
You know, I'm quite partial to the hell Raiser universe,
even though maybe only two of those movies are watchable.
I just love the Clive Barker snme you know, cenebite designs.
Speaker 2 (04:47):
That always creaked me out the most of all films,
I think, truly.
Speaker 4 (04:51):
I did the whole plant pain is pleasure thing. It
just gives me the heby gbis.
Speaker 3 (04:56):
The hell Bound Heart, the novel that the Birth of
franchise is such a banger. Flive Barker is such a
brilliant weird dude. He actually wrote speaking of legends and
tall Tales. He wrote what I think is one of
the best young adult fiction books.
Speaker 4 (05:13):
Oh, the Thief of Always out There. It is the
Thief of All Noise. If you haven't read it, read
it now.
Speaker 3 (05:19):
It is God good.
Speaker 4 (05:21):
It's got like big roll doll energy, oh in a way,
and it's like Neil Gaiminy and it's just very very good,
and it's illustrated, and the illustrations are also phenomenal. That's
a long time favorite of mine. Then I haven't thought
about that in a minute.
Speaker 3 (05:35):
Clive Barker is also a well known visual artist and illustrator.
Any super into mythology, And yes, you should read Thief
of Always Again. It'll take you like an afternoon. It
is a stem disterned banger like Missy Elliott's Flip It
and Reverse It. So those are kind of on equal
(05:58):
on par in the Western camp. And we could say,
but you know, what I was thinking about, guys, is
that we look back at these things we call myths
and the supernatural, and sometimes we make this unfair assumption
that humans of civilization's past were somehow less intelligent than
(06:22):
modern humanity. And part of that is because they believed
a lot of stuff that current humanity has concluded is
not true. Like for much of human history, a lot
of people genuinely believed in the things we call myths.
Like you're in Greco Roman society. You think Apollo is
(06:45):
a real dude and not just the charming dog that
appears on our show.
Speaker 4 (06:49):
Sometimes mm hmm, I tell you that Apollo has got
an arch nemesis out in the world named Zeus.
Speaker 3 (06:55):
You did, yeah, dog, Yeah.
Speaker 4 (06:57):
They're bitter rivals. Aren't those related? Are the same pantheon?
Or is a aren't they kind of equal like Apollo? No,
Apollo is like the Messenger. No, that's Mercury. Gosh darn it, I'm.
Speaker 2 (07:08):
Scanning it all up.
Speaker 3 (07:10):
Well, there's idea.
Speaker 4 (07:11):
There's analogous gods and various pantheons. But to your point,
that's part of this too, right, the parallel thinking of
personifying fears, hopes, and dreams and need for the environment
to do things that we need it to do through
various deities and creatures and creations of fantasy that ultimately
represent very real parts of being human.
Speaker 3 (07:33):
Oh, one hundred percent and well put. I also side note,
agree with your earlier theory that there is a trend
of small canines getting named after phenomenally dangerous Greco Roman gods. Yeah. Like,
we can still see traces of these earlier beliefs, this
(07:54):
literalism applied to the supernatural. We can still see it
in the four of superstitions today. I think a lot
of people, a lot of humans have their own private superstitions.
You know, what are you touch with your left hand
versus your riot, etc. Those things that we treat as
silly but quietly reaffirming. Hot take the atheist in the crowds,
(08:18):
and I are going to argue that religion itself as
a concept is a vestigial thing and artifact of the
time when the majority of human beings literally believed in magic.
I don't know if I agree with it. We're not
here to denigrate people's personal beliefs or faith.
Speaker 4 (08:37):
Well, that is an interesting point, though, Ben, because the
question then, for me becomes is modern religious belief and
faith a form of belief in magic? Because I think
there might be some devout religious folks who would consider
that to be blasphemy, that magic ain't part of the equation,
that it's divine influence, and that calling it is in
(09:00):
some way dismissive or less lessening of it in its gravity.
Speaker 3 (09:04):
One hundred percent, because for many different religions, the idea
of magic is kind of hacking reality in a way
that goes against the divine right. So look no further
than the Christian Bible and what it has to say
about witches. But for us here March thirtieth twenty twenty six.
(09:29):
Our main thing as a show is to believe what
you want. As long as you are not forcing those
beliefs on other people, and as long as you are
not using your beliefs to harm other people or to
harm yourself, go nuts with it. Roll the dice, think,
do as thou wilt. Nobody has a universal answer at
(09:50):
this point, and people from the ancient past were being
post by cameral mind theory. People from the ancient past
were pretty much this same as people of the modern day.
The same hardware, the same basic needs, and the same
urge to explain the world in which they lived. That's
why today's scholars and we don't have to get two
(10:13):
into the academic weeds here. Today's scholars see a lot
of real world efficacy in the legends of old So,
for example, we all remember warnings about boogeymen or child
snatching monsters in the wild that literally helped make sure
that children didn't go too far into the woods. Because
(10:37):
it might not be like a magical ogre or a
what's a fun monster?
Speaker 4 (10:44):
No, we can do better.
Speaker 3 (10:46):
We can do I think we do better. Something weird,
like a wear badger or something it's not that that's
going to kill you in the woods. It's a kid
dropping into a creek that's deeper than it looks or
breaking their leg were.
Speaker 4 (10:59):
Glassy for that, you need to deploy Lassie to the scene.
Speaker 3 (11:03):
That's a good note. We need to figure out when
border colleagues became a thing.
Speaker 4 (11:06):
Yeah, and all these wells everywhere that little kids are
getting trapped in. Matt you got you got a fun
a fun hybrid monster.
Speaker 2 (11:14):
A creature from the badger lagoon.
Speaker 4 (11:17):
Love it had no notes. Uh we we all terrifying lagoon.
It's a weird po badger like in the scenario of
badgers and bags like that's yea. The implication is that
that's not good.
Speaker 2 (11:32):
If it's in a bag, it's a secret it's contained.
Speaker 3 (11:35):
Okay, yeah, yeah, But the thing is all badgers are
not created. Equals shout out to the honey badger and
shout out to the civilian badger, honey badger.
Speaker 2 (11:45):
The important thing is that the bag man knows that
the badgers are in there, but anybody observing the badger
from the outside has no idea.
Speaker 4 (11:52):
But gosh, I'm the badger, I'm the bagg We all know. Yeah,
that's record, thank you.
Speaker 3 (11:58):
So we also see other efficacy of these stories humanity
told itself, because all human beings, you're essentially a story
that you're telling yourself. Their myths about things like changelings,
remember those. I'm not referencing the horror film Changeling, but
the I think it's the Irish tradition, Irish Scottish UK
(12:22):
tradition that infants might be switched out shortly after birth
by fairy folks un sealy, yeah, yeah. And then these
non human children, these changelings were sickly right, they were
doing not well and they often died. Folkloreists believed that
(12:44):
this was an attempt to explain otherwise inexplicable childhood ailments,
medical conditions, and.
Speaker 4 (12:52):
Well, dude, I mean like crib deaths are caught death sudden.
We still barely and understand what causes that. And there's
a name for it, but it's like a very mysterious
phenomenon stealth. I'm not mistaken.
Speaker 3 (13:04):
Well, ye's sudden infindeesth syndrome.
Speaker 2 (13:06):
And there are all kinds of rituals you have to
take part in to increase the chances that your child
doesn't experience that or you know you won't wake up
one night and that is happening, or you know it
is weird stuff about how to arrange pillows and what
to put in the crib or keep out of the crib.
It is ritualized in an interesting way.
Speaker 4 (13:24):
Do y'all remember the Chuck Pallinik novel Lullaby. Yes, that
was like and not to spoil anything, but that attempted
to tell a story explaining crib death, associating it with
I think certain voodoo rituals and the idea that it
could be brought upon somebody, you know, from an enemy
or a nemesis.
Speaker 3 (13:43):
Like the earlier legends, Lullaby is acknowledged to be a
work of fiction. I think what we're saying is, look, folks,
even if you personally do not believe in the supernatural,
even if you see yourself as the most skeptical secular
thing walking the planet Earth, there is no denying that
(14:07):
these stories, these perspectives for reality, had a significant and
positive effect on a lot of the societies in which
they were created. Of course, not always. Countless innocent people
have been, you know, tortured and murdered for religion from
Aztec human sacrifices to the actions of extremist terrorists today,
(14:33):
who are very upfront that they are murdering people for
religious ideology.
Speaker 4 (14:39):
And for potential rewards in the afterlife.
Speaker 3 (14:41):
Right, Yeah, that is the case.
Speaker 2 (14:45):
Right.
Speaker 3 (14:45):
No one has conclusively scientifically proven the existence of an
afterlife just yet. Anyway, the setup here is that we
want to acknowledge as a result of these key social functions,
this need to understand and explain reality. It's frankly pretty
(15:07):
difficult to imagine a human world that does not have
some sort of belief in the supernatural, even if it's
only an artifact or of this stigial thing from earlier civilizations,
like superstition. Do you guys practice any superstitions that you're
comfortable sharing on air?
Speaker 4 (15:25):
Not really, But Ben, this whole topic reminds me of
a Stephen King novel that I have not personally read.
But there's this wonderful Stephen King podcast called Just King Things.
Or they're going through this whole bibliography or whatever canon,
you know, one at a time, and they just got
to the book Revival, which I have not read, and
(15:46):
it sounds like a banger, and I don't mind spoilers,
so I definitely listen to the whole discussion, but it
revolves around the idea of spiritual technology, or of this
idea of electricity being this nant of like a dead
god in some ways that was left behind for humans
to wield, and the antagonist of the whole story is
(16:08):
attempting to figure out if there is an afterlife through
experimentation using electricity or technology. Things like Jacob's Ladders.
Speaker 3 (16:15):
Beautiful, that's perfect, that's perfect reference. That's also I agree
with you. That is a banger. That is also one
of the Bleaker King novels.
Speaker 4 (16:27):
But the long line of Bleak King novels, that's saying
a lot.
Speaker 3 (16:31):
Yeah, the idea that electricity is somehow Promethean remnant of
a an earlier thing. But oh man, ooh dude, just
text me when when you read that, because sound good.
Speaker 4 (16:48):
Yeah, because you're going to be depressed. Well, I'm interested
in your thoughts on this. Not to derail, but the
one of the hosts of that show says he doesn't
care about spoilers because he thinks that a work of
literature should work whether not you know what's going to
happen or not. And I tend to agree with that.
Spoilers don't really bother me either.
Speaker 3 (17:05):
One hundred percent. Yeah, you guys know that I usually
welcome spoilers. I can't remember what put said it, but
a good book is like a house, right, It's like
a nice house. You become familiar, you walk through your
favorite rooms, you admire the view.
Speaker 2 (17:23):
I'm gonna push back, guys, sorry. I think if you
know how something's gonna end, it really does take away
from the in the moment, true experience that you have.
Speaker 4 (17:34):
I agree, I will agree with you in their respect.
There's a certain element of surprise and twists and excitement
you can only experience once. But then the question becomes
does that mean you're not a rewatcher or a rereader
and does revisiting something have no value to you? No?
Speaker 2 (17:51):
I just you should definitely rewatch.
Speaker 4 (17:53):
Okay, cool, coo cool. So there's two different things. And
I'm not saying that it's not an awesome experience to
not know the twist and to be surprised like that.
I'm just saying the work should also stand alone.
Speaker 2 (18:02):
I agree. I do agree. Just as a case in point,
just watched last one laughing on somewhere that you can
find it and it's a.
Speaker 4 (18:12):
Bunch of British delightful humans, right like it is.
Speaker 2 (18:16):
But the second season was first in the order. And
I watched the first episode of the second season and
somebody on that season was from the first one because
they had won the first season. I went, oh, crap.
I didn't realize that. So I went back and then
watched the whole first season and I was aware of
who won the whole time. Which took it. Didn't ruin it,
(18:37):
but it took away that little It's like a little
spice that you get in the experience. That's all I'm saying.
Speaker 4 (18:44):
I'm with you, man, I understand what you mean. I
feel I would feel the same way if I know
who is going to win the Great British Baking Show.
Speaker 3 (18:49):
And yeah, again, you know your beliefs are your own, folks.
We're not going to tell you one is better than
the other.
Speaker 2 (18:56):
I am. I'm saying that.
Speaker 3 (19:00):
Spoil again. As long, like I said earlier, as long
as you don't harm other people with your personal beliefs,
do as thou wilt. So in this case, harming other
people with your personal beliefs would be telling people who
don't care for spoilers. A spoiler right, that's unfair to.
Speaker 4 (19:19):
This it's removing agency from the individual.
Speaker 3 (19:21):
Right, Yeah, that's a good way to put.
Speaker 4 (19:23):
It, by just popping it on them.
Speaker 2 (19:24):
You know.
Speaker 4 (19:25):
It's it's absolutely non consensual, and we don't like that.
We tried very hard to always do spoiler alerts, even
if it's older.
Speaker 3 (19:32):
Even if it's older. Yeah, yeah, Ford Theater, Abraham Lincoln
three two one, and we'll count it down.
Speaker 4 (19:38):
But enjoined the show. It was a delightful performance.
Speaker 3 (19:41):
Yeah. Other than that, missus Lincoln, Uh, don't help these
why does good?
Speaker 4 (19:48):
You know?
Speaker 3 (19:48):
So the thing is here though, Also, if you are
a person who enjoys spoilers, there's nothing wrong with you.
That's how you experience realities. Don't let people detegrate you,
you know. The The big thing here is that, speaking
to spoilers, the majority of the world, I think we
can agree, is increasingly secular, right, increasingly dismissive of those
(20:11):
things we called legends and took literally once upon a time,
there's scientific inquiry. Right, it's a shining lighthouse that has
revealed to a degree the hidden clockwork and mechanism of
the universe or universes. Right, This is why we know
that change leans were most likely unfortunate human children suffering
(20:35):
from varieties of ailments that can now many of which
can now be easily cured through modern medicine. But to
the LHC comment earlier, we know that scientific inquiry tends
to create at least as many questions as it answers,
right like how long can we make lead gold? You know,
(20:58):
if we really we eate a black hole? What happens next?
Speaker 2 (21:02):
If we send a bunch of antimatter down the road
on a huge semi truck, what happens? Are we going
to be okay?
Speaker 3 (21:08):
Well?
Speaker 4 (21:09):
And then the question becomes, are a lot of these
things like the large Hadron Collider and some of the
experiments going on there things that we shouldn't be messing with?
Is that an example of us being Promethean and meddling
with the fabric of reality in a way that is potentially,
you know, catastrophic. I'm not saying that that's true. I
(21:29):
think there are some people that might say that that's
the case, that it's in some way, you know, fiddling
with God's plan or you know, something that something larger
than us that we should just have full belief and
faith in would not have wanted us to mess.
Speaker 3 (21:42):
With perfect, perfect, and well put, because every single technological
innovation of significance in human civilization will ultimately in some
way challenge pre existing frameworks for interacting with reality. That's
an overly fancy way to say it, but it does
(22:03):
hold true. I mean, maybe it is time again for
humanity to reframe its collective perspective and ask not just
whether science is disproving the supernatural, but whether these technological
breakthroughs are making some of those old legends, some of
(22:27):
that old magic real in practice, like making it reality.
Speaker 4 (22:33):
Can I just say this is fun. I'm excited about this.
Speaker 3 (22:38):
We're going to take a pause for the full moon,
We'll have a light transformation, and we'll be right back.
Speaker 4 (22:49):
Here's where it gets crazy.
Speaker 3 (22:51):
Okay. First off, hope everybody had a good break. We
don't always do the breaks on Detflix, but guys, how
is your where we'll that break?
Speaker 4 (23:00):
Had a good howl?
Speaker 3 (23:01):
You had a good howl?
Speaker 2 (23:03):
Mm hmm, fantastic. I feel utterly psychopathic, oh boy?
Speaker 3 (23:10):
Uh yeah, And check out our episode on any Psychosis.
Speaker 2 (23:14):
As wild and psycho.
Speaker 3 (23:18):
And then every Cryptid episode we've done. We've done so many.
I revisited some of those guys, and we were on
one a couple times. We're really good at those.
Speaker 4 (23:28):
Well, now, Matt jumping back onto that you're talking about
this idea of like full moon fever or like the
idea of where we'll is sort of being created as
a myth because of the fact that there are instances
of people going cuckoo for cocoa puffs when the moon.
Speaker 2 (23:41):
Gosh, yeah, that was an old, old, old video we
made one of the first ones we released on YouTube
when we had a YouTube channel there. We created our own.
It was about like aanthropy and psychopathy and you know
that tales of that person that you're not sure, yeah, person,
you're not sure who the gonna be when you encounter them.
(24:01):
And one of those people that is contained within that
one human being is you know, a murderer and doesn't
feel things in the same way that maybe you do.
Speaker 4 (24:10):
I guess that's why I always sort of associated like
Jecky and Hide with were wolf ism, because there is
that transformative quality and that idea of the beast within
and all of that stuff, which is a great way
of describing say multiple multiple personalities or just ingrained psychopathy
that maybe isn't always you know, front and center right right.
Speaker 3 (24:32):
See also in Night Shyamalan's character The Beast. No spoilers
on that trilogy, but they did a pretty good job
of grounding superhero stuff. Check out Peter Stump, the alleged
serial killer accused of where wolf reea before we before
we move to to the real crazy stuff. Tennessee, I
(24:56):
saw you. You went off mute for a second. Do
you have some hot takes on were wolves and shaped shifters?
Speaker 5 (25:02):
Well, I just I created an AI company for were wolves.
It's called Lycanthropic. The product is called Clause.
Speaker 3 (25:12):
The product called Clause. Yeah, they said there's no silver
bullet solution to AI right Hey, though it only works
on a full moons to check back with us in
two weeks.
Speaker 2 (25:29):
Hence, dude, I bet that scarcity makes it wildly popular.
Speaker 3 (25:32):
Of course, like how I can only play Octurtle once
every twenty four hours. Tut tut. Look what we're saying
here is we know we might have sounded a little
awe or a little strange in our setup, and you
might be asking again, like, guys, why are you talking
about the role of myths and legends? It's because technology
(25:57):
is magic. Shout out to Arthur C. Clark. As wild
as it might sound, modern technology isn't just getting close
to creating in practice magical things. This is already happening
as you listen to tonight's episode. A ton of publicly
available known technology is already like magic radar. For example.
(26:20):
Are these smartphones that so many of us are addicted to,
you know what I mean?
Speaker 2 (26:25):
Bluetooth blue teeth? That's magic? How can these headphones just
like I click a button and somehow I can hear
you guys talking now I'm not connected, there's no wires.
What the hell?
Speaker 3 (26:39):
Yeah? Or like, I've never been to Tehran, It's going
to be dodgy for me to get there. But I
can do something a lot like clairvoyance now, and I
can get knowledge from halfway around the world.
Speaker 2 (26:54):
This man has access to zoom.
Speaker 3 (26:55):
Sure. And people were always bolting each other about out
their technological discoveries. You know, one of the one of
the great old saws from World War from earlier World Wars.
I don't even want to call them one or two
anymore because we're in the third one. You guys, remember
the story about how carrots improving your eyesight became an
(27:19):
accepted factoid. A factoid is a thing that sounds like
a fact, but it is not actually true. It's because
the British, our British cousins were trying to cover up
the deployment of radar and our pilots just love carrots.
Speaker 2 (27:35):
My guy, I thought it was just because my parents
wanted me to eat carrots.
Speaker 4 (27:38):
But that's the thing though, And it goes on, it
becomes it gets perpetuated by that kind of thing like
a lot of myths do. Because it has utility. That's
that's exactly it. Yeah.
Speaker 3 (27:50):
So we could say also that with this pattern, innovations
in AI, genetic or genomic science are of particular concern,
and as is big data, predictive modeling, agriculture, medicine, physics,
and of course the pharmaceutical industry. Uh, guys, can we
(28:10):
start with AI? Can we start with AI?
Speaker 2 (28:13):
We must?
Speaker 4 (28:14):
I mean, it's the biggest badger in the in the
bag question there.
Speaker 3 (28:18):
Yeah, it's a real honey badger in that bag of
civilian badgers.
Speaker 4 (28:22):
All right.
Speaker 3 (28:23):
True artificial intelligence has yet to occur, or at least
be publicly acknowledged. Still, millions of people and institutions across
the planet are using what we call chatbots or large
language models for things like therapy or assistance at work
(28:44):
or numerous other tasks. And a lot of shows, including
your hopefully favorite hosts. Uh, we've collectively warned about the
dangers of misusing this stuff. Do you guys remember our
earlier episode on AI Psycho.
Speaker 4 (29:00):
Oh my gosh, how could I forget? How could any
of us forget? It's terrifying. But I think we've talked
about too in this whole topic area, how similar it
is to the idea of casting spells, like vibe coding
prompts to generate things that didn't previously exist, and the
dangers of going too far down that rabbit hole, and
(29:21):
how it can totally drive you mad.
Speaker 3 (29:25):
That's perfect, Yeah, because I was trying to figure out
where we have that side note, this makes so much
sense to like you were saying, nol just searching the
internet right, figuring out your search turns just writing code.
Even before vibe coding. It becomes a lot like stories
of wizards casting spells, by which we mean if you
(29:45):
know the correct words, the correct invocations rituals, you can
achieve amazing results. But like old magic there are consequences
attached sometimes well, especially if you say.
Speaker 4 (30:01):
The coding sequence incorrect, you know, put a character in
that doesn't belong. It's the same story we always hear
in fiction about the sorceer's apprentice type stuff when you
get it just a little bit wrong and then it
goes absolutely haywire.
Speaker 3 (30:15):
Yeah, like the old bar joke of bartender having a
genie and he pulls out a tiny guy who plays
a minuture grand piano and he says, do you really
think I wished for a twelve inch pianist?
Speaker 4 (30:32):
That sounds like fun.
Speaker 2 (30:34):
The weird thing about AI and all of this, you know,
these versions of technology and all this stuff. The very
weird thing to me is that it's all occurring on
this black scrying mirror, all of it. And unless unless
the program than that I or all of us are
scrying with is physically attached to some mechanism that can
(30:57):
actually do something physically in this world, it's all happening
in this imaginary place. This is not true real place.
It's like a whole portal to another world where anything
is possible, and this thing can do anything you want
it to, but it's not actually going to affect you
or anyone around you unless it gets in your head.
Speaker 3 (31:17):
Yeah, yeah, unless it gets in your head, and depending
on the user, because another rule of magic is that
your mileage may vary. Right, we do like, just do
a like to Bat's point, just do a quick search
on the real world consequences of AI hallucination as well
as what they're calling AI psychosis.
Speaker 4 (31:39):
Well, and and to maybe take it a step further,
there is a world where it is connected to the
real world, say, using it to run weapons systems, for example,
And then all of a sudden, off the groundwork that
we made that was completely conceptual is now terrifyingly real
and no longer in our hands, much like the out
(31:59):
of control you know, mops and buckets and the Sorcerer's Apprentice.
Speaker 3 (32:03):
Right shout out to Mickey. Yeah. And see also our
earlier comments regarding the massive storm around plagiarism intellectual property.
And there's this cartoonishly annoying tendency of people pretending they
made a thing that taking credit for something they didn't
really make. And at this point I just got to say,
(32:25):
I think we all have to agree. God bless the
teachers and professors who have to deal with diplomatically put
paradigm changes in things like their students' attention spans, cognition, retention.
I mean, it's fascinating right from the lens of folklore,
and now I want to get to your point. No,
but the thing is, this interaction through the lens of
(32:49):
folklore is increasingly similar to the old stories of summoning
demons and angels and jin. Right with the right rituals
like download your app, joining your chat session, doing your
prompts correctly, you have a thing at your beck and call,
similar to Solomon of old. It can appear to be
(33:11):
essentient entity. It passes the Turing test for a lot
of people, it works for or is enslaved by you.
It's hidden knowledge. That's what it provides. To Matt's earlier,
earlier point about it being at first in a separate, digital,
non tangible world, you still get the hidden knowledge like
(33:32):
the legend of Faustus. But the institutional or state level
actor means we are getting incredibly close to programming or
evoking things that don't just reveal knowledge, they can take
direct action.
Speaker 2 (33:49):
Oh dude, there you got. We got there under and
the knowledge that the jin contained within here for just
going along this path. If employed correctly, it can perform
surgery but incredibly well right. It can maximize the output
of specific energy systems, and if you do the right
(34:10):
things with it, it can model out new vertical farming techniques
that might save humanity. But it can also it can
also freaking you know, destroy all of us or individually
the way a gin can you know, have those effects. Again,
there are many different types of the smokeless fire ones
(34:31):
out there, but there are versions where that entity possesses
you in a way and you become obsessed with it
and these other things. I'm just it matches up way
too well with this gin thing. I don't know if
it's the phones, the AI, or like the combination maybe
of those together.
Speaker 4 (34:52):
It's all of it is right, It's all And the
question then becomes like, are we working in reverse to
achieve those goals? Are we modeling this stuff after the
arcane tools of old? Is it just somehow embedded in
us as human people? And that's just sort of what
technology becomes inevitably. I don't have an answer to that either.
(35:14):
I'm just kind of positing another fun thought experiment, like
the scrying mirror. Matt like the fact that is it
a coincidence that the phone is that black mirror?
Speaker 3 (35:25):
I would posit it. It's not. It's just an escalation
of the things human civilization always wanted. Have it imagined
exactly right, right, right, So autonomous drones are on the
way in all levels of war, and depending on the software,
they can potentially behave a lot again religion and lore aside.
(35:49):
In practice, they can function a lot like demons or
spirits summoned by a witch, a wizard, a warlock. What
have you a sorcerer to recap from afar? And I
would say we should go ahead and add way mow
in there as a cherry on top. It's a thing
you can summon, similar to a magic carpet to taking
(36:12):
places so long as you can pay, so long as
you can, so long as you can pay for the consequences.
That that's a caveat in old magic, and it's a
caveat that magic shares with the modern economy, right it just.
Speaker 2 (36:31):
Trying to get across the river.
Speaker 4 (36:32):
Sticks Bin, give me him coins, put them right now.
Speaker 3 (36:36):
Give me my two n f ts. Where my two NFTs.
I'm not going I'm not going to the two bitcoin please, Yeah,
unless they know I'm a fan of bored ape. Yeah.
I mean it doesn't stop there, right, This is exciting stuff,
the and terrified and stuff. The kissing cousin of the
(36:57):
large language model is the predictive model. This has been
a subject of intense research for far longer than the
public knows, and for far longer than the people in
power will likely admit.
Speaker 4 (37:13):
Now.
Speaker 3 (37:13):
I know, a few years back or over the years
in the course of this show, we have lightly alluded
to different old professors who were able to build out
statistical models of countries, right, like Afghanistan or something.
Speaker 2 (37:30):
Well, we've been talked about the video game concepts that
have been created and put out by the military in
conjunction with private companies as ways to build intensely sophisticated
battlefield scenarios. Right, But then combining that with all the
other stuff, Ben that you've mentioned in the past, it
(37:51):
really does feel like at this point we can have
that whole I can't remember what movie or television show
or whatever. It is, the version of a tiny model
Earth that is just represented in a large oval room
and you can look down and see what is to
come on the actual Earth.
Speaker 3 (38:12):
Yes, what is past, what is passing, or what has
yet to come?
Speaker 4 (38:16):
Well, in that kind of modeling at a high enough
level of effectiveness, is it that indistinguishable from telling the
future the aadically.
Speaker 3 (38:27):
That is foreshadowing what is past, is passing, or yet
to come. Right now, we are going to dive into
the next myth that is becoming reality. Oracles. We're back.
We're not talking about Oracle the company necessarily that we
(38:52):
all know big tech companies have a habit of naming
themselves ambitiously that startup culture. What you have to understand, folks,
is wherever you live, whatever you do, some sort of
entity is currently monitoring your activity, and on some level
(39:12):
it is attempting to scry your future attitudes and actions
to predict them. Right and now, to be fair, beat
me here, Dylan again. Sorry, I'm cursing a lot of
this on, folks. To be fair, people have been trying
to do stuff like this for ever.
Speaker 2 (39:32):
Do Kroger has been doing it since I've been able
to buy groceries.
Speaker 3 (39:36):
Since you got your Kroger card, right, that's correct?
Speaker 4 (39:38):
Yeah? Uh getose sweet sweet discounts that don't really exist.
Speaker 3 (39:44):
Target. Yeah, shout out to a big brother in your
grocery store. Check out that episode, uh and shout out
to Target, a surveillance company disguised as a department store.
Speaker 4 (39:58):
They're very good, they're very don't ruin target for men.
Speaker 3 (40:02):
I'm sorry. They ruined themselves.
Speaker 2 (40:04):
Man their campaign contribution.
Speaker 3 (40:07):
No yah, ask yourself how you opt out of the
information that has gathered as soon as you walk in.
And if you want to have a real fun time,
guys at Target in a place where cannabis is legal
that also has targets, take it edible. Walk in, you know,
(40:28):
after your edible kicks in and look up at the ceiling.
Tell us how many cameras you see? No, seriously, they
make so much money off surveillance tech. Anyway, this level
of surveillance and prediction, it is real. It is a conspiracy.
(40:48):
It is happening to you. This thing becomes more focused
or scopes in further on individuals who live in developed countries,
as well as what you would call the current VIPs
of humanity. You know, the putents, the musk the teals,
the technic arts, the business tycoons, the high level politicians,
(41:10):
the international criminals. But if you live in Somalia, these
models are going to focus less on you as a
specific person because they have less information. They're going to
focus more on what Asimov would call a psychohistory approach, like,
how do these large scale regional variables change. Yeah, but
(41:33):
we don't live in Somalia. If you live at a
highly developed, highly surveilled environment like say China, the United States,
Western Europe, or an authoritarian regime, then you specifically are
monitored at a level you may not understand. I think
we saw that just a few days ago the news
(41:54):
published I think it was March twenty six. Again, we're
recording on the thirtieth that simply using a VP and
a proxy network may subject you to a new level
of NSA spy. Did you guys read about this? I know,
I know, dude, but uh, our buddies snowed him before
(42:18):
he goes to Russia. Uh, he warned the United States
and the developed world about this.
Speaker 2 (42:24):
Uh.
Speaker 3 (42:24):
To paraphrase the old video game, Let's see if this
meme works. To paraphrase the old video game, all your
information are belonged to us.
Speaker 4 (42:34):
Do you guys remember that belonged to us? Right? Yeah?
Speaker 3 (42:38):
Yeah.
Speaker 4 (42:39):
The synthesizer company Mogue has a fun bumper sticker that
I have that says all your base are belonged to us,
as in bass.
Speaker 3 (42:47):
Like dub dub dub a dub dub uh. We're talking
what are we talking about? Like when we say that
certain entities are collecting your information, we mean anything that
you would picture as information about you. Financial records, media consumption,
(43:07):
online activity, your medical history, your connections. Right, do you
hang out with Dylan Fagan? Who does he hang out with?
Six degrees of Kevin Bacon? Is real? Your habits, your hobbies,
your vices, your virtues. It all goes into what Philip
Larkin called the combine harvester.
Speaker 2 (43:27):
Which WiFi is you auto log into, which people auto
log into your house's WiFi? How many people live in
your house on a normal, regular basis literally every item
you've ever purchased, whether for yourself or for someone else.
Speaker 3 (43:43):
Congratulations, congratulations, folks, we did it. We made a surveillance state,
or well, would be insufferably smug at this point.
Speaker 2 (43:54):
The crazy thing is, though, for real, not kidding. Outside
of the facial recognition and license plate scam, if you
just leave this dang scrying mirror, you're you're okay, You're
gonna be okay. Just get away from the facial recognition
and the license plate scanners.
Speaker 3 (44:13):
Right and disrupt your patterns. That's another big issue. Also,
just use cash or gold. I don't know if we're
going back to gold or like bottle caps or peppercorns.
I can't remember what it is. At this point, I.
Speaker 2 (44:29):
Think about that general mcasslein that we keep talking about.
This is a crazy, well known human being and he
just left his smart watch and his phone at the house.
Speaker 4 (44:39):
And that dude is gone for now.
Speaker 2 (44:42):
I mean, but just I think if you're super nervous
about when you're hearing all this, super nervous about that
state of the surveillance, just know the key is in
that weird black mirror.
Speaker 3 (44:55):
That is definitely one of the primary factors. That's also
why on this show we don't love smart cars. Just
to be honest, it's not that the technology is bad,
it's that society has yet to evolve and step with
that technology. I mean, look, if we're talking about surveillance,
let's say the quiet part out loud. Most people in
(45:17):
the United States, many of whom would even consider themselves
conspiracy realists, having countered something off with surveillance technology in
the form of that black box wild West we call
targeted advertising. Right, if you live in the United States,
the odds are overwhelmingly likely that you've felt some sort
(45:42):
of online thing. Somehow knows you showed interest in something.
Maybe you're just speaking around your black mirror, right, and
all the laws say that, hey, they're not monitoring your vocalizations.
Speaker 4 (45:58):
We must say hogwash to that which we've experienced. Have
you guys noticed that that stuff's getting a lot better too, Yes,
it used to be.
Speaker 3 (46:08):
It used to be stuff like, hey, Ben, you bought
you bought a toilet, so here are a lot of
ads for more toilets, Like you're gonna look at this
and say, I just bought one, but I don't know,
why don't I treat myself?
Speaker 4 (46:23):
That's exactly right, Ben, That was an indication I think
of that. It was maybe a little bit more in
its infancy of that stuff, that level of targeting, you know,
being effective. And now I'm not even gonna lie though, guys,
there are parts of me that actually really like it. Like,
I buy a lot of things on Instagram, man, and
most of them are good. Most of them are things
(46:44):
that I enjoyed. I'm sorry, I'm just being listen, I'm
just being the stand in for the regular folk out there.
I know that it's weird, but there is part of
me that the buy in of it is like worth
the increasing it to invasive.
Speaker 3 (47:02):
Getting to I don't know, man, I'm excited for you.
Speaker 4 (47:09):
No. Look, anything we can do to help sleep at
night in this dystopian healthscape that we find ourselves.
Speaker 3 (47:18):
Yeah, I think I love that. Yeah, that we did
collectively and I believe.
Speaker 2 (47:25):
Ignorant to what's going on, we're agreeing with you there,
like yeah, yeah, we're literally anything.
Speaker 3 (47:32):
We're collectively agreeing with an ellipses at the end, we're
just both.
Speaker 2 (47:36):
Like yeah, and there's a there's a hidden at the
end of that.
Speaker 3 (47:41):
Yeah, yes, right to us for the hidden conspiracy I
heard radio dot com. Look, it doesn't stop there, because
what I love about Noel's description here is you're talking
about the targeted ads becoming better, which is part of
the surveillance state, but also feels like a win win, right,
(48:02):
a microcosts. It's because the next logical step is to
predict your predilections and your behavior. Right, So now the
ads are going to get better every time we play
the game companies like Palanteer spend so much blood and
treasure figuring out not just where you are right now,
but how often you've been there, where you go next.
(48:25):
We know this conspiracy is genuine. There are increasingly creepy
actions in the US, China, and of course the escalation
of monstrous, unclean activities in the Middle East.
Speaker 2 (48:37):
I just I get these pictures of monsters that are
just growing and feeding off of every piece of input,
every tat on these weird little phones, and just feed.
It's just eating it up, and it's just regurgitating back
at us all the things we like, just like AI generation,
just regurgitating what it thinks we are and what we want,
(48:58):
and then us just eating back that regurgitated slop, and
then we're It is just this disgusting, poisoning ourselves picture
of what humanity is right now. It's just sorry, it's
just horrifying imagining that really well, put is going to
be getting into war with bombs and ballistic missiles and
(49:21):
drones with explosives.
Speaker 3 (49:23):
And the next step is, you know, I love the
little Shop of Horrors comparison. Feed me Seymour, right, feed
me your information. I'll regurgitate some back to you like
a bird feeding its children. That's the next step. It
is influencing your future actions, ideally without your conscious knowledge.
(49:45):
So humanity is already doing what the oracles of old
did in mythology and folklore, but in this case, specifically guys,
humanity is moving past the abilities of those old oracular
legends like governments, consultants, corporations. They can now play around
(50:07):
with something that gets very close to the microcosmic or
duplo version of simulation theory. I know we're all fans
of video games, like something akin to Civilization or SIMS. Right,
you pour so much real world data into a thing
that what happens in the simulation what happens in the
(50:27):
game can get very close to predicting what happens in
reality over the near horizon. It's nuts. All you need
is enough information, and past a certain threshold, it becomes
possible to push not just individuals, right, I'll buy myself
a second toilet as a treat, but it becomes possible
(50:49):
to push policies, to push populations, events, large scale decisions.
This means the things like the Gulf of Tonkin become
increasingly irrelevant. We have moved from the caffeine of false
flags to the utter crack cocaine of changing the future
(51:09):
to suit our aims. And spoiler, folks, the good guys
are not really in this conversation. It's real politics stuff.
It's hardcore like startup in q Tel, marriage, vin Diagram,
private entities and government. It's like palenteer, I mean, and
(51:30):
the palenteer of it all too.
Speaker 4 (51:31):
We had a thing that came up on a recent
conversation that really lit up a light bulb for me.
This idea of enough small scale acts of surveillance stacked
upon one another and collated and scaled then becomes just
as powerful as that high level target that you might
(51:54):
have once thought was the source or the target of
this kind of surveillance, and not even that we were
even talking about taking it a step further, and the
idea of compromot and compromising individual agents who are just civilians.
Enough of those, then you can truly steer the course
of humanity.
Speaker 3 (52:13):
I'm itching my nose, which is bad body language, because
we all saw the news about our buddy Cash Patel, Right.
Speaker 4 (52:21):
What about how he's a douchebag he got hacked. The dude, Yeah,
he got it was an innocuous hack, and that it
was just pictures of him in Cuba looking like a
well we already knew that, but what it represents, Ben,
I'm with you. I'm totally with you. I'm just saying
at this point it was just sort of an embarrassment.
But is that a person who really is capable of
further embarrassment or.
Speaker 3 (52:44):
Scuttle? But is the the Iranian actors found his uh
adult website handle?
Speaker 4 (52:53):
Yea, yeah, it was cat what is it? Spider cash
with a dollar sign as the.
Speaker 3 (53:00):
Of course that's sick.
Speaker 4 (53:01):
Yeah, spider cash.
Speaker 3 (53:02):
Yeah, so, uh, be safe out there, folks.
Speaker 4 (53:06):
I'm sorry guys. Hey, politically, can we agree that guy's
kind of a kind of a dweb it, which is
the worst even diagram of those two things.
Speaker 3 (53:13):
I worry about him.
Speaker 2 (53:16):
It is tough for me to see him there like that.
Speaker 4 (53:19):
No, no, no, You're not a dweep, Matt, You're a nerve.
That's different.
Speaker 3 (53:23):
The thing is that historically in the FBI you had
to be qualified to be the director. That even mean, right,
So I'm a little concerned, you know what I mean,
not the first person to be in over his head
at a job, right, everybody's optimistic on their resume or they're.
Speaker 2 (53:41):
A c like Ronald Wilson Reagan.
Speaker 3 (53:48):
Phenomenal actor. Yeah, yeah, I definitely did a uh, definitely
did a turd acting as the President of the United
We'll leave it there, and he was awake for a
lot of it. Anyway, We'll leave it there. I'm sorry, guys,
(54:10):
I just resent the dude. Anyway. There's another thing. So
we have AI right functioning as the invisible spirits under
command of old. We have big data becoming something very
much like predicting the future, closer and closer every day
to the oracles of ancient times. And now we have
(54:33):
genetic science. This has changed everything, medicine, agriculture. Soon enough society,
we can create chimera folks. You know, something like the
old myth of the dreaded three headed fire breathing monsters
of Greek mythology. And when I say dreaded three headed,
(54:53):
just to be clear, I mean there was a head
of three, three different animal heads, not with dreadlocks. People
are is scared of the chrime era because it looked crazy,
but do.
Speaker 4 (55:05):
Tread lightly with white people who possess dreadlocks. It is
a it is a real red flag. And I would
argue too, Ben. I know we're going to get into
more of the genomic science of it all, but even
plastic surgery has evolved to such a level that one
could liken it to the kinds of you know, Devil's
(55:27):
Bargains of the past, of like we could, but I
just mean the idea of like seeking beauty over all
else and how it can become this vicious cycle or
this spiral that ultimately leads to one's downfall or one
becoming sort of a gullum.
Speaker 3 (55:44):
Yeah, that's a fascinating point, Nold, because it is transformative, right,
just like someone who takes a potion from a spellcaster
or a witch or an artist of the magical person
and they take a potion to make them look younger.
Speaker 4 (56:03):
But then they take three of the potions because if
one is good, then three must be great. But then,
of course, all of a sudden they become a baby
or some sort of you know, wet, withered hag or
you know what I mean.
Speaker 3 (56:14):
There's always the back, dude, Yeah, there are always the consequences.
I'm laughing because I love that your go to was, uh,
then they become a baby.
Speaker 4 (56:23):
Well, I'm just saying I want to I want to
go back that far, you know.
Speaker 3 (56:28):
Oh yeah, yeah, that's awesome. Yeah. But the so we're
making current civilization is making real life chimera in that
now scientists and researchers are able to mix agglomerate, aggregate
combine cells from two or more distinct organisms or species.
(56:53):
They can inject stem cells. Right, that's our philosopher's stone
in this conversation, the stem cell. They can inject stem
cells from one species into the embryo all another. And
there's been so much headway made in this in just
the last few years that we the public know about.
And spoiler folks, we the public have no idea what's
(57:16):
going on in those top secret labs.
Speaker 2 (57:19):
No, we also don't fully understand m RNA and DNA,
like vaccines and gene therapies, Crispers, things.
Speaker 3 (57:29):
That ooh, I love Crisper. I love how they took
the E out and saved us some money on this
because it's extreme.
Speaker 2 (57:35):
Same just this concept that DNA and RNA and creatures
and old old stem cells are being transformed and potentially
new materials and creatures. But I just I keep thinking about, well,
what does that mean for you know, the private sector
when it comes to how we're going to be affected,
(57:56):
or the kids, or is there going to be some
new way to inject something that's going to make all
of us better, you know, this new thing that's going
to change our DNA enough to where we're more we
can acclimatize now to the warmer temperatures better, and all
this other stuff, these things that are probably our on
the way towards us.
Speaker 4 (58:18):
And it's all because you can pay the blood price.
Speaker 3 (58:21):
Right, let me sell you a door, folks. We've made
some real headway in this field since twenty seventeen, not
again your hopefully favorite podcasters, but human society overall. Researchers
have been able to cure mice of diabetes by growing
(58:44):
pancreas organs from mouse stem cells inserted into rat embryos
and then transplanting those organs into the diabetic mice. Ethical
quandaries aside, pretty interesting stuff. Humans have also injected human
stem cells into pig embryos. That was way back in
(59:07):
twenty twenty one, but the research hasn't stopped. We talked
about it in our Strange News program in twenty twenty two.
The first successful transplant of a genetically modified pig heart
into a human patient occurred. This guy was fifty seven
years old at the time, David Bennett, Senior. He lived
(59:29):
for two months before experiencing heart failure, which sounds I understand,
it doesn't sound like a big win. But let's remember
Wilbur and Orville write at Kittie Hawk. They didn't fly
over the Pacific immediately, right, They just got off the ground.
This is getting off the ground. It's a weird comparison,
(59:50):
but yeah, I'm gonna stay with it. I'm gonna stay
with it.
Speaker 2 (59:53):
Yeah, I like the Kiddi Hawk of it. Guys, just
speaking of something else is getting off the ground. I
found this in the research for this week, and this
is a thing I think we need to talk about
in full at some point. I just send a link
over there. It is hyper stealth biotechnology, and guys, it
looks like a thing that we talked about before with
some of our military technology explorations. But this looks to
(01:00:17):
be some type of film. It almost looks like a
window pane, but a more flexible pain of some type
of substance, and it appears to obscure anything that is
in the center of it, so the light travels around
it and it looks like predator vision stuff.
Speaker 4 (01:00:35):
Cool.
Speaker 3 (01:00:35):
So is it similar to those earlier experiments a few
years back with something like an invisibility cloak.
Speaker 2 (01:00:43):
I think this is one of the same companies that
was pioneering this stuff. They claim to have all kinds
of contracts, but it also looks like a kind of
a crappy website, so it's hard to know. Like is
that we judge companies in their veracity often on how
good their app or their website is, at least I
know I do in this case. There's just a little
(01:01:04):
you know, it could do with the revamp to clarify.
Speaker 4 (01:01:07):
When you say predator vision, you're talking about the shiny
invisibility the way we see the predator when he's blipped,
and it's sort of there's a reflective haze light warping.
Speaker 2 (01:01:18):
I apologize more like the cloaking device used utilized by
the predator rather than how the predator sees ah.
Speaker 4 (01:01:28):
And there's no reason that that shouldn't exist. We know
that there's much more low tech versions of that kind
of cloaking, like dazzle camouflage, you know, on warships in
the past and mirrored houses for example, or various ways
of employing optics to achieve that kind of goal. So
that is absolutely not beyond the realm of expectation that
(01:01:49):
something like that could be like fine tuned.
Speaker 3 (01:01:50):
It could absolutely happen. It's already happening in experimental phases.
It's very much on the way, just like designer babies,
which we don't have to relitigate today. I know not
everybody loves the term Exactlyatica one of my favorite movies, guys,
please watch it. This is stuff like preimplantation genetic diagnosis.
Speaker 2 (01:02:15):
Oh.
Speaker 3 (01:02:15):
Also, the fancy word for putting non human animal organs
into human animals is zeno transplantation. But anyway, PGD. Pre
implantation genetic diagnosis is where there's a process through which
doctors and scientists can analyze multiple human embryos. They can
(01:02:38):
identify genes that might be associated with specific diseases or characteristics,
and then select the embryos that match their checklist, their wishless,
their bucket list as parents. So you can you can
do this in an embryotic stage. You can also do
this through technologies like what we mentioned Earli Crisper directly
(01:03:02):
editing the genome before birth, so q gatica. This is
another early stages kittie hawk kind of thing. But there's
a lot of potential for disaster. There's a lot of
potential for awesome stuff. We've got to mention it. You guys,
remember the case of the Chinese renegade scientist HEYESI and Kia.
Speaker 4 (01:03:26):
No sirh recloning.
Speaker 3 (01:03:29):
He was futs in with a lot of stuff. I'm
being so cartoonishly diplomatic. He implanted these genetically edited embryos
into two women with the assistance of at least two colleagues,
and they had modified.
Speaker 4 (01:03:47):
This key gene.
Speaker 3 (01:03:50):
In a way that they believe will make people HIV
resistant in life. The big issue is that this modification
is against law in China. It also could be heritable,
by which we mean it could be a modification or
(01:04:12):
a remix or a tweak that can be passed on
to the descendants of any children these humans have. This again,
was already against law in China. So it turns out
our buddy he forged ethical review documents lied to doctors
who are doing the procedure. This prompted a deeper conversation
(01:04:36):
about the ramifications of making inheritable genetic tweaks. Ultimately, this
guy gets three years in jail and he gets fined
the equivalent of four hundred and twenty nine thousand US
dollars or three million Chinese one. It's a pickle though,
because he definitely managed to do it in secret. He
(01:05:00):
might not have been caught if he didn't go to
a conference and stun everybody by saying, I figured out
a way to stop the HIV crisis.
Speaker 2 (01:05:11):
Well, that's amazing.
Speaker 4 (01:05:14):
Theoretically, yeah, but also know that some positive things came
from like Nazi experiments, so it doesn't really necessarily ends
justify the means. Maybe the some might argue that they do.
Speaker 3 (01:05:26):
Yeah, maybe because they're not living in the time when
those abuses occurred.
Speaker 4 (01:05:31):
Right, They're not on the other end of them, like
in terms of being on the table or being planted with.
Speaker 3 (01:05:35):
The thing exactly.
Speaker 4 (01:05:37):
I mean.
Speaker 3 (01:05:37):
Also, look, this is becoming an arm race, right, There's
a drone arm race. Now there's an AI quote unquote
AI arm race. There is a real possibility of a
genetic modification arm race on the horizon that this this
(01:05:57):
stuff is it inevitable. Right, not to sound like Thanos
or whatever, But gene editing technology exist, and different countries
are scrambling to figure out how they can address this technology,
to get in front of it through legislation.
Speaker 4 (01:06:16):
You know, it ain't for nothing that we're talking about
super soldiers and the supermen and all of this stuff
throughout history. And of course, the moment that stuff is available,
whoever is able to wield it is going to go
for it. Right There's no like you know, moral high
ground of no should.
Speaker 3 (01:06:34):
We We always do when we can, you know, I mean,
if everybody else is doing it, why can't we?
Speaker 1 (01:06:42):
Right?
Speaker 4 (01:06:42):
Killer Cranberry's record an absolutely appropriate assessment of the situation.
Speaker 3 (01:06:47):
I mean, look in the US, for example, federal funding
currently cannot be used to create certain types of chimeras,
including non human primate embryos with human stafe themselves. So
despite the fact that the United States so often disagrees
with itself, we have a law that says no human's
(01:07:10):
ease past a certain number of days in the embryonic situation.
Feels like a good move, feels like, you know, that's
a good hustle, right, Like, not to be all big
government about it.
Speaker 4 (01:07:24):
I'm controversial, not controversial.
Speaker 3 (01:07:27):
Let's figure out the Homo sapei in primate before we
start bak it's new.
Speaker 4 (01:07:32):
Ones for funzies. Yeah, that's just reasonable.
Speaker 2 (01:07:35):
That sounds perfectly reasonable. Guys, cand Of bring one one
more thing up, because I want to I want to
have just one little tiny piece of positivity for my
own brain right here. Yes, please, just quick. It has
nothing to do with what we were talking about, But
I keep thinking, I keep seeing this picture of all
of these incredible things and potentially dangerous things ben that
(01:07:55):
we're talking about right the future, with all these technologies,
and what could happen if it, oh, another couple degrees.
But it's increasingly feeling like we're not gonna make it
that far, just with you know, the number of existential
crises we face right now and the ones that we
at least appear to be on the brink of. There
is a group of students at MIT that created this
(01:08:17):
thing called a passive atmospheric water harvester.
Speaker 4 (01:08:21):
Wow.
Speaker 2 (01:08:22):
And we have talked before on the show about water wars,
the intense struggle to make sure there's enough water on
the planet, because if you don't have water, you can't
do any of these genetic experience. You can't do any
of the AI stuff, right, we know they need water.
This is an incredible thing that is a window sized
panel of this new material that's the invention, this new
(01:08:45):
water absorbent material, and it captures water out of the air.
They successfully tested it in Death Valley and it just
grabs moisture out of the air and it turns it
into completely pure drinking water. I'm just ama, guys, a
world where technological advancements like that, which is kind of
just a text Is that a textile advancement or a
(01:09:07):
I guess that's what you would call that, maybe material
science advancement, this hydrogel stuff, because that's ultimately what it is.
It absorbs water out of the air and then when
it gets filled up, it pushes the water basically out
and that's the stuff you can drink. And it's just
a huge panel of that stuff. It's just to remind
us that there are crazy positive things that science and
(01:09:30):
technology and innovation advancement can do, like give us all
access to water that is all around us at all times,
and we wouldn't have to pay for it. Really, once
you got one of these things, it takes zero power
to run, no power source you just set it up
and now you've got drinking water. Incredible.
Speaker 3 (01:09:48):
Yeah, check out our episode on is It called is
There Enough for Everyone? Because this is positive. It reminds
me before we get back to the gene arm race.
It reminds me of that point that I think we're
all becoming increasingly obsessed with. I certainly am technology can
(01:10:09):
evolve at a break deck pace, but that technology is
only as useful as the society that evolves to recognize it. Right,
So free water can exist, and it should exist. Water
should be a human right. Sorry Nestley, you're wrong on
that one. Water access to basic needs should be a
(01:10:30):
human rights. Society has not evolved to acknowledge that.
Speaker 4 (01:10:33):
Well, and then it becomes and I know we're going
to get back into the gene stuff. But to piggyback
on what Matt's saying about the resource extraction and energy,
you know, quandary of it all, like there are our
society in particular seems to be very hesitant to embrace
new forms of magic, newer forms of magic, like trying
(01:10:56):
to get away from fossil fuels, and that's all incredibly intentional,
all it would seem, and in a party line that's
being pushed from the highest echelons of of you know,
uh corporate world and government, those who have the most
to gain from maintaining staying the course and relying on
things like fossil fuels as we're seeing, you know, throw
(01:11:18):
world events into disarray right now with all of the
iron stuff. And yet we see other countries doing quite
well using wind energy and hydropower and stuff like that.
And yeah, we seem to be wanting to, you know,
go in a different direction for not the best reasons,
not just the betterment of humanity and the betterment of
life on earth, but for all of these kind of
(01:11:40):
shady other reasons that involved you know, certain people getting
paid right.
Speaker 1 (01:11:45):
Yeah.
Speaker 3 (01:11:45):
Our academic or theoretical term for that is path dependence,
getting locked into certain technological pathways or sociological paths because
of the status or because of the earlier decisions that
were made. So maybe don't let uh elderly villains run
(01:12:07):
the world. Anyway, We've got something we've got to get
back to. As we're ending the idea, I can't remember
which of us mentioned the more dystoping idea. Going back
to our point about this technology becoming inevitable once one
Right now, China is the most publicly welcoming nation for
(01:12:30):
genetic research. It's kind of genetic research we're discussing that
creates chimera. Once one country decides to bend or break
those what Noel calls toothless international treaties, if they create
an army of super soldiers, every other single country is
going to have to figure out some kind of response.
(01:12:52):
If we go full sci fi, shout out Gattiga, we
may see a new stratification of society.
Speaker 4 (01:12:59):
I was reading this.
Speaker 3 (01:13:00):
I've been reading this awesome book series. I can't recommend
it enough. It's called Red Rising, and it's about humanity
expanding out into space and creating a system of genetic discrimination. Literally,
your role, your cast, your class could be hardwired into
(01:13:21):
your body. Members of a new upper class could be
defined by their excellent genes, their immunity to certain diseases
or environmental conditions, and most importantly, the ability to pass
those genes onto their children. And that makes us ask
what happens to the regular folks? You know what I mean?
The teaming masses of the world already have a lot
(01:13:42):
to worry about without this technology becoming magic and harming them.
Speaker 2 (01:13:49):
We're just unnecessary.
Speaker 3 (01:13:51):
Well, the meatbags might not be necessary, but we're a
lot of fun, right. The chatbots say we're a lot
of fun. They think our ideas are always universally great.
Speaker 2 (01:14:02):
Yeah, man, giving us that little hit of validation that
we so grave.
Speaker 3 (01:14:08):
I love validation.
Speaker 4 (01:14:09):
It's like the one thing that I would not be
able to give up.
Speaker 3 (01:14:13):
I would quit coffee today if it meant I have
more validation.
Speaker 4 (01:14:16):
Do you never use the validating Yeah?
Speaker 2 (01:14:21):
Thanks, Bro, that's a great idea, Ben, Why don't we
get started. I can make you a ten point plan.
I got to be more validated, to.
Speaker 3 (01:14:29):
Be more validated, there's so much more ahead, folks. We
hope you are having as much fun as we are
with this trippy thought experiment about technology.
Speaker 4 (01:14:41):
Start fun, legendial.
Speaker 3 (01:14:44):
We had some positive stuff. Oh, speaking of getting dark right,
like the old Gene Wilder Willie Walker adaptation. Uh, there's
no earthly way of knowing which direction we are going.
Stay tuned for our next episode. Will technology help let's
bring back the dead more importantly? Should it? Hi?
Speaker 4 (01:15:03):
There's a moment when Ben and I were talking with
our friend Gandhi on sauce on the side where this
very topic came up, and it is it's very interesting.
I do recommend giving that one a listen.
Speaker 3 (01:15:17):
Yeah, so join us for sauce On the side, I
think we'll be hanging out with Gandhi more often in
the near future. Matt, you're going to be coming along
with us. I hope maybe we got you man come on,
and we hope that you come along with us as well.
You are our favorite part of the show. Whether you
(01:15:38):
are human, an Eldridge entity of old, or a new
up and coming chatbot, we want to hear from you.
So find us online, call us on a telephone, and
you can always send us an email.
Speaker 4 (01:15:51):
Oh and if you're into Eldritch type stuff, I highly
recommend the new DLC for Borderlands, for it is eldrich
af and good old time in a you know, loody
shooter environment. If you want to reach out to us,
let us know what you've been playing, what sort of
gene editing you're into or not. You can find us
(01:16:12):
on your social media platform of choice at the handle
Conspiracy Stuff or Conspiracy Stuff Show.
Speaker 2 (01:16:16):
Hey, are you Zach Galifanakis or one of his agents
at UTA? Why not call one eight three three std
WYTK and set up a time to hang out with
us so we can talk about the gardening show. We
want to do that. If you're not Zach or one
of his agents, why don't you call the number, give
yourself a cool nickname, and let us know what you
(01:16:37):
think about this episode, or maybe give us an idea
for another episode. Any of it. You might find it
on one of our listener mail episodes. If you want
to send us an email, you can do that too.
Speaker 3 (01:16:47):
We are the entities that read each piece of correspondence
we receive. Be well aware, Yes I'm afraid sometimes the
void rights back and your email could end up in
our weekly listener mail segment, which informs how our episodes
in the future. Ask us for a random fact, or
give us a random fact, we'll give you one in return.
(01:17:07):
For instance, did you know that, prior to being divided
into two countries, the nation of Sudan had more pyramids
than any other country in the world. We're talking two
hundred and fifty five compared to far fewer in Egypt.
We'll see you out here in the dark conspiracy at
iHeartRadio dot com.
Speaker 2 (01:17:46):
Stuff they don't want you to know is a production
of iHeartRadio. For more podcasts from iHeartRadio, visit the iHeartRadio app,
Apple Podcasts, or wherever you listen to your favorite shows.