Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:00):
He must have felt
really badly when Chow Yun-fat
got a series in America.
Speaker 2 (00:04):
Gentlemen, let's
broaden our minds.
Speaker 1 (00:07):
Are they in the
proper approach pattern for
today?
Negative, negative.
All weapons Now Charge thelightning field man.
Speaker 2 (00:33):
Part four huh, at
least at least part four will
this be the end skip?
Is it the end of artificialintelligence for us, you and I?
Speaker 1 (00:42):
is.
It's at the end, my friend Willit finally take over.
Speaker 2 (00:46):
You don't even need
us.
Yeah, man, we are obsolete.
Let's talk about them, theirrobots, now.
We could go on and on days anddays, weeks and weeks, years and
years, before the heat death ofthe universe even happens.
Just talking about robots andartificial intelligence in
(01:08):
popular culture specifically,mostly movies, books.
Jesus, we'd be there timeimmemorial.
Speaker 1 (01:16):
It would be literally
the entire premise of our
podcast.
Speaker 2 (01:19):
Hey, we're moving and
shaking folks, all right, we
have more things to talk about,so we can't just spend forever
on AI.
But there were a few morethings we wanted to discuss.
We wanted to get out there, wewanted to lay it hard on wax, as
they say, that nobody says.
Speaker 1 (01:37):
We have a few more
things to say, and by that we
mean the main premise of theepisode we were trying to do.
That turned into many episodes.
Speaker 2 (01:46):
You know, ai is an
evolving topic and subject.
True, it touches our real livesmuch more than Rocky does, and
there's a lot more philosophicaland theoretical implications,
much more than Highlander.
So I think there was room totalk about this and it was
(02:07):
something that was both vitaland, in our minds, necessary.
So let me just get us off thehook there.
So now, completely apropos ofnothing, droids in the star wars
universe.
Now, this is a fantasy world,however much technology seems to
be prevalent in it, but I thinkdroids are maybe one of the
most prevalent presences of aiin mass pop culture.
(02:33):
Everybody knows star wars andby knowing that, you know about
at least c-3po and rtd2, and ifyou're a little younger than us,
you definitely know about them.
Droids, like the Droid Army,the fight on I don't know,
genosha, and Is it Kashuk?
Or the Trade Federation, andblah, blah I don't even know
(02:53):
where those came from.
Are they trade droids?
You know what Did they?
Speaker 1 (02:57):
make those.
Those movies don't make anysense or matter.
So I mean sure.
I mean yes, I think they aretrade federation, yes, I think
they are by the trade federation, and then the that's.
Speaker 2 (03:11):
I mean, that's that's
smart for them to make these
droids and then, I'm assuming,selling them to their partners,
because they seem to be fightingeverywhere roger roger.
Speaker 1 (03:21):
Yeah, I think their
trade federation built.
I don't know that.
They well they're paid for.
I don't know that they builtthem but they never tell you.
Either way, I mean, the onlyreal origin thing that you get
into is the Jango Fett shit inthat movie.
Speaker 2 (03:35):
Right, but that's all
about the clone, cloning and
Clone Wars.
Speaker 1 (03:39):
you lucas never
actually gives you anything.
I don't think they wereexplored, even in the gendy
tartakovsky stuff or in theancillary clone wars stuff.
I don't think they ever tellyou any of that probably not.
Speaker 2 (03:52):
I mean, it really
doesn't matter.
It was a passing thought, butreally what I'd like to talk
about is these the weirddepiction of droids in star wars
.
Speaker 1 (04:02):
So bizarre it's so
odd.
Speaker 2 (04:04):
Again, a real Star
Wars podcast probably knows many
more examples, a much moreintricate understanding of the
moral implications of droids andhow they function in the
plethora of societies in theStar Wars universe.
But what we are exposed to arethese seemingly conscious beings
(04:29):
that also maybe are not, maybejust run their specific
programming, but they alsodefinitely seem to be in
constant slavery to everyone,except for the few times when
they are given free reign tolive their lives.
So I'm trying.
Okay, like c3 and rtd2 aredefinitely slaves.
(04:52):
They are bought and sold.
They are owned by well, jesus uhwe don't need to go through
that whole pantheon but in thecanonical, when we first meet
them, which would be a newepisode four, however you want
to think of it, they were ownedby Leia and then are sent on a
mission, captured by Jawas, thensold to Luke and his uncle and
(05:19):
aunt, and then Hold on a secondWeren't?
Speaker 1 (05:22):
they owned by Captain
Antilles and Leia just
reprogrammed them.
Speaker 2 (05:27):
That's true.
Yes, that was Captain Antilles.
Speaker 1 (05:30):
That which I learned
mostly from the droids cartoon
TV show.
Speaker 2 (05:36):
That was a thing, but
if you also own a droid and
then someone else isreprogramming that droid.
How does that work?
Speaker 1 (05:44):
Right?
Is that legal?
Trying to figure out how thesethings work technically in Star
Wars is a fool's errand, I thinkin general, but we should go
down this rabbit hole.
They were owned by CaptainAntilles, reprogrammed by Leia,
and then they found the way toLuke.
I don't really want to factorin the idea that Anakin created
(06:07):
those droids, because it's juststupid and it makes Empire not
as good.
Speaker 2 (06:12):
There's so many
problems there.
But despite their lineage, whatreally crumples me up is how do
people interact with them?
How are they used?
Are they just used?
To me that's almost likethey're pets okay sometimes they
see them as friends, assomething to care for, but they
(06:32):
don't ever free them frombondage.
We see that they can live theirlife when we get to empire.
We do see an android bountyhunter right ig88 yeah, I used
to have that toy.
Speaker 1 (06:45):
I'd love that fucking
toy.
That was so cool.
But I lost.
Yeah, I lost the fuckingbreather thing that attached to
him, the breather thing I'mthinking of a different droid,
never mind, it doesn't matter,it doesn't matter so many, so
many droids there are a lot ofdroids in star wars there are a
lot of droids in Star Wars, justlike everything in Star Wars,
plus maybe for a differentpodcast.
Speaker 2 (07:06):
But Star Wars is, I
think, a real genesis for
pumping out content 100% Formarketability and saleability.
Speaker 1 (07:19):
They weren't the
first, but they were the most
prolific.
Speaker 2 (07:22):
Yes, I think they
were a test case, for let's give
everything a name, a title,turn it into a toy so that we
can sell it, which has becomerampant thereafter jumped off
and mass pop culture, I thinkwith star wars I think you and I
both recognize that that is oneof the biggest problems with
comic books and with, you know,movies in that sense in general.
Speaker 1 (07:48):
But Star Wars really
did you're right set that
definition.
That is the baseline.
I mean, you had Evel Knievelbefore that and you had the Six
Million Dollar man, things likethat.
But Star Wars really is thereason that you have the
economic structure for IPmerchandising today.
Speaker 2 (08:11):
That is why we know,
like IG-88.
Are you making things to tellthe story or are you making
things to sell a lunchbox?
Be that your BB-8 or whatevercute little baby Yoda thing, you
can look at yourself and say isthis the right way to do things
?
Maybe not, but my bank accountsays that I'm going to keep
(08:34):
doing it.
So just because you findsomething immoral doesn't mean
you're not going to do it.
Yeah, that's true.
Speaking of, I'm sure there's alot of people who find the
behavior of I guess you'd callflesh beings, with the droids in
the Star Wars universe possiblyreprehensible, but you don't
really hear a lot about that.
(08:54):
There is a galacticconstitution which declared all
sentience equal.
It decried memory wipes,maintained to eliminate
personality quirks andquestioned why they were
recommended if droids trulylacked personalities.
The movement, the Droid RightsMovement, also considered the
use of restraining bolts a formof slavery and practiced
(09:15):
outlawed.
Sure, fine.
None of that actually shows upin any of the movies, except for
Solo, where you have L-337, theone that no one likes.
Who is in favor of?
Yes, yeah, well, not the onlyone that no one likes, but one
of the many Star Wars moviesthat no one likes.
Sure, at least it does have thequestion coming up.
(09:37):
Here's a droid.
It's a hodgepodge droid builtfrom different parts of other
robots or mechanical beings, andit has a goal to free robots
from slavery.
But essentially all droids inStar Wars function on some type
of slavery basis.
You know, whether they'resentient or not, I think that's
(10:00):
debatable, but perhaps thatdoesn't matter.
You know, perhaps we shouldfocus less on sentience and what
matters more are thesignificance of relationships
that people form with them.
I just say, is this robotsentient matters less than is
this robot my friend, mycolleague, a part of my family,
(10:21):
a part of my family?
Perhaps the question aboutrelationships and whether they
meet personhood is lessimportant than if these
relationships are sufficient togrant an important kind of moral
status based inherently on therelationship forming and how
they are viewed.
Speaker 1 (10:38):
You mean like the
droid in Rogue One, rogue?
Speaker 2 (10:40):
One.
I don't remember the droid inRogue One, the.
Speaker 1 (10:42):
Alantidek droid, who
sacrificed himself so that they
could execute the mission.
Speaker 2 (10:47):
All the protagonists
in these movies seem to have a
moral relationship with theirdroid, up until it comes to a
point where they want them to dosomething role of like the
(11:08):
morally superior being tosacrifice themselves in some way
to acquiesce to their quote,unquote owner's wishes, not
because that's what they'reprogrammed to do, but because
it's what they've chosen to dofor the good of this
relationship and the benefit oftheir owner.
Whether that's R2, constantlyputting himself at risk to aid
in the situation, constantlyputting himself at risk to aid
in the situation.
That's L-337.
That's the Alan Tudyk robot 3POconstantly acquiesces to
(11:32):
whatever.
Speaker 1 (11:38):
3PO is an interesting
case because he doesn't fit
into that archetype.
He exists on his own to survive.
I make no moral judgments on3PO for his actions, but he
doesn't fit into some of thosestructures although he does do
things.
Speaker 2 (11:51):
I mean like when the
ewoks capture the band oh he
does lie to the ewoks, pretendto be a god.
That's true even against hiswishes in a way.
I mean he kind of needs to beconvinced by it and he gets R2's
advice, but C-3PO is also oneof, I think at this point kind
of become a classic archetypalcharacter of the bumbling best
(12:15):
friend who is a consistentcoward in almost every situation
.
Speaker 1 (12:20):
Oh yeah, I mean, he's
a vaudeville character.
Speaker 2 (12:23):
But it's also like
hard to point out.
You know, I think there's thatfamous Eddie Izzard bit where he
like compares Scooby and Shaggy, yeah, as these protagonists
who are constantly running awayfrom any danger.
And you just don't get cowardsas your protagonists, very often
Right, even in a full secondaryor third situation, and C-3PO
(12:44):
is one of those.
It's very rare that he standsup to anything or takes action
to save the day, but that's alsonot his role.
And we come to the question ishe programmed?
Is that a self-preservationwithin his programming?
Is that a personality trait bya developed sentient being?
It fascinating, don't know I.
(13:05):
But again, like none of theseare given any weight or thought.
Maybe in the comic books, inthe novels, maybe in further
series.
I bailed out on star wars longtime ago, so yeah, I mean other
than the the main stuff thatrises to the surface.
I don't partake it's.
It's not for me and that's fine.
It can be for whoever I likewhat I like, you can enjoy.
(13:28):
What you like, I'm not gonnalike.
Hey, nobody should watch starwars.
Hey, watch all the stars youlike.
Obviously the prequels are notfor me, but you know the
generations coming up, they lovethe prequels.
I disagree, but I also don'tsee the point in arguing about
it.
You know, yeah, go ahead andlike what you like.
Well, yeah, it's not hurting meat all.
I mean, star Wars, this is wecould get into like.
(13:50):
I get it.
Yeah, the many problems of StarWars, that's a whole other
podcast series if we wanted.
Speaker 1 (13:55):
Trust me as a Star
Trek fan.
Speaker 2 (13:57):
I understand we don't
need to do that of robots and
artificial intelligence in thisextremely popular franchise that
how do you make sense and howdo you find it morally
distinguishable to treat thesedroids obviously different than
(14:17):
you treat anybody else?
There is a ton, a ton ofslavery, a ton of oppression
within the star wars universe.
I think finding the moral highground, almost any level with
almost any set of characters, isa flawed venture.
Within the Star Wars universe,be that Jedi Rebels, the
Galactic Empire, darth Vader,the droids, anybody, it's a very
(14:43):
gray universe, for howdistinctly black and white they
try to paint it.
But the nature of droids aslesser than, but obviously like,
befriended.
But is there a Jim Crow versionof droids?
Are there three fifths of ahumanoid for these?
Speaker 1 (15:00):
droids.
It's a good question.
You expect us as an audience torespect the personhood though
(15:30):
he has, personhood even in thenarrative that they they
construct so like.
I think, unfortunately, bynature, this treatment of this
kind of ai is rooted inimperialism, colonialism and
racism.
Because you have to like, youhave to create these strata that
you categorize conscious beingsin, and it's easy for us to
(15:56):
swallow, because obviously youknow white hegemony and american
hegemony and whatever, but alsoclassical colonial
stratification of races andpeoples.
It's such an easy thing for usto just go yeah, cool, that
seems right, mm.
Hmm, I think we have tried toaddress that later.
(16:18):
I don't think Star Wars hasdone enough to address that
later and I'm not trying to be aTrek versus Star Wars guy when
I say this, but I think StarTrek actually does do a lot to
address that.
Speaker 2 (16:30):
I think there's a
specific edict within star trek
that when sentience andtheoretical personhood is
ascertained within a being, thatthey are then set aside as not
being tools, not being enslaved,they are given the right to
choose.
Speaker 1 (16:49):
Except when the
Enterprise D became sentient and
they were like, eh, whatever,eh, I mean again yeah, not every
time.
Speaker 2 (17:00):
This isn't a hard
rule.
Speaker 1 (17:02):
I had a really good
point there I had a good point
there and I fucking lost it.
Sorry.
Speaker 2 (17:08):
That's fine, it's
fine.
And again, there is so much ofstar wars that we could talk
about and it's kind of one ofthose things when you really
break down.
I feel this a lot of times whenI break down any fantasy thing,
when we were talking like I wasthinking about other fantasy
things and I was like, well, howto train a dragon is a movie.
That's being live action, quote, unquote, remade.
But you have these dragons whoare kind of thought of as pets,
(17:31):
but they definitely knowlanguage and have thoughts and
wishes of their own, but theyare subjugated as writing war
vehicles because they need thatfor the story.
There's no like, hey, we shouldlet them be themselves, except
for like the end of the series,when I mean, essentially you're,
you're letting your dog go liveits own life after you've
(17:55):
already like used it up forthree movies.
But that's like a fantasy thing,you know.
But when you we start breakingdown problems within fantasy
worlds, you know like, well, whydoes harry potter?
Why do they celebrate christ?
None of that makes sense, right?
If they can fly everywhere, whydo they need flu networks?
What has stopped them fromusing this magic all the time?
(18:16):
It's?
At least they do get into elfrights with their, with their
own form of indentured slaverythat's true hey, you got one
thing right.
Kudos for you.
Speaker 1 (18:26):
She didn't get a lot
of other things right.
She can get fucked.
Which of our favorite sci-fi orfantasy writers aren't
extremely problematic?
Maybe Asimov?
It is tough.
Speaker 2 (18:39):
It's a lot of them.
Speaking of Asimov, we shouldat least discuss a little bit.
We won't get into a lot of thebooks Just like.
That's a whole.
It's a whole thing anotherkettle of fish.
But I think your laws ofrobotics right definitely need
to come into play.
We at least need to mention themabsolutely offhand, he made
these three laws of robotics onea robot may not injure a human
(19:03):
being or, through inaction,allow a human being to come to
harm.
A robot must obey the ordersgiven to it by human beings,
except where such orders wouldconflict with the first law.
And a robot must protect itsown existence as long as such
protection does not conflictwith the first or second law.
(19:24):
Now, obviously, in his storieswe find the conflict inherent in
some of these laws and thattheir implementation is not
without its own consequence.
But I think where I would liketo take those laws and see them
implemented in movies would bethe alien franchise.
Hmm, now, the alien franchiseis I know you scrunch your well
(19:45):
mostly that this is my, um, no,my connective tissue.
I'm just curious.
I mean one you have.
In each alien film there is asynthetic person, an artificial
being, usually one.
They almost always look and actand feel just like other humans
, feel in a touch sense, notnecessarily in a emotional sense
(20:08):
, although that does tend toevolve as the franchise expands.
We'll discuss in a second.
It starts off with ash, analien.
Then you get bishop and aliensand then call in alien
resurrection.
Alien three is the one thatdoesn't have.
(20:28):
I mean, you get a little bit ofBishop, but not really, not
really, but but a little bit,but it's the one alien film that
doesn't other than alien versuspredator, which we're not
counting.
Speaker 1 (20:39):
No, or alien versus
predator requiem.
Speaker 2 (21:10):
It is the one alien
film that doesn't have Right
Right, that doesn't have arobotic person, an android
integral to the plot, becausehuman being to come to harm like
, specifically quoting Asimov'ssetup.
You don't have syntheticandroids, you don't have them
where they look and act and seemjust like people, to the point
(21:33):
where the fact that they are nothuman is a not known fact.
In the first one, ash isessentially a secret android
Nobody knows.
And his again, he breaks thesecond law by his programming,
because we are led to believethat his programming has caused
(21:53):
him to put the, the humans he'son board the vessel with, in
danger so that he can bring homea sample of xenomorph to the
company.
The company knows sends them onthis mission.
They have ash there to makesure that their will is carried
out.
Now he is set up as a miniantagonist.
(22:16):
In the first one, bishop is a acomplete friendly protagonist.
In the second one, he you seehim, you know evolve.
He obviously doesn't want toput himself in danger, but when
he is called to that duty, youknow, he recognizes his own
desire for self-preservation,but the needs of his human
companions outweigh that.
(22:38):
He also seems to come to aculmination of ideas about
humanity.
Culmination of ideas abouthumanity, seeing the love that
Ripley seems to have for Newtand how far she is willing to go
and what she's able to do.
He specifically comments herthat you know you did pretty
good for a human.
(22:59):
But he is distinctly a positiveforce throughout the entire
film.
It is a twist in that we arekind of set up to think that he
will turn, because we saw thesynthetic being turn in the
first one.
Him being a proactive, helpfulforce is a oh wow.
(23:19):
He was good.
We can like Bishop throughoutthe whole second film and the
third film In the third film youdon't really get much.
He just kind of set up to be anexposition dump.
Well, yeah, so that we can seewhat happened in between the
second, third film.
In the fourth film again, youhave a subversive secret robot
(23:41):
in call winona rider.
I would, yeah, winona's writercharacter.
She is, but I also think she'sdoing the least out of any of
these synthetic beings in theAlien franchise.
Speaker 1 (23:55):
That feels like one
of the Joss Whedon things.
Right yeah.
Speaker 2 (23:59):
I mean, she kind of
seems like his manic pixie dream
android in a way.
Speaker 1 (24:04):
Well, I mean, what do
you think buffy is?
I mean, yes, he created a buff,he created a buffy to live
alongside ripley because hecouldn't deal with ripley being
ripley you know what I mean likehe needed, he needed his own
version of ripley to existalongside her yeah, but let's,
let's shrink her down.
Speaker 2 (24:24):
Let let's yeah.
You know a way thin.
Small it always reminds me of,like his, his serenity character
, where it's like, oh, I have tohave the manic pixie dream girl
, but she's the perfect weaponas well, right you?
Speaker 1 (24:40):
know it's like OK.
Speaker 2 (24:40):
But then I think
where the AI question really
pops up is in Prometheus andpast that.
That's when we're introduced toDavid what one might call the
evil AI, a very dystopianversion, and this seems to be a
(25:03):
synthetic person that has chosenhis own aims over those of
humanity.
Speaker 1 (25:10):
I still don't
understand the motivation of
that character.
I'm sorry, I don't likePrometheus.
I don't like any of thesubsequent movies.
Until you get to Romulus, Ijust don't.
And I know we'll get to it, butI just don't get it.
It doesn't make any sense to me.
It just get, it doesn't makeany sense to me.
Speaker 2 (25:26):
It just I think there
is.
In the first film we definitelysee that it is his programming
by the company that has put himon a course to bring harm to the
crew by letting them die,impregnation, all for the, the
benefit of the corporation.
And the second one, we don'thave any of that.
(25:46):
It's, you know, a flip thescript moment where, oh, it's a,
it's a good synthetic person.
Three and four, we kind offlush those.
It's not really relevant whenwe get to david.
It depends on yourinterpretation of david within
this world.
I think at this point, eventhough it prequel, it seems to
(26:08):
be this moment where, like hegets some, some almost always
surpass humanity, another higherrung on the idea of perfection,
that then, by creating that one, there is a horror element.
(26:40):
When humanity gives birth tosomething beyond it, there's a
natural framework of fear thatis created.
You know the, the frankenstein,frankenstein's monster.
It's outside of our control andit it is beyond us.
You know how do we deal withthat?
Uh, two, I think they're inthis dystopian view of what AI
(27:04):
can become.
You see that David is lookingfor perfection in himself.
He obviously sees himselfbeyond his creators and he sees
the xenomorph as an avenue foreven further perfection.
Thus his wills do not alignwith his, I guess, programming.
But again it's hard tounderstand what he is quote
(27:25):
programmed to do.
But again it's hard tounderstand what he is quote
programmed to do.
It appears that he's programmedto be like an autonomous being
of synthetic origin who is meantto bend to the will of the
humans in his not not necessarycare but his service.
But I don't know.
He set it up as a flaw in hisprogramming that gives him the
(27:46):
will to do whatever he wants,which then turns him against the
crew in searching for his owngoals and aims to be enacted,
which brings about thedestruction of everybody in the
film.
But the genesis of that is kindof like you kind of have to
read into it because it's notwell explained.
Speaker 1 (28:04):
You have to do the
work for the movie.
Yes, and that's a problem it is.
Speaker 2 (28:09):
I think I'm kind of
interpreting as a a kind of a
jump to conclusions, but it is,you know, in the, in the way I
mean it's, it's calledprometheus for a reason well,
yeah, no shit, yeah it's thecreation of this thing that
leads to the downfall of themall.
Speaker 1 (28:34):
You know I mean he's
literally helping his creator
die because of the knowledgethat he is trying to gain.
That was during the RidleyScott period, where he did like
Hannibal.
I've read that book.
It's not saying the things itthinks it's saying.
It's saying and knowing,especially the behind the scenes
of how prometheus was writtenand rewritten by damon lindelof.
Don't give it too much credit.
Speaker 2 (28:54):
It's, it's I think
again it is a bit of a jump to
conclusions map for a poorlyconstructed and even worse
written set of films yeah itlooks great, it feels like a
legitimate alien movie and Idon't understand why people like
it retroactively ruins thealien franchise.
Speaker 1 (29:14):
I don't.
Speaker 2 (29:15):
I don't know if it's
retro, I can still like what I
like for what it's bringing tothe table itself.
If we have to take it as awhole.
Speaker 1 (29:22):
You and I will be
upset that they retroactively
ruined the things that were good.
Speaker 2 (29:27):
Right, it is
distinctively a bad idea, bad
execution, and it does tarnishif we're looking at and a
totality.
It does tarnish that image, butit's not going to stop me from
still enjoying the good parts ofit.
Sure, by themselves, that's allI'm saying.
Speaker 1 (29:44):
Maybe that's a
journey people need to take.
As a star trek fan and a starwars fan, I understand both of
those.
Speaker 2 (29:50):
I wonder if there's a
lot of people who, like they,
saw Prometheus, maybe beforethey saw Alien.
Speaker 1 (29:54):
Yeah, you're probably
right, I never thought about
that.
Yeah, there probably are, andfor them maybe the experience is
completely different.
I don't know.
Speaker 2 (30:02):
Exactly.
I don't think they are key tounderstanding the ideas behind
the laws of robotics or ourviews on sentience within AI in
films or fiction, but what Iwanted to do is point out that
these are some distinct keyideas in the dystopian view of
(30:23):
robots and artificialintelligence within pop culture.
So much of what we see and howrobots and AI are depicted is as
a negative influence, anantagonist, something that's
going to bring about thedownfall of humanity.
And I think, when you look atthe totality of fiction, I think
(30:44):
in a lot of books AI are seenas beneficial and quite good
overall, but especially inmovies, I think there is this
need to depict them in anegative light.
Obviously, you have plenty ofgood versions.
You know your WALL-E that wedon't want to talk about.
I think you know something likeyou can talk about WALL-E.
It's fine.
We're not going to talk aboutWALL-E.
Speaker 1 (31:05):
Okay.
Speaker 2 (31:08):
I mean about wally.
It's fine, we're not going totalk about wally, okay?
Um, I don't.
I mean you just think we do notwant to talk about wally.
We, we have.
We have plenty of other thingsI would rather talk about.
These really are the moreexciting versions of ai and what
lead to probably a more dumbeddown take on what mechanical
beings might bring to humanity.
But they are a what am I tryingto say?
(31:30):
A cathartic expression ofhumanity's idea for what
creating artificial intelligencemight bring.
Now, again, this does happenwith david or ash, but I think
you're gonna hit some biggerones that really kind of form
the mold of what we see asdystopian versions of artificial
intelligence.
Obviously, there's just HALfrom 2001.
(31:52):
And I mean you have, you know,terminator, ex Machina,
westworld Dune, but I think, Ithink we're really going to have
to break down with, like, why?
Why do they have these versionsin pop culture?
Why is it so popular?
What is it saying about us andour thoughts of our creation and
what it could do to us?
Now, one, there is a natural,inherent fear of the unknown.
(32:18):
I think.
Creating artificialintelligence, creating a robot
who can think and possibly feeland has a will of its own.
We just don't know what thatwould bring, and that's
inherently something that,throughout time immemorial with
humanity, anything that we don'tknow or understand, we fear and
(32:39):
we tend to alienate and fightagainst.
That is part of our nature.
I think that we also lead tothe fact that humanity in itself
is inherently violent and weare killers from an evolutionary
standpoint.
Throughout the totality ofhumanity's history, we have
fought, warred, killed andothered constantly and to
(33:03):
constantly and to think thatwhat we create wouldn't have
similar purpose or course ofaction in a what you're talking
about fiction.
Speaker 1 (33:16):
You're getting an
original sin, essentially yes, I
think there is a bit of a bitof that the flaw that, in
creating something else, theflaw of the creator carries over
.
Speaker 2 (33:29):
Right right into his
progeny, Right.
I think that is a possibility.
I think we also feel the needto tell these stories of black
and white and by putting ourthoughts the way that we see
everything else, by attaching itto this entity that we don't
know and don't understand.
It's an easy story for us totell of the way that we interact
(33:53):
with everything around us thatmight be the same thing that
happens to us by this othercreation man, ronald moore was
onto something, wouldn't he?
yeah, now refresh my memory ofCaprica.
Oh, yes, because it's been awhile.
Sure, but it is the creation ofartificial life that then leads
(34:14):
to the Cylons, because theythen start making themselves.
Speaker 1 (34:19):
Yes, so Caprica is
both helped explain in a really
great way and then alsocompletely ruins Battlestar
Galactica in a lot of ways.
So what happens is a cyberneticengineer.
He creates AI essentially whatwe would consider droids and
he's like one of the mostwealthy and famous people in the
(34:43):
colonies, and his daughter endsup dying and he tries to upload
her consciousness into acybernetic life form.
Speaker 2 (34:56):
Oh, it's just a it,
it's a robot.
Robocop terminator scenario.
Speaker 1 (35:00):
Yeah not terminator I
would say, but well, robocopop
versus Terminator is in thatvein his dead daughter's
consciousness into one of thecybernetic creations he made,
because he modeled the brain andthe brain structure of his
daughter even though he thoughtit didn't work, and then it just
(35:31):
sort of all of a sudden worked.
It's very Frankenstein in thatsense.
They did the thing, they pulledthe lever and then he walked
away and then later it turnedout it really did work, you know
, and then she comes to life.
I'm glad you brought that upbecause Cap caprica is a really
weird example of how peopledon't know how to bridge this
gap here.
(35:51):
You watch battle star, you canhave all sorts of opinions about
ai and philosophy and nature,and what have you?
Caprica does this weird thingwhere it tries to explain
unexplainable part of that thatis existential.
It literally tries to quantifythe existential and it works
from a writing standpoint andthen doesn't work from a
(36:14):
narrative standpoint.
Right, yeah, it gives you anexplanation of how these things
happen, but it also kind ofruins the existential part of
the nature of consciousness thatthey try to get to in
Battlestar.
And it's still Ronald Moore.
So, like you're like dude.
Speaker 2 (36:32):
Right, right.
The Cylons are one of those.
Another key building block thatwe see in the Cylons hate
humans and are constantlyattacking them and you know it
gets into.
You know again another reasonwhy would AI attack humanity?
To try and wipe it out?
I think Cylons they fear humansand they also see them as
(36:54):
superior to humans.
Speaker 1 (36:56):
This is the whole
philosophical thing they get
into in Battlestar, like I haveto destroy my father.
It's this Greek tragedy thingthat people are obsessed with
Right.
It's this Greek tragedy thingthat people are obsessed with
Right.
Speaker 2 (37:08):
I mean, I think
that's more of a rhetorical
flourish that they provide there, because, I mean, they're basic
.
Yes, they're trying to killtheir father, but it's really, I
think, about what is stoppingthem from being themselves, from
evolving.
It's humans, or whatever, Iguess.
Do they call themselves humans?
Oh, no?
Well, no, not the C, no, I know.
Speaker 1 (37:29):
Yeah, humans call
themselves humans in the vsg
world I don't, I don't remember,so I don't think they ever
actually use the.
Well, fuck, they might at somepoint.
I don't think they do off thebat, but I think they.
No, they do.
They 100 do because they haveto address it at the end when
they go to earth.
They 100 do even if it's laterin the show because they're like
(37:49):
, but I think there is.
Speaker 2 (37:50):
In a lot of these
versions, you see that humanity
is placed as the enemy, andsometimes that is because they,
they are seen as the the onethat can stop their progress.
Sometimes it is simply the antin the highway.
You know where?
If, like, we are building ahighway, we recognize that there
(38:11):
are ants, you know living theirlives, but we do not care about
them in the construction of ourown synthesis, of our will.
They are beneath us and thustheir aims, wills, goals, do not
matter in the culmination ofour own.
Speaker 1 (38:27):
I can tell you've
never read the three body
problem, but they address thatvery, very clearly.
They have a whole treatise onthis exact concept.
Speaker 2 (38:36):
Yeah, I, I haven't.
Well, if you, if you do havesomething, to well.
Speaker 1 (38:40):
No, I mean, if you
want to be really depressed,
then go ahead.
But if you want uh, I'm alreadydepressed.
Speaker 2 (38:46):
I don't know if I
need more it's an extremely
nihilistic work.
Speaker 1 (38:50):
If you're in a place
where you don't want to be
really depressed and hateyourself and think there's
nothing worth living for, don'tread it all right, that's a.
Speaker 2 (38:58):
It's a glowing then
terminator.
Obviously we have.
I think this is probably theholy grail of the dystopian
version of AI and robots, where,essentially, ai is born in
Skynet and Skynet turns againsthumanity, concocts a scenario
where humanity helps to wipeitself out and then proceeds to
(39:24):
go on a killing rampage toexterminate humanity.
Speaker 1 (39:31):
Um well the reasons
why.
So this is one of the.
This is one of the like.
One of the most foundationalthings about dystopian sci-fi,
ai fiction, is that ai becomesself-aware.
Its programming is how to bestserve humanity, how to you know
how to stop war from happening.
(39:53):
This is the case in like, in atleast Joss Whedon's version of
Ultron, skynet, the Matrix, manyother I mean we talked about in
other episodes, other fictionalworks that deal with this, and
their conclusion is well, justkill humans and then there won't
be any war To protect humanity,you have to get rid of humanity
(40:15):
, because humanity is inherentlya threat to itself.
Speaker 2 (40:19):
Right, exactly so the
logic dictates that we need to
systematically remove thatvariable, right.
Speaker 1 (40:26):
Because Skynet,
specifically, is a defense
department ai driven thing.
It's invented by the americandepartment of defense like and
it's not just some ambivalently,you know, like computer that
somebody made like it wasspecifically made for this
reason, right right.
Speaker 2 (40:43):
cyberdyne will help
AI, which will give rise to
Skynet, which will then helphumanity destroy itself, then
try to crush all of past, tothen kill the mother of said
leader, but in doing so leavespart of itself behind, allows
(41:13):
for the leader to be created andallows for itself to be created
by leaving parts of itself thatare then found by Miles Dyson,
used by Cyberdyne to thenconstruct the circuitous loop of
time travel logic that nobodyneeds to get into when it comes
to time travel stuff, though itdoesn't make any sense really at
(41:34):
all, but still, narrative wise,those first two movies, that's
pretty great sci-fi stuff Eventhough they're action films or,
in the case of the first one, atotal B movie.
Speaker 1 (41:44):
Do they?
Speaker 2 (41:45):
I don't remember.
Do they explain?
Because I mean, when you set uptwo, are we led to believe that
Judgment Day won't happen atthe end?
Because I feel like it's leftambivalent.
It's like they don't knowwhether it will happen or not.
But you know, at least we'vegrown as people.
Yeah, they leave it up in theair for sure which again I mean
yeah, if you want to talk abouta franchise, uh, devolving over
(42:08):
time and and tarnishing its, itslegacy, you got terminator all
over that yeah, but I wouldrather take terminator.
Speaker 1 (42:16):
You know now.
Now, retrospectively, I wouldmuch rather watch t3 or and and
or take it in canon over any ofthe other sequels that came out,
would you rather?
Speaker 2 (42:28):
T3 or Prometheus.
Speaker 1 (42:29):
Oh, that is a great
question Once a sequel and once
a prequel, but you're right.
I mean, we're dealing with timetravel, so okay, okay, that's
fair.
Speaker 2 (42:37):
And is T3 your third?
Speaker 1 (42:41):
Well, I mean of the
Terminator movies.
I mean because it's the thirdone that happened.
Yeah, I mean because it's thethird one that happened.
Speaker 2 (42:46):
Yeah, I mean Well, no
, I mean in your ranking.
Speaker 1 (42:50):
Well, because I think
we both agree that the
Terminator franchise hasdiminishing quality.
There has never been one thathas risen more.
It's not gone up in quality asthe sequels have gone on so like
yes, by default, terminator 3is the third best terminator
movie, but alien, that's also agood question, because there are
(43:12):
a lot of those that are notgood quality either, because
that movie went from a b-movieto an extremely great genre film
.
And can we just blame jamescameron for all of this?
Can we just like like shoot him?
Can we just like put himagainst the wall and be like
stop it?
Well, I mean, what did he dowrong?
(43:33):
Well, I mean, he's, he's theSarah Connor.
If you get rid of James Cameron, we don't have these problems.
Speaker 2 (43:40):
Right, but you also
don't have the greatness either
I know, but isn't that the wholething in the Termitos?
Speaker 1 (43:48):
don't have the
greatness either, I know, but
isn't that the whole thing inthe termitos?
Yeah, it gives birth to thisgreat dude who does this thing
that leads humanity.
But then, but at least all thisstrife, and if you just get rid
of him, then you know therobots win, then the matrix
becomes important, robots wouldnever I mean the robots would
never exist.
If you kill sir connor,according to terminator 2 yes,
which is great, because that'sall, that is a time.
Speaker 2 (44:08):
Yes, you're right,
you were correct, that is a time
travel we don't, we don't needto deal with time loops, yeah
that's a time travelconversation, not a, not an ai
conversation yeah, but I thinkit is again.
You know what, what reese says.
You know that it's.
It's an unthinking, unfeelingkilling machine and it will not
stop ever until you are dead,which is like, quintessentially,
(44:30):
what is so terrifying about anAI robot apocalypse coming to
destroy humanity?
It's these creations that wehave made.
Obviously, they have been inthe Terminator universe.
One they are these horrifyingendoskeletons made of metal that
we can't stop.
Two they have then made theminto cyborgs, so you can't tell
(44:52):
it's a Terminator, but it'sstill coming after you.
Three, it has no empathy.
We cannot relate to it on anyemotional level.
Like zombies, they will notstop Good point Ever.
They're just going to keepgoing until you are dead.
Right, it is one of theseperfect villains, um, that we
(45:13):
can't, we can't empathize with,we can't really even understand
and we also can't stop, which isjust so perfect, uh, of an
antagonist.
Speaker 1 (45:23):
It goes to our core
of of our, of our fears as a
species.
Yeah it's, it's great, it andyou know what?
It's the only reason that thatmovie ended up becoming more
mainstream, because that wouldhave normally been like a, b,
like a straight up b movie.
I mean, that is a, that is a bmovie, low budget, coming out of
(45:44):
the Corman camp, essentially.
Speaker 2 (45:47):
Yeah, I mean, it's
one of those things that it
streaks of genius, coming fromhumble beginnings in a derided
genre that it was able to riseabove.
Speaker 1 (46:04):
You know it's.
It's like it could have startedoff at some point in.
Schwarzenegger wasn't even astar then.
Like that should have just beenrelegated to I come in peace
status.
But because you're right,because of those things that it
hits on, it became an iconicthing and you know what may have
actually legitimized that genremore than without it.
Like that might have actuallylegitimized the genre because of
(46:27):
the things that it appeals to.
Like you're saying, then, hadthat movie not existed, I don't
know that we would have the geekculture we have today.
Speaker 2 (46:35):
Yeah, I think there's
so much that really spawns over
that, especially when we dealwith our ideas of robots and AI.
Whenever we think of thebiggest downfall, we point to
Skynet, we point to Terminatorsas a collective ideology, and I
think part of that is also likethey were able to put a face to
(46:57):
our coming robotic destruction.
If you look at like what Haldid.
Hal is also, you know, aprogenitor of this, but there is
not the anthropomorphism thatwe get with the killer robots in
Terminator A big box with a redlight and a monotone, soothing
voice.
It can only terrify so much andit doesn't hit on certain
(47:19):
levels and it's harder for us toidentify with that.
I think that's another reasonwhy that has trouble breaking
through in the way that killerrobots have you know since you
know, especially sinceTerminator, but even in in you
know pre-Terminator things.
I mean, you have, you know, Ithink, most of the killer robots
(47:40):
that we think of, you know inyour, your Westworld, your Megan
, your Battlestar and Terminator.
You have these two legs, twoarms, a humanoid body, something
made in our image, in ourlikeness, that is out to kill
its creator, ethean tale, comingto its final, conclusive,
(48:16):
combustive end, which I think iskind of like what drives a lot
of like, especially mainstreamthought with this genre.
Now, you do have plenty ofutopian positive ideas.
With AI, you know from, youknow an after gang or creator or
data or her or or blade runner.
There's plenty of positives wecan see, but I think the
dominant, you know, tale of ofthe dark possibilities of our
(48:36):
creating something completelyout of our control, this
frankensteinian creation that is, that is beyond our logic, our
understanding or our control,gives rise to the horror that we
see could be, and I think thatis what really captures the
(48:58):
imagination of the dark side ofour possibilities.
Speaker 1 (49:04):
Mm-hmm.
Speaker 2 (49:05):
Please go away.