All Episodes

February 29, 2024 40 mins

Gary Goldman was a writer on “Total Recall”, a Philip K. Dick adaptation directed by Paul Verhoeven and starring Arnold Schwarzeneger. It was a big hit. So why do Gary and his writing partner, Angus Fletcher, have so much trouble selling another Philip K. Dick adaptation? They tell Malcolm that it all came down to a roller coaster ride of plot twists that even A-List action actors couldn’t stomach, and an early attempt at AI that was too dumb to pick a smart script.

See omnystudio.com/listener for privacy information.

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:15):
Pushkin.

Speaker 2 (00:23):
Welcome to episode two of Development Hell, our revisionist history
series based on the radical notion of the best Hollywood
stories are the ones that never got made into movies.
Last time, we kicked out the series with the story
of how my book Blink failed to make it under
the big screen. If you haven't heard that one yet,
you really really should, because it explains why failed stories

(00:46):
have such meaning for me. Anyway, this episode is a
wildly convoluted tale, full of big plot twists and huge
moral questions, a story so much about right now that's
going to get a little uncomfortable. Allow me to introduce

(01:10):
guests Number one, Angus Fletcher. He's a screenwriter.

Speaker 3 (01:14):
I'm not a screenwriter, and I've never really wanted to
be a screenwriter. Have no aptitude in screenwriting.

Speaker 2 (01:20):
Don't listen to Angus. Angus is in fact a screenwriter,
as you'll hear, he's also an author, researcher, and professor
of story science at Ohio State and a longtime friend
of Revision's history. Some of you will remember him as
the person who provided the intellectual scaffolding for our memorable
revision of The Little Mermaid. Remember this so you think

(01:44):
we should be able to we should fix Ursula.

Speaker 3 (01:47):
Well, I think we should just stop her from doing
whatever she's doing. We should have a conversation with her
about maybe why this isn't helpful.

Speaker 4 (01:53):
Do you, Prince Eric, take Ursula to have and to
hold in sickness and in health for as long as
you both shall live.

Speaker 3 (02:04):
Eric looks deep into Ursula's eyes, hypnotized.

Speaker 1 (02:08):
I do.

Speaker 2 (02:11):
If you haven't heard that Little Mermaid series, listen to it. Anyway,
back to today's story.

Speaker 3 (02:17):
I'm just fascinated by this problem that Hollywood has, that
it keeps selling the same story over and over and
over again. And so you know, back when I was
actually a Shakespeare professor up at Stanford, I placed this
folk all the Pixar and I was basically like, how
is it you guys who were able to make stuff
that's like better? This is when they're making movies like
Up which had just these totally berserk narrative structures. And

(02:39):
so the anyway they let me inside, I learned all
this stuff. I thought it was totally fascinating. I realized
that they were making stories in a totally different way
than Hollywood was. So I went down to Hollywood to
advise Hollywood, and that led to me basically trying to
convince Hollywood by writing a series of screenplays the Pixar way,
because I thought that they would then allow me to
consult on movies. Instead, No, they just hired me to
write screenplays, and that's how I got an agent. And

(03:02):
then one day my agent called me up and because
he knew I'm just obsessed with the original Total Recall,
which had destroyed my mind when I was young, and
he was like, how would you like to work with
the writer of Total Recall, Gary Goldman?

Speaker 1 (03:14):
And so that's how we got together. Gary.

Speaker 2 (03:18):
Wait, Gary, you wrote Total Recall. Meet Guests number two,
Gary Goldman. He's also a screenwriter, but unlike Angus, has
no trouble admitting it.

Speaker 1 (03:28):
Yes, I was the last writer on Total Recall. On
the first Total Recall.

Speaker 2 (03:32):
Total Recall amazing movie, Arnold Schwarzenegger Sharon Stone, directed by
Paul Verhoven.

Speaker 3 (03:39):
Yeah.

Speaker 1 (03:39):
So it was about a milk toast character who finds
out that he's not who he thinks he is, which
is that he's a kind of a super spy.

Speaker 2 (03:45):
But how real?

Speaker 1 (03:46):
Does it seem?

Speaker 2 (03:47):
As real as any memory in your head? Come on,
don't bullshit me.

Speaker 4 (03:51):
No, I'm telling you, Doug, your brain will not know
the difference, and that's guaranteed or your money back.

Speaker 1 (03:56):
What about the guy you lobottomized? Did he get the refund?

Speaker 2 (04:00):
Came out in nineteen ninety blockbuster tons of awards total
recall has a number of It has a genealogy. It's
not just to me. I mean, it has a short story.
The source material is from Philip K.

Speaker 1 (04:11):
Dick. And then there was a fantastic original screenplay by
Dan O'Bannon and Ron Schussette that everybody loved from the beginning,
and this launched the project. In fact, this is sort
of the prime example of this idea of development hell,
which is that you sell something for a lot of

(04:32):
money and everybody's very excited about it, and they tell
you it's going to be a movie right away, and
then it goes into a process of development where everybody
has a hand in revising it and reconceptualizing it and
making it in their image and in the middle. Depending

(04:53):
on your involvement, you can be in development hell along
with the project, or you can be a bystander pushed
to the outside watching it go through the process of
development Hell with lots of other people, not including yourself.
I mean, if you're lucky it gets made at all.
The project had been in development for almost eight or
nine years when I got involved.

Speaker 2 (05:12):
Yeah, eight or nine years. Yes, So if you go
back to the original screenplay, do you think that's a
better movie than the final screenplay?

Speaker 3 (05:24):
No?

Speaker 1 (05:25):
Not in my case.

Speaker 2 (05:28):
So you're saying development hell is a bad thing. Accept
in the case where I'm involved in convicings, which, by
the way, a legitimate position that I would write, I'm
totally open to agree with that.

Speaker 1 (05:39):
I mean, when I was starting out, I read the
original Total Recalled screenplay as a junior executor for a
producer who was considering making it and acquiring it. And
basically the problem that most people felt was that it
fell apart in the third act. So and this apparently,

(05:59):
it fell apart so disastrously that even though it had
many different stars and directors attached and people always wanted
to make it, it didn't cross the finish line. And
by the time it got to me it was eight
or nine years later. It had been in pre production
with Bruce Beresford and Patrick Swayzee and the producer Dino

(06:22):
Dilarentis in Australia, and in fact, Bruce Beresford asked me
to come over and do a rewrite on it, and
I said I couldn't because I was working with Paul
Verhovean on another project. But Dino went bankrupt. Arnold Schwarzenegger
had always loved the project and had been watching it.
He went the owners of the Carolco studio that had

(06:43):
done the Rambo movies and he said, you know, I
want to do this, and they bought it from Dio
Dilarentis out of bankruptcy for a humongous amount of money
and with the idea that Arnold was attached. Arnold wanted
to work with Paul Verhoven because Paul had just done
RoboCop and I was working with Paul Verhovean on a

(07:03):
movie called about Out of Body Travel, and Paul said
he didn't think it was ready to shoot and that
he was going to leave our project and go do
another project, and he was apologizing to me for that.
I said, what's the other project and he said Total Recall.
I said, well, that's really funny, Paul, because I turned
down totally the chance to rewrite Total Recall because I
was working with you, and he said, well, what did

(07:25):
you think about it? And I told him my take
and he said, well, you know, that's pretty much how
I see it too. Let me let me see if
I can get you the job to rewrite it.

Speaker 2 (07:32):
Gary, Gary, Gary got worried. We want to talk with
his other project. But now I'm thinking we can come
back to total recall or is this all leading up
to the other project. Well, they're connected, yeah, So what's
the connection? The connection is I would say Philip K. Dick,
Philip K. Dick, sci Fi Master. To name just a
few other Philip K. Dick film adaptations, Blade Runner, Minority Report, Scanner, Darkly,

(07:59):
Philip K. Dick is probably the most important sci fi
writer ever, or at least of the last hundred years.
After working on Minority Report, Gary was becoming a bit
of a Philip K. Dick specialist, which is not a
bad thing to be in a town that loves sci
fi adaptations. When we get back, Gary and Angus go

(08:20):
to work on adapting another of his stories. We're back
with Angus Fletcher and Gary Goldman, who've just been set
up to work together on an adaptation of a Philip K.

Speaker 1 (08:41):
Dick story. I've done four projects based on Philip K.
Dick material. This would become my calling card in Hollywood
was that I did Philip K. Dick adaptations.

Speaker 2 (08:53):
So, Angus, wait, did you have a particular affection for
Philip K. Dick prior to this project?

Speaker 3 (09:00):
Well, I mean I loved the movie Total Reco And
I'm going to acknowledge that I was such a philistine
that I had no idea that it came from Philip K.

Speaker 1 (09:07):
Dick.

Speaker 3 (09:08):
And happened was the moment I got the call from
my agent. I then went and read all of Philip K.
Dick and it's surreal. I mean, he sort of went
from science fiction into religion. And I called Gary up
and I was like, well, Gary, I was like, what
do you think you know?

Speaker 1 (09:26):
I'd love to work with you. And the main reason
I wanted to work with Gary is.

Speaker 3 (09:29):
Because I believe that screenwriters had the ability to see
the future, because I think this is a common power
of story. I think that science fiction is always about
intuiting what's going to happen next. And so I said
to Gary, I said, Gary, what do you think the
future is? And he said, Angus, read this story, variable

(09:50):
man and the premise of Variable Man's very simple. It's
basically just about a future in which there are these
computers who used this histics to predict everything's going to happen.
And I said, you know, Gary, this seems weirdly like
the world we're accelerating it to now.

Speaker 1 (10:04):
And he said, yeah, he said it is. And he said, and.

Speaker 3 (10:08):
I have a premise or a story that I'd like
to pitch to you that I think will kind of
be as big, if not bigger, than total recall.

Speaker 1 (10:14):
And that's where we got started, was with his pitch
to me. And is the So the Variable Man's Story
was written when, Oh, I don't know about nineteen fifty, probably.

Speaker 2 (10:26):
Oh I see, Oh it really so. Imagining in nineteen
fifty a world in which computers predict everything is kind
of great if you're in the twenty When when did
you guys, When did you guys work together on this project?
What was the early two thousand, early two thousands? So
you have and so were you were you being faithful
to the Philip K. Dick short story or were you

(10:47):
were you changing it in useful ways?

Speaker 3 (10:49):
We were changing it in useful ways. And there a
couple a couple of big things. So, first of all,
Gary's pitch to me was he was like, imagine a
future in which computers don't just predict things like they
do in the Philip K.

Speaker 1 (11:03):
Dick story, which is who wins this war?

Speaker 3 (11:05):
But he says, you know, imagine that they're able to
predict exactly.

Speaker 1 (11:08):
What food you'd like to eat right now. And then
he said, and then imagine.

Speaker 3 (11:12):
That they're able to predict a gift that you can
give to your best friend for their birthday. And that
gift is so good that your friend not only loves it,
but it seems perfectly like it came from you. And
then he said, imagine the computers can pick your soulmate.
Imagine the computers can pick the person that you love

(11:32):
so totally and so absolutely that you know that.

Speaker 1 (11:36):
This computer knows you better than yourself. And I said,
all right.

Speaker 3 (11:41):
And then he said, and now imagine, after you've given
yourself up totally these computers, you wake up one morning
and they say to you, in twenty four hours, the
world is going to end because humanity is going to
destroy itself. And I've crunched all the variables and there's
no opportunity for you to do anything. That's the starting

(12:01):
point for the story.

Speaker 2 (12:02):
Oh my god, So wait, how far have we traveled?
We've traveled quite significantly from Philip K. Dick At this
he was just imagining a world where there were these
computers that were predicting the outcome of a war, and
the computer's run across a problem. That's basically the plot,
well right.

Speaker 1 (12:20):
The title variable Man. Because the other thing that Philip K.

Speaker 3 (12:24):
Dick did, which he intuited I think at a moment
of genius, was what would happen if you put a
person who the computers had no information on into the
middle of the statistical world. So these are all these
computers that are able to see the future because they
have all the data, but there's one person who they
have no data on. And so the starting point for

(12:45):
the screenplay that Carried and I did was humanity goes,
oh my goodness, we're going to die tomorrow. Our only
option is to bring in someone who the computer has
no data on, the variable man, the person who is unpredictable, unknown.

Speaker 1 (12:58):
And so, just like in the original Philip K.

Speaker 3 (13:00):
Dick's story, we reached back into time, back into history
before the computers and pluck this person into the present.
And that's basically the beginning of the screenplay is when
the variable man arrives in this world where everything has
been planned, everything has been predetermined, the computers know everything.

Speaker 1 (13:17):
We're all going to die.

Speaker 2 (13:18):
Yeah, now the way with so many, so many wonderful uh, Gary,
tell me a little bit about your thought process in
moving from the original Philip K. Dick notion to the
one that Anglish has described. Tell me about the leaps
that you made there and why you made them?

Speaker 1 (13:38):
All Right, So, I suppose I was very interested in AI.
You know, everyone's thinking about it now, but it wasn't
brand new. And I saw this idea that that of
the computer that can predict everything, and that you have
to give up control to the computer because it's it's

(14:00):
working so fast, uh, And I wanted to sort of
imagine this sort of trajectory of of going from where
we are now to the point where the computer is
in total control. And I was really and I remain
concerned about this question of decision making and how do

(14:24):
we make decisions? And if the computer can make better
decisions than we do, then what is the role of
free will? And if you spend a whole life just
simply doing what the computer tells you to do. What
is identity. So I wanted to put this hero who
comes from the past and who still is accustomed to

(14:45):
having free will and his power is to have free will,
and to put it in the situation where whenever he
listens to the computer, everything goes right, and whenever he
trusts his own judgment, it's wrong.

Speaker 2 (14:58):
Oh wait, wait, wait, so the AI, the AI is
smart enough to realize that the only way to save
the world is to bring someone in from outside AI. Yeah, oh,
the AI a little bit of self knowledge and humility
in your world.

Speaker 1 (15:12):
Oh yeah, that was one of our best things. We
created this wonderful AI, a wonderful character called We named
him Plato, and uh, he's you know, he's sort of
somewhere We're going to Butler and God.

Speaker 2 (15:23):
Yeah, that's that's this is this is getting better and better.

Speaker 3 (15:27):
I was gonna chime in there and uh and just
and just say that basically, in this society, there are
a few people that have tuned out and they're like,
I'm not going to listen to this computer.

Speaker 1 (15:36):
Where does this computer know?

Speaker 3 (15:37):
And their lives are horrible and all their decisions are bad,
and they decide to choose their own soulmates, and they
have horrible marriages.

Speaker 1 (15:46):
And then everyone who.

Speaker 3 (15:46):
Listens to exactly what the computer says has this Instagram
life where they are actually so happy and their kids
are so perfect and everything is so amazing.

Speaker 1 (15:56):
And so, you know, Cole comes into this world. Our
hero comes into this world.

Speaker 3 (15:59):
And he of course finds an appalling and as as
appalling as we would find it, and he's determined to
ignore Plato. He's determined to ignore the AI that has
brought into the future to save humanity from itself. So
this is another kind of unfolding paradox is humanity doesn't
want to save itself. It means listening to the computers
on how to save itself.

Speaker 1 (16:19):
Wait, wait, so many questions.

Speaker 2 (16:21):
First of all, tell me a little bit about so
our variable man is called as in the Philip K.
Dick story, your veryble man is called coal. Tell me
about your coal? And where does he come from? What's
his personality?

Speaker 1 (16:35):
Like?

Speaker 2 (16:35):
Why is he? Why is he the way? Why is
he so capable of standing up to AI even if
AI has all these demonistrable advantages.

Speaker 1 (16:46):
So basically he's just a classic action hero.

Speaker 3 (16:49):
The idea is he embodies kind of our cowboy American
trust in our guts. And so the opening sequence, he's
sent in to stop this hacker, this hacker who's been
creating chaos across the world, who's been sort of, you know,
getting into all these top secret facilities and creating we
think he's gonna launch world War three whatever.

Speaker 1 (17:11):
And so Cole goes in and he's being sort of
instructed what.

Speaker 3 (17:14):
To do, and we already had these computers at this
time that are sort of running the percentages and telling
him do this, do this, do this, do this other thing,
and he's listening to the computers.

Speaker 1 (17:23):
He's breaking in.

Speaker 3 (17:24):
He finally gets to the heart of the layer where
this hacker is, and he kicks down the door and
he points his gun and then the hacker turns around
and it's a child, and Cole realizes, oh, my goodness,
this is the hacker. This is the person who's been
creating all this chaos. And he gets this word on

(17:45):
his earpiece, kill the hacker, and he says, I can't
kill the hacker.

Speaker 1 (17:49):
This is a child.

Speaker 3 (17:50):
And there's this long pause and they say, look, we've
run all the numbers. If you don't kill the kid.
This is going to happen again. Pull the trigger and
he stands there in the moment. Can I pull the trigger?
Can I pull the sugar? And then he doesn't do it.
He doesn't pull the trigger, He drops the gun, he
grabs a kid, and what essentially happens is he dies

(18:11):
saving the kid's life. There's a fireball that blows him up,
but the kid goes on and the kid lives.

Speaker 1 (18:16):
And because that fireball incinerates him.

Speaker 3 (18:20):
He can be brought into the future without changing time
because he was essentially eliminated from the timeline. So if
you reach into the exact moment just before he dies
and pull him into the future, time isn't changed. And
so he's the perfect person to bring back because he
has demonstrated the ability to say no to the computers,
and also because taking him doesn't change time.

Speaker 2 (18:42):
Oh I see so in the Fruit, when we meet
Cole's he's part of the present, he's not. He's He's
just a guy who's standing up to the tyranny of AI.
Here's an action hero who's AI has been corrupted by
this hacker. He's started to solve the problem. He fails
as an incident. How much time passes between that initial
encounter with the hacker and then when our story is

(19:05):
taking place.

Speaker 3 (19:06):
Well, this is part of the twist because when you
get into the future, you think that an enormous amount
of time has passed because the future is radically different
from when Cole disappeared, and it's so different that he
can't believe it.

Speaker 1 (19:21):
But we start to.

Speaker 3 (19:22):
Learn that actually a very small amount of time, much
more less time than you might suspect, has passed between
when he died and when he was brought back because
these computers, as Gary was saying, have accelerated human decision
making to the points that our.

Speaker 1 (19:35):
Society has started to run on this.

Speaker 3 (19:37):
Utopia cycle and everything is just going much faster and
much better than anyone could have imagined in our time.

Speaker 1 (19:44):
And it's because of this great genius who was the kid.
The kid is is the thing that made the difference
because he didn't kill him. Because Cole didn't kill him,
he survived and he became the great genius that ushered
us into this accelerated utopia.

Speaker 2 (20:00):
Now, why one thing I don't understand. If we are
in accelerated utopia, why do we need saving? Why would
utopia have a kind of expiry date. Isn't the definition
of utopia that it keeps going in perpetuity?

Speaker 3 (20:13):
Well, I mean, so this is part of the great
middle of the beginning of the story is why is
it exactly that we're going to die? What could possibly
go wrong? Everything seems like it's perfect. What is going
to destroy the world? Well, nobody is able to identify
exactly at the beginning of the movie what's going to
go on. All they know is that the computers are
quite insistent that is going to happen in the next
twenty four hours. And what happens over the middle parts

(20:37):
of the screenplay is various different alternatives start to emerge.
We start to see all of these different moving pieces,
and we start to realize, oh, sure, you know it's
a utopia, but there's still humans in it, and there's
still other things going on. And I don't know how
many of the twists you want me to ruin up, Malcolm.
But one of the things that starts to become a
question that haunts the hero is is he the one

(21:02):
who's going to destroy the world? Is what the computers
saw that he would get brought back and that by
not listening to the computers. He was going to be
the one that blows everything up.

Speaker 2 (21:15):
You know what this is, if I might reduce your
whole premise to an axiom. This is the cinematic version
of the famous adage in the Land of the Blind,
the one eyed man is king right. Yeap, your coal
is your one eyed man who is asked to be
king right. And what you're describing just there is he

(21:36):
knows he only has one eye right. He's wrestling with
the fact that he can't see it all. It's sort
of beautiful. Let me give you a little recap of
where we are. There's a perfect utopia where AI runs everything.
People do what computers tell them to do and are
better for it. For reasons we don't quite understand yet,

(21:56):
this utopia is in danger. In order to save it,
the AI named Plato no Less brings back to life
a guy who defied its commands in the past. After
the break, Gary and Angus try to get the movie made.

(22:25):
So tell me about your tell me more about your goal.
So does he who who do you imagine playing your goal?

Speaker 1 (22:32):
So we actually had we had quite a lot of interest.

Speaker 3 (22:34):
We had interest from Bradley Cooper's people.

Speaker 1 (22:38):
We had interest from Mark Wahlberg's people. But the one
thing that was always common.

Speaker 3 (22:43):
Was that people became increasingly disturbed about the paradox at
the center of Paul Psyche, because if you're going to
play this part, you have to wrestle with all of
these sort of profound questions, And one of the most
profound questions is that what the computers are doing is
they're saying, trust us. And the reason they get us

(23:07):
to trust them is by being able to predict our
most intimate choice.

Speaker 1 (23:11):
They're able to.

Speaker 3 (23:12):
Predict our heart. They're able to know our heart better
than we know our heart. They're able to say, you're
going to love this person, and when you meet this person,
your entire life is going to be changed. And the
computer does this to Cole when he gets into the future.
Is he picks the perfect person for Coal and they
fall in love immediately. In fact, they fall in love
so fast that they don't even know that they were

(23:33):
set up by the computer.

Speaker 1 (23:34):
They only figure it out after the fact.

Speaker 3 (23:37):
So now Cole knows that the computer has told him
this is who you're going to love, this is your soulmate.
And the computer has also told him the world is
going to end, and so essentially the existential choice facing
Coal is is my heart wrong.

Speaker 1 (23:56):
And I don't love this woman and the world is
going to be fine, or.

Speaker 3 (24:01):
By following my heart and accepting that I love this woman,
is everyone going to die?

Speaker 1 (24:06):
And that he has to listen to the computer when
the computer gives him a mission, and the computer's mission
for him is to kill someone. Is to kill the
person who's going to trigger the singularity.

Speaker 2 (24:18):
Same mission as was given him before he was incinerated.

Speaker 1 (24:22):
That's right, but on a much bigger and more advanced scale. Yeah,
And the twist is that the person who he has
to kill is, of course the person that he saved
when he was a child. And the further twist is
that that person is the father of the girl he's
been matched with, who he's fallen in love with, and
that he's been set up to be with this girl

(24:43):
because she's going to bring him home to Daddy and
that's the only way to get through daddy's security.

Speaker 2 (24:49):
Oh my god, so good, it's so good. Why let's
go back. So you had the interest from these actors
in playing Cole, But why did they have when you
said they had trouble with that twist. It sounds to
me like this is an extraordinary opportunity for a skilled actor.
What do you mean they had trouble with it.

Speaker 3 (25:11):
Well, so I think that when you think about why
total Recall got made, Arnold Schwarzninger has the capacity to
be himself and also be meta Ornald at the same time.
So he's the capacity to kind of laugh at himself
at the same time he's being himself. This part requires
that ability to be yourself and be outside yourself at
the same time, and it forces you also to go

(25:33):
against the kind of core fantasy of Hollywood, which is
that love saves the world.

Speaker 1 (25:42):
If you listen to yourself, that saves the world.

Speaker 3 (25:45):
And so this caused a lot of cognitive dissonance in actors,
and the same reason that Gary was talking about that
the sort of AI billionaire's daughter was caused cognitive dissonance.
And so that was a real emotional sticking point for
a lot of actors, and I think they were concerned
about the story.

Speaker 1 (26:06):
Yeah, I think you're right, Angus. Ultimately we didn't want
to say, oh, yeah, trust your heart and it's all
going to work out. That's a lot of Hollywood endings
manufacture that ending, and it's the Hollywood answer is trust
your heart, and society is producing a new answer, which
is trust the AI.

Speaker 2 (26:25):
Can you think of a Hollywood movie in previous to
this where not the narrow question of don't trust your heart,
but the kind of broader question of that love and
the fulfillment of a real emotional need would have catastrophic

(26:48):
consequences outside of it?

Speaker 3 (26:50):
Is?

Speaker 1 (26:50):
That is this virgin territory is what I'm asking for
a Hollywood film. One example springs to mind Casablanca. For
Humphrey Bogart and Ingrid Bergmann to get together will farm
the world, and so they Humphrey Bogart makes the Rick
makes the decision not to follow his heart.

Speaker 2 (27:09):
I think you're right, but you will.

Speaker 3 (27:13):
Uh.

Speaker 2 (27:13):
There's a crucial difference, though, and that is, do we
ever really believe that Rick is madly in love with
Ingrid Berkman? Not in what you've described as a situation
where the two parties are genuinely and have powerful reason
to believe that they are genuinely in love, they are soulmates.
Was Ingram Bourkman Humphrey Bogart's soulmon in Casablanca or was

(27:36):
it was like his heart is so kind of hardened
and covered in scar tissue, we kind of accept him
walking away because that's what he does. He walks away, right,
But you just gotta be something a very different dynamic.
You're talking about two people who are soulmates having to
walk away.

Speaker 3 (27:56):
Yeah, and you know, I will say I'm Malcolm my way.
That's a very iconoclastic reading that you just delivered of Casablanca.

Speaker 1 (28:00):
But also, wait, wait, wait, wait what wait?

Speaker 2 (28:03):
But they know you don't think Humphrey Bogart's heart is
covered in covered in scar tissues.

Speaker 3 (28:09):
I think the point is that even though it is he,
he realizes that the romantic thing to do is to leave.
And so that's the ultimate tragedy of that movie, Malcolm,
which which you've just desecrated, is that he's a man
who is reconnected with his heart and in doing so,
walked away from it.

Speaker 1 (28:28):
You know. Anyway, I think that's the Hollywood schlock reading
of it.

Speaker 3 (28:30):
But but the point is in that movie he walks away.
He does the heroic thing, and and you know, you're asking,
is there a movie where someone has basically willingly destroyed
the world for their own love.

Speaker 1 (28:43):
Has someone willingly made that choice?

Speaker 3 (28:45):
You know?

Speaker 1 (28:46):
And that one that one did happen in the English patient?

Speaker 2 (28:51):
Oh tell me, tell me, I remember that film is
not strong enough. Tell me how that happens in the
English Patient.

Speaker 1 (28:56):
Well, in the end of the hero basically is a
traitor to the allies out of love for a woman. Okay,
and usually Holly would doesn't put the hero in that position.
That's they don't decide to. I mean, what they will
either do is say there's a greater good here. I

(29:19):
mean a lot of English movies, you know, even brief encounter,
you know, people say I'm in love, but I'm going
to do this, I'm going to stay with my wife
and kids. Yes, so there is.

Speaker 2 (29:28):
There is English annoying kind of aristocratic English notion of
remembering Tale of Two Cities at the end, it's a
far far better thing than I've ever done.

Speaker 1 (29:39):
Blah blah blah blah.

Speaker 2 (29:40):
They love playing that role. They never play it in
real life, but they love playing it in literature. Yeah, yeah,
they're like, are you kidding me? In real life he's good, No,
he's not doing that. But no, but wait wait, I'm
still I'm not satisfied. So it doesn't. Why am I
am I wrong? Why does it seem to me like
you have described in this movie a much more complicated

(30:02):
and interesting as a problematic narrative scenario that exists in
say Castle Plankt. Bradley Cooper's not having a problem doing
a remake of Kazablanca. It's fine with that, but he
clearly had a problem with this. So you described to
me in your words, well, I can see how they
are analogous, but they're not. It's not perfect. Why is
it not perfect?

Speaker 1 (30:23):
It's I don't think it's perfect at all. By the way,
I do, I do.

Speaker 3 (30:26):
Want to get on record as saying that you've also
designated Charles Dickens, who's another sotemic figure of English literature.

Speaker 1 (30:33):
No, I mean, it's it's the reason that it's that,
it's that it's different, is uh.

Speaker 3 (30:39):
I mean, imagine if the character chose his heart and
the Nazis won.

Speaker 1 (30:46):
Yeah, you you choose.

Speaker 3 (30:47):
I mean, imagine the sound of music right in which
he marries Julie Andrews, and the result of that is
the Second World War is lost. I mean, you know,
I mean that's the kind of that's the kind of
crisis you're setting up. And so in order to establish
free will, he has to reject Julie Andrews. He has
to reject you know, his heart to save to to
to save the world. Accept we haven't even told you

(31:11):
the end of the movie, so you don't even know
what's going to happen, right and when and when actors
got to the end of the movie, the dissonance became
so intense for them that, you know, I think that
they started to say, how can I, in good conscience
play this part? Because I myself don't even understand exactly
the part that I'm playing.

Speaker 2 (31:31):
So you're in telling the story of why this movie
doesn't get made. You we're listing, we're going to go
through the list of culprits, but it's corporate number one
that the actors themselves couldn't handle it.

Speaker 3 (31:42):
I think that's part of it. But I mean, I
think the core of it is is is.

Speaker 1 (31:48):
Do we trust ourselves? Yeah, And.

Speaker 3 (31:54):
That's a hard place to go in the movies because
at the end of the day, the movies basically no comedy,
and they know tragedy, and in a comedy, you give
yourself over to a fantasy, and in tragedy. You're like,
it's all over from the beginning, and it's a chance
for you to kind of wallow. So another way to

(32:16):
say this is when we're born, in our brain, there's
a kind of primordial story, and that story is that
I'm good and that life is good. And that primordial
story is so powerful that it allows us to.

Speaker 1 (32:29):
Be open to the world and to explore the world.

Speaker 3 (32:32):
And then what happens over time is that story runs
up against our experiences of life, many of which are negative,
and this other story starts to occur in our brain,
which is actually, life is not so good, and I'm
also not so good, and we start to believe that
story more, and that story starts to become more primordial
for us, and the good story starts to become more

(32:52):
superficial and fantasy for us, and we get into kind
of a dissociated place. And in that dissociated place, which
I think most humans exist most of the time, they
really believe that life is pretty bad and that they're
pretty terrible, but they're always trying to actively convince themselves
that life is good and that they're a nice person.
And this is a product of you know, mindfulness, positive

(33:13):
thinking all of these things which kind of infest our
society now. And when people go to the movies, they
want to indulge either the fantasy life of no, I'm
a good person and everything works out, or they want
to wallow in this negative space of no, you know what,
this is a safe space for me to admit that
everything's terrible and I'm a horrible human beings. And that's

(33:33):
why we have like the Dark Knights and you know,
Christopher Nolan and all this kind of stuff, you know,
kind of like the dark, sludgy stuff. And this movie
is constantly forcing you back and forth across the threshold.
This movie is not allowing you to say, you know what, yes,
everything is bad, life is bad, life is awful. Nor
is it saying there's a kind of simple easy answer here.

(33:55):
It's pushing you back and forth to kind of reconfront
and can you keep going?

Speaker 1 (34:00):
And at the bottom of.

Speaker 3 (34:01):
The movie is this idea that only intelligence is going
to solve things emotion, not wishing, not wanting, but being smart.
And are you the audience smart enough to solve this problem?
Because if you, the audience, are not smart enough to
solve this problem, the computers are going to solve it
for you. And so the existential challenge for the movie

(34:22):
essentially is are you smart enough to keep ahead of
the movie, in which case you know what, you're smart
enough to live into the future, or are you going
to submit and become passive and be drawn into the
engine of the story. And then that's the future of AI,
in which you no longer have any autonomy and you've
given up your will to the plot and the machinery.

(34:42):
So I think that's why people find the movie unnerving,
because it really does put you in that existential place
that you feel that your autonomy is being stripped. But
I mean there's also a lot of sort of funny
and sort of dire studio stories they're involved, so I
think ultimately the studios were also alarmed by this story.
I mean, this is a big budget movie, and the

(35:04):
thing is, if you're going to do a big budget
movie like Total Recall like this, you need to get
that act on board to try and secure.

Speaker 1 (35:12):
The money.

Speaker 3 (35:13):
And then do you want to put one hundred and
twenty million dollars into something that has this premise?

Speaker 1 (35:17):
I think it was too demanding, But I think that
our main problem here was really that we didn't get
a director. We didn't get a strong director. A project
like this requires a director who with a very strong
vision and who the people with the money trust. And
I suppose I had the hubris to think that after

(35:39):
doing these three Philip K. Dick movies, they would trust me.
But actually that wasn't enough. The movie's fate in the
real world precisely parallels it's narrative on the page. In
other words, that actually happens. So one of.

Speaker 3 (35:57):
The things that happened is after we went out to
all of these actors, I then was on another screenplay projects,
which is going well, and it was produced by Bob Shay,
who did Lord of the Rings all these big kind
of movies. And so I showed the script to Bob,
and Bob was like, Oh, the script is amazing. I
love the script. We got to get the script produced,
and so I said, okay, great. But it all culminated

(36:18):
in Bob calling me up one day and saying, Angus,
we've got financing for this movie. I'm almost entirely in
hisstry by financing for this this movie.

Speaker 1 (36:28):
I said, oh, okay, I said, what's going on? He said, well,
there's this company in town.

Speaker 3 (36:32):
They have a computer that they feed all of the
scripts into and the algorithm tells them how much money
the script is going to make.

Speaker 1 (36:43):
And we've already given.

Speaker 3 (36:47):
The script to the executives there and they love this
and they're going to give it to the algorithm, and
I just know that this is going to work out.

Speaker 1 (36:53):
So I said, Okay, Bob, this sounds great. You know.

Speaker 3 (36:55):
And this company, I should say, they very wealthy, very successful.

Speaker 1 (37:00):
They've invested all these movies.

Speaker 3 (37:02):
You know, they had themselves the sort of top floor
of the biggest luxury car dealership in you like, walked
in through all these luxury cars to get in their office
and sort of like typical high with stuff.

Speaker 1 (37:12):
Right.

Speaker 3 (37:14):
So I get a call a week or two later
from Bob. He's like and he's like, I, Unfortunately, I've
got bad news for you. And I was like, oh
my god. I was like, did the computers read our
script and not like it? He goes, no, No, He's like,
that's not the problem. He's like, the problem is the
company just went bankrupt. And they went bankrupt because the
last script their computers picked was a ball and so
now they'd.

Speaker 1 (37:33):
Have no money.

Speaker 3 (37:34):
And so we were unable to make our scripts about
the infinitely intelligent computer because a stupid computer was unable
to pick the right scripts to make.

Speaker 2 (37:45):
A stupid computer. Is it any wonder Angus has gone
on to become a pioneer in story science. In fact,
Angus has gone on to become a leading critic of AI.
He thinks only humans will ever be able to tell
good stories. One of the papers that's published is called
why computer AI will never do what we imagine it can.

(38:08):
I'm reading now For the abstract, computers contain a hardware
limit that renders them permanently incapable of reading or writing narrative.
This article draws upon the author's work with deep neural
networks Judaea, Pearls, do Calculus, GPT three, and other current
generation AI to logically demonstrate that no computer AI, quantum

(38:33):
or otherwise has ever learned or will ever learn, to
produce or process novels or any other kind of narrative,
including scripts, short fiction, political speeches, business plans, scientific hypotheses,
technology proposals, military strategies, and plots to take over the world.

(38:54):
So what do we have here? We have a script
about AI killed by an AI company that went bankrupt,
whose co author goes on to write the definitive debunking
of AI. Who's writing that screenplay? Guys, this has been fantastic.
If my vote is in all do all moguls out

(39:18):
there in our listening audience, It's clear what the people want.
The people want is did you call me with the
variable man? Yeah, the people want the variable man.

Speaker 1 (39:27):
They do fingers crossed.

Speaker 2 (39:33):
Next week, we'll be back with another story from the
depths of development Hell. This episode was produced by Nina
Bird Lawrence with Tali Ellmen and Bendadaf Haffrey. Editing by

(39:54):
Sarah Nicks, original scoring by Luisquira, engineering by Echo Mountain.
Our executive producer is Jacob Smith. I'm Malcolm Gladwell.
Advertise With Us

Popular Podcasts

Dateline NBC
The Nikki Glaser Podcast

The Nikki Glaser Podcast

Every week comedian and infamous roaster Nikki Glaser provides a fun, fast-paced, and brutally honest look into current pop-culture and her own personal life.

Stuff You Should Know

Stuff You Should Know

If you've ever wanted to know about champagne, satanism, the Stonewall Uprising, chaos theory, LSD, El Nino, true crime and Rosa Parks, then look no further. Josh and Chuck have you covered.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2024 iHeartMedia, Inc.