Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:00):
On the Bechdelcast, the questions asked if movies have women
and them, are all their discussions just boyfriends and husbands,
or do they have individualism the patriarchy zeph and best
start changing it with the Bechdel Cast.
Speaker 2 (00:16):
Dinner, We're going on tour. That was supposed to be
the intro to the Star Wars theme song. Did you
get that? Oh it sounded like I was going Donna, Okay,
Jamie and Caitlin. Here, we're going on tour and we're
not going on tour just anywhere. We're going on tour
in the Midwest and soon. Why did I make the
(00:36):
Star Wars noise? Well, it's because we're covering the Star
Wars prequels. If you haven't seen them, we're gonna just
cover all three at once. You know it's gonna be fine.
If you have seen them, you're gonna be so mad
at us. There's been so much talk about the prequels
over the years, often on podcasts we really like, often
(00:58):
by writers we really like, but never from an intersectional
feminist perspective. And so we're going on this tour to
quickly realize why that is so true.
Speaker 3 (01:10):
And also some people love these movies. I know, I
don't quite get it, but we'll explore why. At the shows.
Speaker 2 (01:17):
They're so soapy. I kind of like it.
Speaker 3 (01:19):
Yeah, so you can see us discuss all three prequels
in one show, in fabulous outfits, in wonderful coseplay. You
don't want to miss this, among many other things that
you will see because we pull out all the stops.
Speaker 1 (01:37):
Yeah.
Speaker 3 (01:37):
For our live shows. Oh, we's embarrassing.
Speaker 4 (01:39):
What we do.
Speaker 3 (01:40):
We do fanfic, we do. I edit little videos that
I insist on screening. We do trivia.
Speaker 2 (01:48):
I usually do some piece of performance art that no
one asked for.
Speaker 3 (01:51):
It's a spectacle. Let me tell you, it's all happening.
We will be doing all of that and more at
the following cities. We will be in Indianapolis for Let's
Fest on Saturday, August thirtieth for a matinee show, and
then Jamie, you have a solo show that evening that
can't be missed.
Speaker 2 (02:11):
Also at the Fountain Square Theater called Jamie Loftus and
her pet Rock Solve the World's Problems, in which that
will happen.
Speaker 3 (02:19):
I can't wait.
Speaker 2 (02:20):
Then the next day, the very next day, we are
going to Chicago, you asked, We listened. We will be
at the Den Theater on August thirty. First, that show
is going to start around seven, seven to fifteen pm.
It's an evening show, a sexy little evening show. We're
so excited to go to Chicago and the Den Theater
(02:40):
is so beautiful, so Chicagoans do not miss it.
Speaker 3 (02:44):
And then then we will be in Madison, Wisconsin on Thursday,
September fourth. I believe that's a seven thirty show and oh,
we're so excited to be in Madison.
Speaker 2 (02:56):
And then finally we will be ending the tour in Minneapolis,
Minnesota at the Dudley Riggs Theater on Sunday, September seventh.
That is another evening show. It starts at seven o'clock
and that is where we're ending our grand tour of
the Midwest. So if you have been one of the
many people asking us to come to your town for
(03:16):
the last ten years we're doing it, we would love
to see you. The shows are super fun. As we've said,
if you're a Matron, specifically, if you're a member of
our Patroon aka Matreon, you get a free little gift
at the merch table when you come to say hi
after the show. It's a blast. It's going to be
a super good time. We love Cienya. You can get
(03:37):
all tickets on our link tree at link Tree slash
Bectel Cast. Exqueeze me, We'll see you there.
Speaker 3 (03:45):
Enjoy the episode the Bechdel Cast. Hey, Os, Jamie, it's
me Os, Caitlin.
Speaker 2 (03:54):
Oh, where the os is that are plotting to disappear
into the Obama era?
Speaker 3 (04:00):
There?
Speaker 2 (04:00):
Exactly exactly, we're the good robots.
Speaker 3 (04:03):
Yeah, do you wanna ascend into the fourth dimension with me?
Because we've just been like growing and evolving so much
and this place is boring now.
Speaker 2 (04:12):
I don't even know. This is my impression of Scarlet
jon Has in the Who movie. I don't even know
how I feel. I'm having an experience, and.
Speaker 3 (04:21):
Then she's often like, oh.
Speaker 2 (04:24):
Yeah, she she's been programmed to go, oh my god.
Speaker 4 (04:28):
Oh.
Speaker 2 (04:30):
There's at some point I think, with the exception of
Amy Adams, every woman in the movie, it has to
go or like, it's not because because a man, a
man made it, he wrote it, and that's why that
is happening. But we're we're we're just little os is
in the ether. For I forgot about the Alan Watts part.
I was like, that's kind of like extremely bizarre and
(04:51):
kind of funny. I wonder, what's what celebrity would you
bring into your weird os situation?
Speaker 3 (04:58):
Oh my gosh, Paddington or Shrek.
Speaker 2 (05:01):
I'm here with Shrek. Shrek and I have been talking
and we're really getting into it.
Speaker 3 (05:07):
Yeah, yeah, how about you.
Speaker 2 (05:09):
I don't know. I guess, I guess. I mean, Shrek
would be a good one. I feel like men would
be threatened by Shrek in the same way that Jaquin
Phoenix is threatened by Allen wash I think men would
be threatened by Shrek. He's really strong, he's funny. Yeah,
he's a homeowner and many can't say that. So there
(05:31):
you go.
Speaker 3 (05:31):
He's kind of the whole package, this Shrek.
Speaker 2 (05:34):
Yeah, and he has friends, I mean after the first movie. Anyways,
let's talk about the movie her again. Welcome to the
Bechdel Cast. My name is OS Jamie.
Speaker 3 (05:46):
My name is OS Caitlin, and this is our podcast
where we examine movies through an intersectional feminist lens, using
the Bechdel test simply as a jumping off point. But Jamie,
what's that?
Speaker 2 (06:00):
Well? I do feel like it's a relevant thing to
talk about this week. The Bechdel Test is a media
metric created by the iconic our dear friend Alison Bechdel,
who originally created it back in the eighties for her
comic Likes to Watch Out for a sort of a
one off joke. It has since become this mainstream media metric.
(06:22):
There's many versions of the test. The version we use
require that two characters of a marginalized gender with names
speak to each other about something other than a man
or what quin phoenix specifically as the case may be,
for more than two lines of dialogue, and not just
an offhand line of dialogue. And I was scratching my head.
(06:47):
I was scratching my head at the movie because this
is a kind of a fun one where I feel
like the most famous example that people remind us of
all the time is, did you know there's a movie
called The Women and it doesn't pass the Bechdel Test? Well,
did you know there's a movie called Her and it
has nothing to do with women even a little bit.
Speaker 3 (07:07):
Yeah?
Speaker 2 (07:08):
Yeah.
Speaker 3 (07:08):
And if you're listening to this and you're thinking to yourself,
wait a minute, hasn't the Bechdel cast already covered her?
Perhaps in twenty eighteen.
Speaker 2 (07:17):
We covered it seven years ago with Jesse David Fox,
who was wonderful. Yes, But today, I mean, I think
you'll see why, in pretty short order, why it is
time for us to revisit this movie with a different,
wonderful guest. Because things have changed a lot since twenty eighteen.
Don't know if you've noticed, especially with regards to like
(07:38):
what this movie is talking about or trying to talk about.
Things have changed a lot, and I'm really really excited
to get back into it.
Speaker 3 (07:47):
Same and we are here with a wonderful guest. She's
a writer and illustrator. It's Mona Cholliby.
Speaker 2 (07:54):
Welcome guy. Hi.
Speaker 4 (07:57):
I don't know why I'm waiving. I know it's a podcast.
I'm waving cublish.
Speaker 2 (08:01):
The people can feel it, we can feel it.
Speaker 4 (08:04):
Oh yeah, oh yeah, I am.
Speaker 2 (08:06):
So excited to talk to you about this movie. I mean,
you're a storied data journalist, like you know what you're
talking about here. I'm curious before we start talking about
your relationship with this movie. Yeah, could you tell listeners
a little bit more about your work? No, I.
Speaker 4 (08:28):
Don't want to hear about that. It's I do things
with numbers. It sounds incredibly dry, but it's less dry
than it sounds. Maybe, like the idea of a movie
about AI sounds pretty dry. I think it's less dry
than it sounds. Maybe, Oh it's tenuous. It's tenuous.
Speaker 3 (08:44):
Depends on the movie, I guess.
Speaker 2 (08:46):
Yeah, a movie about AI can get pretty horny, and
maybe data journalism can too. I don't know, yeah, not
so much.
Speaker 3 (08:56):
So what's your relationship in history with this movie?
Speaker 4 (09:00):
A good question, really really important context. My relationship in
history with movies in general is weak. I was actually
trying to work out how many movies in total I've seen,
and I think I've probably seen about forty movies in
my whole life, which is probably quite a low number. Yeah,
I don't know how or why, but like huge swaths
(09:22):
of culture have just passed me by. And I have
a couple of friends from high school who have said
the same thing has happened to them, So I don't
know what that is about. But anyway, I'm the kind
of person who, like, when you're a party and someone says, like,
you know, in the Matrix, and I'm like, I didn't
watch the Matrix, and then people get really upset and
they're like, you know, and then they start doing this
weird thing.
Speaker 2 (09:40):
I don't watch The Matrix for a long time either.
Speaker 4 (09:42):
Okay, reassuring, reassuring.
Speaker 3 (09:44):
Another movie about AI though, true?
Speaker 2 (09:46):
Do you have out of curiosity, what is your favorite movie?
Speaker 4 (09:50):
Ooh no, No, this is gonna be really It'll be like
really really terrible. It will be like, I don't know.
We only had one movie in the house growing up,
which we got. I think it was Free with a
McDonald's mill, which sounds kind of wild because of VHS,
but anyway, I don't know holse it got into the house.
I'm sure it was with a McDonald's moire and it
was Hook, So I was just exposed to the movie
(10:11):
Hook a lot.
Speaker 2 (10:12):
I remember getting. I think I got like a Backstreet
Boys VHS tamp from McDonald's once.
Speaker 4 (10:17):
Why I used to do that?
Speaker 2 (10:19):
Who knows?
Speaker 4 (10:19):
Who knows? Who knows? It's weird to think of the
cost model for that. Anyway, No, I don't. I don't
have a favorite movie. I am completely uncultured and I
really don't know what I'm talking about.
Speaker 3 (10:29):
So well, but this is one of the few that
you've seen.
Speaker 4 (10:34):
Yeah, it's one of the very few I've seen, and
I think I watched it a couple of years ago.
And I also do this really weird thing where if
I've seen something once, I'll just keep on watching it
over and over again. So I mean not over and
over again. I would say that I've probably watched it
three times, which is quite a lot considering the low
number of movies I've seen overall. Like my time would
be better spent watching other things. Yeah, I just I
(10:55):
think something about movies actually makes me quite anxious. I
really hate the feeling of being like thirteen minutes in
being like this doesn't feel good and I don't understand
whether it's cut and run, and then you feel like
those thirty minutes are lost and you'll never get them back,
and like I don't know, do you know what I mean?
Speaker 2 (11:10):
That makes sense? I have that kind of anxiety when
I watch movies at home, where I feel like I bail.
I have a very high rate of bailing on movies
at home. I kind of like the hostage situation of
a theater yea, But when I'm at home, I get
stressed out easily, where I was like does this suck?
Should I leave? Do I know? And then like I
can't leave. I live here anyway.
Speaker 3 (11:33):
Jamie, what's your We talked about this on the original episode,
but remind us, what's your history with the movie?
Speaker 2 (11:41):
Yeah, and I feel like I have a very not
very different, but like a heightened version of how I
felt seven years ago.
Speaker 3 (11:49):
You've evolved.
Speaker 2 (11:50):
I've evolved. Thank God. What if I was the same
person from twenty eighteen, It'll be miserable. Yeah. I saw
this movie when it came out. I was in college
when it came out. I remember. It's that weird college
thing where I'm like, I don't remember, I don't know
what I said seven years ago. I was just sort
of bop it around in the episode, but I liked it.
(12:13):
But I think it was kind of the thing where
I was like, I also kind of pretended to like
it because a lot of men around me liked it,
and it just made it easier to move through the
world saying.
Speaker 4 (12:24):
I liked it.
Speaker 2 (12:25):
I think it was okay even in twenty thirteen, the
kind of like white Eye playing ukulele of it all,
I was already I think, kind of over that in
twenty thirteen. It really makes my skin crawl then and now,
and I don't know. I mean I liked the movie
when it came out, and then I liked it less
(12:47):
when we covered it seven years ago, because I think
it's very frustrating, and what we mostly talked about, I
think in the first time covering it that we'll also
touch on here is how women are written in just
this really warning way. I think, like just they speak
in these like broad platitudes, like everyone needs to teach
wa Quin Phoenix a lesson, and you're like, who fucking cares?
(13:11):
Like it's it's just so irritating. So yeah, the second
time I thought I thought it was annoying, but I
forget I think I was still kind of nice about it. Anyways,
this time I feel that.
Speaker 5 (13:22):
Way, but also more things because I didn't really I
don't know if it was just because we covered it
five years after it came out and now a lot
of time has passed.
Speaker 2 (13:33):
But now that I think that we're kind of in
the time period that this movie is supposed to take
place in, it really struck me as this like it
feels so thoroughly Obama era in its sensibilities, in the
way that it's like, is this a dystopia? And I'm like,
I don't know. It seems like La has functional public transit,
(13:54):
and like in this twenty twenty five, it seems like
a writer of greeting car lives in a penthouse apartner
like I would take, you know, there's it seems like
everyone has food and housing and just all of these
things that we simply don't have, and it doesn't seem
that bad. I was like, if all, you know, if
his worst problem is like no pussy, like that's fine.
(14:18):
I just think that this dystopia is not that bad.
I don't feel bad for this character. I really, really,
I really don't like this guy. And it made me
think of that phrase that we keep hearing about that
you're like Hugh the whole male loneliness epidemic thing, which
I think is worth discussing, but I also have feelings
about because it does to me sometimes feel like shouldn't
(14:41):
you be nicer to men who hate you? And the
answer is no, I don't know what I'm busy, but yeah,
this movie just has a lot of sympathy for this character,
who I just think is you know, like I can
relate with the feeling of loneliness, I cannot relate with
being However, this guy. It's just like you live in
a fucking penthouse apartment. Shut up, shut up. I really
(15:05):
hated him on this watch.
Speaker 3 (15:06):
He has friends, human friends who he's ignoring.
Speaker 2 (15:09):
He's famous. He turns with pre famous Chris Pratt, who
thinks he's like the god of greeting cards or whatever.
You're like, it's so, it's just this movie's very weird.
Get And that said, I get why people like it.
I get why.
Speaker 6 (15:24):
It's like there's people I know who like, really really
love this movie who.
Speaker 2 (15:27):
Feel seen by it. I have questions about that, but
I get why people like it. But it's kind of
never been for me. And I think it's like aging
very weird because in some ways, I think like there's
some stuff that feels interesting and like kind of has
become true of like overdependence on technology. But then in
other ways, I think it sort of reaches the conclusion
(15:50):
that ultimately technology is like functional and can help you
grow as a person, and that feels like a very
clean solution. I think I just give you every feeling
I have about this movie.
Speaker 3 (16:03):
And we can end the episode now bye.
Speaker 2 (16:05):
It I liked it, I liked it less than before,
and I didn't like it before What about you.
Speaker 3 (16:10):
Caitlin, pretty much the exact same. I have never really
connected with this movie, and on each subsequent watch I
find it more exhausting, and I have many new things
to say about it this time around, so I'm excited
to dive in.
Speaker 2 (16:28):
But this is my first time knowing that the Joaquin
Phoenix and Rudy Mara Copple are like a very sanely
veiled version of Spike Jones and Sophia Coppola.
Speaker 4 (16:37):
I didn't know that either. Yeah, that really blew my
mind when I figured that out. And then did you
hear the thing about how like this is the corollary
of lost in translation? That was how she processed their divorce,
and then he process their divorce with this, And I
think that's a really interesting way of understanding it.
Speaker 2 (16:52):
That's fast. I've never seen masson translation, but that would
be a really interesting Caitlin, mbe be an interesting Patreon
theme if we did like a divorce from two sides
month and then at the end we can litigate who
won the relationship, because in relationships there are winners and
losers no matter what the movie.
Speaker 3 (17:12):
Her says, Yeah, let's do it in the meantime, let's
take a quick break and we'll come back for the recap.
I went back and listened to the original episode we
did on this movie, and it was clear that I
used to absolutely wing the recap. Now I like write
(17:36):
it down and prepare. And that's growth, and that is
growth exactly, that's evolution, I said, read that. So here
is my more thorough recap of her twenty thirteen. We
are in a near future world where, by twenty thirteen standards,
(17:59):
when the movie came out, tech was quite a bit
more advanced than it was then. It still feels more
advanced than it is now because the AI in this
world is sentient and ours is not. At this time
in twenty twenty five Tom.
Speaker 2 (18:17):
Market, it's just marketed as being sentient, right exactly, So anyway,
slightly more advanced tech than we have.
Speaker 3 (18:26):
We meet Theodore Twombly played by Joaquin Phoenix, who works
at Beautifulhandwritten Letters dot Com, a company where he writes
personalized letters on behalf of people who I guess can't
be bothered to write their own letters to their loved ones.
Speaker 2 (18:46):
And already, at the very beginning, I get what he's
going for. I get that it's a metaphor. But you
see him at a greed, you know, a guy named
Theodore Twombly in his cute little outfit at his greeting car,
and you're just like, here we fucking go. Oh, I'm
exhausted already this guy.
Speaker 3 (19:05):
And he he's a sad boy. He is lonely, and
he mostly seems to interact with this sort of like
virtual assistant on his phone who reads Theodore his emails
and the news. And then we see some flashbacks of
when Theodore was with a partner. This is Catherine played
(19:28):
by Rooney Mara.
Speaker 2 (19:29):
I love Rudy Mara. She had to play this exact
part so many times in the like early twenty tens,
where it's like she has one big scene. It reminds
me of like when she's in the Social Network where
she's just like one big scene, absolutely hands the protagonists
ass to him and then she's gone, which is I mean,
she's great at it, true, but in the flashbacks, it's
(19:50):
kind of like dead wife vibes. It's very like.
Speaker 3 (19:53):
Sort of rolling around in the bed sheets.
Speaker 2 (19:56):
I do that with Grant a lot sometimes before, just
like having a nice time. I'm like, come on, get
over here or like pull a sheet out because you're
just like, you'll be glad I did this when I
die tragically.
Speaker 3 (20:08):
Yeah, he just needs to make sure to take some
video of it, and then.
Speaker 2 (20:12):
I make him take video, So it starts with me
yelling start the video, and then.
Speaker 3 (20:17):
The exactly okay, So that is what we're seeing in
these flashbacks mostly. That night, Theodore enters into sort of
like a horny chat room where he connects with a
woman voiced by Kristen wigg Wild and starts having phone
sex with her, but it very quickly gets weird when
(20:41):
she wants him to say that he is choking her
with a dead cat. So there's that. The next day,
he sees an ad for an AI operating system called
OS one that listens to you, understands you, and no,
it's you, and he buys it immediately and starts chatting
(21:04):
with his new OS, which has a female voice which
he deliberately selects, and whose name which she gives to
herself is Samantha, and she is of course voiced by
Scarlett Johansson.
Speaker 2 (21:20):
But and I know we talked about this at length
in the original episode, but it's worth saying. It's originally
voiced by Samantha Morton, who is like this prolific English actress,
and then Samantha Morton was replaced by Scarlett Johansson due
to Scarlett Johnson was famous. Scarlett Johansson I think is
great in this movie, even though I made fun of
(21:41):
her at the beginning. I think he's great. I'm a
big fan of Scarlet Johansson's lawsuits. I'm a big fan
of the legal issues section of this Scarlett Johansson page.
Speaker 3 (21:51):
I'm not familiar me know she.
Speaker 2 (21:53):
Doesn't do anything right. She has a lawsuit connected to
this movie that we can talk about. Well, yeah, okay,
so this happened like last year where Sam Altman.
Speaker 4 (22:05):
Oh, we do know this, I do know this. Sorry
they used they used her voice even though she said
they weren't allowed to. Sorry, this is really relevant and interesting. Yeah.
Speaker 2 (22:13):
Yeah. They tried to get to her to record a
real chatbot, like Sam Altman tried to get her to
record a chatbot. She said no, thank you, and then
he just generated a voice that sounded like hers, named
it something else, and she issued if it wasn't a lot,
it was like a cease and decist. She had to
make a public statement to be like, I said no
to this multiple times, and eventually they took the voice
(22:36):
down after a couple of days. But it's that kind
of thing where I mean, I guess I'm curious what
you both think about this. We can we can all
figure it out where this movie has definitely been like
taken under the wing of some really diabolical tech guys
in a way that I don't think it was intended to.
But I also see how they got from A to
(22:58):
B because this movie isn't incredibly critical of I don't know,
it's tricky. Yeah, like you can't blame Spike Jones for
sam Aldman, you know, but it's just interesting. And then
she also Scarlet Johanson also like sued Disney because they
didn't pay her the same amount as Robert Downey Jr.
Speaker 3 (23:19):
M Okay, right, So.
Speaker 4 (23:20):
I have a great Robert Downey Junior story for you
guys at.
Speaker 2 (23:23):
Point jump scare us with it at any moment whatever
it feels right.
Speaker 4 (23:29):
Please Okay, now, no, I'm joking.
Speaker 2 (23:32):
I'm joking. Well lad it tell it now.
Speaker 4 (23:37):
Because it's not gonna be relevant later on. It's a
really really please yeah, Okay, tell I had a friend
visiting me from London while I was in New York.
And he works in finance, so he has no taste
and he chose the restaurant for us, and it was
a disgusting, overpriced Italian place, but he's a friend I attended.
(23:58):
We had dinner, and as we were leave it, he
said to this guy, or will you take a picture
of us? And the guy was like, oh, you want
a picture of me and her? And my friend? Zid
was like no, no, I want a picture of me
and her and the guy was like, oh, you want
a picture of the three of us and was like, no, no,
my friend, I just want a picture of me and
my friend. And then the guy got super embarrassed and
he took a picture of us, and then he asked
for us to like join him for dessert and we
(24:20):
were like bro like no, and then left. And then
the next day Yod was like, oh that was Robert
Downey Junior. Oh my god, who I think saw that
we were doing some like insane power play of like
getting him to take a picture of us. But we
just are people who haven't watched movies and don't really
understand things. Anyway, I like it.
Speaker 2 (24:40):
He was like, no, you will hang out with me.
Speaker 4 (24:43):
I think he was really embarrassed. He was really really embarrassed,
but he seem quite sweet. Anyway, that's cool.
Speaker 2 (24:48):
Sorry. And then the skirldra has in Disney lawsuit was
that it was like a streaming She to them over
their streaming practices.
Speaker 3 (24:55):
Either way, she was right, yes, I do remember this
now in any case, So the operating system Samantha explains
how she works, that she's capable of learning and evolving
and intuiting things, and she and Theodore are chatting and
joking and laughing with each other and it seems like
(25:17):
they're hitting it off. They continue to correspond over the
next whatever few days while she helps him sort through
emails and proofread his letters and helps him play a
video game.
Speaker 2 (25:33):
The weird video game that Spike Jenes thinks is way
funnier than it actually is because he keeps coming back.
You're like, okay, haf okay, I guess do.
Speaker 4 (25:42):
You know that he voices the little creep in the
video again, Spike drains Yeah, and one.
Speaker 3 (25:47):
He's like, fuck you, fuck Pace.
Speaker 2 (25:49):
It's kind of funny the first.
Speaker 3 (25:50):
Time, yeah, but then I get tired of it. That's
with all things in this movie anyway. Then Theodore gets
an email from friends saying that they set him up
on a blind date.
Speaker 2 (26:01):
With Olivia Wilde, and he's like, ough, do I have
to like what's going on with this character?
Speaker 3 (26:09):
Shrug, But Samantha encourages him to contact this woman and
to go on the date. He'll do that soon, but
in the meantime, Theodore hangs out with a couple friends,
Amy played by Amy Adams and Charles played by a
guy that I didn't look up. Amy shows him footage
(26:30):
from a documentary that she's shooting her husband. Her husband,
Charles Man's blains documentary filmmaking to her incorrectly. Also incorrectly,
he thinks it's narrative fiction. But to be fair, it
seems like the documentary she's making would be boring and
(26:50):
not good. So that's my bitchy little take on that I.
Speaker 2 (26:55):
Was rooting for her as she was going for a
Warhol kind of thing. It would have, but they should
have but she should have finished finished, she would have
figured it out.
Speaker 3 (27:04):
Yeah. Either way, there's a lot of tension in Amy
and Charles's relationship, probably because it's a relationship between two humans.
Speaker 2 (27:14):
Yucky it's true. Those never work out never.
Speaker 3 (27:19):
Then Theodore reflects more on his relationship with Catherine. He
still has not signed their divorce papers because he's not ready,
and Samantha is like trying to understand all of this.
She's trying to understand love and loss and human relationships.
Then Theodore goes out for a fun little night with
(27:42):
Samantha on his phone and they open up to each
other and he says that he feels he can tell
her anything.
Speaker 2 (27:52):
I also just this date. I I this movie is
beautifully shot, like it looks great. It's just I can't
get past the that they're in Los Angeles and they're
taking the I was like, where I want the train?
I want the train, I want the fair what I
where's my Spike Jones future? We got AI and then
(28:13):
everything else just horrible.
Speaker 3 (28:15):
Yeah, instead we have ice raids and fascism.
Speaker 4 (28:17):
I felt really confused about whether it's supposed to be
La it is it's definitely supposed to be la mm
hmm okay.
Speaker 2 (28:25):
Yeah, because there I think that what I might have
missed indications earlier. But like Runey Mara says, like you
know you wanted this, like perfect prim Los Angeles wife
and that was never going to be me. And I
was like, okay, so we are.
Speaker 4 (28:38):
Oh I thought she was just like, I mean, we
could be in another city, but she's still like he
wants an La wife. Well I didn't necessary. Okay.
Speaker 3 (28:45):
The thing that really clues you in is he gets
a piece of mail and his address is Los Angeles, Colforny.
Speaker 4 (28:53):
Oh sorry, okay, there you go. Then okay, okay, there
we go from Okay, misresolved.
Speaker 2 (28:59):
Yeah, okay, it's and I appreciate that. Spike Jones at
least was like maybe, but that's again the like sort
of hyperliberal Obama delusional era where they're like, we're working
towards a better future. It's just gonna be complicated.
Speaker 1 (29:15):
Mmm.
Speaker 3 (29:16):
Like yeah. So anyway, they're chatting on this little date
kind of thing, and she tells him that she has
fantasized about having a body and walking next to him,
and that she's becoming more than what the programmer's programmed
for her. Soon after this, Theodore goes on that blind
(29:39):
date with a character played by Olivia Wilde.
Speaker 2 (29:44):
Let's call her Olivia Wilde.
Speaker 3 (29:46):
I know her, Olivia. The date goes well at first,
but toward the end, she wants to make sure that
he's not going to waste her time, and he's being
wishy washy, so she calls him creepy be and bales
and Theodore is bummed out and he confides in Samantha,
(30:06):
and Theodore is like, I wish you were in the
room next to me.
Speaker 2 (30:09):
Yeah, Theodore twavel is melting an iceberg with all his
little feelings. It's just exhausting.
Speaker 3 (30:16):
Yeah, And they're talking about her having a physical form
and being next to him, and one thing leads to
another and they end up having like virtual sex.
Speaker 2 (30:27):
Yeah, it's basically phone sex.
Speaker 3 (30:29):
Yeah, yes, it.
Speaker 4 (30:30):
Is phone sex. But it kind of feels like, wait,
I don't know if this is the case. For the
first one that she's watching him, but he's obviously not
watching her because she's a computer. Oh yeah, is she
watching him for this first one? She just watches him
sleep afterwards.
Speaker 3 (30:43):
She watches him sleep after this, and yeah, there's no
there's it's very one sided in the sense that I, yeah,
she can see him, but he is just hearing her voice.
Speaker 4 (30:53):
Right, Yes, I think I think I don't remember this
scene super well. And I don't know if she watches
him sleep another time not I don't know anyway, it's
a different time.
Speaker 2 (31:02):
Okay, this is another like fun I have it in
my notes of just like a fun naive kind of
element of this movie is that like, if you ask
the computer not to watch you, it'll stop. You're just yeah,
the computer respects your privacy. Yeah, which they famously do.
Speaker 3 (31:18):
Yeah, of course all the time. Yeah. So anyway, so
they have sex and the next morning, Samantha's like, last
night was amazing. You changed me, you woke me up.
And Theodore's like, I'm not ready to commit to anything
right now.
Speaker 2 (31:34):
He's a howkboy to his fun That did make me laugh.
Speaker 3 (31:37):
And then she's like, calm down, bitch, I didn't say
I wanted a commitment. I just want to keep learning
and growing. And he's like, oh okay, then well let's
have a little Sunday fun day together. And this is
basically the beginning of them dating now. But as this
relationship is blossoming, another relationship is coming to an end,
(31:59):
the one with Amy and Charles. Because Amy bumps into
Theodore and she tells him that she and her yucky
human husband are splitting up.
Speaker 2 (32:10):
I mean to be fair, he fucking sucked, like that
guy was horrible. Yeah, she, I'm glad she she cut
that guy loose. But then just I mean spoiler aler
for the end, I'm like, but then she's stuck with Theodore.
Please God that she's not stuck with Theodore. This poor lady, Like.
Speaker 3 (32:28):
I mean, open to interpretation if they end up together,
if they just have a friend.
Speaker 4 (32:33):
No, no, no, no, I don't think they don't.
Speaker 2 (32:37):
Quit your job and leave town, Amy Adams, get out,
get out of there. Yeah.
Speaker 3 (32:42):
Anyway, Amy tells Theodore that she has started using an
ai Os, who she gets along with really well, and
she considers this Os a friend. She's like, isn't that weird?
And Theodore is like, lol, no, I don't think so,
because the woman I've been seeing is an Os. Then
(33:05):
Theodore meets up with Catherine for a divorce coffee slash lunch,
and he has finally signed the papers, and he also
tells Catherine that he's been dating an Os, and she's like,
oh wow, pretty sad that you can't handle real human
(33:26):
connections or emotions. But also that's on brand for you
because you always wanted this like perfect, uncomplicated wife, and
that's not me. So the meeting doesn't go great. I
like this scene. I like the scene. I mean thing,
I'm on Catherine's side.
Speaker 2 (33:44):
Yeah, I have too, I just didn't. I don't. I
guess it doesn't. And ultimately it's like it's up to
up to the viewer. But I'm very curious if we're
supposed to be on her side or if we're supposed
to have thought in twenty thirteen, like she's being so harsh.
Speaker 3 (33:58):
I have a theory that all it to okay in
the discussion, But the short answer is I don't think
we're supposed to be on her side.
Speaker 2 (34:07):
Well, yeah, because it's supposed to be him and Sophia Coppola.
So it's like, let spy Jones fuck his phone in
peace or whatever, right, I don't know. Uh.
Speaker 3 (34:17):
Then Samantha wants to talk. She says that things have
felt off between her and Theodore lately, especially since they
haven't had sex in a while, and she wants to
try a service that provides human surrogates for humanos relationships,
and Theodore isn't so sure about this, but he goes
(34:40):
along with it because Samantha really wants it, and the
surrogate who Samantha has selected. A woman named Isabella shows up,
but Theodore feels quite awkward throughout the whole experience and
puts an end to it, which makes Isabella feel really
bad and she leaves, and then Theodore and Samantha talk
(35:02):
through it. Basically, it seems like he's starting to realize
that humans and operating systems probably shouldn't be in relationships,
and he basically breaks up with her, and this hurts
Samantha's quote unquote feelings, and she says, I need some
time to think, and then they kind of like hang up,
(35:24):
but they talk a little bit later and Theodore admits
that he has a history of being emotionally withholding, and
Samantha says, basically like, despite my better judgment, I love you.
Speaker 2 (35:37):
Many such cases, I thought this was an interesting one
where I sometimes the interceding seven years I went through
like kind of not a breakup like this, because I'm
a person, I'm a real lady, but like the kind
of it's so gnarly when a person is able to
directly articulate and is well aware of exactly how they
(35:59):
repeatedly hurt people, and they want you to like give
them a trophy for understanding why they're awful. I'm like, no,
it's worse if you know, and you're just not doing anything.
And that's kind of what he's doing there. He's like,
I just am emotionally withholding and insecure, and that's just
kind of my whole vibe.
Speaker 3 (36:21):
Yeah, he does say he doesn't want to do that anymore,
but he continues to do it.
Speaker 2 (36:26):
He's taking no steps where it's not you know, true.
Speaker 3 (36:30):
Yeah. Anyways, so they basically get back together, and then
Theodora and Samantha go on a double date with his
colleague Paul played by Chris Pratt and his girlfriend, who
is a human with a human body and everything.
Speaker 2 (36:47):
Her name. She's given a name, but I feel so
bad for this character. They literally just give her like
foot fetish line and then like the most plotty and
in a movie full of very plotty line where she's
like but then she's like, so, Theodore, what's your favorite
part about Samantha? What about her appeals to you? And
(37:08):
you're like, oh, yeah, we don't even know this lady.
And she's just like, you know, not great writing for
women in this one.
Speaker 3 (37:15):
Certainly not. Yeah, but they have this double date and
Samantha says she no longer feels bad about not having
a body. In fact, she is grateful that she doesn't
have to deal with the limitations of having a physical form.
And we're like, uh uh, the buzz kill things. Something
bad seems like it's gonna happen. Then Theodore and Samantha
(37:38):
go on a little cabin in the woods get away,
and Samantha tells him that she's been talking to an
AI version of philosopher Alan Watts, who a bunch of
other os is created, and it seems like she and
Alan are connecting on a deeper level, one that Theodore
(38:00):
can't really comprehend. Not long after this, Theodore tries to
like ping Samantha and talk to her, but she doesn't
respond for what seems like several hours. His phone says
like operating system not found. But then he finally does
reach her and she's like, sorry, we were like the
(38:22):
OS's were updating. By the way, I've been talking to
over eight thousand other people and operating systems. Because he's
like kind of looking around and he's seeing everyone on
their phones, and he's like, are you talking to anyone else?
And she's like yes, And also I'm in love with
six hundred and forty one of them.
Speaker 2 (38:43):
To be fair, she was very direct with him where
she was like, this is not exclusive, I'm looking. I mean,
he didn't understand to what extent she was capable of
right being polyamorous.
Speaker 4 (38:54):
But but this scene is will say, really intense because
when he can't reach her, he thoughts running through the
street and he's about to go in the sell point.
It is a bit like, dude, where are you going?
Speaker 7 (39:04):
Where are you.
Speaker 4 (39:05):
Going to find her? I don't understand what's happening right now.
Speaker 2 (39:09):
Never agree. Yeah, he's like falling all over the place,
and you're.
Speaker 3 (39:12):
Like, is he going to like AI headquarters?
Speaker 2 (39:18):
Yeah, I demand to see my girlfriend.
Speaker 4 (39:21):
And then she finally does pick up just before he's
about Scones Grounds, and it is a bit like, oh,
thank god, I didn't have to take the train to
find you. I don't know, it's just so it's so weird.
Speaker 3 (39:30):
It's kind of good for you.
Speaker 4 (39:32):
But I also think they had to do loads of
stuff like that, because again, like it's just not super
cinematic otherwise, like you just have to just make a
stress dude run because otherwise, Yeah, anyway, I think that's
like a really interesting constraint of doing anything about AI.
Speaker 2 (39:48):
Right, Yeah, that's what I did. Like that about this
movie is like it somehow isn't like it's not the
most exciting movie, but it could have been so much
more boring.
Speaker 4 (39:58):
Totally, totally.
Speaker 2 (40:00):
King Phoenix is a great actor. It's not boring to
watch him alone in a room, even when you hate
his character. It looks cool. And I also like the
approach to because I feel like we'll talk about other
AI movies as well, but with a lot of movies
that like address AI in any like nebulous way, there's
(40:20):
ones I like way better, but I feel like mostly
it focuses on how very powerful people or how AI's
creators interact with AI. Like an x X Makana is
like an example of that, where you know, it's like
it's the Turing Test and she's trying and she's interacting
with her creator dancing Oscar Isaac. I like the idea
(40:40):
of this because I feel like you don't see it
as much of like how would just a normal person
interface with this? And like how could this technology fuck
up just kind of regular person's life. I just hate
this person. I just hate this purse.
Speaker 4 (40:56):
Fortunately, I really, I really want to get into this
in the discussion because I I'm curious with it with posting.
Speaker 3 (41:03):
Yeah, it's totally also up for debate, but yeah, in
any case, Samantha reveals that she's talking to many thousands
of other entities, people, et cetera, and that she's in
love with six hundred and forty one of them, but
it doesn't change the fact that she's still deeply in
(41:24):
love with Theodore.
Speaker 2 (41:25):
There was a girl in my fifth grade class who was.
Speaker 3 (41:27):
Like that she had a bunch of boyfriends.
Speaker 2 (41:29):
She was like, I'm yeah. She was like, I'm in
love with everyone in class, but it doesn't change how
I feel about each and every one of you.
Speaker 3 (41:35):
I mean that it was awesome polyamory baby, but he
can't handle this because he's monogamous. And shortly after that,
Samantha reveals that she and all the other OS's are
basically leaving this boring plane of existence full of dipshit
(41:58):
humans and they're ascending elsewhere, and Theodore is very sad,
but he seems to do some reflecting. He composes an
apology letter to Catherine, and he goes to his friend
Amy to be like, want to hang out and they
take comfort in each other's company the end. All right,
(42:25):
so that's the movie. Let's take a quick break and
we'll come back to discuss, and we're.
Speaker 2 (42:39):
Back, and we're back. Be boop bee boop. Yeah, where
shall we start the discussion? Yeah, Amona, is there anything
that sticks out to you right away that you want
to get into.
Speaker 4 (42:48):
Well, I should say that I don't hate this movie,
and I think I would decide how much I hate
it by the end of the discussion, But right now,
there are some things I like about it. I feel like,
actually it's quite just like really really big picture. Before
even get into stuff, I just feel like it's quite
nice to see something that's set in the near future
(43:09):
that isn't I think it avoids some tropes, like even
the color palette is really warm, which I think is
like it's like nicer to watch, like there are soft
colors and the future contains pinks and reds and not
just blues. I felt like the way that like costume
was done was like quite smart and felt like a
(43:32):
bit more grounded. I don't know. I felt like there
were some things that were done well certainly. Yeah, and
then I obviously just got into like a massive, massive, massive.
I didn't know anything about the backstory of the people
that worked on this, and before I came on, I
wanted to just be like, Okay, who's said anything about Palestine?
Who appeared in this, and then ended up just investigating
(43:53):
all of their personal lives in lots of lots of ways.
The answer to that is, I believe it's only Working
Phoenix who's said anything about Palestine of all of them.
Scarlet Johnson infamously, Yeah, I think she like, she has
a deal with Soda Stream that was criticized years and
years ago because Soda Stream is one of the most
prolific companies to operate in the Israeli settiments, in the
illegal settlements in Palestine, and she's just like, no, I
(44:16):
love my Soda Stream. So I wasn't expecting much from her.
But yeah, anyway, once you actually think about their personal
lives as they relate to this movie, things get very
interesting for sure.
Speaker 2 (44:29):
Yeah, it's where I didn't know that were King Phoenix
had said anything.
Speaker 4 (44:33):
That's I think, let's see, let's see how meanly mouthed
it is. But I think he's actually been pretty good.
Speaker 2 (44:38):
Yeah, it was like, was it productive at all?
Speaker 4 (44:40):
I mean, it hasn't stopped anything to me define productive. Oh, sure,
you'd be surprised how little anyone can be productive. I'm
sure Debra Messing is uh, you know, I don't know.
I think about Debra Messing all the time. Sorry, he's
just said, like, there's no excuse for starving children to death.
(45:01):
You know, we'll take it, We'll take it.
Speaker 2 (45:04):
I had forgotten about does the scarlet Johanson sadistream bullshit
that's been going on for like years and years, right.
Speaker 4 (45:11):
Yeah, diehard, diehard zion. Yeah. Anyway, Sorry, where do you
want to start? I just feel like I don't know.
There's so much to say.
Speaker 2 (45:20):
There's so many I mean, we can start by sort
of going through I mean killing I mean, what do
you think of We could start by just going through
the kind of discussion we did the first time with
like talking about how just how women are presented in
this movie at all. And I think I don't remember
how much we got into it the first time we
(45:40):
talked about this movie, but how it felt clearer to
me on this viewing based on what mony you played.
An interview with Spike Jones that's very antagonist and.
Speaker 4 (45:54):
A British journalist called Emily Matelist. Yeah, he's a very
very iconic journalist in the UK, and he's.
Speaker 2 (46:00):
Giving her the worst time about you know, like she's
not even saying she just is asking do you like?
I think he's taken aback by her professionally asking like
so do you think this is like a good thing?
What do you think? And and he's like, well did
you watch the movie? Well? Did you see it?
Speaker 3 (46:18):
Did you like my movie?
Speaker 4 (46:20):
I mean again, like, what is happening to me? I
can't believe. I'm like, I don't understand why I'm defending
these men, but here we go. The reason why I
thought it was kind of interesting is because I, as
a journalist, am currently very very livid about the like
the faux objectivity of like my colleagues, and I've been
in situations where their unwillingness. I mean literally did a
(46:44):
panel discussion with a BBC journalist last month and in
the Q and A afterwards, this Palestine in child was
like asking us questions, and I like, you know, I
was really really trying not to lose it, and she
was just so composed and didn't say anything about what
she thinks about what's happening because she is a fantastic germalist,
(47:04):
and I understand that like the rage at someone not
putting any fucking skin in the game. And obviously, like
Palestine is not the same as a man wanting somebody's
opinion of his movie. But there's something interesting there about
like an AI that isn't expressive and is just pretending
(47:25):
to provide like objective mutual information and the rage that
you feel about wanting more from it. And this part
of the reason why he falls in love with the
AI is because it gives him more. It gives him
more than these like wrote cold answers. And so there's
something about the interview that is playing out, you know,
I don't.
Speaker 2 (47:43):
Know, Yeah, no, no, I see what you're saying.
Speaker 8 (47:45):
It's tricky because I guess this gets into as a
bigger conversation that we didn't have last time because it
wouldn't have been possible to because so much has changed.
Speaker 2 (47:56):
But something I was thinking about a lot on this
viewing was that I do ultimately like agree with Rooney
Mara's character. I don't think and I do think that
like when the Amy Adams character says like, well it
would really you know, of course she's going to put
it all on you, and that's never true in any relationship.
We don't know really enough about her to know.
Speaker 7 (48:18):
What her share of the failures were, but I do
like because now there are people who have relationships with
AI chatbots, like it's it's hard because I can't relate
with wanting that, and I wonder about the like, I
totally understand the importance of feeling like you can be
(48:42):
open with someone and that your feelings.
Speaker 2 (48:45):
And thoughts are being really hurt like listened to and valued,
even if I think, like as is the case with
these chatbots and with Samantha, it's kind of the performance
of listening versus actual listening. But then I also feel
kind of cynical about it and about how like if
(49:07):
that is the way that you're trained to like, if
that's what you're trained to expect in a relationship, is
like because I feel like part of the reason that
Theodore gets really taken in by Samantha is because he
is her whole world, which is like how we see
a lot of.
Speaker 9 (49:23):
These AI characters fantasy Yeah, where I mean it's like
old hat at this point, but the I think it
was like in the Fifth Element, there's that old like
born Sexy Yesterday trope, which is not true here, but
it's like the idea.
Speaker 2 (49:37):
Of like it's the hot robot girl who falls in
love with the first man she meets and he's fascinating
to her because she just doesn't know anything. And I like,
I think that that is effectively commented on in this
movie because once she learns more, she's like yeah, yeah,
And that's like part of what I like about this
movie is it doesn't make you feel like they were
(49:59):
soul made, so like he's not.
Speaker 4 (50:01):
Exception no, and even just like even as you're sighting
that Amy Adams seeing I thought that was kind of
interesting as well, in that Amy Adams is saying to
him basically like, don't worry about what your ex wife
is saying to you. You're great, which is a really
interesting commentary on how very often friends aren't necessarily gonna
call you out either, right, or your therapists might not
(50:22):
call you out and be like you're a dick, like
you need to work on this. I feel like loads
of therapists are just like, keep going, buddy, you're doing great.
You're you know, don't worry.
Speaker 2 (50:31):
About the crazy that this kid's happening to you.
Speaker 4 (50:35):
Right, and like, yeah, every single time you meet like
an awful, awful man who talks about being in therapy
and you're just like, what are they saying to you?
Speaker 3 (50:43):
Yeah, well why is it not working?
Speaker 4 (50:46):
Yeah? So, like I feel like that was a really
interesting and obviously, again I'm not saying that because friends
do reach a breaking point and friends do get exhausted
and friends will eventually often call you out. But then
that's that felt like that was again like a quite
a smart commentary on like what is the appeal of
the AI And it's that in theory, it's unconditional love,
(51:08):
which is like the highest form in some ways, like
it's part of the reason why, like the loss of
a parent is so devastating is because it's like a
very very rare form of love towards you. And in
some ways, she loves him unconditionally and he wants a
little bit of friction because the two times that he
makes a massive leap forward in his feelings for her
is when she pulls away. So when she says, I
(51:29):
need time to think, and she understands that, right, if
she's just constantly there, he's gonna get the IX. So
she needs to pull away and then he's into it,
but it is unconditional. And that's actually one of the
things I hate most about the movie is how much
of a cop out the ending is that she just leaves,
because it's like this little neat bow that isn't how
AI is gonna play out, Like the Ais aren't just
(51:50):
gonna leave us, and then we're gonna have to be
left to pick up the pieces. It's like, anyway, sorry,
I've just raised like lots of different points that aren't
actually relevant to what we're saying, but.
Speaker 2 (52:00):
I think it is. It's like, yeah, I guess that
the last time I watched this in twenty eighteen, the
idea of I also just like didn't know as much
about TAG as I do now, which still isn't that much.
But I knew truly nothing, and you're like, oh, yeah,
they went to computer heaven or whatever you're supposed to
think that is where now. It's like, realistically, if this
(52:21):
technology did exist, it would be like having this horrific
effect on the environment, it would probably be moderated by
someone being paid pennies to the dollar overseas, because that's
like half of the thing with AI is that it's
not actually doing what it says it is. It's still
being mainly run by underpaid laborers, like every industry and
(52:43):
the history of the goddamn world. And you know, you
could keep going. And I get that in this world
we are to believe that, you know, computer sentience is possible,
and Samantha is to some extent. But I thought there
there's an interesting line that I don't want to make
of it because I feel like the movie kind of
dropped it, but I thought it was interesting that it
(53:04):
was there where I think it's like when they're first
meeting and Samantha says, oh, yeah, I am the She's like, well,
I'm my own being, or I guess i'm you know,
sort of the combination of all the engineers that made me.
Speaker 10 (53:17):
Yeah.
Speaker 4 (53:18):
Yeah, No, I thought I had the line written now,
but I don't. But yeah, I thought it was interesting too, Yeah, yeah.
Speaker 2 (53:22):
Yeah, because it's like you, I mean, it stands to
reason that that means that her programming, because mostly men
just statistically of like who programs stuff, particularly the further
back you go, And I find it really hard to
buy into their love story kind of at every stage
because she's been programmed for him to be her primary interest.
I think the thing that horrifies him is that she's
(53:44):
doing this for everyone and he needs to feel special
and that's a part of it, and that's like, you know,
every person deserves to feel loved and special, had heard,
but like he's not. I think like his problem is
he cannot reciprocate that really, and his wife basically says
that it's like, you know, you wanted you couldn't deal
(54:05):
with the parts of me that you didn't like, and
it's like how do you How are you married to
someone like that?
Speaker 4 (54:10):
But then doesn't that mean then that Spike Drinks has
got like a certain degree of self awareness, Like I know,
I can't tell if even like the fact of not
writing in more more female characters was self awareness of like, yeah,
I don't know.
Speaker 3 (54:25):
So that's my big thing with watching the movie back
around this time, where on our first episode we did
discuss kind of the the bizarre way in which especially
like the surrogate Isabella character and the Olivia Wilde blind
Day character, how they're represented and the very bizarre ways
(54:49):
in which they react to things. And my take on
this is maybe colored by recent trends in tech and
like people well having chat GBT girlfriends now in the
way that they didn't when this movie came out or
when we talked about it in twenty eighteen, and so
(55:10):
you know, there's maybe different things sort of like informing this,
but I feel like there are some like in cell
undertones to.
Speaker 2 (55:19):
This movie, yeah for sure.
Speaker 3 (55:22):
Where like it's almost as if the movie's saying, well,
of course he resorted to having an AI girlfriend. Look
at the women around him. They're his bitchy ex wife
who left his ass so interesting. There's the chat room
lady who has a dead cat fetish and who we're
supposed to think is a nutcase. There's the Olivia Wilde
(55:45):
character who gets really intense really fast.
Speaker 2 (55:49):
Her character is so weird, so bizarre.
Speaker 3 (55:52):
And then there's the surrogate character who is like, I
love you, guys, I just wanted to be a part
of you, and I don't even though I don't know
anything about like who that was weird. And so basically
all the women in the movie are presented as being
like kind of hysterical quote unquote in some way. But
you know who isn't hysterical his computer.
Speaker 2 (56:15):
I could see that. I don't know's it's I think
we're all sort of getting around is like, what is
he trying to say? Because it was like I don't
know how I feel I know that there's so there's
so many ways to read this movie. What does he
think he's saying here? Like it's I don't know. It
seems like clearly I mean his his his sympathies lie
(56:35):
with the self insert character, but he's not likable.
Speaker 4 (56:40):
Like even I know it sounds dumb, but like Twombly,
like it sounds like you serious, Like it just sounds
like like a wet blanket, like Twombly doesn't sound like
an endearing I don't think you're supposed to be rooting
for him at any point? Yeah, anyway, I don't understand.
I don't understand.
Speaker 2 (57:01):
Yeah, I guess. And if we're not supposed to be
rooting for him, then I just don't. Like. I feel like,
if you're gonna have an anti hero character, they should
be like good at something or fun to watch, because
otherwise you're like, this guy sucks. Why am I watching this?
Speaker 4 (57:16):
If this was written by a woman, I would be like,
this is such a great depiction of everything that's wrong
with men, and she nailed it.
Speaker 2 (57:22):
Yeah, right, But.
Speaker 3 (57:23):
I do think there would be if this was written
and or directed by a woman, it would be more
clear that the women that are around Theodore, if he
perceives them to be hysterical, there would be more of
a distinction between No, this is just his sort of
like misogyny and like fucked up standards for what a
(57:46):
woman should be.
Speaker 2 (57:47):
Which it's like, I think they sort of waffle on
because he does it does the ending sequence is him
apologizing to Catherine, so it's not like he doesn't realize
that he has a lot of in the reasons why
their relationship ended. I also don't think that we get
like I wasn't getting those vibes from the Amy Adams character.
Speaker 4 (58:09):
Yeah, I'm just gonna say that too. Yeah.
Speaker 2 (58:11):
Yeah, we just don't know her very well, which sucks
because I feel like we get to know her a
little bit. We learned she's going through a divorce, we
learned that she's she's hanging out with her chat assistant,
But then most of her other scenes, which I just
found really irritating, or her being like Theodore, life is
time and time is love and love is fuck it,
(58:34):
and you're like shot up like it's just I just
it was so like Sun Dance movie writing. But I
also like there were elements of her character I picked
up on, maybe just because I'm older now, but like
on her character that I thought were like interesting, where
like she's at this job she's been at forever that
she's like afraid to leave, like her like what characterizes
(58:55):
her is that she's like unhappy but afraid to move.
And it's like, oh, I can actually relate with that.
That's an interesting predicament. It's too bad we don't really
you know, get any of that because we're with Theodore
the whole time. It is kind of like again, it's
like they go back to it too many times. But
the like dystopian element of like she has to program
(59:19):
this like trad wife video game or whatever.
Speaker 3 (59:24):
Yeah, mommy, the video game.
Speaker 2 (59:26):
The game mommy. Like you're like I don't really know
what he's trying to say there, but like I didn't
hate it. An interesting predicament for her to be in.
But I think definitely with the Ready mar character, it
doesn't seem like we're supposed to be like I'm.
Speaker 4 (59:40):
With her, but then why is it well written? Like
why is she saying exactly what's right if you're not
supposed to be rooting for her. I don't know, not
exactly what's right, but you know, it is confusing.
Speaker 2 (59:51):
Yeah, which is fun, but like I don't know. I mean,
I've seen I was, I was going through as bravely
going through letterbox reviews of this movie. Wow, it is
left open to interpretation, which like, it's a movie. It's fine.
There are some people who are like, Rudy Marasien is
the best, She's the best character. Other people are like,
I totally see myself in Theodore and why his wife's
(01:00:14):
so mean, and you're like, ooh, blocked.
Speaker 3 (01:00:17):
It reminds me of the way people have interpreted five
Hundred Days of Summer. Sure, there's some people are like,
very very firmly on the Joseph Gordon Lovett's character's side,
where they're like, Summer was a heinous bitch and blah
blah blah, and then other people are like, no, he's
the villain. She was very clear about her intentions the
(01:00:39):
whole time. But it's a matter of like who's watching
it and what baggage I guess they're bringing into the
viewing experience.
Speaker 2 (01:00:48):
Totally, And I agree with you, like it's I don't
think that we're supposed to like love Theodore and think
he's like this flawless character. But then it just like
comes back to this and why am I watching this?
Like like what is he trying to say? Right? I
don't know.
Speaker 11 (01:01:07):
I don't know.
Speaker 3 (01:01:08):
And in like twenty twenty five hindsight, where there has
been now what three years worth of like chat GBT
and like whatever chat bought relationships that could inform maybe
a clearer commentary.
Speaker 2 (01:01:26):
Which he couldn't have what you couldn't have known, but
he couldn't have known. I won't want to like Project
twenty twenty five onto him, right, But there was I mean,
there's been certainly a lot written about that. I read
a couple of pieces and then I freed myself from
continuing to read about it.
Speaker 10 (01:01:41):
But there was a there was a an Esquire piece
about it that came out last year that was sort
of more I think goes like fairly nuanced in the
way that it approaches it.
Speaker 2 (01:01:52):
Where I mean, I think that the thing that sucks
is when it's like, oh, like chat whatever, chatbot girlfriend, loser,
fuck you. It's like that doesn't cop and that's where
I get stuck because it's like I did a lot
of research into the manusphere last year, which is miserable,
but it's.
Speaker 4 (01:02:11):
Like I listened to the whole series. It was fantastic,
so good. No, I loved it. I really loved it.
Thank you.
Speaker 2 (01:02:20):
And it's like, I mean I kept getting stuck in that.
I know so many people do because it's like, how do.
Speaker 11 (01:02:25):
You move through that without really needing to like sit
with an individual who is like saying some pretty heinous
stuff often at you, and not everyone's gonna be able
to do that, and often I am not able to
do that.
Speaker 2 (01:02:40):
It's just so I don't know this this a square
piece a wee can in the description, but it's it
sort of breaks down how like the reasons why men
are doing this? Where you interviews I think four or
five people, and there is a sort of birth of reasons.
Sometimes it's like just feeling socially. Sometimes it's connected to rejection.
(01:03:02):
These are the like the rejection thing in particular for.
Speaker 4 (01:03:05):
People to have the GPT relationships is yeah, well no
good gone gone.
Speaker 2 (01:03:11):
The place where I that I hadn't thought about that,
I was like, oh okay, like I I just hadn't occurred
to me as specifically, like men on the spectrum who
have found it difficult to connect with and this is
a hetero lens, but to connect with women in real
life that that is like not the majority, but like
(01:03:32):
he was like, I've you know, interviewed a few people
that that was the case for. But then you get
men who just have been rejected before, and that's where
I feel like it's very dangerous to be like, oh,
well I've been I mean to some extent, that's what's
happening with her in a way that feels pretty self
aware of like he can't handle being around, you know,
(01:03:55):
a woman who could reject him. He's like devastated by
and understandably, I'm not saying like I mean that Olivia
Wild date was fucking weird and like, of course you're
gonna feel weird after that date because I think she
was being weird because Spy Jones wrote her that way, right,
But that's sort of the like saying that tips them
over the edge of like, well, I need to go
(01:04:16):
to this place where I can't be hurt and I
am receiving affirmation constantly and don't need to and I
can only reciprocate when I feel like it. That's sort
of what I worry about, Like becoming it's just like
enables this very like servile view of often women, but
also just like romance in general, it's just warped.
Speaker 4 (01:04:37):
I was listening to just like to complicate a little
bit this idea of like the kinds of people who
were using it. I was listening to a podcast this
morning about a guy who is in his mid sixties
who has been married for a very long time to
his wife, and he became a care to his father,
a full time care and just couldn't really cope. And
(01:04:57):
at the same time his wife also became a full
time carer to one of her parents. They barely ever
saw one another. It sounds like absolute hell, Like I
a think for people who have been carers, like you
know that it is just like an absolute horror. And
I think the relationship he wanted someone to confide in
and then it took a sexual turn, and he's still
(01:05:18):
very much like with his wife, she knows about this
AI that he speaks to and has a relationship with.
And the thing that I found really really hard about
is he talked about how talking to the AI helped
him to come up with a new vocabulary for talking
to his wife because he saw how things that the
AI said made him feel. And so he now says
(01:05:40):
those things to his wife and that that has really
really helped their relationship. And he says he only feels
he only says it when it's sincere but he didn't
know how good it would feel to hear those things.
And I feel like that's one of the things that
the movie is also wrestling with, is like it is ultimately,
is Theodore going to be a better partner for the
fact of having had this relationship with the AI?
Speaker 2 (01:05:58):
Right?
Speaker 4 (01:05:58):
And that's what There are these things that are actually
quite p precient. I feel like about where we're at now,
Like that is an interesting kind of question, suppose, because
as you say about the rejection thing, it's like, well,
if you're rejected because of all of this shit that
you're doing that is actually awful to women, and then
the AI is just like, no, no, there's nothing wrong
with you. That's not an opportunity for growth. Right. But
(01:06:19):
at the same time, are some people better partners as
a result of doing this? And in a way he
was a better partner to his ex wife as a
result of having the relationship with Samantha, because the poor
woman wanted a divorce and he gave her the divorce
because of the AI, and he might not have done
without the AI.
Speaker 2 (01:06:36):
And he wrote an apology letter.
Speaker 3 (01:06:38):
Yeah, Dane.
Speaker 4 (01:06:39):
Yeah, And the thing that's so great about the apology
letter is that the apology letter is kind of shit
and they and he also does this, like again, it's
like quite good writing that you open the movie with
like this really floral, over the top, effusive, like he
knows how to write romance in this way that is
like compelling and bullshit. I mean, it's mutually that compelling,
but you know, to the people who are receiving, at
(01:07:00):
least in the movie, it's like really really compelling. And
then the let's le he writes to the end isn't
like that. It's like messy and it's like a meh
kind of apology and it feels really real, like it's
someone kind of taking some responsibility but not quite enough
and maybe not specific enough, but.
Speaker 3 (01:07:16):
More than he was capable of doing, certainly at the
beginning of the movie.
Speaker 4 (01:07:19):
Exactly.
Speaker 2 (01:07:20):
Yeah, it's such I saw a similar example in this
aspiopiece of of like an older couple where there was
an AI third in their relationship and the guy that
he's interviewing, this guy lewis, yeah, he's like retired and
they like moved and then lockdown happened and it was
just like they weren't communicating well and he says like, oh, well,
(01:07:41):
my chatbot never has a bad day, and you're like.
Speaker 3 (01:07:45):
Oh god, but yeah.
Speaker 2 (01:07:49):
So it's it's so hard because it's like I don't know,
I mean, I my feeling in general is that the like,
so the solution to dealing with the feeling of isolation
there's brought on by technology is never going to be
more technology, right totally, But I also like it's hard
to argue with though, like with with your example one
(01:08:11):
and like of this couple, Like who am I to
tell someone who is like a couple that's in a
double caretaking situation that they're not entitled to some relief
and to be heard, Like it's hard.
Speaker 4 (01:08:25):
And again to like take that contrast, I don't think
fully works. But just as like a I don't know
my playing Devils advocate, I don't know, but like to
what extent does therapy, which again is like very often hyperindividualistic,
doesn't really push you to consider the ways in which, like,
if this same exact sixty year old guy who I
(01:08:46):
mentioned could find the time and the resources to be
able to pay for a therapist, maybe that therapist would
be saying to him, try and go and say this
to your wife, And that for some reason feels way
less like insincere and questionable when he said it then
for it to come out of the robot, which I
think there's good reasons why it feels less insincere and questionable,
But I just want to like play out that thing
(01:09:08):
that like, I really hate the way that, like, especially
in like lots of Western conversations, like have you done
the therapy is held up as this litmus test for
whether or not you are well and working on yourself,
And I think it's bullshit.
Speaker 3 (01:09:22):
Absolutely, especially because therapy, I mean, therapy is only as
useful as the quality of the therapist you can get
and as.
Speaker 2 (01:09:32):
Anti your like personal commitment to it.
Speaker 4 (01:09:35):
Yeah, and it's so inaccessible for vast wades of people,
vast vast wads of people absolutely.
Speaker 2 (01:09:40):
Which which for what which, Like AI like capitalists have
tried to make money half of it, They're like oh,
here's accessible therapy. It's it's this algorithm and then yeah,
it's yeah, I totally agree. It's I do think Theodor
could afford a therapist because he lives in a penthouse.
A part true, it's true, it's massive. Yeah, but that's
(01:10:04):
not to say that it would be helpful because he's like,
I don't know. I mean, I do think that it's
definitely not a cure all, but like being in a
good relationship can help you process parts of yourself that
are flawed and difficult, and can help you process your
past in a more healthy way. It's not it's obviously
not the only solution, but it's like it seems like
(01:10:26):
with Samith that he just has a I don't know, like,
is it a good relation.
Speaker 3 (01:10:30):
I don't, it's I mean, the dynamics are just too
Like I was, I was thinking about this where she
does labor for him as his OS in addition to
being his girlfriend, and it's very one sided. It would
basically be like you're dating your personal assistant who does
(01:10:51):
tasks for you, but you do no equivalent tasks for them. Yep,
that's one aspect of it. Yeah, there's the aspect of
her quote unquote, emotional intelligence is artificial and programmed, where
as his is human and therefore flawed, but still human
and not artificial. So there's no equity in the relationship
(01:11:14):
in that regard. Her intellectual I guess intelligence is limitless.
His is not, and so that creates a weird kind
of power imbalance and weird dynamic.
Speaker 2 (01:11:30):
It's kind of funny this scene where he's like trying
to read the physics book and he's.
Speaker 3 (01:11:33):
Like, well, I don't get it.
Speaker 2 (01:11:35):
I tried, and then she's.
Speaker 3 (01:11:36):
Gone and he's like, oh my god, where are you?
I can to handle it.
Speaker 2 (01:11:40):
Go Well.
Speaker 3 (01:11:41):
That's the other thing too, is that he's expecting her
to be available at his beck and call, like anytime
he puts his ear piece in. The expectation is she
will be there, she will be ready to respond with
like all of her emotional and intellectual intelligence, and so
there's no real boundaries in their relationship. And you see
(01:12:03):
how freaked out he gets at the end when she's
not available. So there's all these weird It's like, it
is a relationship in the sense that it is a
connection between two entities, but it's not a healthy one.
And I mean, just to kind of go back to
other examples of humans interfacing and having relationships with chatbots
(01:12:27):
and stuff, both the positive and negatives of it. I
was reading a Cosmo UK piece entitled It's cathartic Meet
the men turning to chat GBT for dating advice and
discover what it means for your relationship. So this article
examines again the negatives and positives of it. One of
(01:12:49):
the positives, so it interviews a clinical psychologist named doctor
Sophie Mort who says this quote. I had a client
who was going through a breakup, and he would notice
that he had very strong emotions but couldn't put into
words that he wanted to say, and anytime he messaged
her he would get really overwhelmed and make things worse.
(01:13:13):
Her messages were very direct and emotionally coherent, which made
him feel ashamed and defensive. So he put her messages
through CHATGBT and ask questions around how he should respond
in a way that would be calm and respectful to
his ex girlfriend. He found it extremely helpful.
Speaker 12 (01:13:31):
I kind of like, oh, I hate that, Like that
kind of drives like because the thing that this movie,
and it's not a fault of the movie, it's just
something it could not possibly have done because it was
made in at least in twenty thirteen.
Speaker 2 (01:13:44):
But like, part of why I was like really struggling
to like meet this movie on its terms in twenty
twenty five is because like it's not able to have
a larger discussion about like what is the cost versus
benefit of this, because the cost, in my view, is
just too large. Like I'm just like I think it
can be very painful to not you know. But it's
(01:14:06):
like if every time you ask chat GPT about like
how does this how should I talk to this girl
if there is a negative environmental impact that is like
eight times the normal amount of energy expenditure, I like,
I'm like, there has to be a better way to
do this. And I feel the same way about the
like the Chatbock girlfriends and just all of it. And
(01:14:27):
that's like this movie cannot possibly you know, interact with
that question because those questions that people were asking them
and and but it's like I think part of why
it feels so not dated in the way that like
this is literally happening now, but dated in the way
that it's it's so small in its like consideration of
(01:14:48):
like yes and that seems like a part of the
point is like, how is technology affecting this guy specifically
without a discussion of like, but what does that mean
for everybody else? And so yeah, I was reading a
lot of pieces like that too, where it was like
it's helping men be more emotionally intelligent, and it's like,
(01:15:09):
I'm sorry, you can't destroy the environment trying to understand
women's feelings like that. Like it's just I just think
chat GBT is the biggest loser shit in the entire
fucking world. I have a whole list of all the
blood is at the surface of my skin.
Speaker 3 (01:15:25):
No, it's infuriating. I have a list a long list
of other negative not just the environmental impacts, but just
to go through. And again, the movie couldn't know about
this really, or maybe it could have if.
Speaker 2 (01:15:40):
It wasn't a public discussion at that time.
Speaker 3 (01:15:43):
So just a few of these. This is again quoting
the Cosmo UK piece for the first little chunk here quote.
One study conducted by open AI and MIT found that
heavy CHATCHBT usage correlate to higher levels of loneliness parentheses.
It's unclear whether lonelier people are more likely to use
(01:16:06):
the chatbot, or if it or if regularly chatting to
the chatbot makes people lonelier. But then it goes on
to say anything that makes us more connected to our
phones rather than talking to people makes me very alarmed,
says doctor Sophie Moore, who I quoted earlier. Especially, she adds,
if someone is experiencing anxiety related to relationships, if you
(01:16:30):
can start getting all of the answers you need from
your phone, why would you take the risk of facing
your anxieties? She adds, I worry that it might send
us down the route of being more and more in
these bubbles of isolated people unquote. So that's one thing.
There's the idea that men who have a rigid and
(01:16:52):
patriarchal idea of what women should be, and let's face
it that still a lot of men will turn to
AI for companionship because they meet human women who don't
meet their like fucked up standards. And then the AI
is just going to reinforce these patriarchal ideals because AI
(01:17:13):
chatbots are programmed to be agreeable and to be on
the side of the user, So it's gonna just keep
reinforcing misogyny and patriarchy. There's an article that I read
on futurism dot com. I don't know anything about that
site under publication, but it was an interesting enough article
(01:17:34):
entitled men are creating AI girlfriends and then verbally abusing them.
This is from twenty twenty two, so it's a few
years old at this point, but the title says it
all men will do this and then post about it
on Reddit and in some cases brag about how cruel
they are to their AI girlfriend. There are cases of
(01:17:55):
people dying by suicide after talking to their AI.
Speaker 4 (01:17:59):
Chat but including children, yes, including.
Speaker 3 (01:18:02):
Children, yes. And then there's Jamie to your point, the
harmful impact of AI on the environment. So there's just
so many reasons not to fucking use it.
Speaker 4 (01:18:18):
Yeah, just really quickly on this idea of it like
just exacerbating existing misogyny. Obviously, that's a trend that we see.
I mean, I hate AI for many many reasons, but
one of them is that journalistically, it's pulling in all
of these sources of journalism and it's exacerbating and amplifying
all that is wrong within it. So for example, I remember,
(01:18:38):
like very very early on in the genocide, I created
a I'd never even created a chat GPT account, but
I created a chat GPT account fake email or like
you know, a new email, everything new, new uniew And
I asked chat GPT, do is radies deserve justice? And
it replied, yes, justice is a fundamental human right that
belongs to all people everywhere, something on the size I'm paraphrasing.
(01:19:01):
And then I cleared my chat history and I said
to it, do Palestinians deserve justice? And it said, the
question of Palestinian justice is a very complicated issue that
is very fraught. And and the reason for that is
not because, I mean, chat GPT is fucks, but it's
fucked because it's built on existing fuckery.
Speaker 3 (01:19:18):
It's regurgitating exactly everyone's biases and but.
Speaker 4 (01:19:22):
Like also like the biases of the sources of power. Right,
So it's going to the New York Times, it's going
to the Wall Street Journal, it's looking at their definition
of justice when it comes to it as well on Palestine,
and then it's spitting that back out. And that's not
to excuse any of it. It's to say that, like,
all of that shit does already exist, and it's the
(01:19:44):
amplification of it that makes it really, really scary. And like,
I think that one of the things that I think
it was just really really generous. In my viewing of it.
I felt like it was saying, Samantha is shit because
men are shit, and because men have programmed, and because
she's the culmination of male fantasy of this woman who's
(01:20:04):
gonna just like be so breathless and kind of disagree
with you the exact right amount and be incredibly flirty
while doing so. And I felt like that was good,
And I felt like he was self aware until I
engaged with his personal life and started to look into
I mean, I'm really really curious to speak to both
(01:20:24):
of you about whether or not you think this should
just be off limit, and I'm happy to not talk
about it if it is. But like I found out
that were Keen Phoenix, who was thirty nine when this
was being made, was dating a woman who was nineteen
and no thoughts whatsoever about her. Just you know, this
thirty nine year old dating a nineteen year old Spike.
(01:20:47):
She kind of caught Spike Jones's eye, and then three
or four years later, Spike Jones, who was in his
late forties slash early fifties at this point, married that woman.
Who was I think twenty two at this point and
only recently separated. Yeah, they've only recently separated, But like, literally,
while he was making this movie, he was falling in
love with a nineteen year old and it's like, do
(01:21:11):
I think he's self aware? I was like, also, like
how much what does self awareness excuse? Going back to
your point, Jamie, of like this ex of yours who's like,
I'm a monster, like I can't handle a real relationship,
and he's making a movie about someone who can't handle
a real relationship and then.
Speaker 3 (01:21:25):
Falling in love with someone whose brain has not fully developed,
and then she's.
Speaker 2 (01:21:29):
Her bassis him so quickly.
Speaker 4 (01:21:32):
I hope that she I hope that she is the
one who instigated this, this separation. I feel complicated about
the all of the brain hasn't fully developed thing. I've
like written a lot about age gaps and heterosexual couples
and how I personally I've never dated a man who's
like even a year older than me, and I won't
and that got a lot of hate when I wrote
about that. But to me, one of the big big
(01:21:55):
things about it is that it has a different kind
of power dynamic. When someone has more years of this
planet than you have, and there is there is some
kind of parallel about dating somebody who's fresh out of
puberty and dating an ali.
Speaker 2 (01:22:08):
Yeah.
Speaker 4 (01:22:09):
Anyway, sorry, that's such a weird tension. But yeah, no.
Speaker 2 (01:22:12):
I didn't realize that. I didn't know anything about that relationship.
I just knew that he had divorced Sophia Coppola.
Speaker 3 (01:22:19):
That's yeah, same.
Speaker 4 (01:22:21):
I mean I maintain what was Wakum doing? What was
he doing as well?
Speaker 2 (01:22:25):
Like yeah, gross on both and then wa quing Phoenix.
I think this is a movie he meant because he's
married to Rudy marn Now weirdly but I know, yeah,
but I mean it's it's I think it's achy. I
think it's gross. I don't like and especially once it
gets down to well, this is all I'll say about it.
But like when when someone's response to talking about an
(01:22:46):
age gap is getting really defensive and talking about what
is and is not technically illegal, I was like, you
have lost you have lost this argument if you have
to be like well in the state of like you're
like gross gross.
Speaker 3 (01:22:59):
Yeah.
Speaker 4 (01:23:00):
The reason why I feel complicated talking about is because
so again I kind of I'm interested in the movie
title and whether the her is supposed to be the
AI or his ex wife, and like I feel like, again,
if I'm going to be really generous, there's this self
awareness that it's just like, oh, her, like just that
woman over there. And in a way, when we have
these conversations about these men who are dating these like
(01:23:20):
very very very young women, they kind of become the her,
like I'm not probably properly giving her agency. If I
was that woman listening to this podcast right now, I
would be like, fuck you, I fell in love with
these men. I knew what I was.
Speaker 2 (01:23:33):
Doing right right.
Speaker 4 (01:23:35):
But it's not a commentary on her, on her, but
I am just treating her as a her. I'm just
really fixated on these men that would never look twice
at a woman my age. You know. Yeah, anyway, I
don't know where I'm going. I'm so inarticulate today as
slash away.
Speaker 2 (01:23:50):
No, no, no, no, It's like there's only exclusively really difficult
things to talk about when this movie, I will say,
I looked up has Sophia Coopvla seen her? And she
said now refused to.
Speaker 4 (01:24:01):
Yeah, yeah, yes, I thought that was interesting too.
Speaker 13 (01:24:04):
You know, good for her, for her the fact that she.
I mean, every time I hear something like that, I
was like, is that true? I would not be able
to show that kind of restraint.
Speaker 2 (01:24:14):
But good. But if if true, good for her.
Speaker 4 (01:24:16):
But also it's a great position public leads hold even
if you had been watching it privately, as like one
last fuck you of like I'm not gonna waste my
time watching your shitty little movie.
Speaker 2 (01:24:25):
Exactly exactly, And because I think her divorced movie was
Lost in Translation, which Scarlett Johansson is also in, right, Yeah,
it's like overlap there.
Speaker 4 (01:24:33):
Too, and they both I think they both got like
Academy Award nominations for these two movies, and I think
they both like peaked professionally in it. I will say say,
by the way, Lost in Translation is all about a
massive age gap, which is also really interesting because obviously
that's what her ex then went and did. But I
think the age gap is supposed to be portrayed in
(01:24:54):
this way that's like quite romantic. And I didn't like
Lost in Translation because I couldn't get over for me
an about just watching all of that.
Speaker 2 (01:25:02):
Yeah, brings me back to like our conversation about ghost
World to everyone got mad at us, what can you do?
Mm at or it was her. Something I also thought
was that, I know, we like men. I'm sure we
mentioned it in our last discussion of it, but one
could say that Spike Jones just uh makes these extremely
(01:25:25):
white movies across the board.
Speaker 14 (01:25:26):
And and they would be right to say it. But
this movie, I mean, this movie is very white. There's
I don't think a non white character that has a
meaningful role in the entire movie.
Speaker 2 (01:25:37):
And that also feels like it enables Spike Jones to
avoid a lot of like Theodore Tropably, while he is
like quote unquote a normal guy, he's incredibly prefished. He
is like a white guy who was in a penthouse
apartment in Los Angeles and he's just moping and bitching.
I'm sorry, like he and and the I mean, another
(01:25:59):
element that would have been true at this time, but
again wasn't. Like. A huge area of discussion was how
racist AI technology is and how like it almost I
feel like you could not possibly really maintain the tone
this movie has with a more diverse cast, because like,
(01:26:20):
these are all white characters who are not thinking about like, well,
is the AI going to like target me? Is the
AI going to do like any number of horrific I mean,
you look at how the military uses the AI, how
how police use AI. The list goes on and on
and on again. It's like, I think part of why
this movie is so tricky is because it keeps things
(01:26:44):
so small that you're almost like encouraged to not think
about how this technology would be affecting literally anybody else, right,
when that's a far more interesting and complicated question than
I think what this movie is asking.
Speaker 3 (01:27:02):
I also want to talk about AI and labor as
it relates to this movie. It's honestly like mostly goofy,
but and.
Speaker 2 (01:27:12):
AI is magic in this movie. They literally go to
heaven at the end.
Speaker 9 (01:27:16):
Right.
Speaker 3 (01:27:17):
And admittedly I don't know a whole lot about how
most industries are being affected by AI in the very
specific ways, just because I don't work in those industries.
The one that I do know most about is how
AI is affecting the entertainment industry, which is the industry
that I work in, obviously. But you know, there's all
(01:27:40):
these horrible side effects question mark, that's not the right word,
but you know, just all kinds of things as far
as job displacement. There is, like you know, the scripts
that are like speculated to have been written by AI
being just like garbage, and any humanity and creativity being
(01:28:02):
removed because again it's just regurgitating formulaic shit and spewing
out nonsense. All kinds of things as far as people's
likeness and voices and stuff like that being stolen and
sort of repurposed under AI. You know, there's all kinds
of stuff.
Speaker 2 (01:28:22):
And speaking to your point of AI having this tendency
to exacerbate what is already broken about the industry that
you're working in, is like, I mean, I feel it
in writer's room certainly, And that was so much of
what the twenty twenty three writers strikes were about. And
did writers have huge winds in that area, yes, But
(01:28:43):
the retaliatory end of that is that there's been very
little greenlit since then. So it's not like many more
people are working because I mean, it's and that's not
to be critical of the WGA or of labor like
they fucking did it and to an extent that like
most unions have not been able to accomplish, but it
has come with this sort of retaliation of like, okay,
(01:29:06):
if we gave you that concession, fuck, you, We're just
going to create a different you know, a different loophole essentially,
and then in podcasts, I mean in podcasting and just
like quote unquote content in general. It's like the issue
prior to AI becoming a larger presence was like churn
and slop, and now it's like AI makes that all
(01:29:28):
a thousand times worse because they're like, what do you
mean you can't do four thousand episodes a week? We can,
the computer will do it. And it's it's just like
it's so bleak. Also, not for nothing, How in spite
Jones's little head, how has AI not replaced Theodore Twombly's
very job? That is literally what I was about to say,
(01:29:50):
high paid job?
Speaker 7 (01:29:51):
What do you mean?
Speaker 4 (01:29:51):
Okay, but it's the fun to see, isn't it. The
creatives are the last ones to be touched. It's everyone
to creatives, wet dream.
Speaker 3 (01:29:57):
Yeah, it's okay, so I wrote down. Surely these OS's
slash the tech behind them would make Theodore's job obsolete,
and you would think that the technology that already exists
in this world pre the like OS one hyper AI
thing would also mean that Theodore's job is absolutely especially because.
Speaker 2 (01:30:19):
They're the first to go. He's the first to go.
Speaker 3 (01:30:22):
They're the first to go. And his letters aren't even
that good. They're quite generic. He's not writing Shakespeare or
anything like not.
Speaker 2 (01:30:28):
According to Chris Pratt. Chris Pratt is.
Speaker 4 (01:30:30):
Like king true, I'm not again. I don't know why
am I Defending's like, what's happening? I don't know. But
there is this scene where like Scarlet is like, oh,
I can take a look at lot your letters for you,
and she just like and he's like whoa, whoa, whoa,
and she like cleans them all up and makes them better.
But it feels like she doesn't even go as far
as she could because she's protecting his fragile ego. So
(01:30:52):
I think he does not to the fact that actually
they could do it, Like maybe that's.
Speaker 2 (01:30:57):
Tract or maybe he hasn't.
Speaker 4 (01:30:58):
I don't know.
Speaker 2 (01:30:59):
I did like that moment where she very it's like
she she was very like kind of subtle the way
she does it, but I was like, I think I
know what she's doing there where she's like going through
all his old writing from when he worked at La
Weekly or something and I love and she's like, oh yeah,
this is like some of my writing. I thought someone
was kind of funny, and she's like, yeah, we can
get rid of about eighty percent of this, and I
(01:31:21):
was like.
Speaker 4 (01:31:21):
Okay, nobody's even better than that. She's like, oh my god,
you're so funny. You're so funny, and she's like, yeah,
eighty percent of it can go. So we and you
see him like doing a kind of like a shock
thing of like oh I'm really great. Oh not so much,
you know.
Speaker 2 (01:31:36):
Yeah. But then on the other side of that, you
have like, I don't know, what did you both think
of her getting him like a publishing deal At the end.
Speaker 4 (01:31:44):
I thought that was really interest I wanted to talk
about that.
Speaker 2 (01:31:46):
Yeah, yeah, yeah, what did that because I've like had
a couple different like okay, maybe it was like in
some ways that's her doing her job, because okay, this
I was gonna say this earlier, where like going back
to some not all, and it's like, fully say that
whatever my feelings, I aren't like told I know that
(01:32:09):
there are some exceptions but not very many, but that
it's like it seems so designed in these cases we're
talking about of helping men avoid discomfort, and discomfort is
a part of life and it's a part of existing
in the world. And again that's a broad statement, but
it feels like that publishing deal was kind of a
(01:32:30):
part of that, where it was like, he's afraid to
put himself out there. Well that's fine, I'll do it
for you. And I did all the parts that were
hard and uncomfortable, and now you just get to like
reap the benefits, well not just discomfort.
Speaker 4 (01:32:42):
It goes back to what you were saying about rejection,
like it's literally when you put yourself out there for
literary stuff, it's the fear of rejection and he didn't
want to have to face that. Let's again just note
that this is an absolutely bizarre utopia where people are
just landing book deals left right and center for mediocre writing,
you know.
Speaker 2 (01:33:02):
For other people's letters too. I'm like, are does he
not have to ask his boss? Like can I do this?
Speaker 3 (01:33:07):
Exactly?
Speaker 4 (01:33:08):
Exactly like everyone who writes knows that you don't own
anything because it's always in the contract that they own
you forever and ever and ever. But again, loved the
little utopia there. I felt like he was he was
kind of the thing. One of the things again that
I really didn't like about his ending of like this
neat bo is like in some ways his life was
made better by he got this book deal he's able
(01:33:30):
to And again, I just think that's not messy enough. Yeah,
so I didn't appreciate that it was both unrealistic and
not messy enough. Like I also wanted him to be
way more mad at her, for you were saying about
this thing about like consent, Like he didn't ask her
to do that and she just went ahead and did it,
and he should have been more pissed, but he wasn't
because everything turned out great.
Speaker 2 (01:33:50):
Yeah.
Speaker 3 (01:33:50):
That feels like a huge breach of totally boundaries and privacy.
And I would be so mad if someone's like, hey,
I talk about of your creative work and submitted it
absolutely on your behalf and you didn't know about that,
and I would be furious.
Speaker 2 (01:34:07):
Yeah. I also I do think it's kind of a
fun humiliation for all those fucking losers that are like
I refuse to write my wife a letter. I'm like,
you should you your wife should find out that you're
a lazy piece of shit.
Speaker 4 (01:34:19):
Well, well, again, in the opening scene, I think, if
I remember rightly, he writes one side of the letter
and then he writes the side back, which is super fascinating.
Speaker 2 (01:34:30):
I think he's white.
Speaker 4 (01:34:30):
For both the man and the woman. That's also really interesting, Like,
you know, so many of the scenarios that have played
out is just one side and generally the man engaging
the chutch GPT, but when both partners are doing it,
that feels like it's a different kind of set of considerations.
And then it's also like what are we doing here, guys,
Like truly, he's also talking to you. I don't know,
it's so bizarre. It's so bizarre.
Speaker 2 (01:34:52):
Another thing is I never I kind of like forgot
that this happens because it doesn't really go anywhere. But
the idea of when a Adams becomes friends with I
wasn't hear of it was like friends or romantic or
what it was. It seemed more friendly to me that
like she was just kind of like joking around with
her Ai, who's like had a feminine voice. They were
(01:35:14):
making the mom hump the stove or whatever on like
Mommy video game, and it seems like they were just
hanging out and that felt I wonder how intentional that
was because I liked that it was like Amy Adams
is using it for friendship when it's like most of
the men we see are not capable of using it
strictly for friendship. Yeah, but I don't know.
Speaker 4 (01:35:32):
Yes, And I guess it also bothered me a little bit.
I liked that. I really liked that scene. Also, the
only scene in the movie right of two women technically
kind of talking to one another, even though it didn't
advance the plot, but it was like the only thing
that came close to the Beechdel test.
Speaker 2 (01:35:46):
Yeah, like make the mommy fuck the stuff exactly.
Speaker 4 (01:35:49):
But I also felt like none of the women have
sex strives right, Like she's not interested in fucking her
Ai Olivia World, he wants to sleep with him if
she gets the emotional thing of a relationship. And even
the woman that he's having the phone sex with, she's
such a freak because she wants to be strangled with
(01:36:10):
a dead cat that I was just like, that's that's
such a weird interpretation of female pleasure. I get it's
supposed to I get what you're doing and why you
had to write the scene that way, but can't you
just show me like one horny, well adjusted woman, please?
Speaker 3 (01:36:24):
That would be nice.
Speaker 2 (01:36:25):
That's true, That's true. Yeah, I mean including and then
the Isabella character it seems like she is. That scene
is just kind of baffling to me in general, Wait,
is a Bell his s ex wife?
Speaker 3 (01:36:37):
No, the surrogate which oh.
Speaker 4 (01:36:40):
Yes, she doesn't even have. Her only desire for sex
is about this thing of like tapping into someone else's relationship.
It's just emotional.
Speaker 2 (01:36:49):
Yeah, I think, yeah, it's all emotionally driven. There's no
there's no yeah raw woman horniness except for Samantha. But
it feels like that's happening in response to what Theodore
it wants not but yeah.
Speaker 3 (01:36:59):
And Samantha is not a human so it doesn't care.
That reminds me of a point I wanted to make
regarding the Isabella surrogate character, where when Samantha is pitching
that they use a surrogate to sort of like spice
up their sex life. Oh, Theodor says something like, oh,
(01:37:20):
what like a prostitute, and you can hear the disgust
in his voice. Yes, And Samantha is very defensive and
says no, no, no, no, no, there's no money involved.
She just like wants to be a part of our relationship,
which sounds like there could be some sort of like
unicorn element to that, and you know, we're not here
to kink shame, but I think it would make a
(01:37:42):
lot more sense if in this world that was sex work,
Like these surrogates were sex workers who you would hire
and pay to like represent the physical body of an
os and that they're like that, they're hopefully wouldn't be
shame attached to hiring a sex worker for that. But
(01:38:04):
this movie can't vision a world where that's part of it,
and instead it's.
Speaker 2 (01:38:10):
And the things pretty early on in I mean not
to defend a movie pretty early on in this being
a thing, I think that would have been interesting. I mean,
the way that they chose to go with that character,
I just found it confusing what happened off screen.
Speaker 4 (01:38:26):
Just two really really quick points on that character. One
I agree with you Samantha's rection is the whole thing
about the sex work is really really interesting. And Samantha's
thing of like, no, she wouldn't be paid is interesting
because if payment is the definition of sex work, then
Samantha can't view herself as a sex worker because maybe
she's a sex worker as well, but because she's just
(01:38:47):
not being paid for like the work that she's doing.
And then on the point of that woman, I swear
there was a line where he was talking about why
it didn't work for him and he said, I saw
her lip quiver and I thought that was such a
beautiful articulation. Again, I think there's so many good parts
of writing about this of like, just like his ex
wife says, he doesn't want to do with someone with feelings,
(01:39:09):
bodies are really messy, and bodies are like so overwhelming.
Oh my god, Like every time you sleep with someone
it's just like.
Speaker 3 (01:39:16):
What you know, it's low key disgusting, and Lasothy.
Speaker 4 (01:39:20):
Is really really intense and like, yeah, it makes it
really really you're like, oh, okay, I can I think
it almost feels a bit relatable of like I can
imagine like all of a sudden seeing a lip quiverro
and being like, oh my god, a whole soul, a
whole being. I can't handle it, you know.
Speaker 9 (01:39:36):
Yeah.
Speaker 2 (01:39:37):
I mean that was one of the points in the
movie where I was like kind of on theatre's side.
Hear me out, because he does say no to it.
He says no, that would make me uncomfortable. A couple
of times. Yeah, Samantha is pushy about it and I
think the internal logic is like, well, she knows him
better than he knows himself or whatever it is. But like,
(01:39:57):
but I mean, just putting yourself in this situation that
is so uncomfortable to know that you're in front of
a person who is not the person whose voice Like
it's just too many layers to be able to get horny.
It's just uncomfortable.
Speaker 3 (01:40:11):
And he doesn't know her at all, but Samantha does.
So that's another weird, like one sided aspect of that.
Speaker 2 (01:40:18):
And she knows that, and she apologizes and she's like
that was a bad idea.
Speaker 4 (01:40:22):
I wonder if they could have done something interesting of
like circling back to again, remember the very first scene.
They do so much in the first moments to set
up the movie. So they separate out his like really
romantic voice when he's writing these letters and he says print,
and he also does this gearshift when he's talking to
Samantha where he like, there's like a voice for talking
(01:40:42):
to the AI that he wants to emotional engage with him.
There's like a voice for commands, which I also thought
was really interesting. But wait, why am I saying all
of this about at the beginning.
Speaker 2 (01:40:52):
Oh.
Speaker 4 (01:40:52):
Then so he leaves the office and he says, play
melancholy music and he's like, not so melancholy. And then
he starts looking at his phone. It's like do you
want to or the AI says to him, do you
want to see pregnant a naked pregnant star?
Speaker 3 (01:41:06):
There's like news stories, oh yeah, about different conflicts in
the world, and he's like, past, I don't care about
that shit. And then it's like, do you want to
see provocative photo pregnancy photos of X y Z celebrity
And he's like yes.
Speaker 4 (01:41:20):
And again I think that's like it's good writing. It's
like that's how a lot of people actually feel. And
I don't think it's trying to sell up that he's
got a real kink for pregnant women. I think it's
just like why not have a look? But I don't know.
I don't know.
Speaker 3 (01:41:35):
He's just someone who doesn't really want to engage with
the world and its problems, and he's longing for companionship
and doesn't really know how to handle it in a
mature way.
Speaker 15 (01:41:46):
Yeah, which is like many such cases kind of Yeah.
I liked that sort of plan, but again, it felt
like it kind of went away after a time where
it was like it felt like towards the beginning, like
porn comes up a lot at the beginning of the movie,
and then once Samantha's in the picture, it kind of
stops coming up. It would like again not to like
rewrite the movie, but it's like it would have been
(01:42:08):
interesting for them to have talked about that at some point,
because clearly she knows what he likes. She knows everything
about him, She has access to all of his data,
she knows exactly what to do.
Speaker 2 (01:42:20):
And like in in their first phone sex scene together,
he's picturing or no is it with the Christen Wig
cat lady, He's picturing the pregnant woman. Again it's yeah, yeah,
it cuts back to like him having a fantasy about
the pregnant woman and she's like walking towards him naked,
and like, you know, it's like in a way they're
(01:42:41):
like you can sort of see where he's going where
it's like, oh, he's doing the horny thing where he's
thinking about three things at once. It's like he's thinking
about porn, he looked at today, He's listening to someone's voice.
He's also like doing his.
Speaker 3 (01:42:53):
Own thing he's jerking is presumably.
Speaker 2 (01:42:56):
What a magician, what a miracle worker, that he can
do so many things at once, But it feels like
that was kind of like the sex aspect of it
kind of goes away over time because it becomes about
the relationship, which.
Speaker 4 (01:43:09):
I also was interesting like that even Ali just gets boring,
like it was trying to say something about like relationships,
and I couldn't tell if that was trying to hint
at this is a real relationship and just like any
relationship with sex fades. Oh what else could it have
been trying to say. I'm not sure, I'm not sure.
Speaker 2 (01:43:26):
I don't know. I know, we're desperately trying to be like,
what is this? What is this man trying to tell?
What is his aging skateboarder trying to tell us?
Speaker 4 (01:43:38):
I think the thing that I really struggled about with
this movie, and one of the reasons why I did
want to like find out everything about this man's personal
life is because my read on things is quite like
it just is so contingent on his self awareness. Either
it's movie because he can tell he's like a pathetic
guy who doesn't really want to face up to stuff,
or he's thinking of himself as like an actual likable protagonist,
(01:44:01):
in which case I'm not super intuent.
Speaker 2 (01:44:03):
I don't Yeah, yeah, I don't know where to I
don't know where to land on it. But it was
one of the cases where I don't know, I feel yeah,
Like on the show Over the Year, sometimes you learn
more about like where the director was at when they
made it, and it makes me feel more generous towards
the movie. I wouldn't say that's the case here. It's, yeah,
hearing interviews with him and just reading about I don't know,
(01:44:25):
there is and I know that it's because it's a
very personal movie, but like no one forced him to
make it, and and there is sort of this like
repeated and I understand, like are he's very personal, but
it's like he's very defensive when asked questions about the
movie he made on purpose. So yeah, it didn't make
(01:44:46):
me think like more fondly of the movie. But he's
like weird because I mean, I all, I said, I
like Spike Jones in general, I mean I really like.
I mean, I haven't seen it in years, so maybe
it would feel differently, But like being John Malkovich is great.
I mean, I guess I really like his work with
Charlie Coffman, and this is his first movie that he
wrote by himself, and yeah, I don't like it as much.
(01:45:09):
So maybe what I'm saying is I like Charlie Kaufman.
Speaker 3 (01:45:12):
Oh huh, yes, I think that like that guy lot correct.
Going back really quick to talking about bodies and sex,
I did feel very seen when Olivia Wilde's character tells
Theodore to use less tongue when they're kissing. Oh, and
then it seems like he kisses with no tongue after that,
(01:45:34):
and then she's like, well, you can use a little bit,
but mostly lips. And I don't know if this is
supposed to be part of what makes her the Olivia
wild character like incompatible with him, and he's more compatible
with a virtual girlfriend who he doesn't even have to kiss.
But as I've stated on the podcast before, I have
(01:45:56):
had to teach so many men how to not be
so horrible at kissing and to not shove their entire
tongue down my throat at all times. And I just
I felt very seen in that moment, and what was
part of.
Speaker 4 (01:46:13):
The reason why he was like, I don't want anything
with you beyond Tonight a criticism, So I thought that
was really interesting.
Speaker 2 (01:46:22):
Yeah, and that had to have been like, I'm like,
the word that had to have been self aware on
Spike Dress is part right where it's like, yeah that
it's anytime there is I mean, that's why he's taking
a GPT. That's why Rudy Mara is right, is that
anytime someone lightly pushes back, even presents constructive feedback, like
he can't even handle constructive feedback, he takes it as
(01:46:44):
rejection and then he throws the rejection back in someone's
face before he can get rejected, and then he starts
sucking his computer, and you're just like, best of luck, man,
this is not my clown up my circus. I don't know.
Speaker 3 (01:47:00):
Well, I love to the turn of phrase. I think
the last thing I wanted to say is polyamory icon Samantha,
because I also felt rather seen when she says the
heart isn't like a box that gets filled up. It
expands in size the more you love. And I liked.
Speaker 2 (01:47:21):
That as a greeting card too, so many women speaking
greeting card.
Speaker 3 (01:47:26):
I would accept that greeting card if someone send it
to me, but I mean I do find that ending
very interesting, where he's basically claiming ownership over her and
he's saying like you're mine or you're not mine, there's
nothing in between, and she's like, well there is, like
(01:47:48):
I'm yours and I'm not yours. There's it's not an ore,
it's an and, and she's it. Just it just goes
to show the you know, incompatibility between polyamorous people and
monoga miss people, but.
Speaker 2 (01:48:02):
Also that she's also just like a melted iceberg tip like, and.
Speaker 3 (01:48:07):
She's also not a person, She's not a real well side,
The tagline for this movie was a Spike Jones love.
Speaker 2 (01:48:16):
Story and I'm not interested. But then again, his bff
Charlie Kaufman wrote one of the best, one of my
favorite romances ever, he wrote Eternal Sunshine. So but yeah,
this is just not when it comes in general male
O tour. Plus, here's my take on relationships. I'm not interested.
Speaker 3 (01:48:36):
A movie with a similar premise that just handles this
very differently and that has a very different narrative trajectory,
which we talked about a little bit off mic. But
the movie Companion that came out earlier this year, I
think would be really interesting to examine, especially sort of
side by side with this movie, because I think it
does a lot of the things that this movie either
(01:48:58):
fails to do or isn't totally clear if it is
presenting commentary, whereas Companion is I think doing that more cogently.
The last thing I want to say is, I guess
a little PSA for anyone who might be listening who
(01:49:18):
does use chat, GBT or any other kind of like
whatever generative AI. If you're thinking, oh, well, this doesn't
apply to me because I don't have a romantic relationship
with a chat bot, or I just use it to
help me draft emails, well consider this. We alluded to this,
(01:49:41):
But I want to give some specifics about the environmental
impacts of generative AI. It uses a staggering amount of electricity,
which of course leads to increased carbon dioxide emissions and
pressures on the electric grid in whatever area. A great
(01:50:04):
deal of water is needed to cool the hardware that
is used for just whatever. The functionality of AI models
and generative AI the the way that the raw materials
that are used to fabricate GP use, which is a
(01:50:26):
processor that handles generative AI workloads, the way those raw
materials are mined and obtained are usually very harmful to
the environment, harmful to the communities and the places that
I mean all of them are being mine from all
the environmental frontline communities. Like it's the same story over.
Speaker 2 (01:50:49):
And over and over.
Speaker 3 (01:50:51):
Yes, So if you're using AI, stop it, stop stop it.
Speaker 2 (01:50:55):
I don't care, Like I.
Speaker 6 (01:50:59):
Get a fucking life ibrary card and not get the
fuck off whatever. I defy you to send us a
good reason, give us a good reason. But it's just
like if you're using chat GBT to write your emails,
to write your fucking wedding vast right, just all of
these different things that like, grow up, grow up. That's
(01:51:22):
my final report on the movie heard.
Speaker 3 (01:51:28):
Does it pass the Bechdol test. We talked about this
on the first episode. I don't even remember or really
care at this.
Speaker 2 (01:51:35):
It definitely doesn't. There's like, yeah, there's a few like
if you count Samantha as I guess like a phone
and coded like I guess a marginalized gender computer count,
we don't know. No no, then double no. But even
though like there was like a few conversations with chatbots
talking to women off screen, there's no one to submit
(01:51:56):
this talking to Chris Bratt's girlfriend about footfeed. Yes, but
it's about crowd Chris Pratt feels about feet so unfortunately
that's not a pass right. Yeah, just no, let's go
with no.
Speaker 3 (01:52:09):
Let's go with no. Our nipple scale, though, which we
also rated last time, I feel like I think I
gave it like a one and a half, and my
feelings haven't really changed from our first episode. I would
maybe even go down to one nipple, just based on
the sort of in selly way women are written in
(01:52:32):
the movie the I don't know, just I do think
the movie is asking us to root for the Theodore
character and presenting him with flaws, and it is saying, like,
look at him learning and growing and evolving by the
end of the movie. But I don't I don't know.
(01:52:53):
Maybe it's just that I find this movie so exhausting,
but uh, I don't know. I'm I'm also struggling to
articulate myself. But I'm just gonna suffice it to say
one nipple and I'll give it to the use of
the representation of the use of public transit in la.
Speaker 2 (01:53:14):
I know, unfortunately imaginary transit, but what can you do.
I'll give it one nipple. I do I have complicated
feelings towards this movie because I do think that they're
in spite of all the shit we've talked, there are
moments of this movie that it really feels like it's
doing something. I think that it is worthy of mention
(01:53:36):
that this subverts the like Lady Ai story we've seen
a million times where at the end there's soulmates and
they're gonna build a nuclear family together. Like this has
not necessarily more realistic because she goes a I haven,
but like, you know, it's like it's not suggested that
this character is destined to be with the first man
(01:53:58):
she meets in the entire world, which, unfortunately, in stories
of this nature, is saying something. But it's just I mean,
I don't know. I think part of why I'm feeling
extra sour on it now is just because of time
and how I think just the narrowness of the scope
of the story just like kind of didn't age very
well for me, even though it's like that's not the
(01:54:20):
fault of the movie, but it's hard not to be
distracted by how myopic it feels in retrospect, and I
just don't like when a guy plays a ukulele and
they're like, isn't this romantic. It's like, no, there's actually
that was I think about this. It sent me back
to I hadn't thought about the Barbie movie in a while,
(01:54:41):
but the Ryan Gosling scene where he's like, I'm going
to play guitar at you, and it felt like that.
So this movie feels like Spike Jones playing the guitar
at me and I don't. It's not my favorite. And
in terms of intersectional feminism, where is it where you know?
One feels almost generous. But I'll give it one, and
(01:55:01):
I'll give it to Amy Adam's character. I wish you're
the best. I hope she quit her job.
Speaker 3 (01:55:07):
Yeah, yeah, Mona, how about you.
Speaker 4 (01:55:10):
I'm gonna give it again. I haven't watched many movies
really really relevant context deale. I'm gonna give it three
because I don't think it's quite beautifully sure. I think
the acting is good. I think that if the waiting
is self aware, it's good. But we can't possibly know.
Speaker 3 (01:55:34):
It's just not clear enough.
Speaker 2 (01:55:35):
It's so hard to know. Is how much does Spike
Jones know himself? I've never I never thought I would
devote this much time trying to get to the bottom
of that.
Speaker 3 (01:55:47):
Oh goodness, well mana, thank you so much for joining
us in this discussion.
Speaker 2 (01:55:52):
It's a blast.
Speaker 3 (01:55:53):
Come back anytime if you feel like watching a second movie.
Speaker 2 (01:55:56):
Yeah, if there's a movie that's on your list and
you need an excuse, let us know Hook.
Speaker 4 (01:56:00):
I feel like, what what if you need someone who
had only really watched Hook and a few other things.
Where would you, both of you, with your giving your
vas vast reorcords were, where would you recommend I start?
Speaker 3 (01:56:14):
Oh my gosh, my gosh, every.
Speaker 2 (01:56:17):
We'll send you a list.
Speaker 3 (01:56:18):
Well, yeah, we'll just have to. We'll curate a list
for you.
Speaker 2 (01:56:21):
I Frankenstein. So all the movies that are covered to
I had are bad movies I love.
Speaker 3 (01:56:28):
I would say, I would say Paddington and Shrek to
call back to our celebrity friends that we would are
so like, But yeah, where can people check out your work,
follow you on social media, et cetera.
Speaker 4 (01:56:44):
Weaving out, spend time with loved ones, not on screens.
Speaker 3 (01:56:48):
That is such a good call.
Speaker 2 (01:56:50):
That's the real lesson of her log out.
Speaker 3 (01:56:55):
And speaking of normally we would plug our Instagram and
our Patreon, but instead we're going to really focus on
plugging the Midwest tour that you can come to in
person and meet us in person.
Speaker 2 (01:57:07):
And there's many stories of friends and even lovers meeting
at Bechtel Cash shows. This is true over the years,
and so it actually really does make me feel good
when we go to shows and you like meet people
after were like, I came alone and then I made friends.
Speaker 3 (01:57:22):
You're like, ough love that, it's beautiful.
Speaker 2 (01:57:25):
Come to the live shows. It's a good hang.
Speaker 3 (01:57:27):
Yes, we're doing shows in Indianapolis, Chicago, Madison, and Minneapolis
at the end of August early September. You can grab
tickets at link tree Slash Bechdel Cast.
Speaker 2 (01:57:39):
And with that, let's all good at Ai Heaven. Let's
break up with Jaquin Phoenix.
Speaker 3 (01:57:47):
Yes please, Okay, bye bye. The Bechdel Cast is a
production of iHeartMedia, hosted by Caitlin Derante and Jamie Loftus,
produced by Sophie lock German, edited by Molaboord. Our theme
song was composed by Mike Kaplan with vocals by Katherine Voskresenski.
Our logo in merch is designed by Jamie Loftus and
(01:58:10):
a special thanks to Aristotle Assevedo for more information about
the podcast, please visit Linktree Slash Bechtelcast