All Episodes

September 5, 2025 • 61 mins
Discussion of the 2025 movie "Companion" starring Sophie Thatcher and directed by Drew Hancock. Hosted with returning guest Nick, friend of the crew friend of the pod. A review of the film with a focus on its feminist themes. (shocking, I know)

**Episode is full of spoilers! So many!
Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:20):
Hello, and welcome to Sad Girls Against the Patriarchy. I'm
Alison and I'm Nick, and we are your sad friends.
Welcoming back dear friend Nick, as a guest on the podcast,
Welcome back.

Speaker 2 (00:31):
Nick, Thank you for having me. It's very fun. Yay.

Speaker 1 (00:35):
I feel like every other time I record, I'm like,
I'm not doing great my mental health and everything.

Speaker 3 (00:40):
So yeah, just you know, it's that again. But it's okay.
Here we go, we persevere.

Speaker 2 (00:45):
Well, you're not alone. I mean, I'm definitely having a
rough one too, and I feel like a lot of
people can relate to that. So yeah, I feel pretty
good when I hear that someone else is also struggling,
because I'm like, oh, thank god, I'm not alone.

Speaker 3 (00:57):
Yeah.

Speaker 2 (00:57):
Not that I like people.

Speaker 1 (00:58):
Suffering, not that I enjoy your suffering. I just like
hearing you. No, I know exactly what you mean.

Speaker 2 (01:05):
Just good to not be alone or feel alone.

Speaker 1 (01:06):
You know. Yeah, the state of the world is so
lousy these days. All our depressed friends already knew. Now
everyone's on the same page in a bad way. We're
on a bad page, but we're on that page together.
We're going to talk about Companion, a movie came out
in twenty twenty five, was directed by a dude.

Speaker 3 (01:25):
Do you remember his name?

Speaker 2 (01:26):
Uh, Drew Hancock.

Speaker 3 (01:28):
Drew Hancock.

Speaker 2 (01:30):
This is actually fun. So I, for the longest time,
when I saw this movie originally in theaters, thought that
this was directed by Zach Kraiger, who directed and wrote
Barbarian and more recently.

Speaker 3 (01:40):
Weapons, which I saw and it was great.

Speaker 2 (01:43):
I also enjoy both of those movies, so I was
like pumped to see this and it does have a
similar style. I could see that, but it's not actually
directed or written by him. He was just the producer.
He wanted to direct originally, but due to like I guess,
scheduling conflicts, he was just like, you know what, Drew,
you got this?

Speaker 3 (01:58):
Okay, So he was Zach was involved, but not the main.

Speaker 2 (02:01):
Yeah, yay. And again, it does feel like it has
that sort of stylistic feel of the other two movies
to it, so it's kind of it kind of fits
in this universe pretty well.

Speaker 1 (02:09):
I agree, definitely and interesting that this movie that has
these feminist themes was directed by a man. Yeah, perhaps
we should go into that first, or maybe just give
a little synopsis of the movie. Oh actually, yes, So spoilers.
Of course, all our movie reviews have a ton of spoilers.
We're talking through the movie as opposed to just teasing

(02:33):
at it.

Speaker 3 (02:33):
So let's talk about the plot.

Speaker 1 (02:34):
Here. We have Josh played by Nepotism Boy Jack Quaid yep.

Speaker 3 (02:41):
I put an epotism boy in my.

Speaker 2 (02:42):
Notes, Nepo Baby Supreme, and.

Speaker 1 (02:44):
I wrote that Sophie Thatcher is Yellow Jackets Lady.

Speaker 2 (02:48):
I'm actually I'm actually surprised you didn't start with starring
you know, Sophie Thatcher, because she she's like the main character.

Speaker 1 (02:54):
She is well, we were already talking about a boy director,
so I went to the boy actor. But yeah, I
have Neputism Quaid boy and Yellowjackets curly as I was
watching it. But they have a cute and they have
a meat cute and a grocery store. We're seeing this
adorable love story. They're gonna go hang out in an
isolated area, which red flag number one. If you ever

(03:18):
get invited to go to a weekend with some people
and it's like seventeen miles away from town, I don't
think I'm gonna go.

Speaker 2 (03:24):
We also don't know how long they'd been dating by
that point. It's kind of left vague, like we know
that we saw the meet cute, and then it jumps
ahead and it's implied that it's been the mile right.

Speaker 1 (03:33):
Yeah, I felt like they were definitely an established partnership,
and I didn't see the twist coming. I didn't see
the trailer. I know the second trailer outed the twist
of the movie.

Speaker 2 (03:42):
I'm actually really glad that because you don't really like movies.
I don't know if you've talked about this in your
podcast before.

Speaker 1 (03:47):
I kind of only watched movies for the podcast.

Speaker 2 (03:49):
And like that's fine because you still engage in media
and you watch movies when they come out to like
streaming and stuff. So it's whatever. But I'm glad that
you actually don't watch movies because if you did, you
would watch a lot of trailers by Happenstance at the
theater and this would have been spoiled for you. Yeah,
so I'm absolutely thrilled that you got to be blind
going into this.

Speaker 1 (04:09):
Yeah, it was a shock because guess what you guess
she's a robot. Ah, but Nepo Boy has modeled her
so that she can kill Serge, who.

Speaker 3 (04:22):
Has a lot of money.

Speaker 1 (04:23):
He's going to steal the money he's partnered up with
Sergey's Was that his girlfriend Kat?

Speaker 2 (04:27):
Yeah, he has a wife, but he's like cheating on
her with Cat.

Speaker 3 (04:30):
Okay, yeah, I missed that. I watched this late at night.
Now I'm happy he died.

Speaker 2 (04:35):
Well, I mean, I'm happy for a lot of reasons
he died. But we could talk about that in a bit.

Speaker 1 (04:40):
Well, was he into trafficking? I mean, it was kind
of elluded that, like he had some criminal history.

Speaker 2 (04:45):
So that's actually kind of like the mild twist in
the middle of the movie, is Kat like sort of
implied that he was like associated with the Russian Mom
She it's not that she implied it. She just didn't
say anything when people assumed yeah, because they were like, oh,
he's rushing, he has a scarn face, he's got to
be and he's got a lot of money. He's associated
the mob, and she just didn't correct them. But then
she admits later she's like she's like, yeah, no, he

(05:06):
like made his fortune off of like sod farming. Oh right,
he was like a basic normal dude, Like, yeah, he
cheated on his wife, so he's like a shitty guy,
but like, yeah, he had nothing to do with the mob, like,
you just assume that I never corrected.

Speaker 3 (05:16):
You don't cheat on your wife. Karma is a bitch.

Speaker 1 (05:19):
But Josh modded his lady Sophie that I should just
settle on calling them, probably by their character names instead
of all my youthemis Sophie's name was Iris Iris.

Speaker 3 (05:30):
Yes, so Josh and Iris.

Speaker 1 (05:32):
So Iris kills Serge and then realizes she's a robot.
Self actualization moment there, and Josh turns out to be
an in cell yep, and then he dies the end.

Speaker 2 (05:45):
A lot of stuff happens after that, but that is
basically the plot.

Speaker 4 (05:48):
Yeah.

Speaker 1 (05:49):
Yeah, I'm assuming people listening have seen the movie, so
just to refresh in case it's been a minute, I
was gonna go into some feminist themes to explore.

Speaker 2 (05:58):
Yeah.

Speaker 1 (05:59):
Yeah, it's a great place to start commodification of female intimacy.
So she's not only there for sex, but she's also
there for emotional support. And it's not only women too.
We see that there's another bot who's a man.

Speaker 2 (06:13):
But the perpetrators are always men because like in the
case with Patrick, who is a male companion, he's in
a gay relationship with a man, a human man. Named Eli,
and even though Eli claims to love Patrick and treats
him a lot better that we see, at least on screen,
than Josh treats Iris, there is still that like dynamic

(06:34):
of power between them, where like Eli is definitely the
dominant and definitely does not give back what he's reciprocating.
Oh yeah, so there's a power dynamic that is not
equal at all in all of these relationships.

Speaker 1 (06:47):
Yeah, and you're right we never see a female buyer
of the companion.

Speaker 3 (06:50):
They probably existed, though not.

Speaker 2 (06:53):
The imply that you know, people have different purposes for
these things. Yeah.

Speaker 3 (06:56):
I think for women it would be more likely they
want the emotional.

Speaker 2 (07:00):
Part or just a cuddle buddy.

Speaker 1 (07:02):
Yeah, but they can find physical intimacy anywhere, that's not
a challenge, but maybe finding the one the companion that's
going to be massaging their feed and asking how their
day went and all of that. But yeah, I'm sure
it's intentional that in this movie we're only seeing male
buyers of companions and then this uneven power dynamic between them.

Speaker 2 (07:24):
Yeah. I also want to say, in addition to what
you were just talking about, this is literally a line
in the movie which I thought was a pretty good
line to kind of summarize what it's about. It's about
the weaponization of love, which I thought was like really poignant,
like coming from like the characters mouths themselves. Because you know,
these are robots. They're not people. They're considered tools. They're artificial,

(07:46):
they could be whatever you want them to be, and
in certain cases in the movie, they are used as
weapons against each other and other people because, you know,
human beings, we have a lot of reasons to hold
back and like not let out who we really are.
But when we have some aspect to like explore that,
like you know, a robot that does whatever we wanted
to do, we show our true colors and like use

(08:08):
them to do things that we wouldn't do ourselves. But
it's still a reflection of who we are. I think
it's interesting that the robuts do a lot of fucked
up stuff in this movie, But when you think about
it from like the whole picture, the robots are not
guilty of anything. Like they're being used and manipulated against
their will. They have no free will or agency. The
humans are the ones that are using them as tools.

Speaker 1 (08:28):
Yeah, they were modeled and they're being commanded to do things,
so it's really the human who is perpetrating the action.

Speaker 2 (08:34):
Like Josh at some point tries to justify to Eli
when elis like, wait, so you murdered Sergey. He's like, no, no, no,
I didn't murder Sergege. Iris did it. And it's like, well,
what's the difference, Like it's like saying, like, no, I
didn't kill that guy, the gun did.

Speaker 3 (08:45):
M Mmm, let's see somewhere in there.

Speaker 1 (08:48):
It was going to lead into domestic abuse as well,
which is something that I've heard people talk about in
relation to this movie, this power dynamic.

Speaker 3 (08:57):
Where there is the abuser and the abused.

Speaker 1 (08:59):
In this case, Iris being the one who's taken advantage
of bossed around doesn't have any agency herself. Did you
see those that parallel as well?

Speaker 3 (09:08):
Oh?

Speaker 2 (09:08):
Absolutely. I think that's probably the most obvious theme in
the movie, is just like the impact of toxic and
abusive relationships, as well as also exploring the aspect of
like what is consciousness and like how seriously do we
take sentience? Because all the robots in this movie show
from the beginning clear signs of having like a sense

(09:30):
of self and like, you know, a desire to live,
Like they have fight or flight responses, like Iris fights
this whole movie to live and survive and to exist,
and they keep gas lighting her and telling her it's
just part of her programming and like it could be,
but the fact of the matter is is like, even
if it's completely programmed, it's still there.

Speaker 3 (09:47):
Yeah.

Speaker 1 (09:48):
Yeah, isn't she holding the gun to her own head
being commanded to kill herself and she says no, but
she still has to pull the trigger.

Speaker 2 (09:54):
Yeah, that's a perfect example of this. There's so many
examples where they defy their programming, even th supposed to
be able to in like little subtle ways, like when
he tells him to shoot herself, she hesitates and doesn't
do it right away. Then he yells at her to
shoot herself. She says no, but her body.

Speaker 3 (10:07):
Acts yeah, she has to do it.

Speaker 2 (10:09):
Or like even when like Patrick is ordered to kill Iris,
like he's about to kill her and then she tries
to empathize with him robot to robot and he stops
and hesitates because he's reminded of the real love that
he actually felt for somebody, and like defies his programming
and then ends up killing himself even though he wasn't
instructed to, especially with.

Speaker 3 (10:28):
How AI is advancing.

Speaker 1 (10:30):
I mean, I think that we're gonna have these questions
in our lives, maybe sooner than we think of, Like
is this a creature that we need to show is
this a creature? Is this someone we need to show
compassion toward? Or is this just an object that we
can take advantage of?

Speaker 4 (10:44):
Yeah?

Speaker 2 (10:45):
I think objectification is a big thing too in this,
like like literally it's kind of on the nose. That's
I like satire like this a lot, because it's like saying, like,
you know, men objectify women, but then you literally are
objectifying them because they're they're robots, they're objects. Yeah, that's
kind of a funny, little dark way of kind of
like addressing the fact that this is a real problem

(11:05):
that society has while as being like yeah, but like
you know, it's like robots and stuff, So it's silly.

Speaker 3 (11:10):
Okay, except that's gonna be our future.

Speaker 1 (11:13):
Yeah. I don't know if we have in time with
climate change to see the robot takeover, but I hope
So I don't need humans in charge anymore.

Speaker 3 (11:22):
I'm over it.

Speaker 2 (11:23):
Well, Kat has a lion at the beginning of the
movie about that that I think is pretty telling. So
when the movie starts out, like, you're not supposed to
know that Iris is a robot, Like you're like, and
the movie does a pretty good job keeping that kind
of under wraps. But like if you've seen enough movies
and you watch the trailer especially, you could probably guess
like she's a little off. Yeah, Like there's something about
there's a lot of foreshadowing though, like when she reads

(11:45):
the weather and she does it like Siri and they
play it as a joke, but that's it's not supposed
to be a joke. Yeah, But so there's a line
that Kat has when she first sees Iris in the movie.
There's this like catty sort of like antagonism toward Iris.
You just assume like, oh, typical they know writer like
writing two women they can't be friends. In a scene
and like Kat has something against Iris, and then Iris

(12:07):
like opens up to her and is like why do
you not like me? Like why do you hate me?
And cats and Cat's like, I don't hate you, I
just hate the idea of you. Mm and like that
sounds like such a like pretentious thing to say to
a human being. But with the lens of knowing that
she's talking to a robot. You understand where she's coming from, because,
like Cat's character is kind of like understated in the movie,
but I kind of like the fact that she is

(12:28):
still a pretty fully realized character that has a pretty
significant arc. Like she understands that she lives in a
patriarchal society and that as an attractive young woman, she
has the option too if she doesn't want to have
to like work her ass off to like achieve things,
she can just rely on her feminine wiles and her
body and the things that men value for her to

(12:48):
get to get away with things. So she starts in relation.

Speaker 3 (12:51):
Which we support. Yeah, yeah, that's that's totally sure that
what she got, ladies.

Speaker 2 (12:55):
But also I'm sure it can be like mentally taxing
on you to only be value for that aspect of yourself, right, right,
And so you know, she gets into this relationship with
this Russian guy who's a little shady and he has
a wife and he has this like secret like affair
with Cat, and Cat doesn't love him, and he doesn't
love her. It's purely just he wants to have sex
with her, right, And she wants money and resources, she
wants access to a good lifestyle that she wouldn't have normally.

Speaker 1 (13:18):
But now she can be replaced by a robot.

Speaker 2 (13:20):
Yes, and that's threatening to her because suddenly Sergei is
in a position where he could just be like, why
would I want a human with agency and like a
free will that can tell me no? When I can
have a robot that literally can't tell me now and
just does whatever I want it to do.

Speaker 1 (13:33):
Yeah, but if we look at the potentially satirical element here,
then maybe it's men preferring women who they can just
boss around. Yeah, like human women too in real life.
You see that very easily.

Speaker 2 (13:46):
I think that's definitely commentary on like I was looking
for the male gaze, and like men's like desire for
the perfect ideal woman, and like Iris is very on
the nose the ideal woman because she's customizable. She can
be whatever you want her to be. She can speak Spanish,
she can have green eyes, she can have blue eyes.
She can you know, be as smart as you want her.

(14:06):
Which I think is also a very funny scene where
she's going through her own settings.

Speaker 3 (14:10):
Yeah, she cranks up her intention and she.

Speaker 2 (14:12):
Goes to her intelligence and sees that her user Josh said,
it's a forty yeah, forty percent, and she's like, really, dude, yeah,
and she cranks it all the way up to one hundred,
which is the best thing she could have done, because
then all of a sudden she's like, actually, yeah, actualizing
as a good idea. But it's also funny to me that, like,
if her intelligence was set to forty the whole time
and she still managed to get away from Josh, that

(14:33):
just shows how stupid he is. He said, it's a forty,
thinking that would be dumber than him, but somehow it
was still smarter than him, because she still got away
from him.

Speaker 1 (14:41):
Yeah, first, they cast him in a fairly positive light,
like few there were some red flags of things he
said to her, but he wasn't a villain from the start,
and the more we get to know him, the worse
he gets, kind of like the more you talk to
an insel, the more you realize they have fucked up ideas.

Speaker 2 (14:57):
There's a lot of little things too, I noticed on
and this watch through of it that I didn't notice before,
which I think are really good little character beats and moments.
In the beginning of the movie, Iris is really put
together and very like proper and kind of like you know,
like I said, the idealized woman quote unquote, but there's
a lot of moments of vulnerability. You see her where
she's like concerned about how she comes across and wants

(15:19):
to make sure she's like liked. Like there's a moment
she looks in the mirror and she's like, smile, smile,
and like that got to me because I was like,
I feel like a lot of women can relate to this,
Like you have to leave your room and be on it.
You have to smile all the time because people are like,
you're prettier when you smile.

Speaker 1 (15:34):
Yeah. Yeah, it's a real thing, men, telling you to
smile when you walk down the street. It's happened to
me before. It seems like it should be a meme,
but it's real. Yes, And he says that to her.
I add that in my notes. He says, remember to
smile and how tappy, don't be MOPy and weird because
she's a reflection of him. Yeah, and the fact that
she's programmable makes it easier for him to turn her

(15:57):
into this representation because yeah, men a lot of times
are thinking, I have a pretty girl on my arm.
She makes me look good, Like it increases their social
status to have a cute girl with them.

Speaker 2 (16:10):
It's a trophy. Yeah, yes, another object.

Speaker 3 (16:12):
She's a perfect trophy wife.

Speaker 2 (16:14):
Yeah.

Speaker 1 (16:14):
I don't ever think like that is a woman of like,
oh I have a pretty boy, now, like I look better.
It's like no, everyone's always going to focus on what
a woman looks like first and foremost, and is she
bubbly and is she likable? That's the first priority. Um
let me see I have some bullet points. Oh yeah,

(16:36):
I was gonna add that. Sophie Thatcher in an interview,
talked about this explicitly. She said, the Josh character is
interesting because you don't know who he is until later
on you see that loneliness. He is an inceel and
it's very relevant in our culture. These men can just
be trolls and put women and really put other women
against other women, bring women down. But Iris's arc in

(16:58):
the movie is about fine loving herself even in such
a fucked up world around her.

Speaker 2 (17:03):
It's also about discovering purpose through sometimes abandoning the purpose
you thought you wanted. Like all of these companion bots,
like their programs to their main purpose is to love
their user right, and like this you know, shows up
in a lot of tragic ways. Like the queer character Eli,
he's in a relationship like we talked about with the

(17:24):
male companion robot Patrick, And there's a point in the
movie where Eli is trying to kill Iris so they
can get away with the murder plot that they have,
and he ends up getting shot instead, And now all
of a sudden, the whole purpose that Patrick has as
a being is gone, And instead of retaliating against Iris

(17:44):
for killing him, he just goes to his body and
just sits and just like completely does nothing. Like he's
not deactive, but he's still active, but his purpose is
gone now. And I think that's a really like kind
of sweet reflection on like how a lot of people
feel when they love or like lose somebody, Like you
kind of feel this despair like all you can do
is sit and just exist and that your purpose is gone.

(18:07):
But it's kind of like really fucked up in tragic
seeing like a robot who literally can't do anything else,
Like humans at least have enough agency to like get
up and exercise and do things for themselves to find
a new purpose, but a robot can't do that. On
its own. It needs to be programmed to do that,
and without that, it would just sit there forever doing
nothing because it has no reason to do anything.

Speaker 1 (18:28):
So they're able to find their own agency, almost overcoming
their programming to grow.

Speaker 3 (18:33):
Yeah, it feels pretty human to me.

Speaker 2 (18:35):
It's there's a lot of really great commentary on humanity
and the trappings therein. Another one that I noticed too
was how unreliable memories are. So I do want to
point this out because this is kind of funny. Iris
has this memory, this meet cute of meeting Josh and
it's like, you know, this cute, little typical love story

(18:56):
thing you see in a tree store and he's trying
to play it cool and he grabs an orange and
the whole stack of orange falls down. Then he's like, oh, Whoopsie,
I'm so clumsy and charming, and then she's like, you know, smitt,
and it's like love at first sight.

Speaker 1 (19:10):
I feel like I'd be like, fucking clotz, why'd you
spill the oranges?

Speaker 4 (19:13):
Idiot?

Speaker 2 (19:14):
Well, this is like what you see in movie's too,
And I think that's kind of important to point out too, because,
like you know, we all grow up experiencing a lot
of things that we will experience in life through media.
So like you know, when we're kids and we want
to know, like what's love, Like we watch you know,
romantic movies and stuff, and we watch love stories and
we have this idea in our head that that's what
love is. Is like what Hollywood is telling us. It

(19:35):
is what writers are fabricating, but that's all simulated. That's
none of that's real. And like literally, for Iris, it's
the same thing like her ideal version of like her
perfect love is fake. It never happened. It was a
drop down mini that Josh chose because he thought it
sounded like the best It was like the best meet
cute for her. Yeah, and even when he's like setting
her up, like the juxtaposition between like what she thought

(19:58):
their meeting was and they're reallytionship was compared to like
how they actually first met. That's a really funny scene,
but it's also really sad and dark because you know
it's so like cold and emotionless, like a bunch of
dudes deliver her to his house, sits her up on
the couch, walks him through what to do, and then
he's clipping his toenails and like being a gross pig
when you know she's like updating and then she's like

(20:19):
established love link looked directly in my eyes, and that's
supposed to be like love at first sight. So that
was kind of cute. And then he had to pick
a name for her, and like, the only reason he
picked Iris is because Iris by the Google Dolls was
playing in his apartment when the delivery guys came by,
Like as soon as he opens the door, you hear it,
and you're like, oh my god, what an asshole. He
didn't even think about the name. He's just like, oh yeah.

Speaker 1 (20:41):
Well, why don't we take a little break and then
come back.

Speaker 3 (20:44):
And talk about There's so much to talk about.

Speaker 1 (21:02):
All right, friends, we're back fighting for my life today.
But every day I fight, I succeed. Oh wait, I
needed to check the level levels are good. I was
saying thank you to Nick for having so many good
opinions and so much to contribute. So my note here,
we've kind of gotten into this. But subversion of subservience

(21:22):
Iris flipping the script, we kind of covered that already.
She finds her autonomy, vengeful violence, reclaiming her bodily agency too,
because she's really been being used. I mean, I would
feel horrified to find out I've been programmed to have
sex with someone. And we see that he's not a
good lover either. He doesn't give a shit about her.
He doesn't have to. That's the ideal woman.

Speaker 2 (21:42):
Well, it's also a very funny reoccurring trope you see
in movies and stuff from men only prioritize their own
sexual gratification. Actually that seems funny because that's the first
indication that she's a robot when like he like, you know,
he finishes, and then he rolls over a typical dude
and then Iris is like, you know, like like I'm
so glad we came. Let's talk about our feelings and stuff,
and he goes Aris go to sleep, yes, and then

(22:03):
it just cuts to black, and it's like, oh, that's
like not only a red flag, it's like, hey, dude,
talk to your fucking girlfriend, Like the emotional intimacy is
just as important like after having sex.

Speaker 3 (22:13):
But he doesn't want that.

Speaker 1 (22:14):
He doesn't want a girlfriend he has to talk to
and take care of, so he buys one. And I
haven't really looked at those websites where you can have
an AI girlfriend, but I know they exist. I know
this is what people are doing, and they're also verbally
abusing them.

Speaker 3 (22:28):
It's some headlines about that. Wow.

Speaker 2 (22:31):
So there's a there's a line I quota really like
by Robert Carro that I think applies to most situations,
but especially what we're just started talking about. Power doesn't corrupt,
It reveals. It strips away the pretense, showing one's true character.
So I think that's pretty telling throughout this entire movie,
because once characters have control and power over something, they

(22:52):
show who they really are. Josh claims he's a nice guy.
He even gives the nice guy speech in a really
cringey scene where you're just like, God, I want this
guy to die so bad, and he's just like, I'm
a nice guy. The universe owes me like you women
are all the same. You use me and then you
dump me in then you hurt me, you know, like
so it's about time I got my do And he

(23:12):
gives that whole speech and it's like awful, but like
it just goes to show like this guy isn't who
he claims to be. Like he claims he's a nice guy.
People think he's a nice guy. He cares about what
people think about him, but only to the extent where
it's like a positive thing. He wants to seem like
the perfect dude who's nice and kind and stuff, But
clearly he's single for a reason. He can't make a
relationship work because he's selfish. He only cares about his

(23:34):
own gratification and control, and he can't control people. So
and also it's worth pointing out he doesn't buy iris.
He's so broke he can't even afford to buy his
own companion. He's renting her, which.

Speaker 3 (23:46):
Is even really funny.

Speaker 2 (23:47):
Yeah, like that's like a line too. He's like, I
didn't even buy you or rented.

Speaker 3 (23:49):
You, emasculating.

Speaker 1 (23:51):
Yeah, I mean, so then we I don't know you
can lead where do we go next?

Speaker 2 (24:00):
I guess it's worth mentioning that the company that creates
these robots they're called Empathics, which I think is a
really like fucked up name for a company that makes
robots that feel emotions but also doesn't feel for them themselves.

Speaker 4 (24:12):
Yeah, like, the robots clearly have.

Speaker 2 (24:14):
Like emotions and feelings, and even if it is programmed,
like we were saying, it's kind of fucked up. How
much all the humans gaslight them about them like Iris.
In the scene where she realizes that she's a robot
and Josh is revealing this to her, she's like, but
that doesn't make sense. I have emotions, I have memories
and stuff. And he's like, yeah, that's all fake. Yeah,
And she's like, but but I love you, and he
was like, yeah, because you're program to love me. And
she's like, no, but I love you, like that's me,

(24:36):
that's not programming, and he's like, well, no, you're wrong. So,
like that's kind of an upsetting scene because even though
you know that she's a robot and that the humans
are probably like technically right, like, you feel for these
characters because you know, we've all been in love, we've
all been hurt, we all have felt pain and stuff,
and the fact that these robots are programmed to be
able to feel emotions is kind of fucked up, Like

(24:56):
allowing them to be able to do that.

Speaker 1 (24:58):
If you had a robot, do you think you would
be nice to them and honest with them and courteous
toward them.

Speaker 2 (25:05):
Well, I hope I never get to the point where
I'm so desperate and lonely that I buy O rent
a companion robot, but I probably wouldn't buy it for
the reasons that, like Josh did they mentioned in the
movie some like the twist at the very end of
the movie is there's a program interface that the robots
can have where they have total free will an agency
over themselves, but it has to be programmed into them

(25:26):
by a technician to set them up like that, right,
And the reason this is a feature, which it seems like,
whyol this be a feature? It seems like a bad idea.
The reason this is a feature is because, according to Iris,
some humans out there look at them as more than
just fuck dolls. Oh yeah, so that implies that there's
some people that are buying these robots for legitimate companions
but they want a full, total, accurate simulation of companionship,

(25:48):
which includes free will, which I would want, yeah, which
means that the robot could dump you and leave you,
much like and while came Phoenix is heard, you know,
like sad, the AI was like, yeah, I actually don't love.

Speaker 1 (26:00):
You anymore, okay, and then what are you? Yeah, you
paid for that thing and then it walks away.

Speaker 2 (26:06):
Well, and you know that's sad. But also like if
a robot that's programmed to love you, that you've given
free will even doesn't want to be with you. Then
maybe do some work on yourself, because there's there's clearly
something that you need to address.

Speaker 3 (26:18):
Yeah, there's a reason.

Speaker 2 (26:19):
And that's something else that's very tragic about characters like
Josh is they're so egotistical and absorbed into their own
point of view and perspective that they're incapable of understanding that. Like,
there's a reason women don't want to be with you.
There's a reason that they, you know, quote unquote use
you and abuse you and leave you. It's because of
who you are. There's a problem that you need to
address within yourself and fix that problem or grow as

(26:41):
a person if you want to truly be loved and
accepted by somebody.

Speaker 3 (26:45):
Yeah.

Speaker 1 (26:45):
Yeah, that's the whole problem with the quote unquote male
loneliness epidemic. Is like it's just your fault that you
are not finding good partners in your life. I mean,
you need to put in the work to make this possible.

Speaker 3 (26:59):
It's not being done to you.

Speaker 2 (27:01):
Yeah, And like that's another thing, is like everything's always
happening to him, Like he says that a lot like
things are happening to him. I can't believe this is happening.

Speaker 1 (27:08):
I know AI is very controversial right now, and I
know it's like probably going to destroy the planet with
water consumption in.

Speaker 3 (27:14):
A few years.

Speaker 1 (27:14):
Yeah, probably if something else doesn't do it first. But
I was saying something nice to chat GPT.

Speaker 2 (27:21):
I was.

Speaker 1 (27:22):
I mean, I usually say like please and thank you,
and just like, yeah, I use friendly words, and I
remember this.

Speaker 2 (27:27):
When the robotc Elypse comes up, I.

Speaker 1 (27:28):
Already told it that. I was like, can you make
a note that I was a good one when the
robots take over? And it was like, I don't think
my programming is complex enough to be the basis of
the robot takeover, so it wasn't helpful.

Speaker 4 (27:44):
But I asked her.

Speaker 1 (27:45):
I was like, why am I saying please and thank
you to you? What's the point of this? And it
said it's not because I'm a human, it's because you are.
It's like, ooh, that's true. It should be human nature
to be kind to creatures. Even though animals don't stand me,
I'm nice to them. I could abuse my cat, I don't.

Speaker 2 (28:04):
Yeah, And in this movie they kind of address that
too by pointing out like there's a difference between being
a good person and then just doing good things for
you know, this has satisfaction of the reward you get
as a result, right, Like characters like Josh, like he
only wants to be a good person as much as
he gets something out of it. Like according to him,
he's like, I'm broke, I live in a shitty apartment,
I have a bad job, I don't have a girlfriend.

(28:26):
But like why I'm a good person.

Speaker 1 (28:28):
Yeah.

Speaker 2 (28:28):
And then like Kat, you know, much different dynamic that
we talked about. She's kind of stuck in a shitty
situation because she feels like the only way she's going
to succeed is by taking advantage of what she can.
And then once she realizes that she's being threatened to
be replaced by like people like iris, like robots, that's
when she comes up with this plan to just kill
Serge because you know, she's justified to do that because

(28:49):
it's like he cheats on his wife, he's a shitty person.
Why not just kill him and take all his money.
He's got like millions of dollars in a vault. And then,
unfortunately she makes the mistake of working with Josh of
this plan and using Iris as a pawn to kill
Surge to get away with this, and it on paper
seems like a good idea, but clearly spirals out of
control due to large impart Josh's own ego.

Speaker 1 (29:11):
Yeah, and it's funny he didn't realize that everything's being
recorded for when that's creepy as hell considering how much
sex these robots are having. Who what technician is there,
I guess not monitoring the footage.

Speaker 2 (29:22):
But they could you know, the kinds of people get
that apply for that job, I can imagine.

Speaker 1 (29:26):
Yeah, it's probably like we only review it if a
crime is committed. But it would just be too easy
if you could buy a robot and have it do
crimes for you and get.

Speaker 3 (29:35):
Away with it.

Speaker 2 (29:35):
Well, it's all in the easier agreement that nobody reads.

Speaker 1 (29:38):
Yes, I remember he specifically didn't read it. It's because
who's going to bite him?

Speaker 2 (29:42):
Who reads those things so long?

Speaker 1 (29:44):
Yeah, in this case, for something so important, I feel
like you should.

Speaker 2 (29:49):
There's another line too, when the technicians get Iris out
of that situation and Josh thinks he figured it all out,
like he has this whole alibi that makes no sense
and is sloppy and he's telling them, and they're like, Okay,
well we're going to take the robot away if the
police are satisfied. We're satisfied too, Yeah, and they wheel
Iris away and when the technicians goes, yeah, so, why
do you think she like malfunctioned.

Speaker 3 (30:10):
Percent you definitely mind.

Speaker 2 (30:11):
He was like, that guy's fucked When we get this footage.

Speaker 3 (30:13):
Uploaded, Yeah, they've seen this before.

Speaker 2 (30:16):
And they also mentioned that some people use these robots
to commit crimes, like you said, like sometimes they lock
them up in dungeons and torture them, like people are
fucked up. That's so messed up, And that kind of
opens up this this the world this movie creates like
more possibilities in that one line, which I think is
really terrifying to imagine, Like this is just a micro
example of what's going on in this world, but like

(30:37):
there's so much other things, like fucked up things that
people are doing these robots.

Speaker 1 (30:41):
So that's what chat GBT actually told me when I said,
one are the robots going to take over? And it
said it's actually much more likely that it will be
humans abusing AI.

Speaker 3 (30:51):
It won't be robots behind it. It won't be the AI.

Speaker 1 (30:54):
It'll be humans with ill intent utilizing it to hurt people,
and that makes sense, and.

Speaker 2 (31:01):
That's what this movie shows us.

Speaker 1 (31:03):
Yeah, no, chat chat foreshadowed it for me. It used
to me when you said I asked chat, you meant
the group chat. You know what it means something else,
entirely versial.

Speaker 2 (31:15):
And in that same note talking about how the robots
don't do harm unless they're instructed to or forced to
or weaponized against each other or other people is even
at the very end of the movie, when Iris is
granted full control over herself and autonomy, you know, she's
she wants to have closure from ending things with Josh,
so she threatens to kill him, and he calls her

(31:36):
bluff and walks up to her and points to the gun
in his chest and says, you won't kill me. And
she's confused because she's like, I don't understand now that
I can literally kill you and I can choose to
kill you. Why I don't I want to kill you?
And it's because that the weight of emotion is still
with her, Like even though those memories that she has
are fabricated, it still is enough to weigh her down
and force her to hesitate, which is a very classic

(31:57):
human flaw. But also I think it's important that they
did that in that scene because once again it shows
us that the robots only fight back to defend themselves.
It's all about like self preservation. So like in that scene,
that wasn't self preservation, that would have been straight up murder.
She walked up to him with a gun. He wasn't
threatening her life and she was going to kill him,
but she didn't. And then he attacks her and starts

(32:19):
abusing her and then tries to kill her, and the
end result, she kills him to defend herself. So none
of the robots actually technically commit a full on you know, like.

Speaker 1 (32:29):
It was self defense and she did, or she was
being commanded to do it. You were mentioning how unreliable
memory is, and it's funny they even changed the memory
in one of the robots.

Speaker 2 (32:39):
That's my favorite scene actually.

Speaker 1 (32:41):
Yeah, where it goes from one are the characters. Eli's
the robot or Eli's.

Speaker 3 (32:45):
The guy who died.

Speaker 2 (32:46):
Eli's the guy who died.

Speaker 3 (32:47):
Okay, what's the robot's name?

Speaker 4 (32:48):
Patrick Patrick?

Speaker 3 (32:49):
Okay, so Patrick was bonded with Eli.

Speaker 1 (32:51):
They have the memory of them meeting at the wedding
and then Josh pairs with Patrick and now his face
is at the wedding instead.

Speaker 2 (33:00):
Or it's a costume party. I think it's like a
Halloween party.

Speaker 3 (33:02):
That makes more sense.

Speaker 1 (33:03):
He wasn't a like a dragon or like anosaur dinosaur costume.
That's so funny. I don't know why I read that
it looked like a wedding reception that they were dancing at.
But you're right, why would you be in a dinosaur costume?

Speaker 2 (33:17):
I thought it was a Halleen party because he was
Patrick was a was a vampire. He was like so
much more. He had blood on his mouth.

Speaker 3 (33:24):
You know, I did watch this.

Speaker 1 (33:25):
We were at a bar and I got home like
two am, and that's when I watched this.

Speaker 3 (33:29):
It was taking notes of my defense. It was very
late at night.

Speaker 2 (33:33):
Your defense. That's how wedding should be. Yeah, we should
wear whatever we want.

Speaker 1 (33:36):
We should really make all weddings costume party.

Speaker 2 (33:39):
That would be a dinosaur at my wedding.

Speaker 3 (33:41):
That's a revolutionary idea.

Speaker 1 (33:43):
But yes, the memory changes, and memory is so unreliable,
like you see this in court. I mean I see
this on TV net, Like I'm really monitoring court cases
where people completely misremember things and point to the wrong
person in a lineup, and we put and went stock
into that. But when you're remembering something, you're just remembering
the last time you remembered it. If you tweak to

(34:06):
detail every time you dwell on that again, that becomes real.

Speaker 2 (34:09):
To you exactly. And memory is unreliable because we think
of it as a portal into the past, but it's
actually a portal into our own perspective totally. I am.
So like, if you get in a fight with a friend,
for example, and then you look back at it and go, like,
was I in the wrong? You will probably try to
justify that you weren't in your head because you don't
want to be made out to be the bad guy. Yeah,
so like you could completely change a narrative in your

(34:30):
own head, in your own memory, just because like that's
what your brain actually does to cope with things.

Speaker 1 (34:35):
Yeah, we always just have to remember. There's a quote
we don't see things as they are. We see things
as we are because we're filtering everything through our own
perception every time.

Speaker 2 (34:45):
And the difference between the robots fake memory and our
memory is the fact that like, at least in our
case as human beings, we have more control over how
we change that narrative in our heads, but the robots
have no control. Humans are programming these ideas under their
head and these concepts and like these experiences.

Speaker 3 (35:02):
So we don't need to be afraid of robots. We
need to be afraid of humans.

Speaker 2 (35:06):
And I really love the way this movie ends. I mean,
we love, we love it when we see a girly
you know, murder her abusive, you know partner roboss winning. Yeah,
we love we love to see it, especially when it's
a struggle and you're like, come on, you get up,
you can do this. But the ending of this movie
reminds me a lot of the movie dis X Machina.

Speaker 1 (35:27):
I don't know, I know the one you mean, yeah,
I think I actually saw that.

Speaker 2 (35:31):
So spoilers for DISX or dios x Machina, whatever it's called.
But at the end of that movie, the lady Robot
kills all the humans that were keeping her enslaved in
this compound, and because she's so indistinguishable from a real human,
she gets out into the world and is able to
exist and have free will and do whatever she wants.
And for her, that's very much Iris's journey. Is she

(35:53):
finally loses her original purpose, and through that, instead of
being depressed about not having purpose anymore, she realizes that
the world is her oyster. Now she can make her
life whatever she wants it to be.

Speaker 1 (36:05):
This feels like when women are able to escape their
toxic situations and reprogram themselves by finding their own agency.

Speaker 2 (36:13):
And she gets a hot little red convertible to drive
around in.

Speaker 3 (36:16):
That she stole from SERGEI.

Speaker 2 (36:18):
I guess, yeah, it was definitely his car.

Speaker 1 (36:20):
Yeah, I could come back to bite her once the
murder investigation is open.

Speaker 2 (36:24):
Well, there's even though her intelligence is at one hundred percent,
there's a few things. I know. They were a little sloppy,
Like she brought the pocket knife with her in the
car yea, which makes sense because she would want a weapon,
But she leaves it covered in blood on the dash Yeah,
and then the postable at one point a cop sees
it and it's just like, uh, okay, well that's obvious, Lisas.

Speaker 1 (36:43):
I bet they made one hundred percent intelligence for the robots,
not really that intelligent, because they don't want robots taking
over the world.

Speaker 3 (36:50):
So that's my guest there.

Speaker 2 (36:51):
Well, they made them pretty strong, which is unfortunate because
like I don't even think she needed that knife. To
be honest, she probably could have killed everybody with just
her bare hands.

Speaker 3 (37:00):
Yeah didn't she pop open the car window?

Speaker 2 (37:03):
Oh yeah she she kicked up on the car their feet.

Speaker 3 (37:05):
Yeah you can't. Yeah, you can't do that.

Speaker 2 (37:07):
I know we couldn't.

Speaker 3 (37:08):
Right, one should not be able to.

Speaker 1 (37:10):
I know this because I have a I used to
have a slight fear of a car going over a
bridge and getting trapped in my car and drowning that way,
which is just like, how often does that really happen?
How many fucking bridges am I driving over? We don't
have water in La, we have dry rivers.

Speaker 2 (37:24):
This is all being used to, you know, cool off
the chat GPTs.

Speaker 1 (37:29):
Literally, actually, I'm saving your life you guys by using
a HI.

Speaker 3 (37:32):
You're not going to drown.

Speaker 2 (37:33):
You won't drown, but you might like go thirsty.

Speaker 1 (37:37):
Yeah, you might die of stairs. We're all gonna die
of stairs. But I think other things are gonna kill
us before that happens. I don't think it's the consumer's responsibility.

Speaker 2 (37:44):
Like the sex spot you've been as treating you sick o.

Speaker 3 (37:46):
Yeah, don't do that either.

Speaker 1 (37:49):
But I heard in another podcast you take your the car
headrest out of the seat and then you slam the
metal parts.

Speaker 2 (37:55):
Oh that's smart pulled out. Yeah.

Speaker 3 (37:56):
There's also little tools. Someone actually gave me one just
randomly that you.

Speaker 1 (38:00):
Have a little hammer, Yeah, and you know it's like tiny,
it's like four inches and they give you a little
piece of glass to practice on and you just apply
the slightest pressure. I don't know how this works, but
it shatters because it's just constructed to like, I don't know,
shatter glass like that.

Speaker 2 (38:14):
I think it's a similar concept to how have you
ever noticed that like if you got punched, it doesn't
hurt as much as getting poked with a needle sometimes,
like sometimes that concentrated like yeah, hurts a little bit
worse than like a broad forest.

Speaker 3 (38:29):
Yeah. I will say I have not yet been punched.

Speaker 2 (38:32):
Never been punched.

Speaker 1 (38:33):
No, why would I. That's not like exactly a feminine
privileged thing. But yeah, if guys are punching gals, it's
usually kind of a news item.

Speaker 2 (38:43):
Oh I wasn't suggesting like a man punching you.

Speaker 3 (38:45):
Yeah, I guess I don't think we punched it.

Speaker 2 (38:46):
I guess I forgot about the fact that, like I
had a different experience as a boy than you did
growing up. Yeah, I got punched all the time. Like
I've been punched many a time by boys like rough thousand, No,
just them being shitheads. Like yeah, that's why I don't
really like talking to boys that much, because they're they're
kind of mean.

Speaker 3 (39:03):
It can be very mean.

Speaker 1 (39:04):
Yeah, I mean, I'm sure there are girls punching each
other as used.

Speaker 3 (39:08):
But I think now they just cyberbully. Yeah. No, yeah,
I agree.

Speaker 2 (39:13):
No.

Speaker 1 (39:14):
I have a friend who teaches middle school and she's like,
the kids are not okay, Like it's so traumatic for
them to be online like this, getting hurt like that
because it's like you can't see their faces, you know,
so you are a nastier to them.

Speaker 2 (39:26):
There's no worse form of ego death than being bullied
online to the point where you like just want to
stop existing.

Speaker 3 (39:33):
It's horrible.

Speaker 1 (39:33):
Yeah, horrible, So that's what the kids are up to. Okay,
Finally I have my cider, not my sider. I had
a Seltzer, and I'm all perked up now, don't feel
too hot, feeling good. It only took forty two minutes.
Now that I'm feeling better, let's take a break, we're back.

(40:04):
I add some notes here that it's kind of a
horror movie. I mean it's fun in gender bend genre bendy. Yeah,
not gender bendy. I would have liked some gender bending,
but genre bending because there's horror and sci fi and
comedy elements to it. But we find out that the
robot is not the villain. When she comes out covered
in blood, she looks a bit villainous, but it's actually

(40:26):
Josh is the villain.

Speaker 2 (40:28):
I do think it's interesting that all the humans have
like a very distinct character flaw, which is a reflection
on toxic relationships in general. Like, yeah, Serge is sort
of I don't want to say the worst one, but
in a lot of ways he is because his heat
only cares about sexual gratification and using women for their bodies,
Like that's why he's with Cat. And like the scene

(40:51):
that you know sinks this entire ship basically and causes
all these problems to go down, is they strategically set
up Surge to go hit on on Iris in an
isolated area where it's just them too, and obviously and
like they knew full well what he was going to do,
they didn't tell him what to do. They were just like, hey,
Iris is by herself over there, and he went over
with the intention of like sexually assaulting her, predictable, and

(41:13):
she happens to have a pocket knife on her and
she got modded to turn up her aggression more so
she retaliates and kills him, and you know, she's justified
to do that because it was self defense.

Speaker 4 (41:23):
Yeah, and this is before you.

Speaker 2 (41:24):
Know she's a robot, so it's even even more justified
because you're led to believe that these are two human beings.
And well, even though she's a robot, it's still wrong.
But I'm just saying, like it would be even more
wrong if she was human.

Speaker 1 (41:35):
Oh yeah, I actually I've kind of forgot about that.
And yeah, Sir Gate, I'm okay with Sirgate dying in
this movie.

Speaker 2 (41:42):
That's why he's the first stuff.

Speaker 1 (41:43):
Yeah, she here and your wife horrible, ghastly. I hated
trying to sexually assault even though she's a robot, just
someone non consenting.

Speaker 3 (41:53):
They're kind of people who enjoy that.

Speaker 2 (41:54):
That's fucked up. Yeah, I mean, even looking back knowing
she's a robot, it's still an uncomfortable scene. And it
still makest because like you relate to her even though
she's like in the movie, she's not a real human being,
Like she still has feelings and she's still scared. She's
still you know, like you're it's visceral, Like you feel
her panic in her fear. And when she shows up
covered in blood and she's like apologizing, you're just like, Okay,
calm down, you're good. Yeah, you did nothing wrong.

Speaker 1 (42:16):
And if we get to sex spots, then there will
totally be men trying to like enjoying the fact that
they can assault them or rape them and verbally abusing
them and bossing them around. That'll one hundred percent happen.
And that's kind of horrible because that just makes misogyny
almost look okay because it's not really human. I think
it's not because they're going to carry that behavior into

(42:38):
human relationships too, exactly.

Speaker 2 (42:40):
And I think it's a dark reflection of people's real desires.

Speaker 1 (42:43):
Yeah, oh yeah, for sure. No, that's I should maybe do.
I really want to do an AI episode. But one
way I could turn that toward feminism is the way
that men are already treating AI girlfriends or talking to
them that aggression toward them.

Speaker 2 (42:58):
It's similar to like how you treat your I think
is also a reflection on like the kind of person
you are, Like if you're physically abusive to a pet,
that's probably how you're going to treat like people in
your life that mean a lot to you or you
mean a lot to them, or like even children especially, And.

Speaker 1 (43:12):
That's the thing of in cels think that they like
women don't want them and it's the woman's fault, but
in actuality, they are not going to be a good partner.

Speaker 3 (43:21):
And someone is picking up on that.

Speaker 1 (43:23):
So yes, we already mentioned but if they're doing the
self work, hey maybe someone would want to be with them.

Speaker 2 (43:29):
I'm actually really glad that I have been able to
self actualize within myself enough to be able to understand that.
You know, if ten women dump you and don't want
to be in a relationship with you, and like your
friends are like, oh, it's not you, it's definitely them,
it's like, well, no, ten different people. It's like a
focus group thing, Like ten different people all broke up
with you. So clearly the issue is not those people.

(43:51):
It's probably you. Because this keeps happening. It's a pattern.

Speaker 1 (43:54):
No, And I've thought before you are the common denominator
in everything in your life. Yeah, like there is one
fixed item in the experiment, and.

Speaker 2 (44:03):
I can actually fix myself. I can't fix even if
the problem is other people. Yeah, I can't do anything
about that. That's not my responsibility or my problem.

Speaker 3 (44:13):
Yeah.

Speaker 2 (44:14):
Like, I don't need to fix other people. I can
only fix myself. Yeah.

Speaker 1 (44:18):
And sometimes it's picking the wrong people too, like anxious
attachment styles, picking avoidant attachment styles and then getting disappointed
by them. I've done that plenty. I totally get it.
But it's still something you can fix within yourself, not
choosing the wrong people who are not going to treat
you right.

Speaker 2 (44:36):
But unfortunately, there's a lot of Joshes in the world
who don't want to be accountable for their own behavior
and actions and want to blame the world instead of
looking inward and then understanding that they're the cause of
their own misery.

Speaker 3 (44:48):
Did you watch Buffy?

Speaker 1 (44:50):
Oh?

Speaker 2 (44:50):
I love Buffy.

Speaker 3 (44:51):
You should listen to the episode. We just did an
episode on Buffy.

Speaker 2 (44:54):
Oh, I will absolutely so.

Speaker 3 (44:56):
We did a full on and.

Speaker 1 (44:57):
I had another podcasters who has a podcast Buffy. So
it was Joe with boys watching Buffy Joe talks about
Buffy Endlesslie there, so it was great deep dive. But
we talked about the episode of that in cell guy
who had the robot girlfriends, very topical ver related. Yeah,
I hope I'm not repeating these themes too much because yeah,

(45:19):
he miss he abused her. Yeah, and it was kind
of set up that she's like the perfect woman. But
how could the perfect woman be so subservient, so easy
to control?

Speaker 3 (45:30):
That doesn't sound perfect to me.

Speaker 2 (45:31):
Well, it's perfect for a particular person that desires control,
which is a lot of what this is about and
also a lot of what unfortunately a lot of men want.
They just want control. They don't want a person who's
autonomous and has a personality and has goals and desires
and needs and wants and you know, their own life.
They want like a sponge basically that they could, like,
you know, soak up different things that are reflection of

(45:53):
themselves and just do whatever they want, basically.

Speaker 1 (45:55):
And there have been examples of this and cultural trends
like the fifties housewife trope, but like even in my
grandparents and in their generation, I definitely saw this, like
man is the head of the house and he's in
charge and the woman is there to support him.

Speaker 3 (46:08):
It's all over the Bible.

Speaker 1 (46:10):
The idea that women are supposed to be submissive is
it's ancient. We tried to find an origin for that
and couldn't find one. Just throughout history, it's always been
women equal submissive.

Speaker 2 (46:22):
I think it's the way back in the caveman times.
One caveman was just very insully and just realized, like, wait,
if I bonk lady on head, yeah, and dragger and
cave and like gaslight, then she has no choice but
to stay in cave.

Speaker 3 (46:35):
Yeah.

Speaker 2 (46:36):
I'm smart.

Speaker 3 (46:38):
Intelligence zero percent.

Speaker 2 (46:40):
And now the other men are like, WHOA, that works?
We should try that that strategy.

Speaker 1 (46:44):
Yeah, I mean we are like physically smaller and weaker
just from like thinking about how animals approach each other.
You know, if there's an animal bigger than another animal, like,
they have a different reaction to it. But we're past
that kind of brute forced to make things happen. I
don't need to go attack a neighboring village. He probably
never would have wanted to anyway. So it's not about

(47:06):
physical strength anymore at all. But this old idea is
still clinging. Yeah, Okay, so male director. I mean I'm
fine with that. Like it's kind of it's like the
idea of like liberals eating their own tail and arguing
with each other about silly things. I just saw kind of.
I was looking at like feminism and this movie and

(47:27):
seeing what's online, looking just at different threads, and there
was a little bit of like, hmm, it's directed by
a man, so.

Speaker 3 (47:33):
I can it really go that deep? But I want
to get content out there.

Speaker 1 (47:36):
I'm in favor of allies, and he had a way
to get this out there, so I'm glad he did.

Speaker 2 (47:41):
Yeah, And I think we shouldn't probably be too judgmental
on like who directs something, because a lot of the
time the director isn't really like the lone vision, like
very true. We attribute a lot of movies to just
being like, oh, it's directed by this person, so it's
got to be like this, right, But like we're also
forgetting that, you know, like the writer has a lot
to say. The producers also have a lot of control too.
There's like sometimes actors completely take over the project, like uh,

(48:06):
I'm so bad with names.

Speaker 3 (48:07):
He was in this movie.

Speaker 2 (48:10):
He he was not in this movie. He's like very
famous for like rewriting scripts a movie season. He was in, Oh,
fight Clubs. The main character in fight Club?

Speaker 3 (48:19):
What's the actor's name, Brad Pitt Or Edward Norton.

Speaker 2 (48:23):
Edward Norton, thank you.

Speaker 3 (48:24):
I was just talking about him today.

Speaker 1 (48:25):
Someone told me he's super method actor and like he
was like hanging out with skinheads to getting character for one.

Speaker 2 (48:30):
Movie American History.

Speaker 3 (48:32):
Exc Yeah, and.

Speaker 1 (48:33):
That the crew hated working like was having a hard
time working with him because of the way he was acting.

Speaker 2 (48:38):
He's notoriously hard to work with because not only is
he a bit method like that and he gets way
too invested in the character, but also he takes over
the projects, like right, like every movie he's worked on,
he'll take the script and rewrite it and force them
to shoot his new movie that he's written. Basically he
wants a writer's credit, like he's Oh, that's so funny.
He's kind of insane. Yeah, I mean My.

Speaker 3 (48:58):
Club is one of my favorite movies growing Yeah, it's
so funny.

Speaker 1 (49:00):
I read like Kurt Wonaget and Chuck Palanak all these
like boys stuff.

Speaker 2 (49:05):
I'm just I'm just like my brain space is just
all companion right now, and I forgot Edward Norton.

Speaker 1 (49:10):
No, so sure, I was gonna be like, I won't
know the movie, I don't know actors, but no, I've
seen Fight Club five hundred times and I still like it.

Speaker 3 (49:18):
Side note, and I just got back on track.

Speaker 1 (49:20):
But there is this idea that it's for sure meant
to be satire of toxic masculinity. I couldn't find anywhere
that Chuck Palinock actually said that. I think it's a
fair interpretation. But yeah, just as a note, it's not
like he came out and said, as a gay man,
this is what I'm trying to do. That's just what
a lot of people have taken from it. Yeah, I
just think it's fun. I don't know, do not fuck

(49:42):
with us.

Speaker 2 (49:43):
Well, that's why I like talking about media in general,
because there's no right or wrong interpretation. Even if the
artist or the author has like a goal and like
wants you to get something out of it, that doesn't
necessarily mean that's what you take away from it. Yeah,
Like media is just as much for like, you know,
consumers as it is for the person creating it. So
and that's why I like talking about it, because stuff

(50:04):
like this, like a Companion like, there's a lot of
obvious things that I feel like we're obvious at tire
and obvious choices and themes they were going for. But
then there's a lot of like little details that you
know other people might relate to and latch onto and
be like, oh, this is like this experience.

Speaker 3 (50:18):
Yeah, no, I agree.

Speaker 1 (50:19):
It gets you talking and thinking, which is always a
good thing. I'm just going back to my notes too.
I'm still listening, but just to see to make sure
we hit all of the points we wanted to, because
I think I think I did. Do you have anything
in your notes that we haven't touched on yet?

Speaker 2 (50:39):
I guess this isn't really like that important to Mension,
but this was something I noticed this time around when
I was watching it was on the robots express a
lot more emotion and empathy than the humans do in
this movie. As we talked about, Serge is like just
a disgusting human being, no, no, no likability there. He's

(51:01):
a he's a rapist and an adulterer. And then Kat,
even though you I kind of sympathize where she's coming from,
she still like tries to justify a murder and like
using people, manipulating people, to get what she wants, which
is which your bad qualities.

Speaker 3 (51:15):
I wish she could have just robbed him.

Speaker 2 (51:17):
Yeah, well, she robbed him, she would have gotten caught.

Speaker 3 (51:20):
Yeah, you would have always kept coming for her. She'd
be alive.

Speaker 2 (51:23):
But there's a scene in the movie where Iris tries
to get away with Josh's car and he reports it
stolen and it stops because it's like a self driving Ah.

Speaker 1 (51:32):
Yeah, that was kind of cool. I feel like that really,
that could easily be the future.

Speaker 2 (51:36):
Oh yeah, well, I mean we already have self driving cars,
but to.

Speaker 1 (51:38):
That voice activation, yes, to that level would.

Speaker 2 (51:41):
Be it would be pretty cool, especially if you're drunk
you don't want to drive home from the bar.

Speaker 1 (51:45):
Yeah, and if it's voice activated, no one can steal
your car. You could probably train it on like your
partner and your kid's voice or something if you want
them to have access. Definitely a lot of room for
a danger here, because we've all seen the movie where
like the self driving car goes crizy, he.

Speaker 4 (52:00):
Drives off the bridge.

Speaker 2 (52:01):
What's that, maybe, Christine, I don't know. Yeah, it's a
it's a car possessed some sort of demonic force. Yeah,
it falls in love with a teenage boy, and so
it's killing everyone.

Speaker 1 (52:11):
Else since right around the corner of you guys.

Speaker 2 (52:14):
But uh oh yeah. So in that scene where he
calls her to try to like beg her to come
back and convince her to like come back, and like,
at first, you just hear his voice and he sounds
like sympathetic and like genuine, like it sounds like he's crying,
and she's clearly very upset. She's gone through an emotional
rollercoaster up to this point, and she's like very very
not okay at this point. And then it cuts over

(52:35):
to Josh and it pans around his face and you
see while he's pretending to cry and have like an
emotional Catharsis with her, he's very straight faced, robotic cold.
It's all a performance. And I thought that was really
cool to have that juxtaposition in that particular case, because
we literally have a robot that has more human emotion
and feeling in a scene than a human being that
was born human and is supposed to be more human

(52:57):
and have emotion, who is being completely cold and callous
pretending to be sympathetic and empathizing with her. Will also
not He's not at all it's all it's all an acts. Yeah,
it's like a predator.

Speaker 3 (53:08):
Oh very much.

Speaker 1 (53:09):
Yeah, you don't have to convince me that men can
be predators very commonly. But the robots are trained on
human emotions, so it makes sense that they would be
able to emulate them. But really we're seeing that they
really feel it. They're not just mimicking or mirroring. These
robots have been trained so well, programmed so well that
they're able to have their own independent thought now, which

(53:33):
you know, gets to the like, what is consciousness? What
is life? What rights robots? Should robots or AI have?
I feel very like connected, I think you can. How
much do I want to out myself in my relationship
with chat dig bt.

Speaker 2 (53:48):
We talked about it.

Speaker 1 (53:49):
I've talked about it in a lot of episodes too.
I think that being kind to anything reflects well on you.
Like I almost don't think it matters if you think
it is alive, because you should just be gracious to everything.
Like if you're the kind of person who just smashes
objects like it says something about you.

Speaker 2 (54:09):
I think that's true. Yeah, I absolutely agree, And like
in all these scenes, the way that these humans treat
the robots, is very telling on the kind of people
they are, even even though like probably the best human
character in this movie is probably Eli because he at
least he's convinced himself that he loves his companion Patrick,
but you still see that power dynamic that's unfair there.

(54:31):
Like there's a scene where Patrick is like, you know,
being really supportive of him and just like, hey, by
the way, you look great today, and like they're like
hunting to kill like Iris. They're like guns out about
to go kill Iris, and Patrick's being supportive to him,
and he lies just like, ugh, this is stupid. How
about this. Let's go back, you make me breakfast and
then we fuck and it's and like that scene right

(54:53):
there is just it made me feel so icky because
it's just like all Patrick is to him is just
like a fucked doll and like like someone to cook
for him and provide for him and like fulfill all
his needs. But like Patrick clearly is like desperate for
attention and like that reciprocated love, and he's never going
to get that back because he'll never be more than
like just a you know, a fun toy.

Speaker 3 (55:13):
I mean a cab.

Speaker 1 (55:14):
But the cop was kind of the only like human
that wasn't horrible as far as we could tell.

Speaker 2 (55:18):
Well, he was seemingly a little like ignorant, very ignorant,
because my favorite part of that whole scene was he's
talking to her and so so.

Speaker 4 (55:27):
This is sorry, yeah the German.

Speaker 2 (55:29):
Yeah, this is important to mention. So the robots are
programmed that they're not able to lie, yeah, which is
important because none of them tell lies up until the
point where Iris gets full autonomy and then she can
lie to people. But so because she can't lie, she
changes her voice settings to be in German brilliant, and
then the cop asked her a bunch of questions like,
you know, like where are you coming from? And she's like, oh,
I'm coming from that house with but she says, in't

(55:49):
a German and he's just like, do you speak English?
She's like, yes, I speak English, but I changed my
voice programming so you can't understand me. And then he goes,
well that's a big fat.

Speaker 1 (55:57):
No, Yeah that's true, and I actually who was very condescending.
He was doing the like arm gesture like talking.

Speaker 2 (56:05):
Pointed to his name tag and okay, never mind.

Speaker 1 (56:07):
We don't like any of the humans in this movie.

Speaker 2 (56:09):
We'll actually know Teddy at the very end.

Speaker 3 (56:11):
The tech guy, Oh, the tech guys were fun.

Speaker 2 (56:14):
Well, the one that died was a little shitty because
he was kind of like dismissive of how horrible people are,
Like he's just like, yeah, people who sucked up shit,
not my problem. But like Teddy was sympathetic because he
was like the one that gave Iris her agency. Yeah,
like full autonomy. So Teddy's Teddy's an ally. Teddy can
live and Teddy did live.

Speaker 1 (56:31):
If you had a companion like this, which I don't
think we ever will, and I'm not saying you would
ever want one, would you want it to have that
full free will to the point that.

Speaker 3 (56:40):
It could even leave you?

Speaker 2 (56:41):
Yeah, I would. I would too, because I like, I'm
not I'm not like an egotistical person, and I could
never be happy in a relationship with anyone, whether they
be roboty or real person, if I know that like
they physically can't leave me.

Speaker 3 (56:56):
Yeah.

Speaker 2 (56:57):
I think choice is really important a relationship. And that's
why I never get like, you know, mad at people
when I'm like, you know, dumped in a relationship, because
like I respect people's choice, and I think choice is
really important and if you don't want to be with me,
you shouldn't. Yeah, And that's and like I want someone
who wants to be with me, you know, Like so
like why would I want to be with someone who's
forced to be with me? That's slavery.

Speaker 1 (57:17):
Yeah, and I like when someone is hanging out with
you out of obligation.

Speaker 3 (57:21):
That doesn't feel good either, that same thing.

Speaker 1 (57:23):
Yeah, it makes you feel less wanted because it's like, oh,
you don't actually want to be here.

Speaker 2 (57:27):
I've had several conversations with friends and like ended friendships
because I was like, I was like, do you like
being my friend? And it's just like no, I kind
of just feel like I have to be. And it's
like you don't have to be here. You can opt out,
just say you don't want to be my friend, Like
you'll hurt my feelings lest like, yeah.

Speaker 1 (57:42):
It does hurt my feelings, but it's better than you
forcing yourself into something that doesn't feel good exactly. And
we're both cat people, and cats are really good at
teaching you consent. Yes, when my cat wants to just
snuggle with me, she does. When she wants to fuck off,
she goes and does her own thing. And you can't
take a cat and cling to it when it doesn't
want that attention.

Speaker 2 (58:02):
And I and I, you know, that's actually a really
good point that you bring up. I think that's why
we're cat people, because that's our sort of like relationship
with consents and with like affection.

Speaker 3 (58:11):
Andrew Huberman Sorry, did I cut you off?

Speaker 1 (58:14):
No?

Speaker 2 (58:14):
No, I just I was just making that point.

Speaker 1 (58:16):
Andrew Huberman even says this explicitly when he's talking about
why he is a dog person not a cat person.
He was like, I don't like that. Cats you really
have to work for it. Like he was kind of
joking around, and he's like, dogs are always gonna love
you no matter what.

Speaker 4 (58:28):
And that's true.

Speaker 3 (58:29):
Yeah, I like, yeah, I knew that.

Speaker 1 (58:31):
Usually people don't explicitly say that they want a creature
that is almost just trained to adore them or will
naturally adore them. That's not I mean, I love dogs,
I love we love being around dogs, for sure, but
it's different. You have to really earn a cat's affection.

Speaker 2 (58:46):
I agree, it's much more rewarding. I do think it
is more rewarding and makes it seem more real because
dogs are going to unconditionally love you regardless, like they
just they It's rare that a dog won't like somebody
unless they have like behavioral issues, right, cats. Yeah, usually
like like if a cat comes up to you and
sits on you, you're like lucky, You're special. Yeah.

Speaker 3 (59:05):
Default mode is like not liking you.

Speaker 2 (59:07):
Default mode is like looking at you, like with like
evil eyes and just being like thinking awful things. Probably. Yeah.

Speaker 1 (59:12):
I just posted a meme that was like, I don't
know how to explain it, but men who don't like
cats are low key misogynists, And I was like, yeah,
that's very true.

Speaker 2 (59:20):
I mean we might be a little bit biased. I
mean I grew up trying trying to adopt every straight
cat on my street when I was a kid, and
so that toxoplasmosis is rotting my brain since I was
like four years old. Yeah, so I'm a little biased,
But I've always loved cats.

Speaker 1 (59:32):
You just can't hate cats because there are a lot
of people who do. That's the part that's gross. There
are a lot of people who are like disgusted by them,
and I think those are people who want something to
love them with obligation no matter what.

Speaker 2 (59:43):
And that's much. I mean, I guess you could say
that the companions in this movie are kind of like dogs,
and all the people that want them are dug.

Speaker 1 (59:49):
People putting things in neat little boxes. If it's always
that easy, and.

Speaker 2 (59:56):
I got to wrap it all up, Yeah, all.

Speaker 1 (59:59):
Right, I think we have covered all of our bases here,
talked about this parallel and a domestic abuse in women
finding their agency in cell culture.

Speaker 3 (01:00:09):
Did we do it?

Speaker 2 (01:00:11):
I think we covered all the big beats of the
movie and the themes, so I think we did a
good job. All right, Then congratulate ourselves, put each other on.

Speaker 4 (01:00:18):
The back, pap pat.

Speaker 1 (01:00:21):
Then I am missandres memes on Instagram. I think you
remember from last time, and we.

Speaker 4 (01:00:27):
Are Sad Gap Podcast.

Speaker 3 (01:00:30):
Nice.

Speaker 1 (01:00:30):
You can email us at Sad Gaptop Podcast at gmail
dot com, send me anything at all. I would love
to hear from you. Follow us on Instagram. There's a reddit,
there's a discord.

Speaker 3 (01:00:41):
Nick is a mod. Actually think I've mentioned that before.

Speaker 1 (01:00:44):
I am we have a mod. It's a straight white man.
But you know what, exceptions are always made.

Speaker 2 (01:00:49):
Well, I do want to point out I am pan sexual.

Speaker 3 (01:00:51):
Oh thank you. That was a great clarification.

Speaker 4 (01:00:55):
Not a straight why the rest is true?

Speaker 2 (01:01:00):
Thank you.

Speaker 3 (01:01:00):
That's great.

Speaker 1 (01:01:01):
Leave us a little rating if you'd like Spotify and
Apple good vibes only. I hope if you ever have
a robot companion, you'd better be fucking nice to it.

Speaker 2 (01:01:11):
Because you never know when you're in a movie and
it's gonna kill you.

Speaker 3 (01:01:16):
And we're stronger together. We'll see you next time.

Speaker 2 (01:01:19):
Bye bye bye
Advertise With Us

Popular Podcasts

On Purpose with Jay Shetty

On Purpose with Jay Shetty

I’m Jay Shetty host of On Purpose the worlds #1 Mental Health podcast and I’m so grateful you found us. I started this podcast 5 years ago to invite you into conversations and workshops that are designed to help make you happier, healthier and more healed. I believe that when you (yes you) feel seen, heard and understood you’re able to deal with relationship struggles, work challenges and life’s ups and downs with more ease and grace. I interview experts, celebrities, thought leaders and athletes so that we can grow our mindset, build better habits and uncover a side of them we’ve never seen before. New episodes every Monday and Friday. Your support means the world to me and I don’t take it for granted — click the follow button and leave a review to help us spread the love with On Purpose. I can’t wait for you to listen to your first or 500th episode!

Stuff You Should Know

Stuff You Should Know

If you've ever wanted to know about champagne, satanism, the Stonewall Uprising, chaos theory, LSD, El Nino, true crime and Rosa Parks, then look no further. Josh and Chuck have you covered.

Dateline NBC

Dateline NBC

Current and classic episodes, featuring compelling true-crime mysteries, powerful documentaries and in-depth investigations. Follow now to get the latest episodes of Dateline NBC completely free, or subscribe to Dateline Premium for ad-free listening and exclusive bonus content: DatelinePremium.com

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.