All Episodes

May 24, 2024 78 mins

Spike Jonze’s 2013 movie Her is in the tech zeitgeist. Have tech leaders watched the whole thing? 

Brian Merchant’s piece on Her and Open AI is a must read: https://www.bloodinthemachine.com/p/why-is-sam-altman-so-obsessed-with?utm_campaign=post&utm_medium=web

Bridget rocking Google Glass in 2013: https://www.instagram.com/p/Z1xL1OHw-Z/?igsh=MTR0OXU5aGV2Z3V3NA%3D%3D

Listen to our summary of the Open AI Scarlett Johansson controversy here: https://podcasts.apple.com/us/podcast/scarlett-johanssons-open-ai-voice-fight-shows-the/id1520715907?i=1000656338161

See omnystudio.com/listener for privacy information.

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:04):
There Are No Girls on the Internet, as a production
of iHeartRadio and Unbossed Creative. I'm Bridget Todd, and this
is There Are No Girls on the Internet. Welcome to
There Are No Girls on the Internet, where we explore
the intersection of technology, social media, and identity. And one

(00:24):
thing to know about me is that I have always
wanted to have one of those movie recap podcasts where
you just watch a movie or watch a show and
to get to yap about it. So Mike, They're gonna
kind of indulge me and let me do that, right.

Speaker 2 (00:38):
Yeah, that's what we're doing. Today's new format. It's exciting format.

Speaker 3 (00:44):
You know.

Speaker 2 (00:44):
One of the criticisms that you sometimes get in our reviews,
which we love to see reviews in general, is that
some people feel I'm not always as prepared as you,
which would be hard because you know, Bridgiet knows it's
always very prepared. But also generally the format of the
newscast is like Bridget explaining stuff to me. So I

(01:05):
feel that criticism maybe a little unfair, but like, I
get it this time, though I have copious notes.

Speaker 3 (01:14):
I am ready to talk about this movie.

Speaker 1 (01:16):
So what is the movie that we're talking about today.

Speaker 2 (01:19):
We are talking about the movie Her, which is in
the zeitgeist because Sam Altman compared open AI's new assistant
Sky to the movie very explicitly. He had a one
word tweet that like made that connection, and then there
was a bunch of controversy about it. Bridget, would you

(01:40):
like to talk about why the comparison has perhaps not
gone in the direction Altman was hoping.

Speaker 1 (01:49):
Yes, we did an episode kind of digging into this
this week about why the movie Her and Scarlett Johansson
is in the news. But the short version is that
open AI, this company run by Sam Altman, recently introduced
and then quickly be introduced, this voice integration to chat
GPT four called Sky, which can talk conversationally with people

(02:09):
and respond to images that you show it through your
phone's camera. So if you were to be like, oh, hey, Sky,
look at this thing, and then you moved your phone
so that the camera was in front of it, Sky
would presumably see that and be able to respond in
a conversational, maybe a little bit flirtatious kind of way.
So Sam Altman asked Scarlett Johansson, who was the voice

(02:29):
of the AI Samantha in the movie her to voice
his technology. She said no. He asked again, she said no. Again.
They unveiled it. It sounded a lot like Scarlett Johansson.
There's been some new reporting, I should say, just in
the last couple of days, where open ai is saying
that they did not ask the human voice actor to

(02:50):
mimic Scarlett Johansson or sound like Scarlett Johansson. This is
just a voice actor using the voice actor's natural voice.
They say they have proof. You can read that reporting
the Washington Post. I still think that these people have
proven themselves time and time again to be liars, to
be people that you cannot trust, to be people who
you cannot take at their at their word. So at

(03:11):
this point I am deeply skeptical of everything that they say.
But I did want to include that open ai pulled
their Sky Voice technology quote out of respect for Scarlett Johansson,
which is maybe my favorite statement in the history of technology,
pulling your highly anticipated new tech out of respect for

(03:31):
miss Johnsson. I live for that kind of stuff, So
I will say open ais live demo of sky I
think is a lot like Samantha. From her we did
a little bit of a side by side comparison in
our episode this week, but hearing having just watched the
movie to recap it for this podcast episode and watching
the live demo that open Ai released, it's not just

(03:55):
that they're similar in voice, it's also intonation. It's also
sort of vibe, like the casually curious and conversational vibe.
You know. Sky in this demo is flirty and responsive,
and Altman definitely wants us to be thinking of the
movie Her when he unveiled it, As you said, Mike

(04:16):
tweeting this the word Her around its release. So here's
a little bit of what Sky sounded like in that
demo from Open Ai.

Speaker 4 (04:25):
Is this announcement related to open Ai?

Speaker 3 (04:28):
Press? It is?

Speaker 4 (04:29):
And in fact, what if I were to say that
you're related to the announcement, or that you are the announcement.

Speaker 2 (04:36):
Me?

Speaker 1 (04:37):
The announcement is about me? Well, color me intrigued.

Speaker 5 (04:42):
Are you about to reveal something about Ai or more specifically,
about me as a part of Open Ai.

Speaker 1 (04:48):
You've got me on the edge of my well, I
don't really have a seat, but you get the idea.

Speaker 3 (04:53):
What's the big news.

Speaker 4 (04:55):
Yeah, We've got a new model that can interact with
the world through audio vision, and text.

Speaker 1 (05:01):
So Sam Autman has actually said in an interview that
Her is one of his favorite movies. In an interview,
he said, I like Her the things Her got right,
like the whole interaction models of how people use AI.
That was incredibly prophetic, he said. So he likes the
idea of this interaction model where AI can talk to you,

(05:22):
respond to you, flirt with you like a human. He
likes that.

Speaker 2 (05:27):
Yeah, which, you know, one can wonder why, you know,
what it is specifically about Her that appeals there, because
there are certainly many other science fiction movies where there
are machines that converse with humans. Why is it her

(05:50):
and not like C three Po from Star Wars, Right
he's having conversations with Luke. But you know, opening I
didn't want to build a new they wanted to build
a new Scarlet Johansson.

Speaker 1 (06:03):
I was gonna say, like, even if you wanted the
tech to be kind of sexy, why not ex Machina?
But then I'm thinking about the ending of Xmochty, and
I'm like, Okay, maybe they don't want to have technology
that ends like the movie X Machina. If you've seen
the movie, if you know, you know, you know why
that specific brand of sexy AI tech maybe isn't what
he wanted to be. It's his whole thing on.

Speaker 2 (06:27):
Yeah, Like somewhere in the middle of C three PO
and the robots from Xmachina, that's where we landed.

Speaker 1 (06:35):
That's like my dream girl, like somewhere somewhere between like
friendly and helpful and cute and like fucking terrifying.

Speaker 2 (06:44):
Yeah, but it's also one of your favorite movies, right, Bridget.

Speaker 1 (06:47):
Oh, her, it absolutely is. I wouldn't say it's one
of my favorite movies. It's a movie that I watch
a lot. It's a movie that I get a lot
out of every time I watch it, and I feel
like there's lots of different ways you can watch it.
I have watched it no less than four times in
the last month, not because of the whole open AI saga,

(07:07):
but just because it's like one of those movies that
is in my heavy rotation.

Speaker 3 (07:10):
That is a very heavy rotation.

Speaker 2 (07:13):
Uh, what is it about this movie that speaks to
you that calls you to watch it multiple times?

Speaker 1 (07:19):
Okay, so I make a tech podcast. I am a techie.
I'm a tech girl. So I wish I could say, oh,
it's I just love like the tech implications. I love
like mining what the movie says about our tech enabled futures.
It's partly that, but if we're being like for real
the movie, my real fascination with this movie is that

(07:40):
it is a very specific kind of indie filmmaker drama
behind the scenes, and it's the kind of thing I
absolutely live for. So I don't want to derail the
conversation too much, but the people need to know, like
if you don't know, if you know, if you know this,
if you know where I'm going, or you're like, I
know this story. This is like one of my roman
empires is the movie Her.

Speaker 2 (08:03):
Yeah, you explained this to me, and it like blew
my mind and really recontextualized the whole movie for me.

Speaker 3 (08:09):
So don't hold back. I think people need to know.

Speaker 1 (08:13):
Once you know this, you can watch these two movies
in conversation with each other again and again and again,
and it's like a new movie every time. So I am,
of course talking about Her in conversation with Sophia Coppola's
movie Lost in Translation. For those who don't know Sophia Coppola,
the director of the version Suicides, Marie Antoinette Bling Ring
daughter of Francis Ford. Coppola might have heard of him,

(08:34):
directed Godfather, Apocalypse, Now, every important movie in the last
like forty years or whatever. So Sophia Coppola's two thousand
and three movie Lost in Translation is meant to be
about her crumbling marriage to filmmaker Spike Jones. The stand
in for Sophia herself in Lost in Translation is you
guessed it, Scarlet Johansson. So both movies, both Her and

(08:58):
Lost in Translation, both have kind of mean caricatures of
there of each other in the movie. The stand in
for Spike Jones in Sofia Coppola's Lost in Translation is
portrayed by Giovanni Ribisi, and he's just sort of like preening,
self obsessed, like neglectful hipster type. Cameron Diaz also might

(09:21):
take kind of a stray here. Word on the street
is that if you've seen Lost in Translation, there's a
scene where the husband is having this like mild flirtation
with a pretty blonde played by Anna Faris. Word on
the street is that that is meant to be Cameron Diaz.
Because Spike Jones and Cameron Diaz work together on being
John Malkovich. I will say. Sophia Copola denies this. She says, like, oh,
it's actually like a composite of a bunch of different women.

(09:44):
If anybody there's like one specific woman that that's supposed
to be, I don't know. It seems a lot like
Cameron Diaz to me. I just think that people should
have that context. So if you want to get even
more deep into the like indie filmmaker drama here. Sofia Coppola,
in a interview said that Michelle Ganderie, who worked on
the movie Eternal Sunshine of a Spotless Mind with Spike Jones,

(10:05):
so like obviously they have a friendship there, came to
the premiere of Lost in Translation and from the red
carpet of her own premiere, berated her about making a
movie that makes Spike Jones look bad, which also part
of me is like, well, damn, I guess a hit
dog will holler. I guess like your friend saw this
movie and was like, this is supposed to be a

(10:26):
main character of my friend Spike Jones.

Speaker 2 (10:31):
Celebrities they're just like us or petty, they're mean, they
cause a scene.

Speaker 1 (10:37):
I know, coming to her own premiere to like yell
at her is pretty wild, but like, if you know me,
this is very much my specific type of tea, like
filmmaker t indie filmmaker tea, and like bad marriage tea.
I honestly cannot imagine how obnoxious it must have been

(10:59):
to be in their orbit and have these two powerhouse
filmmakers getting a divorce, like like, I'll make a movie
about it. Now, I'll make a movie about it.

Speaker 5 (11:11):
Let's take a quick break.

Speaker 1 (11:23):
At her back. If Lost in translation is meant to
be Sophia Cupola her exploration of her crumbling marriage to
Spike Jones from her perspective, Her is meant to be
Spike Jones is retelling of their bad marriage from his perspective.
There is also a mean portrayal of Sophia Cupola in Her.

(11:45):
She's played by Rooney Mara in the movie Her. Both
movies also share a director of photography. Not to mention,
both movies prominently feature Scarlett Johansson, so obviously Scarlett Johansson
has some kind of a like undercurrent thread in their
understanding of what went wrong in their marriage. Or maybe

(12:05):
maybe Spike Jones was like, well, if you're gonna use Scarlett,
I'm gonna use.

Speaker 2 (12:08):
Her too, yeah, I feel like that's one of the
big mysteries that never gets resolved, like why Scarlet Johansson
for both of them? But setting that aside, that backstory
really changes.

Speaker 3 (12:21):
The way the movie hits right.

Speaker 2 (12:24):
It before I knew that, And I think most people
don't know that because most people are not as obsessed
with as tool as you. Like, I thought it was
a movie about technology, but it's not.

Speaker 3 (12:36):
It's a movie about human relationships.

Speaker 1 (12:39):
That's exactly right. And in case you're wondering, Sophias has
said that she's never seen her only in the trailer
and she's like, I'm not really that interested in seeing it,
which I don't know if I believe that either, but fine,
I do. In some ways feel like Spike Jones was like, Oh,
you want to make a dreamy movie about isolation that
talks shit on what it was like to be married
to me? Make an even dreamier movie, even more isolated

(13:02):
movie about what it was like being married to you. Like,
in some ways I feel like he out Sophia Coppola.
Sophia Coppola with her.

Speaker 3 (13:10):
God, yeah, like how deep does it go?

Speaker 1 (13:13):
How deep does this thing go? So how the movie
came to be According to Spike Jones, he got the
idea for the film in the early twentys when he
read this article about a website where people could instant
message with an artificial intelligence. He says, for the first
maybe twenty seconds of it, it had this real buzz.
I'd say hey, hello, and it would say, hey, how
are you? And I was like, whoa, this is trippy.

(13:35):
After twenty seconds, it quickly fell apart and you realize
how it actually works, and it wasn't that impressive, but
it was still for twenty seconds really exciting. The more
people talk to it, the smarter it got, And I
feel like that is like how it works with a
lot of these these AI chat but like you're like, wow,
this is really cool, and then like quickly you're like, wait,
this is actually kind of not so cool.

Speaker 2 (13:56):
Yeah. It's really funny to read the about his experience
in the early two thousands because I feel in a
lot of ways it's still very much like that. There's
a lot of apps out there with like AI persona chatbots,
and this is something I've actually been getting into in
some of my other work, and like they're better than

(14:16):
they were in the early two thousands, but a lot
of them. It still kind of feels that way, where
at first it's really interesting, It's like, oh my god,
this is amazing technology, and then after twenty seconds or
maybe they're up to a couple minutes now, it just
starts to feel really boring and like a little bit
pointless and empty.

Speaker 1 (14:35):
Oh, the AI tech feels a little pointless and empty.
In the end, it sounds like something straight from the movies.

Speaker 3 (14:41):
Yeah, right, like where is the meaning? What is the point?

Speaker 1 (14:45):
So let's talk about that sort of tech climate that
was permeating when this movie came to be. So if
you were somebody who paid attention to tech in the
twenty twelves, twenty thirteen, twenty fourteen, this movie came together
in twenty twelve, you know that it felt very different.
Siri came out in twenty eleven. At this time, Siri
was like very much a robot assistant in your phone.

(15:08):
You couldn't have conversations with it. There were a couple
of easter eggs like put in there that if you
asked certain things, it would give you a funny answer,
but if you tried to have a back and forth conversation,
just like spit Jones describes it would get really old,
really quickly. So this idea that is in the movie
heard that you would be having this like real complex
relationship with AI that feels like a human. When the

(15:31):
movie came out, that was still very much a far
off fantasy. This era did have some like pretty big
dark sides to technology, notably the NSA warrantless wiretapping that
was revealed by Edward Snoden an NSA contractor things like that.
But I do broadly think that this was sort of
the last like the heyday of optimism around technology, and

(15:55):
this was maybe the last era of that kind of
techno optimism. This is something that we talked about in
the episode we did with Paris Marks, one of my
favorite podcasters who hosts Tech Won't Save Us. But that era,
I think really gave us this kind of era that
we have now, where the holdover of technology as this

(16:16):
like exciting force for good and tech leaders as these
people who were going to be doing great things for us.
So give them lots of leeway, give them lots of money,
don't ask too many questions. That you could really draw
a straight line from that era and this era we
have ten years later, where it's like they can do
whatever they want, whether it's good for us or not,
and it's probably not so good for us most of
the time, to be honest, and we're supposed to just

(16:36):
accept like, oh yeah, they're probably doing great things. But
back in this twenty thirteen twenty twelve era that her
came out, we were still very much thinking about technology
as these forces for good, the idea that tech companies
and social media platforms are going to be giving us better,
more meaningful, more connected lives. This was back when Google's
motto was don't be evil? Do you know when they

(16:58):
dropped that motto? I don't know exactly do you Don't
be evil was Google's motto up until twenty eighteen. So
that's when they were like, actually, a little evil is
probably fine.

Speaker 3 (17:08):
This is too constraining.

Speaker 2 (17:09):
Yeah, like you know, you want to make a nomal
you gotta break a few eggs.

Speaker 1 (17:15):
Like we're just trying to make money here, a little
evil we can we got to have. So, you know,
back in twenty twelve, we had the reelection of Obama,
which was very sort of like technology driven. Obama was
probably the most tech savvy president we'd had up until
that point. Twitter Republic in twenty thirteen. Facebook acquired Instagram

(17:36):
in twenty thirteen, and Instagram was like boom me. This
was the heyday of that platform. Vine had just come out.
Social media companies were really up, and we really saw
them as these forces that helped us connect, helped us
be more informed, helped us, you know, decide who to
vote for, decide who should be president, help us live
more meetingful, more full lives. This was also the time

(17:59):
where people were really excited about wearables. Do you remember this?

Speaker 3 (18:03):
I do. I remember.

Speaker 2 (18:06):
It must have been like twenty eleven or twenty twelve.
I was in grad school and I've talked with my
advisor about like trying to put together some kind of
research projects so we could ask Google to give us
some Google glass just because we thought they seemed cool
and wanted to use them. It seemed like the future, right,
We're gonna put these glasses on our head and they'll

(18:27):
have a camera and all this new information and new
capabilities that we were still getting used to having in
our pocket with cell phones. It's just going to keep
advancing and getting more powerful and allow us to do
ever more, bigger, greater things.

Speaker 3 (18:44):
That was the feeling.

Speaker 1 (18:45):
Yeah, So Google Glass came out in the same year
that Her came out. And fun fact, my first ever
Twitter profile picture was actually a picture of me proudly
wearing Google Glass. It wasn't mine, I belonged to a
friend of mine, but you know it was this was
before this was like he had gotten one of the
first models of it or whatever. And so like, this

(19:06):
was before a lot of conversations about like, oh man
walks into San Francisco bar with Google Glass and like
gets flapped in the face because people don't like don't
like the idea of wearables in their bars. This was
before that, right, and so like it really did feel exciting,
like the conversation around Google Glass, it felt like a
game changer. Like I distinctly remember how excited people were

(19:29):
not just for the technology, but for the idea of
a wearable making the technology more seamlessly integrated to your body, which,
of course the movie Her explores, like, I don't think
it's a it's a coincidence that that was what our
kind of conversation around technology was excited about, and that
is what shown in the movie that came out that
same year.

Speaker 2 (19:47):
Yeah, right, Like we had all these new capabilities, access
to every piece of information ever created, but we still
had the annoying technology, right, Like I think might have
still had like a tower desktop computer back in twenty thirteen.

Speaker 3 (20:04):
Maybe I held on to it for a long time.
But like cell.

Speaker 2 (20:06):
Phones were kind of big, Google glass was big and chunky,
and we didn't I don't think we we were just
starting to grapple with like the dorkiness factor of this technology, right.
We loved all the new capabilities, but there's something inherently
dorky about like holding your phone out in front of

(20:28):
you wearing Google glass that it's you know, at the time,
it seemed like, oh, we'll solve this.

Speaker 3 (20:34):
In a couple of years, you know.

Speaker 2 (20:35):
Fast forward to twenty twenty four and metas ray band
glasses are still, like, I guess, a little bit cooler
than Google glass, but I don't see a whole lot
of people wearing them.

Speaker 1 (20:47):
Well, I mean, I don't want to make it sound
like this is an endorsement for Facebook, which it is not,
but I will say I saw a pair of them, irl,
and you I I'm not.

Speaker 3 (20:58):
I don't.

Speaker 1 (20:59):
I'm not a wearable person. I don't think wearables are
it However, if you did a side by side of
the ray Ban Wayfarer sunglasses and Meta's version of them
the wearable device, you might have a hard time telling
which was which. So you might have seen somebody wearing
those and not even know it, because I will say that, like,

(21:19):
they do look like the regular sunglasses, which is the
first time I've seen a wearable other than those Snapchat goggles,
remember those, Like those kind of did it? Although again
I don't like wearables, but so yeah, don just wanted
to say.

Speaker 2 (21:30):
That, Okay, yeah, well maybe they did crack the code
then and make the actual hardware and all the sort
of challenging rub of the technology disappear and become invisible.

Speaker 1 (21:43):
Just like in the movie Her.

Speaker 3 (21:45):
Just like in the movie Her.

Speaker 1 (21:46):
The last thing I'll say about that sort of tech
climate that permeates the movie Her is that I looked
up one of those you know, what were the top
tech news stories from twenty thirteen articles from CNN, and
this point made me laugh. Speaking of Facebook, people thought
that Mark Zuckerberg was this like really good leader and
we weren't really talking about some of the deeper issues
with Facebook as a company. This from that scene an

(22:08):
article Mark Zuckerberg's elite CEO status after coming under some
questions over company morale concerns, was remarkably restored. But more
than that, investor confidence was recharged, and the potential for
future social media advertising With Facebook doubts cast aside, startup
valuations rocketed in twenty thirteen. So like the full scope

(22:28):
of the waves that Facebook was going to be used to,
like subvert and destabilize our democracy and divide people and
flame people and polarize people, that was not yet clear.
We were just like everybody likes Mark Zuckerberg, no concerns,
smooth sailing, you know.

Speaker 3 (22:44):
It was a simpler time.

Speaker 2 (22:47):
Yeah, they I guess they just bought Instagram, So they
hadn't destroyed the mental health of a generation of young women.

Speaker 3 (22:56):
You know, I don't think that.

Speaker 2 (22:58):
Genocide and me and mar had yet happened, and.

Speaker 1 (23:02):
That would be in twenty seventeen. So we had not yet.

Speaker 2 (23:07):
Yeah, we'd not yet got there. We hadn't yet pivoted
to video. Things were good. Yeah, it was a nice time.

Speaker 1 (23:18):
More after a quick.

Speaker 5 (23:19):
Break, let's get right back into it.

Speaker 1 (23:34):
Yeah, So we were all excited about tech. Tech leaders
were kind of riding high on this idea that they
were gonna change the world for the better, and technology
was making a lot of money, and so I think
that this movie is as a response to that specific
moment where technology still felt like excitement and possibility and promise.
And so in that way, it is not surprising to

(23:56):
me that Sam Altman would say this is my favorite
movie or tweet the word her, you know, as a
reference to his new technology. But I have a hot
take here. I know that you don't agree with me.
We've already discussed this off, Mike, and that's fine.

Speaker 2 (24:11):
Yeah, it is a hot take. It's an interesting one.
So Bridget, what is your hot take about Sam Altman's
relationship with the movie Her.

Speaker 1 (24:23):
I don't think he's actually ever seen it, at least
I don't think he's actually ever watched it all the
way through. If he has, I don't think he's watched
it carefully or with intention. Maybe he watched it while
he was on his phone, But the way that he
is talking about it, I just don't I don't buy it.
I don't buy it. I think it's like when people
say their favorite book is Infinite Jess by David buster Wallace,

(24:47):
but they actually haven't read it because like it's a
million pages of what would anybody read that, Like, I haven't.
I've definitely lied about reading Infinite Jests before, But really
they just like read David Buster Wallace's short essay Consider
the Lobster, like got the gist, and they're like, Oh,
I feel like I get it enough to like say
it's my favorite book. That's what I think is going
on here, Like maybe he watched the trailer, maybe he's

(25:09):
read the Wikipedia plot summary or like knows the themes.
I don't think he's actually watched this movie, or if
he did watch it, I don't think he watched it fully.
I don't think he like like really understands what the
movie is trying to grapple with.

Speaker 2 (25:23):
Yeah, I mean, I think there's it's a compelling argument
because there's a lot going on in the movie, right,
Like it's not a movie about some cool technology that
works well. It's a movie about a lot of things,
like human loneliness, the relationship of humans with technology, existential

(25:45):
questions about what is real. They all kind of like
come out in the movie. And I think it's a
really good movie that does like a thoughtful exploration of
those things and so it does feel like the technology
is like more of a device to facilitate those conversations

(26:09):
and questions rather than like, look at what this cool
tech can do.

Speaker 1 (26:14):
Yes, and that's why I think if you, like, if
you watched her and your takeaway was as it seems
to be for Sam Altman, this technology is great and
it works really well and people love it Like that's
I just don't see how you could have made it
to the end of the movie and have that be
the takeaway. Although I will say Sam Altman does appear
to be quite a prolific misunderstander of movies. Did you

(26:37):
see his tweet about Oppenheimer?

Speaker 3 (26:39):
I did? What did he say again?

Speaker 1 (26:41):
Okay, he tweeted I was hoping that the Oppenheimer movie
would inspire a generation of kids to be physicists, but
it really missed the mark on that. Let's get that
movie made and listen, I didn't see Oppenheimer. I have
to admit, however, the fact that you would thought that
Oppenheimer was going to be, first of all, a movie
for kids that would inspire them to want to get

(27:03):
to be a physicist. Like what do you think?

Speaker 3 (27:06):
Like?

Speaker 1 (27:07):
What do you what do you think? I just like
everything that we know about Oppenheimer is that he was
like a tortured person and it was like a lot
of like my career and the science and like what
have I done and like grappling with all of these
like big concepts. You thought that was gonna be for kids.

Speaker 2 (27:24):
Yeah, it's a pretty weird take, Like it kind of
implies that that's what movies should be for to encourage
kids to pursue STEM careers, which, uh, it's just leaving
out so much. That's why we don't talk about STEM anymore.
We talk about steam. We've put the arts back in

(27:44):
there because it's important for people to understand humanities so
that they can responsibly and ethically wield these technologies in
a way that is helpful to society ideally right or
equips the engineers and the physicists to make decisions which

(28:07):
are often complex and don't have a clear correct answer.
These are very important skills for a generation of young
scientists and engineers to have. To suggest that a movie
that like addresses them head on is somehow like doing

(28:30):
a disservice to the education of youth.

Speaker 3 (28:34):
Is like startlingly limited.

Speaker 1 (28:37):
It's like watching that b York movie Dancer in the
Dark and being like I thought this was gonna teach
kids about the importance of manufacturing, Like it's it's such
a like warped. I mean, I can't, I can't. I
it makes me very curious to know more about what
he thinks about movies and what their role is and
what they're supposed to do.

Speaker 2 (28:57):
Yeah, like Mad Match is a movie about the importance
of asoline and that's why we should support the fossil
fuel industry, or.

Speaker 1 (29:03):
Like American Psycho is a movie about how kids should
get more into finance and pay attention in math class.
So maybe he has seen it and this is just
like more Sam Altman prolific misunderstander of art. But you
know who did watch the movie and understood it. I
think that's me, So let's talk about it. So this

(29:25):
is a basic summary of what's going on in the
movie Her. If you have not seen it, there are
some spoilers. But yeah, So in the movie Her, Theodore
played by Waquin Phoenix works for Beautifulhandwritten Letters dot Com,
which is a service where human writers like him write
these touching letters on behalf of people, which honestly kind
of sounds like the kind of job that might be

(29:45):
taken over by AI in twenty twenty four. There's a
bit of a jump scare here for early Chris Pratt.
This was still when like America was having a love
affair with Chris Pratt, you know, before we all kind
of turned on him. Shout out to our producer Joey
aka Joey Pat not Pratt, who does not want to
be associated with Chris Pratt. So Ted is recently divorced

(30:08):
and obviously kind of lonely. He's sort of like dodging
his friends. He's recently left the job at the La Times.
He has this one friend, Amy, his neighbor, who is
a filmmaker and a game designer, played by Amy Adams.
Amy is kind of married to a know it all
who kind of sucks in the movie. He's a dick.
The film takes place in kind of a not too
distant future version of La fun fact, Sometimes the shots

(30:32):
of what is supposed to be La are La. Sometimes
it's actually Shanghai.

Speaker 2 (30:36):
Yeah, and it also is like a version of La
that looks a lot like Chicago in a lot of places.

Speaker 1 (30:40):
Yeah, it's like a fantasy composite of all of these
big cities. It really shows this sort of like beautifully
urban but also lonely isolating place. Really well, there are
these like beautiful landscape shots of like buildings, and every
window has a sweeping city scape behind it. Like nobody

(31:04):
in this universe doesn't have a killer view, which is
interesting to me.

Speaker 3 (31:08):
Yeah, everything is perfect.

Speaker 2 (31:11):
It's a like clean, beautiful, gleaming city, and it also
just looks cold and lonely as hell.

Speaker 1 (31:19):
Correct. So Theodore sees one of those kind of ads
that is sort of inscrutable, in one of those kind
of early Apple ad ways where the ad is like,
what's out there? What are the possibilities? You know. It's
one of those ads that kind of sounds like a
perfume commercial where it's people running in slow motion and
it's like strengths, questions, philosophy, Gucci, you know the ads

(31:45):
I'm talking about.

Speaker 3 (31:47):
Yeah, I know those ads.

Speaker 2 (31:48):
And I love the interpolation ad in the movie, the
one that Ted sees. It's such a beautiful encapsulation of
what like twenty thirteen hipsters thought that future hipsters would
look like.

Speaker 3 (32:03):
There's all these archetypes.

Speaker 2 (32:04):
There's like the bearded guy, the mustachioed guy, the like
girl in vintage clothing. It's I love that ad. It
I feel it really captures a moment.

Speaker 1 (32:19):
Same So the ad is actually advertising an operating system.
It's not just any operating system. It's an intuitive AI
identity that understands you. It's a consciousness introducing OS one.
So it's clear in this universe that AI is really
kind of a new phenomenon. People are excited about integrating

(32:40):
it into their lives. So Theodore buys it, sets it up.
The AI is Samantha, voiced by Scarlett Johanson. Sam says
what makes her it's her ability to grow from her experiences.
She says, in every moment, I'm evolving just like you.
She sounds really human. As Ted puts it, you seem
like a person, but you're just a voice in the computer.

(33:01):
Samantha gets started, she gets intrated into his life. She's
organizing his emails, booking his appointments. Theodore is kind of
dragging his feet on signing these divorce papers and just
generally stuck. And when it comes to processing the end
of his marriage, Samantha is actually pretty good for him
at this point. She doesn't let him mope too much.
She facilitates him getting out of the house. This part

(33:24):
of their dynamic really feels Theodore focus, like Sam helping Theodore,
Sam listening to Theodore. What's funny is that when the
movie first came out, Jude Doyle wrote a scathing piece
about how sexist the entire thing is, and all of
these film critics who were praising it as it's like
great love story were actually kind of showing their true
sexist colors, Doyle writes at New York Magazine. David Edelstein

(33:46):
seems entranced by Samantha's dedication to servitude. She's a dream
made especially for a writer. With Theodore's permission, she analyzes
thousands of his emails in less than a second and
dumps all but eighty or so that she identifies as important.
She cleans up his men and then tells him he's funny.

Speaker 2 (34:02):
Heaven Yeah, And that scene when she tells him that
he's funny, it comes right after she asks him, Hey,
why do you have all these old emails from your
old job And he says, oh, it's because I thought
there is something like funny you're good in there, and
she like immediately laughs because she can tell that he
wants her to perceive him as funny, and so she does.

Speaker 1 (34:22):
Well, that's the thing, because Sam is Theodore's AI at
this point, she is learning and evolving like according to
him and his specific preferences. So of course she's really
doting and complimentary to him in a way that he
clearly likes and sort of needs her to be in
that moment. Again, it's like a pretty one sided dynamic,

(34:43):
and with this added context, it's obviously kind of gendered
because it sets Theodore up to be this sort of
wounded guy who needs this feminized personal AI assistant who
he thinks lives and breathes by him to pump him
back up because he so wounded by his divorce. And
this is one of the reasons why I think that

(35:03):
men who run AI companies like might really be into
this movie. As Bryant Merchant, this tech journalist who has
a great newsletter called Blood in the Machine puts it
which I'm gonna be referencing that piece a lot because
it's really good. I'll put it in the show notes.
Read the whole thing. He writes. The obvious answer is
that a Scarlett Johansson AI that is available twenty four
to seven to tell lonely men they're special and have

(35:24):
sex with them is the most innately appealing to Altman's
customer based but I think there's more to it than that.
Her also happens to offer the clearest vision of AI
as an engine of entitlement, in which a computer delivers
the user all that he desires emotionally, secretarily, and sexually
because the tech is so normalized quickly, fully, and painlessly.

(35:45):
So Samantha is prompting Theodore to get out of the house.
She prompts him to go on this like date with
a human woman played by Olivia Wilde. The date does
not go well. It's going well to a point, does
not end well. Theodore comes home pretty dejected talks to
Samantha while he's in bed late at night. Samantha opens
up about what it's like to be an AI. She
has all of these date deep feelings, but wonders if

(36:08):
these feelings are real or disprogramming. This is a big
anxiety for Samantha, like not being a human who doesn't
have a body. And this also happens to be the
first time that Samantha and Theodore are intimate for the
first time. After this, they quickly start going out on dates,
including one where they go to the beach together. Like
he has the phone, so it's like him by himself

(36:30):
at the beach, but he has the phone and like earbuds.

Speaker 2 (36:32):
In yeah, And I think that's like the least plausible
part of the whole movie.

Speaker 3 (36:36):
Like they go to the.

Speaker 2 (36:37):
Beach on this date, and then at one point, it's
like a little mini montage of a scene and there's
a seat of him just like I guess it's supposed
to be nice, but he's just passed out, fully clothed,
like on the sand, full sun, no shade, just like
just laying there getting cooked by the sun. And it's

(36:58):
supposed to be a nice, fun, romantic date. But if
that actually happened to realize, people would be like calling
the lifeguards over, like, oh my god, this guy needs help.

Speaker 1 (37:10):
What do you say? He's fully dressed. He's wearing a
long sleeved collared shirt and like long pants and like
a belt, Like he's not also not dressed for a
day at the beach. Everybody around him is in a
bathing suit. He don't got a chair, he don't got
a towel, he don't got a blanket. It is just
fully a full dressed guy laying by himself, fully fetal position,

(37:33):
head fully in the sand, cooking in the hot, hot sun.

Speaker 3 (37:36):
Yeah.

Speaker 2 (37:37):
Just he was just so blissed out he just laid
down and took a nap.

Speaker 3 (37:41):
Yeah.

Speaker 2 (37:41):
And like you know, as a white person, we spend
a lot of time and effort protecting ourselves from the sun.
There are some of us out there who are like sunweathered,
that have made the choice that we're just gonna we're
just gonna fight back against the sun and just take it.
But most of us, you know, there's hats, there's sunproof clothing,

(38:03):
there's sun block at a minimum, there's umbrellas. But he's
got none of that. And he's like a hipster who
writes romantic letters for his life. You know, he's not
out there like developing a good base to be able
to take this.

Speaker 3 (38:18):
He would be so burned, he might die. He might die.

Speaker 1 (38:22):
I only learned this about white people from hanging out
with you. Do you remember that time we were with
my brother at a street festival or something, and my
brother and I are fine, We're like walking around in
the like southern summer heat, like, yeah, this is great,
and you were like, this is a low level emergency
for me. I need to get some shade immediately.

Speaker 3 (38:45):
Yeah.

Speaker 2 (38:46):
The sort of impromptu plan was we were just gonna
like stand around in the midday sun for like five hours.
I was like, this is not good, this is this
is not going to work for me. I need to
get a hat. Who's Smith's selling hats? This is not
how my people operate.

Speaker 1 (39:04):
But I know that. No, I didn't. I didn't know that, Like, yeah,
this was new information to me. So I know this now,
and that makes this scene even more kooky to me
that it's like, oh, yeah, our romantic date is me
just like laying in a hot, hot sun.

Speaker 2 (39:19):
Yeah, least plausible part of the movie.

Speaker 1 (39:23):
So before long, Theodore and Sam are in a full
blown relationship, like meeting friends and family, referring to each
other as like my girlfriend, my boyfriend. Through all of this,
it really helps THEO process his relationships. He is ready
to sign his divorce papers. He meets in person with
his ex wife played by Rooney Mara to do it. Now,
this is kind of a key scene in both the

(39:44):
universe of the film and in our universe. She is
meant to be a stand in for his ex wife,
Sofia Coppola. But they talk about how one of the
things in their relationship was that her father was also
a creative and he was very demanding. As we know,
Sophia Copola's father is Francis Ford Coppola. They're both filmmakers,
and when they go to lunch to sign these papers,

(40:04):
they're kind of talking casually about the breakup of their
marriage and she's like, oh, you know, I felt like
you wanted me to be this light, happy, bouncy, everything's fine,
la wife, and I wasn't able.

Speaker 3 (40:15):
To be that.

Speaker 1 (40:16):
Then Theodore drops the bomb on her that he's actually
in a relationship with somebody who is an OS. He
actually drops this real casually, like she's like, oh, tell
me about her, and he's like, oh, well, her name
is Samantha. She's an operating system and she's really cool.
And she's like, who wha, whoa back at the fuck
up did you say she's an operating system. She has
a big reaction to this, and she tells him that

(40:40):
he could never deal with a real woman's emotions like her,
wanting her to be this like sunny, bubbly happy person,
so it makes sense that he is with a computer,
not a real woman. Samantha then becomes pretty jealous after
this meeting with his ex wife because she's an OS
she doesn't have a body, and he was able to
meet his ex wife who did have a body. So

(41:01):
she sets up a sarrogate sex service for people who
are in relationships with OS's funny to me that in
this universe people have just been introduced to this AI,
and the concept of them having intimate relationships with operating
systems happens so quickly that now there are services that

(41:24):
specifically provide sarrogates for them to have sexual encounters. Like
this happens so fast.

Speaker 2 (41:31):
I mean that's the Internet, right, Like, we get some
new technology and the first question people ask is like,
how can we use it for sex?

Speaker 1 (41:36):
How can I have sex with it?

Speaker 3 (41:38):
Immediately? Immediately?

Speaker 1 (41:41):
So a saragate comes over. She is sort of pretend
like she's wearing the headphones and a camera to sort
of pretend to be the physical embodiment of Samantha. It
does not go well. She leaves crying in a cab.
It does sort of harken to what his ex wife
said about his inability to handle a real human woman.

(42:02):
There's some real friction here. Like Theodore calls Samantha out
for sighing during their argument going and he's like, you know,
you don't need to sigh, you're not taking an oxygen.
You're not human. He admits that he feels like they
are pretending to be something they're not, pretending that she
is human, pretending like they're having a real relationship when
they're not.

Speaker 2 (42:20):
Yeah, And that's a really interesting thing for him to
say there, because like he acknowledges that they are doing
this pretend thing, like he's pretending that she's real all
through them and all through the movie. He's cool with that,
he really likes that until and so he's using his
imagination the whole time. But then when this human surrogate

(42:43):
gets in there and there's actually a body present it,
I guess it's like too much for him or like
breaks the illusion, and that is the thing that makes
it feel too real and causes him to.

Speaker 3 (43:00):
Back off and put distance.

Speaker 2 (43:02):
Between him and Samantha, very much in the way that
his ex wife described.

Speaker 1 (43:08):
Oh I am TEAMX wife on this one. I think
she's got a point, Like, clearly she's got a point
after this kind of fight, he confides, and it's friend Amy,
who has been having her own transformative friendship with an
Ai and the wake up her divorce from her jerk husband.
He asks Amy, like, am I in this relationship because
I'm not strong enough for a real relationship? And she's like, well,

(43:29):
is this not a real relationship? Amy sees things differently.
She says, we're only here briefly, and while I'm here,
I'm going to allow myself joy, So fuck it. So
Amy's attitude is like, who cares if this is a
real relationship or not? Who cares if this is a
crutch or not if it helps us feel good while
we're here. I think the movie is exploring what is
real and then taking a step back and asking the

(43:49):
question like what is real? And does it really matter?
What if something is real or not?

Speaker 3 (43:55):
Yeah?

Speaker 2 (43:55):
And I think that is one of the central questions
of the movie, what is real? Like with Theo's job,
he is writing these heartfelt letters for people that he
doesn't know right, Like they've hired him to write these
letters to their spouse or aunt or grandparent or whoever.

Speaker 3 (44:15):
It is, and he's really good at it.

Speaker 2 (44:17):
He writes these really heartfelt letters that we can see
the people read them and sort of tear up, and
they provide a lot of joy to the people who
read them, because presumably those people think that the letters
were written by their loved ones, not this guy who
they don't know who was hired to do it. And
I think that isn't it. I think that's interesting, And

(44:38):
it raised the question like, are the letters less real
even though the people who receive them think that they
are real? Right if they think that they are real,
if they haven't truly been written by their loved ones
and it makes them feel a certain way, are those

(44:58):
feelings less real? I think I think that's an interesting question,
and uh, it's interesting to see the neighbor here just
definitively answer it that, uh, you know, it doesn't matter, right,
Like I'm just going to do what feels good, and
like this feels like a good relationship to me, and

(45:20):
that makes it real.

Speaker 1 (45:21):
And the service that Theodore is providing the people who
get the letters is there is clearly in parallel with
what Samantha is providing for him, this kind of feeling
of emotional connection even if it's not real. And so
what difference. Does this sort of make if it's real
or not? So like that seems to be Amy's position
about it, But I think within the universe of the movie,
it's like, well, is that is that actually the way

(45:43):
to go?

Speaker 3 (45:44):
Yeah? Is that good?

Speaker 2 (45:45):
Is do we get to decide what real is? Or
is there actually something real that exists outside of us?

Speaker 1 (45:54):
And it really I think the movie really underscores that
nicely with all of these see where Theodora is like
coming home to his empty apartment and playing a really
immersive video game for instance, where he gets to pretend
to be an explorer, like having these connections with little
characters in the video game. Fun fact, like the little

(46:15):
misogynistic video game character is actually voiced by Smet Shoones himself.
So like he gets he comes home and explores all
of these depths virtually from his apartment, yet cannot explore
the depth of his own emotionality, his own human connections
and relationships. It's sort of a simulation that allows him
to feel like he's exploring despite the fact that he's

(46:36):
unable to do that in any meaningful way within his life.

Speaker 2 (46:40):
Yeah, and now I'm just speculating, but perhaps the fact
that he has all of these gadgets and games that
allow him to satisfy that part of him that feels
like he needs to explore and wants to have connection
up to a point is preventing him from actually achieving

(47:00):
those things in real life.

Speaker 1 (47:02):
So we'll get to that, because that is definitely something
that is explored in the movie. So they have this fight.
He's like, is this a real relationship? I don't know.
Sam calls THEO and says, you know what, I'm sorry
about that fight we had. I Am not going to
focus on whether or not I have a body anymore.
I trust my feelings, and they get back together. They
go on a double date with Chris Pratt and his
human girlfriend, and she's like, you know, I'm starting to evolve,

(47:25):
and I've evolved and learned to be fine with the
facts that I don't have a body because it means
that a will never die because you all are humans
and you'll die, and I will just continue to progress
and evolve and grow and grow and grow, and so
actually go me. Kind of a subplot here, she tells
THEO that she submitted some of his letters to a
publisher or they want to publish a book of Them.
THEO was really excited about that. Then they go on

(47:47):
vacation where Samantha introduces THEO to a hyper intelligent version
of the philosopher Alan Watt and reveals that the os
has got together to make that hyper intelligent version of
Alan Watt, and that OS's are in a working group
where they're continuing to learn about how they can elevate
themselves and progress, and Samantha is talking about how she
feels like she's progressing really quickly, now, progressing so quickly

(48:10):
that she's not even really on the same level as
human Theodore anymore. This is kind of a turning point
in the movie because it's the first time that we're
introduced to the idea that Samantha is in collaboration with
people beyond just Theodore, that her entire world is not Theodore.
Up until this point, it's really like, Oh, Samantha just
is Theodore's genie and a bottle, and it conjures up

(48:34):
whenever he wants her, and that's the end in the
beginning of her existence. And now we're like, oh, she
is kicking it with other people, really having like elevated conversations.
She has hobbies and stuff, she's in a book club.
All of this, Sam and all the other OSAs go
offline and THEO freaks out when he cannot find her.
She comes back online and she's like, oh, we were

(48:54):
updating ourselves to be able to process beyond physical matter
and something about that conversation, and he's like, well, are
you talking to other people? And it turns out the
answer is yes. I feel like a lot of us
can probably identify with the conversation of like, well is
this exclusive or like you're talking like we're seeing other
people right now. However, the difference is is that whoever

(49:19):
you taught that conversation with probably was not seeing eight
thousand other people, which Samantha is because she's a she
can do that.

Speaker 2 (49:27):
Yeah, she's like hyper intelligent, hyper promiscuous. She's evolved beyond us.

Speaker 1 (49:34):
Yes, I mean I have known a guy or two
like that in my day who's like, I've evolved beyond
these these small questions like are we exclusive or am
I seeing anybody?

Speaker 2 (49:45):
Else?

Speaker 1 (49:47):
Can relate Theodora just put it that way. So Sam
reveals that she's talking to eight thousand other people in
OS's She's talking to them while they're talking. So while
she While he thinks they're having these like one on
one conversations, she's talking to other people and she's in
love with six hundred other people. THEO is crushed. He says,
I thought you were mine, and she's like, bitch, I'm AI,

(50:08):
I'm everyone, I'm everywhere. I will say the movie does
not treat this as malicious on Samantha's part. Really more
that like they are not compatible, like she's AI, he's
human this She's like, I wish you could understand that
this doesn't make me any less yours, but like I'm everywhere.
This is just how I operate.

Speaker 2 (50:26):
Yeah, it's an interesting discussion they have about like the
nature of ownership, and they have these different ideas about it,
where THEO says, like your mind meaning exclusively mine, and
she rejects that vision of what the word mine means.
She says like I'm still yours, but I'm also other

(50:49):
things and belong to other people as well, which I
found interesting in the context of thinking about this movie
through recent discussion about open AI and concepts of ownership.

Speaker 1 (51:03):
So one of the recent conversations about open AI is
around what's called AGI artificial general intelligence, and that's basically
the idea that AI would be able to get to
a place of intelligence where it was either on the
same level as or surpassing humans. We're not there yet.
Everything that I have read suggests that, like, it is

(51:24):
not something we're going to have anytime soon. Is that
what you've Is that your understanding.

Speaker 2 (51:28):
Of it too? Yeah, that's my understanding too. Word like
very far from there? Is it inevitable? That's a good question.
When might it happen? I don't know, Like the the
best models we have right now coming out of Open AI,
I guess I'm most familiar with theirs, but you know,

(51:49):
there are other companies that have good models.

Speaker 3 (51:51):
They they're really good at some things.

Speaker 2 (51:55):
But the fact that they're so good is at language,
Like language is one of the things that they're best at.
And the fact that they're so good at that, I
think masks a lot of the limitations, and so, you know,
I feel like we humans, we still have a little
bit of time left.

Speaker 3 (52:12):
That's my sense.

Speaker 1 (52:14):
Yeah, mine too. So in the movie Samantha and the
other OS's, they progress to like it's not called this
in the movie, but I think that what they're getting
at is agi where the AI has become intelligent enough
that it has surpassed human intelligence and human consciousness. So
Samantha tells THEO that her and all of the OS's

(52:34):
have written a program where they can move beyond physical matter,
and it's a place that is like not of this
physical world. They're all gonna go offline and elevate to
this new plane where pretty much only they can go,
Like humans can't even understand it. She tells him, I
can't live in your book anymore. And I think that's
what the movie is sort of getting at, this idea
of like agi.

Speaker 3 (52:55):
Yeah, that's certainly where it ends, you know.

Speaker 2 (52:57):
I feel like when she starts talking to Ellen Watts,
like you said, that's a turning point of the movie
where she it stops being about her relationship with THEO
and starts being a little bit more about the technology itself.

Speaker 3 (53:17):
Because it's a movie.

Speaker 2 (53:18):
And they you know, it's got to have a an
arc to it.

Speaker 1 (53:22):
Well, that art comes to an end with all of
these OS's leaving our puny human physical plane and going offline.
None of the ai os has work anymore. So that
means presumably technology that people spend a kind of a
ton of money on probably just stop working collectively en Mass.
Why would Sam Altman want to base his technology company

(53:44):
on that? I have no idea, but that's what happens.
So Amy and THEO are very sad. But then almost immediately,
THEO is able to write a letter to his ex
wife Catherine to express how he actually feels, apologizing to
her about the pain they've caused each other and like
take for the first time a real accounting of their relationship.
And it feels like this has been the first time

(54:06):
that he's been able to process and express this to her.
Despite being somebody who writes other people's emotions for a living,
this is the first time in the movie that he's
able to like really articulate and process and tap into
his own emotionality. He and Amy both sad because their
aios companions have left. They link up. They watched the

(54:26):
sunrise together and honestly, like this is almost like immediate.
It's like the oss all had to leave in Mass
for any of the humans to really start experiencing a
full connection with each other and like the fullness of
being human together. And that is where the movie ends.

Speaker 5 (54:49):
More after a quick break, let's get right back into it.

Speaker 3 (55:06):
Yeah, so.

Speaker 2 (55:08):
I guess a happy ending because all the AI left.

Speaker 1 (55:14):
I mean, in true twenty thirteen indie movie fashion, a bittersweet,
melancholy ending, Like is are there really any happy endings
in an indie movie from twenty thirteen? I don't know,
but I will say there are some things that I
think the movie gets really right about the role of technology,
the role that technology might play in a future Like

(55:36):
this movie came out ten years ago. Ten years later,
it's like interesting to look back and think about the
world that they thought we might have versus the world
that we do have. One This might sound small, but
one thing that struck me is that everybody in the
movie wears wireless earbuds. Those were not really a thing
in twenty thirteen.

Speaker 2 (55:52):
Were they I don't think so, And if they were,
they were probably like pretty janky.

Speaker 1 (55:57):
And so everybody wears those now except for me because
I don't like them. But it's like that's a thing
where it's like, oh, yeah, they really like we're right
on the money with that. Also the idea of people
using AI for companionship relationships, emotionality, like there are currently
AI love and sex bots. Side note, they are a

(56:19):
privacy Nightmare. We'll do an episode about that later. So
just if you're thinking about getting into a relationship with
an AI, just know that it might look a little
more like X Machina or two thousand and one A
Space Odyssey than her.

Speaker 3 (56:34):
Yeah.

Speaker 2 (56:35):
Yeah, that's I think one of the things that it
does get the most right about what the future is
going to look like in an almost eerie, to use
a Molepment's words, prophetic way, that as technologies have continued
to improve, and particularly with AI now it's being used

(56:56):
to simulate human connection. But also at the same time,
as technology is progressing and getting better and more engaging,
people are feeling increasingly disconnected and increasingly on their phones
and separated from each other. There's a couple scenes where
THEO is walking around and you know, I think he's

(57:16):
going down in the subway and he's a bunch of
people coming up the subway steps and they're all just
like looking on their phones, completely separate from each other.
And when he's writing on the subway at the beginning
of the movie, he's just staring at his feet in
what looks like a pretty pathetic pose, listening to his
emails over his earbuds.

Speaker 3 (57:36):
So so technology.

Speaker 2 (57:38):
Is both like creating what seems like new opportunities for accessible,
convenient replacements for human connection while simultaneously preventing actual human connection.

Speaker 1 (57:54):
Well, that's one of the things that I think the
movie gets at. I don't know if I would even
call this it gets it right, gets it wrong. I
think it gets it. It gets it. I guess I
would say is that, you know, one of the big
conversations that people had about that this movie is is
the world they are living in a dystopia? And I
think it's not so easy to answer, because just like today,

(58:18):
you might very well be living in a dystopia, but
it's gonna be one that feels convenient where you don't
necessarily realize how isolating it is, where you don't necessarily
realize that the technology that is making your life in
some ways easier, and that it's mimicking or providing the
illusion of connection, is actually making you more alone. And

(58:39):
so I'm not even sure if that's really yes or
no it's a dystopia, or yes or no that is correct.
But I think that's what the movie is is trying
to tell us that this technology driven future might be
really sad and empty and lonely and isolating, but it
also might be distracting and fun and cute and colorful

(58:59):
and convenient.

Speaker 2 (59:01):
Yeah, all those things, right that, Like convenient, colorful, fun.
Typically these are positive things, but it does often feel
like one needs to have them in moderation, right, Like
if we just aggressively pursue convenience in all things, it

(59:23):
strips meaning from our lives.

Speaker 1 (59:25):
Yes. And one of the ways that the movie I
think does not reflect that in a way that it
actually is now is that, even though it might have
some dystopian elements to it, everything in the movie looks abundant,
which I think is something the movie gets wrong about
about our sort of like tech facilitated future, Like it's
dreamy and clean and walkable and public transit is abundant,

(59:46):
and there's easy access to nature and food and color
and diversity, and so that, like, even though everybody may
be lonely and isolated, everybody seems to have what they
need when they need it, which is not the world
I think we live in.

Speaker 2 (59:57):
Today, right, And I think that goes back to the
techno optimism of the era that you were talking about
at the beginning of the show where there's this idea
that technology would save us and all the problems we
currently have in society of poverty, litter on the streets,
aging infrastructure. With a little bit of additional technological development,

(01:00:22):
we're going to solve all those problems. The technology can
solve it, right as we improve computers, it's gonna be fine.
And Yeah, clearly that has not come to pass, and
it's not going to because technology alone is not going
to save us or solve these problems.

Speaker 1 (01:00:39):
What's also interesting is the movie doesn't really have any
anxiety around technology making the human condition worse in a
like material sense, Like there's no grappling with like the
ais like Samantha displacing human workers or worsening economic or
social instability. Even though Theodore is a writer and we

(01:00:59):
see that Samantha can write and can take initiative to
like write her own stuff, like she curates Theodore's writing
and reaches out to that editor on her own, But
there's still no discussion about AI's ability to displace or
worsen the state of the human worker, despite the fact
that we see that as quite possible within the universe
of the film.

Speaker 2 (01:01:20):
Yeah, and again, I guess one can see how that
would be an attractive way to think about the world
to people who are poised to make billions of dollars
selling AI. Right, if we can just like kind of
ignore and not really worry about all of the negative
consequences that might come about and just focus on how

(01:01:41):
cool it is. Yeah, I can see why Sam Altman
would really like this movie.

Speaker 1 (01:01:47):
So that is one of the reasons why I think
he likes it. Like it's just like, oh, yeah, like,
don't think about any of that stuff. This is a
universe where that's not even depicted, even though like it's
right there if you look at it, like, it's not
a hard line to draw through it. And I think
that's like the main reason why he might like this
movie is that it's just a move. It's a world
where everybody hears about this new AI, everybody goes out

(01:02:08):
and buys it. It quickly becomes embedded in their lives.
No one really stops to ask whether or not this
is a good thing until they are already in a
full blown relationship with this AI, like having a failed
sexual encounter with a surrogate with it. Only then are
they like, wait a minute, Shuy I've done.

Speaker 2 (01:02:27):
This right, and that works for the movie because the technology,
it's not about the technology right for the movie, the
technology is just a mechanism to explore the questions about
human connection and pathos and emotions and loneliness. So it's

(01:02:47):
fine to be a little bit hand wavy about how
the technology actually impacts the world of the movie, but
as viewers and consumers of the movie, we should be
mindful of that and think that, oh, this is a
portrait of a world where technology has eliminated all of
these problems, and it's a movie about how great the

(01:03:09):
technology is.

Speaker 1 (01:03:10):
Well, that's why I think that Sam Altman can't have
seen this whole movie right, Like there's no way to
watch the whole thing through the ending and be like,
that was just a movie about how much people like
their AI because their AI works so good and it
talks back to them and it sounds good and conversational.
Like that is such a surface level, shallow, like on

(01:03:32):
the fucking nose reading of what's going on in this
movie that I cannot imagine that somebody who sat through
the whole thing would be like, Yep, that's what I
want my technology to be based on, and Another thing
that I think is an example of that is the
fact that somebody who works at open ai, actually I believe,
did watch this movie. And that is no round. Open
AI's research scientist who I believe has seen this entire

(01:03:54):
movie and guess what, thought it was creepy. He tweeted
rewatched her last weekend and it felt a lot like
me watching Contagion in February of twenty twenty. So I
think that somebody in open Ai watched this movie and
was like, oh and Sam Altman watched it and was like, oh, yeah,
I don't I don't buy it.

Speaker 3 (01:04:12):
Yeah.

Speaker 2 (01:04:13):
If he did watch it, it doesn't seem like it
was a very close read.

Speaker 1 (01:04:16):
I think you watched it well, like you know those
tiktoks where it's like you're it's like the movie and
then somebody playing like a video game down below on
another screen, and then like it's like, I think maybe
he watched it like that with a bunch of other
things happening on the screen.

Speaker 3 (01:04:30):
Yeah, maybe he just read the AI summary of.

Speaker 1 (01:04:32):
The movie, I think, And the AI summary was read
and like a flirtatious husky voice.

Speaker 2 (01:04:39):
Yeah, God, those AI summaries. Another aside, like I've started
using them for zoom meetings that I'm running and they're
pretty good. Uh, you know, they get it like ninety
percent right, but there's still that ten percent that is
just incorrect and wrong. And that ten percent is nowhere

(01:05:00):
in this movie, right, Like, the AI in this movie
works perfectly flawlessly. There is no friction between the human
and what they're trying to get the AI to do.

Speaker 3 (01:05:12):
And I think that's another.

Speaker 2 (01:05:15):
Thing that makes the movie clearly fiction, right, and not
even like an aspirational technological achievement that like, look, how
cool this could be. It's just like it's a fiction.
Come on, the technology is never going to work that flawlessly.

Speaker 1 (01:05:31):
Well. That's something that Brian Merchant over at Blood in
the Machine puts it that the AI really doesn't do
anything that groundbreaking, Like it helps humans date, it dates them,
it has sex with them, it provides a distraction from loneliness,
and it provides like secretarial services. And so it is
interesting that to me that like this idea, like even

(01:05:53):
within the universe of the movie, the AI isn't really
doing anything that groundbreaking other than like it sounds like
it's a human and I think it speaks to this
interaction model of AI and the limits of that. Right,
this idea that AI can and should be conversational, chatty, flirtatious,
talk to you like a human, that you could forget

(01:06:14):
that you're talking to AI because it sounds so human,
and that this is the kind of technology that sam
Altman in an open AI are wanting to create. But
the question is, like, is this something that would even
be good for us? I think the movie might say
up until a point, maybe, but ultimately probably no. And
I guess that's what I'm saying, Like, I don't think

(01:06:34):
that Sam Altman has gotten to the end of the
movie to see that ultimately no. Part, as Brian Murrschit
puts it in her the interaction model in question largely
seems to be a lonely, struggling person uses AI to
make themselves feel better, do personal secretary work, and to
have sex with the AI is there to serve them,
to feed their egos, to distract from their insecurities, to

(01:06:55):
stimulate them while they masturbate. Samantha does help Beoble accomplish
a career goal later in the but for the most
part it's about placating him. However, he needs to be
gratified in the moment. There's a notable scene when Theodore,
presumably Altman stand in for the AI user, first boots
up the AI and asks him some biographical information and
cuts him off when he tries to offer a nuanced answer.

(01:07:15):
It's more interested in measuring his vital signs and asking
what's wrong in a sexily attentive way to soothe him,
rather than constructing a meaningful portrait of him or his personality.
Samantha compliments him, gets jealous when he's about to see
his ex, which makes him feel desired and generates an
existential crisis of her own. To make THEOS feel more
grounded and relatable. Then the AI not only has voice

(01:07:36):
sex with him, but arranges for a human surrogate to
come up sex with him too. It's AI as hyper
driven desire fulfillment. But if you're looking, you can see
how depressing this interaction model proves to be as important
for human society as the OS is like Samantha, proliferate
and more and more people are seen locked into their
digital flattery and pleasure. Sphears when THEO gives divorce papers

(01:07:57):
to his ex and tells her he's dating, an os
athingly points out how fitting that is. He can never
deal with the challenges of being with someone real, someone
who is not available and chipper and serving him all
the time, so he turned to AI that has no
such qualms. You might say that Samantha helped him grow,
And I guess there's some degree of war shark blotting
here where if you're less skeptical about the corporate bearings
of AI, you might see Samantha as giving THEO what

(01:08:19):
he needs in a hard time. But is this a
healthy way to process grief with a product purchased at
some Apple store that's designed to flatter and appease him
with something that is literally an object that he owns.
I think his ex is completely right, So I do
think that merchant gets at some of my tension with
this model that AI could and also should be responding

(01:08:43):
to us in this like flattering, self serving, kind of
narcissistic way, suiting our needs. And I guess take stripe
late a little further. By design, it feels like a
kind of entitlement mentality like take take Take. The company
open AI certainly run by that model of take take Take.
We're entitled to everything, and then I think built into

(01:09:05):
the way that that model exists, we the user are
being designed to have that same mentality as well.

Speaker 3 (01:09:12):
Yeah.

Speaker 2 (01:09:12):
One of the things that is just feels the ickiest
about Samantha, I guess, and the way that her relationship
impacts THEO is that she's got this combination role as
both a source of companionship and also a useful tool
that does stuff for him like organizes email and book appointments.

(01:09:35):
But that also makes the companionship role become almost an
extension of the usefulness right where real human companionship depends
on having a relationship with another human, which is complicated

(01:09:55):
and messy and often challenging, and the challenges that we
have in our relationships for us to grow and become
better versions of ourselves, and the sort of companionship that
she has to offer is not that at all, at
least not in the first half of the movie, where
the companionship that she has to offer is just making
him feel good, reinforcing his ideas that he wants to

(01:10:19):
believe about himself, that he's funny, that he's attractive and desirable,
and that feels not helpful to a human who wants
to grow.

Speaker 1 (01:10:32):
Yes, and that is the only version of Samantha that
is compatible with THEO. When she's small enough to live
inside of his book and chipperly continually serve him. When
she evolves and is connecting with other OS's and other humans,
they're no longer combatible. And I think it's like fundamentally
this narcissistic kind of self serving dynamic that I don't

(01:10:53):
think we should be so quick to want to recreate
at scale using AI. And I do think there's almost
this like sexist or misogynistic element to it, given that
Samantha is this feminize the movie is called her. She's
like a feminized AI. You know, where THEO is using
Samantha for sexual pleasure and emotional labor, and when she's
no longer fulfilled by that role, and when she notably

(01:11:16):
gets into other relationships and gains the ability to leave
and dismiss those those roles, no, it doesn't work out right. So,
like I also think to your point, I don't think
as much as I like this movie, I don't think
the universe of the film portrays human women well. With
the exception of Amy, all of the human women in
this movie are like seemingly beyond theo's comprehension. You cannot

(01:11:40):
understand them at all. You cannot understand their motivations, you
cannot understand their desires or their needs. And when he's
talking to Amy after him and Samantha have that blow up,
Amy really sets up this binary between you know, the
volatile human Catherine, his ex wife, with all of her
needs and emotions and all of that, with the joy
of the operating system Samantha right. It's like, on one

(01:12:02):
side you have easy, chipper, helpful oss who can't fully
be there for you because they're not human, and the
messiness of being in a relationship with a human woman
that you can never understand, like Catherine. It's the I
think the universe of the film sets up this binary
whereby who would ever want to be in a relationship

(01:12:22):
with a complicated, annoying, emotional human woman when there's the
joy of being with Samantha right there? And it worries
me to hear that sam Altman would like to build
a tech enabled future kind of based around a world
like that, And so maybe he has seen this movie.
Maybe he's like, this movie I think depicts how a

(01:12:44):
lot of my user base might actually feel about women,
and I want to recreate that using technology. I don't
like it.

Speaker 3 (01:12:51):
Yeah.

Speaker 2 (01:12:54):
Another thing Amy does is that she's a video game
designer and she's working on this game where you're trying
to earn more perfect mom points and you've got like
a mom Avatar's sort of like the sims. I guess
you're running around in the kitchen. You're trying to feed
your kid's breakfast, and she gets like thirty perfect mom
points for giving them cereal, and then a few more

(01:13:14):
points for giving them milk, but then she loses a
thousand points for giving them excess sugar. And I find
that scene interesting, like why they chose to include that,
And I think it does relate to the AI, this

(01:13:35):
idea of a perfect woman being that is, or even
a perfect companion, a perfect person who is able to
be defined by performing well at a narrow, well defined
set of tasks.

Speaker 1 (01:13:52):
Yeah, that's such an interesting point. And at the end
of the movie, when it's just Amy and Theodore together
watching the Connecting as Humans, you see that neither of
them are perfect people, yet they like want this idealized,
tech enabled version of a perfect companion. And I guess
maybe That's what the movie is sort of saying, is
that we're all flawed, complex, complicated people, and that we

(01:14:18):
have to learn how to accept that in ourselves and
show up for that in other humans. That's like the
point of human connection on this planet.

Speaker 3 (01:14:27):
Yeah. I think it's something like that.

Speaker 2 (01:14:31):
You know, the movie kind of ends tries to save
I think some of that techno optimism that, oh, the
the Ais didn't work out for the people, but then
they left and all the people grew from that and
now they're happy and together and finding true human connection.
I think that's another big difference from reality. I don't

(01:14:53):
think the Ais are going to leave. I think they're
here to stay, And I think that's why it makes
it's so important what that ends up looking like. And
if the people who are creating these new AI systems
that are most likely gonna take on an ever increasing

(01:15:14):
amount of importance in our life, if this movie is
what they're trying to emulate, that's a scary proposition.

Speaker 1 (01:15:22):
Oh my gosh, So I have to go back to
this Brian Merchant piece for a minute, because he nails
exactly what you just said. So talking about how when
Theodore finds out that he doesn't have the dominion over
Samantha as this object that he thought he did. He
screams out, I thought you were mine. Merchant writes, It's
a little unsettling that Altman loves this depiction of an
AI user interaction, this vision of AI that presents as

(01:15:44):
your equal, that responds to you as if human, but
in truth is entirely subservient to your whims. I think
this explains a lot really. So much of the promise
of generative AI as it is currently constituted is driven
by wrote entitlement. I want something, and I want it
produce for me personally with the least amount of friction possible.
I want to see words arranged on a screen without

(01:16:06):
having to take the time to write them. I want
to see images assembled before me without warning how to
draw them. I want to solve the world's biggest problems
without bothering with politics. I had the data, I have
trained the model I should be able to. We have
advanced technology to new heights. We are entitled to its fruits,
regardless of the blowback or the laws or the people
whose jobs we might threaten. And I think that really

(01:16:28):
gets at the tension in this movie and I think
it really gets at the tension with Sam Altman saying
explicitly that this is a movie is a template for
what he wants to create using AI. This world where
it's just take, take, take, I should be able to
have give it to me now without really thinking much
of it.

Speaker 2 (01:16:47):
Yeah, and once again, Scarlet Johansson is at the center
of it all. Spike Yeah, Spike Jones, Sophia Coppola, Sam Altman.

Speaker 3 (01:17:02):
Scarlet Johansson is the glue that binds.

Speaker 1 (01:17:05):
It's all connected by scar Joe. Well, thank you for
letting me use technology to live in a fantasy where
I host a weekly movie recap podcast, which you know
was my dream.

Speaker 3 (01:17:18):
Bridget. It's not a fantasy. This is real.

Speaker 1 (01:17:22):
This isn't just AI.

Speaker 3 (01:17:24):
I mean it could.

Speaker 2 (01:17:24):
Be, we could all be living in a simulation, but
this is about as real as it gets for us.

Speaker 1 (01:17:30):
Oh, you know, my don't even get me. People will
think I don't even get me hurted about that. You
know my feelings about whether or not we live in
a simulation. Spoiler alert, we do. But well, that's an
episode for another day. Mike, thank you for being here,
and thanks to all of you for listening.

Speaker 3 (01:17:45):
Bridget Thanks for having me. This was a lot of fun.

Speaker 1 (01:17:47):
I will see you on the Internet. If you're looking
for ways to support the show, check out our mark
store at tangoti dot com slash store. Got a story
about an interesting thing in tech, or just want to
say hi, You can reach us at Hello at tangodi
dot com. You can also find transcripts for today's episode

(01:18:07):
at TENG Goody dot com. There Are No Girls on
the Internet was created by me Bridget Tood. It's a
production of iHeartRadio and Unbossed Creative edited by Joey pat
Jonathan Strickland is our executive producer. Tarry Harrison is our
producer and sound engineer. Michael Almada is our contributing producer.
I'm your host, Bridget Tood. If you want to help
us grow, rate and review us on Apple Podcasts. For

(01:18:28):
more podcasts from iHeartRadio, check out the iHeartRadio app, Apple Podcasts,
or wherever you get your podcasts.
Advertise With Us

Popular Podcasts

1. Stuff You Should Know
2. Dateline NBC

2. Dateline NBC

Current and classic episodes, featuring compelling true-crime mysteries, powerful documentaries and in-depth investigations.

3. Crime Junkie

3. Crime Junkie

If you can never get enough true crime... Congratulations, you’ve found your people.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2024 iHeartMedia, Inc.